This application claims priority to and the benefit of Taiwan Application Serial Number 109125338, filed on Jul. 27, 2020, the entire content of which is incorporated herein by reference as if fully set forth below in its entirety and for all applicable purposes.
The disclosure generally relates to an object detection method, system, and storage medium, and more particularly, to an object detection method, system, and storage medium for a 3D mammogram.
The conventional method for detecting objects in the 3D mammogram is to process the 2D slice image of the 3D mammogram. For example, the pixel features of the 2D slice image, such as the texture and the brightness, are taken as the parameters to analyze, and further, the selected pixels of the 2D slice image are used to detect the object of the 3D mammogram. However, the conventional method analyzes the 2D slice images. When the ranges of the 2D slice image, which are determined to contain the object, are spliced to recover the 3D mammogram, however, the border of the object range in the 3D mammogram is not continuous. The method reduces the precision for detecting the object range of the 3D mammogram.
The conventional method for detecting objects in the 3D mammogram includes the deep learning algorithm, the machine learning algorithm, and so on. The algorithm requires a large amount of feature data to train the model. If the amount of data is not enough, the problem that the model is over-fitting or under-fitting will occur.
On the other hand, the conventional method for detecting objects in the 3D mammogram uses the deep learning algorithm, the machine learning algorithm, and so on. The algorithm applies a single decision, such that the efficiency to detect the objects in the 3D mammogram is affected by the model efficiency of the algorithm's feature.
The problem that the area of the object in the 3D mammogram is recovered from the area of the object which is detected in the 2D slice image such that the range of the object in the 3D mammogram is not continuous to cause that the detecting result is not precise, the problem that the amount of feature data is not enough such that the model is over-fitting or under-fitting, and the problem that the single decision reduces the efficiency for detecting the object. Accordingly, how to find the solutions is an ordeal faced by people skilled in the art.
The disclosure can be more fully understood by reading the following detailed description of the embodiments, with reference made to the accompanying drawings as described below. It should be noted that the features in the drawings are not necessarily to scale. The dimensions of the features may be arbitrarily increased or decreased for clarity of discussion.
The present disclosure of an embodiment provides an object detection method suitable for a 3D mammogram. The 3D mammogram includes a plurality of voxels. The object detection method includes steps of: controlling N filers to execute a filtering computation in the 3D mammogram respectively to generate N 3D filtering images, where N is an integer larger than 1; computing a difference value between Mth and (M−1)th 3D filtering images to generate a plurality of 3D differential images, where M is an integer between 1 and N; executing a filtering computation in the plurality of 3D differential images to generate a plurality of 3D smooth differential images; computing a difference variation among the plurality of voxels of the plurality of 3D smooth differential images to obtain a blurriness value of the plurality of voxels; using the blurriness value of the plurality of voxels in a decision module to execute a plurality of first decision operators to generate a plurality of first decision results, and using one or the plurality of first decision results to execute the plurality of second decision operators to generate a plurality of second decision results; and executing a final decision operator by using the plurality of first decision results and the plurality of second decision results to generate a detection object of the 3D mammogram.
One aspect of the present disclosure is to provide an object detection system. The object detection system includes a storage device and a processor. The storage device is configured to store a 3D mammogram. The 3D mammogram includes a plurality of voxels. The processor is connected with the storage device, and the processor includes N filters, a plurality of differential image operators, a blurriness value computation operator, and a decision module. The N filters are configured to execute a filtering computation in the 3D mammogram respectively to generate N 3D filtering images, where N is an integer larger than 1. The plurality of differential image operators is connected with the N filters, and the plurality of differential image operators is configured to compute a difference value between Mth and (M−1)th 3D filtering images to obtain a plurality of 3D differential images, where M is an integer between 1 and N. The blurriness value computation operator is connected with the plurality of differential image operators, and the blurriness value computation operator is configured to use the plurality of 3D differential images to execute a filter to generate a plurality of 3D smooth differential images. The blurriness value computation operator is further configured to compute a difference variation among the plurality of 3D smooth differential images to obtain a blurriness value of the plurality of voxels. The decision module is connected with the blurriness value computation operator, and the decision module is configured to use the blurriness value of the plurality of voxels to execute a plurality of first decision operators to generate a plurality of first decision results and use one or the plurality of first decision results to execute a plurality of second decision operators to generate a plurality of second decision results and to execute a final decision operator by using the plurality of first decision results and the plurality of second decision results to generate a detection object of the 3D mammogram.
One aspect of the present disclosure is to provide a non-transitory computer-readable storage medium storing computer-executable code including instructions for causing a processor to: acquiring a 3D mammogram comprising a plurality of voxels; controlling N filers to execute a filtering computation in the 3D mammogram respectively to generate N 3D filtering images, where N is an integer larger than 1; computing a difference value between Mth and (M−1)th 3D filtering images to generate a plurality of 3D differential images, where M is an integer between 1 and N; executing a filtering computation in the plurality of 3D differential images to generate a plurality of 3D smooth differential images; computing a difference variation among the plurality of 3D smooth differential images to obtain a blurriness value of the plurality of voxels; using the blurriness value of the plurality of voxels in a decision module to execute a plurality of first decision operators to generate a plurality of first decision results, and using one or the plurality of first decision results to execute the plurality of second decision operators to generate a plurality of second decision results; and executing a final decision operator by using the plurality of first decision results and the plurality of second decision results to generate a detection object of the 3D mammogram.
It is to be understood that both the foregoing general description and the following detailed description are by examples, and are intended to provide further explanation of the disclosure as claimed.
The disclosure can be more fully understood by reading the following detailed description of the embodiments, with reference made to the accompanying drawings as described below. It should be noted that the features in the drawings are not necessarily to scale. The dimensions of the features may be arbitrarily increased or decreased for clarity of discussion.
The technical terms “first”, “second” and the similar terms are used to describe elements for distinguishing the same or similar elements or operations and are not intended to limit the technical elements and the order of the operations in the present disclosure. Furthermore, the element symbols/alphabets can be used repeatedly in each embodiment of the present disclosure. The same and similar technical terms can be represented by the same or similar symbols/alphabets in each embodiment. The repeated symbols/alphabets are provided for simplicity and clarity and they should not be interpreted to limit the relation of the technical terms among the embodiments.
Reference is made to
In some embodiments, the object detection system 100 is configured to detect an object in the 3D mammogram. The object of the disclosure means the breast lesion, for example, the mass, the calcification, and so on. It should be noted that the object of the disclosure is not limited herein. The object detection system 100 computes a blurriness value of the 3D mammogram and trains a decision operator of a decision module by using the blurriness value and/or a combination of feature values. Therefore, the image analysis method can be applied for an indicated object, such as the mass, to detect the mass in the 3D mammogram. In some embodiments, the output device 130 is the display device. If the processor 110 detects the mass or the calcification in the 3D mammogram, the processor 110 tags a range of the mass or the calcification, and the output device 130 displays the coordinates of the range which covers the breast lesion in the 3D mammogram. For the sake of simplicity, the term “object” in the disclosure represents the image feature of the breast lesion.
Reference is made to
In some embodiments, the N filters 112a-112n apply different parameters such that the N filters 112a-112n generate different filtering results.
Reference is made incorporating with
In step S310, the N filters 112a-112n execute a filtering computation in the 3D mammogram respectively to generate the N 3D filtering images, where N is an integer larger than 1.
Reference is made to
Ik=I0⊗Gk (function 1)
, where 1≤k≤N. In function 1, Ik is the kth filtering image, Gk is the 3D Gaussian low-pass filters, and the operator ⊗ is the convolution operator. The parameter Gk of function 1 is computed by function 2:
Gk=gx⊗gy⊗gz (function 2)
In function 2, gx, gy, and gz are 1-dimension Gaussian low-pass filters, where gx is applied for the filter of x-direction of the 3D mammogram, gy is applied for the filter of y-direction filter of the 3D mammogram, and gz is applied for the filter of z-direction of the 3D mammogram. The parameter gx of function 2 is computed by function 3:
In function 3, σx is the standard deviation of gx, and Lx is the length of gx. αx is a weighting factor of gx, and αx is computed by function 4:
αx=Σi=0L
Similarly, gy and gz are also computed by both function 3 and function 4.
Based on the said function 1 to function 4, after the filtering computation of the N filters is executed in the 3D mammogram respectively, N 3D filtering images I1, . . . , In are generated.
In step S320, a difference value between Mth and (M−1)th 3D filtering images is computed to generate a plurality of 3D differential images.
Reference is made to
Dk=|Ik-1−Ik| (function 5)
, where 1≤k≤N and the operator “∥” is the absolute value operator.
In step S330, based on each voxel of the 3D mammogram, a difference variation among each voxel of a plurality of 3D smooth differential images is computed to obtain a blurriness value of each voxel.
Reference is made to
Hk=Dk⊗Ω, (function 6)
In function 6, Ω is a 3D averaging filter. The difference variation among each of the voxels of the plurality of 3D smooth differential images to obtain the blurriness value of the voxels. The blurriness value of the voxel is computed by function 7:
B=Σi=1NwiHi (function 7)
In function 7, B is a blurriness value image, and wi is a weighting value corresponding to the ith 3D smooth differential image. In the disclosure, the difference variation between a center voxel and the voxels around the center voxel is calculated by a numerical estimation. The larger the blurriness value is, the difference variation between the center voxel and the voxels around the center voxel is smaller, such that the visual effect is soft. The smaller the blurriness value is, the difference variation between the center voxel and the voxels around the center voxel is larger, such that the visual effect is sharp.
When the type of the breast lesion is different, the image feature of the 3D mammogram is also different accordingly (such as the object size, the brightness distribution, the texture, the contour). For example, the size of the mass is larger than the size of the calcification, and the distribution of the mass is more collective than the distribution of the calcification. The quantity of the calcification is more than the quantity of the mass. Based on the type of breast lesion, the filter parameters of function 1 to function 7 have to be set for the computations according to the object type which is assigned to detect the 3D mammogram. In some embodiments, the filters 112a-112n apply the 3D Gaussian low-pass filter, where a is the standard deviation, i.e., σx=σy=σz=σ. The 3D Gaussian low-pass filter uses the correct standard deviation, and the 3D mammogram is processed by the correct filter to generate the correct 3D filtering image, the correct 3D differential image, the correct 3D smooth differential image, and the correct blurriness value image. Accordingly, the object is detected correctively.
In some embodiments, the object detection method for detecting the mass applies four 3D Gaussian low-pass filters, and the four 3D Gaussian low-pass filters apply the standard deviations which are σ1=3, σ2=7, σ3=11, σ4=15. In some embodiments, the object detection method for detecting the calcification applies four 3D Gaussian low-pass filters, and the Gaussian low-pass filters apply the standard deviations which are σ1=3, σ2=5, σ3=7, σ4=9. That is, the plurality of 3D Gaussian low-pass filters apply different standard deviations based on the object type. In some embodiments, the N standard deviations a σ1, σ2, . . . , σn of the N 3D Gaussian low-pass filters satisfy the requirement: σ1<σ2< . . . <σn.
In step S340, using the blurriness value of the plurality of voxels to execute a plurality of first decision operators to generate a plurality of first decision results and using one or the plurality of first decision results to execute a plurality of second decision operators to generate a plurality of second decision results is performed.
Reference is made incorporating with
The decision module 118 receives the feature data F. In some embodiments, the feature data F makes use of the blurriness value which is outputted by the blurriness value computation operator 116. In some other embodiments, the feature data F makes use of a combination of the blurriness value and other features.
As shown in
Reference is made incorporating with
The decision module in the disclosure, which is different from the conventional method that uses a single decision strategy, applies the plurality of decision operators, such as the regression, the classification, the voting process. The plurality of first decision results generated by the plurality of decision operators can be the regression values, the classification results, the voting results, and so on. And then the decision results are processed by one or more decision operators, and one or more second decision results are generated. That is, the decision operation is performed based on the previous decision results. The final decision operator generates the detection result of the object. Therefore, in the object detection procedure, the processor 110 detects the object in a detecting 3D mammogram, and the object can be the mass and the calcification. The term “a detecting 3D mammogram” is an image that is going to be detected whether the object exists thereon.
In some embodiments, when the processor 110 detects the object of the detecting 3D mammogram, the range of the object will be generated. Reference is made to
Reference is made to
The range 514 denoted as the detection object of
In some embodiments, a non-transitory computer-readable storage medium that stores multiple executable codes including instructions is provided. The executable code is loaded into the processor 110 of
Accordingly, the object detection method, the object detection system suitable for the 3D mammogram, and the non-transitory computer-readable storage medium compute the blurriness value of the voxels of the 3D mammogram to determine whether the voxel is a part of the object and then to detect the object of the 3D mammogram. Because of the imaging features of the 3D mammogram, the computed blurriness value can be represented as the feature of the imaging focus of the object, and the feature is equivalent to the depth information of the 3D image. Therefore, the blurriness value of the voxel can be used to detect the object in the 3D mammogram. For example, the imaging of the mass is blurry, and the range of the mass in the 3D mammogram includes the voxels which have large blurriness value. The imaging of the calcification is definite, and the range of the calcification in the 3D mammogram includes the voxels which have small blurriness value. Furthermore, because the decision module in the disclosure combines the plurality of decision operators, the problem of the over-fitting model or the under-fitting model for determining the voxels of the 3D mammogram can be resolved.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the present disclosure without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the present disclosure cover modifications and variations of this disclosure provided they fall within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10912533.8 | Jul 2020 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
5598481 | Nishikawa | Jan 1997 | A |
6480565 | Ning | Nov 2002 | B1 |
8848866 | Virta | Sep 2014 | B2 |
20070286470 | Bernard | Dec 2007 | A1 |
20110305313 | Sklansky | Dec 2011 | A1 |
Number | Date | Country |
---|---|---|
100362964 | Jan 2008 | CN |
102006021036 | Oct 2007 | DE |
20130011854 | Jan 2013 | KR |
Entry |
---|
Y.-C. Zeng and C.-H. Tsai, “Detection of salient object using pixel blurriness,” Proceedings of The 2012 Asia Pacific Signal and Information Processing Association Annual Summit and Conference, 2012, pp. 1-4. (Year: 2012). |
A machine translation of KR20130011854A (Year: 2013). |
A machine translation of DE-102006021036-A1 (Year: 2007). |
The office action of the corresponding Taiwanese application dated Jan. 4, 2021. |
Number | Date | Country | |
---|---|---|---|
20220028062 A1 | Jan 2022 | US |