Embodiments of the disclosure relate to the field of breast skinline detection.
Breast cancer is a type of malignancy occurring in both men and women. Existing diagnostic imaging techniques for breast lesion detection and diagnosis include, but are not limited to ultrasound imaging, magnetic resonance imaging, computerized tomography scan, and x-ray mammography. Often, x-ray mammography is used in screening of a breast for early stage detection and diagnosis of breast lesions. Examples of x-ray mammography techniques include film based x-ray mammography, digital breast tomography and full field digital mammography.
It is noted while diagnosing the breast lesions that thickening of skin and skin retractions are indications of malignancy. It is also noted that micro-calcifications found on, or immediately below a breast skinline are considered benign. In one example, the breast skinline can be defined as a demarcation line that separates a breast region from a background region. Accurate knowledge of breast skinline and position of abnormalities from the breast skinline is needed for diagnosing the breast lesions. Often, the position of the abnormalities is reported relative to the breast skinline. A mammography technician upon finding a suspicious lesion in one view must locate the suspicious lesion in another view at same distance from the breast skinline. Further, the mammography technician has to ensure that equal amounts of tissue, between the breast skinline and chest wall, are visualized in all views taken. The breast skinline and relative position of nipple acts as a registration aid and a marker for detecting and reporting the abnormalities in the breast region. In existing x-ray mammography techniques, visualization of the breast skinline is difficult and error prone. Also, detection of the breast skinline requires human intervention. In one example, inaccurate detection of the breast skinline can cause failure to diagnose the breast lesions. In another example, the inaccurate detection of the breast skinline can cause overlooking of certain cancerous regions of the breast.
An example of a method for determining skinline in a digital mammogram image includes smoothening the digital mammogram image to yield a smoothened image. The method includes determining gradient in the digital mammogram image to yield a gradient map. Further, the method includes extracting breast region from the digital mammogram image based on the smoothened image and the gradient map using fuzzy rule based pixel classification to yield a binary image. The method includes filtering the binary image to remove noise and to yield a filtered image. The method includes extracting boundary of the breast region in the filtered image. The method includes detecting the skinline based on the boundary of the breast region.
An example of a method for determining skinline in a digital mammogram image by an image processing unit includes smoothening the digital mammogram image to yield a smoothened image. The method includes determining gradient in the digital mammogram image to yield a gradient map. The method includes extracting breast region from the digital mammogram image based on the smoothened image and the gradient map using fuzzy rule based pixel classification to yield a binary image. The method includes filtering the binary image to remove noise and to yield a filtered image. The method includes extracting boundary of the breast region in the filtered image. The method includes filtering the digital mammogram image based on a homomorphic filtering technique to yield a homomorphic filtered image. The method also includes detecting the skinline based on the smoothened image, the gradient map, and the homomorphic filtered image.
An example of an image processing unit (IPU) for determining skinline in a digital mammogram image includes an image acquisition unit that electronically receives the digital mammogram image. The IPU includes a digital signal processor (DSP) responsive to the digital mammogram image to de-noise the digital mammogram image, smoothen the digital mammogram image to yield a smoothened image, determine gradient in the digital mammogram image to yield a gradient map, extract breast region from the digital mammogram image based on the smoothened image and the gradient map using fuzzy rule based pixel classification to yield a binary image, filter the binary image to remove noise and to yield a filtered image, extract boundary of the breast region in the filtered image, filter the digital mammogram image based on a homomorphic filtering technique to yield a homomorphic filtered image, and detect the skinline based on at least one of the smoothened image, the gradient map, and the homomorphic filtered image.
In the accompanying figures, similar reference numerals may refer to identical or functionally similar elements. These reference numerals are used in the detailed description to illustrate various embodiments and to explain various aspects and advantages of the disclosure.
Various embodiments discussed in disclosure pertain to determining of breast skinline in a digital x-ray mammogram. The breast skinline, hereinafter referred to as the skinline can be defined as a demarcation line that separates a breast region from a background region. In one example, the background region includes a region outside body. Accurate determination of the skinline is required to detect and diagnose breast lesions.
An environment 100 for determining the skinline is shown in
The determining of skinline is explained in conjunction with
Referring to
At step 205, a digital mammogram image is received. The digital mammogram image can be received from an image source or an image detector, for example the x-ray detector 115. The digital mammogram image, hereinafter referred to as the image can be an uncompressed 8/10/12/14 bit grayscale image.
At step 210, the image is de-noised. De-noising the image includes removing speckle noise and salt-pepper noise from the image. The speckle noise can be defined as a granular noise that exists in the image as a result of random fluctuations in a return signal from an object whose magnitude is no larger than a pixel. The salt-pepper noise can be defined as randomly occurring white and black pixels in the image as a result of quick transients like faulty switching while capturing the image.
In some embodiments, the de-noising includes removing the speckle noise and the salt-pepper noise using a median filter.
The median filter can be referred to as non-linear digital filtering technique and can be used to prevent edge blurring. A median of neighboring pixels values can be calculated. The median can be calculated by repeating following steps for each pixel in the image.
a) Storing the neighboring pixels in an array. The neighboring pixels can be selected based on shape, for example a box or a cross. The array can be referred to as a window, and is odd sized.
b) Sorting the window in numerical order.
c) Selecting the median from the window as the pixels value.
In one example, the median filter can be a 3×3 median filter.
At step 215, the image is smoothened to yield a smoothened image. In one example, smoothening includes convoluting the image with a finite sized averaging mask, for example with an N×N averaging mask. The convolution can be defined as a mathematical operation that involves selection of a window of a finite size and shape, for example an N×N window and scanning the window across the image to output a pixel value that is a weighted sum of input pixels within the window. The window can be considered as a filter that filters the image to smoothen or sharpen the image. The smoothened image represents average gray level value of pixels surrounding the pixel.
At step 220, gradient in the image is determined to yield a gradient map. The gradient in the image, hereinafter referred to as the image gradient, can be determined using a gradient detection technique, for example using a sobel operator. The sobel operator can be used to compute an approximate value for the image gradient. The gradient map represents value of gray level gradient at a pixel location. In one example, the image gradient represents magnitude and direction of change in gray level values.
At step 225, the image is filtered based on a homomorphic filtering technique to yield a homomorphic filtered image. The homomorphic filtering technique includes mapping spatial domain representation of the image to another domain, for example a frequency domain and performing filtering in the frequency domain. The homomorphic filtering technique enhances contrast of the image. The homomorphic filtering technique is further explained in conjunction with
At step 230, breast region is extracted from the image based on the smoothened image and the gradient map using fuzzy rule based pixel classification to yield a binary image. The binary image can be defined as an image whose pixels values are represented by binary values.
The fuzzy rule based pixel classification includes checking a rule base. The rule base is based on the average gray level value and the image gradient and is used to determine pixels representing the breast region and pixels representing background region.
The checking of the rule base includes receiving the smoothened image and the gradient image. The fuzzy rule based pixel classification makes use of linguistic variable graphs to demarcate the breast region from the background region. The linguistic variable graphs are predefined based on experimentation. A first linguistic variable (A) graph corresponds to the average gray level value and related certainty of it being LOW or HIGH and a second linguistic variable (G) graph corresponds to the image gradient and related certainty of it being LOW or HIGH. For a first pixel, the certainty of the first pixel having a LOW value or a HIGH value in the first linguistic graph is determined. Similarly, the certainty for other pixels in the first linguistic graph is determined. Likewise, the certainty of the first pixel and other pixels having a LOW value or a HIGH value in the second linguistic graph is determined. Based on the LOW value and the HIGH value in the graphs, the image is classified as the background (Bg) region or the breast region (Br) using the following rules:
If the average gray level value (A) of the pixel is LOW “AND” the gradient value (G) of the pixel is LOW then the pixel belongs to the background region (Bg). The “AND” operator represents minimum of two values.
If the average gray level value (A) of the pixel is LOW “AND” the gradient value (G) of the pixel is HIGH or if the average gray level value (A) is HIGH, then the pixel belongs to the breast region (Br).
The first linguistic graph and the second linguistic graph are further explained in conjunction with
At step 235, the binary image is filtered to remove noise. The binary image can be filtered using morphological filtering techniques, for example morphological opening-closing with a binary mask and a connected component labeling technique to yield a filtered image. In one example, the morphological opening-closing with a binary mask of radius N pixels can be defined as a technique to fill holes in the breast region and the background region. In another example, the connected component labeling technique can be defined as a technique to detect and connect regions filled with holes in the image.
At step 240, boundary of the breast region is extracted. In one example, the boundary of the breast region is extracted using morphological boundary extraction techniques.
In one embodiment, the morphological boundary extraction technique can be performed using two steps, for example an erosion step followed by a subtraction step. In another embodiment, the morphological boundary extraction technique can be performed using a dilation step followed by a subtraction step. Erosion, dilation, and subtraction are morphological operations. In a morphological operation, value of each pixel in an output image is based on a comparison of corresponding pixel in an input image with neighboring pixels. By choosing size and shape of neighborhood, an appropriate morphological operation can be performed that is sensitive to specific shapes in the input image. In one example, the morphological operation of dilation adds pixels to object boundaries, while the morphological operation of erosion removes pixels on object boundaries. In another example, the morphological operation of subtraction takes two images as input and produces as output a third image whose pixel values are those of a first image minus corresponding pixel values from a second image.
In yet another embodiment, the morphological boundary extraction technique can include one step of erosion, dilation or subtraction. The boundary extracted using the morphological boundary extraction technique is an approximate boundary of the breast region and is further processed to determine accurate boundary of the breast region. The morphological boundary extraction technique is further explained in conjunction with
At step 245, the skinline is detected based on extracted boundary of the breast region. The skinline is detected based on active contour technique. The active contour technique uses the smoothened image, the gradient map, and the homomorphic filtered image as inputs to determine the skinline. The active contour technique is an energy minimizing technique that is used to detect image contours, for example lines and edges in the image. In one example, the active contour technique uses a greedy snake algorithm to detect the image contours. The greedy snake algorithm tracks the image contours and matches them to determine the accurate boundary of the breast region, thereby determining accurate skinline. The active contour technique at any instant of time tries to minimize an energy function and hence is termed as an active technique. Further, the image contours slither while minimizing the energy function and hence the contours are termed as snakes. The active contour technique is further described in “Snakes: Active contour models” Kass, M., Witkin, A., Terzopoulos, D., and W. H. Wolberg, International Journal of Computer Vision, pp 321-331, 198, which is incorporated herein by reference in its entirety.
The image after detecting the skinline can be classified into the breast region and the background region.
At step 250, the skinline can be marked and further the image with marked skinline and breast map can be processed for breast lesion detection and diagnosis.
It is noted that one or more of these steps can be performed in parallel, for example step 225 can be performed in parallel with step 215 or step 220.
Referring to
At step 252, a digital mammogram image is received. The digital mammogram image, hereinafter referred to as the image can be received from an x-ray detector, for example the x-ray detector 115.
At step 254, the image is de-noised to remove speckle noise and salt-pepper noise.
At step 256, an approximate skinline is extracted. The approximate skinline can be extracted using morphological boundary extraction techniques.
At step 258, contrast of the image is enhanced. It is noted that step 258 can be performed in parallel with step 256.
At step 260, an accurate skinline is detected. The accurate skinline can be detected using an active contour technique.
At step 262, a marked breast skinline and a breast map is generated. The breast map can be defined as a map constituting features of the breast, including details of suspicious lesions. In some embodiments, the breast map can also be referred to as a breast mask. The skinline can be marked and further the image with marked skinline and the breast map can be processed for breast lesion detection and diagnosis. The breast lesion detection and diagnosis using the marked skinline is further explained in
Referring to
At step 264, a digital mammogram image is received.
At step 266, skinline is detected in the digital mammogram image. Detection of the skinline in the digital mammogram image is performed based on the following steps. The digital mammogram image is first de-noised. The digital mammogram image is then smoothened to yield a smoothened image. Further, gradient in the digital mammogram image is determined to yield a gradient map. The digital mammogram image is filtered based on a homomorphic filtering technique to yield a homomorphic filtered image. The breast region is extracted from the digital mammogram image based on the smoothened image and the gradient map using fuzzy rule based pixel classification to yield a binary image. The binary image is filtered to remove noise and to yield a filtered image. The binary image can be filtered using morphological filtering techniques. Further, boundary of the breast region is extracted. In one example, the boundary of the breast region is extracted using morphological boundary extraction techniques. The skinline is then detected using an active contour technique.
At step 268, a breast mask is generated. The breast mask includes a marked skinline. The breast mask is further used to define regions of interest for the breast lesion detection and diagnosis by image analysis and region of interest (ROI) based compression of the digital mammogram image.
At step 270, the regions of interest defined by the breast mask is further processed for the breast lesion detection and diagnosis. The image is analyzed and region of interest based compression algorithms are implemented. Further, analyzed image is used for the breast lesion detection and diagnosis.
At step 272, an abnormality marked image is generated. The abnormality marked image includes region in the breast where suspected lesions have been found.
The breast 110 is placed between an x-ray source 105 and a detector 115. In one example, the x-ray source 105 can be a linear accelerator that generates x-rays by accelerating electrons. In one example, the detector 115 can be an x-ray detector and can detect x-rays. Examples of the detector 115 include, but are not limited to photographic plate, a Geiger counter, a scintillator, and a semiconductor detector. The image of the breast 110 is captured by the detector 115. In one embodiment, an imaging setup 370 is required to position the x-ray source 105 and the detector 115.
An image acquisition module 325 electronically receives the image of the breast 110 from an image detector, for example the detector 115. In one example, the image acquisition module 325 can be a video processing subsystem (VPSS). The IPU 305 includes a digital signal processor (DSP) 310, coupled to the communication bus 330 that receives the image of the breast 110 and processes the image. The IPU 305 includes a micro-processor unit (MPU) 315 and a graphics processing unit (GPU) 320 that processes the image in conjunction with the DSP 310. The GPU 320 can process image graphics. The MPU 315 controls operation of components in the IPU 305 and includes instructions to perform processing of the image on the DSP 310.
The storage device 350 and the display 355 can be used for outputting result of processing. In some embodiments, the DSP 330 also processes a skinline detected breast image and is used for breast lesion detection and diagnosis. The DSP 330 also generates the abnormality marked image, which can then be displayed, transmitted or stored, and observed. The abnormalities marked image is displayed on the display 355 using a display controller 345.
The logarithmic unit 405 receives an input x-ray image that can be represented as a function f(x, y). The input x-ray image f(x, y) can be expressed as a product of incident radiation (i(x, y)) and attenuation offered by tissue along different paths taken by the x-ray through the tissue (t(x, y)) as given below:
f(x, y)=i(x, y)×t(x, y)
Output of the logarithmic unit 405 can be expressed as g(x, y) and can be calculated as given below:
g(x, y)=ln f(x, y)
g(x, y)=ln i(x, y)+ln t(x, y)
The DFT unit 410 receives the output g(x, y) and computes Fourier transform of g(x, y). In one example, the Fourier transform can be defined as a mathematical operation that transforms a signal in spatial domain to a signal in frequency domain. The Fourier transform of g(x, y) can be calculated as given below:
F{g(x, y)}=F{ln i(x, y)}+F{ln t(x, y)}
Or
G(u, v)=I(u, v)+T(u, v)
Where I(u, v) is the Fourier transform of ln i(x, y) and T(u, v) is the Fourier transform of ln t(x, y).
The homomorphic filtering unit 415 applies a filter represented by response function H(u, v) on G(u, v) to output S(u, v). The output S(u, v) can be calculated as given below:
S(u, v)=H(u, v)·G(u, v)
S(u, v)=H(u, v)·I(u, v)+H(u, v)·T(u, v)
The IDFT unit 420 calculates the inverse Fourier transform of S(u, v) to output S(x, y). The output S(x, y) is in spatial domain and can be calculated as given below:
F
−1
{S(u, v)}=S(x, y)=i′(x, y)+t′(x, y)
The exponential unit 425 calculates exponential of S(x, y) to output S′(x, y). The output S′(x, y) gives an enhanced image and can be calculated as given below:
exp(S(x, y))=exp[i′(x, y)]×exp[t′(x, y)]
S′(x, y)=i″(x, y)×t″(x, y)
Now, i″(x, y) and t″(x, y) are illumination and attenuation components of the enhanced image. An illumination component tends to vary gradually across the image. An attenuation component tends to vary rapidly across the image. It is noted that there is a step change in skinline-air interface in the enhanced image. Therefore, by applying a frequency domain filter like the homomorphic filtering unit 415 having a frequency response as shown in
In some embodiments, A can have a value between the thresholds A1 and A2. G can also have a value between the thresholds G1 and G2.
In one example, let A1=1 and A2=2
If A=0.7, then A<A1 and is considered LOW with 100 percent certainty
If A=2.7, then A>A2 and is considered HIGH with 100 percent certainty
If A=1.3, then A is between A1 and A2. A has 0.7 certainty of being LOW or in other words 0.3 certainty of being HIGH.
In another example, let G1=2 and G2=3
If G=0.7, then G<G1 and is considered LOW with 100 percent certainty
If G=3.7, then G>G2 and is considered HIGH with 100 percent certainty
If G=2.3, then G is between G1 and G2. G has 0.7 certainty of being LOW or in other words 0.3 certainty of being HIGH.
In yet another example, let G1=2 and G2=3
If G=0.7, then G<G1 and is considered LOW with 100 percent certainty
If G=3.7, then G>G2 and is considered HIGH with 100 percent certainty
G=2.7, then G is between G1 and G2. G has 0.3 certainty of being LOW or in other words 0.7 of being HIGH.
A rule base can be created by defining a pixel as a pixel representing the background region if the average gray level value of the pixel is a first predefined value (LOW) and the gradient value of the pixel is the first predefined value (LOW). It is noted that the background region is a low intensity homogeneous region and hence the average gray level value of the pixel is LOW and the gradient value of the pixel is LOW. The pixels representing the background region can be defined based on the following rule:
If the average gray level value (A) of the pixel is LOW “AND” the gradient value (G) of the pixel is LOW then the pixel belongs to the background region (Bg). The “AND” operator represents minimum of two values.
The rule base can be created by defining the pixel as a pixel representing the breast region if the average gray level value of the pixel is the first predefined value (LOW) and the gradient value of the pixel is a second predefined value (HIGH) or by defining the pixel as a pixel representing the breast region if the average gray level value of the pixel is the second predefined value (HIGH). It is noted that the breast region is a high intensity non homogeneous region and hence the average gray level value of the pixel is HIGH and the gradient value of the pixel is HIGH. The pixels representing the breast region can be defined based on the following rule:
If the average gray level value (A) of the pixel is LOW “AND” the gradient value (G) of the pixel is HIGH or if the average gray level value (A) is HIGH, then the pixel belongs to the breast region (Br).
The rule base can be further explained with the following examples:
If A is 0.7 (LOW) and G is 0.3 (HIGH) then the pixel value is minimum of 0.7 and 0.3, that is 0.3 (HIGH). Hence, the pixel belongs to the breast region.
If A is 0.7 (LOW) and G is 0.6 (LOW) then the pixel value is minimum of 0.7 and 0.6, that is 0.6 (LOW). Hence, the pixel belongs to the background region.
If A is 0.3 (High) then the pixel belongs to the breast region.
b(i, j)=p(i, j)⊕((q)∀q∈ N4(p(i, j)))
Where ⊕ represents a logical exclusive OR operation, and (•) represents logical AND operation, N4(•) represents a 4-neighbourhood around the pixel in the argument.
Referring to
Referring to
The breast region 810 is extracted based on the smoothened image 1105 and the gradient map 1205 using a fuzzy rule based pixel classification to yield a binary image 1405. The binary image 1405 is shown in
The skinline 815 that is detected using the techniques in disclosure is accurate and easy to visualize. The skinline 815 can act as a registration aid in comparing images of left and right breasts or in comparing views of same breast taken at different times. Further, the skinline 815 can be used to define region of interest for abnormality detection and image compression. The skinline 815 detected can reduce computational requirements for consecutive image analysis stages for breast lesion detection and diagnosis.
In the foregoing discussion, the term “coupled or connected” refers to either a direct electrical connection or mechanical connection between the devices connected or an indirect connection through intermediary devices.
The foregoing description sets forth numerous specific details to convey a thorough understanding of embodiments of the disclosure. However, it will be apparent to one skilled in the art that embodiments of the disclosure may be practiced without these specific details. Some well-known features are not described in detail in order to avoid obscuring the disclosure. Other variations and embodiments are possible in light of above teachings, and it is thus intended that the scope of disclosure not be limited by this Detailed Description, but only by the Claims.