The present invention generally relates to medical image processing and more particularly to detecting candidate anatomical abnormalities as shown in medical images.
The field of medical imaging has seen significant advances since the time X-Rays were first used to determine anatomical abnormalities. Medical imaging hardware has progressed in the form of newer machines such as Magnetic Resonance Imaging (MRI) scanners, Computed Axial Tomography (CAT) scanners, etc. Because of large amount of image data generated by such modern medical scanners, there is a need for developing image processing techniques that automatically determine the presence of anatomical abnormalities in scanned medical images.
Recognizing anatomical structures within digitized medical images presents multiple challenges. First concern is related to the accuracy of recognition. Another concern is the speed of recognition. Because medical images are an aid for a doctor to diagnose a disease or condition, the speed of recognition is of utmost important to aid the doctor in reaching an early diagnosis. Hence, there is a need for improving recognition techniques that provide accurate and fast recognition of anatomical structures in medical images.
Digital medical images are constructed using raw image data obtained from a scanner, for example, a CAT scanner, MRI, etc. Digital medical images are typically either a 2-D image made of pixel elements or a 3-D image made of volume elements (“voxels”) . Such 2-D or 3-D images are processed using medical image recognition techniques to determine presence of anatomical structures such as cysts, tumors, polyps, etc. However, given the amount of image data generated by any given image scan, it is preferable that an automatic technique should point out anatomical features in the selected regions of an image to a doctor for further diagnosis of any disease or condition.
Feature based recognition techniques are used to determine presence of anatomical structures in medical images. However, feature based recognition techniques suffer from accuracy problems. Hence, there is a need for non-feature based recognition techniques that provide improved recognition of anatomical features in medical images.
One exemplary embodiment of the invention is a method for determining the presence of predetermined objects within a digital image is provided. The method comprises computing a gradient field of the digital image and applying at least one predetermined mask to the gradient field. Further, the method involves generating a response image from the application of filter to the gradient field. The method then determines the presence of at least one predetermined object from the response image.
In another exemplary embodiment of the invention, a method is provided where the gradient field has a X-component, a Y-component and a Z-component corresponding to the X, Y and Z axes of the digital image and the mask includes a X-filter, a Y-filter and a Z-filter. A vector convolution is generated by applying the X filter to the X-component, the Y filter to the Y-component and the Z-filter to the Z-component to generate three response image components. These three response image components are added to generate the response image.
In yet another exemplary embodiment of the invention a method detects spherical and semi-spherical objects in the digital image. In another aspect of the invention a method detects elliptical and semi-elliptical objects in the digital image.
Exemplary embodiments of the invention are described with reference to the accompanying drawings, of which:
At step 20 surface normals are calculated for the sub-volume showing the colon. The surface normals are represented in an image form as X, Y and Z slices 22, 24 and 26. The gradient fields are computed in step 20. The gradient field calculation is performed on the complete volume. Strongest gradient field magnitude is obtained in the transition area between different tissues and organs. The transition is imaged by image modality (e.g., a CT, MRI or PET scan) as an intensity change. In the present example of a colon, the intensity change is indicated as a transition between the tissue and the lumen. Multiple techniques exist to calculate the gradient fields and any one of such techniques can be used for this purpose.
Thereafter, masks of varying sizes are applied to the surface normalized image. The number of masks can be one or more and the types of masks to be used will depend on the candidate anatomical feature that is being searched. In the present illustration, to determine lesions in the colon, spherical and semi-spherical candidates are to be searched in the input image. Therefore, the masks for such an application are designed to search spherical or semi-spherical objects within the input image.
The gradient field of an object that is spherical or semi-spherical diverges relative to the centroid of the object. When this object is convolved with a vector field of a mask of a given size that is diverging, a response volume is obtained. This response image can be used to characterize spherical or semi-spherical objects.
Steps 28 and 40 form a loop that iterates for all mask sizes. For each iteration a single mask is applied to the surface normalized image to compute raw DGFR responses at step 30. The raw DGFR response image is shown as image slices 22, 24 and 26 representing X, Y and Z slices respectively. The raw DGFR response is calculated by a convolution of surface normals with filters, a process that will be described below in detail.
The diverging gradient field response is computed using the formula:
where,
Vx(x,y,z)=x/√{square root over ((x2+y2+z2))},
Vy(x,y,z)=y/√{square root over ((x2+y2+z2))},
Vz(x,y,z)=z/√{square root over ((x2+y2+z2))},
Ω=[−floor(maskSize/2) to +floor(maskSize/2)],
Then, at step 38 the raw DGFR response image is used to compute features that can used to recognize the presence of any candidate lesion in the colon. For example, features such as sphericity and maximum response region can be computed to determine presence of lesions having spherical or semi-spherical shapes. At step 42, after the loop of steps 28 and 40 has iterated for all mask sizes, the features corresponding to all mask sizes that are found during DGFR processing are sent to an output device or medium. Finally, based on the response image, features can be computed, and used for either candidate generation or can be used as an input to a classifier.
In the above Equation (I), the intermediate response images are represented as DGFRx, DGFRy, DGFRz while Ω is the domain of all valid coordinates in the image. The intermediate response images are combined to form the final response image. The combination can be performed using multiple techniques. For example, one technique involves adding the intermediate response images. This addition is expressed below as:
DGFR(x,y,z)=DGFRx(x,y,z)+DGFRy(x,y,z)+DGFRz(x,y,z) for x,y,zεΩ
Further, the intermediate response images can be further processed or combined using either a single function G or a combination of functions F(G . . . ). Typical examples of such functions are absolute value and squaring. Many other functions can be used. Application of an absolute value function to combine intermediate response images is expressed as:
DGFR(x,y,z)=Abs(DGFRx(x,y,z))+Abs(DGFRy(x,y,z))+Abs(DGFRz(x,y,z)) for x,y,zεΩ
Similarly, application of an absolute value function to combine intermediate response images is expressed as:
DGFR(x,y,z)=Square(DGFRx(x,y,z))+Square(DGFRy(x,y,z))+Square(DGFRz(x,y,z)) for x,y,zεΩ
While, the above Equation (I) is described for three-dimensional digital image, the equation with filters and gradient fields can be applied to two-dimensional or binary digital images where the z-component of the filters and gradient images is dropped and only x and y components are used to calculate the DFGR response image. Further, a DGFR a process can be applied to any number of image dimensions typical examples of which are 2-D, 3-D and 4-D images.
In the present example, filters Vx(x,y,z), Vy(x,y,z) Vz(x,y,z) are designed to detect spherical or semi-spherical objects. However, any appropriate filters can be used to determine a particular shape of the candidate object. For example, elliptical and semi-elliptical objects filters can be designed as below:
Vx(x,y,z)=a.x/√{square root over ((x2+y2+z2))},
Vy(x,y,z)=b.y/√{square root over ((x2+y2+z2))},
Vz(x,y,z)=c.z/√{square root over ((x2+y2+z2))}
or using approximations of above functions.
A DGFR process applied in an exemplary embodiment of the invention is used to detect spherical or semi-spherical objects in an image volume. A DGFR process is an intuitive technique that relies on the nature of the image gradient fields to detect anatomical features in images rather than a feature based approach that relies on specific features of objects (such as sphericity) to be detected. A DGFR process is both an approach for filtering, i.e., highlighting, in its simplest form, as well as a sophisticated algorithm to perform automatic detection of abnormal anatomical structures such as colonic polyps, aneurisms, lung nodules, etc. In addition, it can be used to obtain other descriptive characteristics of a candidate lesion useful for its identification and classification.
Vector convolution of a diverging gradient field with an image having a spherical or semi-spherical object produces a high response. Gradient field of objects having spherical or semi-spherical shapes typically diverges at all possible points.
Pane 62 represents the filter Vx(x,y,z); pane 64 represents the filter Vy(x,y,z); and pane 66 represents the filter Vz(x,y,z) in the equation (I). These filters forming the mask image 60 are convolved with the fx, fy and fz components of gradient field in the Equation (I) above.
Three dimensional convolution is a computationally expensive procedure. Hence, the filters (Vx(x,y,z), Vy(x,y,z), Vz(x,y,z)) for each separate axes are used. Further, approximations of such filters can also be applied. Each of these linearly separable filters are individually convolved, typically by taking vector inner product, with the fx, fy and fz. components of gradient field and then added to achieve a DGFR response in an exemplary embodiment of the invention. Hence, using separable filters provides a relatively faster method that is computationally less expensive in calculating a DGFR response image in exemplary embodiment of the invention.
In the present example, masks with odd sizes varying from 5×5×5 to 25×25×25 are used to compute a response. Multiple mask sizes are used so that for a given size polyp, multiple masks that are smaller and larger in size than the given polyp, will provide better responses. All possible shifts of the masks are used to compute the vector convolution to obtain the response volume. These responses are then integrated to generate a single response volume. Thus, response images are generated as a result of convolution of the template masks with the derivative image.
The response image can be further processed. For example, for each of these responses a threshold is applied and the shape of the resulting response image is analyzed. In particular, the thresholded response is approximated to an ellipsoid using Eigen-value decomposition, and the ratio of the largest Eigen-value to the smallest Eigen-value is estimated. This value gives an estimate as to the sphericity of the response and can be used as a feature (along with other statistics of the response in a given region).
Referring to
The computer platform 101 also includes an operating system and micro instruction code. The various processes and functions described herein may either be part of the micro instruction code or part of the application program (or a combination thereof) which is executed via the operating system. In addition, various other peripheral devices may be connected to the computer platform such as an additional data storage device and a printing device.
It is to be further understood that, because some of the constituent system components and method steps depicted in the accompanying figures may be implemented in software, the actual connections between the system components (or the process steps) may differ depending upon the manner in which the present invention is programmed. Given the teachings of the present invention provided herein, one of ordinary skill in the related art will be able to contemplate these and similar implementations or configurations of the present invention.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the appended claims.
This application claims the benefit of U.S. Provisional Application No. 60/519,522 filed on Nov. 12, 2003, entitled as “DGFR: Filtering and Automatic Detection of Abnormal Anatomical Structures in Medical Images”, content of which is fully incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4769850 | Itoh et al. | Sep 1988 | A |
5797396 | Geiser et al. | Aug 1998 | A |
6842638 | Suri et al. | Jan 2005 | B1 |
6909797 | Romsdahl et al. | Jun 2005 | B2 |
6999630 | Drouot | Feb 2006 | B1 |
7010163 | Weiss | Mar 2006 | B1 |
7187810 | Clune et al. | Mar 2007 | B2 |
7200259 | Gold et al. | Apr 2007 | B1 |
7574024 | Bitter et al. | Aug 2009 | B2 |
20020114518 | Wilt | Aug 2002 | A1 |
20030053667 | Paragios et al. | Mar 2003 | A1 |
20030095721 | Clune et al. | May 2003 | A1 |
20030161520 | Yamano et al. | Aug 2003 | A1 |
20030223627 | Yoshida et al. | Dec 2003 | A1 |
20040076324 | Burl et al. | Apr 2004 | A1 |
20040252870 | Reeves et al. | Dec 2004 | A1 |
20050232472 | Scholze | Oct 2005 | A1 |
20050259855 | Dehmeshki | Nov 2005 | A1 |
20060115146 | Ogura et al. | Jun 2006 | A1 |
Number | Date | Country |
---|---|---|
0 405 457 | Jan 1991 | EP |
62125481 | Jun 1987 | JP |
03027483 | May 1991 | JP |
03288278 | Dec 1991 | JP |
WO 03034176 | Apr 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20050105800 A1 | May 2005 | US |
Number | Date | Country | |
---|---|---|---|
60519522 | Nov 2003 | US |