The present invention relates generally to Computer-Assisted Detection (CAD) and in particular, to a CAD method for automatically detection of lesion in the volumetric medical images.
Techniques for acquiring medical image data, rendering 3D models of an object from such data and allowing an operator to view the virtual object are well known. For example, U.S. Pat. No. 6,343,936 to Kaufman et al. entitled “System and Method for Performing a Three Dimensional Virtual Examination, Navigation and Visualization” discloses such systems and methods. Virtual colonoscopy is an important application for such systems. The ability to have the colon interior examined using a non-invasive medical imaging method as opposed to conventional invasive techniques will make the procedure more widely used and will lead to early detection of cancerous polyps in a larger number of cases. In this regard, the '936 patent discloses a method of detecting polyps within a colon lumen. The disclosure in the '936 patent presents an initial method for effectively detecting polyps. It is expected that improvements in such techniques are desirable to further improve the results of automatic polyp detection.
The present invention is directed to a computer-based method for automatically detecting abnormal lesions in volumetric medical images. The method includes the steps of features extraction and fusion. The first step is computing gradient feature for extraction of the layer of Partial Volume Effect (LPVE) between different tissues that relate to specific organs. The LPVE is combined with the result of voxel classification to provide tissue classification. After tissue classification, the contour of the tissue boundary is determined. The gradient feature is also analyzed to determine the direction that intensity changes. The direction that intensity changes most dramatically is selected as the normal vector for the voxels on the contour of the tissue boundary.
The second step is to determine a local surface patch on the contour for each voxel on the contour. A local landmark system is created on the patch and a Euclidean Distance Transform Vector (EDTV) is then computed based on those landmarks. The EDTV is substantially independent of the image coordinate system and is substantially invariant to translation and rotation. The EDTV is the basic shape feature for lesion detection. A vector classification algorithm for pattern recognition based on EDTVs is also provided. The voxels on the contour of tissue boundary can be grouped into areas based on similar patterns to form lesion patch and local lesion volume. That area can be further analyzed to estimate of the likelihood of a lesion.
These and other features and advantages of the present invention will become apparent from the following detailed description of preferred embodiments, which is to be read in connection with the accompanying drawings.
The present invention is directed to a computer assisted diagnostic (CAD) method for automatically detecting suspicious lesions in volumetric medical images. A general overview of the present invention is shown in the flow chart of
To facilitate a clear understanding of the present invention, illustrative examples are provided herein which describe the method as a sequential procedure with multiple steps for processing CT virtual colonoscopy images. However, the invention is not solely limited to applications in this way. It is to be appreciated that the invention may be used in any kind of combination of those invented techniques and may be used for any organ related lesion detection procedure. Moreover, the present invention is equally applicable to other modality of images unless there exist intensity difference between tissue of interest and its surrounding tissues.
The basic steps of the present method include the extraction of the contour of a virtual object, such as a colonic lumen, perform geometry feature analysis for the contour, detect suspicious regions, such as polyps based on geometry features, analysis of other features on the polyp volume, and the estimation of likelihood that a suspicious region relates to a real polyp.
Referring to
The volumetric image data is subjected to voxel classification (step 105). In the case of virtual colonoscopy, the colon can be insufflated with air or CO2 during image scanning. The remaining residual stool and fluid in the colon is usually tagged in higher intensity. As described in the '936 patent, the intensities of air, soft tissue, and tagged stool/fluid are in ascending order respectively with approximate 500 HU difference from each other in image intensity. This intensity difference enables differentiation of these materials. However, the boundary between the regions may be slowly varying due to partial volume effects. The range of partial volume effect depends on both the width of collimation and the image resolution, i.e., voxel size in image data. Moreover, the intensity may be blurred due to noise. Therefore, more sophisticated voxel classification, such as described in the '936 patent or in U.S. Pat. No. 6,331,116 (“'116 patent”), which is hereby incorporated by reference in its entirety, may be desired. The method disclosed in the '116 patent classifies each voxel based on its local intensity vector.
In addition to voxel classification, a gradient feature for the voxels in the volumetric image data is also calculated (step 110). The gradient feature is defined as the directional first-order derivative. In the digital image data, voxels are located on a discrete integer grid. Generally, the grid has different step lengths along different coordinate directions due to variations of the scanning parameters. The present method calculates the gradient feature in a manner which is adaptive to the anisotropic voxel size as well as to the layer of partial volume effect (LPVE), which depends on both collimation and the voxel size of CT images. Usually, the larger the collimation is, the larger the range of partial volume effects. On the other hand, the gradient feature should describe the shape variation at that point.
For each voxel in the volumetric image data, five directional derivatives are calculated, as illustrated in
The calculation of first-order derivatives is described in the following expressions. Let X1 be the intensity of i-th neighbor voxels and denote Y={yi: i=1, 2, . . . , 5}, the vector of directional derivatives. Equation 1 shows how to calculate the Y values. The neighborhood is defined in
y1=|x12−18x4+18x6−x14 |/(12*λxy),
y2=|x9−18x1+18x3−x11|/(12*λyy3=|x13−18x5+18x7−x15|/(12*λxy),
y4=|x10−18x2+18x0−x8 |/(12*λx) (1)
y5=|x16−x17|/(2*λz)
where λ* is the step length along direction “*.” In Equation 1, the central-5-point formula is employed as an example, however, the invention is not limited to this embodiment. Any other suitable formula can be utilized to calculate the first-order derivative and account for voxel size.
The maximum value among the absolute values of the directional derivatives of equation 1 is defined as the Gradient Feature Value (GFV). By thresholding the GFV, the LPVE can be separated from the tissue homogeneous regions since the homogeneous region has a much lower GFV compared to the tissue boundary, where the intensity dramatically changes. The threshold can be pre-set for a specific application. In the present invention as applied to virtual colonoscopy CT images, a threshold of about 50 has been found to be useful. In other words, if the GFV of a voxel is larger than 50, it is in the LPVE.
From the voxel classification of step 105 and the gradient feature of step 110, various tissue regions as well as the layer of partial volume effect (LPVE) between regions is extracted in step 115.
The determination of the LPVE can be fused with that of the voxel classification to determine the region of the colon lumen and the colonic wall layer. There are several voxel classification methods applicable to this application. In one embodiment, the self-adaptive vector quantization algorithm, which is disclosed in U.S. Pat. No. 6,331,116, was used. The GFV represents the intensity change at a given location. The voxel classification addresses the intensity homogeneity. By combining GFV thresholding and voxel classification, more reliable and complete information can be obtained. Moreover, the partial volume effect can be considered in an appropriate way. This convergence is referred to herein as Intensity-Gradient Hybrid Segmentation (IGHS). The segmentation result of IGHS is the LPVE around the homogenous tissue region. The layer of partial volume effects may be more than several voxels thick. The one-voxel-thick shell of the layer that is connected to the colon lumen is referred to herein as the Contour Of the Colonic Lumen (COCL) (step 120).
For each voxel on the COCL, the direction in which intensity changes most dramatically is determined. Referring to
In step 125, the contour of the regions of interest are evaluated and a set of local landmarks for the local patch contour are determined. As illustrated in
In step 130, the landmarks can be used to compute a Euclidean Distance Transform Vector (EDTV). The EDTV consists of the set of all distances between any pair of landmarks in a predetermined order. For example, when nine landmarks (number 1 through 9) are used, a 3-dimensional EDTV results, which includes the distances between landmarks 1-2, 1-3, 1-4, 1-5, 1-6, 1-7, -1-8, 1-9, 2-3, 2-4, 2-5, 2-6, 2-7, 2-8, 2-9, 3-4, 3-5, 3-6, 3-7, 3-8, 3-9, 4-5, 4-6, 4-7, 4-8, 4-9, 5-6, 5-7, 5-8, 5-9, 6-7, 6-8, 6-9, 7-8, 7-9and 8-9. The order in which the distances are determined among the landmarks is not critical so long as the order is known and is repeatable. Since the elements of EDTV are relative distances, the EDTV is independent to the coordinate system and is invariant to translation and rotation. This allows the EDTV to be a good shape feature vector.
The EDTV provides a geometric representation of the region of interest which can be used in classification (step 140) and pattern recognition (step 135). A polyp library can be created based on a training set of EDTV. The training set is a set of 3-dimensional EDTVs that are extracted from available polyp samples in the virtual colonoscopy images. Any known feature extraction algorithm can be applied to the training set to classify the training EDTVs into several typical clusters and the representative EDTV for each cluster is generated. One solution is to use principal component analysis. The representative EDTVs serve as a template for all kinds of local shapes associated with possible polyps. This library can be updated as more EDTVs related to polyps are obtained. The feature extraction procedure can also apply on a training set that includes EDTVs related to both polyp and normal colon wall/fold. In that case, the representative EDTVs for the normal wall/fold can also be created.
The current invention detects suspicious polyp locations in virtual colonoscopy images based on the EDTV. When a new set of image data is acquired, the colon lumen and the LPVE are determined first. Then, the COCL is delineated. Following that, the EDTV is calculated for all voxels on the COCL. The distances between each EDTV and those representative EDTVs in the library are computed. The minimum distance is than determined. If the minimum distance is less than or equal to a pre-set threshold, the current voxel is associated to a suspicious polyp. Otherwise, the current voxel is associated with a normal colon wall. The threshold is created by finding out the minimum distance between those EDTV of normal wall and those representative EDTV.
The suspicious voxels on the COCL can be further grouped based on a nearest distance rule. In other words, if the distance between two suspicious voxels is less than a threshold, they are considered as related to the same polyp. The distance threshold is determined by the voxel size, i.e., the image resolution. For example, for a 0.7×0.7×1.0 mm3 voxel size, the distance can be set as 1.5 mm. After grouping of suspicious voxels, the area on the COCL that is related to the same polyp is dilated to a thickness of 1.5 mm to create a contiguous surface area on the COCL. The contiguous surface area is the so-called Polyp Patch (PP).
Each location which may be a polyp is processed to estimate the likelihood of the region representing a polyp (step 150). Each patch is associated with a suspicious polyp location. Then, the smallest parallelepiped volume of voxels that contain the PP is determined and is used as the polyp volume. The geometry features of volume, such as the size, the volume, the diameter, etc. can be computed. Further feature analysis can be applied to the polyp volume. For example, texture analysis as disclosed in U.S. Pat. No. 5,971,676, which is hereby incorporated by reference in its entirety, can be used to further analyze a suspicious region. Moreover, multiple feature analysis can be applied and their results are further fused to estimate the likelihood that the volume covers a polyp. A list of all suspicious regions is created in step 145. The list of suspicious polyps can be sorted in a number of ways depending on user preference. For example, the list can be sorted in descending order of likelihood or the list can be sorted in the descending order of size of the polyp volume.
The suspicious region list is provided to the display equipment where the user interface is running. The user can interactively sort the polyps in the list based on different features. The PP is coded with specific color to remind the reviewer when he/she navigates inside the colon lumen. When the user confirms a polyp in the list, the associated EDTVs of that PP will automatically be stored for further polyp library improvement.
All the features can further be transformed into a format for rendering in the 3D view to facilitate detection. This is the so-called virtual biopsy.
This application claims the benefit of U.S. Provisional Application Ser. No. 60/322,046, entitled “Advanced Navigation and Detection for Virtual Examination,” which was filed on Sep. 14, 2001.
Number | Name | Date | Kind |
---|---|---|---|
4367216 | Mutzel et al. | Jan 1983 | A |
4391280 | Miller | Jul 1983 | A |
4493039 | Gregory | Jan 1985 | A |
4630203 | Szirtes | Dec 1986 | A |
4710876 | Cline et al. | Dec 1987 | A |
4719585 | Cline et al. | Jan 1988 | A |
4729098 | Cline et al. | Mar 1988 | A |
4737921 | Goldwasser et al. | Apr 1988 | A |
4751643 | Lorensen et al. | Jun 1988 | A |
4791567 | Cline et al. | Dec 1988 | A |
4823129 | Nelson | Apr 1989 | A |
4831528 | Crawford et al. | May 1989 | A |
4874362 | Wiest et al. | Oct 1989 | A |
4879668 | Cline et al. | Nov 1989 | A |
4984157 | Cline et al. | Jan 1991 | A |
4985834 | Cline et al. | Jan 1991 | A |
4985856 | Kaufman | Jan 1991 | A |
4987554 | Kaufman | Jan 1991 | A |
4993415 | Long | Feb 1991 | A |
5006109 | Douglas et al. | Apr 1991 | A |
5023072 | Cheng | Jun 1991 | A |
5038302 | Kaufman | Aug 1991 | A |
5047772 | Ribner | Sep 1991 | A |
5056020 | Feldman et al. | Oct 1991 | A |
5095521 | Trousset et al. | Mar 1992 | A |
5101475 | Kaufman | Mar 1992 | A |
5127037 | Bynum | Jun 1992 | A |
5166876 | Cline et al. | Nov 1992 | A |
5170347 | Tuy et al. | Dec 1992 | A |
5187658 | Cline et al. | Feb 1993 | A |
5204625 | Cline et al. | Apr 1993 | A |
5229935 | Yamagishi et al. | Jul 1993 | A |
5245538 | Lis | Sep 1993 | A |
5261404 | Mick et al. | Nov 1993 | A |
5265012 | Amans et al. | Nov 1993 | A |
5270926 | Tam | Dec 1993 | A |
5283837 | Wood | Feb 1994 | A |
5295488 | Lloyd et al. | Mar 1994 | A |
5299288 | Glassman et al. | Mar 1994 | A |
5322070 | Goodman et al. | Jun 1994 | A |
5331550 | Stafford et al. | Jul 1994 | A |
5345490 | Finnigan et al. | Sep 1994 | A |
5361763 | Kao et al. | Nov 1994 | A |
5365927 | Roemer et al. | Nov 1994 | A |
5371778 | Yanof et al. | Dec 1994 | A |
5442733 | Kaufman et al. | Aug 1995 | A |
5458111 | Coin | Oct 1995 | A |
5611025 | Lorensen et al. | Mar 1997 | A |
5623586 | Höhne | Apr 1997 | A |
5630034 | Oikawa et al. | May 1997 | A |
5699799 | Xu et al. | Dec 1997 | A |
5734384 | Yanof et al. | Mar 1998 | A |
5782762 | Vining | Jul 1998 | A |
5971767 | Kaufman | Oct 1999 | A |
5986662 | Argiro et al. | Nov 1999 | A |
6130671 | Argiro | Oct 2000 | A |
6149594 | Rock et al. | Nov 2000 | A |
6215893 | Leshem et al. | Apr 2001 | B1 |
6219059 | Argiro | Apr 2001 | B1 |
6272366 | Vining | Aug 2001 | B1 |
20010055016 | Krishnan | Dec 2001 | A1 |
20020164061 | Paik et al. | Nov 2002 | A1 |
Number | Date | Country |
---|---|---|
9613207 | May 1996 | WO |
9811524 | Mar 1998 | WO |
9837517 | Aug 1998 | WO |
0055812 | Sep 2000 | WO |
0055814 | Sep 2000 | WO |
Number | Date | Country | |
---|---|---|---|
60322046 | Sep 2001 | US |