Claims
- 1. A method for automated analysis of textural differences present on a diagnostic medical image, said image comprising a plurality of pixels, with each pixel having a gray level assigned thereto, said method comprising:
defining a region of interest (ROI) on said image; performing at least one first order texture measure within said ROI to describe a frequency of occurrence of all gray levels assigned to pixels of said ROI; performing at least one second order texture measure within said ROI to describe spatial interdependencies between the pixels of said ROI; and classifying said ROI as belonging to a tissue pathology class based upon said first and second order texture measures obtained.
- 2. The method of claim 1, wherein prior to defining an ROI, said method further comprises:
removing structures within said image by assigning pixels that form said structures a particular gray level.
- 3. The method of claim 1, wherein prior to defining an ROI, said method further comprises:
forming pixel regions within said image, wherein said pixels that are located adjacent one another are assigned a common gray level providing that said adjacent pixels' gray levels differ by an insignificant amount; and performing said second order texture measure within only said ROI.
- 4. The method of claim 3, wherein said commonly assigned gray level is the average of all said adjacent pixels' gray levels that differ in value by an insignificant amount.
- 5. The method of claim 3, wherein said adjacent pixels' gray levels differing by an insignificant amount is a difference in gray level of 20 or less.
- 6. The method of claim 3, wherein prior to forming pixel regions within said image, said method further comprises:
converting said image from an 11-bit format to an 8-bit format.
- 7. The method of claim 1, wherein prior to said classifying, said method further comprises:
centering a pixel block about each pixel of said ROI; determining the average of absolute gray level intensity differences of each possible pixel-pair separated by a distance “d” within said pixel block; and assigning the pixel, about which said pixel block is centered, a stochastic fractal value based upon said average absolute gray level intensity differences obtained.
- 8. The method of claim 7, wherein said method further comprises:
assigning said pixels of said ROI only one of two binary values dependent upon their respective gray levels; mapping said image onto a grid of super-pixels of increasing size “e”; determining the number of said super-pixels that are one binary value within said ROI; and determining a geometric fractal value of said ROI based upon said determined number of super-pixels that are said one binary value.
- 9. The method of claim 1, wherein prior to said classifying, said method further comprises:
eliminating said first order and second order texture measures which are redundant or fail to properly distinguish a particular tissue pathology class.
- 10. The method of claim 1, wherein prior to said classifying, said method further comprises:
providing known samples of particular tissue pathology classes; performing first order and second order texture measures from said samples; and storing said first order and second order texture measures obtained from said samples in association with said particular tissue pathology classes.
- 11. The method of claim 10, wherein said method further comprises:
determining the probability that said ROI belongs to a particular tissue pathology class based upon said stored texture measures and said first and second order texture measures performed on said ROI; and classifying said ROI to the particular tissue pathology class for which said probability is the highest.
- 12. The method of claim 1, wherein said performing one or more second order texture measures further comprises:
determining gray level run-lengths that exist within said ROI by inspecting the existence of consecutive, collinear pixels that possess the same gray level in said ROI.
- 13. The method of claim 1, wherein said performing one or more second order texture measures further comprises:
determining the number of times a particular gray level and another particular gray level occur at a separation distance “d” within said ROI.
- 14. The method of claim 1, wherein said method further comprises:
assigning a color code to said ROI that is indicative of said particular tissue pathology class assigned to said ROI.
- 15. The method of claim 1, wherein said diagnostic medical image is a computed tomography (CT) image.
- 16. A method for automated analysis of a diagnostic medical image, said image comprising a plurality of pixels, with each pixel having a particular gray level assigned thereto, said method comprising:
forming pixel regions within said image by assigning a common gray level to a group of pixels which possess substantially similar gray levels; obtaining measures from said image which describe relationships between said pixels; and classifying areas of said image to a particular tissue pathology class based upon said texture measures obtained.
- 17. An apparatus, comprising:
image input means adapted to receive a diagnostic medical image, said image comprising a plurality of pixels, with each pixel having a particular gray level assigned thereto; and processor means adapted to perform texture measures on a group of pixels within said image, said texture measures providing information on an occurance frequency of gray levels assigned to said group of pixels and spatial interdependencies between particular pixels of said group of pixels, said processor means being further adapted to classify said group of pixels to a tissue pathology class based upon said texture measures obtained.
- 18. The apparatus of claim 17 further comprising:
display means for displaying a graphical user interface and said received image.
- 19. The apparatus of claim 18 further comprising:
user input means for use in conjunction with said graphical user interface, wherein a user of said apparatus is able to select a plurality of options of said apparatus by use of said user input means and said graphical user interface.
- 20. The apparatus of claim 19, wherein said user input means is a computer mouse.
- 21. The apparatus of claim 17, wherein said processor is further adapted to remove structures within said image by assigning pixels that form said structures a particular gray level.
- 22. The apparatus of claim 17, wherein said processor is further adapted to form pixel regions within said image, wherein said pixels that are located adjacent one another are assigned a common gray level providing that said adjacent pixels' gray levels differ by an insignificant amount.
- 23. The apparatus of claim 22, wherein said commonly assigned gray level is the average of all said adjacent pixels' gray levels that differ in value by an insignificant amount.
- 24. The apparatus of claim 22, wherein said adjacent pixels' gray levels differing by an insignificant amount is a difference in gray level of 20 or less.
- 25. The apparatus of claim 17, wherein said processor is further adapted to convert said received image from an 11-bit format to an 8-bit format.
- 26. The apparatus of claim 17, wherein said processor is further adapted to center a pixel block about each pixel of said group of pixels; determine the average of absolute gray level intensity differences of each possible pixel-pair separated by a distance “d” within said pixel block; and assign the pixel, about which said pixel plock is centered, a stochastic fractal value based upon said average gray level intensity differences obtained.
- 27. The apparatus of claim 26, wherein said processor is further adapted to assign said pixels of said group of pixels only one of two binary values dependent upon their respective gray levels; map said image onto a grid of super-pixels of increasing size “e”; determine the number of said super-pixels that are one binary value within said group of pixels; and determine a geometric fractal value of said group of pixels based upon said determined number of super-pixels that are said one binary value.
- 28. The apparatus of claim 17, wherein said processor is further adapted to eliminate said texture measures which are redundant or fail to properly distinguish a particular tissue pathology class.
- 29. The apparatus of claim 17, wherein said image input means is further adapted to receive samples of images that are known to possess a particular tissue pathology class; said processer is further adapted to determine texture measures from said received samples of images; and said apparatus further comprises:
storage means for storing said texture measures obtained from said samples in association with said particular tissue pathology class.
- 30. The apparatus of claim 29, wherein said processor is further adapted to determine the probability that said group of pixels belongs to a particular tissue pathology class based upon said stored texture measures in said storage means and said texture measures performed on said group of pixels; and said processor is further adapted to classify said group of pixels to the particular tissue pathology class for which said probability is the highest.
- 31. The apparatus of claim 17, wherein at least one of said texture measures is determined from gray level run-lengths that exist within said group of pixels by inspecting the existence of consecutive, collinear pixels that possess the same gray level.
- 32. The apparatus of claim 17, wherein at least one of said texture measures is determined from the number of times a particular gray level and another particular gray level, within said group of pixels, occur at a separation distance “d”.
- 33. The apparatus of claim 18, wherein said processor is further adapted to assign a color code to said group of pixels that is indicative of said tissue pathology class classified to said group of pixels and said display means displays said color code on said image at the location of said group of pixels.
- 34. The apparatus of claim 17, wherein said texture measures comprise at least one first order texture measure to describe a frequency of occurrence of all gray levels assigned to pixels of said ROI and at least one second order texture measure to obtain spatial interdependencies between the pixels of said ROI.
- 35. The apparatus of claim 17, wherein said diagnostic medical image is a computed tomography (CT) image.
- 36. A method for automated analysis of textural differences present on a diagnostic medical image, said image comprising a plurality of pixels, with each pixel having a gray level assigned thereto, said method comprising:
forming pixel regions within said image, wherein said pixels that are located adjacent one another are assigned a common gray level providing that said adjacent pixels' gray levels differ by an insignificant amount; defining a region of interest (ROI) on said image; performing at least one first order texture measure within said ROI to describe a frequency of occurrence of all gray levels assigned to pixels of said ROI; performing at least one second order texture measure within said ROI to describe spatial interdependencies between the pixels of said ROI; and classifying said ROI as belonging to a tissue pathology class based upon said first and second order texture measures obtained.
- 37. A method for automated analysis of textural differences present on a diagnostic medical image of the pulmonary region, said image comprising a plurality of pixels, with each pixel having a gray level assigned thereto, said method comprising:
defining a region of interest (ROI) on said image; performing at least one first order texture measure within said ROI to describe a frequency of occurrence of all gray levels assigned to pixels of said ROI; performing at least one second order texture measure within said ROI to describe spatial interdependencies between the pixels of said ROI; centering a pixel block about each pixel of said ROI; determining the average of absolute gray level intensity differences of each possible pixel-pair separated by a distance “d” within said pixel block; assigning the pixel, about which said pixel block is centered, a stochastic fractal value based upon said average absolute gray level intensity differences obtained; and classifying said ROI as belonging to a tissue pathology class based upon said first and second order texture measures and said stochastic fractal value obtained.
PRIORITY CLAIM
[0001] This application claims priority of U.S. Provisional Application No. 60/037,067, filed Feb. 12, 1997, entitled: “Method and Apparatus for Analyzing CT Images to Determine the Presence of Pulmonary Parenchyma”, by Renuka Uppaluri, Theophano Mitsa, Eric A. Hoffman, Geoffrey McLennan, and Milan Sonka, Atty Docket No.: IOWA:013PZ1/MOG.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60037067 |
Feb 1997 |
US |
Continuations (1)
|
Number |
Date |
Country |
Parent |
09022093 |
Feb 1998 |
US |
Child |
10251446 |
Sep 2002 |
US |