Claims
- 1. A method of analyzing a plurality of views of an object, said object including an edge portion partially extending from a surface of said object into an internal volume of said object, comprising the step of:
analyzing each acquired view, wherein the step of analyzing each acquired view includes analysis of said edge portion.
- 2. The method of claim 1, wherein the step of analyzing each acquired view comprises:
detecting at least one region of concern; and classifying said detected region of concern.
- 3. The method of claim 2, wherein said object comprises breast tissue, and wherein the step of classifying said detected region of concern classifies pathologies within said breast tissue.
- 4. The method of claim 2, further comprising the steps of:
correlating a detected region of concern in a first view of said plurality of views with a corresponding region in a second view of said plurality of views; and weighting said classification with a weighting factor corresponding to a degree of correlation.
- 5. The method of claim 4, wherein the step of correlating comprises at least one of:
determining that a corresponding region does not exist; and comparing at least one of:
a shape of the detected region of concern; a size of the detected region of concern; a contrast of the detected region of concern; and a contrast distribution of the detected region of concern.
- 6. The method of claim 1, further comprising a step of reconstructing a plurality of reconstructed planes from said plurality of views.
- 7. The method of claim 6, further comprising the steps of:
detecting at least one region of concern in a reconstructed plane; and classifying said detected region of concern in the reconstructed plane.
- 8. The method of claim 7, further comprising the steps of:
correlating a detected region of concern in a first reconstructed plane of said plurality of reconstructed planes with at least one of:
a corresponding region in a second reconstructed plane of said plurality of reconstructed planes; and a corresponding region in at least one of said plurality of views; and weighting said classification with a weighting factor corresponding to a degree of correlation.
- 9. The method of claim 1, further comprising a step of reconstructing a three dimensional (3D) image of said object from said plurality of views.
- 10. The method of claim 9, further comprising the steps of:
detecting regions of concern in said reconstructed 3D image; and classifying said detected regions of concern in said reconstructed 3D image.
- 11. The method of claim 10, further comprising the steps of:
correlating a detected region of concern in said 3D image with a corresponding region in at least one vew of said plurality of views; and weighting said classification with a weighting factor corresponding to a degree of correlation.
- 12. The method of claim 10, further comprising at least one of the steps of:
summing image data along a line to simulate a ray projection along that line; determining maximum intensity projections within a detected region of concern; determining a minimum intensity projection within a detected region of concern; determining a mean intensity projection within a detected region of concern; and determining median intensity projection amongst said detected regions of concern.
- 13. The method of claim 1, further comprising the steps of:
angularly displacing a radiation source about a pivot point to a position corresponding to said view; radiating said object via said radiation source; and detecting radiation from said object.
- 14. The method of claim 13, wherein the total angular displacement of view acquisition is less than 360°.
- 15. The method of claim 14, wherein the total angular displacement of view acquisition is less than about 90°.
- 16. The method of claim 15, wherein the total angular displacement of view acquisition is less than about 600.
- 17. The method of claim 2, wherein said plurality of views are acquired with an x-ray modality, further comprising the steps of:
correlating said detected region of concern with a corresponding region in an image acquired by one of an ultrasound modality and a nuclear medicine modality; and weighting said classification with a weighting factor corresponding to a degree of correlation.
- 18. The method of claim 17, wherein the step of correlating comprises at least one of:
determining that a corresponding region does not exist; and comparing at least one of:
a shape of the detected region of concern; a size of the detected region of concern; a contrast of the detected region of concern; and a contrast distribution of the detected region of concern.
- 19. A program product for causing a machine to analyze a plurality of views from a tomosynthesis system, said tomosynthesis system imaging an object including an edge portion partially extending from a surface of said object into an internal volume of said object, said program product causing the machine to perform the step of:
analyzing each acquired view, wherein the step of analyzing each acquired view includes analysis of said edge portion.
- 20. The program product of claim 19, further causing the machine to perform the step of:
reconstructing a plurality of reconstructed planes from said plurality of views.
- 21. The program product of claim 20, further causing the machine to perform the steps of:
detecting at least one region of concern in a reconstructed plane; and classifying said detected region of concern in the reconstructed plane.
- 22. The program product of claim 21, further causing the machine to perform the steps of:
correlating a detected region of concern in a first reconstructed plane of said plurality of reconstructed planes with at least one of:
a corresponding region in a second reconstructed plane of said plurality of reconstructed planes; and a corresponding region in at least one of said plurality of views; and weighting said classification with a weighting factor corresponding to a degree of correlation.
- 23. The program product of claim 19, further causing the machine to perform the step of reconstructing a three dimensional (3D) image of said object from said plurality of views.
- 24. The program product of claim 23, further causing the machine to perform the steps of:
detecting regions of concern in said reconstructed 3D image; and classifying said detected regions of concern in said reconstructed 3D image.
- 25. The program product of claim 24, further causing the machine to perform the steps of:
correlating a detected region of concern in said 3D image with a corresponding region in at least one view of said plurality of views; and weighting said classification with a weighting factor corresponding to a degree of correlation.
- 26. The program product of claim 19, further causing the machine to perform the steps of:
angularly displacing a radiation source about a pivot point to a position corresponding to said view; radiating said object via said radiation source; and detecting radiation from said object.
- 27. The program product of claim 26, wherein the total angular displacement of view acquisition is less than 360°.
- 28. The program product of claim 21, wherein said plurality of views are acquired with an x-ray modality, further causing the machine to perform the steps of:
correlating said detected region of concern with a corresponding region in an image acquired by one of an ultrasound modality and a nuclear medicine modality; and weighting said classification with a weighting factor corresponding to a degree of correlation.
- 29. A tissue imaging device, comprising:
a radiation source for emitting radiation through tissue to be imaged, said radiation source being angularly displaceable through a plurality of emission positions corresponding to a plurality of views; a detector positioned to detect radiation emitted through said tissue, said detector generating a signal representing a view of said tissue; and a processor electrically coupled to said detector for analyzing said signal, wherein said processor analyzes at least one of said plurality of views, said analysis including analysis of an edge portion of said tissue partially extending from a surface of said tissue into an internal volume of said tissue.
- 30. The tissue imaging device of claim 29, wherein said processor further:
reconstructs a plurality of reconstructed planes from said plurality of views.
- 31. The tissue imaging device of claim 30, wherein said processor further:
detects at least one region of concern in a reconstructed plane; and classifies said detected region of concern in the reconstructed plane.
- 32. The tissue imaging device of claim 31, wherein said processor further:
correlates a detected region of concern in a first reconstructed plane of said plurality of reconstructed planes with at least one of:
a corresponding region in a second reconstructed plane of said plurality of reconstructed planes; and a corresponding region in at least one of said plurality of views; and weights said classification with a weighting factor corresponding to a degree of correlation.
- 33. The tissue imaging device of claim 29, wherein said processor further:
reconstructs a three dimensional (3D) image of said object from said plurality of views.
- 34. The tissue imaging device of claim 33, wherein said processor:
detects regions of concern in said reconstructed 3D image; and classifies said detected regions of concern in said reconstructed 3D image.
- 35. The tissue imaging device of claim 34, wherein said processor further:
correlates a detected region of concern in said 3D image with a corresponding region in at least one view of said plurality of views; and weights said classification with a weighting factor corresponding to a degree of correlation.
- 36. The tissue imaging device of claim 29, wherein the radiation source is angularly displaceable through less than 360° about said tissue.
- 37. The tissue imaging device of claim 29, further comprising one of:
an ultrasound imager comprising:
an ultrasound source for emitting ultrasound through tissue to be imaged; and an ultrasound detector positioned to detect ultrasound emitted through said tissue to generate an ultrasound image; and a nuclear medicine detector to generate a nuclear medicine image, wherein said processor further:
detects a region of concern in said plurality of views; correlates said a detected region of concern with a corresponding region in one of said ultrasound image and said nuclear medicine image; and weights said classification with a weighting factor corresponding to a degree of correlation.
- 38. A method of analyzing an object with a multi-modality imaging system, comprising the steps of:
detecting a region of concern in at least one of a first image of said object generated by a first modality and a second image of said object generating by a second modality; classifying said detected region of concern; correlating said region of concern with a corresponding region in the other of said first image and said second image; and weighting said classification with a weighting factor corresponding to a degree of correlation, wherein said first modality is different from said second modality.
- 39. The method of claim 38, wherein said first modality and said second modality each comprises one of x-ray, ultrasound, and nuclear medicine.
- 40. The method of claim 38, wherein at least one of said first image and said second image comprise a three dimensional (3D) reconstructed image of said object.
- 41. The method of claim 38, further comprising at least one of:
determining that a corresponding region does not exist; and comparing at least one of:
a shape of the detected region of concern; a size of the detected region of concern; a contrast of the detected region of concern; and a contrast distribution of the detected region of concern.
- 42. The method of claim 38, further comprising the step of:
fusing said first image with said second image.
- 43. The method of claim 38, further comprising the steps of:
registering said first image with said second image; and correcting differences in spacial resolution between said first image and said second image.
- 44. An imaging system for imaging an object, comprising:
means for generating a first image of said object from x-ray radiation; means for generating a second image of said object from ultrasound; means for detecting a region of concern in at least one of said first image and said second image; means for correlating said detected region of concern with a corresponding region in the other of said first image and said second image; means for at least one of:
determining whether said abnormality is present in said corresponding region in said other of said first image and said second image; and comparing at least one of:
a shape of the detected region of concern; a size of the detected region of concern; a contrast of the detected region of concern; and a contrast distribution of the detected region of concern; means for classifying said abnormality; and means for weighting said classification in relation to a degree of correlation.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH & DEVELOPMENT
[0001] The government may have rights in this invention pursuant to Subcontract 22287 issued from the Office of Naval Research/Henry M. Jackson Foundation.