The present invention relates generally to analysis of medical imaging data, and, more particularly, to detecting poor quality in three-dimensional (3D) reconstructions in a biological cell imager.
3D tomographic reconstructions require projection images as input. A projection image assumes that an object of interest is translucent to a source of exposure such as a light source transmitted through the object of interest. The projection image, then, comprises an integration of the absorption by the object along a ray from the source to the plane of projection. Light in the visible spectrum is used as a source of exposure in optical projection tomography.
In the case of producing projections from biological cells, the cells are typically stained with hematoxylin, an absorptive stain that attaches to proteins found in cell chromosomes. Cell nuclei are approximately 15 microns in diameter, and in order to promote reconstructions of sub-cellular features it is necessary to maintain sub-micron resolution. For sub-micron resolution, the wavelength of the illuminating source is in the same spatial range as the biological objects of interest. This can result in undesirable refraction effects. As a result a standard projection image cannot be formed. To avoid these undesirable effects, as noted above, the camera aperture is kept open while the plane of focus is swept through the cell. This approach to imaging results in equal sampling of the entire cellular volume, resulting in a pseudo-projection image. A good example of an optical tomography system has been published as United States Patent Application Publication 2004-0076319, on Apr. 22, 2004, corresponding to U.S. Pat. No. 7,738,945 issued Jun. 15, 2010, to Fauver, et al. and entitled “Method and Apparatus for Pseudo-Projection Formation for Optical Tomography.” U.S. Pat. No. 7,738,945 is incorporated herein by reference.
An optical tomography system may advantageously employ scores for classifying objects of interest, for example, to detect lung cancer in its pre-invasive and treatable stage. In order to do so with accuracy and reliability, the classification scores must be based on good quality 3D reconstruction images of the objects being classified. One example of an optical tomography system is being built by VisionGate, Inc. of Gig Harbor Wash., assignee of this application, is under the trademark “Cell-CT™.” In one aspect, the Cell-CT™ optical tomography system employs scores, designed to provide an indication of lung cancer in its pre-invasive and treatable stage.
While it is generally understood that poor quality 3D reconstructions may adversely affect classification results in optical tomography systems, an automated system for detecting such poor quality 3D reconstructions has been lacking until now. The system and method disclosed herein provides, for the first time, a solution for detection of poor quality 3D reconstructions useful for an optical tomography system, for example.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
A system and method for detecting poor quality images in an optical tomography system is presented. The system includes an acquisition means for acquiring a set of projection images of an object having a center of mass, where each of the set of projection images is acquired at a different angle of view. A reconstruction means is coupled to receive the projection images, for reconstruction of the projection images into 3D reconstruction images. A quality means for classification of the 3D reconstruction images uses selected features that characterize poor quality reconstructions.
While the novel features of the invention are set forth with particularity in the appended claims, the invention, both as to organization and content, will be better understood and appreciated, along with other objects and features thereof, from the following detailed description taken in conjunction with the drawings, in which:
The following disclosure describes several embodiments and systems for imaging an object of interest. Several features of methods and systems in accordance with example embodiments of the invention are set forth and described in the figures. It will be appreciated that methods and systems in accordance with other example embodiments of the invention can include additional procedures or features different than those shown in figures.
Example embodiments are described herein with respect to biological cells. However, it will be understood that these examples are for the purpose of illustrating the principles of the invention, and that the invention is not so limited. Additionally, methods and systems in accordance with several example embodiments of the invention may not include all of the features shown in these figures. Throughout the figures, like reference numbers refer to similar or identical components or procedures.
Unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open, inclusive sense that is as “including, but not limited to.”
Reference throughout this specification to “one example” or “an example embodiment,” “one embodiment,” “an embodiment” or various combinations of these terms means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Generally as used herein the following terms have the following meanings when used within the context of optical tomography processes:
As used in this specification, the terms “processor” and “computer processor” encompass a personal computer, a microcontroller, a microprocessor, a field programmable object array (FPOA), a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), a programmable logic array (PLA), or any other digital processing engine, device or equivalent including related memory devices, transmission devices, pointing devices, input/output devices, displays and equivalents.
Referring now to
In the example, the plurality of pseudo-projection images, here exemplified by pseudo-projection images 22A, 22B and 22C are shown acquired at angles of 0°, 90° and 180° respectively. It will be understood that these are merely examples and that the number of pseudo-projection images actually acquired will typically be several hundred images. The reconstruction processor 35 may be of the type as described in Fauver et al. referenced hereinabove. The quality score classifier may, for example, advantageously assign numerical scores to the reconstructed 3D images where the numerical scores have values scaled to represent degrees of quality of the images. In other embodiments the quality score classifier may simply sort poor quality images from other images.
Having described the major components of an optical tomography system including a quality score classifier, it is now considered useful to an understanding of the invention to describe an example embodiment of operation of such a system. Taken in a substantially chronological order, an example of operation may include the following functions.
Among other things, good quality classification depends on good quality 3D reconstructions in step 6. Issues governing quality arise from detrimental effects that may be introduced by the operation of a given optical tomography system and characteristics relating to deficient correction of random cell motion occurring during image capture. If cells are not properly in focus in the set of pseudo-projections or if the cell moves off the camera frame during capture, the resulting reconstruction will not be ideal. In a likewise fashion, if proper corrections for the random motions arising during image capture are not made, then the various features of the cell will not reinforce each other in the reconstruction, thus compromising reconstruction quality.
Poor quality images may result in distorted reconstructions entering the classification stream, producing unpredictable results reflected in incorrect or distorted classification scoring. Therefore, poor quality reconstructions need to be detected to ensure the integrity of classification. A method for detecting poor quality reconstructions in cases where, for example, pseudo-projection images were not collected in an ideal way, when registration was not successful, or for other reasons affecting image quality, is described in detail herein sufficient for one skilled in the art to make and use the invention.
As described further herein, detection of poor quality reconstructions may be carried out by various methods including poor quality detection based on features describing streaking in reconstruction, poor quality detection based on a comparison between fixed focal plane and reconstructed slice, poor quality detection using parameters of cosine fitting to center of mass trends and the like. It has been observed that streaking may have various causes. Image quality issues due to poor focus and random motions affecting cell alignment have similar streaking effects on reconstructions.
Referring now to
Referring now to
In some circumstances, the correction algorithm does not converge to an appropriate solution and poor alignment is observed in the acquired set of corrected pseudo-projections that are used as input to the filtered-backprojection algorithm. As a result, cell morphology does not reinforce in the backprojection. The effect of poor alignment is similar to that of poor focus. Lack of good quality alignment produces streaking in the reconstruction.
Comparing
For some applications, segmentation development may be initiated with annotations of reconstructions made by hand drawn cell boundaries. These boundaries serve as a reference to guide development. The resulting segmentation algorithm includes identification of a threshold, selected for the particular cell under examination. In one example, threshold selection follows a procedure wherein a cell segmentation program first selects fifteen slices near the center of the reconstruction. With each slice a range of thresholds is applied and an area derivative and a second derivative is computed for each. To select a threshold for each slice, a negative second derivative is located at a threshold higher than the maximum area derivative. A global threshold is chosen using a percentile of the selected slice thresholds. Finally, the largest object is kept, and any holes in it are filled using digital techniques.
Referring now to
Referring now to
Another technique for assessing reconstruction quality is to compare reconstruction slices with their corresponding fixed focal plane slices. So long as they are well focused, fixed focal plane slices should be free of whatever distortions were introduced into the reconstruction during image capture or processing. Therefore, these images form an excellent reference to judge reconstruction quality. Referring now to
Features derived to judge good quality of reconstruction are formed by creating a difference image between the fixed focus and reconstruction slice images. In contrast with the above features of Table 1, difference image features are computed for those voxels that are associated with the cell. Low average difference for the portion of the images containing the cell reflects good quality of reconstruction.
Another useful method for detection of poor quality images employs parameters of cosine fitting to center of mass trends. As indicated by
Referring now to
With respect to the example of
With respect to the example of
Referring now to
While specific embodiments of the invention have been illustrated and described herein, it is realized that numerous modifications and changes will occur to those skilled in the art. It is therefore to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit and scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
3657537 | Wheeless, Jr. | Apr 1972 | A |
3705771 | Friedman et al. | Dec 1972 | A |
3748468 | Hartman | Jul 1973 | A |
3999047 | Green | Dec 1976 | A |
4175860 | Bacus | Nov 1979 | A |
4360885 | Edgar | Nov 1982 | A |
4786165 | Yamamoto | Nov 1988 | A |
4858128 | Nowak et al. | Aug 1989 | A |
4873653 | Grosskopf | Oct 1989 | A |
5141609 | Sweedler | Aug 1992 | A |
5148502 | Tsujiuchi | Sep 1992 | A |
5159398 | Maekawa et al. | Oct 1992 | A |
5189518 | Nishida et al. | Feb 1993 | A |
5402460 | Johnson | Mar 1995 | A |
5539800 | Katsevich | Jul 1996 | A |
5550892 | Katsevich | Aug 1996 | A |
5552605 | Arata | Sep 1996 | A |
5644388 | Maekawa et al. | Jul 1997 | A |
5680484 | Ohyama | Oct 1997 | A |
5760901 | Hill | Jun 1998 | A |
5835617 | Ohta et al. | Nov 1998 | A |
5878103 | Sauer | Mar 1999 | A |
5909476 | Cheng et al. | Jun 1999 | A |
5915048 | Hill et al. | Jun 1999 | A |
5987158 | Meyer | Nov 1999 | A |
6026174 | Palcic | Feb 2000 | A |
6078681 | Silver | Jun 2000 | A |
6091983 | Alfano et al. | Jul 2000 | A |
6130958 | Rohler et al. | Oct 2000 | A |
6165734 | Garini | Dec 2000 | A |
6201628 | Basiji | Mar 2001 | B1 |
6211955 | Basiji | Apr 2001 | B1 |
6215587 | Alfano et al. | Apr 2001 | B1 |
6249341 | Basiji et al. | Jun 2001 | B1 |
6252979 | Lee | Jun 2001 | B1 |
6388809 | MacAulay | May 2002 | B1 |
6442235 | Koppe et al. | Aug 2002 | B2 |
6473176 | Basiji | Oct 2002 | B2 |
6512807 | Pohlman et al. | Jan 2003 | B1 |
6519355 | Nelson | Feb 2003 | B2 |
6522775 | Nelson et al. | Feb 2003 | B2 |
6542573 | Schomberg | Apr 2003 | B2 |
6591003 | Chu et al. | Jul 2003 | B2 |
6608682 | Ortyn et al. | Aug 2003 | B2 |
6636623 | Nelson et al. | Oct 2003 | B2 |
6697508 | Nelson | Feb 2004 | B2 |
6741730 | Rahn et al. | May 2004 | B2 |
6770893 | Nelson | Aug 2004 | B2 |
6823204 | Grass et al. | Nov 2004 | B2 |
6850587 | Karimi | Feb 2005 | B1 |
6944322 | Johnson et al. | Sep 2005 | B2 |
6975400 | Ortyn et al. | Dec 2005 | B2 |
7003143 | Hewitt | Feb 2006 | B1 |
7141773 | Kaplan et al. | Nov 2006 | B2 |
7197355 | Nelson | Mar 2007 | B2 |
7218393 | Sharpe et al. | May 2007 | B2 |
7224540 | Olmstead et al. | May 2007 | B2 |
7260253 | Rahn et al. | Aug 2007 | B2 |
7274809 | MacAulay et al. | Sep 2007 | B2 |
7440535 | Netsch et al. | Oct 2008 | B2 |
7505549 | Ohishi et al. | Mar 2009 | B2 |
7505551 | Grass et al. | Mar 2009 | B2 |
7539529 | Schmitt et al. | May 2009 | B2 |
7738945 | Fauver et al. | Jun 2010 | B2 |
7811825 | Fauver et al. | Oct 2010 | B2 |
7835561 | Meyer et al. | Nov 2010 | B2 |
20020122167 | Riley et al. | Sep 2002 | A1 |
20030199758 | Nelson | Oct 2003 | A1 |
20040076319 | Fauver et al. | Apr 2004 | A1 |
20040228520 | Dresser | Nov 2004 | A1 |
20050085721 | Fauver et al. | Apr 2005 | A1 |
20060023219 | Meyer et al. | Feb 2006 | A1 |
20060066837 | Ortyn et al. | Mar 2006 | A1 |
20060068371 | Ortyn et al. | Mar 2006 | A1 |
20060093200 | Sharpe et al. | May 2006 | A1 |
20060204071 | Ortyn et al. | Sep 2006 | A1 |
20070146873 | Ortyn et al. | Jun 2007 | A1 |
20070215528 | Hayenga et al. | Sep 2007 | A1 |
20070258122 | Chamgoulov et al. | Nov 2007 | A1 |
20080175455 | John et al. | Jul 2008 | A1 |
20080285827 | Meyer et al. | Nov 2008 | A1 |
20090103792 | Rahn et al. | Apr 2009 | A1 |
Number | Date | Country |
---|---|---|
1704874 | Sep 2006 | EP |
Number | Date | Country | |
---|---|---|---|
20100296713 A1 | Nov 2010 | US |