Claims
- 1. An imaging apparatus, comprising:
a motorized stage; a camera focussed relative to said motorized stage; and a processor coupled to said camera, wherein said processor contains instructions which, when executed by said processor, cause said processor to:
capture an image that is incident on said camera, wherein the image includes a plurality of pixels and the pixels have a characteristic; establish the characteristic for each pixel; and determine which pixels contain a target image based on the characteristic of the pixels.
- 2. The imaging apparatus of claim 1, wherein said characteristic includes pixel intensity.
- 3. The imaging apparatus of claim 1, wherein said characteristic includes pixel color.
- 4. The imaging apparatus of claim 1, wherein said determining which pixels contain a target image based on the relative intensity of the pixels includes:
determining a mean intensity of the pixels; comparing the intensity of each pixel to the mean intensity; and dividing the pixels into a group of non-target image pixels having high intensities and a group of target image pixels having an intensity lower than the high intensity pixels.
- 5. The imaging apparatus of claim 1, wherein said determining which pixels contain a target image based on the relative intensity of the pixels includes:
determining an intensity standard deviation for the pixels that provides an amount of variation in pixel intensity; comparing the intensity of each pixel to the intensity standard deviation; and dividing the pixels into a group of non-target image pixels having low standard deviations and a group of target image border pixels having a standard deviation that is greater than the standard deviation of the pixels having low standard deviations.
- 6. The imaging apparatus of claim 1, further comprising a pulsed light directed toward said motorized stage.
- 7. The imaging apparatus of claim 1, further comprising a stage position sensor adjacent said motorized stage.
- 8. An imaging apparatus, comprising:
a motorized stage; a camera having a lens directed toward said motorized stage; and a processor coupled to said camera, wherein said processor contains instructions which, when executed by said processor, cause said processor to:
select at least three points of a sample adjacent said motorized stage; determine stage position for each selected point; focus said camera on each selected point; determine object distance from the camera lens to the sample at each selected point; and develop a focus surface based on stage position and object distance for the at least three selected points.
- 9. The imaging apparatus of claim 8, wherein said selecting at least three points of a sample adjacent the motorized stage includes selecting points dependent on a characteristic of the image of those points.
- 10. The imaging apparatus of claim 9, wherein selecting points dependent on a characteristic of the image of those points includes selecting the darkest regions.
- 11. The imaging apparatus of claim 9, wherein selecting points dependent on a characteristic of the image of those regions includes selecting the lightest regions.
- 12. The imaging apparatus of claim 9, wherein selecting points dependent on a characteristic of the image of those regions includes selecting points having a high contrast relative to the regions.
- 13. The imaging apparatus of claim 8, wherein said selecting at least three points of a sample adjacent the motorized stage includes:
determining a distribution of the at least three selected points within the sample; determining whether at least one selected point lies within each of at least two predetermined areas; and selecting additional points until at least one point lies within each predetermined area.
- 14. The imaging apparatus of claim 13, further comprising:
determining for a selected point whether an error equal to the difference between the object distance at the selected point and the object distance of the surface at that point is greater than a predetermined error limit; deleting the point from the selection if the error is greater than the predetermined error limit; and developing a focus surface based on stage position and object distance for the remaining selected points.
- 15. The imaging apparatus of claim 13, further comprising:
determining for a selected point whether an error equal to the difference between the object distance at the selected point and the object distance of the surface at that point is greater than a predetermined error limit; select an additional point from the area in which the point having an error equal to the difference between the object distance at the selected point and the object distance of the surface at that point is greater than a predetermined error limit; determine stage position for the additional selected point; focus said camera on the additional selected point; determine object distance from the camera lens to the sample at the additional selected point; and develop a focus surface based on stage position and object distance for the selected points.
- 16. An imaging apparatus, comprising:
a motorized stage; a camera focussed relative to said motorized stage; a stage position sensor adjacent said motorized stage; and a pulsed light directed toward said motorized stage and coupled to said stage position sensor such that said pulsed light illuminates in response to said stage position sensor.
- 17. The imaging apparatus of claim 16, further comprising a processor coupled to said camera, said pulsed light and said stage position sensor, wherein said processor contains instructions which, when executed, cause said processor to:
initiate motion of the motorized stage; energize the pulsed light when the stage position sensor indicates the motorized stage is in a predetermined position; and capture an image by way of the camera while the pulsed light is energized.
- 18. The imaging apparatus of claim 16, wherein the motorized stage moves continuously while capturing images.
- 19. A system for creating a high throughput montage image of microscope slides, the system comprising:
an optical system that comprises at least one camera, a motorized stage for moving a slide while an image of the slide is captured, a pulsed light illumination system that optically stops motion on the motorized stage while allowing continuous physical movement of the motorized stage, and a stage position detector that controls firing of the pulsed light illumination system at predetermined positions of the motorized stage; a plurality of first components that identifies tissue regions on the slide in the optical system and determines locations of tissue on the slide, wherein the plurality of first components use information about the locations to generate control parameters for the motorized stage and the camera; a plurality of second components that use the control parameters to ensure that a high-quality montage image is captured; and means for capturing a high-quality montage image, by the plurality of second components, by maintaining motion of the motorized stage and synchronization of the optical system, thereby enabling accurate focus control of optical elements without requiring the stage to be stopped and refocused at each tile location.
- 20. The system of claim 19, wherein the optical system is a bright field microscope.
- 21. The system of claim 19, wherein the pulsed light illumination system is a standard strobe light, and wherein stage location is determined by the stage position detector, the stage location executes the strobe light and the system does not depend on uniform motion of the motorized stage over an imaged area to execute the strobe light.
- 22. The system of claim 21, wherein the camera is free running and the motorized stage speed is matched to the camera's frame rate to an extent that prevents execution of the pulse light illumination system from falling outside of an exposure window.
- 23. The system of claim 19, wherein the pulse light illumination system is any pulsed light source.
- 24. The system of claim 19, wherein the stage position detector is a Ronchi ruler that is attached to the motorized stage, wherein the Ronchi ruler is a pattern of alternating light and dark bands that are equally spaced along a substrate.
- 25. The system of claim 24, wherein the stage position detector utilizes a light sensor that is mechanically isolated from the Ronchi ruler, wherein as the ruler passes under the sensor, a series of electronic pulses that correspond to the alternating light and dark bands of the Ronchi ruler is generated and the series of electronic pulses is used to monitor the position and direction of the motorized stage.
- 26. The system of claim 19, wherein the position system uses only position information from a stage controller without an external feedback sensor.
- 27. The system of claim 18, wherein the pulse illumination system is fired whenever the stage position detector determines that the motorized stage has moved into a neighboring field of view of the camera.
- 28. The system of claim 19, wherein signals from the stage position detector represent motions of the motorized stage, and wherein timing of the signals vary depending on speeds of the motorized stage.
- 29. The system of claim 19, wherein an absolute speed of the motorized stage is not relevant.
- 30. The system of claim 21, wherein a stage position as determined by the stage position detector executes the camera and the pulse light illumination system, wherein the camera is not free running and each camera frame corresponds to an equally spaced positional change that is independent of a stage velocity.
- 31. The system of claim 21, wherein the plurality of first components include an image cropping component for identifying tissue regions on the slide to be scanned, wherein the image cropping component:
determines an approximate location of a slide boundary by searching upper and lower intervals corresponding to regions expected to contain upper and lower edges of the slide; re-examines the approximate location to determine a more accurate location; and removes portions of the image falling outside of the determined slide boundary.
- 32. The system of claim 31, wherein the image cropping component converts a copy of the thumbnail image to a grayscale image.
- 33. The system of claim 30, wherein the image cropping component identifies pixel blocks that are likely to contain remaining boundary edges and flag the blocks as edges that should not be considered for high-resolution imaging.
- 34. The system of claim 18, wherein the plurality of first components include a tissue finding component that locates regions in the thumbnail image that contain tissue of interest to a specialist.
- 35. The system of claim 34, wherein a cropped image is inputted into the tissue finding component from an image cropping component, wherein the tissue finding component identifies tissue regions by applying a sequence of filters that incorporate knowledge of typical appearance and location of tissue and non-tissue slide regions and outputs a tiling matrix having values that indicate which tiles should be imaged.
- 36. The system of claim 35, wherein a first filter analyzes mean pixel intensity to generate a threshold value for making an initial classification of tissue versus non-tissue regions.
- 37. The system of claim 35, wherein a first filter analyzes a difference between pixel intensities to generate a threshold value for making an initial classification of tissue versus non-tissue regions.
- 38. The system of claim 35, wherein a first filter analyzes mean and standard deviation of local pixel intensities and combines the mean and the standard deviation to generate a threshold value for making an initial classification of tissue versus non-tissue regions.
- 39. The system of claim 38, wherein the intensities are used to differentiate tissue-containing regions from blank regions and other non-tissue containing regions and the standard deviation represents the amount of variation in pixel values and is therefore a good indicator of the border between tissue and the blank slide.
- 40. The system of claim 35, wherein morphological filters are applied to an array representing selected and unselected regions which in turn represent tissue and non-tissue regions to refine classification based on size and position of neighboring groups of potential tissue pixels, wherein the morphological filters process pixels of the cropped image in groups that correspond to slide regions that can be imaged individually during a high-resolution scanning process.
- 41. The system of claim 40, wherein the morphological filters ensure that tiles that partially filled with tissue are classified as tissue-containing tiles.
- 42. The system of claim 35, wherein other image characteristics can be used to identify tissue from non-items of interest.
- 43. The system of claim 18, wherein the plurality of first components include a scan control component that interprets a tile matrix, outputted by a tissue finding component, and transposes positions of the tile matrix into actual stage coordinates for a microscopic imaging.
- 44. The system of claim 18, wherein the plurality of second components comprise:
a focus point selection component that evaluates tissue regions of a thumbnail image and selects several points to initially focus microscope optics on a point-by-point basis; a focal surface determination component that uses focus positions to generate a three-dimensional data set corresponding to optical specimen distance at each stage location, wherein data points in the data set are used as input to a routine that generates control parameters for a slide scanning process; and a scan component that captures the high-quality montage image by maintaining motion of a stage and synchronization of a microscopic imaging system during montage image acquisition, thereby enabling accurate focus control of optical elements without requiring stopping and refocusing of the stage at each tile location and substantially reducing montage acquisition time.
- 45. The system of claim 44, wherein focus point selection component selects positions based on their relative contrast within the thumbnail image and their overall distribution of points with respect to a tissue coverage area.
- 46. The system of claim 44, wherein focus points are definable through an input file.
- 47. The system of claim 44, wherein focus points are definable through a user interface.
- 48. The system of claim 44, wherein focus points are predefined and repeated for each slide without any preprocessing to tissue finding regions, when specimen locations are reproducible on the slides.
- 49. The system of claim 44, wherein the number of data points required depends on the actual three-dimensional structure defined by the specimen and the slide and the geometrical dimension of the surface to be fit.
- 50. The system of claim 49, wherein once the surface is determined, an error function is calculated to determine a fit accuracy, wherein if the accuracy exceeds expected limits additional points can be acquired and the surface is recalculated.
- 51. The system of claim 44, wherein the specimen is positioned such that each camera image is aligned within the equivalent single pixel spacing.
- 52. The system of claim 44, wherein to maintain focus during the scanning process, the stage is positioned at a proper focal position as determined by the focus surface parameters.
- 53. A method for creating a high throughput montage image of microscope slides, the method comprising:
placing a slide to be imaged in a slide holder on a motorized stage; pre-scanning a thumbnail image of the slide to identify tissue locations on the slide; generating necessary control parameters to scan only the regions of interest under microscopic optics; capturing a high quality montage image by enabling accurate focus control of optical elements without requiring that the motorized stage be stopped and refocused at each tile location in the montage image; controlling a tiling process by moving the motorized stage; capturing image tiles with precise alignment by executing a strobe illumination system whenever a stage position sensor determines that the motorized stage has moved to a neighboring field of view of a camera; scanning each row of locations identified to contain tissue for tissue; and removing the slide and inserting another slide to be imaged.
- 54. The method of claim 53, wherein pre-scanning further comprises:
flat-field correcting the thumbnail image using a blank slide and a similar image obtained from a camera that captured the thumbnail image; cropping the thumbnail image by an image cropping component; inputting a cropped image into a tissue finding component, wherein the tissue finding component identifies tissue regions by applying a sequence of filters that incorporate knowledge of typical appearance and location of tissue and non-tissue slide regions and outputs a tiling matrix whose values indicate which tiles should be imaged; and interpreting the tiling matrix, by a scan control component, and transposing positions of the tiling matrix into an actual stage coordinate for a microscopic imaging.
- 55. The method of claim 53, wherein generating further comprises:
evaluating tissue regions of the thumbnail image and selecting several focus positions to initially focus microscope optics on a point-by-point basis; placing under the microscope optics each focus position, wherein the best-fit focus at each position is determined; generating a three-dimensional data set corresponding to an optimal specimen distance at each stage location, wherein data points in the data set are used as input to a routine that generates necessary control parameters for a slide scanning process; and passing the tissue information and control parameters to a component that is responsible for the motion of a stage and synchronization of a microscopic imaging system during a montage image acquisition.
- 56. The method of claim 55, further comprising determining a surface on the slide that represents a focal position of a tissue sample and using information from the surface to automatically control focus as the tissue sample is scanned under microscope optics.
- 57. The method of claim 55, wherein evaluating further comprises selecting positions based on their relative contrast within the thumbnail image and their overall distribution of points with respect to a tissue coverage area.
- 58. The method of claim 55, wherein evaluating further comprises allowing a user to define focus points.
- 59. The method of claim 55, wherein generating the number of data points required depends on the actual three-dimensional structure defined by the specimen and the slide and the geometrical dimension of the surface to be fit, wherein an error function is calculated to determine a fit accuracy once the surface has been determined, and if the accuracy exceeds expected limits additional points can be acquired and the surface is recalculated.
- 60. The method of claim 55, further comprising positioning the specimen such that each camera image is aligned within the equivalent single pixel spacing.
- 61. The method of claim 55, further comprising positioning the stage at a proper focal position as determined by focus surface parameters.
- 62. The method of claim 55, further comprising computing a vertical velocity, as a function of a parameter and subsequently of time, from the derivative of the focal surface if imaging is accomplished during continuous motion of the stage in a x-direction via an imaging arrangement.
- 63. The method of claim 62, further comprising using the velocity to control the optical position and maintain focus as images are acquired continuously across a row.
- 64. The method of claim 62, further comprising requiring a new velocity function for each row scanned based on a stepped y-position.
- 65. The method of claim 54, wherein cropping further comprises the steps of:
determining an approximate location of a slide boundary by searching upper and lower intervals corresponding to regions expected to contain upper and lower edges of the slide; and re-examining the approximate location to determine a more accurate location; and removing portions of the image falling outside of the determined slide boundary.
- 66. The method of claim 65, further comprising converting a copy of the thumbnail image to a grayscale image.
- 67. The method of claim 66, further comprising identifying pixel blocks that are likely to contain remaining boundary edges and flagging these blocks as edges that should not be considered for high resolution imaging.
- 68. The method of claim 65, further comprising analyzing mean and standard deviation of local pixel intensities and combining the mean and the standard deviation to generate a threshold value.
- 69. The method of claim 68, further comprising using the intensities to differentiate tissue-containing regions from blank regions and other non-tissue containing regions.
- 70. The method of claim 65, further comprising applying morphological filters to threshold standard deviation data to refine classification based on size and position of neighboring groups of potential tissue pixels, whereby the morphological filters process pixels of the cropped image in groups that correspond to slide regions that can be imaged individually during a high-resolution scanning process.
- 71. The method of claim 53, wherein controlling comprises attaching a Ronchi ruler to the motorized stage.
- 72. The method of claim 71, wherein controlling further comprises utilizing a light sensor that is mechanically isolated from the Ronchi ruler, whereby as the Ronchi ruler passes under the light sensor a series of electronic pulses that correspond to alternating light and dark bands of the Ronchi ruler is generated.
- 73. The method of claim 53, wherein capturing further comprises capturing images of tiles with precise alignment until a row is finished.
- 74. The method of claim 53, wherein capturing further comprises capturing images of tiles with precise alignment until a controlling program tells the system to stop.
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application is a continuation-in-part of U.S. patent application Ser. No. 09/757,703, filed Jan. 11, 2001, U.S. patent application Ser. No. 09/758,037, filed Jan. 11, 2001, and U.S. patent application Ser. No. 09/788,666, filed Feb. 21, 2001, all of which are currently pending and assigned to the assignee of the present invention.
Continuation in Parts (3)
|
Number |
Date |
Country |
Parent |
09757703 |
Jan 2001 |
US |
Child |
09919452 |
Jul 2001 |
US |
Parent |
09758037 |
Jan 2001 |
US |
Child |
09919452 |
Jul 2001 |
US |
Parent |
09788666 |
Feb 2001 |
US |
Child |
09919452 |
Jul 2001 |
US |