The present invention relates to an image pickup apparatus, such as a digital microscope, which obtains an image of an object.
In recent years, attention has been given to image pickup apparatuses that acquire, as computerized images, outer shape information from the entire sample and details of cellular tissues and display the computerized images on a monitor for observation.
This type of image pickup apparatus is characterized in that the size of an object is large (several millimeters to several tens of millimeters) in contrast to the resolution (<1 μm) of an objective lens necessary to observe the object. Accordingly, to form an image with a high resolution and a wide field of view, it is necessary to obtain one whole image by combining images of different portions of the object taken by an objective lens that has a narrow field of view, but has a high resolution.
However, when defocus is measured and focusing is performed for each portion of the object, it takes much time to obtain one whole image. Accordingly, PTL 1 discloses that focusing is performed at three or more points on a slide glass, which holds a sample (object), to obtain the tilt of the slide glass and that focal positions at points other than the three or more points are estimated by calculation. PTL 2 discloses that an area where a sample exists is obtained beforehand, focal positions at three reference points in the area are measured, and a focal position at an arbitrary position is calculated from a plane equation including the three points.
In PTL 1 and PTL 2, the plane equation including the three points on the surface of the object is obtained from the focal positions at the three points. However, the surface of an actual sample is not always flat. For this reason, an image obtained by the methods described in PTL 1 and PTL 2 may blur because the focal position at an arbitrary position is greatly displaced from an actual focal position, or more time may be taken because focusing is performed again.
An image pickup apparatus according to an aspect of the present invention includes: a measuring section configured to measure a surface shape of an object; an image pickup section configured to obtain images of different areas of the object formed on an image plane of an image pickup optical system by a plurality of image pickup elements; a focal-position detecting unit configured to detect a focal position of the object where a focal-position detecting point of the object is focused on the image plane; and a focal-position determining unit configured to determine a focal position of the object at a point different from the focal-position detecting point of the object on the basis of a detection result of the focal-position detecting unit and a measurement result of the measuring section. The image pickup section takes the images of the different areas of the object on the basis of a determination result of the focal-position determining unit in a state in which the images are focused on the plurality of image pickup elements.
Image pickup apparatuses according to embodiments of the present invention will be described below.
The main image pickup system 10 includes an illumination optical system 100 that guides light from a light source unit 110 to an irradiated surface on which a sample 225 is placed, an image pickup optical system 300 that forms an image of the sample 225, and an image pickup element unit 400 in which a plurality of image pickup elements 430 are arranged on an image plane of the image pickup optical system 300. The measuring optical system 20 includes a position measuring device 510 that measures the position of a sample stage 210, a light source 520 that illuminates the sample 225, a half mirror 530, a camera 540 that measures the position of the sample 225, and a camera sensor 550 that measures the surface shape of the sample 225. For example, the sample 225 is placed between a slide glass and a cover glass (the glasses are not illustrated; sometimes the cover glass is not used) to form a prepared slide 220. The prepared slide 220 is placed on the sample stage 210, and is conveyed between the main image pickup system 10 and the measuring optical system 20 by the sample stage 210.
Hereinafter, the optical axis of the image pickup optical system 300 is referred to as a Z-direction, and a plane perpendicular to the optical axis of the image pickup optical system 300 is referred to as an XY-plane.
These structures will be described in detail along a procedure of
First, the sample 225 is placed at a position where the sample 225 can be measured with the measuring optical system 20 (Step 101).
Then, the measuring optical system 20 measures a size, an image pickup area, an image pickup position (sample reference point), and a surface shape of the sample 225 (Step 102).
The camera 540 takes an image of the sample 225 by using transmitted light of light applied from the light source 520 via the half mirror 530 in order to recognize the position of the sample 225 on the sample stage 210. The size, image pickup area, image pickup position, etc. of the sample 225 are thereby measured. The camera sensor 550 is a Shack-Hartmann wavefront sensor, and measures the surface shape of the sample 225. It is said that, when the cover glass is placed on the sample 225, the surface shape of the sample 225 changes along the surface shape of the cover glass. For this reason, when the cover glass is placed on the sample 225, the surface shape of the cover glass may be measured as the surface shape of the sample 225.
The sample stage 210 can change the position of the prepared slide 220 in the Z-, X-, and Y-direction or tilt the position of the prepared slide 220 with respect to the Z-direction, and is driven so that the sample 225 coincides with the irradiated surface.
As illustrated in
The stage drive amount is calculated, from positional relationship data among three points, namely, “a stage position (measured by the position measuring device 510)”, “image coordinates”, and “a reference position in the main image pickup system (main reference point)” that are obtained beforehand during assembly of the apparatus, so that the main reference point and the sample reference point BP° coincide with each other.
In this way, the image pickup area 400a in the main image pickup operation, the surface shape of the sample, and the position of the sample (sample reference point BP0) are determined.
Next, a description will be given of a method for measuring the surface shape of the sample 225 or the cover glass with the camera sensor 550. As described above, the camera sensor 550 is a Shack-Hartmann wavefront sensor, and includes an image pickup element 551 and a microlens array 552, as illustrated in
When the surface of the sample 225 or the cover glass is ideally flat, imaging points shown by closed circles are regularly arranged on the image pickup element 551, as illustrated in
Similarly to the camera sample reference point BP0, the sensor sample reference point BP1 is determined so that the image pickup area in the main image pickup system 10 coincides with the image pickup area 400a determined by the measuring optical system 20. That is, the sensor sample reference point BP1 is determined at a position corresponding to the camera sample reference point BP0 in the image pickup area 400a. For this reason, the sensor sample reference point BP1 is uniquely determined by determining the camera sample reference point BP0.
Here, the coordinates of the sensor sample reference point BP1 is taken as (a1, b1). At this time, for example, as illustrated in
Next, to take an image of the sample 225, the sample stage 210 is driven so that the camera sample reference point BP0 coincides with the main reference point (Step 103).
Referring again to
The image pickup element unit 400 includes an image pickup stage 410, an electric circuit board 420, image pickup elements 430, and a focus sensor 440. As illustrated in
For example, the focus sensor 440 may be a two-dimensional image pickup element that can process the contrast of an image of a uniformly illuminated sample at high speed, or may be formed by a plurality of actinometers to determine the focal position by the light quantity. Here, a description will be given of a structure of the focus sensor 440 for acquiring focal-position information and a focal-position acquisition method adopted when a plurality of actinometers are used, with reference to
In
When acquiring the focal-position information, reliability can be enhanced by obtaining only scattering light from the sample 225 as dark-field illumination. For example, only scattering light from the sample 225 can be acquired by setting the numerical aperture NA of the illumination optical system 100 to be larger than the numerical aperture NA of the image pickup optical system 300 so that illumination light does not enter the image pickup optical system 300.
Further alternatively, as illustrated in
Further, any of a plurality of image pickup elements 430 may be selected as a focus sensor instead of using the sensor only for focusing, a specific pixel in the selected image pickup element may be set as a main reference point, and focusing may be performed by using the above-described method.
By the above-described structure and method, a focal position is determined by the focus sensor 440.
A focal position of the sample 225 at the camera sample reference point BP0 is found with the focus sensor 440 while moving the sample stage 210 in the Z-direction (Step 104).
Here, the sample 225 is placed so that the camera sample reference point BP0 and the focus sensor unit 440 have a conjugated positional relationship with the image pickup optical system 300. An image of the sample 225 is sometimes taken with focus not only on the surface of the sample 225 but also on the inside of the sample 225. Hence, the focal position detecting point can be set not only on the surface of the sample 225 but also in the sample 225.
After focus is obtained at the camera sample reference point BP0, the surface shape data obtained by the measuring optical system 20 is applied to the entire sample 225 (Step 105).
First, the camera sample reference point BP0 and the main reference point are caused to have a focus relationship between an object point and an image point in the image pickup optical system 300. In portions other than the camera sample reference point BP0, focal positions are determined by the processing unit 610 serving as the focal-position determining unit on the basis of the detection result of the focus sensor 440 and the surface shape data obtained beforehand. At this time, when the sensor sample reference point BP1 is set at a position corresponding to the camera sample reference point BP0 in the image pickup area 400a, the surface shape data obtained beforehand is applied with reference to the focal position at the camera sample reference point BP0. That is, the focal position at the camera sample reference point BP0 is caused to correspond to the sensor sample reference point BP1 serving as the reference point of the surface shape data, and the difference (surface shape) from the sensor sample reference point BP1 is applied as the defocus amount in the Z-direction, thereby determining the focal position of the entire surface of the sample. When the sensor sample reference point BP1 is set at a position different from the position corresponding to the camera sample reference point BP0, the position corresponding to the camera sample reference point BP0 in the surface shape data is caused to correspond to the focal position at the camera sample reference point BP0. Then, the surface shape data is applied to the entire surface of the sample.
By doing this, the focal position can be obtained from the surface to the inside of the sample 225 by a small number of focusing operations. However, as for the defocus amount on the image pickup section side, the optical (lateral) magnification β of the image pickup optical system 300 is considered. As an example, it is assumed that the image pickup optical system forms images an add number of times and a defocus zxy from the sensor sample reference point BP1 is provided at an arbitrary point (Xxy, Yxy) on the sample. In this case, on the image plane side, a defocus Zxy×β2 is applied at a point (−Xxy×β, −Yxy×β) on the XY-plane.
When the entire surface is actually focused, the relative positions between the sample stage 210 and the image pickup elements 430 are changed so that the sample stage 210 and the image pickup elements 430 have a conjugated relationship (Step 106). For example, as illustrated in
Through the above-described procedure, the entire surface is focused and an image is obtained. Since a plurality of image pickup elements 430 are separately arranged in the image pickup section of the first embodiment, a whole image of the sample cannot be taken in one image pickup operation. For this reason, it is necessary to form a whole image of the sample by performing image pickup operations while moving the sample 225 and the image pickup element unit 400 relative to the plane perpendicular to the optical axis direction of the image pickup optical system 300 and combining obtained separate images.
Hereinafter, a description will be given of the relationship between the motion of the sample 225 and the sample stage 210, and the image pickup optical system 300 and the image pickup element unit 400 when the entire sample is taken as one image.
When the first image pickup operation is performed at a position of
In this way, the whole image of the sample is obtained. To obtain an in-focus image, focusing is performed through Steps 104 to 106 of
By the method described above, an in-focus and high-resolution whole image is formed by using the optical system with a wide angle of view and a plurality of image pickup elements.
According to the above-described method, it is possible to more accurately determine the focal position of the object at an arbitrary position and to obtain a whole image of the object in a shorter time.
In the first embodiment, the surface shape of the sample 225 is measured, and the camera sample reference point BP0 is aligned with the main reference point. The focal position of the image pickup optical system is determined at the camera sample reference point BP0, and the image pickup elements or the like are driven along undulation of the surface shape, so that the focal positions are also determined at a plurality of points other than the point BP0, and an in-focus whole image of the sample is obtained.
However, if the prepared slide 220 is tilted by impact or the like during transportation from the measuring optical system 20 to the main image pickup system 10, there is a need to correct the tilt. In this case, an in-focus whole image of the sample may be obtained by calculating the tilt of the sample 225 from focal positions measured by three or more focus sensors that are arranged in the image pickup element unit 400 such as not to be aligned in a straight line and by correcting the tilt by the sample stage 210.
A focusing method adopted in this case will be described along a focusing procedure shown in
One of the three reference points serves as a camera sample reference point BP0 that is the basis of the focal position of the entire sample, and the other reference points serve as tilt detection points TP (
Next, a difference between the focal position at the camera sample reference point BP0 and the focal positions (Z-direction) at the tilt detection points TP when the focal position at the camera sample reference point BP0 is determined is calculated (Step 202).
Then, a difference between the focal positions (Z-direction) at the camera sample reference point BP0 and the tilt detection points TP is calculated from the surface shape measured by a measuring optical system 20 beforehand (Step 203).
The differences in focal position between Step 202 and Step 203 are compared (Step 204). When the comparison result is within a predetermined range, tilt correction is not performed by the sample stage 210, and focusing is completed. When the comparison result is out of the predetermined range, a tilt amount is calculated (Step 205).
The sample stage 210 is driven according to the tilt amount calculated in Step 205 to correct the tilt so that the difference in focal position (Z-direction) between the camera sample reference point BP0 and the tilt detection points TP falls within the predetermined range (Step 206).
By the above-described steps, the surface shape of the sample 225 is measured, the focal position at the camera sample reference point BP0 is adjusted, the defocus amount is calculated according to the surface shape (undulation), and the tilt is corrected by driving the sample section 200, so that an in-focus whole image of the sample can be obtained. When the tilt is large, the procedure may return from Step 206 to Step 201, and the same steps may be repeated.
This allows more accurate focusing.
In the first embodiment and the second embodiment, the optical axes of the image pickup optical system and the measuring optical system are different. For example, as illustrated in
A focusing method adopted in this case will be described along a focusing procedure shown in
Next, a sample stage 210 is driven on the XY-plane to adjust the image pickup area of the sample 225 so that the camera sample reference point BP0 and a focus sensor (main reference point) have a conjugate positional relationship with an image pickup optical system 300 (Step 303).
Then, a focal potion at the camera sample reference point BP0 is found with the focus sensor while driving the sample stage 210 in the Z-direction (Step 304). At this time, the sample 225 is placed so that the camera sample reference point BP0 and the focus sensor have a conjugate positional relationship with the image pickup optical system 300.
As described in Step 105 of the first embodiment, after focus is obtained at the camera sample reference point BP0, surface shape data obtained by the measuring optical system 20 is applied to the entire sample while the main reference point and a sensor sample reference point BP1 serving as a reference point of the surface shape data are aligned (Step 305).
To focus the entire sample, the relative positions between the sample stage and the image pickup elements are changed so that the sample stage and the image pickup elements have a conjugate positional relationship (Step 306).
In Steps 304 to 306, a tilt detection operation may be performed with a plurality of focus sensors, similarly to the second embodiment.
The above method allows a whole image of the sample to be accurately formed in a short time.
In the first to third embodiments, the surface shape of the sample is measured with the Shack-Hartmann wavefront sensor, the focus is adjusted at the reference point in the main image pickup system, and the focal position of the entire sample is indirectly determined on the basis of the measured surface shape.
As illustrated in
First, a sample stage 210 is placed at a position to be measured with a main image pickup system 10 (Step 401), and a size, an image pickup area 400a, a camera sample reference point BP0, and a surface shape of a sample 225 are measured with a measuring optical system 20 (Step 402).
Next, a sample stage 210 is driven in the Z-direction to adjust an image pickup area so that the camera sample reference point BP0 and the focus sensors 440 (main reference point) have a conjugate positional relationship with an image pickup optical system 300 (Step 403).
Then, a focal position of the sample 225 at the camera sample reference point BP0 is found while driving the sample stage 210 in the Z-direction of the image pickup optical system 300, and focal positions are also measured with the focus sensors 440 placed at positions that are not conjugate with the camera sample reference point BP0 (Step 404).
Then, a focal position of the entire surface, including portions where the focus sensors 440 are not provided, can be calculated from the focal positions measured at a plurality of points (Step 405).
To apply the calculated focal position to the focal position of the entire surface, the relative positions between the sample and the image pickup elements are changed so that the sample and the image pickup elements have a conjugate positional relationship (Step 406).
In Step 404, to accurately find the focal position, the accuracy in focusing the entire surface may be increased by calculating focal positions with the focus sensors as the sample stage 210 is driven in the XY-plane so as to increase the number of focal-position measuring points.
In the above-described embodiments, the image pickup apparatus of the present invention is applied to the microscope. While the transmissive optical system that focuses transmitted light of light applied to the sample onto the image plane is adopted in the embodiments, an epi-illumination optical system may be adopted.
While some embodiments have been described, images of a plurality of samples can be taken in a short time by performing operations in parallel (simultaneously) in the main image pickup system and the measuring optical system as in the first embodiment and the second embodiment. That is, the measuring optical system measures the surface shape of a first sample, and at the same time, the main image pickup system takes an image of a second sample.
When the image pickup apparatus takes images of a small number of samples, it can be made compact by partially aligning the optical axes of the main image pickup system and the measuring optical system as in the third embodiment and the fourth embodiment.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2011-162157, filed Jul. 25, 2011, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2011-162157 | Jul 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/068046 | 7/10/2012 | WO | 00 | 1/23/2014 |