The present invention relates to an image pickup apparatus such as a camera, and an image pickup method using the image pickup apparatus.
In recent years, distance measurement apparatuses for measuring the distance to an object (an object to which the distance is measured) based on the parallax between a plurality of image pickup optical systems have been used for the following distance measurement for automobiles, auto focus systems for cameras, and three-dimensional shape measurement systems.
In such a distance measurement apparatus, a pair of image pickup optical systems arranged in the left-right direction or in the vertical direction form images on the respective image pickup areas, and the distance to the object is detected through triangulation using the parallax between those images.
The DFD (Depth From Defocus) method is known as a scheme for measuring the distance from a single image pickup optical system to an object. While the DFD method is an approach in which the distance is calculated by analyzing the amount of blur of the obtained image, it is not possible with a single image to determine whether it is a pattern of the object itself or a blur caused by the object distance, and therefore methods for estimating the distance from a plurality of images have been used (Patent Document 1, Non-Patent Document 1).
[Patent Document 1] Japanese Patent No. 3110095
[Patent Document 2] Japanese Laid-Open Patent Publication No. 2010-39162
[Non-Patent Document 1] Xue Tu, Youn-sik Kang and Murali Subbarao, “Two- and Three-Dimensional Methods for Inspection and Metrology V.”, Edited by Huang, Peisen S. Proceedings of the SPIE, Volume 6762, pp. 676203 (2007).
Configurations using a plurality of image pickup optical systems increase the size and cost of the image pickup apparatus. Moreover, it is necessary to provide a plurality of image pickup optical systems of uniform characteristics and to ensure that optical axes of a plurality of image pickup optical systems are parallel to one another with a high precision, thus making the manufacture more difficult; and since a calibration process for determining camera parameters is needed, thereby requiring a large number of steps.
With such DFD methods as disclosed in Patent Document 1 and Non-Patent Document 1, it is possible to calculate the distance to an object with a single image pickup optical system. With the methods of Patent Document 1 and Non-Patent Document 1, however, it is necessary to obtain a plurality of images in a time division manner while varying the distance to an object in focus (the focal length). Applying such an approach to a movie, misalignment occurs between images due to the time lag in the image-capturing operation, thereby lowering the distance measurement precision.
Patent Document 1 discloses an image pickup apparatus in which the optical path is divided by a prism, and an image is captured by two image pickup surfaces of varied back focuses, thereby making it possible to measure the distance to an object in a single iteration of image capture. However, such a method requires two image pickup surfaces, thereby increasing the sizes of the image pickup apparatus and significantly increasing the cost.
The present invention has been made in order to solve such problems as described above, and a primary object thereof is to provide an image pickup apparatus and an image pickup method capable of obtaining brightness information with which it is possible to calculate the object distance using a single image pickup optical system.
An image pickup apparatus of the present invention includes: a lens optical system having a plurality of regions including six regions having such optical characteristics that focal characteristics are made different from one another; an image pickup device having a plurality of pixels on which light beams having passed through the lens optical system are incident; and an arrayed optical device arranged between the lens optical system and the image pickup device for making light beams having passed through the six regions incident respectively on different pixels on the image pickup device.
An image pickup system of the present invention includes: an image pickup apparatus of the present invention; and a signal processing device for calculating a distance to an object using brightness information of a plurality of pixels obtained respectively from six different pixels on which light beams having passed through six regions of the image pickup apparatus are incident.
An image pickup method of the present invention uses image pickup apparatus including: a lens optical system having a plurality of regions including six regions having such optical characteristics that focal characteristics are made different from one another; an image pickup device having a plurality of pixels on which light beams having passed through the lens optical system are incident; and an arrayed optical device arranged between the lens optical system and the image pickup device, the method including: making light beams having passed through the six regions incident respectively on different pixels on the image pickup device by means of the arrayed optical device; and calculating a distance to an object using brightness information of a plurality of pixels obtained respectively from six different pixels on which light beams having passed through six regions are incident.
Another image pickup apparatus of the present invention includes: a lens optical system having a plurality of regions including three regions having such optical characteristics that focal characteristics are made different from one another; an image pickup device having a plurality of pixels on which light beams having passed through the lens optical system are incident and of which center points of the pixels are located at apices of a regular hexagon; and an arrayed optical device arranged between the lens optical system and the image pickup device for making light beams having passed through the three regions incident on different pixels on the image pickup device.
Still another image pickup apparatus of the present invention includes: a lens optical system having a plurality of regions including four regions having such optical characteristics that focal characteristics are made different from one another; an image pickup device including a plurality of pixels on which light beams having passed through the lens optical system are incident and of which positions of center points in a row direction are shifted from one row to another by half a pixel arrangement pitch; and an arrayed optical device arranged between the lens optical system and the image pickup device for making light beams having passed through the four regions incident on different pixels on the image pickup device.
According to the present invention, it is possible to obtain brightness information with which the object distance can be calculated through image capture using a single image pickup system. In the present invention, it is not necessary to make uniform the characteristics or the positions of a plurality of image pickup optical systems as with an image pickup apparatus using a plurality of image pickup optical systems, thus allowing for a reduction in the number of steps and facilitating the manufacturing process. Moreover, where a movie is captured using an image pickup apparatus of the present invention, it is possible to measure the accurate distance to an object even if the position of the object varies over the passage of time.
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
[
Embodiments of the image pickup apparatus of the present invention will now be described with reference to the drawings.
(Embodiment 1)
The lens optical system L has six optical regions D1, D2, D3, D4, D5 and D6 (
In the present embodiment, light beams having passed through the six optical regions D1, D2, D3, D4, D5 and D6 pass through the lens L2 and then are incident on the arrayed optical device K. The arrayed optical device K causes the light beams having passed through the six optical regions D1, D2, D3, D4, D5 and D6 to be incident on six pixel groups P1, P2, P3, P4, P5 and P6 of the image pickup device N, respectively. A plurality of pixels belong to each of the six pixel groups P1, P2, P3, P4, P5 and P6. For example, in
The first signal processing section C1 outputs images I1, I2, I3, I4, I5 and I6 obtained from the pixel groups P1, P2, P3, P4, P5 and P6, respectively. Since the optical characteristics of the six optical regions D1, D2, D3, D4, D5 and D6 are different from one another, the degrees of sharpness (values calculated by using the brightness) of the images I1, I2, I3, I4, I5 and I6 are different from one another depending on the object distance. The storage section Me stores the correlation between the degree of sharpness and the object distance for each of the light beams having passed through the optical regions D1, D2, D3, D4, D5 and D6. In the second signal processing section C2, it is possible to obtain the distance to the object based on the degrees of sharpness for the images I1, I2, I3, I4, I5 and I6 and the correlations.
As shown in
a) is a diagram showing, on an enlarged scale, the arrayed optical device K and the image pickup device N shown in
Note that the optical elements M1 are preferably arranged in a hexagonal close-packed pattern so that pixels arranged to be at the apices of a regular hexagon can be covered efficiently.
The arrayed optical device is designed so that the majority of the light beams B1, B2, B3, B4, B5 and B6 having passed through the optical regions D1, D2, D3, D4, D5 and D6 on the optical device L1 arrives at the pixel groups P1, P2, P3, P4, P5 and P6 on the image pickup surface Ni, respectively. Specifically, this configuration can be realized by appropriately setting parameters, such as the refractive index of the arrayed optical device K, the distance from the image pickup surface Ni, and the radius of curvature of the surface of the optical element M1.
Now, the first signal processing section C1 shown in
The images I1, I2, I3, I4, I5 and I6 are images obtained by the light beams B1, B2, B3, B4, B5 and B6 having passed through the optical regions D1, D2, D3, D4, D5 and D6 having such optical characteristics that focal characteristics are made different from one another. The second signal processing section C2 calculates the distance to the object by using the degree of sharpness (brightness information) of a plurality of images obtained for a plurality of pixel groups among the first to sixth pixel groups P1 to P6. In the present embodiment, using the images I1, I2, I3, I4, I5 and I6, it is possible to precisely obtain the distance to an object at a short distance, as compared with a method where the number of divisions of optical regions is smaller. That is, it is possible to precisely obtain the distance to the object through (e.g., a single iteration of) image capture using a single image pickup optical system (the lens optical system L).
The stop S is a region through which light beams of all field angles pass. Therefore, by inserting a plane having optical characteristics for controlling focal characteristics in the vicinity of the stop S, it is possible to similarly control focal characteristics of light beams of all field angles. That is, in the present embodiment, it is preferred that the optical device L1 is provided in the vicinity of the stop S. As the optical regions D1, D2, D3, D4, D5 and D6 having such optical characteristics that focal characteristics are made different from one another are arranged in the vicinity of the stop S, the light beams can be given focal characteristics according to the number of divisions of regions.
In
Next, a specific method for obtaining the object distance will be described.
The relationship between the object distance and the degree of sharpness, shown in a graph, is as shown in
Where E denotes the degree of sharpness in a block of a predetermined size, it can be obtained based on the difference in brightness value between adjacent pixels by using Expression 1, for example.
In Expression 1, Δxi,j is the difference value between the brightness value of a pixel at a certain coordinate point in an image block of a predetermined size and the brightness value of a pixel at the same position in an adjacent block, Δyi,j is the difference value between the brightness value of a pixel at a coordinate point in an image block of a predetermined size and the brightness value of a pixel at the same position in an adjacent block, and k is a coefficient. It is preferred that Δyi,j is multiplied by a predetermined coefficient.
Next, a method for obtaining the degree of sharpness E in a block of a predetermined size based on a Fourier-transformed frequency spectrum. Since the image is two-dimensional, a method for obtaining the degree of sharpness using a two-dimensional Fourier transform will be described. Herein, a case where the degree of sharpness for a predetermined block size is obtained by a two-dimensional Fourier transform will be described.
a) to 7(c) each show the brightness distribution of an image block having a size of 16×16. The degree of sharpness decreases in the order of
Now, the range of Z in
When the image pickup apparatus is used, of the data obtained as a result of a single iteration of image capture, the ratio between the degrees of sharpness of the images I1, I2, I3, I4, I5 and I6 produced for the respective pixel groups P1, P2, P3, P4, P5 and P6 is obtained for each arithmetic block. Then, the object distance can be obtained by using correlations stored in the storage section Me (the correlation between any two images and the ratio between the degrees of sharpness thereof). Specifically, for each arithmetic block, the ratio between degrees of sharpness of the correlation is compared with the value of the ratio between the degrees of sharpness of the images I1, I2, I3, I4, I5 and I6. Then, the object distance corresponding to the value at which they match is used as the distance to the object at the time of the image-capturing operation.
In order to uniquely obtain the object distance based on the ratios between the degrees of sharpness of the images I1, I2, I3, I4, I5 and I6, the ratios between the degrees of sharpness need to be all different from one another over a predetermined object distance range.
In
Note that the relationship between the object distance and the degree of sharpness is dictated by the radius of curvature of the surface of the optical regions D1, D2, D3, D4, D5 and D6, the spherical aberration characteristics, and the refractive index. That is, the optical regions D1, D2, D3, D4, D5 and D6 need to have such optical characteristics that the ratios between the degrees of sharpness of the images I1, I2, I3, I4, I5 and I6 are all different from one another over a predetermined distance range.
Note that in the present embodiment, the object distance may be obtained by using a value other than the degree of sharpness, e.g., the contrast, as long as it is a value calculated using the brightness (brightness information). The contrast can be obtained, for example, from the ratio between the maximum brightness value and the minimum brightness value within a predetermined arithmetic block. While the degree of sharpness is a difference between brightness values, the contrast is a ratio between brightness values. The contrast may be obtained from the ratio between a point of the maximum brightness value and another point of the minimum brightness value, or the contrast may be obtained from the ratio between the average value among some higher brightness values and the average value among some lower brightness values, for example. Also where the object distance is obtained using the contrast, as in a case where the degree of sharpness is used, correlations between object distances and contrasts ratio are stored in advance in the storage section Me. By obtaining the contrast ratio between the images I1, I2, I3, I4, I5 and I6 for each block, it is possible to obtained the object distance using the correlation.
Note that the present embodiment may employ either one of the method of obtaining the degree of sharpness from the difference between brightness values of adjacent pixels, and the method of obtaining the degree of sharpness through Fourier transform. Note however that since the brightness value is a relative value, the brightness value obtained by the former method and the brightness value obtained by the latter method are different values. Therefore, the method of obtaining the degree of sharpness for obtaining correlations (correlations stored in advance between object distances and degrees of sharpness) and the method of obtaining the degree of sharpness at the time of image capture need to be matched with each other.
In the present embodiment, the optical system of the image pickup apparatus may use an image-side telecentric optical system. Thus, even if the field angle changes, the main beam incident angle of the arrayed optical device K is a value close to 0 degree, and it is therefore possible to reduce the crosstalk between light beams arriving at respective pixel groups P1, P2, P3, P4, P5 and P6 over the entire image pickup area.
In the present embodiment, an image-side non-telecentric optical system may be used as the lens optical system L. In such a case, since the radii of curvature of six regions of the optical device L1 are different from one another, magnifications of the obtained images I1, I2, I3, I4, I5 and I6 are different from one another for each of the regions. Now, where the ratio between degrees of sharpness is calculated for each image region, predetermined regions to be referenced are shifted from one another outside the optical axis, thus failing to correctly obtain the ratio between degrees of sharpness. In such a case, correction is made so that the magnifications of the images I1, I2, I3, I4, I5 and I6 are generally equal to one another, and the ratio between degrees of sharpness over a predetermined region is obtained, thus making it possible to obtain the ratio correctly.
In Embodiment 1, the areas of the optical regions D1, D2, D3, D4, D5 and D6 (the areas as viewed from a direction along the optical axis) are made equal to one another (generally equal area). With such a configuration, the exposure time can be made equal for the pixel groups P1, P2, P3, P4, P5 and P6. Where the areas of the optical regions D1, D2, D3, D4, D5 and D6 are different from one another, it is preferred that the exposure time is varied among the pixel groups P1, P2, P3, P4, P5 and P6 or a brightness adjustment is performed after image capture.
As described above, according to the present embodiment, correlations between object distances and ratios between degrees of sharpness (or the contrasts) of images obtained from the six optical regions D1, D2, D3, D4, D5 and D6 of the optical device L1 are stored in advance, and the distance to an object can be obtained by the ratio between degrees of sharpness (or the contrasts) of the images I1, I2, I3, I4, I5 and I6 and the correlations. That is, by performing a single iteration of image capture, for example, using an image pickup apparatus of the present embodiment, it is possible to obtain brightness information with which the object distance can be measured. Then, the object distance can be calculated using the brightness information. As described above, in the present embodiment, since it is possible to obtain the distance to an object through (e.g., a single iteration of) image capture using a single image pickup optical system (the lens optical system L), it is not necessary to make uniform the characteristics or the positions of a plurality of image pickup optical systems as with an image pickup apparatus using a plurality of image pickup optical systems. Moreover, where a movie is captured using an image pickup apparatus of the present embodiment, it is possible to measure the accurate distance to an object even if the position of the object varies over the passage of time.
Note that with an arrangement such that the center points of the pixels are at the apices of a regular hexagon on the image pickup surface Ni, the number of kinds of optical characteristics of the optical regions D1, D2, D3, D4, D5 and D6 may be three instead of six. That is, as shown in
As shown in
(Embodiment 2)
Embodiment 2 is different from Embodiment 1 in that the region of the optical device L1 is divided in seven. In the present embodiment, similar contents to Embodiment 1 will not herein be described in detail.
The stop S is installed in the vicinity of the lens optical system L, and has a single opening.
In the present embodiment, light beams having passed through the seven optical regions D1, D2, D3, D4, D5, D6 and D7 pass through the lens L2 and then are incident on the arrayed optical device K. The arrayed optical device K causes the light beams having passed through the seven optical regions D1, D2, D3, D4, D5, D6 and D7 to be incident on the pixel groups P1, P2, P3, P4, P5, P6 and P7 (shown in
While the optical region D1 has a different shape from the optical regions D2, D3, D4, D5, D6 and D7 in Embodiment 2, the optical regions D1, D2, D3, D4, D5, D6 and D7 have an equal area. With such a configuration, the exposure time can be made equal between the pixel groups P1, P2, P3, P4, P5, P6 and P7 on which light beams from the optical regions are incident. Note that where the optical regions have different areas, it is preferred that the exposure time is made different between pixels depending on their areas, or the brightness is adjusted in the image generation process.
The broken line s denotes the position of the stop S.
In the present embodiment, the configuration of the arrayed optical device K is similar to that of Embodiment 1, and the perspective view of the arrayed optical device K of the present embodiment is similar to that of
a) is a diagram showing, on an enlarged scale, the arrayed optical device K and the image pickup device N shown in
The arrayed optical device K is arranged so that the surface thereof on which the optical elements M4 are formed is facing the image pickup surface Ni. On the image pickup surface Ni, a plurality of pixels P are arranged in n rows (n is an integer greater than or equal to 2), for example. As shown in
The arrayed optical device K is arranged in the vicinity of the focal point of the lens optical system L, and is arranged at a position away from the image pickup surface Ni by a predetermined distance. On the image pickup surface Ni, the microlenses Ms are provided so as to cover the surfaces of seven pixels p1, p2, p3, p4, p5, p6 and p7 included in the pixel groups P1, P2, P3, P4, P5, P6 and P7, respectively.
The arrayed optical device K is arranged so that the surface thereof on which the optical elements M4 are formed is facing the image pickup surface Ni. The arrayed optical device K is configured so that one optical element M4 corresponds to seven pixels p1, p2, p3, p4, p5, p6 and p7 included in the pixel groups P1, P2, P3, P4, P5, P6 and P7, respectively. The arrayed optical device is designed so that the majority of the light beams B1, B2, B3, B4, B5, B6 and B7 having passed through the optical regions D1, D2, D3, D4, D5, D6 and D7 on the optical device L1 arrives at the pixel groups P1, P2, P3, P4, P5, P6 and P7 on the image pickup surface Ni, respectively. Specifically, this configuration can be realized by appropriately setting parameters, such as the refractive index of the arrayed optical device K, the distance from the image pickup surface Ni, and the radius of curvature of the surface of the optical element M4.
Now, the first signal processing section C1 shown in
In Embodiment 2, the relationship between the object distance and the degree of sharpness is as shown in
As described above, the present embodiment is configured so that seven different images can be obtained simultaneously by seven regions having such optical characteristics that focal characteristics are made different from one another, and it is therefore possible to obtain the distance to an object through (e.g., a single iteration of) image capture using a single image pickup optical system. With this configuration, it is possible to expand the object distance range over which the distance can be measured, as compared with the embodiment shown in
Note that where the positions of the center points of the pixels in the row direction are arranged while being shifted from one row to another by half the arrangement pitch on the image pickup surface Ni, the number of kinds of optical characteristics of the optical regions D1, D2, D3, D4, D5, D6 and D7 may be four instead of seven. That is, as shown in
(Other Embodiments)
Note that while Embodiments 1 and 2 are examples where curved surface configurations, etc., for making focal characteristics different from one another are arranged on the object-side surface of the optical device L1, such curved surface configurations, etc., may be arranged on the image-side surface of the optical device L1.
While the lens L2 has a single-lens configuration, it may be a lens configured with a plurality of groups of lenses or a plurality of lenses.
A plurality of optical regions may be formed on the optical surface of the lens L2 arranged in the vicinity of the stop.
While the optical device L1 is arranged on the object side with respect to the position of the stop, it may be arranged on the image side with respect to the position of the stop.
Embodiments 1 and 2 are directed to an image pickup apparatus including the first signal processing section C1, the second signal processing section C2, and the storage section Me (shown in
Note that with the distance measurement method of the present invention, correlations between the degree of sharpness and the object distance do not always have to be used. For example, the object distance may be obtained by substituting the obtained degree of sharpness or contrast into an expression representing the relationship between the degree of sharpness or the contrast and the object distance.
It is preferred that the optical elements (microlenses) of the microlens array of Embodiments 1 and 2 are in a rotationally symmetric shape with respect to the optical axis within a range of a predetermined radius of each optical element. Hereinafter, description will be made in comparison with microlenses having a rotationally asymmetric shape with respect to the optical axis.
a
1) is a perspective view showing a microlens array having a rotationally asymmetric shape with respect to the optical axis. Such a microlens array is formed through patterning using a resist, which is obtained by forming a quadrangular prism-shaped resist on the array and performing a heat treatment, thereby rounding the corner portions of the resist.
a
3) is a diagram showing the results of a light beam tracking simulation in a case where the microlens shown in
b
1) is a perspective view showing a microlens array having a rotationally symmetric shape with respect to the optical axis. A microlens having such a rotationally symmetric shape can be formed on a glass plate, or the like, through a thermal imprinting or UV imprint process.
b
2) shows the contour lines of the microlens having a rotationally symmetric shape. With a microlens having a rotationally symmetric shape, the radius of curvature in the longitudinal and lateral directions is equal to that in the diagonal direction.
b
3) is a diagram showing the results of a light beam tracking simulation in a case where the microlens shown in
An image pickup apparatus according to the present invention is useful as an image pickup apparatus such as a digital still camera or a digital video camera. It is also applicable to a distance measurement apparatus for monitoring the surroundings of an automobile and a person in an automobile, or a distance measurement apparatus for a three-dimensional information input for a game device, a PC, a portable terminal, and the like.
A Image pickup apparatus
L Lens optical system
L1 Optical device
L2 Lens
D1, D2, D3, D4, D5, D6, D7 Optical region
S Stop
K Arrayed optical device
N Image pickup device
Ni Image pickup surface
Me Storage section
Ms Microlens on image pickup device
M1, M2, M3, M4 Microlens (optical element) of arrayed optical device
P1, P2, P3, P4, P5, P6, P7 Light-receiving device (pixel group) on image pickup device
p1, p2, p3, p4, p5, p6, p7 Pixel
C1, C2 First, second signal processing section
| Number | Date | Country | Kind |
|---|---|---|---|
| 2011-099165 | Apr 2011 | JP | national |
| Filing Document | Filing Date | Country | Kind | 371c Date |
|---|---|---|---|---|
| PCT/JP2012/000728 | 2/3/2012 | WO | 00 | 8/28/2013 |
| Publishing Document | Publishing Date | Country | Kind |
|---|---|---|---|
| WO2012/147245 | 11/1/2012 | WO | A |
| Number | Name | Date | Kind |
|---|---|---|---|
| 4560863 | Matsumura et al. | Dec 1985 | A |
| 5576975 | Sasaki et al. | Nov 1996 | A |
| 20010015763 | Miwa et al. | Aug 2001 | A1 |
| 20040125230 | Suda | Jul 2004 | A1 |
| 20070017993 | Sander | Jan 2007 | A1 |
| 20070279618 | Sano et al. | Dec 2007 | A1 |
| 20080277566 | Utagawa | Nov 2008 | A1 |
| 20100171854 | Yokogawa | Jul 2010 | A1 |
| 20110085050 | Dowski et al. | Apr 2011 | A1 |
| 20120033105 | Yoshino | Feb 2012 | A1 |
| Number | Date | Country |
|---|---|---|
| 61-076310 | May 1986 | JP |
| 05-302831 | Nov 1993 | JP |
| 07-035545 | Feb 1995 | JP |
| 07-060211 | Jun 1995 | JP |
| 2000-152281 | May 2000 | JP |
| 3110095 | Sep 2000 | JP |
| 2001-227914 | Aug 2001 | JP |
| 2004-191893 | Jul 2004 | JP |
| 2006-184065 | Jul 2006 | JP |
| 2006-184844 | Jul 2006 | JP |
| 2008-051894 | Mar 2008 | JP |
| 2009-198376 | Sep 2009 | JP |
| 2010-039162 | Feb 2010 | JP |
| 2012-039255 | Feb 2012 | JP |
| Entry |
|---|
| International Search Report for corresponding International Application No. PCT/JP2012/000728 mailed Apr. 17, 2012. |
| Preliminary Report on Patentability for corresponding International Application No. PCT/JP2012/000728 dated Jul. 4, 2013 and partial English translation. |
| Tu et al., “Two- and Three-Dimensional Methods for Inspection and Metrology V”, Edited by Huang, Peisen S. Proceedings of the SPIE, vol. 6762, pp. 676203 (2007), entitled “Depth and Focused Image Recovery from Defocused Images for Cameras Operating in Macro Mode”. |
| Number | Date | Country | |
|---|---|---|---|
| 20130329042 A1 | Dec 2013 | US |