The present invention relates to an image sensor and an image capturing apparatus.
As a focus detection method performed by an image capturing apparatus, an on-imaging surface phase difference method is used in which focus detection by a phase difference method is performed using focus detection pixels formed in an image sensor.
U.S. Pat. No. 4,410,804 discloses an image capturing apparatus using a two-dimensional image sensor in which one microlens and a plurality of photoelectric conversion units are formed in each pixel. The plurality of photoelectric conversion units are configured to receive light components that have passed through different regions of the exit pupil of an imaging lens via one microlens, thereby dividing the pupil. A correlation amount is calculated from focus detection signals output from pixels (focus detection pixels) each including a plurality of photoelectric conversion units, and an image shift amount is obtained from the correlation amount, thereby performing focus detection by the phase difference method. Further, Japanese Patent Laid-Open No. 2001-083407 discloses generating an image signal by adding focus detection signals output from a plurality of photoelectric conversion units for each pixel.
Japanese Patent Laid-Open No. 2000-156823 discloses an image capturing apparatus in which pairs of focus detection pixels are partially arranged in a two-dimensional image sensor formed from a plurality of imaging pixels. The pairs of focus detection pixels are configured to receive light components from different regions of the exit pupil of an imaging lens via a light shielding layer having openings, thereby dividing the pupil. An image signal is acquired by imaging pixels arranged on most part of the two-dimensional image sensor. A correlation amount is calculated from focus detection signals of the partially arranged focus detection pixels, and an image shift amount is obtained from the calculated correlation amount, thereby performing focus detection by the phase difference method, as disclosed.
In focus detection using the on-imaging surface phase difference method, the defocus direction and the defocus amount can simultaneously be detected by focus detection pixels formed in an image sensor. It is therefore possible to perform focus control at a high speed.
However, in the on-imaging surface phase difference method, there is a problem in that, when a variation range of an incident angle of light from an imaging lens (imaging optical system) on an image sensor at a peripheral portion is large, a pupil deviation between an entrance pupil of a sensor and an exit pupil of the imaging lens is large, and the base line length is not secured, and consequently there is a case in which the focus detection quality by the on-imaging surface phase difference method deteriorates.
The present invention has been made in consideration of the above situation, and realizes focus detection by the on-imaging surface phase difference method under a wide range of conditions in a case where a variation range of an incident angle of light from an imaging optical system on an image sensor at a peripheral portion is large.
According to the present invention, provided is an image capturing apparatus in which a plurality of pixels each having a plurality of photoelectric conversion units for receiving light fluxes that have passed through different partial pupil regions of an imaging optical system are arrayed, wherein an entrance pupal distance Zs of the image sensor with respect to a minimum. exit pupil distance Lmin of the imaging optical system and the maximum exit pupil distance Lmax of the imaging optical system satisfies a condition of
Further, according to the present invention, provided is an image sensor in which a plurality of pixels each having a plurality of photoelectric conversion units for receiving light fluxes that have passed through different partial pupil regions of an imaging optical system are arrayed, wherein an entrance pupil distance Zs of the image sensor with respect to a maximum image height R of the image sensor satisfies the condition of 2.33R<Zs<6.99R.
Furthermore, according to the present invention, provided is an image sensor in which a plurality of pixels each having a plurality of photoelectric conversion units for receiving light fluxes that have passed through different partial pupil regions of an imaging optical system are arrayed, wherein a maximum image height and an entrance pupil distance of the image sensor is determined such that a deviation amount between the entrance pupil of the image sensor and an exit pupil of the imaging optical system with respect to each of the plurality of photoelectric conversion units falls within a predetermined range.
Further, according to the present invention, provided is an image capturing apparatus that is capable of connecting to an exchangeable imaging optical system comprising the image sensor as described above.
Further, according to the present invention, provided is an image capturing apparatus comprising: the imaging optical system; and the image sensor as described above.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention, and together with the description, serve to explain the principles of the invention.
Exemplary embodiments of the present invention will be described in detail in accordance with the accompanying drawings.
A third lens group 105 (focus lens) carries out focus adjustment by moving forward and backward along the optical axis. A low-pass optical filter 106 is an optical element for the purpose of reducing false color and moiré of a sensed image. An image sensor 107 is composed of a two-dimensional CMOS photo sensor and the surrounding circuitry, and disposed on an imaging plane of the imaging optical system.
A zoom actuator 111 carries out a magnification-change operation by rotation of a cam barrel, not shown, to move the first lens group 101 through the second lens group 103 forward and backward along the optical axis. An aperture-shutter actuator 112 controls the diameter of the opening of the aperture-shutter 102 and adjusts the amount of light for image sensing, and also controls the exposure time during still image sensing. A focus actuator 114 moves the third lens group 105 forward and backward along the optical axis to adjust the focus.
An electronic flash 115 for illuminating an object is used during image sensing. A flash illumination device that uses a Xenon tube is preferable, but an illumination device comprised of a continuous-flash LED may also be used. An AF auxiliary flash unit 116 projects an image of a mask having a predetermined opening pattern onto an object field through a projective lens to improve focus detection capability with respect to dark objects and low-contrast objects.
The CPU 121 controls the camera main unit in various ways within the image capturing apparatus. The CPU 121 may, for example, have a calculation unit, ROM, RAM, A/D converter, D/A converter, communication interface circuitry, and so forth. In addition, the CPU 121, based on predetermined programs stored in the ROM, drives the various circuits that the camera has, and executes a set of operations of AF, image sensing, image processing, and recording.
An electronic flash control circuit 122 controls firing of the electronic flash 115 in synchrony with an image sensing operation. An auxiliary flash drive circuit 123 controls firing of the AF auxiliary flash unit 116 in synchrony with a focus detection operation. An image sensor drive circuit 124 controls the image sensing operation of the image sensor 107 as well as A/D-converts acquired image signals and transmits the converted image signals to the CPU 121. An image processing circuit 125 performs such processing as γ conversion, color interpolation, JPEG compression and the like on the images acquired by the image sensor 107.
A focus drive circuit 126 controls the drive of the focus actuator 114 based on the focus detection result to drive the third lens group 105 reciprocally in the optical axis direction, thereby performing focus adjustment. An aperture-shutter drive circuit 128 controls the drive of the aperture-shutter actuator 112, thereby driving the opening of the aperture-shutter 102. A zoom drive circuit 129 drives the zoom actuator 111 in accordance with the zoom operation of the user.
A display device 131, such as an LCD, displays information relating to the image sensing mode of the camera, preview images before image sensing, confirmation images after image sensing, focus state display images during focus detection, and the like. An operating switch group 132 is composed of a power switch, a release (image sensing trigger) switch, a zoom operation switch, an image sensing mode selection switch, and the like. A detachable flash memory 133 records captured images.
A pixel group 200 includes pixels of 2 columns×2 rows. A pixel 200R having an R (red) spectral sensitivity is arranged at the upper left position, pixels 200G having a G (green) spectral sensitivity are arranged at the upper right and lower left positions, and a pixel 200B having a B (blue) spectral sensitivity is arranged at the lower right position. Each pixel is formed from a first focus detection pixel 201 and a second focus detection pixel 202 arrayed in 2 columns×1 row.
A number of arrays of 4 (columns)×4 (rows) pixels (8 (columns)×4 (rows) focus detection pixels) shown in
Each of the photoelectric conversion units 301 and 302 may be formed as a pin structure photodiode including an intrinsic layer between a p-type layer and an n-type layer or a p-n junction photodiode without an intrinsic layer, as needed.
In each pixel, a color filter 306 is formed between the microlens 305 and the photoelectric conversion units 301 and 302. The spectral transmittance of the color filter may be changed between the focus detection pixels, as needed, or the color filter may be omitted.
Light that has entered the pixel 200G shown in
The pixels 200R and 200B shown in
The correspondence between pupil division and the pixel structure according to this embodiment shown in
A first partial pupil region 501 of the first focus detection pixel 201 represents a pupil region that is almost conjugate with the light receiving surface of the photoelectric conversion unit 301 having a center of gravity decentered in the −x direction via the microlens 305, and light beams that have passed through the first partial pupil region 501 are received by the first focus detection pixel 201. The first partial pupil region 501 of the first focus detection pixel 201 has a center of gravity decentered to the +x side on the pupil plane.
A second partial pupil region 502 of the second focus detection pixel 202 represents a pupil region that is almost conjugate with the light receiving surface of the photoelectric conversion unit 302 having a center of gravity decentered in the +x direction via the microlens 305, and light beams that have passed through the second partial pupil region 502 are received by the second focus detection pixel 202. The second partial pupil region 502 of the second focus detection pixel 202 has a center of gravity decentered to the −x side on the pupil plane.
Light beams that have passed through a pupil region 500 are received by the whole pixel 200G including the photoelectric conversion units 301 and 302 (first focus detection pixel 201 and the second focus detection pixel 202). Reference numeral 400 denotes an opening of the aperture-shutter 102.
The embodiment shows an example in which the pupil region is divided into two in the horizontal direction. However, the pupil region may be divided in the vertical direction as needed.
It should be noted that, in the aforesaid example, a plurality of image sensing pixels each formed with the first focus detection pixel and the second focus detection pixel are arrayed, however, the present invention is not limited to this. The image sensing pixels, the first focus detection pixels and the second focus detection pixels may be formed independently, and the first focus detection pixels and the second focus detection pixels may be partially provided among an array of the image sensing pixels.
In this embodiment, the first focus detection signal is formed by collecting signals corresponding to received light from the first focus detection pixels 201 of the respective pixels of the image sensor, the second focus detection signal is formed by collecting signals corresponding to received light from the second focus detection pixels 202 of the respective pixels of the image sensor, and perform focus detection using the first and second focus detection signals. Further, by adding the signals from the first focus detection pixels 201 and the second focus detection pixels 202 for each pixel of the image sensor 107, an image signal (a captured image) with the resolution corresponding to the number of the effective pixels N is generated.
At the sensor entrance pupil distance Zs, the first partial pupil region 501 which is a light receiving region (entrance pupil) of the first focus detection pixel 201 of each pixel of the image sensor 107 and the second partial pupil region 502 which is a light receiving region of the second focus detection pixel 202 intersect substantially on the optical axis. Considering the overlapping of the first partial pupil region 501 and the second partial pupil region 502, which are the sensor entrance pupil, and the imaging lens exit pupil at the sensor entrance pupil distance Zs, a pupil deviation amount between the sensor entrance pupil and the imaging lens exit pupil with the minimum exit pupil distance Lmin is P1. Similarly, a pupil deviation amount between the sensor entrance pupil and the imaging lens exit pupil with the maximum exit pupil distance Lmax is P2. If either of the pupil deviation amount P1 or the pupil deviation amount P2 between the sensor entrance pupil and the imaging lens exit pupil is large, the base-line length is not secured, and there is case in which the focus detection performance by the phase difference AF deteriorates.
Accordingly, in the present embodiment, it is configured such that the sensor entrance pupil distance Zs satisfies the following condition so that the pupil deviation amounts P1 and P2 are restrained.
First, let an incident angle of light on a pixel located at the maximum image height R of the image sensor 107 from the sensor entrance pupil distance Zs on the optical axis be θs, an incident angle of light on the same pixel from the minimum exit pupil distance Lmin on the optical axis be θmax, and an incident angle of light on the same pixel from the maximum exit pupil distance Lmax on the optical axis be θmin. In order to restrain the pupil deviation amounts P1 and P2 to secure the base-line length, in this embodiment, the angle θs is set within a range that satisfies the following expression (1) which defines a neighboring range of an average incident angle.
Further, θs, θmax, and θmin can be approximated by the following expressions (2).
By substituting the expression (2) to the expression (1), the following expression (3) that shows the condition that the sensor entrance pupil distance Zs should satisfy is obtained.
Therefore, in this embodiment, in order to restrain the pupil deviation amounts P1 and P2 and secure the base-line length, it is configured so that the entrance pupil distance Zs of the image sensor 107 satisfies the expression (3) with the minimum exit pupil distance Lmin and the maximum exit pupil distance Lmax of the imaging optical system.
In a case of a lens exchangeable camera, lenses with various optical conditions, from the wide angle lenses to telephoto lenses may be attached. In this case, as a condition for the maximum exit pupil distance Lmax of the imaging optical system, it is desired to set Lmax=∞ in order to cope with the telecentric optical lens. Further, as a condition for the minimum exit pupil distance Lmin of the imaging optical system, it is desired to restrain a marginal illumination. decrease expressed by the cosine fourth law with respect to the center image height to equal to or less than ½ (half). Therefore, under the condition of cos4(θmax)=½, it is desired that the maximum incident angle θmax of light that incidents on the pixel located at the maximum image height R of the image sensor 107 from the minimum exit pupil distance Lmin on the optical axis is θmax=32.8°=0.572[rad]. Accordingly, from the expression (2), with the maximum image height R of the image sensor 107, it is desired that minimum exit pupil distance Lmin is Lmin=R/0.572.
By substituting Lmin=R/0.572 and Lmax=∞ in the expression (3), the following condition expression (4) for the sensor entrance pupil distance Zs is obtained.
2.33R<Zs<6.99R (4)
In this embodiment, since the horizontal size H of the image sensor 107 is 36 mm, the vertical size V of the same is 24 mm, and the maximum image height R of the same is 21.63 mm, the condition expression (4) for the sensor entrance pupil distance Zs is 50.4 mm<Zs<151.2 mm.
Further, the condition of the minimum exit pupil distance Lmin of the imaging optical system may be determined on the basis of the pupil intensity distribution of the pupil region 500 which is a combined area of the first partial pupil region 501 and the second partial pupil region 502 of the image sensor 107. Here, the pupil intensity distribution of the pupil region 500 at the central image height is denoted by PI0(θ), and the pupil intensity distribution of the pupil region 500 at the maximum image height R is denoted by PIR(θ). As the condition of the minimum exit pupil distance Lmin of the imaging optical system, it is desirable to restrain a decrease in the pupil intensity distribution PIR(θ=θmax PIR) at the incident angle θmax PIR[rad] at the maximum image height R to ½ (half) or less than the pupil intensity distribution PI0(θ=0) at the incident angle θ[rad] at the central image height. Therefore, it is desirable to determine the incident angle θmax PIR of light from the minimum exit pupil distance Lmin on the optical axis based on the condition of PIR(θ=θmax PIR)=0.5×PI0(θ=0), and determine the minimum exit pupil distance Lmin=R/θmax PIR from the expression (2).
By substituting Lmin=R/θmax PIR and Lmax=∞ to the expression (3), the condition expression (5) of the sensor entrance pupil distance Zs is obtained.
As described above, the image sensor of the embodiment has a configuration in which a plurality of pixels are arrayed, wherein each pixel has a plurality of photoelectric conversion units for receiving light fluxes that have passed through different partial pupil regions of the imaging optical system, and in which the sensor entrance pupil distance Zs satisfies the expression (3) with respect to the minimum exit pupil distance Lmin of the imaging optical system and the maximum exit pupil distance Lmax of the imaging optical system.
Further, the image sensor of the embodiment has a configuration in which a plurality of pixels are arrayed, wherein each pixel has a plurality of photoelectric conversion units for receiving light fluxes that have passed through different partial pupil regions of the imaging optical system, and in which the sensor entrance pupil distance Zs satisfies the expression (4) with respect to the maximum image height R of the image sensor.
With the configuration as described above, the focus detection by the on-imaging surface phase difference method is possible over a wide range of conditions in a case where a changing range of the incident angle of light from the imaging optical system on a pixel at a peripheral image height of the image sensor is large.
In the first embodiment as described above, each pixel of the image sensor 107 is divided by 2 in the x direction, and by 1 (or not divided) in the y direction. However, the present invention is not limited to this, an image sensor 107 comprises pixels that are divided by different numbers and different ways from the pixels shown in
In this modification, a pixel group 700 shown in
A number of arrays of 4 (columns)×4 (rows) pixels (8 (columns)×8 (rows) focus detection pixels) shown in
In the modification, by adding signals from the first focus detection pixel 701 to the fourth focus detection pixel 704 for each pixel of the image sensor 107, an image signal (a captured image) with the resolution corresponding to the number of the effective pixels N is generated. Except for this, the modification is similar to the above embodiment.
With the configuration as described above, the focus detection by the on-imaging surface phase difference method is possible over a wide range of conditions in a case where a changing range of the incident angle of light from the imaging optical system on a pixel at a peripheral image height of the image sensor is large.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application Nos. 2016-016173, filed on Jan. 29, 2016 and 2016-234423, filed on Dec. 1, 2016, which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | Kind |
---|---|---|---|
2016-016173 | Jan 2016 | JP | national |
2016-234423 | Dec 2016 | JP | national |
This application is a continuation of application Ser. No. 15/974,173, filed May 8, 2018, which is a continuation of application Ser. No. 15/416,388, filed Jan. 26, 2017, which issued as U.S. Pat. No. 10,015,390 on Jul. 3, 2018, the entire disclosures of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4410804 | Stauffer | Oct 1983 | A |
4755868 | Hodges | Jul 1988 | A |
7974017 | Katakura | Jul 2011 | B2 |
8390943 | Uemura | Mar 2013 | B2 |
9377601 | Kusaka | Jun 2016 | B2 |
10015390 | Fukuda | Jul 2018 | B2 |
10477099 | Fukuda | Nov 2019 | B2 |
20040141086 | Mihara | Jul 2004 | A1 |
20060024038 | Kawamura et al. | Feb 2006 | A1 |
20080277566 | Utagawa | Nov 2008 | A1 |
20110122510 | Uchida | May 2011 | A1 |
20110228163 | Isaka et al. | Sep 2011 | A1 |
20120002165 | Saito | Jan 2012 | A1 |
20120075729 | Uemura | Mar 2012 | A1 |
20120249846 | Nishio et al. | Oct 2012 | A1 |
20120249852 | Fukuda | Oct 2012 | A1 |
20130057968 | Tang et al. | Mar 2013 | A1 |
20140071322 | Fukuda | Mar 2014 | A1 |
20140111681 | Fukuda | Apr 2014 | A1 |
20140226038 | Kimura | Aug 2014 | A1 |
20150015765 | Lee | Jan 2015 | A1 |
20150296129 | Ishikawa | Oct 2015 | A1 |
20150378129 | Yuza | Dec 2015 | A1 |
20160088245 | Nakata | Mar 2016 | A1 |
20160255267 | Takamiya | Sep 2016 | A1 |
20160307326 | Wang | Oct 2016 | A1 |
20160349492 | Maetaki | Dec 2016 | A1 |
20170192209 | Yamahiro | Jul 2017 | A1 |
20170272643 | Tamaki et al. | Sep 2017 | A1 |
20170295331 | Fukuda et al. | Oct 2017 | A1 |
20170310913 | Takada | Oct 2017 | A1 |
20170353680 | Fukuda | Dec 2017 | A1 |
20180003923 | Fukuda | Jan 2018 | A1 |
Number | Date | Country |
---|---|---|
103780848 | May 2014 | CN |
2 548 462 | Sep 2017 | GB |
2000-156823 | Jun 2000 | JP |
2001-083407 | Mar 2001 | JP |
2009-145527 | Jul 2009 | JP |
2010175955 | Aug 2010 | JP |
2013257507 | Dec 2013 | JP |
2014006275 | Jan 2014 | JP |
Entry |
---|
The above U.S. Patent Application Publication #10 was cited in a U.K. Search Report dated Jul. 14, 2017, which is enclosed, that issued in the corresponding U.K. Patent Application No. 1700961.4. |
The above foreign patent document #4 were cited in the May 20, 2019 Japanese Office Action, which is enclosed without an English Translation, that issued in Japanese Patent Application No. 2016234423. |
The above U.S. Patent #6, U.S. Patent Application Publication #24, and Foreign patent document #5 were cited in the Sep. 3, 2019 Chinese Office Action, which is enclosed with an English Translation, that issued in Chinese Patent Application No. 201710051977. |
The above foreign patent documents were cited in the Sep. 14, 2020 Japanese Office Action, which is enclosed without an English Translation, that issued in Japanese Patent Application No. 2019160641. |
Number | Date | Country | |
---|---|---|---|
20200014842 A1 | Jan 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15974173 | May 2018 | US |
Child | 16577052 | US | |
Parent | 15416388 | Jan 2017 | US |
Child | 15974173 | US |