The aspect of the embodiments relates to an image capturing apparatus, a method of controlling thereof, and a non-transitory computer-readable medium.
In the case of a camera that includes a telephoto lens with a bright f-number, the depth of field is generally shallow. Therefore, in a case where a camera that includes such a lens is used to perform shooting from a diagonal direction (a non-orthogonal direction) relative to a subject plane while AF is functioning, the obtained image is in focus only in the vicinity of the center and is out of focus in a region other than the vicinity of the center. If the technique so-called tilt shooting, in which the optical axis of the lens is inclined relative to a solid-state image sensor, is used under the same circumstance, the in-focus range can be expanded.
Japanese Patent Laid-Open No. 2021-76777 suggests an image capturing apparatus that, in order to realize high-speed focusing during tilt shooting, uses the technique of so-called image-plane phase-difference AF with use of a solid-state image sensor that includes phase-difference detection pixels.
According to an embodiment of the disclosure, an apparatus is provided with a sensor including a plurality of pixels that are arrayed two-dimensionally, at least a part of the plurality of pixels being a phase-difference detection pixel that include a first conversion unit and a second conversion unit for performing phase-difference AF, the apparatus comprising: a driving mechanism configured to change an angle of an image plane of the sensor relative to a main plane of an optical system; and a readout unit configured to read out signals obtained in the first conversion unit and the second conversion unit in an order corresponding to the angle.
According to another embodiment of the disclosure, a method of controlling an apparatus provided with a sensor including a plurality of pixels that are arrayed two-dimensionally, at least a part of the plurality of pixels being a phase-difference detection pixel that include a first conversion unit and a second conversion unit for performing phase-difference AF and a driving mechanism configured to change an angle of an image plane of the sensor relative to a main plane of an optical system comprises reading out signals obtained in the first conversion unit and the second conversion unit in an order corresponding to the angle.
According to an embodiment of the disclosure, a non-transitory computer-readable medium stores one or more programs which, when executed by a computer comprising one or more processors and one or more memories, cause the computer to control an apparatus provided with a sensor including a plurality of pixels that are arrayed two-dimensionally, at least a part of the plurality of pixels being a phase-difference detection pixel that include a first conversion unit and a second conversion unit for performing phase-difference AF and a driving mechanism configured to change an angle of an image plane of the sensor relative to a main plane of an optical system, wherein the one or more programs further cause the computer to read out signals obtained in the first conversion unit and the second conversion unit in an order corresponding to the angle.
Further features of the disclosure will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
Hereinafter, embodiments will be described in detail with reference to the attached drawings. Note, the following embodiments are not intended to limit the scope of the claimed invention. Multiple features are described in the embodiments, but limitation is not made to a disclosure that requires all such features, and multiple such features may be combined as appropriate. Furthermore, in the attached drawings, the same reference numerals are given to the same or similar configurations, and redundant description thereof is omitted.
In a case where focusing is performed during tilt shooting with use of the image-plane phase-difference AF on the image capturing apparatus disclosed in Japanese Patent Laid-Open No. 2021-76777, a sensitivity difference arises between a plurality of photoelectric conversion units depending on a tilt angle. As a result, the accuracy of ranging in phase-difference detection pixels decreases, the accuracy of in-focus shooting decreases, and the speed of focusing slows down.
One embodiment of the disclosure can suppress a decrease in the accuracy of ranging in an image capturing apparatus that performs tilt shooting with use of a solid-state image sensor that includes phase-difference detection pixels.
The main control unit 105 is composed of a CPU, a ROM that stores a program executed by the CPU, and a RAM that is used by the CPU as a working area. The main control unit 105 obtains an instruction from the user via the operation unit 160. Then, the main control unit 105 controls the entire apparatus by controlling the focus control unit 102, solid-state image sensor 103, tilt control unit 104, and signal processing unit 106.
The imaging optical system 101 is composed of a plurality of lenses. Under control of the control unit 105, the focus control unit 102 causes a focus lens inside the imaging optical system 101 to move along a Z-axis indicated by an arrow 150 by driving a non-illustrated driving mechanism, such as a stepper motor. In this way, the focus control unit 102 can adjust the focus position of the imaging optical system 101.
The solid-state image sensor 103 is axially supported in such a manner that it is rotatable on an X-Z plane (an arrow 151 shown in the figure) in order to enable the angle of tilt relative to the optical axis direction to be changed. Under control of the control unit 105, the tilt control unit 104 can change the angle of an image plane of the solid-state image sensor 103 relative to a main plane of the imaging optical system 101 (a later-described tilt angle) by driving a non-illustrated driving mechanism, such as a stepper motor. It is assumed that this tilt angle is set by the user operating the operation unit 160.
The following describes the relationship among the subject plane, the lens, and the image plane of the solid-state image sensor in tilt shooting with reference to
In tilt shooting, according to the Scheimpflug principle, the image plane 103a, the main plane 108 of the imaging optical system 101, and the subject plane 107 intersect at a single point 110 that extends in the Y-axis direction. Therefore, the subject plane 107 is inclined relative to the main plane 108 of the imaging optical system 101. That is to say, in tilt shooting, causing the focus plane 107 to coincide with the subject 109 that is inclined relative to the main plane 108 of the imaging optical system 101 enables image capture in which a wide range of the subject 109 is in focus. The angle θ formed by the image plane 103a of the solid-state image sensor 103 and the main plane 108 of the imaging optical system 101 is called a tilt angle.
The arrangement is such that, due to the microlens 114, an exit pupil 120 of the imaging optical system 101 and the first photoelectric conversion unit 112 and the second photoelectric conversion unit 113 are placed in a conjugate positional relationship. As a result, a light beam that has mainly passed through the right half of the imaging optical system 101 is directed to the first photoelectric conversion unit 112. Also, a light beam that has mainly passed through the left half of the imaging optical system 101 is directed to the second photoelectric conversion unit 113. Thus, an amount of displacement of the subject from the focus position can be obtained by detecting an amount of image displacement between a first image, which has been generated from pixel signals obtained by the first photoelectric conversion units 112 that are respectively included in the plurality of phase-difference detection pixels 111, and a second image, which has been generated from pixel signals obtained by the second photoelectric conversion units 113 that are respectively included therein.
<Sensitivity Difference among Photoelectric Conversion Units>
First, a description is given of a case where the tilt angle is 0 degrees as in
In a case where the exit pupil distance of the imaging optical system 101 is infinite, the first photoelectric conversion unit 112 and the second photoelectric conversion unit 113 of a phase-difference detection pixel in a periphery region outside the central region of the solid-state image sensor 103 have the same sensitivity. However, due to, for example, a general demand for size reduction in an imaging optical system, the exit pupil distance is often a limited distance. In view of this,
As apparent from
Similarly, as apparent from
Next, a description is given of a case where the tilt angle is large. In a case where the tilt angle is large, the exit pupil 120 of the imaging optical system 101 is inclined relative to the image plane 103a. Therefore, as shown in
The phase-difference detection pixel 111 that is displaced in the −X direction on the image plane 103a is located farther from the center of the exit pupil than the phase-difference detection pixel 111 in the central region is. Therefore, as shown in
On the other hand, as shown in
To summarize the above, in a case where tilt shooting has been performed on the image capturing apparatus that uses the solid-state image sensor 103 including the phase-difference detection pixels 111, the magnitude relationship between the sensitivities of the first photoelectric conversion unit and the second photoelectric conversion unit in a phase-difference detection pixel varies depending on the magnitude of the tilt angle and the position of the image plane 103a. A table of
In
In the image capturing apparatus according to the present embodiment, in order to suppress a decrease in the accuracy of ranging that occurs due to the sensitivity difference between the first photoelectric conversion unit and the second photoelectric conversion unit, the order of readout from the first photoelectric conversion unit and the second photoelectric conversion unit is changed in accordance with the tilt angle. The following describes readout of pixel signals and the advantageous effects of the present embodiment.
In the image capturing apparatus of the present embodiment, a pixel signal S1 obtained in one of the photoelectric conversion units, as well as a sum S1+2 of the pixel signals obtained in both of the photoelectric conversion units, is read out. Then, a pixel signal S2 of the other photoelectric conversion unit is obtained by subtracting S1 from S1+2; that is to say, so-called summation readout is used. Although the following describes a case where a signal of PD_A is read out first as an example, it is sufficient to interchange A and B in a case where a signal of PD_B is read out first. It is sufficient to prepare, as peripheral circuits for interchanging A and B, two types of vertical scanning circuits for which the timings of TX_A and TX_B have been interchanged, and to change the vertical scanning circuit to connect to in accordance with a column.
First, after RST is turned ON at time t3, SEL is turned ON at time t4; in this way, a noise level is read out. Subsequently, TX_A is turned ON at time t5, and SEL is turned ON at time t6; in this way, a pixel signal of the photoelectric conversion unit PD_A is obtained. Finally, TX_B is turned ON at time t7, and SEL is turned ON at time t8; in this way, a sum of an image signal of the photoelectric conversion unit PD_A and an image signal of PD_B is obtained.
A description is now given of noise carried by a pixel signal that is read out from the solid-state image sensor 103. Noise that is dominant as noise components includes optical shot noise Ns and readout circuit noise Nr. The optical shot noise occurs at the time of photoelectric conversion, and the magnitude thereof depends on the magnitude of a signal and is the square root of a signal amount. On the other hand, the readout circuit noise Nr occurs when a pixel signal is read out from FD region via SF, does not depend on the magnitude of the signal, and takes a constant value. As the optical shot noise and the readout circuit noise are independent events, the value of the sum of the noises is the root sum square.
Therefore, the signal-to-noise ratio SN1 of the pixel signal S1 that has been read out first is indicated by expression (1), and the signal-to-noise ratio SN1+2 of a summed signal S1+2 is indicated by expression (2).
On the other hand, the signal-to-noise ratio SN2 of the pixel signal S2, which is obtained by subtracting the pixel signal S1 from the summed signal S1+2, is indicated by expression (3) as noise attributed to read signals is added thereto.
As can be understood from comparison between expression 1 and expression 3, in a case where the magnitudes of S1 and S2 are the same, the signal-to-noise ratio of the pixel signal is lower in the photoelectric conversion unit from which the pixel signal has been read out later than in the photoelectric conversion unit from which the pixel signal has been read out first.
<Read from Unit with Low Sensitivity First>
As stated earlier, in order to suppress a decrease in the accuracy of ranging that occurs due to a sensitivity difference between the first photoelectric conversion unit and the second photoelectric conversion unit, the image capturing apparatus of the present embodiment changes the order of readout from the first photoelectric conversion unit and the second photoelectric conversion unit in accordance with a tilt angle. Specifically, the main control unit 105 controls the readout unit 161 to read out a pixel signal from the photoelectric conversion unit with a low sensitivity first, and read out a pixel signal from the photoelectric conversion unit with a high sensitivity later. The following describes the reason why a decrease in the accuracy of ranging can be suppressed.
First, the following is a discussion of a case where a pixel signal from the photoelectric conversion unit with a high sensitivity is read out first, and a pixel signal from the photoelectric conversion unit with a low sensitivity is read out later. As stated earlier, in a case where the same amount of light is incident on the photoelectric conversion units, the signal-to-noise ratio of the pixel signal is lower in the photoelectric conversion unit from which the pixel signal has been read out later than in the photoelectric conversion unit from which the pixel signal has been read out first.
That is to say, in a case where the pixel signal from the photoelectric conversion unit with a high sensitivity is read out first and the pixel signal from the photoelectric conversion unit with a low sensitivity is read out later, the magnitude of the signal is smaller in the latter, and noise is worse in the latter as well. As the accuracy of detection of an image displacement amount is mainly determined based on the pixel signal that is assumed to have a lower signal-to-noise ratio, the accuracy of ranging decreases in a case where the pixel signal from the photoelectric conversion unit with a low sensitivity is read out later.
On the other hand, in a case where the pixel signal from the photoelectric conversion unit with a low sensitivity is read out first and the pixel signal from the photoelectric conversion unit with a high sensitivity is read out later, the magnitude of the signal is smaller in the former, but noise characteristics are more favorable in the former. Therefore, a decrease in the accuracy of ranging can be suppressed in a case where the pixel signal from the photoelectric conversion unit with a low sensitivity is read out first and the pixel signal from the photoelectric conversion unit with a high sensitivity is read out later, compared to a case where the pixel signal from the photoelectric conversion unit with a high sensitivity is read out first and the pixel signal from the photoelectric conversion unit with a low sensitivity is read out later.
<Making Change in Accordance with Magnitudes of Pixel Signals>
Note that as the difference between expression 2 and expression 3 is the readout noise, a pixel signal of either photoelectric conversion unit may be read out first in a case where the optical shot noise is sufficiently larger than the readout noise. That is to say, whether to designate the order of readout of pixel signals from the photoelectric conversion units may be changed in accordance with the magnitudes of pixel signals.
As shown in
Note that as can be understood from
As shown in
<Moving Borderline in Accordance with Tilt Angle>
The larger the tilt angle, the greater the inclination of the exit pupil. Therefore, the borderline at which the order of readout from the first photoelectric conversion unit and the second photoelectric conversion unit is reversed is set so that the borderline extends in the direction perpendicular to the pupil-division direction, and the borderline shifts toward the side where the distance from the image plane to the subject plane is relatively short as the tilt angle increases.
The borderline at which the order of readout from the first photoelectric conversion unit and the second photoelectric conversion unit is reversed may move continuously in accordance with the tilt angle. This configuration increases the proportion of pixels in which a pixel signal from the photoelectric conversion unit with a low sensitivity is read out first, and improves the accuracy of ranging. It should be noted that the borderline may be changed in a stepwise manner. For example, the order of readout may be changed depending on whether the tilt angle is equal to or larger than a second threshold or is smaller than the second threshold.
<Designating Pixels Associated with First Readout Only in Case Where Tilt Angle is Large>
Furthermore, as can be understood from comparison between
Note that although the foregoing has described a case where a phase-difference detection pixel includes two photoelectric conversion units, it may include three or more photoelectric conversion units. In this case, a pixel signal of the photoelectric conversion unit with the lowest sensitivity may be read out first, and then pixel signals may be read out in sequence in ascending order of sensitivity.
A second embodiment is now described. The present second embodiment pertains to a structure of the solid-state image sensor according to the first embodiment. The solid-state image sensor according to the present second embodiment is given reference sign 203, and an image plane thereof is denoted by 203a. As other constituents are the same as those of the first embodiment, they will be described using the same reference signs thereas.
Specifically, as shown in
In the image capturing apparatus according to the second embodiment, the microlenses of the phase-difference detection pixels 211 are decentered in accordance with the exit pupil 220 of the imaging optical system 101. Therefore, in a case where the tilt angle is 0 degrees as in
As described above, even in a case where the structures of the phase-difference detection pixels have been optimized in accordance with the exit pupil of the imaging optical system, when tilt shooting has been performed, the magnitude relationship between the sensitivities of the first photoelectric conversion unit and the second photoelectric conversion unit in a phase-difference detection pixel varies depending on the tilt angle. For this reason, in the image capturing apparatus according to the second embodiment as well, in order to suppress a decrease in the accuracy of ranging that occurs due to the sensitivity difference between the first photoelectric conversion unit and the second photoelectric conversion unit, the order of readout from the first photoelectric conversion unit and the second photoelectric conversion unit is changed in accordance with the tilt angle. Specifically, in a case where the tilt angle is larger than a preset threshold, a decrease in the accuracy of ranging is suppressed by reading out a pixel signal from the second photoelectric conversion unit first.
As described above, in the image capturing apparatus according to the second embodiment, the amounts of decentering of the microlenses in the phase-difference detection pixels are adjusted so that the first photoelectric conversion unit and the second photoelectric conversion unit have the same sensitivity in the case of a first tilt angle. Also, the image capturing apparatus is configured so that, in a case where the tilt angle is equal to or larger than a second tilt angle that is larger than the first tilt angle, a signal from the photoelectric conversion unit located on the side where the distance from the image plane to the subject plane is relatively close (the +X direction) is read out first.
A third embodiment is now described. The present third embodiment pertains to a structure of the solid-state image sensor according to the first embodiment. The solid-state image sensor according to the present third embodiment is given reference sign 303, and an image plane thereof is denoted by 303a. As other constituents are the same as those of the first embodiment, they will be described using the same reference signs thereas.
The present third embodiment is an example in which the microlenses of the phase-difference detection pixels 311 in the solid-state image sensor 303 have been optimized in line with a case where the tilt angle is large.
Specifically, in the present third embodiment, each of the microlenses of the phase-difference detection pixels 311 in every region of the solid-state image sensor 303 is decentered in the +X direction relative to the center of the pixel thereof. Also, the amounts of decentering of the microlenses change either continuously, or in a stepwise manner, so that they become the largest in a periphery region in the −X direction, and the smallest in a periphery region in the +X direction. By adopting such a configuration, the sensitivity difference between the first photoelectric conversion unit and the second photoelectric conversion unit can be reduced in a case where the tilt angle is large.
Note that with regard to one phase detection pixel according to the present third embodiment, the −X side on the image plane 303a is referred to as a first photoelectric conversion unit, and the +X side is referred to as a second photoelectric conversion unit, similarly to the first and second embodiments.
In the image capturing apparatus according to the present third embodiment, the microlenses of the phase-difference detection pixels 311 are decentered in accordance with the exit pupil 320 of the imaging optical system in a case where the tilt angle is large. Therefore, in a case where the tilt angle is large as in
As described above, even in a case where the structures of the phase-difference detection pixels have been optimized in line with a case where the tilt angle is large, the magnitude relationship between the sensitivities of the first photoelectric conversion unit and the second photoelectric conversion unit in a phase-difference detection pixel varies depending on the tilt angle. For this reason, in the image capturing apparatus according to the third embodiment as well, in order to suppress a decrease in the accuracy of ranging that occurs due to the sensitivity difference between the first photoelectric conversion unit and the second photoelectric conversion unit, the order of readout from the first photoelectric conversion unit and the second photoelectric conversion unit is changed in accordance with the tilt angle. Specifically, in a case where the tilt angle is small, a decrease in the accuracy of ranging can be suppressed by reading out a pixel signal from the first photoelectric conversion unit first.
In the image capturing apparatus according to the third embodiment, the amounts of decentering of the microlenses in the phase-difference detection pixels are adjusted so that the first photoelectric conversion unit and the second photoelectric conversion unit have the same sensitivity in the case of a third tilt angle. Also, the image capturing apparatus is configured so that, in a case where the tilt angle is smaller than a fourth tilt angle that is smaller than the third tilt angle, a signal from the photoelectric conversion unit located on the side where the distance from the image plane to the subject plane is relatively long (the −X direction) is read out first.
As has been described in the second and third embodiments, the position of the exit pupil of the imaging optical system 101 varies depending on the tilt angle. Therefore, the amounts of decentering of the microlenses can be optimized in relation to an intermediate tilt angle. The image capturing apparatus according to a fourth embodiment is an example in which the amounts of decentering of the microlenses have been optimized in relation to an angle that is exactly a midpoint between the smallest tilt angle and the largest tilt angle that have been preset for the case of tilt shooting.
In the case of the image capturing apparatus according to the fourth embodiment as well, the magnitude relationship between sensitivities varies depending on the tilt angle and the position of a phase-difference detection pixel on the image plane, and therefore the order of readout from the photoelectric conversion units can be changed depending on them. Specifically, the proportion of the phase-difference detection pixels in which readout from the first photoelectric conversion unit is performed first is increased as the tilt angle decreases, and the proportion of the phase-difference detection pixels in which readout from the second photoelectric conversion unit is performed first is increased as the tilt angle increases. In this way, a decrease in the accuracy of ranging that occurs due to the sensitivity difference between the photoelectric conversion units can be suppressed.
A fifth embodiment is now described. The following describes a surveillance system that uses the image capturing apparatus described in the above first to fourth embodiments.
The client apparatus 501 is, for example, an external device such as a PC, and the network 502 is composed of a wired LAN, a wireless LAN, or the like. Furthermore, it is permissible to adopt a configuration in which power is supplied to the image capturing apparatus 503 via the network 502.
Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the disclosure has been described with reference to exemplary embodiments, it is to be understood that the disclosure is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2022-148405, filed Sep. 16, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2022-148405 | Sep 2022 | JP | national |