The present invention relates to a control apparatus that provides a focus detection.
Japanese Patent Laid-Open No. (“JP”) 58-24105 discloses a focus detection apparatus that provides a focus detection using a two-dimensional image sensor (image pickup element) that includes a micro lens for each pixel, and a phase difference in a pixel divided by a so-called pupil dividing method. The focus detection apparatus disclosed in JP 58-24105 can detect a focus state by dividing a photoelectric convertor in each pixel into a plurality of portions, and by receiving through the divided photoelectric convertors and the micro lens light fluxes that have passed mutually different areas on the pupil in the imaging optical system. JP 2016-71275 discloses an imaging apparatus that performs a focus detection using a contrast autofocus (AF) that performs a focus detection by evaluating the contrast of image data, in addition to the phase difference AF.
In the focus detection apparatus disclosed in JP 58-24105, a signal used for the phase difference AF is a signal corresponding to an area thinned from the entire image, and thus information useful for the phase difference AF, such as a high-contrast edge region, may exist in the thinned area. Then, the focus detection cannot be accurately performed with the phase difference AF.
The imaging apparatus disclosed in JP 2016-71275 performs the focus detection using both the phase difference AF and the contrast AF, but the contrast AF may require a long focusing time and may not achieve a high speed focus detection.
The present invention provides a control apparatus, an imaging apparatus, and a control method, which can provide an accurate and fast focus detection.
A control apparatus according to one aspect of the present invention includes a focus detector configured to perform a focus detection by a phase difference method based on an image signal output from an image sensor, a driver configured to change a relative position between an imaging optical system and the image sensor in a direction orthogonal to an optical axis in the imaging optical system, and a controller configured to control the focus detector and the driver so as to perform the focus detection by changing the relative position between the imaging optical system and the image sensor to each of a first position and a second position.
An imaging apparatus according to another aspect of the present invention includes an image sensor, a focus detector configured to perform a focus detection by a phase difference method based on an image signal output from the image sensor, a driver configured to change a relative position between an imaging optical system and the image sensor in a direction orthogonal to the optical axis of the imaging optical system, and a controller configured to control the focus detector and the driver so as to perform the focus detection by changing the relative position between the imaging optical system and the image sensor to each of a first position and a second position.
A control method according to another aspect of the present invention includes the steps of changing a relative position between an imaging optical system and an image sensor, and changing the relative position in a direction orthogonal to an optical axis in the imaging optical system to each of a first position and a second position and performing a focus detection by a phase difference method based on an image signal output form the image sensor.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Referring now to the accompanying drawings, a detailed description will be given of embodiments according to the present invention.
Referring now to
A divided image generating circuit 102 generates two divided images (divided images) based on the output signal from the image sensor 101. A phase difference detecting accelerator circuit 103 performs processing for correcting an optical distortion in each of the two images (divided images) generated by the divided image generating circuit 102, and a correlation calculation (focus detection) for detecting the phase difference between the two images. In other words, the phase difference detecting accelerator circuit 103 constitutes a focus detector configured to perform the focus detection by the phase difference method based on the image signal output from the image sensor 101 which photoelectrically converts the optical image formed via the lens unit 100. The output signal from the phase difference detecting accelerator circuit 103 is written in the memory 108.
An image signal processing circuit 104 combines the two images (image signals) output from the image sensor 101 to generate a video signal, and performs plural optical correction processing, electric noise processing, and the like for the generated video signal. An image memory 107 temporarily stores the video signal generated by the image signal processing circuit 104. An image processing circuit 105 converts the video signal into a predetermined video data format. A recording circuit 106 records an image in a recording medium (not shown).
A contrast evaluating circuit (contrast evaluator) 112 evaluates the contrast state in a plurality of predetermined areas in the video signal temporarily stored in the image memory 107. A shake detecting sensor 113 detects vibration information (shake information) of the imaging apparatus 10. A sensor driving circuit (driver) 111 changes the position of the image sensor 101 (the relative position between the image sensor 101 and the lens unit 100) in order to perform image stabilization processing and high-resolution processing. A CPU 109 governs controls over a variety of circuits in the imaging apparatus 10, a focus control, a lens driving control, and the like. A memory 108 stores a program and data used for the CPU 109. A lens driving circuit (driver) 110 drives the focus lens, the zoom lens, the diaphragm, the shift lens, and the like in the lens unit 100. Signal processing (focus detection processing), which will be described later in this embodiment, is mainly executed by the CPU 109 and the phase difference detecting accelerator circuit 103. The control apparatus according to this embodiment includes at least the CPU 109 and the phase difference detecting accelerator circuit 103.
Referring now to
The image sensor 101 is a Bayer array type image sensor, and each pixel on the phase difference detecting line has two divided photoelectric conversion elements for each RGB pixel which share one micro lens. For example, the R pixel is halved into an A image pixel (first pixel) 202 and a B image pixel (second pixel) 203. Hereinafter, the images (image signals) output from the A image pixel 202 and the B image pixel 203 will be referred to as A image (A image signal) and B image (B image signal), respectively. Similarly, each of the G pixel (G1 pixel, G2 pixel) and the B pixel is also halved into a corresponding one of the A image pixels 204, 206, and 208 and a corresponding one of the B image pixels 205, 207, and 209.
The imaging apparatus 10 using the thus-configured image sensor 101 can generate an image signal R as in prior art to be recorded or processed by combining (or adding) the A image output from the A image pixel 202 and the B image output from the B image pixel 203. In addition, the imaging apparatus 10 obtains an original signal used for the focus detection (phase difference detection) by separately treating the A image pixel 202 and the B image pixel 203 as two images or left and right divided images (two divided images). On the other hand, each RGB pixel has a single photoelectric conversion element for a single micro lens in the block used only for the image generation, and the block includes the R pixel 211, the G1 pixel 212, the G2 pixel 213, and the B pixel 214.
Specifically, one evaluation value is generated by integrating the contrast evaluation value (such as a peak value after predetermined filtering and a line direction integrated value of an adjacent difference) obtained by the phase difference detecting line 402 in each of the areas 510 to 550. Similarly, one evaluation value is generated by integrating the contrast evaluation value obtained in the area 403 in each of the areas 510 to 550. One evaluation value is generated by integrating the contrast evaluation value obtained in the area 404 in each of the areas 510 to 550. One evaluation value is generated by integrating the contrast evaluation value obtained in the area 405 in each of the areas 510 to 550. Therefore, four contrast evaluation values are generated in the AF frame 401.
Next follows a description of the operation of the imaging apparatus 10 according to this embodiment. When the imaging apparatus 10 runs and the lenses, circuits, and the like are initialized in predetermined initialization processing, an image signal is taken from the image sensor 101. When a signal (image signal) for developments to record or display an image is input to the image signal processing circuit 104, the phase difference detecting pixel first adds the A image and the B image to each other (or generates an added signal), as if they are treated as one pixel (because normal non-shared pixels are not added) as in the prior art. Then, the image signal processing circuit 104 performs optical correction processing and electrical correction processing for the added signal, and temporarily stores it in the image memory 107. For example, when it is recorded as a captured image, the added signal is converted into a predetermined format (motion image or still image format such as MPEG 2, MP 4, JPG, etc.) via the image processing circuit 105 and the image memory 107, and recorded in the recording medium through the recording circuit 106.
The image information for the specified AF frame and specified area read out of the image memory 107 is input into the contrast evaluating circuit 112, the contrast is evaluated, and then the contrast evaluation result is recorded in the memory 108. On the other hand, the output signal from the phase difference detecting pixel is input from the image sensor 101 to the divided image generating circuit 102 for the focus detection (phase difference detection) and receives the compression processing and correction processing based on predetermined settings, and the A and B images are generated. Herein, two pixels are added in the horizontal direction and two pixels are added in the vertical direction for each of the A image and the B image and the result is sent to the phase difference detecting accelerator circuit 103 as addition result data for RG(G1, G2)B. The phase difference detecting accelerator circuit 103 performs a correlation calculation for the phase difference detection, and the calculation result is temporarily output to the memory 108. The CPU 109 performs final processing for the calculation result and detects the defocus amount of the focus lens.
Referring now to
First, in the step S901, when receiving an instruction to start the AF operation (AF trigger) by half-pressing the release button or the like, the CPU 109 proceeds to the step S902. In the step S902, the CPU 109 sets an AF frame of a predetermined frame size at a position specified by the user, such as one center point. Next, in the step S903, the CPU 109 performs the focus detection processing on the AF frame (specified frame) set in the step S902. The details of the focus detection processing will be described later.
Next, in the step S904, the CPU 109 calculates a focus lens driving amount based on the focus detection data (defocus amount obtained by the focus detection processing) on the specified frame. For example, if the nearly in-focus state is determined based on the obtained defocus amount, the CPU 109 determines as the defocus amount itself the lens driving amount as an amount to move the focus lens to the in-focus position, and obtains the in-focus state in the next lens driving. On the other hand, when the defocus amount is, for example, about 2 mm or more causing a blurred area, the CPU 109 makes a determination and calculation such as setting the driving amount to about 1 mm because the focus detection accuracy slightly deteriorates, and determines the lens driving amount. Next, in the step S905, the CPU 109 drives the lens using the lens driving circuit 110 based on the lens driving amount determined in the step S904.
Next, in the step S906, if the CPU 109 determines that the lens is in-focus by the lens driving in the step S905, the CPU 109 proceeds to the step S907 and ends the AF control. On the other hand, if the CPU 109 determines that the lens is not in-focus by the lens driving in the step S905 (in an out-of-focus state), the CPU 109 returns to the step S903 and repeats the steps S903 to S906.
Referring now to
First, in the step S1001 in
An operation mode A (first mode) is a wobbling or reciprocating mode with a fixed amount in which the imaging apparatus 10 is physically fixed, and used for a searching operation for an object with focus detection frames at multiple points or the like. An operation mode B (second mode) is a mode that changes the shift lens driving amount in accordance with the frame rate, and is mainly used for a high frame rate of the imaging apparatus 10. An operation mode C (third mode) is used for the defocus state, gives priority to the recorded image quality, and switches the driving amount based on the phase difference focus detection result and the image contrast evaluation result. This embodiment performs different focus detection processing as follows based on the three operation modes A, B, and C. However, this embodiment is not limited to these operation modes, and may have another operation mode. The operation mode may be selected by the CPU 109 in accordance with a user operation on an unillustrated operation unit, or the CPU 109 may select an operation mode suitable for a determination result of determining a scene based on the image signal.
In the operation mode A, the flow proceeds to the step S1102, and the CPU 109 selects a fixed value L1 as a shift lens driving amount. The fixed value L1 can be arbitrarily set, for example, may use a value that is half the interval between the phase difference detecting lines. Next, in the step S1103, the CPU 109 determines whether or not the image frame used for the focus detection is an even-numbered frame. When the image frame is an even-numbered frame, the shift lens driving amount is maintained at the fixed value L1, and this flow ends. On the other hand, when the image frame is an odd-numbered frame, the flow ends by setting the shift lens driving amount to a driving amount (L1×(−1)) by inverting the sign of the fixed value L1. Thereby, the shift lens wobbles or reciprocates with a width of the fixed value L1 in a frame unit. In other words, the relative position between the imaging optical system and the imaging apparatus is changed to each of the first position and the second position. Referring now to
On the other hand, if the operation mode B is selected in the step S1101 in
Hs=Wd/((FrameRate/30)/2+1) (1)
If FrameRate/30 is an odd number in the expression (1), 1 is subtracted from it in order to round it into an even number. Next, in the step S1107, the CPU 109 calculates the driving direction with the current frame, and ends this flow.
When the operation mode C is selected in the step S1101 in
When the contrast of the phase difference detecting line is lower than the contrast th1 in the step S1108, the flow proceeds to the step S1109. In the step S1109, the CPU 109 extracts a line that provides the maximum contrast evaluation value among the three areas (lines) other than the phase difference detecting line, which is equal to or higher than the contrast th2. In other words, the CPU 109 determines whether or not there is the highest contrast line equal to or higher than the contrast th2.
If there is no line with the contrast th2 or higher in the step S1109, since the contrast is low in any of the lines, the flow proceeds to the step S1111 and the CPU 109 sets the driving amount to zero. On the other hand, if there is the line having the contrast th2 or higher in the step S1109, the flow proceeds to the step S1110. In the step S1110, the CPU 109 sets the driving amount from the current state to that line (having the contrast th2 or higher), and ends this flow.
When the shift lens driving amount in the step S1001 in
Next, in the step S1004, the CPU 109 generates the A image and the B image using the divided image generating circuit 102. Next, in the step S1005, the CPU 109 performs band-pass filter processing for the A image and the B image using the phase difference detecting accelerator circuit 103. Next, in the step S1006, the CPU 109 performs a correlation calculation between the A image and the B image using the phase difference detecting accelerator circuit 103. Next, in the step S1007, the CPU 109 evaluates the reliability as to whether the focus detection result is correct or not based on the information of the correlation calculation process and the calculated contrasts of the A image and the B image. Next, in the step S1008, the CPU 109 obtains the final focus detection result and ends the focus detection processing.
In this way, the CPU 109 drives the shift lens so as to change the relative relationship (relative position) in the optical-axis orthogonal direction between the shift lens and the image sensor 101 to the first position and the second position. Thus, even when the high-contrast position such as the eye of the object 410 illustrated in
While this embodiment describes that the shift lens is driven in the optical-axis orthogonal direction, the present invention is not limited to this embodiment and may drive the image sensor 101 in the optical-axis orthogonal direction so as to change the relative position between the imaging position of light from the lens unit 100 and the image sensor 101.
This embodiment can change the positional relationship between the object position and the phase difference detecting line by a zooming operation that changes the focal length or by driving the zoom lens in the lens unit 100 in the optical axis direction. Then, the relative position changes in the optical axis direction between the lens unit (optical system) 100 and the image sensor 101. Hence, the relative position may be changed at a timing (other than recording the image) different from the recording timing of the image so as to avoid the influence on the image, in particular, in the one shot AF and live-view without recording the image. Further, this embodiment can change the relative position every predetermined period such as every image frame (per frame rate).
This embodiment provides the image sensor 101 with a plurality of phase difference detecting lines at predetermined intervals (thinning intervals), but the present invention is not limited to this embodiment. For example, this embodiment is applicable to a configuration that provides a phase difference detecting pixel over the entire surface of the image sensor (so as to achieve the two-image division on the entire surface) and achieves the thinning reading due to factors such as the high frame rate scheme and image processing capability. Further, this embodiment provides the phase difference detecting lines thinned out in the vertical direction, but the present invention is not limited to this embodiment and may be applied to the phase difference detecting lines thinned out in the horizontal direction. In order to avoid a recorded image or a displayed image from shifting when the shift lens or the like is driven for the phase difference detection, a correction (by the electronic image stabilization) may be made which changes the position of the recorded image cut out of the overall image output from the image sensor by a shift amount and cuts the image. This embodiment can repetitively perform the focus detection on the object with a high accuracy.
Referring now to
First, in the step S1201, the CPU 109 determines whether or not it is shift lens a driving timing. Referring now to
First, in the step S801, the CPU 109 detects a motion vector (motion vector information) based on captured latest image information (image signal). Next, in the step S802, the CPU 109 analyzes the output signal from the shake detecting sensor 113 and detects a fluctuation amount (shake information) of the imaging apparatus 10. Next, in the step S803, the CPU 109 calculates a duration for which the shift lens is not driven, based on the motion vector detected in the step S801 and the fluctuation amount of the imaging apparatus 10 detected in the step S802. This duration is a period that anticipates that the object area entering the phase difference detecting line changes in accordance with the movement of the object and the motion (each moving amount) of the imaging apparatus 10. In other words, for example, where the object and the imaging apparatus 10 fluctuate largely (where each moving amount is large), even if the shift lens and the image sensor 101 are not positively driven, a relationship changes between the object position and the phase difference detecting line. Hence, setting the duration can reduce the influence on the recorded image.
This embodiment converts each moving amount into the number of lines on a captured image, and sets a long duration (a period for which the shift lens is not driven) when a moving amount is equal to or larger than a predetermined amount. On the other hand, when the moving amount is small, a short duration is set. For example, when the moving amount is equal to or larger than the interval Wd between the phase difference detecting lines, the duration for which the shift lens is not driven is maintained. On the other hand, if the moving amount is less than half the interval Wd, the duration is maintained by a predetermined number of frames, such as three frames. When the moving amount is small, the shift lens is positively driven without setting the duration.
At the shift lens driving timing in the step S1201 in
As described above, this embodiment aggressively captures the object and maintains the focus detection performance in the AF frame, even when the phase difference detecting lines are thinned out. In addition to the focus detection of the phase difference method, the contrast evaluation of the AF frame area of the image may be introduced to detect the object position, thereby improving the focus detection frequency by the phase difference method and continuing the accurate and fast focus detection processing.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
According to the embodiments, it is possible to provide a control apparatus, an imaging apparatus, a control method, and a storage medium capable of performing high-precision and high-speed focus detection.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2018-058643, filed on Mar. 26, 2018, which is hereby incorporated by reference herein in its entirety.
| Number | Date | Country | Kind |
|---|---|---|---|
| 2018-058643 | Mar 2018 | JP | national |