The present technique relates to imaging apparatuses. More particularly, the present technique relates to an imaging apparatus that performs focus control, a method of controlling the imaging apparatus, and a program for causing a computer to implement the method.
In recent years, imaging apparatuses, such as a digital video camera (a recorder with a camera, for example) that generates images (image data) by imaging objects such as sceneries and persons, and records the generated images as image content, have become widely popular. Also, a large number of imaging apparatuses that automatically perform focus control have been suggested so as to prevent failures in imaging processes according to user operations.
For example, there is a suggested imaging apparatus that performs focus control by using the intensities of the contrasts in image data. Also, there is a suggested imaging apparatus that estimates a position to which the focus lens is to be moved, using two images that are captured at different focal distances (see Patent Document 1, for example)
By the above described conventional technique, focus control can be performed with the use of image data, and there is no need to provide an additional device for performing focus control in an imaging apparatus.
However, depending on objects and imaging conditions, the focus lens might be moved to a position that is not the focusing position, or it might take time to move the focus lens to the focusing position. Therefore, it is critical to perform focus control appropriately in accordance with objects and imaging conditions.
The present technique has been developed in view of those circumstances, and aims to perform appropriate focus control.
The present technique has been developed to solve the above problems, and a first aspect thereof is an imaging apparatus that includes a control unit that performs control to set a first mode or a second mode based on a predetermined condition, the first mode being for performing an auto focus process by moving a focus lens based the contrast in an image generated by an imaging unit, the second mode being for performing an auto focus process by moving the focus lens based on the result of a matching process performed between a first image and a second image, the first image and the second image being generated by the imaging unit with the focus lens being located in different positions. The first aspect of the present technique is also a method of controlling the imaging apparatus, and a program for causing a computer to implement the method. Accordingly, the first mode or the second mode is effectively set based on the predetermined condition.
In the first aspect, the control unit may determine whether switching between the first mode and the second mode is necessary based on the position of the focus lens and a history of the result of the matching process. Accordingly, necessity or unnecessity of switching between the first mode and the second mode is effectively determined based on the position of the focus lens and a history of the result of the matching process.
In the first aspect, in a case where the first mode is set, the control unit may perform control to set the second mode when the predetermined condition is satisfied, the predetermined condition being that the focus lens converges on one position, and the difference between the position of the focus lens and a focusing position estimated based on a history of the result of the matching process is larger than a threshold value. Accordingly, where the first mode is set, the second mode can be effectively set when the focus lens converges on one position, and the difference between the position of the focus lens and the focusing position estimated based on the history of the result of the matching process is larger than the threshold value.
In the first aspect, in a case where the second mode is set, the control unit may perform control to set the first mode when the predetermined condition is satisfied, the predetermined condition being that the difference between the position of the focus lens and a focusing position estimated based on a history of the result of the matching process is smaller than a threshold value. Accordingly, where the second mode is set, the first mode can be effectively set when the difference between the position of the focus lens and the focusing position estimated based on the history of the result of the matching process is smaller than the threshold value.
In the first aspect, in a case where the second mode is set, the control unit may perform control to set the first mode when the predetermined condition is satisfied, the predetermined condition being that the difference between the position of the focus lens and a focusing position estimated based on a history of the result of the matching process is smaller than a threshold value, and a weighted distribution of the history of the estimated focusing position is smaller than a threshold value. Accordingly, where the second mode is set, the first mode can be effectively set when the difference between the position of the focus lens and the focusing position estimated based on the history of the result of the matching process is smaller than the threshold value, and the weighted distribution of the history of the estimated focusing position is smaller than the threshold value.
In the first aspect, the imaging apparatus may further include a posture detecting unit that detects a change in the posture of the imaging apparatus, and the control unit may determine whether the switching is necessary without using the result of the matching process as the history when the detected posture change is larger than a threshold value. Accordingly, necessity or unnecessity of the switching is effectively determined without the use of the result of the matching process as the history when the detected posture change is larger than the threshold value.
In the first aspect, the control unit may determine whether the switching is necessary without using the result of the matching process as the history when the difference between the luminance detection value in the first image and the luminance detection value in the second image is larger than a threshold value. Accordingly, necessity or unnecessity of the switching is effectively determined without the use of the result of the matching process as the history when the difference between the luminance detection value in the first image and the luminance detection value in the second image is larger than the threshold value.
In the first aspect, the control unit may determine whether the switching is necessary without using the result of the matching process as the history when the difference between the aperture value at the time of the generation of the first image and the aperture value at the time of the generation of the second image is larger than a threshold value. Accordingly, necessity or unnecessity of the switching is effectively determined without the use of the result of the matching process as the history when the difference between the aperture value at the time of the generation of the first image and the aperture value at the time of the generation of the second image is larger than the threshold value.
In the first aspect, the imaging apparatus may further include a posture detecting unit that detects a change in the posture of the imaging apparatus, and the control unit may perform control to set the first mode when the detected posture change is larger than a threshold value. Accordingly, the first mode is effectively set when the detected posture change is larger than the threshold value.
In the first aspect, the control unit may perform control to set the first mode when the difference between the luminance detection value in the first image and the luminance detection value in the second image is larger than a threshold value. Accordingly, the first mode is effectively set when the difference between the luminance detection value in the first image and the luminance detection value in the second image is larger than the threshold value.
In the first aspect, the control unit may perform control to set the first mode when the difference between the aperture value at the time of the generation of the first image and the aperture value at the time of the generation of the second image is larger than a threshold value. Accordingly, the first mode is effectively set when the difference between the aperture value at the time of the generation of the first image and the aperture value at the time of the generation of the second image is larger than the threshold value.
According to the present technique, an excellent effect can be achieved so as to perform appropriate focus control.
The following is a description of modes (hereinafter referred to as embodiments) for carrying out the present technique. Explanation will be made in the following order.
1. First Embodiment (Focus Control: Example in which contrast AF mode and 2-image matching AF mode are switched based on predetermined conditions)
The imaging apparatus 100 includes an imaging lens 101, an imaging device 102, an analog signal processing unit 103, an A/D (Analog/Digital) converter 104, and a digital signal processing unit 105. The imaging apparatus 100 also includes a liquid crystal panel 106, a viewfinder 107, a recording device 108, an object detecting unit 109, a gyro sensor 110, and a control unit 120. The imaging apparatus 100 also includes an EEPROM (Electrically Erasable and Programmable Read Only Memory) 131. The imaging apparatus 100 also includes a ROM (Read Only Memory) 132 and a RAM (Random Access Memory) 133. The imaging apparatus 100 also includes an operation unit 140, a TG (Timing Generator) 151, a motor driver 152, a focus lens drive motor 153, and a zoom lens drive motor 154. The imaging apparatus 100 is realized by a digital still camera or a digital video camera (such as a recorder with a camera) that can perform an AF (Auto Focus) process, for example.
The imaging lens 101 is a lens that gathers light from an object, and supplies the gathered light to the imaging device 102. The imaging lens 101 includes a zoom lens, a focus lens, an iris, a ND (Neutral Density) mechanism, a shift vibration-proof image-stabilizing lens, and the like. The zoom lens is a lens for continuously changing the focal length. The focus lens is a lens for focusing on the object. The iris is designed to change the aperture diameter. The ND mechanism is a mechanism for inserting a ND filter. The shift vibration-proof image-stabilizing lens is a lens for correcting jiggling of the hand of the user during an image capturing operation. The focus lens is driven by the focus lens drive motor 153, and moves back and force with respect to the object. In this manner, a focusing function is realized. The zoom lens is driven by the zoom lens drive motor 154, and moves back and force with respect to the object. In this manner, a zooming function is realized.
The imaging device 102 is a photoelectric conversion element that receives light entering from the object via the imaging lens 101, and converts the light into an electrical signal (image signal). The image signal (analog signal) generated through this conversion is supplied to the analog signal processing unit 103. That is, an optical image of the object that enters via the imaging lens 101 is formed in the imaging area of the imaging device 102, and the imaging device 102 performs an imaging operation in that situation, to generate an image signal (analog signal). The imaging device 102 is driven by the TG 151. The imaging device 102 may be a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like.
Under the control of the control unit 120, the analog signal processing unit 103 performs analog processing such as denoising on the image signal (analog signal) supplied from the imaging device 102. The analog signal processing unit 103 supplies the image signal (analog signal) subjected to the analog processing, to the A/D converter 104.
Under the control of the control unit 120, the A/D converter 104 converts the image signal (analog signal) supplied from the analog signal processing unit 103 into a digital signal, and supplies the A/D-converted image signal (digital signal) to the digital signal processing unit 105.
Under the control of the control unit 120, the digital signal processing unit 105 performs digital processing such as gamma correction on the image signal (digital signal) supplied from the A/D converter 104, and supplies the digitally-processed image signal (digital signal) to respective components. For example, the digital signal processing unit 105 supplies the digitally-processed image signal (digital signal) to the liquid crystal panel 106 and the viewfinder 107 to display images. The digital signal processing unit 105 also performs a compressing process on the digitally-processed image signal (digital signal), and supplies the image data subjected to the compressing process (compressed image data), to the recording device 108 to record the image data.
The liquid crystal panel 106 is a display panel that displays respective images based on the image signal (image data) supplied from the digital signal processing unit 105. The liquid crystal panel 106 displays the image signal (image data) supplied from the digital signal processing unit 105 as a through image, for example. The liquid crystal panel 106 also displays image data recorded in the recording device 108 as a list image, for example. The liquid crystal panel 106 may be a display panel such as a LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) panel.
The viewfinder 107 is an electronic viewfinder (EVF) that displays respective images based on the image signal (image data) supplied from the digital signal processing unit 105.
The recording device 108 is a recording device that records the image signal (image data) supplied from the digital signal processing unit 105. The recording device 108 also supplies recorded image data to the digital signal processing unit 105. The recording device 108 may be included in the imaging apparatus 100, or may be detachable from the imaging apparatus 100. The recording device 108 may be a flash memory or a DV tape.
Under the control of the control unit 120, the object detecting unit 109 analyzes the image signal (image data) supplied from the digital signal processing unit 105, and detects the object included in the image. The detection result (detection information) is then output to the control unit 120. For example, the object detecting unit 109 detects the face of the person included in the image corresponding to the image signal (image data) supplied from the digital signal processing unit 105, and outputs face information about the detected face to the control unit 120. The face detection method may be a method of detecting a face by performing matching between a template in which luminance distribution information about the face is recorded and an actual image (see JP 2004-133637 A, for example), a method of detecting a face based on the feature quantity of the skin-colored portion or the face of the person included in the image data, or the like. The face detection information contains the position and the size of the detected face in the image. The object detecting unit 109 also has the function of recognizing the object that is following AF with respect to the image signal (image data) supplied from the digital signal processing unit 105.
The gyro sensor 110 detects the angular velocity of the imaging apparatus 100, and outputs the detected angular velocity to the control unit 120. Since the angular velocity of the imaging apparatus 100 is detected by the gyro sensor 110, a change of the posture of the imaging apparatus 100 is detected. A sensor (such as an acceleration sensor) other than a gyro sensor may be used to detect acceleration, motion, tilt, and the like of the imaging apparatus 100, and detect the posture or a change of the posture of the imaging apparatus 100 based on the results of the detection.
The control unit 120 includes a CPU (Central Processing Unit), and controls respective processes to be performed by the imaging apparatus 100 based on a program stored in the ROM 132. For example, the control unit 120 performs the respective processes to realize respective functions such as an AF function to focus on the object, an AE (Auto Exposure) function to adjust brightness, a WB (White Balance) function to perform white balancing. The control unit 120 also outputs control information about the focus lens depending on the focus following in accordance with AF, a manual operation, or a zooming operation, control information about the zoom lens in accordance with a zooming operation, and the like to the motor driver 152.
The EEPROM 131 is a nonvolatile memory that is capable of holding data while the imaging apparatus 100 is off, and stores image data, various kinds of auxiliary information, and various kinds of setting information.
The ROM 132 is a memory that stores the program, the operational parameters, and the like to be used by the control unit 120.
The RAM 133 is a working memory that stores a program to be used by the control unit 120, parameters that vary at the time of execution of the program, and the like.
The operation unit 140 receives an operation input from a user, such as a REC button (recording button) operation, a zooming operation, or a touch panel operation, and supplies the content of the received operation input to the control unit 120.
Under the control of the control unit 120, the TG 151 generates a drive control signal for driving the imaging device 102, and causes the imaging device 102 to be driven.
Under the control of the control unit 120, the motor driver 152 drives the focus lens drive motor 153 and the zoom lens drive motor 154 to drive respective lenses (the focus lens, the zoom lens, and the like). Specifically, the motor driver 152 converts control signals (control information for driving the respective motors) output from the control unit 120 into voltages, and outputs the respective converted voltages to the focus lens drive motor 153 and the zoom lens drive motor 154. The motor driver 152 then drives the respective motors to drive the respective lenses.
The focus lens drive motor 153 is a motor that moves the focus lens based on the voltage that is output from the motor driver 152. The zoom lens drive motor 154 is a motor that moves the zoom lens based on the voltage that is output from the motor driver 152.
The imaging apparatus 100 includes a posture detecting unit 210, an imaging unit 220, an image processing unit 230, a recording control unit 240, a content storage unit 241, a display control unit 250, and a display unit 251. The imaging apparatus 100 also includes a control unit 260, a history information holding unit 261, a contrast AF processing unit 270, a 2-image matching AF processing unit 280, and an operation receiving unit 290.
The posture detecting unit 210 detects a change in the posture (angular velocity) of the imaging apparatus 100, and outputs information (posture information) about the detected change in the posture (angular velocity) to the control unit 260. The posture detecting unit 210 corresponds to the gyro sensor 110 shown in
The imaging unit 220 generates image data (an image signal), and outputs the generated image data to the image processing unit 230, the contrast AF processing unit 270, and the 2-image matching AF processing unit 280. The imaging unit 220 also moves the focus lens to realize the AF function under the control of the contrast AF processing unit 270 or the 2-image matching AF processing unit 280. The imaging unit 220 corresponds to the imaging lens 101, the imaging device 102, the focus lens drive motor 153, and the zoom lens drive motor 154 shown in
In accordance with an instruction from the control unit 260, the image processing unit 230 performs various kinds of image processing on the image data output from the imaging unit 220, and outputs the image data subjected to the various kinds of image processing, to the recording control unit 240, the display control unit 250, and the control unit 260. The image processing unit 230 corresponds to the analog signal processing unit 103, the A/D converter 104, and the digital signal processing unit 105 shown in
In accordance with an instruction from the control unit 260, the recording control unit 240 performs recording control on the content storage unit 241. For example, the recording control unit 240 records the image data output from the image processing unit 230 as image content (a still image file or a moving image file) in the content storage unit 241. The recording control unit 240 corresponds to the digital signal processing unit 105 and the control unit 260 shown in
The content storage unit 241 is a recording medium that stores various kinds of information (such as image content) under the control of the recording control unit 240. The content storage unit 241 corresponds to the recording device 108 shown in
The display control unit 250 causes the display unit 251 to display the image output from the image processing unit 230 in accordance with an instruction from the control unit 260. The display control unit 250 corresponds to the digital signal processing unit 105 and the control unit 120 shown in
The display unit 251 is a display panel that displays respective images under the control of the display control unit 250. The display unit 251 corresponds to the liquid crystal panel 106 and the viewfinder 107 shown in
The control unit 260 controls the respective components in the imaging apparatus 100 based on a control program stored in a memory (not shown). For example, the control unit 260 performs control to set the contrast AF mode (first mode) or the 2-image matching AF mode (second mode) based on predetermined conditions. Here, the contrast AF mode is the mode in which the contrast AF processing unit 270 performs an auto focus process by moving the focus lens based on the contrast in an image generated by the imaging unit 220. The 2-image matching AF mode is the mode in which the 2-image matching AF processing unit 280 performs an auto focus process by moving the focus lens based on the result of a matching process between a first image and a second image. The first image and the second image are two images that are generated by the imaging unit 220, with the focus lens being located in different positions. The control on the setting of those modes will be described later in detail, with reference to
The history information holding unit 261 is a holding unit that sequentially holds histories of the results of matching processes performed by the 2-image matching AF processing unit 280. The history information holding unit 261 corresponds to the RAM 133 shown in
The contrast AF processing unit 270 performs an auto focus process by moving the focus lens based on the contrast in an image generated by the imaging unit 220 in a case where the contrast AF mode is set. The contrast AF process will be described later in detail, with reference to
The 2-image matching AF processing unit 280 performs an auto focus process by moving the focus lens based on the result of a matching process between the first image and the second image generated by the imaging unit 220, with the focus lens being located in different positions. The 2-image matching AF process will be described later in detail, with reference to
The operation receiving unit 290 is an operation receiving unit that receives an operation that is input by a user, and outputs control signals (operation signals) in accordance with the contents of the received operation, to the control unit 260. The operation receiving unit 290 corresponds to the operation unit 140 shown in
In
The contrast AF process is now described. At present, imaging apparatuses (such as digital video cameras (recorders with cameras) that have the function of automatically focusing on a principal object (an AF function) during a moving image capturing operation are widely used. This AF function may be a contrast AF function that performs focus control based on contrast measurement, for example. With this contrast AF function, the intensity of the contrast in image data acquired via a lens is determined, and the position of the focus lens is determined.
Specifically, with the contrast AF function, focus control is performed by using intensity information about the contrast in an image acquired in the imaging apparatus 100. For example, a specific area in a captured image is set as a signal acquisition area (a spatial frequency extraction area) for focus control. This specific area is also referred to as a range measurement frame (frequency detection frame). As the contrast in the specific area becomes higher, the specific area is determined to be in focus. As the contrast becomes lower, the specific area is determined to be out of focus. In view of this, the contrast AF function drives and adjusts the focus lens to such a position as to achieve the highest contrast.
Specifically, the high-frequency component of the specific area is extracted, integral data of the extracted high-frequency component is generated, and the intensity of the contrast is determined based on the generated integral data of the high-frequency component.
That is, images are acquired while the focus lens is moved from one position to other positions, and a filtering process (such as high-pass filtering) is performed on the luminance signals of the respective images, to obtain AF evaluation values indicating the contrast intensities of the respective images.
In a case where there is an object that comes into focus in a position where the focus lens exists, the AF evaluation value with respect to the position of the focus lens draws a curve. The position of the peak of this curve (or the position where the contrast value of the image becomes largest) is the focusing position.
As described above, in the contrast AF, a focusing operation is performed based only on the information about an image formed by an imaging device (an imager), and therefore, there is no need to provide a range measuring optical system as well as an imaging optical system in the imaging apparatus. In view of this, the contrast AF is widely used in imaging apparatuses such as digital still cameras and digital video cameras.
However, with the contrast AF, correct focusing might not be performed when the object satisfies a certain condition. When the object satisfies a certain condition, a high-luminance point source is included in the object, as shown in “a” in
As described above, with the contrast AF, as the focus lens approaches the focusing position, the contrast evaluation value becomes higher. As the focus lens moves further away from the focusing position, the contrast evaluation value becomes lower. However, when a high-luminance point source is included in the object as shown in “a” in
Also, the AF function other than the contrast AF is an AF function to be realized by a 2-image matching process (see JP 2011-128623 A, for example). The AF function to be realized by a 2-image matching process will be described below in detail, with reference to
The 2-image matching process is a process to estimate a focusing position by matching two images that are generated by changing the position of the focus lens (see JP 2011-128623 A, for example). The 2-image matching AF process is also an AF process to move the focus lens based on the focusing position estimated by the 2-image matching process (see JP 2011-128623 A, for example).
In
In the 2-image matching process, the distance (an arrow 336) to the focusing position (an arrow 335) is estimated by using the two images 331 and 332 generated in the two different focus lens positions. This calculation is repeated several times, to increase precision.
Here, the change in blurring between the images 331 and 332 can be modeled by an image conversion function P expressed by the equation (1) shown below. In the equation (1), fA represents the image 331, and fB represents the image 332.
f
A
*P=f
B Equation (1)
Here, * represents a two-dimensional convolution. Also, the image conversion function P can be approximated by using a series of convolutions with a blurring kernel K, as shown below in the equation (2).
P=K*K*K* . . . *K Equation (2)
As the blurring kernel K, the following matrix may be used, for example.
Here, the amount of the blurring difference between the images 331 and 332 can be measured by counting the number of convolutions in the equation (2). That is, the amount of the blurring difference can be measured by counting the number of convolutions used until the images 331 and 332 become the same. An example of the result of this measurement calculation is shown in
Here, the number of iterations indicated by the ordinate axis shown in
The images to be used in the 2-image matching process are now described. When the 2-image matching process is performed, central areas (certain proportions) of captured images can be extracted during the AF of a moving image, and be used in the calculation. Instead of the central areas of captured images, designated areas (areas (such as rectangular areas) in captured images) designated by the user of the imaging apparatus 100 may be extracted, and be used in the calculation. In this case, the user can focus on a particularly desired object. However, in the peripheral portions of the captured images, the precision of focusing position estimation might become lower due to a phenomenon called lens aberration. Therefore, there is a possibility that the precision of determination as to switching between the contrast AF mode and the 2-image matching AF mode will become lower, or the time required for the AF in the 2-image matching AF mode will become longer. In view of this, areas close to the centers of captured images are preferably designated as the designated areas.
In the 2-image matching AF process, two images with different focus lens positions are required in the calculation, and therefore, the time intervals at which an output is obtained become longer than those in the contrast AF process. As a result, the precision of position estimation in the vicinity of the focusing position might become lower than that in the contrast AF process.
In the 2-image matching AF process, a certain time interval is also required to acquire two images. Therefore, the precision of focusing position estimation might become lower in a situation where the user is moving the imaging apparatus 100 (a situation where panning or tilting is being performed, for example), a situation where the object is moving or transforming, or the like.
In the 2-image matching AF process, the difference between the focus lens positions of two images is also used. Therefore, in a situation where the size of the aperture of the imaging unit 220 (the imaging lens 101) is changing by virtue of an automatic exposure adjustment function or the like, the precision of focusing position estimation might become lower.
Therefore, if an AF process (a 2-image matching AF process) based on a focusing position estimated by a 2-image matching process is performed in the above described situations, the focus lens might be moved to a wrong position.
In view of this, in the first embodiment of the present technique, appropriate focus control is performed in such situations.
In
As shown in “a” in
As described above, only when the focus lens converges on one position, is the following process (to determine whether a first condition is satisfied) performed. For example, the contrast AF process has a larger number of outputs in a certain period of time than the 2-image matching AF process, and can advantageously cope with movement of the object. With such favorable features being taken into account, the AF mode can be switched to the 2-image matching AF mode, only when a wrong focusing position is estimated by the contrast AF process. However, in a case where the object being imaged is an object not compatible with the contrast AF process, the focus lens might keep moving in a wrong direction until converging on one position.
Therefore, in a case where the contrast AF mode is set, the following process (to determine whether the first condition is satisfied) may be unconditionally performed at regular intervals (or irregularly). That is, in a case where the contrast AF mode is set, a processing result (an index value) of the 2-image matching process may be regularly or irregularly referenced. In such a case, the focus lens can be prevented from continuing to move in a wrong direction even if the object is not compatible with the contrast AF process. Also, in a case where the focus lens is relatively far from the focusing position, the AF mode can be promptly switched to the 2-image matching AF mode.
In a case where the focus lens converges on one position (351), the control unit 260 references history information (one or more histories of focusing position estimation index values obtained by the 2-image matching process) held by the history information holding unit 261. The control unit 260 then determines whether the first condition is satisfied (352). The first condition is that the difference between the focusing position estimated based on the histories of the processing results of the 2-image matching process and the current position of the focus lens is equal to or larger than a threshold value. The estimated focusing position determined based on the histories may be the average of the estimated focusing positions held by the history information holding unit 261, for example.
In a case where the first condition is satisfied (352), the control unit 260 determines that the focusing position of the contrast AF is wrong, and switches the AF mode to the 2-image matching AF mode (353). For example, as shown in “a” in
In a case where the first condition is not satisfied (352), the control unit 260 cannot determine that the focusing position of the contrast AF is wrong, and therefore, does not change AF modes (354).
In a case where the focus lens does not converge on any position (351), the control unit 260 does not change the currently set AF mode, either (354).
As the first condition, some other condition may be used. For example, the first condition may be that the difference between an estimated focusing position calculated by a weighted averaging method and the current position of the focus lens is equal to or larger than a threshold value. In a case where this condition is not satisfied, AF modes are not changed. In a case where this condition is satisfied, the AF mode is switched to the 2-image matching AF mode. The estimated focusing position calculated by the weighted averaging method is expressed by the equation (4) shown below.
Also, the first condition may be that each estimated focusing position included in the histories is compared with the current position of the focus lens, and the number of estimated focusing positions exceeding a threshold value is equal to or larger than a certain number. In a case where this condition is not satisfied, AF modes are not changed. In a case where this condition is satisfied, the AF mode is switched to the 2-image matching AF mode.
In
As shown in “b” in
Here, an estimated focusing position calculated by combining one or more histories (an estimated focusing position calculated by a weighted averaging method) is calculated according to the equation (3) shown below. The weighted distribution of the histories of estimated focusing positions (weighted average distribution) is calculated according to the equation (4) shown below.
Here, the N non-biased processing results d1, . . . , dN of the 2-image matching process are obtained with di through N(μ, σi2). N represents the number of pairs of images (two images) used in the 2-image matching process, and μ represents the distance to the actual focusing position. The “maximum likelihood estimator (MLE)” of μ is determined by weighted averaging, and σi2 represents the distribution.
Ina case where the second condition is satisfied (361), the control unit 260 switches the AF mode to the contrast AF mode (362). Specifically, since the precision of fine adjustment of the focus lens to the focusing position is higher in the contrast AF mode, the contrast AF mode is set after the focus lens enters the vicinity of the focusing position. Accordingly, focus control can be controlled with high precision.
As the condition that the weighted distribution of the histories of estimated focusing positions is equal to or smaller than a threshold value is included in the second condition, the focus lens being currently located in the vicinity of the focusing position can be sensed with higher precision. However, the condition that the weighted distribution of the histories of estimated focusing positions is equal to or smaller than a threshold value may be excluded from the second condition.
As described above, the control unit 260 determines whether switching between the contrast AF mode (the first mode) and the 2-image matching AF mode (the second mode) is necessary based on the position of the focus lens and the histories of matching processing results.
Specifically, in a case where the focus lens converges on one position and the first condition is satisfied in the contrast AF mode, the control unit 260 performs control to set the 2-image matching AF mode. As described above, the first condition may be that the difference between the position of the focus lens and the focusing position estimated based on the histories of matching processing results is larger than a threshold value.
In a case where the second condition is satisfied in the 2-image matching AF mode, the control unit 260 performs control to set the contrast AF mode. As described above, the second condition may be that the difference between the position of the focus lens and the focusing position estimated based on the histories of matching processing results is smaller than a threshold value, and the weighed distribution of the histories of estimated focusing positions is smaller than a threshold value.
In a case where the difference between the position of the focus lens and the focusing position estimated based on the histories of matching processing results is smaller than a threshold value in the 2-image matching AF mode, the control unit 260 may also set the contrast AF mode.
[Example of Determination as to Whether the 2-Image Matching Process is Necessary with the Use of Angular Velocities]
In
During a moving image capturing operation, the user might perform panning or tilting. However, in cases where the imaging apparatus 100 is moving due to panning, tilting, or the like (or where the optical axis direction differs between two images) in the 2-image matching process, a wrong focusing position estimation result is often output.
Therefore, when the two images to be used in the 2-image matching process are acquired, the posture detecting unit 210 (the gyro sensor 110) detects a change in posture (angular velocity) (401 through 406). The control unit 260 then determines whether the change in posture (angular velocity) detected by the posture detecting unit 210 (the gyro sensor 110) is equal to or larger than a threshold value (a determination process for the 2-image matching process) (411, 421, and 431).
If the change in posture (angular velocity) is smaller than the threshold value, the 2-image matching AF processing unit 280 performs the 2-image matching process by using the two images corresponding to the times of the acquisitions of the angular velocities used in the comparison (412, 422, and 432). The control unit 260 then causes the history information holding unit 261 to hold, as history information, the processing result of the 2-image matching process performed by the 2-image matching AF processing unit 280 (413, 423, and 433).
On the other hand, if the change in posture (angular velocity) is equal to or larger than the threshold value, the 2-image matching AF processing unit 280 does not use the two images corresponding to the times of the acquisitions of the angular velocities used in the comparison to perform the 2-image matching process (412, 422, and 432). That is, the control unit 260 determines whether AF mode switching is necessary without using the matching processing result as the history when the change in posture (angular velocity) is larger than the threshold value (0, for example).
The threshold value for the change in posture (angular velocity) preferably becomes higher as the angle of view becomes wider. However, when the threshold value is set, it is preferable to take other conditions (such as an image stabilization condition) into consideration.
Also, the control unit 260 may set the contrast AF mode (the first mode) under the condition that the change in posture (angular velocity) is larger than the threshold value.
[Example of Determination as to Whether the 2-Image Matching Process is Necessary with the Use of Luminance Detection Values]
In
During a moving image capturing operation, the object might move or transform. However, in cases where the position or the shape of the object differs between the two images in the 2-image matching process, a wrong focusing position estimation result is often output.
Therefore, when the two images to be used in the 2-image matching process are acquired, the control unit 260 calculates the luminance detection values in the images generated by the imaging unit 220 (451 through 456). The control unit 260 then determines whether the difference between the calculated two luminance detection values is equal to or larger than a threshold value (a determination process for the 2-image matching process) (415, 425, and 435). A luminance detection value is the total value or the average value of the luminance values in the detection frame in an image.
If the difference between the two luminance detection values is smaller than the threshold value, the object can be determined not to have moved or transformed. Therefore, the 2-image matching AF processing unit 280 performs the 2-image matching process by using the two images corresponding to the times of the calculations of the two luminance detection values (412, 422, and 432). The control unit 260 then causes the history information holding unit 261 to hold, as history information, the processing result of the 2-image matching process performed by the 2-image matching AF processing unit 280 (413, 423, and 433).
On the other hand, if the difference between the two luminance detection values is equal to or larger than the threshold value, the object can be determined to have moved or transformed. Therefore, the 2-image matching AF processing unit 280 does not use the two images corresponding to the times of the calculations of the two luminance detection values in the 2-image matching process (412, 422, and 432). That is, the control unit 260 determines whether AF mode switching is necessary without using the matching processing result as the history when the difference between the two luminance detection values is larger than the threshold value.
Also, the control unit 260 may set the contrast AF mode (the first mode) under the condition that the difference between the two luminance detection values is larger than the threshold value.
[Example of Determination as to Whether the 2-Image Matching Process is Necessary with the Use of Aperture Values]
In
During a moving image capturing operation, the size of the aperture might change due to an automatic exposure control function or the like. Here, the 2-image matching process is performed to estimate the focusing position by using the blurring levels in images. Therefore, in cases where there is a difference in the shape of the point-spread function or the focal depth between the two images, a wrong focusing position estimation result is often output as in the case where, for example, the size of the aperture changes.
Therefore, when the two images to be used in the 2-image matching process are acquired, the control unit 260 acquires the aperture values (F values) of the imaging unit 220 (461 through 466). The control unit 260 then determines whether the difference between the acquired two aperture values is equal to or larger than a threshold value (a determination process for the 2-image matching process) (416, 426, and 436).
If the difference between the two aperture values is smaller than the threshold value, the 2-image matching AF processing unit 280 performs the 2-image matching process by using the two images corresponding to the times of the acquisitions of the two aperture values (412, 422, and 432). The control unit 260 then causes the history information holding unit 261 to hold, as history information, the processing result of the 2-image matching process performed by the 2-image matching AF processing unit 280 (413, 423, and 433).
On the other hand, if the difference between the two aperture values is equal to or larger than the threshold value, the 2-image matching AF processing unit 280 does not use the two images corresponding to the times of the calculations of the two aperture values in the 2-image matching process (412, 422, and 432). That is, the control unit 260 determines whether AF mode switching is necessary without using the matching processing result as the history when the difference between two aperture values is larger than the threshold value (0, for example).
Also, the control unit 260 may set the contrast AF mode (the first mode) under the condition that the difference between the two aperture values is larger than the threshold value.
Although
Although
First, the control unit 260 determines whether the two images (the first image and the second image) to be used in the 2-image matching process have been acquired (step S901). If the two images have not been acquired (or if only one image has been acquired) (step S901), the operation moves on to step S905.
If the two images have been acquired (step S901), the control unit 260 performs a determination process for the 2-image matching process (step S920). This determination process will be described later in detail, with reference to
The control unit 260 then determines whether the 2-image matching process is determined to be possible by the determination process for the 2-image matching process (step S902). If the 2-image matching process is determined not to be possible, the operation moves on to step S905. If the 2-image matching process is determined to be possible (step S902), the control unit 260 performs the 2-image matching process by using the acquired two images (step S903). The control unit 260 then causes the history information holding unit 261 to hold the processing result of the 2-image matching process as history information (step S904).
The control unit 260 then determines whether the currently set AF mode is the contrast AF mode or is the 2-image matching AF mode (step S905). If the currently set AF mode is the contrast AF mode (step S905), the control unit 260 determines whether the focus lens converges on one position (step S906). If the focus lens converges on one position (step S906), the control unit 260 determines whether the first condition (shown in
If the first condition is satisfied (step S907), the control unit 260 sets the 2-image matching AF mode (step S908). That is, the AF mode is switched from the contrast AF mode to the 2-image matching AF mode.
If the focus lens does not converge on any position (step S906), or if the first condition is not satisfied (step S907), the control unit 260 does not change AF modes (step S909). That is, since the contrast AF mode is set as the AF mode, the contrast AF processing unit 270 performs the contrast AF process (step S909).
If the currently set AF mode is the 2-image matching AF mode (step S905), the control unit 260 determines whether the second condition (shown in
If the second condition is not satisfied (step S910), the control unit 260 does not change AF modes (step S912). That is, since the 2-image matching AF mode is set as the AF mode, the 2-image matching AF processing unit 280 performs the 2-image matching process (step S912). It should be noted that step S909 is an example of the first processing step in the claims. Step S912 is an example of the second processing step in the claims. Steps S905 through S908, S910, and S911 are an example of the control step in the claims.
First, the posture detecting unit 210 detects the posture of the imaging apparatus 100 (step S921). The control unit 260 calculates a change in posture (angular velocity) based on the currently detected posture of the imaging apparatus 100 and the previously detected posture of the imaging apparatus 100, and determines whether the angular velocity is lower than a threshold value (step S922).
If the angular velocity is lower than the threshold value (step S922), the control unit 260 calculates the luminance detection value in an image generated by the imaging unit 220 (step S923). The control unit 260 then determines whether the difference between the currently calculated luminance detection value and the previously calculated luminance detection value is smaller than a threshold value (step S924).
If the difference is smaller than the threshold value (step S924), the control unit 260 acquires the aperture value in the imaging unit 220 (step S925). The control unit 260 then determines whether the difference between the currently acquired aperture value and the previously acquired aperture value is smaller than a threshold value (step S926).
If the difference is smaller than the threshold value (step S926), the control unit 260 determines that the 2-image matching process is possible (step S927).
If the angular velocity is equal to or greater than the threshold value (step S922), the control unit 260 determines that the 2-image matching process is not possible (step S928). Likewise, if the difference in luminance detection value is equal to or larger than the threshold value (step S924), or if the difference in aperture value is equal to or larger than the threshold value (step S926), the control unit 260 determines that the 2-image matching process is not possible (step S928).
As described above, according to the first embodiment of the present technique, 2-image matching AF is performed on an object such as a high-luminance point source that is not compatible with contrast AF. Accordingly, the focus lens can be moved to the vicinity of the correct focusing position. Also, as the contrast AF mode is set in the vicinity of the focusing position, precision can be maintained in the vicinity of the focusing position.
Also, execution of the 2-image matching process that leads to a wrongly estimated focusing position can be prevented. Accordingly, the precision of the focus lens moving direction and distance can be improved. Also, the error rate and the speed of AF can be improved. Further, unnecessary calculations can be avoided in advance. That is, appropriate focus control can be performed.
As described above, according to the first embodiment of the present technique, hybrid moving image AF in the 2-image matching process can be realized.
Although the imaging apparatus 100 including the imaging unit 220 has been described as an example of the first embodiment of the present technique, an embodiment of the present technique can be applied to an imaging apparatus (an electronic device) from which an imaging unit can be detached. Also, an embodiment of the present technique can be applied to electronic devices, such as a portable telephone device with an imaging function, and a portable terminal device with an imaging function (a smartphone, for example).
It should be noted that the above described embodiment is merely an example for embodying the present technique, and the matters in the embodiment correspond to the respective inventive matters in the claims. Likewise, the inventive matters in the claims correspond to the respective matters with like names in the embodiment of the present technique. However, the present technique is not limited to the embodiment, and can also be embodied by making various modifications to the embodiment without departing from the scope of the technique.
The processing procedures described in the above described embodiment may be regarded as a method involving the series of procedures, and may also be regarded as a program for causing a computer to carry out the series of procedures or a recording medium storing the program. This recording medium may be a CD (Compact Disc), a MD (MiniDisc), a DVD (Digital Versatile Disk), a memory card, a Blu-ray Disc (a registered trade name), or the like.
The present technique may also be in the following forms.
(1) An imaging apparatus including a control unit that performs control to set a first mode or a second mode based on a predetermined condition, the first mode being for performing an auto focus process by moving a focus lens based the contrast in an image generated by an imaging unit, the second mode being for performing an auto focus process by moving the focus lens based on the result of a matching process performed between a first image and a second image, the first image and the second image being generated by the imaging unit with the focus lens being located in different positions.
(2) The imaging apparatus of (1), wherein the control unit determines whether switching between the first mode and the second mode is necessary based on the position of the focus lens and a history of the result of the matching process.
(3) The imaging apparatus of (1) or (2), wherein, where the first mode is set, the control unit performs control to set the second mode when the predetermined condition is satisfied, the predetermined condition being that the focus lens converges on one position, and the difference between the position of the focus lens and a focusing position estimated based on a history of the result of the matching process is larger than a threshold value.
(4) The imaging apparatus of any of (1) through (3), wherein, where the second mode is set, the control unit performs control to set the first mode when the predetermined condition is satisfied, the predetermined condition being that the difference between the position of the focus lens and a focusing position estimated based on a history of the result of the matching process is smaller than a threshold value.
(5) The imaging apparatus of any of (1) through (4), wherein, where the second mode is set, the control unit performs control to set the first mode when the predetermined condition is satisfied, the predetermined condition being that the difference between the position of the focus lens and a focusing position estimated based on a history of the result of the matching process is smaller than a threshold value, and a weighted distribution of the history of the estimated focusing position is smaller than a threshold value.
(6) The imaging apparatus of (2), further including
a posture detecting unit that detects a change in the posture of the imaging apparatus,
wherein the control unit determines whether the switching is necessary without using the result of the matching process as the history when the detected posture change is larger than a threshold value.
(7) The imaging apparatus of (2), wherein the control unit determines whether the switching is necessary without using the result of the matching process as the history when the difference between the luminance detection value in the first image and the luminance detection value in the second image is larger than a threshold value.
(8) The imaging apparatus of (2), wherein the control unit determines whether the switching is necessary without using the result of the matching process as the history when the difference between the aperture value at the time of the generation of the first image and the aperture value at the time of the generation of the second image is larger than a threshold value.
(9) The imaging apparatus of any of (1) through (8), further including
a posture detecting unit that detects a change in the posture of the imaging apparatus,
wherein the control unit performs control to set the first mode when the detected posture change is larger than a threshold value.
(10) The imaging apparatus of any of (1) through (9), wherein the control unit performs control to set the first mode when the difference between the luminance detection value in the first image and the luminance detection value in the second image is larger than a threshold value.
(11) The imaging apparatus of any of (1) through (10), wherein the control unit performs control to set the first mode when the difference between the aperture value at the time of the generation of the first image and the aperture value at the time of the generation of the second image is larger than a threshold value.
(12) A method of controlling an imaging apparatus, including:
a first processing step of performing an auto focus process by moving a focus lens based on the contrast in an image generated by an imaging unit when a first mode is set;
a second processing step of performing an auto focus process by moving the focus lens based on the result of a matching process performed between a first image and a second image when a second mode is set, the first image and the second image being generated by the imaging unit with the focus lens being located in different positions; and
a control step of performing control to set the first mode or the second mode based on a predetermined condition.
(13) A program for causing a computer to carry out:
a first processing step of performing an auto focus process by moving a focus lens based on the contrast in an image generated by an imaging unit when a first mode is set;
a second processing step of performing an auto focus process by moving the focus lens based on the result of a matching process performed between a first image and a second image when a second mode is set, the first image and the second image being generated by the imaging unit with the focus lens being located in different positions; and
a control step of performing control to set the first mode or the second mode based on a predetermined condition.
Number | Date | Country | Kind |
---|---|---|---|
2011-280956 | Dec 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/082626 | 12/17/2012 | WO | 00 | 5/8/2014 |