The present technology relates to an imaging apparatus.
In single lens reflex cameras of the related art, a so-called dedicated phase difference sensor is mounted to realize fast autofocus. On the other hand, compact cameras, mirrorless cameras, and the like generally employ a contrast detection autofocus (hereinafter referred to as AF) system. In addition, in order to realize fast AF in such cameras, a method of embedding an image sensor for phase difference detection in another image sensor has been proposed (refer to Japanese Unexamined Patent Application Publication No. 2000-156823).
Furthermore, a method of mounting both a dedicated phase difference detecting module (hereinafter referred to as a dedicated AF sensor) and a phase difference detecting image-plane sensor (hereinafter referred to as an image-plane AF sensor) has also been proposed in order to obtain advantages of both sensors using the above-described technique.
In such an imaging apparatus in which both the dedicated AF sensor and the image-plane AF sensor are mounted, unnecessary light reflected on the dedicated AF sensor is incident on an image sensor during photographing particularly in a strong backlight state, which may adversely affect photographing and focus detection.
It is desirable to provide an imaging apparatus that can prevent a backlight state from adversely affecting an image sensor in a configuration in which both a dedicated AF sensor and an image-plane AF sensor are mounted.
According to an embodiment of the present technology, there is provided an imaging apparatus including a first focus detection unit that is provided in an image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through a photographing lens, and a second focus detection unit that is provided so as to be positioned above the image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through the photographing lens.
According to an embodiment of the present technology, an adverse effect of a backlight state on an image sensor in a configuration in which both a dedicated AF sensor and an image-plane AF sensor are mounted can be prevented.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Hereinafter, embodiments of the present technology will be described with reference to the appended drawings. Note that description will be provided in the following order.
First, an example of a configuration of an imaging apparatus 100 according to the related art will be described with reference to
As shown in
The semi-transmissible mirror 130 is provided between the photographing lens 122 and the image sensor 140 in the housing 110. Light from a subject is incident on the semi-transmissible mirror 130 through the photographing lens 122. The semi-transmissible mirror 130 reflects part of subject light incident through the photographing lens 122 in the direction of the dedicated AF sensor 160 positioned below the semi-transmissible mirror, reflects part of the subject light in the direction of the pentaprism 170 positioned above the mirror, and further causes part of the subject light to be transmitted therethrough toward the image sensor 140. In addition, a total-reflection mirror 131 is provided on the side of the image sensor 140 of the semi-transmissible mirror 130 as a sub-mirror. The total-reflection mirror 131 guides subject light that has been transmitted through the semi-transmissible mirror 130 to the dedicated AF sensor 160. During an AF operation, subject light for dedicated AF is transmitted through the semi-transmissible mirror 130, bent downward by the total-reflection mirror 131, and then incident on the dedicated AF sensor 160. In addition, during photographing, the semi-transmissible mirror 130 and the total-reflection mirror 131 are retracted, and the subject light is guided to the image sensor 140.
The image sensor 140 for generating photographed images is provided inside the housing 110. As the image sensor 140, a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like is used. The image sensor 140 photoelectrically converts subject light incident through the photographing lens 122 into an amount of electric charge, and thereby generates images. Image signals undergo predetermined signal processes such as a white balance adjustment process or a gamma correction process, and then finally are stored in a storage medium in the imaging apparatus 100, an external memory, or the like as image data.
The image sensor 140 has R (Red) pixels, G (Green) pixels, and B (Blue) pixels, which are general imaging pixels, and phase difference detection elements for detecting a phase difference focus. The pixels constituting the image sensor photoelectrically convert incident light from a subject to convert the light into an amount of electric charge, and output a pixel signal.
In addition, in
In this manner, the image sensor 140 has the image-plane AF sensor 150 using the phase difference detection elements in addition to the general pixels, and the imaging apparatus 100 can perform so-called image-plane phase difference AF (Autofocus) using an output from the image-plane AF sensor 150.
The dedicated AF sensor 160 is provided below the semi-transmissible mirror 130 inside the housing 110 so as to be positioned in front of the image sensor 140. The dedicated AF sensor 160 is a dedicated autofocus sensor of, for example, a phase difference detection AF system, a contrast detection AF system, or the like. As an AF system, the phase difference detection system and the contrast AF system may be combined. In order to satisfactorily perform AF in a dark place or on a subject with low contrast, it may be possible to generate AF auxiliary light and gain an AF evaluation value from returning light. Subject light collected by the photographing lens is reflected on the semi-transmissible mirror and then incident on the dedicated AF sensor 160. A focus detection signal detected by the dedicated AF sensor 160 is supplied to a processing unit that performs computation of a defocusing amount in the imaging apparatus 100.
Description will return to the configuration of the imaging apparatus 100. The pentaprism 170 is a prism having a cross-section in a pentagonal shape, and causing subject light incident from the bottom to be reflected therein to switch the top and bottom and the right and left of an image of the subject light, thereby forming an upright image. The subject image set to be the upright image by the pentaprism 170 is guided in the direction of the finder 180. The finder 180 functions as an optical finder through which subjects are checked during photographing. A user can check an image of a subject by looking in a finder window.
The display 190 is provided in the housing 110. The display 190 is a flat display such as an LCD (Liquid Crystal Display), or an organic EL (Electroluminescence). Image data obtained by processing an image signal output from the image sensor 140 in a signal processing unit (not shown) is supplied to the display 190, and the display 190 displays the image data as a real-time image (a so-called through image) thereon. In
The imaging apparatus 100 according to the related art is configured as described above. When photographing is performed in the imaging apparatus 100, if the sun is set to be in a photographing direction, and accordingly the imaging apparatus is in a strong backlight state, there is concern that unnecessary light reflected on a face of the dedicated AF sensor 160 is incident on the image sensor 140 as shown in
Next, a configuration of an imaging apparatus according to the present technology will be described.
The imaging apparatus 1000 according to the present technology has a housing 1001, an optical imaging system 1010 provided with a photographing lens 1011, a semi-transmissible mirror 1002, an image sensor 1030, an image-plane AF sensor 1031, a dedicated AF sensor 1020, an electronic view finder 1003, and a display 1004. It should be noted that, since the configurations of the housing 1001, the optical imaging system 1010, the image sensor 1030, the image-plane AF sensor 1031, and the display 1004 are the same as those of the imaging apparatus of the related art described above, description thereof will not be repeated.
The semi-transmissible mirror 1002 is provided in the housing 1001 between the photographing lens 1011 and the image sensor 1030 positioned in the housing 1001. Subject light is incident on the semi-transmissible mirror 1002 via the photographing lens 1011. The semi-transmissible mirror 1002 reflects part of the subject light incident through the photographing lens in the direction of the dedicated AF sensor 1020 positioned above, and transmits part of the subject light toward the image sensor 1030.
The dedicated AF sensor 1020 is provided so as to be positioned above the semi-transmissible mirror 1002 and in front of the image sensor 1030 in the housing 1001. The dedicated AF sensor 1020 is a dedicated autofocus module of, for example, a phase difference detection system, or a contrast AF system. Subject light collected by the photographing lens 1011 is reflected on the semi-transmissible mirror 1002, and then incident on the dedicated AF sensor 1020. A focus detection signal detected by the dedicated AF sensor 1020 is supplied to a processing unit that performs computation of a defocusing amount in the imaging apparatus 1000.
In
The areas indicated by crosses in
There are cases in which it is difficult to uniformly dispose the AF areas at an equal interval in the dedicated AF sensor 1020 due to disposition of the areas in a dedicated optical system. For this reason, when detection results of the dedicated AF areas and the image-plane AF areas are compared as in the present technology, it is better to put the positions of the two kinds of AF areas together. To this end, the image-plane AF areas are unevenly disposed so that the positions of the image-plane AF areas are associated with the positions of the dedicated AF areas as shown in
The electronic view finder (EVF) 1003 is provided in the housing 1001. The electronic view finder 1003 has, for example, a liquid crystal display, an organic EL display, or the like. Image data obtained by processing an image signal output from the image sensor 1030 in a signal processing unit (not shown) is supplied to the electronic view finder 1003, and the electronic view finder 1003 displays the image data as a real-time image (through image).
The imaging apparatus according to the present technology is configured as described above. In the imaging apparatus according to the present technology, the dedicated AF sensor 1020 is provided above the semi-transmissible mirror 1002 in the housing 1001 of the imaging apparatus 1000. Thus, even when the sun is in the photographing direction, and accordingly the imaging apparatus is in a strong backlight state, there is no such case in which unnecessary light is reflected on a face of the dedicated AF sensor 1020 and incident on the image sensor 1030 as shown in
It should be noted that, since a light source such as the sun or an illuminating device is positioned higher than an imaging apparatus during photographing in many cases, light is also incident on the imaging apparatus from above. Thus, by providing the dedicated AF sensor 1020 above the image sensor as in the present technology, it is possible to prevent unnecessary light from being reflected on the dedicated AF sensor 1020 and incident on the image sensor 1030.
It should be noted that, in the present technology, the dedicated AF sensor 1020 is provided at the position at which the pentaprism would be provided in the related art, and thus the pentaprism may not be provided, and the electronic view finder is preferably used as a finder.
The imaging apparatus 1000 of
The optical imaging system 1010 is configured to include the photographing lens 1011 for collecting light from a subject on the image sensor 1030 (including a focus lens, a zoom lens, and the like), a lens drive mechanism 1012 that adjusts focus by moving the focus lens, a shutter mechanism, an iris mechanism, and the like. The system is driven based on a control signal from the control unit 1070 and the focus control unit 1075. The lens drive mechanism 1012 realizes an AF operation by moving the photographing lens 1011 in an optical axis direction corresponding to a defocusing amount supplied from the focus control unit 1075. A light image of a subject obtained through the optical imaging system 1010 is formed on the image sensor 1030 serving as an imaging device.
The dedicated AF sensor 1020 is a dedicated autofocus sensor of, for example, the phase difference detection AF system, the contrast detection AF system, or the like. Subject light collected by the photographing lens 1011 is reflected on the semi-transmissible mirror and incident on the dedicated AF sensor 1020. A focus detection signal detected by the dedicated AF sensor 1020 is supplied to the defocusing amount computation unit 1071. The dedicated AF sensor 1020 corresponds to a first focus detection unit according to an embodiment of the present disclosure. Thus, a defocusing amount obtained from detection of focus by the dedicated AF sensor 1020 corresponds to a first defocusing amount according to an embodiment of the present disclosure.
The image sensor 1030 has R (Red) pixels, G (Green) pixels, and B (Blue) pixels, which are normal imaging pixels, and phase difference detection elements for detecting a phase difference focus. The pixels constituting the image sensor 1030 photoelectrically convert light incident from a subject into an amount of electric charge, and output a pixel signal. In addition, the image sensor 1030 finally outputs an imaging signal that includes the pixel signal to the pre-processing circuit 1040. As the image sensor 1030, a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like is used. It should be noted that a detailed configuration of the image sensor 1030 will be described later.
The image-plane AF sensor 1031 is a sensor for autofocus that includes a plurality of phase difference detection elements. A focus detection signal detected by the image-plane AF sensor 1031 is supplied to the defocusing amount computation unit 1071. A detailed configuration of the image-plane AF sensor 1031 will be described later. The image-plane AF sensor 1031 corresponds to a second focus detection unit according to an embodiment of the present disclosure. Thus, a defocusing amount obtained from detection of focus by the image-plane AF sensor 1031 corresponds to a second defocusing amount according to an embodiment of the present disclosure.
The pre-processing circuit 1040 performs sample holding or the like on the imaging signal output from the image sensor 1030 so that an S/N (Signal to Noise) ratio is satisfactorily held from a CDS (Correlated Double Sampling) process. Furthermore, gain is controlled in an AGC (Auto Gain Control) process, A/D (Analog to Digital) conversion is performed, and a digital image signal is thereby output.
The camera processing circuit 1050 performs signal processes such as a white balance adjustment process, a color correction process, a gamma correction process, a Y/C conversion process, an AE (Auto Exposure) process, and the like on the image signal output from the pre-processing circuit 1040.
The image memory 1060 is a volatile memory, or a buffer memory configured as, for example, a DRAM (Dynamic Random Access Memory), which temporarily stores image data that has undergone the predetermined processes by the pre-processing circuit 1040 and the camera processing circuit 1050.
The control unit 1070 is constituted by, for example, a CPU, a RAM, a ROM, and the like. The ROM stores programs read and operated by the CPU, and the like. The RAM is used as a work memory of the CPU. The CPU controls the entire imaging apparatus 1000 by executing various processes according to the programs stored in the ROM and issuing commands.
In addition, the control unit 1070 functions as the defocusing amount computation unit 1071, the defocusing amount selection unit 1072, the defocusing amount decision unit 1073, the defocusing amount correction unit 1074, and the focus control unit 1075 by executing a predetermined program. Each of the units may be realized by hardware with each of the functions as a dedicated device, not by a program. In this case, the imaging apparatus 1000 is configured to include the hardware.
The defocusing amount computation unit 1071 computes a defocusing amount that indicates a deviation amount from focus based on a phase difference detection signal acquired by the dedicated AF sensor 1020 or the image-plane AF sensor 1031. The defocusing amount selection unit 1072 performs a process of selecting which amount between a defocusing amount obtained from a detection result of the dedicated AF sensor 1020 (hereinafter referred to as a dedicated defocusing amount) and a defocusing amount obtained from a focus detection result of the image-plane AF sensor 1031 (hereinafter referred to as an image-plane defocusing amount) will be used in focus control and employing the result. A detailed process performed by the defocusing amount selection unit 1072 will be described later.
The defocusing amount decision unit 1073 performs a process of deciding a defocusing amount for each image-plane AF area based on the image-plane defocusing amount computed based on the focus detection result of the image-plane AF sensor. A detailed process of the defocusing amount decision unit 1073 will be described later. The defocusing amount correction unit 1074 performs a correction process of an image-plane defocusing amount. A detailed process performed by the defocusing amount correction unit 1074 will be described later. The focus control unit 1075 controls the lens drive mechanism 1012 of the optical imaging system 1010 based on the employed defocusing amount to perform a focus adjustment process.
The graphic I/F 1080 causes an image to be displayed by generating an image signal for displaying the image on the display unit 1090 from the image signal supplied from the control unit 1070 and supplying the signal to the display unit 1090. The display unit 1090 is a display unit configured as, for example, an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-luminescence) panel, or the like. The display unit 1090 displays a through image being captured, an image recorded in the storage medium 1120, and the like.
The input unit 1100 includes, for example, a power button for switching between on and off of power, a release button for instructing start of recording a captured image, an operator for zoom adjustment, a touch screen integrated with the display unit 1090, and the like. When an input operation is performed on the input unit 1100, a control signal according to the input is generated and output to the control unit 1070. Then, the control unit 1070 performs an arithmetic operation process and control according to the control signal.
The R/W 1110 is an interface connected to the storage medium 1120 in which image data generated from imaging, and the like is recorded. The R/W 1110 writes data supplied from the control unit 1070 on the storage medium 1120, and outputs data read from the storage medium 1120 to the control unit 1070. The storage medium 1120 is a large-capacity storage medium 1120, for example, a hard disk, a Memory Stick (registered trademark of Sony Corporation), an SD memory card, or the like. Images are stored in a compressed state in the form of, for example, JPEG, or the like. In addition, EXIF (Exchangeable Image File Format) data including information of the stored images and additional information such as imaged dates, and the like is also stored therein in association with the images.
Herein, a basic operation of the imaging apparatus 1000 described above will be described. Before an image is captured, signals obtained from photoelectric conversion of light sensed by the image sensor 1030 are sequentially supplied to the pre-processing circuit 1040. The pre-processing circuit 1040 performs a CDS process, an AGC process, and the like on the input signals, and further performs conversion of the signals into image signals.
The camera processing circuit 1050 performs an image quality correction process on the image signals supplied from the pre-processing circuit 1040, and supplies the result to the graphic I/F 1080 via the control unit 1070 as signals of a camera through image. Accordingly, the camera through image is displayed on the display unit 1090. A user can adjust an angle of view while viewing the through image displayed on the display unit 1090.
In this state, when the shutter button of the input unit 1100 is pressed, the control unit 1070 outputs a control signal to the optical imaging system 1010 to cause a shutter included in the optical imaging system 1010 to operate. Accordingly, image signals for one frame are output from the image sensor 1030.
The camera processing circuit 1050 performs an image quality correction process on the image signals for one frame supplied from the image sensor 1030 via the pre-processing circuit 1040, and supplies the processed image signals to the control unit 1070. The control unit 1070 encodes and compresses the input image signals and supplies the generated encoded data to the R/W 1110. Accordingly, a data file of a captured still image is stored in the storage medium 1120.
Meanwhile, when the image file stored in the storage medium 1120 is reproduced, the control unit 1070 reads the selected still image file from the storage medium 1120 through the R/W 1110 according to an input operation on the input unit 1100. The read image file is subjected to an extended decoding process. Then, decoded image signals thereof are supplied to the graphic I/F 1080 via the control unit 1070. Accordingly, a still image stored in the storage medium 1120 is displayed on the display unit 1090.
The phase difference detection elements are embedded in image sensors 1030 as shown in, for example,
In the phase difference detection elements disposed as described above, a plurality of phase difference detection elements are set to be an AF area as a group (for example, the rectangular frame indicated by a thick line in
Next, a process executed by the imaging apparatus 1000 will be described. First, an overview of a focusing process executed in the present embodiment will be described with reference to
First,
In
When the subject within a predetermined defocusing amount is not detected by the dedicated AF sensor 1020 even after the predetermined time elapses from the pause of the focus adjustment, focus adjustment is performed so as to focus on another subject with a minimum defocusing amount of the dedicated AF sensor 1020 as shown in
Even when the subject that was previously focused and traced enters AF areas of the dedicated AF sensor 1020 again as shown in
It should be noted that, when the subject being traced is not a subject that a user desires, the input of the AF instruction is first released by the user (for example, release of half-pressing of the shutter) to pause the autofocus process. Then, there is no focus on any subject as shown in
In addition, when the user inputs an AF instruction again (for example, half-presses the shutter), focus adjustment is performed so that focus is on the proximate subject as shown in
In the present technology as described above, a subject can be focused and traced with high accuracy by using the dedicated AF sensor 1020 and the image-plane AF sensor 1031 together.
First, in Step S1, the defocusing amount computation unit 1071 computes defocusing amounts. The computation of the defocusing amounts is performed based on each of a focus detection result of the image-plane AF sensor 1031 and a focus detection result of the dedicated AF sensor 1020. In other words, a defocusing amount is computed based on the focus detection result of the image-plane AF sensor 1031 and a defocusing amount is computed based on the focus detection result of the dedicated AF sensor 1020.
Next, in Step S2, the defocusing amount selection unit 1072 performs a defocusing amount selection process. The defocusing amount selection process is a process of selecting which of the defocusing amounts of the image-plane AF sensor 1031 and the dedicated AF sensor 1020 will be used in focus control as a defocusing amount. Details of the defocusing amount selection process will be described later.
Next, in Step S3, the focus control unit 1075 controls driving of the focus lens based on the defocusing amount selected from the defocusing amount selection process. Accordingly, focus control is performed. Furthermore, a focus determination process in Step S4 is a process of checking whether or not focus is on a subject that a user desires in a focus adjustment process. In the imaging apparatus 1000, the process is repeated as long as the user inputs an AF instruction (for example, half-presses the shutter).
Next, the defocusing amount selection process included in the overall flowchart described above will be described with reference to the flowchart of
In addition, the determination of Step 101 may be made based on, for example, whether or not the focus detection result of the image-plane AF sensor 1031 can be used at an exposure timing. The exposure timing of the image-plane AF sensor 1031 is not synchronized with the dedicated AF sensor 1020 since reading of imaging is restricted. Thus, when a detection timing (timing of exposure end) of the image-plane AF sensor 1031 is acquired, and exposure timings are significantly deviated at a timing of exposure end of the dedicated AF sensor 1020, the focus detection result of the image-plane AF sensor 1031 is not employed. In this manner, when the determination of Step S101 is performed, and the focus detection result of the image-plane AF sensor 1031 is not valid, the process proceeds to Step S102 (No in Step S101).
Then, in Step S102, a proximate defocusing amount among a plurality of defocusing amounts computed based on focus detection results of a plurality of dedicated AF areas is selected as a defocusing amount to be used in focus control (hereinafter the selected defocusing amount is referred to as a selected defocusing amount). When there are 11 AF areas of the dedicated AF sensor 1020 as shown in
Description will return to Step S101. In Step S101, when the focus detection result of the image-plane AF sensor 1031 is determined to be valid, the process proceeds to Step S103 (Yes in Step S101). Then, an image-plane defocusing amount decision process is performed in Step S103. The image-plane defocusing amount decision process is a process of computing defocusing amounts for each of a plurality of image-plane AF areas (hereinafter referred to as image-plane defocusing amounts), and deciding an image-plane defocusing amount. Details of the image-plane defocusing amount decision process will be described later.
When an image-plane defocusing amount is decided, it is checked whether or not the imaging apparatus 1000 is in a proximity priority mode next in Step S104. The proximity priority mode is a mode in which focus is on a most proximate subject within all focus areas. When the imaging apparatus 1000 is in the proximity priority mode (Yes in Step S104), the value of a proximate defocusing amount among defocusing amounts of the dedicated AF areas (hereinafter referred to as dedicated defocusing amounts) is selected as a selected defocusing amount in Step S105. This is because a value of a proximate defocusing amount among the defocusing amounts is set to be selected according to the mode when the imaging apparatus 1000 is in the proximity priority mode. On the other hand, when the imaging apparatus 1000 is found not in the proximity priority mode in Step S104, the process proceeds to Step S106 (No in Step S104).
Next, in Step S106, it is determined whether or not the dedicated defocusing amounts obtained by the dedicated AF sensor 1020 are equal to or smaller than a first threshold value that is a predetermined threshold value. This determination is made on all of the dedicated defocusing amounts. When the dedicated defocusing amounts are equal to or smaller than the first threshold value, the process proceeds to Step S107 (Yes in Step S106), and a minimum amount among the dedicated defocusing amounts obtained for each of the plurality of dedicated AF areas is selected as a selected defocusing amount.
On the other hand, when the dedicated defocusing amounts obtained by the dedicated AF sensor 1020 are equal to or greater than the first threshold value, the process proceeds to Step S108 (No in Step S106). Next, in Step S108, it is determined whether the defocusing amounts obtained by the image-plane AF sensor 1031 are equal to or smaller than a second threshold value that is a predetermined threshold value. When the defocusing amounts are equal to or smaller than the second threshold value, the process proceeds to Step S109 (Yes in Step S108), and a minimum amount among the image-plane defocusing amounts obtained for each of the plurality of image-plane AF areas is selected as a selected defocusing amount.
On the other hand, when the defocusing amounts of the image-plane AF sensor 1031 are determined to be equal to or greater than the second threshold value in Step S108, the process proceeds to Step S110 (No in Step S108). Then, in Step S110, a minimum amount among the defocusing amounts obtained for each of the plurality of dedicated AF areas is selected as a selected defocusing amount. Next, a stabilization process is performed in Step S111.
Herein, the stabilization process will be described with reference to the flowchart of
First, in Step S201, it is determined whether or not the selected defocusing amount is a value in a predetermined reference range. When the defocusing amount is in the reference range, the process proceeds to Step S202, and a count value is set to be 0. This count value will be described later. Then, next in Step S203, the selected defocusing amount is employed as a defocusing amount to be used in focus control. In Step S203, the defocusing amount to be used in focus control is decided. The employed defocusing value is supplied to the focus control unit 1075.
Description will return to Step S201. In Step S201, when the selected defocusing amount is determined not to be in the reference range, the process proceeds to Step S204 (No in Step S201). Next, in Step S204, it is checked whether or not a defocusing amount of an object (for example, the face of a person, or the like) is obtained. When a defocusing amount of the object is obtained, the process proceeds to Step S203 (Yes in Step S204), and the selected defocusing amount is employed as a defocusing amount to be used in focus control.
On the other hand, when a defocusing amount of the object (for example, the face of a person, or the like) is not obtained, the process proceeds to Step S205 (No in Step S204), and it is checked whether or not the imaging apparatus 1000 is in the proximity priority mode. When the imaging apparatus 1000 is in the proximity priority mode, the process proceeds to Step S203 (Yes in Step S205), and the selected defocusing amount is employed as a defocusing amount to be used in focus control.
When the imaging apparatus 1000 is found not in the proximity priority mode in Step S205, the process proceeds to Step S206 (No in Step S205), and it is determined whether or not the subject is a moving object. Determining whether or not the subject is a moving object can be performed using a moving object detection technique of the related art. When the subject is a moving object, the process proceeds to Step S203 (Yes in Step S206), and the selected defocusing amount is employed as a defocusing amount to be used in focus control.
On the other hand, when the subject is not a moving object, the process proceeds to Step S207 (No in Step S206). Next, it is checked whether or not a count value is equal to or greater than a third threshold value in Step S207. When the count value is equal to or greater than the third threshold value, the process proceeds to Step S203 (Yes in Step S207), and the selected defocusing amount is employed as a defocusing amount to be used in focus control.
On the other hand, when the count value is not equal to or greater than the third threshold value, the process proceeds to Step S208 (No in Step S207), and 1 is added to the count value. Then, in Step S209, the selected defocusing amount is not employed, and as a result, focus control using driving of the focus lens based on the defocusing amount is not performed either.
In the stabilization process, when the answers to the all determinations from Step S201 to Step S206 are No, it is the case in which the defocusing amount is not in the reference range, a defocusing amount is not detected on the object, the imaging apparatus is not in the proximity priority mode, and the subject is not a moving object. In this case, focus control is not performed until the count value is equal to or greater than the third threshold value. Accordingly, a stand-by state in which focus control is in a paused state until the count value is equal to or greater than the third threshold value can be realized. In addition, since focus control is performed based on a defocusing amount as long as the defocusing amount is in the reference range, a significant change of the employed defocusing amount can be prevented. When the count value is equal to or smaller than the third threshold value, 1 is added to the count value in Step S208, and when the count value is equal to or greater than the third threshold value, the selected defocusing amount is employed as a defocusing amount to be used in focus control in Step S203. Thus, the length of the stand-by state can be adjusted according to setting of the threshold values.
Next, the image-plane defocusing amount decision process performed in Step S103 of the defocusing amount selection process will be described with reference to the flowchart of
First, in Step S301, a maximum value is substituted for an image-plane defocusing amount. Substituting the maximum value for the image-plane defocusing amount corresponds to performing initialization. For example, the image-plane defocusing amount is assumed to be defined as data with 16-bit codes. In this case, the range in which the image-plane defocusing amount can be obtained is “−32768 to +32767.” Since “image-plane defocusing amount=maximum value” corresponds to initialization, the maximum value “+32767” is substituted for the amount. The image-plane defocusing amount substituted with the maximum value is called an image-plane defocusing amount for comparison because the amount is compared when the sizes of image-plane defocusing amounts obtained for each image-plane AF area are determined.
Next, in Step S302, 1 is added to a variable i for counting the number of image-plane AF areas (i=i+1). This variable i is a value from 1 to the maximum number of image-plane AF areas. Thus, when there are 100 image-plane AF areas, for example, the image-plane AF areas are numbered from 1 to 100, and the variable i has a value from 1 to 100. Accordingly, the image-plane defocusing amount decision process is performed on all of the image-plane AF areas by looping the processes of the following Step S303 to Step S306.
Next, in Step S303, in an image-plane AF area corresponding to the variable i to be processed, it is checked whether or not a luminance value is equal to or greater than a predetermined value, and thereby it is determined whether or not the area has low contrast. When the area is determined not to have low contrast, the process proceeds to Step S304 (No in Step S303).
Next, in Step S304, the absolute value of an image-plane defocusing amount for comparison is compared to the absolute value of the image-plane defocusing amount in the image-plane AF area corresponding to the variable i. As a result of the comparison, when the absolute value of the image-plane defocusing amount in the ith AF area is greater than the absolute value of the image-plane defocusing amount for comparison, the process proceeds to Step S305 (Yes in Step S304). Then, in Step S305, it is set that “the absolute value of the image-plane defocusing amount for comparison=the absolute value of the image-plane defocusing amount,” and the defocusing amount of the ith image-plane AF area is decided.
On the other hand, in Step S304, when the absolute value of the image-plane defocusing amount in the ith image-plane AF area is smaller than the absolute value of the image-plane defocusing amount for comparison, the process proceeds to Step S306 (No in Step S304) without performing the process of Step S305. In addition, even when the area is determined to have low contrast in Step S303, the process proceeds to Step S306 (Yes in Step S303) without performing the process of Step S305. In this case, since the process of Step S305 is not performed, the image-plane defocusing amount is not decided.
Next, in Step S306, it is determined whether or not the variable i reaches the number of image-plane AF areas. When the variable i does not reach the number of image-plane AF areas, the process proceeds to Step S302 (No in Step S306). Then, the processes from Step S302 to Step S306 are repeated until the variable i reaches the number of image-plane AF areas. Accordingly, the processes from Step S302 to Step S306 are performed on all of the image-plane AF areas.
When the variable i reaches the number of image-plane AF areas, the process proceeds to Step S307 (Yes in Step S306). Then, in Step S307, a previously-decided image-plane defocusing amount determination process is performed.
Herein, the previously-decided image-plane defocusing amount determination process will be described with reference to the flowchart of
First, in Step S401, it is determined whether or not the previously decided image-plane defocusing amounts are equal to or smaller than a fourth threshold value that is a predetermined threshold value. When the image-plane defocusing amounts are equal to or smaller than the fourth threshold value, the process proceeds to Step S402 (Yes in Step S401). Then, in Step S402, the previously decided image-plane defocusing amounts are decided as image-plane defocusing amounts again.
On the other hand, in Step S401, when the image-plane defocusing amounts are determined to be equal to or greater than the fourth threshold value, the process proceeds to Step S403 (No in Step S401). Then, in Step S403, defocusing amounts of peripheral image-plane AF areas of the image-plane AF area for which the previously decided image-plane defocusing amount is obtained are computed.
The peripheral areas are, for example, 8 image-plane AF areas in the periphery of the image-plane AF areas for which the previously decided defocusing amounts are computed, four areas in the upper, lower, right, and left sides thereof, or the like.
Next, in Step S404, it is checked whether or not defocusing amounts have been computed for all image-plane AF areas in the periphery of the image-plane AF areas. The processes of Step S403 and Step S404 are repeated until image-plane defocusing amounts of all of the peripheral image-plane AF areas are computed (No in Step S404).
Then, after the computation of the defocusing amounts is performed for all of the peripheral AF areas, the process proceeds to Step S405 (Yes in Step S404). Next, in Step S405, it is determined whether a minimum value of the defocusing amounts of all of the peripheral AF areas is less than or equal to the fourth threshold value, and when the value is determined to be less than or equal to the fourth threshold value, the process proceeds to Step S406 (Yes in Step S405).
Then, in Step S406, the minimum value of the defocusing amounts of all of the peripheral AF areas is decided to be an image-plane defocusing amount. When the previously decided defocusing amounts of the image-plane AF areas are equal to or greater than the threshold value, the defocusing amount of a peripheral image-plane AF area corresponding to the movement destination of the subject when the subject moves to the periphery of the areas is employed as an image-plane defocusing amount.
When the minimum value of the defocusing amounts of all of the peripheral AF areas is determined to be more than the fourth threshold value in Step S405, the image-plane defocusing amount decided in the process of the flowchart of
As described above, either of the defocusing amount obtained by the dedicated AF sensor 1020 or the defocusing amount obtained by the image-plane AF sensor 1031 is selected to be used in focus control. Accordingly, autofocus in a wide range by the image-plane AF sensor 1031 can be compatible with autofocus with high accuracy by the image-plane AF sensor 1031.
Next, a process of increasing accuracy of an image-plane defocusing amount by correcting the image-plane defocusing amount when a subject leaves all of the dedicated AF areas and is positioned on the image-plane AF areas as shown in
First, in Step S501, the dedicated AF sensor 1020 and the image-plane AF sensor 1031 respectively perform focus detection. Next, in Step S502, it is determined whether or not focus is on a subject (main subject) targeted by a user among subjects (whether or not a subject to be traced is decided). When focus is not on the main subject, the process proceeds to Step S503 (No in Step S502).
Next, in Step S503, it is checked whether or not the focus detection by the dedicated AF sensor 1020 has been performed. When the focus detection by the dedicated AF sensor 1020 has been performed, the process proceeds to Step S504, AF control is performed based on the defocusing amount obtained from the focus detection by the dedicated AF sensor 1020. As long as the focus detection by the dedicated AF sensor 1020 is performed, AF control is performed in Step S504 based on the defocusing amount obtained by the dedicated AF sensor 1020. It should be noted that the AF control in Step S504 corresponds to the AF control process in Step S3 of the flowchart of
On the other hand, when the focus detection by the dedicated AF sensor 1020 has not been performed in Step S503, the process proceeds to Step S505 (No in Step S503). Then, in Step S505, a process for an AF out-of-control time is performed. When AF control is not available without performing the focus detection by the dedicated AF sensor 1020, for example, the imaging apparatus 1000 is in a photographing unavailable state with a nullified release button. Such nullification of the release button may be cancelled when, for example, focus detection is then performed by the dedicated AF sensor 1020.
Description will return to Step S502. When focus is determined to be on the subject targeted by the user among subjects in Step S502, the process proceeds to Step S506 (Yes in Step S502). Next, in Step S506, it is checked whether or not focus detection has been performed by the dedicated AF sensor 1020 or the image-plane AF sensor 1031. When the focus detection is performed by neither the dedicated AF sensor 1020 nor the image-plane AF sensor 1031, the process proceeds to Step S505, and the process for AF out-of-control time is performed (No in Step S506). The process for AF out-of-control time is, for example, nullification of the release button as described above. This is because photographing is difficult to perform when neither the dedicated AF sensor 1020 nor the image-plane AF sensor 1031 is available to perform focus detection. Nullification of the release button may be cancelled when, for example, focus detection is performed by the dedicated AF sensor 1020 thereafter.
On the other hand, when the focus detection is determined to be performed by the dedicated AF sensor 1020 or the image-plane AF sensor 1031 in Step S506, the process proceeds to Step S507 (Yes in Step S506). Next, in Step S507, it is determined whether or not the main subject is focused and traced. The determination is possible in such a way that it is checked whether or not there is an area having a focus deviation amount equal to or smaller than a predetermined value, and whether or not there is an AF area in which focus is substantially on the main subject of a previous AF operation among a plurality of AF areas.
When the main subject is not focused or traced, the process proceeds to Step S503 (No in Step S507). Then, if focus detection by the dedicated AF sensor 1020 is possible in Step S503, AF control is performed based on a defocusing amount detected by the dedicated AF sensor 1020 in Step S504. In addition, if focus detection by the dedicated AF sensor 1020 is unavailable in Step S503, the process for AF out-of-control time is performed in Step S505.
When the main subject is confirmed as being traced in Step S507, the process proceeds to Step S508 (Yes in Step S507). Next, in Step S508, it is checked whether or not the area in which the main subject is detected as being traced is a dedicated AF area. When the main subject is detected in a dedicated AF area, the display unit displays areas of the dedicated AF sensor 1020 and the image-plane AF sensor 1031 in Step S509.
In the display of the area in Step S509, for example, crosses overlapping the subject among crosses indicating the image-plane AF areas may be indicated by thick lines as shown in
Next, in Step S510, the difference between a defocusing amount in the dedicated AF area overlapping the subject and a defocusing amount in the image-plane AF area is computed, and stored in a storage unit, a cache memory, or the like of the imaging apparatus 1000.
As a method for computing the difference, for example, there is a method for obtaining the difference of respective defocusing amounts detected in an overlapping dedicated AF area and image-plane AF area. In addition, the difference may be obtained by associating a defocusing amount of one dedicated AF area and the average of defocusing amounts of a plurality of image-plane AF areas in the periphery of the dedicated AF area. Furthermore, the difference of defocusing amounts is also affected by an aberration property of the photographing lens 1011, and thus when, for example, a subject is positioned apart from substantially the center of a frame, an offset amount may be added to the difference, considering an aberration amount of the photographing lens 1011.
As will be described in detail, the difference is used to correct focus adjustment when the main subject leaves all of the dedicated AF areas and is positioned only in the image-plane AF areas.
Next, in Step S504, AF control is performed based on the defocusing amount of the dedicated AF sensor 1020. This is because AF control is better performed using the defocusing amount of the dedicated AF sensor 1020 when the main subject overlaps the dedicated AF area since the dedicated AF sensor 1020 shows higher AF accuracy than the image-plane AF sensor 1031. Then, the process returns to Step S501.
Description will return to Step S508. When the area in which the main subject is detected as being traced is determined not to be a dedicated AF area in Step S508, the process proceeds to Step S511 (No in Step S508).
The area in which the main subject is being traced is not a dedicated AF area when the main subject is detected in the image-plane AF areas only by the image-plane AF sensor 1031. Thus, next in Step S511, the image-plane AF area in which the main subject is detected is specified. As a method for specification, for example, an area for which a defocusing amount equal to or smaller than a predetermined value is detected is specified from a plurality of image-plane AF areas near a dedicated AF area in which a main subject has been detected, and a subject detected in the specified area is assumed to be the same subject as the main subject.
Next, in Step S512, the plurality of image-plane AF areas considered to overlap the main subject are grouped, and a predetermined data process such as an averaging process of defocusing amounts detected in the image-plane AF areas is performed so that tracing of AF is smoothly performed.
Next, in Step S513, it is determined whether or not the plurality of grouped image-plane AF areas are near the position of the main subject in the previous process. This is a process for continuing tracing only when the plurality of grouped image-plane AF areas are near the area in which the subject is detected in the previous focus detection so that focus is not on a subject other than the main subject when the subject is in the area. Here, being near means, for example, a state in which areas are neighboring.
When the plurality of grouped image-plane AF areas are not near the position of the main subject in the previous process, the process proceeds to Step S505 (No in Step S513). Then, in Step S505, the process for AF out-of-control time is performed. The process for AF out-of-control time is the same as described above.
On the other hand, when the plurality of grouped image-plane AF areas are near the position of the main subject in the previous process, the process proceeds to Step S514 (Yes in Step S513). Then, in Step S514, using the difference of the defocusing amounts computed and stored in Step S510, the defocusing amount detected by the image-plane AF sensor 1031 is corrected.
In general, accuracy of focus detection by the image-plane AF sensor is lower than that by the dedicated AF sensor in many cases. Thus, in AF areas of the dedicated AF areas and the image-plane AF areas overlapping each other in a state in which the dedicated AF sensor 1020 can perform focus detection, the difference of two focus detection results is computed. Then, when a subject overlaps only image-plane AF areas, focus detection by the image-plane AF sensor 1031 is corrected using the difference. Accordingly, the sole image-plane AF sensor 1031 can perform focus detection with accuracy of the same degree as the dedicated AF sensor 1020.
Next, in Step S515, areas traced by the image-plane AF sensor 1031 are displayed. In the display of the areas in Step S515, for example, crosses and a frame overlapping the subject among crosses indicating the image-plane AF sensor 1031 and frames indicating the dedicated AF areas may be indicated by thick lines as shown in
Then, in Step S516, AF control is performed based on the corrected defocusing amount of the image-plane AF sensor 1031. The AF control corresponds to the AF control process in Step S3 of the flowchart of
As described above, in the image-plane defocusing amount correction process, when both of the dedicated AF sensor 1020 and the image-plane AF sensor 1031 can perform focus detection, the difference between a defocusing amount of the dedicated AF sensor 1020 and a defocusing amount of the image-plane AF sensor 1031 is constantly computed. Then, when a subject leaves all dedicated AF areas and only the image-plane AF sensor 1031 can perform focus detection, the defocusing amount of the image-plane AF sensor 1031 is corrected using the computed difference. Accordingly, accuracy of focus detection by the image-plane AF sensor 1031 can improve, and autofocus with high accuracy and a wide range of AF areas can be compatible.
Next, a second embodiment of the present technology will be described.
The subject detection unit 1076 detects a subject from an image of supplied image data. As a subject, for example, there is the face of a person, or the like. In the second embodiment, a subject is a person, and a case in which the face of the person is detected will be exemplified. However, a target to be detected by the subject detection unit 1076 does not have to be the face of a person, and animals, buildings, and the like are possible as long as they are detectable objects.
As a detection method, template matching based on the shape of a face, template matching based on luminance distribution of a face, a method based on feature amounts of skin or the face of a person included in an image, and the like can be used. In addition, the methods can be combined in order to increase accuracy in face detection. It should be noted that, since the constituent elements other than the subject detection unit 1076 are the same as those of the first embodiment, description thereof will not be repeated.
Next, a process performed in the second embodiment will be described. First, an overview of a focusing process performed in the present embodiment will be described with reference to
In the first example of
Then, when focus is on the subject, and then the subject moves as shown in
In the second example of
In addition, when focus is on the subject, and then the subject moves as shown in
On the other hand, when the subject does not enter the AF areas within the predetermined period of time, another subject positioned in the AF areas is focused on as shown in
Next, the defocusing amount selection process included in the overall flowchart described above will be described with reference to the flowcharts of
After an image-plane defocusing amount decision process is performed in Step S1001, the process proceeds to Step S1002. It should be noted that the image-plane defocusing amount decision process of the second embodiment will be described later in detail. However, the image-plane defocusing amount decision process of the second embodiment is also a process in which defocusing amounts are computed for each of a plurality of image-plane AF areas and an image-plane defocusing amount is decided in the same manner as in the first embodiment.
Next, in Step S1002, it is determined whether or not the face of a subject has been detected in a photographed screen. When the face has not been detected, the process proceeds to Step S104 (No in Step S1002).
On the other hand, when the face has been detected, the process proceeds to Step S1003 (Yes in Step S1002). Next, in Step S1003, it is determined whether or not the detected face overlaps dedicated AF areas. When the face overlaps the dedicated AF areas, a minimum defocusing amount among the defocusing amounts of the dedicated AF areas located in the region detected as the face is set to be a selected defocusing amount in Step S1004 (Yes in Step S1003).
When the detected face does not overlap the dedicated AF areas in Step S1003, the process proceeds to Step S1005 (No in Step S1003). Next, in Step S1005, it is determined whether or not the detected face overlaps image-plane AF areas. When the face overlaps the image-plane AF areas, a minimum defocusing amount among the defocusing amounts of the plurality of image-plane AF areas located in the region detected as the face is set to be a selected defocusing amount in Step S1006 (Yes in Step S1005).
Since other processes are the same as those of the first embodiment, description thereof will not be repeated. It should be noted that a stabilization process is also the same as that of the first embodiment.
Next, an image-plane defocusing amount decision process in the second embodiment will be described with reference to the flowchart of
First, in Step S3001, a maximum value is substituted for an image-plane face defocusing amount. The image-plane face defocusing amount refers to a defocusing amount of image-plane AF areas overlapping a region detected as the face of a subject in a photographed screen. Substituting the maximum value for the image-plane face defocusing amount corresponds to performing initialization. For example, the image-plane face defocusing amount is assumed to be defined as data with 16-bit codes. In this case, the range in which the image-plane face defocusing amount can be obtained is “−32768 to +32767.” Since “image-plane face defocusing amount=maximum value” corresponds to initialization, the maximum value “+32767” is substituted for the amount. The image-plane face defocusing amount substituted with the maximum value is called an image-plane face defocusing amount for comparison because the amount is compared when the sizes of image-plane defocusing amounts obtained for each image-plane AF area overlapping a face region are determined.
In addition, in Step S3001, the maximum value is substituted for an image-plane defocusing amount for comparison in the same manner as in the first embodiment. In Step S302, substituting 1 for a variable i is also the same as in the first embodiment.
In Step S303, when the area is determined not to have low contrast, the process proceeds to Step S3001 (No in Step S303). Next, in Step S3002, it is checked whether or not an image-plane AF area among the plurality of image-plane AF areas corresponding to the variable i overlaps the region detected as the face.
When the image-plane AF area corresponding to the variable i overlaps the face region, the process proceeds to Step S3003 (Yes in Step S3002). Next, in Step S3003, the absolute value of the image-plane face defocusing amount for comparison is compared to the absolute value of the image-plane defocusing amount in an image-plane AF area. As a result of the comparison, when the absolute value of the image-plane defocusing amount in the ith image-plane AF area is smaller than the absolute value of the image-plane face defocusing amount for comparison, the process proceeds to Step S3004 (No in Step S3003). Then, in Step S3004, the defocusing amount of the ith image-plane AF area overlapping the face region is decided.
On the other hand, when the absolute value of the image-plane defocusing amount in the ith image-plane AF area is greater than the absolute value of the image-plane face defocusing amount for comparison in Step S3003, the process proceeds to Step S304 (Yes in Step S3003) without performing the process of Step S3004. In addition, when the image-plane AF area corresponding to the variable i does not overlap the face region in Step S3002, the process also proceeds to Step S304 (No in Step S3002) without performing the process of Step S3004. Since the process of Step S3004 is not performed in this case, the image plane defocusing amount of the ith image-plane AF area overlapping the face region remains undecided. As described above, in the second embodiment, the defocusing amount of the image-plane AF area overlapping the region detected as the face is decided.
The processes in the second embodiment are performed as described above. In the second embodiment, since focus control is performed based on the defocusing amount of the AF area overlapping the region detected as the face of the subject, focus control is possible based on the face positions as shown in
The processes in the present technology are performed as described above. In general, when a subject leaves all AF areas of the dedicated AF sensor 1020 in a state in which the subject has been focused and traced, there are cases in which another subject present in the background of the subject targeted by a user is focused on. However, according to the present technology, since a subject can be detected by the image-plane AF sensor 1031 in a wide range, focus can be kept on the subject once the subject is focused even when the subject leaves all AF areas of the image-plane AF sensor 1031, and erroneous focusing on another subject can be prevented.
In addition, when a tracing operation is performed with the focus on a subject who a user desires and another subject approaches and enters the frame, there are cases in which the latter subject is focused on. However, according to the present technology, once focus is on a subject, the focus is not shifted to another subject even when the subject approaches, and the focus can be continuously on the subject who the user desires.
In addition, since the image-plane AF sensor 1031 having a wide focus range is used in addition to the dedicated AF sensor 1020, even when a position of a subject is significantly changed, the subject can be reliably detected and traced. Furthermore, when the face or the like of a subject is detected, and the face or the like overlaps image-plane AF areas, focus control is performed using image-plane defocusing amounts thereof, and thus a subject can be traced in a more extensive range than before.
Hereinabove, although the embodiments of the present technology have been described in detail, the present technology is not limited to the embodiments described above, and can be variously modified based on the technical gist thereof.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Additionally, the present technology may also be configured as below.
(1) An imaging apparatus including:
a first focus detection unit that is provided in an image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through a photographing lens; and
a second focus detection unit that is provided so as to be positioned above the image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through the photographing lens.
(2) The imaging apparatus according to (1), wherein the second focus detection unit is a dedicated phase difference focus detection module.
(3) The imaging apparatus according to (1) or (2), wherein the first focus detection unit includes a phase difference focus detection element provided in the image sensor.
(4) The imaging apparatus according to any one of (1) to (3), further including:
an optical member that splits subject image light that has passed through the photographing lens into incident light of the image sensor and incident light of the dedicated phase difference focus detection module.
(5) The imaging apparatus according to any one of (1) to (4), further including:
an electronic view finder that displays an image obtained using the image sensor.
The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2012-199534 filed in the Japan Patent Office on Sep. 11, 2012, the entire content of which is hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
2012-199534 | Sep 2012 | JP | national |