IMAGING APPARATUS

Abstract
There is provided an imaging apparatus including a first focus detection unit that is provided in an image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through a photographing lens, and a second focus detection unit that is provided so as to be positioned above the image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through the photographing lens.
Description
BACKGROUND

The present technology relates to an imaging apparatus.


In single lens reflex cameras of the related art, a so-called dedicated phase difference sensor is mounted to realize fast autofocus. On the other hand, compact cameras, mirrorless cameras, and the like generally employ a contrast detection autofocus (hereinafter referred to as AF) system. In addition, in order to realize fast AF in such cameras, a method of embedding an image sensor for phase difference detection in another image sensor has been proposed (refer to Japanese Unexamined Patent Application Publication No. 2000-156823).


Furthermore, a method of mounting both a dedicated phase difference detecting module (hereinafter referred to as a dedicated AF sensor) and a phase difference detecting image-plane sensor (hereinafter referred to as an image-plane AF sensor) has also been proposed in order to obtain advantages of both sensors using the above-described technique.


SUMMARY

In such an imaging apparatus in which both the dedicated AF sensor and the image-plane AF sensor are mounted, unnecessary light reflected on the dedicated AF sensor is incident on an image sensor during photographing particularly in a strong backlight state, which may adversely affect photographing and focus detection.


It is desirable to provide an imaging apparatus that can prevent a backlight state from adversely affecting an image sensor in a configuration in which both a dedicated AF sensor and an image-plane AF sensor are mounted.


According to an embodiment of the present technology, there is provided an imaging apparatus including a first focus detection unit that is provided in an image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through a photographing lens, and a second focus detection unit that is provided so as to be positioned above the image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through the photographing lens.


According to an embodiment of the present technology, an adverse effect of a backlight state on an image sensor in a configuration in which both a dedicated AF sensor and an image-plane AF sensor are mounted can be prevented.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic cross-sectional diagram illustrating an outlined configuration of an imaging apparatus according to the related art;



FIG. 2 is a diagram illustrating a configuration of an image sensor;



FIG. 3A is a diagram illustrating an example of an output of phase difference focus detection when there is no unnecessary incident light, and FIG. 3B is a diagram illustrating an example of an output of phase difference focus detection when there is unnecessary incident light;



FIG. 4 is a schematic cross-sectional diagram illustrating an outlined configuration of an imaging apparatus according to the present technology;



FIG. 5 is a diagram illustrating a disposition of image-plane AF areas and dedicated AF areas on a photographed screen;



FIG. 6 is a block diagram illustrating a configuration of the imaging apparatus according to the present technology;



FIG. 7 is a diagram for describing a configuration of image-plane AF areas;



FIG. 8 is a diagram for describing another configuration of image-plane AF areas;



FIGS. 9A, 9B, 9C, and 9D are diagrams for describing an overview of a process in a first embodiment;



FIGS. 10A, 10B, 10C, and 10D are diagrams for describing an overview of another process in the first embodiment;



FIG. 11 is a diagram for describing an overview of still another process in the first embodiment;



FIG. 12 is an overall flowchart for describing the processes in the first embodiment;



FIG. 13 is a flowchart for describing a defocusing amount selection process in the first embodiment;



FIG. 14 is a flowchart for describing a stabilization process;



FIG. 15 is a flowchart for describing an image-plane defocusing amount decision process in the first embodiment;



FIG. 16 is a flowchart for describing a previously decided image-plane defocusing amount determination process;



FIG. 17 is a flowchart for describing an image-plane defocusing amount correction process;



FIG. 18 is a flowchart for describing the image-plane defocusing amount correction process;



FIG. 19 is a block diagram illustrating a configuration of an imaging apparatus according to a second embodiment of the present technology;



FIGS. 20A, 20B, 20C, and 20D are diagrams for describing a first example of an overview of a process in the second embodiment;



FIGS. 21A, 21B, 21C, and 21D are diagrams for describing a second example of the overview of the process in the second embodiment;



FIG. 22 is a flowchart for describing another defocusing amount selection process in the first embodiment;



FIG. 23 is a flowchart for describing the defocusing amount selection process in the first embodiment; and



FIG. 24 is a flowchart for describing an image-plane defocusing amount decision process in the second embodiment.





DETAILED DESCRIPTION OF THE EMBODIMENT(S)

Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.


Hereinafter, embodiments of the present technology will be described with reference to the appended drawings. Note that description will be provided in the following order.

  • <1. Embodiments>
  • [1-1. Configuration of an imaging apparatus of the related art]
  • [1-2. Configuration of an imaging apparatus according to an embodiment of the present technology]
  • <2. First embodiment of a process in the imaging apparatus>
  • [2-1. Configuration of the imaging apparatus]
  • [2-2. Overview of a process]
  • [2-3. Defocusing amount selection process]
  • [2-4. Image-plane defocusing amount decision process]
  • [2-5. Image-plane defocusing amount correction process]
  • <3. Second embodiment of a process in the imaging apparatus>
  • [3-1. Configuration of the imaging apparatus]
  • [3-2. Overview of a process]
  • [3-3. Defocusing amount selection process]
  • <4. Modified example>


1. Embodiments
[1-1. Configuration of an Imaging Apparatus of the Related Art]

First, an example of a configuration of an imaging apparatus 100 according to the related art will be described with reference to FIG. 1. The imaging apparatus 100 has a housing 110, an optical imaging system 120, a semi-transmissible mirror 130, an image sensor 140, a phase difference detection element 150 embedded in the image sensor (hereinafter referred to as an image-plane AF sensor 150), a dedicated phase difference AF module 160 (hereinafter referred to as a dedicated AF sensor 160), a pentaprism 170, a finder 180, and a display 190.


As shown in FIG. 1, the optical imaging system 120 is provided in the housing 110 constituting the main body of the imaging apparatus 100. The optical imaging system 120 is, for example, a so-called lens unit that can be replaceable, and provided with a photographing lens 122, a diaphragm, and the like inside a lens barrel 121. The photographing lens 122 is driven by a focus drive system (not shown), and designed to enable AF operations. It should be noted that the optical imaging system 120 may be configured as one body with the housing 110.


The semi-transmissible mirror 130 is provided between the photographing lens 122 and the image sensor 140 in the housing 110. Light from a subject is incident on the semi-transmissible mirror 130 through the photographing lens 122. The semi-transmissible mirror 130 reflects part of subject light incident through the photographing lens 122 in the direction of the dedicated AF sensor 160 positioned below the semi-transmissible mirror, reflects part of the subject light in the direction of the pentaprism 170 positioned above the mirror, and further causes part of the subject light to be transmitted therethrough toward the image sensor 140. In addition, a total-reflection mirror 131 is provided on the side of the image sensor 140 of the semi-transmissible mirror 130 as a sub-mirror. The total-reflection mirror 131 guides subject light that has been transmitted through the semi-transmissible mirror 130 to the dedicated AF sensor 160. During an AF operation, subject light for dedicated AF is transmitted through the semi-transmissible mirror 130, bent downward by the total-reflection mirror 131, and then incident on the dedicated AF sensor 160. In addition, during photographing, the semi-transmissible mirror 130 and the total-reflection mirror 131 are retracted, and the subject light is guided to the image sensor 140.


The image sensor 140 for generating photographed images is provided inside the housing 110. As the image sensor 140, a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like is used. The image sensor 140 photoelectrically converts subject light incident through the photographing lens 122 into an amount of electric charge, and thereby generates images. Image signals undergo predetermined signal processes such as a white balance adjustment process or a gamma correction process, and then finally are stored in a storage medium in the imaging apparatus 100, an external memory, or the like as image data.


The image sensor 140 has R (Red) pixels, G (Green) pixels, and B (Blue) pixels, which are general imaging pixels, and phase difference detection elements for detecting a phase difference focus. The pixels constituting the image sensor photoelectrically convert incident light from a subject to convert the light into an amount of electric charge, and output a pixel signal.



FIG. 2 is a diagram illustrating an array state of the general pixels and the phase difference detection elements of the image sensor 140. R indicates R (Red) pixels, G indicates G (Green) pixels, and B indicates B (Blue) pixels, all of which are general imaging pixels.


In addition, in FIG. 2, P1 indicates a first phase difference detection element, and P2 indicates a second phase difference detection pixel. The phase difference detection elements are configured to form pairs of P1 and P2, and perform pupil-dividing of the photographing lens 122. The phase difference detection elements P1 and P2 have an optical feature different from general imaging pixels. It should be noted that, in FIG. 2, G pixels are set as phase difference detection elements. This is because there are twice as many G pixels as there are R pixels or B pixels. However, phase difference detection elements are not limited to the G pixels.


In this manner, the image sensor 140 has the image-plane AF sensor 150 using the phase difference detection elements in addition to the general pixels, and the imaging apparatus 100 can perform so-called image-plane phase difference AF (Autofocus) using an output from the image-plane AF sensor 150.


The dedicated AF sensor 160 is provided below the semi-transmissible mirror 130 inside the housing 110 so as to be positioned in front of the image sensor 140. The dedicated AF sensor 160 is a dedicated autofocus sensor of, for example, a phase difference detection AF system, a contrast detection AF system, or the like. As an AF system, the phase difference detection system and the contrast AF system may be combined. In order to satisfactorily perform AF in a dark place or on a subject with low contrast, it may be possible to generate AF auxiliary light and gain an AF evaluation value from returning light. Subject light collected by the photographing lens is reflected on the semi-transmissible mirror and then incident on the dedicated AF sensor 160. A focus detection signal detected by the dedicated AF sensor 160 is supplied to a processing unit that performs computation of a defocusing amount in the imaging apparatus 100.


Description will return to the configuration of the imaging apparatus 100. The pentaprism 170 is a prism having a cross-section in a pentagonal shape, and causing subject light incident from the bottom to be reflected therein to switch the top and bottom and the right and left of an image of the subject light, thereby forming an upright image. The subject image set to be the upright image by the pentaprism 170 is guided in the direction of the finder 180. The finder 180 functions as an optical finder through which subjects are checked during photographing. A user can check an image of a subject by looking in a finder window.


The display 190 is provided in the housing 110. The display 190 is a flat display such as an LCD (Liquid Crystal Display), or an organic EL (Electroluminescence). Image data obtained by processing an image signal output from the image sensor 140 in a signal processing unit (not shown) is supplied to the display 190, and the display 190 displays the image data as a real-time image (a so-called through image) thereon. In FIG. 1, the display 190 is provided on the back side of the housing, but the disposition is not limited thereto, and the display may be provided on an upper face or the like of the housing, and may be a movable type, or a detachable type.


The imaging apparatus 100 according to the related art is configured as described above. When photographing is performed in the imaging apparatus 100, if the sun is set to be in a photographing direction, and accordingly the imaging apparatus is in a strong backlight state, there is concern that unnecessary light reflected on a face of the dedicated AF sensor 160 is incident on the image sensor 140 as shown in FIG. 1, which adversely affects detection of a focus by the image-plane AF sensor 150.



FIGS. 3A and 3B are diagrams showing signal output examples of a phase difference focus detection system of the image-plane AF sensor 150. Generally, when no unnecessary light is incident in the phase difference focus detection system, two images (a P1 image and a P2 image) have substantially the same shape and the same output level as shown in FIG. 3A. On the other hand, when the imaging apparatus is in a strong backlight state, and unnecessary light is incident on an image sensor with phase difference detection elements embedded, two images have different shapes, or an output level of either of the two images gradually decreases as shown in FIG. 3B, and thus accurate detection of a focus is difficult.


[1-2. Configuration of an Imaging Apparatus According to an Embodiment of the Present Technology]

Next, a configuration of an imaging apparatus according to the present technology will be described. FIG. 4 is a schematic cross-sectional diagram illustrating an outlined configuration of the imaging apparatus 1000 according to the present technology.


The imaging apparatus 1000 according to the present technology has a housing 1001, an optical imaging system 1010 provided with a photographing lens 1011, a semi-transmissible mirror 1002, an image sensor 1030, an image-plane AF sensor 1031, a dedicated AF sensor 1020, an electronic view finder 1003, and a display 1004. It should be noted that, since the configurations of the housing 1001, the optical imaging system 1010, the image sensor 1030, the image-plane AF sensor 1031, and the display 1004 are the same as those of the imaging apparatus of the related art described above, description thereof will not be repeated.


The semi-transmissible mirror 1002 is provided in the housing 1001 between the photographing lens 1011 and the image sensor 1030 positioned in the housing 1001. Subject light is incident on the semi-transmissible mirror 1002 via the photographing lens 1011. The semi-transmissible mirror 1002 reflects part of the subject light incident through the photographing lens in the direction of the dedicated AF sensor 1020 positioned above, and transmits part of the subject light toward the image sensor 1030.


The dedicated AF sensor 1020 is provided so as to be positioned above the semi-transmissible mirror 1002 and in front of the image sensor 1030 in the housing 1001. The dedicated AF sensor 1020 is a dedicated autofocus module of, for example, a phase difference detection system, or a contrast AF system. Subject light collected by the photographing lens 1011 is reflected on the semi-transmissible mirror 1002, and then incident on the dedicated AF sensor 1020. A focus detection signal detected by the dedicated AF sensor 1020 is supplied to a processing unit that performs computation of a defocusing amount in the imaging apparatus 1000.



FIG. 5 is a diagram illustrating AF areas of the dedicated AF sensor 1020 on a photographed screen (hereinafter referred to as dedicated AF areas) and AF areas of the image-plane AF sensor 1031 on the photographed screen (hereinafter referred to as image-plane AF areas).


In FIG. 5, the areas indicated by square frames are the dedicated AF areas. As understood from FIG. 5, the dedicated AF areas are disposed in a narrower range than the image-plane AF areas, and concentrated substantially in the vicinity of the center. The dedicated AF sensor 1020 can detect a focus with higher accuracy than the image-plane AF sensor 1031.


The areas indicated by crosses in FIG. 5 are the image-plane AF areas. As understood from FIG. 5, the image-plane AF areas are spread in a wide range, and can complement a subject in a wide range.


There are cases in which it is difficult to uniformly dispose the AF areas at an equal interval in the dedicated AF sensor 1020 due to disposition of the areas in a dedicated optical system. For this reason, when detection results of the dedicated AF areas and the image-plane AF areas are compared as in the present technology, it is better to put the positions of the two kinds of AF areas together. To this end, the image-plane AF areas are unevenly disposed so that the positions of the image-plane AF areas are associated with the positions of the dedicated AF areas as shown in FIG. 5. The method of disposition will be described later.


The electronic view finder (EVF) 1003 is provided in the housing 1001. The electronic view finder 1003 has, for example, a liquid crystal display, an organic EL display, or the like. Image data obtained by processing an image signal output from the image sensor 1030 in a signal processing unit (not shown) is supplied to the electronic view finder 1003, and the electronic view finder 1003 displays the image data as a real-time image (through image).


The imaging apparatus according to the present technology is configured as described above. In the imaging apparatus according to the present technology, the dedicated AF sensor 1020 is provided above the semi-transmissible mirror 1002 in the housing 1001 of the imaging apparatus 1000. Thus, even when the sun is in the photographing direction, and accordingly the imaging apparatus is in a strong backlight state, there is no such case in which unnecessary light is reflected on a face of the dedicated AF sensor 1020 and incident on the image sensor 1030 as shown in FIG. 4. Thus, it is possible to prevent unnecessary light from adversely affecting detection of a focus by the image-plane AF sensor 1031.


It should be noted that, since a light source such as the sun or an illuminating device is positioned higher than an imaging apparatus during photographing in many cases, light is also incident on the imaging apparatus from above. Thus, by providing the dedicated AF sensor 1020 above the image sensor as in the present technology, it is possible to prevent unnecessary light from being reflected on the dedicated AF sensor 1020 and incident on the image sensor 1030.


It should be noted that, in the present technology, the dedicated AF sensor 1020 is provided at the position at which the pentaprism would be provided in the related art, and thus the pentaprism may not be provided, and the electronic view finder is preferably used as a finder.


<2. First Embodiment of a Process in the Imaging Apparatus>
[2-1. Configuration of the Imaging Apparatus]

The imaging apparatus 1000 of FIG. 6 is configured to include the optical imaging system 1010, the dedicated AF sensor 1020, the image sensor 1030, the image-plane AF sensor 1031, a pre-processing circuit 1040, a camera processing circuit 1050, an image memory 1060, a control unit 1070, a graphic I/F (Interface) 1080, a display unit 1090, an input unit 1100, an R/W (reader and writer) 1110, and a storage medium 1120. The control unit functions as a defocusing amount computation unit 1071, a defocusing amount selection unit 1072, a defocusing amount decision unit 1073, a defocusing amount correction unit 1074, and a focus control unit 1075.


The optical imaging system 1010 is configured to include the photographing lens 1011 for collecting light from a subject on the image sensor 1030 (including a focus lens, a zoom lens, and the like), a lens drive mechanism 1012 that adjusts focus by moving the focus lens, a shutter mechanism, an iris mechanism, and the like. The system is driven based on a control signal from the control unit 1070 and the focus control unit 1075. The lens drive mechanism 1012 realizes an AF operation by moving the photographing lens 1011 in an optical axis direction corresponding to a defocusing amount supplied from the focus control unit 1075. A light image of a subject obtained through the optical imaging system 1010 is formed on the image sensor 1030 serving as an imaging device.


The dedicated AF sensor 1020 is a dedicated autofocus sensor of, for example, the phase difference detection AF system, the contrast detection AF system, or the like. Subject light collected by the photographing lens 1011 is reflected on the semi-transmissible mirror and incident on the dedicated AF sensor 1020. A focus detection signal detected by the dedicated AF sensor 1020 is supplied to the defocusing amount computation unit 1071. The dedicated AF sensor 1020 corresponds to a first focus detection unit according to an embodiment of the present disclosure. Thus, a defocusing amount obtained from detection of focus by the dedicated AF sensor 1020 corresponds to a first defocusing amount according to an embodiment of the present disclosure.


The image sensor 1030 has R (Red) pixels, G (Green) pixels, and B (Blue) pixels, which are normal imaging pixels, and phase difference detection elements for detecting a phase difference focus. The pixels constituting the image sensor 1030 photoelectrically convert light incident from a subject into an amount of electric charge, and output a pixel signal. In addition, the image sensor 1030 finally outputs an imaging signal that includes the pixel signal to the pre-processing circuit 1040. As the image sensor 1030, a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like is used. It should be noted that a detailed configuration of the image sensor 1030 will be described later.


The image-plane AF sensor 1031 is a sensor for autofocus that includes a plurality of phase difference detection elements. A focus detection signal detected by the image-plane AF sensor 1031 is supplied to the defocusing amount computation unit 1071. A detailed configuration of the image-plane AF sensor 1031 will be described later. The image-plane AF sensor 1031 corresponds to a second focus detection unit according to an embodiment of the present disclosure. Thus, a defocusing amount obtained from detection of focus by the image-plane AF sensor 1031 corresponds to a second defocusing amount according to an embodiment of the present disclosure.


The pre-processing circuit 1040 performs sample holding or the like on the imaging signal output from the image sensor 1030 so that an S/N (Signal to Noise) ratio is satisfactorily held from a CDS (Correlated Double Sampling) process. Furthermore, gain is controlled in an AGC (Auto Gain Control) process, A/D (Analog to Digital) conversion is performed, and a digital image signal is thereby output.


The camera processing circuit 1050 performs signal processes such as a white balance adjustment process, a color correction process, a gamma correction process, a Y/C conversion process, an AE (Auto Exposure) process, and the like on the image signal output from the pre-processing circuit 1040.


The image memory 1060 is a volatile memory, or a buffer memory configured as, for example, a DRAM (Dynamic Random Access Memory), which temporarily stores image data that has undergone the predetermined processes by the pre-processing circuit 1040 and the camera processing circuit 1050.


The control unit 1070 is constituted by, for example, a CPU, a RAM, a ROM, and the like. The ROM stores programs read and operated by the CPU, and the like. The RAM is used as a work memory of the CPU. The CPU controls the entire imaging apparatus 1000 by executing various processes according to the programs stored in the ROM and issuing commands.


In addition, the control unit 1070 functions as the defocusing amount computation unit 1071, the defocusing amount selection unit 1072, the defocusing amount decision unit 1073, the defocusing amount correction unit 1074, and the focus control unit 1075 by executing a predetermined program. Each of the units may be realized by hardware with each of the functions as a dedicated device, not by a program. In this case, the imaging apparatus 1000 is configured to include the hardware.


The defocusing amount computation unit 1071 computes a defocusing amount that indicates a deviation amount from focus based on a phase difference detection signal acquired by the dedicated AF sensor 1020 or the image-plane AF sensor 1031. The defocusing amount selection unit 1072 performs a process of selecting which amount between a defocusing amount obtained from a detection result of the dedicated AF sensor 1020 (hereinafter referred to as a dedicated defocusing amount) and a defocusing amount obtained from a focus detection result of the image-plane AF sensor 1031 (hereinafter referred to as an image-plane defocusing amount) will be used in focus control and employing the result. A detailed process performed by the defocusing amount selection unit 1072 will be described later.


The defocusing amount decision unit 1073 performs a process of deciding a defocusing amount for each image-plane AF area based on the image-plane defocusing amount computed based on the focus detection result of the image-plane AF sensor. A detailed process of the defocusing amount decision unit 1073 will be described later. The defocusing amount correction unit 1074 performs a correction process of an image-plane defocusing amount. A detailed process performed by the defocusing amount correction unit 1074 will be described later. The focus control unit 1075 controls the lens drive mechanism 1012 of the optical imaging system 1010 based on the employed defocusing amount to perform a focus adjustment process.


The graphic I/F 1080 causes an image to be displayed by generating an image signal for displaying the image on the display unit 1090 from the image signal supplied from the control unit 1070 and supplying the signal to the display unit 1090. The display unit 1090 is a display unit configured as, for example, an LCD (Liquid Crystal Display), a PDP (Plasma Display Panel), an organic EL (Electro-luminescence) panel, or the like. The display unit 1090 displays a through image being captured, an image recorded in the storage medium 1120, and the like.


The input unit 1100 includes, for example, a power button for switching between on and off of power, a release button for instructing start of recording a captured image, an operator for zoom adjustment, a touch screen integrated with the display unit 1090, and the like. When an input operation is performed on the input unit 1100, a control signal according to the input is generated and output to the control unit 1070. Then, the control unit 1070 performs an arithmetic operation process and control according to the control signal.


The R/W 1110 is an interface connected to the storage medium 1120 in which image data generated from imaging, and the like is recorded. The R/W 1110 writes data supplied from the control unit 1070 on the storage medium 1120, and outputs data read from the storage medium 1120 to the control unit 1070. The storage medium 1120 is a large-capacity storage medium 1120, for example, a hard disk, a Memory Stick (registered trademark of Sony Corporation), an SD memory card, or the like. Images are stored in a compressed state in the form of, for example, JPEG, or the like. In addition, EXIF (Exchangeable Image File Format) data including information of the stored images and additional information such as imaged dates, and the like is also stored therein in association with the images.


Herein, a basic operation of the imaging apparatus 1000 described above will be described. Before an image is captured, signals obtained from photoelectric conversion of light sensed by the image sensor 1030 are sequentially supplied to the pre-processing circuit 1040. The pre-processing circuit 1040 performs a CDS process, an AGC process, and the like on the input signals, and further performs conversion of the signals into image signals.


The camera processing circuit 1050 performs an image quality correction process on the image signals supplied from the pre-processing circuit 1040, and supplies the result to the graphic I/F 1080 via the control unit 1070 as signals of a camera through image. Accordingly, the camera through image is displayed on the display unit 1090. A user can adjust an angle of view while viewing the through image displayed on the display unit 1090.


In this state, when the shutter button of the input unit 1100 is pressed, the control unit 1070 outputs a control signal to the optical imaging system 1010 to cause a shutter included in the optical imaging system 1010 to operate. Accordingly, image signals for one frame are output from the image sensor 1030.


The camera processing circuit 1050 performs an image quality correction process on the image signals for one frame supplied from the image sensor 1030 via the pre-processing circuit 1040, and supplies the processed image signals to the control unit 1070. The control unit 1070 encodes and compresses the input image signals and supplies the generated encoded data to the R/W 1110. Accordingly, a data file of a captured still image is stored in the storage medium 1120.


Meanwhile, when the image file stored in the storage medium 1120 is reproduced, the control unit 1070 reads the selected still image file from the storage medium 1120 through the R/W 1110 according to an input operation on the input unit 1100. The read image file is subjected to an extended decoding process. Then, decoded image signals thereof are supplied to the graphic I/F 1080 via the control unit 1070. Accordingly, a still image stored in the storage medium 1120 is displayed on the display unit 1090.


The phase difference detection elements are embedded in image sensors 1030 as shown in, for example, FIG. 7 so as not to affect a photographed image. In the horizontal direction, a pair of elements (P and Q in the drawing) that are partially opened and pupil-divided for detecting a phase difference are disposed in line. In addition, in the vertical direction, lines of the phase difference pixels are embedded at an interval of several lines.


In the phase difference detection elements disposed as described above, a plurality of phase difference detection elements are set to be an AF area as a group (for example, the rectangular frame indicated by a thick line in FIG. 7), and an arithmetic operation for focus detection is performed for each area. Accordingly, by deviating setting of the AF areas as shown in FIG. 8, uneven disposition of the AF areas as shown in FIG. 5 is possible. It should be noted that the disposition of the AF areas can be unevenly made from a process of software, but by setting disposition of the phase difference detection elements in the image sensor 1030 to be uneven, the AF areas can also be unevenly disposed.


[2-2. Overview of a Process]

Next, a process executed by the imaging apparatus 1000 will be described. First, an overview of a focusing process executed in the present embodiment will be described with reference to FIGS. 9A to 11. FIGS. 9A to 11 show dedicated AF areas within a photographed screen, image-plane AF areas within the photographed screen, and a subject traced using autofocus. In FIGS. 9A to 11, dashed-lined squares indicate the dedicated AF areas of the dedicated AF sensor 1020, and dashed-lined crosses indicate the image-plane AF areas of the image-plane AF sensor 1031.


First, FIG. 9A shows a state in which a subject is not present and autofocus is not performed. When a subject appears as shown in FIG. 9B, and a user inputs an AF instruction (for example, half-presses a shutter), a defocusing amount is first computed based on a focus detection result of the dedicated AF sensor 1020, and focus is set to be on a proximate subject (hereinafter referred to as a proximate subject) based on the defocusing amount. To be specific, focus is set to be on the proximate subject by adjusting focus of the photographing lens 1011 when the photographing lens 1011 is driven based on the defocusing amount. In FIGS. 9A to 9D, AF areas in which focus is on the proximate subject are indicated by solid lines.



FIG. 9C shows a case in which the subject moves after focus is on the proximate subject. Also in this case, focus is adjusted so that the subject proximate to a current focus position (subject with a minimum defocusing amount) is kept to be focused using the defocusing amount computed based on respective focus detection results of the dedicated AF sensor 1020 and the image-plane AF sensor 1031. In FIG. 9C, a dedicated AF area and image-plane AF areas in which the proximate subject is focused are all indicated by solid lines.



FIG. 9D shows a case in which the subject moves and then leaves all AF areas of the dedicated AF sensor 1020. In this case, if the subject is positioned within an image-plane AF area, focus is kept to be on the subject with a minimum defocusing amount using the defocusing amount of the image-plane AF sensor 1031. Thus, focus is not lost from a subject.


In FIG. 9D, crosses of AF areas in which focus is on the subject are indicated by solid lines. It should be noted that, in the present technology, when a subject leaves all dedicated AF areas and is positioned only on image-plane AF areas, a process of increasing accuracy of a defocusing amount is performed by the defocusing amount correction unit. Details of the process will be described.



FIG. 10A shows a case in which the subject further moves, and leaves all AF areas of the dedicated AF sensor 1020 and the image-plane AF sensor 1031. In this case, the focus adjustment process is paused for a predetermined time at the final focus position until the subject is detected again by the dedicated AF sensor 1020.


When the subject within a predetermined defocusing amount is not detected by the dedicated AF sensor 1020 even after the predetermined time elapses from the pause of the focus adjustment, focus adjustment is performed so as to focus on another subject with a minimum defocusing amount of the dedicated AF sensor 1020 as shown in FIG. 10B. Accordingly, a subject being traced is changed. In FIG. 10B, the square of an AF area in which focus is on the subject is indicated by a solid line.


Even when the subject that was previously focused and traced enters AF areas of the dedicated AF sensor 1020 again as shown in FIG. 10C after the subject being traced is changed, focus adjustment is performed so that focus is on the changed subject.


It should be noted that, when the subject being traced is not a subject that a user desires, the input of the AF instruction is first released by the user (for example, release of half-pressing of the shutter) to pause the autofocus process. Then, there is no focus on any subject as shown in FIG. 10D.


In addition, when the user inputs an AF instruction again (for example, half-presses the shutter), focus adjustment is performed so that focus is on the proximate subject as shown in FIG. 11.


In the present technology as described above, a subject can be focused and traced with high accuracy by using the dedicated AF sensor 1020 and the image-plane AF sensor 1031 together.



FIG. 12 is an overall flowchart for describing the processes performed by the imaging apparatus 1000 as shown in FIGS. 9A to 11.


First, in Step S1, the defocusing amount computation unit 1071 computes defocusing amounts. The computation of the defocusing amounts is performed based on each of a focus detection result of the image-plane AF sensor 1031 and a focus detection result of the dedicated AF sensor 1020. In other words, a defocusing amount is computed based on the focus detection result of the image-plane AF sensor 1031 and a defocusing amount is computed based on the focus detection result of the dedicated AF sensor 1020.


Next, in Step S2, the defocusing amount selection unit 1072 performs a defocusing amount selection process. The defocusing amount selection process is a process of selecting which of the defocusing amounts of the image-plane AF sensor 1031 and the dedicated AF sensor 1020 will be used in focus control as a defocusing amount. Details of the defocusing amount selection process will be described later.


Next, in Step S3, the focus control unit 1075 controls driving of the focus lens based on the defocusing amount selected from the defocusing amount selection process. Accordingly, focus control is performed. Furthermore, a focus determination process in Step S4 is a process of checking whether or not focus is on a subject that a user desires in a focus adjustment process. In the imaging apparatus 1000, the process is repeated as long as the user inputs an AF instruction (for example, half-presses the shutter).


[2-3. Defocusing Amount Selection Process]

Next, the defocusing amount selection process included in the overall flowchart described above will be described with reference to the flowchart of FIG. 13. First, in Step S101, it is determined whether or not a focus detection result of the image-plane AF sensor 1031 is valid. This determination is made based on, for example, a set state of the imaging apparatus 1000 by a user. The determination based on the set state is made by confirming which mode the user has selected when the imaging apparatus 1000 is configured to select an AF mode in which the image-plane AF sensor 1031 and the dedicated AF sensor 1020 are used together or another AF mode in which only the dedicated AF sensor 1020 is used. The focus detection result of the image-plane AF sensor 1031 is determined to be valid when a mode in which both sensors are used is selected, and the focus detection result of the image-plane AF sensor 1031 is not valid when the AF mode in which only the dedicated AF sensor 1020 is selected.


In addition, the determination of Step 101 may be made based on, for example, whether or not the focus detection result of the image-plane AF sensor 1031 can be used at an exposure timing. The exposure timing of the image-plane AF sensor 1031 is not synchronized with the dedicated AF sensor 1020 since reading of imaging is restricted. Thus, when a detection timing (timing of exposure end) of the image-plane AF sensor 1031 is acquired, and exposure timings are significantly deviated at a timing of exposure end of the dedicated AF sensor 1020, the focus detection result of the image-plane AF sensor 1031 is not employed. In this manner, when the determination of Step S101 is performed, and the focus detection result of the image-plane AF sensor 1031 is not valid, the process proceeds to Step S102 (No in Step S101).


Then, in Step S102, a proximate defocusing amount among a plurality of defocusing amounts computed based on focus detection results of a plurality of dedicated AF areas is selected as a defocusing amount to be used in focus control (hereinafter the selected defocusing amount is referred to as a selected defocusing amount). When there are 11 AF areas of the dedicated AF sensor 1020 as shown in FIG. 5, for example, the proximate defocusing amount among the 11 defocusing amounts is set to be a selected defocusing amount.


Description will return to Step S101. In Step S101, when the focus detection result of the image-plane AF sensor 1031 is determined to be valid, the process proceeds to Step S103 (Yes in Step S101). Then, an image-plane defocusing amount decision process is performed in Step S103. The image-plane defocusing amount decision process is a process of computing defocusing amounts for each of a plurality of image-plane AF areas (hereinafter referred to as image-plane defocusing amounts), and deciding an image-plane defocusing amount. Details of the image-plane defocusing amount decision process will be described later.


When an image-plane defocusing amount is decided, it is checked whether or not the imaging apparatus 1000 is in a proximity priority mode next in Step S104. The proximity priority mode is a mode in which focus is on a most proximate subject within all focus areas. When the imaging apparatus 1000 is in the proximity priority mode (Yes in Step S104), the value of a proximate defocusing amount among defocusing amounts of the dedicated AF areas (hereinafter referred to as dedicated defocusing amounts) is selected as a selected defocusing amount in Step S105. This is because a value of a proximate defocusing amount among the defocusing amounts is set to be selected according to the mode when the imaging apparatus 1000 is in the proximity priority mode. On the other hand, when the imaging apparatus 1000 is found not in the proximity priority mode in Step S104, the process proceeds to Step S106 (No in Step S104).


Next, in Step S106, it is determined whether or not the dedicated defocusing amounts obtained by the dedicated AF sensor 1020 are equal to or smaller than a first threshold value that is a predetermined threshold value. This determination is made on all of the dedicated defocusing amounts. When the dedicated defocusing amounts are equal to or smaller than the first threshold value, the process proceeds to Step S107 (Yes in Step S106), and a minimum amount among the dedicated defocusing amounts obtained for each of the plurality of dedicated AF areas is selected as a selected defocusing amount.


On the other hand, when the dedicated defocusing amounts obtained by the dedicated AF sensor 1020 are equal to or greater than the first threshold value, the process proceeds to Step S108 (No in Step S106). Next, in Step S108, it is determined whether the defocusing amounts obtained by the image-plane AF sensor 1031 are equal to or smaller than a second threshold value that is a predetermined threshold value. When the defocusing amounts are equal to or smaller than the second threshold value, the process proceeds to Step S109 (Yes in Step S108), and a minimum amount among the image-plane defocusing amounts obtained for each of the plurality of image-plane AF areas is selected as a selected defocusing amount.


On the other hand, when the defocusing amounts of the image-plane AF sensor 1031 are determined to be equal to or greater than the second threshold value in Step S108, the process proceeds to Step S110 (No in Step S108). Then, in Step S110, a minimum amount among the defocusing amounts obtained for each of the plurality of dedicated AF areas is selected as a selected defocusing amount. Next, a stabilization process is performed in Step S111.


Herein, the stabilization process will be described with reference to the flowchart of FIG. 14. The stabilization process is a process of employing a selected defocusing amount as is only when the defocusing amount is not significantly changed. Accordingly, focus control can be stabilized without sharply changing a defocusing amount by a great amount.


First, in Step S201, it is determined whether or not the selected defocusing amount is a value in a predetermined reference range. When the defocusing amount is in the reference range, the process proceeds to Step S202, and a count value is set to be 0. This count value will be described later. Then, next in Step S203, the selected defocusing amount is employed as a defocusing amount to be used in focus control. In Step S203, the defocusing amount to be used in focus control is decided. The employed defocusing value is supplied to the focus control unit 1075.


Description will return to Step S201. In Step S201, when the selected defocusing amount is determined not to be in the reference range, the process proceeds to Step S204 (No in Step S201). Next, in Step S204, it is checked whether or not a defocusing amount of an object (for example, the face of a person, or the like) is obtained. When a defocusing amount of the object is obtained, the process proceeds to Step S203 (Yes in Step S204), and the selected defocusing amount is employed as a defocusing amount to be used in focus control.


On the other hand, when a defocusing amount of the object (for example, the face of a person, or the like) is not obtained, the process proceeds to Step S205 (No in Step S204), and it is checked whether or not the imaging apparatus 1000 is in the proximity priority mode. When the imaging apparatus 1000 is in the proximity priority mode, the process proceeds to Step S203 (Yes in Step S205), and the selected defocusing amount is employed as a defocusing amount to be used in focus control.


When the imaging apparatus 1000 is found not in the proximity priority mode in Step S205, the process proceeds to Step S206 (No in Step S205), and it is determined whether or not the subject is a moving object. Determining whether or not the subject is a moving object can be performed using a moving object detection technique of the related art. When the subject is a moving object, the process proceeds to Step S203 (Yes in Step S206), and the selected defocusing amount is employed as a defocusing amount to be used in focus control.


On the other hand, when the subject is not a moving object, the process proceeds to Step S207 (No in Step S206). Next, it is checked whether or not a count value is equal to or greater than a third threshold value in Step S207. When the count value is equal to or greater than the third threshold value, the process proceeds to Step S203 (Yes in Step S207), and the selected defocusing amount is employed as a defocusing amount to be used in focus control.


On the other hand, when the count value is not equal to or greater than the third threshold value, the process proceeds to Step S208 (No in Step S207), and 1 is added to the count value. Then, in Step S209, the selected defocusing amount is not employed, and as a result, focus control using driving of the focus lens based on the defocusing amount is not performed either.


In the stabilization process, when the answers to the all determinations from Step S201 to Step S206 are No, it is the case in which the defocusing amount is not in the reference range, a defocusing amount is not detected on the object, the imaging apparatus is not in the proximity priority mode, and the subject is not a moving object. In this case, focus control is not performed until the count value is equal to or greater than the third threshold value. Accordingly, a stand-by state in which focus control is in a paused state until the count value is equal to or greater than the third threshold value can be realized. In addition, since focus control is performed based on a defocusing amount as long as the defocusing amount is in the reference range, a significant change of the employed defocusing amount can be prevented. When the count value is equal to or smaller than the third threshold value, 1 is added to the count value in Step S208, and when the count value is equal to or greater than the third threshold value, the selected defocusing amount is employed as a defocusing amount to be used in focus control in Step S203. Thus, the length of the stand-by state can be adjusted according to setting of the threshold values.


[2-4. Image-Plane Defocusing Amount Decision Process]

Next, the image-plane defocusing amount decision process performed in Step S103 of the defocusing amount selection process will be described with reference to the flowchart of FIG. 15. The image-plane defocusing amount decision process is performed by the defocusing amount decision unit 1073. The image-plane defocusing amount decision process is a process of deciding defocusing amounts for each image-plane AF area from a focus detection result of the image-plane AF sensor 1031.


First, in Step S301, a maximum value is substituted for an image-plane defocusing amount. Substituting the maximum value for the image-plane defocusing amount corresponds to performing initialization. For example, the image-plane defocusing amount is assumed to be defined as data with 16-bit codes. In this case, the range in which the image-plane defocusing amount can be obtained is “−32768 to +32767.” Since “image-plane defocusing amount=maximum value” corresponds to initialization, the maximum value “+32767” is substituted for the amount. The image-plane defocusing amount substituted with the maximum value is called an image-plane defocusing amount for comparison because the amount is compared when the sizes of image-plane defocusing amounts obtained for each image-plane AF area are determined.


Next, in Step S302, 1 is added to a variable i for counting the number of image-plane AF areas (i=i+1). This variable i is a value from 1 to the maximum number of image-plane AF areas. Thus, when there are 100 image-plane AF areas, for example, the image-plane AF areas are numbered from 1 to 100, and the variable i has a value from 1 to 100. Accordingly, the image-plane defocusing amount decision process is performed on all of the image-plane AF areas by looping the processes of the following Step S303 to Step S306.


Next, in Step S303, in an image-plane AF area corresponding to the variable i to be processed, it is checked whether or not a luminance value is equal to or greater than a predetermined value, and thereby it is determined whether or not the area has low contrast. When the area is determined not to have low contrast, the process proceeds to Step S304 (No in Step S303).


Next, in Step S304, the absolute value of an image-plane defocusing amount for comparison is compared to the absolute value of the image-plane defocusing amount in the image-plane AF area corresponding to the variable i. As a result of the comparison, when the absolute value of the image-plane defocusing amount in the ith AF area is greater than the absolute value of the image-plane defocusing amount for comparison, the process proceeds to Step S305 (Yes in Step S304). Then, in Step S305, it is set that “the absolute value of the image-plane defocusing amount for comparison=the absolute value of the image-plane defocusing amount,” and the defocusing amount of the ith image-plane AF area is decided.


On the other hand, in Step S304, when the absolute value of the image-plane defocusing amount in the ith image-plane AF area is smaller than the absolute value of the image-plane defocusing amount for comparison, the process proceeds to Step S306 (No in Step S304) without performing the process of Step S305. In addition, even when the area is determined to have low contrast in Step S303, the process proceeds to Step S306 (Yes in Step S303) without performing the process of Step S305. In this case, since the process of Step S305 is not performed, the image-plane defocusing amount is not decided.


Next, in Step S306, it is determined whether or not the variable i reaches the number of image-plane AF areas. When the variable i does not reach the number of image-plane AF areas, the process proceeds to Step S302 (No in Step S306). Then, the processes from Step S302 to Step S306 are repeated until the variable i reaches the number of image-plane AF areas. Accordingly, the processes from Step S302 to Step S306 are performed on all of the image-plane AF areas.


When the variable i reaches the number of image-plane AF areas, the process proceeds to Step S307 (Yes in Step S306). Then, in Step S307, a previously-decided image-plane defocusing amount determination process is performed.


Herein, the previously-decided image-plane defocusing amount determination process will be described with reference to the flowchart of FIG. 16. When approximate defocusing amounts are obtained from a plurality of separate image-plane AF areas, for example, there is concern that a focus position changes much, and focus is not on a main subject. Thus, the previously-decided image-plane defocusing amount determination process is a process for preventing a fine change in focus by continuously deciding the image-plane defocusing amounts previously decided as image-plane defocusing amounts when image-plane defocusing amounts for each image-plane AF area decided in the previous process are equal to or smaller than a predetermined amount.


First, in Step S401, it is determined whether or not the previously decided image-plane defocusing amounts are equal to or smaller than a fourth threshold value that is a predetermined threshold value. When the image-plane defocusing amounts are equal to or smaller than the fourth threshold value, the process proceeds to Step S402 (Yes in Step S401). Then, in Step S402, the previously decided image-plane defocusing amounts are decided as image-plane defocusing amounts again.


On the other hand, in Step S401, when the image-plane defocusing amounts are determined to be equal to or greater than the fourth threshold value, the process proceeds to Step S403 (No in Step S401). Then, in Step S403, defocusing amounts of peripheral image-plane AF areas of the image-plane AF area for which the previously decided image-plane defocusing amount is obtained are computed.


The peripheral areas are, for example, 8 image-plane AF areas in the periphery of the image-plane AF areas for which the previously decided defocusing amounts are computed, four areas in the upper, lower, right, and left sides thereof, or the like.


Next, in Step S404, it is checked whether or not defocusing amounts have been computed for all image-plane AF areas in the periphery of the image-plane AF areas. The processes of Step S403 and Step S404 are repeated until image-plane defocusing amounts of all of the peripheral image-plane AF areas are computed (No in Step S404).


Then, after the computation of the defocusing amounts is performed for all of the peripheral AF areas, the process proceeds to Step S405 (Yes in Step S404). Next, in Step S405, it is determined whether a minimum value of the defocusing amounts of all of the peripheral AF areas is less than or equal to the fourth threshold value, and when the value is determined to be less than or equal to the fourth threshold value, the process proceeds to Step S406 (Yes in Step S405).


Then, in Step S406, the minimum value of the defocusing amounts of all of the peripheral AF areas is decided to be an image-plane defocusing amount. When the previously decided defocusing amounts of the image-plane AF areas are equal to or greater than the threshold value, the defocusing amount of a peripheral image-plane AF area corresponding to the movement destination of the subject when the subject moves to the periphery of the areas is employed as an image-plane defocusing amount.


When the minimum value of the defocusing amounts of all of the peripheral AF areas is determined to be more than the fourth threshold value in Step S405, the image-plane defocusing amount decided in the process of the flowchart of FIG. 15 is decided as an image-plane defocusing amount rather than the previously decided image-plane defocusing amount (No in Step S405).


As described above, either of the defocusing amount obtained by the dedicated AF sensor 1020 or the defocusing amount obtained by the image-plane AF sensor 1031 is selected to be used in focus control. Accordingly, autofocus in a wide range by the image-plane AF sensor 1031 can be compatible with autofocus with high accuracy by the image-plane AF sensor 1031.


[2-5. Image-Plane Defocusing Amount Correction Process]

Next, a process of increasing accuracy of an image-plane defocusing amount by correcting the image-plane defocusing amount when a subject leaves all of the dedicated AF areas and is positioned on the image-plane AF areas as shown in FIG. 9D will be described. FIGS. 17 and 18 are flowcharts showing a flow of an image-plane defocusing amount correction process. The image-plane defocusing amount correction process is for correcting an image-plane defocusing amount based on the difference between a defocusing amount obtained by the dedicated AF sensor 1020 and a defocusing amount obtained by the image-plane AF sensor 1031. The image-plane defocusing amount correction process is performed by the defocusing amount correction unit 1074.


First, in Step S501, the dedicated AF sensor 1020 and the image-plane AF sensor 1031 respectively perform focus detection. Next, in Step S502, it is determined whether or not focus is on a subject (main subject) targeted by a user among subjects (whether or not a subject to be traced is decided). When focus is not on the main subject, the process proceeds to Step S503 (No in Step S502).


Next, in Step S503, it is checked whether or not the focus detection by the dedicated AF sensor 1020 has been performed. When the focus detection by the dedicated AF sensor 1020 has been performed, the process proceeds to Step S504, AF control is performed based on the defocusing amount obtained from the focus detection by the dedicated AF sensor 1020. As long as the focus detection by the dedicated AF sensor 1020 is performed, AF control is performed in Step S504 based on the defocusing amount obtained by the dedicated AF sensor 1020. It should be noted that the AF control in Step S504 corresponds to the AF control process in Step S3 of the flowchart of FIG. 12.


On the other hand, when the focus detection by the dedicated AF sensor 1020 has not been performed in Step S503, the process proceeds to Step S505 (No in Step S503). Then, in Step S505, a process for an AF out-of-control time is performed. When AF control is not available without performing the focus detection by the dedicated AF sensor 1020, for example, the imaging apparatus 1000 is in a photographing unavailable state with a nullified release button. Such nullification of the release button may be cancelled when, for example, focus detection is then performed by the dedicated AF sensor 1020.


Description will return to Step S502. When focus is determined to be on the subject targeted by the user among subjects in Step S502, the process proceeds to Step S506 (Yes in Step S502). Next, in Step S506, it is checked whether or not focus detection has been performed by the dedicated AF sensor 1020 or the image-plane AF sensor 1031. When the focus detection is performed by neither the dedicated AF sensor 1020 nor the image-plane AF sensor 1031, the process proceeds to Step S505, and the process for AF out-of-control time is performed (No in Step S506). The process for AF out-of-control time is, for example, nullification of the release button as described above. This is because photographing is difficult to perform when neither the dedicated AF sensor 1020 nor the image-plane AF sensor 1031 is available to perform focus detection. Nullification of the release button may be cancelled when, for example, focus detection is performed by the dedicated AF sensor 1020 thereafter.


On the other hand, when the focus detection is determined to be performed by the dedicated AF sensor 1020 or the image-plane AF sensor 1031 in Step S506, the process proceeds to Step S507 (Yes in Step S506). Next, in Step S507, it is determined whether or not the main subject is focused and traced. The determination is possible in such a way that it is checked whether or not there is an area having a focus deviation amount equal to or smaller than a predetermined value, and whether or not there is an AF area in which focus is substantially on the main subject of a previous AF operation among a plurality of AF areas.


When the main subject is not focused or traced, the process proceeds to Step S503 (No in Step S507). Then, if focus detection by the dedicated AF sensor 1020 is possible in Step S503, AF control is performed based on a defocusing amount detected by the dedicated AF sensor 1020 in Step S504. In addition, if focus detection by the dedicated AF sensor 1020 is unavailable in Step S503, the process for AF out-of-control time is performed in Step S505.


When the main subject is confirmed as being traced in Step S507, the process proceeds to Step S508 (Yes in Step S507). Next, in Step S508, it is checked whether or not the area in which the main subject is detected as being traced is a dedicated AF area. When the main subject is detected in a dedicated AF area, the display unit displays areas of the dedicated AF sensor 1020 and the image-plane AF sensor 1031 in Step S509.


In the display of the area in Step S509, for example, crosses overlapping the subject among crosses indicating the image-plane AF areas may be indicated by thick lines as shown in FIG. 9D. Thereby, the user can easily recognize a current subject and areas in which the subject is detected. In addition, the areas may be displayed by coloring the crosses overlapping the subject instead of, or in addition to, the display of the thick lines.


Next, in Step S510, the difference between a defocusing amount in the dedicated AF area overlapping the subject and a defocusing amount in the image-plane AF area is computed, and stored in a storage unit, a cache memory, or the like of the imaging apparatus 1000.


As a method for computing the difference, for example, there is a method for obtaining the difference of respective defocusing amounts detected in an overlapping dedicated AF area and image-plane AF area. In addition, the difference may be obtained by associating a defocusing amount of one dedicated AF area and the average of defocusing amounts of a plurality of image-plane AF areas in the periphery of the dedicated AF area. Furthermore, the difference of defocusing amounts is also affected by an aberration property of the photographing lens 1011, and thus when, for example, a subject is positioned apart from substantially the center of a frame, an offset amount may be added to the difference, considering an aberration amount of the photographing lens 1011.


As will be described in detail, the difference is used to correct focus adjustment when the main subject leaves all of the dedicated AF areas and is positioned only in the image-plane AF areas.


Next, in Step S504, AF control is performed based on the defocusing amount of the dedicated AF sensor 1020. This is because AF control is better performed using the defocusing amount of the dedicated AF sensor 1020 when the main subject overlaps the dedicated AF area since the dedicated AF sensor 1020 shows higher AF accuracy than the image-plane AF sensor 1031. Then, the process returns to Step S501.


Description will return to Step S508. When the area in which the main subject is detected as being traced is determined not to be a dedicated AF area in Step S508, the process proceeds to Step S511 (No in Step S508).


The area in which the main subject is being traced is not a dedicated AF area when the main subject is detected in the image-plane AF areas only by the image-plane AF sensor 1031. Thus, next in Step S511, the image-plane AF area in which the main subject is detected is specified. As a method for specification, for example, an area for which a defocusing amount equal to or smaller than a predetermined value is detected is specified from a plurality of image-plane AF areas near a dedicated AF area in which a main subject has been detected, and a subject detected in the specified area is assumed to be the same subject as the main subject.


Next, in Step S512, the plurality of image-plane AF areas considered to overlap the main subject are grouped, and a predetermined data process such as an averaging process of defocusing amounts detected in the image-plane AF areas is performed so that tracing of AF is smoothly performed.


Next, in Step S513, it is determined whether or not the plurality of grouped image-plane AF areas are near the position of the main subject in the previous process. This is a process for continuing tracing only when the plurality of grouped image-plane AF areas are near the area in which the subject is detected in the previous focus detection so that focus is not on a subject other than the main subject when the subject is in the area. Here, being near means, for example, a state in which areas are neighboring.


When the plurality of grouped image-plane AF areas are not near the position of the main subject in the previous process, the process proceeds to Step S505 (No in Step S513). Then, in Step S505, the process for AF out-of-control time is performed. The process for AF out-of-control time is the same as described above.


On the other hand, when the plurality of grouped image-plane AF areas are near the position of the main subject in the previous process, the process proceeds to Step S514 (Yes in Step S513). Then, in Step S514, using the difference of the defocusing amounts computed and stored in Step S510, the defocusing amount detected by the image-plane AF sensor 1031 is corrected.


In general, accuracy of focus detection by the image-plane AF sensor is lower than that by the dedicated AF sensor in many cases. Thus, in AF areas of the dedicated AF areas and the image-plane AF areas overlapping each other in a state in which the dedicated AF sensor 1020 can perform focus detection, the difference of two focus detection results is computed. Then, when a subject overlaps only image-plane AF areas, focus detection by the image-plane AF sensor 1031 is corrected using the difference. Accordingly, the sole image-plane AF sensor 1031 can perform focus detection with accuracy of the same degree as the dedicated AF sensor 1020.


Next, in Step S515, areas traced by the image-plane AF sensor 1031 are displayed. In the display of the areas in Step S515, for example, crosses and a frame overlapping the subject among crosses indicating the image-plane AF sensor 1031 and frames indicating the dedicated AF areas may be indicated by thick lines as shown in FIG. 9C. Accordingly, the user can easily recognize areas in which the subject is currently detected. In addition, the areas may be displayed by coloring the crosses and the frame overlapping the subject instead of, or in addition to, the display of the thick lines.


Then, in Step S516, AF control is performed based on the corrected defocusing amount of the image-plane AF sensor 1031. The AF control corresponds to the AF control process in Step S3 of the flowchart of FIG. 12.


As described above, in the image-plane defocusing amount correction process, when both of the dedicated AF sensor 1020 and the image-plane AF sensor 1031 can perform focus detection, the difference between a defocusing amount of the dedicated AF sensor 1020 and a defocusing amount of the image-plane AF sensor 1031 is constantly computed. Then, when a subject leaves all dedicated AF areas and only the image-plane AF sensor 1031 can perform focus detection, the defocusing amount of the image-plane AF sensor 1031 is corrected using the computed difference. Accordingly, accuracy of focus detection by the image-plane AF sensor 1031 can improve, and autofocus with high accuracy and a wide range of AF areas can be compatible.


3. Second Embodiment
[3-1. Configuration of the Imaging Apparatus]

Next, a second embodiment of the present technology will be described. FIG. 19 is a block diagram illustrating another configuration of the imaging apparatus 1000 according to the second embodiment. The imaging apparatus 1000 in the second embodiment has a subject detection unit 1076.


The subject detection unit 1076 detects a subject from an image of supplied image data. As a subject, for example, there is the face of a person, or the like. In the second embodiment, a subject is a person, and a case in which the face of the person is detected will be exemplified. However, a target to be detected by the subject detection unit 1076 does not have to be the face of a person, and animals, buildings, and the like are possible as long as they are detectable objects.


As a detection method, template matching based on the shape of a face, template matching based on luminance distribution of a face, a method based on feature amounts of skin or the face of a person included in an image, and the like can be used. In addition, the methods can be combined in order to increase accuracy in face detection. It should be noted that, since the constituent elements other than the subject detection unit 1076 are the same as those of the first embodiment, description thereof will not be repeated.


[3-2. Overview of a Process]

Next, a process performed in the second embodiment will be described. First, an overview of a focusing process performed in the present embodiment will be described with reference to FIGS. 20A to 21D. FIGS. 20A to 20D show a first example of the second embodiment, and FIGS. 21A to 21D show a second example of the second embodiment. FIGS. 20A to 21D show dedicated AF areas in a photographed screen, image-plane AF areas in the photographed screen, and subjects traced using autofocus. In FIGS. 20A to 21D, dashed-lined squares indicate AF areas of the dedicated AF sensor 1020, and dashed-lined crosses indicate AF areas of the image-plane AF sensor 1031.


In the first example of FIGS. 20A to 20D, the face of a subject to be photographed is first detected in the photographed screen as shown in FIG. 20A. The face of the subject is positioned on a dedicated AF area and image-plane AF areas. In this case, focus control is performed using defocusing amounts in the areas overlapping the subject as shown in FIG. 20B. It should be noted that, when the face of the subject overlaps both dedicated AF areas and image-plane AF areas, focus control may be performed based on a defocusing amount detected by the dedicated AF sensor 1020. This is because the dedicated AF sensor 1020 exhibits higher accuracy in focus detection than the image-plane AF sensor 1031.


Then, when focus is on the subject, and then the subject moves as shown in FIG. 20C, focus control is performed based on the defocusing amount of the AF areas in which the subject that has moved is positioned. In addition, when the position of the face of the subject leaves all of the AF areas as shown in FIG. 20D, the imaging apparatus 1000 stands by holding the process in a standby state for a predetermined period of time. In addition, when the subject enters the AF areas again within a predetermined period of time, focus control is performed based on the defocusing amount of the AF areas in which the face of the subject is positioned. On the other hand, when the subject does not enter the AF areas within the predetermined period of time, another subject positioned in the AF areas is focused on as shown in FIG. 20D.


In the second example of FIGS. 21A to 21D, the face of a subject to be photographed in the photographed screen is first detected as shown in FIG. 21A. The face of the subject is positioned in image-plane AF areas. In this case, focus control is performed using a defocusing amount of the image-plane AF areas overlapping the face as shown in FIG. 21B.


In addition, when focus is on the subject, and then the subject moves as shown in FIG. 21C, focus control is performed based on the defocusing amount of the AF areas in which the subject that has moved is positioned. In addition, when the position of the face of the subject leaves all of the AF areas as shown in FIG. 21D, the imaging apparatus 1000 stands by holding the process in a standby state for a predetermined period of time. In addition, when the subject enters the AF areas again within a predetermined period of time, focus control is performed based on the defocusing amount of the AF areas in which the face of the subject is positioned.


On the other hand, when the subject does not enter the AF areas within the predetermined period of time, another subject positioned in the AF areas is focused on as shown in FIG. 21D. It should be noted that the flowchart of the entire process is the same as that of the first embodiment shown in FIG. 12.


[3-3. Defocusing Amount Selection Process]

Next, the defocusing amount selection process included in the overall flowchart described above will be described with reference to the flowcharts of FIGS. 22 and 23. Since the processes other than those in Steps S1001 to S1006 in the flowcharts of FIGS. 22 and 23 are the same as those in the first embodiment, description thereof will not be repeated.


After an image-plane defocusing amount decision process is performed in Step S1001, the process proceeds to Step S1002. It should be noted that the image-plane defocusing amount decision process of the second embodiment will be described later in detail. However, the image-plane defocusing amount decision process of the second embodiment is also a process in which defocusing amounts are computed for each of a plurality of image-plane AF areas and an image-plane defocusing amount is decided in the same manner as in the first embodiment.


Next, in Step S1002, it is determined whether or not the face of a subject has been detected in a photographed screen. When the face has not been detected, the process proceeds to Step S104 (No in Step S1002).


On the other hand, when the face has been detected, the process proceeds to Step S1003 (Yes in Step S1002). Next, in Step S1003, it is determined whether or not the detected face overlaps dedicated AF areas. When the face overlaps the dedicated AF areas, a minimum defocusing amount among the defocusing amounts of the dedicated AF areas located in the region detected as the face is set to be a selected defocusing amount in Step S1004 (Yes in Step S1003).


When the detected face does not overlap the dedicated AF areas in Step S1003, the process proceeds to Step S1005 (No in Step S1003). Next, in Step S1005, it is determined whether or not the detected face overlaps image-plane AF areas. When the face overlaps the image-plane AF areas, a minimum defocusing amount among the defocusing amounts of the plurality of image-plane AF areas located in the region detected as the face is set to be a selected defocusing amount in Step S1006 (Yes in Step S1005).


Since other processes are the same as those of the first embodiment, description thereof will not be repeated. It should be noted that a stabilization process is also the same as that of the first embodiment.


Next, an image-plane defocusing amount decision process in the second embodiment will be described with reference to the flowchart of FIG. 24. It should be noted that, since processes other than those in Steps S3001 to S3004 in the flowchart of FIG. 24 are the same as those in the first embodiment, description thereof will not be repeated.


First, in Step S3001, a maximum value is substituted for an image-plane face defocusing amount. The image-plane face defocusing amount refers to a defocusing amount of image-plane AF areas overlapping a region detected as the face of a subject in a photographed screen. Substituting the maximum value for the image-plane face defocusing amount corresponds to performing initialization. For example, the image-plane face defocusing amount is assumed to be defined as data with 16-bit codes. In this case, the range in which the image-plane face defocusing amount can be obtained is “−32768 to +32767.” Since “image-plane face defocusing amount=maximum value” corresponds to initialization, the maximum value “+32767” is substituted for the amount. The image-plane face defocusing amount substituted with the maximum value is called an image-plane face defocusing amount for comparison because the amount is compared when the sizes of image-plane defocusing amounts obtained for each image-plane AF area overlapping a face region are determined.


In addition, in Step S3001, the maximum value is substituted for an image-plane defocusing amount for comparison in the same manner as in the first embodiment. In Step S302, substituting 1 for a variable i is also the same as in the first embodiment.


In Step S303, when the area is determined not to have low contrast, the process proceeds to Step S3001 (No in Step S303). Next, in Step S3002, it is checked whether or not an image-plane AF area among the plurality of image-plane AF areas corresponding to the variable i overlaps the region detected as the face.


When the image-plane AF area corresponding to the variable i overlaps the face region, the process proceeds to Step S3003 (Yes in Step S3002). Next, in Step S3003, the absolute value of the image-plane face defocusing amount for comparison is compared to the absolute value of the image-plane defocusing amount in an image-plane AF area. As a result of the comparison, when the absolute value of the image-plane defocusing amount in the ith image-plane AF area is smaller than the absolute value of the image-plane face defocusing amount for comparison, the process proceeds to Step S3004 (No in Step S3003). Then, in Step S3004, the defocusing amount of the ith image-plane AF area overlapping the face region is decided.


On the other hand, when the absolute value of the image-plane defocusing amount in the ith image-plane AF area is greater than the absolute value of the image-plane face defocusing amount for comparison in Step S3003, the process proceeds to Step S304 (Yes in Step S3003) without performing the process of Step S3004. In addition, when the image-plane AF area corresponding to the variable i does not overlap the face region in Step S3002, the process also proceeds to Step S304 (No in Step S3002) without performing the process of Step S3004. Since the process of Step S3004 is not performed in this case, the image plane defocusing amount of the ith image-plane AF area overlapping the face region remains undecided. As described above, in the second embodiment, the defocusing amount of the image-plane AF area overlapping the region detected as the face is decided.


The processes in the second embodiment are performed as described above. In the second embodiment, since focus control is performed based on the defocusing amount of the AF area overlapping the region detected as the face of the subject, focus control is possible based on the face positions as shown in FIGS. 20A to 21D.


The processes in the present technology are performed as described above. In general, when a subject leaves all AF areas of the dedicated AF sensor 1020 in a state in which the subject has been focused and traced, there are cases in which another subject present in the background of the subject targeted by a user is focused on. However, according to the present technology, since a subject can be detected by the image-plane AF sensor 1031 in a wide range, focus can be kept on the subject once the subject is focused even when the subject leaves all AF areas of the image-plane AF sensor 1031, and erroneous focusing on another subject can be prevented.


In addition, when a tracing operation is performed with the focus on a subject who a user desires and another subject approaches and enters the frame, there are cases in which the latter subject is focused on. However, according to the present technology, once focus is on a subject, the focus is not shifted to another subject even when the subject approaches, and the focus can be continuously on the subject who the user desires.


In addition, since the image-plane AF sensor 1031 having a wide focus range is used in addition to the dedicated AF sensor 1020, even when a position of a subject is significantly changed, the subject can be reliably detected and traced. Furthermore, when the face or the like of a subject is detected, and the face or the like overlaps image-plane AF areas, focus control is performed using image-plane defocusing amounts thereof, and thus a subject can be traced in a more extensive range than before.


4. Modified Example

Hereinabove, although the embodiments of the present technology have been described in detail, the present technology is not limited to the embodiments described above, and can be variously modified based on the technical gist thereof.


It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.


Additionally, the present technology may also be configured as below.


(1) An imaging apparatus including:


a first focus detection unit that is provided in an image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through a photographing lens; and


a second focus detection unit that is provided so as to be positioned above the image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through the photographing lens.


(2) The imaging apparatus according to (1), wherein the second focus detection unit is a dedicated phase difference focus detection module.


(3) The imaging apparatus according to (1) or (2), wherein the first focus detection unit includes a phase difference focus detection element provided in the image sensor.


(4) The imaging apparatus according to any one of (1) to (3), further including:


an optical member that splits subject image light that has passed through the photographing lens into incident light of the image sensor and incident light of the dedicated phase difference focus detection module.


(5) The imaging apparatus according to any one of (1) to (4), further including:


an electronic view finder that displays an image obtained using the image sensor.


The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2012-199534 filed in the Japan Patent Office on Sep. 11, 2012, the entire content of which is hereby incorporated by reference.

Claims
  • 1. An imaging apparatus comprising: a first focus detection unit that is provided in an image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through a photographing lens; anda second focus detection unit that is provided so as to be positioned above the image sensor, and outputs a signal for phase difference focus detection by sensing subject image light that has passed through the photographing lens.
  • 2. The imaging apparatus according to claim 1, wherein the second focus detection unit is a dedicated phase difference focus detection module.
  • 3. The imaging apparatus according to claim 1, wherein the first focus detection unit includes a phase difference focus detection element provided in the image sensor.
  • 4. The imaging apparatus according to claim 1, further comprising: an optical member that splits subject image light that has passed through the photographing lens into incident light of the image sensor and incident light of the dedicated phase difference focus detection module.
  • 5. The imaging apparatus according to claim 1, further comprising: an electronic view finder that displays an image obtained using the image sensor.
Priority Claims (1)
Number Date Country Kind
2012-199534 Sep 2012 JP national