The disclosure of Japanese Patent Application No. 2008-212407, which was filed on Aug. 21, 2008 is incorporated herein by reference.
1. Field of the Invention
The present invention relates to an electronic camera. More particularly, the present invention relates to an electronic camera for adjusting an imaging parameter in a manner to match an object scene.
2. Description of the Related Art
According to one example of this type of electronic camera, a proportion of a subject in which a motion amount exceeds a threshold value to a central region of an object scene and a proportion of the subject in which the motion amount exceeds the threshold value to a surrounding region of the object scene are individually detected by a CPU. When a difference between the respective detected proportions is large, the CPU adjusts a photographing parameter in a manner to match a sport scene.
However, in the above-described electronic camera, a region to be noticed for detecting the proportion is fixedly allocated, for example, fixed to the central region and the surrounding region. Therefore, there is a limit to a capability of determining the sport scene, by extension, a capability of adjusting the imaging parameter.
An electronic camera according to the present invention comprises: an imager, having an imaging surface on which an object scene is captured, for repeatedly outputting an object scene image; a detector for detecting one or more motion areas indicating motion exceeding a reference from among a plurality of areas on the object scene, based on the object scene image outputted from the imager; and an adjuster for adjusting an imaging parameter by comparing a pattern of the one or more motion areas detected by the detector with a predetermined area pattern.
Preferably, each of the plurality of areas has a plurality of small areas, and the detector includes a motion coefficient calculator for calculating a plurality of motion coefficients respectively corresponding to the plurality of small areas, an extractor for extracting a motion coefficient exceeding a reference value from among the plurality of motion coefficients calculated by the motion coefficient calculator, and a specifier for specifying, as the motion area, an area to which the small area corresponding to the motion coefficient extracted by the extractor belongs.
Preferably, the detector repeatedly executes a detecting process, and the adjuster includes a creator for repeatedly creating a pattern of the one of more motion areas and a setter for setting the imaging parameter to a predetermined parameter in reference to a number of times that satisfies a predetermined condition between the pattern created by the creator and the predetermined area pattern.
Preferably, the predetermined condition includes a condition under which the predetermined area pattern involves the pattern created by the creator.
Preferably, the imaging parameter adjusted by the adjuster is equivalent to an imaging parameter that matches a sport scene.
According to the present invention, an imaging controlling program product executed by a processor of an electronic camera provided with an imager, having an imaging surface on which an object scene is captured, for repeatedly outputting an object scene image, the imaging controlling program product, comprises: a detecting step of detecting a motion area indicating motion exceeding a reference from among a plurality of areas on the object scene, based on the object scene image outputted from the imager, and an adjusting step of adjusting an imaging parameter by comparing a pattern for the motion area detected by the detecting step with a predetermined area pattern.
According to the present invention, an imaging controlling method executed by an electronic camera provided with an imager, having an imaging surface on which an object scene is captured for repeatedly outputting an object scene image, the imaging controlling method, comprises: a detecting step of detecting one or more motion areas indicating motion exceeding a reference from among a plurality of areas on the object scene, based on the object scene image outputted from the imager; and an adjusting step of adjusting an imaging parameter by comparing a pattern of the one or more motion areas detected by the detecting step with a predetermined area pattern.
The above described features and advantages of the present invention will become more apparent from the following detailed description of the embodiment when taken in conjunction with the accompanying drawings.
With reference to
It is noted that the imaging surface is covered with a primary color filter not shown, and the electric charges produced by each of a plurality of pixels placed on the imaging surface have color information of any one of R (Red), G (Green), and B (Blue).
When power is inputted in order to execute a through image process under an imaging task, a CPU 34 commands a driver 18c to repeat an exposure operation and a thinning-out reading-out operation. The driver 18c, in response to a vertical synchronization signal Vsync periodically generated from an SG (Signal Generator) not shown, exposes the imaging surface and reads out one portion of the electric charges produced on the imaging surface in a raster scanning manner. From the imaging device 16, raw image data based on the read-out electric charges are periodically outputted.
A camera processing circuit 20 performs processes, such as a white balance adjustment, a color separation, and a YUV conversion, on the raw image data outputted from the imaging device 16, so as to produce image data of a YUV format. The produced image data is written into an SDRAM 24 through a memory control circuit 22. An LCD driver 26 repeatedly reads out the image data accommodated in the SDRAM 24 through the memory control circuit 22, and drives an LCD monitor 28 based on the readout image data As a result, a real-time moving image (through image) of the object scene is displayed on a monitor screen.
With reference to
An AE/AWB evaluating circuit 30 integrates one portion of Y data belonging to each evaluation area, out of Y data outputted from the camera processing circuit 20, at each generation of the vertical synchronization signal Vsync. Thereby, 256 integral values, i.e., 256 AE/AWB evaluation values, are outputted from the AE/AWB evaluating circuit 30 in response to the vertical synchronization signal Vsync.
An AF evaluating circuit 32 integrates a high frequency component of one portion of Y data belonging to each evaluation area at each generation of the vertical synchronization signal Vsync. Thereby, 256 integral values, i.e., 256 AF evaluation values are outputted from the AF evaluating circuit 32 in response to the vertical synchronization signal Vsync.
Before a shutter button 36s arranged in a key input device 36 is manipulated, an imaging mode is set to a normal mode. The CPU 34 executes a through image-use AE/AWB process that matches the normal mode based on the AE/AWB evaluation values outputted from the AE/AWB evaluating circuit 30, and calculates an appropriate aperture amount, an appropriate exposure time period, and an appropriate white balance adjusting gain that match the normal mode, as an appropriate imaging parameter. The calculated appropriate aperture amount, appropriate exposure time period, and appropriate white balance adjusting gain are set to the drivers 18b, 18c, and the camera processing circuit 20, respectively. As a result, a brightness and a white balance of the through image displayed on the LCD monitor 28 are moderately adjusted.
When the shutter button 36s is half-depressed, the CPU 34 executes a mode setting process so as to select an imaging mode according to the object scene, and thereafter, executes an AF process based on the AF evaluation values outputted from the AF evaluating circuit 32 and a recording-use AE process based on the AE/AWB evaluation values outputted from the AE/AWB evaluating circuit 30.
The AF process is executed based on the AF evaluation values outputted from the AF evaluating circuit 32. The focus lens 12 is moved in an optical-axis direction by the driver 18a, and is placed at a focal point by a so-called hill-climbing process.
The recording-use AE process is executed in a manner that matches the selected imaging mode. Thereby, an optimal aperture amount and an optimal exposure time period are calculated as an optimal imaging parameter of a current imaging mode. The calculated optimal aperture amount and optimal exposure lime period are respectively set to the drivers 18b and 18c, similarly to the above-described case. As a result, the brightness of the through image displayed on the LCD monitor 30 is adjusted to an optimal value.
When the shutter button 36s is fully depressed, the recording-use AWB process is executed in a manner that matches the selected imaging mode. Thereby, an optimal white balance adjusting gain is calculated as an optimal imaging parameter of the current imaging mode. The calculated optimal white balance adjusting gain is set to the camera processing circuit 20, similarly to the above-described case. As a result, the white balance of the through image displayed on the LCD monitor 30 is adjusted to an optimal value.
Upon completion of the recording-use AWB process, a recording process is executed. The CPU 34 commands the driver 18c to execute a main exposure operation and all-pixel reading-out, one time each. The driver 18c performs the main exposure on the imaging surface in response to the generation of the vertical synchronization signal Vsync, and reads out all the electric charges produced on the imaging surface in a raster scanning manner As a result, high-resolution raw image data representing the object scene is outputted from the imaging device 16.
The outputted raw image data is subjected to the process similarly as described above, and as a result, high-resolution image data according to a YUV format is secured in the SDRAM 24. An I/F 38 reads out the high-resolution image data thus accommodated in the SDRAM 24 through the memory control circuit 22, and then, records the read-out image data on a recording medium 40 in a file format. It is noted that the through-image process is resumed at a time point when the high-resolution image data is accommodated in the SDRAM 24. Also, the imaging mode is returned to the normal mode.
The mode setting process is executed as follows: Firstly, the 256 evaluation areas allocated on the imaging surface are divided into 16 evaluation groups in a manner that follows bold lines shown in
To the 16 divided evaluation groups, identification numbers G—0 to G—15 are allocated in a manner shown in
When the vertical synchronizing signal Vsync is generated, based on the AE/AWB evaluation values acquired in the hatching area shown in
EC(M, N)=ΔVL(M, N)*256/VLmax(M, N) [Equation 1]
The calculated motion coefficient EC(M, N) is compared with a reference value REF, and when the motion coefficient EC(M, N) exceeds the reference value REF, a variable CNT_M is incremented. That is, the incremented variable is equivalent to a variable allocated to the evaluation group including the hatching area in which a motion coefficient exceeding the reference value REF is obtained.
Upon completion of the above-described operation for the 80 hatching areas shown in
Therefore, as shown in
When the number of evaluation groups forming the motion pattern MP is equal to or more than “3” and equal to or less tan “9”, a matching process for determining whether or not the motion pattern MP satisfies a sport scene condition is executed, regarding that a dynamic object is present in the object scene. Specifically, it is determined whether or not the evaluation group forming the motion pattern MP is involved by an evaluation group forming any one of predetermined motion patterns DP—0 to DP—6 shown in
On the other hand, when the number of evaluation groups forming the motion pattern MP falls below “3” or exceeds “9”, the variable K is decremented in a range where “0” is a lowest value, regarding that the dynamic object is not present in the object scene.
Between a motion pattern MP at a topmost level or a second level in the right column in
When the updated variable K reaches “4”, it is determined that the object scene is equivalent to the sport scene. As a result, the imaging mode is finalized to the sport mode. Also, unless the variable K reaches “4”, another scene determining&mode finalizing process is executed in parallel. It is noted that in this case, the imaging mode may be fed to a mode different from the sport mode. However, when the scene is not explicitly determined even after an elapse of a 30-frame period from a start of the mode setting process, the imaging mode is finalized to the normal mode. The above-described recording-use AE/AWB process is executed in a manner that matches the imaging mode thus finalized.
The CPU 34 executes in parallel a plurality of tasks, including an imaging task shown in
With reference to
In a step S5, it is determined whether or not the shutter button 36s is half-depressed, and as long as the determination result indicates NO, the through image-use AE/AWB process in a step S7 is repeated. As a result, the brightness and the white balance of the through image are moderately adjusted in a manner according to the normal mode.
When the shutter button 36s is half-depressed, the mode setting process is executed in a step S9 in order to select an imaging mode that matches the object scene. In a step S11, the AF process is executed, and in a step S13, the recording-use AE process is executed. As a result of the process in the step S11, the focus lens 12 is placed at the focal point. The recording-use AE process in the step S13 is executed in a manner that matches the selected imaging mode.
In a step S15, it is determined whether or not the shutter button 36s is fully depressed, and in a step S23, it is determined whether or not a manipulation of the shutter button 36s is cancelled. When YES is determined in the step S15, the process proceeds to a step S17 in which the recording-use AWB process is executed in a manner that matches the selected imaging mode. Upon completion of the recording-use AWB process, the process undergoes a recording process in a step S19 and a through image process in a step S21, and then, returns to the step S3. When YES is determined in the step S23, the process returns to the step S3 as it is.
The mode setting process in the step S9 is executed according to a subroutine shown in
In a step S41, according to the above-described Equation 1, the motion coefficient EC(M, N) is calculated, and in a step S43, it is determined whether or not the calculated motion coefficient EC(M, N) exceeds the reference value REF. Herein, when NO is determined, the process proceeds to a step S47 as it is while YES is determined, the process incrementing the variable CNT_M in a step S45, and then, proceeds to the step S47.
In the step S47, it is determined whether or not the variable N reaches “4”, and in a step S49, it is determined whether or not the variable M reaches “15”. When NO is determined in the step S47, the process increments the variable N in the step S51, and then, returns the step S41. When YES is determined in the step S47 while when NO is determined in the step S49, the process increments the variable M in a step S53, and then, returns to the step S39.
When YES is determined in both of the steps S47 and S49, the process proceeds to a step S55 in which the motion pattern MP is created with reference to the variables CNT—0 to CNT—15. Specifically, out of the variables CNT—0 to CNT—15, a variable showing a numerical value equal to or more than “1” is extracted, and an evaluation group corresponding to the extracted variable is specified as a motion group. Then, a pattern formed by the specified motion group is created as the motion pattern MP.
In a step S57, it is determined whether or not the number of evaluation groups (=MG) forming the motion patter MP is equal to or more than “3” and equal to or less Man “9”. Herein, when NO is determined, the variable K is decremented in a range equal to or more than “0”, and then, the process proceeds to a step S71. On the other hand, when YES is determined, the process proceeds to a step S61 in which the matching process for determining whether or not the motion pattern MP created in the step S55 satisfies the sport scene condition is executed. A flag FLG_sprt is set to “1” when the motion pattern MP satisfies the sport scene condition while set to “0” when the motion pattern MP does not satisfy the sport scene condition.
In a step S63, it is determined whether or not the flag FLG_sprt indicates “1”. When NO is determined, the process proceeds to a step S71 while YES is determined, the variable K is incremented in a step S65. In a step S67, it is determined whether or not the variable K is equal to or more than “4”. When NO is determined, the process proceeds to a step S71 while YES is determined, the imaging mode is finalized to the sport mode in the step S69. Upon completion of the process in the step S69, the process is restored to a routine at a hierarchically upper level.
In a step S71, the another scene determining & mode finalizing process is executed. In a step S73, it is determined whether or not the imaging mode is finalized by the process in the step S71, and in a step S75, it is determined whether or not a 30-frame period has been elapsed from a start of the mode setting process. When YES is determined in the step S73, the process is restored to the routine at a hierarchically upper level as it is, and when YES is determined in the step S75, the process finalizes the imaging mode to the normal mode, and then, is restored to the routine at a hierarchically upper level. When NO is determined in both of the steps S73 and S75, the process returns to the step S33.
The matching process in the step S61 is executed according to a subroutine shown in
As understood from the above description, the imaging device 16 has an imaging surface for capturing an object scene, and repeatedly outputs the object scene image. The CPU 34 notices the evaluation areas B—0 to B—4 belonging to each of the evaluation groups G—0 to G—15 allocated on the object scene (S37 to S39 and S47 to S53) so as to calculate the motion coefficient in the noticed evaluation area based on the object scene image outputted from the imaging device 16 (S41). Also, the CPU 34 specifies the evaluation group including the evaluation area in which the motion coefficient exceeds the reference value REF, as a motion group indicating a motion exceeding the reference (S43 to S45). Moreover, the CPU 34 creates the pattern of one or more motion groups specified as the motion pattern MP (S55), and compares the created motion pattern MP with the predetermined motion patterns DP—0 to DP—6 so as to adjust the imaging parameter (S57 to S69, S13, and S17).
When a cause of the motion exceeding the reference is a camera shake on the imaging surface, all the evaluation groups G—0 to G—15 are detected as the motion group. On the other hand, when a cause of the motion exceeding the reference is a motion of an object present in the object scene, one portion of the evaluation groups G—0 to G—15 is detected as the motion group. Which of these causes, i.e., the camera shake on the imaging surface and the motion of an object, result in the motion is determined by comparing the pattern of the detected motion groups, i.e., the motion pattern MP, with the predetermined motion patterns DP—0 to DP—15. When this determination result is referenced, it becomes possible to adjust the imaging parameter according to the cause of the motion. Thus, the improvement in capability of adjusting the imaging parameter is implemented.
Moreover, the accuracy for detecting the motion of an object depends on the number of the evaluation areas allocated on the imaging surface. The larger the number of evaluation areas, the higher the detection accuracy. However, the increase in number of evaluation areas makes it difficult to comprehend the behavior of the motion of an object.
In this embodiment, evaluation areas allocated on the imaging surface are grouped and the pattern of the evaluation groups to which the evaluation area in which the motion occurs belongs is noticed. Thus, irrespective of an increase in number of evaluation areas allocated on the imaging surface, it becomes easy to comprehend the nature of motion of an object. Thereby, the improvement in imaging capability for a dynamic object scene is implemented.
It is noted that in this embodiment, the determination of the sport scene is assumed. However, when the mode of the motion patterns DP—0 to DP—15 is appropriately changed, the present invention can also be applied to adjustment of an imaging parameter of a surveillance camera or a WEB camera.
Moreover, this embodiment is so designed that when the scene is not explicitly determined even after an elapse of the 30-frame period from a start of the mode setting process, the imaging mode is finalized to the normal mode (see the steps S75 to S77 in
Also, in this embodiment, when the imaging mode is finalized to the sport mode, the mode setting process is ended (see the step S69 in
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2008-212407 | Aug 2008 | JP | national |