The disclosure of Japanese Patent Application No. 2010-217698, which was filed on Sep. 28, 2010, is incorporated here by reference.
1. Field of the Invention
The present invention relates to an electronic camera. More particularly, the present invention relates to an electronic camera which adjusts an imaging condition corresponding to a touch operation.
2. Description of the Related Art
According to one example of this type of apparatus, an electronic camera is provided with a liquid crystal display portion for confirming a photographed image, a touch panel installed on the crystal display portion, a shooting lens, an imaging unit consisting of an AF module, a CCD and etc., a ranging circuit which measures a distance to a desired object of an image inputted by the imaging unit, and an autofocus driving circuit for driving the imaging unit so as to focus the desired object.
Moreover, out of objects within a photographing range displayed on the crystal display portion, an object existing at an arbitrary position in the photographing range is focused and photographed by following processes: a focus block which is a position intended to focus is designated on the touch panel, a distance to the object in the designated focus block is measured by the ranging circuit, and the autofocus driving circuit is controlled based on a measured result.
However, in the above-described apparatus, an adjustment manner of an imaging condition other than designating the focus block by the touch panel operation is not described. Therefore, it is unclear that it is possible to simultaneously use another imaging condition adjustment manner such as a face detection. Thereby, the adjustment manner of the imaging condition may be limited.
An electronic camera according to the present invention, comprises: an imager, having an imaging surface capturing an optical image, which outputs an electronic image corresponding to the optical image; an executor which executes, along a reference criterion, a process of adjusting an imaging condition based on a partial image appeared on a designated position out of the electronic image outputted from the imager; a searcher which searches for a specific object image coincident with a dictionary image from the electronic image outputted from the imager; a first selector which selects a specific criterion as the reference criterion when the designated position is equivalent to a position of the specific object image discovered by the searcher; and a second selector which selects a criterion different from the specific criterion as the reference criterion when the designated position is different from the position of the specific object image discovered by the searcher.
According to the present invention, a computer program embodied in a tangible medium, which is executed by a processor of an electronic camera provided with an imager, having an imaging surface capturing an optical image, which outputs an electronic image corresponding to the optical image, the program comprises: an executing instruction to execute, along a reference criterion, a process of adjusting an imaging condition based on a partial image appeared on a designated position out of the electronic image outputted from the imager; a searching instruction to search for a specific object image coincident with a dictionary image from the electronic image outputted from the imager; a first selecting instruction to select a specific criterion as the reference criterion when the designated position is equivalent to a position of the specific object image discovered based on the searching instruction; and a second selecting instruction to select a criterion different from the specific criterion as the reference criterion when the designated position is different from the position of the specific object image discovered based on the searching instruction.
According to the present invention, an imaging control method executed by an electronic camera provided with an imager, having an imaging surface capturing an optical image, which outputs an electronic image corresponding to the optical image, the imaging control method comprises: an executing step of executing, along a reference criterion, a process of adjusting an imaging condition based on a partial image appeared on a designated position out of the electronic image outputted from the imager; a searching step of searching for a specific object image coincident with a dictionary image from the electronic image outputted from the imager; a first selecting step of selecting a specific criterion as the reference criterion when the designated position is equivalent to a position of the specific object image discovered by the searching step; and a second selecting step of selecting a criterion different from the specific criterion as the reference criterion when the designated position is different from the position of the specific object image discovered by the searching step.
The above described features and advantages of the present invention will become more apparent from the following detailed description of the embodiment when taken in conjunction with the accompanying drawings.
With reference to
With reference to
When a power source is applied, in order to execute a moving-image taking process, a CPU 36 commands a driver 18c to repeat an exposure procedure and an electric-charge reading-out procedure under the imaging task. In response to a vertical synchronization signal Vsync periodically generated from an SG (Signal Generator) not shown, the driver 18c exposes the imaging surface and reads out the electric charges produced on the imaging surface in a raster scanning manner. From the image sensor 16, raw image data that is based on the read-out electric charges is cyclically outputted.
A pre-processing circuit 20 performs processes, such as digital clamp, pixel defect correction, gain control and etc., on the raw image data outputted from the image sensor 16. The raw image data on which these processes are performed is written into a raw image area 24a of an SDRAM 24 through a memory control circuit 22.
A post-processing circuit 26 reads out the raw image data accommodated in the raw image area 24a through the memory control circuit 22, and performs processes such as a color separation process, a white balance adjusting process, a YUV converting process, on the read-out raw image data. Moreover, the post-processing circuit 26 executes a zoom process for display on image data that comply with a YUV format. As a result, display image data that comply with the YUV format is individually created. The display image data is written into a display image area 24b of the SDRAM 24 by the memory control circuit 22.
An LCD driver 28 repeatedly reads out the display image data accommodated in the display image area 24b through the memory control circuit 22, and drives an LCD monitor 30 based on the read-out image data. As a result, a real-time moving image (a live view image) of the scene is displayed on a monitor screen.
With reference to
An AE/AF/AWB evaluating circuit 34 integrates Y data belonging to the evaluation area EVA for each divided area, out of the Y data produced by the pre-processing circuit 20, at every time the vertical synchronization signal Vsync is generated. Thereby, 256 integral values (256 AE evaluation values) are outputted from the AE/AF/AWB evaluating circuit 34 in response to the vertical synchronization signal Vsync.
Moreover, the AE/AF/AWB evaluating circuit 34 extracts a high-frequency component of the Y data belonging to the same evaluation area EVA, out of the Y data outputted from the pre-processing circuit 20, at every time the vertical synchronization signal Vsync is generated so as to integrate the extracted high-frequency component for each divided area. Thereby, 256 integral values (256 AF evaluation values) are outputted from the AE/AF/AWB evaluating circuit 34 in response to the vertical synchronization signal Vsync.
Moreover, the AE/AF/AWB evaluating circuit 34 integrates RGB data belonging to the same evaluation area EVA for each divided area, out of the RGB data outputted from the pre-processing circuit 20, at every time the vertical synchronization signal Vsync is generated. Thereby, 256 integral values (256 AWB evaluation values) are outputted from the AE/AF/AWB evaluating circuit 34 in response to the vertical synchronization signal Vsync.
When a touch operation is not performed on the monitor screen, under the imaging task, the CPU 36 executes a simple AE process based on output from the AE/AF/AWB evaluating circuit 34 so as to calculate an appropriate EV value. An aperture amount and an exposure time period that define the calculated appropriate EV value are respectively set to the drivers 18b and 18c. As a result, a brightness of the live view image is adjusted approximately.
In parallel with the moving-image taking process, under a face detecting task, the CPU 36 repeatedly issues a searching request toward a face detecting circuit 44.
The face detecting circuit 44 is configured as shown in
When the image in the comparing frame structure coincides with the registered face image, the face detecting circuit 44 registers a size and a position of the comparing frame structure at a current time point on a register 44e as shown in
In response to the searching end notification sent back from the face detecting circuit 44, the CPU 36 determines whether or not searching for a face of a person is successful. When at least one comparing frame structure is registered in the register 44e, it is determined that searching for the face image is successful. In contrary, when the comparing frame structure is not registered in the register 44e, it is determined that searching for the face image is unsuccessful.
When searching for the face image is successful, the CPU 36 detects comparing frame structure information registered in the register 44e so as to issue a face-frame-structure character display command corresponding to the detected comparing frame structure information toward the LCD driver 28. The LCD driver 28 drives the LCD monitor 30 with reference to thus applied face-frame-structure character display command. A face-frame-structure character is displayed on the LCD monitor 30 in a manner to surround a face of a person appeared in the live view image. Thus, when the scene shown in
When searching for the face image is unsuccessful, the CPU 36 issues a face-frame-structure character non-display command toward the LCD driver 28. As a result, the face-frame-structure character displayed on the LCD monitor 30 is disappeared.
When the touch operation is performed on the monitor screen in a state where the live view image is displayed on the LCD monitor 30, a touch position is detected by a touch sensor 32, and therefore, a detected result is applied to the CPU 36.
Subsequently, under the imaging task, the CPU 36 determines whether or not the touch position is inside of any one of one or at least two comparing frame structures registered in the register 44e. When the touch position is inside of any one of the comparing frame structures, the CPU 36 determines that the person is designated by the touch operation so as to set an adjustment criterion of the imaging condition for a portrait scene.
When the touch position is not inside of any of the comparing frame structures, the CPU 36 determines that the person is not designated by the touch operation, and then, determines that the scene is equivalent to which one of a plurality of photographed scenes except the portrait scene, i.e., a night-view scene and a landscape scene. Each of the night-view scene determination and the landscape scene determination is executed based on the AE evaluation values, the AF evaluation values and the AWB evaluation values outputted from the AE/AF/AWB evaluating circuit 34.
As a result of the determination, when the scene is equivalent to the night-view scene, the CPU 36 sets the adjustment criterion of the imaging condition for the night-view scene. When the scene is equivalent to the landscape scene, the CPU 36 sets the adjustment criterion of the imaging condition for the landscape scene. When the scene is not equivalent to any of the night-view scene and the landscape scene, the CPU 36 sets the adjustment criterion of the imaging condition for a default scene.
Subsequently, the CPU 36 extracts AE evaluation values, AF evaluation values and AWB evaluation values corresponding to the touch position, from among the 256 AE evaluation values, the 256 AF evaluation values and the 256 AWB evaluation values outputted from the AE/AF/AWB evaluating circuit 34.
The CPU 36 executes a strict AE process that is based on the extracted partial AE evaluation values along the set adjustment criterion. An aperture amount and an exposure time period that define an optimal EV value calculated by the strict AE process are respectively set to the drivers 18b and 18c. As a result, the brightness of the live view image is adjusted to a brightness in which a part of the scene equivalent to the touch position is noticed.
Upon completion of the strict AE process, the CPU 36 executes an AF process that is based on the extracted partial AF evaluation values along the set adjustment criterion. As a result, the focus lens 12 is placed at a focal point in which a part of the scene equivalent to the touch position is noticed, and thereby, a sharpness of the live view image is improved.
Upon completion of the AF process, the CPU 36 executes an AWB process that is based on the extracted partial AWB evaluation values along the set adjustment criterion. Thereby, an appropriate white balance adjustment gain is calculated. The calculated appropriate white balance adjustment gain is set to the post-processing circuit 26, and as a result, the white balance of the live view image is adjusted to a white balance in which a part of the scene equivalent to the touch position is noticed.
In a case where the adjustment criterion of the imaging condition is set for the portrait scene, after the AWB process is completed, the CPU 36 executes portrait adjusting processes such as a skin color emphasizing process and a noise removal process. As a result, a sharpness of an image representing a skin color portion of the person is improved.
When the imaging condition is thus adjusted, a still-image taking process and a recording process are executed. One frame of the display image data at a time point at which the touch operation is performed on the monitor screen is taken into a still-image area 24c by the still-image taking process. The taken one flame of the image data is read out from the still-image area 24c by an I/F 38 which is started up in association with the recording process, and is recorded on a recording medium 40 in a file format.
The CPU 36 executes a plurality of tasks including the imaging task shown in
With reference to
When the determined result of the step S3 is updated from NO to YES, in a step S7, an adjustment criterion selecting process is executed. As a result, a photographed scene is determined in a manner which is different depending on the touch position so as to select an adjustment criterion corresponding to a determined photographed scene as the adjustment criterion of the imaging condition.
In step S9, out of the 256 AE evaluation values, the 256 AF evaluation values and the 256 AWB evaluation values outputted from the AF/AF/AWB evaluating circuit 34, the AE evaluation values, the AF evaluation values and the AWB evaluation values corresponding to the touch position are extracted. In a step S11, an imaging condition adjusting process is executed based on thus extracted partial AE evaluation values, AF evaluation values and AWB values. As a result, the imaging condition is adjusted by noticing a partial scene equivalent to the touch position.
In a step S13, the still-image taking process is executed. As a result, one frame of the display image data at the time point at which the touch operation is performed on the monitor screen is taken into the still-image area 24c. In a step S15, the recording process is executed. As a result, the one frame of the image data taken into the still-image area 24c is read out so as to record on the recording medium 40 in the file format.
The adjustment criterion selecting process in the step S7 is executed according to a subroutine shown in
When the determined result is NO, in a step S25, it is determined whether or not the scene is equivalent to the night-view scene, and in a step S29, it is determined whether or not the scene is equivalent to the landscape scene. When YES is determined in the step S25, in a step S27, the adjustment criterion of the imaging condition is set for the night-view scene. When YES is determined in the step S29, in a step S31, the adjustment criterion of the imaging condition is set for the landscape scene. When NO is determined in both of the steps S25 and S29, in a step S33, the adjustment criterion of the imaging condition is set for a default scene. Upon completion of the processes in the steps S23, S27, S31 or S33, the process returns to the routine in an upper hierarchy.
The imaging condition adjusting process in the step S11 is executed according to a subroutine shown in
In a step S43, the AF process that is based on the partial AF evaluation values extracted in the step S9 is executed along the adjustment criterion set in the step S7. As a result, the focus lens 12 is placed at the focal point in which the partial scene equivalent to the touch position is noticed, and thereby, the sharpness of the live view image is improved.
In a step S45, the AWB process that is based on the partial AWB evaluation values extracted in the step S9 is executed along the adjustment criterion set in the step S7. Thereby, the appropriate white balance adjustment gain is calculated. The calculated appropriate white balance adjustment gain is set to the post-processing circuit 26, and as a result, the white balance of the live view image is adjusted to the white balance in which the partial scene equivalent to the touch position is noticed.
In a step S47, it is determined whether or not the adjustment criterion of the imaging condition is set for the portrait scene. When a determined result is NO, the process returns to the routine in an upper hierarchy while when the determined result is YES, the process returns to the routine in an upper hierarchy via a process in a step S49.
In the step S49, the portrait adjusting processes such as the skin color emphasizing process and the noise removal process are executed. As a result, the sharpness of the image representing the skin color portion of the person is improved.
With reference to
When the searching end notification is sent back from the face detecting circuit 44, in a step S55, it is determined whether or not searching for the face image is successful. When at least one comparing frame structure is registered in the register 44e, it is determined that searching for the face image is successful, and the process advances to a step S57. Thereafter, the process returns to the step S51. In contrary when the comparing frame structure is not registered in the register 44e, it is determined that searching for the face image is unsuccessful, and the process advances to a step S59. Thereafter, the process returns to the step S51.
In the step S57, the face-frame-structure character display command is issued toward the LCD driver 28. The LCD driver 28 drives the LCD monitor 30 with reference to thus applied face-frame-structure character display command. The face-frame-structure character is displayed on the LCD monitor 30 in a manner to surround the face of the person appeared in the live view image. In the step S59, the face-frame-structure character non-display command is issued toward the LCD driver 28. As a result, the face-frame-structure character displayed on the LCD monitor 30 is disappeared.
As can be seen from the above-described explanation, the image sensor 16 has the imaging surface capturing the scene and outputs the electronic image, and the CPU 36 executes, along the reference criterion, the process of adjusting the imaging condition based on the partial image appeared on the designated position out of the electronic image outputted from the image sensor 16. The CPU 36 and the face detecting circuit 44 search for the specific object image coincident with the dictionary image from the electronic image outputted from the image sensor 16. Moreover, the CPU 36 determines whether or not the designated position is equivalent to the position of the specific object image discovered by the CPU 36 and the face detecting circuit 44. The CPU 36 selects the specific criterion as the reference criterion when a determined result is positive while selects the criterion different from the specific criterion as the reference criterion when the determined result is negative.
It is noted that, in this embodiment, control programs equivalent to the multi task operating system and the plurality of tasks executed thereby are previously stored in the flash memory 42. However, a communication I/F 46 for connecting to an external server may be arranged in the digital camera 10 as shown in
Moreover, in this embodiment, the processes executed by the CPU 36 are divided into the imaging task shown in
Moreover, in this embodiment, it is determined that the scene is equivalent to which of the portrait scene, the night-view scene, the landscape scene or the default scene. However, it may be determined that the scene is equivalent to a photographed scene other than these scenes.
Moreover, in this embodiment, it is determined whether or not the touch position is inside of the comparing frame structure by searching for the face image of the person. However, it may be determined whether or not the touch position is inside of the comparing frame structure by searching for an image other than the face image of the person.
Moreover, in this embodiment, the present invention is explained by using a digital still camera, however, a digital video camera, cell phone units and a smartphone may be applied to.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2010-217698 | Sep 2010 | JP | national |