The disclosure of Japanese Patent Application No. 2007-219412 which was filed on Aug. 27, 2007 is incorporated herein by reference.
1. Field of the Invention
The present invention relates to a video camera. More particularly, the present invention relates to a video camera that prevents a generation of a flicker resulting from a beat interference between an exposure time period of an imaging element and a blinking cycle of a fluorescent lamp.
2. Description of the Related Art
According to one example of this type of a video camera, an electric-charge accumulation time period, i.e., an exposure time period, of an imaging element is set to the same time period as a blinking cycle (=T) of a fluorescent lamp or to an integral multiple of the cycle T (=2T; 3T; 4T, . . . ). Thereby, a generation of a flicker resulting from a beat interference between the exposure time period of the imaging element and the blinking cycle of the fluorescent lamp is prevented. However, in order to determine whether or not the flicker occurs, an object scene image having several frames to ten-odd frames needs to be referenced. Therefore, when whether or not the flicker occurs is determined all the time, a processor load increases.
A video camera according to the present invention, comprises: an imager, having an imaging surface irradiated with an optical image of an object scene, for repeatedly generating an object scene image; a determiner for repeatedly determining whether or not a specific variation exceeding a reference is generated in the object scene captured by the imager in parallel with a generating process of the object scene image by the imager; and a flicker processor for determining whether or not a flicker occurs in the object scene image generated by the imager when a determination result of the determiner is updated from a negative result to an affirmative result so as to execute a flicker countermeasure process.
Preferably, the imager generates the object scene image in a first cycle, and the determiner determines whether or not the specific variation is generated in a second cycle longer than the first cycle.
Preferably, the flicker processor executes a flicker determination at a time that a designated period is elapsed from an updating of the determination result.
Further preferably, there is further provided a first changer for changing the designated period according to a variation amount of the object scene captured by the imager.
Preferably, there is further provided a second changer for changing a flicker determining precision according to a variation amount of the object scene captured by the imager.
Preferably, there is further provided an adjustor for repeatedly adjusting an imaging parameter based on the object scene image generated by the imager, in which the determiner executes a determining process by noticing a variation of the imaging parameter adjusted by the adjustor.
Further preferably, the imaging parameter includes an exposure amount. Also, the imaging parameter includes a white-balance adjustment gain.
An imaging-control program product according to the present invention is an imaging-control program product executed by a processor of a video camera comprising an imager having an imaging surface irradiated with an optical image of an object scene, for repeatedly generating an object scene image, the imaging-control program product, comprising: a determining step of repeatedly determining whether or not a specific variation exceeding a reference is generated in the object scene captured by the imager, in parallel with a generating process of the object scene image by the imager; and a flicker processing step of determining whether or not a flicker occurs in the object scene image generated by the imager when a determination result of the determining step is updated from a negative result to an affirmative result so as to execute a flicker countermeasure process.
An imaging control method according to the present invention is an imaging control method of a video camera comprising an imager, having an imaging surface irradiated with an optical image of an object scene, for repeatedly generating an object scene image, the imaging control method, comprising: a determining step of repeatedly determining whether or not a specific variation exceeding a reference is generated in the object scene captured by the imager, in parallel with a generating process of the object scene image by the imager, and a flicker processing step of determining whether or not a flicker occurs in the object scene image generated by the imager when a determination result of the determining step is updated from a negative result to an affirmative result so as to execute a flicker countermeasure process.
The above-described features and advantages of the present invention will become more apparent from the following detailed description of the embodiment when taken in conjunction with the accompanying drawings.
With reference to
When a power supply is turned on, a CPU 30 instructs a TG Timing Generator) 22 configuring the image sensor 16 to repeatedly perform an exposure and an electric-charge reading-out. The TG 22 applies to the imaging portion 18 a plurality of timing signals responding to a vertical synchronization signal Vsync outputted from an SG (Signal Generator) 24 in order to repeatedly execute an exposure operation of the imaging surface and a reading-out operation of the electric charges obtained thereby. The vertical synchronization signal Vsync is outputted from the SG 24 at each 1/60 seconds, and the raw image signal of each frame generated in the imaging portion 18 is read out at each 1/60 seconds in an order according to a raster scanning.
The raw image signal outputted from the imaging portion 18 is subjected to a series of processes, such as a correlation double sampling, an automatic gain adjustment, and an A/D conversion, by a CDS/AGC/AD circuit 20 configuring the image sensor 16. A signal-processing circuit 26 performs processes, such as a white balance adjustment, a color separation, and a YUV conversion, on the raw image data outputted from the CDS/AGC/AD circuit 20, and writes YUV-formatted image data to an SDRAM 34 through a memory control circuit 32. It is noted that the white balance adjustment is executed by a white-balance adjusting circuit 26w.
An LCD driver 36 reads out the image data thus written to the SDRAM 34 through the memory control circuit 34 at each 1/60 seconds, and drives an LCD monitor 38 based on the read-out image data. As a result, a real-time moving image (through image) of the object scene is displayed on a monitor screen.
A luminance evaluation circuit 28 evaluates brightness (luminance) of the object scene at every 1/60 seconds based on Y data generated by the signal processing circuit 26. A CPU 30 calculates an optimal EV value based on luminance evaluation values acquired by the luminance evaluation circuit 28, and sets an aperture amount and an exposure time period, which define the calculated optimal EV value, to the aperture unit 14 and the TG 22, respectively. As a result, the brightness of the through image displayed on the LCD monitor 38 is moderately adjusted.
The CPU 30 also calculates an optimal gain for a white balance adjustment, based on RGB-formatted image data outputted from the white-balance adjusting circuit 26w; and sets the calculated optimal gain to the white-balance adjusting circuit 26w. Thereby, a white balance of the through image is moderately adjusted.
When a recording start operation is performed by a key input device 46, the CPU 30 instructs an I/F 40 to perform a recording process. The I/F 40 reads out the image data accommodated in the SDRAM 34 through the memory control circuit 32 at each 1/60 seconds, and creates a moving-image file including the read-out image data in the recording medium 42. Such a recording process is ended in response to a recording end operation by the key input device 46.
With reference to
To solve this, the CPU 30 executes a flicker determining process that references the object scene image captured by the imaging surface, and executes a flicker countermeasure process when a determination result indicates a generation of the flicker. As a result of the flicker countermeasure process being executed, the exposure time period of the imaging surface is set to an integral multiple of 1/120 seconds corresponding to the 60-Hz commercial-use power supply, and set to an integral multiple of 1/100 seconds corresponding to the 50-Hz commercial-use power supply.
However, in order to determine whether or not the flicker occurs, an object scene image having several flames to ten-odd frames is needed, and thus, when the flicker determining process is executed all the time, a load of the CPU 30 increases. Therefore, the CPU 30 calculates, as ΔEV, a difference of optimal EV values obtained at different timings, and executes the flicker determining process when the calculated difference ΔEV exceeds a threshold value TH1.
Further, a determining reference of whether or not the flicker determining process should be started depends upon a variation amount of the optimal EV value, and thus, when an orientation of the imaging surface is frequently changed between in a bright portion and in a dark portion, there is a concern that the flicker determining process is frequently started. Also, it is probable that it takes time from a variation of the object scene in which the difference ΔEV exceeds the threshold value TH1 until the object scene is stabled.
Therefore, the CPU 30 calculates, as the ΔEV, a difference between an optimal EV value acquired one second earlier and a latest optimal EV value, repeats a reset & start of a timer TM (timer setting value: 10 seconds), and executes the flicker determining process as a condition of satisfying of ΔEV>threshold value TH1 and time-out of the timer TM. The flicker determining process is executed according to a procedure shown in
The CPU 30 executes a plurality of tasks including an imaging condition task shown in
With reference to
With reference to
With reference to
When the flag FLG indicate “0”, the process returns to the step S33, and when the flag FLG indicates “1”, the process advances to a step S39. In the step S39, the flicker determining process is executed during a period of several flames to ten-odd frames, and in a step S41, it is determined whether or not a result of the flicker determining process indicates the generation of the flicker. When NO is determined in this step, the process-directly advances to a step S45, and on the other hand, when YES is determined, the process executes the flicker countermeasure process in a step S43, and then, advances to a step S45. In the step S45, the flag FLG is returned to “0”, and thereafter, the process returns to the step S33.
As is apparent from the above description, the image sensor 16 has the imaging surface irradiated with the optical image of the object scene, and repeatedly generates the object scene image. The CPU 30 repeatedly determines whether or not a specific variation exceeding a reference (variation of ΔEV>TH1) is generated in the object scene captured by the image sensor 16, in parallel with a generation process of the object scene image by the image sensor 16 (S27). When a determination result is updated from a negative result to an affirmative result the CPU 30 waits for an elapse of a designated period (remaining time period of the timer TM), then, executes the flicker determining process (S39), and executes the flicker countermeasure process, as needed (S43).
Thus, the flicker determining process is executed when the specific variation exceeding the reference is generated in the object scene. In other words, unless the specific variation is generated, the flicker determining process is suspended. Thereby, it becomes possible to quickly prevent the generation of the flicker while inhibiting the increase of the load resulting from the flicker determination.
It is noted that in this embodiment, a time period set to the timer TM in each of the steps S31 and S35 is a fixed value (=10 seconds) and a waiting time period from the flag FLG is updated to “1” until the flicker determining process is executed (=designated time period) is a remaining time period of the timer TM. This waiting time period may be changed according to a variation amount of the object scene. In this case, the CPU 30 preferably executes processes in steps S51 to S59 shown in
With reference to
It is noted that in the processes shown in
Also, in this embodiment a precision of the flicker determining process is not particularly associated with the variation amount of the object scene. However, the precision of the flicker determining process (that is, a time period spent on the flicker determining process) may be changed according to the variation amount of the object scene. In this case, the CPU 30 preferably executes processes in steps S61 to S71 shown in
With reference to
It is noted that in the processes shown in
Furthermore, in this embodiment, when the variation amount of the optimal EV value exceeds the threshold value TH1, the flag FLG is set to “1”. However, the flag FLG may be controlled with reference to the variation amount of the optimal gain for a white balance adjustment In this case, processes in steps S25′ to S27′ shown in
With reference to
Furthermore, in this embodiment the CMOS-type image sensor is used. However, instead thereof, a CCD-type image sensor may be used. Also, in this embodiment a generation cycle of the vertical-synchronization signal Vsync is assumed to be 1/60 seconds. However the generation cycle of the vertical synchronization signal Vsync is not limited thereto.
Additionally, according to
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2007-219412 | Aug 2007 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5053871 | Ogawa et al. | Oct 1991 | A |
5293238 | Nakano et al. | Mar 1994 | A |
5570177 | Parker et al. | Oct 1996 | A |
20020154225 | Matsumoto et al. | Oct 2002 | A1 |
20050157203 | Nakakuki et al. | Jul 2005 | A1 |
20060061669 | Jang et al. | Mar 2006 | A1 |
20060158531 | Yanof | Jul 2006 | A1 |
20060279641 | Takahashi et al. | Dec 2006 | A1 |
20070046789 | Kirisawa | Mar 2007 | A1 |
20070146500 | Lee et al. | Jun 2007 | A1 |
20070153094 | Noyes et al. | Jul 2007 | A1 |
20070263101 | Cho et al. | Nov 2007 | A1 |
20080018751 | Kushida | Jan 2008 | A1 |
20080291291 | Kim | Nov 2008 | A1 |
20090051782 | Ono et al. | Feb 2009 | A1 |
Number | Date | Country |
---|---|---|
02-135986 | May 1990 | JP |
04-373365 | Dec 1992 | JP |
08-265652 | Oct 1996 | JP |
2003018458 | Jan 2003 | JP |
2005-229353 | Aug 2005 | JP |
2006-217413 | Aug 2006 | JP |
Number | Date | Country | |
---|---|---|---|
20090066805 A1 | Mar 2009 | US |