This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2007-115807, filed on Apr. 25, 2007, the entire contents of which is incorporated herein by reference.
1. Field
The embodiments relate to a white balance adjusting device which performs white balance adjustment on image data to be processed, and a related imaging apparatus and white balance adjusting program.
2. Description of the Related Art
Techniques for performing white balance adjustment on digital image data are known. And a technique has been proposed which varies the weights of pieces of information used for white balance adjustment according to luminance information to prevent color failure due to an object color in the white balance adjustment (refer to Japanese Unexamined Patent Application Publication No. 2002-232906, for example).
Incidentally, there may occur an event that color contamination occurs in object color boundary portions included in image data and white balance adjustment is performed improperly due to color failure. The technique of the above publication No. 2002-23296 cannot accommodate this problem.
A proposition of the embodiments is therefore to provide a white balance adjusting device, an imaging apparatus, and a white balance adjusting program which can suppress color failure and perform white balance adjustment properly.
To attain the above proposition, a color balance adjusting device according to the present embodiment includes a dividing unit which divides an image to be processed into plural small areas; a calculating unit which calculates evaluation values of each of the small areas based on color information of the image; a judging unit which makes a judgment as to whether to use the evaluation values of a small area of attention for white balance calculation based on a relationship between the evaluation values of the small area of attention and the evaluation values of small areas adjacent to the small area of attention among the evaluation values of the plural small areas; and a calculating unit which performs white balance calculation based on a judgment result of the judging unit.
For example, the judging unit makes the judgment based on whether the evaluation values are included in achromatic regions which are defined in advance in a predetermined chromatic coordinate.
For example, the judging unit judges that the evaluation values of the small area of attention should be used for white balance calculation if the evaluation values of the small area of attention are included in the achromatic regions and the evaluation values of all of the small areas adjacent to the small area of attention are included in the achromatic regions.
For example, the judging unit judges that the evaluation values of the small area of attention should be used for white balance calculation if the evaluation values of the small area of attention are included in the achromatic regions and color differences between the small area of attention and all of the small areas adjacent to the small area of attention are smaller than or equal to a predetermined value.
Furthermore, for example, the calculating unit determines a weight to be applied to the evaluation values of the small area of attention according to the color differences when they are used for white balance calculation.
An imaging apparatus having any of the above white balance adjusting devices is effective as a specific form of implementation of the present embodiment.
A white balance adjusting program, as obtained by converting the expression of any of the above white adjusting devices into a program form, for realizing white balance adjustment on image data to be processed is also effective as a specific form of implementation of the present embodiment.
A first embodiment will be described below with reference to the drawings. The first embodiment is directed to an electronic camera which is equipped with a white balance adjusting device according to the present embodiment.
At step S1, the WB calculating unit 6 obtains image data of an image to be processed from the video signal processing circuit 4. The image data is RGB image data.
At step S2, the WB calculating unit 6 divides the obtained image into plural small areas. In this embodiment, as illustrated in
At step S3, the WB calculating unit 6 calculates evaluation values for each of the small areas produced at step S2. The evaluation values are two values, that is, (average of R values)/(average of G values) and (average of B values)/(average of G values) which will be referred to as R/G and B/G, respectively. The WB calculating unit 6 stores the calculated evaluation values R/G's and B/G's of the respective small areas in the memory 7 (the evaluation values are stored temporarily), and moves to step S4.
At step S4, the WB calculating unit 6 performs judgment on the evaluation values of each small area calculated at step S3. More specifically, the WB calculating unit 6 performs judgment on the evaluation values of each small area based on whether or not the evaluation values calculated at step S3 are included in predetermined achromatic regions in a predetermined color plane illustrated in
For example, in the case of an image in which a top-left area is white, a bottom-right area is green, and a blue area exists between them (see
Therefore, the WB calculating unit 6 stores “OK” in the memory 7 for the small areas indicated by “W”, “W/B”, or “B/G” in
At step S5, the WB calculating unit 6 selects a small area of attention. The WB calculating unit 6 selects small areas of attention in order starting from the small area that is located at the top-left corner.
At step S6, the WB calculating unit 6 judges whether or not all the small areas (i.e., eight small areas) adjacent to the small area selected at step S5 are associated with “OK”. If judging that all of such small areas are associated with “OK”, the WB calculating unit 6 moves to step S7. On the other hand, if there is at least one small area that is associated with “NG”, the WB calculating unit 6 moves to step S9 (described later).
Among the small areas for which “OK” was stored at step S4 (judgment), it is not appropriate to use, for white balance calculation, the evaluation values of the small areas indicated by “W/B” because color contamination will occur there. The small areas indicated by “B/G” were erroneously detected by the judgment at step S4 because actually they are not achromatic areas. However, these small areas can be excluded from subjects of white balance calculation by virtue of the judgment of step S6.
At step S7, the WB calculating unit 6 adds the evaluation values of the small area selected at step S5 to TR/G and TB/G, respectively. Symbols TR/G and TB/G represent the sums of evaluation values and are initialized (TR/G=0, TB/G=0) before the process is started.
At step S8, the WB calculating unit 6 increments, by one, parameter CountT in the memory 7. Then, the WB calculating unit 6 moves to step S9. Parameter CountT is the count of a counter for counting the number of evaluation values that have been added to TR/G and TB/G at step S7, and is initialized (CountT=0) before the process is started.
At step S9, the WB calculating unit 6 increments, by one, parameter CountB in the memory 7. Then, the WB calculating unit 6 moves to step S10. Parameter CountB is the count of a counter for counting the number of small areas that have been selected at step S5, and is initialized (CountB=0) before the process is started.
If it is judged at step S6 that there is at least one small area that is associated with “NG” (step S6: no), steps S7 and S8 are skipped. That is, the evaluation values of the small area of attention are excluded from subjects of white balance calculation.
At step S10, the WB calculating unit 6 judges, based on the count CountB stored in the memory 7, whether or not the judgment of step S6 has been made for all the small areas. If judging that the judgment of step S6 has been made for all the small areas, the WB calculating unit 6 moves to step S11. On the other hand, if judging that the judgment of step S6 has not been made for all the small areas yet, the WB calculating unit 6 returns to step S5 and selects the next small area as a small area of attention.
At step S11, the WB calculating unit 6 calculates average values of the evaluation values according to the following Equations (1) and (2):
(Average value of evaluation values R/G)=TR/G/CountT (1)
(Average value of evaluation values B/G)=TB/G/CountT (2)
At step S12, the WB calculating unit 6 calculates Rgain and Bgain based on the average values of evaluation values calculated at step S11. A specific method for calculating Rgain, and Bgain is the same as in the well-known technique.
As described above, in the first embodiment, a dividing unit which divides an image to be processed into plural small areas, a calculating unit which calculates evaluation values of each small area based on color information of the image, and a judging unit which judges whether to use the evaluation values of a small area of attention for white balance calculation based on a relationship between the evaluation values of the small area of attention and the evaluation values of small areas adjacent to the small area of attention among the evaluation values of the plural small areas are provided and white balance calculation is performed based on a judgment result of the judging unit. Therefore, color failure is suppressed and white balance adjustment can be performed properly.
Furthermore, in the first embodiment, the judging unit makes the judgment based on whether or not the evaluation values are included in achromatic regions which are defined in advance in a predetermined chromatic coordinate. Therefore, small areas that may cause color contamination can be excluded from subjects of white balance calculation.
Still further, in the first embodiment, the judging unit judges that the evaluation values of the small area of attention should be used for white balance calculation if the evaluation values of the small area of attention are included in the achromatic regions and the evaluation values of all of the small areas adjacent to the small area of attention are included in the achromatic regions. Therefore, achromatic areas can be selected with high accuracy and used for white balance calculation.
A second embodiment will be described below with reference to the drawings. The second embodiment is a modification of the first embodiment. In the second embodiment, only components, operations, etc. that are different than in the first embodiment will be described in detail with components having the same components in the first embodiment given the same reference symbols as the latter.
The second embodiment is directed to an electronic camera which is the same in configuration as the electronic camera 1 of
At steps S21-S25, the WB calculating unit 6 performs the same processing as at steps S1-S5 in
At step S26, as at step S6 in
At step S27, the WB calculating unit 6 calculates color differences ΔC between the small area selected at step S25 and all the small areas adjacent to it, respectively.
ΔC(x,y)=√{square root over ((R/G(x,y)−R/G(0,0)2+(B/G(x,y)−B/G(0,0)2)}{square root over ((R/G(x,y)−R/G(0,0)2+(B/G(x,y)−B/G(0,0)2)}{square root over ((R/G(x,y)−R/G(0,0)2+(B/G(x,y)−B/G(0,0)2)}{square root over ((R/G(x,y)−R/G(0,0)2+(B/G(x,y)−B/G(0,0)2)} (3)
In Equation (3), (x, y) is one of eight combinations, that is, (−1, −1), (−1, 0), (−1, +1), (0, −1), (0, +1), (+1, −1), (+1, 0), and (+1, +1). The WB calculating unit 6 calculates color differences ΔC for these eight combinations of (x, y), stores them in the memory 7 (the calculated color differences ΔC are stored temporarily), and moves to step S28. A calculated color difference ΔC(x, y) stored in the memory 7 may be reused for the same combination of (x, y) corresponding to a different small area of attention (i.e., a color difference ΔC(x, y) is calculated only when each combination (x, y) appears for the first time).
At step S28, the WB calculating unit 6 judges whether or not the maximum value of the color differences ΔC(x, y) calculated at step S27 is smaller than or equal to a predetermined threshold value k. If judging that it is smaller than or equal to the predetermined threshold value k, the WB calculating unit 6 moves to step S29. On the other hand, if judging that it is larger than the predetermined threshold value k, the WB calculating unit 6 moves to step S31 (described later).
At step S29, as at step S7 in
At step S30, as at step S8 in
At step S31, as at step S9 in
Steps S29 and S30 are skipped if it is judged at step S26 that there is at least one small area that is associated with “NG” (step S6: no) or if it is judged at step S28 that the maximum value of the color differences ΔC(x, y) are larger than the predetermined threshold value k (step S28: no). That is, the evaluation values of the small area of attention are excluded from subjects of white balance calculation.
At step S32, as at step S10 in
At step S33, as at step S11 in
At step S34, as at step S12 in
As described above, in the second embodiment, the judging unit judges that the evaluation values of the small area of attention should be used for white balance calculation if the evaluation values of the small area of attention are included in the achromatic regions and the color differences between the small area of attention and all of the small areas adjacent to the small area of attention are smaller than or equal to the predetermined value. Therefore, the second embodiment provides, in addition to the advantages of the first embodiment, an advantage that small areas that may cause color contamination can be excluded more reliably from subjects of white balance calculation and hence the evaluation values of small areas with low degrees of color unevenness can be used for white balance calculation.
A third embodiment will be described below with reference to the drawings. The third embodiment is a modification of the first and second embodiments. In the third embodiment, only components, operations, etc. that are different than in the first and second embodiments will be described in detail with components having the same components in the first and second embodiments given the same reference symbols as the latter.
The third embodiment is directed to an electronic camera which is the same in configuration as the electronic camera 1 of
At steps S41-S45, the WB calculating unit 6 performs the same processing as at steps S1-S5 in
At step S46, as at step S6 in
At step S47, the WB calculating unit 6 calculates color differences ΔC(x, y) for the eight combinations of (x, y) in the same manner as at step S27 in
At step S48, the WB calculating unit 6 judges whether or not the maximum value of the color differences ΔC(x, y) calculated at step S47 is smaller than or equal to a predetermined threshold value m. If judging that it is smaller than or equal to the predetermined threshold value m, the WB calculating unit 6 moves to step S52 (described later). On the other hand, if judging that it is larger than the predetermined threshold value m, the WB calculating unit 6 moves to step S49.
At step S49, the WB calculating unit 6 judges whether or not the maximum value of the color differences ΔC(x, y) calculated at step S47 is larger than or equal to a predetermined threshold value n. If judging that it is larger than or equal to the predetermined threshold value n, the WB calculating unit 6 moves to step S54 (described later). On the other hand, if judging that it is smaller than the predetermined threshold value n, the WB calculating unit 6 moves to step S50.
If the maximum value of the color differences ΔC(x, y) is larger than the threshold value m and smaller than the threshold value n, at step S50, the WB calculating unit 6 calculates a weight W according to the following Equation (4) based on the maximum value of the color differences ΔC(x, y):
W=1−{(a−m)/(n−m)} (4)
where a is the maximum value of the color differences ΔC(x, y).
At step S51, the WB calculating unit 6 multiplies the evaluation values of the small area selected at step S45 by the weight W calculated at step S50. Then, the WB calculating unit 6 moves to step S52.
At step S52, as at step S7 in
At step S53, as at step S8 in
At step S54, as at step S9 in
Steps S52 and S53 are skipped if it is judged at step S49 that the maximum value of color differences ΔC(x, y) are larger than or equal to the threshold value n (step S49: yes). That is, the weight W for a small area whose maximum value of color differences ΔC(x, y) are larger than or equal to the threshold value n is set to 0.
That is, if the maximum value of color differences ΔC(x, y) is sufficiently small, the weight W is determined so that the evaluation values of the small area of attention contributes to white balance calculation greatly. On the other hand, if the maximum value of color differences ΔC(x, y) is sufficiently large, the evaluation values of the small area of attention are excluded from subjects of white balance calculation.
At step S55, as at step S10 in
At step S56, as at step S11 in
At step S57, as at step S12 in
As described above, in the third embodiment, a weight to be applied to the evaluation values when they are used for white balance calculation is determined according to the color differences. Therefore, the third embodiment provides, in addition to the advantages of the first embodiment, an advantage that the degree of contribution of the evaluation values of each small area of attention to white balance calculation can be adjusted according to color differences.
Although in each of the above-described embodiments image data of an image to be processed is RGB image data, the present embodiment can similarly be applied to a case that image data of an image to be processed is image data in another color space (e.g., YCbCr image data).
Although each of the above-described embodiments is directed to the electronic camera which is equipped with the white balance adjusting device, the present embodiment can similarly be applied to an image processing apparatus using a computer. In this case, it is appropriate to store, in the computer, a program corresponding to the flowchart of
The many features and advantages of the embodiments are apparent from the detailed specification and, thus, it is intended by the appended claims to cover all such features and advantages of the embodiments that fall within the true spirit and scope thereof. Further, since numerous modifications and changes will readily occur to those skilled in the art, it is not desired to limit the inventive embodiments to the exact construction and operation illustrated and described, and accordingly all suitable modifications and equivalents may be resorted to, falling within the scope thereof.
Number | Date | Country | Kind |
---|---|---|---|
2007-115807 | Apr 2007 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5555022 | Haruki et al. | Sep 1996 | A |
6020920 | Anderson | Feb 2000 | A |
6788331 | Sacca | Sep 2004 | B1 |
6906744 | Hoshuyama et al. | Jun 2005 | B1 |
6952225 | Hyodo et al. | Oct 2005 | B1 |
20020101516 | Ikeda | Aug 2002 | A1 |
20030156206 | Ikeda et al. | Aug 2003 | A1 |
20040212691 | Sato | Oct 2004 | A1 |
20060045512 | Imamura et al. | Mar 2006 | A1 |
20060170789 | Takahashi et al. | Aug 2006 | A1 |
20060232684 | Miki | Oct 2006 | A1 |
20070047803 | Nikkanen | Mar 2007 | A1 |
20070154203 | Takahashi et al. | Jul 2007 | A1 |
Number | Date | Country |
---|---|---|
A-2000-165896 | Jun 2000 | JP |
A-2002-23296 | Jan 2002 | JP |
A-2002-232906 | Aug 2002 | JP |
A-2003-224864 | Aug 2003 | JP |
WO 2007026303 | Mar 2007 | WO |
Number | Date | Country | |
---|---|---|---|
20080266417 A1 | Oct 2008 | US |