1. Field of the Invention
The present invention relates to a backlight detection device and a backlight detection method used to determine whether an image is in a backlight state.
2. Description of the Related Art
In an exposure process where an imaging apparatus such as a digital camera, a radio-frequency video camera, etc., is utilized, in a case of photographing with backlight, a result is produced in general that a background portion is too bright and a really-interesting photographic subject, i.e., a target portion is too dark. Therefore backlight detection is widely employed in various functions of intelligent scene recognition, automatic exposure control, etc., of an imaging apparatus such as a digital camera, a video camera, etc.
A backlight detection technique is proposed in the below cited reference No. 1. In this backlight detection technique, the central and bottom portions of an image screen are determined as a main subject area, and the top portion of the image screen is determined as a background area. Then brightness difference between the main subject area and the background area is calculated. If the brightness difference is greater than a predetermined threshold value, it is determined that an image on the image screen is in a backlight state; otherwise it is determined that the image on the image screen is in a non-backlight state.
An automatic backlight detection technique used in a video camera is proposed in the below cited reference No. 2. In this automatic backlight detection technique, a predetermined template is adopted to determine a main subject area and a background area. If brightness difference between the main subject area and the background area is relatively high, it is determined that the corresponding image is in a backlight state.
Furthermore, a backlight detection method proposed in the below cited reference No. 3 is as follows: first a predetermined detection area is set on a portion of an image sensing plane; then light is detected based on difference in level of video signals corresponding to the inside and the outside of the predetermined detection area; finally, based on the detected result, it is determined that whether an image is in a backlight state.
And a backlight detection method proposed in the below cited reference No. 4 is as follows: first plural detection frames are set based on division of an imaging surface; then brightness level of each of the set detection frames is detected; next ratio between the brightness level detected from the detection frame having the lowest detected brightness level and the average value of the brightness levels detected from the detection frames other than the detection frame having the lowest brightness level is calculated; finally, if the ratio is greater than or equal to a predetermined value, it is determined that the corresponding image is in a backlight state. An existing problem in this method is that the area having the lowest brightness level is not always a subject area, i.e., there is a possibility of wrong determination with regard to the subject area; as a result, there may also be a possibility of wrong determination with regard to the backlight state on some level.
Furthermore a problem universally existing in the above-mentioned backlight detection techniques is that the division of the subject and background areas carried out according to the predetermined subject and background areas is fixed no matter in what circumstance the image is. As a result, if a real target is not located in the predetermined areas or the predetermined template, then the backlight detection cannot be achieved, or if the subject area is determined only by using brightness, then the wrong determination of the subject area may occur; at any rate, the performance of the backlight detection may be severely influenced.
The disadvantages of the prior art are overcome by the present invention. The present invention relates to image processing and pattern recognition, and provides a backlight detection device and a backlight detection method used to determine whether an image is in a backlight state. The backlight detection device and the backlight detection method in the embodiments of the present invention can be applied to an image apparatus such as a digital camera, a video camera, etc., without determining a subject area and a background area in advance; that is, in a case where there are not a fixed subject area and a fixed background area, the backlight state is automatically detected. Furthermore the embodiments of the present invention are not only dependent on brightness when determining the subject area and the background area. In the embodiments of the present invention, the subject area and the background area are automatically determined according to area growth started from a focal position; as a result, it is possible to determine the backlight state based on brightness difference between the subject area and the background area.
According to one aspect of the present invention, a backlight detection device used to determine whether an image is in a backlight state is provided. The backlight detection device comprises a pixel value acquiring unit used to acquire a pixel value of each of pixels in the image; a focal position determination unit used to determine a focal position in the image; a subject area determination unit used to determine, based on the pixel values of the pixels in the image, a subject area starting from the focal position by using area growth processing so as to divide the image into the subject area and a background area; a brightness difference calculation unit used to calculate brightness difference between the subject area and the background area; and a backlight determination unit used to determine, based on the brightness difference, whether the image is in the backlight state so as to detect the image in the backlight state.
According to another aspect of the present invention, a backlight detection method used to determine whether an image is in a backlight state is provided. The backlight detection method comprises a pixel value acquiring step of acquiring a pixel value of each of pixels in the image; a focal position determination step of determining a focal position in the image; a subject area determination step of determining, based on the pixel values of the pixels in the image, a subject area starting from the focal position by using area growth so as to divide the image into the subject area and a background area; a brightness difference calculation step of calculating brightness difference between the subject area and the background area; and a backlight determination step of determining, based on the brightness difference, whether the image is in the backlight state so as to detect the image in the backlight state.
Furthermore a predetermination process may also be carried out based on a brightness histogram before determining the focal position; in other words, an image that is apparently not in the backlight state may be directly discarded (i.e. follow-on processing is not applied to this kind of image) so as to increase detection speed.
The backlight detection device and the backlight detection method used to determine whether the image is in the backlight state according to the embodiments of the present invention can be applied to various imaging apparatuses for determining a backlight state; the determination processing may be carried out not only before an image is finally formed but also in a process of post-processing the finally formed image.
Hereinafter, the embodiments of the present invention will be concretely described with reference to the drawings.
The backlight detection device according to the embodiment of the present invention may deal with a hierarchical color image formed by an imaging apparatus such as a digital camera, a video camera, etc. The pixel value acquiring unit 11 may acquire four channel values of each of pixels in the hierarchical color image. That is, a brightness channel value L, a red channel value R, a green channel value G, and a blue channel value B may be acquired; here R, G, and B stand for brightness values of red, green, and blue, respectively. Here it should be noted that all of the values of R, G, B and L may be automatically obtained by a conventional imaging apparatus based on a known technique in an image capturing process.
The pixel value acquiring unit 11 acquires signals of the pixel values of all of the pixels in the image; the signals are applied to a follow-on backlight detection process. Since the aim is to carry out backlight detection, it is possible to use a preview image whose resolution is lower than that of an image finally generated by an imaging apparatus; by this way, for example, it is possible to satisfy the demand of real-time processing in the imaging apparatus. The low-resolution preview image may be automatically and directly detected and obtained by an imaging apparatus such as a digital camera, a video camera, etc.; an actual example of the preview image is, for example, an image displayed on a liquid crystal display of an imaging apparatus such as a digital camera, a video camera, etc., before pressing down a shutter button, wherein, the resolution of the image is lower than that of a final image generated by the imaging apparatus after pressing down the shutter button in the same condition. Here it should be noted that it is apparent that the embodiments of the present invention may also be applied to the final image generated by the imaging apparatus.
The human face detection unit 121 may utilize various conventional human detection techniques, for example, a human face detection method disclosed in the cited reference No. 5 and human face detection techniques disclosed in the cited references No. 6 and No. 7, to carry out determination and detection of a human face.
If the human face detection unit 121 determines that there is not a human face in the image, then the human face detection unit 121 transmits the image to the automatic focusing unit 122. The automatic focusing unit 122 carries out automatic focusing processing with regard to the image so as to automatically obtain a focal area; this focal area serves as the focal position. Here it should be noted that the automatic focusing processing may be realized by a conventional imaging apparatus based on a known technique.
The subject area determination unit 13 may determine a subject area in the image by employing a known algorithm of area growth. For example, the subject area determination unit 13 may first create a m×n matrix M (here m and n are counting numbers) whose size is equal to the size of the image being processed; in other words, the respective elements in the matrix M correspond to the respective pixels in the image being processed. And the initial value of each of the elements in the matrix M is set to 0. Then a m×n matrix b whose size is equal to the size of the matrix M is created. Since the size of the matrix b is equal to the size of the matrix M, the elements in the matrix b correspond to the pixels in the image being processed. According to the result (i.e. the focal position) obtained by the focal position determination unit 12, with regard to all the elements in the matrix b, the initial values of the elements corresponding to the pixels located at the focal position in the image are set to 1, and the initial values of the other elements are set to 0.
Next an area growth process is carried out. It is supposed that b(x,y) stands for the respective elements in the matrix b, and M(x,y) stands for the respective elements in the matrix M; here x and y stand for a row position and a column position of each of the elements in the corresponding matrix, respectively, and both x and y are counting numbers. Then the respective elements b(x,y) in the matrix b are checked in series. If the value of one of b(x,y) is 1 and the value of the corresponding M(x,y) is 0, then the value of the corresponding M(x,y) is set to 1, and (x,y) is determined as a start point from which the area growth process growing to the neighboring points begins to be carried out. If it is supposed that a start point is (x0,y0), then only when the following equation (1) is satisfied, it is determined that the start point merges with its neighboring points (xi, yi); here i refers to an index, and is a counting number between 1 and 8.
abs(G(xi,yi)−G(x0,y0))+abs(R(xi,yi)−R(x0,y0))+abs(B(xi,yi)−B(x0,y0))<d (1)
Here d is a predetermined threshold value; abs( ) refers to the calculation of absolute value; R( ), G( ), and B( ) refer to R, G, and B channel values, respectively. When the area growth process started from the start point stops, a pixel, which is close to the start point and has the similar color expression, in the matrix M is set to 1. After all the elements in the matrix b are checked, the subject area determination unit 13 outputs the matrix M as the result of the subject area determination processing. The pixels, which correspond to the elements whose values in matrix M are 1, form the subject area, and the pixels, which correspond to the elements whose values in matrix M are 0, form the background area.
The above-mentioned process may be expressed by the following STEPS.
STEP 1: creating a m×n matrix M in which element values are initialized to 0, i.e. M(i,j)=0;
STEP 2: creating a stack S whose contents are initialized to “empty” values;
STEP 3: x=0, y=0, and setting a predetermined threshold value d;
STEP 4: if (M(x,y)==0 and b(x,y)==1), then
STEP 5: x=x+1;
STEP 6: if (x≧m), then x=0 and y=y+1;
STEP 7: if (y≧n), then exiting; otherwise going to STEP 4.
The brightness difference calculation unit 14 calculates brightness difference between the subject area and the background area. If the brightness difference between the subject area and the background area is greater than or equal to a predetermined threshold value, then the backlight determination unit 15 determines that this image is in a backlight state, or in other words, this image is a backlight image; otherwise the backlight determination unit 15 determines that this image is in a non-backlight state, or in other words, this image is a non-backlight image.
As a further improvement of the embodiments of the present invention, as shown in
For example, the predetermination 20 may utilize the pixel value acquiring unit 11 to acquire brightness channel values in the pixel values of all of the pixels in the image, and then predetermine, based on a brightness histogram of this image, that this image is a candidate backlight image or a non-backlight image. In a case where this image is predetermined as the candidate backlight image, this image is output to the focal position determination unit 12 for carrying out the follow-on processing. In a case where this image is predetermined as the non-backlight image, the processing applied to this image stops. The predetermined unit 20 may obtain a classification function by carrying out training based on plural known sample images which are in a backlight state and plural known sample images which are in a non-backlight state.
In the testing process, the predetermination unit 20 deals with a test image prepared to be processed. In STEP 211, a brightness histogram of the test image is extracted and serves as the feature of this test image. In STEP 212, the obtained classification function is utilized to calculate the feature (i.e. the brightness histogram) of the test image so as to predetermine whether the test image is a non-backlight image. Here it should be noted that since this determination processing is predetermination processing, if the test image is predetermined as a backlight image, then it may be called a candidate image. If the test image is predetermined as a non-backlight image, then the processing applied to the test image stops; if the test image is predetermined an a backlight image, then the test image is transmitted to the focal position determination unit 12 for carrying out the follow-on processing.
In particular, according to the SVM method, in the training process, plural flagged backlight sample images and non-backlight sample images serve as positive samples and negative samples, respectively. With regard to each of the samples, a feature vector fi is extracted; here i is an index of the samples, and is a counting number. The feature vector is, for example, a brightness histogram. If it is supposed that p positive samples and q negative samples are adopted, then the total number k=p+q; here p, q, and k are counting numbers. As a result, a feature vector set F={fi} (i=1, . . . k) can be obtained, and a flag set Y={yi} (i=1, . . . k) can be obtained too; here yi is a class flag corresponding to the feature vector fi, and can be defined as follows.
Before STEP 202 is carried out, first a kernel function K is selected; in this embodiment, it is possible to select a linear kernel function defined as follows.
K(g,h)=g·h (3)
That is, the kernel function K calculates the inner product of the two vectors g and h as shown in the above equation (3).
In the training process, according to the SVM training algorithm, nv vectors are selected from the feature vector set F to form a support vector set V={vi} for determining the classification function; here i is an index, and i=1, . . . , nv. And according to this training algorithm, a weight ai is given to the respective vectors vi
In the testing process, in STEP 212, with regard to a feature vector v (i.e. the brightness histogram extracted in STEP 211) of the test image prepared to be processed, it may be determined by adopting a classification function fun( ) defined by the following equation (4).
fun(v)=Σy=1nvyi*ai*K(vi,v)+b (4)
Here yi is a class flag corresponding to the vector vi and b is a constant calculated by the SVM training algorithm.
In a case where the linear kernel function is adopted, equation (5) may be obtained based on the above-mentioned classification function as follows.
Since all of yi, ai, vi, and nv are known amounts in the training process, (Σi=1nvyi*ai*vi) may be expressed as w. And since w may be calculated in advance, determination time in the testing process cannot be influenced.
As for the feature vector v of the test image prepared to be processed, its class flag yv may be defined as follows.
If the calculation result of the classification function fun( ) with regard to the feature vector v is greater than or equal to 0, then the class flag yv of the feature vector v is set to 1 that means that the test image corresponding to the feature vector v may be classified as a positive sample, i.e., in this embodiment, means that the test image is predetermined as a candidate backlight image and the follow-on determination processing will be carried out. If the calculation result of the classification function fun( ) with regard to the feature vector v is less than 0, then the class flag yv of the feature vector v is set to 0 that means that the test image corresponding to the feature vector v may be classified as a negative image, i.e., in this embodiment, means that the test image is predetermined as a non-backlight image and its processing stops (i.e. the follow-on determination processing will not be carried out).
Here it should be noted that the above-mentioned support vector machine (SVM) method is just used as an example to explain how to carry out the training and how to carry out the predetermination with regard to the test image prepared to be processed; in other words, those practiced in the art can understand that it is also possible to adopt other known machine learning methods, for example, k-NN, AdaBoost, etc., to train the classifier and to carry out the predetermination of whether the test image is a backlight image.
Furthermore a backlight detection method used to determine whether an image is in a backlight state is provided. The backlight detection method comprises a pixel value acquiring step, which may be executed by the pixel value acquiring unit 11, of acquiring a pixel value of each of the pixels in the image; a focal position determination step, which may be executed by the focal position determination unit 12, of determining a focal position in the image; a subject area determination step, which may be executed by the subject area determination unit 13, of determining, based on the pixel values of the pixels in the image, a subject area starting from the focal position by using area growth processing so as to divide the image into the subject area and a background area; a brightness difference calculation step, which may be executed by the brightness difference calculation unit 14, of calculating brightness difference between the subject area and the background area; and a backlight determination step, which may be executed by the backlight determination unit 15, of determining, based on the brightness difference, whether the image is in the backlight state so as to detect the image in the backlight state.
The image may be a hierarchical color image, and the pixel value may include a brightness channel value, a red channel value, a green channel value, and a blue channel value of the corresponding pixel. Furthermore the image may be a preview image whose resolution is lower than that of an image finally generated by an imaging apparatus; in particular, an actual example of the preview image is, for example, an image displayed on a liquid crystal display of an imaging apparatus such as a digital camera, a video camera, etc. before pressing down a shutter button.
The focal position determination step comprises a human face detection step, which may be executed by the human face detection unit 121, of determining whether there is a human face in the image, wherein, if there is the human face in the image, then a human face area is detected and serves as the focal position; and an automatic focusing step, which may be executed by the automatic focusing unit 122, of carrying out automatic focusing processing so as to automatically obtain a focal area serving as the focal position if the human face detection step determines that there is not a human face in the image.
If the brightness difference between the subject area and the background area is greater than a predetermined value, then the backlight determination step determines that the image is in the backlight state.
The backlight detection method according to the embodiment of the present invention may further comprise a predetermination step, which may be executed by the predetermination unit 20, of predetermining the image as a candidate backlight image or a non-backlight image based on the above-mentioned brightness histogram of the image, wherein, in a case where the image is predetermined as the candidate backlight image, the image is output to the focal position determination step, and in a case where the image is predetermined as the non-backlight image, processing with regard to the image stops.
Furthermore, in the predetermination step, a classification function is obtained by carrying out training according to plural known sample images which are in a backlight state and plural known sample images which are in a non-backlight state.
A series of operations described in this specification can be executed by hardware, software, or a combination of hardware and software. When the operations are executed by software, a computer program can be installed in a dedicated built-in storage device of a computer so that the computer can execute the computer program. Alternatively, the computer program can be installed in a common computer by which various types of processes can be executed so that the common computer can execute the computer program.
For example, the computer program may be stored in a recording medium such as a hard disk or a ROM in advance. Alternatively, the computer program may be temporarily or permanently stored (or recorded) in a movable recording medium such as a floppy disk, a CD-ROM, a MO disk, a DVD, a magic disk, or a semiconductor storage device. And also it is possible to let these kinds of movable recording media be packaged software for purpose of distribution.
While the present invention is described with reference to the specific embodiments chosen for purpose of illustration, it should be apparent that the present invention is not limited to these embodiments, but numerous modifications could be made thereto by those skilled in the art without departing from the basic concept and scope of the present invention.
The present application is based on Chinese Priority Patent Application No. 201010120389.5 filed on Mar. 9, 2010, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
201010120389.5 | Mar 2010 | CN | national |