This application claims priority to and benefit of Chinese Patent Application Serial No. 200710077567.9, filed in the State Intellectual Property Office of the P.R. China on Dec. 4, 2007, the entire contents of which are incorporated herein by reference.
The present invention relates to image processing, and in particular to a method of image edge enhancement.
A digital image's quality can be improved by enhancing the image's edge. Conventional methods of image edge enhancement often require an interpolation operation of an image to obtain a full color RGB image followed by a second-order Laplace operation and edge trend specific operation of the full color RGB image to enhance the image's edge. These methods are not only computationally expensive, but also require a large amount of memory space, e.g., a pre-storage of two to four rows of RGB data would occupy six to twelve rows of static random access memory (SRAM).
The purpose of embodiments of the present invention is to provide a method of image edge enhancement to solve or alleviate the aforementioned problems associated with the conventional approaches.
One embodiment of the present invention comprises the steps of: determining the edge trend of an image in accordance with the second order gradient value of a center pixel in different directions; performing interpolation operation for the center pixel and calculating the absent color components of the center pixel; performing the edge enhancement operation for the image in the interpolation module in accordance with original color components of the center pixel and the image edge trend based on the Bayer data.
In some embodiments of the present invention, the image edge enhancement process takes into account the influence of the green component values of different pixels surrounding the center pixel, and adopts a noise-resistant, self-adaptive edge enhancement algorithm, to suppress noise on the image edge. Thus, the resulting image has a clear image edge. In addition, the fact that the process performs image edge enhancement in the interpolation module based on the Bayer data can significantly reduce the consumption of memory space.
The aforementioned features and advantages of the invention as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of embodiments when taken in conjunction with the drawings.
Like reference numerals refer to corresponding parts throughout the several views of the drawings.
The aforementioned features and advantages of the invention as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of the following embodiments when taken in conjunction with the drawings.
In some embodiments of the present invention, the edge enhancement operation independently processes as many pixels along the same edge as possible and then takes the average of the processing results associated with the pixels to suppress the noise influence on the edge. The operation includes: using a 3×5 Bayer matrix and a 5×3 Bayer matrix to determine the second order gradient values in the horizontal and vertical directions, respectively and using a 5×5 Bayer matrix to determine the second order gradient values in 135 degree direction and in 45 degree direction, respectively, so as to determine the image's edge trend and perform a corresponding edge enhancement scheme accordingly.
An image edge is represented by the brightness contrast on two sides of the edge, one side being relatively bright and the other side being relatively dark. The bigger the contrast between the bright side and the dark side is, the more outstanding the edge is. The pixels on or near the image edge are located either on the bright side or on the dark side. Either a brightness decrease of the pixels on the dark side or a brightness increase of the pixels on the bright side or both can achieve the edge enhancement effect for the image.
Step S101 is to determine the second order gradient value for the center pixel in different directions.
In one embodiment of the present invention, a 3×5 Bayer matrix and a 5×3 Bayer matrix are used to determine the center pixel's second order gradient value H in the horizontal direction and the center pixel's second order gradient value V in the vertical direction, respectively. A 5×5 Bayer matrix is used to determine the center pixel's second order gradient value X in 135 degree direction and the center pixel's second order gradient value Y in 45 degree direction.
In some embodiments of the present invention, if the original component of the center pixel is blue, as shown in
wherein,
H
1
=|G
32
−G
34|+|2B33−B31−B35|;
H
2
=|R
22
−R
24|+|2G23−G21−G25|;
H
3
=|R
42
−R
44″+|2G43−G41−G45|.
And the second order gradient value V in the vertical direction of the center pixel B33 is:
wherein,
V
1
=|G
23
−G
43|+|2B33−B13−B53|;
V
2
=|R
22
−R
42|+|2G32−G12−G52|;
V
3
=|R
24
−R
44|+|2G34−G14−G54|.
The second order gradient value X in the 135 degree direction is:
X=|R
22
−R
44|+|2B33−B11−B55|.
The second order gradient value Y in the 45 degree direction is:
Y=|R
24
−R
42|+|2B33−B15−B51|.
If the original component of the center pixel is red, as shown in
wherein,
H
1
=|G
32
−G
34|+|2R33−R31−R35|;
H
2
=|B
22
−B
24|+|2G23−G21−G25|;
H
3
=|B
42
−B
44|+|2G43−G41−G45|.
And the second order gradient value V in the vertical direction of the center pixel R33 is:
wherein,
V
1
=|G
23
−G
43|+|2R33−R13−R53|;
V
2
=|B
22
−B
42|+|2G32−G12−G52|;
V
3
=|B
24
−B
44|+|2G34−G14−G54|.
The second order gradient value X in the 135 degree direction is:
X=|B
22
−B
44|+|2R33−R11−R55|.
The second order gradient value Y in the 45 degree direction is:
Y=|B
24
−B
42|+|2R33−R15−R51|.
If the original component of the center pixel is green and its left and right adjacent pixels are blue, as shown in
wherein,
H
1
=|B
32
−B
34|+|2G33−G31−G35|;
H
2
=|G
22
−G
24|+|2R23−R21−R25|;
H
3
=|G
42
−G
44|+|2R43−R41−R45|.
And the second order gradient value V in the vertical direction of the center pixel G33 is:
wherein,
V
1
=|R
23
−R
43|+|2G33−G13−G53|;
V
2
=|G
22
−G
42|+|2B32−B12−B52|;
V
3
=|G
24
−G
44|+|2B34−B14−B54|.
The second order gradient value X in the 135 degree direction is:
X=|2G33−G22−G44|+|2G33−G11−G55|.
The second order gradient value Y in the 45 degree direction is:
Y=|2G33−G24−G42|+|2G33−G15−G51|.
If the original component of the center pixel is green and its left and right adjacent pixels are red, as shown in
wherein,
H
1
=|R
32
−R
34|+|2G33−G31−G35|;
H
2
=|G
22
−G
24|+|2B23−B21−B25|;
H
3
=|G
42
−G
44|+|2B43−B41−B45|.
And the second order gradient value V in the vertical direction of the center pixel G33 is:
wherein,
V
1
=|B
23
−B
43|+|2G33−G13−G53|;
V
2
=|G
22
−G
42|+|2R32−R12−R52|;
V
3
=|G
24
−G
44|+|2R34−R14−R54|.
The second order gradient value X in the 135 degree direction is:
X=|2G33−G22−G44|+|2G33−G11−G55|
The second order gradient value Y in the 45 degree direction is:
Y=|2G33−G24−G42|+|2G33−G15−G51.
Step S102 is to determine the edge trend of the image in accordance with the second order gradient value of the center pixel in different directions.
In some embodiments of the present invention, the maximum Max—HV and the minimum Min—HV of H and V, the maximum Max—XY and the minimum Min—XY of X and Y, are calculated respectively. Max—HV and Max—XY are compared with a predefined flat region threshold value TH—flat
If both Max—HV and Max—XY are smaller than TH—flat, the center pixel is deemed to be a pixel in a flat region, i.e. no edge enhancement is necessary.
If at least one of Max—HV and Max—XY is greater than or equal to TH—flat, Min—HV is compared with Min—XY. If Min—HV<Min—XY and H−V<−TH—edge, wherein TH—edge is a predefined edge region threshold value, the center pixel is deemed to be a pixel on the horizontal edge; if Min—HV<Min—XY and H−V>TH—edge, the center pixel is deemed to be a pixel on the vertical edge. If Min—HV>Min—XY and X−Y<−TH—edge, the center pixel is deemed to be a pixel on the 135 degree edge; if Min—HV>Min—XY and X−Y>TH—edge, the center pixel is deemed to be a pixel on the 45 degree edge.
A pixel that does not match any of the above-mentioned four scenarios is deemed to be a pixel in a non-flat, non-edge, generic region. In this case, edge enhancement operation is optional.
A software program implementing the aforementioned process is as follows:
The step S103 is to perform interpolation operation for the center pixel along the edge and to calculate absent color components of the pixel.
In one embodiment of the present invention, the interpolation operation begins with building a 5×5 Bayer matrix by treating the pixel to be interpolated as the center pixel, selecting an interpolation scheme according to the determined color variation trend, and calculating two interpolating component values of the center pixel using the selected interpolation scheme in consideration of the color continuity and the edge continuity at the center pixel. In some embodiments, the color variation trend is consistent with the aforementioned edge detection result.
After the interpolation operation, all the three color components of the center pixel, i.e., (R33, G33, B33), are determined.
The step S104 is to perform edge enhancement for the image in accordance with the original color components of the center pixel, the edge trend of the image, and the Bayer data.
In some embodiments of the present invention, the color change of the center pixel should remain roughly the same after the edge enhancement. Assuming the center pixel's surrounding pixels have the same color but different brightness, the conversion formula from the RGB space to the YCbCr space is as follows:
Y=0.299*R+0.587*G+0.114*B (1)
Cb=0.1687*R−0.3313*G+0.5*B (2)
Cr=0.5*R−0.4187*G−0.0813*B (3)
wherein,
Note that, according to the formulae (1), (2) and (3), if R, G and B vary by the amount “Diff”, Cb and Cr remain the unchanged and Y varies by the same amount “Diff”. In other words, the variation amount “Diff” determined from one color component can be applied to the other two color components. As a result of edge enhancement according to some embodiments of the present invention, the color of the center pixel after edge enhancement does not change, but the brightness component Y at the edge is enhanced.
In some embodiments of the present invention, the edge enhancement operation comprises the following steps:
Step 1: Determine the brightness of the edge center (Y—center) by averaging the brightness on both sides of the edge;
Step 2: Determine the difference between the brightness of the center pixel (Y33) and the brightness of the edge center (Y—center), i.e., En—Diff=Y33−Y—center;
Step 2.1: If En—Diff<0, the center pixel is located on the dark side of the image edge;
Step 2.2: If En—Diff>0, the center pixel is located on the bright side of the image edge;
Step 3: Multiply the brightness difference (En—Diff) by a gain factor (Edge—Gain—t) and define the new brightness of the center pixel (Y33
Y
33
new
−Y
—center=Edge—Gaint—t*En—Diff,
i.e., Y33
By plugging the definition of Y—center, i.e., Y—center=Y33−En—Diff into the equation above, the new brightness of the center pixel (Y33
wherein Edge—Gain is the amplification parameter whose range is 0˜3, and the default value is 1.
According to the RGB-YCbCr conversion formulae (1)-(3), Y varies by Diff while Cb and Cr remain the same when R, G and B all vary by Diff. Thus, this edge enhancement scheme only needs to determine the variation of one color component, which can be applied to the other two components. In other words, this edge enhancement scheme enhances the brightness of the center pixel without altering its color. In some embodiments, considering that the number of green pixels is twice the number of red or blue pixels, the scheme first performs edge enhancement to the green component G of the center pixel to determine En—Diff—G, and then performs respectively edge enhancement to the red component R and the blue component B. After the edge enhancement, the new values of the three color components of the center pixel are determined in accordance with following equations:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Below is the respective description of various edge enhancement scenarios in connection with the original color component of the center pixel and the image's edge trend at the center pixel:
Scenario One:
The original color component of the center pixel is blue and perform the edge enhancement to the center pixel as shown in
Option (1):
If Min—HV<Min—XY and H−V<−TH—edge, treat the center pixel as a pixel on the horizontal edge and perform edge enhancement to the center pixel on the horizontal edge.
Step (a):
Determine the difference (En—Diff—G—cen) between the value of the green component of the center pixel (G33) and the value of the green component of the edge center of the edge at which the center pixel (3,3) is located.
First, the value (G—center) of the green component of the edge center is defined as:
In the formula (1.1), G23 and G43 are known, but G13 and G53 are unknown. Because the center pixel is deemed to be a pixel on the horizontal edge, G13 and G53 can be treated as pixels on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—cen) is defined as:
En
—Diff—G—cen
=G
33
−G
—center (1.4)
By plugging the formulae (1.1), (1.2), and (1.3) into (1.4), En—Diff—G—cen can be expressed as:
Step (b):
Determine the green component difference (En—Diff—G—lef) associated with the pixel (3, 2) left to the center pixel (3, 3).
First, the value (G—lef—center) of the green component of the edge center corresponding to the pixel (3,2) is defined as:
In the formula (1.6), G12 and G52 are known, but G22 and G42 are unknown. Because the center pixel is a pixel on the horizontal edge, G22 and G42 can be treated as a pixel on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—lef) is defined as:
En
—Diff—G—lef
=G
32
−G
—lef—center (1.9)
By plugging the formulae (1.6), (1.7), and (1.8) into (1.9), En—Diff—G—lef can be expressed as:
Step (c):
Determine the green component difference (EnDiff G rig) associated with the pixel (3, 4) right to the center pixel (3, 3).
First, the value (G—rig—center) of the green component of the edge center corresponding to the pixel (3, 4) is defined as:
In the formula (1.11), G14 and G54 are known, but G24 and G44 are unknown. Because the center pixel is a pixel on the horizontal edge, G24 and G44 can be treated as a pixel on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—rig) is defined as:
En
—Diff—G—rig
=G
34
−G
—rig—center (1.14)
By plugging the formulae (1.11), (1.12), and (1.13) into (1.14), En—Diff—G—rig can be expressed as:
Step (d):
Perform edge enhancement to the center pixel (3, 3).
To suppress the noise's influence on the image edge enhancement, the weighted average of En—Diff—G—cen, En—Diff—G—lef and En—Diff—G—rig is defined as:
By plugging the formulae (1.5), (1.10), (1.15) into (1.16), the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (2):
If Min—HV<Min—XY, and H−V>TH—edge, treat the center pixel as a pixel on the vertical edge and perform edge enhancement to the center pixel on the vertical edge.
Step (a):
Determine the difference (En—Diff—G—cen) between the value of the green component of the center pixel (G33) and the value of the green component of the edge center of the edge at which the center pixel (3,3) is located.
First, the value (G—center) of the green component of the edge center is defined as:
In the formula (1.18), G32 and G34 are known, but G31 and G35 are unknown. Because the center pixel is deemed to be a pixel on the horizontal edge, G31 and G35 can be treated as pixels on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—cen) is defined as:
En
—Diff—G—cen
=G
33
−G
—center (1.21)
By plugging the formulae (1.18), (1.19), and (1.20) into (1.21), En—Diff—G—cen can be expressed as:
Step (b):
Determine the green component difference (En—Diff—G—up) associated with the pixel (2, 3) above the center pixel (3, 3).
First, the value (G—up—center) of the green component of the edge center corresponding to the pixel (2, 3) is defined as:
In the formula (1.23), G21 and G25 are known, but G22 and G24 are unknown. Because the center pixel is a pixel on the horizontal edge, G22 and G24 can be treated as a pixel on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—up) is defined as:
En
—Diff—G—up
=G
23
−G
—up—center (1.26)
By plugging the formulae (1.23), (1.24), and (1.25) into (1.26), EnDiff G up can be expressed as:
Step (c):
Determine the green component difference (En—Diff—G—dow) associated with the the pixel (4, 3) below the center pixel (3, 3).
First, the value (G—dow—center) of the green component of the edge center corresponding to the pixel (4, 3) is defined as:
In the formula (1.28), G41 and G45 are known, but G42 and G44 are unknown. Because the center pixel is a pixel on the vertical edge, G42 and G44 can be treated as a pixel on the vertical edge, i.e.,
Note that the green component difference (En—Diff—G—dow) is defined as:
En
Diff G dow
=G
43
−G
dow center (1.31)
By plugging the formulae (1.28), (1.29), and (1.30) into (1.31), En—Diff—G—dow can be expressed as:
Step (d):
Perform edge enhancement to the center pixel (3, 3).
To suppress the noise's influence on the image edge enhancement, the weighted average of En—Diff—G—cen, En—Diff—G—up and En—Diff—G—dow is defined as:
By plugging the formulae (1.22), (1.27), and (1.32) into (1.33), the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (3):
If Min—HV>Min—XY, and X−Y<−TH—edge, treat the center pixel as a pixel on the 135 degree edge and perform edge enhancement to the center pixel on the 135 degree edge.
First, the value (G—center) of the green component of the edge center is defined as: (please modify the formula 1.35)
In the formula (1.35), G24 and G42 are unknown. Because the center pixel is treated as a pixel on the 135 degree edge, G24 and G42 can be treated as a pixel on the 135 degree edge, i.e.,
Note that the green component difference (En—Diff—G—cen) is defined as:
En
—Diff—G—cen
=G
33
−G
—center (1.38)
By plugging the formulae (1.35), (1.36), and (1.37) into (1.38), EnDiff G cen can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (4):
If Min—HV>Min—XY, and X−Y>TH—edge, treat the center pixel as a pixel on the 45 degree edge and perform edge enhancement to the center pixel on the 45 degree edge.
First, the value (G—center) of the green component of the edge center is defined as:
In the formula (1.40), G22 and G44 are unknown. Because the center pixel is deemed to be a pixel on the 45 degree edge, G22 and G44 can be treated as pixels on the 45 degree edge, i.e.,
Note that the green component difference (En—Diff—G—cen) is defined as:
En
—Diff—G—cen
=G
33
−G
—center (1.43)
By plugging the formulae (1.40), (1.41), and (1.42) into (1.43), EnDiff G cen can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Scenario Two:
The original color component of the center pixel is red and perform the edge enhancement to the center pixel as shown in
Option (1):
If Min—HV<Min—XY, and H−V<−TH—edge, treat the center pixel as a pixel on the horizontal edge and perform edge enhancement to the center pixel on the horizontal edge.
By analogy to the option (1) of the scenario one, the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33 new
=R
33+EdgeGain*EnDiff G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (2):
If Min—HV<Min—XY, and H−V>TH—edge, treat the center pixel as a pixel on the vertical edge and perform edge enhancement to the pixel on the vertical edge.
By analogy to the option (2) of the scenario one, the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (3):
If Min—HV>Min—XY, and X−Y<−TH—edge, treat the center pixel as a pixel on the 135 degree edge and perform edge enhancement for the pixel on the 135 degree edge.
By analogy to the option (3) of the scenario one, the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33 new
=R
33+EdgeGain*EnDiff G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (4):
If Min—HV>Min—XY, and X−Y>TH—edge, treat the center pixel as a pixel on the 45 degree edge and perform edge enhancement for the pixel on the 45 degree edge.
By analogy to the option (4) of the scenario one, the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Scenario Three:
The original color component of the center pixel is green and the pixels that are on the left and right sides of the center pixel are blue, perform the edge enhancement to the center pixel as shown in
Option (1):
If Min—HV<Min—XY, and H−V<−TH—edge, treat the center pixel as a pixel on the horizontal edge and perform edge enhancement to the center pixel on the horizontal edge.
Step (a):
Determine the difference (En—Diff—G—cen) between the value of the green component of the center pixel (G33) and the value of the green component of the edge center of the edge at which the center pixel (3,3) is located.
First, the value (G—center) of the green component of the edge center is defined as:
In the formula (3.1), G13 and G53 are known, but G23 and G43 are unknown. Because the center pixel is a pixel on the horizontal edge, G23 and G43 can be treated as pixels on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—cen) is defined as:
En
—Diff—G—cen
=G
33
−G
—center (3.4)
By plugging the formulae (3.1), (3.2), and (3.3) into (3.4), En—Diff—G—cen can be expressed as:
Step (b):
Determine the green component difference (En—Diff—G—lef) associated with the pixel (3, 2) left to the center pixel (3, 3).
First, the value (G—lef—center) of the green component of the edge center associated with the pixel (3, 2) is defined as:
In the formula (3.6), G22 and G42 are known, but G12 and G52 are unknown. Because the center pixel is a pixel on the horizontal edge, G12 and G52 can be treated as a pixel on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—lef) is defined as:
En
—Diff—G—lef
=G
32
−G
—lef—center (3.9)
And G32 can also be treated as a pixel on the horizontal edge, i.e.,
By plugging the formulae (3.6), (3.7), (3.8), and (3.10) into (3.9), En—Diff—G—lef can be expressed as:
Step (c):
Determine the green component difference (En—Diff—G—rig) associated with the pixel (3, 4) right to the center pixel (3, 3).
First, the value (G—rig—center) of the green component of the edge center corresponding to the pixel (3, 4) is defined as:
In the formula (3.12), G24 and G44 are known, but G14 and G54 are unknown. Because the center pixel is a pixel on the horizontal edge, G14 and G54 can be treated as a pixel on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—rig) is defined as:
En
—Diff—G—rig
=G
34
−G
—rig—center (3.15)
And G34 can also be treated as a pixel on the horizontal edge, i.e.,
By plugging the formulae (3.12), (3.13), (3.14), and (3.16) into (3.15), the weighed average can be expressed as:
Step (d):
Perform edge enhancement to the center pixel.
To suppress the noise's influence on the image edge enhancement, the weighted average of En—Diff—G—cen, En—Diff—G—lef and En—Diff—G—rig is defined as:
By plugging the formulae (3.5), (3.11), and (3.17) into (3.18), the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (2):
If Min—HV<Min—XY, and H−V>TH—edge, treat the center pixel as a pixel on the vertical edge and perform edge enhancement for the pixel on the vertical edge.
Step (a):
Determine the difference (En—Diff—G—cen)between the value of the green component of the center pixel (G33) and the value of the green component of the edge center of the edge at which the center pixel (3,3) is located.
First, the value (G—center) of the green component of the edge center is defined as:
In the formula (3.20), G31 and G35 are known, but G32 and G34 are unknown. Because the center pixel is deemed to be a pixel on the horizontal edge, G32 and G34 can be treated as pixels on the horizontal edge, i.e.,
Note that the green component difference (En—Diff—G—cen) is defined as:
En
—Diff—G—cen
=G
33
−G
—center (3.23)
By plugging the formulae (3.20), (3.21), and (3.22) into (3.23), En—Diff—G—cen can be expressed as:
Step (b):
Determine the green component difference (En—Diff—G—up) associated with the pixel (2, 3) above the center pixel (3, 3).
First, the value (G—up—center) of the green component of the edge center corresponding to the pixel (2, 3) is defined as:
In the formula (3.25), G22 and G24 are known, but G21 and G25 are unknown. Because the center pixel is a pixel on the vertical edge, G21 and G25 can be treated as a pixel on the vertical edge, i.e.,
Note that the green component difference (En—Diff—G—up) is defined as:
En
—Diff—G—up
=G
23
−G
—up—center (3.28)
And G23 can also be treated as a pixel on the vertical edge, i.e.,
By plugging the formulae (3.25), (3.26), (3.27), (3.29) and (3.28):
Step (c):
Determine the green component difference (En—Diff—G—dow) associated with the pixel (4, 3) below the center pixel (3, 3).
First, the value (G—dow—center) of the green component of the edge center corresponding to the pixel (4, 3) is defined as:
In the formula (3.31), G42 and G44 are known, but G41 and G45 are unknown. Because the center pixel is a pixel on the vertical edge, G41 and G45 can be treated as a pixel on the vertical edge, i.e.,
Note that the green component difference (En—Diff—G—dow) is defined as:
En
—Diff—G—dow
=G
43
−G
—dow—center (3.34)
And G43 can also be treated as a pixel on the vertical edge, i.e.,
By plugging the formulae (3.31), (3.32), (3.33), and (3.35) into (3.34), En—Diff—G—dow can be expressed as:
Step (d):
Performing edge enhancement to the center pixel (3, 3).
To suppress the noise's influence on the image edge enhancement, the weighted average of En—Diff—G—cen, En—Diff—G—up and En—Diff—G—dow is defined as:
By plugging the formulae (3.24), (3.30), and (3.36) into (3.37), the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (3):
If Min—HV>Min—XY, and X−Y<−TH—edge, treat the center pixel as a pixel on the 135 degree edge and perform edge enhancement for the center pixel on the 135 degree edge.
Step (a):
Determine the difference (En—Diff—G—cen) between the value of the green component of the center pixel (G33) and the value of the green component of the edge center of the edge at which the center pixel (3,3) is located.
First, the value (G—center) of the green component of the edge center is defined as:
And the green component difference (En—Diff—G—cen) is defined as:
En
—Diff—G—cen
=G
33
−G
—center (3.40)
By plugging the formula (3.39) into (3.40), En—Diff—G—cen can be expressed as:
Step (b):
Determine the green component difference (En—Diff—G—lu) associated with the pixel (2, 2) that is on the left upper side to the center pixel (3, 3).
First, the value (G—lu—center) of the green component of the edge center corresponding to the pixel (2, 2) is defined as:
And the green component difference (En—Diff—G—lu) is defined as:
En
—Diff—G—lu
=G
22
−G
—lu—center (3.43)
By plugging the formula (3.42) into (3.43), En—Diff—G—lu can be expressed as:
Step (c):
Determine the green component difference (En—Diff—G—rd) of the edge center associated with the pixel (4, 4).
First, the value (G—rd—center) of the green component of the edge center corresponding to the pixel (4, 4) is defined as:
And the green component difference (En—Diff—G—rd) of the edge center associated with the pixel (4, 4) is:
En
—Diff—G—rd
=G
44
−G
—rd—center (3.46)
By plugging the formula (3.45) into (3.46), En—Diff—G—rd can be expressed as:
Step (d):
Perform edge enhancement to the center pixel.
To suppress the noise's influence on the image edge enhancement, the weighted average of En—Diff—G—cen, En—Diff—G—lu and En—Diff—G—rd is defined as:
By plugging the formulae (3.41), (3.44), and (3.47) into (3.48), the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (4):
If Min—HV>Min—XY, and X−Y>TH—edge, treat the center pixel as a pixel on the 45 degree edge and perform edge enhancement to the center pixel on the 45 degree edge.
Step (a):
Determine the green component difference (En—Diff—G—cen) associated with the center pixel (3,3).
First, the value (Gcenter) of the green component of the edge center is defined as:
And the green component difference (En—Diff—G—cen) of the edge center associated with the center pixel is:
En
—Diff—G—cen
=G
33
−G
—center (3.51)
By plugging the formula (3.50) into (3.51), En—Diff—G—cen can be expressed as:
Step (b):
Determine the green component difference (En—Diff—G—ru) associated with the pixel (2, 4) on the right upper side of the center pixel (3, 3).
First, the value (G—ru—center) of the green component of the edge center associated with the pixel (2, 4) is defined as:
And the green component difference (En—Diff—G—ru) of the edge associated with the pixel (2, 4) is defined as:
En—Diff—G—ru=G24−G—ru—center (3.54)
By plugging the formula (3.53) into (3.54), En—Diff—G—ru can be expressed as:
Step (c):
Determine the green component difference (En—Diff—G—ld) of the edge center associated with the pixel (4, 2) on the left lower side of the center pixel (3, 3).
First, the value (G—ld—center) of the green component of the edge center corresponding to the pixel (4, 2) on the left lower side of the center pixel is defined as:
And the green component difference (En—Diff—G—ld) of the edge center associated with the pixel (4, 2) is defined as:
En
Diff G ld
=G
42
−G
ld—center (3.57)
By plugging the formula (3.56) into (3.57), En—Diff—G—ld can be expressed as:
Step (d):
Performing edge enhancement to the center pixel.
To suppress the noise's influence on the image edge enhancement, the weighted average of En—Diff—G—cen, En—Diff—G—ld and En—Diff—G—ru is defined as:
By plugging the formulae (3.52), (3.55), and (3.58) into (3.59), the weighted average can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Scenario Four:
If the original color component of the center pixel is green and the pixels that are on the left and right sides of the center pixel are red, perform the edge enhancement to the center pixel, as shown in
Option (1):
If Min—HV<Min—XY and H−V<−TH—edge, treat the center pixel as a pixel on the horizontal edge and perform edge enhancement to center the pixel on the horizontal edge.
By analogy to the option (1) of the scenario three, En—Diff—G can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33 new
=R
33+EdgeGain*EnDiff G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (2):
If Min—HV<Min—XY, and H−V>TH—edge, treat the center pixel as a pixel on the vertical edge and perform edge enhancement to the pixel on the vertical edge.
By analogy to the option (2) of the scenario three, En—Diff—G can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (3):
If Min—HV>Min—XY, and X−Y<−TH—edge, treat the center pixel as a pixel on the 135 degree edge and perform edge enhancement for the pixel on the 135 degree edge.
By analogy to the option (3) of the scenario three, En—Diff—G can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33 new
=R
33+EdgeGain*EnDiff G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
Option (4):
If Min—HV>Min—XY, and X−Y>TH—edge, treat the center pixel as a pixel on the 45 degree edge and perform edge enhancement for the pixel on the 45 degree edge.
By analogy to the option (4) of the scenario three, En—Diff—G can be expressed as:
The new value of the three components R, G and B of the center pixel (3, 3) after edge enhancement is:
R
33
new
=R
33+Edge—Gain*En—Diff—G;
G
33
new
=G
33+Edge—Gain*En—Diff—G;
B
33
new
=B
33+Edge—Gain*En—Diff—G.
In some embodiments of the present invention, the aforementioned image edge enhancement process takes into account the influence of the green component values of different pixels surrounding the center pixel, and adopts a noise-resistant, self-adaptive edge enhancement algorithm, to suppress noise on the image edge. Thus, the resulting image has a clear image edge. In addition, the fact that the process performs image edge enhancement in the interpolation module based on the Bayer data can significantly reduce the consumption of memory space. Meanwhile, based on the Bayer data, the aforementioned image edge enhancement scheme can reduce the memory usage by performing edge enhancement using a small mount of memory space.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.
Number | Date | Country | Kind |
---|---|---|---|
200710077567.9 | Dec 2007 | CN | national |