The present disclosure relates to image processing technologies, and more particularly to an image processing method and device.
With improvements on living standards, demands on display quality of electronic products are increasingly high. In existing skills, in order to improve the display quality, image processing is performed during screen display. During the image processing, it is usually necessary to adjust saturation of images in order to make the display more colorful.
For color image saturation enhancement, it is very important to ensure no out-of-boundary issue for RGB color space and keep color tone unchanged. Color image enhancement is usually performed in HSI (Hue, Saturation, Intensity) and HSV (Hue, Saturation, Value) spaces through conversion. However, a color space transformation problem will occur in converting processed images back to the RGB space. Generally, cut-out approaches are adopted to map the out-of-boundary values to boundary values. This may cause some details to be lost and cause color tone to be changed. Further, this space transformation approach takes time, and is large in computation consumption and low in efficiency.
The present invention provides an image processing method for solving image color tone inconsistence caused by being out of the boundaries of color space.
To achieve above object, technical schemes provided in the present disclosure are described below.
The present disclosure provides an image processing method, including:
Step S10: according to a first function f1(x), processing components of a point Ai(ri,gi,bi) of an original image in RGB color space to obtain A0(r0,g0,b0)=f1(xi), where i is a natural number;
Step S20: according to a second function f2(x), processing the point Ai(ri,gi,bi) of the original image to obtain a processed saturation S0=f2(xi); and
Step S30: letting f2(xi)=1 to determine the point A0(r0,g0,b0), and processing the point Ai in the RGB color space if max(r0,g0,b0) converting the point Ai into CMY color space for image processing if max(r0,g0,b0)>1.
In accordance with a preferred embodiment of the present disclosure, Step S10 includes:
Step S11: selecting the point Ai(ri,gi,bi) from the original image in the RGB color space; and
Step S12: according to the first function f1(x), processing each component of the point Ai of the original image in the RGB color space to obtain A0(r0,g0,b0)=f1(xi),
wherein processing each component of the point Ai of the original image in the RGB color space is to stretch (α) and translate (β) the components of the point Ai in the RGB color space, where the first function is f1(x)=αx+β.
In accordance with a preferred embodiment of the present disclosure, Step S20 includes:
Step S21: selecting the point Ai(ri,gi,bi) from the original image in the RGB color space;
Step S22: according to a third function f3(x), determining saturation Si=f3(xi) of the point Ai(ri,gi,bi) of the original image; and
Step S23: according to the second function f2(x), processing the saturation Si of the original image to obtain the processed saturation S0=f2(xi).
In accordance with a preferred embodiment of the present disclosure, the saturation of the point Ai(ri,gi,bi) of the original image is Si=1−3×min[ri,gi,bi]/ri+gi+bi; and
the processed saturation obtained by processing the saturation Si of the original image according to the second function f2(x) is S0=f2(x)=1−3×α·min[ri,gi,bi]+β/ri+gi+bi.
In accordance with a preferred embodiment of the present disclosure, brightness of the original image remains unchanged before and after image processing, and values of α and β in f1(x) are obtained using S0=f2(xi) and A0(r0,g0,b0)=f1(xi).
In accordance with a preferred embodiment of the present disclosure, the CMY color space is a color model based on subtractive color mixture, and the point Ai is processed in the CMY color space using a fourth function f4(x)=1−x, where x represents each component of the point Ai(ri,gi,bi) in the RGB color space.
The present disclosure further provides an image processing device, including:
an image processing module configured to process components of a point Ai(ri,gi,bi) of an original image in RGB color space according to a first function f1(x) to obtain A0(r0,g0,b0)=f1(xi), where i is a natural number, process the point Ai(ri,gi,bi) of the original image according to a second function f2(x) to obtain a processed saturation S0=f2(xi), determine the point A0(r0,g0,b0), process the point Ai in the RGB color space if max(r0,g0,b0)≤1, and convert the point Ai into CMY color space for image processing if max(r0,g0,b0)>1.
In accordance with a preferred embodiment of the present disclosure, the image processing module is configured to stretch (α) and translate (β) each component of the point Ai of the original image in the RGB color space, where the first function is f1(x)=αx+β.
In accordance with a preferred embodiment of the present disclosure, the image processing module is configured to determine saturation Si=f3(xi) of the point Ai(ri,gi,bi) of the original image according to a third function f3(x), where
and
the image processing module is further configured to process the saturation Si of the original image according to the second function f2(x) to obtain the processed saturation
The present disclosure further provides an image processing method, including:
Step S10: according to a first function f1(x), processing components of a point Ai(ri,gi,bi) of an original image in RGB color space to obtain A0(r0,g0,b0)=f1(xi), where i is a natural number;
Step S20: according to a second function f2(x), processing the point Ai(ri,gi,bi) of the original image to obtain a processed saturation S0=f2(xi); and
Step S30: letting f2(xi)=1 to determine the point A0(r0,g0,b0), and processing the point Ai in the RGB color space if max(r0,g0,b0)≤1; using a fourth function f4(x) to convert the point Ai into CMY color space for image processing if max(r0,g0,b0)>1.
In accordance with a preferred embodiment of the present disclosure, Step S10 includes:
Step S11: selecting the point Ai(ri,gi,bi) from the original image in the RGB color space; and
Step S12: according to the first function f1(x), processing each component of the point Ai of the original image in the RGB color space to obtain A0(r0,g0,b0)=f1(xi),
wherein processing each component of the point Ai of the original image in the RGB color space is to stretch (α) and translate (β) the components of the point Ai in the RGB color space, where the first function is f1(x)=αx+β.
In accordance with a preferred embodiment of the present disclosure, Step S20 includes:
Step S21: selecting the point Ai(ri,gi,bi) from the original image in the RGB color space;
Step S22: according to a third function f3(x), determining saturation Si=f3(xi) of the point Ai(ri,gi,bi) of the original image; and
Step S23: according to the second function f2(x), processing the saturation Si of the original image to obtain the processed saturation S0=f2(xi).
In accordance with a preferred embodiment of the present disclosure, the saturation of the point Ai(ri,gi,bi) of the original image is
and
the processed saturation obtained by processing the saturation Si of the original image according to the second function f2(x) is
In accordance with a preferred embodiment of the present disclosure, brightness of the original image remains unchanged before and after image processing, and values of α and β in f1(x) are obtained using S0=f2(xi) and A0(r0,g0,b0)=f1(xi).
In accordance with a preferred embodiment of the present disclosure, the CMY color space is a color model based on subtractive color mixture, and the point Ai is processed in the CMY color space using a fourth function f4(x)=1−x, where x represents each component of the point Ai(ri,gi,bi) in the RGB color space.
Beneficial effects of the present disclosure are described below. The present disclosure provides an image processing method and device. Points of an original image in its color space are filtered. By performing space transformation for the points that may be out of boundary, the present disclosure can efficiently solve the image distortion issue caused by being out of the boundaries of the color space, ensure unchanged color tone, and improve display quality. Also, image saturation is enhanced in the RGB color space and computing power is improved.
For explaining the technical schemes used in the conventional skills and the embodiments of the present disclosure more clearly, the drawings to be used in descripting the embodiments or the conventional skills will be briefly introduced in the following. Obviously, the drawings below are only some embodiments of the present disclosure, and those of ordinary skill in the art can further obtain other drawings according to these drawings without making any inventive effort.
The following descriptions for the respective embodiments are specific embodiments capable of being implemented for illustrating the present disclosure with referring to the appended figures. In describing the present disclosure, spatially relative terms such as “upper”, “lower”, “front”, “back”, “left”, “right”, “inner”, “outer”, “lateral”, and the like, may be used herein for ease of description as illustrated in the figures. Therefore, the spatially relative terms used herein are intended to illustrate the present disclosure for ease of understanding, but are not intended to limit the present disclosure. In the appending drawings, units with similar structures are indicated by the same reference numbers.
In existing image processing methods, color images are converted into HSI (Hue, Saturation, Intensity) and HSV (Hue, Saturation, Value) color spaces and are processed in these color spaces. When they are converted back to RGB color space, some details may be lost and color tone may be changed because of being out of the boundaries of the color space. The present disclosure provides an image processing method, and embodiments of the present disclosure can avoid these drawbacks.
In Step S10, components of a point Ai(ri,gi,bi) of an original image in RGB color space is processed according to a first function f1(x) to obtain A0(r0,g0,b0)=f1(xi), where i is a natural number.
In the RGB color space, the point Ai(ri,gi,bi) is selected from the original image. Each component of the point Ai of the original image is processed in the RGB color space according to the first function f1(x) to obtain A0(r0,g0,b0)=f1(xi).
Processing each component of the point Ai of the original image in the RGB color space is to stretch (α) and translate (β) the components of the point Ai in the RGB color space.
In Step S20, the point Ai(ri,gi,bi) of the original image is processed according to a second function f2(x) to obtain a processed saturation S0=f2(xi).
In the RGB color space, the point Ai(ri,gi,bi) is selected from the original image. Saturation Si=f3(xi) of the point Ai(ri,gi,bi) of the original image is determined according to a third function f3(x). The saturation Si of the original image is processed according to the second function f2(x) to obtain the processed saturation S0=f2(xi).
The saturation is obtained according to Si=1−3×min[ri,gi,bi]/ri+gi+bi. The processed saturation is obtained according to
Brightness of the original image remains unchanged before and after image processing, and values of α and β in f1(x) are obtained using S0=f2(xi) and A0(r0,g0,b0)=f1(xi).
In Step S30, let f2(xi)=1 to determine the point A0(r0,g0,b0). The point Ai is processed in the RGB color space if max(r0,g0,b0)≤1.
If max(r0,g0,b0)>1, the point Ai is converted into CMY color space for image processing.
In the above formulas, let f2(xi)=1 to determine values of r0,g0,b0 for the point A0(r0,g0,b0).
According to a size of the obtained values of r0,g0,b0, select a corresponding color space to process the point Ai.
If max(r0,g0,b0)≤1, the point Ai is processed in the RGB space using the second function f2(x). If max(r0,g0,b0)>1, the point Ai is converted into CMY color space using a fourth function f4(x) and processed in the CMY color space.
The converted point is processed in the CMY color space. After being processed, the point is converted from the CMY color space to the RGB color space using the fourth function f4(x) again.
The CMY color space is a color model based on subtractive color mixture. The fourth function is f4(x)=1−x , where x represents each component of the point Ai(ri,gi,bi) in the RGB color space.
For instance, a point Ai(ri,gi,bi) is selected in the RGB color space, where i is a natural number. After being processed in the RGB color space, the point is noted as A0(r0,g0,b0).
(1) The first function f1(x)=αx+β is used to stretch (α) and translate (β) the point Ai(ri,gi,bi) in the RGB color space to obtain the following equations:
r
0
=αr
i+β (1-1)
g
0
=αg
i+β (1-2)
b
0
=αb
i+β (1-3)
(2) Brightness of the original image remains unchanged before and after image processing. Based on this principle, the following equations are obtained:
l
i
=r
i
+g
i
+b
i (1-4)
l
0
=r
0
+g
0
+b
0 (1-5)
According to equations (1-1)˜(1-5), a relation between α and β is obtained:
r
i
+g
i
+b
i=α(ri+gi+bi)+β (1-6)
(3) Saturation of the point Ai(ri,gi,bi) of the original image is determined according to the third function f3(x). The following equation is obtained:
(4) The saturation Si of the original image is processed according to the second function f2(x) to obtain a processed saturation S0 as below:
Accordingly, values of α and β can be obtained according to equations (1-6) and (1-8).
In equation (1-8), let f2(xi)=1 to determine values of r0,g0,b0 for the point A0(r0,g0,b0). According to a size of the obtained values of r0,g0,b0, select a corresponding color space to process the point Ai.
If max(r0,g0,b0)≤1, the point Ai is processed in the RGB space using the second function f2(x). If max(r0,g0,b0)>1, the point Ai is converted into CMY color space using a fourth function f4(x) and processed in the CMY color space.
(5) According to the fourth function f4(x), the point Ai(ri,gi,bi) is processed in the CMY color space to obtain a point Ai(ci,mi,yi) in the CMY color space as below:
c
i=1−ri (1-9)
m
i=1−gi (1-10)
y
i=1−bi (1-11)
(6) The first function f1(x)=αx+β is used to stretch (α) and translate (β) the point Ai(ci,mi,yi) in the CMY color space to obtain a processed point A0(c0,m0,y0) as below:
c
0
=αc
i+β (1-12)
m
0
=αm
i+β (1-13)
y
0
=αy
i+β (1-14)
(7) The saturation Si of the original image is processed according to the second function f2(x) to obtain a processed saturation S0 as below:
(8) The point A0(c0,m0,y0) is converted from the CMY color space into the RGB color space using the fourth function f4(x), as below:
r
0=1−c0 (1-16)
g
0=1−m0 (1-17)
b
0=1−y0 (1-18)
Accordingly, the afore-described method can filter points of the original image and perform space transformation for the points that may be out of boundary to solve the out-of-boundary issue.
The present disclosure further provides an image processing device. The device includes an image processing module.
Firstly, in the image processing module, a point Ai(ri,gi,bi) is selected from the original image in the RGB color space. Each component of the point Ai of the original image is processed in the RGB color space according to the first function f1(x)=αx+β to obtain a processed point A0(r0,g0,b0)=f1(xi).
The image processing is to stretch (α) and translate (β) each component of the point Ai of the original image in the RGB color space.
After that, in the image processing module, the point Ai(ri,gi,bi) is selected from the original image in the RGB color space. Saturation Si=f3(xi) of the point Ai(ri,gi,bi) of the original image is determined according to the third function f3(x). The saturation Si of the original image is processed according to the second function f2(x) to obtain a processed saturation S0=f2(xi).
The saturation is obtained according to
The processed saturation is obtained according to
Brightness of the original image remains unchanged before and after image processing, and values of α and β in f1(x) are obtained using S0=f2(xi) and A0(r0,g0,b0)=f1(xi).
Finally, in the image processing module, let f2(xi)=1 to determine values r0,g0,b0 of for the point A0(r0,g0,b0). According to a size of the obtained values of r0,g0,b0, select a corresponding color space to process the point Ai.
If max(r0,g0,b0)≤1, the point Ai is processed in the RGB space using the second function f2(x). If max(r0,g0,b0)>1, the point Ai is converted into CMY color space using a fourth function f4(x) and processed in the CMY color space.
The converted point is processed in the CMY color space. After being processed, the point is converted from the CMY color space to the RGB color space using the fourth function f4(x) again.
The CMY color space is a color model based on subtractive color mixture. The fourth function is f4(x)=1−x , where x represents each component of the point Ai(ri,gi,bi) in the RGB color space.
For instance, a point Ai(ri,gi,bi) is selected in the RGB color space, where i is a natural number. After being processed in the RGB color space, the point is noted as A0(r0,g0,b0).
(1) The first function f1(x)=αx+β is used to stretch (α) and translate (β) the point Ai(ri,gi,bi) in the RGB color space to obtain the following equations:
r
0
=αr
i+β (2-1)
g
0
=αg
i+β (2-2)
b
0
=αb
i+β (2-3)
(2) Brightness of the original image remains unchanged before and after image processing. Based on this principle, the following equations are obtained:
l
i
=r
i
+g
i
+b
i (2-4)
l
0
=r
0
+g
0
+b
0 (2-5)
According to equations (2-1)˜(2-5), a relation between α and β is obtained:
r
i
+g
i
+b
i=α(ri+gi+bi)+β (2-6)
(3) Saturation of the point Ai(ri,gi,bi) of the original image is determined according to the third function f3(x). The following equation is obtained:
(4) The saturation Si of the original image is processed according to the second function f2(x) to obtain a processed saturation S0 as below:
Accordingly, values of α and β can be obtained according to equations (2-6) and (2-8).
In equation (2-8), let f2(xi)=1 to determine values of r0,g0,b0 for the point A0(r0,g0,b0). According to a size of the obtained values of r0,g0,b0, select a corresponding color space to process the point Ai.
If max(r0,g0,b0)≤1, the point Ai is processed in the RGB space using the second function f2(x). If max(r0,g0,b0)>1, the point Ai is converted into CMY color space using a fourth function f4(x) and processed in the CMY color space.
(5) According to the fourth function f4(x), the point Ai(ri,gi,bi) is processed in the CMY color space to obtain a point Ai(ci,mi,yi) in the CMY color space as below:
c
i=1−ri (2-9)
m
i=1−gi (2-10)
y
i=1−bi (2-11)
(6) The first function f1(x)=αx+β is used to stretch (α) and translate (β) the point Ai(ci,mi,yi) in the CMY color space to obtain a processed point A0(c0,m0,y0) as below:
c
0
=αc
i+β (2-12)
m
0
=αm
i+β (2-13)
y
0
=αy
i+β (2-14)
(7) The saturation Si of the original image is processed according to the second function f2(x) to obtain a processed saturation S0 as below:
(8) The point A0(c0,m0,y0) is converted from the CMY color space into the RGB color space using the fourth function f4(x), as below:
r
0=1−c0 (2-16)
g
0=1−m0 (2-17)
b
0=1−y0 (2-18)
Accordingly, the afore-described method can filter points of the original image and perform space transformation for the points that may be out of boundary to solve the out-of-boundary issue.
The present disclosure provides an image processing method and device. Each component of a certain point of an original image is processed in RGB color space. The points of the original image in the RGB color space are filtered according to the processed results. By performing space transformation for the points that may be out of the boundary and converting them from the RGB color space into CMY color space, the present disclosure can efficiently solve the image distortion issue caused by being out of the boundaries of the color space, ensure unchanged color tone, and improve display quality. Also, image saturation is enhanced in the RGB color space and computing power is improved.
Above all, while the preferred embodiments of the present disclosure have been illustrated and described in detail, various modifications and alterations can be made by persons skilled in this art. The embodiment of the present disclosure is therefore described in an illustrative but not restrictive sense. It is intended that the present disclosure should not be limited to the particular forms as illustrated, and that all modifications and alterations which maintain the spirit and realm of the present disclosure are within the scope as defined in the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
201710760193.4 | Aug 2017 | CN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2017/114520 | 12/5/2017 | WO | 00 |