SYSTEM, METHOD AND COMPUTER PROGRAM PRODUCT FOR COLOR PROCESSING OF POINT-OF-INTEREST COLOR

Information

  • Patent Application
  • 20120176395
  • Publication Number
    20120176395
  • Date Filed
    January 06, 2011
    13 years ago
  • Date Published
    July 12, 2012
    12 years ago
Abstract
Methods and systems to manipulate color processing parameters to allow the detection of an arbitrary color of interest. Such reconfigurations may enable general point-of-interest color processing. Color mapping curves may also be configured, to accomplish the tasks of color correction, enhancement, de-saturation, and color compression.
Description
BACKGROUND

The processing of point-of-interest color has been a pervasive problem in graphics processing. This problem often arises in the processing of skin tone colors, where off-hue skin tones need to be corrected while maintaining a hue shift for non-skin tone colors. In addition to correction, the enhancement of a color of interest is also a common problem.


Existing technology attempts to determine the possibility of a pixel being a skin color pixel, and to enhance the color intensity of the pixel according to its likelihood of being a skin color. This process of saturation enhancement, however, cannot necessarily correct a certain color when it is off-hue. The skin tone correction test that is often used is not generally extendable to an arbitrary color.





BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES


FIG. 1 illustrates a color cluster in a three-dimensional (3D) YUV space, where a color of interest may be located at the center of the cluster.



FIG. 2 is a flowchart illustrating the process of determining the likelihood that the color of a pixel is found at the center of a color cluster, according to an embodiment.



FIG. 3 illustrates rectangular and diamond-shaped projections of a color cluster on a hue-saturation (HS) plane, according to an embodiment.



FIG. 4 is a flowchart illustrating the process of determining the likelihood that the color of a pixel is a point-of-interest color in the UV plane, according to an embodiment.



FIG. 5 illustrates two piece-wise linear functions (PWLFs) that may be used to approximate the projection of a color cluster on the YV plane, according to an embodiment.



FIG. 6 is a flowchart illustrating the process of determining the likelihood that the color of a pixel is a point-of-interest color in the YV plane, according to an embodiment.



FIG. 7 illustrates a PWLF that may be used to approximate the projection of a color cluster on the YU plane, according to an embodiment.



FIG. 8 is a flowchart illustrating the process of determining the likelihood that the color of a pixel is a point-of-interest color in the YU plane, according to an embodiment.



FIG. 9 illustrates examples of functions that may be constrained for purposes of adjustment of saturation, hue, or for color compression, according to an embodiment.



FIG. 10 is a flowchart illustrating the process of adjusting the saturation of a pixel, according to an embodiment.



FIG. 11 is a flowchart illustrating the process of adjusting the hue of a pixel, according to an embodiment.



FIG. 12 is a flowchart illustrating the process of performing color compression for a pixel, according to an embodiment.



FIG. 13 is a block diagram illustrating the structural modules of an implementation of an embodiment of the invention.



FIG. 14 is a block diagram illustrating the structural modules of a software or firmware implementation of an embodiment of the invention.





DETAILED DESCRIPTION

An embodiment is now described with reference to the figures, where like reference numbers indicate identical or functionally similar elements. Also in the figures, the leftmost digit of each reference number corresponds to the figure in which the reference number is first used. While specific configurations and arrangements are discussed, it should be understood that this is done for illustrative purposes only. A person skilled in the relevant art will recognize that other configurations and arrangements can be used without departing from the spirit and scope of the description. It will be apparent to a person skilled in the relevant art that this can also be employed in a variety of other systems and applications other than what is described herein.


In the system described herein, color processing parameters may be manipulated to allow the detection of an arbitrary color of interest. Such reconfigurations may enable general point-of-interest color processing. Color mapping curves may also be configured, to accomplish the tasks of color correction, enhancement, de-saturation, and color compression.



FIG. 1 shows a cluster of colors in a three-dimensional (3D) YUV domain where the center of the cluster may be considered as the point-of-interest color. The specification of such a 3D cluster may be accomplished by taking the intersection of its projections onto the three planes (i.e., UV, YV and YU).


The process of detecting a point-of-interest color is illustrated in FIG. 2, according to an embodiment. At 210, the likelihood that an input pixel p is the point-of-interest color in the UV plane (Uc, Vc) may be determined. This probability is shown as likelihoodpUV. At 220, the likelihood that an input pixel p is the point-of-interest color (Yc, Vc) in the YV plane may be determined. This probability is shown as likelihoodpYV. At 230, the likelihood that an input pixel p is the point-of-interest color (Yc, Uc) in the YU plane may be determined. This latter probability may simplified to be a one-dimensional projection over the Y-axis in the illustrated embodiment, as will be described below. This probability is shown as likelihoodpy. At 240, the likelihood that the pixel p is the point-of-interest color at the center of the color cluster may be determined. This probability is shown as likelihoodp.


The generation of these likelihoods may be described as follows. While the distribution of a point-of-interest color can result in an arbitrary projection shape on the UV plane, a rectangle and a diamond shape in a transformed coordinate plane, i.e., hue-saturation (HS) plane, may be utilized to approximate the shape of the projection to facilitate the detection of the projection in UV plane:










[



S




H



]

=



[




cos






θ
c






-
sin







θ
c







sin






θ
c





cos






θ
c





]



[




U
-

U
c







V
-

V
c





]


.





(
1
)







(Uc, Vc) and θc in the above equation represent the projection of the 3D distribution center c (i.e., the point-of-interest color) on the UV plane and the orientation angle of this projection. As shown in eq. 1, the values U and V may be shifted initially by Uc and Vc respectively. The processing of eq. 1 may create S and H coordinates that correspond to the input coordinates (U, V), where the projection may be rotated by the angle θc. The likelihood of a pixel being a point-of-interest color pixel in the UV plane may decrease with the distance from (Uc, Vc):











likelihood
UV
p

=

min


(


R
factor

,

D
factor


)



,




(
2
)







R
factor

=

{







min


[



(


H
max

-



H
p




)

/

2




B
margin

-
5





,


(


S
max

-



S
p




)

/

2




B
margin

-
5






]


,









for





all








H
p




<

H
max


&





S
p




<

S
max


,






0
,



otherwise









D
factor


=

{




1
,





for





all





dist

<

(


D
L

-

D
margin


)









(


D
L

-

D
margin


)

/

2




B
d_margin

-
5





,






for





all






(


D
L

-

D
margin


)



dist
<

D
L


,






0
,



otherwise










(
3
)






dist
=

[





S
p

-
dS



+


(


1
/
tan






β

)






H
p

-

d
H






]





(
4
)







Rfactor and Dractor in eq. 2-4 represent the soft decision, or likelihood, of an input pixel p being the point-of-interest color (Uc, Vc) determined from the rectangle and diamond-shaped projections respectively. (Hp, Sp) in eq. 3 and 4 is the input pixel p in the HS plane, while Hmax, Smax, Bmargin) and DL, Dmargin, BDmargin, dH, dS and β are parameters specifying the range of the rectangle and diamond shapes as shown in FIG. 3.


The process for determining likelihood that p is the color of interest in the UV plane is illustrated in FIG. 4, according to an embodiment. At 410, the color cluster, centered at the point-of-interest color, may be projected on the UV plane. At 420, this projection may be approximated in the HS plane. At 430, the likelihood that the pixel p is the point-of-interest color may be determined from the rectangular shaped projection, as described in eq. 3 above. This probability is shown as Rfactor. At 440, the likelihood that the pixel p is the point-of-interest color may be determined from the diamond-shaped projection, as described in eq. 4 above. This probability is shown as Dfactor. At 450, the likelihood that the pixel p is the point-of-interest color in the UV plane may be determined, likelihoodpUV=min(Rfactor, Dfactor).


The same method of utilizing a rectangle in conjunction with a diamond shape to represent the projection of a 3D color cluster on a 2D plane may be applied to the soft decision in the YV and YU planes. Nevertheless, accurate modeling of the projection on the UV plane may contribute to the detection of a point-of-interest color more than the projections on the YV and YU planes. This leads to the simplification of the detection process on the YV and YU planes: the projection in the YV plane may be approximated by the use of two piecewise linear functions (PWLFs) while the projection in the YU planes may be further simplified to be a one-dimensional projection over the Y axis.


The likelihood in the YV plane may then be calculated as





likelihoodVYp=min(detL,detU)  (5)


where










det
L

=

{





1
,





for





all






V
p


>

[



f
L



(

Y
p

)


+

M
VY_L


]









[


V
P

-


f
L



(

Y
p

)



]

/

M
VY_L


,





for





all







f
L



(

Y
p

)





V
P



[



f
L



(

Y
p

)


+

M
VY_L


]







0
,



otherwise



,




and






(
6
)







det
U

=

{





1
,





for





all






V
p


<

[



f
U



(

Y
p

)


+

M
VY_U


]









[



f
U



(

Y
p

)


-

V
p


]

/

M
VY_U


,





for






all




[



f
U



(

Y
p

)


-

M
VY_U


]




V
p




f
U



(

Y
p

)








0
,



otherwise



,






(
7
)







Here, Yp and Vp represent the Y and V values of the input pixel p respectively: MYVU and MYVL are parameters that specify the confidence margins in the YV plane. fL and fU are the two PWLFs mapping Y to V, and each of them may be composed of four anchor points as shown in FIG. 5.


The process for determining likelihood in the YV plane is illustrated in FIG. 6, according to an embodiment. At 610, the functions fL, and fU may be defined. At 620, the value detL may be computed according to eq. 6 above; at 630, the value detU may be computed according to eq. 7 above. At 640, the likelihood that the pixel p is the point-of-interest color in the YV plane may be determined, likelihoodpYV=min(detL, detU).


The likelihood over the 1D Y axis may be given by a PWLF,





likelihoodpY=fms(Yp)  (8)


where fms: Y→ms, and msε[0,1]. The PWLF fms may be controlled by four anchor points as shown in FIG. 7. Yp in the above equation may be the Y value of the pixel p.


A process for determining the likelihood of the pixel p in the YU plane, as approximated by a projection on the Y axis, is illustrated in FIG. 8, according to an embodiment. At 810, the function fms may be defined. At 820, the likelihood that pixel p is the point-of-interest color in the YU plane may be approximated by the likelihood that Yp has the appropriate value along the Y axis, likelihoodpYU=likelihoodpYU=fms(Yp).


The likelihood that the input pixel p is a pixel with the point-of-interest color may then be determined by





likelihoodp=min(likelihoodUV,likelihoodVY,likelihoodY)  (9)


As an example, a skin tone detection module may be set to have (Uc, Vc)=(110, 154) for 8-bit input data, where O may be approximately 128°, and (dS, dH)=(0, 0). Such a module may have similar longitude settings for the rectangle and diamond shape to approximate an eclipse-like ST projection on the UV plane. In an embodiment, (Uc, Vc) may be considered to be the skin tone with the greatest confidence level, and may reflect the exact target color in the above point-of-interest color detector.


To map θc in the above point-of-interest color detector, the angle θc may be decomposed into a hue angle of the exact point-of-interest color c (i.e., θhc) and a hue angle offset (i.e., θoffset):





θchcoffset  (10)


where







θ
hc

=

{






tan

-
1




[


(


(


V

c_

8

bit


-
128

)

/
224

)

/

(


(


U


c

_


8

bit


-
128

)

/
224

)


]


,








for





all






U


c

_


8

bit



>
128





&







V


c

_


8

bit



>
128








180

°

+


tan

-
1




[


(


(


V


c

_


8

bit


-
128

)

/
224

)

/

(


(


U


c

_


8

bit


-
128

)

/
224

)


]



,





for





all






U


c

_


8

bit



<
128








360

°

+


tan

-
1




[


(


(


V


c

_


8

bit


-
128

)

/
224

)

/

(


(


U


c

_


8

bit


-
128

)

/
224

)


]



,








for





all






U


c

_


8

bit



>
128





&







V


c

_


8

bit



<
128






0
,



otherwise








When the dissimilarity between an input pixel p and the exact color point c in hue and saturation is considered equal in both positive and negative directions in the calculation of likelihood, this may result in a horizontally or vertically (or both) symmetric projection shape aligned with the direction of c. A non-zero θffset (approximately 3°) and a color detection module may detect a tilted ST projected UV distribution which implies the non-equal treatment for the directional hue change from the skin color center. In other words, θffset may be viewed as a parameter granting the freedom of the orientation of the distribution of any color cluster.


The value likelihoodp may have several uses. As noted above, this value may be used to detect a point-of-interest color. It may also be used for purposes of color adjustment, such as manipulation of saturation and hue. This value may also be used for color compression.


To accomplish this, two PWLFs may be defined, and denoted gmaph and gmaps; an input pixel p may be adjusted in the HS plane according to its likelihood to the target color:










H
p_out

=

{








H
p

+

Δ






H
p

×

likelihood
p



,








H
p





H
max


,


Δ






H
p


=

[



g
map_h



(

H
p

)


-

H
p


]









H
p

,



otherwise









S
p_out


=

{






S
p

+

Δ






S
p

×

likelihood
p



,








S
p





S
max


,


Δ






S
p


=

[



g
map_s



(

S
p

)


-

S
p


]









S
p

,



otherwise










(
12
)







where gmaph: H→H and gmaps: S→S. Examples of these functions are shown in FIG. 10, according to an embodiment. The pixel may be transformed from the HS domain back into the UV domain. If θoffset is set to zero for point-of-interest color detection, adjustment of saturation of a color pixel while keeping its hue close to the original value may be provided by setting gmaph and gmaps with the following constraints:









{







g
map_h



(

H
p

)




H
p


,






for





all





0



H
p



H
+



,


H
+


<

H
max











g
map_h



(

H
p

)




H
p


,






for





all






(

-

H
-



)




H
p


0

,


H
-


<

H
max











g
map_s



(

S
p

)




S
p


,





for





all








S
p






S
max









(
13
)







Since gmaph and gmaps have been confined with gmaph(±Hmax)=±Hmax and gmaps(±Smax)=±Smax, the effective range of saturation enhancement may be specified by (H′, H′+, Smax) in eq. (13).


The functionality of de-saturation may be achieved by applying the following constraints to gmaph and gmaps:









{







g
map_h



(

H
p

)




H
p


,





for





all





0



H
p



H
max










g
map_h



(

H
p

)




H
p


,





for





all






(

-

H
max


)




H
p


0









g
map_s



(

S
p

)




S
p


,





for





all








S
p






S
max









(
14
)







The processing for saturation adjustment is illustrated in FIG. 10, according to an embodiment. At 1010, the likelihood that an input pixel p is the point-of-interest color in the UV plane, (Uc, Vc), may be determined. This probability is shown as likelihoodpUV. At 1020, the likelihood that an input pixel p is the point-of-interest color (Yc, Vc) in the YV plane may be determined. This probability is shown as likelihoodpYV. At 1030, the likelihood that an input pixel p is the point-of-interest color (Yc, Uc) in the YU plane may be determined. This probability may simplified to be a one-dimensional projection over the Y-axis in the illustrated embodiment. This probability is shown as likelihoodpY. At 1040, the likelihood that the pixel p is the point-of-interest color at the center of the color cluster may be determined. This probability is shown as likelihoodp. At 1050, color saturation at p may be adjusted by constraining the functions gmapt, and gmaps.


Color correction for hue may also be performed. In an embodiment, this may be achieved by setting gmaph and gmaps with the following constraints:









{







g
map_h



(

H
p

)




H
p


,






for





all





0



H
p



H
+



,


H
+


<

H
max











g
map_h



(

H
p

)




H
p


,






for





all






(

-

H
-



)




H
p


0

,


H
-


<

H
max











g
map_s



(

S
p

)




S
p


,





for





all








S
p






S
max









(
15
)







The processing for hue adjustment is illustrated in FIG. 11, according to an embodiment. At 1110, the likelihood that an input pixel p is the point-of-interest color (Uc, Vc) in the UV plane may be determined. This probability is shown as likelihoodpUV. At 1120, the likelihood that an input pixel p is the point-of-interest color (Yc, Vc) in the YV plane may be determined. This probability is shown as likelihoodpYV. At 1130, the likelihood that an input pixel p is the point-of-interest color (Yc, Uc) in the YU plane may be determined. This probability may simplified to be a one-dimensional projection over the Y-axis in the illustrated embodiment. This probability is shown as likelihoodpY. At 1140, the likelihood that the pixel p is the point-of-interest color at the center of the color cluster may be determined. This probability is shown as likelihoodp. At 1150, hue at p may be adjusted by constraining the functions gmaph and gmaps as shown above in eq. 15.


Moreover, the functions gmaph and gmaps may be constrained for purposes of color compression (i.e., moving a pixel in a local neighborhood towards the central color) by establishing flat central intervals, as follows:









{







g
map_h



(

H
p

)




H
p


,






for





all





0



H
p



H
+



,


H
+


<

H
max











g
map_h



(

H
p

)




H
p


,






for





all






(

-

H
-



)




H
p


0

,


H
-


<

H
max











g
map_s



(

S
p

)




S
p


,






for





all





0



S
p



S
+



,


S
+


<

S
max











g
map_s



(

S
p

)




S
p


,






for





all






(

-

S
-



)




S
p


0

,


S
-


<


S
max

.










(
16
)







Here the values S′ and S′+ are user controllable parameters specifying the range covered for color compression.


The processing for color compression is illustrated in FIG. 12, according to an embodiment. At 1210, the likelihood that an input pixel p is the point-of-interest color (Uc, Vc) in the UV plane may be determined. This probability is shown as likelihoodpUV. At 1220, the likelihood that an input pixel p is the point-of-interest color (Yc, Vc) in the YV plane, may be determined. This probability is shown as likelihoodpYV. At 1230, the likelihood that an input pixel p is the point-of-interest color (Yc, Uc) in the YU plane may be determined. This probability may simplified to be a one-dimensional projection over the Y-axis in the illustrated embodiment. This probability is shown as likelihoodpY. At 1240, the likelihood that the pixel p is the point-of-interest color at the center of the color cluster may be determined. This probability is shown as likelihoodp. At 1250, color compression for p may be performed by constraining the functions gmaph and gmaps according to eq. 16 above.


The processing described above may be implemented using structural modules as illustrated in FIG. 13, according to an embodiment. Module 1303 may represent a point-of-interest color detector, and module 1306 may represent a module that performs color enhancement, correction, and/or compression. Within module 1303, a module 1310 may receive an input pixel p, where the color of p is specified in Y, U, V coordinates. Module 1310 may also receive a definition of a color of interest, (Uc, Vc). Module 1310 may perform a shift of (U, V), resulting in coordinates (Y, Uc, Vc). At module 1320, corresponding coordinates (Y, S, H) may be produced, according to the processing described above with respect to eq. 1.


Module 1330 may receive the coordinates (Y, S, H) and determine the likelihood that the pixel p is located at a specified location in the YV plane. To accomplish this, module 1330 uses the values (Smax, Hmax) and (DL, dS, dH). At module 1340, the likelihood that p is located at a location on the Y axis may be determined, using the function fms. At module 1350, the likelihood that p is located at a location in the YV plane may be determined, using PWLFs fL and fU. Note that in the embodiment of FIG. 13, the likelihood of the pixel in the UV plane may be determined, followed by the determination with respect to the Y axis, followed by the determination with respect to the YV plane. This sequence is not meant to be limiting; in an alternative embodiment, the sequence of these operations may differ.


The resulting likelihoodp may then be generated and sent to a module 1360, where the saturation (or de-saturation) or hue may be adjusted, and/or where color compression may be performed. In the illustrated embodiment, the operation of module 1360 uses the values ΔSp and ΔHP. These latter values may be generated at module 1370, using PWLFs gmaph and gmaps as described above with respect to eq. 12.


The new coordinates that result from the operation of module 1360 may then be passed to module 1380. Here, the rotation of module 1320 may be undone. At module 1390, a shift of the U and V coordinates may be performed, to account for the shift performed at module 1310. This may result in a final output as shown in the embodiment of FIG. 13.


One or more features disclosed herein, including the modules shown in FIG. 13, may be implemented in hardware, software, firmware, and combinations thereof, including discrete and integrated circuit logic, application specific integrated circuit (ASIC) logic, and microcontrollers, and may be implemented as part of a domain-specific integrated circuit package, or a combination of integrated circuit packages. The term software, as used herein, refers to a computer program product including a computer readable medium having computer program logic stored therein to cause a computer system to perform one or more features and/or combinations of features disclosed herein.


A software or firmware embodiment of the processing described above is illustrated in FIG. 14. System 1400 may include a programmable processor 1420 and a body of memory 1410 that may include one or more computer readable media that store computer program logic 1440. Memory 1410 may be implemented as one or more of a hard disk and drive, a removable media such as a compact disk and drive, flash memory, or a random access (RAM) or read-only memory (ROM) device, for example. Processor 1420 and memory 1410 may be in communication using any of several technologies known to one of ordinary skill in the art, such as a bus. Processor 1420 may be a special purpose graphics processor or a general purpose processor being used as a graphics processor. Logic, contained in memory 1410 may be read and executed by processor 1420. One or more I/O ports and/or I/O devices, shown collectively as I/O 1430, may also be connected to processor 1420 and memory 1410.


In an embodiment, computer program logic 1440 may include the logic modules 1450 and 1460. Point-of-interest color detection logic 1450 may be responsible for the processing described above with respect to reference 1303 of FIG. 13, as well as FIGS. 2, 4, 6 and 8. Color enhancement/correction/compression logic module 1460 may be responsible for the processing described above with respect to reference 1306 of FIG. 13, as well as FIGS. 10-11.


Methods and systems are disclosed herein with the aid of functional building blocks illustrating the functions, features, and relationships thereof. At least some of the boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries may be defined so long as the specified functions and relationships thereof are appropriately performed.


While various embodiments are disclosed herein, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail may be made therein without departing from the spirit and scope of the methods and systems disclosed herein. Thus, the breadth and scope of the claims should not be limited by any of the exemplary embodiments disclosed herein.

Claims
  • 1. A method, comprising: in a graphics processor, determining the likelihood (likelihoodpUV) that a pixel p is an arbitrary point-of-interest color in a UV plane;determining the likelihood (likelihoodpYV) that pixel p is the point-of-interest color in a YV plane;determining the likelihood (likelihoodpY) that pixel p is the point-of-interest color along a Y axis; anddetermining the likelihood (likelihoodp) that pixel p is the point-of-interest color at the center of a color cluster, likelihoodp=min(likelihoodpUV, likelihoodpYV, likelihoodpY).
  • 2. The method of claim 1, wherein said determining of likelihoodpUV comprises: projecting the color cluster on the UV plane;approximating the projection in an hue-saturation (HS) plane;determining a likelihood Rfactor that the pixel p is the point-of-interest color from a rectangular projection in the HS plane, where
  • 3. The method of claim 1, wherein said determining of likelihoodpYV comprises: determining
  • 4. The method of claim 1, wherein said determining of likelihoodpY comprises: determining likelihood Y=fms(Yp),where fms is a four segment PWLF, fms: Y→ms, ms ε[1,0] and Yp is the Y value of the pixel p.
  • 5. The method of claim 1, further comprising: adjusting the saturation of the pixel p by constraining a piece-wise linear function (PWLF) gmap—s according to gmap—s(Sp)≧Sp, for all |Sp|≦Smax where Sp is the coordinate of the pixel p on the S axis and Smax is a parameter defining a maximum value for S in a projection of the color cluster in a hue-saturation (HS) plane; anddetermining an adjusted saturation
  • 6. The method of claim 1, further comprising: de-saturating the color of the pixel p by constraining a piece-wise linear function (PWLF) gmap—s according to gmap—s(Sp)≦Sp, for all |Sp|≦Smax where Sp is the coordinate of the pixel p on the S axis and Smax is a parameter defining a maximum value for S in a projection of the color cluster in a hue-saturation (HS) plane; anddetermining an adjusted saturation
  • 7. The method of claim 1, further comprising: correcting the color of the pixel p by constraining a piece-wise linear function (PWLF) gmap—h according to gmap—h(Hp)≦Hp, for all 0≦Hp≦H′+, H′+<Hmax gmap—h(Hp)≧Hp, for all (−H′−)≦Hp<0, H′−<Hmax where Hp is the coordinate of the pixel on the H axis, Hmax is a parameter defining a maximum value for H in a projection of the color cluster in a hue-saturation (HS) plane, and (H′−, H′+, Smax) define the effective range of saturation enhancement; anddetermining an adjusted hue
  • 8. The method of claim 1, further comprising: performing color compression by constraining a piece-wise linear function (PWLF) gmap—h according to gmap—h(Hp)≦Hp, for all 0≦Hp≦H′+<Hmax gmap—h(Hp)≧Hp, for all (−H′−)≦Hp≦0, H′−<Hmax where Hp is the coordinate of the pixel on the H axis, Hmax is a parameter defining a maximum value for H in a projection of the color cluster in a hue-saturation (HS) plane, and (H′−, H′+, Smax) define the effective range of saturation enhancement;constraining a PWLF gmap—s according to gmap—s(Sp)≦Sp, for all 0≦Sp≦S′+, S′+<Smax gmap—s(Sp)≧Sp, for all (−S′−)≦Sp≦0, S′−<Smax where Sp is the coordinate of the pixel on the S axis, Smax is a parameter defining a maximum value for S in the projection of the color cluster in the HS plane, and S′− and S′+ are user controllable parameters that specify the range covered for color compression;determining an adjusted saturation,
  • 9. A system, comprising: a processor; anda memory in communication with said processor, wherein said memory stores a plurality of processing instructions configured to direct said processor to determine the likelihood (likelihoodpUV) that a pixel p is an arbitrary point-of-interest color in a UV plane;determine the likelihood (likelihoodpYV) that pixel p is the point-of-interest color in a YV plane;determine the likelihood (likelihoodpY) that pixel p is the point-of-interest color along a Y axis; anddetermine the likelihood (likelihoodp) that pixel p is the point-of-interest color at the center of a color cluster, likelihoodp=min(likelihoodpUV, likelihoodpYV, likelihoodpY).
  • 10. The system of claim 9, wherein said processing instructions configured to direct said processor to determine likelihoodpUV comprises processing instructions configured to direct said processor to: project the color cluster on the UV plane;approximate the projection in an hue-saturation (HS) plane;determine a likelihood Rfactor that the pixel p is the point-of-interest color from a rectangular projection in the HS plane, where
  • 11. The system of claim 9, wherein said processing instructions configured to direct said processor to determine likelihoodpYV comprises processing instructions configured to direct said processor to: determine
  • 12. The system of claim 9, wherein said processing instructions configured to direct said processor to determine likelihoodpY comprise processing instructions configured to direct said processor to: determine likelihood Y=fms(Yp),where fms is a four segment PWLF, fms: Y→ms, ms ε[1,0] and Yp is the Y value of the pixel p.
  • 13. The system of claim 9, wherein said processing instructions further comprise processing instructions configured to direct said processor to: adjust the saturation of the pixel p by constraining a piece-wise linear function (PWLF) gmap—s according to gmap—s(Sp)≧Sp, for all |Sp|≦Smax where Sp is the coordinate of the pixel p on the S axis and Smax is a parameter defining a maximum value for S in a projection of the color cluster in a hue-saturation (HS) plane; anddetermining an adjusted saturation
  • 14. The system of claim 9, wherein said processing instructions further comprise processing instructions configured to direct said processor to: de-saturate the color of the pixel p by constraining a piece-wise linear function (PWLF) gmap—s according to gmap—s(Sp)≦Sp, for all |Sp≦Smax where Sp is the coordinate of the pixel p on the S axis and Smax is a parameter defining a maximum value for S in the projection of the color cluster in a hue-saturation (HS) plane; anddetermining an adjusted saturation
  • 15. The system of claim 9, wherein said processing instructions further comprise processing instructions configured to direct said processor to: correct the color of the pixel p by constraining a piece-wise linear function (PWLF) gmap—h according to gmap—h(Hp)≦Hp, for all 0≦Hp≦H′+, H′+<Hmax gmap—h(Hp)≧Hp, for all (−H′−)≦Hp≦0, H′−<Hmax where Hp is the coordinate of the pixel on the H axis, Hmax is a parameter defining a maximum value for H in a projection of the color cluster in a hue-saturation (HS) plane, and (H′−, H′+, Smax) define the effective range of saturation enhancement; anddetermine an adjusted hue
  • 16. The system of claim 9, wherein said processing instructions further comprise processing instructions configured to direct said processor to: perform color compression by constraining a piece-wise linear function (PWLF) gmap—h according to gmap—h(Hp)≦Hp, for all 0≦Hp≦H′+, H′+<Hmax gmap—h(Hp) ≧Hp, for all (−H′−)≦Hp≦0, H′−<Hmax where Hp is the coordinate of the pixel on the H axis, Hmax is a parameter defining a maximum value for H in a projection of the color cluster in a hue-saturation (HS) plane, and (H′−, Smax) define the effective range of saturation enhancement;constraining a PWLF gmap—s according to gmap—s(Sp)≦Sp, for all 0≦Sp≦S′+, S′+<Smax gmap—s(Sp)≧Sp, for all (−S′−)≦Sp<0, S′−<Smax where Sp is the coordinate of the pixel on the S axis, Smax is a parameter defining a maximum value for S in the projection of the color cluster in the HS plane, and S′− and S′+ are user controllable parameters that specify the range covered for color compression;determining an adjusted saturation,
  • 17. A computer program product including a non-transitory computer readable medium having computer program logic stored therein, the computer program logic including: logic to cause a processor to determine the likelihood (likelihoodpUV) that a pixel p is an arbitrary point-of-interest color in a UV plane;logic to cause the processor to determine the likelihood (likelihoodpYV) that pixel p is the point-of-interest color in a YV plane;logic to cause the processor to determine the likelihood (likelihoodpY) that pixel p is the point-of-interest color along a Y axis; andlogic to cause the processor to determine the likelihood (likelihoodp) that pixel p is the point-of-interest color at the center of a color cluster, likelihoodp=min(likelihoodpUV, likelihoodpYV, likelihoodpY).
  • 18. The computer program product of claim 17, wherein said logic to cause the processor to determine the likelihoodpUV comprises: logic to cause the processor to project the color cluster on the UV plane;logic to cause the processor to approximate the projection in an hue-saturation (HS) plane;logic to cause the processor to determine a likelihood Rfactor that the pixel p is the point-of-interest color from a rectangular projection in the HS plane, where
  • 19. The computer program product of claim 17, wherein said logic to cause the processor to determine the likelihoodpYV comprises: logic to cause the processor to determine
  • 20. The computer program product of claim 17, wherein said logic to cause the processor to determine the likelihoodpY comprises: logic to cause the processor to determine likelihood Y=fms(Yp),where fms is a four segment PWLF, fms: Y→ms, ms ε[1,0] and Yp is the Y value of the pixel p.
  • 21. The computer program product of claim 17, further comprising: logic to cause the processor to adjust the saturation of the pixel p by constraining a piece-wise linear function (PWLF) gmap—s according to gmap—s(Sp)≧Sp, for all |Sp|≦Smax where Sp is the coordinate of the pixel p on the S axis and Smax is a parameter defining a maximum value for S in a projection of the color cluster in a hue-saturation (HS) plane; andlogic to cause the processor to determine an adjusted saturation
  • 22. The computer program product of claim 17, further comprising: logic to cause the processor to de-saturate the color of the pixel p by constraining a piece-wise linear function (PWLF) gmap according to gmap—s(Sp)≦Sp, for all |Sp|≦Smax where Sp is the coordinate of the pixel p on the S axis and Smax is a parameter defining a maximum value for S in a projection of the color cluster in a hue-saturation (HS) plane; andlogic to cause the processor to determining an adjusted saturation
  • 23. The computer program product of claim 17, further comprising: logic to cause the processor to correct the color of the pixel p by constraining a piece-wise linear function (PWLF) gmap—h according to gmap—h(Hp)≦Hp, for all 0≦Hp≦H′+, H′+<Hmax gmap—h(Hp)≧Hp, for all (−H′−)≦Hp≦0, H′−<Hmax where Hp is the coordinate of the pixel on the H axis, Hmax is a parameter defining a maximum value for H in a projection of the color cluster in a hue-saturation (HS) plane, and (H′−, H′+, Smax) define the effective range of saturation enhancement; andlogic to cause the processor to determine an adjusted hue
  • 24. The computer program product of claim 17, further comprising: logic to cause the processor to perform color compression by constraining a piece-wise linear function (PWLF) gmap—h according to gmap—h(Hp)≦Hp, for all 0≦Hp≦H′+, H′+<Hmax gmap—h(Hp)≧Hp, for all (−H′−)≦Hp≦0, H′−<Hmax where Hp is the coordinate of the pixel on the H axis, Hmax is a parameter defining a maximum value for H in a projection of the color cluster in a hue-saturation (HS) plane, and (H′−, H′+, Smax) define the effective range of saturation enhancement;constraining a PWLF gmap—s according to gmap—s(Sp)≦Sp, for all 0≦Sp<S′+, S′+<Smax gmap—s(Sp)≧Sp, for all (−S′−)≦Sp≦0, S′−<Smax where Sp is the coordinate of the pixel on the S axis, Smax is a parameter defining a maximum value for S in the projection of the color cluster in the HS plane, and S′− and S′+ are user controllable parameters that specify the range covered for color compression;determining an adjusted saturation,