Liquid crystal display systems typically use backlights. Traditionally, the backlight produced constant and even light regardless of the content, with the liquid crystal cells controlling the brightness of the image. However, constant backlights have some disadvantages in high power consumption especially at high ambient light, heat generation and reduction in the dynamic range of the display. One solution for better control of the backlight replaces the constant backlight panel with an array of solid-state light emitters, such as light-emitting diodes (LEDs), with the number of LEDs being far less than the number of LCD elements. This allows for adjustment of the backlight according to the brightness in regions of the image, but has the disadvantage of increasing the cost and size of the device. Therefore, there is a desire to use the fewest possible zones. In the extreme case of a single zone, the overall brightness of the entire image can used.
When using a backlight, the input image is typically downsampled to a resolution that corresponds to the LED array size. There are several methods that can be used to down sample the data. One method lowpass filters the data before downsampling and then adjusts that data to take into account the amount of light leaking from adjacent LED zones, where a zone consists of the area that is in front of the LED. Each zone represents the LCD elements/pixels closest to a particular LED, or group of LEDs, that are controlled together. To save driver cost and allow for a thinner panel a zone might consist of several LEDs that are controlled together so that they act like a single LED at a larger distance from the LCD panel.
Another method controls the LED value based on the maximum image data value for an LED zone. Another method might look at the histogram data of the input image associated with the zone. In any of the above approaches, the zone area might also be increased so that it overlaps with adjacent zones. In addition, low pass filtering might be combined with the other methods. Some systems may also apply a spatial or temporal weight to the data. These approaches represent just some of the ways of calculating the LED values.
However they are determined, once one has the LED values for the LED array, the system needs to adjust the input image pixels to achieve a desired image value. A typical desired image value is the input image value. The image value results from the LED backlight illumination at a pixel multiplied by the transmittance of the pixel.
When the dynamic range of a display is increased, it may also be desirable to increase the dynamic range and/or adjust the look of the image to take advantage of the increase. In addition, because the frequency response of the LED resolution is much lower than the input image, compromises might be required to reduce the level of artifacts or reduce power. In addition, for high ambient light levels, the max LED illumination required may not be enough. These compromises might result in an LED illumination too low to allow the perfect reproduction of the original image. That is, it might require a pixel transmittance of greater than 100%, which is impossible. In the current art, a value corresponding to a transmittance of greater than 100% requires either a soft clipping circuit or results in areas of the image with no detail.
In different light conditions, the relative brightness perception produced by the same actual screen brightness is different. In an outdoors light environment, the phone screen is too dark, but in a dark environment, the phone screen is too bright.
Ambient light and adaptive display adjustment is used to increase or decrease the display backlight and pixel values depending on the environment and picture content. This adjustment can dramatically improve the visibility and reduce power consumption at the same time. An ambient light sensor measures the ambient light illumination.
For mobile devices equipped with a LCD (Liquid Crystal Display), the backlight consumes a considerable percentage of the total energy. The backlight of the typical handheld device consumes 20%-40% of the total system power. Dynamically dimming the backlight is considered an effective method to save energy. Based on image content, the backlight level is automatically decreased and the image pixel values are correspondingly changed. Backlight dimming reduces power consumption. Using pixel compensation the pixel value is increased to match perceived image luminance before and after backlight adjustment.
The embodiments here rely upon a novel ambient light and adaptive display adjustment method. Both the display backlight and the image pixels are adjusted for better viewing performance and lower power consumption. If the environment is dark, the backlight is decreased and the image pixels are changed only based on the image content. That is, if the image is dark, then the backlight will be decreased from the ambient light setting and the pixel values increased to save power. The display backlight will be increased along with the increasing environment brightness. If the upper limit of display backlight is still not enough for the current ambient light, the pixel values will also be increased using a similar pixel compensation process resulting in better viewing for high ambient light conditions.
The process first calculates the necessary backlight level suitable to display content at the current the ambient light. A piece-wise linear function as shown in
Based on image content, the pixel compensation weight and the amount to decrease the backlight can be obtained. If the image is dark, the compensation weight is large and the backlight level is decreased from the value determined by the ambient light. Pixel compensation is used to adjust the pixel value if necessary. The compensation weight depends on both the ambient light backlight and image content. The content backlight and the ambient light backlight will be combined to get the actual output backlight sent to the display. While the final compensation weight is sum of the content compensation weight and the ambient light compensation weight.
If the ratio of the upper limit of the display backlight to the multiplication of the ambient light backlight and the content backlight is larger than 1, then the ambient light compensation weight is zero. Otherwise, the smaller the ratio, the larger the compensation weight.
As discussed above, the system first determines the backlight value. Based on ambient light illumination, the backlight value suitable to the viewer is calculated. First, a piece-wise linear function is used to map the ambient light sensor value to the ambient light backlight.
where, AL is the ambient light illumination, BL_AL_raw is the calculated AL backlight. Note that BL_AL_raw may be larger than the upper limit of the display backlight. That is, it may have a value greater than 1.
The process then does temporal filtering of AL backlight. The basic formula of temporal filtering is as follows:
BL_ALi=Gain_TF*(BL_ALi−1−BL_AL_raw)+BL_AL_raw,
where BL_ALi−1 is previous AL backlight after temporal filtering, BL_ALi is current AL backlight after temporal filtering, 0≤Gain_TF≤1 is the gain of temporal filtering.
After determining the back light value, an image luma histogram is used to calculate the content compensation weight. The pixel luma is equal to max(R,G,B) for the pixel (R,G,B). The bin number of histogram can be selected in the range from 8 to 64. Each bin stores the sum of all pixel luma located in this bin. For each bin, sum of pixel luma is normalized by dividing by image width and image height, or by the width and height of the LED zone if being used with a display with more than one LED backlight zone. Denote the normalized sum by binsum_norm(i), 0≤i≤N−1, i is the bin number of histogram. The piece-wise mapping curve is used to map the normalized bin sum to the bin backlight. Each bin sum has a unique piece-wise mapping cure.
where x0(i)≥0, k0(i)≥0, g0(i)≥0, k1(i)≥0, g1(i)≥0 are parameters.
Based on bin backlight, histogram backlight histBL can be obtained. It can be calculated using the max of binBL(i), 0≤i≤N−1 or be sum of binBL(i), 0≤i≤N−1. If histBL is small, the content-based compensation weight is large. The content-based compensation may also be referred to as the LCD compensation, where LCD represents the display panel whether or not an actual LCD panel or other pixelated display is used. If histBL is large, the content-based compensation weight is small. For example, the function of the content-based pixel compensation weight, w_compen_C can be taken as
f(x)=(1−x)3,
where, x denotes histBL, f(x) is the compensation weight. For hardware implementation, a LUT can be used to store the compensation weight to the backlight for additional flexibility.
The luma average of the original image and the luma average of the maximum compensated image are used to calculate the backlight. The pixel luma is calculated by max(R,G,B) for the pixel (R,G,B). The original luma average (org_avg) is equal to sum of original pixel luma divided by the number of image pixels. For max compensated image, the luma average (max_compen_avg) is equal to sum of its compensated pixel luma divided by the number of image pixel assuming that the compensation weight (w_compen_C) equals one. This allows all the values necessary to calculated the back light value to be obtained as the image is received. The details of calculating the pixel compensation and w_compen_C is in subsection-“pixel compensation”.
After original luma average org_avg, max compensation luma average max_compen_avg and content-based compensation weight w_compen_c are obtained, the content-based backlight adjustment is calculated as the following formulas:
compen_avg=org_avg+w_compen_C*(max_compen_avg−org_avg),
where, α≤0 can be taken as 2.2 for typical LCD panel and LED response curves. For hardware implementation, the value (•)α may be stored in a LUT. Because org_avg≤compen_avg, BL_C is in the interval [0,1]
The content backlight and the ambient light backlight are combined to get the output backlight as follows:
BL_ALC=BL_AL*BL_C*BL_USER
BL_o=min(BL_LMT, BL_AL_C), where, BL_AL is ambient light backlight, BL_C is content backlight, BL_USER is max backlight set by the user, BL_LMT denotes the upper limit of backlight for current setting and it may be BL_USER, 100% of screen max backlight or somewhere in between depending on the specific implementation by the manufacturer. The manufacturer might also make this dependent on the battery capacity remaining or whether or not the device is plugged in or not.
If the required backlight BL_AL_C is larger than the upper limit of the panel backlight, the ambient light compensation is also needed. If BL_AL_C is not larger than BL_LMT, the ambient light compensation weight (w_compen_AL) is zero. In addition, if automatic adjustment of the backlight based on ambient light is turned off, then w_compen_AL will also be zero because BL_AL is forced to be less than BL_LMT. If BL_AL_C is larger than BL_LMT, then the screen backlight isn't enough for the ambient light and the image content. That is, the actual backlight level is lower than what was used to calculate the content based compensation value. Therefore, the ambient light pixel compensation should take effect. The smaller min(1, BL_LMT/BL_AL_C) is, the larger the compensation weight. If BL_LMT/BL_AL_C is close to zero, the compensation weight is large and close to one. Similar to the calculation of the content compensation weight, such as f(x)=(1−x)3 or in general a one-dimensional look-up table (1D LUT), the compensation weight w_compen_AL can be obtained. The final pixel compensation weight is sum of the content compensation weight and the ambient light compensation weight, for example
w_compen=min(1,w_compen_C+w_compen_AL).
For better pixel compensation for the ambient light, a pixel offset may also used and is discussed in more detail below. The larger the w_compen_AL, the larger compensation offset.
Pixel compensation is used to increase the pixel value if necessary. For example, if the ambient light backlight is larger than the upper limit of the screen backlight or the backlight is decreased for the image content. Based on the compensation weight and the compensation gain curve, each pixel is adjusted. The relationship between the compensation gain curve and the compensation curves is as follows:
pixel compensation gain=(compensated pixel)/(original pixel).
For a strictly accurate representation, the compensation gain curve would equal a constant, but doing so does not adjust the dynamic range or provide a soft clip. Therefore, the curve in general is a more complex function.
For detail implementation, the compensation curve is replaced by the compensation gain curve.
A LUT may store the max compensation gain curve for different gray level. For a pixel (R,G,B), the adjusted pixel is obtained as follows:
RGB Bright Pixel Stretch:
(R′,G′,B′)=(R,G,B)*gain_BS(Y),
where, Y is equal to max(R,G,B), gain_BS is the adjustment gain of bright pixel stretch.
Compensation Gain Calculation:
gain_compen=1+w_compen*(gain_max(Y′)−1),
where, Y′ is equal to max(R′,G′,B′), gain_max is the max compensation gain stored in a 1D LUT of max compensation gain.
Rgb Compensation:
(R″,G″,B″)=gain_compen*(R′,G′,B′),
where, (R″,G″,B″) is the compensated pixel.
AL Offset Adjustment:
(R′″,G′″,B′″)=Offset_AL+(1−Offset_AL)*(R″,G″,B″),
where Offset_AL=w_offset_AL*max_Offset_AL, 0≤rmax_Offset_AL≤1 is a parameter.
As discussed above, bright pixels may undergo stretching because image compensation will reduce the contrast of high grayscale pixels. Bright pixel stretch is used to compensate for the contrast loss. 8-bin grayscale histogram is generated to represent the grayscale distribution of the image. Based on the histogram, each grayscale obtains a contrast adjustment gain.
For an 8-bin histogram, each bin stores the number of the pixels that fall in the range of the bin. For each pixel (R,G,B), max(R,G,B) is taken as the pixel grayscale. After dividing by the image width and image height, a histogram can represent the grayscale distribution of all the pixels in the image. Denote the histogram by hist_BS[0˜7]. The contrast adjustment deltas are calculated as follows:
delta[i]=min(Lmt_Delta_BS[i],K_Delta_BS[i]*max(0,hist_BS[7−i]−TH_HIST_BS)),i=0, . . . ,4,
where, Lmt_Delta_BS[0˜4]≥0, K_Delta_BS[0˜4]≥0 and TH_HIST_BS≥0 are parameter.
Low-pass filtering of the histogram is used to increase the stability of the deltas:
delta_LP[i]=delta[i]+K_LPF_BS*(delta[i+1]−delta[i]),i=0, . . . ,3,
where, 0≤K_LPF_BS≤1 denote the gain of low-pass filtering.
Temporal filtering of the contrast delta is used to improve temporal stability:
delta_TP[i,t]=delta_LP[i]+K_TF_BS*(delta_TP[i,t−1]−delta_LP[i]),i=0, . . . ,4,
where, 0≤K_TF_BS≤1 denote the gain of temporal filtering.
If the pixel compensation weight is small, the contrast loss of high grayscale pixels is small. Therefore, the contrast delta is multiplied by the pixel compensation weight to reduce the value:
delta_BS[i]=delta_TP[i]*w_compen,i=0, . . . ,4,
where, 0≤w_compen≤1 denote the compensation weight.
Based on the contrast delta, bright pixel stretch will calculate the corresponding adjusted grayscale value after contrast adjustment. Denote the adjusted value by value_BS[0˜8]. For 8 bit grayscale value, the original value of value_BS[0˜8] is {0,32,64,96,128,160,192,224,256}, shown in
where BIT denotes the bit width of the grayscale value and for 8 bit video, BIT=8.
In this manner, an ambient light adjustment to a display backlight is accomplished. The adjustment takes into account not only the ambient light in the environment of the display, but the image content to be displayed as well. The image data is adjusted based upon the content and the backlight to arrive at new output image data and backlight values to be displayed.
Although there has been described to this point a particular embodiment for a method and apparatus for image data based compensation for an LED backlight, it is not intended that such specific references be considered as limitations upon the scope of this invention except in-so-far as set forth in the following claims.
Number | Name | Date | Kind |
---|---|---|---|
6694051 | Yamazoe | Feb 2004 | B1 |
8564528 | Chen | Oct 2013 | B1 |
20030012437 | Zaklika | Jan 2003 | A1 |
20030081133 | Lee | May 2003 | A1 |
20040136605 | Seger | Jul 2004 | A1 |
20040143380 | Stam | Jul 2004 | A1 |
20070001997 | Kim | Jan 2007 | A1 |
20070002003 | Kim | Jan 2007 | A1 |
20070081721 | Xiao | Apr 2007 | A1 |
20070097069 | Kurokawa | May 2007 | A1 |
20070188623 | Yamashita | Aug 2007 | A1 |
20080007512 | Honbo | Jan 2008 | A1 |
20080231581 | Fujine | Sep 2008 | A1 |
20080284719 | Yoshida | Nov 2008 | A1 |
20080284721 | Hasegawa | Nov 2008 | A1 |
20090009455 | Kimura | Jan 2009 | A1 |
20090009464 | Kohashikawa | Jan 2009 | A1 |
20090015541 | Honbo | Jan 2009 | A1 |
20090122073 | Higgins | May 2009 | A1 |
20090167658 | Yamane | Jul 2009 | A1 |
20090174636 | Kohashikawa | Jul 2009 | A1 |
20090184915 | Tsai | Jul 2009 | A1 |
20090262063 | Fujine | Oct 2009 | A1 |
20090268105 | Kohashikawa | Oct 2009 | A1 |
20090273614 | Higgins | Nov 2009 | A1 |
20100026703 | Parker | Feb 2010 | A1 |
20100195906 | Uliyar | Aug 2010 | A1 |
20100277515 | Ward | Nov 2010 | A1 |
20100295877 | Yun | Nov 2010 | A1 |
20110037784 | Shiomi | Feb 2011 | A1 |
20110037785 | Shiomi | Feb 2011 | A1 |
20110123129 | Lee | May 2011 | A1 |
20110141090 | Hong | Jun 2011 | A1 |
20120057084 | Sano | Mar 2012 | A1 |
20120075362 | Ichioka | Mar 2012 | A1 |
20120120089 | Byun | May 2012 | A1 |
20120120096 | Johnson | May 2012 | A1 |
20120147067 | Hashimoto | Jun 2012 | A1 |
20120154459 | Otoi | Jun 2012 | A1 |
20120256892 | Hung | Oct 2012 | A1 |
20120268436 | Chang | Oct 2012 | A1 |
20120281027 | Kim | Nov 2012 | A1 |
20120287093 | Gotoh | Nov 2012 | A1 |
20120287141 | Higgins | Nov 2012 | A1 |
20120320105 | Ueno | Dec 2012 | A1 |
20130033583 | Lee | Feb 2013 | A1 |
20130170540 | Damkat | Jul 2013 | A1 |
20130315505 | Atkins | Nov 2013 | A1 |
20140003715 | Tsai | Jan 2014 | A1 |
20140049571 | Erinjippurath | Feb 2014 | A1 |
20140152720 | Kim | Jun 2014 | A1 |
20140168288 | Tusch | Jun 2014 | A1 |
20140225943 | Shiobara | Aug 2014 | A1 |
20140306943 | Kuo | Oct 2014 | A1 |
20150109325 | Yata | Apr 2015 | A1 |
20150245043 | Greenebaum | Aug 2015 | A1 |
20160248939 | Thurston, III | Aug 2016 | A1 |
20170103711 | Kamio | Apr 2017 | A1 |