Dynamic dimming LED backlight

Information

  • Patent Grant
  • 10431166
  • Patent Number
    10,431,166
  • Date Filed
    Thursday, August 7, 2014
    10 years ago
  • Date Issued
    Tuesday, October 1, 2019
    5 years ago
Abstract
Disclosed herein is a system for controlling the interactions of light between adjacent subsections of a dynamic LED backlight. Preferred embodiments contain a dividing wall positioned between each adjacent subsection of the LED backlight. The dividing wall may be in contact with the LED backlight and extend away from the LED backlight. The dividing wall may prohibit light from a first subsection from entering an adjacent second subsection at its full luminance. The luminance for each adjacent subsection may be approximately half of the full luminance of each subsection, when measured at the location of the dividing wall.
Description
TECHNICAL FIELD

Disclosed embodiments relate generally to an LED backlight having individually controlled subsections and an associated liquid crystal display.


BACKGROUND OF THE ART

Liquid Crystal Displays (LCDs) contain several layers which work in combination to create a viewable image. A backlight is used to generate the rays of light that pass through what is commonly referred to as the LCD stack, which typically contains several layers that perform either basic or enhanced functions. The most fundamental layer within the LCD stack is the liquid crystal material, which may be actively configured in response to an applied voltage in order to pass or block a certain amount of light which is originating from the backlight. The layer of liquid crystal material is divided into many small regions which are typically referred to as pixels. For full-color displays these pixels are typically further divided into independently-controllable regions of red, green and blue subpixels, where the red subpixel has a red color filter, blue subpixel has a blue color filter, and green subpixel has a green color filter. These three colors are typically called the primary colors. Of course, some displays may use additional color filters (such as adding a yellow filter) and these could also be used with the embodiments herein.


The light which is passing through each subpixel originates as “white” (or broadband) light from the backlight, although in general this light is far from being uniform across the visible spectrum. The subpixel color filters allow each subpixel to transmit a certain amount of each color (red, green or blue). When viewed from a distance, the three subpixels appear as one composite pixel and by electrically controlling the amount of light which passes for each subpixel color the composite pixel can produce a very wide range of different colors due to the effective mixing of light from the red, green, and blue subpixels.


Currently, the common illumination source for LCD backlight assemblies is fluorescent tubes, but the industry is moving toward light emitting diodes (LEDs). Environmental concerns, small space requirements, lower energy consumption, and long lifetime are some of the reasons that the LCD industry is beginning the widespread usage of LEDs for backlights.


LCDs are becoming popular for not only home entertainment purposes, but are now being used as informational/advertising displays in both indoor and outdoor locations. When used for information/advertising purposes, the displays may remain ‘on’ for extended periods of time and thus would see much more use than a traditional home theatre use. Further, when displays are used in areas where the ambient light level is fairly high (especially outdoors) the displays must be very bright in order to maintain adequate picture brightness. When used for extended periods of time and/or outdoors, overall energy consumption can become an issue. Thus, it is desirable to limit the power consumption of these displays as much as possible while maintaining image fidelity.


SUMMARY

Exemplary embodiments provide a backlight with individually controlled subsections. The luminance for each subsection can be controlled based on the image data being sent to the LCD. The incoming image data may be analyzed to determine the requirements for each subsection, and some may be selectively ‘dimmed’ if they correspond to portions of the image which do not require the full luminance output of the backlight. Selectively dimming portions of the backlight allows for several benefits, including but not limited to reduced power consumption, longer product lifetime, and higher contrast ratios.


These and other objects are achieved by a device as described in the following detailed description.





BRIEF DESCRIPTION OF THE DRAWINGS

A better understanding will be obtained from a reading of the following detailed description and the accompanying drawings wherein identical reference characters refer to identical parts and in which:



FIG. 1 is a front view of a backlight with individually controlled subsections;



FIG. 2 is a front view of LCD image data where the image is divided into several subimages;



FIG. 3 is a histogram of a subimage;



FIG. 4 is a flow chart for one embodiment for analyzing the subimage histogram data;



FIG. 5 is a front view of the backlight where each subsection is being driven at the appropriate luminance level based off the histogram data for the corresponding subimage;



FIG. 6 is a front view of the re-scaled LCD image data;



FIG. 7 is a front view of the backlight from FIG. 4 after diffusion;



FIG. 8 is the image resulting from combining the diffuse backlight of FIG. 7 with the rescaled LCD image of FIG. 6;



FIG. 9 a surface plot of a fully illuminated subsection of the backlight that has been convolved with a Gaussian filter;



FIG. 10 is a plot of relative luminance versus physical position on a pair of adjacent subsections when using the virtual subsection method;



FIG. 11 is a perspective view of one embodiment for controlling the ‘bleeding’ of light between adjacent subsections of the backlight;



FIG. 12 is a plot of relative luminance versus physical position on subsections when using pre-determined brightness profiles; and



FIG. 13 is a schematic view of one embodiment for the physical architecture of controlling the dynamic backlight.





DETAILED DESCRIPTION


FIG. 1 shows a backlight 10 which has been divided into several individually-controllable subsections 15. The backlight 10 produces light through a plurality of LEDs (not shown) which are mounted to the front face of the backlight 10. In this example, an 8×8 array of subsections 15 is shown. However, any number, shape, and size of subsections may be used with the various embodiments. The number of actual subsections may depend upon: the size of the display, cost, complexity of controlling circuitry desired, and desire for maximum power savings. Ideally, the greater number of subsections will provide a higher level of control and performance by the system. It should be noted that lines 16 are only used to represent the divisions regarding control of the subsections 15 and are not required as actual lines or physical divisions of the backlight 10.



FIG. 2 provides the LCD image data 20, where this image is divided into subimages 22 which correspond with the subsections 15 of the backlight 10 (shown in FIG. 1). Again, the lines 26 are only used to represent the divisions of the subimages and are not physical divisions of the LCD and should not be visible through the LCD assembly.



FIG. 3 shows a plot of histogram data for one of the subimages 22 shown in FIG. 2. The brightness index value is shown along the x-axis and the number of pixels within the subimage which have the corresponding brightness index value is shown along the y-axis. Here, the brightness index values range from 0 (no saturation) to 255 (fully saturated). Three separate plots are shown in FIG. 3: red subpixels 37, blue subpixels 30, and green subpixels 35. It can be observed from this plot that the red subpixels will control the brightness requirements for the subsection of the backlight as the red subpixel histogram data is skewed to the right of the green 35 and blue 30 data. Further, it can also be observed that the blue data 30 is bimodal, meaning that there are two peaks in the data, a first one 31 near zero and a second one 32 near 60. This bimodal characteristic will be discussed further below.


The histogram data for each subimage is analyzed to determine the proper luminance level for the backlight subsection corresponding to each subimage. FIG. 4 shows one embodiment for analyzing the histogram data for each channel (in this example: red, green, and blue) to determine the proper luminance setting for the backlight subsection.


Once the histogram data has been created 40, a first average μ1 and standard deviation σ1 are calculated 41. The following is one method for calculating these values and analyzing them:


Let N=the total number of pixels (red, green, or blue) in the subimage.


Denote the histogram as H(i) where i ranges from 0 to 255







Calculate





the





average





from


:



μ
1


=


1
N






i
=
0

255







i
·

H


(
i
)












Calculate





Standard





Deviation






σ
1


=




1
N



(




i
=
0

255








H


(
i
)


·

i
2



)


-

μ
1
2







The initial luminance value for this subsection of the backlight may then be calculated 42 as the average value plus one and a half standard deviations. Y=μ1+1.5·σ1. It should be noted that one and a half standard deviations was chosen as effective for one embodiment. Depending on several factors, some systems may require more or less than 1.5 standard deviations for adequate system performance. This variable could be adjusted for each display.


The backlight luminance can range from ‘off’ to ‘full on’ and these points, along with all of the settings in between, should be calibrated with the brightness index values from the histogram which can also vary from 0 (off) to 255 (full on). Thus, once the initial luminance value is calculated it may be compared with the maximum value of 255 (see step 43). If the initial luminance value is greater than 255, then the backlight luminance for this subsection is simply set to full on (255) and is stored for this channel (go directly from step 43 to step 47). The use of ‘channel’ herein denotes one of the primary colors that are used to create the image within the LCD. As discussed above, a typical LCD contains three channels (Red, Green, and Blue) but other LCD designs may use additional colors (such as Yellow) and thus may contain 4 or more channels.


Next, the histogram data for this channel may be tested for a bimodal distribution 44. This step may be performed because if the distribution contains multiple peaks, simply averaging and adding some amount of standard deviations may completely miss a peak which would require a higher backlight level. For example, in reference to FIG. 3, as mentioned above, the blue curve 30 may be considered bimodal. The initial luminance Y1 for the blue curve 30 may fall somewhere in between peaks 31 and 32, thus missing the peak 32 which requires the highest amount of backlight (i.e. if the blue curve were driving the backlight level, the minimum luminance level would have to be closer to 70, to ensure that peak 32 achieves its necessary illumination). In this particular case however, it would not affect the outcome of the analysis because the highest luminance value between the three channels is the value which will be finally used for the subsection (see step 48 in FIG. 4). However, the test for bimodal distribution may still be performed to ensure that the driving color (in this particular case the red channel is actually the driving color) does not contain several peaks such that one would not be adequately illuminated.


The following is one method for determining if a histogram is bimodal 44. Using Otsu's algorithm, find the optimal separation point between distributions in the histogram:

C=nB(T)nO(T)[μB(T)−μO(t)] (Otsu's algorithm)

    • where:
    • T is the threshold value and ranges from 0 to 255
    • nB(T) is the number of pixels that fall below the threshold value
    • nO(T) is the number of pixels that fall above the threshold value
    • μB(T) is the average value of the pixels below the threshold value
    • μO(T) is the average value of the pixels above the threshold value
    • Perform Otsu's algorithm for each for each value of T in the histogram and determine the T which corresponds to the maximum value of C (this will be referred to as Tmax also known as the Otsu Threshold).
    • Compare Tmax to the first average value μ1.
    • If, |Tmax−μ1|≤Δ, then the histogram data is not bimodal and the luminance value for the subsection is equal to the initial luminance value.

      Yf=Yi
    • Note, Δ may be selected for each display setup and may need to be adjusted depending on the type of display and what is being shown on the display. Acceptable results have been found for some displays with a Δ value near 10.
    • If, |Tmax−μ1|>Δ, then the histogram data is bimodal and the following steps should be performed:
      • Calculate a second average and a second standard deviation based on the histogram data to the right of the Otsu Threshold Tmax. (see step 45 in FIG. 4)







Set





j

=



T
max

.




N

=





i
=

j
+
1


255







H


(
i
)



//

Set





N





to





new





sample





size










Calculate





the





Second





average





from


:



μ
2


=


1
N






i
=

j
+
1


255







i
·

H


(
i
)











    • Calculate the Second Standard Deviation from:










σ
2

=




1
N



(




i
=

j
+
1


255








H


(
i
)


·

i
2



)


-

μ
2
2







The final luminance value (Yf) for this channel can then be calculated 46 as the average plus one standard deviation. Yf2+1.0·σ2 Again, acceptable results have been found by using one standard deviation, but different display setups may require a different number of standard deviations. This final luminance value should be compared to the maximum luminance value possible (255) and if it is larger than this value, the luminance value will simply be stored as the maximum luminance of 255. (If Yf>255 then Yf=255) The final luminance value for this channel is then stored 47 and steps 40-47 are repeated for the remaining two channels. Finally, when the final luminance value for all three channels (R, G, and B) has been determined, they are compared with one another and the largest final luminance value Yf is stored 48 as the proper luminance value for the backlight subsection.



FIG. 5 shows what the backlight 10 may look like once each of the luminance values has been stored and the corresponding subsections are driven at their proper luminance values (after Gamma correction has been performed, if necessary—see below for more information on Gamma correction). This may involve a conversion of the luminance values to current/voltage levels and can easily be accomplished by one skilled in the art by creating a linear relationship where luminance level 0 corresponds with 0 amps (or volts) and luminance level 255 corresponds to x amps (or volts), where x represents the power level that generates the maximum luminance from the LEDs). It can be easily observed from FIG. 5 that some subsections are completely on (white) while others are slightly gray to dark grey. The capability of dimming these sections of the backlight will save power as well as provide a deeper black/dark color since the backlight is not shining through the liquid crystal material at full luminance.


However, LCD subpixel voltages are typically determined based on a ‘full on’ backlight and when sections of the backlight are dimmed, the subpixel voltages may need rescaled (‘adjusted’) to ensure that the picture fidelity remains high and the proper colors are produced by the display. One method for rescaling the LCD subpixel voltages is to divide the subpixel voltage by the ratio of proper luminance level to maximum luminance. FIG. 6 shows the resulting LCD image data (without the adjusted backlight levels) once it has been rescaled based on the calculated backlight luminance levels.


For example, subsection 50 shown in FIG. 5 may have a luminance level of 128. This would be 128 out of a possible 255 (maximum luminance), resulting in 128/255=approximately ½. As an illustration, assume that one of the subpixel voltages for subsection 50 was originally 1 mV. To rescale this subpixel voltage, divide 1 mV by ½. Now, the subpixel voltage should be 2 mV. Assuming that we are dealing with a normally black LCD stack (voltage is required to orient the crystals to pass light) this increase in subpixel voltages is required because we have decreased the backlight level. Thus, from FIG. 5 we know that the backlight will decrease approx. 50% at subsection 50, so in order to create the original colors in the image, the subpixel voltage must be increased in order to allow more light through the liquid crystals. The seemingly brighter resulting LCD image for subsection 50 can be observed in FIG. 6. Note, that FIG. 6 only shows the image data and does not take into account the actual backlight levels that are illuminating the LCD, so although subsection 50 appears lighter, this will be accounted for once the new backlight levels are applied.


As a second example, subsection 55 shown in FIG. 5 may have a luminance level of 255 (maximum luminance). This would be 255/255, or 1. Thus, assuming any original subpixel voltage for subsection 55, say V, the resulting scaled subpixel voltage would be identical because the backlight subsection remains at full on. V/1=V. This can be observed in FIG. 5 as the subsection 55 appears white. Also notice that subsection 55 in FIG. 6, appears identical to the original image in FIG. 2 because the backlight remains at ‘full on’ so the subpixel voltages have not been altered from their original settings.


It is common in LCD assemblies to place a light diffusing/scattering element (herein ‘diffuser’) in between the backlight and the liquid crystal material in order to provide a more uniform appearance of light through the display. Without the diffuser, the LED point-sources of light may be visible through the final display. Thus, when the backlight from FIG. 5 is placed behind a diffuser, the resulting luminance pattern can be seen in FIG. 7. Further, when the diffused backlight from FIG. 7 is placed behind the rescaled LCD image data from FIG. 6, the resulting image from the LCD is shown in FIG. 8.


As can be easily observed, the diffusing properties alter the actual luminance levels of the backlight, especially near the edges of the subsections. Looking at subsection 50 for example, the luminance in the center 51 is acceptable, while the luminance near the edges 52 has been increased due to ‘bleed over’ from brighter adjacent subsections 60.


One method discovered to account for this phenomenon is the creation of a ‘virtual backlight’ or ‘VB’ where the ‘bleed over’ behavior of adjacent subsections can be mathematically modeled and accounted for during the rescaling of the LCD subpixel voltages. There are many methods for mathematically modeling a given backlight in order to create a VB.


One method for creating the VB may be referred to as ‘virtual subsections’ and is based on the use of a stored matrix of data that represents the appearance of a single, fully illuminated subsection in the backlight assembly as seen through the diffuser. FIG. 9 provides a surface plot of a fully illuminated subsection 90 that has been convolved with a Gaussian filter. The subsection 90 has a width (W) 93, height (H) 92, and a tail (T) 95, where W, H, and T are each measured in pixels. The tail 95 represents the subpixels which may be impacted by the luminance from adjacent subsections of the backlight. In other words, illumination of the subsection that extends beyond the physical edge of the subsection 90. Thus, the dimensions of the stored matrix for the subsection would be (2T+W)×(2T+H). Because the virtual subsection is larger than the actual subsection, the adjacent subsections may be overlapped and the principle of additive light may be used to blend the edges of the subsections.



FIG. 10 illustrates the relative luminance versus physical position on a pair of adjacent subsections. The x-axis of this figure represents the pixel location while the y-axis represents the relative luminance of the backlight subsections. Relative luminance refers to the percentage of the backlight luminance Y, which was determined for the subsection (subsection) in FIG. 4. Thus, 0.5 would represent one-half of the luminance, 0.25 would represent one-quarter of the luminance, etc. The plot for a first subsection 100 and an adjacent second subsection 101 are shown. The line 105 represents the physical dividing line between the subsections. Looking at the first subsection 100, at pixel zero the full luminance level is recorded. The relative luminance decreases as the pixel location increases (as we approach the division between the subsections 105). At pixel 90, only half of the full luminance level is recorded. As the pixel location continues to increase (as we move away from the division between the subsections 105) the relative luminance continues to decrease until it reaches zero at pixel 180. Thus, for this example the tail T, of each subsection may be 90 pixels long. A symmetrically-opposite trend can be seen with the plot for the adjacent subsection 101.


It should be noted that because the plot for the adjacent subsections 100 and 101 are symmetrical about line 105 and about the relative luminance of 0.5, if the subsections were driven to the same backlight luminance level they would blend to create 100% luminance across the line 105 between the subsections. Obviously, at line 105 the VB data for each subsection is at 0.5 or 50% of the backlight luminance for that subsection, so if each subsection were driven to the same backlight luminance, these would add together to create the same luminance level across the line 105. If the subsections were driven to different luminance levels, as the VB data is entered, this will blend between the different luminance levels. For example, at pixel location 38 within subsection 100, the VB data should be 90% of the luminance for subsection 100 plus 10% of the luminance for subsection 101.


Obviously, the relationship shown in FIG. 10 is only applied to adjacent subsection edges and to subpixels which are within the ‘tail’ portion of the adjacent subsections. Thus, subsection edges which are not adjacent to any other subsections (i.e. along the perimeter of the overall display) may not show this relationship and may simply use 100% of the luminance level as the VB data for that subsection.


By using the luminance values for each backlight subsection along with the model for backlight luminance along the subsection edges, an array of VB data for each subsection can be stored and then combined to create a larger array which contains VB data for each pixel in the display. As discussed above, the original subpixel voltages may then be divided by the ratio of VB data over the maximum backlight value in order to properly rescale the original LCD image data.


It should be noted that although a Gaussian curve has been used in FIG. 10 to represent the relationship between adjacent subsections, this is not required. For some embodiments a linear relationship or exponential function may provide a more appropriate mathematical representation of what is actually occurring with the diffused backlight. Other mathematical models are discussed below. This brings up an interesting point to keep in mind when designing this type of system. Either a mathematical system can be derived to model the existing physical backlight or the physical backlight may be designed so that it performs similar to a selected mathematical model.


If using the gaussian relationship shown in FIG. 10, it may be advantageous to design the physical system such that this type of relationship actually exists. For example, the backlight and diffuser should be designed such that only 50% luminance exists at the overlapping edge of each subsection. FIG. 11 shows one method for accomplishing this specific embodiment, where an array of dividing walls 120 has been used between the backlight LEDs 125 and the diffusing element (not shown). FIG. 11 shows a simplified figure as only a 3×3 array is shown and the figure does not show LEDs in every subsection. However, as discussed above, the number of backlight subsections can vary depending on many different factors, and one skilled in the art can easily modify the simplified FIG. 11 into an 8×8 array (or any other arrangement) with LEDs in every subsection.


Preferably, there would be a gap between the end of the dividing walls 120 and the diffuser. This would prevent any of the dividing walls 120 from being visible through the final display. The precise geometry of the dividing walls 120 and their relationship to the diffuser may require fine tuning for each display. Acceptable results have been found for 70 inch LCD displays where the dividing walls 120 are about two to three inches high with a gap between the dividing wall 120 and diffuser of 30-40 mm.


As mentioned above, other mathematical models may be used to simulate the backlight through the diffuser. One other method is to use a point spread function (PSF). If the diffuser is treated like an optical low pass filter, then a 2D filter operation can be performed on the virtual backlight. One could also modify the PSF by observing that a diffused backlight only requires a blurring operation along the boundaries between subsections.


An examination of the edges between a fully illuminated subsection and an adjacent dimmer subsection constructed via the Gaussian Point Spread Function reveals a series of common curves. FIG. 12 shows the change in relative illumination from 1 to 0.5 (curve 130), 1 to 0.25 (curve 132), and 1 to 0 (curve 134). If we denote Z(x) as the curve that goes from 1 to zero, then it is possible to recreate any change in brightness between adjoining subsections with the equation: f(x)=y1+Z(x)·(y0−y1) where y0 is the brightness of the starting subsection and y1 is the brightness of the ending subsection.


Thus, a two-step process for this method could include: (1) Create a series of changing brightness lines that run vertically down the middle of each subsection using the above formula. If the subsections are rectangular, then a “longer” brightness function will be required for this operation and (2) Starting at the top of the VB, create a series of horizontal brightness curves using the data from step 1 as the endpoints for each curve.


A final technique to produce a virtual backlight would be through the use of Bezier Curves. In this approach, cubic splines could be used to interpolate between the subsection centers and thus simulate diffusion. For each point in the Virtual Backlight, the following equation would be calculated:

B(t)=(1−t)3P0+3t(1−t)2P1+3t2(1−t)P2+t3P3, t∈[0,1].


As discussed above, once the data for the VB has been generated, it may be divided into the corresponding subpixel voltages in order to properly rescale the LCD video image. This can be accomplished in many ways. Because division is typically a time-consuming operation, one exemplary embodiment may use a 256 byte lookup table of 8-bit scaling factors. These would be multiplied by each pixel and then followed by an 8-bit shift. The 8-bit shift can be skipped if only the upper byte of the product is used. If an overflow occurs, the resulting pixel value would be 255.


Before driving the backlight subsections with the appropriate luminance values, gamma correction may be applied. This step may help correct the contrast and may also provide additional power savings. Assuming backlight intensities from 0 to 255, one method of gamma correction may be: I=255·(Y/255)γ where γ is typically equal to 2.2 (but this may be varied depending on the application). For example, assume that the luminance value (Y) for the subsection was calculated to be 128. When this value is used in the gamma equation above, the actual intensity of the backlight (I) is calculated to be 56. This backlight intensity (I) can now be converted to actual voltage/current and sent to the appropriate backlight subsection. Also, the re-scaled image data can now be sent to the LCD as the backlight is updated.


An example for the physical architecture which could perform the operations as discussed above is now presented. It should be pointed out that this architecture is only an example and those skilled in the art could modify this example or create other types of physical architecture which are capable of performing the operations discussed herein.



FIG. 13 shows a schematic representation of one example for the physical architecture. This specific example assumes the following: the input is RGB data on a 24-bit wide data bus, an 8×8 backlight array is used, the output is RGB data on a 24-bit wide data bus, an external pixel clock is available, the maximum LCD resolution is 1080 by 1920 for a total of 2,073,600 pixels, the Samsung LTI700HD01 is the assumed LCD, the design should support a pixel clock of 148.5 Mhz.


Two frame buffers 200 may be used to store the incoming frame and process and output the outgoing frame. Each frame buffer should store 2,073,600 RGB values and the width of the frame buffer should be at least 24 bits. Eight, three channel histogram accumulators 210 may be used for statistical processing. Each accumulator 210 should consist of 256 15-bit counters. There may be accumulators for each of the three color channels (if using an RGB-type LCD). The output of each counter should be double buffered. Two virtual backlight buffers 215 may be used to store newly created backlight based on incoming image data and rescale the gain of outgoing LCD data.


The embodiment for the architecture described here would implement the steps above using a “Pitch and Catch” approach. While one block is ‘catching’ and analyzing the incoming video data, the other block is scaling and ‘pitching’ video data to the output. As shown in FIG. 13, the upper half of the system is in “catch” mode. During this phase, incoming RGB data is sampled by the histogram accumulators 210 while being stored in the frame buffer 200. After 135 lines have been buffered, the contents of the twenty-four histogram accumulators 210 are made available to the digital signal processor 220 (DSP). The DSP 220 then calculates the brightness of each of the corresponding subsections and updates the virtual backlight buffer 215. This process is repeated seven more times for the remaining video data. Note that the last eight subsections placed into the virtual backlight may have to be calculated during the “vertical retrace” period.


The lower half of the system is operating in “pitch” mode. During this phase, each pixel from the input buffer 200 is divided by the corresponding pixel in the virtual backlight buffer 215 and sent to the video out MUX. To speed execution and avoid the use of a hardware divider, a lookup table may be used to determine a scaling factor. This factor can then be used to rescale the RGB data with three 8×8 multipliers. Concurrent with the rescaling operation, the individual subsections of the backlight matrix will be updated synchronously using the values calculated during the “catch” phase.


It should be noted that the system and method described herein has been described with reference to each ‘frame’ and in an exemplary embodiment the backlight subsections would be updated for each ‘frame.’ However, there are many different frame rates of video which exist as well as different refresh rates of LCD displays (ex. 60 Hz, 120 Hz, 240 Hz, etc.). As used herein, the term ‘frame’ represents each time that the pixel voltages are updated for the LCD display. Thus, the backlight subsections should preferably be updated (and the LCD subpixel voltages re-scaled) each time that a new set of subpixel data is sent to the LCD display.


Having shown and described preferred embodiments, those skilled in the art will realize that many variations and modifications may be made to affect the described embodiments and still be within the scope of the claims. Thus, many of the elements indicated above may be altered or replaced by different elements which will provide the same result and fall within the spirit of the claimed embodiments. It is the intention, therefore, to limit the invention only as indicated by the scope of the claims.

Claims
  • 1. An LED backlight assembly for a liquid crystal display (LCD), the LED backlight assembly comprising: an LED backlight having a number of individually controllable subsections, wherein each subsection comprises a number of red light emitting LEDs, a number of green light emitting LEDs, and a number of blue light emitting LEDs, wherein simultaneous illumination of each of said number of red, green, and blue light emitting LEDs within a given subsection is configured to produce white light, and wherein each of said subsections comprises an equal number of red, green, and blue light emitting LEDs;an unpenetrated dividing wall extending away from the LED backlight and separating adjacent subsections of the LED backlight;a diffusing element placed in front of the LED backlight;a gap located between the dividing wall and the diffusing element;a histogram accumulator in electrical communication with the LED backlight;a frame buffer; anda digital signal processor in electrical communication with the LED backlight, wherein the digital signal processor is also in electrical communication with the LCD and is configured to send virtual backlight data to the LCD.
  • 2. A system for controlling the interactions of light between adjacent subsections of a dynamic LED backlight, the system comprising: a dividing wall positioned between adjacent subsections of the dynamic LED backlight, wherein the dividing wall extends below a diffusing element such that a gap is created between the diffusing element and an upper portion of the dividing wall;wherein the dividing wall extends away from the LED backlight at a distance which creates a luminance overlap of adjacent subsections of 50% within the gap when measured at the dividing wall;wherein said dynamic LED backlight comprises a number of color producing LEDs that are configured to be illuminated simultaneously to produce white light.
  • 3. The system of claim 2 wherein: the dividing wall is adapted to produce a Gaussian relationship for the interactions of light between adjacent subsections of the LED backlight.
  • 4. The system of claim 2 further comprising: a histogram accumulator in electrical communication with the LED backlight.
  • 5. The system of claim 2 wherein: the dividing wall extends less than three inches away from the LED backlight.
  • 6. A system for controlling the interactions of light between adjacent subsections of a dynamic LED backlight, the system comprising: a first subsection of the LED backlight;a second subsection of the LED backlight positioned adjacent to the first subsection so as to create a border between the first and second LED subsections; anda dividing wall positioned along the border and extending away from the LED backlight such that approximately one-half of the luminance of the first subsection and approximately one-half of the luminance of the second subsection is present at the dividing wall;a gap located between the dividing wall and a diffusing element such that the border extends within the gap;wherein each of said first subsection of the LED backlight and said second subsection of the LED backlight comprise a number of color producing LEDs that are configured to be illuminated simultaneously to produce white light;wherein the first subsection and the second subsection are configured to be illuminated at approximately one-half of the full luminance when measured at the border and the remainder of the first subsection and the second subsection are configured to be illuminated at approximately full luminance when measured away from the border.
  • 7. The system of claim 6, further comprising: a histogram accumulator in electrical communication with the first and second subsections of the LED backlight.
  • 8. The system of claim 6 wherein: the full luminance of the first subsection is not permitted to cross the border into the second subsection.
  • 9. The system of claim 6 wherein: the luminance of the first subsection is gradually reduced as one moves from the border and into the second subsection.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of and claims priority to U.S. application Ser. No. 13/850,854 filed on Mar. 26, 2013, now U.S. Pat. No. 8,803,790, which is in turn a divisional of and claims priority to U.S. application Ser. No. 13/722,537 filed on Dec. 20, 2012, now U.S. Pat. No. 8,704,752, which is in turn a divisional of and claimed priority to U.S. application Ser. No. 12/793,474 filed on Jun. 3, 2010, now U.S. Pat. No. 8,350,799, which is a non-provisional application of U.S. Provisional Application No. 61/183,592 filed Jun. 3, 2009, each of which are herein incorporated by reference in their entirety.

US Referenced Citations (284)
Number Name Date Kind
1812919 Balder Jul 1931 A
3510973 Mazzocco, Sr. May 1970 A
4257084 Reynolds Mar 1981 A
4804953 Castleberry Feb 1989 A
5040878 Eichenlaub Aug 1991 A
5046805 Simon Sep 1991 A
5066106 Sakamoto et al. Nov 1991 A
5363149 Furuno et al. Nov 1994 A
5365354 Jannson et al. Nov 1994 A
5381309 Borchardt Jan 1995 A
5440324 Strickling et al. Aug 1995 A
5453855 Nakamura et al. Sep 1995 A
5528720 Winston et al. Jun 1996 A
5598068 Shirai Jan 1997 A
5661578 Habing et al. Aug 1997 A
5856854 Hyun Jan 1999 A
6027222 Oki et al. Feb 2000 A
6166389 Shie et al. Dec 2000 A
6307216 Huh et al. Oct 2001 B1
6400101 Biebl et al. Jun 2002 B1
6409356 Nishimura Jun 2002 B1
6419372 Shaw et al. Jul 2002 B1
6421103 Yamaguchi Jul 2002 B2
6437673 Nishida et al. Aug 2002 B1
6446467 Lieberman et al. Sep 2002 B1
6481130 Wu Nov 2002 B1
6556258 Yoshida et al. Apr 2003 B1
6601984 Yamamoto et al. Aug 2003 B2
6636003 Rahm et al. Oct 2003 B2
6683639 Driessen-Olde Scheper et al. Jan 2004 B2
6762815 Lee Jul 2004 B2
6789921 Deloy et al. Sep 2004 B1
6805468 Itoh et al. Oct 2004 B2
6842204 Johnson Jan 2005 B1
6860628 Robertson et al. Mar 2005 B2
6936968 Cross et al. Aug 2005 B2
6949772 Shimizu et al. Sep 2005 B2
6958743 Shin et al. Oct 2005 B2
6982686 Miyachi et al. Jan 2006 B2
7012379 Chambers et al. Mar 2006 B1
7015650 McGrath Mar 2006 B2
7018054 Miyashita et al. Mar 2006 B2
7025474 Campbell et al. Apr 2006 B2
7038186 De Brabander et al. May 2006 B2
7040794 Bernard May 2006 B2
7045828 Shimizu et al. May 2006 B2
7049761 Timmermans et al. May 2006 B2
7053557 Cross et al. May 2006 B2
7057590 Lim et al. Jun 2006 B2
7178963 Ueda et al. Feb 2007 B2
7190416 Paukshto et al. Mar 2007 B2
7194158 Schultheis et al. Mar 2007 B2
7210839 Jung May 2007 B2
7218812 Maxwell et al. May 2007 B2
7232250 Chuang Jun 2007 B2
7250637 Shimizu et al. Jul 2007 B2
7259403 Shimizu et al. Aug 2007 B2
7307391 Shan Dec 2007 B2
7307614 Vinn Dec 2007 B2
7324080 Hu et al. Jan 2008 B1
7327416 Lee et al. Feb 2008 B2
7347706 Wu et al. Mar 2008 B1
7352940 Charters et al. Apr 2008 B2
7375381 Shimizu et al. May 2008 B2
7421167 Charters et al. Sep 2008 B2
7427140 Ma Sep 2008 B1
7473019 Laski Jan 2009 B2
7481553 Kim et al. Jan 2009 B2
7481566 Han Jan 2009 B2
7510299 Timmermans et al. Mar 2009 B2
7513637 Kelly et al. Apr 2009 B2
7542108 Saito et al. Jun 2009 B2
7546009 Kukulj et al. Jun 2009 B2
7639220 Yoshida et al. Dec 2009 B2
7682047 Hsu et al. Mar 2010 B2
7738746 Charters et al. Jun 2010 B2
7781979 Lys Aug 2010 B2
7795574 Kennedy et al. Sep 2010 B2
7813694 Fishman et al. Oct 2010 B2
7853288 Ma Dec 2010 B2
7982706 Ichikawa et al. Jul 2011 B2
8021900 Maxwell et al. Sep 2011 B2
8064744 Atkins et al. Nov 2011 B2
8120595 Kukulj et al. Feb 2012 B2
8125163 Dunn et al. Feb 2012 B2
8194031 Yao et al. Jun 2012 B2
8233115 Hadlich et al. Jul 2012 B2
8274626 Choi et al. Sep 2012 B2
8294168 Park et al. Oct 2012 B2
8351013 Dunn et al. Jan 2013 B2
8400430 Dunn et al. Mar 2013 B2
8508155 Schuch Aug 2013 B2
8529993 Charters et al. Sep 2013 B2
8648993 Dunn et al. Feb 2014 B2
8674390 Harris et al. Mar 2014 B2
8674963 Cornish et al. Mar 2014 B2
8803790 Wasinger et al. Aug 2014 B2
8810501 Budzelaar et al. Aug 2014 B2
8829815 Dunn et al. Sep 2014 B2
8842366 Arnett et al. Sep 2014 B2
9030129 Dunn et al. May 2015 B2
9141329 Reicher et al. Sep 2015 B1
9167655 Dunn et al. Oct 2015 B2
9348174 Dunn et al. May 2016 B2
9812047 Schuch et al. Nov 2017 B2
9867253 Dunn et al. Jan 2018 B2
9924583 Schuch et al. Mar 2018 B2
10126579 Dunn et al. Nov 2018 B2
10191212 Dunn Jan 2019 B2
20010009508 Umemoto et al. Jul 2001 A1
20010033726 Shie et al. Oct 2001 A1
20020043012 Shibata et al. Apr 2002 A1
20020126078 Horibe et al. Sep 2002 A1
20030026085 Ueda et al. Feb 2003 A1
20030043312 Nishida et al. Mar 2003 A1
20030227428 Nose Dec 2003 A1
20040062029 Ato Apr 2004 A1
20040113044 Ishiguchi Jun 2004 A1
20050094391 Campbell et al. May 2005 A1
20050105303 Emde May 2005 A1
20050117323 King Jun 2005 A1
20050140640 Oh Jun 2005 A1
20050140848 Yoo et al. Jun 2005 A1
20050162737 Whitehead et al. Jul 2005 A1
20050265019 Sommers et al. Dec 2005 A1
20060012985 Archie, Jr. et al. Jan 2006 A1
20060055012 Hsin Chen et al. Mar 2006 A1
20060072299 Lai Apr 2006 A1
20060077686 Plan et al. Apr 2006 A1
20060082700 Gehlsen et al. Apr 2006 A1
20060087521 Chu et al. Apr 2006 A1
20060092346 Moon et al. May 2006 A1
20060092348 Park May 2006 A1
20060092618 Tanaka May 2006 A1
20060125418 Bourgault Jun 2006 A1
20060197474 Olsen Sep 2006 A1
20060214904 Kimura et al. Sep 2006 A1
20060221612 Song et al. Oct 2006 A1
20060238367 Tsuchiya Oct 2006 A1
20060262079 Seong et al. Nov 2006 A1
20060279946 Park et al. Dec 2006 A1
20060289201 Kim et al. Dec 2006 A1
20070001997 Kim Jan 2007 A1
20070013647 Lee et al. Jan 2007 A1
20070013828 Cho et al. Jan 2007 A1
20070021217 Wu Jan 2007 A1
20070047808 Choe Mar 2007 A1
20070070615 Joslin et al. Mar 2007 A1
20070097321 Whitehead et al. May 2007 A1
20070115686 Tyberghien May 2007 A1
20070127144 Gao Jun 2007 A1
20070139574 Ko et al. Jun 2007 A1
20070139929 Yoo et al. Jun 2007 A1
20070147037 Wang Jun 2007 A1
20070153515 Hong et al. Jul 2007 A1
20070171353 Hong Jul 2007 A1
20070171623 Zagar et al. Jul 2007 A1
20070171676 Chang Jul 2007 A1
20070177071 Egi et al. Aug 2007 A1
20070195535 Artwohl et al. Aug 2007 A1
20070198638 Omura et al. Aug 2007 A1
20070206158 Kinoshita et al. Sep 2007 A1
20070222910 Hu Sep 2007 A1
20070230218 Jachim et al. Oct 2007 A1
20070268234 Wakabayashi et al. Nov 2007 A1
20070279369 Yao et al. Dec 2007 A1
20070297163 Kim et al. Dec 2007 A1
20070297172 Furukawa et al. Dec 2007 A1
20080019147 Erchak et al. Jan 2008 A1
20080036940 Song et al. Feb 2008 A1
20080043463 Park Feb 2008 A1
20080049164 Jeon et al. Feb 2008 A1
20080068836 Hatanaka et al. Mar 2008 A1
20080074372 Baba et al. Mar 2008 A1
20080089064 Wang Apr 2008 A1
20080101086 Lee May 2008 A1
20080106527 Cornish et al. May 2008 A1
20080111949 Shibata et al. May 2008 A1
20080143916 Fujino et al. Jun 2008 A1
20080151527 Ueno et al. Jun 2008 A1
20080158468 Kim et al. Jul 2008 A1
20080170178 Kubota Jul 2008 A1
20080170400 Maruyama Jul 2008 A1
20080212305 Kawana et al. Sep 2008 A1
20080231196 Weng et al. Sep 2008 A1
20080266331 Chen Oct 2008 A1
20080272999 Kurokawa Nov 2008 A1
20080276507 Hines Nov 2008 A1
20080278432 Ohshima Nov 2008 A1
20080284942 Mahama et al. Nov 2008 A1
20080291686 Cull et al. Nov 2008 A1
20090002990 Becker et al. Jan 2009 A1
20090009102 Kahlman et al. Jan 2009 A1
20090015755 Bang et al. Jan 2009 A1
20090021461 Hu et al. Jan 2009 A1
20090033612 Roberts et al. Feb 2009 A1
20090058795 Yamazaki Mar 2009 A1
20090061945 Ma Mar 2009 A1
20090085859 Song Apr 2009 A1
20090091634 Kennedy et al. Apr 2009 A1
20090109165 Park et al. Apr 2009 A1
20090135167 Sakai et al. May 2009 A1
20090135583 Hillman et al. May 2009 A1
20090174840 Lee et al. Jul 2009 A1
20090196069 Iwasaki Aug 2009 A1
20090243501 Dunn et al. Oct 2009 A1
20090244884 Trulaske, Sr. Oct 2009 A1
20090284457 Botzas et al. Nov 2009 A1
20090289580 Dunn et al. Nov 2009 A1
20100039440 Tanaka et al. Feb 2010 A1
20100102735 Chang et al. Apr 2010 A1
20100109553 Chang et al. May 2010 A1
20100165240 Cho et al. Jul 2010 A1
20100194296 Park Aug 2010 A1
20100220258 Dunn et al. Sep 2010 A1
20100231563 Dunn et al. Sep 2010 A1
20100307800 Wee et al. Dec 2010 A1
20100313592 Pae Dec 2010 A1
20110007228 Yoon et al. Jan 2011 A1
20110013114 Dunn et al. Jan 2011 A1
20110083460 Thomas et al. Apr 2011 A1
20110102704 Dunn et al. May 2011 A1
20110116000 Dunn et al. May 2011 A1
20110141724 Erion Jun 2011 A1
20110164434 Derichs Jul 2011 A1
20110205145 Lin et al. Aug 2011 A1
20110242437 Yoo et al. Oct 2011 A1
20110242839 Dunn et al. Oct 2011 A1
20110283199 Schuch et al. Nov 2011 A1
20120050958 Sanford et al. Mar 2012 A1
20120062819 Dunn et al. Mar 2012 A1
20120086344 Schuch Apr 2012 A1
20120098794 Kleinert et al. Apr 2012 A1
20120105424 Lee et al. May 2012 A1
20120134139 Jang et al. May 2012 A1
20120154712 Yu et al. Jun 2012 A1
20120212520 Matsui et al. Aug 2012 A1
20120212956 Chen Aug 2012 A1
20120242926 Hsu et al. Sep 2012 A1
20120250329 Suehiro et al. Oct 2012 A1
20120274882 Jung Nov 2012 A1
20120299891 Fujiwara et al. Nov 2012 A1
20120314447 Huang Dec 2012 A1
20120327039 Kukulj Dec 2012 A1
20130016080 Dunn et al. Jan 2013 A1
20130016296 Fujita et al. Jan 2013 A1
20130027633 Park et al. Jan 2013 A1
20130063326 Riegel Mar 2013 A1
20130094160 Narumi Apr 2013 A1
20130163277 Kim et al. Jun 2013 A1
20130258659 Edon Oct 2013 A1
20130278868 Dunn et al. Oct 2013 A1
20140016355 Ajichi Jan 2014 A1
20140078407 Green et al. Mar 2014 A1
20140085564 Hendren et al. Mar 2014 A1
20140104538 Park et al. Apr 2014 A1
20140134767 Ishida et al. May 2014 A1
20140144083 Artwohl et al. May 2014 A1
20140160365 Kwong et al. Jun 2014 A1
20140268657 Dunn et al. Sep 2014 A1
20140285477 Cho et al. Sep 2014 A1
20140340375 Dunn et al. Nov 2014 A1
20140361969 Wasinger et al. Dec 2014 A1
20150009653 Dunn et al. Jan 2015 A1
20150153506 Dunn Jun 2015 A1
20150219954 Kubo Aug 2015 A1
20150226996 Ohashi Aug 2015 A1
20150245443 Dunn et al. Aug 2015 A1
20150346525 Wolf et al. Dec 2015 A1
20160037606 Dunn et al. Feb 2016 A1
20160103275 Diaz et al. Apr 2016 A1
20160238876 Dunn et al. Aug 2016 A1
20160300549 Zhang Oct 2016 A1
20160334666 Liu Nov 2016 A1
20160335705 Williams et al. Nov 2016 A1
20160338181 Schuch et al. Nov 2016 A1
20160338182 Schuch et al. Nov 2016 A1
20160351133 Kim et al. Dec 2016 A1
20160358538 Schuch et al. Dec 2016 A1
20170059938 Brown et al. Mar 2017 A1
20170248823 Dunn et al. Aug 2017 A1
20180012566 Lin et al. Jan 2018 A1
20180048849 Dunn Feb 2018 A1
20180061297 Schuch et al. Mar 2018 A1
Foreign Referenced Citations (83)
Number Date Country
2004283319 May 2005 AU
2007216782 Sep 2007 AU
2536130 May 2005 CA
2688214 Nov 2008 CA
1836179 Sep 2006 CN
101432647 May 2007 CN
10104685 Oct 2007 CN
101339272 Jan 2009 CN
101351765 Jan 2009 CN
101681222 Mar 2010 CN
0313331 Apr 1989 EP
1678534 Jul 2006 EP
1805539 Jul 2007 EP
2156276 May 2008 EP
1941342 Jul 2008 EP
153110 Nov 1920 GB
302007 Feb 2006 IN
032009 May 2008 IN
152010 Dec 2009 IN
11095214 Apr 1999 JP
2002064842 Feb 2002 JP
2002209230 Jul 2002 JP
2002366121 Dec 2002 JP
2004004581 Jan 2004 JP
2007509372 Oct 2004 JP
2004-325629 Nov 2004 JP
2004325629 Nov 2004 JP
2005228996 Aug 2005 JP
2005236469 Sep 2005 JP
2005-292939 Oct 2005 JP
2008518251 Oct 2005 JP
2005-332253 Dec 2005 JP
2006-198344 Aug 2006 JP
2007080872 Mar 2007 JP
2009535723 May 2007 JP
200876755 Apr 2008 JP
2008112719 May 2008 JP
2008256819 Oct 2008 JP
2009036964 Feb 2009 JP
2009512898 Mar 2009 JP
2009231473 Oct 2009 JP
2010509622 Mar 2010 JP
2010527100 Aug 2010 JP
2010282109 Dec 2010 JP
2011081424 Apr 2011 JP
20-0286961 Aug 2002 KR
1020070003755 Feb 2006 KR
20070005637 Jan 2007 KR
1020070084554 May 2007 KR
20080013592 Feb 2008 KR
20080063414 Jul 2008 KR
20080074972 Aug 2008 KR
1020090007776 Jan 2009 KR
20100019997 Feb 2010 KR
1020050033986 Apr 2014 KR
101796718 Nov 2017 KR
200615598 May 2006 TW
200802054 Jan 2008 TW
200808925 Feb 2008 TW
200809285 Feb 2008 TW
200809287 Feb 2008 TW
200828093 Jul 2008 TW
200912200 Mar 2009 TW
201030376 Aug 2010 TW
201038114 Oct 2010 TW
WO9608892 Mar 1996 WO
WO2005051054 Jun 2005 WO
W02005093703 Oct 2005 WO
WO2006001559 Jan 2006 WO
WO2006109237 Oct 2006 WO
WO2007052777 May 2007 WO
WO2005040873 May 2008 WO
WO2008138049 Nov 2008 WO
WO2008152832 Dec 2008 WO
WO2009004574 Jan 2009 WO
2009004574 Aug 2009 WO
WO2010080624 Jul 2010 WO
W02010129271 Nov 2010 WO
WO2011100429 Aug 2011 WO
WO2011143719 Nov 2011 WO
WO2014158642 Oct 2014 WO
WO2015003130 Jan 2015 WO
WO2018031753 Feb 2018 WO
Non-Patent Literature Citations (14)
Entry
Wikipedia, Gradient-index optics, 2016.
Patrick Frantz & Deania Fernandez, Printed Circuit Boards (PCBs), Feb. 18, 2004, 2 Pages, Version 1.1.
Teravision Corp., LCD-TV Panel Control Board Specification, Nov. 2007, 24 Pages.
Supertex Inc., Constant Off-time, Buck-based LED Drivers Using HV9910, Nov. 2, 2004, 4 Pages.
Grin Tech, Grin Lenses, Aug. 25, 2016, 4 Pages.
Supertex Inc., Universal High Brightness LED Driver, 2007, 8 Pages.
Shigeru Aoyama, Akihiro Funamoto & Koichi Imanaka, Hybrid normal-reverse prism coupler for light-emitting diode backlight systems, Oct. 1, 2006, 6 Pages, vol. 45, No. 28.
Panel-Brite, Inc., High Brightness LED Backlight Technology, Mar. 11, 2009, 1 Page.
RPO, How Digital Waveguide Touch Works, Sep. 15, 2011, 1 Page.
Dave Roos, How Transmissive Film Works, article, 2008, 9 pages.
Schott, Glass made of Ideas—OPALIKA, 2016, 2 pages.
Anandan, LED Backlight: Enhancement of picture quality on LCD screen, Oct. 8-12, 2006, 5 pages.
Lu, Color shift reduction of a multi-domain IPS-LCD using RGB-LED backlight, 2006, 10 pages.
Anandan, M., Progress of LED backlights for LCDs, Journal of the SID, 2008, pp. 287-310, 16/2.
Related Publications (1)
Number Date Country
20140361969 A1 Dec 2014 US
Provisional Applications (1)
Number Date Country
61183592 Jun 2009 US
Divisions (2)
Number Date Country
Parent 13722537 Dec 2012 US
Child 13850854 US
Parent 12793474 Jun 2010 US
Child 13722537 US
Continuations (1)
Number Date Country
Parent 13850854 Mar 2013 US
Child 14453966 US