The present disclosure relates to an image processing system and method thereof creating a Field Sequential Color (FSC) using Delta Sigma (As) Pulse Density Modulation (PDM). More particularly, an image processing system and method used in digital displays.
Display technology has become ubiquitous in our daily life. Applications include smartphones, tablets, laptops, monitors, televisions (TVs), Augmented Reality (AR) and Virtual Reality (VR) head-mounted display (HMD)s and signage. As these technologies grow, the amount and percentage of your household energy budget that they consume grows. Energy efficiency needs to become better if the world's economies are to meet their climate change abatement targets while continuing to grow.
Liquid Crystal Displays (LCDs) were invented in the 1960s and are non-emissive; meaning that they require a backlight unit (BLU). Organic Light Emitting Diodes (OLEDs) have emerged due to their thinness and black levels. MicroLEDs (μLEDs) and miniLEDs (mLEDs) are the newest contenders. With mLEDs, normally, acting as a backlight source for LCDs and μLED competing in small devices like VR/AR HMDs.
An emissive display converts electrical energy into light. Each pixel emits light and each pixel turns on/off individually. Emissive displays are distinguished by a deep black level, high contrast and fast response time. The primary emissive display method is an OLED (shown in
OLEDs have some technical issues:
OLED's are difficult to manufacture and suffer from color balance and uniformity issues especially for the color blue. Non-uniformity is where adjacent pixels look different and therefore uniformity means having adjacent pixels look the same. Many techniques have been proposed to address the color balance issue which include using white OLEDs with a color filter as shown in
A non-emissive display, which is sometimes called transmissive, reflective or passive, uses optics to bend light (they are collectively termed spatial light modulators). A light source such as a light-emitting diode (LED), mLED or sunlight is bent. The primary non-emissive display is a Liquid Crystal Display (LCD), which is used in automotive, TVs, and signage, but there are others like digital light processing (DLP) or liquid crystal on silicon (LCoS). The device structure is bulkier, but it is modular, which allows it to be easily manufactured and each module can advance separately. LCD's are easy to manufacture, but cannot produce true black, suffer from color inversion (i.e. poor viewing angle) and are not efficient. A typical LCD is shown in
Thus, there is a present need for a non-emissive display technology, which produces bright, clear and colorful images while increasing efficiency. For smartphones and tablets, low power consumption leads to a longer battery life. For large screen TVs, Computer Display, Energy Commissions—like Energy Star in the US—set the power regulations. The present disclosure and invention take a novel approach of using a Field Sequential Color (FSC) coding methodology and applying a Delta Sigma (ΔΣ) Pulse Density Modulation (PDM) circuit which solves the problems mentioned above. The FSC ΔΣ PDM builds an image over time, is frameless, and has no spatial properties. In FSC ΔΣ PDM color mixing, a display presents to the observer several mono-colored frames in sequence, which are then perceived to be a single full color frame. The human eye uses temporal integration to blend as shown in
The present disclosure is an image processing system and a method implemented in a display driver. Using modulation and backlight controls, the display changes the way that videos and/or images are displayed by using Field Sequential Color (FSC) Delta Sigma (ΔΣ) Pulse Density Modulation (PDM). An incoming video/image of N-bits at a video harmonic rate of F1 is converted to M-bits at the display rate of F2 where N≥M and F2≥F1. The F2 frames are sent to a sequential color picker, which outputs frames with one color, followed by the next in a sequential pattern. The advantages of this approach are reduction in power consumption, increased color saturation, increased contrast, and increased brightness.
A visual artifact called “color breakup” (CBU) has been a recurring concern with FSC displays and is commonly viewed as a rainbow appearing in the video. Color breakup is the imperfect overlap of frames on the retina caused by a difference in the relative velocity between displayed objects and an observer's eyes; for example, saccadic eye movements or smooth pursuits of moving objects. The color breakup issue reduces as the frame rate increases. The slower the update rate, the more prevalent is CBU. The disclosed invention has solved the color breakup problem and has eliminated CBU by performing ΔΣ PDM. An FSC algorithm uses a Delta-Sigma Pulse Density Modulation (ΔΣ PDM). The original video/image is inputted. The FSC algorithm breaks the video/image down into subpixels. Each subpixel is then run through the FSC algorithm as shown in
LCD panels have internal row and column drivers, much like DRAM. Row drivers activate the rows of the display, while column drivers set the required voltage on all of the dots in the activated row and thus supplies voltages to the LCD panel. An LCD panel comprises a matrix of pixels, divided into for example, red, green, and blue “sub-pixels”.
The invention and disclosure introduce a way to achieve FSC using a massively parallel FSC algorithm. The FSC algorithm allows for a massively parallel architecture to be built. This is because no pixel is related to any other pixel within the same frame. The term frame here would mean 1080P or 4K or something equivalent. The FSC algorithm converts every subpixel using the following formula:
New_Value=Input_Video/Image_Value+Residual_Value;
Output_Video/Image=NearestValueEqualorUnder
Residual_Value=New_Value−Output_Video/Image.
Note: Output_Video/Image is a normalized floating-point number between 0 and 1. Output_Video/Image corresponds to the value M. The values are spread equally between 0 and 1 in increments of 2M−1.
Example: If M=2 (2 bit-depth video), the increments are divided by 3.
The Output_Video/Image is one of {0, 1/3, 2/3, 1.0}. If M=3 (3-bit depth video output), then the values are divided by 7 {0, 1/7, 2/7, 3/7, 4/7, 5/7, 6/7, 1.0}.
FSC changes the mono-colored Backlight Unit (BLU) to individual colored BLU. The new BLU normally uses RGB LEDs, but other diodes and/or functionally equivalent elements/devices are possible to be used. The system illuminates the Red (R) followed by Green (G) followed by Blue (B). The order of the color is not important. A typical LED/LCD stack is shown in
Running the FSC algorithm increases the aperture ratio by 3×. Only the Red or Green or Blue BLU illuminates on each frame. When not using FSC, one has to illuminate the Red, Green, and Blue pixel in succession leading to the need for 3 column drivers, one for each color. This invention uses FSC and therefore only one column driver is need because Red or Green or Blue is shown per frame and therefore this reduces the column drivers by 2/3. Moreover, this reduction in the number of column drivers also has a big advantage in that the pixels per inch (PPI) increases by 3× and the contrast is improved because of scattering off of the column drivers, which are made of metal, is reduced (as shown in
Making displays run faster is desired to reduce eye fatigue and to sell to the gaming markets. Delta Sigma PDM requires less time to resolve the least significant bit (LSB) when compared to Pulse width modulation (PWM). The system has to meet the human eyes integration time, which is 0.6 seconds. This translates into 420 fps to resolve N=8-bpc using M=1-bpc (Refer to Table 2).
In order to more clearly illustrate the embodiments of the present disclosure, a brief description of the drawings is given below. The following drawings are only illustrative of some of the embodiments of the present disclosure and for a person of ordinary skill in the art, other drawings or embodiments may be obtained from these drawings without an inventive effort.
The technical solutions of the present disclosure will be clearly and completely described below with reference to the drawings. The embodiments described are only some of the embodiments of the present disclosure, rather than all of the embodiments. All other embodiments that are obtained by a person of ordinary skill in the art on the basis of the embodiments of the present disclosure without an inventive effort shall be covered by the protective scope of the present disclosure.
In the description of the present disclosure, it is to be noted that the orientational or positional relation denoted by the terms such as “center”, “upper”, “lower”, “left”, “right”, “vertical”, “horizontal”, “inner” and “outer” is based on the orientation or position relationship indicated by the figures, which only serves to facilitate describing the present disclosure and simplify the description, rather than indicating or suggesting that the device or element referred to must have a particular orientation, or is constructed or operated in a particular orientation, and therefore cannot be construed as a limitation on the present disclosure. In addition, the terms “first”, “second” and “third” merely serve the purpose of description and should not be understood as an indication or implication of relative importance.
In the description of the present disclosure, it should be noted that unless otherwise explicitly specified and defined, the terms “install”, “link” and “connect” shall be understood in the broadest sense, which may, for example, refer to fixed connection, detachable connection or integral connection; may refer to mechanical connection or electrical connection; may refer to direct connection or indirect connection by means of an intermediate medium; and may refer to communication between two elements. A person of ordinary skill in the art would understand the specific meaning of the terms in the present disclosure according to the specific situations.
The invention is an image processing system or method implemented within a display driver and is a novel way to display images, whether the image(s)/video(s) is/are still or moving. As the modulation scheme within the image processing system, the invention produces a Field Sequential Color (FSC) using Delta Sigma (ΔΣ) Pulse Density Modulation (PDM). The system uses ΔΣ PDM and oversamples the input therefore, breaking the input into digital components (see
New_Value=Input_Video/Image_Value+Residual_Value;
Output_Video/Image=NearestValueEqualorUnder
Residual_Value=New_Value−Output_Video/Image.
As an example, if the initial Residual_Value=0 and the Input_Video/Image_Value=0.89 (0 1 scale), then the New_Value=0.89+0=0.89. If M=2, the four possible values are 0, 1/3, 2/3, 1.0. Therefore, the Output_Video/Image=0.67 (Nearest Value that is equal or under is 2/3). The Residual=0.89-0.67=0.22. On the next frame, if the Input_Video/Image_Value does not change, then the New_Value=0.89+0.22=1.11. Thus, the Output_Video/Image=1.0 and the Residual=1.11-1.0=0.11. Thus, the residual value is saved, and the residual value will be used on the next frame at the same pixel location. The next frame will occur at time increments of F 1. F2/F1 will define how many Outputs occur for the Input.
A series of shifters and adders are implemented by using different technologies. For example, using a semiconductor-based technology (see
The ΔΣ PDM output is not an image. In fact, the output has no frame properties; the ΔΣ PDM output is frame-less (see
Since the human eye is the integrator, the image must be resolved within the human eye integration time which is 0.6 seconds. This makes ΔΣ PDM faster than PWM (refer to Table 2). The advantages of this approach are reduced power consumption, increased color saturation, increased brightness, increased contrast, and increased PPI.
Definition of Terms
Dithering hides banding by noisily transitioning from one color to another. This does not increase the bits-per-component.
Pixel is a point within the image. A pixel is made up of three or four components such as red, green, and blue (RGB), or cyan, magenta, yellow, and black (CMYK). Components are also referred to as sub-pixels/subpixels. Throughout this document we refer to bits-per-component (bpc), which is also known as bits per subpixel.
The present invention, which discloses a system and method thereof creating a Field Sequential Color (FSC) using Delta Sigma (As) Pulse Density Modulation (PDM) for digital displays, is described in detail below in reference to the figures.
The LCD comprises an analyzer; a liquid-crystal (LC); a thin film transistor (TFT) array; and a polarizer. The backlight unit (BLU) comprises a diffuser film; a color converter, which can be a quantum dot (QD) color converter or any equivalent color converter; and at least one blue LED or there are a plurality of blue LED's.
When comparing the present invention to a traditional ΔΣ block diagram of
Reference number 1 is a residual from the previous iteration. The first iteration is pre-defined. A good first-order approximation of the residual is a distribution of a random number across the image. The residual is divided into its color components. The color components can be any color space such as RGB (Red-Green-Blue) or CMYK (Cyan-Magenta-Yellow-black). These color components are often termed sub-pixels/subpixels.
Reference number 2 is a video/image. The video is also divided into its color components. The video can be inputted at any frame rate F1 (0=still image; 15 fps, 24 fps, etc.).
Reference number 3 is an oversampling module. The oversampling module 3 can be software (i.e. code or algorithm) and/or hardware such as a chip, an application processor (AP) and/or a timing controller (TCON). The oversampling module 3 can be implemented in many different ways depending on the underlying hardware. A common way is to use an N-bpc adder. Box 1 is added to Box 2 for each component. If the summation overflows the N-bpc adder, then the output value is incremented. For example, if M=1, the video input is 81 (0-255 range), the residual from the previous frame is 200 (0-255 range), then the summation=281 (0-255 range). This creates an overflow; output value=1 and the residual for the next iteration=26. The output value, defined in M-bpc, does not define a color level per frame. ΔΣ PDM is frameless. Instead, the M-bpc values are integrated over time by the eye to form the image. The M-bpc values averaged over the oversampling frequency will approximate the original input video at N-bpc. They will be equivalent if the oversampling frequency is high enough as show in Table 1.
Reference number 4 is a module providing the desired F2 fps. The module 4 can be software (i.e. code or algorithm) and/or hardware such as a chip, an application processor (AP) and/or a timing controller (TCON). F2 is nominally set to the display's frequency and M is set to achieve the desired goal. The goal may be equivalency, bandwidth reduction, power reduction, or Mura (i.e. lack of uniformity) correction.
Reference number 5 is an output value in M-bpc. In the above example, the value=1.
Reference number 6 is a Sequential Color Picker. In an example, the sequential color picker chooses among Red, Green or Blue. When the sequential color picker chooses Red, the sequential color picker Nulls (zeros) all information about Green and Blue.
Reference number 7 is an output to the display that will show the value M.
A low pass filter which will integrate the output values over time. Nominally, this is a human eye. The low pass filter can alternatively be a camera running at the input frequency (F1) or alternatively any device that performs a low pass filter function.