The present disclosure relates generally to electronic displays and, more particularly, to gain applied to display an image or image frame on an electronic display.
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present techniques, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
Electronic devices often use electronic displays to provide visual representations of information by displaying one or more images. Such electronic devices may include computers, mobile phones, portable media devices, tablets, televisions, virtual-reality headsets, and vehicle dashboards, among many others. To display an image, an electronic display may control light emission from display pixels based at least in part on image data, which indicates target characteristics of the image. The electronic displays may be calibrated to compensate for a current drop due to resistance on a path from a power supply, such as a power management integrated circuit (PMIC), to the electronic display. The compensation may be determined and/or tuned based on a white point for the electronic display. However, this compensation may result in overcompensation for non-white colors resulting in oversaturation of at least some colors.
A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
The present disclosure generally relates to improving perceived image quality on an electronic display. To display an image, the electronic display may control light emission from its display pixels based at least in part on image data that indicates target characteristics (e.g., luminance) at image pixels in the image. In some instances, the image data may be generated by an image data source.
An electronic display may experience display variations based on resistance of connections between a power supply and emissive elements of the display (e.g., current drop). To correct for these display variations, the electronic device (e.g., including the display) may be set to drive levels to produce a target white point for white pixels. However, nonwhite pixels may be oversaturated. Furthermore, color accuracy of the display may be decreased by cross-talk on an emissive element from data signals for other emissive elements in the display.
To address white color overcompensation and/or other cross-talk, a multi-dimensional color lookup table (CLUT) to convert incoming image data into compensated and/or corrected image data. For example, the CLUT may be populated to map incoming data values to correct for upcoming white point overcompensation. In other words, the mapping may be used to invert the overcompensation. The usage of the CLUT enables correction of non-linear white point overcompensation by choosing values that undue overcompensation that are mapped using empirical data and/or calculations. Furthermore, the mapping in the CLUT may account for data values adjacent channels that may cause cross-talk between the emissive element data paths to compensate for the cross-talk by reducing or eliminating cross-talk-based color inaccuracies. In other words, empirical data reflecting cross-talk variations may be input into the CLUT to adjust a subpixel based on other subpixels, such as pixel values (e.g., including multiple subpixel values) of a pixel and/or adjacent pixels.
Various aspects of this disclosure may be better understood upon reading the following detailed description and upon reference to the drawings in which:
One or more specific embodiments of the present disclosure will be described below. These described embodiments are only examples of the presently disclosed techniques. Additionally, in an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but may nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “one embodiment” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features.
The present disclosure generally relates to electronic displays, which may be used to present visual representations of information, for example, as images in one or more image frames. To display an image, an electronic display may control light emission from its display pixels based at least in part on image data that indicates target characteristics of the image. For example, the image data may indicate target luminance (e.g., brightness) of specific color components in a portion (e.g., image pixel) of the image, which when blended (e.g., averaged) together may result in perception of a range of different colors.
An electronic display may experience display variations based on resistance of connections between a power supply and emissive elements of the display (e.g., current drop). To correct for these display variations, the electronic device (e.g., including the display) may be set to drive levels to produce a target white point for white pixels. However, nonwhite pixels may be oversaturated. Furthermore, color accuracy of the display may be decreased by cross-talk on an emissive element from data signals for other emissive elements in the display.
To address white color overcompensation and/or other cross-talk, a multi-dimensional color lookup table (CLUT) to convert incoming image data into compensated and/or corrected image data. For example, the CLUT may be populated to map incoming data values to correct for upcoming white point overcompensation. In other words, the mapping may be used to invert the overcompensation. The usage of the CLUT enables correction of non-linear white point overcompensation by choosing values that undue overcompensation that are mapped using empirical data and/or calculations. Furthermore, the mapping in the CLUT may account for data values adjacent channels that may cause cross-talk between the emissive element data paths to compensate for the cross-talk by reducing or eliminating cross-talk-based color inaccuracies. In other words, empirical data reflecting cross-talk variations may be input into the CLUT to adjust a subpixel based on other subpixels, such as pixel values (e.g., including multiple subpixel values) of a pixel and/or adjacent pixels.
In some embodiments, tone compensation, brightness compensation, device-specific calibrations, and linear accessibility filters may also be used to select values to populate the CLUT to map incoming data to corrected and/or compensated data. Additionally or alternatively, device-specific calibrations, brightness compensations, linear accessibility filters, and/or tone compensation may be performed in other parts of a display pipeline including the CLUT.
Furthermore, the CLUT may be any suitable size. For example, the size of the CLUT may be based on a number available colors for the electronic display and/or other parameters. Moreover, the number of dimensions of the CLUT may be set according to a number of indexes used to lookup data. For example, if a subpixel value is to be compensated and/or corrected from a pixel having three subpixels, the CLUT may have at least three dimensions.
With the foregoing in mind, one embodiment of an electronic device 10 that utilizes an electronic display 12 is shown in
In the depicted embodiment, the electronic device 10 includes the electronic display 12, input devices 14, input/output (I/O) ports 16, a processor core complex 18 having one or more processor(s) or processor cores, local memory 20, a main memory storage device 22, a network interface 24, a power source 26, and image processing circuitry 27. The various components described in
As depicted, the processor core complex 18 is operably coupled with local memory 20 and the main memory storage device 22. In some embodiments, the local memory 20 and/or the main memory storage device 22 may be tangible, non-transitory, computer-readable media that store instructions executable by the processor core complex 18 and/or data to be processed by the processor core complex 18. For example, the local memory 20 may include random access memory (RAM) and the main memory storage device 22 may include read only memory (ROM), rewritable non-volatile memory such as flash memory, hard drives, optical discs, and the like.
In some embodiments, the processor core complex 18 may execute instruction stored in local memory 20 and/or the main memory storage device 22 to perform operations, such as generating source image data. As such, the processor core complex 18 may include one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more field programmable logic arrays (FPGAs), or any combination thereof.
As depicted, the processor core complex 18 is also operably coupled with the network interface 24. Using the network interface 24, the electronic device 10 may be communicatively coupled to a network and/or other electronic devices. For example, the network interface 24 may connect the electronic device 10 to a personal area network (PAN), such as a Bluetooth network, a local area network (LAN), such as an 802.11x Wi-Fi network, and/or a wide area network (WAN), such as a 4G or LTE cellular network. In this manner, the network interface 24 may enable the electronic device 10 to transmit image data to a network and/or receive image data from the network.
Additionally, as depicted, the processor core complex 18 is operably coupled to the power source 26. In some embodiments, the power source 26 may provide electrical power to operate the processor core complex 18 and/or other components in the electronic device 10. Thus, the power source 26 may include any suitable source of energy, such as a rechargeable lithium polymer (Li-poly) battery and/or an alternating current (AC) power converter.
Furthermore, as depicted, the processor core complex 18 is operably coupled with I/O ports 16 and the input devices 14. In some embodiments, the I/O ports 16 may enable the electronic device 10 to interface with various other electronic devices. Additionally, in some embodiments, the input devices 14 may enable a user to interact with the electronic device 10. For example, the input devices 14 may include buttons, keyboards, mice, trackpads, and the like. Additionally or alternatively, the electronic display 12 may include touch sensing components that enable user inputs to the electronic device 10 by detecting occurrence and/or position of an object touching its screen (e.g., surface of the electronic display 12).
In addition to enabling user inputs, the electronic display 12 may facilitate providing visual representations of information by displaying images (e.g., in one or more image frames). For example, the electronic display 12 may display a graphical user interface (GUI) of an operating system, an application interface, text, a still image, or video content. To facilitate displaying images, the electronic display 12 may include a display panel with one or more display pixels. Additionally, each display pixel may include one or more subpixels, which each control luminance of one color component (e.g., red, blue, or green).
As described above, the electronic display 12 may display an image by controlling luminance of the subpixels based at least in part on corresponding image data (e.g., image pixel image data and/or display pixel image data). In some embodiments, the image data may be received from another electronic device, for example, via the network interface 24 and/or the I/O ports 16. Additionally or alternatively, the image data may be generated by the processor core complex 18 and/or the image processing circuitry 27.
As described above, the electronic device 10 may be any suitable electronic device. To help illustrate, one example of a suitable electronic device 10, specifically a handheld device 10A, is shown in
As depicted, the handheld device 10A includes an enclosure 28 (e.g., housing). In some embodiments, the enclosure 28 may protect interior components from physical damage and/or shield them from electromagnetic interference. Additionally, as depicted, the enclosure 28 surrounds the electronic display 12. In the depicted embodiment, the electronic display 12 is displaying a graphical user interface (GUI) 30 having an array of icons 32. By way of example, when an icon 32 is selected either by an input device 14 or a touch-sensing component of the electronic display 12, an application program may launch.
Furthermore, as depicted, input devices 14 open through the enclosure 28. As described above, the input devices 14 may enable a user to interact with the handheld device 10A. For example, the input devices 14 may enable the user to activate or deactivate the handheld device 10A, navigate a user interface to a home screen, navigate a user interface to a user-configurable application screen, activate a voice-recognition feature, provide volume control, and/or toggle between vibrate and ring modes. As depicted, the I/O ports 16 may also open through the enclosure 28. In some embodiments, the I/O ports 16 may include, for example, an audio jack to connect to external devices.
To further illustrate, another example of a suitable electronic device 10, specifically a tablet device 10B, is shown in
As described above, the electronic display 12 may display images based at least in part on image data received, for example, from the processor core complex 18 and/or the image processing circuitry 27. Additionally, as described above, the image data may be processed before being used to display an image on the electronic display 12. In some embodiments, a display pipeline may process the image data, for example, based on gain values associated with corresponding pixel position to facilitate improving perceived image quality of the electronic display 12.
To help illustrate, a portion 34 of the electronic device 10 including a display pipeline 36 is shown in
As depicted, the portion 34 of the electronic device 10 also includes an image data source 38, a display driver 40, a controller 42, and external memory 44. In some embodiments, the controller 42 may control operation of the display pipeline 36, the image data source 38, and/or the display driver 40. To facilitate controlling operation, the controller 42 may include a controller processor 50 and controller memory 52. In some embodiments, the controller processor 50 may execute instructions stored in the controller memory 52. Thus, in some embodiments, the controller processor 50 may be included in the processor core complex 18, the image processing circuitry 27, a timing controller in the electronic display 12, a separate processing module, or any combination thereof. Additionally, in some embodiments, the controller memory 52 may be included in the local memory 20, the main memory storage device 22, the external memory 44, internal memory 46 of the display pipeline 36, a separate tangible, non-transitory, computer readable medium, or any combination thereof.
In the depicted embodiment, the display pipeline 36 is communicatively coupled to the image data source 38. In this manner, the display pipeline 36 may receive image data corresponding with an image to be displayed on the electronic display 12 from the image data source 38, for example, in a source (e.g., RGB) format. In some embodiments, the image data source 38 may be included in the processor core complex 18, the image processing circuitry 27, or a combination thereof.
As described above, the display pipeline 36 may process the image data received from the image data source 38. To process the image data, the display pipeline 36 may include one or more image data processing blocks 54. For example, in the depicted embodiment, the image data processing blocks 54 include a color manager 56. Additionally or alternatively, the image data processing blocks 54 may include an ambient adaptive pixel (AAP) block, a dynamic pixel backlight (DPB) block, a white point correction (WPC) block, a subpixel layout compensation (SPLC) block, a burn-in compensation (BIC) block, a panel response correction (PRC) block, a dithering block, a subpixel uniformity compensation (SPUC) block, a content frame dependent duration (CDFD) block, an ambient light sensing (ALS) block, or any combination thereof. The color manager 56 controls and/or compensates color in the displayed image presented on the electronic display 12.
After processing, the display pipeline 36 may output processed image data, such as display pixel image data, to the display driver 40. Based at least in part on the processed image data, the display driver 40 may apply analog electrical signals to the display pixels of the electronic display 12 to display images in one or more image frames. In this manner, the display pipeline 36 may operate to facilitate providing visual representations of information on the electronic display 12.
To help illustrate, one embodiment of a process 60 for operating the display pipeline 36 is described in
As described above, the display pipeline 36 may receive image pixel image data, which indicates target luminance of color components at points (e.g., image pixels) in an image, from the image data source 38 (block 62). In some embodiments, may include other display parameters, such as pixel greyscale levels, compensation settings, accessibility settings, brightness settings, and/or other factors that may change appearance of display. In some embodiments, the image pixel image data may be in a source format. For example, when the source format is an RGB format, image pixel image data may indicate target luminance of a red component, target luminance of a blue component, and target luminance of a green component at a corresponding pixel position.
Additionally, the controller 42 may instruct the display pipeline 36 to process the image pixel image data to determine display pixel image data to correct white point overcompensation (block 64) and output the display pixel image data to the display driver 40 (block 66). To determine the display pixel image data, the display pipeline 36 may convert image data from a source format to a display format based on the various display parameters. In some embodiments, the display pipeline 36 may determine the display format may be based at least in part on layout of subpixels in the electronic display 12. For example, the display pipeline 36 may use white-point compensation to compensate for current drop in the panel and also utilizing white-point correction to correct potential compensation of the white-point.
To help illustrate white-point compensation and overcompensation correction, a portion 70 of the display 12 is presented in
This of pixels in that light using an emissive element 79. The emissive element 79 may include organic light-emitting diode (OLED) and/or any other emissive elements. An amount of light emitted from the emissive elements 79 is based on a respective current 80, 82, or 84. For example, the current 80 controls how much red light is emitted from a corresponding emissive element 79, the current 82 controls how much green light is emitted from a corresponding emissive element 79, and the current the four controls how much blue light is emitted from a corresponding emissive elements 79.
Amount of electricity going through the currents 80, 82, and 84 is controlled by voltage difference between ELVDD 86 and ELVSS 88. However, due to resistances 90 in the connections between a power supply (e.g., PMIC), the voltage across the portion 72 may be different than the difference between ELVDD 86 and ELVSS 88. In other words, ΔFLVDD 92 and ΔFLVSS 94 may cause a driving current (e.g., the current 80) through the corresponding emissive element 79 to be reduced. This reduction may be referred to as the current drop on the panel of the display 12.
To address current drop, the display pipeline 100 (e.g., display pipeline 36) attempts to compensate by tuning currents through the emissive elements 79 to produce a white point corresponding to a greyscale value of 255 of combining a maximum driving of the subpixels. This white point compensation performed in display pipeline 100, specifically, in a white point compensation transform block 102. This white point compensation transform block 102 may receive various parameters that control this compensation. For example, the white point compensation transform block 102 may utilize a tone compensation 104, brightness compensation 106, and primary calibration 108 to determine the white point for the display 12. The tone compensation 104 may compensate for ambient light (e.g., color and/or brightness). For example, the tone compensation 104 may be used to compensate for colors and brightness of ambient light to ensure that parents of the display image is the same between different ambient light conditions. Additionally or alternatively, the tone compensation 104 may be used to set certain tones for display images based on settings. For example, night mode may be used to reduce blue light emission by adjusting the white point determined from the white point compensation transform block 102. The brightness compensation 106 is based on a brightness setting that is used display 12. The primary calibration 108 may include panel specific calibration factors to correct for panel variability.
The color manager 56 may include a three-dimensional color lookup table (CLUT) 110 that is may be used to convert the image data from one format to another. The color manager 56 may also be used to convert image data into a suitable panel gamut (e.g., display range of colors) for the display 12 using panel gamut conversion parameters 112 in a pre-CLUT transformation block 113. The panel gamut conversion parameters 112 may include a palette of physical colors available for display using the display 12. The color manager 56, using the three-dimensional lookup table 110, may also be used for image data based on linear accessibility filters 114 and non-linear accessibility features 116. The linear accessibility filters 114 may include various linear filters the change in appearance of display data on the display 12. For example, these linear accessibility filters 114 may include color filters that adjusts the incoming data to compensate for color vision efficiency. For instance, the color filters may include a grayscale filter, a red/green filter for Protanopia, a green/red filter for Deuteranopia, a blue/yellow filter for Tritanopia, and/or other custom filters. Since these linear accessibility filters 114 are linear, these filters may be applied in the pre-CLUT transformation block 113 in the pipeline 100 before the CLUT 110. The color manager 56 may also include a pre-CLUT range map block 115 that maps colors from the image data to the CLUT 110.
The non-linear accessibility features 116 may include other accessibility features that are non-linear and change in appearance display data on the display 12. For example, the non-linear accessibility features 116 may include an inversion mode that inverts colors in the image data to aid in readability for those with certain vision deficiencies. These non-linear accessibility features may be applied in a post-CLUT range map 118 and/or a post-CLUT transform block 120.
The display pipeline 100 may include other processing blocks. For example, the illustrated embodiment of the display pipeline 100 and includes an ambient adaptive pixel (AAP) block 122 and a dynamic pixel backlight (DPB) block 124. The AAP block 122 may adjust pixel values in the image content in response to ambient conditions. The DPB block 124 may adjust backlight setting up backlight for the display 12 according to the image content. For example, in some embodiments, the DPB clock 124 may perform histogram equalization on image data and decrease the backlight output to reduce power consumption without changing appearance of the image data on the display 12.
Note that color accuracy of the display 12 is at least partially driven by white point compensation in the white point compensation transform block 102 (e.g., in a frame-by-frame basis). As previously noted, white point compensation using a white point (e.g., grayscale value 255 for multiple pixels) may address some issues with current drop. However, performing white point compensation based on the white point may cause oversaturation of nonwhite colors due to overcompensation since the compensation is based on the white point rather than the nonwhite color (e.g., R=0, G=100, and B=0). Moreover, color accuracy issues may be derived from cross-talk that changes (e.g., increases) an emission level away from a target value for the display as the emission target value increases. For example,
To address these issues, the display pipeline 36, 100 may utilize the three-dimensional CLUT 110 to modulate luminance of subpixels based on total current level in the display 12 and/or compensations for the data. In other words, modulation of a luminance level of a subpixel is a function of current through other channels. To aid in explanation,
The display pipeline 36, 100 then utilizes the CLUT 110 to lookup a driving level for an emissive element of the multiple emissive elements based at least in part on the driving values for the multiple emissive elements (block 156). By looking up a driving level for the emissive element (e.g., green subpixel) based on other emissive elements (e.g., red and blue subpixels), the effect on cross-talk on the display 12 may be reduced and/or eliminated. Additionally or alternatively to using multiple channel information to calculate driving levels of a single subpixel, in some embodiments, the lookup table may include the compensation information to correct for oversaturation and/or other compensation issues. The electronic device 10 then drives the emissive element to the driving level (block 158).
Although the foregoing embodiments include using a three-dimensional CLUT, some embodiments may utilize a multi-dimensional CLUT that includes a different number of dimensions than three. For example, when a pixel includes a different number of subpixels (e.g., 4 subpixels RGBW), the CLUT may have a number of dimensions that match the number of subpixels in a pixel.
Furthermore, each of the display pipelines 100, 170, 174, and 176 include a CLUT 110 in a static location. However, in some embodiments, the CLUT 110 may be located at a different location in a display pipeline. For example, instead of using software compensation of cross-talk as previously discussed, the CLUT 110 may be moved closer to an end of the display pipeline to reduce cross-talk without convoluting the LUT data to deal with cross-talk.
The specific embodiments described above have been shown by way of example, and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this disclosure.
The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function] . . . ” or “step for [perform]ing [a function] . . . ”, it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).
Number | Name | Date | Kind |
---|---|---|---|
9117400 | Chaji | Aug 2015 | B2 |
20040183813 | Edge | Sep 2004 | A1 |
20070222724 | Ueno | Sep 2007 | A1 |
20130093783 | Sullivan et al. | Apr 2013 | A1 |
20130093917 | Zhang | Apr 2013 | A1 |
20140267784 | Chen et al. | Sep 2014 | A1 |
20160273382 | Chaji et al. | Jan 2016 | A1 |
Entry |
---|
International Search Report and Written Opinion for PCT Application No. PCT/US2018/040763 dated Sep. 17, 2018; 13 pgs. |
Number | Date | Country | |
---|---|---|---|
20190080656 A1 | Mar 2019 | US |