A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
The present disclosure generally relates to electronic devices with display panels, and more particularly, to schemes for sub-pixel uniformity correction (SPUC) on a display panel. For example, image processing circuitry may include a SPUC block to adjust the applied analog voltage (e.g., by pixel drive circuitry) to the display pixels by adjusting the image data provided thereto. In general, the SPUC block may convert input image data from a gray level domain to a voltage domain, compensate the voltage data, and convert the compensated voltage data to the gray level domain. However, as presently recognized, in some scenarios, the voltage compensation may decrease or increase the value of the voltage data to a level that is clipped by the minimum or maximum gray level, respectively, when converted back to the voltage domain.
As such, in some embodiments, the SPUC block may utilize a calibrated V2G mapping that expands the headroom and/or footroom of the gray level domain with respect to the voltage domain. In other words, a lower voltage data level may map to lowest gray level and/or a higher voltage data level may map to the highest gray level relative to the G2V mapping used prior to the voltage compensation. The calibrated V2G mapping may generate, from the compensated voltage data, compensated image data that is indicative of the same range of luminances, but a wider range of voltages than the input image data. Furthermore, the compensated image data may be provided to the pixel drive circuitry to drive the display pixels at the compensated voltages.
Additionally, the pixel drive circuitry may utilize a calibrated G2V (gray-to-voltage) mapping (e.g., an inverse mapping of the calibrated V2G mapping) to obtain the desired voltage levels for driving the display pixels. In other words, the extended voltage range of the compensated voltage data may be realized at the pixel drive circuitry, and the analog voltages corresponding to the compensated voltage data may be supplied to the display pixels. As such, by utilizing the calibrated V2G mapping and the calibrated G2V mapping, the headroom and/or footroom in the gray level domain may be increased to accommodate the compensated voltage data levels that would otherwise be clipped by the gray level domain.
Various refinements of the features noted above may exist in relation to various aspects of the present disclosure. Further features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination. For instance, various features discussed below in relation to one or more of the illustrated embodiments may be incorporated into any of the above-described aspects of the present disclosure alone or in any combination. The brief summary presented above is intended only to familiarize the reader with certain aspects and contexts of embodiments of the present disclosure without limitation to the claimed subject matter.
Various aspects of this disclosure may be better understood upon reading the following detailed description and upon reference to the drawings in which:
One or more specific embodiments of the present disclosure will be described below. These described embodiments are only examples of the presently disclosed techniques. Additionally, in an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but may nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “one embodiment” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Furthermore, the phrase A “based on” B is intended to mean that A is at least partially based on B. Moreover, the term “or” is intended to be inclusive (e.g., logical OR) and not exclusive (e.g., logical XOR). In other words, the phrase A “or” B is intended to mean A, B, or both A and B.
Electronic devices often use electronic displays to present visual information. Such electronic devices may include computers, mobile phones, portable media devices, tablets, televisions, virtual-reality headsets, and vehicle dashboards, among many others. To display an image, an electronic display controls the luminance (and, as a consequence, the color) of its display pixels based on corresponding image data received at a particular resolution. For example, an image data source may provide image data as a stream of pixel data, in which data for each display pixel indicates a target luminance (e.g., brightness and/or color) of one or more display pixels located at corresponding pixel positions. In some embodiments, image data may indicate luminance per color component, for example, via red component image data, blue component image data, and green component image data, collectively referred to as RGB image data (e.g., RGB, sRGB). Additionally or alternatively, image data may be indicated by a luma channel and one or more chrominance channels (e.g., YCbCr, YUV, etc.), grayscale (e.g., gray level), or other color basis. It should be appreciated that a luma channel, as disclosed herein, may encompass linear, non-linear, and/or gamma-corrected luminance values.
The display pixels of an electronic display may include self-emissive pixels such as light-emitting diodes (LEDs) (e.g., organic light-emitting diodes (OLEDs), micro-LEDs (μLEDs), active matrix organic light-emitting diodes (AMOLEDs), etc.) or transmissive pixels such as on a liquid crystal display (LCD). However, due to various properties associated with manufacturing of the display (e.g., manufacturing variations), driving the display pixels (e.g., crosstalk or other electrical anomaly), and/or other characteristics related to the display, different display pixels provided with the same gray level of image data may output different amounts of light (e.g., luminance). As such in some embodiments, the image data may be processed to account for one or more physical or digital effects associated with displaying the image data.
For example, in some embodiments, image processing circuitry may include a sub-pixel uniformity correction (SPUC) block (e.g., SPUC circuitry) to adjust the driving current and/or voltage for each pixel to account for differences in output luminance between the pixels, such as due to manufacturing variance. Indeed, some display pixels may exhibit different luminance outputs at the same voltage/current than other pixels, and such differences may be noted and/or preprogrammed during manufacturing to account for such differences. As discussed herein, the SPUC block may account for an adjustment to the voltage of a display pixel. However, as should be appreciated, the change in voltage may also be associated with a change in current, and the techniques discussed herein may be utilized to adjust either driving current or voltage.
In general, the SPUC block may adjust the driving current and/or voltage (e.g., provided by pixel drive circuitry to the display pixels) for the display pixels by adjusting the image data provided thereto. Furthermore, as the change to be applied is indicative of a change in applied voltage, the change to the image data may be applied in a voltage domain that represents the image data as a digital value (e.g., voltage data) of the voltage to be applied to a pixel. In some embodiments, input image data may be converted from a gray level domain to the voltage domain, generating voltage data, based on a G2V (gray-to-voltage) mapping. The G2V mapping may be panel specific (e.g., specific to the individual display panel, a manufactured batch of display panels, a model of the display panel, etc.). For example, the G2V mapping may provide a gamma or other optical calibration that correlates the gray levels to amounts of luminance desired to be displayed, which may vary based on the display panel. Additionally, as variations in manufacturing may cause differences between different display panels, the voltage compensation (e.g., adjustment) made in the voltage domain may be based on a voltage compensation map, which may be also be panel specific (e.g., specific to the individual display panel, a manufactured batch of display panels, a model of the display panel etc.).
Traditionally, the compensated voltage data may be converted back to the gray level domain via an V2G (voltage-to-gray) mapping that is the inverse of the G2V mapping. However, as presently recognized, in some scenarios, the voltage compensation may adjust the desired voltage level to a level that is clipped by the minimum or maximum gray level. For example, if a particular display pixel is naturally brighter than the average (e.g., expected) display pixel, a negative voltage compensation may be added to the voltage data to reduce the applied voltage. However, if the voltage compensation would reduce the voltage data past a lower threshold voltage data level corresponding the lowest gray level, the compensated voltage data may be effectively clipped at the lower threshold voltage data level when mapped back to the gray level domain (e.g., via the V2G mapping). In a similar manner, a naturally dimmer display pixel (e.g., relative to the average display pixel of the display panel) may receive a voltage compensation that is effectively clipped at an upper threshold voltage data level corresponding to a maximum gray level (e.g., when mapped back to the gray level domain, such as via the V2G mapping). Such clipping may reduce the effectiveness of the SPUC and result in visible artifacts such as luminance variations between display pixels being displayed.
As such, embodiments of the present disclosure may include a calibrated V2G mapping that expands the headroom and/or footroom of the gray level domain with respect to the voltage domain. In other words, the lowest gray level may be mapped to a calibrated lower threshold voltage level less than the lower threshold voltage data level and/or the highest gray level may be mapped to a calibrated higher threshold voltage level greater than the upper threshold voltage data level. Furthermore, in some embodiments, the calibrated V2G mapping may remap (e.g., relative to the G2V mapping) a portion of the gray levels, such as those below a lower tap point threshold and/or above an upper tap point threshold, such that certain tap points of the mappings are the same. Moreover, in some embodiments, the calibrated V2G mapping may remap (e.g., relative to the G2V mapping) the entire spectrum of gray levels, relative to the voltage domain, such that one or zero tap points remain in common with the G2V mapping. The calibrated V2G mapping may generate compensated image data indicative of the same range of luminances, but a wider range of voltages than the input image data. Furthermore, the compensated image data may be provided to the pixel drive circuitry to drive the display pixels at the compensated voltages.
In general, the pixel drive circuitry may convert gray level domain image data to the voltage domain for determining and/or selecting the desired analog voltages to drive the display pixels therewith. For example, the pixel drive circuitry may receive display image data (e.g., the compensated image data or display image data based on the compensated image data) and utilize one or more digital-to-analog converters, multiplexers, or other circuitry to generate, select, or otherwise obtain the analog voltages to drive the display pixels. As should be appreciated, in some embodiments, one or more image processing techniques, such as dithering, may occur on the compensated image data prior to being transmitted to the pixel drive circuitry. However, as the G2V mapping used prior to the voltage compensation does not map to the extended voltage range of the compensated image data, a calibrated G2V (gray-to-voltage) mapping may be utilized at the pixel drive circuitry to obtain the desired voltage levels for driving the display pixels. Moreover, the calibrated G2V mapping may be the inverse mapping (e.g., with the same tap points) of the calibrated V2G mapping. In other words, the extended voltage range of the compensated voltage data may be realized at the pixel drive circuitry to supply corresponding analog voltages to the display pixels. As should be appreciated, the pixel drive circuitry may or may not convert the display image data to digital voltage data (e.g., compensated voltage data) in the voltage domain prior to selecting the analog voltage for a display pixel. For example, in some embodiments, the pixel drive circuitry may directly select the analog voltage based on the display image data in accordance with the calibrated G2V mapping (e.g., implemented in hardware or software). Alternatively, the digital voltage data may be generated based on the display image data and the calibrated G2V mapping, and the analog voltage may be selected based on the digital voltage data. As such, by utilizing the calibrated V2G mapping and the calibrated G2V mapping, the headroom and/or footroom in the gray level domain may be increased to accommodate the compensated voltage data levels that would otherwise be clipped by the gray level domain.
With the foregoing in mind,
The electronic device 10 may include one or more electronic displays 12, input devices 14, input/output (I/O) ports 16, a processor core complex 18 having one or more processors or processor cores, local memory 20, a main memory storage device 22, a network interface 24, a power source 26, and image processing circuitry 28. The various components described in
The processor core complex 18 is operably coupled with local memory 20 and the main memory storage device 22. Thus, the processor core complex 18 may execute instructions stored in local memory 20 or the main memory storage device 22 to perform operations, such as generating or transmitting image data to display on the electronic display 12. As such, the processor core complex 18 may include one or more general purpose microprocessors, one or more application specific integrated circuits (ASICs), one or more field programmable logic arrays (FPGAs), or any combination thereof.
In addition to program instructions, the local memory 20 or the main memory storage device 22 may store data to be processed by the processor core complex 18. Thus, the local memory 20 and/or the main memory storage device 22 may include one or more tangible, non-transitory, computer-readable media. For example, the local memory 20 may include random access memory (RAM) and the main memory storage device 22 may include read-only memory (ROM), rewritable non-volatile memory such as flash memory, hard drives, optical discs, or the like.
The network interface 24 may communicate data with another electronic device or a network. For example, the network interface 24 (e.g., a radio frequency system) may enable the electronic device 10 to communicatively couple to a personal area network (PAN), such as a BLUETOOTH® network, a local area network (LAN), such as an 802.11x Wi-Fi network, or a wide area network (WAN), such as a 4G, Long-Term Evolution (LTE), or 5G cellular network.
The power source 26 may provide electrical power to operate the processor core complex 18 and/or other components in the electronic device 10. Thus, the power source 26 may include any suitable source of energy, such as a rechargeable lithium polymer (Li-poly) battery and/or an alternating current (AC) power converter.
The I/O ports 16 may enable the electronic device 10 to interface with various other electronic devices. The input devices 14 may enable a user to interact with the electronic device 10. For example, the input devices 14 may include buttons, keyboards, mice, trackpads, and the like. Additionally or alternatively, the electronic display 12 may include touch sensing components that enable user inputs to the electronic device 10 by detecting occurrence and/or position of an object touching its screen (e.g., surface of the electronic display 12).
The electronic display 12 may display a graphical user interface (GUI) (e.g., of an operating system or computer program), an application interface, text, a still image, and/or video content. The electronic display 12 may include a display panel with one or more display pixels to facilitate displaying images. Additionally, each display pixel may represent one of the sub-pixels that control the luminance of a color component (e.g., red, green, or blue). As used herein, a display pixel may refer to a collection of sub-pixels (e.g., red, green, and blue subpixels) or may refer to a single sub-pixel.
As described above, the electronic display 12 may display an image by controlling the luminance output (e.g., light emission) of the sub-pixels based on corresponding image data. In some embodiments, pixel or image data may be generated by or received from an image source, such as the processor core complex 18, a graphics processing unit (GPU), storage device 22, or an image sensor (e.g., camera). Additionally, in some embodiments, image data may be received from another electronic device 10, for example, via the network interface 24 and/or an I/O port 16. Moreover, in some embodiments, the electronic device 10 may include multiple electronic displays 12 and/or may perform image processing (e.g., via the image processing circuitry 28) for one or more external electronic displays 12, such as connected via the network interface 24 and/or the I/O ports 16.
The electronic device 10 may be any suitable electronic device. To help illustrate, one example of a suitable electronic device 10, specifically a handheld device 10A, is shown in
The handheld device 10A may include an enclosure 30 (e.g., housing) to, for example, protect interior components from physical damage and/or shield them from electromagnetic interference. The enclosure 30 may surround, at least partially, the electronic display 12. In the depicted embodiment, the electronic display 12 is displaying a graphical user interface (GUI) 32 having an array of icons 34. By way of example, when an icon 34 is selected either by an input device 14 or a touch-sensing component of the electronic display 12, an application program may launch.
Input devices 14 may be accessed through openings in the enclosure 30. Moreover, the input devices 14 may enable a user to interact with the handheld device 10A. For example, the input devices 14 may enable the user to activate or deactivate the handheld device 10A, navigate a user interface to a home screen, navigate a user interface to a user-configurable application screen, activate a voice-recognition feature, provide volume control, and/or toggle between vibrate and ring modes. Moreover, the I/O ports 16 may also open through the enclosure 30. Additionally, the electronic device may include one or more cameras 36 to capture pictures or video. In some embodiments, a camera 36 may be used in conjunction with a virtual reality or augmented reality visualization on the electronic display 12.
Another example of a suitable electronic device 10, specifically a tablet device 10B, is shown in
Turning to
As described above, the electronic display 12 may display images based at least in part on image data. Before being used to display a corresponding image on the electronic display 12, the image data may be processed, for example, via the image processing circuitry 28. Moreover, the image processing circuitry 28 may process the image data for display on one or more electronic displays 12. For example, the image processing circuitry 28 may include a display pipeline, memory-to-memory scaler and rotator (MSR) circuitry, warp compensation circuitry, or additional hardware or software means for processing image data. The image data may be processed by the image processing circuitry 28 to reduce or eliminate image artifacts, compensate for one or more different software or hardware related effects, and/or format the image data for display on one or more electronic displays 12. As should be appreciated, the present techniques may be implemented in standalone circuitry, software, and/or firmware, and may be considered a part of, separate from, and/or parallel with a display pipeline or MSR circuitry.
To help illustrate, a portion of the electronic device 10, including image processing circuitry 28, is shown in
The electronic device 10 may also include an image data source 38, a display panel 40, and/or a controller 42 in communication with the image processing circuitry 28. In some embodiments, the display panel 40 of the electronic display 12 may be a self-emissive display (e.g., organic light-emitting-diode (OLED) display, micro-LED display, etc.), a transmissive display (e.g., liquid crystal display (LCD)), or any other suitable type of display panel 40. In some embodiments, the controller 42 may control operation of the image processing circuitry 28, the image data source 38, and/or the display panel 40. To facilitate controlling operation, the controller 42 may include a controller processor 44 and/or controller memory 46. In some embodiments, the controller processor 44 may be included in the processor core complex 18, the image processing circuitry 28, a timing controller in the electronic display 12, a separate processing module, or any combination thereof and execute instructions stored in the controller memory 46. Additionally, in some embodiments, the controller memory 46 may be included in the local memory 20, the main memory storage device 22, a separate tangible, non-transitory, computer-readable medium, or any combination thereof.
The image processing circuitry 28 may receive source image data 48 corresponding to a desired image to be displayed on the electronic display 12 from the image data source 38. The source image data 48 may indicate target characteristics (e.g., pixel data) corresponding to the desired image using any suitable source format, such as an RGB format, an αRGB format, a YCbCr format, and/or the like. Moreover, the source image data may be fixed or floating point and be of any suitable bit-depth. Furthermore, the source image data 48 may reside in a linear color space, a gamma-corrected color space, or any other suitable color space. Moreover, as used herein, pixel data/values of image data may refer to individual color component (e.g., red, green, and blue) data values corresponding to pixel positions of the display panel.
As described above, the image processing circuitry 28 may operate to process source image data 48 received from the image data source 38. The image data source 38 may include captured images (e.g., from one or more cameras 36), images stored in memory, graphics generated by the processor core complex 18, or a combination thereof. Additionally, the image processing circuitry 28 may include one or more image data processing blocks 50 (e.g., circuitry, modules, processing stages, algorithms, etc.) such as a sub-pixel uniformity correction (SPUC) block 52. As should be appreciated, multiple other processing blocks 54 may also be incorporated into the image processing circuitry 28, such as a pixel contrast control (PCC) block, color management block, a dither block, a blend block, a warp block, a scaling/rotation block, etc. before and/or after the SPUC block 52. The image data processing blocks 50 may receive and process source image data 48 and output display image data 56 in a format (e.g., digital format, image space, and/or resolution) interpretable by the display panel 40. Further, the functions (e.g., operations) performed by the image processing circuitry 28 may be divided between various image data processing blocks 50, and, while the term “block” is used herein, there may or may not be a logical or physical separation between the image data processing blocks 50. After processing, the image processing circuitry 28 may output the display image data 56 to the display panel 40. Based at least in part on the display image data 56, analog electrical signals may be provided, via pixel drive circuitry 58, to display pixels 60 of the display panel 40 to illuminate the display pixels 60 at a desired luminance level and display a corresponding image.
The pixel drive circuitry 58 may be utilized to provide suitable power to the display pixels 60. In some embodiments, the pixel drive circuitry 58 may generate one or more gamma reference voltages (e.g., supplied by a gamma generator) to be applied to the display pixels 60 to achieve the desired luminance outputs. To help illustrate,
In some embodiments, the electronic display 12 may use analog voltages to power display pixels 60 at various voltages that correspond to different luminance levels. For example, the display image data 56 may correspond to original (e.g., source image data 48) or processed image data and contain target luminance values for each display pixel 60. As used herein, pixels or pixel data may refer to a grouping of sub-pixels (e.g., individual color component pixels such as red, green, and blue) or the sub-pixels themselves. Moreover, the pixel drive circuitry 58 may include one or more display drivers 66 (also known as source drivers, data drivers, column drivers, etc.), source latches 68, source amplifiers 70, and/or any other suitable logic/circuitry to provide the appropriate analog voltage(s) to the display pixels 60, based on the display image data 56. For example, in some embodiments, the circuitry of the display drivers 66 or additional circuitry coupled thereto may include one or more DACs and/or multiplexers to select (e.g., from the gamma bus 64), generate, or otherwise obtain an analog voltage based on the display image data 56. As such, the pixel drive circuitry 58 may apply power at a corresponding voltage and/or current to a display pixel 60 to achieve a target luminance output from the display pixel 60, based on the display image data 56. Such power, at the appropriate analog voltages for each display pixel 60, may travel down analog datalines 72 to the display pixels 60.
In some embodiments, the different analog voltages may be generated by a gamma generator 62 via one or more digital-to-analog converters (DACs) 74, amplifiers 76, and/or resistor strings (not shown). As discussed above, the different analog voltages supplied by the gamma bus 64 may correspond to at least a portion of the values of the display image data 56. For example, 8-bit display image data 56 per color component may correspond to 256 different gray levels and, therefore, 256 different analog voltages per color component. Indeed, display image data 56 corresponding to 8-bits per color component may yield millions or billions of color combinations and, in some embodiments, may include the brightness of the electronic display 12 for a given frame. As should be appreciated, the display image data 56 and corresponding voltage outputs may be associated with any suitable bit-depth depending on implementation and/or may use any suitable color space (e.g., RBG (red/blue/green), sRBG, Adobe RGB, HSV (hue/saturation/value), YUV (luma/chroma/chroma), Rec. 2020, etc.). Furthermore, the gamma bus 64 may include more or fewer analog voltages than the corresponding bit-depth of the display image data 56. For example, in some embodiments, the same analog voltages may be used for multiple gray levels, for example, via interpolation between analog voltages and/or pulse-width modulation of current flow to obtain the different perceived luminance outputs. In some embodiments, the gamma generator 62 and/or pixel drive circuitry 58 may provide the display pixels 60 with negative voltages relative to a reference point (e.g., ground). As should be appreciated, the positive and negative voltages may be used in a similar manner to operate the display pixels 60, and they may have mirrored or different mappings between voltage level and target gray level.
Additionally, in some embodiments, different color component display pixels 60 (e.g., a red sub-pixel, a green sub-pixel, a blue sub-pixel, etc.) may have different mappings between voltage level and target gray level. For example, display pixels 60 of different color components may have different luminance outputs given the same driving voltage/current. As such, in some embodiments, one or more gamma buses 64 may be used for each color component and/or voltage polarity. As should be appreciated, the mappings between voltage level and target gray level may depend on the type of display pixels (e.g., LCD, LED, OLED, etc.), the brightness setting, a color hue setting, temperature, contrast control, pixel aging, etc., and, therefore, may depend on implementation.
As discussed herein, the display pixels 60 (e.g., sub-pixels of individual color components) of an electronic display 12 may exhibit variations therebetween, such as due to manufacturing variances or other factors. For example, some display pixels 60, even those of the same color component, may exhibit different luminance outputs when supplied with the same voltage/current. As such, when displaying an image, image artifacts such as luminance variations may be perceived at one or more locations across the electronic display 12. As such, in some embodiments, the image processing circuitry 28 may include the SPUC block 52 (e.g., SPUC circuitry) to adjust the driving current and/or voltage for different pixels to account for differences in output luminance therebetween, such as due to manufacturing variance.
In general, the SPUC block 52 may receive input image data 78 and generate compensated image data 80 based thereon to adjust the driving voltage (e.g., applied via the pixel drive circuitry 58) for the display pixels 60, as shown in
As such, in some embodiments, the SPUC block 52 may include a gray-to-voltage domain conversion sub-block 82 to convert the input image data 78 from a gray level domain to the voltage domain, generating voltage data 84, based on a G2V (gray-to-voltage) mapping 86. The G2V mapping 86 may be panel specific (e.g., specific to the individual display panel 40, a manufactured batch of display panels 40, a model of the display panel 40, etc.). Additionally, the G2V mapping 86 may provide or be based on (e.g., in accordance with) a gamma or other optical calibration that correlates the gray levels of the gray level domain to amounts of luminance that are desired to be displayed. Further, in some embodiments, different brightness settings of the electronic display 12 may correspond to different G2V mappings 86. For example, in some embodiments, a given gray level may correspond to different amounts of output luminance, and therefore different driving voltages, at different brightness settings of the electronic display 12. As should be appreciated, the brightness setting may be indicative of a global brightness (e.g., maximum total brightness) of the electronic display 12, and may be based on a user setting, a time of day, an ambient light reading, or other factor.
Furthermore, the SPUC block 52 may include a voltage compensation sub-block 88 that applies a voltage compensation map 90 to generate compensated voltage data 92. In general, the voltage compensation map 90 may include a per pixel map of compensations (e.g., voltage adjustments) to be applied to voltage data 84 of each display pixel 60. For example, each display pixel 60 or group of display pixels 60 may have an independent voltage compensation associated therewith. Furthermore, in some embodiments, the voltage compensation map 90 may depend on the brightness setting of the electronic display 12. Moreover, as variations in manufacturing may cause differences between different display panels 40, the voltage compensations (e.g., adjustments) of the voltage change map may be panel specific (e.g., specific to the individual display panel, a manufactured batch of display panels, a model of the display panel etc.).
Additionally, the SPUC block 52 may include a voltage-to-gray domain conversion sub-block 94 to convert the compensated voltage data 92 into the compensated image data 80 (e.g., in the gray level domain). Traditionally, the compensated voltage data 92 may be converted back to the gray level domain via an V2G (voltage-to-gray) mapping that is the inverse of the G2V mapping 86. However, as presently recognized, in some scenarios, the voltage compensations (e.g., of the voltage compensation map 90) may adjust one or more values of the compensated voltage data 92 to a level that would be clipped by the minimum or maximum gray level if converted using such a V2G mapping. For example, as illustrated in the graph 96 of
To achieve the target luminance 108 for the first display pixel 102, a voltage compensation 112 (ΔV) may be applied (e.g., according to the voltage compensation map 90) to the voltage data 84 to reduce the applied voltage to a reduced voltage level 114. However, in some scenarios, the voltage compensation 112 may reduce the voltage data level 100 past a lower threshold voltage data level 116 and into a clipped region 118. The lower threshold voltage data level 116 may correspond to the lowest gray level (e.g., in the gray level domain) and, thus, the compensated voltage data 92 may be effectively clipped at the lower threshold voltage data level 116 when mapped back to the gray level domain (e.g., via the V2G mapping). Furthermore, the difference between the target luminance 108 and a clipped luminance 120 (e.g., corresponding to the lower threshold voltage data level 116) may result in a perceivable luminance variation, such as a pixel luminance (e.g., clipped luminance 120) that is higher than desired.
In a similar manner, the profile of a second display pixel 122 may be naturally dimmer than the profile of the average display pixel 104 of the display panel 40, and the voltage compensation 112 may be effectively clipped at an upper threshold voltage data level 124, corresponding to a maximum gray level (e.g., when mapped back to the gray level domain, such as via the V2G mapping), as shown in the graph 126 of
As such, in some embodiments the voltage-to-gray domain conversion sub-block 94 may utilize a calibrated V2G mapping 128 that expands the headroom and/or footroom of the gray level domain with respect to the voltage domain. To help illustrate, the graph 130 of
Furthermore, in some embodiments, the calibrated V2G mapping 128 may remap (e.g., relative to the G2V mapping) a portion of the gray levels 134 such as those below a lower tap point threshold 136 and/or above an upper tap point threshold 138 correspond to lower (e.g., relative to the G2V mapping 86) voltage data levels 100 and/or higher (e.g., relative to the G2V mapping 86) voltage data levels 100, respectively, as shown in the graph 140 of
The compensated image data 80 may undergo additional processing, such as via one or more other processing blocks 54 (e.g., a dither block) to generate the display image data 56 or be provided to the pixel drive circuitry 58 as the display image data 56. However, returning to
As should be appreciated, the calibrated V2G mapping 128 may effectively create a calibrated gray level domain different (e.g., with respect to corresponding luminances for the individual gray levels) from the gray level domain of the input image data 78. As such, the calibrated gray level domain may not align with the same gamma or other optical calibration of the G2V mapping 86. However, by utilizing the calibrated G2V mapping 148 (e.g., within the pixel drive circuitry 58) to convert the calibrated gray level domain to the analog voltages levels to be applied to the display pixels 60, deviations from the gamma or other optical calibration may canceled, at least in part, as the calibrated G2V mapping 148 is the inverse of the calibrated V2G mapping 128.
As discussed above, the G2V mapping 86 may be specific to the display panel 40 (e.g., type, model, individual, batch, etc.). Moreover, the G2V mapping 86 may be identified during or after manufacturing and the electronic display 12, or image processing circuitry 28 thereof, may be preprogrammed (e.g., in software or hardware) with the G2V mapping 86. For example, a set of look-up-tables (LUTs) or algorithms may identify G2V mappings 86 for different color component display pixels 60 and/or different brightness settings of the electronic display 12. Similarly, the calibrated V2G mapping 128 and/or calibrated G2V mapping 148 may be implemented in hardware or software and may be preprogrammed (e.g., during manufacturing) or generated (e.g., by the image processing circuitry 28) based on the preprogrammed G2V mapping 86.
Additionally, in some embodiments, the image processing circuitry 28 may determine whether to utilize the calibrated V2G mapping 128 and calibrated G2V mapping 148 based on a variance in the voltage data levels 100 of the G2V mapping 86. To help illustrate,
Furthermore, to help illustrate techniques of the present disclosure,
By utilizing a calibrated V2G mapping 128 as part of a sub-pixel uniformity correction, the headroom and/or footroom in the gray level domain may be increased to accommodate voltage data levels 100 of compensated voltage data 92 that would otherwise be clipped by conversion into the gray level domain. Furthermore, by utilizing a calibrated G2V mapping 148 when selecting analog voltages (e.g., via pixel drive circuitry 58), the extended range (e.g., non-clipped range) of the compensated voltage data 92 may be implemented in the pixel drive circuitry 58 to provide corresponding analog voltages to the display pixels 60. As such, perceivable artifacts such as luminance variations on the electronic display 12 may be reduced or eliminated. Furthermore, although the flowcharts discussed above are shown in a given order, in certain embodiments, process/decision blocks may be reordered, altered, deleted, and/or occur simultaneously. Additionally, the flowcharts are given as illustrative tools and further decision and process blocks may also be added depending on implementation.
The specific embodiments described above have been shown by way of example, and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this disclosure.
It is well understood that the use of personally identifiable information should follow privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining the privacy of users. In particular, personally identifiable information data should be managed and handled so as to minimize risks of unintentional or unauthorized access or use, and the nature of authorized use should be clearly indicated to users.
The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function] . . . ” or “step for [perform]ing [a function] . . . ”, it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).
This application claims priority to U.S. Provisional Application No. 63/599,394, entitled “Sub-Pixel Uniformity Correction Clip Compensation Systems and Methods,” and filed Nov. 15, 2023, which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63599394 | Nov 2023 | US |