The present disclosure generally relates to electronic displays and, more particularly, to testing and compensating for pixel degradation in an electronic display based on current and voltage values sensed in a reference array.
Flat panel displays, such as light-emitting diode (LED) displays or organic-LED (OLED) displays, are commonly used in a wide variety of electronic devices, including such consumer electronics such as televisions, computers, and handheld devices (e.g., cellular telephones, audio and video players, gaming systems, and so forth). Such display panels typically provide a flat display in a relatively thin package that is suitable for use in a variety of electronic goods. In addition, such devices may use less power than comparable display technologies, making them suitable for use in battery-powered devices or in other contexts where it is desirable to minimize power usage.
Electronic displays typically include pixels arranged in a matrix to display an image that may be viewed by a user. Individual pixels of an LED display may generate light as current is applied to each pixel. Current may be applied to each pixel by programming a voltage to the pixel that is converted by circuitry of the pixel into the current. The circuitry of the pixel that converts the voltage into the current may include, for example, thin film transistors (TFTs). However, certain operating conditions, such as aging or temperature, may affect the amount of current applied to a pixel in relation to the programmed voltage.
Display panel sensing allows for operational properties of pixels of an electronic display to be identified to improve the performance of the electronic display. For example, variations in temperature and periphery circuit aging (e.g., an LDO) (among other things) across the electronic display cause pixels on the display to behave differently. Indeed, the same image data programmed on different pixels of the display could appear to be different due to the variations in temperature and periphery circuit aging. For example, a pixel emits an amount of light, gamma, or grey level based at least in part on an amount of current supplied to a diode (e.g., an LED) of the pixel. For voltage-driven pixels, a target voltage may be applied to the pixel to cause a target current to be applied to the diode (e.g., as expressed by a current-voltage relationship or curve) to emit a target gamma value. Variations may affect a pixel by, for example, changing the resulting current that is applied to the diode when applying the target voltage. Without appropriate compensation, these variations could produce undesirable visual artifacts.
Accordingly, the techniques and systems described below may be used to test and compensate for the functionality of various components of the display. A compensation manager may compensate for degradation of one or more pixels in the display based on current and voltage values sensed from a reference array that is used for the testing. The compensation manager may determine a current through circuitry of each pixel of the display to confirm operation of each pixel and corresponding components.
Various refinements of the features noted above may exist in relation to various aspects of the present disclosure. Further features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination. For instance, various features discussed below in relation to one or more of the illustrated embodiments may be incorporated into any of the above-described aspects of the present disclosure alone or in any combination. The brief summary presented above is intended only to familiarize the reader with certain aspects and contexts of embodiments of the present disclosure without limitation to the claimed subject matter.
Various aspects of this disclosure may be better understood upon reading the following detailed description and upon reference to the drawings described below.
One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “one embodiment” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Furthermore, the phrase A “based on” B is intended to mean that A is at least partially based on B. Moreover, the term “or” is intended to be inclusive (e.g., logical OR) and not exclusive (e.g., logical XOR). In other words, the phrase A “or” B is intended to mean A, B, or both A and B.
Electronic displays are ubiquitous in modern electronic devices. As electronic displays gain ever-higher resolutions and dynamic range capabilities, image quality has increasingly grown in value. In general, electronic displays contain numerous picture elements, or “pixels,” that are programmed with image data. Each pixel emits a particular amount of light based at least in part on the image data. By programming different pixels with different image data, graphical content including images, videos, and text can be displayed.
Display panel sensing allows for operational properties of pixels of an electronic display to be identified to improve the performance of the electronic display. For example, variations in temperature and periphery circuit (e.g., an LDO) aging (among other things) across the electronic display cause pixels on the display to behave differently from calibration. Indeed, the same calibrated image data could appear to be different due to the variations in temperature and periphery circuit aging. For example, a pixel emits an amount of light, gamma, or grey level based at least in part on an amount of current supplied to a diode (e.g., an LED) of the pixel. For voltage-driven pixels, a target voltage may be applied to the pixel to cause a target current to be applied to the diode (e.g., as expressed by a current-voltage relationship or curve) to emit a target gamma value. Variations may affect a pixel by, for example, changing the resulting current that is applied to the diode when applying the target voltage. Without appropriate compensation, these variations could produce undesirable visual artifacts.
Accordingly, the techniques and systems described below may be used to test and compensate for functionality of various components of the display. Testing circuitry is coupled to an active array and a reference array of the display. The testing circuitry may determine current and/or voltage for driving pixels of the active array based at least in part on current and/or voltage data sensed from the reference array. The testing circuitry may determine a current through circuitry of each pixel of the reference array and compensate for variations of the display, such as temperature and periphery circuit aging.
With this in mind, a block diagram of an electronic device 10 is shown in
The electronic device 10 shown in
The processor core complex 12 may carry out a variety of operations of the electronic device 10, such as causing the electronic display 18 to perform display panel sensing and generate a current-voltage (IV) curve that may be used to adjust image data to be displayed on the electronic display 18. The processor core complex 12 may include any suitable data processing circuitry to perform these operations, such as one or more microprocessors, one or more application specific processors (ASICs), or one or more programmable logic devices (PLDs). In some cases, the processor core complex 12 may execute programs or instructions (e.g., an operating system or application program) stored on a suitable article of manufacture, such as the local memory 14 and/or the main memory storage device 16. In addition to instructions for the processor core complex 12, the local memory 14 and/or the main memory storage device 16 may also store data to be processed by the processor core complex 12. By way of example, the local memory 14 may include random access memory (RAM) and the main memory storage device 16 may include read only memory (ROM), rewritable non-volatile memory such as flash memory, hard drives, optical discs, or the like.
The electronic display 18 may display image frames, such as a graphical user interface (GUI) for an operating system or an application interface, still images, or video content. The processor core complex 12 may supply at least some of the image frames. The electronic display 18 may be a self-emissive display, such as an organic light emitting diodes (OLED) display, a micro-LED display, a micro-OLED type display, or a liquid crystal display (LCD) illuminated by a backlight. In some embodiments, the electronic display 18 may include a touch screen, which may allow users to interact with a user interface of the electronic device 10. The electronic display 18 may employ display panel sensing to identify operational variations of the electronic display 18. This may allow the processor core complex 12 to adjust image data that is sent to the electronic display 18 to compensate for these variations, thereby improving the quality of the image frames appearing on the electronic display 18.
The input structures 22 of the electronic device 10 may enable a user to interact with the electronic device 10 (e.g., pressing a button to increase or decrease a volume level). The I/O interface 24 may enable electronic device 10 to interface with various other electronic devices, as may the network interface 26. The network interface 26 may include, for example, interfaces for a personal area network (PAN), such as a Bluetooth network, for a local area network (LAN) or wireless local area network (WLAN), such as an 802.11x Wi-Fi network, and/or for a wide area network (WAN), such as a cellular network. The network interface 26 may also include interfaces for, for example, broadband fixed wireless access networks (WiMAX), mobile broadband wireless networks (mobile WiMAX), asynchronous digital subscriber lines (e.g., ADSL, VDSL), digital video broadcasting-terrestrial (DVB-T) and its extension DVB Handheld (DVB-H), ultra wideband (UWB), alternating current (AC) power lines, and so forth. The power source 29 may include any suitable source of power, such as a rechargeable lithium polymer (Li-poly) battery and/or an alternating current (AC) power converter.
In certain embodiments, the electronic device 10 may take the form of a computer, a portable electronic device, a wearable electronic device, or other type of electronic device. Such computers may include computers that are generally portable (such as laptop, notebook, and tablet computers) as well as computers that are generally used in one place (such as conventional desktop computers, workstations and/or servers). In certain embodiments, the electronic device 10 in the form of a computer may be a model of a MacBook®, MacBook® Pro, MacBook Air®, iMac®, Mac® mini, or Mac Pro® available from Apple Inc. of Cupertino, Calif. By way of example, the electronic device 10, taking the form of a notebook computer 10A, is illustrated in
User input structures 22, in combination with the electronic display 18, may allow a user to control the handheld device 10B. For example, the input structures 22 may activate or deactivate the handheld device 10B, navigate user interface to a home screen, a user-configurable application screen, and/or activate a voice-recognition feature of the handheld device 10B. Other input structures 22 may provide volume control, or may toggle between vibrate and ring modes. The input structures 22 may also include a microphone may obtain a user's voice for various voice-related features, and a speaker may enable audio playback and/or certain phone capabilities. The input structures 22 may also include a headphone input may provide a connection to external speakers and/or headphones.
Turning to
Similarly,
The display panel 61 may also include a sensing analog front end (AFE) 66 to perform analog sensing of the response of the pixels 63, 65 to input data. In some embodiments, the AFE 66 may be used for sensing in both the active array 62 and the reference array 64. In alternative or additional embodiments, there may be at least a first AFE used for sensing in the active array 62 and at least a second AFE used for sensing in the reference array 64.
The display panel 61 may also include a digital-to-analog converter (DAC) 92. The DAC 92 may be a gamma DAC and provide gamma correction to the pixels 63, 65 of the active array and/or reference array based on the IV curve, as discussed below. As illustrated, the DAC 92 is coupled to and shared by the reference array 64 and the active array 62. In additional or alternative embodiments, at least a first DAC may be coupled to the active array 62 and at least a second DAC may be coupled to the reference array 64.
As illustrated, the compensation circuitry 104 includes an optical calibration 98, a reference array manager 96, a compensation manager 102, and a clock 94. The optical calibration 98 may provide a target current value to be applied to a diode of a pixel 63, 65 to emit a target gamma value. In some embodiments, the target current value may be based on a current and/or voltage sensed at the respective pixel 65 in the reference array 64.
The reference array manager 96 include a timing controller (TCON) and may share some functionality of the processor core complex 12 with respect to the reference array 64. For example, the reference array manager 96 may cause the reference array 64 to perform a sensing operation and generate a current-voltage (IV) curve that may be used to adjust image data to be displayed on the electronic display 18 via the active array 62. In some embodiments, the reference array manager 96 may control timing of the sensing operation of the reference array 64. For example, the reference array manager 96 may determine an interval at which the sensing operation is performed on the pixels 65 of the reference array 64.
The compensation manager 102 may receive the IV curve (e.g., IV data) from the reference array manager 96 and determine an amount of compensation to adjust the image data. In some embodiments, the compensation manager 102 may generate an IV spline coefficient for the reference array 64. In some embodiments, the compensation manager may be implemented via hardware elements (including circuitry), software elements (including machine-executable instructions stored on a tangible, non-transitory medium, such as the local memory 14 or the main memory storage device 16 discussed with respect to
In some embodiments, the compensation manager 102 may send signals across gate lines of the display panel 61 to cause a row of pixels 63 of the active array 62 to become activated and programmable, at which point the compensation manager 102 may transmit the image data (e.g., input data to be displayed by the display panel 61) across data lines (e.g., via the gate drivers 90) to program the pixels 63 to display a particular grey level (e.g., individual pixel brightness). By supplying the image data to different pixels 63 of different colors, full-color images may be programmed into the pixels 63 of the active array 62 of the display panel 61.
The compensation manager 102 may also send signals across gate lines to cause a row of pixels 65 of a reference array 64 to become activated and programmable. For example, the compensation manager 102 may send signals to the pixels 65 of the reference array via the gate drivers 90. The reference array 64 may not be visible to a user of the electronic device 10. For example, the reference array 64 may be covered by an opaque structure or material (e.g., black material) that blocks sight of the reference array 64 from view. In some embodiments, the reference array 64 may wrap around an edge or back side of the electronic device 10 such that it is hidden from view.
In some embodiments, the compensation manager 102 may send sense control signals 68 to cause the display 18 to perform display panel sensing. In response, the display 18 may send sense feedback representing digital information relating to the operational variations of the display 18. The sense feedback may be input to the compensation manager 102 and take any suitable form. An output of the compensation manager 102 may take any suitable form and may be converted into a compensation value that, when applied to the image data, appropriately compensates for operational changes of the display 18 (e.g., resulting in global changes to the display 18). This may result in greater fidelity of the image data, reducing or eliminating visual artifacts that would otherwise occur due to the operational variations of the display 18.
After the electronic device is turned on at operation 120, the burst sensing operation 122 may be initialized to consecutively sense a current and/or voltage at multiple pixels 65 of the reference array 64. In some embodiments, the current-voltage data from the burst sensing operation 122 may be used to generate one or more taps to be used for determining compensation values for the pixels 63 of the active array 62. As an example, between 5 and 20 taps may be used. That is, the burst sensing operation 122 may be used to determine tap points on the IV curve to be compared to the target current value from the optical calibration 98.
After the burst sensing operation 122, data 124 may be sent to the active array 62 to program the pixels 63 thereof. The normal sensing operations 116, 118 may be performed on the pixels 65 of the reference array 64 during a vertical blank 126 of each image frame programmed at the pixels 63 of the active array 62. In some embodiments, the normal sensing operations 116, 118 may be time triggered to occur at a given time interval (e.g., two seconds). In this way, the normal sensing operations 116, 118 may continuously track current and/or temperature drift of the pixels 63, 65.
The timing diagram 110 illustrates how the burst sensing operation 114 occurs at the start-up of the electronic device 10 and is used to determine baseline data for the IV curve. Then, the normal sensing operations 116, 118 are executed to obtain current sensing data that is used to update the IV curve. The various sensing operations 114, 116, 118 are used to obtain sensing data to generate and update the IV curve to improve an accuracy of the IV curve over time and track the operation of the electronic display 18.
The compensation manager 102 multiplies the delta by a loop gain 138 and clamps 140 the output of the loop gain 138 to reduce any sudden increase in the voltage applied to the corresponding pixel 63 which could cause a sudden change in the luminance change during operation of the display panel 61. The loop gain 138 and the clamping threshold 140 may depend on the slope of the IV curve. Thus, the value of the loop gain 138 and the clamping threshold 140 may be determined to match or closely resemble the slope of the IV curve. This may lead to a smooth settling of the voltage applied to the corresponding pixel 63 in the active array 62. In some embodiments, the clamping 140 may be performed using a clamping threshold which is grey level dependent. That is, for a low grey level of the corresponding pixel, a high clamping threshold 140 may be used. The output of the clamping threshold 140 is one or more tapcode adjustments 146 for the gamma tap points 145. An accumulator 142 adds the tapcode adjustments 146 and the gamma tap points 145. An output of the accumulator 142 is one or more gamma tap points 141 for the reference array 64. In some embodiments, spline interpolation 144 may be applied to the burst sensing data 130 for smoothing of the IV curve. In that case, the output of the spline interpolation 144 includes the initial gamma tap points 145 for the reference array 64.
The compensation manager 102 compensates for variations in temperature and periphery circuit aging and ensures that a proper voltage is applied to pixels 63 in the active array 62 based on the input image data. In this way, the compensation manager may increase a performance of the display 18 by reducing visible anomalies.
To generate gamma tap points 143 for the active array 62, cubic spline interpolation 147 may be applied to a combination of calibrated pixel currents 148, the gamma tap points 141 of the reference array 64, and the target currents 132. The calibrated pixel currents 148 may be current values that when applied to various pixels of the display 18 cause a target luminance of the pixels at selected grey levels. The gamma tap points 143 for the active array 62 are used to drive pixel drivers of pixels 63 in the active array 62. That is, the gamma tap points 143 for the active array 62 and the gamma tap points 141 of the reference array 64 are provided to the DAC 92 of the display panel 61, as discussed with respect to
As shown, the IV curve 150 includes various curves corresponding to different temperatures of the reference array 64 and/or the pixels 65 of the reference array 64. For example, a first IV curve 152 may correspond to a temperature of about 50 degrees Celsius, a second IV curve 154 may correspond to a temperature of about 35 degrees Celsius, a third IV curve 156 may correspond to a temperature of about 20 degrees Celsius, and a fourth IV curve 158 may correspond to a temperature of about 5 degrees Celsius. Thus, the IV curve 150 may be based at least in part on a temperature of the reference array 64 and/or the pixels 65 of the reference array 64.
The flowchart 161 begins at operation 162 where a target current 132 is obtained by the compensation manager 102. The optical calibration unit 98 may provide the target current 132 to the compensation manager 102. The target current 132 may be based on a target luminance level (e.g., brightness level) for a corresponding pixel at a particular grey level. That is, the target current 132 when applied to pixels of the display 18 satisfies a target luminance of the display panel 61 at a particular grey level. The target current 132 may be a target convergence current of the reference array 64. That is, the target current 132 is a current to which the reference array 64 is expected to converge. At operation 164, a pixel current is sensed and obtained by the compensation manager 102. The pixel current may be received from the analog front end (AFE) 66 of the display 18. The pixel current may be measured at a corresponding pixel 65 of the reference array 64 and converted to a digitized current by the AFE 66.
At operation 166, the compensation manager 102 determines a difference between the target current 132 and the pixel current. At operation 168, the compensation manager 102 computes tapcode adjustments 146 (e.g., adjustments to the gamma tap points 141, 143) by applying a loop gain 138 and a clamping threshold 140 to the difference 136 determined at operation 166. The loop gain 138 and the clamping threshold 140 may be computed to satisfy a slope of a current-voltage curve generated from burst sensing data 130, as discussed above with respect to
At operation 170, the compensation manager 102 generates reference array tapcodes (e.g., gamma tap points) based on the tapcode adjustments and initial tapcodes generated from the burst sensing operation. To do so, the compensation manager 102 may aggregate (e.g., combine) the tapcode adjustments 146 and the initial tapcodes 145 to adjust the initial tapcodes 145 based on the tapcode adjustments 146. As discussed with respect to
At operation 172, the compensation manager 102 computes active array tapcodes 143 (e.g., gamma tap points 143 of the active array 62) based on the reference array tap codes 141 determined at operation 170, the target current 132, and a calibration current 148. The calibration current 148 may be a current that when applied to various pixels of the display 18 causes a target luminance of the pixels at selected grey levels. The active array tapcodes 143 may be used to drive pixels 63 of the pixels 63 of the active array 62 via one or more drivers (e.g., source drivers and/or gate drivers) of the display 18. That is, the active array tapcodes 143 may be provided to the DAC 92 of the display panel 61 to drive the pixels 63 of the active array 62. During operation of the display 18, the compensation manager 102 may periodically obtain a new target current 132 and pixel current, and update the current-voltage curve and tapcodes 143 used to drive the pixels 63 of the active array 62 using the operations of the flowchart 161. That is, the operations of the flowchart 161 may be used to update and track the current-voltage curve over time and compensate for variations in temperature and periphery circuit aging, among other things.
The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function] . . . ” or “step for [perform]ing [a function] . . . ,” it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).
This application claims the benefit of U.S. Provisional Application No. 63/083,676, filed Sep. 25, 2020, and entitled “REFERENCE ARRAY CURRENT SENSING,” which is incorporated herein by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
10818217 | Hsiang | Oct 2020 | B1 |
20080068309 | Kwak | Mar 2008 | A1 |
20100177126 | Inoue | Jul 2010 | A1 |
20140300618 | Wyatt | Oct 2014 | A1 |
20160148564 | Kim | May 2016 | A1 |
20170269398 | Park | Sep 2017 | A1 |
20180061913 | Kim | Mar 2018 | A1 |
20180075798 | Nho | Mar 2018 | A1 |
20180090109 | Lin | Mar 2018 | A1 |
20180144683 | Shin | May 2018 | A1 |
20190057633 | Li | Feb 2019 | A1 |
20190088199 | Zhang | Mar 2019 | A1 |
20190088205 | Zhang | Mar 2019 | A1 |
20200349894 | Ko | Nov 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
63083676 | Sep 2020 | US |