This disclosure relates to systems and methods for reducing or eliminating front-of-screen (FoS) artifacts due to operation of an under-display sensor of an electronic display.
A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure.
Electronic displays may be found in numerous electronic devices, from mobile phones to computers, televisions, automobile dashboards, and augmented reality or virtual reality glasses, to name just a few. Electronic displays with self-emissive display pixels produce their own light. Self-emissive display pixels may include any suitable light-emissive elements, including light-emitting diodes (LEDs) such as organic light-emitting diodes (OLEDs) or micro-light-emitting diodes (μLEDs). By causing different display pixels to emit different amounts of light, individual display pixels of an electronic display may collectively produce images.
If an electronic device includes a sensor disposed under the electronic display (e.g., to reduce the size of a bezel of the electronic display), the operation of the sensor could disrupt the operation of the electronic display, or vice versa. Some sensors (e.g., a proximity sensor), when activated, may emit a signal (e.g., beam) through the electronic display. Emitting the signal through the electronic display may, depending at least on the timing of activation of the sensor, interfere with the operation of one or more pixels of the electronic display, causing the one or more pixels to display a brightness different than what was intended. Such interferences may result in front-of-screen (FoS) artifacts, which may negatively impact user experience. Likewise, the operation of the sensor may be impacted by the programming of the electronic display. For example, the signal through the electronic display may be affected by electromagnetic interference from the electronic display based on the operation of the electronic display.
A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
An electronic device that has a sensor disposed under the electronic display may reduce or eliminate front-of-screen artifacts by coordinating the timing of the display or sensor to reduce interference or by determining a compensation signal to offset the interaction between these subsystems. For example, an electronic device may determine (e.g., via a processor, a controller, and so on) a timing profile for the under-display sensor based at least partially on spatially weighted statistics (SWS), such that the under-display sensor may activate (e.g., emit a beam) at a time in which interference on the pixels of the display may be reduced or eliminated.
However, in some cases, the under-display sensor may cause some interference (e.g., voltage drop or voltage rise on display pixels) even when a timing profile is applied. In some embodiments, mitigation techniques may be used to reduce or eliminate the impacts of the interference. For example, if, after activating the under-display sensor based on the timing profile, a voltage error is determined for a previous frame or subframe, a compensation signal (e.g., a voltage of opposite polarity and similar or identical magnitude) may be applied to the next frame or subframe, such that the errors average out and the FoS impact of the interference is reduced or eliminated.
Various refinements of the features noted above may exist in relation to various aspects of the present disclosure. Further features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination. For instance, various features discussed below in relation to one or more of the illustrated embodiments may be incorporated into any of the above-described aspects of the present disclosure alone or in any combination. The brief summary presented above is intended only to familiarize the reader with certain aspects and contexts of embodiments of the present disclosure without limitation to the claimed subject matter.
Various aspects of this disclosure may be better understood upon reading the following detailed description and upon reference to the drawings described below in which like numerals refer to like parts.
One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements. The terms “including” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “some embodiments,” “embodiments,” “one embodiment,” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Furthermore, the phrase A “based on” B is intended to mean that A is at least partially based on B. Moreover, the term “or” is intended to be inclusive (e.g., logical OR) and not exclusive (e.g., logical XOR). In other words, the phrase A “or” B is intended to mean A, B, or both A and B.
To reduce the size of a bezel around the electronic display, a sensor may be disposed beneath a display panel of the electronic display. In some embodiments, the sensor may be coupled to or disposed near a laser. For example, a proximity sensor may emit a beam at a nearby object (e.g., a user of the electronic device). The proximity sensor may receive the reflection of the emitted beam to determine the distance between the proximity sensor (e.g., and therefore the electronic device) and the nearby object.
However, in some cases, the operation of the sensor (e.g., the emission of the beam) may produce undesirable effects in the electronic device. Continuing with the previous example, if the proximity sensor fires an emission during certain operations of the electronic devices (e.g., programming image data into pixels of the electronic device) the emission may cause undesirable effects (e.g., front of screen (FoS) artifacts) by inducing a voltage on the pixels being programmed, causing distortion at the pixel that may cause the pixel to output a different color or brightness than was intended by the image data, which may negatively impact user experience.
To mitigate undesirable outcomes such as FoS artifacts, spatially weighted statistics (SWS) may be gathered to determine a more favorable or optimized emission timing profile to cause the sensor to operate during times when the undesirable outcomes may be mitigated or eliminated. However, in some situations, analyzing the SWS may still result in operation of the sensor that causes some, albeit reduced, interference or distortion on the electronic display. For example, using SWS to determine a timing profile for the operation of a proximity sensor may result in a smaller residual induced voltage (e.g., and thus a smaller voltage error) on the pixels of the electronic display, and the smaller induced voltage may lead to undesirable FoS artifacts. To address this, compensation techniques may be used to offset the residual voltage error. In some embodiments, a processor or controller of the electronic device may determine the residual voltage error experienced on the electronic display during a first frame and adjust the data on a subsequent second frame to offset the residual voltage error on the first frame. For example, if a processor determines that the operation of a sub-display sensor causes a voltage error of 10 millivolt (mV) on a row of pixels during a first frame, the processor may adjust the image data for the subsequent frame such that the voltage on the same row of pixels has a −10 mV offset. Consequently, the voltage error may average out and the residual FoS artifacts may be further reduced or eliminated.
With this in mind, an example of an electronic device 10, which includes an electronic display 12 that may benefit from these features, is shown in
In addition to the electronic display 12, as depicted, the electronic device 10 includes one or more input devices 14, one or more input/output (I/O) ports 16, a processor core complex 18 having one or more processors or processor cores and/or image processing circuitry, memory 20, one or more storage devices 22, a network interface 24, and a power supply 26. The various components described in
The processor core complex 18 is operably coupled with the memory 20 and the storage device 22. As such, the processor core complex 18 may execute instructions stored in memory 20 and/or a storage device 22 to perform operations, such as generating or processing image data. The processor core complex 18 may include one or more microprocessors, one or more application specific processors (ASICs), one or more field programmable logic arrays (FPGAs), or any combination thereof.
In addition to instructions, the memory 20 and/or the storage device 22 may store data, such as image data. Thus, the memory 20 and/or the storage device 22 may include one or more tangible, non-transitory, computer-readable media that store instructions executable by processing circuitry, such as the processor core complex 18, and/or data to be processed by the processing circuitry. For example, the memory 20 may include random access memory (RAM) and the storage device 22 may include read only memory (ROM), rewritable non-volatile memory, such as flash memory, hard drives, optical discs, and/or the like.
The network interface 24 may enable the electronic device 10 to communicate with a communication network and/or another electronic device 10. For example, the network interface 24 may connect the electronic device 10 to a personal area network (PAN), such as a Bluetooth network, a local area network (LAN), such as an 802.11x Wi-Fi network, and/or a wide area network (WAN), such as a fourth-generation wireless network (4G), LTE, or fifth-generation wireless network (5G), or the like. In other words, the network interface 24 may enable the electronic device 10 to transmit data (e.g., image data) to a communication network and/or receive data from the communication network.
The power supply 26 may provide electrical power to operate the processor core complex 18 and/or other components in the electronic device 10, for example, via one or more power supply rails. Thus, the power supply 26 may include any suitable source of electrical power, such as a rechargeable lithium polymer (Li-poly) battery and/or an alternating current (AC) power converter. A power management integrated circuit (PMIC) may control the provision and generation of electrical power to the various components of the electronic device 10.
The I/O ports 16 may enable the electronic device 10 to interface with another electronic device 10. For example, a portable storage device may be connected to an I/O port 16, thereby enabling the electronic device 10 to communicate data, such as image data, with the portable storage device.
The input devices 14 may enable a user to interact with the electronic device 10. For example, the input devices 14 may include one or more buttons, one or more keyboards, one or more mice, one or more trackpads, and/or the like. Additionally, the input devices 14 may include touch sensing components implemented in the electronic display 12, as described further herein. The touch sensing components may receive user inputs by detecting occurrence and/or position of an object contacting the display surface of the electronic display 12.
In addition to enabling user inputs, the electronic display 12 may provide visual representations of information by displaying one or more images (e.g., image frames or pictures). For example, the electronic display 12 may display a graphical user interface (GUI) of an operating system, an application interface, text, a still image, or video content. To facilitate displaying images, the electronic display 12 may include a display panel with one or more display pixels. The display pixels may represent sub-pixels that each control a luminance of one color component (e.g., red, green, or blue for a red-green-blue (RGB) pixel arrangement).
The electronic display 12 may display an image by controlling the luminance of its display pixels based at least in part image data associated with corresponding image pixels in image data. In some embodiments, the image data may be generated by an image source, such as the processor core complex 18, a graphics processing unit (GPU), an image sensor, and/or memory 20 or storage devices 22. Additionally, in some embodiments, image data may be received from another electronic device 10, for example, via the network interface 24 and/or an I/O port 16.
One example of the electronic device 10, specifically a handheld device 10A, is shown in
The handheld device 10A includes an enclosure 30 (e.g., housing). The enclosure 30 may protect interior components from physical damage and/or shield them from electromagnetic interference. In the depicted embodiment, the electronic display 12 is displaying a graphical user interface (GUI) 32 having an array of icons 34. By way of example, when an icon 34 is selected either by an input device 14 or a touch sensing component of the electronic display 12, an application program may launch.
Input devices 14 may be provided through the enclosure 30. As described above, the input devices 14 may enable a user to interact with the handheld device 10A. For example, the input devices 14 may enable the user to activate or deactivate the handheld device 10A, navigate a user interface to a home screen, navigate a user interface to a user-configurable application screen, activate a voice-recognition feature, provide volume control, and/or toggle between vibrate and ring modes. The I/O ports 16 also open through the enclosure 30. The I/O ports 16 may include, for example, a Lightning® or Universal Serial Bus (USB) port.
The electronic device 10 may take the form of a tablet device 10B, as shown in
Turning to
The display pixels 54 may each include one or more self-emissive elements, such as a light-emitting diodes (LEDs) (e.g., organic light emitting diodes (OLEDs) or micro-LEDs (μLEDs)); however, other pixels may be used with the systems and methods described herein including but not limited to liquid-crystal devices (LCDs), digital mirror devices (DMD), or the like. Different display pixels 54 may emit different colors. For example, some of the display pixels 54 may emit red light, some may emit green light, and some may emit blue light. Thus, the display pixels 54 may be driven to emit light at different brightness levels to cause a user viewing the electronic display 12 to perceive an image formed from different colors of light. The display pixels 54 may also correspond to hue and/or luminance levels of a color to be emitted and/or to alternative color combinations, such as combinations that use red (R), green (G), blue (B), or others.
The scan driver circuitry 76 may provide scan signals (e.g., pixel reset, data enable, on-bias stress, emission (EM)) on scan lines 80 to control the display pixels 54 by row. For example, the scan driver circuitry 76 may cause a row of the display pixels 54 to become enabled to receive a portion of the image data 74 from data lines 82 from the data driver circuitry 78. In this way, an image frame of the compensated image data 74 may be programmed onto the display pixels 54 row by row. Other examples of the electronic display 12 may program the display pixels 54 in groups other than by row. When the scan driver circuitry 76 provides an emission signal to certain pixels 54, those pixels 54 may emit light according to the image data 74 with which those pixels 54 were programmed.
As previously stated, in some scenarios, a sensor may be disposed beneath a display panel of the electronic display 12 (e.g., to reduce a bezel size of the electronic display 12). In some embodiments, an under-display sensor 90 may emit a beam when activated. For example, a proximity sensor may emit a beam at a nearby object (e.g., a user of the electronic device 10). The proximity sensor may receive the reflection of the emitted beam to determine the distance between the proximity sensor (e.g., and therefore the electronic device 10) and the nearby object. As another example, the under-display sensor 90 may include a fingerprint scanner, a thermal sensor, an ambient light sensor, and so on.
However, in some cases, the activation of the under-display sensor 90 (e.g., the emission of the beam) may produce undesirable effects in the electronic device 10. For example, if the under-display sensor 90 fires an emission during certain operations of the electronic device 10 (e.g., programming the image data 74 into the display pixels 54 of the electronic device 10) the emission may cause undesirable effects (e.g., front of screen (FoS) artifacts) by inducing a voltage on the display pixels 54 being programmed, causing distortion at the display pixels 54 that may cause the display pixels 54 to output a color or brightness that does not correspond to the image data 74. To mitigate undesirable outcomes such as FoS artifacts, spatially weighted statistics may be gathered to determine a more favorable or optimized emission timing profile to cause the under-display sensor 90 to operate during times when the undesirable outcomes may be mitigated or eliminated.
In process block 802 of the method 800, the processor core complex 18 receives the image data 74. The image data 74 may include image data corresponding to a current frame or a previous frame or subframe of content to be displayed on the electronic display 12. In process block 804, the processor core complex 18 may obtain spatially weighted statistics (SWS). The processor core complex 18 (or other circuitry that may be used in determining SWS) may collect statistics (e.g., histogram statistics) on red, green, and blue (RGB) color values of the image data 74 to determine enhanced or optimized timing for activation of an under-display sensor. The processor core complex 18 may collect statistics (e.g., histogram statistics) on a luma (Y) value, the Y value representing a weighted combination of the RGB color values.
In process block 806, the processor core complex 18 determines a timing profile for the under-display sensor.
Returning to
In process block 810, the processor core complex 18 may apply the pixel compensation determined in the process block 808. Continuing with the previous example, the processor core complex 18 may apply the +10 mV voltage compensation (e.g., a voltage error of +10 mV) to the image data 74 corresponding to the current subframe. Consequently, the voltage errors of the previous subframe and the current subframe may average out, reducing or eliminating the FoS impact of the voltage error on the previous subframe. While the method 800 is described as being performed by the processor core complex 18 (e.g., software executed by the processor core complex 18), it should be noted that the method 800 may be performed by dedicated physical hardware, physical hardware components implementing software instructions or software modules or components, or any combination thereof.
A region of the electronic display 12 may be determined as an active region for gathering the SWS 1036. Display pixels 54 in the active region may contribute to the collected SWS 1036. The active region may be defined in terms of start coordinates along each dimension and a size relative to the start coordinate along each dimension. An active region may have dimensions ranging from 25×25 to 100×100, and may be entirely included within a given frame. The active region may be selected based on the position of the pixels in the active region in relation to the under-display sensor 90. For example, pixels that are directly above the under-display sensor 90 or pixels on a row directly above the under-display sensor 90 may be most likely to be affected by the under-display sensor 90, and thus may be included in the selected active region.
In some embodiments, the image data 74 may be sent to range mapping circuitry 1002. In other embodiments, the range mapping circuitry 1002 may be bypassed and the image data 74 may be sent directly to selection circuitry 1010 (e.g., a multiplexer) and selection circuitry 1012 (e.g., a multiplexer). Performing range mapping via the range mapping circuitry 1002 may provide higher precisions for certain intensity levels of the image data 74 when the histogram statistics are collected. For example, applying a gamma curve prior to collecting the SWS 1036 may provide a greater number of samples for dark intensities. The range mapping circuitry 1002 may use a separate one-dimensional lookup table (1D LUT) for 1004, 1006, and 1008 for the red, green, and blue color components, respectively.
Independent histograms (e.g., R histogram 1030, G histogram 1032, and B histogram 1034) may be generated for the red, green, and blue (RGB) color values. The spacing of histogram bins may be determined from settings in a histogram bin spacing register and may be adjusted independently for the red, blue, and green color values and the Y value 1018. In one example, the histograms may be equally spaced or non-equally spaced with 65 bins spanning the range [0, 212-1]. The RGB color values used for the determination of the R histogram 1030, the G histogram 1032, and/or the B histogram 1034 may be based on the RGB values at the input of the range mapping circuitry 1002 (e.g., the RGB values of the image data 74) or the output (e.g., 1014) of the range mapping circuitry 1002.
As previously stated, a luma (Y) value 1018 may represent a weighted combination of the RGB color values. In Y-derivation circuitry 1016, a Y histogram 1026 may be populated based on the Y value 1018 that is determined as a weighted combination of the RGB color values. The RGB values used for the determination of the Y histogram 1026 may be the RGB values at the input (e.g., the image data 74) or the output (e.g., 1014) of the range mapping circuitry 1002. Similar to the RGB color values, the determined Y value 1018 may go through range mapping (e.g., in the Y-derivation range mapping circuitry 1020) such that certain intensity levels may have high precision when the SWS 1036 are collected.
The Y-derivation range mapping circuitry 1018 may use a 1D LUT (Y LUT 1022) for the Y component. When the Y value 1018 has a range [0, 212-1], 257 entries of the Y LUT 1022 cover the range [0, 212]. When input value falls between intervals, the output values may be linearly interpolated. The result may then be clamped to the range [0, 212-1]. When range mapping is bypassed for Y (e.g., such that the Y value 1018 is sent directly to an input of selection circuitry 1024) and it has values in the range [0, 222-1], the value may be right-shifted by 10 prior to collection of the Y histogram 1026.
As the FoS artifact (e.g., a local contrast artifact) induced by the beams 904 may vary spatially, the collected SWS 1036 that aid in determining the timing profile for activating the under-display sensor 90 may be weighted based on a spatial position of the display pixel(s) 54 within the active region of the SWS collection circuitry 1000. This weighting may be achieved by use of a programmed map of weight values stored in spatial weight map circuitry 1038. For example, the spatial weight map may provide a greatest weighting factor for display pixels 54 disposed directly above the under-display sensor 90 and may provide decreasing weighting factors as the spatial position of the display pixles 54 increase in distance from the under-display sensor 90. In some embodiments, the weight values may be interpolated by the interpolation circuitry 1040.
The active region 1102 may represent the electronic display 12 or a portion of the electronic display 12 which displays content during a given frame or subframe. The blanking regions 1104 may delineate the end of the active region 1102. The activation signal may be asserted during at least a portion of the section 1106 for a previous frame N. The activation signal may be deasserted prior to the SWS collection regions 1108 to prevent the under-display sensor 90 from emitting the beams 904 while the SWS collection circuitry 1000 gathers SWS 1036 on an incoming frame N+1. For example, the timing profile for the under-display sensor 90 to emit the beams 904 may be based at least partially on frame content anticipated to be displayed in the incoming frame N+1. Anticipating when certain frame content may be displayed on the electronic display 12 may decrease the likelihood of unintentionally emitting the beams 904 during programming of the display pixels 54. In some embodiments, anticipated frame content for a current frame or incoming frame N+1 may be based at least partially on a previous frame N. Accordingly, prior to displaying the content of the frame N+1 on the electronic display 12, SWS 1036 may be collected regarding the incoming frame N+1.
The timing profile may be determined based on the SWS 1036, and may be used by the processor core complex 18 to assert and/or deassert the activation signal to cause the under-display sensor 90 to emit the beams 904 in a way that may reduce the likelihood of causing distortion (e.g., voltage error) on the display pixels 54 in the active region 1102. The activation signal may be asserted, according to the determined timing profile, at a certain time (e.g., at the time 1110 at the earliest) during the section 1106 to cause the under-display sensor to start running on the fast clocks.
Once the activation signal is asserted, it may be deasserted at any point before the time 1112 based on the determined timing profile. As previously stated, deassertion of the activation signal may cause the under-display sensor 90 to emit the beams 904. As it may be desirable to avoid emitting the beams 904 during the SWS collection regions 1108, the deassertion of the activation signal (and thus the firing of the beams 904) may occur, at the latest, at the time 1112 for the frame N. Once the frame content of the frame N leaves the electronic display 12, the SWS collection circuitry 1000 may begin collecting SWS to determine a timing profile for the frame N+1.
If the processor core complex 18 determines that the relevant frame is not within the active region 1102, the processor core complex 18 will determine, in the process block 1206, a timing profile based on the relevant frame not being with the active region 1102. If the processor core complex 18 determines that the relevant frame is within the active region 1102, in process block 1212 the processor core complex 18 may (e.g., via the SWS collection circuitry 1000) collect relevant SWS (e.g., 1036). Based on the relevant SWS, the processor core complex 18 determines in query block 1214 whether to activate the under-display sensor 90 immediately. If, in query block 1214, the processor core complex 18 determines that the activation signal may be activated immediately, in process block 1216 the processor core complex 18 configures the activation signal. However, if the processor core complex 18 determines in query block 1214 that the activation signal may be delayed, in process block 1218 the processor core complex 18 may set an activation signal pending time. For example, setting a pending time for the activation signal may delay the deassertion of the activation signal (and thus delay the activation of the under-display sensor 90). In the process block 1206, the timing profile may be determined based on whether the activation signal was configured (e.g., in the process block 1216) or was set to pending (e.g., in the process block 1218).
As previously discussed, in some scenarios there may remain some amount of residual error (e.g., voltage error) on the display pixels 54 after activating the under-display sensor 90 according to the timing profile. To address this issue, mitigation actions may be performed on the electronic display 12 to account and correct for the residual error.
In some embodiments, the LUT 1308 and the LUT 1310 are interpolated via interpolation circuitry 1312. The results of the interpolation of the LUT 1308 and the LUT 1310 may be combined based on a mix factor 1314. The interpolation circuitry 1312 may perform the interpolation using any appropriate interpolation algorithm (e.g., a hybrid barycentric-bilinear algorithm). The interpolation circuitry may have multiple modes that may be activated based on the relationship between the current frame pixel value 1302 and the previous frame pixel value 1304. In some embodiments, the interpolation circuitry 1312 may operate in a normal mode. In the normal mode, the interpolation circuitry 1312 may interpolate the LUTs 1306. In a first bypass mode, the interpolation may be bypassed whenever the current frame input pixel value 1302 are equal to the previous frame pixel value 1304. In a second bypass mode, the interpolation may be bypassed whenever the current frame pixel value 1302 is less than or equal to than the previous frame pixel value 1304. In a third bypass mode, interpolation may be bypassed whenever the current frame input pixel value 1302 is greater than or equal to the previous frame pixel value 1304. In a fourth bypass mode, interpolation may be bypassed whenever the absolute value of the difference of the current frame pixel values 1302 and the previous frame pixel value 1304 is less than or equal to a programmed threshold.
An FoS artifact caused by the residual error induced by operation of the under-display sensor 90 may vary spatially (e.g., based on the location of the under-display sensor 90). Due to the spatial variance, it may be beneficial for the residual error mitigation circuitry 1300 to obtain information regarding a spatial position of the FoS artifact on the electronic display 12 (e.g., by obtaining a position of the affected display pixels 54). The residual error mitigation circuitry 1300 may apply a weight to a compensation value 1318 (e.g., received from selection circuitry 1316 based on the output of the LUTs 1306 and/or the interpolation circuitry 1312) based on a spatial weighting map 1320 (e.g., a programmed map of weight values) to spatially weight the compensation value 1318. For example, a greater weighting factor may be applied to the compensation value 1318 if the affected display pixels 54 are near (e.g., directly above) the under-display sensor 90, while a lesser weighting factor may be applied to the compensation value 1318 if the affected display pixels 54 are farther from the under-display sensor 90. In some embodiments, the spatial weight outputted by the spatial weight map 1320 may be interpolated via interpolation circuitry 1322. The residual error mitigation circuitry 1300 may output a current frame output pixel value 1324 based on the compensation value 1318. For example, the residual error mitigation circuitry 1300 may apply a positive compensation value 1318 if the previous frame pixel value 1304 had a negative voltage error (e.g., resulting in a darker pixel output than was intended) or may apply a negative compensation value 1318 if the previous frame pixel value 1304 had a positive voltage error (e.g., resulting in a brighter pixel output than was intended). Accordingly, an FoS artifact may be mitigated over a number of frames by averaging out the voltage errors on the display pixels 54. It should be noted that, in addition to the current frame pixel value 1302 and the pervious frame pixel value 1304, the compensation value 1318 may be based on a timing profile as discussed above.
A defined active region may be stored in memory. For example, the defined active region may be fed into selection circuitry 1326 and stored in memory (e.g., static RAM (SRAM)) with 10-bits of accuracy for the current input pixel value 1302 and the current output pixel value 1324. The memory may be memory mapped and may be accessed by the processor core complex 18. The current frame input pixel value 1302 and/or the current frame output pixel value 1324 may be fed into selection circuitry 1326 and be stored in the memory.
The specific embodiments described above have been shown by way of example, and it should be understood that these embodiments may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this disclosure.
The techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function] . . . ” or “step for [perform]ing [a function] . . . ,” it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).
It is well understood that the use of personally identifiable information should follow privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining the privacy of users. In particular, personally identifiable information data should be managed and handled so as to minimize risks of unintentional or unauthorized access or use, and the nature of authorized use should be clearly indicated to users.
This application claims priority to U.S. Provisional Application No. 63/403,600, filed Sep. 2, 2022, entitled “Display Pipeline Compensation for a Proximity Sensor Behind Display Panel,” the disclosure of which is incorporated by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
63403600 | Sep 2022 | US |