This relates generally to electronic devices, and, more particularly, to electronic devices with displays.
Electronic devices such as laptop computers, cellular telephones, and other equipment are sometimes provided with displays and light sensors. For example, ambient light sensors may be incorporated into a device to provide the device with information on current lighting conditions. Ambient light readings may be used in controlling the device. If, for example bright daylight conditions are detected, an electronic device may increase display brightness to compensate. Similarly, display brightness may be decreased in low light conditions to avoid eye strain.
Conventional methods of dimming a display in low light may reduce the perceived quality of colors in images due to reduced sensitivity of cones in the retina.
An electronic device may include a display, an ambient light sensor, and control circuitry. The ambient light sensor may be configured to measure the color and brightness (e.g., illuminance) of ambient light. The control circuitry may adjust the brightness, color, and/or contrast of the display based on the brightness and/or color of ambient light.
The control circuitry may use a mesopic vision model to compensate images that are displayed in low light conditions when the measured ambient light brightness is below a threshold. The mesopic vision model may use a tone mapping process and a color mapping process to compensate for the reduced color sensitivity of the retina in low light conditions. The control circuitry may iteratively adjust weights associated with the tone mapping process and the color mapping process until a perceived (retinal) version of the compensated image in the measured ambient light conditions closely matches a perceived (retinal) version of the original image in reference ambient light conditions. The control circuitry may use a joint optimization process that balances the contrast and color of the compensated image.
An illustrative electronic device of the type that may be provided with one or more light sensors is shown in
As shown in
Input-output circuitry in device 10 such as input-output devices 12 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 12 may include buttons, joysticks, scrolling wheels, touch pads, keypads, keyboards, microphones, speakers, tone generators, vibrators, cameras, light-emitting diodes and other status indicators, data ports, etc. A user can control the operation of device 10 by supplying commands through input-output devices 12 and may receive status information and other output from device 10 using the output resources of input-output devices 12.
Input-output devices 12 may include one or more displays such as display 14. Display 14 may be insensitive to touch or may be a touch screen display that includes a touch sensor for gathering touch input from a user. A touch sensor for display 14 may be based on an array of capacitive touch sensor electrodes, acoustic touch sensor structures, resistive touch components, force-based touch sensor structures, a light-based touch sensor, or other suitable touch sensor arrangements.
Input-output devices 12 may also include sensors 18. Sensors 18 may include one or more ambient light sensors such as ambient light sensor 20 and other sensors (e.g., a capacitive proximity sensor, a light-based proximity sensor, a magnetic sensor, an accelerometer, a force sensor, a touch sensor, a temperature sensor, a pressure sensor, a compass, a microphone or other sound sensor, or other sensors).
Ambient light sensor 20 for device 10 may be a color ambient light sensor having an array of detectors each of which is provided with a color filter. If desired, the detectors in ambient light sensor 20 may be provided with color filters of different respective colors. Information from the detectors may be used to measure the total amount of ambient light that is present in the vicinity of device 10. For example, the ambient light sensor may be used to determine whether device 10 is in a dark or bright environment. Based on this information, control circuitry 16 can adjust display brightness and/or color for display 14 or can take other suitable action. This is, however, merely illustrative. If desired, ambient light sensor 20 may be insensitive to color and may be configured to measure ambient light brightness only. Arrangements in which ambient light sensor 20 is a color ambient light sensor that measures both brightness and color of ambient light are sometimes described herein as an illustrative example.
Ambient light sensors 20 may be used to make ambient light intensity (e.g., brightness, illuminance, and/or luminance flux per unit area) measurements. Ambient light intensity measurements, which may sometimes be referred to as ambient light illuminance measurements, may be used by device 10 to adjust display brightness, contrast, color, and/or other characteristics of the display. Ambient light sensors 20 may also, if desired, be used to make measurements of ambient light color (e.g., color coordinates, correlated color temperature, or other color parameters representing ambient light color). Control circuitry 16 may convert these different types of color information to other formats, if desired (e.g., a set of red, green, and blue sensor output values may be converted into color chromaticity coordinates and/or may be processed to produce an associated correlated color temperature, etc.).
Color information and illuminance information from ambient light sensor 20 can be used to adjust the operation of device 10. For example, the color cast of display 14 (e.g., the white point of display 14) may be adjusted in accordance with the color of ambient lighting conditions. If, for example, a user moves device 10 from a cool lighting environment (e.g., an outdoor blue sky environment) to a warm lighting environment (e.g., an incandescent light environment), the warmth of display 14 may be increased accordingly, so that the user of device 10 does not perceive display 14 as being overly cold. If desired, the ambient light sensor may include an infrared light sensor. In general, any suitable actions may be taken based on color measurements and/or total light intensity measurements (e.g., adjusting display brightness, adjusting display content, changing audio and/or video settings, adjusting sensor measurements from other sensors, adjusting which on-screen options are presented to a user of device 10, adjusting wireless circuitry settings, etc.).
A perspective view of a portion of an illustrative electronic device is shown in
Display 14 may be protected using a display cover layer such as a layer of transparent glass, clear plastic, sapphire, or other clear layer. Openings may be formed in the display cover layer. For example, an opening may be formed in the display cover layer to accommodate a button, a speaker port, or other components. Openings may be formed in housing 22 to form communications ports (e.g., an audio jack port, a digital data port, etc.), to form openings for buttons, etc.
Display 14 may include an array of display pixels formed from liquid crystal display (LCD) components, an array of electrophoretic pixels, an array of plasma pixels, an array of organic light-emitting diode pixels or other light-emitting diodes, an array of electrowetting pixels, or pixels based on other display technologies. The array of pixels of display 14 forms an active area AA. Active area AA is used to display images for a user of device 10. Active area AA may be rectangular or may have other suitable shapes. Inactive border area IA may run along one or more edges of active area AA. Inactive border area IA may contain circuits, signal lines, and other structures that do not emit light for forming images. To hide inactive circuitry and other components in border area IA from view by a user of device 10, the underside of the outermost layer of display 14 (e.g., the display cover layer or other display layer) may be coated with an opaque masking material such as a layer of black ink. Optical components (e.g., a camera, a light-based proximity sensor, an ambient light sensor, status indicator light-emitting diodes, camera flash light-emitting diodes, etc.) may be mounted under inactive border area IA. One or more openings (sometimes referred to as windows) may be formed in the opaque masking layer of IA to accommodate the optical components. For example, a light component window such as an ambient light sensor window may be formed in a peripheral portion of display 14 such as region 24 in inactive border area IA. Ambient light from the exterior of device 10 may be measured by ambient light sensor 20 in device 10 after passing through region 24 and the display cover layer.
If desired, optical components such as ambient light sensor 20 may instead or additionally be mounted in active area AA of display 14. For example, as shown in
A top view of a portion of display 14 is shown in
Display driver circuitry may be used to control the operation of pixels 36. The display driver circuitry may be formed from integrated circuits, thin-film transistor circuits, or other suitable circuitry. Display driver circuitry 30 of
To display the images on display pixels 36, display driver circuitry 30 may supply image data to data lines D while issuing clock signals and other control signals to supporting display driver circuitry such as gate driver circuitry 34 over path 38. If desired, display driver circuitry 30 may also supply clock signals and other control signals to gate driver circuitry 34 on an opposing edge of display 14.
Gate driver circuitry 34 (sometimes referred to as row control circuitry) may be implemented as part of an integrated circuit and/or may be implemented using thin-film transistor circuitry. Horizontal control lines G in display 14 may carry gate line signals such as scan line signals, emission enable control signals, and other horizontal control signals for controlling the display pixels 36 of each row. There may be any suitable number of horizontal control signals per row of pixels 36 (e.g., one or more row control signals, two or more row control signals, three or more row control signals, four or more row control signals, etc.).
The region on display 14 where the display pixels 36 are formed may sometimes be referred to herein as the active area. Electronic device 10 has an external housing with a peripheral edge. The region surrounding the active area and within the peripheral edge of device 10 is the border region. It is generally desirable to minimize the border region of device 10. For example, device 10 may be provided with a full-face display 14 that extends across the entire front face of the device. If desired, display 14 may also wrap around over the edge of the front face so that at least part of the lateral edges or at least part of the back surface of device 10 is used for display purposes.
Control circuitry 16 may gather ambient light sensor data from ambient light sensor 20 to adaptively determine how to adjust display light and display colors based on ambient lighting conditions. If desired, control circuitry 16 may control display 14 using other information such as time information from a clock, calendar, and/or other time source, location information from location detection circuitry (e.g., Global Positioning System receiver circuitry, IEEE 802.11 transceiver circuitry, or other location detection circuitry), user input information from a user input device such as a touchscreen (e.g., touchscreen display 14) or keyboard, etc.
Ambient light sensor 20 may be used to measure the color and/or intensity of ambient light. Control circuitry 16 may adjust the operation of display 14 based on the color and/or intensity of ambient light. In adjusting the output from display 14, control circuitry 16 may take into account the chromatic adaptation function of the human visual system. This may include, for example, adjusting the white point of display 14 based on the color and/or brightness of ambient light measured by ambient light sensor 20. If, for example, a user moves device 10 from a cool lighting environment (e.g., outdoor light having a relatively high correlated color temperature) to a warm lighting environment (e.g., indoor light having a relatively low correlated color temperature), the “warmth” of display 14 may be increased accordingly by adjusting the white point of display 14 to a warmer white (e.g., a white with a lower color temperature), so that the user of device 10 does not perceive display 14 as being overly cold.
Control circuitry 16 may also adjust the brightness of display 14 based on ambient light conditions. When ambient light sensor 20 detects bright light (e.g., outdoors, in an office, etc.), control circuitry 16 may increase the brightness of display 14. When ambient light sensor 20 detects dim ambient light (e.g., in a bedroom, at night, etc.), control circuitry 16 may decrease the brightness of display 14. Care must be taken, however, to account for the changes that occur in the human visual system under different lighting conditions. The retina operates using two photoreceptor systems: rods and cones. In bright light, a user's photopic vision is activated and the retina primarily uses the three cones (L, S, and M cones) to see color, while the rods remain mostly saturated. Under moderately dim ambient light conditions, a user's mesopic vision is activated, with both rods and cones being used to see color, but with lower perceived quality. In very dim light, scotopic vision is activated and only rods are active.
As rods become more active in low light conditions, colors may appear washed out. As a result, merely dimming a display in low light conditions without accounting for the reduced color sensitivity of the cones in the retina will cause displayed images to appear washed out. To account for the reduced color sensitivity in mesopic lighting conditions (e.g., dim ambient light conditions), control circuitry 16 may use a mesopic vision model to compensate images for colorfulness as well as brightness and contrast, since these three properties are interdependent.
By compensating images in low light conditions for the reduced color sensitivity of the human visual system, images on display 14 such as compensated output image 48 may exhibit consistent vivid color even when display 14 is operating at a lower brightness level. In addition to maintaining vivid colors in low light, using mesopic vision model 46 to compensate images in low light may result in higher contrast for better readability, sufficient brightness for eye comfort without excessive power consumption, power savings with reduced headroom for displaying high dynamic range image content, and better high dynamic range experience in low light with better image quality and highlights.
Mesopic vision model 46 may include a photopic vision component and a scotopic vision component. The weights applied to each of the photopic vision component and the scotopic vision component may change depending on the brightness of the ambient light, with a greater weight generally being applied to the photopic component than the scotopic component when ambient light is brighter, and a greater weight generally being applied to the scotopic component than the photopic component when ambient light is dimmer. Mesopic vision model 46 may be based on a two-stage model including the cone and opponent stages and may assume rod intrusion at the opponent stage. Model 46 may include a gradual and/or nonlinear shift in spectral luminous efficiency to account for the spectral sensitivity difference between photopic and scotopic vision and the nonlinearity of rod influence on the luminance channel. Model 46 may assume a decrease in the chromatic component with decreasing illuminance to account for the reduction of saturation at low illuminance levels. Model 46 may also assume that the red-green component and yellow-blue component change with illuminance levels independently of one another, which will help account for hue shifts that occur with decreasing illuminance.
Control circuitry 16 may use one or more additional models to assign an image quality score to images that are compensated using mesopic vision model 46. For example, as shown in
Models 54 may be used to assess whether mesopic vision model 46 has applied the appropriate tone mapping and color mapping to input image 44. For example, contrast model 50 may be used to produce a contrast quality score indicating how close the contrast of the perceived compensated image 48 in the measured (dim) ambient light conditions is to that of the perceived original image in normal (bright) ambient light conditions. Color discrimination model 52 may be used to produce a color quality score indicating how close the colors (e.g., colorfulness) of the perceived compensated image 48 in the measured ambient light conditions are to that of the perceived original image in normal ambient light conditions. The lower the image quality scores output by models 54, the closer the compensated image 48 in low light is likely to be to the original image in normal ambient light. If the score is too high (e.g., above some threshold, indicating that compensated image 48 in low light conditions is too noticeably different from the original input image 44 in well-lit conditions), control circuitry 16 may adjust the tone mapping and/or the color mapping applied by mesopic vision model 46 until model 54 produces a sufficiently low score. In other words, mesopic vision model 46 may be configured to iteratively adjust the compensation applied to images in low light to minimize a function that jointly optimizes both contrast and colorfulness in compensated image 48.
In each of these curves, low content luminance values are associated with black and low grey levels and high content luminance values are associated with white and high gray levels. Curve 56 is associated with a display pixel luminance value of DL1 visible to the user for a content luminance value of CL1, curve 58 is associated with a display pixel luminance value of DL2 for content luminance CL1, and curve 60 is associated with a display pixel luminance value DL3 for content luminance CL1. The luminance level DL2 is brighter than luminance level DL1, because curve 58 is associated with a brighter set of output luminance values from pixels 36 than curve 56. Similarly, luminance level DL3 is brighter than luminance level DL2 because curve 60 is associated with a brighter set of output luminance values from pixels 36 than curve 58. White image pixels (e.g., pixels at content luminance level CL2) are all associated with the same display luminance level DL4 (e.g., the brightest output available from pixels 36 in display 14), so the mappings of curves 56, 58, and 60 will all produce a display luminance of DL4 for a content luminance of CL2.
Due to the interaction between colorfulness, contrast, and brightness, control circuitry 16 may use a joint optimization process that tunes the strengths (e.g., weights) of color compensation and contrast enhancement in order to balance the benefits of color, contrast, and brightness in low light conditions. In other words, the iterative tone mapping process described in connection with
Illustrative mapping functions that may be applied in low light conditions to compensate for the reduced color sensitivity of the human eye are shown in
To account for the behavior of the cones and rods in the human eye, the mapping functions used by control circuitry 16 (e.g., the mapping functions of
Since input image 44 may originally be represented in RGB color space (e.g., a color space defined by RGB digital display control values for red, green, and blue pixels in display 14), control circuitry 16 may first convert the original input image 44 to perceived image 70 in an opponent color space (e.g., having a luma channel representing brightness and two chroma channels representing the red-green channel and the blue-yellow channel of the human eye). An opponent color space defines light and color based on how the light and color are received in the retina. The goal for control circuitry 16 is to compare and closely match the retinal version of original image 44 in the reference ambient light condition to the retinal version of compensated image 48 in the measured ambient light condition. Since ambient lighting conditions will affect how an image is perceived by the retina, control circuitry 16 may use a reference ambient light brightness when mapping image 44 to perceived image 70 in the opponent color space under the reference ambient light conditions. Image 70 may, for example, have a luma component representing brightness, a first chroma component representing the red-green channel, and a second chroma component representing the blue-yellow channel.
Before, after, and/or in parallel with mapping image 44 to perceived reference image 70, control circuitry 16 may apply a tone mapping and color compensation to image 44 based on the measured ambient light conditions (e.g., based on the ambient light brightness measured by ambient light sensor 20) to produce compensated image 48. This may include, for example, applying a tone mapping curve to image 44 as described in connection with
After compensating image 44 to produce compensated image 48, control circuitry 16 may convert compensated image 48 to perceived compensated image 74 in the opponent color space. Since ambient lighting conditions will affect how an image is perceived by the retina, control circuitry 16 may use the measured ambient light brightness from ambient light sensor 20 when mapping compensated image 48 to perceived compensated image 74 in the opponent color space under the measured ambient light conditions. Image 74 may, for example, have a luma component representing brightness, a first chroma component representing the red-green channel, and a second chroma component representing the blue-yellow channel.
Control circuitry 16 may use perception contrast and color discrimination model 54 to compare the perceived input image 70 in the reference ambient light conditions to the perceived compensated image 74 in the measured ambient light conditions. This may include, for example, using perception contrast model 50 to determine an image contrast quality score based on the difference in contrast between perceived image 70 and perceived image 74 and using color discrimination model 52 to assign a color quality score based on the difference in colorfulness between perceived image 70 and perceived image 74. Lower scores may be associated with more closely matched images than higher scores (if desired). The two scores may be combined into a single image quality score to determine an overall quality of compensated image 48. If the image quality score is too high, control circuitry 16 may adjust the weights associated with the tone mapping (e.g.,
During the operations of box 80, control circuitry 16 may receive input image 44 (sometimes referred to as original image 44) and may apply a tone mapping of mesopic vision model 46 to input image 44 with one or more first weights. The first weights may be an initial guess, may be based on the measured ambient light from sensor 20, and/or may be based on other factors. This may include, for example, applying one of the tone mapping curves of
During the operations of box 82, control circuitry 16 may apply one or more color mapping functions of mesopic vision model 46 to input image 44 with one or more second weights. The second weights may be an initial guess, may be based on the measured ambient light from sensor 20, and/or may be based on other factors. This may include, for example, applying the brightness mapping curve of
During the operations of block 84, control circuitry 16 may determine an image quality score of compensated image 48 (e.g., image 48 to which tone mapping and color mapping of mesopic vision model 46 have been applied) using perception contrast and color discrimination model 54. This may include, for example, converting compensated image 48 to an opponent color space and comparing the perceived compensated image 48 (e.g., the retinal version of the compensated image 48) in the measured ambient light conditions to the perceived original image 44 (e.g., the retinal version of the original image 44) in reference ambient light conditions, as described in connection with
During the operations of block 86, control circuitry 16 may compare the image quality score determined during the operations of block 84 with a threshold image quality score (e.g., so that the overall difference between compensated image 48 in the measured low light conditions and original image 44 in reference light conditions is less than some JND value). If the image quality score is greater than the threshold, control circuitry 16 may adjust the first weights associated with the tone mapping (e.g.,
The foregoing is merely illustrative and various modifications can be made by those skilled in the art without departing from the scope and spirit of the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of provisional patent application No. 63/326,167, filed Mar. 31, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63326167 | Mar 2022 | US |