This application claims priority to GB Application No. 1621901.6, filed Dec. 21, 2016, under 35 U.S.C. § 119(a). The above-referenced patent application is incorporated by reference in its entirety.
The present disclosure relates to a method of controlling display of image data representing an image on a display device, a display controller and a display system.
Known display devices have backlights arranged to illuminate pixels of the display device. The pixels operate as optical switches to obscure a varying proportion of light from the backlight in order to produce display effects ranging from a darkest display effect, which may be intended to correspond with a complete obstruction of light from the backlight, to a lightest display effect, which may correspond with a minimal obstruction of light from the backlight. Backlights come in various forms, including globally-adjusted backlights, in which the luminance of the backlight is adjusted equally across the entire display, one dimensional locally-adjusted backlights in which the luminance of the backlight may be adjusted differently along strip-like regions arranged side by side along one dimension of the display, and two dimensional locally-adjusted backlights in which the luminance of the backlight may be adjusted differently in square or rectangular regions arranged in a two dimensional array across the display.
Known display devices suffer from light leakage. This means that the pixels are unable to completely obscure light from the backlight even for the darkest display effect. In other words, light may leak from the backlight and be visible to an observer even in an off state of the display device. Dark display effects may therefore be lighter than desired or intended, leading to a corresponding reduction in contrast of the display device.
It is known to reduce light leakage of a display device by locally dimming the backlight for dark image regions. However, a locally dimmed backlight as described above is relatively complex and many devices do not have local dimming. Even if local dimming is available, the size and configuration of the dimmable regions is unlikely to correspond with the size and shape of the light and dark regions of an image.
It is desired to provide a method of controlling display of image data representing an image on a display device to provide an improved display quality.
According to some embodiments, a method of controlling display of image data representing an image on a display device is provided. The method includes determining one or more features of the image data. The method includes, in dependence on the determining, adjusting a display luminance of the display device, and applying a spatially-variant tone mapping operation to the image data.
According to some other embodiments, a display controller for controlling display of image data representing an image on a display device is provided. The display controller includes a luminance adjustment unit. The display controller includes a tone mapping module. The luminance adjustment unit is arranged to adjust a display luminance of a display device in dependence on a determination of one or more features of the image data. The tone mapping module is arranged to apply a spatially-variant tone mapping operation to the image data in dependence on the determination.
According to some other embodiments, a display system is provided. The display system includes a display device. The display system includes a display controller for controlling display of image data representing an image on the display device. The display controller includes a luminance adjustment unit. The display controller includes a tone mapping module. The luminance adjustment unit is arranged to adjust a display luminance of a display device in dependence on a determination of one or more features of the image data. The tone mapping module is arranged to apply a spatially-variant tone mapping operation to the image data in dependence on the determination.
Further features will become apparent from the following description, given by way of example only, which is made with reference to the accompanying drawings.
Details of the method according to examples will become apparent from the following description, with reference to the figures. In this description, for the purpose of explanation, numerous specific details of certain examples are set forth. Reference in the specification to “an example” or similar language means that a particular feature, structure, or characteristic described in connection with the example is included in at least that one example, but not necessarily in other examples. It should further be noted that certain examples are described schematically with certain features omitted and/or necessarily simplified for ease of explanation and understanding of the concepts underlying the examples.
Examples described herein provide a method of controlling display of image data representing an image on a display device. The method includes determining one or more features of the image data and, in dependence on the determining, adjusting a display luminance of the display device and applying a spatially-variant tone mapping operation to the image data.
The display luminance can for example be adjusted to reduce light leakage from pixels of the display device, for example to improve the contrast of the display device. The spatially-variant tone mapping operation (discussed in further detail below) may be used to compensate for the change in display luminance to enhance detail in the image such that the detail remains visible despite the change in display luminance. This can improve the display quality of the image displayed by the display device.
In examples, the display luminance may be adjusted and the spatially-variant tone mapping operation may be applied in dependence on the level of compression-noise in the image data to control or reduce the visibility of compression artifacts in the image (explained further below). For example, for an image suffering from extensive compression artifacts, a smaller amount of spatially-variant tone mapping may be applied to avoid undesirably enhancing these artifacts in the image, or the display luminance may be adjusted to a lesser extent so that these artifacts are less noticeable to a viewer.
The display device may be a transmissive or transflective display device such as liquid crystal display (LCD) device, an electrowetting display device or an electrophoretic display device. Alternatively, the display device may be a display device in which pixels or picture elements of the display device generate light, such as an organic light emitting diode (OLED) display device or a plasma display device. The display device may be part of an electronic or computing device such as a television, laptop, tablet or smartphone.
The display device 3 in the example of
The display device 3 of
Various different light sources may be used for the backlight. In the example of
The backlight may emit white light or light of other colors. In examples in which the backlight includes LEDs, such as the example of
The backlight in this example is arranged such that pixels of the display device 3 are located between the LEDs 14 and a viewing side of the display device 3. The LEDs may therefore be considered to be behind the pixels of the display device 3. However, in other examples, a backlight or light sources of the backlight may be arranged around an edge of the display device 3 panel, with a diffusion panel arranged to spread light substantially evenly behind the pixels, for example such that differences in illumination of pixels of the display device 3 are imperceptible to a viewer of the display device 3. For example, the backlight may include a plurality of light sources, such as LEDs, located along one or more edges of the display device 3 panel, such as along a top and a bottom edge. The display device 3 in such examples may further include one or more light guides or diffusers for distributing light from the light sources across the display device 3 panel to illuminate the pixels. The light sources may not be individually controlled or controllable, to simplify the driving of the display. Instead, the light sources may be controlled by one common controller, for example such that each light source is set at the same intensity. For example, where the display device 3 is a display device of a smartphone, the backlight typically includes strips of LEDs that are globally controlled.
Irrespective of the precise arrangement of the backlight, in examples the backlight is located for light from the backlight to pass through pixels of the display device 3, for example from a rear side of the display device 3 to a viewing side of the display device 3. In this way, the backlight illuminates the pixels and contributes to a display luminance of the display device 3. Display luminance is typically understood as referring to the luminous intensity per unit area of light travelling in a given direction, for example passing through a particular solid angle. Subjectively, the display luminance may be considered to correspond to brightness, which is the visual perception of the display luminance. For example, a higher display luminance will generally be experienced by a viewer as a brighter display. The display luminance of a display device in examples with a backlight typically depends on the intensity of the backlight and the transmittance of the pixels of the display device. Thus, the display luminance of such a display device can be controlled by controlling the intensity of the backlight or the proportion of light transmitted by the display device pixels, as described further below. In examples without a backlight, for example emissive display devices such as OLED or plasma display devices, the display luminance may be controlled by controlling the luminance or intensity of each of the pixels.
Image data 4 representing the image is input to the display controller 2 and output as display data 5 to the display device 3. An output of the display controller 2 may be connected to a display driver (not shown in
Input data 6 may be converted to the image data 4 by a processor 7, the conversion including for example video or image decoding. In other examples, the input data 6 is input directly to the display controller 2 as the image data 4 without conversion or further processing. In these other examples, the processor 7 may be absent and the display controller 2 may convert or decode the image data 4 as needed.
The image data may include the intensity values of each pixel of the image, which may be stored with a greyscale or brightness level of, for example, from 0 to 255 per color band for 8-bit data. A greyscale level of 0 for example corresponds with a darkest intensity (e.g. black) and a greyscale level of 255 for example corresponds with a lightest intensity (e.g. white), with greyscale levels between 0 and 255 corresponding with an intermediate intensity between black and white. The image data may also include color data relating to the color of the image represented by the image data. For example, when the image is a color image, a pixel value of an intensity or brightness or each pixel may be stored separately for each color channel. If a pixel is represented by, for example, three primary colors such as in the RGB (red, green, blue) or YUV color spaces (where Y represents the luma of the color, U represents the difference between the blue component of the color and the luma and V represents the difference between the red component of the color and the luma), the visual appearance of each pixel may be represented by three intensity values, one for each primary color, for example with a bit precision of 8 bits per color channel. As will be appreciated by the skilled person, the image data may represent the image using any suitable representation, which may be different from the examples set out above, which are merely illustrative.
The image data 4 may be in any suitable format for representing images. In examples, the image data 4 is in a more than κ-bit format such as a higher than 8-bit floating point format. The image data 4 may, for example, be in an HDR (high dynamic range) format such as the JPEG XT format.
The display controller 2 includes a luminance adjustment unit 8 for controlling the luminance of the display device 3. In examples such as that of
The luminance control signal 13 output by the luminance adjustment unit 8 may be selected from a look-up table, for example. The look-up table may include a mapping of features of the image data to a particular display luminance. For example, a particular feature may correspond with a particular luminance control signal 13 to be output by the luminance adjustment unit 8. The mapping in the look-up table may depend on a construction of pixels of the display device 3, for example a level of light leakage of pixels of the display device 3 as determined under reference conditions, such as in factory conditions. In further examples, the mapping or the luminance control signal 13 may also depend on a user input. For example, a user may be able to adjust the display luminance depending on their viewing preferences, which may depend on the viewing conditions such as whether the user is in a dark or light location or the nature of the content. For example, the optimal or desired display luminance for black text on a white background may be different from that for white text on a black background or for video content. The display luminance may be adjusted for example by a user selecting a particular display luminance for the display device 3, for example by interacting with a computing device coupled to the display device 3. In other examples, the luminance control signal 13 may be calculated by the luminance adjustment unit 8 based on the features of the image data. For example, where the features of the image data include or are representative of an amount of compression-noise in the image data, the luminance control signal 13 may be calculated based on the amount of compression-noise.
The display controller 2 of
The spatially-variant tone mapping operation applied by the tone mapping module 12 may enhance detail or contrast in the image, while still ensuring the image appears relatively “natural” to an observer. To do this, the tone mapping is spatially-variant and therefore may be asymmetric in the brightness domain, such that a greater amount of tone mapping is applied to dark regions of the image than relatively bright regions, for example by altering an intensity value of relatively dark portions of the image to a greater extent than relatively bright portions. This mimics the behavior of the human eye, which has a relatively high dynamic range, and which is capable of seeing detail in even relatively dark regions of an image. The spatially-variant tone mapping operation may therefore be spatially non-uniform, with a greater amount of tone mapping applied to certain spatial regions of the image compared with other spatial regions. The tone mapping may be continuous and smoothly-varying in both spatial and luminance dimensions. The intensity range of pixels corresponding with detail to preserve in the image in dark and/or light areas may therefore be increased and the intensity range of other areas of the image may be decreased. The spatially-variant tone mapping may therefore be used to adjust or alter the dynamic range of the image, which in examples is the ratio between intensities of the brightest and darkest parts of the image. Various different tone mapping algorithms may be used for the spatially-variant tone mapping operation. For example, a suitable algorithm is the Orthogonal Retina-Morphic Image Transform (ORMIT) algorithm.
The luminance adjustment unit 8 and the tone mapping module 12 in examples such as
The one or more features of the image data may be any features or characteristics of the image data that may affect the display quality upon a change in display luminance and/or spatially-variant tone mapping. For example, the one or more features may include a feature representative of a level of compression-noise in the image data representing the image. In this example, applying spatially-variant tone mapping to image data with a relatively high level of compression-noise, for example to attempt to compensate for a reduction in display luminance, may increase the appearance of unsightly compression artifacts in the image, reducing the display quality.
The compression-noise may be visible as pixels with an incorrect (for example noticeably darker or lighter) intensity, around features of the image such as edges or regions corresponding to a transition from a light to a dark image region. There may also or instead be visible “blocks” of pixels with the same intensity around such image features, rather than pixels with smoothly varying intensities. Compression-noise, sometimes referred to as compression artifacts, such as these are typically caused by the quantization step of a lossy encoding operation applied to input data to generate the image data, such as a JPEG (Joint Photographic Experts Group, ISO/IEC 10918) or JPEG XT (ISO/IEC 18477) encoding operation, for example by the processor 7. This step generally involves rounding of various components to integer values, thereby reducing the information associated with the quantized image data after encoding. The visibility of such compression artifacts may depend on the extent or amount of compression applied to the input data 6 to obtain the image data 4 received by the display controller 2.
In order to determine the feature representative of the level of compression-noise in the image data it may not be necessary to calculate an exact or precise level or amount of compression-noise in the image data. Instead, it may be sufficient to determine or estimate an approximate or rough amount of compression-noise that is expected to be present in the image data. For example, the one or more features of the image data may include a feature that the image data satisfies a predetermined compression-noise criterion. This feature may be derived from a calculation of the compression-noise in the image data or this feature may be ascertained based on another characteristic of the image data that indicates an approximate or expected level of compression-noise in the image data.
For example, a format of the image data may be indicative or representative of the approximate level of compression-noise in the image data. In such cases, the compression-noise criterion may relate to the format of the image data. For example, if the format of the image data is a low dynamic range (LDR, sometimes referred to as SDR or standard dynamic range) format such as the JPEG file format, this may be indicative that the image data has a relatively high level of compression-noise. In contrast, if the format of the image data is a high dynamic range (HDR) format such as the JPEG XT file format, this may indicate that the image data has a relatively low level of compression-noise.
In these examples, the feature representative of the level of compression-noise in the image data may relate to a format of the image data, such as a high dynamic range format. In such cases, the determining the feature representative of the level of compression-noise in the image data may involve, solely or in conjunction with other steps, determining the format of the image data and assessing whether the format is of at least one predetermined format considered to satisfy the predetermined compression-noise criterion. If the format is of the at least one predetermined format, the display luminance of the display device may be adjusted and the spatially-variant tone mapping operation may be applied to the image data. Where the feature relates to a high dynamic range format, the determining the feature representative of the level of compression-noise in the image data may thus involve determining that the image data is in an HDR format.
In other examples, the predetermined compression-noise criterion may correspond to another criterion such as a threshold level of compression-noise in the image data. In such cases, the determining the feature representative of the level of compression-noise in the image data may include processing the image data, for example to calculate the amount or level of the compression-noise in the image data. The calculated level of compression-noise can then be compared against the threshold level of compression-noise to ascertain whether the predetermined compression-noise criterion is satisfied.
To put the method according to examples such as
In the example of
In the example of
If it is determined that the image does not include dark patches, the image is displayed 25 on the display device 3. In
If the image is determined to be an HDR image and to include dark patches, the display luminance is reduced and the spatially-variant tone mapping operation is applied to the image data to increase contrast in a dark area of the image 23. The image is then displayed 24 on the display device.
By reducing the display luminance, light leakage from a light source supplying light to pixels of the display device may be reduced. This may improve the image quality for a viewer by making dark areas of the image appear darker, for example so that these dark areas are closer to a desired dark state, and by increasing the contrast of the display device. Reducing display luminance in this way may also reduce power consumption of the display device.
However, by reducing the display luminance, detail in dark areas of the image may no longer be visible to an observer. To maintain a display quality of the image, the spatially-variant tone mapping operation can be applied to increase contrast in these dark areas, to enhance, intensify or increase the visibility of detail in these image regions without compression-noise becoming visible in the image.
If, however, the image suffers from a higher level of compression-noise, increasing the contrast in dark areas of the image to compensate for a reduction in display luminance would not only increase the visibility of detail in these regions but would also increase the visibility of compression artifacts in these regions. This may be unsightly. Thus, in examples such as
In further examples, the adjustment in the display luminance and the amount of spatially-variant tone mapping applied to the image data may vary, for example smoothly, in dependence on the level of compression-noise in the image data or in dependence on a feature resulting from an analysis of image content of the image such as a darkness or pixel intensity of the darkest patch or patches of the image. For example, a decrease in display luminance may correspond with an increase in spatially-variant tone mapping and vice versa.
The amount of spatially-variant tone mapping applied to the image data depends on a tone mapping strength in examples. The tone mapping strength may be or correspond with a particular, e.g. a pre-determined, gain G. The gain G may be expressed as:
where D is the dynamic range of the image data before the spatially-variant tone mapping operation and DTM is an output dynamic range to be obtained after the spatially-variant tone mapping operation. The output dynamic range may for example correspond with a suitable dynamic range for detail to be visible in dark regions of the image. The gain G may be calculated from the adjustment in the display luminance. For example, if the display luminance is reduced by a factor x, the gain G may be set to equal this factor x so that the output dynamic range is increased by the factor x to compensate for the reduction in the display luminance. As the tone mapping operation is spatially-variant, the gain G may vary in different image regions. For example, the gain G may be set to equal this factor x in dark image regions, but may be set to 1 (for example, so that there is no change in the dynamic range) for bright image regions. In other examples, the relationship between the display luminance and the gain G may be different, for example with the gain G equal to a function of the factor x for a reduction in display luminance by the factor x.
An input value α to the spatially-variant tone mapping operation may be derived from the gain G as follows:
where G is the gain defined in (1), and Gmax is the maximum gain achievable with a maximum tone mapping strength.
The input value α may be considered to represent a strength of the tone mapping transformation, which may take a value between 0 and 1, for example, for example an amount or magnitude by which each pixel's intensity or brightness is altered by the tone mapping operation. The input value may be different for different pixels in the image, for example due to a different gain G for different pixels, in order to achieve an amount of tone mapping which varies across the image, i.e. spatially-variant tone mapping. For example, the input value may vary in accordance with pixel intensity so that the tone mapping is stronger (for example with a higher input value) in darker parts of the image with low pixel intensity values, and is weaker in brighter parts of the image, as described above. This allows stronger enhancement of the shadows without affecting the bright regions. For example, a pixel-by-pixel gain may be calculated, for example by dividing the pixel intensity with maximum, or at least relatively strong, tone mapping applied by the pixel intensity with zero, or at least relatively low, tone mapping for each pixel, for example to apply a maximum or relatively strong tone mapping to each pixel. The pixel-by-pixel gain may be used as the input value to the spatially-variant tone mapping operation and may be applied to the image data by multiplying the pixel intensity value of each pixel with the corresponding gain value for that pixel. As noted above, the tone mapping operation may be the ORMIT algorithm. In this case, the input value is the ORMIT α parameter.
In further examples, the spatially-variant tone mapping operation may be or include a so-called alpha-blending operation. In these examples, the input value α to the spatially-variant tone mapping operation may be the input to the alpha-blending operation, which governs a relative contribution to the image data used for displaying the image on the display device of first image data representing the image with a first amount of spatially-variant tone mapping applied and second image data representing the image with a second amount of spatially-variant tone mapping applied. The input to the alpha-blending operation may be calculated from the gain G in the same or a similar manner as the gain G is calculated for the example of the ORMIT algorithm.
In this case, the pixel intensity values may be modified as:
I
out
=I
1×(1−α)×I2×α (3)
where Iout is the output intensity value for the image data, I1 is the pixel intensity value from the first image data and I2 is the pixel intensity value from the second image data, which may be obtained by applying an initial amount of spatially-variant tone mapping to the first image data.
Other blending schemes are also possible. For example, the pixel intensity values may instead be modified as:
I
out=√{square root over (I12×(1−α)+I22×α)} (4)
where Iout, I1, I2 and a are as previously defined.
The alpha-blending procedure may be considered to be an overlaying or combining of two versions of the same image; one with no tone mapping applied (corresponding to the first image data) and one with non-zero tone mapping applied (corresponding to the second image data), which may be with maximal tone mapping applied, for example. In further examples, the first image data may also be tone mapped compared with the image data prior to application of the spatially-variant tone mapping operation, but with a different amount of tone mapping than the initial amount of spatially-variant tone mapping applied to generate the second image data.
If the characteristic representative of the level of compression-noise in the image data of the image is lower than that of the image data of the previously displayed image, the image is displayed 32 on the display device. This may be the case for example where the characteristic representative of the level of compression-noise in the image data is compared against a predetermined compression-noise criterion. If the characteristic representative of the level of compression-noise in the image data of the image and the previously displayed image both satisfy the predetermined compression-noise criterion, for example if both the image and the previously displayed image are HDR images with relatively low compression-noise, then it may not be necessary to further alter the display luminance or the tone mapping.
In contrast, if the characteristic indicative of the level of compression-noise in the image data is above that of the previously displayed image, for example if the previously displayed image was in an HDR format and the image is in an SDR format, the method of
Unlike the method of
In the examples of
In examples in which the display device includes a backlight, the intensity of the backlight may be substantially spatially uniformly adjusted. This may simplify the driving of the display device, for example as this obviates the need to individually adjust the light incident on each pixel. For example, the intensity of each light source of the backlight, for example each LED 14 in the example display device 3 of
In further examples, though, the display luminance may be adjusted by applying a spatially-variant adjustment to the display luminance. In these further examples, the display luminance for some pixels may be altered by a different amount or proportion than for other pixels. For example, by applying the spatially-variant adjustment, the display luminance may be altered from being spatially substantially uniform, or uniform within human perception, to being spatially non-uniform. Alternatively, where the display luminance prior to adjustment is already spatially non-uniform, the spatially-variant adjustment may further alter the relative intensity or brightness of different pixels of the display device.
In these further examples, the display device may include a backlight including a plurality of light sources, each at a different location. A spatially-variant adjustment to the display luminance in these cases may be performed by adjusting a first intensity of a first light source of the plurality of light sources so that the first intensity is different from a second intensity of a second light source of the plurality of light sources. For example, as described above, the backlight may include light sources that each illuminate a strip or stripe of the display device panel. There may be, for example, one LED per strip, with each LED at an edge of the panel, and with a light guide arranged to distribute the light from each LED along the strip of pixels. In this example, the spatially-variant adjustment may be applied by adjusting the intensity of one of the LEDs, for example so that it is brighter or darker than the other LEDs. This typically leads to a, respectively, brighter or darker strip of pixels of the display device panel, for example such that the display luminance of the display device is spatially non-uniform. In other examples, the spatially-variant adjustment may be performed by adjusting the first intensity of the first light source by a different amount, factor or proportion as a second intensity of a second light source. For example, the first intensity may be decreased by a factor of 2 and the second intensity may be decreased by a factor of 4. These examples may therefore allow the display luminance to be adjusted in one or two dimensions, depending on the precise structure of the backlight, which may be different from that described above. For example, as described above, there may be one or more light source per one or more pixel rather than merely light sources located at an edge of the display device panel.
By applying such a spatially-variant adjustment to the display luminance, the display quality may be further improved. For example, if a portion of the image is relatively bright, the display luminance may not be adjusted in pixels of the display device that are used for displaying the portion of the image. However, for pixels corresponding to a different portion of the image that is relatively bright, the display luminance may be reduced and a spatially-variant tone mapping operation may be applied to the image data representing the different portion of the image to compensate for the reduction in display luminance. Moreover, locally adjusting the display luminance in this way may further reduce the power consumption of the display device. For example, a local reduction in display luminance may result in a power saving of up to around 30% compared with a global reduction in display luminance.
In the examples of
Referring back to
The ambient light signal 10 is output by an ambient light sensor 11 that measures the ambient light level near the display device 3. The ambient light sensor may include one or more photo-detectors for measuring the ambient light level; the use of multiple sensors may increase the reliability of the measurement of diffuse ambient light. The ambient light level may be determined at the viewing side and/or the rear side of the display device 3.
In the example of
In other examples, the adjustment of the display luminance of the display device and the application of the spatially-variant tone mapping operation may depend on a level of reflection of ambient light or an expected level of reflection of ambient light from the display device (sometimes referred to as “screen glare”). This generally depends on the ambient light level and the display luminance, for example the intensity of the backlight in display devices with a backlight, as well as other factors such as the construction of the display device or the cleanliness of the display device screen. Screen glare is typically a problem in bright or sunny conditions and may be reduced by increasing the display luminance. However, increasing the display luminance typically also increases the amount of light leakage for dark image regions, making black image patches appear dark grey rather than black. Thus, there may be a trade-off between screen glare and light leakage. For example, in conditions in which screen glare dominates, the display luminance may be increased until such point that the amount of light leakage starts to become substantially equal to the level or expected level of reflection of ambient light from the display device. This may be determined for example by comparing the intensity of light leaking from pixels of the display device and the intensity of reflected light. At this point, the display luminance may not be increased further, to avoid the light leakage becoming even more visible. The spatially-variant tone mapping operation may be applied to the image data as described above, to counteract the change in display luminance.
As can be seen from the first sub-graph 40a, initially the image includes a central light region 43a, with a maximum pixel intensity of Imax. The central light region 43a is surrounded by dark regions 44a, 45a, with a lower pixel intensity. In this example, the darkest or lowest intensity parts of the dark regions 44a, 45a are intended to correspond with a black or darkest display state of the display device. However, due to light leakage from pixels of the display device, the darkest parts of the dark regions 44a, 45a have a pixel intensity of Δ rather than 0, for example due to the extra photons escaping from the pixels in these regions. The parts of the dark regions 44a, 45a with an intensity higher than Δ represent detail in the dark regions 44a, 45a, which will be visible to the user due to the difference in intensity compared with the darkest parts of the dark regions 44a, 45a.
In the example of
Similarly, the intensity of the darkest parts of the dark regions 44b, 45b in the second sub-graph 40b are Δ/2 rather than Δ. However, a spatially-variant tone mapping operation has been applied to the image data representing the dark regions 44b, 45b in the second sub-graph 40b, which compensates for the reduction in the display luminance in the dark regions 44b, 45b. This can enhance the detail in these dark image regions, which would otherwise be lost. Despite this, the absolute difference between the lightest parts of the dark regions 44b, 45b, which as explained above may correspond with features or detail in the dark regions 44b, 45b, and the darkest parts of the dark regions 44b, 45b remains unchanged in the second sub-graph 40b compared with the first sub-graph 40a. This is because the tone mapping operation is applied in the RGB domain. Thus, even though the RGB intensity values are increased by a factor of 2 to compensate for the reduction in the display luminance by a factor of 2, the pixel luminance, which corresponds with the RGB intensity values multiplied by the display luminance remains unchanged. However, the contrast of the dark regions, for example taken as the difference between the brightest and darkest parts of the dark regions divided by the average intensity of the dark regions, is greater in the second sub-graph 40b than in the first sub-graph 40a due to the reduction in the average intensity in the dark regions. As the human visual system is more sensitive to changes in contrast than to changes in absolute luminance, this improvement in contrast will improve the image quality as perceived by a viewer for the second sub-graph 40b compared to the first sub-graph 40a. Thus, detail in the dark regions of the image displayed in accordance with the second sub-graph 40b will be more visible than for the first sub-graph 40a.
In a typical LCD device, the contrast will be around 1000. For such a display device, the display luminance may be reduced by a factor of between 2 and 4, and the gain obtained by the spatially-variant tone mapping may be increased by a corresponding factor of between 2 and 4, for example in dark parts of the image, although the precise display luminance adjustment and gain will generally depend on the one or more features of the image data.
The display device used to display the image of
In the example of
In this example, the intensity of the first and third light sources in the first and third zones 146b, 148b has not been adjusted. This may be the case where, for example, the display quality of the parts of the image in the first and third zones 146b, 148b is already considered sufficiently high or if these parts of the image are relatively free of detail. No tone mapping has been applied to the portions of the image corresponding to the first and third zones 146b, 148b in this example as the display luminance of the first and third light sources is unchanged. Thus, in examples such as
The above examples are to be understood as illustrative examples. Further examples are envisaged. For example, in
It is to be understood that any feature described in relation to any one example may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the examples, or any combination of any other of the examples. Furthermore, equivalents and modifications not described above may also be employed without departing from the scope of the accompanying claims.
Number | Date | Country | Kind |
---|---|---|---|
1621901.6 | Dec 2016 | GB | national |