Emissive displays (e.g. LCD or LED displays) are becoming increasingly present in the environment. Such displays require power to display content and emit light when the display is on. The emission of light makes these displays visually intrusive in certain environments (e.g. in a bedroom at night) unlike, for example, a painting, a poster or wallpaper.
The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known emissive displays.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not intended to identify key features or essential features of the claimed subject matter nor is it intended to be used to limit the scope of the claimed subject matter. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
A method of operating an emissive display is described in which an ambient light level is detected using a light sensor. If the detected ambient light level is in a predefined region, the method comprises setting a backlight level to a minimum level, generating a correction factor based on the detected ambient light level and modifying color values of pixels using the correction factor.
Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
Like reference numerals are used to designate like parts in the accompanying drawings.
The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example are constructed or utilized. The description sets forth the functions of the example and the sequence of operations for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
As described above, the emission of light makes emissive displays visually intrusive in certain environments, such as in a bedroom at night or in a theatre or cinema or other environments with low light levels (e.g. because the emissive displays glow brightly). Even in more brightly lit environments, such displays are more obtrusive than a piece of paper attached to a wall or noticeboard and this may result in information overload.
Described herein is a method of operating a display device that varies the brightness and chromaticity of the display dependent upon the ambient light, such that the displayed content more closely resembles content displayed on paper. Using the methods described herein, the brightness range of an emissive display is extended (e.g. compared to known emissive displays or to the same emissive display where the methods described herein are not implemented) to enable the brightness and chromaticity of the display to be more closely matched to paper over a larger range of ambient light levels. This makes the emissive display less visually intrusive. In particular, the methods described herein enable the brightness and chromaticity of the display to be more closely matched to paper at darker ambient light levels. This makes the emissive display less visually intrusive at such darker ambient light levels and more generally, reduces the power consumption of the emissive display (e.g. compared to known emissive displays or to the same emissive display where the methods described herein are not implemented).
When operated using the methods described herein, the display device may be described as a ‘calm’ display device because it is not obtrusive and may fade into the background from the perspective of a viewer (e.g. unlike a standard emissive display which glows brightly and hence attracts the attention of a viewer). In examples where the display device does not operate using the methods described herein all of the time, the display device may be described as having a ‘calm mode’. When operating in calm mode, the display device implements the methods described herein.
The display device 100 comprises an emissive display 102 (which comprises a backlight 104) and an ambient light sensor 106. The display device 100 may be a standalone display device or may be integrated into another device (e.g. the computing device 110, a home appliance, etc.). Although the ambient light sensor 106 is shown within the display device 100, it will be appreciated that in other examples, the ambient light sensor 106 may be separate from the display device 100 but positioned on, or very close to, the front (i.e. the surface displaying the content) of the display device 100 (e.g. close to the emissive display 102). Although
The ambient light sensor 106 is arranged to obtain light levels (e.g. intensities) at a plurality of different, pre-defined wavelengths (or wavelength ranges) at the same point in time, where each of these pre-defined wavelengths or wavelength ranges may be referred to as a channel. In various examples the ambient light sensor 106 may be arranged to obtain light levels at three or more different, pre-defined wavelengths (or wavelength ranges), e.g. one corresponding to each of the red, green and blue channels, at the same point in time. By obtaining light levels for all of the channels at the same point in time, color flickering artifacts, which might otherwise occur as a consequence of false color readings during lighting changes, can be avoided. The ambient light sensor 106 may, for example, comprise a single sensor (e.g. which can detect the separate colors) or multiple sensors (e.g. three sensors, one for each of the red, green and blue channels). The ambient light sensor 106 may comprise one analog-to-digital converter (ADC) and multiple sample and hold circuits to enable light levels to be obtained for the red, green and blue channels at the same point in time. Alternatively, the ambient light sensor 106 may comprise multiple (e.g. three) ADCs such that all color channels can be read simultaneously. In various examples, the ambient light sensor 106 may additionally comprise a fourth channel (and optionally a fourth ADC), the clear channel, which may be used in calibration, as described below.
In contrast, to paper (which has a relatively uniform angular response, scattering incoming light in all directions), many RGB sensors are particularly sensitive to perpendicular light (i.e. light incident at 90° to the surface of the sensor 106) and so a system which is calibrated using perpendicular light may be too dim when the incident ambient light comes from an oblique angle. In various examples the ambient light sensor 106 may comprise a means for reducing the sensitivity of the sensor to the angle of incident light, such as an enclosure 130 which attenuates perpendicular light and an example arrangement is shown in
The computing device 110 comprises one or more processors 112 which are microprocessors, controllers or any other suitable type of processors for processing computer executable instructions to control the operation of the device in order to generate display information (which is output to the display device 100). In some examples, for example where a system on a chip architecture is used, the processors 112 include one or more fixed function blocks (also referred to as accelerators) which implement a part of the method of generating display information in hardware (rather than software or firmware). Platform software comprising an operating system 114 or any other suitable platform software may be provided at the computing device to enable application software 116 to be executed on the device.
The computer executable instructions are provided using any computer-readable media that is accessible by the computing device 110. Computer-readable media includes, for example, computer storage media such as memory 118 and communications media. Computer storage media, such as memory 118, includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or the like. Computer storage media includes, but is not limited to, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM), electronic erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that is used to store information for access by a computing device. In contrast, communication media embody computer readable instructions, data structures, program modules, or the like in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Although the computer storage media (memory 118) is shown within the computing device 110 it will be appreciated that the storage is, in some examples, distributed or located remotely and accessed via a network or other communication link (e.g. using a communication interface 122).
The memory 118 may be used to store not only the computer-executable instructions for the operating system 114 and application software 116, but also data used by the operating system 114 and/or application software 116, such as calm display calibration data 121 (e.g. brightness and chromaticity calibration data). Although
The computing device 110 also comprises graphics hardware 124 arranged to output display information to the display device 100 which may be separate from or integral to the computing device 110. The display information may provide a graphical user interface and the display information that is output may be calm display information in order that the display device 100 can operate in the calm mode. In examples where the computing device 110 and display device 100 are not co-located and/or the two devices communicate via a network, the graphics hardware 124 may be arranged to output the display information to the display device 100 via the communication interface 122.
The graphics hardware 124 uses gamma lookup tables (LUTs) 120, which may be stored as part of the graphics hardware 124 (e.g. in hardware) or elsewhere (e.g. in memory 118) as a last step of screen content composition. Gamma LUTs 120 were originally intended to compensate for non-linearities in CRT monitors; however, in the methods described herein, they are used for a different purpose, as detailed below.
The computing device 110 may further comprise an input/output controller arranged to receive and process input from one or more devices, such as a user input device (e.g. a mouse, keyboard, camera, microphone, proximity sensor or other sensor). In some examples the user input device detects voice input, user gestures or other user actions and provides a natural user interface (NUI). This user input may be used as an input to the operating system 114 and/or the application software 116, e.g. to control what content is displayed on the display device 100 and/or whether the display is operating in calm mode. In an embodiment the display device 100 also acts as the user input device if it is a touch sensitive display device. The input/output controller may also output data to devices other than the display device, e.g. a locally connected printing device.
Any of the input/output controller, display device 100 and the user input device may comprise NUI technology which enables a user to interact with the computing-based device in a natural manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls and the like. Examples of NUI technology that are provided in some examples include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that are used in some examples include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, red green blue (RGB) camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, three dimensional (3D) displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (electro encephalogram (EEG) and related methods).
In various examples, the computing device 110 may be an embedded system.
The method comprises detecting the ambient light level using a light sensor (block 202) and then, dependent upon the detected ambient light level, brightness and chromaticity matching is performed in different ways (blocks 204-210). The brightness and chromaticity matching is performed using calibration data 121 (e.g. brightness and chromaticity calibration data) and methods for generating this calibration data 121 are described below. The outputs from the method are one or more gamma scaling factors 212 (e.g. three gamma scaling factors, one for each primary color) and a backlight brightness level 214 (which may also be referred to as a backlight intensity value). The gamma scaling factors 212 which are output are used as linear coefficients in the gamma lookup table 120 which is used when rendering content onto the display device 100 (i.e. the values in the gamma LUT are multiplied by the scaling factors). The backlight level 214 which is output is used to set the level of the backlight 104 in the emissive display 102.
A further representation of the method of
As shown in
The first and second regions (as used in blocks 204 and 208) may be defined using a single threshold value 1100 (as depicted in
In other examples, there may be more than two regions (as depicted in
The thresholds 1100, 1200 which are used to define the different regions may be pre-defined and fixed or may vary in some way. In various examples, the value of the threshold that is used to define the first and second regions may vary according to the displayed content and/or the color of the ambient light and/or based on other parameters.
The chromaticity matching (block 302) which is performed (in both blocks 206 and 210) uses chromaticity calibration data 304 and an example of this is shown in
As shown in
The chromaticity matching (block 302) operation can be described with reference to the diagrams in
As described above, for each of the calibration points 512-516 (e.g. ûA′{circumflex over (v)}A′, ûB′{circumflex over (v)}B′, ûC′{circumflex over (v)}C′), the calibration data comprises a desired display output (e.g. uA′vA′, uB′vB′, uc′vC′) and the triangle 511′ in the display space is also shown in
The chromaticity matching (block 302) therefore performs the following transformations (in blocks 502, 504, 506 and 508 respectively): sensor RGB→û′{circumflex over (v)}′→u′v′→display RGB
If the sensed ambient light level is in the first region (‘Yes’ in block 204), the final RGB values that are determined from the chromaticity matching (in block 508) are used as linear coefficients (i.e. the gamma scaling factors 212) in the gamma LUTs 120 (as used by the graphics hardware 124). In this way the chromaticity matching is applied globally to all content displayed on the display device 100 without extra computational overhead.
If, however, the sensed ambient light level is in the second region (‘Yes’ in block 208), then a correction factor is applied to these final RGB values (in block 210) and this correction factor is determined using the brightness calibration data as described below. The resulting values are then used as linear coefficients (i.e. the gamma scaling factors 212) in the gamma LUTs 120 (as used by the graphics hardware 124). As in the case for the first region, by using the gamma LUTs the chromaticity matching is applied globally to all content displayed on the display device 100 without extra computational overhead.
An examples of brightness calibration data 602 (or brightness profile) is shown in
In various examples, the cut-off level 606 defines the threshold which separates the first region and second region (as used in blocks 204 and 208 in
If the detected ambient light level (from block 202) is in the second region (‘Yes’ in block 208), e.g. it is below the cut-off level 606, the correction factor is determined from the negative brightness curve 608. For example, if the detected ambient light level is at a level indicated by arrow 612, then a parameter B (which in this example is negative) is determined using the curve 608 to be a value indicated by arrow 614 and the correction factor (which is between 0 and 1) is given by (1+B).
As described above, if the detected ambient light level (from block 202) is in the second region, e.g. it is below the cut-off level 606, then the correction factor is determined from the part of the brightness calibration data that is the negative brightness curve and the backlight level is set to a minimum (but non-zero) value (block 210). In contrast, the brightness matching (block 306) which is performed when the detected ambient light level is in the first region (in block 206), e.g. if it is above the cut-off level 606, uses the part of the brightness calibration data 308 which is above the cut-off level 606 (e.g. the linear part of the brightness profile). For example, if the detected ambient light level is at a level indicated by arrow 622, then the backlight level is set to a value indicated by arrow 624. In various examples, there may be two or more calibration points above the cut-off level 606 and the brightness matching (in block 206) may interpolate (e.g. linearly interpolate) between the calibration points.
In various examples, one or more filtering operations may be introduced into the method to reduce display flickering. A first optional filtering operation may be included to reduce quantization flicker and a second optional filtering operation may be included to reduce sampling flicker. Quantization flicker is caused by the ambient light level being exactly at the boundary of two digital values reported by the ambient light sensor 106, resulting in a stream of alternating values. The effect of this quantization flicker is more noticeable in lower light conditions. To mitigate quantization flicker, an averaging filter 310 may be placed on the sensor output, as shown in
Although the methods described above use an ambient light level detected by a single ambient light sensor 106, in other examples the display device 110 may comprise a plurality of ambient light sensors 106 which are positioned in different places around the emissive display 102. In such examples, the method of
Although the methods described above output both one or more gamma scaling factors 212 and a backlight level, with a correction being applied to the gamma scaling factors (in block 210) if the detected light level is in the second region (‘Yes’ in block 208), in other examples, the methods may use a global shader (to generate modified RGB values) instead of applying a correction factor to the gamma scaling factors. In such an example, if the detected light level is in the second region (‘Yes’ in block 208), a correction factor is calculated based on the detected light level (as described above) but instead of using this to modify the gamma scaling factor(s), the correction factor is instead input as a parameter to a global shader. The global shader then modifies the RGB values of the pixels to be displayed instead of using the gamma LUTs. In other example, a global shader may be used in combination with gamma LUTs, such that the global shader modifies the RGB values of the pixels prior to application of the gamma LUTs.
In a further variation on the methods described above, the methods may be used only to perform brightness matching and not chromaticity matching, as shown in
Use of the methods described herein may reduce the power consumption of the display device (e.g. because the backlight level is reduced when in calm mode). Furthermore, use of the methods increases productivity (e.g. because displays do not need to be switched off at night and on again in the morning) and may increase security (e.g. because security updates can be applied more quickly if devices are not manually switched off at night).
The chromaticity and brightness calibration data 304, 308 described above may be generated in any suitable way. Example methods of generating the calibration data are described below which involve a visual comparison (by a human viewer) of the emissive display and a piece of paper under the same lighting conditions. Alternatively, the calibration data may be generated without human involvement and with the visual comparison being performed using two cameras with high spectral accuracy. By using calibration data generated in this way, the emissive display, when operated as described above, not only is ‘calm’ in the sense that it fades into the background, but the colors and brightness used make the content look as if it is rendered on a reflective surface, such as being printed onto paper. As a consequence of the color matching to paper, the methods described herein may reduce eye strain or otherwise make content more accessible to users (e.g. because their eyes do not need to adjust to the unnatural light emitted by the display). In addition, or instead it may help people to sleep better (e.g. by reducing any effect of viewing the display on a viewer's circadian timing).
The calibration data may, in various examples, not be generated for each display device 110 but instead may be generated for a device type, where a device type is defined as a particular combination of sensor type and emissive display type (e.g. because any variation between sensors of the same type and/or emissive displays of the same type will not affect the calibration data significantly).
The generation of the brightness calibration data 308 can be described with reference to
Two examples 800, 801 of the arrangement of hardware which is used for the brightness calibration process are shown in
Although
The emissive display 102/display device 100 and the ambient light sensor(s) 106, 806 are connected to a computing device 110 which generates the brightness calibration data 308. This computing device 110 may be the same as the computing device 110 which implements the methods described above with reference to
As shown in
In the first stage 72, the backlight unit 104 in the emissive display 102 is set to a minimum brightness level (block 706), e.g. so that it is operating at the cut-off level 606, and light sensor readings are stored for a plurality of different values of gamma scaling factor between 0 and 1 (block 710). The ambient light sensor which is used in this first stage is facing the emissive display 102 such that it senses light emitted by the emissive display 102 and the measured values may be considered to be brightness levels. The first stage 72 generates at least two data pairs, each comprising a light sensor reading (where this reading corresponds to light emitted by the emissive display), SE, and a gamma scaling factor, G. Having generated all the data pairs, the light sensor readings are scaled to a relative scale from 0 to 1 (i.e. by dividing each reading by the maximum light sensor reading). This generates the negative brightness curve comprising a monatonic curve going from (0,0) to (1,1). In the examples described herein, however, the convention uses negative values and hence one may be subtracted from each of the values to generate a monatonic curve going from (−1,−1) to (0,0).
The second stage 73 provides a link between the performance of the emissive display 102 and how a human perceives the display. As shown in
Finally, the two parts of the calibration curve are combined by using the cut-off brightness level (from the second stage) to scale the y-values of negative brightness curve (which, as generated in the first stage, range from −1 to 0), thereby obtaining a negative brightness curve 608 as shown in
The light sensor readings which are stored in the method described above with reference to
The generation of the chromaticity calibration data 304 can be described with reference to
The first arrangement 800 shown in
As shown in
As shown in
The methods shown in
The methods of operation of an emissive display device described above with reference to
In various examples, the methods described herein (i.e. the calm mode of operation) may be used for a plurality of pixels around the edge of the display 1000 (e.g. those pixels in region 1002 in
By changing the shapes of the two regions 1002, 1004, not only can the display 1000 appear to have a non-rectangular shape, but the apparent shape of the display 1000 can be varied over time and the displays may be made to appear to move and in some examples, the display 1000 may appear to be a plurality of smaller displays.
As described above, using the method of
In further examples, a more intelligent sensing system may be used instead of one or more presence sensors. For example, the system may comprise one or more cameras (e.g. in communication with, or integral to, the display device 100 and/or the computing device 110) and the computing device 110 may comprise image processing software configured to determine whether a user is looking (e.g. based on eye tracking) or facing (e.g. based on tracking body position) towards the display device 100 or not, and to switch the emissive display 102 (and/or backlight 104) on and off, or to switch between normal mode and calm mode, in response to this determination (e.g. to switch the emissive display 102 off if the user is determined to not be looking or facing towards the display device and to switch the emissive display 102 on if the user is determined to be looking or facing towards the display device, where when switched on, the display device 100 in combination with the computing device 110 operates as described above with reference to
In the description above, the method of
Although the present examples are described and illustrated herein as being implemented in a system as shown in
Further examples are described below and additional examples may comprise various combinations of the following features. Features may be combined in any manner (e.g. such that an example may comprise any subset of the features of the first further example set out below or any subset of the features of the third further example set out below).
A first further example provides a method of operating an emissive display, the method comprising: detecting an ambient light level using a light sensor; and in response to detecting an ambient light level in a predefined region, setting a backlight level to a minimum level, generating a correction factor based on the detected ambient light level and modifying color values of pixels using the correction factor.
The method of the first further example may further comprise: in response to detecting an ambient light level in the predefined region, additionally performing chromaticity matching.
The predefined region may be a second region and the method of the first further example may further comprise: in response to detecting an ambient light level in a first region, performing chromaticity and brightness matching independently. Performing brightness matching for a detected ambient light level in the first region may comprise: accessing brightness calibration data, the brightness calibration data comprising a negative brightness curve defining calibration factors for ambient light levels below a cut-off level; and determining a calibration factor corresponding to the detected ambient light level based on the negative brightness curve in brightness calibration data.
In the method of the first further example modifying color values of content to be displayed using the correction factor may comprise: applying the correction factor to one or more gamma scaling factors output from the chromaticity matching to generate modified gamma scaling factors.
The predefined region may be a second region and the method of the first further example may further comprise: comparing the detected ambient light level to a threshold, wherein if the detected ambient light level exceeds the threshold, the ambient light level is in the first region and if the detected ambient light level does not exceed the threshold, the ambient light level is in the second region. The threshold may be determined based on a minimum, non-zero brightness level of the emissive display. The threshold may be determined based on a minimum, non-zero brightness level of a backlight unit in the emissive display.
In the method of the first further example performing chromaticity matching may comprise: converting sensor data from the ambient light sensor into an additive color space; accessing chromaticity calibration data and projecting the converted sensor data onto a closest triangle formed by three calibration points in the chromaticity calibration data; linearly interpolating between display output data for each of the three calibration points to generate a display output data point for the converted sensor data; and converting the display output data point back into RGB color space from the additive color space to generate one or more gamma scaling factors. The additive color space may be a CIE 1976 UCS.
In the method of the first further example generating a correction factor based on the detected ambient light level may comprise: accessing brightness calibration data; and determining a backlight level corresponding to the detected ambient light level based on the brightness calibration data.
The method of the first further example may further comprise: generating brightness calibration data. Generating the brightness calibration data may comprise: setting each pixel in the emissive display to white and setting a backlight in the emissive display to a minimum brightness level; adjusting a gamma scaling factor used to generate display information output to the emissive display and storing light sensor readings for a plurality of different values of gamma scaling factor, wherein the light sensor is positioned such that it captures light emitted by the emissive display such that each stored light sensor reading is a detected emitted light level; setting each pixel in the emissive display to white using a maximum gamma scaling factor; setting a brightness level setting of the backlight to a first value; in response to a user initiated trigger, storing a first data pair comprising the first value and a light sensor reading, wherein the light sensor is positioned such that it captures ambient light falling on the emissive display such that the light sensor reading is a first detected ambient light level; setting a brightness level setting of the backlight to a second value; in response to a user initiated trigger, storing a second data pair comprising the second value and a second detected ambient light level; and generating a first part of the brightness calibration data by converting the detected emitted light levels for the plurality of different gamma scaling factors to a plurality of data points specifying gamma scaling factors for a plurality different detected ambient light levels using at least the first and second data pairs.
The method of the first further example may further comprise generating chromaticity calibration data. Generating the chromaticity calibration data may comprise: setting a controllable light source to a primary color; in response to a user initiated trigger, storing a first color calibration point comprising the color of the controllable light source and a light sensor reading, wherein the light sensor is positioned such that it captures ambient light falling on the emissive display such that the light sensor reading is a detected ambient light color value; and repeating the setting and storing, in response to a user initiated trigger, to store a plurality of other color calibration points at two other primary colors and a plurality of non-primary colors.
A second further example provides a method of operating an emissive display, the method comprising: detecting an ambient light level using a light sensor; in response to detecting an ambient light level in a first region, performing chromaticity and brightness matching independently; and in response to detecting an ambient light level in a second region, setting a backlight level to a minimum level, performing chromaticity matching, generating a correction factor based on the detected ambient light level and modifying color values of pixels using the correction factor.
A third further example provides a system comprising a computing device, the computing device comprising: a processor; graphics hardware configured to output display information to a display device comprising an emissive display and an ambient light sensor; and a memory arranged to store computer-executable instructions, that when executed by the processor, cause the computing device to: detect an ambient light level using an ambient light sensor; and in response to detecting an ambient light level in a predefined region, set a backlight level to a minimum level, generate a correction factor based on the detected ambient light level and modify color values of content to be displayed using the correction factor.
In the third further example the computing device may be further arranged to store chromaticity calibration data and wherein the memory may be further arranged to store computer-executable instructions, that when executed by the processor, cause the computing device to: in response to detecting an ambient light level in the predefined region, additionally performing chromaticity matching.
In the third further example the predefined region may be a second region and the memory may be further arranged to store computer-executable instructions, that when executed by the processor, cause the computing device to:
in response to detecting an ambient light level in a first region, performing chromaticity and brightness matching independently.
In the third further example the system may further comprise the display device, the display device comprising: an emissive display comprising a backlight; and the ambient light sensor. The ambient light sensor may comprise means for attenuating perpendicular light.
In the third further example the computing device may be further arranged to store gamma lookup tables and wherein modifying color values of content to be displayed using the correction factor may comprise: applying the correction factor to one or more gamma scaling factors output from the chromaticity matching to generate modified gamma scaling factors.
In the third further example generating a correction factor based on the detected ambient light level may comprise: accessing brightness calibration data; and determining a backlight level corresponding to the detected ambient light level based on the brightness calibration data.
A fourth further example provides one or more tangible device-readable media with device-executable instructions that, when executed by a computing system, direct the computing system to perform operations comprising: detecting an ambient light level using a light sensor; and in response to detecting an ambient light level in a predefined region, setting a backlight level to a minimum level, generating a correction factor based on the detected ambient light level and modifying color values of content to be displayed using the correction factor.
The term ‘computer’ or ‘computing-based device’ is used herein to refer to any device with processing capability such that it executes instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms ‘computer’ and ‘computing-based device’ each include personal computers (PCs), servers, mobile telephones (including smart phones), tablet computers, set-top boxes, media players, games consoles, personal digital assistants, wearable computers, and many other devices.
The methods described herein are performed, in some examples, by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the operations of one or more of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. The software is suitable for execution on a parallel processor or a serial processor such that the method operations may be carried out in any suitable order, or simultaneously.
This acknowledges that software is a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
Those skilled in the art will realize that storage devices utilized to store program instructions are optionally distributed across a network. For example, a remote computer is able to store an example of the process described as software. A local or terminal computer is able to access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a digital signal processor (DSP), programmable logic array, or the like.
Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to ‘an’ item refers to one or more of those items.
The operations of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
The term ‘comprising’ is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
The term ‘subset’ is used herein to refer to a proper subset such that a subset of a set does not comprise all the elements of the set (i.e. at least one of the elements of the set is missing from the subset).
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the scope of this specification.