This disclosure relates generally to electronic devices, and more particularly to electronic devices having displays.
Portable electronic device usage has become ubiquitous. Vast majorities of the population carry a smartphone, tablet computer, or laptop computer daily to communicate with others, stay informed, to consume entertainment, and to manage their lives.
As the technology incorporated into these portable electronic devices has become more advanced, so too has their feature set. A modern smartphone includes more computing power than a desktop computer of only a few years ago. Additionally, while early generation portable electronic devices included physical keypads, most modern portable electronic devices include touch-sensitive displays. It would be advantageous to have an improved electronic device utilizing methods for adjusting the display settings to improve the user experience.
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views and which together with the detailed description below are incorporated in and form part of the specification, serve to further illustrate various embodiments and to explain various principles and advantages all in accordance with the present disclosure.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present disclosure.
Before describing in detail embodiments that are in accordance with the present disclosure, it should be observed that the embodiments reside primarily in combinations of method steps and apparatus components related to extracting a merged brightness adjustment model from a filtered merged brightness adjustment model dataset and controlling, using one or more processors of an electronic device, a display brightness using the merged brightness adjustment model. Any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included, and it will be clear that functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved. Accordingly, the apparatus components and method steps have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
Embodiments of the disclosure do not recite the implementation of any commonplace business method aimed at processing business information, nor do they apply a known business process to the particular technological environment of the Internet. Moreover, embodiments of the disclosure do not create or alter contractual relations using generic computer functions and conventional network operations. Quite to the contrary, embodiments of the disclosure employ methods that, when applied to electronic device and/or user interface technology, improve the functioning of the electronic device itself by and improving the overall user experience to overcome problems specifically arising in the realm of the technology associated with electronic device user interaction.
It will be appreciated that embodiments of the disclosure described herein may be comprised of one or more conventional processors and unique stored program instructions that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of combining some display brightness values corresponding to some ambient light values selected from a previously generated brightness adjustment model stored in memory with at least one user defined display brightness and at least one sensed ambient light value to obtain a merged brightness adjustment model that is a non-decreasing, monotonic function for a set of increasing ambient light values, and adjusting a display brightness level as a function of a sensed ambient light level measured by a light sensor and the merged brightness adjustment model as described herein. The non-processor circuits may include, but are not limited to, light sensors, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and user input devices. As such, these functions may be interpreted as steps of a method to perform combining a subset of display brightness and ambient light value pairs to obtain a combined brightness adjustment model dataset, filtering the combined brightness adjustment model dataset to obtain a filtered merged brightness adjustment model dataset, extracting a merged brightness adjustment model from the filtered merged brightness adjustment model dataset, and controlling an output brightness of an electronic device as a function of the merged brightness adjustment model. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic.
Of course, a combination of the two approaches could be used. Thus, methods and means for these functions have been described herein. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ASICs with minimal experimentation.
Embodiments of the disclosure are now described in detail. Referring to the drawings, like numbers indicate like parts throughout the views. As used in the description herein and throughout the claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise: the meaning of “a,” “an,” and “the” includes plural reference, the meaning of “in” includes “in” and “on.” Relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
As used herein, components may be “operatively coupled” when information can be sent between such components, even though there may be one or more intermediate or intervening components between, or along the connection path. The terms “substantially,” “essentially,” “approximately,” “about,” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within ten percent, in another embodiment within five percent, in another embodiment within one percent and in another embodiment within one-half percent. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. Also, reference designators shown herein in parenthesis indicate components shown in a figure other than the one in discussion. For example, talking about a device (10) while discussing figure A would refer to an element, 10, shown in figure other than figure A.
As noted above, electronic devices having displays that function as their primary user interfaces have become ubiquitous. While electronic devices not too long ago had physical keypads and controls, today most all smartphones, tablet computers, and similar devices utilize a touch-sensitive display as their main user interface. The majority of these devices use displays that project light from the electronic device, through or from pixels or other image defining structures, out to the user's eyes. In contrast to reflective displays such as “e-ink” or other similar technologies, the overall brightness of these light-emitting displays can be adjusted. For instance, many users prefer a lesser amount of display light in dim environments and prefer greater amounts of display light in brighter environments.
Some electronic devices are able to “adaptively” adjust the brightness of their displays. Such devices use a light sensor to measure an amount of ambient light, and then automatically adjust the display brightness level in accordance with the ambient light value. Google™ even offers an open-source software solution called Turbo™ that provides one technique for implementing adaptive brightness features in Android™ devices.
While Turbo™ works adequately in practice, it and other similar adaptive brightness solutions are not without certain drawbacks. Illustrating by example, some countries actually have legal restrictions in place that prevent the use of such adaptive brightness systems all together. What's more, some of these algorithms can be slow to adjust to user preferences, can consume excessive amounts of battery capacity, and are unable to be implemented or respond “on the fly.” For instance, some adaptive brightness algorithms can take over a week to adjust to user-defined input changing a brightness preference. They can also become non-responsive at times to user requests.
Advantageously, embodiments of the present disclosure provide an improved adaptive brightness system that solves these problems. In one or more embodiments, a method in an electronic device comprises merging a subset of display brightness and corresponding ambient light value pairs selected from a brightness adjustment model defining a plurality of display brightness values corresponding to a plurality of ambient light values on a one-to-one basis that is stored in a memory of the electronic device with one or more user defined display brightness and corresponding ambient light value pairs received from user input occurring at a user interface of the electronic device to obtain a merged brightness adjustment model dataset. In one or more embodiments, the method then filters the merged brightness adjustment model dataset to obtain a filtered merged brightness adjustment model dataset. A merged brightness adjustment model is then extracted from the filtered merged brightness adjustment model dataset. In one or more embodiments, one or more processors of an electronic device then control a display brightness of a display of the electronic device using the merged brightness adjustment model.
Advantageously, this method—as well as others described below—provide a much faster convergence between the merged brightness adjustment model and user input adjusting preferred display brightness settings. Embodiments of the disclosure also remain fully responsive to any, and all, user requests for display brightness adjustments. In contrast to prior art display brightness adjustment systems, embodiments of the disclosure can be trained “on the fly” after a single user interaction requesting a display brightness adjustment.
One of the primary advantages offered by embodiments of the disclosure is that the methods, when implemented in an electronic device to control the display brightness of the display, require far less computational processing power than do prior art methods. Other advantages will be described below. Still others will be obvious to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, an electronic device comprises a light sensor measuring ambient light levels within an environment of the electronic device and a memory storing a previously generated brightness adjustment model. In one or more embodiments, the brightness adjustment model defines a plurality of display brightness and corresponding ambient light value pairs that correspond on a one-to-one basis. In one or more embodiments, the electronic device includes a user interface that receives user input defining at least one preferred user display brightness for at least one sensed ambient light value detected by a light sensor.
In one or more embodiments, one or more processors then combine, optionally using an isotonic regression model, some display brightness values corresponding to some ambient light levels selected from the brightness adjustment model with the at least one user defined display brightness and the at least one corresponding sensed ambient light value to obtain a merged brightness adjustment model. In one or more embodiments, the merged brightness adjustment model is a non-decreasing, monotonic function for a set of increasing ambient light values. In one or more embodiments, the one or more processors then adjust the display brightness level as a function of a sensed ambient light level measured by the light sensor and a corresponding brightness level selected from the merged brightness adjustment model.
This technique, implemented in an electronic device by the one or more processors to control display brightness, provides a novel system and signal flow that automatically predicts a user's preferred level of display brightness (typically measured in units called “nits” from the Latin “nitere,” which means “to shine”) for a measured ambient light level (typically measured in a unit of illuminance referred to as a “lux,” which is one lumen of light per square meter). In its simplest form, the technique includes obtaining several reference points from a previously generated brightness adjustment model, referred to as “display brightness and corresponding ambient light value pairs,” and merging those with user interactions adjusting a display brightness preference, referred to as “user defined display brightness and corresponding ambient light value pairs.”
This merging, which is performed using an isotonic regression to preserve non-decreasing monotonicity in one or more embodiments, can be filtered to “smooth” the otherwise generally piecewise linear output of the merging operation to obtain a filtered merged brightness adjustment model dataset. In one or more embodiments, a one-dimensional Gaussian convolution model is used to perform the filtering. As will be described below with reference to
Thereafter, one or more processors of the electronic device can extract a merged brightness adjustment model from the filtered merged brightness adjustment model dataset. In one or more embodiments, this comprises fitting the data from the filtered merged brightness adjustment model dataset using a monotonic cubic spline. Once the merged brightness adjustment model is obtained, the one or more processors can control the display brightness of its display using the merged brightness adjustment model. In one or more embodiments, the one or more processors do this by obtaining a sensed ambient light level from a light sensor, referencing the merged brightness adjustment model to determine a corresponding display brightness, and then causing the light of the display—or the display itself—to adjust its brightness to the referenced display brightness from the merged brightness adjustment model.
Using this technique, the isotonic regression algorithm works in tandem with filtering, be it via a Gaussian filter, a weighted filter, or other type of filter, to create a new, merged dataset from a previous brightness adjustment model and user interaction data adjusting display brightness preferences. Thereafter, an interpolation model such as a monotonic cubic spline can be fitted to the generated dataset. This resulting “fitted” model can then be used to automatically predict a user-desired display brightness level for a sensed ambient light level. Accordingly, one or more processors of an electronic device can control the display brightness of its display using the merged brightness adjustment model for a given ambient light level.
Turning now to
This illustrative electronic device 100 is shown in
Starting from the top, a fascia 104 is provided. In this illustrative embodiment, the fascia 104 defines a major face of the housing 101 disposed above the display. The fascia 104 may be manufactured from glass or a thin film sheet. The fascia 104 is a covering or housing, which may or may not be detachable. Suitable materials for manufacturing the cover layer include clear or translucent plastic film, glass, plastic, or reinforced glass. Reinforced glass can comprise glass strengthened by a process such as a chemical or heat treatment. The fascia 104 may also include an ultra-violet barrier. Such a barrier is useful both in improving the visibility of display 102 and in protecting internal components of the electronic device 100.
Printing may be desired on the front face of the fascia 104 for various reasons. For example, a subtle textural printing or overlay printing may be desirable to provide a translucent matte finish atop the fascia 104. Such a finish is useful to prevent cosmetic blemishing from sharp objects or fingerprints. The fascia 104 can include a plurality of indium tin oxide or other electrodes, which function as a capacitive sensor, to convert the display 102 to a touch-sensitive display. Where configured to be touch sensitive, users can deliver user input to the display 102 by delivering touch input from a finger, stylus, or other objects disposed proximately with the display.
Beneath the fascia 104 is disposed the display 102. The display 102 is supported by the housing 101 of the electronic device 100. In one embodiment, the display 102 comprises an organic light emitting diode (OLED) display. One or more active elements 105 can be operable to project light outwardly from the housing 101 of the electronic device 100 and through the fascia 104 to a user. Illustrating by example, if the display 102 is an OLED display, each active element 105 can be configured as a single OLED. When a voltage is applied to the OLED, the resulting current moves electrodes and holes to cause light emission. By contrast, where the display 102 is a traditional light emitting display, the one or more active elements 105 may be pixels of a backlight that project light through liquid crystal elements to cause light to be emitted through the fascia 104 to the eyes of a user. In one or more active elements 105 are controllable such that the overall display brightness can be adjusted to a desired level by one or more processors of the electronic device 100.
In one embodiment, the imager 103 comprises a digital camera. The imager 103 could alternatively comprise multiple cameras that are proximately disposed with the display 102. Where multiple cameras are used as the imager 103, these cameras can be oriented along the electronic device 100 spatially in various ways. Illustrating by example, in one embodiment the cameras can be clustered near one another. In another embodiment, the cameras can be oriented spatially across the surface area defined by the display 102, e.g., with one camera in the center and four other cameras, with one camera disposed in each of the four corners of the housing 101.
Where multiple cameras are used, the one or more processors can capture and record the ambient light level of the environment 106 around the electronic device 100. In other embodiments, the imager 103 can be replaced by a simple light sensor. In still other embodiments, a light sensor can be used in addition to the imager 103 to determine ambient light levels.
One or more processors of the electronic device 100 can then use this information to adjust the display brightness of the display 102 by changing the amount of light the one or more active elements 105 (be they OLEDs, a backlight, or other type of element) emit through the fascia 104 to the eyes of a user. In some embodiments, the one or more processors can use the ambient light level to adjust other display parameters, such as by modifying the levels of the display output, e.g., color intensity and color balance, as a function of pixel locations on the display 102 to brighten dark corners (relative to the center), align consistent color balance, and so forth, thereby improving image quality in a real time, closed-loop feedback system.
In one embodiment, the imager 103 is capable of each of metering scenes to adjust its settings, capturing images, and previewing images. When images are captured, the captured image is recorded to memory. When images are previewed, the images are delivered to the one or more processors of the electronic device for presentation on the display 102. When previewing images, the images can either be temporarily written to memory or delivered directly to the display 102 as electronic signals with only temporary buffering occurring in the one or more processors.
This explanatory electronic device 100 also includes a housing 101. Features can be incorporated into the housing 101. Examples of such features include a microphone or speaker port. A user interface component, which may be a button or touch sensitive surface, can also be disposed along the housing 101.
Turning now to
In one or more embodiments, the display 202 may optionally be touch-sensitive. In one embodiment where the display 202 is touch-sensitive, the display 202 can serve as a primary user interface for an electronic device. Users can deliver user input to the display 202 of such an embodiment by delivering touch input from a finger, stylus, or other objects disposed proximately with the display 202. In one embodiment, the display 202 is configured as an active-matrix organic light emitting diode (AMOLED) display. However, it should be noted that other types of displays, including liquid crystal displays, OLED displays, twisted nematic displays, light emitting diode displays, and so forth could be used and would be obvious to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, one or more processors 201 are operable with the display 202 and other components of the electronic devices configured in accordance with embodiments of the disclosure. The one or more processors 201 can include a microprocessor, a group of processing components, one or more ASICs, programmable logic, or other type of processing device. The one or more processors 201 can be operable with the various components of the electronic devices configured in accordance with embodiments of the disclosure. The one or more processors 201 can be configured to process and execute executable software code to perform the various functions of the electronic devices configured in accordance with embodiments of the disclosure.
A storage device, such as memory 207, can optionally store the executable software code used by the one or more processors 201 during operation. The memory 207 may include either or both static and dynamic memory components, may be used for storing both embedded code and user data. The software code can embody program instructions and methods to operate the various functions of the electronic device devices configured in accordance with embodiments of the disclosure, and also to execute software or firmware applications and modules. The one or more processors 201 can execute this software or firmware, and/or interact with modules, to provide device functionality.
In this illustrative embodiment, the schematic block diagram 200 also includes an optional communication circuit 204 that can be configured for wired or wireless communication with one or more other devices or networks. The networks can include a wide area network, a local area network, and/or personal area network. The communication circuit 204 may also utilize wireless technology for communication, such as, but are not limited to, peer-to-peer or ad hoc communications such as HomeRF, Bluetooth and IEEE 802.11, and other forms of wireless communication such as infrared technology. The communication circuit 204 can include wireless communication circuitry, one of a receiver, a transmitter, or transceiver, and one or more antennas.
The one or more processors 201 can also be operable with other components 205. The other components 205 can include an acoustic detector, such as a microphone. The other components 205 can also include one or more proximity sensors to detect the presence of nearby objects. The other components 205 may include video input components such as optical sensors, mechanical input components such as buttons, touch pad sensors, touch screen sensors, capacitive sensors, motion sensors, and switches. Similarly, the other components 205 can include output components such as video, audio, and/or mechanical outputs. Other examples of output components include audio output components such as speaker ports or other alarms and/or buzzers and/or a mechanical output component such as vibrating or motion-based mechanisms. The other components 205 may further include an accelerometer to show vertical orientation, constant tilt and/or whether the device is stationary.
The display 202 can be operable with one or more light sources 203 that are operable to project light to the eyes of a user. As noted above, where the display 202 comprises an OLED or AMOLED display, the light sources 203 can comprise OLEDs or AMOLEDs that are active to project light. In other display technologies, such as light emitting diode or twisted nematic displays, the one or more light sources 203 may comprise a backlight, a pixelated backlight, or other lighting apparatus operable to project light. In one or more embodiments, the one or more light sources 203 are adjustable so that the display brightness of the display 202 can be controlled by the one or more processors 201.
The imager 206 can be configured as an “intelligent” imager that captures one or more images from an environment of an electronic device into which the schematic block diagram 200 is situated. The intelligent imager can then determine whether objects within the images match predetermined criteria using object recognition or other techniques. For example, an intelligent imager can operate as an identification module configured with optical recognition such as include image recognition, character recognition, visual recognition, facial recognition, color recognition, shape recognition and the like. Advantageously, the intelligent imager can be used as a facial recognition device to detect the presence of a face of a subject, as well as whether that face is clearly depicted in the images captured by the intelligent imager or whether the face is at least partially obscured.
Illustrating by example, in one embodiment the intelligent imager can capture one or more photographs of a person. The intelligent imager can then compare the images to a reference file stored in memory to confirm beyond a threshold probability that the person's face sufficiently matches the reference file,
One or more sensors 208 can be operable with the one or more processors 201. The one or more sensors 208 may include a microphone, an earpiece speaker, and/or a second loudspeaker. The one or more other sensors 208 may also include touch actuator selection sensors, proximity sensors, a touch pad sensor, a touch screen sensor, a capacitive touch sensor, and one or more switches. The other sensors 208 can also include audio sensors and video sensors (such as a camera).
Illustrating by example, in one or more embodiments the one or more sensors 208 comprise a gaze detector. The gaze detector can comprise sensors for detecting the user's gaze point. Electronic signals can then be delivered from the sensors to a gaze detection processing engine for computing the direction of user's gaze in three-dimensional space. The gaze detector can further be configured to detect a gaze cone corresponding to the detected gaze direction, which is a field of view within which the user may easily see without diverting their eyes or head from the detected gaze direction. The gaze detector can be configured to alternately estimate gaze direction by inputting to the gaze detection processing engine images representing one or more photographs of a selected area near or around the eyes. Other techniques for detecting gaze will be obvious to those of ordinary skill in the art having the benefit of this disclosure.
The one or more sensors 208 can also include a light sensor 209. In one or more embodiments, the light sensor 209 can detect changes in optical intensity, color, light, or shadows from the environment of the electronic device into which the schematic block diagram 200 is operational. In one or more embodiments, the light sensor 209 can measure an ambient light level in accordance with a predefined unit, one example of which is a lux. The light sensor 209 can measure ambient light values as well. An infrared sensor can be used in conjunction with, or in place of, the light sensor 209 in one or more embodiments. Similarly, a temperature sensor can be included with the one or more sensors 208 to monitor temperature about an electronic device.
The one or more processors 201 can be responsible for performing the primary functions of the electronic devices configured in accordance with one or more embodiments of the disclosure. For example, in one embodiment the one or more processors 201 comprise one or more circuits operable with one or more user interface controls 210, which can include the display 202, to present presentation information to a user. The executable software code used by the one or more processors 201, optionally stored in the memory 207, can be configured as one or more modules that are operable with the one or more processors 201. Such modules can store instructions, control algorithms, and so forth.
In one embodiment, these modules include an adaptive brightness modeling component 211. In one embodiment, the adaptive brightness modeling component 211 comprises software stored in the memory 207. However, in another embodiment the adaptive brightness modeling component 211 can comprise hardware components or firmware components integrated into the one or more processors 201 as well.
In one or more embodiments, the adaptive brightness modeling component 211 is operable with the user interface controls 210, the imager 206, and or the light sensor 209. The adaptive brightness modeling component 211 is also operable with the one or more processors 201. In some embodiments, the one or more processors 201 can control adaptive brightness modeling component 211. In other embodiments, the adaptive brightness modeling component 211 can operate independently, merging a subset of display brightness and corresponding ambient light value pairs 212 selected from a brightness adjustment model 213 stored in the memory 207 with one or more user defined display brightness and corresponding ambient light value pairs 214 to obtain a merged brightness adjustment model dataset 215, filtering the merged brightness adjustment model dataset 215 to obtain a filtered merged brightness adjustment model dataset 216, and extracting a merged brightness adjustment model 217 from the filtered merged brightness adjustment model dataset 216 so that the one or more processors 201 can control the display brightness of the display 202 using the merged brightness adjustment model 217. The adaptive brightness modeling component 211 can receive data from the various sensors 208, including the light sensor 209, or the other components. In one or more embodiments, the one or more processors 201 are configured to perform the operations of the adaptive brightness modeling component 211.
In one or more embodiments, the adaptive brightness modeling component 211 is operable to combine, using an isotonic regression model, some display brightness values corresponding to some ambient light values selected from the previously generated brightness adjustment model 213 stored in the memory 207 with at least one user defined display brightness and at least one corresponding ambient light value sensed by the light sensor 209 to obtain a merged brightness adjustment model 217. In one or more embodiments, the merged brightness adjustment model 217 is a non-decreasing, monotonic function for a set of increasing ambient light values. From this merged brightness adjustment model 217, the one or more processors 201 can adjust a brightness level of the display 202 as a function of the sensed ambient light level measured by the light sensor 209 and the merged brightness adjustment model 217.
In one or more embodiments, the adaptive brightness modeling component 211, prior to the one or more processors 201 adjusting the brightness level of the display 202, can filter a merged brightness adjustment model dataset 215 obtained from the display brightness values corresponding to the ambient light value selected from the brightness adjustment model 213 stored in the memory 207 that are combined with the at least one user defined display brightness and the at least one corresponding sensed ambient light value to obtain a filtered merged brightness adjustment model dataset 216. Additionally, the adaptive brightness modeling component 211 can extract the merged brightness adjustment model 217 from the filtered merged brightness adjustment model dataset 216 as well. Illustrating by example, the adaptive brightness modeling component 211 may apply a monotonic cubic spline to the filtered merged brightness adjustment model dataset 216 to extract the merged brightness adjustment model 217 in one or more embodiments. Examples of how this can occur are described below with reference to
In one or more embodiments, the merged brightness adjustment model 217 extracted by the adaptive brightness modeling component 211 defines a number of nits per pixel for each ambient light value of the set of increasing ambient light values of the merged brightness adjustment model 217. In one or more embodiments, the adaptive brightness modeling component 211 repeats this process, thereby continuing to generate merged brightness adjustment models for use by the one or more processors 201 to adjust the display brightness of the display 202 continually and “on the fly.” Said differently, in one or more embodiments the adaptive brightness modeling component 211 repeats using the previously generated merged brightness adjustment model 217 as the brightness adjustment model 213 from which some display brightness and corresponding ambient light value pairs 212 are selected to be combined with at least one user defined display brightness and a corresponding ambient light value 214 sensed by the light sensor 209 to obtain a new merged brightness adjustment model. The one or more processors 201 can then adjust the display brightness of the display 202 as a function of a present ambient light value sensed by the light sensor 209 and the new merged brightness adjustment model 217. In one or more embodiments, this recurrence occurs multiple times within a twenty-four-hour period.
In one or more embodiments, the one or more processors 201 may generate commands based upon the output from the adaptive brightness modeling component 211. Illustrating by example, the one or more processors 201 may obtain an ambient light value measured by the light sensor 209 and then may reference the merged brightness adjustment model 217 to control the display brightness of the display 202 by selecting a corresponding display brightness pair for the obtained ambient light value.
It is to be understood that
Turning now to
In one or more embodiments, the brightness adjustment model constitutes a previously generated merged brightness adjustment model created by the method 300 shown in
At step 302, the method 300 receives user input defining at least one user defined display brightness and corresponding ambient light value pair preferred by a user. In one or more embodiments, the at least one user defined display brightness and corresponding ambient light value pair defines a preferred display brightness setting identified by the user for a given ambient light level. If, for example, the brightness adjustment model stored in memory from which the display brightness and corresponding ambient light value pairs were selected at step 301 had the display too bright in at a bright ambient light level, the user may enter at least one user defined display brightness and corresponding ambient light value pair to reduce the display brightness. By contrast, if the brightness adjustment model from which the display brightness and corresponding ambient light value pairs were selected at step 301 had the display brightness too dim in at a low ambient light level, the at least one user defined display brightness and corresponding ambient light value pair may cause the display brightness to increase, and so forth.
In one or more embodiments, the user merely enters the at least one user defined display brightness, while a light sensor of the electronic device measures the corresponding ambient light value pair. Accordingly, in one or more embodiments step 302 comprises a user interface of an electronic device receiving one or more user defined display brightness and ambient light value pairs. They are received from user input occurring at a user interface of the electronic device, as the user must adjust the display brightness to a desired value.
At step 303, the display brightness and corresponding ambient light value pairs selected from the brightness adjustment model at step 301 and the user defined display brightness and corresponding ambient light value pairs received at step 302 are merged to obtain a merged brightness adjustment model dataset. In one or more embodiments, the merged brightness adjustment model dataset defines a non-decreasing, monotonic function of display brightness levels for a set of increasing ambient light values. In one or more embodiments, the merged brightness adjustment model dataset is piecewise linear after the display brightness and corresponding ambient light value pairs selected from the brightness adjustment model at step 301 and the user defined display brightness and corresponding ambient light value pairs received at step 302 are merged.
In one or more embodiments, the merging occurring at step 303 comprises applying an isotonic regression to a combination of the display brightness and corresponding ambient light value pairs selected from the brightness adjustment model at step 301 and the user defined display brightness and corresponding ambient light value pairs received at step 302. Said differently, in one or more embodiments step 303 comprises combining a subset of display brightness and ambient light value pairs and the one or more user defined display brightness and ambient light value pairs using an isotonic regression model to obtain a combined brightness adjustment model dataset defining a non-decreasing, monotonic function. Accordingly, in one or more embodiments the merging occurring at step 303 preserves a non-decreasing monotonicity for the combined brightness adjustment model dataset.
In addition to merging, in one or more embodiments step 303 comprises the method 300 filtering the merged brightness adjustment model dataset to obtain a filtered merged brightness adjustment model dataset. In one or more embodiments, the filtered merged brightness adjustment model dataset defines a continuous function. Illustrating by example, in one or more embodiments the merged brightness adjustment model dataset is piecewise linear since it is created by applying an isotonic regression model to the display brightness and corresponding ambient light value pairs selected from the brightness adjustment model at step 301 and the user defined display brightness and corresponding ambient light value pairs received at step 302. To remove these “steps” from the merged brightness adjustment model dataset, in one or more embodiments step 303 comprises filtering the merged brightness adjustment model dataset to obtain that filtered merged brightness adjustment model dataset that is a continuous function devoid of any steps that may be artifacts from the isotonic regression model. This filtering performed at step 303 can occur in a variety of ways.
In one or more embodiments, the filtering comprises applying a Gaussian filter to the merged brightness adjustment model dataset to obtain the filtered merged brightness adjustment model dataset. Illustrating by example, in one or more embodiments the filtering comprises applying a one-dimensional Gaussian convolution model to the merged brightness adjustment model dataset to obtain the filtered merged brightness adjustment model dataset.
While the application of a one-dimensional Gaussian convolution model works well in practice, experimental testing has demonstrated that in low light the one-dimensional Gaussian convolution model can result in display brightness levels being too high for very low ambient light levels. An example of this will be shown and described below with reference to
To correct for this, in another embodiment the filtering occurring at step 303 comprises applying an average of even instances of the merged brightness adjustment model dataset and odd instances of the merged brightness adjustment model dataset to obtain the filtered merged brightness adjustment model dataset. This method of filtering provides markedly improved performance for low display brightness and corresponding ambient light value pairs. Thus, in one or more embodiments step 303 comprises applying an average of even instances of the subset of display brightness and ambient light value pairs and odd instances of the subset of display brightness and ambient light value pairs to the combined brightness adjustment model dataset to obtain a continuous function that is non-decreasing for an increasing set of ambient light value pairs.
At optional step 304, weighting can be applied to the filtered merged brightness adjustment model dataset. Embodiments of the disclosure contemplate that the “new” merged brightness adjustment model extracted at step 305 should not be strikingly different from the brightness adjustment model used at step 301 in response to a user defined display brightness and corresponding ambient light value pair. This is true because a user is likely to prefer subtle changes in display brightness over those taking a super bright display and making it super dark instantly. Accordingly, when optional step 304 is included, the weighting applied ensures that the merged brightness adjustment model extracted at step 305 is not far from the brightness adjustment model used at step 301.
In one or more embodiments, optional step 304 comprises weighting instances of the filtered merged brightness adjustment model dataset as a function of a difference between at least one display brightness and corresponding ambient light value pair and at least one corresponding user defined display brightness and corresponding ambient light value pair. In one or more embodiments, optional step 304, which occurs prior to the extracting occurring at step 305, comprises weighting instances of the filtered merged brightness adjustment model dataset as a function of an inverse of the difference between the at least one display brightness and corresponding ambient light value pair and the corresponding user defined display brightness and its corresponding ambient light value pair. Thus, if the difference between the display brightness values of the brightness adjustment model used at step 301 and the user defined brightness level received at step 302 for a given ambient light level is large, the weighting factors applied at optional step 304 will be reduced. By contrast, if the difference between the display brightness values of the brightness adjustment model used at step 301 and the user defined brightness level received at step 302 for a given ambient light level is small, the weighting factors applied at optional step 304 will be increased, and so forth.
At step 305, the method 300 extracts a merged brightness adjustment model from the filtered merged brightness adjustment model dataset. In one or more embodiments, this step 305 comprises applying a monotonic cubic spline to the filtered merged brightness adjustment model dataset to obtain the merged brightness adjustment model.
At step 306, the method 300 controls an output brightness of a display of an electronic device as a function of the merged brightness adjustment model. In one or more embodiments, this comprises detecting, using a light sensor or other sensor of an electronic device, an ambient light level of an environment of the electronic device. Thereafter, the display brightness of the electronic device is controlled using the merged brightness adjustment model by adjusting the display brightness to a level defined by the merged brightness adjustment model and the ambient light level of the environment of the electronic device.
In one or more embodiments, the method 300 can then repeat. The merged brightness adjustment model extracted at step 305 becomes the brightness adjustment model from which the display brightness and corresponding ambient light value pairs are selected at step 301. One example of how this can occur will be illustrated and described below with reference to
Turning now to
One or more processors of the electronic device then merge, or combine, the display brightness and corresponding ambient light value pairs 212 with the user defined display brightness and corresponding ambient light value pairs 214 to obtain one or both of a merged brightness adjustment model dataset 215 and/or a filtered merged brightness adjustment model dataset 216. When the signal flow diagram is running “on the fly,” the filtering is omitted and only the merging occurs. However, when the signal flow diagram is in a learning mode, the merging and filtering both occur. Instances of each will be illustrated and described below with reference to
In one or more embodiments, the merging comprises applying an isotonic regression model 401 to a combination of the display brightness and corresponding ambient light value pairs 212 and the user defined display brightness and corresponding ambient light value pairs 214 to preserve a non-decreasing, monotonic function that is the merged brightness adjustment model dataset 215.
An example of the merged brightness adjustment model dataset 215 is shown in
Turning now back to
Turning now back to
Turning now back to
As shown in
As shown, the isotonic regression model (401), working in tandem with a filter, one example of which is the one-dimensional Gaussian convolution model (402) generate a new dataset, the merged brightness adjustment model dataset (215), from a previous brightness adjustment model (213) and user interaction data represented by the user defined display brightness and corresponding ambient light value pairs (214). Then, an interpolation, one example of which is the application of a monotonic cubic spline (403), can be fitted to the filtered merged brightness adjustment model dataset (216). Finally, the resulting merged brightness adjustment model (217) can be used to automatically predict the proper display brightness for a given ambient light level.
Said differently, for a given baseline set of reference points set forth in a brightness adjustment model, the method (300) of
To illustrate this, turn now to
To initially show how user input is used to adjust the brightness adjustment model 602, turn now to
Turning now to
That this “almost instant” response is faster than the prior art display brightness adjustment curve (601) of
As shown at step 902, when additional user defined display brightness and corresponding ambient light value pairs 214 is received, the merged brightness adjustment model 217 of embodiments of the disclosures much more accurately tracks the user defined display brightness and corresponding ambient light value pairs 214 than does the prior art display brightness adjustment curve 601. As shown in
Turning now to
The operational diagram 1100 shows a typical day where the method (300) of
By contrast, at stages 1102,1103,1104, the method (300) of
Another interesting feature shown in
By contrast, at stage 1105, which is a training mode, all user defined display brightness and corresponding ambient light value pairs received during the twenty-four-hour period are considered when generating the new merged brightness adjustment model. Thus, as shown in
As noted above with reference to
To correct for this, in another embodiment an alternate filtering occurs. Additionally, weighting can be used to prevent large, dramatic changes occurring in the merged brightness adjustment model in response to user input. Turning now to
As with the signal flow diagram of
One or more processors of the electronic device then merge, or combine, the display brightness and corresponding ambient light value pairs 212 with the user defined display brightness and corresponding ambient light value pairs 214 to obtain one or both of a merged brightness adjustment model dataset 215 and/or a filtered merged brightness adjustment model dataset 216. When the signal flow diagram is running “on the fly,” the filtering is omitted and only the merging occurs. However, when the signal flow diagram is in a learning mode, the merging and filtering both occur.
As before, in one or more embodiments the merging comprises applying an isotonic regression model 401 to a combination of the display brightness and corresponding ambient light value pairs 212 and the user defined display brightness and corresponding ambient light value pairs 214 to preserve a non-decreasing, monotonic function that is the merged brightness adjustment model dataset 215. Since this merged brightness adjustment model dataset 215 can be piecewise linear, a filtering step can be applied. However, in contrast to the signal flow diagram of
Instead, the filtering 1502 uses an average of the isotonic regression data. Illustrating by example, in one or more embodiments the filtering comprises applying an average of even instances of the merged brightness adjustment model dataset 215 and odd instances of the merged brightness adjustment model dataset 215 to obtain the filtered merged brightness adjustment model dataset 216. This method of filtering provides markedly improved performance for low display brightness and corresponding ambient light value pairs. This is shown in
Beginning with
However, turning now to
Turning now back to
In one or more embodiments, the weighting 1505 instances of the filtered merged brightness adjustment model dataset 216 occurs as a function of a difference between at least one display brightness and corresponding ambient light value pair 212 and at least one corresponding user defined display brightness and corresponding ambient light value pair 214. In one or more embodiments, the weighting 1505 instances of the filtered merged brightness adjustment model dataset 216 occurs as a function of an inverse of the difference between the at least one display brightness and corresponding ambient light value pair 212 and the corresponding user defined display brightness and its corresponding ambient light value pair 214.
Thus, if the difference between the display brightness values of the brightness adjustment model 213 and the user defined brightness level for a given ambient light level is large, the weighting factors will be reduced. By contrast, if the difference between the display brightness values of the brightness adjustment model 213 and the user defined brightness level for a given ambient light level is small, the weighting factors will be increased, and so forth. Equations (1600,1700) for weighting 1505 in this manner are shown in
Regardless of whether weighting is employed, the merged brightness adjustment model 217 is then extracted from the filtered merged brightness adjustment model dataset 216. In one or more embodiments, this comprises applying a monotonic cubic spline 403 to the filtered merged brightness adjustment model dataset to obtain the merged brightness adjustment model 217. It should be noted that other splines, e.g., cubic splines, can be used in place of the monotonic cubic spline 403 in other embodiments. This is true with the signal flow diagram of
One or more processors of the electronic device can then control the display brightness 404 as a function of the ambient light level detected by a light sensor and the merged brightness adjustment model 217 by referencing a particular display brightness 404 for the sensed ambient light level and causing the display to output a luminous flux for that display brightness 404.
Turning now to
At 2001, a method in an electronic device comprises merging, by one or more processors of the electronic device:
a subset of display brightness and corresponding ambient light value pairs selected from a brightness adjustment model defining a plurality of display brightness values corresponding to a plurality of ambient light values on a one-to-one basis stored in a memory of the electronic device; and
one or more user defined display brightness and corresponding ambient light value pairs received from user input occurring at a user interface of the electronic device to obtain a merged brightness adjustment model dataset.
At 2001, the method comprises filtering, by the one or more processors, the merged brightness adjustment model dataset to obtain a filtered brightness adjustment model dataset. At 2001, the method comprises extracting, by the one or more processors, a merged brightness adjustment model from the filtered brightness adjustment model dataset. Finally, at 2001, the method comprises controlling, by the one or more processors, a display brightness of a display of the electronic device using the merged brightness adjustment model.
At 2002, the method of 2001 further comprises detecting, by one or more sensors operable with the one or more processors, an ambient light level of an environment of the electronic device. At 2002, the controlling the display brightness of the electronic device using the merged brightness adjustment model adjusts the display brightness to a level defined by the merged brightness adjustment model and the ambient light level of the environment of the electronic device.
At 2003, the merged brightness adjustment model dataset of 2002 defines a non-decreasing, monotonic function for a set of increasing ambient light values. At 2004, the merged brightness adjustment model dataset of 2003 is piecewise linear, and the filtered brightness adjustment model dataset defines a continuous function.
At 2005, the merging of 2004 comprises applying an isotonic regression to a combination of the subset of display brightness and corresponding ambient light value pairs and the one or more user defined display brightness and corresponding ambient light value pairs. At 2006, the filtering of 2005 comprises applying a Gaussian filter to the merged brightness adjustment model dataset to obtain the filtered brightness adjustment model dataset. At 2007, the Gaussian filter comprises a one-dimensional Gaussian convolution model.
At 2008, the filtering of 2005 comprises applying an average of even instances of the merged brightness adjustment model dataset and odd instances of the merged brightness adjustment model dataset to obtain the filtered brightness adjustment model dataset. At 2009, the extracting of 2005 comprises applying a monotonic cubic spline to the filtered brightness adjustment model dataset to obtain the merged brightness adjustment model.
At 2010, the method of 2009 further comprises, prior to the extracting, weighting instances of the filtered brightness adjustment model dataset as a function of a difference between at least one display brightness and corresponding ambient light value pair and at least one corresponding user defined display brightness and corresponding ambient light value pair. At 2011, the weighting of 2010 occurs as an inverse of the difference between the at least one display brightness and corresponding ambient light value pair and the at least one corresponding user defined display brightness and corresponding ambient light value pair.
At 2012, an electronic device comprises a light sensor measuring ambient light levels within an environment of the electronic device. At 2012, the electronic device comprises a memory storing a brightness adjustment model defining a plurality of display brightness values corresponding to a plurality of ambient light values on a one-to-one basis.
At 2012, the electronic device comprises a user interface receiving user input defining at least one user defined display brightness for at least one sensed ambient light value and a display. At 2012, the electronic device comprises one or more processors operable with the display and controlling a display brightness level.
At 2012, the one or more processors combine, using an isotonic regression model, some display brightness values corresponding to some ambient light values selected from the brightness adjustment model with the at least one user defined display brightness and at least one sensed ambient light value to obtain a merged brightness adjustment model. At 2012, the merged brightness adjustment model is a non-decreasing, monotonic function for a set of increasing ambient light values. At 2012, the one or more processors adjust the display brightness level as a function of a sensed ambient light level measured by the light sensor and the merged brightness adjustment model.
At 2013, the one or more processors of 2012, prior to adjusting the display brightness level as the function of the sensed ambient light level measured by the light sensor and the merged brightness adjustment model, filter a merged brightness adjustment model dataset obtained from the some display brightness values corresponding to the some ambient light values selected from the brightness adjustment model with the at least one user defined display brightness and at least one sensed ambient light value to obtain a filtered brightness adjustment model dataset. At 2013, the one or more processors extract the merged brightness adjustment model from the filtered brightness adjustment model dataset.
At 2014, the one or more processors of 2013 apply a monotonic cubic spline to the filtered brightness adjustment model dataset to extract the merged brightness adjustment model. At 2015, the display of 2014 comprises an organic light emitting diode display. At 2015, the merged brightness adjustment model defines a number of nits per pixel of the organic light emitting diode display for each ambient light value of the set of increasing ambient light values.
At 2016, the one or more processors of 2012 further repeat the combining some display brightness values corresponding to some ambient light values selected from the brightness adjustment model with the at least one user defined display brightness and the at least one sensed ambient light value to obtain the merged brightness adjustment model. At 2016, the one or more processors adjust the display brightness level as the function of the sensed ambient light level measured by the light sensor and the merged brightness adjustment model multiple times within a twenty-four-hour period.
At 2017, the at least one user defined display brightness of 2016 and the at least one sensed ambient light value employed during a final repeat of the combining the some display brightness values corresponding to the some ambient light values selected from the brightness adjustment model with the at least one user defined display brightness and the at least one sensed ambient light value to obtain the merged brightness adjustment model and the adjusting the display brightness level as the function of the sensed ambient light level measured by the light sensor and the merged brightness adjustment model comprises all user defined display brightness and corresponding sensed ambient light values received during the twenty-four hour period. At 2017, all other repeats of the combining the some display brightness values corresponding to the some ambient light values selected from the brightness adjustment model with the at least one user defined display brightness and the at least one sensed ambient light value to obtain the merged brightness adjustment model and the adjusting the display brightness level as the function of the sensed ambient light level measured by the light sensor and the merged brightness adjustment model use fewer than the all user defined display brightness and the corresponding ambient light values received during the twenty-four hour period.
At 2018, a method in an electronic device comprises selecting, by one or more processors of the electronic device, a subset of display brightness and ambient light value pairs from a brightness adjustment model. At 2018, the method comprises receiving, by a user interface of the electronic device, one or more user defined display brightness and ambient light value pairs.
At 2018, the method comprises combining, by the one or more processors, the subset of display brightness and ambient light value pairs and the one or more user defined display brightness and ambient light value pairs using an isotonic regression model to obtain a combined brightness adjustment model dataset defining a non-decreasing, monotonic function. At 2018, the method comprises filtering, by the one or more processors, the combined brightness adjustment model dataset to obtain a filtered brightness adjustment model dataset.
At 2018, the method comprises extracting, by the one or more processors, a merged brightness adjustment model from the filtered brightness adjustment model dataset using a monotonic cubic spline. At 2018, the method comprises controlling, by the one or more processors, an output brightness of a display of the electronic device as a function of the merged brightness adjustment model.
At 2019, the filtering of 2018 comprises applying a one-dimensional Gaussian convolution model to the combined brightness adjustment model dataset. At 2020, the filtering of 2018 comprises applying an average of even instances of the subset of display brightness and ambient light value pairs and odd instances of the subset of display brightness and ambient light value pairs to the combined brightness adjustment model dataset. At 2020, the method further comprises weighting instances of the filtered brightness adjustment model dataset as a function of a difference between at least one display brightness and corresponding ambient light value pair and at least one corresponding user defined display brightness and corresponding ambient light value pair.
In the foregoing specification, specific embodiments of the present disclosure have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. Thus, while preferred embodiments of the disclosure have been illustrated and described, it is clear that the disclosure is not so limited. Numerous modifications, changes, variations, substitutions, and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present disclosure as defined by the following claims.
Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present disclosure. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims.
This application is a continuation application claiming priority and benefit under 35 U.S.C. § 120 from U.S. application Ser. No. 17/965,547, filed Oct. 13, 2022, which is incorporated by reference for all purposes.
Number | Date | Country | |
---|---|---|---|
Parent | 17965547 | Oct 2022 | US |
Child | 18121989 | US |