Ambient light conditions can affect a user's ability to view a computer display and the user's perception of color and brightness of content displayed on the display. In some instances, color adjustment algorithms and brightness adjustment algorithms are used to adjust one or more color parameters of the image and to adjust a luminance of the display, respectively, to account for the ambient light conditions.
According to one aspect of the present disclosure, a system is provided comprising a display, a first processor, and a second processor. The system further comprises an image sensor configured to output image sensor data to the first processor or the second processor, and an ambient light sensor configured to output ambient light data to the first processor and/or the second processor. A memory stores instructions executable by the first processor to determine whether the image sensor is in use by an application executed at the first processor. On condition that the image sensor is determined not to be in use by the application, the image sensor data is blocked from the first processor. The image sensor data is routed to the second processor to thereby enable the second processor to execute a color adjustment algorithm configured to use at least the image sensor data and the ambient light data to adjust one or more color parameters of content displayed on the display, and to execute a brightness adjustment algorithm configured to use at least the image sensor data and the ambient light data to adjust a luminance of the display.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Ambient light conditions can affect a user's ability to view a computer display and the user's perception of color and brightness of content, such as images and text, displayed on the display. In some instances, color adjustment algorithms and brightness adjustment algorithms are used to adjust one or more color parameters of the content and to adjust a luminance of the display, respectively, to account for the ambient light conditions.
In some instances, the one or more color parameters and the luminance of the display are adjusted using data from an ambient light sensor, such as an ambient brightness sensor and/or an ambient color sensor. However, raw data output by the ambient light sensor and/or the ambient color sensor can be inaccurate and/or potentially unreliable. In addition, the performance of brightness adjustment algorithms and color adjustment algorithms may vary based upon a type of ambient illumination (e.g., indoor illumination or outdoor illumination) and may further vary based upon a type of light source used (e.g., sunlight or an incandescent lightbulb).
Further, significant error can occur in some usage environments due to low light (e.g., below 100 lux), relatively poor spatial and spectral resolution of ambient light sensors relative to other types of optical sensors (e.g., an image sensor, such as a digital camera sensor), infrared (IR) noise, a non-linear sensor response, and part-to-part variance between sensors. These conditions can impair performance of a display device and adversely affect a user's perception of displayed content and generally degrade the user experience. It can also be challenging to characterize ambient lighting conditions when a combination of light sources are present (also referred to herein as “mixed lighting”) due to spectral overlap between different light sources. For example, spectral leakage from an IR source may be detected in a visible spectral range, distorting color and brightness parameters that are calculated based upon visible light intensity.
In some instances, image sensors (e.g., a digital camera sensor, such as a charge-coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) sensor) are used to determine ambient lighting conditions. These image sensors have higher spatial and/or spectral resolution and a greater field of view than ambient light sensors (e.g., simple photodiodes), enabling the use of computer vision to determine ambient lighting conditions (e.g., via image classification). Computer vision can extract useful data for color and/or brightness adjustment algorithms, such as the existence of multiple different light sources. However, the use of image sensors may present user privacy concerns, such as the potential for the image sensors to capture private data (e.g., personal identifiable information, financial information, or regulated information) while a user is not aware that the image sensors are operational.
To address these issues, examples are disclosed that relate to detecting whether an image sensor is in use by an application executed at a first processor. On condition that an image sensor is not in use by an application executed at the first processor, the image sensor data is blocked from the first processor, and the image sensor data is routed to a second processor to thereby enable the second processor to use at least the image sensor data and the ambient light data to execute a color adjustment algorithm and a brightness adjustment algorithm. In this manner, the computing device can use a combination of the image sensor data and the ambient light data to accurately determine ambient lighting conditions and display content accordingly, while preventing the first processor from potentially receiving private information from the image sensor data when the image sensor data is not already in use at the first processor.
With reference now to
The computing device 102 comprises a display 104, a first processor 106, and a second processor 108. In some examples, the first processor 106 comprises a central processing unit (CPU) of the computing device 102 or a system on a chip (SoC), and serves as a platform for executing an operating system and other software (e.g., application 134). In some such examples, the second processor 108 comprises an auxiliary processor configured to perform supplementary functions. For example, as described in more detail below and in one potential advantage of the present disclosure, the second processor 108 is configured to receive and process image sensor data while this data is blocked from the first processor 106.
It will also be appreciated that, while the methods and processes disclosed herein are described with reference to two processors located at the same computing device, the methods and processes disclosed herein are capable of being implemented using any other suitable number of processors (e.g., three or more processors), and such processors could be distributed on any other suitable number of computing devices (e.g., two or more devices). In this manner, some potentially expensive processes (e.g., computer vision algorithms) can be distributed across a plurality of processors or offloaded from the computing device 102.
The computing device 102 further comprises a memory 110. In some examples, the memory 110 is accessible only by the first processor 106 and stores instructions executable by the first processor 106 to implement at least a portion of the methods and processes described herein. In these examples the computing device 102 also comprises an additional memory 111 that is accessible only by the second processor 108 and stores instructions executable by the second processor 108 to implement at least a portion of the methods and processes described herein. Accordingly, and in one potential advantage of the present disclosure, the separation of the memory 110 and the additional memory 111 prevents the first processor 106 from accessing potentially private data stored in the additional memory 111 as described further below. Additional aspects of the computing device 102 are described in more detail below with reference to
The system 100 of
In the example of
The image sensor data 116 output by the image sensor 112 has a first image resolution that is higher than a second image resolution of the ambient light data 118 output by the ambient light sensor 114. In some examples, the ambient light sensor 114 comprises fewer pixel sensor units than the image sensor 112, and/or the ambient light sensor 114 averages the outputs of the pixel sensor units over at least a portion of the sensor. In this manner, noise is distributed over an area of the sensor.
Further, in some examples, the second image resolution is less than a threshold resolution 136 that enables facial recognition. Accordingly, and as described in more detail below with reference to
In some examples, the ambient light sensor 114 additionally or alternatively includes a filter, such as an infrared filter, a mask, or another material that impedes the passage of light to the ambient light sensor 114. The filter reduces noise via photon diffusion and/or by blocking a potentially noisy spectral range (e.g., IR). The filter also prevents the ambient light sensor 114 from resolving private information, such as human faces.
Compared to the ambient light sensor 114, the image sensor 112 utilizes an optical stack having higher transmissivity and percent transmittance, thereby enabling better color approximation and sensitivity in low light environments, along with lower noise in the image sensor data 116. In addition, and as described in more detail below with reference to
In some examples, the image sensor 112 has a larger diode area than the ambient light sensor 114, increasing sensitivity. In this manner, the image sensor 112 may maintain a linear sensor response in low-light conditions (e.g., less than or equal to 100 lux). In some examples, the image sensor 112 exhibits a linear response to light levels greater than or equal to 10 lux. Accordingly, and in one potential advantage of the present disclosure, the image sensor 112 may be used to evaluate ambient lighting conditions in low-light environments.
In some examples, the image sensor 112 has a greater number of color channels than the ambient light sensor 114, enabling the image sensor data 116 to be binned into a greater number of spectral ranges than the ambient light data 118. These additional color channels enable the computing device 102 to estimate a light source type, light intensity, and light color. Such additional information, as well as the greater sensitivity and resolution of the image sensor 112 relative to the ambient light sensor 114, enables the computing device to accurately determine ambient light conditions using a combination of the image sensor data 116 and the ambient light data 118.
As described in more detail below, and in one potential advantage of the present disclosure, the image sensor data 116 and the ambient light data 118 are used to execute a color adjustment algorithm 120 and a brightness adjustment algorithm 122, while also preventing the higher-resolution image sensor data from being available to the first processor 106 to thereby protect user privacy. Advantageously and as described further below, by utilizing the higher-resolution image sensor data 116 along with the ambient light data 118, the color adjustment algorithm 120 and brightness adjustment algorithm 122 can provide more accurate color and brightness adjustments that are more closely tailored to the particular ambient light conditions of a user's environment.
The color adjustment algorithm 120 is configured to use at least the image sensor data 116 and the ambient light data 118 to adjust one or more color parameters 124 of content 126 displayed on the display 104. The display 104 is configured to output any suitable type of content. Some examples of suitable content include, but are not limited to, an image 128 and text 130. The one or more color parameters 124 include any suitable color parameters. Some examples of suitable color parameters include, but are not limited to, color space values, such as red, green, and blue (RGB) display values. In one potential advantage of the present disclosure, the one or more color parameters 124 are adjusted to value(s) that allow a user to accurately perceive a color of the content.
The brightness adjustment algorithm 122 is configured to use at least the image sensor data 116 and the ambient light data 118 to adjust a luminance 132 of the display 104. In this manner, the luminance 132 of the display 104 is adjusted to a level that allows a user to accurately perceive brightness and contrast in the content 126. It will also be appreciated that, while the brightness adjustment algorithm 122 and the color adjustment algorithm 120 are depicted as separated algorithms in
In some examples, the color adjustment algorithm 120 and the brightness adjustment algorithm 122 are executed at the first processor 106. As described in more detail below with reference to
The system 200 is configured to detect whether an image sensor 206 is in use by an application 208 executed at the SoC 202. With reference to
Accordingly and as described further below, on condition that the image sensor is in use by the application 208, the sensor processor 204 is configured to route the image sensor data 210 to the SoC 202 and first processor. In this manner, the first processor receives and uses the image sensor data 210 in combination with the ambient light data to execute the color adjustment algorithm 120 and the brightness adjustment algorithm 122 as described above. In this example, the sensor processor 204 routes the image sensor data 210 to the SoC 202 by passing the sensor enable signal 214 to a power system 216 via OR gate 220. The power system 216 supplies power to the image sensor 206 and thereby causes the image sensor 206 to activate and output the image sensor data 210. The sensor enable signal 214 further causes a first switch 218 to close, enabling the image sensor data 210 to reach the sensor processor 204. It will also be appreciated that access to the image sensor data 210 may be subject to additional conditions, such as permissions and user consent, to protect user privacy.
In some examples, when the image sensor data is routed to the sensor processor 204, the image sensor data is blocked from the second processor. In the present example, the sensor enable signal 214 also passes to a NOT gate 222, which flips the bit from “1” to “0” and passes the “0” to second processor 224 (which can serve as the second processor 108 of
With reference again to
With reference again to
As shown in
In the example of
As shown in
With reference again to
For example, the image sensor 112 is configured to use a high-performance image capture mode 138 when the image sensor data 116 is in use by the application 134. In the high-performance image capture mode 138, the image sensor 112 is configured to output image sensor data 116 at a higher resolution and/or frame rate than in the low-power image capture mode 140. In this manner, the image sensor data 116 has a suitable resolution or frame rate for the application 134.
Additionally, the image sensor 112 is configured to use the low-power image capture mode 140 when the image sensor data 116 is not in use by the application. The image sensor 112 may consume less power by operating at a lower resolution and/or frame rate than in the high-performance image capture mode 138. In addition, operating at a lower resolution and/or frame rate further prevents potentially private information, such as human faces, from being discernable in the image sensor data 116.
In other examples, high-resolution image sensor data is downsampled to the resolution that is less than the threshold resolution 136. Accordingly, and in one potential advantage of the present disclosure, downsampling may remove at least some identifiable features (e.g., human faces) from the high-resolution image data.
The image 500 has a first resolution that enables recognition of a plurality of facial features of the user 502, such as hair 504, nose 506, eyes 508, ears 510, and fine facial structure 512. Such a detailed depiction of the user can be advantageous for some applications, such as a camera application, a video conferencing application, or facial recognition applications.
As described in more detail below, the additional information that can be gleaned from the image sensor data allows the identification of pixels that are relevant to the color adjustment algorithm and/or the brightness adjustment algorithm, which are incorporated as an additional input to the color adjustment algorithm and/or the brightness adjustment algorithm along with the ambient light data. The relevant and/or irrelevant pixels in the image sensor data are additionally or alternatively used to compensate for noise in the ambient light data, which can increase the reliability of the algorithmic outputs relative to using the ambient light data alone.
In some examples, and with reference again to
In a prophetic example of a use-case scenario, a computer vision algorithm 141 is used to identify the presence of a human (e.g., the user 502) in the image 514 of
In some examples, the locations of the one or more pixels in the background area 518 are used to select relevant pixels in the ambient light data that correspond to the same real-world location. Correspondingly, other locations are used to exclude other pixels that are potential sources of noise. The relevant pixels in the ambient light data and/or the background pixels are provided as input to the color adjustment algorithm and/or the brightness adjustment algorithm.
In other examples, the ambient light data is additionally or alternatively adjusted to fit the image sensor data. For example, the ambient light data 118 of
In some examples, the computing device 102 is configured to segment one or more pixels that correspond to a reflective object from the image sensor data and use the one or more pixels that correspond to the reflective object to account for non-ambient light reflected by the reflective object. In some examples, the segmentation of the reflective object is performed at the first processor 106, the second processor 108, one or more additional processors, or a combination thereof. Further, the segmentation and the execution of the color adjustment algorithm and/or the brightness adjustment algorithm may be performed at the same processor or at different processors.
With reference again to
In some examples, the computing device 102 of
With reference again to
With reference again to
The image 500 of
With reference again to
With reference now to
At 802 of
On condition that the image sensor is in use by the application, at 808 the method 800 includes routing the image sensor data to the first processor. For example, the image sensor data 116 of
At 810, the method 800 includes causing execution of a color adjustment algorithm at the first processor to use at least the image sensor data and the ambient light data to adjust one or more color parameters of content displayed on the display. The method 800 also includes, at 812, causing execution of a brightness adjustment algorithm at the first processor to use at least the image sensor data and the ambient light data to adjust a luminance of the display. In this manner, the higher resolution image sensor data is utilized to adjust one or more color parameters to value(s) that allow a user to accurately perceive a color of the content, and adjust the luminance 132 of the display 104 to a level that allows a user to accurately perceive brightness and contrast in the content 126.
In some examples and as noted above, the first processor 106 of
With reference now to 806, and on condition that the image sensor is not in use by the application, at 816 the method 800 includes blocking the image sensor data from the first processor. For example, the computing device 102 is configured to block the image sensor data 116 from the first processor 106, and the system 200 is configured to block the image sensor data 210 from the sensor processor 204. In this manner, the first processor is prevented from receiving potentially private information from the image sensor data. At 818, the method 800 includes routing the image sensor data to the second processor to thereby enable the second processor to execute the color adjustment algorithm to use at least the image sensor data and the ambient light data to adjust one or more color parameters of content displayed on the display, and execute the brightness adjustment algorithm to use at least the image sensor data and the ambient light data to adjust a luminance of the display. For example, the image sensor data 116 of
At 820, the method 800 may include, on condition that the image sensor is not in use by the application, receiving the image sensor data at a resolution that is less than a threshold resolution that enables facial recognition. In some examples, the image sensor 112 of
At 822, the image sensor data may have a first image resolution, and the ambient light data may have a second image resolution that is less than the first image resolution. For example, the ambient light sensor 114 of
With reference now to
At 826, the method 800 may include segmenting pixels that correspond to a reflective object from the image sensor data; and using the pixels that correspond to the reflective object to account for light reflected by the reflective object. For example, pixels that correspond to the shirt 526 of
At 828, the method 800 may include using the image sensor data to detect a low light condition or a mixed light condition. For example, image sensor data can be used to identify the low light conditions depicted in
At 830, the method 800 may include segmenting pixels that correspond to a light source from the image sensor data; and using at least the pixels that correspond to the light source to execute the color adjustment algorithm and the brightness adjustment algorithm. For example, an area 530 corresponding to the lamp 522 of
In some embodiments, the methods and processes described herein may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
Computing system 900 is shown in simplified form. Computing system 900 can take the form of one or more personal computers, server computers, tablet computers, network computing devices, mobile computing devices, mobile communication devices (e.g., smart phone), wearable computing devices, and/or other computing devices. In some examples, the computing device 102 of
Computing system 900 includes a logic subsystem 902, a storage subsystem 904, and a display subsystem 906. Computing system 900 can optionally include an input subsystem 908, a communication subsystem 910, and/or other components not shown in
Logic subsystem 902 includes one or more physical devices configured to execute instructions. For example, logic subsystem 902 can be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions can be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result. For example, logic subsystem 902 can be used to execute instructions to perform the method 800 of
Logic subsystem 902 can include one or more processors configured to execute software instructions. Additionally or alternatively, logic subsystem 902 can include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of logic subsystem 902 can be single-core or multi-core, and the instructions executed thereon can be configured for sequential, parallel, and/or distributed processing. Individual components of logic subsystem 902 optionally can be distributed among two or more separate devices, which can be remotely located and/or configured for coordinated processing. Aspects of logic subsystem 902 can be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
Storage subsystem 904 includes one or more physical devices configured to hold instructions executable by logic subsystem 902 to implement the methods and processes described herein. For example, storage subsystem 904 can hold instructions executable to perform the method 800 of
Storage subsystem 904 can include removable and/or built-in devices. Storage subsystem 904 can include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage subsystem 904 can include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
It will be appreciated that storage subsystem 904 includes one or more physical devices. However, aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration.
Aspects of logic subsystem 902 and storage subsystem 904 can be integrated together into one or more hardware-logic components. Such hardware-logic components can include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), SoCs, and complex programmable logic devices (CPLDs), for example.
The terms “program” and “application” may be used to describe an aspect of computing system 900 implemented to perform a particular function. In some cases, a program or application may be instantiated via logic subsystem 902 executing instructions held by storage subsystem 904. It will be understood that different programs and applications may be instantiated from the same service, code block, object, library, routine, API, function, etc. Likewise, the same program or application may be instantiated by different services, code blocks, objects, routines, APIs, functions, etc. The terms “program” and “application” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
It will be appreciated that a “service”, as used herein, is an application executable across multiple user sessions. A service may be available to one or more system components, programs, and/or other services. In some implementations, a service may run on one or more server-computing devices.
When included, display subsystem 906 can be used to present a visual representation of data held by storage subsystem 904. This visual representation can take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage subsystem 904, and thus transform the state of the storage machine, the state of display subsystem 906 can likewise be transformed to visually represent changes in the underlying data. For example, display subsystem 906 can be configured to display the content 126 of
Display subsystem 906 can include one or more display devices utilizing virtually any type of technology. Such display devices can be combined with logic subsystem 902 and/or storage subsystem 904 in a shared enclosure, or such display devices can be peripheral display devices.
When included, input subsystem 908 can comprise or interface with one or more user-input devices such as a keyboard, mouse, touch screen, or joystick. In some embodiments, the input subsystem 908 can comprise or interface with selected natural user input (NUI) componentry. Such componentry can be integrated or peripheral, and the transduction and/or processing of input actions can be handled on- or off-board. Example NUI componentry can include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity. For example, input subsystem 908 can be configured to receive user inputs while performing the method 800 and/or displaying the content 126.
When included, communication subsystem 910 can be configured to communicatively couple computing system 900 with one or more other computing devices. Communication subsystem 910 can include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem can be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, communication subsystem 910 can allow computing system 900 to send and/or receive messages to and/or from other devices via a network such as the Internet. For example, communication subsystem 910 can be used receive or send data to another computing system. As another example, communication subsystem may be used to communicate with other computing systems during execution of method 800 in a distributed computing environment.
The following paragraphs provide additional support for the claims of the subject application. One aspect provides a system, comprising: a display; a first processor; a second processor, an image sensor configured to output image sensor data to the first processor or the second processor; an ambient light sensor configured to output ambient light data to the first processor and/or the second processor; and a memory storing instructions executable by the first processor to: determine that the image sensor is not in use by an application executed at the first processor; and on condition that the image sensor is not in use by the application: block the image sensor data from the first processor; and route the image sensor data to the second processor to thereby enable the second processor to execute a color adjustment algorithm configured to use at least the image sensor data and the ambient light data to adjust one or more color parameters of content displayed on the display, and execute a brightness adjustment algorithm configured to use at least the image sensor data and the ambient light data to adjust a luminance of the display. The instructions may be additionally or alternatively executable to determine that the image sensor is in use by the application executed at the first processor; and on condition that the image sensor is in use by the application: route the image sensor data to the first processor; cause execution of the color adjustment algorithm at the first processor to use at least the image sensor data and the ambient light data to adjust one or more color parameters of content displayed on the display; and cause execution of the brightness adjustment algorithm at the first processor to use at least the image sensor data and the ambient light data to adjust a luminance of the display. The image sensor data may additionally or alternatively have a first image resolution, and the ambient light data may additionally or alternatively have a second image resolution that is less than the first image resolution. The instructions may be additionally or alternatively executable by the second processor to segment background pixels from the image sensor data; and use the background pixels to execute the color adjustment algorithm and the brightness adjustment algorithm. The instructions may be additionally or alternatively executable by the second processor to segment pixels that correspond to a reflective object from the image sensor data; and use the pixels that correspond to the reflective object to account for light reflected by the reflective object. The instructions may be additionally or alternatively executable by the second processor to use the image sensor data to detect a low light condition or a mixed light condition. The instructions may be additionally or alternatively executable by the second processor to segment pixels that correspond to a light source from the image sensor data; and use at least the pixels that correspond to the light source to execute the color adjustment algorithm and the brightness adjustment algorithm. The instructions may be additionally or alternatively executable by the second processor to receive environmental context data; and use the environmental context data to adjust the one or more color parameters and/or adjust the luminance of the display. The instructions may be additionally or alternatively executable by the second processor to, on condition that the image sensor is not in use by the application, receive the image sensor data at a resolution that is less than a threshold resolution that enables facial recognition.
Another aspect provides, at a computing device comprising a display, a first processor, a second processor, an image sensor configured to output image sensor data to the first processor or the second processor, and an ambient light sensor configured to output ambient light data to the first processor and/or the second processor, a method for adjusting one or more color parameters of an image displayed on the display and adjusting a luminance of the display, the method comprising: determining that the image sensor is not in use by an application executed at the first processor; and on condition that the image sensor is not in use by the application: blocking the image sensor data from the first processor, and routing the image sensor data to the second processor to thereby enable the second processor to execute a color adjustment algorithm configured to use at least the image sensor data and the ambient light data to adjust one or more color parameters of content displayed on the display, and execute a brightness adjustment algorithm configured to use at least the image sensor data and the ambient light data to adjust a luminance of the display. The method may additionally or alternatively include determining that the image sensor is in use by the application executed at the first processor; and on condition that the image sensor is in use by the application: routing the image sensor data to the first processor; causing execution of the color adjustment algorithm at the first processor to use at least the image sensor data and the ambient light data to adjust one or more color parameters of content displayed on the display; and causing execution of the brightness adjustment algorithm at the first processor to use at least the image sensor data and the ambient light data to adjust a luminance of the display. The method may additionally or alternatively include, wherein the image sensor data has a first image resolution, and wherein the ambient light data has a second image resolution that is less than the first image resolution. The method may additionally or alternatively include segmenting background pixels from the image sensor data; and using the background pixels to execute the color adjustment algorithm and the brightness adjustment algorithm. The method may additionally or alternatively include segmenting pixels that correspond to a reflective object from the image sensor data; and using the pixels that correspond to the reflective object to account for light reflected by the reflective object. The method may additionally or alternatively include using the image sensor data to detect a low light condition or a mixed light condition. The method may additionally or alternatively include segmenting pixels that correspond to a light source from the image sensor data; and using at least the pixels that correspond to the light source to execute the color adjustment algorithm and the brightness adjustment algorithm. The method may additionally or alternatively include receiving environmental context data; and using the environmental context data to adjust the one or more color parameters and/or adjust the luminance of the display. The method may additionally or alternatively include, on condition that the image sensor is not in use by the application, receiving at the second processor the image sensor data at a resolution that is less than a threshold resolution that enables facial recognition.
Another aspect provides a computing device, comprising: a display; a first processor; a second processor; an image sensor configured to output image sensor data to the first processor or the second processor, wherein the image sensor data has a first image resolution; an ambient light sensor configured to output ambient light data to the first processor and/or the second processor, wherein the ambient light data has a second image resolution less than the first image resolution, and wherein the second image resolution is less than a threshold resolution that enables facial recognition; and a memory storing instructions executable by the first processor to, detect whether the image sensor is in use by an application executed at the first processor, on condition that the image sensor is in use by the application, route the image sensor data to the first processor, cause execution of a color adjustment algorithm configured to use at least the image sensor data and the ambient light data to adjust one or more color parameters of content displayed on the display, and cause execution of a brightness adjustment algorithm configured to use at least the image sensor data and the ambient light data to adjust a luminance of the display; and on condition that the image sensor is not in use by the application, block the image sensor data from the first processor, and route the image sensor data to the second processor at a third image resolution that is less than the threshold resolution to thereby enable the second processor to use at least the image sensor data and the ambient light data to execute the color adjustment algorithm and the brightness adjustment algorithm. The instructions may be additionally or alternatively executable by the second processor to: segment background pixels from the image sensor data; and use the background pixels to execute the color adjustment algorithm and the brightness adjustment algorithm.
It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and non-obvious combinations and sub-combinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
| Number | Date | Country | Kind |
|---|---|---|---|
| 2031189 | Mar 2022 | NL | national |
| Filing Document | Filing Date | Country | Kind |
|---|---|---|---|
| PCT/US2023/062709 | 2/16/2023 | WO |