This disclosure relates in general to the field of computing, and more particularly, to a display with an integrated illuminator.
Some emerging trends in electronic devices include the use of a camera. A camera (or webcam) is a video camera that feeds or streams an image or video in real time to or through a computer to a computer network, such as the Internet. The cameras are typically relatively small devices that sit on a desk, attach to a user's monitor, or are built into the hardware of the electronic device. The cameras can be used during a video chat session involving two or more people, with conversations that include live audio and video, during video calls, teleconferences, and other camera-related processes.
To provide a more complete understanding of the present disclosure and features and advantages thereof, reference is made to the following description, taken in conjunction with the accompanying figures, wherein like reference numerals represent like parts, in which:
The FIGURES of the drawings are not necessarily drawn to scale, as their dimensions can be varied considerably without departing from the scope of the present disclosure.
The following detailed description sets forth examples of devices, apparatuses, methods, and systems relating to a display with an integrated illuminator. Features such as structure(s), function(s), and/or characteristic(s), for example, are described with reference to one embodiment as a matter of convenience; various embodiments may be implemented with any suitable one or more of the described features.
In the following description, various aspects of the illustrative implementations will be described using terms commonly employed by those skilled in the art to convey the substance of their work to others skilled in the art. However, it will be apparent to those skilled in the art that the embodiments disclosed herein may be practiced with only some of the described aspects. For purposes of explanation, specific numbers, materials, and configurations are set forth in order to provide a thorough understanding of the illustrative implementations. However, it will be apparent to one skilled in the art that the embodiments disclosed herein may be practiced without the specific details. In other instances, well-known features are omitted or simplified in order not to obscure the illustrative implementations.
The terms “over,” “under,” “below,” “between,” and “on” as used herein refer to a relative position of one layer or component with respect to other layers or components. For example, one layer disposed over or under another layer may be directly in contact with the other layer or may have one or more intervening layers. Moreover, one layer disposed between two layers may be directly in contact with the two layers or may have one or more intervening layers. In contrast, a first layer “directly on” a second layer is in direct contact with that second layer. Similarly, unless explicitly stated otherwise, one feature disposed between two features may be in direct contact with the adjacent features or may have one or more intervening layers.
Implementations of the embodiments disclosed herein may be formed or carried out on a substrate, such as a non-semiconductor substrate or a semiconductor substrate. In one implementation, the non-semiconductor substrate may be silicon dioxide, an inter-layer dielectric composed of silicon dioxide, silicon nitride, titanium oxide and other transition metal oxides. Although a few examples of materials from which the non-semiconducting substrate may be formed are described here, any material that may serve as a foundation upon which a non-semiconductor device may be built falls within the spirit and scope of the embodiments disclosed herein.
In another implementation, the semiconductor substrate may be a crystalline substrate formed using a bulk silicon or a silicon-on-insulator substructure. In other implementations, the semiconductor substrate may be formed using alternate materials, which may or may not be combined with silicon, that include but are not limited to germanium, indium antimonide, lead telluride, indium arsenide, indium phosphide, gallium arsenide, indium gallium arsenide, gallium antimonide, or other combinations of group III-V or group IV materials. In other examples, the substrate may be a flexible substrate including 2D materials such as graphene and molybdenum disulphide, organic materials such as pentacene, transparent oxides such as indium gallium zinc oxide poly/amorphous (low temperature of dep) III-V semiconductors and germanium/silicon, and other non-silicon flexible substrates. Although a few examples of materials from which the substrate may be formed are described here, any material that may serve as a foundation upon which a semiconductor device may be built falls within the spirit and scope of the embodiments disclosed herein.
In the following detailed description, reference is made to the accompanying drawings that form a part hereof wherein like numerals designate like parts throughout, and in which is shown, by way of illustration, embodiments that may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present disclosure. Therefore, the following detailed description is not to be taken in a limiting sense. For the purposes of the present disclosure, the phrase “A and/or B” means (A), (B), or (A and B). For the purposes of the present disclosure, the phrase “A, B, and/or C” means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B, and C). Reference to “one embodiment” or “an embodiment” in the present disclosure means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” or “in an embodiment” are not necessarily all referring to the same embodiment. The appearances of the phrase “for example,” “in an example,” or “in some examples” are not necessarily all referring to the same example.
Turning to
Turning to
Display 108 can be any display that allows for pixel brightness of each pixel or groups of pixels in the display to be set independently. More specifically, display can be a micro-light emitting diode (microLED) display, light emitting diode (LED) display, organic LED (OLED) display, or some other type of display where each pixel or group of pixels in the display can be set independently. In one embodiment, at least one portion of the display is configured to illuminate the user through a full illumination configuration or full brightness configuration while the other portion of display 108 is configured to display content viewable by the user. A microLED display includes of arrays of microLEDs forming the individual pixel elements. MicroLEDs are microscopic-scale versions of LEDs being used today in a plethora of applications and are based on the same gallium nitride technology. Micro-LED dimensions are less than 100 μm, or about two orders of magnitude smaller than a conventional LED die. Some microLEDs are as small as 3 μm on a side.
When camera 110 is on and capturing a video or picture of a user in ambient light, display illumination engine 112 can analyze the captured video or picture and adjust the illumination of the user using one or more portions of display 108. For example, if the captured video or picture indicates that the user is not being illuminated properly by the ambient light, display illumination engine 112 can increase the brightness from display 108 by adjusting one or more areas of display 108 to a full brightness configuration to increase the lighting on the user. In some examples, display illumination engine 112 can cause the brightness from display 108 to be increased by causing one or more areas of display 108 to be adjusted to a full brightness configuration. Portions of the display that have been reconfigured for full brightness may not be suitable to display content so content presented on display 108 may need to be resized to accommodate the one or more areas of display 108 that are adjusted to a full brightness configuration. In a specific example, display 108 includes a timing controller (TCON) and the TCON is configured to resize the image on display 108 and adjust the brightness of display 108. This means that the resizing of the image on display 108 and adjusting the brightness of display 108 is being done on the backend and it is not being done by a central processing unit of electronic device 100a, 100b, or 100c or by a processor or logic on a system on a chip (SoC) of electronic device 100a, 100b, or 100c.
In a specific illustrative example, display illumination engine 112 can be configured to adjust the brightness of display 108 and the lighting on the user during video calls, teleconferences, other camera-related processes, and other applications that requiring a certain amount illumination. In a specific example, display 108 is a microLED display. Display illumination engine 112 can be configured to resize the incoming image and set the LEDs needed for the backlight to ultrabright levels and allow for a display with an integrated illuminator in a power efficient operating system (OS) agnostic way.
It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present disclosure. Substantial flexibility is provided by electronic devices 100a-100c in that any suitable arrangements and configuration may be provided without departing from the teachings of the present disclosure.
For purposes of illustrating certain example techniques of electronic devices 100a-100c, the following foundational information may be viewed as a basis from which the present disclosure may be properly explained. End users have more media and communications choices than ever before. A number of prominent technological trends are currently afoot (e.g., more computing devices, more online video services, more Internet traffic, etc.), and these trends are changing the media delivery landscape. One change is the use of a camera. The term “camera” as used herein includes a webcam or webcam, camera, or some other device that can capture a video image or picture of a user.
As used herein, a camera (or webcam) is a video camera that feeds or streams an image or video in real time to or through a computer to a computer network, such as the Internet. The cameras are typically relatively small devices built into the hardware or chassis of the electronic device, are attached to a user's monitor, or sit on a desk next to the electronic device. The camera can be used during a video chat session involving two or more people, with conversations that include live audio and video, during video calls, teleconferences, etc. In addition, camera software enables users to record a video or stream the video on the Internet. Because video streaming over the Internet requires bandwidth, the video streams typically use some type of compression. The maximum resolution of an electronic device's camera is also lower than most handheld video cameras, as higher resolutions would be reduced during transmission. The lower resolution enables the cameras to be relatively inexpensive compared to most standalone video cameras, but the effect is adequate for video chat sessions. However, due to the lower resolution, lighting can be an important factor for good image quality. The cameras typically include a lens, an image sensor, supporting electronics, and may also include one or even two microphones for sound.
Most current electronic devices are equipped with cameras. In many cases, especially for handheld electronic devices, there are two cameras, one on the front side of the electronic device or on the same side of a general display screen, and the other one on the back side of the electronic device. One fairly widespread usage of the electronic devices is a video call, or video conference in some instances, during which both video images and audio signals are transmitted and received. Most likely the video images are captured with the front side camera, allowing a user of the electronic device to see the display on the electronic device and to be visible at the same time. Video calls enable the callers to hear and see the other person at the same time. Combined with the mobile capacity of the handheld electronic devices, video calls strongly facilitate communication and interaction between the parties.
However, one drawback of the video call conducted on an electronic device is the unpredictable and often far-from-ideal illumination of the user which can render the video calls less attractive or even impossible for participants on the video call to see the user. This problem is especially acute for handheld electronic devices. More specifically, due to the inherent mobility of handheld electronic devices, video calls conducted with handheld electronic devices may be carried out in some locations that have poor or inconsistent lighting. For example, instead of an illuminated conference room, a user of a handheld electronic device may find themselves participating in a video call while in a car, in a dark room, or in some places with weak or impossible-to-adjust lighting, making it difficult for the electronic device to properly capture the user's image.
Some current systems have an external illuminator device for illuminating one or more users in front of a webcam, a communication terminal having a bulb for emitting light, a reflector operatively associated with the bulb for projecting the emitted light, and an arm disposed between the bulb and the terminal for connection to the terminal. The bulb can be adjusted or positioned relative to the webcam to provide viewing of the user through the webcam. Another current system can include an external device for illuminating one or more users in front of a webcam, a communications terminal having a frame, and an external screen having a plurality of bulbs. The plurality of bulbs are disposed in the frame of the terminal to provide illumination to the face or faces of the user. Other current systems for illuminating a user include an external universal lighting system for use with a computer webcam. These systems often include a base clamping mechanism affixed to the electronic device with a light array adjustably connected to the base clamping mechanism for illuminating the user. A diffuser lens can be flexibly connected to the base clamping mechanism and sealingly positioned over the webcam for diffusing received light to try and create a clear image of the illuminated user prior to transmission over the communication network. However, these solutions are bulky and heavy. In addition, one trend in modern devices is to eliminate the bezel around the display leaving no room to place conventional illuminators or most of the current systems used for illumination of a user. What is needed is a display with an integrated illuminator.
A device configured to include display with an integrated illuminator, as outlined in
For example, if the captured video or picture indicates that the user is not being illuminated properly by the ambient light, the display illumination engine can increase the brightness from display 108 by adjusting one or more areas of display 108 to a full brightness configuration so the brightness of the light from the display can be used to increase the lighting on the user. More specifically, the top, a first side, an opposite second side, and/or bottom of the display can be configured to full brightness or illumination. In addition, the width of the portion or portions of the display that are configured to full brightness or illumination can be adjusted depending on the lighting that needs to be on the user.
Also, content or an image on the display can be resized to accommodate the one or more areas of the display that are adjusted to a full brightness configuration. In a specific example, the display includes a TCON and the TCON is configured to resize the content or image on the display and to adjust the brightness of the light from the display. This means that the resizing of the image on the display and the adjustment of the brightness of the light from display is done on the backend and it is not being done by a central processing unit of the electronic device or by a processor or logic on an SoC.
In an example implementation, electronic devices 100a-100c are meant to encompass a computer, a personal digital assistant (PDA), a laptop or electronic notebook, a cellular telephone, mobile device, personal digital assistants, smartphones, tablets, a smart phone, wearables, Internet-of-things (IoT) device, network elements, or any other device that includes a user facing camera and a display. Electronic devices 100a-100c may include any suitable hardware, software, components, modules, or objects that facilitate the operations thereof, as well as suitable interfaces for receiving, transmitting, and/or otherwise communicating data or information in a network environment. This may be inclusive of appropriate algorithms and communication protocols that allow for the effective exchange of data or information. Electronic devices 100a and 100b may include virtual elements.
In regards to the internal structure associated with electronic devices 100a-100c, electronic devices 100a-100c can include memory elements for storing information to be used in the operations outlined herein. Electronic devices 100a-100c may keep information in any suitable memory element (e.g., random access memory (RAM), read-only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), application specific integrated circuit (ASIC), etc.), software, hardware, firmware, or in any other suitable component, device, element, or object where appropriate and based on particular needs. Any of the memory items discussed herein should be construed as being encompassed within the broad term ‘memory element.’ Moreover, the information being used, tracked, sent, or received in electronic devices 100a-100c could be provided in any database, register, queue, table, cache, control list, or other storage structure, all of which can be referenced at any suitable timeframe. Any such storage options may also be included within the broad term ‘memory element’ as used herein.
In certain example implementations, the functions outlined herein may be implemented by logic encoded in one or more tangible media (e.g., embedded logic provided in an ASIC, digital signal processor (DSP) instructions, software (potentially inclusive of object code and source code) to be executed by a processor, or other similar machine, etc.), which may be inclusive of non-transitory computer-readable media. In some of these instances, memory elements can store data used for the operations described herein. This includes the memory elements being able to store software, logic, code, or processor instructions that are executed to carry out the activities described herein.
In an example implementation, elements of electronic devices 100a-100c may include software modules (e.g., display illumination engine 112, light detecting engine 122, video quality image engine 124, screen adjustment engine 126, light adjustment engine 128, and image on screen adjustment engine 130, etc.) to achieve, or to foster, operations as outlined herein. These modules may be suitably combined in any appropriate manner, which may be based on particular configuration and/or provisioning needs. In example embodiments, such operations may be carried out by hardware, implemented externally to these elements, or included in some other network device to achieve the intended functionality. Furthermore, the modules can be implemented as software, hardware, firmware, or any suitable combination thereof. These elements may also include software (or reciprocating software) that can coordinate with other network elements in order to achieve the operations, as outlined herein.
Additionally, electronic devices 100a-100c may include one or more processors that can execute software, logic, or an algorithm to perform activities as discussed herein. A processor can execute any type of instructions associated with the data to achieve the operations detailed herein. In one example, the processors could transform an element or an article (e.g., data) from one state or thing to another state or thing. In another example, the activities outlined herein may be implemented with fixed logic or programmable logic (e.g., software/computer instructions executed by a processor) and the elements identified herein could be some type of a programmable processor, programmable digital logic (e.g., a field programmable gate array (FPGA), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM)) or an ASIC that includes digital logic, software, code, electronic instructions, or any suitable combination thereof. Any of the potential processing elements, modules, and machines described herein should be construed as being encompassed within the broad term ‘processor.’
Turning to
If the captured video or picture indicates that the user lighting is insufficiently bright, display illumination engine 112 can increase the brightness from display 108 by adjusting one or more areas of display 108 to increase the lighting on the user. In one example, display engine can analyze a histogram of the captured video or image and determine if the exposure if above a predefined threshold. In some examples, light sensor 132 can be configured to help determine an amount of light or illumination that is on a user. In different embodiments, display illumination engine 112 can determine whether the lighting is sufficient based on the sensor data received from light sensor 132, the analysis of the captured video or picture, or both.
Turning to
In some examples, display illumination engine 112 can increase the brightness from display 108 by adjusting one or more areas of display 108 to a full brightness configuration. More specifically, display illumination engine 112 can create illumination regions 120a-120c on display 108 and the lighting or illumination in illumination regions 120a-120c can be adjusted to a full brightness configuration. Also, display image 118 on display 108 can be resized to accommodate illumination regions 120a-120c on display 108. In a specific example, display 108 includes a TCON and the TCON is configured to resize display image 118 on display 108 and adjust the brightness and/or size of illumination regions 120a-120c. This means that the resizing of display image 118 on display 108 and adjusting the brightness and/or size of illumination regions 120a-120c is done on the backend and it is not being done by a central processing unit of electronic device 100a or by a processor or logic on an SoC.
Turning to
In some examples, display illumination engine 112 can dynamically adjust the dimensions and/or location of the one or more regions of display 108 having a full brightness configuration. More specifically, display illumination engine 112 can dynamically adjust the placement and size of illumination regions 120a-120c on display 108 regions 120 to create sufficient brightness to illuminate the user. In other examples, display illumination engine 112 can dynamically adjust the dimensions and placement of the illumination regions along with the brightness of the illumination regions to create sufficient brightness to illuminate the user while reducing use discomfort from the illumination. For example, illumination regions 120b and 120c in
Turning to
In some examples, display illumination engine 112 can cause the brightness from display 108 to be decreased by causing one or more areas of display 108 to be adjusted to a less than full brightness configuration or turned off. More specifically, display illumination engine 112 can turn off illumination regions 120b and 120c on display 108 and reduce the illumination from 120a. Also, display image 118 on display 108 can be resized to accommodate illumination region 120a and the absence of illumination region 120b and 120c on display 108.
Turning to
Turning to
Light detecting engine 122 is configured to determine an amount of light or illumination that is on a user. For example, light detection engine 122 can receive data from a light sensor (e.g., light sensor 132) that can be used to determine an amount of light or illumination that is on a user. Video quality image engine 124 can be configured to analyzed a video image of the user and determine if the amount of light or illumination that is on a user is properly illuminating the user. For example, if video quality image engine 124 analyzes a video image of the user and determines the image of the user is too dark or if the user is only partially illuminated, then the user is not properly illuminated.
Screen adjustment engine 126 can be configured to cause one or more illumination regions (e.g., illumination regions 120a-120c) to be located on display 108. For example, screen adjustment engine 126 can create illumination regions 120a-120c on display 108 by increase pixel brightness to a full pixel brightness in illumination regions 120a-120c, as illustrated in
Turning to
Turning to
Elements of
Turning to the infrastructure of
In the system, network traffic, which is inclusive of packets, frames, signals, data, etc., can be sent and received according to any suitable communication messaging protocols. Suitable communication messaging protocols can include a multi-layered scheme such as Open Systems Interconnection (OSI) model, or any derivations or variants thereof (e.g., Transmission Control Protocol/Internet Protocol (TCP/IP), user datagram protocol/IP (UDP/IP)). Messages through the network could be made in accordance with various network protocols, (e.g., Ethernet, Infiniband, OmniPath, etc.). Additionally, radio signal communications over a cellular network may also be provided in the system. Suitable interfaces and infrastructure may be provided to enable communication with the cellular network.
The term “packet” as used herein, refers to a unit of data that can be routed between a source node and a destination node on a packet switched network. A packet includes a source network address and a destination network address. These network addresses can be Internet Protocol (IP) addresses in a TCP/IP messaging protocol. The term “data” as used herein, refers to any type of binary, numeric, voice, video, textual, or script data, or any type of source or object code, or any other suitable information in any appropriate format that may be communicated from one point to another in electronic devices and/or networks. The data may help determine a status of a network element or network. Additionally, messages, requests, responses, and queries are forms of network traffic, and therefore, may comprise packets, frames, signals, data, etc.
Turning to
If the lighting for the user is acceptable, then the system returns to 602 and a (new) image of the user is analyzed. If the lighting for the user is not acceptable, then a display lighting is adjusted, as in 606. For example, if the system determines that the user is not properly illuminated enough, then the system can increase the brightness and/or the intensity of one or more illumination regions on display by adjusting one or more areas of display 108 to a full brightness configuration. Also, if the system determines that the user is illuminated too much, then one or more illumination regions may be removed from the display and/or the intensity of one or more illumination regions can be decreased. In an example, the system can use machine learning or analysis to help determine if the settings are acceptable to the user and if the user is illuminated too much and/or the illumination causes discomfort to the user. For example, if the system detects that the user is squinting at the display, machine learning or analysis can determine that the illumination is too much and/or the illumination causes discomfort to the user. The machine learning can adjust the illumination setting by reconfiguring the illumination region. The reconfiguration can include moving the position of the illumination region, changing the size and/or shape of the illumination region, adjusting the brightness and/or the intensity of the illumination region, and/or some other reconfiguration of the illumination region. In addition, the machine learning or analysis can be used to determine a user's preference for the location and/or brightness of the illumination regions.
Turning to
Turning to
Turning to
As illustrated in
Processors 902a and 902b may also each include integrated memory controller logic (MC) 908a and 908b respectively to communicate with memory elements 910a and 910b. Memory elements 910a and/or 910b may store various data used by processors 902a and 902b. In alternative embodiments, memory controller logic 908a and 908b may be discrete logic separate from processors 902a and 902b.
Processors 902a and 902b may be any type of processor and may exchange data via a point-to-point (PtP) interface 912 using point-to-point interface circuits 914a and 914b respectively. Processors 902a and 902b may each exchange data with a chipset 916 via individual point-to-point interfaces 918a and 918b using point-to-point interface circuits 920a-920d. Chipset 916 may also exchange data with a high-performance graphics circuit 922 via a high-performance graphics interface 924, using an interface circuit 926, which could be a PtP interface circuit. In alternative embodiments, any or all of the PtP links illustrated in
Chipset 916 may be in communication with a bus 928 via an interface circuit 930. Bus 928 may have one or more devices that communicate over it, such as a bus bridge 932 and I/O devices 934. Via a bus 936, bus bridge 932 may be in communication with other devices such as a keyboard/mouse 938 (or other input devices such as a touch screen, trackball, etc.), communication devices 940 (such as modems, network interface devices, or other types of communication devices that may communicate through a network), audio I/O devices 942, and/or a data storage device 944. Data storage device 944 may store code 946, which may be executed by processors 902a and/or 902b. In alternative embodiments, any portions of the bus architectures could be implemented with one or more PtP links.
The computer system depicted in
Turning to
In this example of
Ecosystem SOC 1000 may also include a subscriber identity module (SIM) I/F 1018, a boot read-only memory (ROM) 1020, a synchronous dynamic random-access memory (SDRAM) controller 1022, a flash controller 1024, a serial peripheral interface (SPI) master 1028, a suitable power control 1030, a dynamic RAM (DRAM) 1032, and flash 1034. In addition, one or more embodiments include one or more communication capabilities, interfaces, and features such as instances of Bluetooth™ 1036, a 3G modem 0138, a global positioning system (GPS) 1040, and an 802.11 Wi-Fi 1042.
In operation, the example of
Processor core 1100 can also include execution logic 1114 having a set of execution units 1116-1 through 1116-N. Some embodiments may include a number of execution units dedicated to specific functions or sets of functions. Other embodiments may include only one execution unit or one execution unit that can perform a particular function. Execution logic 1114 performs the operations specified by code instructions.
After completion of execution of the operations specified by the code instructions, back-end logic 1118 can retire the instructions of code 1104. In one embodiment, processor core 1100 allows out of order execution but requires in order retirement of instructions. Retirement logic 1120 may take a variety of known forms (e.g., re-order buffers or the like). In this manner, processor core 1100 is transformed during execution of code 1104, at least in terms of the output generated by the decoder, hardware registers and tables utilized by register renaming logic 1110, and any registers (not shown) modified by execution logic 1114.
Although not illustrated in
It is important to note that the operations in the preceding flow diagram (i.e.,
Although the present disclosure has been described in detail with reference to particular arrangements and configurations, these example configurations and arrangements may be changed significantly without departing from the scope of the present disclosure. Moreover, certain components may be combined, separated, eliminated, or added based on particular needs and implementations. Additionally, although electronic devices 102a-102c have been illustrated with reference to particular elements and operations that facilitate the communication process, these elements and operations may be replaced by any suitable architecture, protocols, and/or processes that achieve the intended functionality of electronic devices 102a-102c.
Numerous other changes, substitutions, variations, alterations, and modifications may be ascertained to one skilled in the art and it is intended that the present disclosure encompass all such changes, substitutions, variations, alterations, and modifications as falling within the scope of the appended claims. In order to assist the United States Patent and Trademark Office (USPTO) and, additionally, any readers of any patent issued on this application in interpreting the claims appended hereto, Applicant wishes to note that the Applicant: (a) does not intend any of the appended claims to invoke paragraph six (6) of 35 U.S.C. section 112 as it exists on the date of the filing hereof unless the words “means for” or “step for” are specifically used in the particular claims; and (b) does not intend, by any statement in the specification, to limit this disclosure in any way that is not otherwise reflected in the appended claims.
Example A1, is an electronic device including a user facing camera to capture a video stream of a user, a display presenting content, and display illumination logic determine the user is insufficiently illuminated in the video stream, in response to the determination, reconfigure a first portion of the display as an illumination region and a second portion of the display as a content region, and increase a brightness of one or more pixels in the illumination region to better illuminate the user.
In Example A2, the subject matter of Example A1 can optionally include where reconfiguring the portion of the display as an illumination region and a second portion of the display as a content region includes scaling the content to fit in within the second portion of the display.
In Example A3, the subject matter of any one of Examples A1-A2 can optionally include where the display includes micro light emitting diodes (microLEDs) and the illumination region of the display includes microLEDs at full brightness.
In Example A4, the subject matter of any one of Examples A1-A3 can optionally include where the video stream from the user facing camera is analyzed to determine that the user is insufficiently illuminated in the video stream.
In Example A5, the subject matter of any one of Examples A1-A4 can optionally include where dimensions and location of the illumination region is dependent on a current illumination of the user.
In Example A6, the subject matter of any one of Examples A1-A5 can optionally include where the brightness of the one or more pixels in the illumination region is dependent on a current illumination of the user.
In Example A7, the subject matter of any one of Examples A1-A6 can optionally include a light sensor, where output from the light sensor is used to determine that the user is insufficiently illuminated.
In Example A8, the subject matter of any one of Examples A1-A7 can optionally include where the content is a video conference.
In Example A9, the subject matter of any one of Examples A1-A8 can optionally include where the illumination region is a ring shape surrounding the content region to simulate a ring light.
Example M1 is a method including capturing content using a user facing camera, displaying the content on a display, determining that the content is insufficiently illuminated, in response to the determination, reconfiguring a first portion of the display as an illumination region and a second portion of the display as a content region, and increasing a brightness of one or more pixels in the illumination region to better illuminate the content.
In Example M2, the subject matter of Example M1 can optionally include scaling the content to fit in within the content region of the display when the first portion of the display is reconfigured as an illumination region and the second portion of the display is reconfigured as a content region.
In Example M3, the subject matter of any one of the Examples M1-M2 can optionally include the display includes micro light emitting diodes (microLEDs) and the illumination region of the display includes microLEDs at full brightness.
In Example M4, the subject matter of any one of the Examples M1-M3 can optionally include where the captured content is a video stream of a user and the video stream from the user facing camera is analyzed to determine that the user is insufficiently illuminated in the video stream.
In Example M5, the subject matter of any one of the Examples M1-M4 can optionally include where dimensions and location of the illumination region is dependent on a current illumination of the user.
Example S1 is a system including one or more processors, a user facing camera to capture a video stream of a user, a display presenting content, and display illumination logic. The display illumination logic can cause the one or more processors to, determine the user is insufficiently illuminated in the video stream, in response to the determination, reconfigure a first portion of the display as an illumination region and a second portion of the display as a content region, and increase a brightness of one or more pixels in the illumination region to better illuminate the user.
In Example S2, the subject matter of Example S1 can optionally include where reconfiguring the portion of the display as an illumination region and a second portion of the display as a content region includes scaling the content to fit in within the second portion of the display.
In Example S3, the subject matter of any one of the Examples S1-S2 can optionally include where the display includes micro light emitting diodes (microLEDs) and the illumination region of the display includes microLEDs at full brightness.
In Example S4, the subject matter of any one of the Examples S1-S3 can optionally include where the video stream from the user facing camera is analyzed to determine that the user is insufficiently illuminated in the video stream.
In Example S5, the subject matter of any one of the Examples S1-S4 can optionally include where dimensions and location of the illumination region is dependent on a current illumination of the user.
In Example S6, the subject matter of any one of the Examples S1-S5 can optionally include a light sensor, where output from the light sensor is used to determine that the user is insufficiently illuminated.