This disclosure relates generally to optics, and in particular to a head mounted device.
A head mounted device is a wearable electronic device, typically worn on the head of a user. Head mounted devices may include one or more electronic components for use in a variety of applications, such as gaming, aviation, engineering, medicine, entertainment, activity tracking, and so on. Head mounted devices may include display to present virtual images to a wearer of the head mounted device. When a head mounted device includes a display, it may be referred to as a head mounted display. Head mounted devices may have user inputs so that a user can control one or more operations of the head mounted device.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Embodiments of adaptive control of optical transmission in augmented reality (AR) devices are described herein. In the following description, numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
A head mounted device (and related method) for adaptive control of optical transmission, as provided in this disclosure, addresses a situation, such as in an augmented reality (AR) implementation, a virtual image overlays/superimposes over a scene of an environment external to the head mounted device. Due to a brightness level of scene light (e.g., ambient light) in the scene, it may be difficult for a user of the head mounted device to see the details of the virtual image in the field of view (FOV) of the head mounted device, for example, if a high brightness level of the scene light reduces a contrast of the virtual image with respect to the scene. Accordingly, the head mounted device is provided with capability and features to provide dimming of the scene light that propagates through the head mounted device, so that the scene light propagating through the head mounted device can be dimmed when needed and in an adaptive and dynamic manner, thereby improve the contrast and other visibility of the virtual image.
Determining whether dimming is appropriate may be based on a plurality of inputs to processing logic provided by a corresponding plurality of sensors. These sensors may include an ambient light sensor, a display brightness sensor, a stack transmission sensor, a temperature sensor, an eye-tracking camera, and so forth. For instance, a head mounted device may include a light sensor configured to generate light data in response to measuring scene light in an external environment of the head mounted device, a display configured to present a virtual image to an eyebox area of the head mounted device, a near-eye dimming element configured to modulate a transmission of the scene light to the eyebox area in response to a transmission command, and processing logic configured to adjust the transmission command of the dimming element in response to a brightness level of the virtual image and the light data generated by the light sensor.
By using the information/data from these sensors in combination, the processing logic for the head mounted device is able to more accurately monitor brightness in the scene and in the display, determine whether some adjustment to the dimming element and/or to the display is needed in order to achieve an appropriate contrast result, perform the adjustments, etc., with the monitoring, determinations, and adjustments being performed in an automatic and more efficient manner as the user moves within or between scenes, views different/multiple virtual images, experiences scene changes, etc. These and other embodiments are described in more detail in connection with
Cameras 108A and 108B may image the eyebox region directly or indirectly. For example, optical elements 110A and/or 110B may have an optical combiner that is configured to redirect light from the eyebox to the cameras 108A and/or 108B. In some implementations, near-infrared light sources (e.g. LEDs or vertical-cavity side emitting lasers) illuminate the eyebox region with near-infrared illumination light, and cameras 108A and/or 108B are configured to capture infrared images. Cameras 108A and/or 108B may include complementary metal-oxide semiconductor (CMOS) image sensor. A near-infrared filter that receives a narrow-band near-infrared wavelength may be placed over the image sensor so that the image sensor is sensitive to the narrow-band near-infrared wavelength while rejecting visible light and wavelengths outside the narrow-band. The near-infrared light sources may emit the narrow-band wavelength that is passed by the near-infrared filters.
Sensor 160 is positioned on frame 102, and/or positioned on or otherwise proximate to either or both optical elements 110A and 110B or elsewhere in head mounted device 100. Sensor(s) 160 may include one or more of an ambient light sensor (including a RGB camera, monochromatic camera, photodiode etc.) or a temperature sensor. As will be described later below, the data provided by sensor(s) 160 may be used by processing logic to control dimming or to otherwise control characteristics (such as brightness, contrast, etc.) of head mounted device 100 with respect to a scene and virtual image that is presented in a field of view of head mounted device 100.
While
When head mounted device 100 includes a display, it may be considered to be a head mounted display. Head mounted device 100 may be considered to be an augmented reality (AR) head mounted display. While
Illumination layer 130A is shown as including a plurality of in-field illuminators 126. In-field illuminators 126 are described as “in-field” because they are in a field of view (FOV) of a user of the head mounted device 100. In-field illuminators 126 may be in a same FOV that a user views a display of the head mounted device 100, in an embodiment. In-field illuminators 126 may be in a same FOV that a user views an external environment of the head mounted device 100 via scene light 191 propagating through near-eye optical elements 110. Scene light 191 is from the external environment of head mounted device 100. While in-field illuminators 126 may introduce minor occlusions into the near-eye optical element 110A, the in-field illuminators 126, as well as their corresponding electrical routing may be so small as to be unnoticeable or insignificant to a wearer of head mounted device 100. In some implementations, illuminators 126 are not in-field. Rather, illuminators 126 could be out-of-field in some implementations.
As shown in
As shown in
Optically transparent layer 120A is shown as being disposed between the illumination layer 130A and the eyeward side 109 of the near-eye optical element 110A. The optically transparent layer 120A may receive the infrared illumination light emitted by the illumination layer 130A and pass the infrared illumination light to illuminate the eye of the user. As mentioned above, the optically transparent layer 120A may also be transparent to visible light, such as scene light 191 received from the environment and/or image light 141 received from the display layer 140A. In some examples, the optically transparent layer 120A has a curvature for focusing light (e.g., display light and/or scene light) to the eye of the user. Thus, the optically transparent layer 120A may, in some examples, may be referred to as a lens. In some aspects, the optically transparent layer 120A has a thickness and/or curvature that corresponds to the specifications of a user. In other words, the optically transparent layer 120A may be a prescription lens. However, in other examples, the optically transparent layer 120A may be a non-prescription lens.
Transparency modulator layer 150A may be superimposed over display layer 140A at a backside 111, such that transparency modulator layer 150A is facing a scene that is being viewed by the user in the FOV of head mounted device 100. According to various embodiments, transparency modulator layer 150A may include a dimming element that is configured to control an amount (e.g., intensity) of scene light 191 that is transmitted through optical element 110A. The dimming element may be controlled to reduce or increase an intensity of scene light 191, so as to provide an appropriate contrast between a scene and a virtual image that are presented in a FOV of head mounted device 100.
For example,
Therefore,
According to various embodiments that will be described later below, a region of interest (ROI) may be defined for virtual image 212, such that the amount of dimming may be performed dependent upon whether the ROI is positioned over a relatively brighter area of scene 202. The ROI can have, for example, a size and shape that generally corresponds to the external outline of virtual image 212 (e.g., a ROI in the shape of a tiger). As another example, the ROI can have a more general shape, such as a rectangle, box, ellipse, polygon, etc. that encompasses the external outline of virtual image 212.
Head mounted device 400 may include an optical element 410 that includes a transparency modulator layer 450, a display layer 440, and an illumination layer 430. Additional optical layers (not specifically illustrated) may also be included in example optical element 410. For example, a focusing lens layer may optionally be included in optical element 410 to focus scene light 456 and/or virtual images included in image light 441 generated by display layer 440. Transparency modulator layer 450 (which includes a dimming element) modulates the intensity of incoming scene light 456 so that the scene light 459 that propagates to eyebox region 201 may have a reduced intensity when compared to the intensity of incoming scene light 456.
Display layer 440 presents virtual images in image light 441 to an eyebox region 201 for viewing by an eye 203. Processing logic 470 is configured to drive virtual images onto display layer 440 to present image light 441 to eyebox region 201. Processing logic 470 is also configured to adjust a brightness of display layer 440. In some implementations, adjusting a display brightness of display layer 440 includes adjusting the intensity of one or more light sources of display layer 440. All or a portion of display layer 440 may be transparent or semi-transparent to allow scene light 456 from an external environment to become incident on eye 203 so that a user can view their external environment in addition to viewing virtual images presented in image light 441, such as described above with respect to
Transparency modulator layer 450 may be configured to change its transparency to modulate the intensity of scene light 456 that propagates to the eye 203 of a user. Processing logic 470 may be configured to drive an analog or digital signal onto transparency modulator layer 450 in order to modulate the transparency of transparency modulator layer 450. In an example implementation, transparency modulator layer 450 includes a dimming element comprised of liquid crystals wherein the alignment of the liquid crystals is adjusted in response to a drive signal from processing logic 470 to modulate the transparency of transparency modulator layer 450. Other suitable technologies that allow for electronically and/or optically controlled dimming of the dimming element may be included in transparency modulator layer 450. Example technologies may include, but are not limited to, electrically activated guest host liquid crystal technology in which a guest host liquid crystal coating is present on a lens surface, photochromic dye technology in which photochromic dye embedded within a lens is activated by ultraviolet (UV) or blue light, or other dimming technologies that enable controlled dimming through electrical, optical, mechanical, and/or other activation techniques.
Illumination layer 430 includes light sources 426 configured to illuminate an eyebox region 201 with infrared illumination light 427. Illumination layer 430 may include a transparent refractive material that functions as a substrate for light sources 426. Infrared illumination light 427 may be near-infrared illumination light. Camera 477 is configured to image (directly) eye 203, in the illustrated example of
Camera 447 may include a complementary metal-oxide semiconductor (CMOS) image sensor, in some implementations. An infrared filter that receives a narrow-band infrared wavelength may be placed over the image sensor so that it is sensitive to the narrow-band infrared wavelength while rejecting visible light and wavelengths outside the narrow-band. Infrared light sources (e.g. light sources 426) such as infrared LEDs or infrared VCSELS that emit the narrow-band wavelength may be oriented to illuminate eye 203 with the narrow-band infrared wavelength. Camera 447 may capture eye-tracking images of eyebox region 201. Eyebox region 201 may include eye 203 as well as surrounding features in an ocular area such as eyebrows, eyelids, eye lines, etc. Processing logic 470 may initiate one or more image captures with camera 477 and camera 477 may provide eye-tracking images 479 to processing logic 470. Processing logic 470 may perform image processing to determine the size and/or position of various features of the eyebox region 201. For example, processing logic 470 may perform image processing to determine a pupil position or pupil size of pupil 266. Light sources 426 and camera 477 are merely an example eye-tracking configuration and other suitable eye-tracking systems and techniques may also be used to capture eye data, in implementations of the disclosure.
In the illustrated implementation of
Ambient light sensor 423 may be comprised of a 2D sensor (e.g., a camera) capable of mapping a solid angle FOV onto a 2D pixel array. There may be many such 2D sensors (cameras), and these cameras can have optical elements, modules, data readout, analog-to-digital converters, etc. Ambient light sensor 423 may also be sensitive to color and brightness of a scene, thereby mapping the scene accurately across the spectral range. Ambient light sensor 423 may also be polarization-sensitive and thereby capable of detecting S versus P polarized light, and may be configured to capture and transmit data at frame rates in the same order of magnitude as the display frame rate.
In the illustrated implementation, processing logic 470 is configured to receive ambient light measurement 429 from ambient light sensor 423. Processing logic 470 may also be communicatively coupled to ambient light sensor 423 to initiate the ambient light measurement.
In some embodiments, transparency modulation layer 450 is made up of one or more materials that are sensitive to temperature, such that temperature changes (e.g., increases or decreases in temperature due to ambient temperature, incident energy such as sunlight, heat generated during operation, etc.) may affect the transparency performance (e.g., light transmission capability) of the dimming element. Hence, a temperature sensor 431 can be provided in/on or near transparency modulation layer 450 so as to detect the temperature of transparency modulation layer 450, and to provide a corresponding temperature measurement 432 to processing logic 470.
Furthermore in some embodiments, a display brightness sensor 433 may be provided within, behind, or in front of display layer 440 so as to sense/measure the brightness of display layer 440, and then provide a corresponding display brightness measurement 434 to processing logic 470. For example, the brightness of display layer 440 can typically be determined processing logic 470 by knowing the input power provided to display layer 440 and then comparing this input power with known brightness values (such as via a lookup table). The contents of the lookup table and other known values may be derived from factory settings or other known characteristics of display layer 440 at the time of manufacture.
However, the brightness characteristics/performance of display layer 440 may change over time and with age/use. Thus, display brightness sensor 433 provides a more accurate/true and real-time brightness value for display layer 440.
Display brightness sensor 433 may be positioned at any one or more locations that are suitable to determine the brightness of display layer 440. For example, display brightness sensor 433 may be located at an input and/or output of a waveguide (e.g., waveguide 158A in
In operation, transparency modulator layer 450 may be driven to various transparency values by processing logic 470 in response to one or more of eye data, ambient light measurements 429, temperature measurement 432, display brightness measurement 434 and/or other display brightness data, or other input(s) or combinations thereof. By way of example, a pupil diameter of an eye may indicate that scene light 456 is brighter than the user prefers or the ambient light sensor 423 may indicate that scene light 456 is too high, such that the user may have difficulty viewing a virtual image in a scene. Other measurements of an ocular region (e.g. dimension of eyelids, sclera, number of lines in corner region 263, etc.) of the user may indicate the user is squinting and that scene light 456 may be brighter than the user prefers. Inputs from the temperature sensor 431 and display layer 440 may also be received at processing logic 470. Thus, a transparency of transparency modulator layer 450 may be driven by processing logic 470 to a transparency that makes the user more comfortable with the intensity of scene light 459 that propagates through transparency modulator layer 450, and/or driven to a transparency that changes an intensity of scene light 456 so as to improve the visibility of virtual image(s) superimposed on a scene. The transparency of transparency modulator layer 450 may be modulated to various levels between 10% transparent and 90% transparent or other ranges, in response to the eye data, the ambient light measurement, display brightness, etc. for example.
The order in which some or all of the process blocks and related components appear in process 500 (and in any other process/method disclosed herein) should not be deemed limiting. Rather, one of ordinary skill in the art having the benefit of the present disclosure will understand that some of the process blocks may be executed in a variety of orders not illustrated, or even in parallel. Furthermore, some process blocks may be modified, combined, eliminated, or supplemented with additional process blocks.
For the process 500 of
Display 508 may be operated/controlled by a display controller 512. Display 508 is configured to present a virtual image (monocularly or binocularly) to an eyebox area (e.g., the area of the eye 504) of the head mounted device, and is configured to adjust a brightness level of the virtual image in response to commands from display controller 512.
An ambient light sensor 516 is configured to generate light data in response to measuring light at scene 502 in the external environment of the head mounted device. In operation, ambient light sensor 516 provides the light data or other signals to a processing kernel 518. Processing kernel 518 may be a signal processing kernel, for example, that is part of the processing logic (e.g., processing logic 470 in
With respect to dimming element 506, dimming controller 514 controls (e.g., electrically, optically, etc.) the transmission characteristics (e.g., amount of dimming) of dimming element 506. Based on the control signals provided by dimming controller 514 to dimming element 506, the processing logic is able to estimate a stack transmission at a process block 524, as such via a lookup table that contains factory calibration information. This estimate of the stack transmission is provided as a second input to process block 522. Stack transmission may be estimated/measured in a more accurate manner, as compared to using factory calibration information, using a stack transmission sensor that will be described further below in
Analogously to the dimming controller 514, display controller 512 provides control signals and/or other signals to display 508. Based on the signal(s) provided by display controller 512 to display 508, the processing logic is able to estimate display brightness at a process block 526, as such via a lookup table that contains factory calibration information. This estimate of the display brightness is provided as a third input to process block 522. Display brightness may be estimated/measured in a more accurate manner, as compared to using factory calibration information, using a display brightness sensor that will be described further below in
In process block 522, which may also form part of the processing logic, a contrast or contrast value for the virtual content (e.g., one or more virtual images) is computed based on at least some of the above-described first, second, and third inputs. The contrast value may represent an amount of visibility or clarity of the virtual content relative to the scene 502. Example formulas for computing the contrast value may be the following:
contrast=1+display/scene, wherein display and scene are the respective brightness values of display 508 and scene 502 in nits or lux, or
contrast=1+display/(transmittance*scene*reflectance), wherein transmittance is the stack transmission computed at process block 524 and reflectance represents the reflectivity of the transparent modulator layer.
The contrast value may be compared to a threshold, in which contrast values below the threshold would require adjustment (e.g., dimming) of the optical transmission of dimming element 506, and contrast values above the threshold (and up to a certain maximum value) would require little or no adjustment of the optical transmission of dimming element 506.
The contrast value may differ based on various use cases. For example, the contrast value may be different for a use case in which the scene is indoors versus outdoors; a use case for virtual reality (VR) versus augmented reality (AR), a use case in which a scene is inside a bright room versus a scene in a relatively darker room; etc. Various thresholds for contrast values may be stored in a lookup table and used at a process block 528.
In process block 528, the processing logic determines whether the computed contrast value is greater than the threshold. If the computed contrast value is greater than the threshold (“YES” at process block 528), then nothing is done at process block 530 (e.g., no change is made to the optical transmission of dimming element 506). The processing logic may then repeat process 500 described above for another set of first, second, third inputs.
If, however, the computed contrast is determined to be less than the threshold (“NO” at process block 528), then the processing logic checks at a block 532 as to whether the brightness of display 508 may be increased so as to increase the contrast. For instance, the processing logic checks whether the contrast of display 508 is below a maximum value, and if below (“YES” at process block 532), the processing logic instructs display controller 512 to increase the contrast by changing an amount or other value (e.g., amplitude and/or direction) of electrical actuation or by making other changes to the electrical input(s) to display 508.
If, however, the brightness of display 508 is unable to be increased any further (“NO” at process block 532), then the processing logic changes the optical transmission of dimming element 506 at a process block 534. For instance, the processing logic instructs dimming controller 514 to increase the dimming of dimming element 506, by changing by an amount of electrical/optical actuation or by making other changes to the electrical/optical input(s) to dimming element 506 (e.g., changing the value of an actuation signal, such amplitude and/or direction values). The change in transmission can vary between 0% to 100%, and may be applied to the entire visible spectrum. Furthermore, the change in transmission can happen at different transition times, and the rate of the transition can be manipulated as appropriate in various embodiments.
The process 500 then repeats as described above for another set of first, second, and third inputs.
As previously explained above with respect to
Therefore, to improve the detection of bright areas that are actually visible to the user, another embodiment uses a RGB camera as ambient light sensor 516 and uses an image processing kernel as processing kernel 518. As such, the effect of IR lighting is more effectively filtered out from scene 502, and the detection of visible bright areas (on which a virtual image is superimposed) can be improved by treating the outline of the virtual image as a region of interest (ROI) at the bright area(s) of scene 502.
In such an embodiment, the computation of brightness at process block 520 may involve considering the average brightness of scene 502, the peak brightness of scene 502, the average brightness over the ROI, the peak brightness over the ROI, the variance in brightness over the ROI, and/or other factors.
In process block 602, compensation of photopic sensitivity of the user is performed on the brightness of scene 502 that was computed at process block 520, and the result is provided as the first input to process block 522 for the contrast computation. For example, some users (e.g., as they age) may have visual sensitivities to certain colors under different lighting conditions.
Thus at process block 602, compensation may be performed by multiplying/scaling the computed brightness by a photopic sensitivity curve. For instance, the brightness may be computed at process block 520 based at least on the average brightness of scene 502, the peak brightness of scene 502, the peak brightness over the ROI, and the variance in brightness over the ROI, and then multiplied at process block 602 by one or more values in a photopic sensitivity curve that corresponds to the user.
In process block 702, the processing logic obtains/computes a running average of scene 502 over the last several N frames of images taken by the RGB camera, wherein N may be an integer greater than 1. One purpose of taking the running average to provide increased robustness against flickering light in scene 502.
For example, there may be a latency between when scene brightness is computed (for a single frame) and when the transmittance of dimming element 506 is adjusted based on that computed brightness. Due to the latency and if flickering light is present, the adjustment of the dimming element 506 might end up being performed when the original brightness (based on which the transmittance was computed) is no longer present or has changed. Thus, the transmittance adjustments may be ineffective in that the adjustments are not synchronized with rapid/flickering brightness changes, thereby not achieving the desired visual enhancements for the virtual image and potentially resulting in annoyance to the user.
By using the running average of N frames of scene 502 at process block 702, adjustments in the transmittance may be performed at process block 534 that are more stable and less annoying to the user.
Component 802 may be a display brightness sensor (e.g., display brightness sensor 433 shown in
Hence, the use of component 802 (display brightness sensor) serves to reduce the uncertainty in the determination of the brightness of display 508, regardless of the source of the uncertainty. In operation, component 802 measures actual brightness of display 508 and provides this information as an output in analog or digital format, and the processing logic in turn provides (at process block 804) the measured brightness as the third input to process block 522 for computation of the contrast.
The display brightness sensor may be located near the in-coupling grating so as to capture light that does not couple into the grating, near the boundary at the edge of the waveguide, or at other location(s). A disparity sensor may also be used as the display brightness sensor since the disparity sensor can capture some of the light coming from display 508.
A display brightness sensor can also be added to assemblies such as mounts, lenses, etc. of the head mounted device, as tiny photodiode sensor(s) facing display 508 instead of the scene 502 (e.g. like VCSELs but not facing the eye). One or more photodiodes can be used.
The display brightness sensor can track the absolute brightness of display 508 through a prior calibration or track the relative change in brightness of display 508 in real time. Also, the display brightness sensor can generate brightness measurement data at frame rates, and can measure the average display brightness or peak brightness or both, and can measure across all wavelengths and field of view.
As previously explained above, the pupil size of eye 504 may vary from one user to another, and may also vary according to different lighting or other different conditions. For instance, pupil size may change due to the user's age and/or due to brightness.
However, the brightness measured by ambient light sensor 516 might not be the same as the brightness perceived by eye 504 through the optical stack. The estimate of transmission of the optical stack at any given time (at the process block 524) may be based on factory calibration of optical elements, including dimming element 506. More accurate estimation may be provided by using camera 902 to measure pupil size at process block 904.
The measured pupil size may then be used by the processing logic at process block 906 to provide a more accurate estimate of the stack transmission. As such, the camera 902 may operate as or in conjunction with a stack transmission sensor 908 for generating a transmission light measurement/estimate (as well as performing other operations such as tracking gaze of scene 502 by the user). This estimate of the stack transmission is then provided as an input to process block 522 for computation of the contrast.
The camera 902 may also provide other types of eye-tracking data to the processing logic to enable the processing logic to determine head pose and eye pose of the user, thereby enabling capability to make a prediction about where the virtual image will be overlaid on top of scene 502 in the next several frames or cycles. The processing logic has contextual awareness of the virtual content being delivered and can determine the relationship of this virtual content with respect to areas in scene 502, and can therefore make contrast adjustments based on where the virtual content is located or will be located.
With respect to stack transmission sensor 908 that generates a transmission light measurement, the transmission light measurement can be provided at process block 524 (via dimming controller 514) and/or at process block 906. As such, this transmission light measurement may represent a real time measurement that is more accurate than transmission light measurement that was obtained during factory calibration. Stack transmission sensor 908 may be located at or near the surface of dimming element 506, and multiple stack transmission sensors can be located on both surfaces of dimming element 506 (e.g., inside and outside).
Temperature sensor 1002 may be coupled to dimming element 506 so as to measure the temperature of dimming element 506, since the transmission characteristics of dimming element 506 may change in response to changes in temperature. The measured temperatures may be provided to dimming controller 514, and used by the processing logic to estimate the stack transmission at process block 524 (now shown in solid lines in
In a process block 1102, the processing logic receives a plurality of inputs provided by a corresponding plurality of sensors. The plurality of sensors may include the ambient light sensor 516, temperature sensor 1002, display brightness sensor 802, stack transmission sensor 908, camera 902, etc., such that the plurality of inputs are associated with a brightness of the scene light and the brightness level of display 508.
In a process block 1104, the processing logic determines a contrast value based on the plurality of inputs. The contrast value corresponds to a contrast of the virtual image that is overlayed on scene 502. The contrast value may indicate whether the virtual image is satisfactorily visible to the user of the head mounted device. For instance, if the scene is too bright, or the virtual image is superimposed over a bright area of the scene, the details of the virtual image may be difficult for the user to see.
In a process block 1106, the processing logic determines that the contrast value is below a threshold, thereby indicating that the user may have difficulty viewing details of the virtual image due to excessive brightness in scene 502. As explained previously above, the threshold value for contrast may vary from one use case to another.
In a process block 1108, the processing logic increases the contrast, in response to determining that the contrast value is below the threshold, by changing at least one of an optical transmission of dimming element 506 through which the scene light passes, or the brightness level of display 508. Factors such as the ROI of the virtual image over scene 502, the transmission characteristics (e.g., properties) of dimming element 506, changing brightness characteristics of display 608, temperature of dimming element 506, the pupil size of eye 504, and/or other factors can influence the determination of whether to change the contrast, and if so, the technique by which the contrast may be changed.
Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
The term “processing logic” (e.g., processing logic 470) in this disclosure may include one or more processors, microprocessors, multi-core processors, Application-specific integrated circuits (ASIC), and/or Field Programmable Gate Arrays (FPGAs) to execute operations disclosed herein. In some embodiments, memories (not illustrated) are integrated into the processing logic to store instructions to execute operations and/or store data. Processing logic may also include analog or digital circuitry to perform the operations in accordance with embodiments of the disclosure.
A “memory” or “memories” (e.g. memory 475) described in this disclosure may include one or more volatile or non-volatile memory architectures. The “memory” or “memories” may be removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Example memory technologies may include RAM, ROM, EEPROM, flash memory, CD-ROM, digital versatile disks (DVD), high-definition multimedia/data storage disks, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information for access by a computing device.
Networks may include any network or network system such as, but not limited to, the following: a peer-to-peer network; a Local Area Network (LAN); a Wide Area Network (WAN); a public network, such as the Internet; a private network; a cellular network; a wireless network; a wired network; a wireless and wired combination network; and a satellite network.
Communication channels or any communication links/connections may include or be routed through one or more wired or wireless communication utilizing IEEE 802.11 protocols, BlueTooth, SPI (Serial Peripheral Interface), I2C (Inter-Integrated Circuit), USB (Universal Serial Port), CAN (Controller Area Network), cellular data protocols (e.g. 3G, 4G, LTE, 5G), optical communication networks, Internet Service Providers (ISPs), a peer-to-peer network, a Local Area Network (LAN), a Wide Area Network (WAN), a public network (e.g. “the Internet”), a private network, a satellite network, or otherwise.
A computing device may include a desktop computer, a laptop computer, a tablet, a phablet, a smartphone, a feature phone, a server computer, or otherwise. A server computer may be located remotely in a data center or be stored locally.
The processes explained above are described in terms of computer software and hardware. The techniques described may constitute machine-executable instructions embodied within a tangible or non-transitory machine (e.g., computer) readable storage medium, that when executed by a machine will cause the machine to perform the operations described. Additionally, the processes may be embodied within hardware, such as an application specific integrated circuit (“ASIC”) or otherwise.
A tangible non-transitory machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.