The present disclosure generally relates to near-eye display technology, and more particularly, to a method and a system for correcting nonuniformity of a near-eye display.
Near-eye displays (NEDs) may be provided as an augmented reality (AR) display, a virtual reality (VR) display, a Head Up/Head Mount, or other displays. Generally, an NED usually includes an image generator and optical paths including optical combiners. The image generator is commonly a projector with micro displays (e.g., micro-LED (light-emitting diode), micro-OLED (organic light-emitting diode), LCOS (liquid crystal on silicon), or DLP (digital light processing)) and an integrated optical lens. The optical combiner includes reflective and/or diffractive optics, such as freeform mirror/prism, birdbath, cascaded mirrors, or grating coupler (waveguide). A virtual image is rendered from an NED to human eyes with/without ambient light.
Uniformity is one performance factor for evaluating the imaging quality of an NED. Nonuniformity can be caused by imperfections of display pixels and optical paths to guide the light emitted by the display, and manifests as variation in global distribution, and/or variation in local zones called Mura. A visual artefact may seem like a mottled appearance, a bright spot, a black spot, or a cloudy appearance. For the NEDs such as an AR/VR display, a visual artefact is also observable on the virtual image rendered in the display system. In the virtual image rendered in the AR/VR display, nonuniformity may be shown in luminance and chromaticity. Moreover, the visual artefact caused by nonuniformity is more obvious due to the closeness to human eyes when compared with traditional displays.
Therefore, there is a need for improving the nonuniformity of an NED.
Embodiments of the present disclosure provide a method for correcting nonuniformity of an NED. The method includes: generating and displaying a first plurality of test patterns for a display of the NED; obtaining, in response to the first plurality of test patterns, a first plurality of images at an end of an optical path coupled to the display; fitting a mapping relationship for each of display pixels of the display according to the first plurality of test patterns and the first plurality of images, the mapping relationship of a display pixel mapping the display pixel and a corresponding image pixel in each image of the first plurality of images; and determining a correction scheme for correcting nonuniformity of the NED based on the mapping relationship of each of the display pixels.
Embodiments of the present disclosure provide a system for correcting nonuniformity of an NED. The system includes: a light measuring device (LMD) configured to: obtain, in response to a first plurality of test patterns being displayed by a display of the NED, a first plurality of images at an end of an optical path coupled to the display; and a processor configured to: fit a mapping relationship for each of display pixels of the display according to the first plurality of test patterns and the first plurality of images, the mapping relationship of a display pixel mapping the display pixel and a corresponding image pixel in each image of the first plurality of images; and determine a correction scheme for correcting nonuniformity of the NED based on the mapping relationship of each of the display pixels.
Embodiments of the present disclosure provide a non-transitory computer-readable storage medium storing a set of instructions that are executable by one or more processors of a device to cause the device to perform operations for correcting nonuniformity of an NED, the operations including: generating and displaying a first plurality of test patterns for a display of the NED; obtaining, in response to the first plurality of test patterns, a first plurality of images at an end of an optical path coupled to the display; fitting a mapping relationship for each of display pixels of the display according to the first plurality of test patterns and the first plurality of images, the mapping relationship of a display pixel mapping the display pixel and a corresponding image pixel in each image of the first plurality of images; and determining a correction scheme for correcting nonuniformity of the NED based on the mapping relationship of each of the display pixels.
Embodiments and various aspects of the present disclosure are illustrated in the following detailed description and the accompanying figures. Various features shown in the figures are not drawn to scale.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise represented. The implementations set forth in the following description of exemplary embodiments do not represent all implementations consistent with the invention. Instead, they are merely examples of apparatuses and methods consistent with aspects related to the invention as recited in the appended claims. Particular aspects of the present disclosure are described in greater detail below. The terms and definitions provided herein control, if in conflict with terms and/or definitions incorporated by reference.
In some embodiments, NED 110 includes an image generator 111. Image generator 111 is provided as one or more micro displays (e.g., one display for one eye), such as micro-LED displays, micro-OLED displays, LCOS displays, or DLP displays, and each of the micro displays can be configured as a light engine with an additional projector lens. In some embodiments, the display may be coupled with a plurality of lenses (also referred to as a “lens group”, “designed optics”, etc.) for adjusting the image displayed by the micro display in a manner applicable to human eyes. The micro display of image generator 111 includes a micro light emitting array which can form an active emitting area. The projected image from the light engine through designed optics is transferred to human eyes via an optical path including an optical combiner (not shown). The optics of the optical combiner can be reflective and/or diffractive optics, such as a free form mirror/prism, birdbath, or cascaded mirrors, grating coupler (waveguide), etc.
In some embodiments, a driving module 112, for example, a driver, can be further provided to drive NED 110 for image displaying. Driving module 112 can be coupled to communicate with NED 110, specifically to communicate with image generator 111 of NED 110. That is, driving module 112 can be configured to drive image generator 111 to display an image on the micro displays by driving signals.
It is to be noted that “left” and “right” mentioned in the present disclosure are from the perspective view from a person (i.e., a user or a viewer of system 200) shown in
As can be appreciated, first micro display 210 can be used to display a right image, while second micro display 220 can be used to display a left image captured or rendered at a different angle from the right image. When simultaneously viewing the left image and right image, the brain of the viewer combines these two images into a three-dimensional scene. However, if the uniformity within either the left image or the right image is not ideal, a “sense of place” of the three-dimensional scene created by these two images can be affected. The nonuniformity can be caused by one or more of second micro display 220, first micro display 210, lens group 211, or lens group 221. For example, when driven by a signal indicating a same intensity of brightness, some display pixels in either or both of first micro display 210 and second micro display 220 may be brighter compared with the others. In addition, the wavelengths of the three fundamental colors emitted by the display pixels are different, such that the propagation properties in the optical paths thereof are different, which may cause nonuniformity in the images rendered by VR system 200 to human eyes.
Referring to
Referring back to
Processing module 104 is configured to evaluate and improve the uniformity of the virtual image rendered by NED 110. In some embodiments, processing module 104 can be included in a computer or a server. In some embodiments, processing module 104 can be deployed in the cloud, which is not limited herein. In some embodiments, processing module 104 can include one or more processors.
At step S402, a first plurality of test patterns is generated for the display of the NED for displaying. For example, with further reference to AR system 300 in
In the present disclosure, a pixel in a display is referred to as a display pixel intending to distinguish it from an image pixel in an image described below. As appreciated, the display pixel is a hardware component while the image pixel is an imaging representation which can be encoded as a set of data.
In some embodiments, all the display sub-pixels in first micro display 310 or second micro display 350 may need correction. In this situation, the test patterns can be generated for the displays to measure the performance of the displays for different imaging situations. For example, first micro display 310 and second micro display 350 include three kinds of fundamental display sub-pixels, i.e., red display sub-pixels, green display sub-pixels, and blue display sub-pixels, hence complete nonuniformity correction of AR system 300 may need testing of all these display sub-pixels. That is, a first plurality of test patterns can be generated to light all the red display sub-pixels, all the green display sub-pixels, and all the blue display sub-pixels within display pixels for observing the property of these display pixels in a designed sequence. As first micro display 310 and second micro display 350 are typically driven by signals in RGB chroma space, in some embodiments, the first plurality of test patterns can include at least one of: (1) a second plurality of red patterns each corresponding to a red-scale value, for example, red patterns with RGB values of (64, 0, 0), (96, 0, 0), (128, 0, 0), (160, 0, 0), (192, 0, 0), and (224, 0, 0); (2) a third plurality of green patterns each corresponding to a green-scale value, for example, green patterns with RGB values of (0, 64, 0), (0, 96, 0), (0, 128, 0), (0, 160, 0), (0, 192, 0), and (0, 224, 0); or (3) a fourth plurality of blue patterns each corresponding to a blue-scale value, for example, blue patterns with RGB values of (0, 0, 64), (0, 0, 96), (0, 0, 128), (0, 0, 160), (0, 0, 192), and (0, 0, 224). In this manner, the red display sub-pixels, green display sub-pixels, and blue display sub-pixels of first micro display 310 or second micro display 350 can be lighted by the red patterns, the green patterns, and the blue patterns with different intensities (color-scales) in sequence.
As described above, waveguide 322 and optical combiner 323 are provided in optical path 320, while waveguide 362 and optical combiner 363 are provided in optical path 360. The imaging quality of AR system 300 can be affected by optical path 320 or optical path 360. Specifically, at least a part of the nonuniformity in AR system 300 may be caused by optical path 320 or optical path 360. This source of nonuniformity is eliminated or at least reduced with the correction method described herein.
Referring back to
Referring back to
At sub-step S502, the first plurality of images is downsampled to obtain a first plurality of intermediate images having a target resolution. The images captured by an imaging module may have a fine resolution (e.g., 9000×6000), which may be too large for image processing. In some embodiments, the resolution of these images can be lowered by pixel decimation. As used herein, pixel decimation refers to a process by which the number of image pixels in an image is reduced, e.g., downsampled. For example, with further reference to
In some embodiments, before downsampling at sub-step S502, the first plurality of images can be pre-processed with image enhancement, image segmentation, or blob analysis to establish a coordinates mapping relationship between the display and the first plurality of images. For example, the display pixel in a relative location of the display (e.g., in ¼ location horizontally and ¾ location vertically) can be mapped to image pixels with the same relative location (e.g., in ¼ location horizontally and ¾ location vertically) in the image.
At sub-step S504, the mapping relationship of the objective display pixel is fitted according to the first plurality of test patterns and the first plurality of intermediate images.
The method described above to form the intermediate images can also be applied to the green patterns and the blue patterns, and three images representing intermediate images in CIE X components, CIE Y (luminance) components and CIE Z components will be generated respectively, which is not repeated here. In the present disclosure, although images denotated as “CIE X-RED” and “CIE Z-RED” are used to represent imaging chroma components, they can be expressed in a tristimulus showing the intensity of X and Z components of each image pixel. In some embodiments, the first plurality of images and the first plurality of intermediate images are represented by tristimulus values.
At sub-step S602, a color-scale value of each of the first plurality of test patterns and a tristimulus value of the corresponding image pixel in each image of the first plurality of intermediate images are determined for the objective display pixel.
Referring back to
In some embodiments, the mapping relationship can also be fitted with a mathematical expression form:
where, x is the color-scale value of the objective display pixel corresponding to a test pattern, y is the tristimulus value of the corresponding image pixel in the intermediate image corresponding to the test pattern, and ai (i=1, . . . , n), c and γ are the coefficients of the expression which can be zero but cannot all be zero at the same time. As can be appreciated, once the mapping relationship of the objective display pixel is determined, the tristimulus value of the corresponding image pixel in the intermediate image corresponding to a test pattern with a color-scale value can be estimated according to the mapping relationship.
As also can be appreciated,
Referring back to
At sub-step S702, an estimated tristimulus value for the corresponding image pixel is determined according to a predetermined color-scale value driving the objective display pixel and the mapping relationship of the objective display pixel. Referring to
At sub-step S704, a target tristimulus value for the corresponding image pixel is set for the objective display pixel driven by the predetermined color-scale value. The target tristimulus value can be set according to a distribution of the tristimulus values of the image pixels in an image of the first plurality of intermediate images. In some embodiments, the target tristimulus value is an average tristimulus value of each image pixel in the intermediate image. In some other embodiments, the target tristimulus value is the tristimulus value with a greatest probability in the distribution. A target value generally reflects a statistical status of the intermediate image, and can be used to represent the intermediate image.
Referring to
In this 3×3 matrix, XR represents the target tristimulus value of chroma components X corresponding to a predetermined red pattern with a predetermined red-scale value, YG represents the target tristimulus value of luminance components Y corresponding to a predetermined green pattern with a predetermined green-scale value, ZB represents the target tristimulus value of chroma components Z corresponding to a predetermined blue pattern with a predetermined blue-scale value, etc. As can be understood, this matrix implements both luminance correction and chromaticity correction. If only the luminance correction is needed to be applied, then only the second row [YR YG YB]obj needs to be determined, and this matrix become a 1×3 matrix.
Referring back to
in which, [M3×3]px can be in the form of:
wherein, XRe represents the estimated tristimulus value of chroma components X corresponding to the predetermined red pattern with the predetermined red-scale value, YGe represents the estimated tristimulus value of luminance components Y corresponding to the predetermined green pattern with the predetermined green-scale value, ZB represents the estimated tristimulus value of chroma components Z corresponding to the predetermined blue pattern with the predetermined blue-scale value, etc.
In other words, the correction matrix can be
determined through the following equation:
wherein, [M3×3]px′ denotes the inverse of matrix [M3×3]px, and [M3×3]px contains the estimated tristimulus values that can be determined at sub-step S702. As appreciated, [M3×3]corr can be generated for each of the 640×480 pixels.
In some embodiments, gamma operation in a driving system (e.g., driving module 112 in
When [M3×3]corr_2 is applied for correction, the driving system (e.g., driving module 112 in
In some embodiments, the determined correction matrix can be saved for further processing. In some other embodiments, the determined correction matrix for each image pixel in the intermediate image can be used to update the driving system of the display according to the correction matrix. The driving system can drive the display with their corresponding driver files. For example, when a display pixel of the display is driving a signal
in RGB chroma space, the drives can correct this driving signal to
which is actually used to drive the display pixel. When considering gamma operation, the drive can instead correct this driving signal to
To review the quality of improvement with the correction scheme, the uniformity of the display can be evaluated before and after correction. In some embodiments, the method for correcting nonuniformity further includes the following steps (not shown): displaying the test pattern on the display with the updated driver files; and verifying an updated uniformity of the display according to an updated image corresponding to the test pattern.
Some embodiments of the present disclosure further provide a non-transitory computer-readable storage medium storing a set of instructions that are executable by one or more processors of a device to cause the device to perform any of the above-mentioned methods for correcting nonuniformity of an NED.
It should be noted that relational terms herein such as “first” and “second” are used only to differentiate an entity or operation from another entity or operation, and do not require or imply any actual relationship or sequence between these entities or operations. Moreover, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
As used herein, unless specifically stated otherwise, the term “or” encompasses all possible combinations, except where infeasible. For example, if it is stated that a database may include A or B, then, unless specifically stated otherwise or infeasible, the database may include A, or B, or A and B. As a second example, if it is stated that a database may include A, B, or C, then, unless specifically stated otherwise or infeasible, the database may include A, or B, or C, or A and B, or A and C, or B and C, or A and B and C.
In the foregoing specification, embodiments have been described with reference to numerous specific details that can vary from implementation to implementation. Certain adaptations and modifications of the described embodiments can be made. Other embodiments can be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims. It is also intended that the sequences of steps shown in figures are only for illustrative purposes and are not intended to be limited to any particular sequence of steps. As such, those skilled in the art can appreciate that these steps can be performed in a different order while implementing the same method.
In the drawings and specification, there have been disclosed exemplary embodiments. However, many variations and modifications can be made to these embodiments. Accordingly, although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation.
| Number | Date | Country | Kind |
|---|---|---|---|
| PCT/CN2024/071770 | Jan 2024 | WO | international |
This disclosure claims the benefits of priority to PCT Application No. PCT/CN2024/071770, filed on Jan. 11, 2024, which is incorporated herein by reference in its entirety.