This application claims the benefit of priority from Korean Patent Application No. 10-2023-0106232, filed on Aug. 14, 2023, which is hereby incorporated by reference as fully set forth herein.
The present disclosure is applicable to any technology for displaying an image using a lens, and is also applicable to various devices that provide VR virtual reality services. For example, it is applicable to Head Mounted Displays (HMDs), smart glasses, mobile phones, PCs, and the like. Yet, the present disclosure is applicable to components that are not final products.
Various VR devices such as HMD and the like have recently appeared in the market. However, since a user wearing these devices checks image data through a lens, chromatic aberration occurs.
Chromatic aberration refers to a phenomenon in which light passing through a lens forms an image in a different place for each color because the refractive index of light varies depending on a wavelength.
As for the refractive characteristics of light, the shorter the wavelength becomes, the higher the refractive index gets. And, the longer the wavelength becomes, the lower the refractive index gets. When chromatic aberration occurs due to this phenomenon, there is a problem that color is mainly distorted in an outer part of an image.
To solve this problem, according to the related art, a lens distortion parameter of a fixed value for chromatic aberration correction was used.
However, if the degree of distortion varies depending on a distance from a center when performing simple distortion correction by matching the center, there is still a problem that a periphery of the center differs from an outer area in correction accuracy.
Accordingly, the present disclosure is directed to an apparatus for displaying an image using a lens and method for controlling the same that substantially obviate one or more problems due to limitations and disadvantages of the related art.
One object of the present disclosure is to provide a compensation algorithm for a distortion phenomenon caused by a lens distortion parameter in a device that provides an image to a user through a special lens.
For example, based on a center point of a lens distortion parameter, it is intended to compensate by reflecting a distortion parameter according to a distance between a position of a current pixel and a position of the center point.
To achieve these objects and other advantages and in accordance with the purpose of the disclosure, as embodied and broadly described herein, a method of controlling a display device displaying an image using a lens may include storing a lens distortion parameter for chromatic aberration correction, receiving image data, primarily calculating a distance between a center point and a current pixel, secondarily calculating a distortion correction weight in consideration of the distance, and outputting image data of the current pixel through calculation according to the calculated distortion correction weight without using the stored lens distortion parameter intactly.
The secondarily calculating may be designed to enable to distortion correction weight to increase as the distance increases.
A plurality of the distortion correction weights may be obtained by an interpolation technique or by using a Lookup Table (LUT). Furthermore, according to the structure of a lens attached to an HMD or the like, it is also a feature of the present disclosure that an LUT consuming the least power, interpolation and the like are selected.
When the current pixel corresponds to green, the primarily calculating and the secondarily calculating may be designed to be disabled.
The center point may correspond to a center point of a distortion parameter for red or a center point of a distortion parameter for blue.
The steps described above may be performed in a semiconductor for display driving.
In another aspect of the present disclosure, an apparatus for providing a Virtual Reality (VR) service according to one embodiment of the present disclosure may include a memory storing a lens distortion parameter for chromatic aberration correction and a controller.
The controller may receive image data, primarily calculate a distance between a center point and a current pixel, secondarily calculate a distortion correction weight in consideration of the distance, and output image data of the current pixel according to the calculated distortion correction weight.
Accordingly, the present disclosure provides the following effects and/or advantages.
According to one embodiment of the present disclosure, provided is a compensation algorithm for a distortion phenomenon caused by a lens distortion parameter in a device that provides an image to a user through a special lens.
For example, by compensating by reflecting a distortion parameter according to a distance between a position of a current pixel and a position of a center point based on the center point of a lens distortion parameter, there is a technical effect capable of more completely removing the chromatic aberration phenomenon.
In addition to the above-described technical effects, matters obvious to those skilled in the art in consideration of the entire specification are also included in the additional technical effects of the present disclosure.
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the disclosure and together with the description serve to explain the principle of the disclosure.
Throughout the specification, like reference numerals are used to refer to substantially the same components. In the following description, detailed descriptions of components and features known in the art may be omitted if they are not relevant to the core configuration of the present disclosure. The meanings of terms used in this specification are to be understood as follows.
The advantages and features of the present disclosure, and methods of achieving them, will become apparent from the detailed description of the embodiments, together with the accompanying drawings. However, the present disclosure is not limited to the embodiments disclosed herein and will be implemented in many different forms. The embodiments are provided merely to make the disclosure of the present disclosure thorough and to fully inform one of those skilled in the art to which the present disclosure belongs of the scope of the disclosure. It is to be noted that the scope of the present disclosure is defined only by the claims.
The figures, dimensions, ratios, angles, numbers of elements given in the drawings are merely illustrative and are not limiting. Like reference numerals refer to like elements throughout the specification. Further, in describing the present disclosure, descriptions of well-known technologies may be omitted in order to avoid obscuring the gist of the present disclosure.
As used herein, the terms “includes,” “has,” “comprises,” and the like should not be construed as being restricted to the means listed thereafter unless specifically stated otherwise. Where an indefinite or definite article is used when referring to a singular noun e.g. “a” or “an”, “the”, this includes a plural of that noun unless something else is specifically stated.
Elements are to be interpreted a margin of error, even if not explicitly stated otherwise.
In describing positional relationship, for example, if the positional relationship of two parts is described as ‘on ˜’, ‘over ˜’, ‘under ˜’, ‘next to ˜’, or the like, one or more other parts may be located between the two parts unless ‘right’ or ‘direct’ is used.
In describing temporal relationships, terms such as “after,” “subsequent to,” “next to,” “before,” and the like may include cases where any two events are not consecutive, unless the term “immediately” or “directly” is explicitly used.
While the terms first, second, and the like are used to describe various elements, the elements are not limited by these terms. These terms are used merely to distinguish one element from another. Accordingly, a first element referred to herein may be a second element within the technical idea of the present disclosure.
“X-axis direction”, “Y-axis direction”, and “Z-axis direction” should not be interpreted only as a geometrical relationship in which the relationship between each other is vertically formed, and may mean that the configuration of the present disclosure has a wider directionality within the range in which it may work functionally.
Virtual Reality (VR) refers to a technology that uses a virtual image that is not real, for example, an object (object), a background, and an environment. A representative product used in VR technology is a Head Mounted Display (HMD) device. When an HMD device is used on a head, there is a small display at a position close to both eyes, so a 3D image using parallax is projected. With a gyro sensor that tracks a user's movement and a rendering function of creating an image according to the movement, and the like, the user can experience as if he is in a 3D space.
Augmented Reality (AR) refers to a technology that displays a single image by superimposing a 3D virtual image on a real image or background, for example. A representative product used in AR technology is AR glasses. This is an electronic device in the form of glasses that implements AR content on a transparent lens. While a user wears it like regular glasses, a large screen screen-level display may be projected in front of eyes or various augmented reality contents may be used. It is able to experience the extended reality in which augmented reality contents are combined by utilizing all 360-degree spaces centering on the user.
Mixed Reality (MR) includes the meaning of Augmented Reality (AR) that adds virtual information based on reality and Augmented Virtuality (AV) that adds reality information to a virtual environment. In other words, by providing a smart environment in which reality and virtuality are naturally connected, users can have a rich experience.
Extended Reality (XR) refers to ultra-realistic technologies and services that cover Mixed Reality (MR) technologies that encompass Virtual Reality (VR) and Augmented Reality (AR).
Hereinafter, in
A reference number 220 is a display, and it may be manufactured in the form of two displays to provide a left-eye image and a right-eye image, respectively, or it is also within the scope of the present disclosure to manufacture the display as a single display.
In addition, a reference number 230 refers to an HMD device (other types of VR devices also fall within the scope of the present disclosure).
As shown in
This is attributed to the human visual characteristics. Hence, it is designed to receive images suitable for the left and right eyes, respectively, because images recognized from both eyes can be combined and recognized as one image in the brain only when a viewpoint of an object looks different depending on the positions of the two eyes.
A first image correction unit 320 included in the left-eye device 300 and a second image correction unit 340 included in the right-eye device 350 independently process the received images for each of R, G, and B, and then transmit the processed images to left-eye and right-eye panels included in an image output unit 330, respectively.
In addition, the images sent to the left-eye device 300 and the right-eye device 350 are generated by an Application Processor (AP).
Meanwhile, in
In addition,
Unlike display panels in other products according to the related art, the HMD device (including other types of VR devices) according to one embodiment of the present disclosure is designed so that ac driving layer is put on a silicon semiconductor.
Furthermore, when an image is corrected, the resolution of a red pixel increases and the resolution of a blue pixel decreases. Therefore, before video data is outputted, data of the red pixel is designed to be partially removed and the blue pixel is designed to be filled with black and transmitted to the driving layer, whereby the video data is outputted.
In particular, in the product to which the present disclosure is applied, since a user sees image data through the lens 210 and the like shown in
As described above, the present disclosure may provide a compensation algorithm for a distortion phenomenon caused by a lens distortion parameter in a device (e.g., a VR device such as an HMD, etc.) that provides an image to a user through a special lens. To implement this, it is designed to compensate by reflecting a distortion parameter depending on a distance between a position of a current pixel and a position of a center point based on the center point of the lens distortion parameter.
For example, as shown in
Meanwhile, the position calculation unit 510 between the current pixel and the center point, the distortion correction weight calculation unit 530 according to the distortion parameter and distance, the image distortion correction calculation unit 540, and the output image synthesis unit 550 except the storage unit 520 are designed to be performed by one controller, processor, AP, etc., which also falls within the scope of the present disclosure.
On the other hand, a Display Driver IC (DDI) may be designed to operate the above-described reference numbers 510, 530, 540 and 550 instead of the AP, and such a design also has an additional effect of lowering power consumption to nearly 50% that is 50 mW compared to about 100 mW consumed by the AP.
The lens distortion parameter storage unit 520 stores a lens distortion parameter for chromatic aberration correction. The storage unit 520 may be implemented with various types of memories.
The position calculation unit 510 between the current pixel and the center point receives image data and primarily calculates the distance between the center point and the current pixel.
The distortion correction weight calculation unit 530 according to the distortion parameter and the distance is designed to secondarily calculate a distortion correction weight in consideration of the above-described distance.
Unlike the related art, the image distortion correction calculation unit 540 is designed to perform image distortion correction according to the calculated distortion correction weight without using the lens distortion parameter stored in the storage unit 520 as it is.
More specifically, the lens distortion parameter P stored in the storage unit 520 is multiplied by a correction weight W calculated in the calculation unit 540, and a pixel value of image data is multiplied by a result PW.
And, the output image synthesis unit 550 synthesizes and outputs the corrected image data.
Here, the corrected image data means, for example, a result obtained by multiplying the pixel value of an initially received image data by the PW described above.
Yet, while red and blue pixels are corrected, green pixels are designed not to be corrected, which is one feature of the present disclosure.
That is, according to one embodiment of the present disclosure, by using a distortion correction weight that changes according to a distance between a center point and a current pixel without using a lens distortion parameter stored in the memory as it is, there is a technical effect capable of solving the chromatic aberration problem that still remains in an outer portion of an image. Such a function may be performed by a controller such as an AP, etc.
In particular, the controller is designed to enable the distortion correction weight to increase as the distance between the center point and the current pixel increases, for example.
There is no particular limitation on the method of obtaining a plurality of distortion correction weights, which may be obtained, for example, by an interpolation technique or by using a LookUp Table LUT.
On the other hand, if the current pixel corresponds to green, the technical effect of minimizing unnecessary power consumption is additionally expected by designing only the modules 510 and 530 to be limitedly and temporarily disabled.
On the other hand, if the current pixel corresponds to red or blue, the modules 510 and 530 are designed to operate normally.
Therefore, the aforementioned center point corresponds to, for example, the center point of the distortion parameter for red or the center point of the distortion parameter for blue.
As described above, while green pixels are designed to be bypassed without compensation, compensation is designed to be performed on red and blue pixels only, which is also one of the features of the present disclosure.
As shown in
Yet, when a distortion compensation algorithm is applied according to the related art, as shown in
On the other hand, when a distortion compensation algorithm is applied according to one embodiment of the present disclosure, as shown in
In this regard, bilinear interpolation may be modified and applied, which will be described in more detail below with reference to
First, in order to implement one embodiment of the present disclosure, for example, the following equation may be used, and in particular, a weight PosW according to a position difference (distance difference) between a center point and a current pixel is additionally used. The weight PosW according to the position (or distance) difference will be described in more detail below with reference to
Here, O_val, for example, is a pixel value of an output image, means a pixel value of an image after completion of interpolation, and corresponds to a reference number 700 shown in
The p1 to p4 are pixel values of an input image, and means pixel values referred to when interpolation is performed. In this description, since it is exemplarily bi-linear interpolation, it will be described as an example of calculation with pixel values of four points.
The w_x1 to w_y2 refer to weight values determined according to pixel positions in an output image compared to pixel positions in an input image.
cal_h: h is an abbreviation for “height”, which refers to a position value for a y-direction of an output image with respect to j (current y position) according to a ratio of h_input (height of input image) and h_output (height of output image).
cal_w: w is an abbreviation for width, which means a position value for an x-direction of an output image with respect to i (current x position) according to a ratio of w_input (width of input image) and w_output (width of output image).
An interpolation process will be described in detail as follows.
First, it will be assumed that scaling is performed from 100×80 to 160×100.
In this case, when it is intended to calculate a pixel value for (50, 36) of an output image, if it is a process of finding pixel positions x1, x2, y1, and y2 to be referenced in an input image, w_input: 100, h_input: 80, w_output: 160, h_output: 100, i: 50, j: 36
and, cal_h=36×80/100=28.8
cal_w=50×100/160=31.25
x1=31, x2=32, y1=28, y2=29
p1=pixel value at a position (31, 28) in the input image
p2=pixel value at a position (32, 28) in the input image
p3=pixel value at a position (31, 29) in the input image
p4=pixel value at a position (32, 29) in the input image
w_x1=1−(31.25-31)=0.75
w_x2=(31.25-31)=0.25
w_y1=1−(28.8-28)=0.2
w_y2=(28.8-28)=0.8
>w_x1˜w_y2 are weights according to the 1's polynomial 1 function depending on a distance due to bi-linear interpolation.
A result calculated as o_val=(p1×0.75+p2×0.25)×0.2+ (p3×0.75+p4×0.25)×0.8 becomes a pixel value corresponding to a position (50, 36) of the output image.
Yet, when the present disclosure is applied, PosW is designed to be additionally calculated when calculating cal_h and cal_w, as follows.
As shown in
An example of an equation for obtaining the weight PosW is illustrated at an upper end of
In addition, one embodiment in which a plurality of weights are interpolated for each point is possible and one embodiment in which a plurality of weights are stored in the form of a lookup table is also possible, which pertain to the scope of the present disclosure.
As shown in
On the other hand, as shown in
Distortion parameter input image data for each channel (e.g., R/G/B, etc.) is received (S1000). Yet, as described above, a green pixel is designed to be bypassed (S1040).
On the other hand, for a red pixel, it is designed to calculate a distance between a center point and a current pixel (S1010), calculate a weight according to a distance and a distortion parameter (S1020), and perform distortion compensation according to the calculated weight (S1030).
Meanwhile, for a blue pixel, it is designed to calculate a distance between a center point and a current pixel (S1050), calculate a weight according to a distance and a distortion parameter (S1060), and perform distortion compensation according to the calculated weight (S1070).
Then, it is designed to synthesize and output data of each compensated pixel (S1080).
In summary, from the perspective of a device that provides a Virtual Reality (VR) service, a lens distortion parameter for chromatic aberration correction is first stored.
After receiving image data, the device primarily calculates a distance between a center point and a current pixel, and secondarily calculates a distortion correction weight by considering the distance.
Unlike the related art, the device does not use lens distortion parameters stored in a memory and the like as they are, but outputs image data of the current pixel according to the calculated distortion correction weight.
In particular, the secondary calculation steps S1020 and S1060 are designed so that the distortion correction weight increases as the distance increases.
As shown in
Again, when the current pixel corresponds to green, the above-described steps S1010, S1020, S1030, S1050, S1060, S1070, and the like are designed to be disabled.
Therefore, the center point in the steps S1010 and S1050 corresponds to, for example, a center point of a distortion parameter for red or a center point of a distortion parameter for blue.
In addition, at least one of the steps shown in
Meanwhile, the present disclosure may also be applied to the Head Mounted Display (HMD) illustrated in
The HMD shown in
In particular, the controller receives image data, primarily calculates a distance between a center point and a current pixel, secondarily calculates a distortion correction weight by considering the distance, and outputs image data of the current pixel according to the calculated distortion correction weight.
It will be appreciated by those skilled in the art to which the present disclosure belongs that the disclosure described above may be practiced in other specific forms without altering its technical ideas or essential features.
For example, an image processing device according to the present disclosure may be implemented in the form of an IC for each component or a combination of two or more components, and the function of the image processing device may be implemented in the form of a program and installed on the IC. When the function of the image processing device according to the present disclosure is implemented as a program, the function of each component included in the image processing device may be implemented as a specific code, and codes for implementing a specific function may be implemented as one program or may be implemented by being divided into a plurality of programs.
It should therefore be understood that the embodiments described above are exemplary and non-limiting in all respects. The scope of the present disclosure is defined by the appended claims, rather than by the detailed description above, and should be construed to cover all modifications or variations derived from the meaning and scope of the appended claims and the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0106232 | Aug 2023 | KR | national |