The present disclosure relates to an operating method of a tracking system, a HMD (HEAD MOUNTED DISPLAY) device, and a tracking system. More particularly, the present disclosure relates to an operating method of a tracking system, a HMD device, and a tracking system for generating viewing image.
The high resolution and high framerate are essential and important to good VR (virtual reality) experiencing. High fidelity 3D scenes also bring better VR experience but introduce high GPU loading at the mean time. Thus, it takes high price for qualifies GPU to VR system requirement.
Reducing render solution is a direct solution to reduce GPU loading. However, it is important to maintain viewing quality while reducing rendering resolution.
One aspect of the present disclosure is related to an operating method of a tracking system. The operating method includes the following operations: obtaining, by a processor, a parameter of a lens of a HMD (Head Mount Display) device; calculating, by the processor, data of a foveation area according to the parameter; generating, by the processor, a foveation image according to the foveation area; generating, by the processor, a peripheral image whose resolution is lower than a resolution of the foveation image; and merging, by the processor, the foveation image and the peripheral image to generate a viewing image; and outputting, by the processor, the viewing image.
Another aspect of the present disclosure is related to a HMD device. The HMD device includes a HMD device includes a display circuit with a lens and a processer. The processor is configured to obtain a parameter of the lens, to calculate data of a foveation area according to the parameter, to generate a foveation image according to the foveation area, to generate a peripheral image whose resolution is lower than a resolution of the foveation image, to merge the foveation image and the peripheral image to generate a viewing image, and to output the viewing image.
Another aspect of the present disclosure is related to a tracking system. The tracking system includes a client device with a lens and a host device. The host device includes a processor. The processor is configured to obtain a parameter of the lens, to calculate data of a foveation area according to the parameter, to generate a foveation image according to the foveation area, to generate a peripheral image whose resolution is lower than a resolution of the foveation image, to merge the foveation image and the peripheral image to generate a viewing image, and to output the viewing image.
Through the operations of one embodiment described above, the viewing quality is maintained while reducing render resolution.
The invention can be more fully understood by reading the following detailed description of the embodiments, with reference made to the accompanying drawings as follows:
Reference will now be made in detail to the present embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
It will be understood that, in the description herein and throughout the claims that follow, when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. Moreover, “electrically connect” or “connect” can further refer to the interoperation or interaction between two or more elements.
It will be understood that, in the description herein and throughout the claims that follow, although the terms “first,” “second,” etc. may be used to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the embodiments.
It will be understood that, in the description herein and throughout the claims that follow, the terms “comprise” or “comprising,” “include” or “including,” “have” or “having,” “contain” or “containing” and the like used herein are to be understood to be open-ended, i.e., to mean including but not limited to.
It will be understood that, in the description herein and throughout the claims that follow, the phrase “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that, in the description herein and throughout the claims that follow, words indicating direction used in the description of the following embodiments, such as “above,” “below,” “left,” “right,” “front” and “back,” are directions as they relate to the accompanying drawings. Therefore, such words indicating direction are used for illustration and do not limit the present disclosure.
It will be understood that, in the description herein and throughout the claims that follow, unless otherwise defined, all terms (including technical and scientific terms) have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. § 112(f). In particular, the use of “step of” in the claims herein is not intended to invoke the provisions of 35 U.S.C. § 112(f).
In some embodiments, the host device 107B includes a processor 150B. In some embodiments, the client device 105B further includes a processor 130B, an eye tracking circuit 170B and a display circuit 120B. The display circuit 120B includes a lens 110B. The display circuit 120B and the eye tracking circuit 170B are electronically coupled to the processor 130B.
Due to the optical effects such as the parameters of focal length, a field of view, or other process issue, for example, the pixel density per degree at peripheral area is lower than center area. For the peripheral area, even though high GPU loading is introduced, the pixel density per degree at the peripheral area is still low, so as to consume the computing resource.
Details of the present disclosure are described in the paragraphs below with reference to an image processing method in
Reference is made to
It should be noted that the method can be applied to a tracking system or a HMD device having a structure that is the same as or similar to the structure of the tracking system 100B shown in
It should be noted that, in some embodiments, the method may be implemented as a computer program. When the computer program is executed by a computer, an electronic device, or the one or more processor 150A, 150B in
In addition, it should be noted that in the operations of the following method, no particular sequence is required unless otherwise specified. Moreover, the following operations also may be performed simultaneously or the execution times thereof may at least partially overlap.
Furthermore, the operations of the following method may be added to, replaced, and/or eliminated as appropriate, in accordance with various embodiments of the present disclosure.
Reference is made to
In operation S210, obtaining a parameter of a lens of a HMD. However, the lens 110A of the display circuit 120A or the lens 110B of the display circuit 120B is functional to image the content at the display circuit 120A or 120B with close range for the user. In some embodiments, the operation S210 may be operated by the processor 150A in
In some other embodiments, the processor 150A or 150B may obtain parameter of the lens 105A or 105B from a database. The database can be, for example, inquired on the server of each manufacturer via the internet, or stored at the HMD device 105A or client device 107B and regularly updated.
In operation S220, calculating foveation area according to the parameter said above. In some embodiments, the operation S220 may be operated by the processor 150A in
Reference is made to
In some embodiments, the parameter of the lens 110A and the parameter of the lens 110B include focal lengths, field of views, or other process issues.
It should be noted that in some embodiments, the display image 300 may include not only foveation area 330 and peripheral area 310. The display image 300 may include several concentric areas or gradient areas with different resolution. How many concentric areas or gradient areas the display image 300 is divided into is determined according to the lens parameter.
In some embodiments, the processor 150A or the processor 150B is further configured to obtain data of performing eye tracking and to refine the rendering area such as display image 300, and particularly the foveation area 330 according to the data of performing eye tracking.
Reference is made to
Reference is made to
In operation S230, generating a foveation image according to the foveation area. In some embodiments, the operation S220 may be operated by the processor 150A in
For example, reference is made to
In operation S240, generating a peripheral image. In some embodiments, the operation S240 may be operated by the processor 150A in
In some embodiments, the processor 150A further includes a peripheral camera circuit 154A. In some embodiments, the processor 150B further includes a peripheral camera circuit 154B. In some embodiments, operation S240 may be operated by the peripheral camera circuit 154A as illustrated in
In some embodiments, the processor 150A or 150B is further configured to perform anti-aliasing process while upscaling the peripheral image 310B. In some embodiments, after upscaling the peripheral image 310B, the resolution of the peripheral image 310B is lower than the resolution of the foveation image 330B. By applying anti-aliasing process edge flickering artifacts is reduced while upscaling the peripheral image 310B.
In operation S250, merging the foveation image and the peripheral image so as to generate a viewing image. In some embodiments, the operation S240 may be operated by the processor 150A in
In operation S260, outputting the viewing image. In some embodiments, the operation S240 may be operated by the processor 150A in
Through the operations of the embodiments described above, the tracking system 100B or the HMD 105A in the present disclosure may optimize the viewing quality while reducing the render resolution. In detail, by considering the impact of lens and user's gaze, the computing resource can be reduced for the image rendering. Particularly, based on characteristic of the lens, parts of display image cannot be presented perfectly via the lens on the display for the user. Thus, the resolution with respect to the parts of the display image is adjustable to reduce the computing burden when rendering.
Although the present invention has been described in considerable detail with reference to certain embodiments thereof, other embodiments are possible. Therefore, the scope of the appended claims should not be limited to the description of the embodiments contained herein.
This application claims priority to U.S. Provisional Application Ser. No. 62/674,016, filed May 20, 2018, which is herein incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62674016 | May 2018 | US |