The present invention relates, generally, to head-mounted displays and, more particularly, to lens distortion correction for eye tracking systems used in connection with such displays.
Recent years have seen dramatic advances in the performance of virtual reality headsets and other such head-mounted displays (HMDs). Despite these improvements, many users find the long-term use of HMDs uncomfortable due to their overall size and weight. More particularly, as the overall lateral dimension or “depth” of an HIVID increases, the rotational force (or moment) applied to the user's head also increases, which can result in significant neck strain. For these and other reasons, there have been significant efforts by HIVID manufactures to reduce the depth of the headset—i.e., to bring the headset closer to the face.
This reduction in HIVID size has a number of undesirable consequences, however. For example, in smaller HMDs that employ eye-tracking systems (i.e., systems for determining a gaze point on the internal display screen of the HIVID), the resulting distortion, reduction in depth-of-field, and compact arrangement of optical components makes it difficult to provide accurate eye-tracking results, particularly for users whose inter-pupillary distance (IPD) is significantly larger or smaller than the general population. This problem is exacerbated by the use of relatively large and thick VR lenses in such systems.
Systems and methods are therefore needed that overcome these and other limitations of the prior art.
Various embodiments of the present invention relate to systems and methods for, inter alia: i) providing eye-tracking in a compact head-mounted display through the use of an IR-reflecting convex mirror in conjunction with an off-axis image sensor; ii) correcting for lens distortion in a head-mounted display through the use of an IR-reflecting convex mirror; iii) providing eye-tracking support for a wider range of inter-pupillary distances (IPDs); and iv) performing slippage compensation to reduce errors in eye-tracking systems. Various other embodiments, aspects, and features are described in greater detail below.
The present invention will hereinafter be described in conjunction with the appended drawing figures, wherein like numerals denote like elements, and:
The present subject matter relates to improved, compact optical systems for performing eye tracking in head-mounted displays. The disclosed systems and methods minimize or eliminate lens distortion—even in systems with large, thick VR lenses—and are compatible with a wide range of inter-pupillary distances. In that regard, the following detailed description is merely exemplary in nature and is not intended to limit the inventions or the application and uses of the inventions described herein. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description. In the interest of brevity, conventional techniques and components related to lenses, mirrors, head-mounted displays, eye-tracking algorithms, and digital image processing may not be described in detail herein.
Referring first to
HMD 110 may be used in the context of virtual reality, augmented reality, or mixed reality applications. Accordingly, the term “virtual reality headset,” is used herein without loss of generality. Furthermore, while the illustrated embodiments are presented in the context of binocular vision, the various optical systems and methods described herein may also be used in the connection with monocular eye tracking.
Referring now to the schematic diagram of
One or more IR LEDs 261 and 262 (e.g., 850, 880, or 940 nm LEDs) are provided adjacent to the front surface 211 of VR lens 210 for performing eye tracking as described in further detail below. Thus, VR lens 210 may correspond to VR lens 122 of
With continued reference to
In the illustrated embodiment, hot mirror 220 is offset laterally (e.g., along the x-axis) a predetermined distance from central axis 203, and convex surface 221 is generally oriented at a predetermined angle such that hot mirror 220 reflects infrared light (e.g., light produced by IR LEDs 261 and 262) off-axis onto a second mirror 230.
Mirror 230 (which is also configured to reflect at least a portion of incident infrared light) is oriented such that surface 231 reflects the incident infrared light onto an image sensor or camera 240 (which may have an associated lens) that is configured to thereby acquire an infrared image of eye 201 to be used (e.g., by eye tracking module 242) to achieve the eye-tracking functionality described herein.
In this regard, as used herein the phrase “eye tracking system” refers to the components of optical system 200 that are used primarily to provide eye tracking functionality—i.e., IR LEDs 261 and 262, hot mirror 220, mirror 230, camera 240, eye-tracking module 242, and the various software code executed by eye-tracking module 242, which may be implemented using a variety of suitable software platforms and languages.
In that regard, the dotted lines in
The resulting image 301, as shown in
The sizes, shapes, relative positions, and materials of the components used to implement the optical system 200 illustrated in
The use of a convex hot mirror 220 results in a number of benefits. For example, the image eye 201 as reflected from convex surface 221 is smaller than what would be reflecting from a planar mirror. Because the eye takes up less area in the image, this allows the eye 201 to be observed by camera 240 at a wider range of inter-pupillary distances. In addition, by using a convex hot mirror 220, at least a portion of the distortion and magnification caused by the relatively large, thick VR lens 210 can be reversed or eliminated, providing a more accurate image of eye 201.
More particularly, as shown in
HIVID 410 will generally include various electronic components and software configured to accomplish the virtual reality imaging functions described herein (including, for example, eye tracking module 242 of
In some embodiments, eye tracking is accomplished by an eye tracking module that is remote from the actual HMD 110. That is, certain imaging data may be transferred over a network to a remote server which then performs at least a portion of the computationally complex operations necessary to determine the CR, PC, or other gaze point data, which is then transmitted back over the network to HIVID 110. In some embodiments, however, eye tracking is computed by an eye tracking module 242 residing with the housing of HIVID 110 or tethered to HIVID 110 via a high-speed data connection.
In accordance with various embodiments, HIVID 110 incorporates various forms of slippage and/or position compensation. More particularly, the image produced by the image sensor 240 is processed to determine the offsets of the positions of the user's pupils and glints—the corneal reflections produced by the IR illuminators. For each eye, these offsets serve as the input to one or more interpolation functions that determine gaze point within a field of interest, typically a display screen; although in some cases it might be a scene camera FOV. The interpolation functions are determined by the data generated when a user performs a calibration. During a calibration, a user is asked to focus his eyes on a number of targets arranged on his display screen while data such as pupil and glint locations, corneal distance, and pupil diameter are collected.
It has been found by the present inventors that the resulting interpolation functions are most accurate, i.e. the gaze point that they output is closest to what the user is actually looking at on the target display screen, when the user's eyes remain at the position where the calibration was performed. However, a HMD 110 may shift on a user's head, i.e., to the left or right and/or up or down. This slippage changes the position of the eyes with respect to the image sensor 240 and IR LEDs 261, 262. For a standalone tracker, the user is free to move his head or body, thus changing the position of his eyes with respect to the image sensor and IR LEDs. The farther the user's eyes stray from the calibration position, the less accurate the gaze point determination becomes.
Slippage or position compensation is intended to minimize the effect of a change of eye position on the accuracy of gaze point determination. In accordance with the present invention, the position of the glints and CRs in the sensor image, along with the distance information calculated by the geometric models, may be used to normalize the pupil/glint offset data to make it less dependent on eye position.
It will be appreciated that the slippage compensation techniques described above are not limited to head-mounted displays, and may be used, for example, in conjunction with remote trackers—i.e., eye tracking systems that are fixed to the bottom portion of a desktop or laptop computer display.
In summary, what has been described herein are various systems and methods for providing eye-tracking in compact head-mounted displays. In accordance with one embodiment, an eye-tracking system includes at least one infrared LED configured to illuminate the user's eye and a first mirror positioned between the first lens and the display screen, wherein the first mirror has a convex face configured to substantially reflect infrared light received from the user's illuminated eye. The system includes an image sensor configured to receive infrared light reflected by the first mirror to thereby produce an image of the user's illuminated eye. An eye-tracking module communicatively coupled to the image sensor is configured to determine a gaze point on the display screen based on the image of the user's illuminated eye.
Embodiments of the present disclosure may be described herein in terms of functional and/or logical block components and various processing steps. It should be appreciated that such block components may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. For example, an embodiment of the present disclosure may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
In addition, those skilled in the art will appreciate that embodiments of the present disclosure may be practiced in conjunction with any number of systems, and that the systems described herein are merely exemplary embodiments of the present disclosure. Further, the connecting lines shown in the various figures contained herein are intended to represent example functional relationships and/or physical couplings between the various elements. It should be noted that many alternative or additional functional relationships or physical connections may be present in an embodiment of the present disclosure.
As used herein, the terms “module” or “controller” refer to any hardware, software, firmware, electronic control component, processing logic, and/or processor device, individually or in any combination, including without limitation: application specific integrated circuits (ASICs), field-programmable gate-arrays (FPGAs), dedicated neural network devices (e.g., Google Tensor Processing Units), electronic circuits, processors (shared, dedicated, or group) configured to execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations, nor is it intended to be construed as a model that must be literally duplicated.
While the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing various embodiments of the invention, it should be appreciated that the particular embodiments described above are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. To the contrary, various changes may be made in the function and arrangement of elements described without departing from the scope of the invention.
This application claims priority to U.S. Provisional Patent Application No. 62/747,322, filed Oct. 18, 2018, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62747322 | Oct 2018 | US |