The present invention relates to eye tracking.
Optical arrangements for near eye display (NED), head mounted display (HMD) and head up display (HUD) require large aperture to cover the area where the observer's eye is located (commonly referred to as the eye motion box—or EMB). In order to implement a compact device, the image that is to be projected into the observer's eye is generated by a small optical image generator (projector) having a small aperture that is multiplied to generate a large aperture.
An approach to aperture multiplication in one dimension has been developed based on a parallel-faced slab of transparent material within which the image propagates by internal reflection. Part of the image wavefront is coupled out of the slab, either by use of obliquely angled partial reflectors or by use of a diffractive optical element on one surface of the slab. Such a slab is referred herein as a light-guide optical element (LOE), light-transmitting substrate, or optical waveguide. The principles of such aperture multiplication are illustrated schematically in
The coupled-in image 18 propagates along the substrate 10 by repeated internal reflection from the faces 12, 14, impinging on an optical coupling-out configuration 28, as illustrated here schematically by a sequence of partially reflective surfaces 28 at an oblique angle (αsur) to the parallel faces 12, 14, where part of the image intensity is reflected so as to be coupled out of the substrate as rays 30A and 30B toward the pupil 32 of an eye 34 of an observer that is located in the EMB 36 at a an eye relief (ER) distance 38 from the surface 12. In order to minimize unwanted reflections which might give rise to ghost images, the partially reflective surfaces 28 are preferably coated so as to have low reflectance for a first range of incident angles, while having the desired partial reflectivity for a second range of incident angles, where a ray with a small inclination to the normal to a partially reflective surface 28 (represented here as angle βref) is split in order to generate a reflected ray for coupling out, while a high inclination (to the normal) ray is transmitted with negligible reflection.
The projected image 18 is a collimated image, i.e., where each pixel is represented by a beam of parallel rays at a corresponding angle, equivalent to light from a scene far from the observer (the collimated image is referred to as being “collimated to infinity”). The image is represented here simplistically by rays corresponding to a single point in the image, typically a centroid of the image, but in fact includes a range of angles to each side of this central beam, which are coupled in to the substrate with a corresponding range of angles, and similarly coupled out at corresponding angles, thereby creating a field of view corresponding to parts of the image arriving in different directions to the eye 34 of the observer.
An optical function which could be useful for NED, HMD or HUD designs is eye tracking, or sensing the direction the eye of the observer is looking relative to the direction of the head (commonly referred to as the gaze direction). Various solutions for eye-tracking have been proposed. In one set of solutions, the eye is imaged within the EMB via the LOE by coupling light, reflected from the eye, into the LOE such that the reflected light propagates through the LOE by internal reflection back to the image projector (i.e., in a reverse direction relative to the image light from the image projector). These solutions attempt to overcome the fundamental problem that the EMB is not located at infinity like the collimated image from the image projector, but rather at a relatively close distance to the LOE. In another set of solutions, the EMB is imaged using one or more cameras deployed in front of the eye at an off-axis position on peripheral portions of a mechanical body to which the LOE is mounted, such as an eye-glasses frame. However, the proximity between the peripheral portions of the mechanical body and the eye makes imaging the eye within the EMB difficult due to the relatively large keystone angle. Obviously, deploying a camera directly in front of the eye could enable high-quality EMB imaging and image processing, however the positioning of the cameras directly in front of the eye will obscure the viewer's natural view.
Aspects of the present invention provide an eye tracker and corresponding method for tracking the gaze direction of a human eye based on imaging the eye via a light-guide optical element, and are particularly suitable for integrating as part of a NED, HMD or HUD.
Aspects of the present invention provide an eye tracker and corresponding method for tracking the gaze direction of a human eye based on imaging the eye via a light redirecting optical arrangement, associated with a light-guide optical element, that redirects light reflected from the eye, in response to illumination of the eye, toward an optical sensor as unguided light, and are particularly suitable for integrating as part of a NED, HMD or HUD.
According to the teachings of an embodiment of the present invention, there is provided an optical system. The optical system comprises: a light-transmitting substrate having at least two major surfaces deployed with a first of the major surfaces in facing relation to an eye of a viewer; an optical sensor deployed for sensing light; a light redirecting arrangement associated with the light-transmitting substrate configured to deflect light from the eye toward the optical sensor such that the deflected light that reaches the optical sensor is unguided by the light-transmitting substrate, and the deflecting of light by the light redirecting arrangement occurs at the light-transmitting substrate; and at least one processor electrically coupled to the optical sensor and configured to process signals from the optical sensor to derive a current gaze direction of the eye.
Optionally, the optical system further comprises: an illumination arrangement deployed to illuminate the eye with light such that the eye reflects a proportion of the light from the illumination arrangement as reflected light, and the reflected light corresponds to the light from the eye that is deflected by the light redirecting arrangement.
Optionally, the illumination arrangement includes at least a first source of light and a second source of light, the first source of light is configured to produce light having wavelengths in a given first range of wavelengths, and the second source of light is configured to produce light having wavelengths in a given second range of wavelengths, the given first range of wavelengths and given second range of wavelengths being non-overlapping ranges.
Optionally, the light from the eye that is deflected by the light redirecting arrangement primarily includes light having wavelengths outside of the visible light region of the electromagnetic spectrum.
Optionally, the light from the eye that is deflected by the light redirecting arrangement primarily includes light having wavelengths in the visible light region of the electromagnetic spectrum.
Optionally, the light redirecting arrangement transmits light having wavelengths in the visible light region of the electromagnetic spectrum and reflects light having wavelengths outside of the visible light region of the electromagnetic spectrum.
Optionally, the light redirecting arrangement includes at least one partially reflective surface located within the light-transmitting substrate.
Optionally, the two major surfaces of the light-transmitting substrate are mutually parallel, and the at least one partially reflective surface is a flat surface that is at an oblique angle to the two major surfaces.
Optionally, the light-transmitting substrate is configured to guide light, corresponding to an image collimated to infinity, by internal reflection between the two major surfaces of the light-transmitting substrate, and the optical system further comprises: a second at least one partially reflective surface located within the light-transmitting substrate for coupling the light, guided by internal reflection between the two major surfaces, out of the light-transmitting substrate to the eye of the viewer.
Optionally, the second at least one partially reflective surface is a flat surface at an oblique angle to the two major surfaces.
Optionally, the at least one partially reflective surface and the second at least one partially reflective surface are parallel to each other.
Optionally, the at least one partially reflective surface and the second at least one partially reflective surface are non-parallel to each other.
Optionally, the at least one partially reflective surface is deployed in non-overlapping relation to the second at least one partially reflective surface.
Optionally, the at least one partially reflective surface is deployed in overlapping relation to the second at least one partially reflective surface.
Optionally, the light redirecting arrangement includes a diffractive element associated with at least a portion of one of the major surfaces of the light-transmitting substrate.
Optionally, the light redirecting arrangement includes a selectively reflective surface associated with at least a portion of one of the major surfaces of the light-transmitting substrate.
Optionally, the selectively reflective surface is formed from at least one of a dielectric coating or a dichroic coating applied to the at least the portion of the major surface.
Optionally, the light redirecting arrangement deflects a first set of light rays from the eye through an imaging lens toward the optical sensor so as to form a first image of at least a portion of the eye, and the optical system further comprises: a second light redirecting arrangement configured to deflect a second set of light rays from the eye through the imaging lens toward the optical sensor so as to form a second image of at least a portion of the eye.
Optionally, the at least one processor is further configured to process signals from the optical sensor that correspond to the first and second images so as to determine a distance between the eye and the first of the major surfaces.
Optionally, the light redirecting arrangement includes one of: at least one partially reflective surface located within a first portion of the light-transmitting surface, a diffractive element associated with at least a first portion of one of the major surfaces of the light-transmitting substrate, or a selectively reflective surface associated with at least a first portion of one of the major surfaces of the light-transmitting substrate, and the second light redirecting arrangement includes one of: at least one partially reflective surface located within a second portion of the light-transmitting surface, a diffractive element associated with at least a second portion of one of the major surfaces of the light-transmitting substrate, or a selectively reflective surface associated with at least a second portion of one of the major surfaces of the light-transmitting substrate.
Optionally, at least one of the major surfaces of the light-transmitting substrate is a curved surface.
Optionally, the light-transmitting substrate is configured to guide light, corresponding to an image collimated to infinity, by internal reflection between the two major surfaces of the light-transmitting substrate, and the optical system further comprises: an optical coupling-out configuration for coupling the light, guided by internal reflection between the two major surfaces, out of the light-transmitting substrate.
Optionally, the optical coupling-out configuration includes a diffractive element.
Optionally, the optical coupling-out configuration includes at least one partially reflective surface located within the light-transmitting substrate.
Optionally, the two major surfaces of the light-transmitting substrate are parallel to each other, and the at least one partially reflective surface is a flat surface at an oblique angle to the two major surfaces.
Optionally, the optical coupling-out configuration is deployed in non-overlapping relation to the light redirecting arrangement.
Optionally, the optical coupling-out configuration is deployed in overlapping relation to the light redirecting arrangement.
Optionally, the light-transmitting substrate is configured to guide light in one dimension.
Optionally, the light-transmitting substrate is configured to guide light in two dimensions.
Optionally, the light redirecting arrangement is deployed in a first set of parallel planes, and the optical coupling-out configuration is deployed in a second set of parallel planes.
Optionally, the first and second sets of planes are mutually parallel.
Optionally, the first and second sets of planes are mutually orthogonal.
Optionally, the first and second sets of planes are mutually oblique.
Optionally, the light-transmitting substrate is integrated as part of a near eye display.
Optionally, the light-transmitting substrate is integrated as part of a head up display.
Optionally, the optical sensor is deployed between the eye of the viewer and the first of the major surfaces.
Optionally, the light from the eye that is deflected by the light redirecting arrangement undergoes at most a single reflection within the light-transmitting substrate before reaching the optical sensor.
Optionally, the optical system further comprises: at least one imaging optical element deployed in an optical path from the light redirecting arrangement to the optical sensor for forming at least one image of at least a portion of the eye on the optical sensor.
There is also provided according to an embodiment of the teachings of the present invention an optical system. The optical system comprises: a light-transmitting substrate having two mutually parallel major external surfaces deployed with one of the major external surfaces in facing relation to an eye of a viewer; an optical coupling-in configuration for coupling light corresponding to a collimated image into the light-transmitting substrate, so as to propagate within the light-transmitting substrate by internal reflection between the major external surfaces; an optical coupling-out configuration for coupling light, propagating within the light-transmitting substrate by internal reflection, out of the light-transmitting substrate; an optical sensor deployed for sensing light; a light redirecting arrangement associated with the light-transmitting substrate configured to deflect light from the eye toward the optical sensor such that the deflected light that reaches the optical sensor is unguided by the light-transmitting substrate, and the deflecting of light by the light redirecting arrangement occurs at the light-transmitting substrate; and at least one processor electrically coupled to the optical sensor and configured to process signals from the optical sensor to derive a current gaze direction of the eye.
Optionally, the light redirecting arrangement includes at least one partially reflective surface located within the light-transmitting surface, and the at least one partially reflective surface is a flat surface at an oblique angle to the two major external surfaces.
Optionally, the optical coupling-out configuration includes a second at least one partially reflective surface, and the second at least one partially reflective surface is a flat surface an oblique angle to the two major external surfaces.
Optionally, the at least one partially reflective surface and the second at least one partially reflective surface are mutually parallel.
Optionally, the at least one partially reflective surface is non-parallel to the second at least one partially reflective surface.
Optionally, the optical coupling-out configuration is deployed in non-overlapping relation to the light redirecting arrangement.
Optionally, the optical coupling-out configuration is deployed in overlapping relation to the light redirecting arrangement.
Optionally, the light corresponding to the collimated image primarily includes light having wavelengths in the visible light region of the electromagnetic spectrum, and the light from the eye that is deflected by the light redirecting arrangement primarily includes light having wavelengths outside of the visible light region of the electromagnetic spectrum.
Optionally, the optical system further comprises: at least one imaging optical element deployed in an optical path from the light redirecting arrangement to the optical sensor for forming at least one image of at least a portion of the eye on the optical sensor.
There is also provided according to an embodiment of the teachings of the present invention an optical system. The optical system comprises: a light-transmitting substrate having two mutually parallel major external surfaces deployed with one of the major external surfaces in facing relation to an eye of a viewer; an optical coupling-in configuration for coupling light corresponding to a collimated image into the light-transmitting substrate, so as to propagate within the light-transmitting substrate by internal reflection between the major external surfaces; at least one first partially reflective surface located within the light-transmitting substrate for coupling light, propagating within the light-transmitting substrate by internal reflection, out of the light-transmitting substrate, the at least one first partially reflective surface is a flat surface that is non-parallel to the two major external surfaces and is at an oblique angle to the two major external surfaces; an optical sensor deployed for sensing light; a light redirecting arrangement including a second at least one partially reflective surface located within the light-transmitting substrate configured to deflect light from the eye toward the optical sensor such that the deflected light that reaches the optical sensor is unguided by the light-transmitting substrate, and the deflecting of light rays by the light redirecting arrangement occurs at the light-transmitting substrate; and at least one processor electrically coupled to the optical sensor and configured to process signals from the optical sensor to derive a current gaze direction of the eye.
Optionally, the at least one second partially reflective surface transmits light having wavelengths in the visible light region of the electromagnetic spectrum and reflects light having wavelengths outside of the visible light region of the electromagnetic spectrum.
Optionally, the light corresponding to the collimated image primarily includes light having wavelengths in the visible light region of the electromagnetic spectrum, and the light from the eye that is deflected by the light redirecting arrangement primarily includes light having wavelengths outside of the visible light region of the electromagnetic spectrum
Optionally, the at least one first partially reflective surface and the at least one second partially reflective surface are mutually parallel.
Optionally, the at least one first partially reflective surface is non-parallel to the at least one second partially reflective surface.
Optionally, the at least one first partially reflective surface is deployed in non-overlapping relation to the at least one second partially reflective surface.
Optionally, the at least one first partially reflective surface is deployed in overlapping relation to the at least one second partially reflective surface.
Optionally, the optical system further comprises: at least one imaging optical element deployed in an optical path from the light redirecting arrangement to the optical sensor for forming at least one image of at least a portion of the eye on the optical sensor.
There is also provided according to an embodiment of the teachings of the present invention an optical system. The optical system comprises: an optical sensor deployed for sensing light; a light redirecting arrangement associated with a light-transmitting substrate having at least two major surfaces deployed with a first of the major surfaces in facing relation to an eye of a viewer, the light redirecting arrangement configured to deflect light from the eye toward the optical sensor such that the deflected light that reaches the optical sensor is unguided by the light-transmitting substrate, and the deflecting of light by the light redirecting arrangement occurs at the light-transmitting substrate; and at least one processor electrically coupled to the optical sensor and configured to process signals from the optical sensor to derive a current gaze direction of the eye.
There is also provided according to an embodiment of the teachings of the present invention an optical system. The optical system comprises: a light-transmitting substrate having at least two major surfaces deployed with a first of the major surfaces in facing relation to an eye of a viewer, the location of the eye relative to the light-transmitting substrate defining an eye motion box; an illumination arrangement deployed to illuminate the eye motion box with eye-tracking light such that a proportion of intensity of the eye-tracking light is reflected by the eye as reflected light; an optical sensor deployed for sensing light; a light redirecting arrangement associated with the light-transmitting substrate configured to deflect the reflected light toward the optical sensor, the deflected light that reaches the optical sensor is unguided by the light-transmitting substrate, and the deflecting of light by the light redirecting arrangement occurs at the light-transmitting substrate; and at least one processor electrically coupled to the optical sensor and configured to process signals from the optical sensor to derive a current gaze direction of the eye.
There is also provided according to an embodiment of the teachings of the present invention an optical system. The optical system comprises: an illumination arrangement deployed to illuminate an eye motion box with eye-tracking light, the eye motion box defining a location of an eye of a viewer relative to a light-transmitting substrate having at least two major surfaces deployed with a first of the major surfaces in facing relation to the eye, a proportion of intensity of the eye-tracking light being reflected by the eye as reflected light; an optical sensor deployed for sensing light; a light redirecting arrangement associated with the light-transmitting substrate and configured to deflect the reflected light toward the optical sensor, the deflected light that reaches the optical sensor is unguided by the light-transmitting substrate, and the deflecting of light by the light redirecting arrangement occurs at the light-transmitting substrate; and at least one processor electrically coupled to the optical sensor and configured to process signals from the optical sensor to derive a current gaze direction of the eye.
Within the context of this document, the term “guided” generally refers to light that is trapped within a light-transmitting material (e.g., a substrate) by internal reflection at major external surfaces of the light-transmitting material, such that the light that is trapped within the light-transmitting material propagates in a propagation direction through the light-transmitting material. Light propagating within the light-transmitting substrate is trapped by internal reflection when the propagating light is incident to major external surfaces of the light-transmitting material at angles of incidence that are within a given angular range. The internal reflection of the trapped light may be in the form of total internal reflection, whereby propagating light that is incident to major external surfaces of the light-transmitting material at angles greater than a critical angle (defined in part by the refractive index of the light-transmitting material and the refractive index of the medium in which the light-transmitting is deployed, e.g., air) undergoes internal reflection at the major external surfaces. Alternatively, the internal reflection of the trapped light may be effectuated by a coating, such as an angularly selective reflective coating, applied to the major external surfaces of the light-transmitting material to achieve reflection of light that is incident to the major external surfaces within the given angular range. Light that is guided through a light-transmitting material undergoes at least two reflections from the major external surfaces of the light-transmitting material.
The term “unguided”, as used within the context of the present disclosure, generally refers to light that is not guided. Light that is unguided traverses a light-transmitting material (e.g., a substrate) without being trapped within the light-transmitting material, i.e., without being trapped between external major surfaces of the light-transmitting material by internal reflection. The light redirecting arrangements of the present disclosure deflect light from the EMB/viewer's eye to an imaging/detection subsystem, and in particular to an optical sensor, such that the light is deflected at the light-transmitting material without being trapped within the light-transmitting material so as to propagate to the optical sensor under free-space propagation. In other words, the light from the EMB/viewer s eye is deflected by the light redirecting arrangements such that the light that reaches the optical sensor is unguided by the light-transmitting material.
The eye tracker according to the various embodiments of the present invention relies on deflection, by a light redirecting arrangement, of light reflected from the toward an optical sensor. The light that is reflected from the eye is also referred to herein as eye-tracking light, and is referred to herein as being within the “eye-tracking spectrum”. The eye tracker according to the various embodiments of the present invention is particularly effective when the eye-tracking light is in the near infrared (NIR) region of the electromagnetic spectrum (i.e., when the eye-tracking spectrum is within the NIR region). However, the eye tracker may also be effective when the eye-tracking light is in the visible region of the electromagnetic spectrum (i.e., when the eye-tracking spectrum is within the visible light region). In addition, and as will be discussed in greater detail, embodiments of the present invention are also contemplated in which the eye-tracking spectrum is outside of both the visible light and NIR regions.
Within the context of this document, light in the NIR region of the electromagnetic spectrum generally refers to light having wavelengths in the range of 700-1400 nanometers (nm), and in certain instances 680-1400 nm. The wavelengths in the vicinity of 700 nm, i.e., in the range of 680-750 nm, may encroach on darker red visible light, but may be of particular advantage when used to illuminate the eye for eye tracking purposes. In the context of the present document, light that is described as primarily having wavelengths in the NIR region generally refers to light having wavelengths in the range of 700-1400 nm or 680-1400 nm unless explicitly stated otherwise unless explicitly stated otherwise. In the context of this document, light that is described as having wavelengths outside of the NIR region generally refers to light having wavelengths less than 700 nm (or less than 680 nm) or greater than 1400 nm unless explicitly stated otherwise.
Within the context of this document, light in the visible region of the electromagnetic spectrum, generally refers to light having wavelength in the range of 380-750 nm. Accordingly, there may be some overlap between the NIR region and the visible light region. In the context of this document, light that is described as primarily having wavelengths in the visible light region generally refers to light having wavelengths in the range of 380-700 nm or 380-680 nm unless explicitly stated otherwise. In the context of this document, light that is described as primarily having wavelengths outside of the visible light region generally refers to light having wavelengths less than 380 nm or greater than 700 nm (or greater than 680 nm) unless explicitly stated otherwise. The visible region is referred to interchangeably herein as the “visible light region”, “photopic region”, and “photopic spectrum”.
Unless otherwise defined herein, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein may be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
Some embodiments of the present invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
Attention is now directed to the drawings, where like reference numerals or characters indicate corresponding or like components. In the drawings:
Embodiments of the present invention provide various apparatus and corresponding methods for tracking the gaze direction of a human eye based on imaging the eye via an optical arrangement associated with a light-guide optical element.
The principles and operation of the various eye tracking apparatus according to present invention may be better understood with reference to the drawings accompanying the description.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.
Referring now to the drawings,
The light that is reflected from the eye 152 propagates toward the light redirecting arrangement 124 in a general reflection direction (represented schematically in the drawings by the thick arrow labeled “RD”), and the light that is deflected by the light redirecting arrangement 124 propagates towards the optical sensor 140 in a general deflection direction (represented schematically in the drawings by the thick arrow labeled “DD”).
The illustrated embodiment is particularly suitable for augmented reality (AR) and virtual reality (VR) applications, in which a projected image is coupled into the substrate 102, guided through the substrate 102 by internal reflection, and coupled out of the substrate for viewing by the eye 152. It is noted, however, that embodiments of the present invention may also be suitable in non-AR/VR applications, in which the substrate is not configured to guide light by internal reflection. Such embodiments will be discussed in subsequent sections of the present disclosure.
In the non-limiting embodiment illustrated in
In the illustrated embodiment, the LOE 102 is part of the apparatus for deriving the gaze direction, and the apparatus is further configured for displaying an image to the eye 152 (when located within EMB 154). Here, the optical system 100 further includes an image projection arrangement (referred to hereinafter as an “image projector”) 108 that generates a projected image 110, as represented here schematically by a beam of illumination 110 (in the visible light region of the electromagnetic spectrum) including sample rays 110A and 110B which span the beam. Although not shown in the drawings, the image projector 108 includes a microdisplay, such as a liquid-crystal on silicon (LCoS), organic light-emitting diode (OLED), and the like, for generating image light, and corresponding collimating optics for collimating the image to infinity. When the microdisplay is implemented a reflective or transmissive display, illumination components (such as one or more LEDs) and illumination optics (such as beamsplitters) are also included in the image projector 108 so as to direct light from the illumination components to the microdisplay, and to direct the image light to the collimating optics.
The image light 110 is coupled into the LOE 102 by an optical coupling-in configuration 112, shown here schematically as a wedge deployed at or near a slant edge 118 of the LOE 102, so as to generate reflected rays 114 (down-going rays) which are trapped by internal reflection within the substrate 102, generating also rays 116 (up-going rays). It is noted, however, that other optical coupling-in configurations are possible, including, for example, a reflecting surface deployed oblique to the surfaces 104, 106.
The coupled-in image 110 propagates along the substrate 102 by repeated internal reflection from the surfaces 104, 106 (i.e., is guided by the substrate 102), impinging on an optical coupling-out configuration 120, as illustrated here schematically by a sequence of mutually parallel partially reflective surfaces 120 deployed within the substrate 102 at an oblique angle to the parallel faces 104, 106, where part of the image intensity is reflected by the optical coupling-out configuration 120 so as to be coupled out of the substrate 102 toward the eye 152. The coupled-out image light is represented schematically as rays 122A and 122B. The set of partially reflective surfaces 102 is merely illustrative of one non-limiting implementation of an optical coupling-out configuration, and other optical coupling configurations can be used to couple image light out of the LOE 102. The optical coupling-out configuration may be any optical coupling arrangement which deflects part of the image incident light already propagating within the LOE 102 by internal reflection to an angle such that the deflected part of the image incident light exits the LOE 102. Other examples of such suitable optical coupling arrangements include, but are not limited to, one or more diffractive optical elements deployed on either of the faces 104, 106, and a beamsplitter arrangement.
An illumination arrangement 126 is deployed for illuminating the EMB 154 with light (referred to as “eye-tracking light”) 130, such that when the eye 152 is positioned within the EMB 154 the eye 152 is illuminated with the eye-tracking light 130. In certain preferred but non-limiting implementations, the illumination arrangement 126 is configured to illuminate the EMB 154 with light having wavelengths outside of the photopic region of the electromagnetic spectrum. In other words, the illumination arrangement 126 is preferably configured to illuminate the eye 152 with light that is not visible to the human eye so as not to interfere with human vision. In particularly preferred but non-limiting implementations, the illumination arrangement 126 is configured to illuminate the eye 152 with light having wavelengths in the near infrared (NIR) region of the electromagnetic spectrum, preferably wavelengths in the range of 700-1000 nanometers (nm) and in certain instances in the range of 680-1000 nm. However, other non-limiting implementations are considered in which the illumination arrangement 126 is configured to illuminate the eye 152 with light having wavelengths in the photopic spectrum (i.e., the visible light region of the electromagnetic spectrum) or regions of the electromagnetic spectrum outside of the visible, IR, and NIR regions.
The illumination arrangement 126 includes at least one light source, and preferably includes multiple light sources, each configured for illuminating the EMB 154 with eye-tracking light. The light source (or sources) of the illumination arrangement 126 can be implemented as an LED(s), or any other source that is configured to emit (produce) light in the eye-tracking spectrum. In certain non-limiting implementations, the light sources of the illumination arrangement 126 are isotropic (or near-isotropic) sources that emit light in all directions. The illumination, by the illumination arrangement 126, of the EMB 154 with eye-tracking light 130 is represented schematically in
A proportion of the intensity of the eye-tracking light 130 that illuminates the eye 152 is reflected by the eye 152 as reflected light in a general reflection direction toward the light redirecting arrangement 124. A proportion of the intensity of the reflected light, represented schematically as reflected light rays 132A, 132B, 132C, reaches the light redirecting arrangement 124 that is associated with the substrate 102, and is deflected by the light redirecting arrangement 124 as deflected light, represented schematically as reflected light rays 138A, 138B, 138C, so as to redirect the reflected light in a general deflection direction toward the optical sensor 140. The deflection of the eye-tracking light by the light redirecting arrangement 124 occurs at the substrate 102, meaning that the point of deflection of the eye-tracking light is within the substrate 102.
Parenthetically, for the purpose of simplicity of presentation, reference is made herein in the text and accompanying drawings to a sole “reflection direction” and to a sole “deflection direction”, which correspond to light waves emanating from the eye as they propagate toward the optical sensor via deflection by the light redirecting arrangement. These “directions” are used to represent the general propagation direction of the aforementioned light waves, and are not, strictly speaking, vector directions, but are rather indicative of the general path through free-space that the light waves travel from one point to another. Each of the individual reflected light rays (e.g., rays 132A, 132B, 132C) has its own vector propagation direction, which together span the general reflection direction discussed above. Likewise, each of the individual deflected light rays (e.g., rays 138A, 138B, 138C) has its own vector propagation direction, which together span the general deflection direction discussed above.
In the non-limiting embodiment illustrated in
The implementation of the light redirecting arrangement as a set of partially reflective surfaces 124 is merely one illustrative example of a light redirecting arrangement. Other example implementations of a light redirecting arrangement, including implementations based on diffractive and dielectric and/or dichroic coating technologies will be described in detail in subsequent sections of the present disclosure.
The partially reflective surfaces 124 are configured to reflect light having wavelengths in the eye-tracking spectrum (i.e., to reflect light the light reflected by the eye 152 in response to illumination by the illumination arrangement 126). In certain non-limiting implementations, the partially reflective surfaces 124 are further configured to transmit light having wavelengths in the photopic region (e.g., to transmit visible light).
In certain non-limiting implementations, the light redirecting arrangement 124 is deployed such that the eye-tracking light that reaches the light redirecting arrangement 124 is deflected by the light redirecting arrangement 124 at a slant angle relative to the incoming angle, meaning that the angle of the incoming light (rays 132A, 132B, 132C) to the light redirecting arrangement 124 measured relative to the surface 104 is different than the angle of deflected light (rays 138A, 138B, 138C). In certain instances, the angle of the incoming light measured relative to the surface 106 is steeper than the angle of the deflected light (for example, as shown in
In the non-limiting example configuration illustrated in
As should be apparent, the reflected light from the eye 152 may strike different portions of the partially reflective surfaces 124, and not just the positions illustrated in
It is emphasized that in contrast to the image light 110 from the image projector 108 which is guided by the substrate 102, the light that is deflected by the light redirecting arrangement 124 is unguided by the substrate 102. In particular, and as defined above, the light that is deflected by the light redirecting arrangement 124 reaches the optical sensor 140 without being trapped by internal reflection within the substrate 102 (i.e., free-space propagation). When implemented as a set of partially reflective surfaces 124, the partially reflective surfaces 124 are deployed at an oblique angle (αsur124) relative to the major surfaces 104, 106 such that the eye tracking light (rays 134A, 134B, 134C) that is incident to the partially reflective surfaces 124 is reflected by the partially reflective surfaces 124 so as to not be trapped by internal reflection, but rather so as to propagate under free-space propagation toward the optical sensor 140. In the illustrated embodiment, the reflected light from the eye 152 undergoes at most a single reflection within the substrate 102 (the single reflection being the reflection by the partially reflective surfaces 124) prior to reaching the optical sensor 140.
Focusing optics 142, represented schematically as a lens (but which may include a set of lenses), is deployed in an optical path between the light redirecting arrangement and the optical sensor 140. The focusing optics 142 receives the light 138A, 138B, 138C that is deflected by the light redirecting arrangement 124 and converts the received light 138A, 138B, 138C into converging beams of light (represented schematically as light rays 144A, 144B, 144C) that impinge on the optical sensor 140. In certain preferred implementations, the focusing optics 142 is an imaging optic which forms an image of the eye 152 on the optical sensor 140. The focusing optics 142 is preferably deployed to define a field of view corresponding to the region or portion associated with the substrate 102 from which the deflected eye-tracking light will arrive so as to enable the capture of the deflected eye-tracking light by the focusing optics 142. The region or portion associated with substrate 102 is generally the region or portion at which the light redirecting arrangement is deployed. The focusing optics 142 is preferably integrated with the optical sensor 140 as part of a camera system, deployed between the eye 152 and the surface 104, for imaging the eye 152 by capturing the light 138A, 138B, 138C deflected by the light redirecting arrangement 124.
The processing subsystem 146 is electrically associated with the optical sensor 140, and is configured to process signals from the optical sensor 140 to derive a current gaze direction of the eye 152. The processing subsystem 146 is preferably also electrically associated with the illumination arrangement 126 so as to control illumination timing of the EMB by the illumination arrangement 126. The processing system 146 may be implemented using any suitable type of processing hardware and/or software, as is known in the art, including but not limited to any combination of various dedicated computerized processors operating under any suitable operating system and implementing suitable software or firmware modules. The processing system 146 may further include various communications components for allowing wired or wireless communication with LAN and/or WAN devices for bidirectional transfer of information and graphic content. A simplified block diagram of the processing subsystem 146 according to a non-limiting example implementation is illustrated in
The following paragraphs describe various deployment options of the light redirecting arrangement 124 according to non-limiting embodiments of the present invention. In the non-limiting implementation in which the light redirecting arrangement 124 is implemented as a set of partially reflective surfaces 124, deployment angles (αsur124) of the partially reflective surfaces 124 in the range of 10-35 degrees have been found to be particularly suitable for deflecting light from the EMB 154 (eye 152) to the optical sensor 140 in an unguided fashion. In particular, such deployment angles are suitable for deflecting incoming light (rays 132A, 132B, 132C) that is incident to the major surface 104 at angles of incidence (AOI) in the range of 18-40 degrees to deflection angles in the range of 60-90 degrees (where the deflection angles are measured between the rays 138A, 138B, 138C and the normal to the major surface 104). The angular range of 18-40 degrees corresponds to the angular distribution of the eye-tracking light that is reflected from the eye toward the light redirecting arrangement 124.
As previously mentioned, each of the partially reflective surfaces preferably (but not necessarily) has the same deployment angle (αsur124) In addition, the deployment angle (αsur124) of the partially reflective surfaces 124 may be the same or different from the deployment angle (αsur120) of the partially reflective surfaces 120. Generally speaking, a tradeoff between manufacturing complexity of the optical system 100 (and in particular the substrate 102 with sets of embedded partial reflecting surfaces 120 and 124) and the position of the optical sensor 140 (i.e., camera) may be considered when deciding on the deployment angle of the partially reflective surfaces 124. For example, deploying the partially reflective surfaces 120, 124 at a common oblique angle to the major surfaces 104, 106 (i.e., such that the surfaces 120, 124 are mutually parallel) may provide certain advantages during manufacturing of the substrate 102 by reducing the number of manufacturing steps. Typically, the substrate 102 with embedded partial reflecting surfaces is manufactured by deploying transparent plates coated with partially reflective coatings in a stack and then slicing the stack at an oblique angle (as described, for example, in U.S. Pat. No. 8,432,614) to form a slab (substrate) with embedded obliquely angled partial reflectors. The manufacturing process of the substrate 102 of the present invention is simplified when the partially reflective surfaces 120, 124 are to be deployed at a common oblique angle since the stacking and slicing of the coated plates can be performed similar to as described in U.S. Pat. No. 8,432,614.
If, however, the partially reflective surfaces 120, 124 are to be deployed at different oblique angles (i.e., such that the partially reflective surfaces 124 and 120 are non-parallel to each other), the complexity of the manufacturing process may increase by requiring additional steps. Here, the manufacturing process could generally include steps such as: producing a first portion of the substrate having the mutually parallel partially reflective surfaces 120 at a first oblique angle, producing a second portion of the substrate having the mutually parallel partially reflective surfaces 124 at a second oblique angle, and attaching (e.g., via optical cement) the two substrate portions together to form a unitary substrate having the two sets of the partially reflective surfaces at two different deployment angles. The two different deployment angles, while having manufacturing drawbacks, could provide certain advantages in the arrangement of the components of the optical system 100. In particular, different deployment angles provides flexibility in the placement (i.e., spatial positioning) of the optical sensor 140 relative to the substrate 102. For example, the ability to deploy the partially reflective surfaces 124 at angles that are steeper or shallower than the deployment angle of the partially reflective surfaces 120 may allow the optical sensor 140 to be positioned further outside of the viewer's peripheral field of vision than would be otherwise be achievable when using a common deployment angle for the partially reflective surfaces 120, 124.
In addition to having the same or different deployment angles, the partially reflective surfaces 124 and the partially reflective surfaces 120 may be deployed at the same or different planar orientations. Preferably, the partially reflective surfaces 124 are mutually parallel planar surfaces that lay in a first set of mutually parallel planes. Likewise, the partially reflective surfaces 124 are mutually parallel planar surfaces that lay in a second set of mutually parallel planes. In one non-limiting deployment configuration, the first and second sets of planes are parallel (i.e., all of the reflective surfaces 124 and 120 are parallel to each other), for example, as illustrated in
The partially reflective surfaces 124 preferably have dimensions, and are preferably deployed at an angle and orientation, such that both horizontal and vertical eye movement across a desired vertical and horizontal field is captured. It is noted that human gaze direction is wider across the horizontal field than the vertical field. Therefore, in certain preferred embodiments the partially reflective surfaces 124 are dimensioned and deployed at an orientation (plane and deployment angle) so as deflect a wider angular range of light coming from the horizontal field than from the vertical field.
In the non-limiting embodiment illustrated in
In the non-limiting deployment configurations described above with reference to
Referring now to
Referring first to
In this non-limiting example configuration, the diffractive grating 224 is deployed on the major surface 104 at or near the distal end of the substrate 102 and is in non-overlapping relation with the optical coupling-out configuration 120 (similar to as described with reference to
Similar to the light redirecting arrangement 124 (implemented as at least one partially reflective surface), the light redirecting arrangement 224 in certain deployment configurations is such that the eye-tracking light (rays 132A, 132B, 132C) that reaches the light redirecting arrangement 224 is deflected by the light redirecting arrangement 124 at a slant angle, meaning that the angle of the incoming light (rays 132A, 132B, 132C) to the light redirecting arrangement 224 measured relative to the surface 106 is different than the angle of deflected light (rays 138A, 138B, 138C). In certain instances, the angle of the incoming light measured relative to the surface 106 is steeper than the angle of the light deflected by the light redirecting arrangement 224.
It is noted that unlike the light redirecting arrangement 124, the eye-tracking is not refracted by the major surface 104 when the diffractive optical element is deployed on major surface 104. In the non-limiting implementation illustrated in
In the non-limiting example configuration illustrated in
Although a single continuous diffractive grating deployed along portions of the major surfaces 104 and 106 is illustrated in
Similar to the light redirecting arrangement 124, the light redirecting arrangement 224 is preferably configured to deflect incoming light (rays 132A, 132B, 132C) that is incident to the major surface 104 at AOIs in the range of 18-40 degrees to deflection angles in the range of 60-90 degrees (where the deflection angles are measured between the rays 138A, 138B, 138C and the normal to the major surface 104).
Referring now to
It is noted that in the present embodiment, similar to as in the embodiments described with reference to
The light redirecting arrangement 324 is preferably configured to deflect incoming light (rays 132A, 132B, 132C) that is incident to the major surface 104 at AOIs in the range of 18-40 degrees to deflection angles in the same angular range as the incident light (where the deflection angles are measured between the rays 138A, 138B, 138C and the normal to the major surface 104). The incident light angular range of 18-40 degrees corresponds to the angular distribution of the eye-tracking light that is reflected from the eye toward the light redirecting arrangement 124.
In certain embodiments, a chromatic selective reflective coating may also be applied to the region of the substrate 102 at which the light redirecting arrangement 324 is formed in order to support the above-mentioned angular distribution of incoming light and the angular range of deflected light.
In the non-limiting implementation illustrated in
The deflection of the eye-tracking light at the major surface 106 enables the optical sensor 140 (i.e., camera system) to be deployed closer to the major surface 104, thereby increasing the distance between the optical sensor 140 (i.e., camera system) and the eye 152 (and the EMB), providing a more comfortable viewing experience for the viewer.
In the non-limiting configuration illustrated in
Although
It is noted that the section of the major surface 104, 106 on which the light redirecting arrangement 324 is deployed may be a single continuous portion of the major surface 104, 106, or may be one or more discontinuous portions (i.e., separate segments) of the major surface 104, 106.
The non-limiting implementations of the light redirecting arrangement 324 described thus far have pertained to deployment of a reflective surface on a portion or portions of one of the surfaces 104, 106 at or near the proximal end of the substrate 102. The light redirecting arrangements 324 of these implementations are in non-overlapping relation with the optical coupling-out configuration 120 (similar to as described with reference to
The light redirecting arrangements 124, 224, 324 according the various embodiments of the present invention may be deployed at or close to portions of the major surface 104 such that the eye-tracking light that is transmitted from the eye 152 to the light redirecting arrangements 124, 224, 324 is first incident on the portions of the major surface 104 at oblique angles (i.e., a relatively high AOI measured relative to the normal to the surface 104), and by equivalence shallow angles relative to the plane of the surface 104 (in embodiments in which the surface 104 is planar). This is most clearly illustrated in
Such deployment configurations of the light redirecting arrangements 124, 224, 324 allow placement of the optical sensor 140 (and focusing optics 142) at or close to the image projector 108, and in certain preferred embodiments attached to a mechanical body or housing that houses the components of the image projector 108.
The following paragraphs describe several of the deployment options for the illumination arrangement 126. Generally speaking, various configurations of the illumination arrangement 126 are contemplated herein. In all of the illumination arrangement configurations, the illumination arrangement 126 includes at least one light source configured to illuminate the EMB 154 (and hence the eye 152 when positioned in the EMB 154) with eye-tracking light, which in preferred but non-limiting implementations is in the NIR region of the electromagnetic spectrum.
In one non-limiting deployment configuration, the eye-tracking illumination may be incorporated as part of the image projected by the image projector 108. The illumination may be during image projection or in a separate time slot. The timing control of the eye-tracking illumination and the image projection is preferably controlled by the processing subsystem 146. Where the eye-tracking illumination is in the IR region, there are various options for providing the IR illumination to the eye. Where a wavelength of NIR close to visible wavelengths is used, IR illumination may be combined as a fourth “color” in the conventional visible image projector, for example, using an LCoS modulator. If patterned illumination is desired for longer wavelengths of IR, a digital light processing (DPL) device is typically preferred. For non-patterned illumination, a dedicated illumination source is typically provided independent of the image projector. The eye-tracking illumination, generated by the image projector, is coupled into the substrate 102 in the same manner as the visible image illumination 110, and propagates by internal reflection. The optical coupling-out configuration 120 is preferably configured to transmit light in the eye-tracking spectrum, while the light redirecting arrangement 124, 224, 324 is preferably configured to deflect light in the eye-tracking spectrum and transmit in the photopic spectrum. The desired reflection and transmission characteristics of the optical coupling-out configuration 120 may be achieved by applying appropriate coatings to the partially reflective surfaces 120. Similarly, the desired reflection and transmission characteristics of the light redirecting arrangement 124 may be achieved by applying appropriate coatings to the partially reflective surfaces 124.
In a more preferable deployment configuration, the illumination arrangement 126 is separate from the image projector, and includes one or more light sources deployed in proximity to the optical sensor 140 and/or about the periphery of a mechanical body of the optical system 100 to which the substrate 102 is attached.
In the present non-limiting implementation, the illumination arrangement 126 includes three separate light sources 128A, 128B, 128C (implemented, for example, as three LEDs). Two of the sources 128A, 128B are deployed at or near the optical coupling-out configuration 120, on a peripheral portion of the eye-glasses frame 156. The third light source 128C is deployed near the side of the observer's head in proximity to the optical sensor 140 (in
In general, the illumination arrangement 126 may be configured to illuminate specific regions of the eye 152 or the entire eye 152 with eye-tracking light. As discussed above in detail, the eye-tracking light emitted by the illumination arrangement (rays 130A, 130B, 130C) is reflected by the eye 152 as reflected light 132A, 132B, 132C, and is deflected by the light redirecting arrangement 124, 224, 324 as deflected light (rays 138A, 138B, 138C) toward the optical sensor 140, where it is preferably focused (by the optics 142) onto the optical sensor 140. In certain non-limiting implementations, the deflected light (e.g., rays 138A, 138B, 138C) is focused by the focusing optics 142 onto different regions or portions of the optical sensor 140. The optical sensor 140 generates signals in response to sensing the focused light (e.g., corresponding to the image of eye), and those signals are transferred to the processing system 146 which is configured to process the signals to derive a current gaze direction of the eye 152. In certain non-limiting implementations, the optical system 100 obtains the gaze direction (the angular orientation of the eye, or line of sight of the eye) by imaging patterns that exist on specific regions of the eye 152. The position of such patterns and their motion are indicative of the current gaze direction and motion of the eye. The human eye includes various trackable features, including, for example, patterns generated by the nerves of the cornea (i.e., corneal nerve patterns) based on corneal reflections, the center of the eye pupil, and patterns generated by the blood vessels of the optic disc. These trackable features can be tracked using appropriate tracking algorithms implemented by suitable image processing instructions performed by the processing system 146. In certain non-limiting implementations, the processing system 146 computes the gaze direction based on the vector between the pupil center and the corneal reflections.
In general, all background illumination causes noise that degrades the quality of the eye image. In order to reduce the effects of external illumination sources (e.g., ambient light, natural sunlight, etc.) the illumination arrangement 126 may be configured to generate a short pulse of light (preferably below 1 ms), and the optical sensor 140 is synchronized (by the processing subsystem 146) to integrate light only during this short illumination duration. In this manner, continuous background illumination can be greatly suppressed. Additionally, or alternatively, a passband spectral filter may be deployed in the optical path from the light redirecting arrangement to the optical sensor 140 to obstruct light of wavelengths outside a given range of wavelengths within which the eye-tracking illumination is generated from reaching the optical sensor 140. The spectral filter may ideally be positioned between the focusing optics 142 and the optical sensor 140, but may alternatively be deployed before the focusing optics 142.
In a non-limiting process for deriving and tracking the gaze direction, the cornea pattern (optionally in combination with the optic disc pattern and/or pupil) is mapped and trackable features are determined during an initial setup process, and then a continuous tracking process is performed. For example, an image marker may be displayed to the observer for the observer to look at during an initialization. While the observer looks towards the marker, the illumination arrangement 126 fully illuminates the cornea and a full image of the cornea (and pupil) is obtained (via the optical sensor 140). This image is then processed by processing system 146 to identify trackable features (for example, the optic disc and the fovea). During the continuous tracking process, selected regions of interest (ROI) of the eye 152 are selectively illuminated by the illumination arrangement 126, and an image of the ROI (obtained by the optical sensor 140) is sampled and processed (by the processing system 146) during the corresponding illumination pulse to determine the current gaze direction (line of sight), and this derived gaze direction is used to update the position of the ROI for the subsequent illumination cycle, and the continuous tracking process repeats by illuminating the updated ROI. Assuming that the frequency of the tracking measurements is high compared to the speed of motion of the eye, this update process is typically effective to maintain continuous tracking, optionally combined with tracking information from the other eye. As the gaze direction changes, so does the illumination area. Updating of the ROI may be performed according to the “current” gaze direction as determined from the last sampled image or, in certain cases, may use predictive extrapolation based on eye motion between the previous two or more measurements. In the event that tracking fails, the size of the illuminated region can be temporarily increased until the trackable features are recovered.
The light sources of the illumination arrangement 126 may be configured to emit eye-tracking light at approximately the same or different center wavelengths within the eye-tracking spectrum. Typically, in the NIR region the dispersion of glass materials from which the substrate 102 can be constructed is sufficiently low so to avoid suffering from distortion within the spectral width of a single eye-tracking light source (the spectral width typically being in the range of 20-50 nm for LEDs). However, employing light sources that emit eye-tracking light at two spectrally separated center wavelength (while still being within the same region of the eye-tracking electromagnetic spectrum), may provide certain advantages when imaging the eye. For example, deploying the illumination arrangement 126 with a first and a second light source that emit light centered around approximately 700 nm and 950 nm, respectively, can result in two different images of the eye, shifted one relative to the other, formed on the optical sensor 140. By applying appropriate image processing algorithms, such as correlation algorithms, the processing subsystem 146 may achieve higher resolution in the gaze direction calculations.
It is noted that the processing subsystem 146 and the camera (optical sensor with focusing optics) are preferably deployed in close proximity to each other, for example as shown in
The general deflection direction associated with the light redirection arrangements can easily be changed from right-going to left-going (and vice versa) by changing the deployment orientation and/or optical parameters of the light redirecting arrangement. For example, looking at the general right-going deflection direction of the partially reflective surfaces 124 in
Returning now to
Eye-tracking illumination is reflected from the eye 152, represented schematically by light rays 132A, 132B, 132C, 132D, 132E, 132F. Some of the eye-tracking light reflected by the eye, represented schematically by a first set of light rays 132A, 132B, 132C, propagates in a first general reflection direction (represented schematically by the thick arrow labeled “RD1”) and is deflected by the light redirecting arrangement 124 as deflected light (represented schematically by light rays 138A, 138B, 138C) in a first general deflection direction (represented schematically by the thick arrow labeled “DD1”). Some of the eye-tracking light reflected by the eye, represented schematically by a second set of light rays 132D, 132E, 132F, propagates in a second general reflection direction (represented schematically by the thick arrow labeled “RD2”) and is deflected by the light redirecting arrangement 324 as deflected light (represented schematically by light rays 138D, 138E, 138F) in a second general deflection direction (represented schematically by the thick arrow labeled “DD2”). The deflected light, in the form of the two sets of deflected light rays, reach the focusing optics 142. The focusing optics 142 converts the first set of deflected incoming light 138A, 138B, 138C into converging beams of light (represented schematically as light rays 144A, 144B, 144C) that impinge on a first portion or region, designated 141a, of the optical sensor 140 so as to form a first image of the eye 152 within the EMB. The focusing optics 142 also converts the second set of deflected incoming light 138D, 138E, 138F into converging beams of light (represented schematically as light rays 144D, 144E, 144F) that impinge on a second portion or region, designated 141b, of the optical sensor 140 so as to form a second image of the eye 152 within the EMB.
It is noted that the processing subsystem 146 may process the signals generated by the optical sensor 140, when generating two independent images of the eye, to calculate or estimate the eye relief distance using triangulation techniques. It is also noted that, in principle, more than two light redirecting arrangements can be used in combination to generate a set of more than two independent images of the eye.
Certain embodiments of the present disclosure may be of particular value when deployed as part of a head-up display (HUD) in a vehicle or an aircraft, whereby the display of images projected by the image projector 108 in automotive or aviation environments may be dependent or controlled, at least in part, by the eye gaze direction of the user of the HUD. In an automotive environment, a HUD employing the major components of the eye tracking apparatus according to the disclosed embodiments may be installed in front of the driver of the vehicle, for example integrated into the dashboard or front windshield of the vehicle. In aviation environments, the HUD may be installed in front of the pilot of the aircraft, for example installed as part of a pilot helmet in a front region of the helmet.
The embodiments of the optical system have thus far been described within the context of a substrate 102 being a light-guide optical element (LOE) configured to guide image light (injected from an image projector 108) by internal reflection. Such embodiments are of particular value when used in AR and/or VR applications, where the AR/VR image is produced by a compact image projector having a small aperture that is multiplied by the LOE to generate a large aperture. As discussed in the background section, aperture multiplication in one dimension has been developed based on a parallel-faced slab of transparent material within which the image propagates by internal reflection. It is noted that aperture multiplication in two dimensions has also been developed using various optical waveguide configurations. In one example configuration, two-dimensional aperture expansion is effectuated by way of first and second optical waveguides. The first optical waveguide has two pairs of parallel major external surfaces that form a rectangular cross-section. A first set of mutually parallel partially reflective surfaces traverse the first optical waveguide oblique to a direction of elongation of the optical waveguide. The second optical waveguide, optically coupled to the first optical waveguide, has a pair of parallel major external surfaces forming a slab-type waveguide. A second set of mutually parallel partially reflective surfaces traverse the second optical waveguide oblique to the major external surfaces of the second optical waveguide. In addition, the planes containing the first set of partially reflective surfaces are preferably oblique to the planes containing the second set of partially reflective surfaces. The optical coupling between the two optical waveguides, and the deployment and configuration of the two sets of partially reflective surfaces are such that, when an image is coupled into the first optical waveguide with an initial direction of propagation at a coupling angle oblique to both pairs of parallel major external surfaces of the first optical waveguide, the image advances by four-fold internal reflection along the first optical waveguide (i.e., in two dimensions), with a proportion of intensity of the image reflected at the first set of partially reflective surfaces so as to be coupled out of the first optical waveguide and into the second optical waveguide, and then propagates through two-fold internal reflection within the second optical waveguide (i.e., in one dimension), with a proportion of intensity of the image reflected at the second set of partially reflective surfaces so as to be coupled out of the second optical waveguide as a visible image seen by the eye of an observer. When implementing the light redirecting arrangement as a set of parallel planar partially reflective surfaces 124, the planes of the partially reflective surfaces 124 may be oriented to be parallel, orthogonal or oblique to the planes of the first or second sets of mutually parallel partially reflective surfaces. Further details of two-dimensional aperture multipliers can be found in various patent documents, including, for example, U.S. Pat. No. 10,564,417, which is incorporated by reference in its entirety herein.
In another example configuration, two-dimensional aperture expansion is effectuated by way of first and second slab-type optical waveguides. The first optical waveguide has two pairs of parallel major external surfaces forming a slab-type waveguide. A first set of mutually parallel internal partially reflective surfaces traverse the first optical waveguide at an oblique angle to the two pairs of parallel major external surfaces. The second optical waveguide also has two pairs of parallel major external surfaces. A second set of mutually parallel internal partially reflective surfaces traverse the second optical waveguide at an oblique angle to the two pairs of parallel major external surfaces of the second optical waveguide. In addition, the planes containing the first set of partially reflective surfaces are oblique or perpendicular to the planes containing the second set of partially reflective surfaces. The optical coupling between the two optical waveguides, and the deployment and configuration of two sets of partially reflective surfaces are such that, when an image is coupled into the first optical waveguide, the image propagates through two-fold internal reflection within the first optical waveguide between the external surfaces of one of the pairs of external surfaces in a first guided direction, with a proportion of intensity of the image reflected at the first set of partially reflective surfaces so as to be coupled out of the first optical waveguide and into the second optical waveguide, and then propagates through two-fold internal reflection within the second optical waveguide between the external surfaces of one of the pairs of external surfaces of the second optical waveguide in a second guided direction (oblique to the first guided direction), with a proportion of intensity of the image reflected at the second set of partially reflective surfaces so as to be coupled out of the second optical waveguide as a visible image seen by the eye of an observer. When implementing the light redirecting arrangement as a set of parallel planar partially reflective surfaces 124, the planes of the partially reflective surfaces 124 may be oriented to be parallel, orthogonal or oblique to the planes of the first or second sets of mutually parallel partially reflective surfaces. Further details of two-dimensional aperture multipliers can be found in various patent documents, including, for example, U.S. Pat. No. 10,551,544, which is incorporated by reference in its entirety herein.
It is further noted that although the apparatus for deriving eye gaze direction may also be of value when used in non-AR or non-VR applications. In one set of additional applications, the apparatus for deriving gaze direction can be used to advantage in combination with assemblies having any type of substrate formed from a material deployed in front of the eye of a viewer that can at least partially transmit light. The “substrate” in such assemblies may include, but are not limited to, sunglasses, eye-glasses, eye-glasses with optical diopter, eye-glasses with a diffractive grating or gratings, and eye protection gear having a shield or dome type structure formed from plastic or glass deployed to protect the eye from debris or particulates. Some or all of the major external surfaces of the aforementioned substrates may be curved surfaces, unlike the major external surfaces of the LOE which are flat planar surfaces.
In another set of additional applications, the apparatus for deriving gaze direction can be used to advantage in combination with computer or mobile device related applications in which the gaze direction of a user's eye may be used to navigate a display screen, webpage, menu or the like, or may be used to interact with a computerized game played on a computer device (e.g., video game system, mobile device, laptop computer, table, etc.). In such applications, the “substrate” may include the display screen of the computer device, and the light redirecting arrangement can be appropriately associated with portions of the display screen so as to deflect eye-tracking light that is reflected from the eye toward the display screen.
As should be understood, the embodiment illustrated in
Although the embodiments of the present disclosure as described thus far have pertained to illuminating the eye 152 within the EMB with eye-tracking light that is preferably in the non-visible region of the electromagnetic spectrum (preferably the NIR region), other embodiments are possible in which the eye is illuminated with light having wavelength outside of the NIR region, including, for example visible light and ultraviolet (UV) light. In embodiments in which the illumination arrangement 126 illuminates the eye with visible light, it may be advantageous to deploy the illumination source(s) to concentrate illumination on regions of the eye that are less sensitive to visible light, such as the sclera, so as to refrain from bombarding the eye with nonimage visible light. In embodiments in which the illumination arrangement 126 illuminates the eye with UV light, precautions should be taken to lessen or minimize the exposure of the eye to harmful UV radiation, for example by placing limits on the intensity/power of the UV beam received on a region of the eye of a given area over a given duration (for example less than 1 milliwatt per square centimeter for periods greater than 1000 seconds for UV light having wavelength in the range of 315 nm-400 nm).
It is also noted that the light sources of the illumination arrangement 126 may be configured to emit light in two or more different regions of the electromagnetic spectrum. For example, a first set (i.e., one or more) of light sources may emit light in the NIR region, while a second set of light sources may emit light in the visible region (preferably focused on the sclera).
According to certain non-limiting implementations, the various eye-tracking apparatus of the present disclosure may be duplicated for tracking both eyes of a subject simultaneously. In addition, when the eye-tracking apparatus is integrated as part of an optical system that also projects images to the eye, the optical system may be duplicated for projecting images to both eyes. By combining data from two eye trackers, it may be possible to achieve enhanced stability and continuity of tracking. For example, while the eyes are moving, the trackable portions of the eyes may be visible to the tracker in one eye and not the other. If a tracking algorithm is used which employs tracking of trackable features, simultaneous tracking for both eyes allows the tracking to be maintained continuously through periods in which only one eye-tracker can track the blind spot.
Where an optical system is binocular, each eye has its own image projection and eye tracking device, and various processing and power-supply components may optionally be shared between the two eye-tracking systems. The eye-tracking information gleaned by the binocular eye-tracking devices can be fused in order to provide enhanced stability and continuity of tracking, as mentioned above.
The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
As used herein, the singular form, “a”, “an” and “the” include plural references unless the context clearly dictates otherwise.
The word “exemplary” is used herein to mean “serving as an example, instance or illustration”. Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
To the extent that the appended claims have been drafted without multiple dependencies, this has been done only to accommodate formal requirements in jurisdictions which do not allow such multiple dependencies. It should be noted that all possible combinations of features which would be implied by rendering the claims multiply dependent are explicitly envisaged and should be considered part of the invention.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.
This application claims priority from U.S. Provisional Patent Application No. 62/953,557, filed Dec. 25, 2019, U.S. Provisional Patent Application No. 62/958,755, filed Jan. 9, 2020, and U.S. Provisional Patent Application No. 63/023,891, filed July May 13, 2020, whose disclosures are incorporated by reference in their entireties herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IL2020/051236 | 11/30/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/130739 | 7/1/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3969023 | Brandt et al. | Jul 1976 | A |
4720189 | Heynen et al. | Jan 1988 | A |
6215460 | Mizoguchi et al. | Apr 2001 | B1 |
6239771 | Usuki et al. | May 2001 | B1 |
6268883 | Zehnder et al. | Jul 2001 | B1 |
6580529 | Amitai et al. | Jun 2003 | B1 |
6879443 | Spitzer et al. | Apr 2005 | B2 |
7612879 | Stumpe et al. | Nov 2009 | B2 |
7724409 | Lin et al. | May 2010 | B2 |
7872217 | Kasahara | Jan 2011 | B2 |
8016428 | Kasazumi et al. | Sep 2011 | B2 |
8405573 | Lapidot et al. | Mar 2013 | B2 |
9311883 | Cajigas et al. | Apr 2016 | B2 |
9509939 | Henion et al. | Nov 2016 | B2 |
9569897 | Miyawaki et al. | Feb 2017 | B2 |
9638920 | Bohn | May 2017 | B2 |
9804396 | Amitai | Oct 2017 | B2 |
11425829 | Zheng et al. | Aug 2022 | B2 |
20030165017 | Amitai | Sep 2003 | A1 |
20040032660 | Amitai | Feb 2004 | A1 |
20040033528 | Amitai | Feb 2004 | A1 |
20040080718 | Kojima | Apr 2004 | A1 |
20040212775 | Hunter et al. | Oct 2004 | A1 |
20050078388 | Amitai | Apr 2005 | A1 |
20050083592 | Amitai | Apr 2005 | A1 |
20050180687 | Amitai | Aug 2005 | A1 |
20050281515 | Togami | Dec 2005 | A1 |
20070091445 | Amitai | Apr 2007 | A1 |
20070097513 | Amitai | May 2007 | A1 |
20070155277 | Amitai | Jul 2007 | A1 |
20070159673 | Freeman et al. | Jul 2007 | A1 |
20070165192 | Prior | Jul 2007 | A1 |
20080025667 | Amitai | Jan 2008 | A1 |
20080106775 | Amitai et al. | May 2008 | A1 |
20080151379 | Amitai | Jun 2008 | A1 |
20080186604 | Amitai | Aug 2008 | A1 |
20080192239 | Otosaka | Aug 2008 | A1 |
20080198471 | Amitai | Aug 2008 | A1 |
20080278812 | Amitai | Nov 2008 | A1 |
20080285140 | Amitai | Nov 2008 | A1 |
20090052046 | Amitai | Feb 2009 | A1 |
20090052047 | Amitai | Feb 2009 | A1 |
20090097127 | Amitai | Apr 2009 | A1 |
20090122414 | Amitai | May 2009 | A1 |
20090153437 | Aharoni | Jun 2009 | A1 |
20100077049 | Appelman et al. | Mar 2010 | A1 |
20100110368 | Chaum | May 2010 | A1 |
20100134534 | Seesselberg et al. | Jun 2010 | A1 |
20100171680 | Lapidot et al. | Jul 2010 | A1 |
20100202048 | Amitai | Aug 2010 | A1 |
20100214635 | Sasaki et al. | Aug 2010 | A1 |
20110050547 | Mukawa | Mar 2011 | A1 |
20110176218 | Noui | Jul 2011 | A1 |
20110304828 | Khechana | Dec 2011 | A1 |
20120062445 | Haddick et al. | Mar 2012 | A1 |
20120179369 | Lapidot et al. | Jul 2012 | A1 |
20130022220 | Dong et al. | Jan 2013 | A1 |
20130128230 | Macnamara | May 2013 | A1 |
20130229717 | Amitai | Sep 2013 | A1 |
20130276960 | Amitai | Oct 2013 | A1 |
20130279017 | Amitai | Oct 2013 | A1 |
20140027232 | Mccollum | Jan 2014 | A1 |
20140014065 | Brown et al. | May 2014 | A1 |
20140118813 | Amitai et al. | May 2014 | A1 |
20140118836 | Amitai et al. | May 2014 | A1 |
20140118837 | Amitai et al. | May 2014 | A1 |
20140126051 | Amitai et al. | May 2014 | A1 |
20140126052 | Amitai et al. | May 2014 | A1 |
20140126056 | Amitai et al. | May 2014 | A1 |
20140126057 | Amitai et al. | May 2014 | A1 |
20140126175 | Amitai et al. | May 2014 | A1 |
20140198017 | Lamb | Jul 2014 | A1 |
20150070864 | Rainer et al. | Mar 2015 | A1 |
20150138451 | Amitai | May 2015 | A1 |
20150198805 | Mansharof et al. | Jul 2015 | A1 |
20150205140 | Mansharof et al. | Jul 2015 | A1 |
20150205141 | Mansharof et al. | Jul 2015 | A1 |
20150233824 | Richards et al. | Aug 2015 | A1 |
20150277127 | Amitai | Oct 2015 | A1 |
20150293360 | Amitai | Oct 2015 | A1 |
20150035481 | Hilkes et al. | Dec 2015 | A1 |
20150355481 | Hilkes et al. | Dec 2015 | A1 |
20160012643 | Kezele et al. | Jan 2016 | A1 |
20160018639 | Spitzer et al. | Jan 2016 | A1 |
20160018654 | Haddick et al. | Jan 2016 | A1 |
20160062119 | Fitch et al. | Mar 2016 | A1 |
20160116743 | Amitai | Apr 2016 | A1 |
20160125642 | Zhu et al. | May 2016 | A1 |
20160170212 | Amitai | Jun 2016 | A1 |
20160170213 | Amitai | Jun 2016 | A1 |
20160170214 | Amitai | Jun 2016 | A1 |
20160187656 | Amitai | Jun 2016 | A1 |
20160020965 | Popovich et al. | Jul 2016 | A1 |
20160247319 | Nowatzyk et al. | Aug 2016 | A1 |
20160278695 | Wang et al. | Sep 2016 | A1 |
20160282616 | Matsushita | Sep 2016 | A1 |
20160314564 | Jones | Oct 2016 | A1 |
20160341964 | Amitai | Nov 2016 | A1 |
20160349518 | Amitai et al. | Dec 2016 | A1 |
20170003504 | Vallius | Jan 2017 | A1 |
20170007351 | Yu | Jan 2017 | A1 |
20170045744 | Amitai | Feb 2017 | A1 |
20170052376 | Amitai | Feb 2017 | A1 |
20170052377 | Amitai | Feb 2017 | A1 |
20170122725 | Yeoh | May 2017 | A1 |
20170130224 | Oestergaard et al. | May 2017 | A1 |
20170132757 | Thiebaud et al. | May 2017 | A1 |
20170146802 | Pletenetskyy | May 2017 | A1 |
20170169747 | Richards et al. | Jun 2017 | A1 |
20170205277 | Ohtsuki | Jul 2017 | A1 |
20170248790 | Cheng | Aug 2017 | A1 |
20170307896 | Kovaluk et al. | Oct 2017 | A1 |
20170336636 | Amitai et al. | Nov 2017 | A1 |
20170353714 | Poulad et al. | Dec 2017 | A1 |
20170357095 | Amitai | Dec 2017 | A1 |
20170363799 | Ofir et al. | Dec 2017 | A1 |
20180039082 | Amitai | Feb 2018 | A1 |
20180046859 | Jarvenpaa | Feb 2018 | A1 |
20180067315 | Amitai et al. | Mar 2018 | A1 |
20180130391 | Bohn | May 2018 | A1 |
20180157057 | Gelberg et al. | Jun 2018 | A1 |
20180210202 | Danziger | Jul 2018 | A1 |
20180267317 | Amitai | Sep 2018 | A1 |
20180275384 | Danziger et al. | Sep 2018 | A1 |
20180275409 | Gao | Sep 2018 | A1 |
20180292592 | Danziger | Oct 2018 | A1 |
20180292599 | Ofir et al. | Oct 2018 | A1 |
20180373039 | Amitai | Dec 2018 | A1 |
20190011710 | Amitai | Jan 2019 | A1 |
20190037712 | Kim et al. | Jan 2019 | A1 |
20190037718 | Shapiro et al. | Jan 2019 | A1 |
20190041634 | Popovich et al. | Feb 2019 | A1 |
20190056600 | Danziger et al. | Feb 2019 | A1 |
20190064518 | Danziger | Feb 2019 | A1 |
20190068944 | Zhang et al. | Feb 2019 | A1 |
20190086674 | Sinay et al. | Mar 2019 | A1 |
20190101767 | Geng et al. | Apr 2019 | A1 |
20190155035 | Amitai | May 2019 | A1 |
20190159354 | Zheng | May 2019 | A1 |
20190170327 | Eisenfeld et al. | Jun 2019 | A1 |
20190187482 | Lanman | Jun 2019 | A1 |
20190208187 | Danziger | Jul 2019 | A1 |
20190212487 | Danziger et al. | Jul 2019 | A1 |
20190222830 | Edwin et al. | Jul 2019 | A1 |
20190227215 | Danziger et al. | Jul 2019 | A1 |
20190278086 | Ofir | Sep 2019 | A1 |
20190285900 | Amitai | Sep 2019 | A1 |
20190293856 | Danziger | Sep 2019 | A1 |
20190339530 | Amitai | Nov 2019 | A1 |
20190346609 | Eisenfeld | Nov 2019 | A1 |
20190361240 | Gelberg | Nov 2019 | A1 |
20190361241 | Amitai | Nov 2019 | A1 |
20190377187 | Rubin et al. | Dec 2019 | A1 |
20190391408 | Mansharof | Dec 2019 | A1 |
20200027855 | Lee et al. | Jan 2020 | A1 |
20200033572 | Danziger et al. | Jan 2020 | A1 |
20200041713 | Danziger | Feb 2020 | A1 |
20200089001 | Amitai et al. | Mar 2020 | A1 |
20200110211 | Danziger et al. | Apr 2020 | A1 |
20200120329 | Danziger | Apr 2020 | A1 |
20200133008 | Amitai | Apr 2020 | A1 |
20200150330 | Danziger et al. | May 2020 | A1 |
20200183159 | Danziger | Jun 2020 | A1 |
20200183170 | Amitai et al. | Jun 2020 | A1 |
20200200963 | Eisenfeld et al. | Jun 2020 | A1 |
20200209667 | Sharlin et al. | Jul 2020 | A1 |
20200241308 | Danziger et al. | Jul 2020 | A1 |
20200249481 | Danziger et al. | Aug 2020 | A1 |
20200261279 | Weidman et al. | Aug 2020 | A1 |
20200278557 | Greenstein et al. | Sep 2020 | A1 |
20200285060 | Amitai | Sep 2020 | A1 |
20200292417 | Lobachinsky et al. | Sep 2020 | A1 |
20200292744 | Danziger | Sep 2020 | A1 |
20200292819 | Danziger et al. | Sep 2020 | A1 |
20200310024 | Danziger et al. | Oct 2020 | A1 |
20200326545 | Amitai et al. | Oct 2020 | A1 |
20200371311 | Lobachinsky et al. | Nov 2020 | A1 |
20210003849 | Amitai et al. | Jan 2021 | A1 |
20210018755 | Amitai | Jan 2021 | A1 |
20210033773 | Danziger et al. | Feb 2021 | A1 |
20210033862 | Danziger et al. | Feb 2021 | A1 |
20210033872 | Rubin et al. | Feb 2021 | A1 |
20210055218 | Aldaag et al. | Feb 2021 | A1 |
20210055466 | Eisenfeld | Feb 2021 | A1 |
20210055561 | Danziger et al. | Feb 2021 | A1 |
20210063733 | Ronen | Mar 2021 | A1 |
20210072553 | Danziger et al. | Mar 2021 | A1 |
20210099691 | Danziger | Apr 2021 | A1 |
20210109351 | Danziger et al. | Apr 2021 | A1 |
20210116367 | Gelberg et al. | Apr 2021 | A1 |
20210141141 | Danziger et al. | May 2021 | A1 |
20210157150 | Amitai | May 2021 | A1 |
20210165231 | Gelberg et al. | Jun 2021 | A1 |
20210239898 | Danziger et al. | Aug 2021 | A1 |
20220018760 | Spartz et al. | Jan 2022 | A1 |
20220035758 | Hanna et al. | Feb 2022 | A1 |
20220252886 | Jung et al. | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
101311772 | Nov 2008 | CN |
104570353 | Apr 2015 | CN |
106054292 | Oct 2016 | CN |
207216158 | Apr 2018 | CN |
207424391 | May 2018 | CN |
H09101479 | Apr 1997 | JP |
2001021448 | Jan 2001 | JP |
2006145644 | Jun 2006 | JP |
2008035146 | Feb 2008 | JP |
2008134471 | Jun 2008 | JP |
2013231652 | Nov 2013 | JP |
2017135605 | Aug 2017 | JP |
2007037089 | Apr 2007 | WO |
2016203212 | Dec 2016 | WO |
WO2022107140 | May 2022 | WO |
Entry |
---|
C.B. Owen et al “Display-relative calibration for optical see-through head-mounted displays” Published in: Third IEEE and ACM International Symposium on Mixed and Augmented Reality—https://ieeexplore.ieee.org/document/1383044. |
Mukawa H “A full-color eyewear display using planar waveguides with reflection volume holograms”, Article in Information Display ⋅ Mar. 2009 DOI: 10.1002/j.2637-496X.2009.tb00066.x. |
Wei Chen et al; “An Image Quality Evaluation Method of Near-eye Display” in 1060 ⋅ SID Symposium Digest of Technical Papers, vol. 47, No. 1, May 1, 2016 , pp. 1060-1063, XP 0554411041, US. |
Number | Date | Country | |
---|---|---|---|
20220357583 A1 | Nov 2022 | US |
Number | Date | Country | |
---|---|---|---|
62953557 | Dec 2019 | US | |
62958755 | Jan 2020 | US | |
63023891 | May 2020 | US |