TECHNICAL FIELD
The present disclosure relates to tunable optical devices, and in particular to ranging and imaging systems usable in visual displays, as well as components, modules, and methods for such ranging and imaging.
BACKGROUND
Visual displays provide information to viewer(s) including still images, video, data, etc. Visual displays have applications in diverse fields including entertainment, education, engineering, science, professional training, advertising, to name just a few examples. Some visual displays, such as TV sets, display images to several users, and some visual display systems, such s near-eye displays (NEDs), are intended for individual users.
An artificial reality system generally includes an NED (e.g., a headset or a pair of glasses) configured to present content to a user. The near-eye display may display virtual objects or combine images of real objects with virtual objects, as in virtual reality (VR), augmented reality (AR), or mixed reality (MR) applications. For example, in an AR system, a user may view images of virtual objects (e.g., computer-generated images (CGIs)) superimposed with the surrounding environment by seeing through a “combiner” component. The combiner component including its light routing optics may be transparent to external light.
An NED is usually worn on the head of a user. Consequently, a large, bulky, unbalanced, and heavy display device with a heavy battery would be cumbersome and uncomfortable for the user to wear. Head-mounted display devices can benefit from a compact and efficient elements and modules, including efficient light sources and illuminators, high-throughput combiner components and ocular lenses, wide-field cameras, eye trackers, depth sensors, and other optical elements.
BRIEF DESCRIPTION OF THE DRAWINGS
Exemplary embodiments will now be described in conjunction with the drawings, in which:
FIG. 1 is a schematic cross-sectional view of an imaging device having a compound field of view;
FIGS. 2A and 2B are schematic cross-sectional views of a lightguide embodiment of the imaging device of FIG. 1 at different grating periods of in-coupling and/or out-coupling gratings of the lightguide;
FIG. 3 is a schematic cross-sectional view of a ranging device having a compound field of view;
FIG. 4 is a schematic cross-sectional view of an eye tracker having a compound field of view;
FIG. 5 is a schematic cross-sectional view of a ranging device having a compound ranging field of view;
FIG. 6 is a schematic cross-sectional view of a lightguide embodiment of the ranging device of FIG. 5;
FIG. 7 is a schematic cross-sectional view of a diffraction grating structure including a stack of switchable gratings of different grating pitch;
FIG. 8 shows side cross-sectional views of a tunable liquid crystal (LC) surface-relief grating of this disclosure;
FIG. 9A is a frontal view of an active Pancharatnam-Berry phase (PBP) liquid crystal (LC) grating usable in imaging and ranging devices of this disclosure;
FIG. 9B is a magnified schematic view of LC molecules in an LC layer of the active PBP LC grating of FIG. 9A;
FIGS. 10A and 10B are side schematic views of the active PBP LC grating of FIGS. 9A and 9B, showing light propagation in OFF (FIG. 10A) and ON (FIG. 10B) states of the active PBP LC grating;
FIG. 11A is a side cross-sectional view of a switchable polarization volume grating (PVH) usable in imaging and ranging devices of this disclosure;
FIG. 11B is a diagram illustrating optical performance of the PVH of FIG. 11A;
FIG. 12A is a side cross-sectional view of a fluidic grating usable in imaging and ranging devices of this disclosure, in an OFF state;
FIG. 12B is a side cross-sectional view of the fluidic grating of FIG. 12A in an ON state;
FIG. 13A is a side cross-sectional view of a lightguide including an acoustic actuator for creating a volume acoustic wave in the lightguide;
FIG. 13B is a side cross-sectional view of a lightguide including an acoustic actuator for creating a surface acoustic wave in the lightguide;
FIG. 14 is a flow chart of an imaging method in accordance with this disclosure;
FIG. 15 is a view of an augmented reality (AR) display having a form factor of a pair of eyeglasses; and
FIG. 16 is a three-dimensional view of a head-mounted display (HMD) of this disclosure.
DETAILED DESCRIPTION
While the present teachings are described in conjunction with various embodiments and examples, it is not intended that the present teachings be limited to such embodiments. On the contrary, the present teachings encompass various alternatives and equivalents, as will be appreciated by those of skill in the art. All statements herein reciting principles, aspects, and embodiments of this disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
As used herein, the terms “first”, “second”, and so forth are not intended to imply sequential ordering, but rather are intended to distinguish one element from another, unless explicitly stated. Similarly, sequential ordering of method steps does not imply a sequential order of their execution, unless explicitly stated. Throughout the specification, the terms “switchable” and “tunable” are used interchangeably. In FIGS. 1 to 7, similar reference numerals denote similar elements.
Near-eye displays use eye trackers to determine viewer's eyes position and orientation. This information is used to adjust the optical system's accommodation distance to account for the eye pupil position, to redirect light towards the pupil for better optical throughput, to adjust the displayed content, etc. To determine eye position/orientation/gaze direction, eye trackers may employ a specialized imaging device to obtain real-time images of user's eyes. Near-eye displays may also use various types of imagers of outside environment, including a color camera, a depth sensing camera or a ranging device, and the like.
One characteristic of camera and ranging devices used in near-eye displays is a requirement to have a wide field of view (FOV). Such a requirement often represents a significant technical challenge, especially when combined with a requirement of low weight, small size, and high spatial resolution.
In accordance with this disclosure, switchable grating structures may be used to extend or multiply a field of view of a camera or a ranging device while preserving compactness, low weight, and spatial resolution. In some embodiments, a diffraction grating structure may be placed upstream of a camera. The diffraction grating structure may have a switchable grating period for switching the camera field of view between a plurality of overlapping portions of a compound field of view, making the compound field of view significantly greater than the camera's inherent field of view. A similar approach may be adapted in respect of a ranging device, a scanner, etc.
In accordance with the present disclosure, there is provided an imaging device comprising a camera for capturing an image within a camera field of view and a diffraction grating structure upstream of the camera. The diffraction grating structure has a switchable grating period for switching the camera field of view between a plurality of overlapping portions of a compound field of view, whereby the compound field of view is greater than the camera field of view, i.e. the switchable diffraction grating structure enables a broader field of view than an inherent field of view of the camera.
In some embodiments, the imaging device includes a lightguide for propagating a light beam therein by a series of internal reflections, the light beam carrying the image, and an in-coupling grating supported by the lightguide for in-coupling the light beam into the lightguide. In such embodiments, the diffraction grating structure may be supported by the lightguide for out-coupling, towards the camera, portions of the light beam propagating in the lightguide. In some embodiments, the imaging device includes an out-coupling grating supported by the lightguide for out-coupling portions of the light beam propagating in the lightguide towards the camera. In such embodiments, the diffraction grating structure may be supported by the lightguide for in-coupling the light beam into the lightguide.
In embodiments where the camera comprises a color camera for a near-eye display, the image may include a color image of outside environment of the near-eye display. In embodiments where the camera comprises a depth sensing camera for a near-eye display, the image may include depth information associated with pixels of the image. In embodiments where the camera comprises an eye tracking camera for a near-eye display, the image may include an image of an eye of a user of the near-eye display.
The diffraction grating structure may include a stack of switchable gratings of differing grating pitch, each switchable grating being switchable between a high-efficiency state wherein efficiency of the switchable grating is above a first threshold, and a low-efficiency state wherein the efficiency of the switchable grating is below a second threshold, wherein the second threshold is lower than the first threshold, e.g. 10 or 100 or 1000 times lower. The diffraction grating may include, by way of non-limiting examples, a switchable polarization volume hologram (PVH) grating, a switchable Pancharatnam-Berry phase (PBP) liquid crystal (LC) grating, a switchable surface relief liquid crystal (LC) grating, and/or a switchable fluidic grating. In some embodiments, the image device comprises an acoustic actuator coupled to a slab of transparent material, and the diffraction grating structure may be formed by an acoustic wave generated by the acoustic actuator in the slab.
The imaging device may include a controller operably coupled to the camera and the diffraction grating structure and configured to operate the diffraction grating structure to switch the camera field of view between the plurality of overlapping portions of the compound field of view, capture an image at each portion of the plurality of overlapping portions of the compound field of view, and optionally stitch the images captured at each portion of the plurality of overlapping portions to obtain a compound image corresponding to the compound field of view.
In accordance with the present disclosure, there is provided an imaging method comprising: switching a field of view of a camera between the plurality of overlapping portions of a compound field of view by switching a grating period of a diffraction grating structure upstream of the camera; at each portion of the plurality of overlapping portions of the compound field of view, capturing an image; and stitching the images captured at each portion of the plurality of overlapping portions to obtain a compound image corresponding to the compound field of view.
The switching may include switching at least one of a stack of switchable gratings of differing grating pitch to a high-efficiency state above a first threshold, while the remaining switchable gratings of the stack are in a low-efficiency state below a second threshold lower than the first threshold, e.g. 10, 100, or even 1000 times lower. The switching may include switching at least one of: a switchable polarization volume hologram (PVH) grating; a switchable Pancharatnam-Berry phase (PBP) liquid crystal (LC) grating; a switchable surface relief liquid crystal (LC) grating; or a fluidic grating
In accordance with the present disclosure, there is further provided a ranging device comprising a transmitter for providing a ranging light beam within a ranging field of view, and a diffraction grating structure downstream of the transmitter, the diffraction grating structure having a switchable grating period for switching the ranging field of view between a plurality of overlapping portions of a compound field of view, whereby the compound field of view is greater than the ranging field of view.
The ranging device may include a lightguide for propagating the light beam therein by a series of internal reflections, and an in-coupling grating supported by the lightguide for in-coupling the light beam into the lightguide. The diffraction grating structure may be supported by the lightguide for out-coupling portions of the light beam propagating in the lightguide. The diffraction grating structure may include a stack of switchable gratings of differing grating pitch, each switchable grating being switchable between a high-efficiency state wherein efficiency of the switchable grating is above a first threshold, and a low-efficiency state wherein the efficiency of the switchable grating is below a second threshold, wherein the second threshold is lower than the first threshold, e.g. 10, 100, or 1000 times lower. The diffraction grating structure may include at least one of: a switchable polarization volume hologram (PVH) grating; a switchable Pancharatnam-Berry phase (PBP) liquid crystal (LC) grating; a switchable surface relief liquid crystal (LC) grating; or a fluidic grating.
Illustrative embodiments of using switchable gratings for field of view expansion will now be considered. Referring to FIG. 1, an imaging device 100 includes a camera 102, e.g. a monochrome or color camera having a lens 104 optically coupled to a monochrome or color photodetector array 106. The camera 102 can capture an image within a camera field of view, which is determined by the focal length of the lens 104 and linear size, i.e. width and height, of the photodetector array 106. A diffraction grating structure 108 is disposed upstream of the camera 102. The diffraction grating structure 108 has a switchable strength and/or grating period for switching the camera field of view between a plurality of overlapping FOV portions, in this example a left portion 111 shown with dashed lines, a middle portion 112 shown with solid lines, and a right portion 113 shown with dotted lines. There may be more than three portions. Together, the left 111, middle 112, and right 113 FOV portions form a compound field of view 110. The field of view is switched by redirecting, by diffraction, the impinging light. An angle of redirection is defined by the period of the diffraction grating structure 108. Examples of the diffraction grating structure 108 will be provided further below. The ability to switch the camera FOV enables the compound field of view 110 to be much larger than the camera FOV, thereby providing an effective FOV expansion without having to use a wide-angle lens coupled to a high pixel count photodetector array. Herein and throughout the rest of the specification, the term “field of view” or “FOV” is related to both a one-dimensional (i.e. in-plane) FOV and a two-dimensional (solid) FOV.
The imaging device 100 may include a controller 150 operably coupled, e.g. electrically and/or mechanically connected, to the camera 102 and the diffraction grating structure 108. The controller 150 may be configured, e.g. programmed, wired, etc., to operate the diffraction grating structure 108 to switch the camera 102 field of view between the overlapping portions 111, 112, 113 of the compound field of view 110. The controller 150 may be configured to capture an image at each portion 111, 112, 113 of the compound field of view 110, e.g. by going sequentially through each portion and taking one or several images at each FOV portion. The controller 150 may then stitch the partial images captured at each portion 111, 112, 113 to obtain a compound image corresponding to the compound field of view 110. In this manner, a very wide field of view may be achieved.
Referring to FIGS. 2A and 2B, an imaging device 200 is an embodiment of the imaging device 100 of FIG. 1, and includes similar elements. In the imaging device 200 of FIGS. 2A and 2B, the camera 102 is coupled to a lightguide 214. The lightguide 214, which may be e.g. a flat or curved plate or slab of a transparent material such as glass, plastic, oxide, crystal etc., supports an in-coupling grating 216 and an out-coupling grating 218. A controller, similar to the controller 150 of the imaging device 100 of FIG. 1 operates in a similar manner, and is omitted from FIGS. 2A and 2B for brevity.
In operation, the in-coupling grating 216 in-couples a light beam 215 into the lightguide 214. The light beam 215 carries an image to be captured by the camera 102. The lightguide propagates the light beam 215 in the lightguide 214 by a series of internal reflections from its opposed surfaces. The light beam 215 propagates along a zigzag path 217 shown with dotted lines. The out-coupling grating 218 out-couples portions 219 of the light beam 215 propagating in the lightguide 214 towards the camera 102.
At least one of the in-coupling grating 216 or the out-coupling grating 218 includes the pitch-switchable diffraction grating structure 108 of FIG. 1. In embodiments where the out-coupling grating 218 includes the diffraction grating structure 108, the latter out-couples the light beam portions 219 towards the camera 102 at a switchable angle, effectively switching between different FOV portions. In embodiments where it is the in-coupling grating 216 that includes the diffraction grating structure 108, the latter in-couples the light beam portions 219 towards the camera 102 at a switchable angle, thereby directly switching between different FOV portions. For instance, FIG. 2B shows that when the pitch (fringe period) of the in-coupling grating 216 is switched, the same out-coupling angle of the out-coupled light beam portions 219 is achieved at the tilted light beam 215, thus providing input FOV portions switching.
The imaging device 200 may be used in a near-eye display for imaging outside environment of a wearer of the near-eye display. The camera 102 may be a monochrome or color camera operating in the visible part of optical spectrum for capturing monochrome or color images of outside environment of the near-eye display, an infrared camera facilitating night-time vision, etc. The images of the outside environment may be used by a controller of the near-eye display for determining size and position of outside objects for various actions such as proximity warning, picking up objects, etc. In virtual reality or mixed reality applications, the images of the outside environment may be projected or combined with virtual generated imagery displayed to a user of the virtual reality system, for a better situational awareness of the user.
Some implementations of the imaging device 100 of FIG. 1 usable in near-eye display applications may include depth sensing systems. Referring to FIG. 3 for a non-limiting illustrative example, a depth sensing imaging device 300 is similar to the imaging device 200 of FIGS. 2A and 2B, and includes similar elements. The depth sensing imaging device 300 of FIG. 3 includes a depth sensing camera 302 coupled to a lightguide 314, which may be similar to the lightguide 214 of FIG. 2. The lightguide 314 of FIG. 3 supports an in-coupling grating 316 and an out-coupling grating 318. One of these gratings, or both of these gratings, may include a switchable diffraction grating structure for switching FOV portions as explained above. A ranging light source 336 provides illuminating light 338 in form of a succession of short pulses or other form of an intensity-modulated light. The purpose of the ranging light source 336 is to illuminate the outside environment being imaged, e.g. an object 340 of the outside environment.
A controller 350 may be operably coupled to the ranging light source 336, the lightguide 314, or more specifically to the switchable diffraction grating structure of the lightguide 314, and the depth sensing camera 302. In operation, the controller 350 causes the ranging light source 336 to emit the illuminating light 338, which is modulated at a known frequency and phase. For definiteness, the illuminating light 338 may include a succession of short light pulses of known emission time. Light 315 reflected from the object 340 in the outside environment reaches the in-coupling grating 316, which in-couples the light 315 into the lightguide 314. The in-coupled light 315 propagates in the lightguide 314 along a light path 317 defined by reflections of the light 315 from opposed surfaces of the lightguide 314, as illustrated. Portions 319 of the light 315 are out-coupled by the out-coupling grating 318.
The depth sensing camera 302 may detect the light 315 in a time-selective manner. Photons arriving at different time intervals from the pulse of the illuminating light 338 may be counted separately, and the distance to elements of the imagery received, e.g. the distance to the object 340, may be determined from the time interval when the photons of the reflected light 315 are received relative to the moment of emission of the illuminating light 338 pulse. In other embodiments, the illuminating light 338 may be periodically e.g. sinusoidally modulated, and the light 315 may be detected by the depth sensing camera 302 in a phase-sensitive manner. The distance to the object 340 may be determined from the phase delay between the illuminating light 338 and the received light 315. In both of these examples, the image detected by the depth sensing camera 302 includes depth information associated with pixels of the image. It is further noted that similar concepts are applicable to the imaging device 100 of FIG. 1, i.e. the lightguide 314 may be replaced with the switchable diffraction grating structure 108, without a lightguide.
Turning to FIG. 4 as another example implementation of the imaging device 100 of FIG. 1, an eye tracking imaging device 400 includes an eye tracking camera 402 coupled to a lightguide 414. The lightguide 414 supports an in-coupling grating 416 and an out-coupling grating 418. One of these gratings, or both of these gratings, may include a switchable diffraction grating structure for switching FOV portions as explained above. The eye tracking imaging device 400 further includes a plurality of eye illuminators 438, which may be supported by the lightguide 414 on its proximal side, i.e. on the side facing a user's eye 440. The purpose of the eye illuminators 438 is to illuminate the eye 440. Light 415 reflected by the eye 440 is in-coupled by the in-coupling grating 416 into the lightguide 414. The in-coupled light 415 then propagates in the lightguide 414 by a series of internal reflections from opposed surfaces of the lightguide 414, as indicated with a zigzag lightpath 417. The eye tracking camera 402 detects the light 415 and obtains images of the eye 440. The expansion of the FOV afforded by switchable diffraction grating structure in the in-coupling 416 and/or out-coupling 418 gratings enables the eye 440 to be detected at a plurality of locations, e.g. the first 440A and second 440B locations. Similar concepts are applicable to the imaging device 100 of FIG. 1, i.e. the lightguide 414 may be replaced with the switchable diffraction grating structure 108.
The principles of FOV expansion by using a diffraction grating structure to switch between several FOV portions can be extended not only to reception devices such as cameras but also to transmission devices. Referring to FIG. 5 for a non-limiting illustrative example, a ranging device 500 includes a transmitter 536 optically coupled to a diffraction grating structure 508 downstream of the transmitter 536. In operation, the transmitter 536 provides a ranging light beam 538 within a ranging field of view. The ranging light beam 538 may include, for example, a diverging (e.g. with a divergence of 40 degrees or more) modulated beam, a wide-angle pulsed beam, a collimated (e.g. with a divergence of 4 degrees or less) modulated beam, a collimated pulsed beam, etc. The time characteristic of the ranging light beam 538 may be used as a basis for the distance determination by calculating time of flight of ranging light beam 538 to an object (not shown for brevity) and time of flight of a reflected or scattered light beam back to a photodetector or a camera 502 of the ranging device 500. The purpose of the photodetector or camera 502 is to capture the reflected or scattered light beam.
The diffraction grating structure 508 is disposed downstream of the transmitter 536. The diffraction grating structure 508 has a switchable grating period for switching the ranging field of view between a plurality of overlapping portions 511, 512, 513 of a compound field of view 510. The compound field of view 510 may be obtained by stitching the portions 511, 512, 513, and may be far greater than the ranging field of view of the ranging device 500.
Turning to FIG. 6, a ranging device 600 is an embodiment of the ranging device 500 of FIG. 5, and includes similar elements. In the ranging device 600 of FIG. 6, the transmitter 536 is coupled to a lightguide 614. The lightguide 614 supports an in-coupling grating 616 and an out-coupling grating 618. At least one of the in-coupling grating 216 or the out-coupling grating 218 includes a pitch-switchable diffraction grating structure, similar to the diffraction grating structure 108 of FIG. 1 described above.
In operation, the in-coupling grating 616 in-couples the ranging light beam 538 emitted by the transmitter 536 into the lightguide 614. The lightguide 614 includes a slab of transparent material such as glass, plastic, crystal, metal oxide, etc., that propagates the ranging light beam 538 in the lightguide 614 by a series of internal reflections from opposed surfaces of the slab. The ranging light beam 538 propagates along a zigzag light path 617. The out-coupling grating 618 out-couples portions of the ranging light beam 538 towards an external object being imaged or ranged. Reflected light is detected by the photodetector or camera 502 (FIG. 6). The distance to the object may be determined from the arrival time or phase of optical power modulation of the received light.
Referring now to FIG. 7, a switchable diffraction grating structure 708 may be used in any of the imaging and ranging devices described herein. The switchable diffraction grating structure 708 includes a stack of switchable gratings of differing grating pitch or grating period, in this example first 771 and second 772 switchable gratings. More than two gratings may be provided. Each switchable grating 771 and 772 is switchable between a high-efficiency state when efficiency of the switchable grating is above a first threshold, and a low-efficiency state when the efficiency of the switchable grating is below a second threshold. The second threshold is lower than the first threshold, e.g. at least 10 times lower, at least 100 times lower, or at least 300 times lower.
Non-limiting examples of switchable/tunable grating structures usable in lightguides and imaging/ranging devices of this disclosure will now be presented. Referring first to FIG. 8, a tunable liquid crystal (LC) surface-relief grating 800 may be used e.g. in the diffraction grating structure 108 of the imaging device 100 of FIG. 1, the in-coupling 216 and/or out-coupling 218 gratings of the imaging device 200 of FIGS. 2A and 2B, the in-coupling 316 and/or out-coupling 318 gratings of the depth sensing imaging device 300 of FIG. 3, the in-coupling 416 and/or out-coupling 418 gratings of the eye tracking imaging device 400 of FIG. 4, the diffraction grating structure 508 of the ranging device 500 of FIG. 5, the in-coupling 616 and/or out-coupling 618 gratings of the ranging device 600 of FIG. 6, and the switchable diffraction grating structure 708 of FIG. 7. The tunable LC surface-relief grating 800 includes a first substrate 801 supporting a first conductive layer 811 and a surface-relief grating structure 804 having a plurality of ridges 806 extending from the first substrate 801 and/or the first conductive layer 811.
A second substrate 802 is spaced apart from the first substrate 801. The second substrate 802 supports a second conductive layer 812. A cell is formed by the first 811 and second 812 conductive layers. The cell is filled with a LC fluid, forming an LC layer 808. The LC layer 808 includes nematic LC molecules 810, which may be oriented by an electric field across the LC layer 808. The electric field may be provided by applying a voltage V to the first 811 and second 812 conductive layers.
The surface-relief grating structure 804 may be formed from a polymer with an isotropic refractive index np of about 1.5, for example. The LC fluid has an anisotropic refractive index. For light polarization parallel to a director of the LC fluid, i.e. to the direction of orientation of the nematic LC molecules 810, the LC fluid has an extraordinary refractive index ne, which may be higher than an ordinary refractive index no of the LC fluid for light polarization perpendicular to the director. For example, the extraordinary refractive index ne may be about 1.7, and the ordinary refractive index no may be about 1.5, i.e. matched to the refractive index np of the surface-relief grating structure 804.
When the voltage Vis not applied (left side of FIG. 8), the LC molecules 810 are aligned approximately parallel to the grooves of the surface-relief grating structure 804. At this configuration, a linearly polarized light beam 821 with e-vector oriented along the grooves of the surface-relief grating structure 804 will undergo diffraction, since the surface-relief grating structure 804 will have a non-zero refractive index contrast. When the voltage V is applied (right side of FIG. 8), the LC molecules 810 are aligned approximately perpendicular to the grooves of the surface-relief grating structure 804. At this configuration, a linearly polarized light beam 821 with e-vector oriented along the grooves of the surface-relief grating structure 804 will not undergo diffraction because the surface-relief grating structure 804 will appear to be index-matched and, accordingly, will have a substantially zero refractive index contrast. For the linearly polarized light beam 821 with e-vector oriented perpendicular to the grooves of the surface-relief grating structure 804, no diffraction will occur in either case (i.e. when the voltage is applied and when it is not), because at this polarization of the linearly polarized light beam 821, the surface-relief grating structure 804 are index-matched. Thus, the tunable LC surface-relief grating 800 can be switched on and off (for polarized light) by controlling the voltage across the LC layer 808. Several such gratings with differing pitch/slant angle/refractive index contrast may be used to switch between several grating configurations. In some embodiments, the surface-relief grating structure 804 may include a birefringent material, as well.
Referring now to FIG. 9A, a Pancharatnam-Berry phase (PBP) LC switchable grating 900 may be used e.g. in the diffraction grating structure 108 of the imaging device 100 of FIG. 1, the in-coupling 216 and/or out-coupling 218 gratings of the imaging device 200 of FIGS. 2A and 2B, the in-coupling 316 and/or out-coupling 318 gratings of the depth sensing imaging device 300 of FIG. 3, the in-coupling 416 and/or out-coupling 418 gratings of the eye tracking imaging device 400 of FIG. 4, the diffraction grating structure 508 of the ranging device 500 of FIG. 5, the in-coupling 616 and/or out-coupling 618 gratings of the ranging device 600 of FIG. 6, and the switchable diffraction grating structure 708 of FIG. 7. The PBP LC switchable grating 900 of FIG. 9A includes LC molecules 902 in an LC layer 904. The LC molecules 902 are disposed in XY plane at a varying in-plane orientation depending on the X coordinate. The orientation angle ϕ(x) of the LC molecules 902 in the PBP LC switchable grating 900 is given by
ϕ(x)=πx/T=πx sin θ/λo (1)
where λo is the wavelength of impinging light, T is a pitch of the PBP LC switchable grating 900, and θ is a diffraction angle given by
θ=sin−1(λo/T) (2)
The azimuthal angle ϕ varies continuously across the surface of an LC layer 904 parallel to XY plane as illustrated in FIG. 9B. The variation has a constant period equal to T. The optical phase delay P in the PBP LC grating 900 of FIG. 9A is due to the PBP effect, which manifests P(x)=2ϕ(x) when the optical retardation R of the LC layer 904 is equal to Δo/2.
FIGS. 10A and 10B illustrate the operation of the PBP LC switchable grating 900 of FIG. 9A. The PBP LC switchable grating 900 includes the LC layer 904 (FIG. 9A) disposed between parallel substrates configured for applying an electric field across the LC layer 904. The LC molecules 902 are oriented substantially parallel to the substrates in absence of the electric field, and substantially perpendicular to the substrates in presence of the electric field.
In FIG. 10A, the PBP LC switchable grating 900 is in OFF state, such that its LC molecules 902 (FIGS. 9A, 9B) are disposed predominantly parallel to the substrate plane, that is, parallel to XY plane in FIG. 10A. When an incoming light beam 1015 is left-circular polarized (LCP), the PBP LC switchable grating 900 redirects the light beam 1015 upwards by a pre-determined non-zero angle, and the beam 1015 becomes right-circular polarized (RCP). The RCP deflected beam 1015 is shown with solid lines. When the incoming light beam 1015 is right-circular polarized (RCP), the PBP LC switchable grating 900 redirects the beam 1015 downwards by a pre-determined non-zero angle, and the beam 1015 becomes left-circular polarized (LCP). The LCP deflected beam 1015 is shown with dashed lines. Applying a voltage V to the PBP LC switchable grating 900 reorients the LC molecules along Z-axis, i.e. perpendicular to the substrate plane as shown in FIG. 10B. At this orientation of the LC molecules 902, the PBP structure is erased, and the light beam 1015 retains its original direction, whether it is LCP or RCP. Thus, the active PBP LC grating 900 is a tunable grating, i.e. it has a variable beam steering property. Furthermore, the operation of the active PBP LC grating 900 may be controlled by controlling the polarization state of the impinging light beam 1015.
Turning to FIG. 11A, a polarization volume hologram (PVH) grating 1100 may be used e.g. in the diffraction grating structure 108 of the imaging device 100 of FIG. 1, the in-coupling 216 and/or out-coupling 218 gratings of the imaging device 200 of FIGS. 2A and 2B, the in-coupling 316 and/or out-coupling 318 gratings of the depth sensing imaging device 300 of FIG. 3, the in-coupling 416 and/or out-coupling 418 gratings of the eye tracking imaging device 400 of FIG. 4, the diffraction grating structure 508 of the ranging device 500 of FIG. 5, the in-coupling 616 and/or out-coupling 618 gratings of the ranging device 600 of FIG. 6, and the switchable diffraction grating structure 708 of FIG. 7. The PVH grating 1100 of FIG. 11A includes an LC layer 1104 bound by opposed top 1105 and bottom 1106 parallel surfaces. The LC layer 1104 may include an LC fluid containing rod-like LC molecules 1107 with positive dielectric anisotropy, i.e. nematic LC molecules. A chiral dopant may be added to the LC fluid, causing the LC molecules in the LC fluid to self-organize into a periodic helical configuration including helical structures 1108 extending between the top 1105 and bottom 1106 parallel surfaces of the LC layer 1104. Such a configuration of the LC molecules 1107, termed herein a cholesteric configuration, includes a plurality of helical periods p, e.g. at least two, at least five, at least ten, at least twenty, or at least fifty helical periods p between the top 1105 and bottom 1106 parallel surfaces of the LC layer 1104.
Boundary LC molecules 1107b at the top surface 1105 of the LC layer 1104 may be oriented at an angle to the top surface 1105. The boundary LC molecules 1107b may have a spatially varying azimuthal angle, e.g. linearly varying along X-axis parallel to the top surface 1105, as shown in FIG. 11A. To that end, an alignment layer 1112 may be provided at the top surface 1105 of the LC layer 1104. The alignment layer 1112 may be configured to provide the desired orientation pattern of the boundary LC molecules 1107b, such as the linear dependence of the azimuthal angle on the X-coordinate. A pattern of spatially varying polarization directions of the UV light may be selected to match a desired orientation pattern of the boundary LC molecules 1107b at the top surface 1105 and/or the bottom surface 1106 of the LC layer 1104. When the alignment layer 1112 is coated with the cholesteric LC fluid, the boundary LC molecules 1107b are oriented along the photopolymerized chains of the alignment layer 1112, thus adopting the desired surface orientation pattern. Adjacent LC molecules adopt helical patterns extending from the top 1105 to the bottom 1106 surfaces of the LC layer 1104, as shown.
The boundary LC molecules 1107b define relative phases of the helical structures 1108 having the helical period p. The helical structures 1108 form a volume grating comprising helical fringes 1114 tilted at an angle , as shown in FIG. 11A. The steepness of the tilt angle depends on the rate of variation of the azimuthal angle of the boundary LC molecules 1107b at the top surface 1105 and p. Thus, the tilt angle is determined by the surface alignment pattern of the boundary LC molecules 1107b at the alignment layer 1112. The volume grating has a period Λx along X-axis and Λy along Y-axis. In some embodiments, the periodic helical structures 1108 of the LC molecules 1107 may be polymer-stabilized by mixing in a stabilizing polymer into the LC fluid, and curing (polymerizing) the stabilizing polymer.
The helical nature of the fringes 1114 of the volume grating makes the PVH grating 1100 preferably responsive to light of polarization having one particular handedness, e.g. left- or right-circular polarization, while being substantially non-responsive to light of the opposite handedness of polarization. Thus, the helical fringes 1114 make the PVH grating 1100 polarization-selective, causing the PVH grating 1100 to diffract light of only one handedness of circular polarization. This is illustrated in FIG. 11B, which shows a light beam 1120 impinging onto the PVH grating 1100. The light beam 1120 includes a left circular polarized (LCP) beam component 1121 and a right circular polarized (RCP) beam component 1122. The LCP beam component 1121 propagates through the PVH grating 1100 substantially without diffraction. Herein, the term “substantially without diffraction” means that, even though an insignificant portion of the beam (the LCP beam component 1121 in this case) might diffract, the portion of the diffracted light energy is so small that it does not impact the intended performance of the PVH grating 1100. The RCP beam component 1122 of the light beam 1120 undergoes diffraction, producing a diffracted beam 1122′. The polarization selectivity of the PVH grating 1100 results from the effective refractive index of the grating being dependent on the relationship between the handedness, or chirality, of the impinging light beam and the handedness, or chirality, of the grating fringes 1114. Changing the handedness of the impinging light may be used to switch the performance of the PVH grating 1100. The PVH grating 1100 may also be made switchable or tunable (both terms are used interchangeably in this specification) by applying voltage to the LC layer 1104, which distorts or erases the above-described helical structure. It is further noted that sensitivity of the PVH 1100 to right circular polarized light in particular is only meant as an illustrative example. When the handedness of the helical fringes 1114 is reversed, the PVH 1100 may be made sensitive to left circular polarized light. Thus, the operation of the PVH 1100 may be controlled by controlling the polarization state of the impinging light beam 1120. Furthermore, in some embodiments the PVH 1100 may be made tunable by application of electric field across the LC layer 1104, which erases the periodic helical structures 1108.
Referring now to FIGS. 12A and 12B, a fluidic grating 1200 may be used e.g. in the diffraction grating structure 108 of the imaging device 100 of FIG. 1, the in-coupling 216 and/or out-coupling 218 gratings of the imaging device 200 of FIGS. 2A and 2B, the in-coupling 316 and/or out-coupling 318 gratings of the depth sensing imaging device 300 of FIG. 3, the in-coupling 416 and/or out-coupling 418 gratings of the eye tracking imaging device 400 of FIG. 4, the diffraction grating structure 508 of the ranging device 500 of FIG. 5, the in-coupling 616 and/or out-coupling 618 gratings of the ranging device 600 of FIG. 6, and the switchable diffraction grating structure 708 of FIG. 7. The fluidic grating 1200 includes first 1201 and second 1202 immiscible fluids separated by an inter-fluid boundary 1203. One of the fluids may be a hydrophobic fluid such as oil, e.g. silicone oil, while the other fluid may be water-based. One of the first 1201 and second 1202 fluids may be a gas in some embodiments. The first 1201 and second 1202 fluids may be contained in a cell formed by first 1211 and second 1212 substrates supporting first 1221 and second 1222 electrode structures. The first 1221 and/or second 1222 electrode structures may be at least partially transparent, absorptive, and/or reflective.
At least one of the first 1221 and second 1222 electrode structures may be patterned for imposing a spatially variant electric field onto the 1201 and second 1202 fluids. For example, in 12A and 12B, the first electrode 1221 is patterned, and the second electrodes 1222 is not patterned, i.e. the second electrodes 1222 is a backplane electrode. In the embodiment shown, both the first 1221 and second 1222 electrodes are substantially transparent. For example, the first 1221 and second 1222 electrodes may be indium tin oxide (ITO) electrodes. The individual portions of a patterned electrode may be individually addressable. In some embodiments, the patterned electrode 1221 may be replaced with a continuous, non-patterned electrode coupled to a patterned dielectric layer for creating a spatially non-uniform electric field across the first 1201 and second 1202 fluids.
FIG. 12A shows the fluidic grating 1200 in a non-driven state when no electric field is applied across the inter-fluid boundary 1203. When no electric field is present, the inter-fluid boundary 1203 is straight and smooth; accordingly, a light beam 1205 impinging onto the fluidic grating 1200 does not diffract, propagating right through as illustrated. FIG. 12B shows the fluidic grating 1200 in a driven state when a voltage V is applied between the first 1221 and second 1222 electrodes, producing a spatially variant electric field across the first 1201 and second 1202 fluids separated by the inter-fluid boundary 1203. The application of the spatially variant electric field causes the inter-fluid boundary 1203 to distort as illustrated in FIG. 12B, forming a periodic variation of effective refractive index, i.e. a switchable or tunable surface-relief diffraction grating. The light beam 1205 impinging onto the fluidic grating 1200 will diffract, forming first 1231 and second 1232 diffracted sub-beams. By varying the amplitude of the applied voltage V, the strength of the fluidic grating 1200 may be varied. By applying different patterns of the electric field e.g. with individually addressable sub-electrodes or pixels of the first electrode 1221, the grating period and, accordingly, the diffraction angle, may be varied. More generally, varying the effective voltage between separate sub-electrodes or pixels of the first electrode 1221 may result in a three-dimensional conformal change of the fluidic interface i.e. the inter-fluid boundary 1203 inside the fluidic volume to impart a desired optical response to the fluidic grating 1200. The applied voltage pattern may be pre-biased to compensate or offset gravity effects, i.e. gravity-caused distortions of the inter-fluid boundary 1203.
The thickness of the first 1221 and second 1222 electrodes may be e.g. between 10 nm and 50 nm. The materials of the first 1221 and second 1222 electrodes besides ITO may be e.g. indium zinc oxide (IZO), zinc oxide (ZO), indium oxide (IO), tin oxide (TO), indium gallium zinc oxide (IGZO), etc. The first 1201 and second 1202 fluids may have a refractive index difference of at least 0.1, and may be as high as 0.2 and higher. One of the first 1201 or second 1202 fluids may include polyphenylether, 1,3-bis(phenylthio)benzene, etc. The first 1211 and/or second 1212 substrates may include e.g. fused silica, quartz, sapphire, etc. The first 1211 and/or second 1212 substrates may be straight or curved, and may include vias and other electrical interconnects. The applied voltage may be varied in amplitude and/or duty cycle when applied at a frequency of between 100 Hz and 100 kHz. The applied voltage can change polarity and/or be bipolar. Individual first 1201 and/r second 1202 fluid layers may have a thickness of between 0.5-5 micrometers, more preferably between 0.5-2 micrometer.
To separate the first 1201 and second 1202 fluids, surfactants containing one hydrophilic end functional group and one hydrophobic end functional group may be used. The examples of a hydrophilic end functional group are hydroxyl, carboxyl, carbonyl, amino, phosphate, sulfhydryl. The hydrophilic functional groups may also be anionic groups such as sulfate, sulfonate, carboxylates, phosphates, for example. Non-limiting examples of a hydrophobic end functional group are aliphatic groups, aromatic groups, fluorinated groups. For example, when polyphenyl thioether and fluorinated fluid may be selected as a fluid pair, a surfactant containing aromatic end group and fluronirated end group may be used. When phenyl silicone oil and water are selected as the fluid pair, a surfactant containing aromatic end group and hydroxyl (or amino, or ionic) end group may be used. These are only non-limiting examples.
Referring to FIG. 13A, a pupil-replicating lightguide 1300A of this disclosure includes a body 1306A having two portions, a substrate 1328 for propagating image light 1304, and a volume-wave acoustic actuator 1330A mechanically coupled at a side of the substrate 1328 joining its top 1315 and bottom 1316 surfaces. In the embodiment shown, the volume-wave acoustic actuator 1330A includes an electrically responsive layer 1332A, e.g. a piezoelectric layer, disposed between electrodes 1307A,1308A. In operation, an electrical signal at a high frequency, typically in the range of 1 MHz to 100 MHz or higher, is applied to the electrodes 1307A,1308A causing the electrically responsive layer 1332A to oscillate, typically at a frequency of a mechanical resonance of the electrically responsive layer 1332A. The oscillating thickness of the electrically responsive layer 1332A creates a volume acoustic wave 1334A propagating in the substrate 1328 in a direction 1335, i.e. along the X-axis. The volume acoustic wave 1334A modulates the refractive index of the substrate 1328 due to the effect of photoelasticity. The modulated refractive index creates a diffraction grating that out-couples portions 1312, 1313 of the image light 1304 from the pupil-replicating lightguide 1300A. By changing the strength of the electric signal applied to the volume-wave acoustic actuator 1330A, the strength of the out-coupling grating may be changed. The out-coupling grating may be switched ON and OFF by switching ON and OFF the oscillating electric signal. The grating period may be changed by changing the frequency of the oscillating electric signal. In some embodiments, an acoustic wave terminator 1336A can be coupled to an opposite side of the substrate 1328 to absorb the volume acoustic wave 1334A and thus prevent a standing acoustic wave formation in the substrate 1328.
Turning to FIG. 13B, a pupil-replicating waveguide 1300B of the present disclosure includes a waveguide body 1306B having two portions, the substrate 1328 for propagating the beam of image light 1304, and a surface-wave acoustic actuator 1330B mechanically coupled at the top surface 1315. Alternatively, the surface-wave acoustic actuator 1330B may also be coupled at the bottom surface 1316. In the embodiment shown, the surface-wave acoustic actuator 1330B includes an electrically responsive layer 1332B, e.g. a piezoelectric layer, disposed between electrodes 1307B,1308B. In operation, an electrical signal at a high frequency, typically in the range of 1 MHz to 100 MHz or higher, is applied to the electrodes 1307B,1308B causing the electrically responsive layer 1332B to oscillate. The oscillation of the electrically responsive layer 1332A creates a surface acoustic wave 1334B propagating in the substrate 1328 in the direction 1335, i.e. along the X-axis. The surface acoustic wave 1334B forms a diffraction grating that out-couples the portions 1312, 1313 of the image light 1304 from the pupil-replicating lightguide 1300B. By changing the strength of the electric signal applied to the surface-wave acoustic actuator 1330B, the strength of the surface grating may be changed. The surface grating may be switched ON and OFF by switching ON and OFF the oscillating electric signal. The grating period may be changed by changing the frequency of the oscillating electric signal. In some embodiments, an acoustic wave terminator 1336B can be coupled to an opposite side of the substrate 1328 at the same surface, i.e. at the top surface 1315 in FIG. 13B, to absorb the surface acoustic wave 1334B and thus prevent a standing acoustic wave formation. The pupil-replicating lightguides 1300A and 1300B may be used e.g. in the diffraction grating structure 108 of the imaging device 100 of FIG. 1, the in-coupling 216 and/or out-coupling 218 gratings of the imaging device 200 of FIGS. 2A and 2B, the in-coupling 316 and/or out-coupling 318 gratings of the depth sensing imaging device 300 of FIG. 3, the in-coupling 416 and/or out-coupling 418 gratings of the eye tracking imaging device 400 of FIG. 4, the diffraction grating structure 508 of the ranging device 500 of FIG. 5, the in-coupling 616 and/or out-coupling 618 gratings of the ranging device 600 of FIG. 6, and the switchable diffraction grating structure 708 of FIG. 7.
Some switchable gratings include a material with tunable refractive index. By way of a non-limiting example, a holographic polymer-dispersed liquid crystal (H-PDLC) grating may be manufactured by causing interference between two coherent laser beams in a photosensitive monomer/liquid crystal (LC) mixture contained between two substrates coated with a conductive layer. Upon irradiation, a photoinitiator contained within the mixture initiates a free-radical reaction, causing the monomer to polymerize. As the polymer network grows, the mixture phase separates into polymer-rich and liquid-crystal rich regions. The refractive index modulation between the two phases causes light passing through the cell to be scattered in the case of traditional PDLC or diffracted in the case of H-PDLC. When an electric field is applied across the cell, the index modulation is removed and light passing through the cell is unaffected. This is described in an article entitled “Electrically Switchable Bragg Gratings from Liquid Crystal/Polymer Composites” by Pogue et al., Applied Spectroscopy, v. 54 No. 1, 2000, which is incorporated herein by reference in its entirety.
Tunable or switchable gratings with a variable grating period may be produced e.g. by using flexoelectric LC. For LCs with a non-zero flexoelectric coefficient difference (e1−e3) and low dielectric anisotropy, electric fields exceeding certain threshold values result in transitions from the homogeneous planar state to a spatially periodic one. Field-induced grating is characterized by rotation of the LC director about the alignment axis with the wavevector of the grating oriented perpendicular to the initial alignment direction. The rotation sign is defined by both the electric field vector and the sign of the (e1−e3) difference. The wavenumber characterizing the field-induced periodicity is increased linearly with the applied voltage starting from a threshold value of about it/d, where d is the thickness of the layer. A description of flexoelectric LC gratings may be found e.g. in an article entitled “Dynamic and Photonic Properties of Field-Induced Gratings in Flexoelectric LC Layers” by Palto in Crystals 2021, 11, 894, which is incorporated herein by reference in its entirety.
Tunable gratings with a variable grating period or a slant angle may be provided e.g. by using helicoidal LC. Tunable gratings with helicoidal LCs have been described e.g. in an article entitled “Electrooptic Response of Chiral Nematic Liquid Crystals with Oblique Helicoidal Director” by Xiang et al. Phys. Rev. Lett. 112, 217801, 2014, which is incorporated herein by reference in its entirety.
For gratings exhibiting strong wavelength dependence of grating efficiency, several gratings, e.g. several volumetric Bragg grating (VBG) gratings, may be provided in the lightguide. The gratings that diffract light at any given moment of time may be switched by switching the VBG grating on and off, and/or by switching the wavelength of the light propagating in the waveguide.
Turning to FIG. 14 with further reference to FIG. 1, an imaging method 1400 of the present disclosure may be implemented e.g. in the controller 150 of the imaging device 100 of FIG. 1, or in a controller of any other imaging/ranging devices disclosed herein. The imaging method 1400 includes switching (FIG. 4; 1402) a field of view of a camera, e.g. the camera 102 of FIG. 1, between the plurality of overlapping FOV portions 111, 112, 113 of a compound field of view 110 by switching a grating period of a diffraction grating structure, e.g. the diffraction grating structure 108 grating period. The diffraction grating structure is disposed upstream of the camera 102. An image is captured (1404) at each portion of the plurality of overlapping portions of the compound field of view. The images captured at each portion of the plurality of overlapping portions are stitched together (1406) to obtain a compound image corresponding to the compound field of view.
In some embodiments, the switching 1402 may include switching at least one of a stack of switchable gratings of differing grating pitch, e.g. a first grating, to a high-efficiency state above a first threshold. The remaining switchable gratings of the stack may be in a low-efficiency state below a second threshold lower than the first threshold, such that only the first grating is activated. Any of the switchable/tunable gratings considered above with reference to FIGS. 8-13 may be used in the method 1400, including e.g. the switchable surface relief LC grating 800 of FIG. 8, the switchable PBP LC grating 900 of FIG. 9, the switchable PVH grating 1100 of FIG. 11, the fluidic grating 1200 of FIG. 12, etc.
Referring now to FIG. 15, a near-eye display 1500 includes a frame 1503 supporting, for each eye: an image projector 1530 for providing an image light beam carrying an image in angular domain, a pupil-replicating lightguide 1506 including any of the waveguides disclosed herein, for providing multiple offset portions of the image light beam to spread the image in angular domain across an eyebox 1512, and a an eye tracker 1504 that may include the eye tracking imaging device 300 of FIG. 3, for example. The purpose of the eye trackers 1504 is to determine position and/or orientation of both eyes of the user. The near-eye display 1500 may further include an imaging or ranging device 1580 for scanning or imaging surrounding environment, which may include any of imaging or ranging devices disclosed herein.
Turning to FIG. 16, an HMD 1600 is a non-limiting illustrative example of an AR/VR wearable display system which encloses the user's face, for a greater degree of immersion into the AR/VR environment. The HMD 1600 may include any of the imaging and ranging systems disclosed herein. The HMD 1600 may generate the entirely virtual 3D imagery. The HMD 1600 may include a front body 1602 and a band 1604 that can be secured around the user's head. The front body 1602 is configured for placement in front of eyes of a user in a reliable and comfortable manner. A display system 1680 may be disposed in the front body 1602 for presenting AR/VR imagery to the user. The display system 1680 may include any of the display devices and waveguides disclosed herein. Sides 1606 of the front body 1602 may be opaque or transparent.
In some embodiments, the front body 1602 includes locators 1608 and an inertial measurement unit (IMU) 1610 for tracking acceleration of the HMD 1600, and position sensors 1612 for tracking position of the HMD 1600. The IMU 1610 is an electronic device that generates data indicating a position of the HMD 1600 based on measurement signals received from one or more of position sensors 1612, which generate one or more measurement signals in response to motion of the HMD 1600. Examples of position sensors 1612 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, a type of sensor used for error correction of the IMU 1610, or some combination thereof. The position sensors 1612 may be located external to the IMU 1610, internal to the IMU 1610, or some combination thereof.
The locators 1608 are traced by an external imaging device of a virtual reality system, such that the virtual reality system can track the location and orientation of the entire HMD 1600. Information generated by the IMU 1610 and the position sensors 1612 may be compared with the position and orientation obtained by tracking the locators 1608, for improved tracking accuracy of position and orientation of the HMD 1600. Accurate position and orientation is important for presenting appropriate virtual scenery to the user as the latter moves and turns in 3D space.
The HMD 1600 may further include a depth camera assembly (DCA) 1611, which may include e.g. the depth sensing imaging device 300 of FIG. 3, the ranging device 500 of FIG. 5, etc. The DCA 1611 captures data describing depth information of a local area surrounding some or all of the HMD 1600. The depth information may be compared with the information from the IMU 1610, for better accuracy of determination of position and orientation of the HMD 1600 in 3D space.
The HMD 1600 may further include an eye tracking system 1614 for determining orientation and position of user's eyes in real time. The eye tracking system 1614 may include the eye tracking imaging device 400 of FIG. 4, for example. The obtained position and orientation of the eyes also allows the HMD 1600 to determine the gaze direction of the user and to adjust the image generated by the display system 1680 accordingly. The determined gaze direction and vergence angle may be used to adjust the display system 1680 to reduce the vergence-accommodation conflict. The direction and vergence may also be used for displays' exit pupil steering as disclosed herein. Furthermore, the determined vergence and gaze angles may be used for interaction with the user, highlighting objects, bringing objects to the foreground, creating additional objects or pointers, etc. An audio system may also be provided including e.g. a set of small speakers built into the front body 1602.
Embodiments of the present disclosure may include, or be implemented in conjunction with, an artificial reality system. An artificial reality system adjusts sensory information about outside world obtained through the senses such as visual information, audio, touch (somatosensation) information, acceleration, balance, etc., in some manner before presentation to a user. By way of non-limiting examples, artificial reality may include virtual reality (VR), augmented reality (AR), mixed reality (MR), hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include entirely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, somatic or haptic feedback, or some combination thereof. Any of this content may be presented in a single channel or in multiple channels, such as in a stereo video that produces a three-dimensional effect to the viewer. Furthermore, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in artificial reality and/or are otherwise used in (e.g., perform activities in) artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a wearable display such as an HMD connected to a host computer system, a standalone HMD, a near-eye display having a form factor of eyeglasses, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
The present disclosure is not to be limited in scope by the specific embodiments described herein. Indeed, other various embodiments and modifications, in addition to those described herein, will be apparent to those of ordinary skill in the art from the foregoing description and accompanying drawings. Thus, such other embodiments and modifications are intended to fall within the scope of the present disclosure. Further, although the present disclosure has been described herein in the context of a particular implementation in a particular environment for a particular purpose, those of ordinary skill in the art will recognize that its usefulness is not limited thereto and that the present disclosure may be beneficially implemented in any number of environments for any number of purposes. Accordingly, the claims set forth below should be construed in view of the full breadth and spirit of the present disclosure as described herein.