This relates generally to electronic devices, and, more particularly, to electronic devices such as head-mounted devices.
Electronic devices such as head-mounted devices may have components such as displays that are used in providing visual content to users.
A head-mounted device may have optical modules mounted in a head-mounted housing. Each optical module, which may sometimes be referred to as an optical assembly, may have a display configured to display an image in a respective eye box through a lens. The optical modules may be slidably coupled to guide rails. Left and right positioners may be used to adjust the locations of the optical modules along the guide rails.
Each optical module may have one or more cameras and/or other sensors. The optical module cameras may be used capture eye images from the eye boxes to measure eye characteristics such as eye opening angle, eye lid opening size, cornea diameter, and interpupillary distance. The eye characteristics may be measured at different times during the use of the head mounted device (e.g., at a first time such as when a user registers with the device and a second time that is later than the first time).
Because the user's eye characteristics tend to remain constant over time, the user's eye characteristics can be used as reference points to detect misalignment in the optical modules. Measured eye characteristics may be used in evaluating whether a device has experienced changes in optical module position over time. For example, measured changes in the eye opening angle can be used in determine whether an optical module has become skewed relative to its original orientation.
If desired, the head-mounted device may have optical module position sensors based on electrode arrays that are contacted by optical module electrodes on the optical modules (e.g., when the optical modules are slid along the guide rails to the limits of their travel). Position measurements with these sensors can be used in determining whether optical module positions have shifted.
Control circuitry can perform image warping operations to ensure that displayed images are compensated for measured changes in optical module position (e.g., misalignment detected using captured images and/or misalignment detected using electrode array optical module position sensors).
Electronic devices such as head-mounted devices may include displays for presenting users with visual content. In an illustrative arrangement, a head-mounted device has displays and lenses mounted in left and right optical modules. The left and right optical modules provide left and right images to left and right eye boxes for viewing by a user's left and right eyes, respectively. The distance between the left and right optical modules may be adjusted to accommodate different user interpupillary distances.
There is a risk that the optical modules in a head-mounted device may become misaligned when a head-mounted device is exposed to excessive stress or abuse such as when a head-mounted device experiences an undesired drop event. To ensure that images are provided satisfactorily to the eye boxes in which the user's eyes are located, the head-mounted device may gather eye images and/or other sensor data and may process this sensor data to detect changes over time. If changes are detected, images may be warped and/or otherwise adjusted to compensate for any detected changes. If, as an example, it is determined that a left eye image has become tilted by 1° in a clockwise direction due to optical module tilt induced by a drop event, control circuitry in the device may adjust image data being supplied to the left display so that the left image is digitally rotated by a compensating amount (e.g., 1° in the clockwise direction in this example). By digitally compensating for detected misalignment conditions in the optical modules, satisfactorily aligned images may be presented to the user.
A schematic diagram of an illustrative system that includes a head-mounted device is shown in
As shown in
During operation, the communications circuitry of the devices in system 8 (e.g., the communications circuitry of control circuitry 12 of device 10), may be used to support communication between the electronic devices. For example, one electronic device may transmit video data, audio data, and/or other data to another electronic device in system 8. Electronic devices in system 8 may use wired and/or wireless communications circuitry to communicate through one or more communications networks (e.g., the internet, local area networks, etc.). The communications circuitry may be used to allow data to be received by device 10 from external equipment (e.g., a tethered computer, a portable device such as a handheld device or laptop computer, online computing equipment such as a remote server or other remote computing equipment, or other electrical equipment) and/or to provide data to external equipment.
Device 10 may include input-output devices 22. Input-output devices 22 may be used to allow a user to provide device 10 with user input. Input-output devices 22 may also be used to gather information on the environment in which device 10 is operating. Output components in devices 22 may allow device 10 to provide a user with output and may be used to communicate with external electrical equipment.
Input-output devices 22 may include one or more displays. In some configurations, device 10 includes left and right display devices. These displays devices may include scanning mirror display devices or other image projectors, liquid-crystal-on-silicon display devices, digital mirror devices, or other reflective display devices, left and right display panels based on light-emitting diode pixel arrays such as organic light-emitting display panels or display devices based on pixel arrays formed from crystalline semiconductor light-emitting diode dies, liquid crystal display panels, and/or or other left and right display devices that provide images to left and right eye boxes for viewing by the user's left and right eyes, respectively.
During operation, control circuitry 12 uses displays to provide visual content for a user of device 10 (e.g., control circuitry 12 provides the displays with digital image data). The content that is presented on the displays may sometimes be referred to as display image content, display images, computer-generated content, computer-generated images, virtual content, virtual images, or virtual objects.
Display images may be displayed in the absence of real-world content or may be combined with real-world images. In some configurations, real-world content may be captured by a camera (e.g., a forward-facing camera, sometimes referred to as a front-facing camera) so that computer-generated content may be electronically overlaid on portions of the real-world image (e.g., when device 10 is a pair of virtual reality goggles with an opaque display). In other configurations, an optical combining system may be used to allow computer-generated content to be optically overlaid on top of a real-world image. With this approach, device 10 has an optical system that provides display images to a user through a waveguide having a holographic output coupler or other optical coupler while allowing the user to view real-world images through the waveguide and optical coupler. Illustrative arrangements for device 10 are sometimes described herein in which device 10 does not include such optical couplers (e.g., illustrative arrangements for device 10 are described in which left and right optical modules that are used in displaying computer-generated content and/or, if desired, pass-through video from forward facing cameras).
Input-output circuitry 22 may include sensors 16. Sensors 16 may include, for example, three-dimensional sensors (e.g., three-dimensional image sensors such as structured light sensors that emit beams of light and that use two-dimensional digital image sensors to gather image data for three-dimensional images from light spots that are produced when a target is illuminated by the beams of light, binocular three-dimensional image sensors that gather three-dimensional images using two or more cameras in a binocular imaging arrangement, three-dimensional lidar (light detection and ranging) sensors, three-dimensional radio-frequency sensors, or other sensors that gather three-dimensional image data), cameras (e.g., infrared and/or visible digital image sensors), gaze tracking sensors (e.g., a gaze tracking system based on an image sensor and, if desired, a light source that emits one or more beams of light that are tracked using the image sensor after reflecting from a user's eyes), touch sensors, capacitive proximity sensors, light-based (optical) proximity sensors, other proximity sensors, force sensors, sensors such as contact sensors based on switches, gas sensors, pressure sensors, moisture sensors, magnetic sensors, audio sensors (microphones), ambient light sensors, microphones for gathering voice commands and other audio input, sensors that are configured to gather information on motion, position, and/or orientation (e.g., accelerometers, gyroscopes, compasses, and/or inertial measurement units that include all of these sensors or a subset of one or two of these sensors), and/or other sensors. To help determine whether components such as optical modules in device 10 should be compensated for misalignment, sensors 22 may include eye sensors such as gaze tracking sensors, visual and/or infrared image sensors (cameras) that face the eyes of a user to capture eye images, optical module misalignment (tilt) sensors (sometimes referred to as optical module position sensors) based on electrode arrays that can be contacted by optical module electrodes, and/or other sensors that gather information indicative of whether the optical modules of device 10 have changed position. If changes in optical module position (and therefore display position relative) are detected, the control circuitry of device 10 can adjust image data being provided to the displays so that the left and right images produced by the left and right displays of device 10 are aligned with the user's right and left eyes (when the user's right and left eyes are located in right and left eye boxes).
To allow a user to control device 10, user input and other information may be gathered using sensors and other input devices in input-output devices 22. If desired, input-output devices 22 may include devices such as haptic output devices (e.g., vibrating components), light-emitting diodes and other light sources, speakers such as ear speakers for producing audio output, circuits for receiving wireless power, circuits for transmitting power wirelessly to other devices, batteries and other energy storage devices (e.g., capacitors), joysticks, buttons, and/or other components.
Electronic device 10 may have housing structures (e.g., housing walls, straps, etc.), as shown by illustrative head-mounted support structures 26 of
During operation of device 10, images are presented to a user's eyes in eye boxes 30. Eye boxes 30 include a left eye box that receives a left image and a right eye box that receives a right image. Device 10 may include a left display system with a left display 14 that presents the left image to the left eye box and a right display system with a right display 14 that presents the right image to the right eye box. In an illustrative configuration, each display (sometimes referred to as a pixel array) is mounted with an associated lens 24 in a respective optical module 28 (e.g., in a lens barrel formed from metal, polymer, and/or other materials or other suitable optical module housing, sometimes referred to as an optical assembly or support structure). Components such as sensors 16 (e.g., eye sensors such as visible light cameras and/or infrared cameras that capture images of a user's eyes when the user's eyes are located in eye boxes 30, gaze tracking sensors, and/or other eye sensing components) may also be mounted in optical modules (optical assemblies) 28 (e.g., in the same optical module housings as the displays at locations where these sensors can operate through lenses 24 and/or where these sensors can bypass lenses 24 when gathering data from eye boxes 30). If desired, device 10 may also contain sensors 16 mounted at other locations in device 10.
As shown in
The characteristics of a user's eyes tend not to change over time. For example, a user's interpupillary distance tends to remain constant. Similarly, a user's cornea diameter, eye opening shape, eye opening angle, and other eye attributes tend to remain constant. As a result, these features of a user's face can be used as reference points by device 10. During a registration process when device 10 is initially being associated with a new user or at another suitable time, device 10 may measure the user's eye characteristics and may store these measurements. At one or more later times during use of device 10, device 10 can remeasure the user's eye characteristics. If any changes are detected, it can be assumed that the positions of the optical modules have drifted (e.g., due to a drop event or other excessive stress) and compensating image processing techniques (e.g., compensating image warping) can be performed on the images being displayed by device 10 to compensate for this detected misalignment. For example, control circuitry 12 can apply a geometrical transform to the image data being supplied to displays 14 to compensate for image distortion (e.g., keystoning, tilt, image size shrinkage or enlargement, image location shift, etc.) due to shifts and/or rotations of displays 14 relative to their nominal positions). Eye measurements and updates to any compensating image transforms that are being used by device 10 can be made each time device 10 is powered up (and/or powered down), can be made in response to user input, can be made periodically (e.g., at regular predetermined intervals), may be made in response to detecting a drop event with an accelerometer in sensors 16 or in response to detecting other high-stress conditions, and/or may be made in response to detecting satisfaction of other suitable calibration criteria.
In an illustrative configuration, image sensors 16 (cameras operating at infrared and/or visible wavelengths) may be use to capture eye images for control circuitry 12 to process. The image sensors may be mounted in respective left and right optical modules 28 and therefore may be used to gauge whether there has been any movement of modules 28 with respect to the user's eyes. The eye images may be captured and stored as image data and/or may be stored after image processing has been performed to extract eye characteristics. Examples of eye characteristics that may be measured using sensors 16 are shown in
Electrode arrays 42, which may sometimes be referred to as optical module position sensor electrode arrays, contain arrays of metal patches, concentric metal rings, and/or other arrays of electrodes 42E, as shown in
Using control circuitry to measure the resistances between each of electrodes 42E in a given array 42 and the associated optical module electrode 40 that has contacted that array 42, device 10 can determine which of the electrodes 42E has shorted to electrode 40. In this way, changes in position (e.g., misalignment) of modules 28 can be measured. Consider, as an example, a scenario in which there is one optical module electrode 40 on an optical module. Initially, when device 10 is initially set up and is properly aligned, optical module electrode 40 will contact an electrode 42E at position P1 of
Electrode array 42 may have electrodes 42E arranged in rows and columns and/or may have other suitable electrode layouts (e.g., configurations with ring-shaped electrodes, configurations with radially extending electrode patterns, etc.). The electrode pattern of illustrative position sensor electrode array 42 of
During the operations of block 60, a user may register with device 10. For example, a user may provide a username and password, biometric credentials, and/or other identifying information to device 10. During this process, the initial positions of optical modules are assumed to be correct (e.g., modules 28 are assumed to have been manufactured within normal tolerances so that optical modules 28 are aligned satisfactorily). Accordingly, eye cameras and/or other eye sensors may be used to measure the user's eyes for later use as reference points in determining whether modules 28 have moved. Eye measurements may include user-specific characteristics such as cornea size, eye lid opening size, interpupillary distance, eye opening angle, etc. and may be stored as images and/or may be stored as processed data (see, e.g., the measured values of D1, D2, D2, and A1 of
At one or more later times, after device 10 has been used and potentially exposed to high-stress conditions and abuse such as drop events, device 10 can be recalibrated to compensate for any changes in optical module alignment. In particular, during the operations of block 62, the user may supply the user's credentials to identify the user to device 10 (e.g., the user may log into device 10). Based on the known identity of the user, device 10 can retrieve the user's specific eye information (corresponding to the measured characteristics of the user's eyes when optical modules 28 are properly aligned). The current characteristics of the user's eyes may then be measured during the operations of block 64 (e.g., using image sensors to capture eye images, etc.). If desired, optical module alignment can also be assessed by using optical module position sensors such as electrode arrays 42 and electrodes 40.
During the operations of block 66, the current measured alignment (position) of modules 28 is compared to the previously measured initial alignment (position) of modules 28. User eye characteristic measurements and/or optical module position sensor measurements with arrays 42 may be used. If no deviations are detected, image data may be provided to displays 14 of modules 28 without alteration. If, however, changes in alignment (position) are detected (e.g., if misalignment is detected), a compensating amount of image warping or other digital image processing may be applied to the image data for the left and right optical modules during the operations of block 68. In this way, changes in module position (e.g., shifts along the X, Y, and/or Z axes and/or rotations about the X, Y, and/or Z axes) can be compensated (e.g., by warping the images being displayed equally and oppositely from the image distortion experienced due to the measured changes in alignment). As just one example, if a rotation of angle A1 by 2° in the image of the user's left eye is measured, the image data for the left display can be correspondingly rotated by 2° to correct for this misalignment in the optical module. In this way, the images provided to the user's eyes will remain aligned, even if the positions of the optical modules change.
In some embodiments, sensors may gather personal user information. To ensure that the privacy of users is preserved, all applicable privacy regulations should be met or exceeded and best practices for handling of personal user information should be followed. Users may be permitted to control the use of their personal information in accordance with their preferences.
In accordance with an embodiment, a head-mounted device is provided that includes a head-mounted support and optical assemblies each of which has a lens through which an image is visible from an associated eye box, a sensor configured to measure eye characteristics in the associated eye box, and a display that adjusts the image based on the measured eye characteristics.
In accordance with another embodiment, the sensor includes an image sensor configured to capture eye images, the measured eye characteristics are obtained from the captured eye images, and the display adjusts the image based on the eye characteristics from the eye images to compensate for changes in optical assembly alignment of the optical assemblies.
In accordance with another embodiment, the sensor includes an image sensor configured to capture eye images, the measured eye characteristics are obtained from the captured eye images and include eye opening angle and cornea diameter.
In accordance with another embodiment, sensor includes an image sensor configured to measure changes in alignment of the optical assemblies by comparing eye information gathered when the optical assemblies are aligned correctly to eye information gathered when the optical assemblies are misaligned.
In accordance with another embodiment, the display is configured to warp the image to compensate for the measured changes in alignment.
In accordance with another embodiment, the measured changes in alignment include optical assembly rotation away from a desired orientation and the display is configured to warp the images to compensate for the optical assembly rotation.
In accordance with another embodiment, the display is configured to warp the image to rotate the image by an equal and opposite amount from the optical assembly rotation away from the desired orientation.
In accordance with another embodiment, the optical assemblies each include an optical assembly electrode configured to make electrical contact with an electrode in a respective optical assembly position sensor electrode array.
In accordance with another embodiment, the optical assemblies are slidably coupled to guide rails and the head-mounted device includes positioners configured to move the optical assemblies so that the optical assembly electrodes make contact with the position sensor electrode arrays.
In accordance with another embodiment, the display is configured to adjust the image based on measurements from the position sensor electrode arrays.
In accordance with another embodiment, the sensor includes an image sensor, the measured eye characteristics include eye opening angle, and the display is configured to use the eye opening angle in warping the image by comparing a currently measured version of the eye opening angle to a previously measured version of the eye opening angle.
In accordance with an embodiment, a head-mounted device is provided that includes a head-mounted support; optical assemblies mounted in the head-mounted support; and optical assembly position sensors having arrays of electrodes, the optical assembly position sensors are configured to measure changes in alignment of the optical assemblies.
In accordance with another embodiment, the optical assemblies have respective displays and respective left and right lenses through which left and right images from the left and right displays are provided respectively to left and right eye boxes.
In accordance with another embodiment, the optical assembly position sensors include optical assembly electrodes configured to make contact with electrodes in the arrays of electrodes.
In accordance with another embodiment, the optical assemblies include a left optical assembly and a right optical assembly and the optical assembly electrodes include at least a left optical assembly electrode on the left optical assembly and a right optical assembly electrode on the right optical assembly, the head-mounted device includes a left positioner configured to move the left optical assembly so that the left optical assembly electrode contacts a first of the arrays of electrodes to make a left optical assembly position measurement and a right positioner configured to move the right optical assembly so that the right optical assembly electrode contacts a second of the arrays of electrodes to make a right optical assembly position measurement.
In accordance with another embodiment, the displays are configured to perform image warping operations.
In accordance with another embodiment, the displays are configured to perform image warping operations based on the measured changes in alignment.
In accordance with another embodiment, the head-mounted device includes a left camera in a first of the optical assemblies that is configured to capture a left eye image and a right camera in a second of the optical assemblies that is configured to capture a right eye image.
In accordance with another embodiment, the head-mounted device includes a left camera in a first of the optical assemblies that is configured to capture a left eye image and a right camera in a second of the optical assemblies that is configured to capture a right eye image the left and right displays are configured to align the left and right images based the measured changes in alignment and based on eye characteristics obtained from the captured left and right eye images.
In accordance with an embodiment, a head-mounted device is provided that includes a head-mounted support and left and right optical assemblies in the head-mounted support, the left and right optical assemblies have respective left and right lenses through which respective left and right images are provided to left and right eye boxes, left and right cameras configured to respectively capture a left eye image from the left eye box and a right eye image from the right eye box to measure corresponding left and right eye characteristics, and left and right displays configured to adjust the left and right images based on a comparison between the left and right eye characteristics measured at a first time and the left and right eye characteristics measured at a second time.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application is a continuation of international patent application No. PCT/US2022/038949, filed Jul. 29, 2022, which claims priority to U.S. provisional patent application No. 63/230,625, filed Aug. 6, 2021, which are hereby incorporated by reference herein in their entireties.
Number | Date | Country | |
---|---|---|---|
63230625 | Aug 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US22/38949 | Jul 2022 | US |
Child | 18425266 | US |