This relates generally to electronic devices, and, more particularly, to electronic devices such as head-mounted devices.
Electronic devices have components such as displays. The positions of these components may sometimes be adjusted.
A head-mounted device may include optical assemblies with displays for presenting images to a user. Motors may be used to adjust the spacing between the optical assemblies and thereby adjust the spacing between the displays to accommodate different interpupillary distances.
The optical assemblies may have gaze trackers. The gaze trackers may be used to make interpupillary distance measurements and eye relief measurements. Adjustments to the positions of the optical assemblies may be made by the motors based on the interpupillary distance measurements and eye relief measurements. For example, no position adjustments may be made unless a measured eye relief measurement exceeds an eye relief threshold and the measured interpupillary distance is lower than an interpupillary distance threshold. If the measured eye relief measurement is sufficiently large and the measured interpupillary distance is sufficiently small, adjustments to the positions of the optical assemblies may be made in which, for a given measured eye relief, a smaller measured interpupillary distance results in a larger adjustment to an optical assembly position than a larger measured interpupillary distance.
As shown in the illustrative top view of device 10 of
Device 10 may have electrical and optical components that are used in displaying images to eye boxes 36 when device 10 is being worn. These components may include left and right optical assemblies 20 (sometimes referred to as optical modules). Each optical assembly 20 may have an optical assembly support 38 (sometimes referred to as a lens barrel or optical module support) and guide rails 22 along which optical assemblies 20 may slide to adjust optical-assembly-to-optical-assembly separation to accommodate different user interpupillary distances.
Each assembly 20 may have a display 32 that has an array of pixels for displaying images and a lens 34. Display 32 and lens 34 of each assembly 20 may be coupled to and supported by support 38. During operation, images displayed by displays 32 may be presented to eye boxes 36 through lenses 34 for viewing by a user. Each optical assembly 20 may also have a gaze tracker 50. Gaze trackers 50 may each include one or more light sources (e.g., infrared light-emitting diodes that provide flood illumination and glints for eye tracking) and an associated camera (e.g., an infrared camera). Using gaze trackers 50, which may sometimes be referred to as gaze tracking systems or gaze tracking sensors, device 10 can gather data on a user's eyes located in eye boxes 36. As an example, the direction in which a user's eyes are pointing (sometimes referred to as a user's point of gaze or direction of view) may be measured. Biometric information such as iris scan information may also be gathered. In addition, gaze trackers 50, may be used to measure the location of a user's eyes relative to device 10 and thereby measure the eye relief of the user's eyes (e.g., the distance between the lenses of device 10 and the eyes) and the separation between the user's left and right eyes (sometimes referred to as the user's interpupillary distance). If desired, gaze trackers 50 (e.g., the cameras of trackers 50) may capture images of the skin of the user's face surrounding the user's eyes (e.g., to measure whether this skin is loose or taut).
Each optical assembly may have magnets, clips, and/or other engagement features to allow removable vision correction lenses (sometimes referred to as prescription lenses) to be removably attached to assemblies 20 in alignment with lenses 34 (see, e.g., illustrative optional vision correction lenses 51). Lenses 51 may have magnets that are sensed by sensors 53 (e.g., magnetic sensors in assemblies 20) or sensors 53 may be optical sensors, switches, or other sensors configured to gather other information indicating when lenses 51 are present.
Housing 12 may have a flexible curtain (sometimes referred to as a flexible rear housing wall or fabric housing wall) such as curtain 12R on the rear of device 10 facing eye boxes 36. Curtain 12R has openings that receive assemblies 20. The edges of curtain 12R that surround each support 38 may be coupled to that support 38. The outer peripheral edge of curtain 12R may be attached to rigid housing walls forming an outer shell portion of main housing 12M.
The walls of housing 12 may separate interior region 28 within device 10 from exterior region 30 surrounding device 10.
Inner ends 24 of guide rails 22 may be attached to central housing portion 12C. Opposing outer ends 26 may, in an illustrative configuration, be unsupported (e.g., the outer end portions of rails 22 may not directly contact housing 12, so that these ends float in interior region 28 with respect to housing 12).
Device 10 may include control circuitry and other components such as component 40. The control circuitry may include storage, processing circuitry formed from one or more microprocessors and/or other circuits. To support communications between device 10 and external equipment, the control circuitry may include wireless communications circuitry. Components 40 may include sensors such as such as force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors, optical sensors such as optical sensors that emit and detect light, ultrasonic sensors, and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or sensors such as inertial measurement units that contain some or all of these sensors), radio-frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices), optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, visual inertial odometry sensors, gaze tracking sensors, and/or other sensors. In some arrangements, devices 10 may use sensors to gather user input (e.g., button press input, touch input, etc.). Sensors may also be used in gathering environmental motion (e.g., device motion measurements, temperature measurements, ambient light readings, etc.) and/or may be used in measuring user activities and/or attributes (e.g., point-of-gaze, eye relief, interpupillary distance, etc.). If desired, position sensors such as encoders (e.g., optical encoders, magnetic encoders, etc.) may measure the position and therefore the movement (e.g., the velocity, acceleration, etc.) of optical assemblies 20 along rails 22.
Each assembly 20 (e.g., each support 38 of
The position and therefore the movement of each optical assembly may be monitored using one or more sensors. In the illustrative configuration of
Optical assembly velocity information and/or other information from the optical assembly position sensors (e.g., linear position sensors formed from the magnetic encoders) may be used in monitoring whether the optical assemblies have slowed down their movement due to contact between the optical assemblies and the nose of a user. Consider, as an example, the arrangement of
When optical assemblies 20 contact the left and right sides of nose surface of 62, motors 48 will encounter resistance to further lateral movement of the optical assemblies along the X axis. This will cause optical assemblies 20 to move more slowly. The optical assembly position sensors will sense the reduction in the velocity of the optical assemblies. In this way, device 10 is informed that optical assemblies 20 are contacting and pressing against nose surface 62. To ensure that device 10 is comfortable as device 10 is being worn on the head of the user, the position of the optical assemblies may, in response to this detected nose contact, be adjusted outward, away from nose surface 62 (e.g., by 1-3 mm or other suitable amount). This outward nudge in the positions of the optical assemblies may be made even if the final separation between the optical assemblies is slightly larger than the user's measured interpupillary distance.
Following outward adjustment of optical assemblies 20, a non-zero gap G may be created between optical assemblies 20 and corresponding side portions of the user's nose (e.g., adjacent portions of nose surface 62) and/or inward pressure imposed on the sides of the user's nose by optical assemblies 20 can be reduced to enhance comfort. By monitoring the optical assembly position sensors during optical assembly position adjustment with motors 48, device 10 can identify a location for assemblies 20 in which the left and right assemblies 20 are separated by distance that is matched as closely as possible to the user's measured interpupillary distance while ensuring satisfactory comfort for the user.
Another way in which the position of optical assemblies 20 can be adjusted satisfactorily involves the use of eye relief measurements from gaze trackers 50. When device 10 is placed onto the head of the user, gaze trackers 50 may measure the user's interpupillary distance and may measure the user's eye relief. Motors 48 may then adjust the positions of optical assemblies 20 based on the measured eye relief and measured interpupillary distance of the user. In an illustrative configuration, the positions of optical assemblies 20 maybe offset (e.g., nudged outwards from the location where the spacing between assemblies 20 matches the measured interpupillary distance of the user) by an amount that varies depending on both measured interpupillary distance and measured eye relief.
This type of approach is illustrated in the graph of
As the graph of a
If, however, the measured eye relief of a user exceeds eye relief threshold ERTH, it may be beneficial for certain users to increase the spacing between optical assemblies 20 by nudging each optical assembly outwardly by an amount ΔX. As an example, consider users with measured interpupillary distances of IPDA. For this class of user, it may be beneficial to adjust the spacing of optical assemblies 20 outwardly by ΔX values that follow curve 64. As shown by curve 64, ΔX may be zero for measured eye relief values of less than ERTH, whereas for measured eye relief values exceeding ERTH, ΔX may rise progressively as a function of measured eye relief (e.g., up to a maximum ΔX value at a maximum measured eye relief value of ERMX, which may be, for example, 25 mm or other suitable value). Users with larger interpupillary distances, such as a measured interpupillary distance of IPDB, may benefit from a less aggressive outward increase in optical assembly spacing, as shown by curve 66 in which the value of ΔX rises more slowly as a function of increasing measured eye relief over ERTH than curve 64.
An interpupillary distance threshold may be present above which it may not be desirable to make any ΔX adjustments for a user, regardless of their measured eye relief. When, for example, a user has a large measured interpupillary distance (e.g., a measured interpupillary distance of IPDC, which exceeds the interpupillary distance threshold), there will generally not be a comfort benefit in increasing optical assembly spacing. As a result, the recommended outward adjustment in optical assembly position ΔX as a function of measured eye relief for users with these larger measured interpupillary distances follows curve 68 (e.g., ΔX remains at zero, even for a measured eye relief of ERMX). With this approach, only at measured interpupillary distances below the interpupillary distance threshold will outward adjustments in optical assembly position be used.
The graph of
During the operations of block 70, device 10 may gather data. The gathered data may include, for example, measurements obtained by gaze trackers 50. These measurements may include, for example, the measured interpupillary distance of a user wearing device 10, the measured eye relief of a user wearing device 10, the measured skin tautness around the eyes of a user wearing device 10, and/or other gaze tracker measurements. The measurements of block 70 may also include measurements with the position sensors (e.g., the magnetic encoders) of optical assemblies 20. For example, when a user dons device 10, motors 48 may automatically start to move optical assemblies 20 to positions associated with the user's measured interpupillary distance from gaze trackers 50. During this initial movement or during movement in response to a user input command or other movement, the position sensors may be used to monitor the velocities of assemblies 20. In response to detected slowing of the speed of inward movement of assemblies 20, device 10 can conclude that assemblies 20 are beginning to exert pressure on the sides of the user's nose (e.g., nose surface 62). The locations associated with the measured reduction in optical assembly velocity are another form of measurement data that may be gathered during block 70.
Further information that can be gathered during the operations of block 70 relates to the status of vision correction lenses 51 on device 10. Vision correction lenses 51 may contain magnets that produce a magnetic field. Device 10 (e.g., assemblies 20) may have vision correction lens sensors such as magnetic sensors 53 that determine whether or not lenses 51 are present by monitoring for the presence of the magnetic fields produced by lenses 51. In response to detection of the magnetic fields from lenses 51 with sensors 53, it can be concluded that lenses 51 are present.
During the operations of block 72, device 10 can take action based on the data gathered at block 70. As an example, the positions of optical assemblies 20 may be adjusted using motors 48. In some configurations, the positions of optical assemblies 20 may be nudged outwards by an amount ΔX determined from measured interpupillary distance and eye relief values, as described in connection with
To help protect the privacy of users, any personal user information that is gathered by sensors may be handled using best practices. These best practices including meeting or exceeding any privacy regulations that are applicable. Opt-in and opt-out options and/or other options may be provided that allow users to control usage of their personal data.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of provisional patent application No. 63/406,902, filed Sep. 15, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63406902 | Sep 2022 | US |