This relates generally to electronic devices, and, more particularly, to electronic devices such as head-mounted devices.
Electronic devices have components such as displays and lenses. It can be challenging to customize such devices for different users.
A head-mounted device may include optical assemblies for presenting images to a user. Each optical assembly may have a display and a lens through which an image from the display may be presented to a respective eye box.
Motors may be used to adjust the spacing between the optical assemblies to accommodate different user interpupillary distances. Gaze trackers may be used to measure the eyes of a user to determine target positions for the optical assemblies.
The force with which the motors move the optical assemblies towards a central nose bridge portion of the device and therefore towards nose surfaces located at the nose bridge portion may be limited. The force may be limited using a clutch such as a magnetic clutch or a physical clutch based on structures that decouple from each other to limit the force. The force may also be limited by monitoring the force and halting the motors in response to detection of a given amount of force. Sensor measurements and electrical motor load measurements may be used in measuring the force. If desired, motor operation may be controlled by a user-operated button. The direction of permitted optical assembly movement when accommodating different interpupillary distances may also be controlled.
Electronic devices such as head-mounted devices may have displays for displaying images and lenses that are used in presenting the images to eye boxes for viewing by a user. Different users have different spacings between their eyes, which are sometimes referred to as interpupillary distances. To accommodate users with different interpupillary distances, a head-mounted device may be provided with movable optical assemblies.
As shown in the illustrative cross-sectional top view of device 10 of
Main portion 12M of housing 12 may be attached to head strap 12T. Head strap 12T may be used to help mount main portion 12 on the head and face of a user. Main portion 12M may have a rigid shell formed from housing walls of polymer, glass, metal, and/or other materials. When housing 12 is being worn on the head of a user, the front of housing 12 may face outwardly away from the user, the rear of housing 12 (and rear portion 12R) may face towards the user. In this configuration, rear portion 12R may face the user's eyes located in eye boxes 36.
Device 10 may have electrical and optical components that are used in displaying images to eye boxes 36 when device 10 is being worn. These components may include left and right optical assemblies 20 (sometimes referred to as optical modules). Each optical assembly 20 may have an optical assembly support 38 (sometimes referred to as a lens barrel, optical module support, or support structure) and guide rails 22 along which optical assemblies 20 may slide to adjust optical-assembly-to-optical-assembly separation to accommodate different user interpupillary distances.
Each assembly 20 may have a display 32 that has an array of pixels for displaying images and a lens 34. Lens 34 may optionally have a removable vision correction lens for correcting user vision defects (e.g., refractive errors such as nearsightedness, farsightedness, and/or astigmatism). In each assembly 20, display 32 and lens 34 may be coupled to and supported by support 38. During operation, images displayed by displays 32 may be presented to eye boxes 36 through lenses 34 for viewing by the user.
Rear portion 12R may include flexible structures (e.g., a flexible polymer layer, a flexible fabric layer, etc.) so that portion 12R can stretch to accommodate movement of supports 38 toward and away from each other to accommodate different user interpupillary distances.
The walls of housing 12 may separate interior region 28 within device 10 from exterior region 30 surrounding device 10. In interior region 28, optical assemblies 20 may be mounted on guide rails 22. Guide rails 22 may be attached to central housing portion 12C. If desired, the outer ends of guide rails 22 may be unsupported (e.g., the outer end portions of rails 22 may not directly contact housing 12, so that these ends float in interior region 28 with respect to housing 12).
Device 10 may include control circuitry and other components such as components 40. The control circuitry may include storage, processing circuitry formed from one or more microprocessors and/or other circuits. To support communications between device 10 and external equipment, the control circuitry may include wireless communications circuitry. Components 40 may include sensors such as such as force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors, optical sensors such as optical sensors that emit and detect light, ultrasonic sensors, and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or sensors such as inertial measurement units that contain some or all of these sensors), radio-frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices), optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, visual inertial odometry sensors, current sensors, voltage sensors, and/or other sensors. In some arrangements, devices 10 may use sensors to gather user input (e.g., button press input, touch input, etc.). Sensors may also be used in gathering environmental motion (e.g., device motion measurements, temperature measurements, ambient light readings, etc.).
Optical assemblies 20 may have gaze trackers 62 (sometimes referred to as gaze tracker sensors). Gaze trackers 62, which may operate through lenses 34, may include one or more light sources such as infrared light-emitting diodes that emit infrared light to illuminate the eyes of a user in eye boxes 36. Gaze trackers 62 also include infrared cameras for capturing images of the user's eyes and measuring reflections (glints) of infrared light from each of the infrared light sources. By processing these eye images, gaze trackers 62 may track the user's eyes and determine the point-of-gaze of the user. Gaze trackers 62 may also measure the locations of the user's eyes (e.g., the user's eye relief and the user's interpupillary distance).
To accommodate users with different interpupillary distances (eye-to-eye spacings), the spacing between the left and right optical assemblies 20 in device 10 can be adjusted (e.g., to match or nearly match the user's measured interpupillary distance). Device 10 may have left and right actuators (e.g., motors) such as motors 48. Each motor 48 may be used to rotate an elongated threaded shaft (screw) such as shaft 44. A nut 46 is provided on each shaft 44. The nut has threads that engage the threads on that shaft 44. When a shaft is rotated, the nut on the shaft is driven in the +X or −X direction (in accordance with whether the shaft is being rotated clockwise or counterclockwise). In turn, this moves the optical assembly 20 that is attached to the nut in the +X or −X direction along its optical assembly guide rail 22. Each assembly 20 (e.g., support 38) may have portions that receive one of guide rails 22 so that the assembly is guided along the guide rail. By controlling the activity of motors 48, the spacing between the left and right optical assemblies of device 10 can be adjusted to accommodate the interpupillary distance of different users. For example, if a user has closely spaced eyes, assemblies 20 may be moved inwardly (towards each other and towards nose bridge portion NB of housing 12) and if a user has widely spaced eyes, assemblies 20 may be moved outwardly (away from each other).
When device 10 is being worn by a user, the user's head is located in region 68. The presence of the user's head (and therefore a determination of whether device 10 is being worn or is unworn) may be made using one or more sensors (e.g., gaze trackers 62, which may detect the presence of the eyes of the user in eye boxes 36, rear-facing sensors such as sensor 66 on main housing 12M, head-facing sensors mounted on strap 12T such as sensor 64, and/or other head presence sensors). These sensors may include cameras, light sensors (e.g., visible light or infrared sensors that measure when ambient light levels have dropped due to shadowing by the head of a user), proximity sensors (e.g., sensors that emit light such as infrared light and that measure corresponding reflected light from a user's head with an infrared light sensor, capacitive proximity sensors, ultrasonic acoustic proximity sensors, etc.), switches and/or other force-sensing sensors that detect head pressure when a user's head is present, and/or other head presence sensors.
When device 10 is being worn and a user's head is present in region 68, the nose of the user will be present under nose bridge portion NB of housing 12. When optical assemblies 20 are moved towards each other so that assemblies 20 are spaced apart by an amount that matches or nearly matches the user's interpupillary distance, inner side surfaces 60 of support structures 38 in assemblies 20 will move toward opposing outer side surfaces 61 of the user's nose. With sufficient inward movement of assemblies 20, surfaces 60 may contact and press against nose surfaces 61. As a result, an outward force on assemblies 20 is created by nose surfaces 61. To avoid discomfort that might arise if the user's nose is pressed against by more than a desired amount, device 10 may be provided with features to limit inward nose pressure (e.g., to limit inward force by assemblies 20).
With an illustrative embodiment, whenever device 10 is mounted on the head of a user, motors 48 may only be permitted to move optical assemblies 20 away from each other and not towards each other. This ensures that surfaces 60 will never move towards each other while the user's nose is present, so that the user's nose will never be pressed excessively by moving surfaces 60.
The operation of device 10 in this type of arrangement is illustrated in the flow chart of
During the operations of block 80, device 10 may be powered up. Device 10 may, for example, be powered up in response to detection of a user button press on button 70. During power up operations, a power supply supplies power (e.g., a power supply voltage) to the control circuits, sensors, displays, and other components 40 of device 10.
During the operations of block 82, motors 48, in response to detection of the power-up condition (e.g., in response to detecting the presence of the power supply voltage), may move optical assemblies 20 away from each other. Gaze trackers 62 may, in response to detection of the power up condition (e.g., in response to the power supply voltage), capture images of the user's eyes in eye boxes 36 and may use this information in determining a target separation between optical assemblies 20 (e.g., gaze trackers 62 may measure the user's interpupillary distance and/or other eye characteristics such as the user's eye relief and may use the user's measured interpupillary distance and/or other eye characteristics in establishing a target separation for optical assemblies 20). During the operations of block 82, motors 48 may move optical assemblies 20 apart until the target separation (target positions) for optical assemblies 20 is reached.
During the operations of block 84, after motors 48 have placed optical assemblies 20 into their desired positions, further movement of assemblies 20 may be halted and device 10 may be used to present images to eye boxes 36 for viewing by the user.
After the user has finished viewing content with device 10, device 10 may be powered down for storage. In an illustrative scenario, a button press on button 70 or other input is used to instruct device 10 to shut down. Optical assemblies 20 may be maintained in their current position while device 10 is powered down or can be moved towards each other in preparation for subsequent outward movements (see, e.g., block 82). When moving optical assemblies 20 towards each other (at block 86 or another time such as during initial power up operations), a head-presence sensor may be used to detect whether the user's head is present in region 68. The head-presence sensor may, as an example, be used to confirm that the user's head is not present whenever optical assemblies 82 are being moved towards each other. For example, during the operations of block 86, motors 48 may move optical assemblies 20 towards each other in response to the detected user power down command (e.g., the button press input on button 70) provided that no head is being detected by the head-presence sensor.
With another illustrative embodiment, which is illustrated in the flow chart of
As shown in
During the operations of block 92, a user has an opportunity to press button 70. When no button press input is detected, motors 48 remain stationary, so that optical assemblies 20 do not move. When button press input is detected, motors 48 are allowed to move to adjust the separation between assemblies 20. As an example, motors 48 may move assemblies 20 inwardly towards their target positions. During movement of assemblies 20, motors 48 may be halted in response to any detected user release of button 70. In this way, the positions of assemblies 20 will be adjusted so long as button 70 is being pressed, but will stop in response to release of button 70 (e.g., when the user desires to prevent movement of assemblies 20 that could press against nose surfaces 62). The user in this scenario remains in continuous control of assemblies 20.
Once the desired positions of assemblies 20 have been reached, device 10 may be used to view images while being worn by the user. Motors 48 may stop automatically when the target positions measured by gaze trackers 62 are reached or motors 48 may stop when the user releases button 70. If desired, the direction of movement of assemblies 20 may be controlled by providing device 10 with two of buttons 70 (e.g., an inward movement button and an outward movement button) or by providing a first button 70 to control movement (e.g., a go/stop button) and a second button (e.g., a slider with two positions) that is used to choose between inward and outward movement settings. Arrangements in which device 10 has non-button user input devices such as microphones for gather voice commands, touch screen displays, and/or other user input devices may also be used in controlling movement of motors 48.
Clutches may be used to limit the amount of inward force that is applied by optical assemblies 20 when assemblies 20 are moved towards nose surfaces 61 by motors 48.
In the example of
Illustrative magnetic clutches are shown in
In the example of
As shown in
In the example of
If desired, the load on motors 48 can be monitored electronically, so that motors 48 can be halted if more than a desired motor load is encountered. An illustrative motor control circuit is shown in
If desired, motor 48 may be supplied with a rotatory encoder, as shown in the example of
Another illustrative technique for electronically monitoring motor load involves the use of a linear encoder of the type shown in
The graph of
In the example of
The flow chart of
During the operations of block 200, device 10 may be powered up (e.g., in response to a detected button press or other activity).
Once device 10 has powered up, gaze trackers 62 may measure the separation between the user's eyes (user interpupillary distance) and other eye characteristics to determine target positions for optical assemblies 20. Motors 48 may then rotate shafts 44 to move optical assemblies 20 towards the target positions (e.g., by moving assemblies 20 towards nose bridge portion NB of housing 12). During motor operation, motor load may be electrically monitored (e.g., using back EMF measurements, using encoder output, using measurements of applied current, and/or using other measurements of the types described in connection with
If optical assemblies 20 contact nose surfaces 61, nose surfaces 61 will produce a force against optical assemblies 20 that tends to resist further movement. Motors 48 are configured to halt operation in response to detection of more than a desired amount of motor load (see, e.g., block 204). At this point, device 10 may be operated normally and used in presenting images to the user's eyes in eye boxes 36.
Following use of device 10, device 10 can be powered down (see, e.g., the operations of block 206). Device 10 may, as an example, be powered down in response to detection of a user button press or other activities.
To help protect the privacy of users, any personal user information that is gathered by sensors may be handled using best practices. These best practices including meeting or exceeding any privacy regulations that are applicable. Opt-in and opt-out options and/or other options may be provided that allow users to control usage of their personal data.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of provisional patent application No. 63/431,395, filed Dec. 9, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63431395 | Dec 2022 | US |