The present disclosure generally relates to head-mounted displays (HMDs), and specifically to adjusting an inter-pupillary distance for a user wearing a HMD.
HMDs include an electronic display and optical elements that project the image from the electronic display to the eyes of a user wearing the HMD. The image is projected to an “eye box” for each eye of the user, which is a volume of space in which the user's eyes must be located to view the image correctly. Variations in the shapes of human faces present a challenge for designing HMDs. Accordingly, conventional HMDs are designed to accommodate a range of user anatomies, while sacrificing ideal eye box placement for all users. As a result, variations in the inter-pupillary distance (i.e., the distance between a person's eyes) may result in a user experiencing optical distortions caused by one or both eyes being outside the eye box.
A head mounted display (HMD) includes a first eyecup and a second eyecup for each eye of the user. Each eyecup includes an optical assembly and an electronic display. The electronic display presents image light and the optical assembly guides the image light to an eye box of a user for viewing. The HMD includes an inter-pupillary distance (IPD) adjustment mechanism configured to change the distance between the first eyecup and the second eyecup. The IPD adjustment mechanism is coupled to a structural plate of the HMD and to the first eyecup and the second eyecup. The IPD adjustment mechanism includes an arm on each eyecup and a gear that interfaces with the arms. In one embodiment, each arm is a rack gear that interfaces with a pinion gear. The eyecups are each mounted on one or more rails on which the eyecups are configured to slide to adjust a distance between the eyecups. Accordingly, movement of the gear causes both eyecups to move either away from or towards each other as the eyecups slide along the one or more rails.
The figures depict embodiments of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles, or benefits touted, of the disclosure described herein.
An inter-pupillary distance (IPD) adjustment mechanism for a head-mounted display (HMD) is disclosed. The IPD adjustment mechanism is coupled to a structural plate of the HMD and to an eyecup for each eye that includes a lens and display assemblies. The IPD adjustment mechanism includes an arm for each optical assembly, and a gear that interfaces with the arms. In one embodiment, each arm is a rack gear that interfaces with a pinion gear. The eyecups are each mounted on one or more rails that allow for the eyecups to adjust a distance between the eyecups along a single dimension. Accordingly, movement of the gear causes both eyecups to move either away from or towards each other. In some embodiments, the IPD adjustment mechanisms is driven by a motor. In some embodiments, the IPD adjustment mechanism is manually controlled by the user.
Binoculars and microscopes both commonly incorporate IPD adjustment mechanisms. In both instances, the left and right eyecups are attached to the end of arms. Each of these arms pivots around a fixed pivot point. This pivot point may be common between the two arms or separated by a small distance horizontally. The user rotates one of the eyecups clockwise or counter-clockwise around its pivot point. This rotation increases or decreases the IPD. In most cases the eyecups are linked so that rotating one eyecup will cause the other eyecup to rotate the opposite direction. The disadvantage to this type of IPD adjustment is that the vertical distance from the optical axis of each of the eyecups to the pivot point(s) changes when the IPD is adjusted which is not an issue with binoculars or microscopes but is a problem in HMDs. The change in the vertical distance between the optical axis of each eyecup and the pivot point requires the HMD to be taller in the vertical dimension in order to completely enclose the eyecups at all IPD adjustment ranges. Provision is also made so that the optical axis of the eyecups doesn't move vertically with respect to the users eyes to ensure that the eyecups remain in line with the user's eyes (i.e., the user is still looking into the optical eye box). An IPD adjustment mechanism that includes an opposing rack and pinion design maintains the vertical position of the optical axis of each eyecup with the user's eyes. This mechanism requires very little space outside of the eyecup footprint so it doesn't increase the size of the HMD. This mechanism also requires very few additional parts. The two racks are part of the eyecups so you only need to add the pinion gear (with integral shaft) and the guide rail.
Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, and any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, e.g., create content in an artificial reality and/or are otherwise used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
A head mounted display (HMD) includes a first eyecup and a second eyecup for each eye of the user. Each eyecup includes an optical assembly and an electronic display. The electronic display presents image light and the optical assembly guides the image light to an eye box of a user for viewing. The HMD includes an inter-pupillary distance (IPD) adjustment mechanism configured to change the distance between the first eyecup and the second eyecup.
In other embodiments, the guide rail 115 is a threaded rod that has left handed threads on one end and right-handed threads on the other end. Accordingly, one eyecup (e.g., eyecup 105) includes a nut (e.g., in place of guide mount 135) that mates with the left-handed threads and the other eyecup (e.g., eyecup 110) includes a nut (e.g., in place of guide mount 140) that mates with the right-handed threads. Thus, when the guide rail 115 is rotated, the eyecups move either towards or away from each other depending on the direction of rotation. This approach provides mechanical advantage and can be self-locking.
In another embodiment, the IPD adjustment mechanism includes two pined linkages between the eyecups 105 and 110. Accordingly, moving a common pin (coupled to and between the two linkages) in the vertical direction causes the eyecups 105 and 110 to move horizontally either towards each other or away from each other, depending on whether the common pin is moved up or down. In one embodiment, moving the common pin upward causes the eyecups 105 and 110 to move horizontally toward each other. Thus, in this embodiment, moving the common pin downward causes the eyecups 105 and 110 to move horizontally away from each other. Accordingly, this IPD adjustment mechanism includes guide rail 115 to constrain the eyecups to move horizontally and the common pin is restrained to move vertically.
In another embodiment, the IPD adjustment mechanism includes a belt with two pulleys. One eyecup is attached to one side of the belt and the other eyecup is attached to the other side of the belt. Accordingly, when one of the pulleys is rotated, the belt moves and the eyecups move in opposite directions.
The single diecast metal plate 605 operates as the main structural support of the HMD. The HMD uses inside-out tracking where multiple cameras and an IMU for a ground truth comparison are used to determine the HMD's position and/or orientation. The cameras are located within the molded camera inlays 610, 615, 620, 625 looking out to determine how the HMD's position changes in relation to the environment. Thus, as the HMD moves, the HMD readjusts its location and/or orientation in the environment based on readings from the cameras and IMU and the virtual scene presented to the user adjusts accordingly in real-time. The IMU is used as a backup tracking mechanism and its data is used as a ground truth comparison for the cameras. Since the reliability of the tracking data is critical for the HMD's performance and the user's experience, tight tolerances are required with respect to any movement of the cameras or the IMU over the lifetime of the HMD. For example, minor changes in location (e.g., a bend in the frame, loosening of the mounts, etc.) of one or more cameras or to the IMU can cause the HMD tracking to become unreliable. Thus, structural rigidity is an important consideration since unreliable tracking quickly degrades the user's experience. As a result, a single piece of metal is used, in one embodiment, to maintain a tight positional relationship between the cameras and the IMU. Drop testing from various heights has been demonstrated using a single piece of metal for the backplate 600, such a magnesium, aluminum, and so forth. In one embodiment, the single diecast metal plate 605 is made from magnesium AZ91D.
The single diecast metal plate 605 also operates as a heat sink that includes both active and passive cooling elements. In some embodiments, all processing components are mounted to the single diecast metal plate 605. This makes heat a relevant consideration in choosing a material for the backplate 600. In other embodiments, a subset of the processing components are mounted to the single diecast metal plate 605. Accordingly, the passive cooling elements include features that can be diecast into the plate to increase thermal spreading. These passive cooling features include elevated mounting regions 650, 655 that elevate heat producing elements, such as the battery and PCB. Elevating these components increases airflow around and under them and the elevated mounting regions 650, 655 increase the surface area between the heat producing elements and other adjacent elements that allows these components to dissipate more heat than they would otherwise if they were not elevated. Accordingly, the elevated mounting region 650 corresponds to the PCB mounting region 630 and the elevated mounting region 655 corresponds to the battery mounting region 645. In other embodiments, the single diecast metal plate 605 can be formed with heat sink fin structures to increase radiant cooling.
Additionally, the positions of the components relative to each other can be chosen to minimize the heat exposure to the more heat sensitive components (e.g., IMU) from adjacent components. In one embodiment, the arrangement of components on the backplate 600 is chosen based on an optimization that maximizes the distance between the highest heat producing elements. In other embodiments, the optimization uses both how much heat a particular element produces and how sensitive a particular element is to heat for its operation into account when determining the layout.
The active cooling element of the HMD is a fan or blower.
Moreover, the HMD may include a heat pipe connected to one or more high heat producing elements (e.g., PCB, battery, etc.) at one end and to the single diecast metal plate 605 at the other. In one embodiment, the heat pipe operates as a heat transfer from the one or more high heat producing elements to a portion of the single diecast metal plate 605 where heat is less or a concern. Not only does the heat pipe itself operate as a heat sink that is air cooled by the fan, it is configured to transfer heat to another portion of the single diecast metal plate 605.
The single diecast metal plate 605 also operates as an electrical and electromagnetic field (EMF) ground for components of the HMD. In one embodiment, the single diecast metal plate 605 includes a passivation layer to prevent corrosion. In order to make a better ground point, pads are laser etched into the single diecast metal plate 605 to remove the passivation layer. Accordingly, each component is electrically grounded to the single diecast metal plate 605 through a pad.
As discussed above with respect to
Accordingly, other components can also be mounted to the single diecast metal plate 605. For example, one or more antennas, universal serial bus (USB) port, a power button, and speaker volume buttons can also mount directly to the single diecast metal plate 605. Additionally, the outside housing of the HMD is also mounted to the single diecast metal plate 605.
The front rigid body 1005 includes one or more electronic display elements (not shown in
The imaging assembly generates image information using images and/or audio information captured from a local area surrounding the HMD 10000. The local area is the environment that surrounds the HMD 10000. For example, the local area may be a room that the user wearing the HMD 10000 is inside, or the user may be outside and the local area is an outside area that is visible to the HMD 10000. The image assembly comprises the cameras 1015, 1020, 1025, 1030 positioned to capture a portion of the local area. Image information may include, e.g., one or more images, audio information (e.g., sounds captured by one or more microphones), video information, metadata, or some combination thereof. Image information may include depth information of one or more objects in the local area and/or amount of light detected by the cameras 1015, 1020, 1025, 1030. In the embodiment of
The cameras 1015, 1020, 1025, 1030 are configured to capture images and/or video of different portions of the local area. Each camera 1015, 1020, 1025, 1030 includes a sensor (not shown), a lens, and a camera controller (not shown). The sensor is an electrical device that captures light using an array of photo-sensitive pixels (e.g., complementary metal oxide, charged coupled display, etc.), wherein each pixel converts light into an electronic signal. Sensors can have varying features, such as resolution, pixel size and sensitivity, light sensitivity, type of shutter, and type of signal processing. The lens is one or more optical elements of a camera that facilitate focusing light on to the sensor. Lenses have features that can be fixed or variable (e.g., a focus and an aperture), may have varying focal lengths, and may be covered with an optical coating. In some embodiments, one or more of the cameras 1015, 1020, 1025, 1030 may have a microphone to capture audio information. The microphone can be located within the camera or may located external to the camera.
Each camera 1015, 1020, 1025, 1030 has a field of view that represents a region within the local area viewable by the camera. In the embodiment of
In addition to the field of view of each camera, the position and orientation of each camera 1015, 1020, 1025, 1030 allows the type of coverage between the cameras to be controlled. The desired type of coverage may be based on the type of desired information to be gathered from the captured images. In the embodiment of
The HMD controller is configured to determine depth information for one or more objects in the local area based on one or more captured images from the imaging assembly. Depth information may be determined by measuring the distance to an object using received information about the object's position. In the embodiment of
The HMD controller is additionally configured to update a local area model for the HMD 1000. The local area model includes depth information, exposure settings, or some combination thereof of the environment surrounding the HMD 10000. In the embodiment of
The HMD 1000 is a head-mounted display that presents content to a user comprising virtual and/or augmented views of a physical, real-world environment with computer-generated elements (e.g., two-dimensional (2D) or three-dimensional (3D) images, 2D or 3D video, sound, etc.). In some embodiments, the presented content includes audio that is presented via an external device (e.g., speakers and/or headphones) that receives audio information from the HMD 1000 and presents audio data based on the audio information
The HMD 1000 includes an imaging assembly 1120, an electronic display 1102, an optical assembly 1105, one or more position sensors 1115, an IMU 1110, an optional eye tracking system (not shown), an optional varifocal module (not shown), and an HMD controller 1145. Some embodiments of the HMD 1000 have different components than those described in conjunction with
The imaging assembly 1120 captures data describing depth information of a local area surrounding some or all of the HMD 1000. The imaging assembly 1120 includes one or more cameras located on the HMD 1000 that capture images and/or video information and/or audio information. In some embodiments, the imaging assembly 1120 can compute the depth information using the data (e.g., based on captured images having stereoscopic views of objects within the local area). In addition, the imaging assembly 1120 determines the amount of light detected by each camera in the imaging assembly 1120. The imaging assembly 1120 may send the depth information and amount of light detected to the HMD controller 1145 for further processing. The imaging assembly 1120 is an embodiment of the imaging assembly in
The electronic display 1102 displays 2D or 3D images to the user in accordance with data received from the imaging assembly controller. In various embodiments, the electronic display 1102 comprises a single electronic display or multiple electronic displays (e.g., a display for each eye of a user). Examples of the electronic display 1102 include: a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an inorganic light emitting diode (ILED) display, an active-matrix organic light-emitting diode (AMOLED) display, a transparent organic light emitting diode (TOLED) display, some other display, or some combination thereof.
The optical assembly 1105 magnifies image light received from the electronic display 1102, corrects optical errors associated with the image light, and presents the corrected image light to a user of the HMD 1000. The optical assembly 1105 includes a plurality of optical elements. Example optical elements included in the optical assembly 1105 include: an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, a reflecting surface, or any other suitable optical element that affects image light. Moreover, the optical assembly 1105 may include combinations of different optical elements. In some embodiments, one or more of the optical elements in the optical assembly 1105 may have one or more coatings, such as partially reflective or anti-reflective coatings.
The IMU 1110 is an electronic device that generates data indicating a position of the HMD 1000 based on measurement signals received from one or more of the position sensors 1115 and from depth information received from the imaging assembly 1120. A position sensor 330 generates one or more measurement signals in response to motion of the HMD 1000. Examples of position sensors 1115 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, a type of sensor used for error correction of the IMU 1110, or some combination thereof. The position sensors 1115 may be located external to the IMU 1110, internal to the IMU 1110, or some combination thereof.
Based on the one or more measurement signals from one or more position sensors 1115, the IMU 1110 generates data indicating an estimated current position of the HMD 1000 relative to an initial position of the HMD 1000. For example, the position sensors 1115 include multiple accelerometers to measure translational motion (forward/back, up/down, left/right) and multiple gyroscopes to measure rotational motion (e.g., pitch, yaw, roll). In some embodiments, the IMU 1110 rapidly samples the measurement signals and calculates the estimated current position of the HMD 1000 from the sampled data. For example, the IMU 1110 integrates the measurement signals received from the accelerometers over time to estimate a velocity vector and integrates the velocity vector over time to determine an estimated current position of a reference point on the HMD 1000. The reference point is a point that may be used to describe the position of the HMD 1000. The reference point may generally be defined as a point in space or a position related to the HMD's orientation and position.
The IMU 1110 receives one or more parameters from the HMD controller 1145. The one or more parameters are used to maintain tracking of the HMD 1000. Based on a received parameter, the IMU 1110 may adjust one or more IMU parameters (e.g., sample rate). In some embodiments, certain parameters cause the IMU 1110 to update an initial position of the reference point so it corresponds to a next position of the reference point. Updating the initial position of the reference point as the next calibrated position of the reference point helps reduce accumulated error associated with the current position estimated the IMU 1110. The accumulated error, also referred to as drift error, causes the estimated position of the reference point to “drift” away from the actual position of the reference point over time. In some embodiments of the HMD 1000, the IMU 1110 may be a dedicated hardware component. In other embodiments, the IMU 1110 may be a software component implemented in one or more processors.
The HMD controller 1145 processes content for the HMD 1000 based on information received from the imaging assembly 1120. In the example shown in
The application store 1170 stores one or more applications for execution by the HMD controller 1145. An application is a group of instructions, that when executed by a processor, generates content for presentation to the user. Content generated by an application may be in response to inputs received from the user via movement of the HMD 1000 or a peripheral device. Examples of applications include: gaming applications, conferencing applications, video playback applications, or other suitable applications.
The tracking module 1165 calibrates the HMD system 1100 using one or more calibration parameters and may adjust one or more calibration parameters to reduce error in determination of the position of the HMD 1000. For example, the tracking module 1165 communicates a calibration parameter to the imaging assembly 1120 to adjust the focus of the imaging assembly 1120 to more accurately determine positions of objects captured by the imaging assembly 1120. Calibration performed by the tracking module 1165 also accounts for information received from the IMU 1110 in the HMD 1000.
The tracking module 1165 tracks movements of the HMD 1000 using information from the imaging assembly 1120, the one or more position sensors 1115, the IMU 1110 or some combination thereof. For example, the tracking module 1165 determines a position of a reference point of the HMD 1000 in a mapping of a local area based on information from the imaging assembly 1120. The tracking module 1165 may also determine positions of the reference point of the HMD 1000. Additionally, in some embodiments, the tracking module 1165 may use portions of data indicating a position or the HMD 1000 from the IMU 1110 as well as representations of the local area from the imaging assembly 1120 to predict a future location of the HMD 1000. The tracking module 1165 provides the estimated or predicted future position of the HMD 1000 to the engine 1160.
The engine 1160 processes information (e.g., depth and/or exposure) received from the imaging assembly 1120. Using the received information, the engine 1160 synchronizes the exposure settings for the cameras of the imaging assembly 1120. As described with regards to
The engine 1160 may additionally use information from the tracking module 1165 in conjunction with the local area model. Using information from the tracking module 1165, the engine 1160 can predict a future location and orientation of the HMD 1000. Subsequently using the local area model, the engine 1160 determines the appropriate exposure settings for each camera in the imaging assembly 1120 for the predicted future location and orientation of the HMD 1000. The local area model allows the engine 1160 to efficiently adjust exposure settings of the cameras in the imaging assembly 1120 such that the engine 1160 does not have to analyze the depth information and amount of light detected by the imaging assembly 1120 at each new location and/or orientation of the HMD 1000. As the location and orientation of the HMD 1000 changes within the environment, the engine 1160 may update the local area model using depth information and amount of light detected from the imaging assembly 1120. Additionally, the imaging assembly 1120 may be configured to send depth information and amount of light detected to the engine 1160 at certain time intervals to account for any changes in the level of light that may have occurred within the environment and to ensure that the local area model is updated accordingly.
The engine 1160 also executes applications within the HMD system 1100 and receives position information, acceleration information, velocity information, predicted future positions, or some combination thereof, of the HMD 1000 from the tracking module 1165. Based on the received information, the engine 1160 determines content to provide to the electronic display 1102 for presentation to the user. For example, if the received information indicates that the user has looked to the left, the engine 1160 generates content for the electronic display 1102 that mirrors the user's movement in a virtual environment or in an environment augmenting the local area with additional content. Additional Configuration Information
The foregoing description of the embodiments of the disclosure has been presented for the purpose of illustration; it is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure.
Some portions of this description describe the embodiments of the disclosure in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
Embodiments of the disclosure may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
Embodiments of the disclosure may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the disclosure be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.