Near-eye display devices are configured to present images to a user via a display that is positioned close to the user's eyes. For example, a head-mounted augmented reality display device may be worn on a user's head to position a near-eye display directly in front of a user's eyes. A near-eye display may be at least partially see-through to allow a user to view a real-world background in combination with displayed virtual objects. This may allow virtual objects to be displayed such that the virtual objects appear to exist within the real-world environment.
Embodiments are disclosed herein that relate to aligning a near-eye display with an eye of a user. For example, one disclosed embodiment provides, on a near-eye display device, a method comprising receiving an image of an eye from a camera via a reverse display optical path, detecting a location of the eye in the image, and determining a relative position of the eye to a target viewing position of the near-eye display. The method further comprises determining an adjustment to make to align the location of the eye with the target viewing position.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
A near-eye display device may use various optical systems to deliver an image to a user's eye, including but not limited to projection-based systems and waveguide-based systems. However, the optical systems of such near-eye displays may have relatively small exit pupils. Further, in some near-eye displays, optical performance may decay toward the edge of the exit pupil.
As such, a near-eye display device may include an adjustable fit system to allow a user to properly locate the exit pupil of the system. This may allow a user to adjust the system to avoid optical effects caused by misalignment. However, the proper adjustment of such a fit system may pose challenges for users. As a result, some users may perform sufficient fit adjustments to find a coarse fit that provides an acceptable level of performance, and then not perform additional adjustment to further optimize viewing. Thus, such viewers may not enjoy the full viewing experience offered by the device.
Accordingly, embodiments are disclosed herein that relate to assisting users in adjusting a near-eye display device. Briefly, the disclosed embodiments determine from image data a relative position between the location of an eye of a user and a target viewing position of the near-eye display, and determine an adjustment to make to the near-eye display device that aligns the eye with the target viewing position. The determined adjustment may be performed automatically and/or output as a recommendation for the user to perform manually. This may help to simplify adjusting the near-eye display system to more precisely align the near-eye display system with the user's eye or eyes. It will be understood that reference herein to a location of an eye may signify a location of the overall eye structure, the pupil of the eye, and/or any other anatomical feature of the eye.
As discussed above, misalignment of the display optics of the head-mounted display device with the user's eye may result in vignetting of the field of view and other optical effects. Thus, for proper viewing, a fit system and/or other mechanisms may be used to place the head-mounted display at a target viewing position relative to the user's eyes. The target viewing position may be defined, for example, by a region in space inside of which an eye may properly perceive displayed images.
Achieving a proper fit via a fit system may pose challenges. For example, some near-eye displays may be fit to a user via professional equipment that is used to determine anatomical measurements related to the eye. However, such methods may be too expensive and cumbersome for use with consumer devices.
Thus, as mentioned above, to facilitate proper alignment between the target viewing position and a user's eye, a near-eye display may be configured to detect a location of a user's eye from image data, and output a recommendation regarding an adjustment to make to the near-eye display to place the user's eye in a target viewing position relative to the near-eye display.
The head-mounted display device 100 may determine an adjustment to perform or recommend in any suitable manner For example, the head-mounted display system may determine an offset of the user's eye (or pupil or other anatomical feature of the user's eye) from the target viewing position for that eye, and may output a recommendation based upon a known or determined relationship between operation of an adjustment mechanism and a change in the location of the user's eye relative to the target viewing position as a function of the operation of the adjustment mechanism.
Any suitable adjustment may be recommended and/or performed. For example, some devices may offer multiple adjustment mechanisms (horizontal, vertical, angular, etc.). In such devices, multiple recommendations may be output, or multiple adjustments performed, in some situations, depending upon the adjustments to be made. Where multiple adjustments are recommended, the recommendations may be output together as a list, may be sequentially displayed (e.g. such that the system first displays a subset of one or more recommended adjustments, and then waits for the user to make those recommended adjustments before displaying one or more other recommended adjustments), or may be output in any other suitable manner
Other devices may offer fewer adjustment mechanisms (e.g. an interpupillary distance adjustment but no vertical adjustment). Further, some devices, such as wearable devices (e.g. head-mounted display systems), may be offered in multiple sizes. In such embodiments, the recommendation may suggest a different sized device, as described in more detail below.
The depicted horizontal adjustment mechanism 202 allows the distance between a left eye display 208 and a right eye display 210 to be adjusted, for example, based upon an interpupillary distance of a user to position the left eye in a left eye target viewing position and the right eye in a right eye target viewing position. In some embodiments, other horizontal adjustment mechanisms may be provided. For example, a horizontal adjustment mechanism (not shown) may be provided that adjusts a distance between each earpiece 212 and associated left or right eye display. Such adjustment mechanisms may be configured to adjust the positions of the left eye display 208 and the right eye display 210 in a complementary or independent manner.
In addition to the horizontal adjustment mechanism 202,
In other embodiments, various optical components may be used to deliver an image of the user's eye to a camera not positioned to directly image the user's eye. For example, in a head-mounted display device, various optical components may be used to deliver display images to a user's eye. These components may be referred to herein as a display optical path. In such a device, a reverse display optical path may be used to deliver images of the eye to the camera.
The near-eye display 500 also includes an eye tracking system comprising an eye tracking camera 512 and one or more light sources 508 (e.g. infrared light sources) configured to produce light for reflection from the user's eye. As shown in
As the eye tracking camera 512 is configured to capture an image of the user's eye, the eye tracking camera 512 also may be used to acquire images of a user's eye during a fitting process for a head-mounted display. As mentioned above, when initially fitting a head-mounted display, a user may perform sufficient fit adjustments to find a coarse fit that provides an acceptable level of performance. Once the user performs these adjustments, at least a portion of the user's pupil will be in the view of the eye tracking system. Image data from the eye tracking camera may then be used to determine a location of the user's eye, and to determine an adjustment to make or recommend.
Returning to
The relative position determined may depend upon a horizontal and/or vertical offset of the eye from the target viewing position in the image, and also upon a distance of the eye from the near-eye display device. Any suitable method may be used to determine the distance of the eye from the near-eye display device. For example, in some embodiments, a predetermined distance (e.g. based upon a design of the system compared to an average anatomy of expected users) may be used based upon the design of a near-eye display device.
Method 400 further includes, at 408, determining an adjustment to make to the head-mounted display to align the location of the eye with the target viewing position. Method 400 additionally includes, at 410, outputting the recommendation and/or making the adjustment automatically. The recommendation may be determined in any suitable manner For example, as mentioned above, the recommendation may be made based upon a detected offset of the user's eye (or each of the user's eyes) from the target viewing position (or each of two target viewing positions) in combination with information regarding the effect of an adjustment mechanism. As a non-limiting example, if it is determined to increase a separation of a left eye display and right eye display by three millimeters and the increment of adjustment is one half millimeter, then it may be determined to recommend to the user to increase a horizontal adjustment value by six increments of adjustment. It will be understood that, where multiple adjustments are to be made, the multiple adjustments may be made via any suitable combination of automatic and manual adjustments, depending upon the adjustment mechanisms provided.
As mentioned above, a recommendation of adjustments to make may take any suitable form.
Recommended adjustments also may be output via images, such as icons, symbols, etc., that direct user how to perform the adjustment. For example, as shown in
In some embodiments, a near-eye display may include motors or other suitable electronic mechanisms for allowing determined adjustments to be performed automatically. In such embodiments, a user may be prompted for confirmation to perform the adjustment, or the adjustment may be automatically performed without user confirmation.
Further, as mentioned above, in some embodiments a near-eye display device may be available in a range of sizes configured to fit different users of having different anatomies (e.g. head sizes, interpupillary distances, etc.). Such near-eye displays may be configured to determine if a user is wearing an appropriately sized near-eye display, and if the user is not wearing an appropriately sized near-eye display, to output a recommendation that direct the user to use to a different size near-eye display. As an example,
To allow a determination to be made to recommend a different sized device, the near-eye display device may include a measuring system, such as an encoder, for each adjustment mechanism. The measuring system may detect a current absolute setting of the adjustment mechanism, and from the current setting determine if an adjustment can be made based upon the remaining adjustment range available. The recommendation to select a different size then may be made if insufficient adjustment range is available. The use of such an encoder (or other measuring mechanism) may provide for other capabilities as well. For example, the absolute adjustment setting mechanism may allow for the absolute measurement of eye dimensional information, which may be used for user identification and/or other device features,
The use of cameras to determine a location of a user's eyes relative to a target viewing position may offer other advantages. For example, the interpupillary distance of a user decreases as a user views objects at closer and closer distances. Thus, in a near-eye display device configured to display stereoscopic images, the interpupillary distance may be determined via image data from the cameras along with information regarding how far apart the cameras are. The rendering of stereoscopic images then may be adjusted based upon changes in the interpupillary distance. This may help to accurately render stereoscopic images at close apparent distances.
In some embodiments, the methods and processes described herein may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
Computing system 700 includes a logic machine 702 and a storage machine 704. Computing system 700 may optionally include a display subsystem 706, input subsystem 708, communication subsystem 710, and/or other components not shown in
Logic machine 702 includes one or more physical devices configured to execute instructions. For example, the logic machine may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
The logic machine may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic machine may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic machine may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic machine optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic machine may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
Storage machine 704 includes one or more physical devices configured to hold instructions executable by the logic machine to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage machine 704 may be transformed—e.g., to hold different data.
Storage machine 704 may include removable and/or built-in devices comprising computer-readable storage media. Storage machine 704 may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage machine 704 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
It will be appreciated that storage machine 704 includes one or more physical devices and excludes a propagating signal per se. However, aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.), as opposed to being stored by a computer readable storage medium.
Aspects of logic machine 702 and storage machine 704 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
The term “program” and the like may be used to describe an aspect of computing system 700 implemented to perform a particular function. In some cases, a program may be instantiated via logic machine 702 executing instructions held by storage machine 704. It will be understood that different programs may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The term “program” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
Display subsystem 706 may be used to present a visual representation of data held by storage machine 704. This visual representation may take the form of a graphical user interface (GUI) displayed, for example, on a near-eye display device. As the herein described methods and processes change the data held by the storage machine, and thus transform the state of the storage machine, the state of display subsystem 706 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 706 may include one or more display devices utilizing virtually any type of technology. For example, a near-eye display device may deliver an image to a user via one or more waveguides, via projection optics, and/or in any other suitable manner. Such display devices may be combined with logic machine 702 and/or storage machine 704 in a shared enclosure, or such display devices may be peripheral display devices.
When included, input subsystem 708 may comprise or interface with one or more user-input devices such as a keyboard, mouse, touch screen, or game controller. In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity.
When included, communication subsystem 710 may be configured to communicatively couple computing system 700 with one or more other computing devices. Communication subsystem 710 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, the communication subsystem may allow computing system 700 to send and/or receive messages to and/or from other devices via a network such as the Internet.
It will be understood that the configurations and/or approaches described herein are presented for example, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.