Recently, there has been a surge in interest in virtual reality (VR), mixed reality (MR), and augmented reality (AR) devices. Many of these devices make use of user-worn headsets that are able to project images onto a user's eyes to create two-dimensional or three-dimensional images displayed to a user. Often, these headsets are bulky and cumbersome to wear. This can be caused in some devices due to the inefficient nature of projectors and optical devices in the headsets. In particular, the inefficiencies of projectors and waveguides results in the need to use higher power for transmission, and the corresponding need to have bulky cooling systems to dissipate excess generated heat. Additionally, in some devices even the weight of the projectors and waveguides create a significant amount of bulk and weight. This can make such headsets difficult to wear and use for long periods of time.
Additionally, such devices often have a limited field of view. For example, some current VR, MR, and AR devices have a field of view somewhere between 30 and 40°.
The subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.
One embodiment illustrated herein includes a movement based display configured to display full images to a user by moving light emitters through a user's field of view. The movement based display includes a first movable member. The movement based display further includes a first light emitter array, comprising a plurality of light emitters, coupled to the first movable member. The first light emitter array is configured to output light from the light emitters dependent on a position of the first movable member. The movement based display further includes a first lens array. The first lens array is coupled to the first light emitter array. The first lens array comprises lenses configured to direct light into a first aperture, such as a user's eye.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Additional features and advantages will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the teachings herein. Features and advantages of the invention may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. Features of the present invention will become more fully apparent from the following description and appended claims, or may be learned by the practice of the invention as set forth hereinafter.
In order to describe the manner in which the above-recited and other advantages and features can be obtained, a more particular description of the subject matter briefly described above will be rendered by reference to specific embodiments which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments and are not therefore to be considered to be limiting in scope, embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Some embodiments illustrated herein are able to implement a motion based display on a headset usable for augmented reality, mixed reality, and/or virtual reality systems. In particular, the embodiments implement a device where an array of light emitters are caused to move and output light from the light emitters in a way that causes a user to perceive an image displayed by the light emitters, that is, to display one or more images that are persisted to a user based on the motion of the light emitters, the modulation of the light emitters, and the physical perception limitations of a user. In particular, the light emitters can be moved at a sufficient speed and output from the light emitters modulated in such a way that the motion of the light emitters is substantially imperceptible to a user, thus causing the motion of the light emitters combined with the output of the light emitters to appear as a persisted image to a user. Note that the static images displayed to a user can be changed over time at a rate which can cause the static images to be used in animation. Alternatively images can be displayed with devices which include head tracking such that virtual items can be displayed to a user in a static location in a real world environment as the user moves about the real world environment. Thus for example, a one dimensional array of light emitters can be used to create a two-dimensional image. Further, as will be illustrated in more detail below, a one dimensional array of light emitters and/or a pair of one dimensional arrays of light emitters can be used to create three-dimensional images by displaying stereoscopic images to a user.
Additionally, embodiments may be implemented where the light emitter arrays are configured for use in near vision devices. For example, the light emitter arrays may be implemented in a device in a way that allows output from the light emitters in the light emitter array to be focused into a user's eye where the user's eye is in close proximity to the light emitters. For example, the light emitters may be somewhere between 0 and 4 inches from the user's eye. In some embodiments, a device may be configured for near vision use by implementing lenses on the light emitters of the light emitter array where the lenses are configured to direct light into a desired aperture, such as the pupil of a user's eye.
Such devices can improve near vision devices dramatically over existing near vision devices. For example, some embodiments may be able to provide a practically unlimited field of view.
Additionally or alternatively, some such devices may be substantially lighter weight than previous devices. Note that in some embodiments this can occur as cooling can be implemented by implementing a light emitter array on a movable member that is configured in size and shape to create convection currents which can cool light emitter elements thus obviating the need for bulky and heavy cooling elements. Additionally or alternatively, embodiments may be implemented where the light emitter array is implemented with a large surface area to facilitate cooling. Additionally or alternatively, embodiments may be implemented where moving elements inside of a device can be used as fan.
Additionally or alternatively, some such devices may be able to achieve 50/50 product weight distribution between different hemispheres of the device, for example between different hemispheres of the device. For example, there may be a desire to have even weight distribution between the front and the back of the device. In contrast, many current devices are notoriously front heavy which leads to neck strain and discomfort. Embodiments illustrated herein can be implemented to allow for a more balanced weight distribution.
Referring now to
Note that the light emitter array can be moved by the movable member 106 and produce light output, in some embodiments, in a fashion which causes one or more 2D images to be displayed to the user 102. In other embodiments, the light emitter array can be moved by the movable member 106 and produce light output in a fashion which causes one or more 3D image to be displayed to the user 102. In particular the light emitter array can direct light output into the eyes of the user 102 in a fashion such that stereoscopic images are displayed to the user 102 resulting in a 3D image being perceived by the user 102.
Referring now to
The movable member 106 is a substrate on which the light emitter array 108 can be mounted. In some embodiments, the movable member is made of one or more metals and is configured to transfer and dissipate heat away from the light emitter array 108. For example, the movable member 106 may be constructed of aluminum, copper, titanium, alloys thereof, combinations thereof, or of other heat transferring materials. However, the movable member 106 may alternatively or additionally be constructed of other materials, such as polymers or other rigid or semi-rigid materials.
In some embodiments, the movable member 106 may further include a semiconductor substrate on which the light emitter array 108 can be fabricated.
As the movable member 106 physically translates when the device 100 is in operation, the movable member 106 may be configured, in some embodiments, in size and shape to create convection air currents when the movable member 106 is in motion. In particular, the movable member 106 may be configured in size and shape to cause air to flow across and/or away from the light emitter array 108 carrying heat away from the light emitter array 108.
The light emitter array 108 may be composed of various light emitting sources. For example the light emitting sources of the light emitter array may include light emitting diodes (LEDs) semiconductor lasers, and/or other light emitting sources. The light emitter array 108 may be configured in any one of a number of different configurations. Various example configurations are illustrated in
The light emitters on the light emitter array 108 may have certain characteristics to obtain a desired image resolution for images persisted to the user by moving the movable member 106. For example, if there is a desire to display a 1080P image to the user 102, the light emitter array 108 may have approximately a 14 μm pixel density. Additionally or alternatively, to allow the light emitter array 108 to change light outputs sufficiently fast to create a persisted image displayed to the user 102, the light emitters in the light emitter array 108 may have a latency of about 1.5 μs or less.
The lens array 110 may be implemented in one or more of a number of different fashions. For example, in some embodiments, the lens array may include discrete lenses mechanically disposed on light emitters of the light emitter array.
Alternatively or additionally, lenses may be applied using semiconductor processing and lithographic techniques. For example, during manufacturing of the lens array 110, processes may include exposing individual light emitters by etching away portions of top layers of a semiconductor epitaxy to create rounded surfaces over each light emitter and adding a resin layer over the rounded surfaces to act as lenses for the light emitters.
Alternatively or additionally, the lens array may include diffraction gratings. For example, in a manufacturing process, fused silica may be deposited onto the light emitters of the light emitter array 108. Holographically patterned gratings can be etched into the fused silica in a fashion that causes light from the light emitters in the light emitter array 108 to be directed towards the eye of a user. Alternatively or additionally, embodiments may use digital planar holography to construct gratings in the lens array 110.
As illustrated in
Note that the lenses may be configured specifically to focus the emitted light toward the aperture 120. In particular in near vison devices, there is a need to focus emitted light into a pupil rather than just allowing diffused light to enter the pupil. Thus, as used herein, directing light into an aperture requires the light to be focused or concentrated into the aperture. In some embodiments, directing light into an aperture may require that a certain percentage of the light be directed towards and enter the aperture. For example, some embodiments may require that at least 90% of the emitted light be directed towards and enter into the aperture. Alternatively or additionally, embodiments may require that at least 70% of the emitted light be directed towards and enter into the aperture. Alternatively or additionally, some embodiments may require that at least 50% of the emitted light be directed towards and enter into the aperture.
Note that in some embodiments, the lens array 110 may be used in conjunction with an additional lens or additional lenses, such as the lens 121 illustrated in
In some embodiments, the driver circuitry 124 may also include computing circuitry to perform various general-purpose computing tasks. For example, the driver circuitry 124 may include the ability to run word processing applications, photo editing applications, email applications, Internet browsing applications, or any one of a number of different applications. Alternatively or additionally, in some embodiments the driver circuitry 124 may be configured to connect to external computing circuitry to receive data used for creating the persisted images displayed to the user 102.
In some embodiments, the driver circuitry 124 may be configured to receive wireless data from sources external to the device 100. For example, consider a scenario where the user wearing the device 100 is able to move about a physical environment. Items in the physical environment may include, or be associated with, wireless transmitters that are able to transmit data to the wireless device driver circuitry 124 through wireless communication means. For example, consider a case where a user may be in museum or other such facility. As the user moves from exhibit to exhibit, the user can view exhibit items at the location of the exhibit items. A wireless transmitter may transmit data describing various features of the exhibit item. As the user moves to a new exhibit item, information can be transmitted from a transmitter proximate that new item which provides details regarding the new item.
Note that embodiments may be implemented where the output from the light emitter array 108 is directed to an eye box having a particular field of view 109. Field of view is defined as the number of viewable angles of an aperture. Typically, when referring to field-of-view in virtual reality, augmented reality, and mixed reality devices, field-of-view refers to the number of viewable angles for a particular fixation of an eye. Due to the movable nature of the movable member 106 and associated light and optical components, embodiments are able to direct light in a way that is able to expand the field of view 109 over currently existing devices. In particular, some embodiments may be able to implement a field-of-view of about 80 to 90° where previous systems have been limited to about 30 to 40°.
Referring now to
Note that while in the example illustrated in
Note again that while a single movable member 106-B is illustrated, it should be appreciated that in other embodiments, multiple movable members may be implemented. For example, in some embodiments a movable member may be implemented for each eye. In some embodiments the motion of the movable members, when multiple movable members are implemented, are coordinated so as to reduce or eliminate any jarring effects that may be experienced by the user resulting from the motion of the movable members. Thus for example, the movable members may be configured to move in opposite directions and to reach the end of their motions at about the same time so as to cause movements and/or abrupt stops to cancel each other.
Referring now to
Returning once again to
Note that while a screw is illustrated herein, other embodiments may use other functionality, such as sliding weights, removable (or addable) tabs, adjustable pivot points, or any one of a number of different methods for balance the moveable member 106.
The following discussion now refers to a number of methods and method acts that may be performed. Although the method acts may be discussed in a certain order or illustrated in a flow chart as occurring in a particular order, no particular ordering is required unless specifically stated, or required because an act is dependent on another act being completed prior to the act being performed.
Referring now to
The method 500 further includes outputting light from a first light emitter array coupled to the first movable member, the first light emitter array comprising a plurality of light emitters, wherein outputting light from a first light emitter array is dependent on a position of the first movable member (act 504). For example, as illustrated in
The method 500 further includes directing the light output from the first light emitter array into a first aperture using a first lens array, wherein the first lens array is coupled to the first light emitter array, and wherein the first lens array comprises lenses configured to direct light into the first aperture (act 506). For example, as illustrated in
The method 500 may be practiced where moving the first movable member comprises rotating the first movable member about an axis. An example of this is illustrated in
The method 500 may be practiced where directing the light output from the first light emitter array into a first aperture comprises directing the light output from the first light emitter array into an eye of a user.
The method 500 may be practiced where directing the light output from the first light emitter array into a first aperture comprises directing the light through a lens configured to work in conjunction with the first lens array to direct light into the aperture. For example, as illustrated in
The method 500 may further include, using the movement of the first movable member to cool the first light emitter array.
The method 500 may be practiced where outputting light from a first light emitter array coupled to the first movable member is performed using driver circuity coupled to the light emitter array, wherein the driver circuitry configures the light emitter array to output light from the light emitters dependent on a position of the first movable member. In some such embodiments, the method 500 may further include the driver circuitry wirelessly receiving data for configuring the light emitter array.
The method 500 may further include moving a second movable member, outputting light from a second light emitter array coupled to the second movable member, the second light emitter array comprising a plurality of light emitters, wherein outputting light from a second light emitter array is dependent on a position of the second movable member, and directing the light output from the second light emitter array into a second aperture using a second lens array, wherein the second lens array is coupled to the second light emitter array, and wherein the second lens array comprises lenses configured to direct light into the second aperture such that the first light emitter array and second light emitter array display a 3D image to a user. One example of this is illustrated in
It should also be appreciated however, that in some embodiments a single movable member may be able to transmit output 3D images by outputting different images of a stereoscopic pair of images two different eyes of the user.
Referring now to
The method 600 further includes configuring the first light emitter array to output light from the light emitters dependent on a position of the first movable member (act 604). For example, the light emitter are a 108 may be coupled to driver circuitry 124 which configures the light emitter are a 108 to modulate its output according to a position of the movable member 106. Thus, the light output from the light emitter are a 108 will vary depending on the position of the movable member 106.
The method 600 further includes coupling a first lens array to the first light emitter array in a fashion that configures the first lens array to direct light into a first aperture (act 606). For example, as illustrated in
The method 600 may further include configuring the first movable member to rotate about an axis.
The method 600 may further include optically coupling a lens configured to the first lens array to work in conjunction with first lens array to direct light into the aperture. For example, as illustrated in
The method 600 may further include coupling a second movable member to a second light emitter array, wherein the second light array comprises a second plurality of light emitters, configuring the second light emitter array to output light from the second plurality of light emitters dependent on a position of the second movable member, and coupling a second lens array to the second light emitter array in a fashion that configures the second lens array to direct light into a second aperture such that the first light emitter array and second light emitter array are configured to display a 3D image to a user. This can be used to create a device such as the one illustrated in
Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
The present invention may be embodied in other specific forms without departing from its spirit or characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.