Mixed reality devices allow users to view the real world while simultaneously viewing computer generated graphics overlaying real world objects and scenery in the user's field of vision. These graphics may be used by the device to enhance the user's viewing experience in many ways, such as by displaying information about objects or locations viewed by the user.
Common designs for mixed reality devices utilize reflective beamsplitters and mirrors to direct both ambient light from the real world and light from an electronic display device toward a user's eye. In a typical design, ambient light enters the device through one beamsplitter while light from an electronic display enters through a second beamsplitter. Light from each source travels along separate light paths before being overlaid and directed out of the system toward the user's eye. In order to properly direct the light, however, the light paths within the system often require the light to pass through or reflect from the first or second beamsplitter one or more times.
Reflective beamsplitters are typically designed to transmit or reflect only a portion of incident light. Thus, mixed reality devices are severely limited by the amount of light intensity lost each time the light in the system reflects from or is transmitted through one of the beamsplitters. As a result, the brightness of the light in the system is diminished and the contrast between the ambient light entering the system and the light generated from the electronic display device cannot be properly controlled. Such an effect reduces the sharpness of graphics displayed on the device and negatively impacts the user's viewing experience. In addition, the loss of light from the electronic display requires the device to expend more power to produce visible graphics and thus reduces the overall battery life of the device.
An optical display system configured to transmit light along a light path to a user's eye is provided. The display system may comprise a circular polarizing reflector configured to reflect light with a first polarization from an image source, a quarter wave plate downstream of the circular polarizing reflector in the light path and configured to rotate the polarization of the light to a second polarization, and a curved linear polarizing reflector downstream of the quarter wave plate and configured to reflect the light back through the quarter wave plate along the light path in the direction of the circular polarizing reflector. The quarter wave plate may be further configured to rotate the polarization of the light received from the curved linear polarizing reflector to a third polarization and the circular polarizing reflector may be further configured to receive said light from the quarter wave plate and transmit the light toward the user's eye.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
As described above, current designs for mixed reality display devices suffer from limitations due to the inefficiency of reflecting light multiples times from traditional beamsplitters. To address these issues, embodiments are disclosed herein that relate to an optical display device which may combine the advantages of highly efficient circular and linear polarizing reflectors with the use of a switchable wave plate to produce a mixed reality device with improved efficiency.
The circular polarizing reflector 20 in optical display system 10 may be configured to transmit one of either right-hand polarized or left-hand polarized light and to reflect the other. Likewise, the curved linear polarizing reflector 16 may be configured to transmit one of S-polarized or P-polarized light and reflect the other. The quarter wave plate 18 may be configured to rotate the polarization of the light in a four-phase pattern alternating between linear and circular polarization. For example, the quarter wave plate 18 may be configured to rotate S-polarized light to right-hand circular polarized light after a first pass, rotate the right-hand circular polarized light to linear P-polarized light after a second pass, and rotate the P-polarized light back to left-hand circular polarized light after a third pass.
In
The display system 10 may be further configured such that the quarter wave plate 18 is a switchable quarter wave plate configured to rotate the polarization of the light if the quarter wave plate 18 is activated and to not rotate the polarization of the light if the quarter wave plate 18 is deactivated. In
Turning next to
Turning next to
Turning briefly back to
Turning next to
The system 110 may be further configured such that an image source 114 is positioned at an angle 134 less than ninety degrees with respect to an optical axis O of the curved linear polarizing reflector 116 so as to project side-addressed light toward the curved linear polarizing reflector 116. The prism 130 may be positioned such that angle 134 between the optical axis O of the curved linear polarizing reflector 116 and the axis A of the image source 114 is less than 90 degrees. As a result, the overall size of the system 110 can be decreased. The system 110 may be further configured to receive ambient light through the curved linear polarizing reflector 116. In certain embodiments, a time-multiplexer 128 may be configured to control the image source 114 and quarter wave plate 118 in the manner described previously and, in doing so, dynamically control the brightness of the contrast between the ambient light and the light from the image source 114.
Turning briefly back to
In
Turning next to
Turning next to
Turning next to
It should be further noted that in certain embodiments of the method 400, the curved linear polarizing reflector may be a curved wire grid polarizing beamsplitter. Furthermore, the circular polarizing reflector may be a cholesteric liquid crystal reflective polarizer. In addition, the circular polarizing reflector may have one of a flat or curved shape.
In some embodiments, the methods and processes described herein may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product, e.g. to display an image via the disclosed display system embodiments.
Computing system 600 includes a logic machine 602 and a storage machine 604. Computing system 600 may also include a display subsystem 606, input subsystem 608, communication subsystem 610, and/or other components not shown in
Logic machine 602 includes one or more physical devices configured to execute instructions. For example, the logic machine may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
The logic machine may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic machine may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic machine may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic machine optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic machine may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
Storage machine 604 includes one or more physical devices configured to hold instructions executable by the logic machine to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage machine 604 may be transformed—e.g., to hold different data.
Storage machine 604 may include removable and/or built-in devices. Storage machine 604 may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage machine 604 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
It will be appreciated that storage machine 604 includes one or more physical devices. However, aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration.
Aspects of logic machine 602 and storage machine 604 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
The terms “module,” “program,” and “engine” may be used to describe an aspect of computing system 600 implemented to perform a particular function. In some cases, a module, program, or engine may be instantiated via logic machine 602 executing instructions held by storage machine 604. It will be understood that different modules, programs, and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, program, and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “module,” “program,” and “engine” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
When included, display subsystem 606 may be used to present a visual representation of data held by storage machine 604. This visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage machine, and thus transform the state of the storage machine, the state of display subsystem 606 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 606 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic machine 602 and/or storage machine 604 in a shared enclosure, or such display devices may be peripheral display devices.
When included, input subsystem 608 may comprise or interface with one or more user-input devices such as a keyboard, mouse, touch screen, or game controller. In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity.
When included, communication subsystem 610 may be configured to communicatively couple computing system 600 with one or more other computing devices. Communication subsystem 610 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, the communication subsystem may allow computing system 600 to send and/or receive messages to and/or from other devices via a network such as the Internet.
It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
Number | Name | Date | Kind |
---|---|---|---|
5686940 | Kuga | Nov 1997 | A |
5903396 | Rallison | May 1999 | A |
5973833 | Booth et al. | Oct 1999 | A |
6072443 | Nasserbakht et al. | Jun 2000 | A |
6088541 | Meyer | Jul 2000 | A |
6144439 | Carollo | Nov 2000 | A |
6563638 | King et al. | May 2003 | B2 |
6567101 | Thomas | May 2003 | B1 |
6847488 | Travis | Jan 2005 | B2 |
7595933 | Tang | Sep 2009 | B2 |
7675684 | Weissman et al. | Mar 2010 | B1 |
7869109 | Shin | Jan 2011 | B2 |
7918559 | Tesar | Apr 2011 | B2 |
8075140 | Phillips, III et al. | Dec 2011 | B2 |
8125448 | Ranta et al. | Feb 2012 | B2 |
8144547 | Plancon et al. | Mar 2012 | B2 |
8203577 | Hoover | Jun 2012 | B2 |
8432322 | Amm et al. | Apr 2013 | B2 |
8488246 | Border et al. | Jul 2013 | B2 |
8570656 | Weissman | Oct 2013 | B1 |
8582206 | Travis | Nov 2013 | B2 |
8649098 | Ruhle | Feb 2014 | B2 |
8672486 | Travis | Mar 2014 | B2 |
9019175 | Lu | Apr 2015 | B2 |
20020085287 | Egawa | Jul 2002 | A1 |
20020113912 | Wright et al. | Aug 2002 | A1 |
20030093600 | Perala et al. | May 2003 | A1 |
20030096648 | Ohno et al. | May 2003 | A1 |
20040164927 | Suyama et al. | Aug 2004 | A1 |
20050052628 | Ikeda et al. | Mar 2005 | A1 |
20070064310 | Mukawa et al. | Mar 2007 | A1 |
20080266530 | Takahashi et al. | Oct 2008 | A1 |
20090189974 | Deering | Jul 2009 | A1 |
20090310858 | Jupe | Dec 2009 | A1 |
20100053069 | Tricoukes et al. | Mar 2010 | A1 |
20100053771 | Travis et al. | Mar 2010 | A1 |
20100117988 | Jacobs et al. | May 2010 | A1 |
20100177388 | Cohen et al. | Jul 2010 | A1 |
20110007035 | Shai | Jan 2011 | A1 |
20110148931 | Kim | Jun 2011 | A1 |
20110181801 | Okumura | Jul 2011 | A1 |
20110221658 | Haddick et al. | Sep 2011 | A1 |
20110228195 | Shikii | Sep 2011 | A1 |
20110310233 | Bathiche et al. | Dec 2011 | A1 |
20120069413 | Schultz | Mar 2012 | A1 |
20130021226 | Bell | Jan 2013 | A1 |
20130033485 | Kollin et al. | Feb 2013 | A1 |
20130113973 | Miao | May 2013 | A1 |
20130201094 | Travis et al. | Aug 2013 | A1 |
20150301336 | Denefle | Oct 2015 | A1 |
20150370074 | McDowall | Dec 2015 | A1 |
20150378074 | Kollin | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
0351967 | Jan 1990 | EP |
0566004 | Oct 1993 | EP |
1267197 | Dec 2002 | EP |
2360603 | Sep 2001 | GB |
2449682 | Dec 2008 | GB |
2006202067 | Aug 2006 | JP |
2008117141 | Oct 2008 | WO |
2013096428 | Jun 2013 | WO |
2014031326 | Feb 2014 | WO |
Entry |
---|
Kessler, “Optics of Near to Eye Displays”, Oasis: International Meeting on Optical Engineering and Science, Feb. 19, 2013, 37 pages. |
ISA European Patent Office, International Search Report and Written Opinion Issued in Application No. PCT/US2015/037565, Sep. 16, 2015, WIPO, 12 pages. |
Rolland, J., “Wide-angle, off-axis, see-through head-mounted display”, Optical Engineering, vol. 39, No. 7, Jul. 2000, 8 pages. |
York, J. et al., “Human-computer interaction issues for mobile computing in a variable work context”, International Journal of Human-Computer Studies, HCI Issues in Mobile Computing, vol. 60, Issues 5-6, May 2004, Available online Feb. 20, 2004, 27 pages. |
Travis, A. et al., “Flat Projection for 3-D”, In Proceedings of the IEEE, vol. 94, Issue 3, Mar. 2006, 11 pages. |
Pfeiffer, T., “Towards Gaze Interaction in Immersive Virtual Reality: Evaluation of a Monocular Eye Tracking Set-Up”, Virtuelle and Erweiterte Realitat—Funfter Workshop der GI-Fachgruppe VR/AR, Sep. 1, 2008, Available as early as Dec. 2007, 12 pages. |
Talha, M. et al., “Design of a Compact Wide Field of View HMD Optical System using Freeform Surfaces”, In Proceedings of SPIE vol. 6624, International Symposium on Photoelectronic Detection and Imaging 2007: Optoelectronic System Design, Manufacturing, and Testing, Mar. 12, 2008, 14 pages. |
Sherstyuk, A. et al., “Virtual Roommates in Ambient Telepresence Applications”, In Proceedings of the 2008 International Conference on Artificial Reality and Telexistence (ICAT'08), Dec. 1, 2008, Japan, 4 pages. |
Kratz, S., “HoverFlow: Expanding the Design Space of Around-Device Interaction”, In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI09), Sep. 15, 2009, Available as early as Dec. 2008, Bonn, Germany, 8 pages. |
Siriborvornratanakul, T. et al., “A Portable Projector Extended for Object-Centered Real-Time Interactions”, 2009 Conference for Visual Media Production (CVMP '09), Nov. 12, 2009, London, 9 pages. |
Strickland, J., “Head-mounted Displays—How Virtual Reality Gear Works”, How Stuff Works website, Retrieved at http://electronics.howstuffworks.com/gadgets/other-gadgets/VR-gear1.htm, Available as early as Feb. 1, 2009, Retrieved on Mar. 30, 2011, 3 pages. |
“Microsoft Develops Glasses-Free Eye-Tracking 3D Display”, Tech-FAQ, Retrieved at http://www.tech-faq.com/microsoft-develops-glasses-free-eye-tracking-3d-display.html, Retrieved on and available as early as Nov. 2, 2011, 3 pages. |
“For any kind of proceeding 2011 springtime as well as coil nailers as well as hotter summer season”, Lady Shoe Worlds, Retrieved at http://www.ladyshoesworld.com/2011/09/18/for-any-kind-of-proceeding-2011-springtime-as-well-as-coil-nailers-as-well-as-hotter-summer-season/, Sep. 18, 2011, Retrieved on and available as early as Nov. 3, 2011, 2 pages. |
Mogg, T., “High-tech Airwave ski goggles from Oakley bring augmented reality to the slopes”, Digital Trends, Retrieved at http://www.digitaltrends.com/cool-tech/high-tech-airwave-ski-goggles-from-oakley-bring-augmented-reality-to-the-slopes/, Nov. 1, 2012, 3 pages. |
“See-Through Enabled Efficient Adaptively-Focused Lightweight Low-Cost Head-Worn Display—Navy SBIR FY2012.1”, Retrieved at http://www.navysbir.com/12—1/211.htm, Retrieved on and available as early as Nov. 21, 2012, 1 page. |
Raveendran, K. et al., “Instant Radiosity for Augmented Reality”, Ending Credits, Available at http://www.endingcredits.com/projects/ARInstantRadiosity/index.html, Retrieved on and available as early as Nov. 21, 2012, 14 pages. |
ISA Korean Intellectual Property Office, International Search Report and Written Opinion issued in Application No. PCT/US2012/047090, Jan. 23, 2013, WIPO, 10 pages. |
IPEA European Patent Office, Second Written Opinion issued in Application No. PCT/US2015/037565, May 18, 2016, WIPO, 7 pages. |
IPEA European Patent Office, International Preliminary Report on Patentability issued in Application No. PCT/US2015/037565, Sep. 8, 2016, WIPO, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20150378074 A1 | Dec 2015 | US |