This invention relates to see-through computer display systems with vision correction and/or increased content density.
Head mounted displays (HMDs) and particularly HMDs that provide a see-through view of the environment are valuable instruments. The presentation of content in the see-through display can be a complicated operation when attempting to ensure that the user experience is optimized. Improved systems and methods for presenting content in the see-through display can improve the user experience.
Aspects of the present invention relate to methods and systems for the see-through computer display systems with waveguides that include a vision corrective and increased content density by reducing scene light.
These and other systems, methods, objects, features, and advantages of the present invention will be apparent to those skilled in the art from the following detailed description of the preferred embodiment and the drawings. All documents mentioned herein are hereby incorporated in their entirety by reference.
Embodiments are described with reference to the following Figures. The same numbers may be used throughout to reference like features and components that are shown in the Figures:
While the invention has been described in connection with certain preferred embodiments, other embodiments would be understood by one of ordinary skill in the art and are encompassed herein.
Aspects of the present invention relate to head-worn computing (“HWC”) systems. HWC involves, in some instances, a system that mimics the appearance of head-worn glasses or sunglasses. The glasses may be a fully developed computing platform, such as including computer displays presented in each of the lenses of the glasses to the eyes of the user. In embodiments, the lenses and displays may be configured to allow a person wearing the glasses to see the environment through the lenses while also seeing, simultaneously, digital imagery, which forms an overlaid image that is perceived by the person as a digitally augmented image of the environment, or augmented reality (“AR”).
HWC involves more than just placing a computing system on a person's head. The system may need to be designed as a lightweight, compact and fully functional computer display, such as wherein the computer display includes a high resolution digital display that provides a high level of emersion comprised of the displayed digital content and the see-through view of the environmental surroundings. User interfaces and control systems suited to the HWC device may be required that are unlike those used for a more conventional computer such as a laptop. For the HWC and associated systems to be most effective, the glasses may be equipped with sensors to determine environmental conditions, geographic location, relative positioning to other points of interest, objects identified by imaging and movement by the user or other users in a connected group, and the like. The HWC may then change the mode of operation to match the conditions, location, positioning, movements, and the like, in a method generally referred to as a contextually aware HWC. The glasses also may need to be connected, wirelessly or otherwise, to other systems either locally or through a network. Controlling the glasses may be achieved through the use of an external device, automatically through contextually gathered information, through user gestures captured by the glasses sensors, and the like. Each technique may be further refined depending on the software application being used in the glasses. The glasses may further be used to control or coordinate with external devices that are associated with the glasses.
Referring to
We will now describe each of the main elements depicted on
The HWC 102 is a computing platform intended to be worn on a person's head. The HWC 102 may take many different forms to fit many different functional requirements. In some situations, the HWC 102 will be designed in the form of conventional glasses. The glasses may or may not have active computer graphics displays. In situations where the HWC 102 has integrated computer displays the displays may be configured as see-through displays such that the digital imagery can be overlaid with respect to the user's view of the environment 114. There are a number of see-through optical designs that may be used, including ones that have a reflective display (e.g. LCoS, DLP), emissive displays (e.g. OLED, micro-LED), holographic surfaces, TIR waveguides, and the like. In embodiments, lighting systems used in connection with the display optics may be solid state lighting systems, such as LED, OLED, quantum dot, quantum dot LED, etc. In addition, the optical configuration may be monocular or binocular. It may also include vision corrective optical components. In other embodiments, the HWC 102 may be in the form of a helmet with a see-through shield, sunglasses, safety glasses, goggles, a mask, fire helmet with see-through shield, police helmet with see through shield, military helmet with see-through shield, utility form customized to a certain work task (e.g. inventory control, logistics, repair, maintenance, etc.), and the like.
The HWC 102 may also have a number of integrated computing facilities, such as an integrated processor, integrated power management, communication structures (e.g. cell net, WiFi, Bluetooth, local area connections, mesh connections, remote connections (e.g. client server, etc.)), and the like. The HWC 102 may also have a number of positional awareness sensors, such as GPS, electronic compass, altimeter, tilt sensor, IMU, and the like. It may also have other sensors such as a camera, rangefinder, hyper-spectral camera, Geiger counter, microphone, spectral illumination detector, temperature sensor, chemical sensor, biologic sensor, moisture sensor, ultrasonic sensor, and the like.
The HWC 102 may also have integrated control technologies. The integrated control technologies may be contextual based control, passive control, active control, user control, and the like. For example, the HWC 102 may have an integrated sensor (e.g. camera) that captures user hand or body gestures 116 such that the integrated processing system can interpret the gestures and generate control commands for the HWC 102. In another example, the HWC 102 may have sensors that detect movement (e.g. a nod, head shake, and the like) including accelerometers, gyros and other inertial measurements, where the integrated processor may interpret the movement and generate a control command in response. The HWC 102 may also automatically control itself based on measured or perceived environmental conditions. For example, if it is bright in the environment the HWC 102 may increase the brightness or contrast of the displayed image. In embodiments, the integrated control technologies may be mounted on the HWC 102 such that a user can interact with it directly. For example, the HWC 102 may have a button(s), touch capacitive interface, and the like.
As described herein, the HWC 102 may be in communication with external user interfaces 104. The external user interfaces may come in many different forms. For example, a cell phone screen may be adapted to take user input for control of an aspect of the HWC 102. The external user interface may be a dedicated UI, such as a keyboard, touch surface, button(s), joy stick, and the like. In embodiments, the external controller may be integrated into another device such as a ring, watch, bike, car, and the like. In each case, the external user interface 104 may include sensors (e.g. IMU, accelerometers, compass, altimeter, and the like) to provide additional input for controlling the HWD 104.
As described herein, the HWC 102 may control or coordinate with other local devices 108. The external devices 108 may be an audio device, visual device, vehicle, cell phone, computer, and the like. For instance, the local external device 108 may be another HWC 102, where information may then be exchanged between the separate HWCs 108.
Similar to the way the HWC 102 may control or coordinate with local devices 106, the HWC 102 may control or coordinate with remote devices 112, such as the HWC 102 communicating with the remote devices 112 through a network 110. Again, the form of the remote device 112 may have many forms. Included in these forms is another HWC 102. For example, each HWC 102 may communicate its GPS position such that all the HWCs 102 know where all of HWC 102 are located.
The waveguide 302 of
The optical stack of
In embodiments, the inner protective layer 402 may itself include a vision corrective portion. The inner protective layer could be formed out of polycarbonate, or other suitable material, and shaped into the corrective prescription for the user. Then the vision corrected inner protective layer could be attached to the waveguide 302 such that the air gap 412 is preserved. This would eliminate the need for a separate material to be applied to the inner protective layer 402. Of course, this configuration may require a more involved manufacturing or user process for installing the vision corrected inner protective layer 402.
As illustrated in
The inventors discovered that when applying electrochromic surfaces to glasses formats, there are significant difficulties. The electrochromic surface tends to not apply well to complicated shapes, including compound radiuses like a standard corrective glass lens or sunglass lens. It becomes somewhat easier to apply the surface to a single curve in a surface. It is easiest and produces the best results when it is applied to a flat planar surface. In embodiments, the air gap design described herein may be used in connection with any shaped electrochromic surface.
In embodiments, the outer protective layer 404 may include photochromic material(s). This would provide auto-dimming of the scene light based on the intensity of the scene light. A photochromic layer may be provided in a separate layer on either side of the outer protective layer. Typically, the photochromic layer would be positioned further from the user's eye than the electrochromic layer such that the electrochromic layer did not have an effect of the performance of the photochromic layer.
In embodiments, anti-reflective coatings may be applied to any or all of the optical stack's surfaces illustrated in connection with
In embodiments, inner and outer protective layer 402 and 408 may be applied to the waveguide 302 without leaving the air gaps 412 by using a material for the protective layers that substantially matches the index of refraction of the material used for the waveguide 302. By using an index matching material, the total internal reflection of the waveguide may use the outer surfaces of the protective layers. In such a configuration, an air gap may be provided between the inner protective layer 402 and the corrective optic 410. Further, in such a configuration an air gap may be provided between the outer protective layer 404 and the electrochromic layer 408.
In embodiments, the waveguide 302, or portions thereof, may be made of chemically treated glass to increase the waveguides strength (e.g. Gorilla Glass).
In an embodiment, a head-worn see-through computer display, may include a glass waveguide having a first inner surface, the first inner surface having a planar area at least in a region where image light is projected from the glass waveguide towards an eye of a user, the glass waveguide further configured such that image light transmits from it at approximately 90 degrees as referenced to the first inner surface, a protective inner layer positioned between the glass waveguide and the eye of the user, wherein the protective inner layer is further positioned to provide a first air gap between the glass waveguide and the protective inner layer, and a vision corrective optic mounted on the protective inner layer and positioned between the protective inner layer and the eye of the user. The glass waveguide may include at least one holographic surface. The at least one holographic surface includes a plurality of holographic surfaces. The glass waveguide may be positioned vertically in front of the eye of the user. The protective inner layer may have an outer surface upon which the vision corrective optic is mounted and the outer surface may be positioned vertically in front of the eye of the user. The head-worn see-through computer display may further include a protective outer layer positioned on a waveguide side opposite the protective inner layer, wherein the protective outer layer may be further positioned to provide a second air gap between the protective outer layer and the glass waveguide. The head-worn see-through computer display may further include an electrochromic surface controlled by a processor to controllably block at least a portion of scene light from reaching the glass waveguide. The electrochromic surface may be positioned between the protective outer layer and the glass waveguide. The electrochromic surface may be applied to the protective outer layer and the second air gap may be between the electrochromic surface and the glass waveguide. The protective outer layer may be photochromic. The vision corrective optic may include an elastomeric optic that attaches to the protective inner layer with surface tension.
Although embodiments of HWC have been described in language specific to features, systems, computer processes and/or methods, the appended claims are not necessarily limited to the specific features, systems, computer processes and/or methods described. Rather, the specific features, systems, computer processes and/or and methods are disclosed as non-limited example implementations of HWC. All documents referenced herein are hereby incorporated by reference.
This application is a continuation of U.S. application Ser. No. 16/393,851, filed on Apr. 24, 2019, which claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 62/661,720, filed Apr. 24, 2018, the contents of which are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4852988 | Velez | Aug 1989 | A |
6170952 | La | Jan 2001 | B1 |
6433760 | Vaissie | Aug 2002 | B1 |
6491391 | Blum et al. | Dec 2002 | B1 |
6847336 | Lemelson | Jan 2005 | B1 |
6943754 | Aughey | Sep 2005 | B2 |
6977776 | Volkenandt et al. | Dec 2005 | B2 |
7347551 | Fergason et al. | Mar 2008 | B2 |
7488294 | Torch | Feb 2009 | B2 |
7758185 | Lewis | Jul 2010 | B2 |
8235529 | Raffle | Aug 2012 | B1 |
8353594 | Lewis | Jan 2013 | B2 |
8611015 | Wheeler | Dec 2013 | B2 |
8638498 | Bohn et al. | Jan 2014 | B2 |
8696113 | Lewis | Apr 2014 | B2 |
8733927 | Lewis | May 2014 | B1 |
8733928 | Lewis | May 2014 | B1 |
8929589 | Publicover et al. | Jan 2015 | B2 |
9010929 | Lewis | Apr 2015 | B2 |
9235064 | Lewis | Jan 2016 | B2 |
9239473 | Lewis | Jan 2016 | B2 |
9244293 | Lewis | Jan 2016 | B2 |
9274338 | Robbins et al. | Mar 2016 | B2 |
9292973 | Bar-zeev et al. | Mar 2016 | B2 |
9323325 | Perez et al. | Apr 2016 | B2 |
9658473 | Lewis | May 2017 | B2 |
9720505 | Gribetz et al. | Aug 2017 | B2 |
10013053 | Cederlund et al. | Jul 2018 | B2 |
10025379 | Drake et al. | Jul 2018 | B2 |
10151937 | Lewis | Dec 2018 | B2 |
10185147 | Lewis | Jan 2019 | B2 |
11204501 | Osterhout | Dec 2021 | B2 |
20030030597 | Geist | Feb 2003 | A1 |
20060023158 | Howell et al. | Feb 2006 | A1 |
20070008624 | Hirayama | Jan 2007 | A1 |
20110211056 | Publicover et al. | Sep 2011 | A1 |
20110213664 | Osterhout | Sep 2011 | A1 |
20110221656 | Haddick | Sep 2011 | A1 |
20120021806 | Maltz | Jan 2012 | A1 |
20130077147 | Efimov | Mar 2013 | A1 |
20130286053 | Fleck et al. | Oct 2013 | A1 |
20140011829 | Steele | Jan 2014 | A1 |
20140195918 | Friedlander | Jul 2014 | A1 |
20150168731 | Robbins | Jun 2015 | A1 |
20150277126 | Hirano et al. | Oct 2015 | A1 |
20160055822 | Bell | Feb 2016 | A1 |
20160170209 | Border | Jun 2016 | A1 |
20170192238 | Riedel et al. | Jul 2017 | A1 |
20170220865 | Osterhout | Aug 2017 | A1 |
20170344114 | Osterhout | Nov 2017 | A1 |
20180011324 | Popovich | Jan 2018 | A1 |
20190278086 | Ofir | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
2316473 | Jan 2001 | CA |
2362895 | Dec 2002 | CA |
2388766 | Dec 2003 | CA |
102906623 | Jan 2013 | CN |
106662747 | May 2017 | CN |
2009145513 | Jul 2009 | JP |
2015184560 | Oct 2015 | JP |
2017514172 | Jun 2017 | JP |
2005088384 | Sep 2005 | WO |
2012118573 | Sep 2012 | WO |
2017157807 | Sep 2017 | WO |
2017199232 | Nov 2017 | WO |
2017223121 | Dec 2017 | WO |
2019226269 | Nov 2019 | WO |
Entry |
---|
Japanese Office Action dated Mar. 10, 2023, for JP Application No. 2020-560406, with English translation, 8 pages. |
Extended European Search Report dated May 27, 2021, for EP Application No. 19807779.4, eight pages. |
International Preliminary Report on Patentability dated Oct. 27, 2020, for PCT Application No. PCT/US2019/28999, filed Apr. 24, 2019, six pages. |
International Search Report and Written Opinion, dated Dec. 27, 2019, for PCT Application No. PCT/US2019/28999, filed Apr. 24, 2019, eight pages. |
Non-Final Office Action dated Apr. 15, 2021, for U.S. Appl. No. 16/393,851, filed Apr. 24, 2019, 16 pages. |
Notice of Allowance dated Aug. 16, 2021, for U.S. Appl. No. 16/393,851, filed Apr. 24, 2019, nine pages. |
Chinese Office Action dated Jan. 6, 2023, for CN Application No. 201980028437.7, with English translation, 18 pages. |
Jacob, R. “Eye Tracking in Advanced Interface Design”, Virtual Environments and Advanced Interface Design, Oxford University Press, Inc. (Jun. 1995). |
Rolland, J. et al., “High-resolution inset head-mounted display”, Optical Society of America, vol. 37, No. 19, Applied Optics, (Jul. 1, 1998). |
Tanriverdi, V. et al. (Apr. 2000). “Interacting With Eye Movements In Virtual Environments,” Department of Electrical Engineering and Computer Science, Tufts University, Medford, MA 02155, USA, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, eight pages. |
Yoshida, A. et al., “Design and Applications of a High Resolution Insert Head Mounted Display”, (Jun. 1994). |
Chinese Office Action dated May 9, 2022, for CN Application No. 201980028437.7, with English translation, 12 pages. |
European Office Action dated Sep. 15, 2023, for EP Application No. 19807779.4, seven pages. |
Japanese Final Office Action dated Sep. 29, 2023, for JP Application No. 2020-560406, with English 1 translation, 15 pages. |
Number | Date | Country | |
---|---|---|---|
20220075197 A1 | Mar 2022 | US |
Number | Date | Country | |
---|---|---|---|
62661720 | Apr 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16393851 | Apr 2019 | US |
Child | 17528059 | US |