This invention relates to head worn computing. More particularly, this invention relates to stray light suppression systems used in head worn computing.
Wearable computing systems have been developed and are beginning to be commercialized. Many problems persist in the wearable computing field that need to be resolved to make them meet the demands of the market.
Aspects of the present invention relate to stray light control systems in head worn computing.
These and other systems, methods, objects, features, and advantages of the present invention will be apparent to those skilled in the art from the following detailed description of the preferred embodiment and the drawings. All documents mentioned herein are hereby incorporated in their entirety by reference.
Embodiments are described with reference to the following Figures. The same numbers may be used throughout to reference like features and components that are shown in the Figures:
While the invention has been described in connection with certain preferred embodiments, other embodiments would be understood by one of ordinary skill in the art and are encompassed herein.
Aspects of the present invention relate to head-worn computing (“HWC”) systems. HWC involves, in some instances, a system that mimics the appearance of head-worn glasses or sunglasses. The glasses may be a fully developed computing platform, such as including computer displays presented in each of the lenses of the glasses to the eyes of the user. In embodiments, the lenses and displays may be configured to allow a person wearing the glasses to see the environment through the lenses while also seeing, simultaneously, digital imagery, which forms an overlaid image that is perceived by the person as a digitally augmented image of the environment, or augmented reality (“AR”).
HWC involves more than just placing a computing system on a person's head. The system may need to be designed as a lightweight, compact and fully functional computer display, such as wherein the computer display includes a high resolution digital display that provides a high level of emersion comprised of the displayed digital content and the see-through view of the environmental surroundings. User interfaces and control systems suited to the HWC device may be required that are unlike those used for a more conventional computer such as a laptop. For the HWC and associated systems to be most effective, the glasses may be equipped with sensors to determine environmental conditions, geographic location, relative positioning to other points of interest, objects identified by imaging and movement by the user or other users in a connected group, and the like. The HWC may then change the mode of operation to match the conditions, location, positioning, movements, and the like, in a method generally referred to as a contextually aware HWC. The glasses also may need to be connected, wirelessly or otherwise, to other systems either locally or through a network. Controlling the glasses may be achieved through the use of an external device, automatically through contextually gathered information, through user gestures captured by the glasses sensors, and the like. Each technique may be further refined depending on the software application being used in the glasses. The glasses may further be used to control or coordinate with external devices that are associated with the glasses.
Referring to
We will now describe each of the main elements depicted on
The HWC 102 is a computing platform intended to be worn on a person's head. The HWC 102 may take many different forms to fit many different functional requirements. In some situations, the HWC 102 will be designed in the form of conventional glasses. The glasses may or may not have active computer graphics displays. In situations where the HWC 102 has integrated computer displays the displays may be configured as see-through displays such that the digital imagery can be overlaid with respect to the user's view of the environment 114. There are a number of see-through optical designs that may be used, including ones that have a reflective display (e.g. LCoS, DLP), emissive displays (e.g. OLED, LED), hologram, TIR waveguides, and the like. In addition, the optical configuration may be monocular or binocular. It may also include vision corrective optical components. In embodiments, the optics may be packaged as contact lenses. In other embodiments, the HWC 102 may be in the form of a helmet with a see-through shield, sunglasses, safety glasses, goggles, a mask, fire helmet with see-through shield, police helmet with see-through shield, military helmet with see-through shield, utility form customized to a certain work task (e.g. inventory control, logistics, repair, maintenance, etc.), and the like.
The HWC 102 may also have a number of integrated computing facilities, such as an integrated processor, integrated power management, communication structures (e.g. cell net, WiFi, Bluetooth, local area connections, mesh connections, remote connections (e.g. client server, etc.)), and the like. The HWC 102 may also have a number of positional awareness sensors, such as GPS, electronic compass, altimeter, tilt sensor, IMU, and the like. It may also have other sensors such as a camera, rangefinder, hyper-spectral camera, Geiger counter, microphone, spectral illumination detector, temperature sensor, chemical sensor, biologic sensor, moisture sensor, ultrasonic sensor, and the like.
The HWC 102 may also have integrated control technologies. The integrated control technologies may be contextual based control, passive control, active control, user control, and the like. For example, the HWC 102 may have an integrated sensor (e.g. camera) that captures user hand or body gestures 116 such that the integrated processing system can interpret the gestures and generate control commands for the HWC 102. In another example, the HWC 102 may have sensors that detect movement (e.g. a nod, head shake, and the like) including accelerometers, gyros and other inertial measurements, where the integrated processor may interpret the movement and generate a control command in response. The HWC 102 may also automatically control itself based on measured or perceived environmental conditions. For example, if it is bright in the environment the HWC 102 may increase the brightness or contrast of the displayed image. In embodiments, the integrated control technologies may be mounted on the HWC 102 such that a user can interact with it directly. For example, the HWC 102 may have a button(s), touch capacitive interface, and the like.
As described herein, the HWC 102 may be in communication with external user interfaces 104. The external user interfaces may come in many different forms. For example, a cell phone screen may be adapted to take user input for control of an aspect of the HWC 102. The external user interface may be a dedicated UI, such as a keyboard, touch surface, button(s), joy stick, and the like. In embodiments, the external controller may be integrated into another device such as a ring, watch, bike, car, and the like. In each case, the external user interface 104 may include sensors (e.g. IMU, accelerometers, compass, altimeter, and the like) to provide additional input for controlling the HWD 104.
As described herein, the HWC 102 may control or coordinate with other local devices 108. The external devices 108 may be an audio device, visual device, vehicle, cell phone, computer, and the like. For instance, the local external device 108 may be another HWC 102, where information may then be exchanged between the separate HWCs 108.
Similar to the way the HWC 102 may control or coordinate with local devices 106, the HWC 102 may control or coordinate with remote devices 112, such as the HWC 102 communicating with the remote devices 112 through a network 110. Again, the form of the remote device 112 may have many forms. Included in these forms is another HWC 102. For example, each HWC 102 may communicate its GPS position such that all the HWCs 102 know where all of HWC 102 are located.
The light that is provided by the polarized light source 302, which is subsequently reflected by the reflective polarizer 310 before it reflects from the DLP 304, will generally be referred to as illumination light. The light that is reflected by the “off” pixels of the DLP 304 is reflected at a different angle than the light reflected by the “on” pixels, so that the light from the “off” pixels is generally directed away from the optical axis of the field lens 312 and toward the side of the upper optical module 202 as shown in
The DLP 304 operates as a computer controlled display and is generally thought of as a MEMs device. The DLP pixels are comprised of small mirrors that can be directed. The mirrors generally flip from one angle to another angle. The two angles are generally referred to as states. When light is used to illuminate the DLP the mirrors will reflect the light in a direction depending on the state. In embodiments herein, we generally refer to the two states as “on” and “off,” which is intended to depict the condition of a display pixel. “On” pixels will be seen by a viewer of the display as emitting light because the light is directed along the optical axis and into the field lens and the associated remainder of the display system. “Off” pixels will be seen by a viewer of the display as not emitting light because the light from these pixels is directed to the side of the optical housing and into a light dump where the light is absorbed. The pattern of “on” and “off” pixels produces image light that is perceived by a viewer of the display as a computer generated image. Full color images can be presented to a user by sequentially providing illumination light with complimentary colors such as red, green and blue. Where the sequence is presented in a recurring cycle that is faster than the user can perceive as separate images and as a result the user perceives a full color image comprised of the sum of the sequential images. Bright pixels in the image are provided by pixels that remain in the “on” state for the entire time of the cycle, while dimmer pixels in the image are provided by pixels that switch between the “on” state and “off” state within the time of the cycle.
The configuration illustrated in
The configuration illustrated in
Critical angle=arc-sin(1/n) Eqn 1
Where the critical angle is the angle beyond which the illumination light is reflected from the internal surface when the internal surface comprises an interface from a solid with a higher refractive index to air with a refractive index of 1 (e.g. for an interface of acrylic, with a refractive index of 1.5, to air, the critical angle is 41.8 degrees; for an interface of polycarbonate, with a refractive index of 1.59, to air the critical angle is 38.9 degrees). Consequently, the TIR wedge 418 is associated with a thin air gap 408 along the internal surface to create an interface between a solid with a higher refractive index and air. By choosing the angle of the light source 404 relative to the DLP 402 in correspondence to the angle of the internal surface of the TIR wedge 418, illumination light is turned toward the DLP 402 at an angle suitable for providing image light as reflected from “on” pixels. Wherein, the illumination light is provided to the DLP 402 at approximately twice the angle of the pixel mirrors in the DLP 402 that are in the “on” state, such that after reflecting from the pixel mirrors, the image light is directed generally along the optical axis of the field lens. Depending on the state of the DLP pixels, the illumination light from “on” pixels may be reflected as image light 414 which is directed towards a field lens and a lower optical module 204, while illumination light reflected from “off” pixels (dark state light) is directed in a separate direction 410, which may be trapped and not used for the image that is ultimately presented to the wearer's eye.
The light trap may be located along the optical axis defined by the direction 410 and in the side of the housing, with the function of absorbing the dark state light. To this end, the light trap may be comprised of an area outside of the cone of image light from the “on” pixels. The light trap is typically made up of materials that absorb light including coatings of black paints or other light absorbing to prevent light scattering from the dark state light degrading the image perceived by the user. In addition, the light trap may be recessed into the wall of the housing or include masks or guards to block scattered light and prevent the light trap from being viewed adjacent to the displayed image.
The embodiment of
The embodiment illustrated in
The combiner 602 may include a holographic pattern, to form a holographic mirror. If a monochrome image is desired, there may be a single wavelength reflection design for the holographic pattern on the surface of the combiner 602. If the intention is to have multiple colors reflected from the surface of the combiner 602, a multiple wavelength holographic mirror maybe included on the combiner surface. For example, in a three color embodiment, where red, green and blue pixels are generated in the image light, the holographic mirror may be reflective to wavelengths matching the wavelengths of the red, green and blue light provided by the light source. This configuration can be used as a wavelength specific mirror where pre-determined wavelengths of light from the image light are reflected to the user's eye. This configuration may also be made such that substantially all other wavelengths in the visible pass through the combiner element 602 so the user has a substantially clear view of the surroundings when looking through the combiner element 602. The transparency between the user's eye and the surrounding may be approximately 80% when using a combiner that is a holographic mirror. Wherein holographic mirrors can be made using lasers to produce interference patterns in the holographic material of the combiner where the wavelengths of the lasers correspond to the wavelengths of light that are subsequently reflected by the holographic mirror.
In another embodiment, the combiner element 602 may include a notch mirror comprised of a multilayer coated substrate wherein the coating is designed to substantially reflect the wavelengths of light provided by the light source and substantially transmit the remaining wavelengths in the visible spectrum. For example, in the case where red, green and blue light is provided by the light source to enable full color images to be provided to the user, the notch mirror is a tristimulus notch mirror wherein the multilayer coating is designed to reflect narrow bands of red, green and blue light that are matched to the what is provided by the light source and the remaining visible wavelengths are transmitted to enable a view of the environment through the combiner. In another example where monochrome images are provided to the user, the notch mirror is designed to reflect a narrow band of light that is matched to the wavelengths of light provided by the light source while transmitting the remaining visible wavelengths to enable a see-through view of the environment. The combiner 602 with the notch mirror would operate, from the user's perspective, in a manner similar to the combiner that includes a holographic pattern on the combiner element 602. The combiner, with the tristimulus notch mirror, would reflect the “on” pixels to the eye because of the match between the reflective wavelengths of the notch mirror and the color of the image light, and the wearer would be able to see with high clarity the surroundings. The transparency between the user's eye and the surrounding may be approximately 80% when using the tristimulus notch mirror. In addition, the image provided by the upper optical module 202 with the notch mirror combiner can provide higher contrast images than the holographic mirror combiner due to less scattering of the imaging light by the combiner.
Light can escape through the combiner 602 and may produce face glow as the light is generally directed downward onto the cheek of the user. When using a holographic mirror combiner or a tristimulus notch mirror combiner, the escaping light can be trapped to avoid face glow. In embodiments, if the image light is polarized before the combiner 602, a linear polarizer can be laminated, or otherwise associated, to the combiner 602 (for example, the polarizer can be laminated to the side of the combiner that is away from the user's eye), with the transmission axis of the polarizer oriented relative to the polarized image light so that any escaping image light is absorbed by the polarizer. In embodiments, the image light would be polarized to provide S polarized light to the combiner 602 for better reflection. As a result, the linear polarizer on the combiner 602 would be oriented to absorb S polarized light and pass P polarized light. This provides the preferred orientation of polarized sunglasses as well as this orientation will absorb light reflected from the surface of lakes and ponds. In a preferred embodiment, the polarizer is combined with a tristimulus notch mirror combiner.
If the image light is unpolarized, a microlouvered film such as a privacy filter (for example 3M ALCF: http://products3.3m.com/catalog/us/en001/electronics_mfg/vikuiti/node_PSG4KNNLC2be/root_GST1T4S9TCgv/vroot_S6 Q2FD9X0Jge/gvel_ZF5G3RNK7Bgl/theme_us_vikuiti_3_0/command_AbcPageHandler/output_html) can be used to absorb the escaping image light while providing the user with a see-through view of the environment. In this case, the absorbance or transmittance of the microlouvered film is dependent on the angle of the light, Where steep angle light is absorbed by the microlouvered film and light at less of an angle is transmitted by the microlouvered film. For this reason, in an embodiment, the combiner 602 with the microlouver film is angled at greater than 45 degrees, as shown in
Another aspect of the present invention relates to eye imaging. In embodiments, a camera is used in connection with an upper optical module 202 such that the wearer's eye can be imaged using pixels in the “off” state on the DLP.
In embodiments, the eye imaging camera may image the wearer's eye at a moment in time where there are enough “off” pixels to achieve the required eye image resolution. In another embodiment, the eye imaging camera collects eye image information from “off” pixels over time and forms a time lapsed image. In another embodiment, a modified image is presented to the user wherein enough “off” state pixels are included that the camera can obtain the desired resolution and brightness for imaging the wearer's eye and the eye image capture is synchronized with the presentation of the modified image.
The eye imaging system may be used for security systems. The HWC may not allow access to the HWC or other system if the eye is not recognized (e.g. through eye characteristics including retina or iris characteristics, etc.). The HWC may be used to provide constant security access in some embodiments. For example, the eye security confirmation may be a continuous, near-continuous, real-time, quasi real-time, periodic, etc. process so the wearer is effectively constantly being verified as known. In embodiments, the HWC may be worn and eye security tracked for access to other computer systems.
The eye imaging system may be used for control of the HWC. For example, a blink, wink, or particular eye movement may be used as a control mechanism for a software application operating on the HWC or associated device.
The eye imaging system may be used in a process that determines how or when the HWC 102 delivers digitally displayed content to the wearer. For example, the eye imaging system may determine that the user is looking in a direction and then HWC may change the resolution in an area of the display or provide some content that is associated with something in the environment that the user may be looking at. Alternatively, the eye imaging system may identify different users and change the displayed content or enabled features provided to the user. Users may be identified from a database of user's eye characteristics either located on the HWC 102 or remotely located on the network 110 or on a server 112. In addition, the HWC may identify a primary user or a group of primary users from eye characteristics wherein the primary user(s) are provided with an enhanced set of features and all other users are provided with a different set of features. Thus in this use case, the HWC 102 uses identified eye characteristics to either enable features or not and eye characteristics need only be analyzed in comparison to a relatively small database of individual eye characteristics.
Another aspect of the present invention relates to the generation of peripheral image lighting effects for a person wearing a HWC. In embodiments, a solid state lighting system (e.g. LED, OLED, etc.), or other lighting system, may be included inside the optical elements of an lower optical module 204. The solid state lighting system may be arranged such that lighting effects outside of a field of view (FOV) of the presented digital content is presented to create an immersive effect for the person wearing the HWC. To this end, the lighting effects may be presented to any portion of the HWC that is visible to the wearer. The solid state lighting system may be digitally controlled by an integrated processor on the HWC. In embodiments, the integrated processor will control the lighting effects in coordination with digital content that is presented within the FOV of the HWC. For example, a movie, picture, game, or other content, may be displayed or playing within the FOV of the HWC. The content may show a bomb blast on the right side of the FOV and at the same moment, the solid state lighting system inside of the upper module optics may flash quickly in concert with the FOV image effect. The effect may not be fast, it may be more persistent to indicate, for example, a general glow or color on one side of the user. The solid state lighting system may be color controlled, with red, green and blue LEDs, for example, such that color control can be coordinated with the digitally presented content within the field of view.
In the embodiment illustrated in
Another aspect of the present invention relates to the mitigation of light escaping from the space between the wearer's face and the HWC itself. Another aspect of the present invention relates to maintaining a controlled lighting environment in proximity to the wearer's eyes. In embodiments, both the maintenance of the lighting environment and the mitigation of light escape are accomplished by including a removable and replaceable flexible shield for the HWC. Wherein the removable and replaceable shield can be provided for one eye or both eyes in correspondence to the use of the displays for each eye. For example, in a night vision application, the display to only one eye could be used for night vision while the display to the other eye is turned off to provide good see-through when moving between areas where visible light is available and dark areas where night vision enhancement is needed.
In embodiments, an opaque front light shield 1412 may be included and the digital content may include images of the surrounding environment such that the wearer can visualize the surrounding environment. One eye may be presented with night vision environmental imagery and this eye's surrounding environment optical path may be covered using an opaque front light shield 1412. In other embodiments, this arrangement may be associated with both eyes.
Another aspect of the present invention relates to automatically configuring the lighting system(s) used in the HWC 102. In embodiments, the display lighting and/or effects lighting, as described herein, may be controlled in a manner suitable for when an eye cover 1408 is attached or removed from the HWC 102. For example, at night, when the light in the environment is low, the lighting system(s) in the HWC may go into a low light mode to further control any amounts of stray light escaping from the HWC and the areas around the HWC. Covert operations at night, while using night vision or standard vision, may require a solution which prevents as much escaping light as possible so a user may clip on the eye cover(s) 1408 and then the HWC may go into a low light mode. The low light mode may, in some embodiments, only go into a low light mode when the eye cover 1408 is attached if the HWC identifies that the environment is in low light conditions (e.g. through environment light level sensor detection). In embodiments, the low light level may be determined to be at an intermediate point between full and low light dependent on environmental conditions.
Another aspect of the present invention relates to automatically controlling the type of content displayed in the HWC when eye covers 1408 are attached or removed from the HWC. In embodiments, when the eye cover(s) 1408 is attached to the HWC, the displayed content may be restricted in amount or in color amounts. For example, the display(s) may go into a simple content delivery mode to restrict the amount of information displayed. This may be done to reduce the amount of light produced by the display(s). In an embodiment, the display(s) may change from color displays to monochrome displays to reduce the amount of light produced. In an embodiment, the monochrome lighting may be red to limit the impact on the wearer's eyes to maintain an ability to see better in the dark.
Although embodiments of HWC have been described in language specific to features, systems, computer processes and/or methods, the appended claims are not necessarily limited to the specific features, systems, computer processes and/or methods described. Rather, the specific features, systems, computer processes and/or and methods are disclosed as non-limited example implementations of HWC.
All documents referenced herein are hereby incorporated by reference.
This application claims the benefit of priority to and is a continuation of the following U.S. patent application, which is hereby incorporated by reference in its entirety: U.S. non-provisional Ser. No. 16/775,866 entitled STRAY LIGHT SUPPRESSION FOR HEAD WORN COMPUTING, filed Jan. 29, 2020, which is a continuation of U.S. non-provisional Ser. No. 15/904,487 entitled STRAY LIGHT SUPPRESSION FOR HEAD WORN COMPUTING, filed Feb. 26, 2018, now U.S. which is a continuation of U.S. non-provisional Ser. No. 14/811,258 entitled STRAY LIGHT SUPPRESSION FOR HEAD WORN COMPUTING, filed Jul. 28, 2015, now U.S. Pat. No. 9,939,646, which is a continuation of U.S. non-provisional application Ser. No. 14/185,987, entitled STRAY LIGHT SUPPRESSION FOR HEAD WORN COMPUTING, filed Feb. 21, 2014, now U.S. Pat. No. 9,122,054, which is a continuation of U.S. non-provisional application Ser. No. 14/163,646, entitled PERIPHERAL LIGHTING FOR HEAD WORN COMPUTING, filed Jan. 24, 2014, now U.S. Pat. No. 9,400,390.
Number | Name | Date | Kind |
---|---|---|---|
1897833 | Benway | Feb 1933 | A |
4852988 | Velez | Aug 1989 | A |
D327674 | Kuo | Jul 1992 | S |
5272757 | Scofield | Dec 1993 | A |
D376790 | Taylor | Dec 1996 | S |
5596451 | Handschy | Jan 1997 | A |
5621424 | Shimada | Apr 1997 | A |
5717422 | Fergason | Feb 1998 | A |
5808800 | Handschy | Sep 1998 | A |
5886822 | Spitzer | Mar 1999 | A |
5949583 | Rallison | Sep 1999 | A |
6099117 | Gregory | Aug 2000 | A |
6160666 | Rallison | Dec 2000 | A |
6195136 | Handschy | Feb 2001 | B1 |
6222677 | Budd | Apr 2001 | B1 |
6359723 | Handschy | Mar 2002 | B1 |
6369952 | Rallison | Apr 2002 | B1 |
6433760 | Vaissie | Aug 2002 | B1 |
6456438 | Lee | Sep 2002 | B1 |
6491391 | Blum et al. | Dec 2002 | B1 |
D470144 | Li | Feb 2003 | S |
6535182 | Stanton | Mar 2003 | B2 |
D473871 | Santos | Apr 2003 | S |
6847336 | Lemelson | Jan 2005 | B1 |
6943754 | Aughey | Sep 2005 | B2 |
6977776 | Volkenandt et al. | Dec 2005 | B2 |
6987787 | Mick | Jan 2006 | B1 |
7016116 | Dolgoff | Mar 2006 | B2 |
7088234 | Naito | Aug 2006 | B2 |
7199934 | Yamasaki | Apr 2007 | B2 |
7206134 | Weissman | Apr 2007 | B2 |
7347551 | Fergason et al. | Mar 2008 | B2 |
7457040 | Amitai | Nov 2008 | B2 |
7488294 | Torch | Feb 2009 | B2 |
7646540 | Dolgoff | Jan 2010 | B2 |
7728799 | Kerr | Jun 2010 | B2 |
7830370 | Yamazaki | Nov 2010 | B2 |
7855743 | Sako | Dec 2010 | B2 |
7928926 | Yamamoto | Apr 2011 | B2 |
8004765 | Amitai | Aug 2011 | B2 |
8018579 | Krah | Sep 2011 | B1 |
8166421 | Magal | Apr 2012 | B2 |
8212859 | Tang | Jul 2012 | B2 |
8228315 | Starner | Jul 2012 | B1 |
8235529 | Raffle | Aug 2012 | B1 |
8246170 | Yamamoto | Aug 2012 | B2 |
8378924 | Jacobsen | Feb 2013 | B2 |
8427396 | Kim | Apr 2013 | B1 |
8494215 | Kimchi | Jul 2013 | B2 |
8564883 | Totani | Oct 2013 | B2 |
8570273 | Smith | Oct 2013 | B1 |
8576276 | Bar-zeev | Nov 2013 | B2 |
8576491 | Takagi | Nov 2013 | B2 |
8587869 | Totani | Nov 2013 | B2 |
8594467 | Lu | Nov 2013 | B2 |
8611015 | Wheeler | Dec 2013 | B2 |
8638498 | Bohn et al. | Jan 2014 | B2 |
8662686 | Takagi | Mar 2014 | B2 |
8670183 | Clavin | Mar 2014 | B2 |
8696113 | Lewis | Apr 2014 | B2 |
8698157 | Hanamura | Apr 2014 | B2 |
8711487 | Takeda | Apr 2014 | B2 |
8743052 | Keller | Jun 2014 | B1 |
8745058 | Garcia-barrio | Jun 2014 | B1 |
8750541 | Dong | Jun 2014 | B1 |
8752963 | Mcculloch | Jun 2014 | B2 |
8760765 | Gupta | Jun 2014 | B2 |
8787006 | Golko | Jul 2014 | B2 |
8803867 | Oikawa | Aug 2014 | B2 |
8823071 | Oyamada | Sep 2014 | B2 |
8832557 | Fadell | Sep 2014 | B2 |
8837880 | Takeda | Sep 2014 | B2 |
8867139 | Gupta | Oct 2014 | B2 |
8922530 | Pance | Dec 2014 | B2 |
8929589 | Publicover et al. | Jan 2015 | B2 |
9010929 | Lewis | Apr 2015 | B2 |
9046686 | Saito | Jun 2015 | B2 |
9100732 | Dong | Aug 2015 | B1 |
9122054 | Osterhout | Sep 2015 | B2 |
9269193 | Saito | Feb 2016 | B2 |
9274338 | Robbins et al. | Mar 2016 | B2 |
9292973 | Bar-zeev et al. | Mar 2016 | B2 |
9323325 | Perez et al. | Apr 2016 | B2 |
9329387 | Border | May 2016 | B2 |
9400390 | Osterhout | Jul 2016 | B2 |
9482880 | Chandrasekhar | Nov 2016 | B1 |
9651784 | Osterhout | May 2017 | B2 |
9658457 | Osterhout | May 2017 | B2 |
9658458 | Osterhout | May 2017 | B2 |
9715112 | Border | Jul 2017 | B2 |
9720227 | Border | Aug 2017 | B2 |
9720505 | Gribetz et al. | Aug 2017 | B2 |
9753288 | Osterhout | Sep 2017 | B2 |
9826299 | Osterhout | Nov 2017 | B1 |
9841599 | Border | Dec 2017 | B2 |
9846308 | Osterhout | Dec 2017 | B2 |
9880441 | Osterhout | Jan 2018 | B1 |
9939646 | Osterhout | Apr 2018 | B2 |
10013053 | Cederlund et al. | Jul 2018 | B2 |
10025379 | Drake et al. | Jul 2018 | B2 |
10578874 | Osterhout et al. | Mar 2020 | B2 |
11782274 | Osterhout et al. | Oct 2023 | B2 |
20020005108 | Ludwig | Jan 2002 | A1 |
20020021498 | Ohtaka | Feb 2002 | A1 |
20030030597 | Geist | Feb 2003 | A1 |
20030068057 | Miller | Apr 2003 | A1 |
20040008157 | Brubaker | Jan 2004 | A1 |
20060003803 | Thomas | Jan 2006 | A1 |
20060023158 | Howell et al. | Feb 2006 | A1 |
20060238550 | Page | Oct 2006 | A1 |
20070215161 | Frater et al. | Sep 2007 | A1 |
20080191965 | Pandozy | Aug 2008 | A1 |
20080198324 | Fuziak | Aug 2008 | A1 |
20090279180 | Amitai | Nov 2009 | A1 |
20100046075 | Powell | Feb 2010 | A1 |
20100079356 | Hoellwarth | Apr 2010 | A1 |
20100130140 | Waku | May 2010 | A1 |
20110130958 | Stahl | Jun 2011 | A1 |
20110131495 | Bull | Jun 2011 | A1 |
20110164047 | Pance | Jul 2011 | A1 |
20110164163 | Bilbrey | Jul 2011 | A1 |
20110196610 | Waldman | Aug 2011 | A1 |
20110199171 | Prest | Aug 2011 | A1 |
20110201213 | Dabov | Aug 2011 | A1 |
20110202823 | Berger | Aug 2011 | A1 |
20110211056 | Publicover et al. | Sep 2011 | A1 |
20110213664 | Osterhout | Sep 2011 | A1 |
20110221672 | Osterhout | Sep 2011 | A1 |
20120021806 | Maltz | Jan 2012 | A1 |
20120035934 | Cunningham | Feb 2012 | A1 |
20120050493 | Ernst | Mar 2012 | A1 |
20120062850 | Travis | Mar 2012 | A1 |
20120075168 | Osterhout | Mar 2012 | A1 |
20120105473 | Bar-zeev et al. | May 2012 | A1 |
20120113514 | Rodman | May 2012 | A1 |
20120188245 | Hyatt | Jul 2012 | A1 |
20120212593 | Na | Aug 2012 | A1 |
20120223885 | Perez | Sep 2012 | A1 |
20120242251 | Kwisthout | Sep 2012 | A1 |
20120250152 | Larson | Oct 2012 | A1 |
20120264510 | Wigdor | Oct 2012 | A1 |
20120306850 | Balan | Dec 2012 | A1 |
20120327116 | Liu | Dec 2012 | A1 |
20130009366 | Hannegan | Jan 2013 | A1 |
20130044042 | Olsson | Feb 2013 | A1 |
20130050067 | Yamashita | Feb 2013 | A1 |
20130077147 | Efimov | Mar 2013 | A1 |
20130083055 | Piemonte | Apr 2013 | A1 |
20130100259 | Ramaswamy | Apr 2013 | A1 |
20130120841 | Shpunt | May 2013 | A1 |
20130135198 | Hodge | May 2013 | A1 |
20130154913 | Genc | Jun 2013 | A1 |
20130196757 | Latta | Aug 2013 | A1 |
20130207970 | Shpunt | Aug 2013 | A1 |
20130230215 | Gurman | Sep 2013 | A1 |
20130249778 | Morimoto | Sep 2013 | A1 |
20130249787 | Morimoto | Sep 2013 | A1 |
20130257622 | Davalos | Oct 2013 | A1 |
20130265227 | Julian | Oct 2013 | A1 |
20130321265 | Bychkov | Dec 2013 | A1 |
20130321271 | Bychkov | Dec 2013 | A1 |
20130335461 | Rekimoto | Dec 2013 | A1 |
20130335573 | Forutanpour | Dec 2013 | A1 |
20140028704 | Wu | Jan 2014 | A1 |
20140043682 | Hussey | Feb 2014 | A1 |
20140062854 | Cho | Mar 2014 | A1 |
20140063473 | Pasolini | Mar 2014 | A1 |
20140101608 | Ryskamp | Apr 2014 | A1 |
20140129328 | Mathew | May 2014 | A1 |
20140146394 | Tout | May 2014 | A1 |
20140147829 | Jerauld | May 2014 | A1 |
20140152530 | Venkatesha | Jun 2014 | A1 |
20140152558 | Salter | Jun 2014 | A1 |
20140152676 | Rohn | Jun 2014 | A1 |
20140159995 | Adams | Jun 2014 | A1 |
20140160055 | Margolis | Jun 2014 | A1 |
20140160157 | Poulos | Jun 2014 | A1 |
20140160170 | Lyons | Jun 2014 | A1 |
20140168735 | Yuan | Jun 2014 | A1 |
20140176603 | Kumar | Jun 2014 | A1 |
20140177023 | Gao | Jun 2014 | A1 |
20140195918 | Friedlander | Jul 2014 | A1 |
20140253605 | Border | Sep 2014 | A1 |
20140306882 | Johansen et al. | Oct 2014 | A1 |
20150168731 | Robbins | Jun 2015 | A1 |
20150205112 | Border | Jul 2015 | A1 |
20150205128 | Border | Jul 2015 | A1 |
20150212324 | Osterhout | Jul 2015 | A1 |
20150212327 | Osterhout | Jul 2015 | A1 |
20150245131 | Facteau | Aug 2015 | A1 |
20150261015 | Han | Sep 2015 | A1 |
20150264467 | Annunziato | Sep 2015 | A1 |
20150338661 | Osterhout | Nov 2015 | A1 |
20150355466 | Border | Dec 2015 | A1 |
20160033772 | Han | Feb 2016 | A1 |
20160109709 | Osterhout | Apr 2016 | A1 |
20160109713 | Osterhout | Apr 2016 | A1 |
20160116738 | Osterhout | Apr 2016 | A1 |
20160116745 | Osterhout | Apr 2016 | A1 |
20160147309 | Li | May 2016 | A1 |
20160161747 | Osterhout | Jun 2016 | A1 |
20160171846 | Brav | Jun 2016 | A1 |
20170123213 | Border | May 2017 | A1 |
20170219831 | Haddick | Aug 2017 | A1 |
20170343820 | Osterhout | Nov 2017 | A1 |
20180003988 | Osterhout | Jan 2018 | A1 |
20180035101 | Osterhout | Feb 2018 | A1 |
20180045967 | Osterhout | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
2316473 | Jan 2001 | CA |
2362895 | Dec 2002 | CA |
2388766 | Dec 2003 | CA |
368898 | May 1990 | EP |
777867 | Jun 1997 | EP |
2486450 | Aug 2012 | EP |
2502410 | Sep 2012 | EP |
8809942 | Dec 1988 | WO |
9606378 | Feb 1996 | WO |
2010062481 | Jun 2010 | WO |
2011060525 | May 2011 | WO |
2011143655 | Nov 2011 | WO |
2012058175 | May 2012 | WO |
2013050650 | Apr 2013 | WO |
2013103825 | Jul 2013 | WO |
2013110846 | Aug 2013 | WO |
2013170073 | Nov 2013 | WO |
Entry |
---|
Bezryadin, et al. “Brightness Calculation in Digital Image Processing,” Technologies for Digital Fulfillment 2007, Las Vegas, NV, 2007, pp. 1-6. |
Clements-Cortes, et al. “Short-Term Effects of Rhythmic Sensory Stimulation in Alzheimer's Disease: an Exploratory Pilot Study,” Journal of Alzheimer's Disease 52 (2016), IOS Press Feb. 9, 2016, pp. 651-660. |
Final Office Action mailed Apr. 26, 2023, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 17 pages. |
Final Office Action mailed Aug. 20, 2021, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 18 pages. |
Final Office Action mailed Mar. 29, 2022, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 18 pages. |
Final Office Action mailed Oct. 14, 2022, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 15 pages. |
Jacob, R. “Eye Tracking in Advanced Interface Design”, Virtual Environments and Advanced Interface Design, Oxford University Press, Inc. (Jun. 1995). |
Logbar Inc., “Ring: Shortcut Everything”, https://www.kickstarter.com/projects/1761670738/ring-shortcut-everything, Dec. 2014, 22 pages. |
Non-Final Office Action mailed Apr. 14, 2021, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 15 pages. |
Non-Final Office Action mailed Apr. 29, 2019, for U.S. Appl. No. 15/904,487, filed Feb. 26, 2018, 6 pages. |
Non-Final Office Action mailed Dec. 8, 2021, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 17 pages. |
Non-Final Office Action mailed Jan. 6, 2023, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 14 pages. |
Non-Final Office Action mailed Jun. 28, 2022, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 16 pages. |
Notice of Allowance mailed Jul. 26, 2023, for U.S. Appl. No. 16/775,866, filed Jan. 29, 2020, 10 pages. |
Notice of Allowance mailed Oct. 10, 2019, for U.S. Appl. No. 15/904,487, filed Feb. 26, 2018, 8 pages. |
Pamplona, Vitor R. et al., “Photorealistic Models for Pupil Light Reflex and Iridal Pattern Deformation”, ACM Transactions on Graphics, vol. 28, No. 4, Article 106, Publication date: Aug. 2009, pp. 1-12. |
Rolland, J. et al., “High- resolution inset head- mounted display”, Optical Society of America, vol. 37, No. 19, Applied Optics, (Jul. 1, 1998). |
Schedwill, “Bidirectional OLED Microdisplay”, Fraunhofer Research Institution for Organics, Materials and Electronic Device COMEDD, Apr. 11, 2014, 2 pages. |
Tanriverdi, V. et al. (Apr. 2000). “Interacting With Eye Movements in Virtual Environments,” Department of Electrical Engineering and Computer Science, Tufts University, Medford, MA 02155, USA, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, eight pages. |
Vogel, et al., “Data glasses controlled by eye movements”, Information and communication, Fraunhofer-Gesellschaft, Sep. 22, 2013, 2 pages. |
Yoshida, A. et al., “Design and Applications of a High Resolution Insert Head Mounted Display”, (Jun. 1994). |
“Genius Ring Mice,” http://www.geniusnet.com/Genius/wSite/productCompare/compare.jsp, Dec. 23, 2014, one page. |
“Lightberry,” https://web.archive.org/web/20131201194408/http:1/lightberry.eu/, Dec. 1, 2013, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20230400696 A1 | Dec 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16775866 | Jan 2020 | US |
Child | 18456352 | US | |
Parent | 15904487 | Feb 2018 | US |
Child | 16775866 | US | |
Parent | 14811258 | Jul 2015 | US |
Child | 15904487 | US | |
Parent | 14185987 | Feb 2014 | US |
Child | 14811258 | US | |
Parent | 14163646 | Jan 2014 | US |
Child | 14185987 | US |