Adjustable nose bridge assembly for headworn computer

Information

  • Patent Grant
  • 10690936
  • Patent Number
    10,690,936
  • Date Filed
    Monday, August 29, 2016
    8 years ago
  • Date Issued
    Tuesday, June 23, 2020
    4 years ago
Abstract
Aspects of the present invention relate to a head-worn computer, comprising a removable and replacable adjustable nose bridge assembly, wherein the adjustable nose bridge assembly has at least three user adjustable features to adapt the adjustable nose bridge assembly to the user's nose, wherein a first adjustment of the at least three user adjustable features is adapted to move the adjustable nose bridge up and down relative a lens of the head-worn computer, wherein a second adjustment of the at least three user adjustable features is adapted to rotate a nose pad of the adjustable nose bridge about an axis substantially perpendicular to a top frame of the head-worn computer, and wherein a third adjustment of the at least three user adjustable features is adapted to flare the nose pad to the side of the axis.
Description
BACKGROUND
Field of the Invention

This invention relates to head worn computing. More particularly, this invention relates to 3-way adjustable nose bridge assemblies for head-worn computers.


Description of Related Art

Wearable computing systems have been developed and are beginning to be commercialized. Many problems persist in the wearable computing field that need to be resolved to make them meet the demands of the market.


SUMMARY

Aspects of the present invention relate to 3-way adjustable nose bridge assemblies for head worn computers.


These and other systems, methods, objects, features, and advantages of the present invention will be apparent to those skilled in the art from the following detailed description of the preferred embodiment and the drawings. All documents mentioned herein are hereby incorporated in their entirety by reference.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments are described with reference to the following Figures. The same numbers may be used throughout to reference like features and components that are shown in the Figures:



FIG. 1 illustrates a head worn computing system in accordance with the principles of the present invention.



FIG. 2 illustrates a head worn computing system with optical system in accordance with the principles of the present invention.



FIG. 3 illustrates three views of a head worn computer in accordance with the principles of the present invention.



FIG. 4 illustrates a temple and ear horn in accordance with the principles of the present invention.



FIG. 5 illustrates a temple and ear horn assembly in various states in accordance with the principles of the present invention.



FIG. 6 illustrates an adjustable nose bridge assembly in accordance with the principles of the present invention.



FIG. 7 illustrates an adjustable nose bridge assembly in accordance with the principles of the present invention.



FIGS. 8-10 illustrate adjustable nose bridge assemblies in accordance with the principles of the present invention.



FIG. 11 illustrates a multiple adjustable nose pad assembly in accordance with the principles of the present invention.



FIG. 12 illustrates a malleable platform use in connection with an adjustable nose bridge assembly in accordance with the principles of the present invention.





While the invention has been described in connection with certain preferred embodiments, other embodiments would be understood by one of ordinary skill in the art and are encompassed herein.


DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)

Aspects of the present invention relate to head-worn computing (“HWC”) systems. HWC involves, in some instances, a system that mimics the appearance of head-worn glasses or sunglasses. The glasses may be a fully developed computing platform, such as including computer displays presented in each of the lenses of the glasses to the eyes of the user. In embodiments, the lenses and displays may be configured to allow a person wearing the glasses to see the environment through the lenses while also seeing, simultaneously, digital imagery, which forms an overlaid image that is perceived by the person as a digitally augmented image of the environment, or augmented reality (“AR”).


HWC involves more than just placing a computing system on a person's head. The system may need to be designed as a lightweight, compact and fully functional computer display, such as wherein the computer display includes a high resolution digital display that provides a high level of emersion comprised of the displayed digital content and the see-through view of the environmental surroundings. User interfaces and control systems suited to the HWC device may be required that are unlike those used for a more conventional computer such as a laptop. For the HWC and associated systems to be most effective, the glasses may be equipped with sensors to determine environmental conditions, geographic location, relative positioning to other points of interest, objects identified by imaging and movement by the user or other users in a connected group, and the like. The HWC may then change the mode of operation to match the conditions, location, positioning, movements, and the like, in a method generally referred to as a contextually aware HWC. The glasses also may need to be connected, wirelessly or otherwise, to other systems either locally or through a network. Controlling the glasses may be achieved through the use of an external device, automatically through contextually gathered information, through user gestures captured by the glasses sensors, and the like. Each technique may be further refined depending on the software application being used in the glasses. The glasses may further be used to control or coordinate with external devices that are associated with the glasses.


Referring to FIG. 1, an overview of the HWC system 100 is presented. As shown, the HWC system 100 comprises a HWC 102, which in this instance is configured as glasses to be worn on the head with sensors such that the HWC 102 is aware of the objects and conditions in the environment 114. In this instance, the HWC 102 also receives and interprets control inputs such as gestures and movements 116. The HWC 102 may communicate with external user interfaces 104. The external user interfaces 104 may provide a physical user interface to take control instructions from a user of the HWC 102 and the external user interfaces 104 and the HWC 102 may communicate bi-directionally to affect the user's command and provide feedback to the external device 108. The HWC 102 may also communicate bi-directionally with externally controlled or coordinated local devices 108. For example, an external user interface 104 may be used in connection with the HWC 102 to control an externally controlled or coordinated local device 108. The externally controlled or coordinated local device 108 may provide feedback to the HWC 102 and a customized GUI may be presented in the HWC 102 based on the type of device or specifically identified device 108. The HWC 102 may also interact with remote devices and information sources 112 through a network connection 110. Again, the external user interface 104 may be used in connection with the HWC 102 to control or otherwise interact with any of the remote devices 108 and information sources 112 in a similar way as when the external user interfaces 104 are used to control or otherwise interact with the externally controlled or coordinated local devices 108. Similarly, HWC 102 may interpret gestures 116 (e.g captured from forward, downward, upward, rearward facing sensors such as camera(s), range finders, IR sensors, etc.) or environmental conditions sensed in the environment 114 to control either local or remote devices 108 or 112.


We will now describe each of the main elements depicted on FIG. 1 in more detail; however, these descriptions are intended to provide general guidance and should not be construed as limiting. Additional description of each element may also be further described herein.


The HWC 102 is a computing platform intended to be worn on a person's head. The HWC 102 may take many different forms to fit many different functional requirements. In some situations, the HWC 102 will be designed in the form of conventional glasses. The glasses may or may not have active computer graphics displays. In situations where the HWC 102 has integrated computer displays the displays may be configured as see-through displays such that the digital imagery can be overlaid with respect to the user's view of the environment 114. There are a number of see-through optical designs that may be used, including ones that have a reflective display (e.g. LCoS, DLP), emissive displays (e.g. OLED, LED), hologram, TIR waveguides, and the like. In embodiments, lighting systems used in connection with the display optics may be solid state lighting systems, such as LED, OLED, quantum dot, quantum dot LED, etc. In addition, the optical configuration may be monocular or binocular. It may also include vision corrective optical components. In embodiments, the optics may be packaged as contact lenses. In other embodiments, the HWC 102 may be in the form of a helmet with a see-through shield, sunglasses, safety glasses, goggles, a mask, fire helmet with see-through shield, police helmet with see through shield, military helmet with see-through shield, utility form customized to a certain work task (e.g. inventory control, logistics, repair, maintenance, etc.), and the like.


The HWC 102 may also have a number of integrated computing facilities, such as an integrated processor, integrated power management, communication structures (e.g. cell net, WiFi, Bluetooth, local area connections, mesh connections, remote connections (e.g. client server, etc.)), and the like. The HWC 102 may also have a number of positional awareness sensors, such as GPS, electronic compass, altimeter, tilt sensor, IMU, and the like. It may also have other sensors such as a camera, rangefinder, hyper-spectral camera, Geiger counter, microphone, spectral illumination detector, temperature sensor, chemical sensor, biologic sensor, moisture sensor, ultrasonic sensor, and the like.


The HWC 102 may also have integrated control technologies. The integrated control technologies may be contextual based control, passive control, active control, user control, and the like. For example, the HWC 102 may have an integrated sensor (e.g. camera) that captures user hand or body gestures 116 such that the integrated processing system can interpret the gestures and generate control commands for the HWC 102. In another example, the HWC 102 may have sensors that detect movement (e.g. a nod, head shake, and the like) including accelerometers, gyros and other inertial measurements, where the integrated processor may interpret the movement and generate a control command in response. The HWC 102 may also automatically control itself based on measured or perceived environmental conditions. For example, if it is bright in the environment the HWC 102 may increase the brightness or contrast of the displayed image. In embodiments, the integrated control technologies may be mounted on the HWC 102 such that a user can interact with it directly. For example, the HWC 102 may have a button(s), touch capacitive interface, and the like.


As described herein, the HWC 102 may be in communication with external user interfaces 104. The external user interfaces may come in many different forms. For example, a cell phone screen may be adapted to take user input for control of an aspect of the HWC 102. The external user interface may be a dedicated UI, such as a keyboard, touch surface, button(s), joy stick, and the like. In embodiments, the external controller may be integrated into another device such as a ring, watch, bike, car, and the like. In each case, the external user interface 104 may include sensors (e.g. IMU, accelerometers, compass, altimeter, and the like) to provide additional input for controlling the HWD 104.


As described herein, the HWC 102 may control or coordinate with other local devices 108. The external devices 108 may be an audio device, visual device, vehicle, cell phone, computer, and the like. For instance, the local external device 108 may be another HWC 102, where information may then be exchanged between the separate HWCs 108.


Similar to the way the HWC 102 may control or coordinate with local devices 106, the HWC 102 may control or coordinate with remote devices 112, such as the HWC 102 communicating with the remote devices 112 through a network 110. Again, the form of the remote device 112 may have many forms. Included in these forms is another HWC 102. For example, each HWC 102 may communicate its GPS position such that all the HWCs 102 know where all of HWC 102 are located.



FIG. 2 illustrates a HWC 102 with an optical system that includes an upper optical module 202 and a lower optical module 204. While the upper and lower optical modules 202 and 204 will generally be described as separate modules, it should be understood that this is illustrative only and the present invention includes other physical configurations, such as that when the two modules are combined into a single module or where the elements making up the two modules are configured into more than two modules. In embodiments, the upper module 202 includes a computer controlled display (e.g. LCoS, DLP, OLED, etc.) and image light delivery optics. In embodiments, the lower module includes eye delivery optics that are configured to receive the upper module's image light and deliver the image light to the eye of a wearer of the HWC. In FIG. 2, it should be noted that while the upper and lower optical modules 202 and 204 are illustrated in one side of the HWC such that image light can be delivered to one eye of the wearer, that it is envisioned by the present invention that embodiments will contain two image light delivery systems, one for each eye. It should also be noted that while many embodiments refer to the optical modules as “upper” and “lower” it should be understood that this convention is being used to make it easier for the reader and that the modules are not necessarily located in an upper-lower relationship. For example, the image generation module may be located above the eye delivery optics, below the eye delivery optics, on a side of the eye delivery optics, or otherwise positioned to satisfy the needs of the situation and/or the HWC 102 mechanical and optical requirements.


An aspect of the present invention relates to the mechanical and electrical construction of a side arm of a head worn computer. In general, when a head worn computer takes the form of glasses, sun-glasses, certain goggles, or other such forms, two side arms are included for mounting and securing the had worn computer on the ear's of a person wearing the head worn computer. In embodiments, the side arms may also contain electronics, batteries, wires, antennas, computer processors, computer boards, etc. In embodiments, the side arm may include two or more sub assemblies. For example, as will be discussed in more detail below, the side arm may include a temple section and an ear horn section. The two sections may, for example, be mechanically arranged to allow an ear horn section to move such that both side arms can fold into a closed position.



FIG. 3 illustrates three separate views 102A, 102B and 102C of a head worn computer 102 according to the principles of the present invention. Turning to the head worn computer illustrated as 102A, one side arm of the HWC 102 is folded into its closed position. The ear horn section 308 of the side arm is rotated relative to its temple section 304 to create space relative to the other side arm 310 so when the other side arm is moved into its closed position it can fully close. In a situation where the ear horn did not rotate to create the space (not illustrated) the ear horn would physically interfere with the other side arm 310, when the side arm was in the closed position, and prevent the other side arm 310 from fully closing. The HWC 102B view illustrates the HWC 102B with both side arms folded into a fully closed position. Note that the ear horn 308 is in the rotated position with respect to its temple section 304 such that the other arm 310 closed without interfering with the ear horn 308. The HWC 102C view also illustrates both arms in closed positions with the ear horn 308 rotated to create the space for the other arm 310 to fully close. FIG. 3 also illustrates a portion of the HWC 102 where electronics may be housed in a top mount 312. The top mount may contain electronics, sensors, optics, processors, memory, radios, antennas, etc.



FIG. 4 illustrates a side arm configuration in accordance with the principles of the present invention. In this embodiment, the side arm includes two sub assemblies: the temple section 304 and the ear horn 308. FIG. 4 illustrates two views of the side arm assembly, one from an outer perspective and one from a sectioned perspective. The ear horn includes a pin 402 that is designed to fit into a hole 404 and to be secured by connector 408. The connector 408 is rotatable and in one position locks the pin 402 in place and in another position unsecures the pin 402 such that the ear horn 308 can be removed and re-attached to the temple section 304. This allows the detachment and re-attachment of the ear horn 308 from the temple section 304. This also allows for the sale of different ear horns 308 for replacement, of which a variety of colors and patterns may be offered. In embodiments, the temple section 304 may include a battery compartment 410 and other electronics, wires, sensors, processors, etc.



FIG. 5 illustrates several views of a HWC side arm with temple 304 and ear horn 308 sections. The views include outer perspectives and cross sections as well as various states of the security of the ear horn 308 with the temple section 304. Figure set 504 illustrates the ear horn 308 and the temple section 304 in a secure un-rotated position. The same pin 402 and connector 408 system described in connection with FIG. 4 is illustrated in the cross sections of FIG. 5. In the secured un-rotated position the pin is pulled internally within the temple section firmly such that it stays in place. Figure set 504 illustrates a state where the ear horn 308 is separated from the temple section 304. This state is achieved when pressure is used to pull on the ear horn 308. In embodiments, the pressure is exerted by a user pulling on the ear horn 308, which compresses a spring 510B that is mechanically associated with the pin 402 in the ear horn 308. The mechanism uses the spring to maintain pressure on the pin 402 to maintain connection with the connector 408 when the connector 408 is in a position to lock the pin 402 in position. Figure set 508 illustrates a state where, after the ear horn 308 has been pulled into the state described in connection with state 504, the ear horn 308 is rotated about the pin 402. This puts the ear horn 308 in a rotated position as described herein such that the first arm, with this rotated ear horn 308, does not interfere with the closure of the other arm 310 when the two arms are folded into the closed position.


An aspect of the present invention relates to an adjustable nose bridge. An adjustable nose bridge may be important with head worn computers, especially those with computer displays, to ensure comfort and alignment of the displays and/or other portions of the head worn computer. FIG. 6 illustrates a HWC 102 with an adjustable nose bridge 602. The nose bridge is adjustable through a mechanism in the HWC 102. In embodiments, the mechanism includes a fixed notched attachment 604, a movable pin 608 adapted to fit into the notches of the notched attachment 604, and a selection device 610 that is attached to the movable pin 608. The movable pin 608 and nose bridge 602 are connected such that the as the movable pin 608 shifts in position the nose bridge 602 moves in position as well. The selection device 610 causes the movable pin 608 to engage and disengage with the fixed notched attachment 604 when presses and allowed to retract. As illustrated in FIG. 6, the selection device 610 is not in a pressed position so the movable pin 608 is engaged with the notched attachment 604 such that the nose bridge is securely attached in a stable position. FIG. 7 illustrates a scenario where the selection device is pressed, or activated, such that the moveable pin 608 is no longer engaged with the fixed notched attachment 604. This allows the nose bridge 602 to move up and down with respect to the rest of the HWC 102. Once the movable pin 608 aligns with a notch of the notched attachment 604, the two parts may engage to re-secure the nose bridge in the HWC 102.


In embodiments, a side arm of the HWC 102 may include an audio jack (not shown) and the audio jack may be magnetically attachable to the side arm. For example, the temple section 304 or ear horn section 308 may have a magnetically attachable audio jack with audio signal wires associated with an audio system in the HWC 102. The magnetic attachment may include one or more magnets on one end (e.g. on the head phone end or the side arm end) and magnetically conductive material on the other end. In other embodiments, both ends of the attachment may have magnets, of opposite polarization, to create a stronger magnetic bond for the headphone). In embodiments, the audio signal wires or magnetic connection may include a sensor circuit to detect when the headphone is detached from the HWC 102. This may be useful in situations where the wearer is wearing the headphones during a period when there is not constant audio processing (e.g. listening for people to talk with periods of silence). In embodiments, the other side's headphone may play a tone, sound, signal, etc. in the event a headphone is detached. In embodiments, an indication of the detachment may be displayed in the computer display.


In embodiments, the HWC 102 may have a vibration system that vibrates to alert the wearer of certain sensed conditions. In embodiments, the vibration system (e.g. an actuator that moves quickly to cause vibration in the HWC 102) may be mounted in a side arm (e.g. the temple portion 304, or ear horn 308), in the top mount 312, etc. In embodiments, the vibration system may be capable of causing different vibration modes that may be indicative of different conditions. For example, the vibration system may include a multi-mode vibration system, piezo-electric vibration system, variable motor, etc, that can be regulated through computer input and a processor in the HWC 102 may send control signals to the vibration system to generate an appropriate vibration mode. In embodiments, the HWC 102 may be associated with other devices (e.g. through Bluetooth, WiFi, etc.) and the vibratory control signals may be associated with sensors associated with the other device. For example, the HWC 102 may be connected to a car through Bluetooth such that sensor(s) in the car can cause activation of a vibration mode for the vibration system. The car, for example, may determine that a risk of accident is present (e.g. risk of the driver falling asleep, car going out of its lane, a car in front of the wearer is stopped or slowing, radar in the car indicates a risk, etc.) and the car's system may then send a command, via the Bluetooth connection, to the HWC 102 to cause a vibratory tone to be initiated in the HWC 102.


In embodiments, the connection between the speaker system and the HWC 102 may be positioned other than under the temple section. It may be positioned on a side, top, bottom, end of a section of the side arm, for example. It may be positioned on the front bridge, for example. In embodiments, the speaker system may be connected to a top or side portion and the speaker may be further positioned to face forward, away from the user's ear. This may be a useful configuration for providing sound to others. For example, such a configuration may be used when the user wants to provide translations to a person nearby. The user may speak in a language, have the language translated, and then spoken through the forward facing speakers.


The removable nature of the speaker systems may be desirable for breakaway situations so a snag does not tear the glasses from the user or pull hard on the user's ear. The removable nature may also be useful for modularity configurations where the user wants to interchange speaker types or attach other accessories. For example, the user may want ear buds at one point and an open ear speaker configuration at another point and the user may be able to make the swap with ease given this configuration. The port on the HWC 102 may also be adapted for other accessories that include lights or sensors for example. The accessory may have an ambient light sensor to assist with the control of the lighting and contrast systems used in the HWC 102 displays, for example. In embodiments, the speaker port may be used as a charging port for the HWC 102 or data port for the HWC 102.


Another aspect of the present invention relates to an adjustable nose bridge assembly of a head-worn computer. Positioning of a head-worn computer can be complicated by the nature of the computer displays that are intended to be positioned in front of the user's eyes along with the fact that people have different shaped heads, noses, eye positions, etc. The inventors have appreciated the difficulties in such positioning and have developed an intuitive mechanism for a multi-axis adjustment system for the head-worn computer. In embodiments, the multi-axis adjustment system provides for vertical adjustment of the nose bridge, persistent rotational settings for the nose pads, and persistent outward/inward flex of the nose pads. Such a system is designed to be used on a wide variety of nose shapes and head sizes.



FIG. 8 illustrates a portion of a head-worn computer 102 with a mounting area 802 for an adjustable nose bridge assembly 804.



FIG. 9 illustrates an an adjustable nose bridge assembly 804 in three different vertical positions 904, 908, and 910. In embodiments, the adjustable nose bridge 804 has a selection device 610 and nose pads 902. In embodiments, the selection device is a button, or other suitable user interface, and is mechanically arranged such that pushing the button releases the nose bridge such that it can be moved up and down. In this embodiment, the button engages with a tooth or other such feature to hold the nose bridge in place. In embodiments, the adjustment may be continuous or discrete and may be mechanically, electrically, or otherwise controlled.



FIG. 10 illustrates an engagement mechanism for removing and replacing the nose pads from and to the vertical adjustment portion of the adjustable nose bridge assembly. As can be seen in FIG. 10, the nose pads are attached to a clip style mechanism that is adapted to mate with the vertical nose bridge adjustment system. FIG. 10 also shows a clear version of one nose pad to illustrate how it is over-molded to a stiff (e.g. metal) member. The inventors appreciate that there are a number of ways to attach the nose pads to the vertical adjustment system and this example is provided as a non-limiting example.



FIG. 11 illustrates a system providing two additional movable features for the nose pads. Together with the vertical adjustment portion, this configuration provides for a three-way adjustment system. Adjustment 1002 illustrates how the nose pads may be rotated or otherwise manipulated from a rear facing view. Adjustment 1004 illustrates how the nose pads may be rotated or otherwise manipulated from a top view. Once assembled on the head-worn computer, the vertical adjustment and two nose pad rotational adjustments provide for a system that accommodates many nose, face, and head shapes.



FIG. 12 illustrates a nose pad mount 1102. As previously described, the nose pads may be over-molded on to the ends of a mount. In this embodiment, the nose pads are over-molded on the ends of the nose pad mount 1102. The nose pad mount 1102 is designed to be malleable around the 2 mm dimension shown. This permits the user to twist, turn, bend, flare, or otherwise manipulate the nose pad mount 1102 to change the positions of the nose pads, which then can accommodate the user's facial structure. While the embodiment shown in FIG. 12 illustrates a single piece, the inventors have appreciated that this mount may be assembled in multiple pieces.


Although embodiments of HWC have been described in language specific to features, systems, computer processes and/or methods, the appended claims are not necessarily limited to the specific features, systems, computer processes and/or methods described. Rather, the specific features, systems, computer processes and/or and methods are disclosed as non-limited example implementations of HWC. All documents referenced herein are hereby incorporated by reference.

Claims
  • 1. A wearable head device, comprising: a display; anda nose bridge assembly coupled to the display, the nose bridge assembly comprising a selection device and a nose pad clip assembly,wherein: the nose bridge assembly is configured to operate in one of a plurality of states comprising a first state and a second state and is further configured to transition from the first state to the second state,the nose bridge assembly in the first state is configured to provide a first amount of support for the display,the nose bridge assembly in the second state is configured to provide a second amount of support for the display, the second amount of support different than the first amount of support,the selection device is configured to receive input from a user, and in response to the selection device receiving the input while the nose bridge assembly operates in the first state, the nose bridge assembly transitions to the second state, andwherein:the nose pad clip assembly is configured to be removably mated with the nose bridge assembly without affecting the connection between the nose bridge assembly and the display.
  • 2. The wearable head device of claim 1, wherein the selection device comprises a button and the input comprises a button press.
  • 3. The wearable head device of claim 1, wherein the input comprises an electrical signal.
  • 4. The wearable head device of claim 1, wherein the nose bridge assembly further comprises: a nose bridge;a notched member; anda movable pin configured to engage and disengage with the notched member, wherein:the first state is associated with the movable pin engaged with the notched member, andthe second state is associated with the movable pin disengaged with the notched member.
  • 5. The wearable head device of claim 4, wherein: the nose bridge is configured to move vertically with respect to a nose of a user of the wearable head device while the nose bridge assembly operates in the second state, andthe nose bridge is configured to remain stationary with respect to the nose of the user while the nose bridge assembly operates in the first state.
  • 6. The wearable head device of claim 5, wherein the nose bridge moving vertically comprises the nose bridge moving vertically in accordance with an electrical signal.
  • 7. The wearable head device of claim 5, further comprising a nose pad coupled to the nose bridge assembly, wherein: the nose pad is configured to rotate about two orthogonal axes in response to a second input from the user, andthe nose bridge moving vertically comprises the nose pad moving vertically without rotation about the two orthogonal axes.
  • 8. The wearable head device of claim 1, wherein the nose bridge assembly is configured to detach from the wearable head device.
  • 9. The wearable head device of claim 1, wherein the nose pad clip assembly comprises: a clip portion;a nose pad mount coupled to a bottom side of the clip, wherein the nose pad mount includes at least a first and second tab; anda first nose pad and second pad, wherein each of the first and second nose pads are configured to be mounted to the first and second tabs, respectively.
US Referenced Citations (307)
Number Name Date Kind
1897833 Benway Feb 1933 A
2064604 Hempel Dec 1936 A
3531190 Leblanc Sep 1970 A
3671111 Okner Jun 1972 A
4145125 Chika Mar 1979 A
4513812 Papst et al. Apr 1985 A
4695129 Faessen et al. Sep 1987 A
5596451 Handschy et al. Jan 1997 A
5625372 Hildebrand et al. Apr 1997 A
D383148 Lee Sep 1997 S
5717422 Fergason et al. Feb 1998 A
5808800 Handschy et al. Sep 1998 A
5808802 Hur Sep 1998 A
5870166 Chang Feb 1999 A
5954642 Johnson et al. Sep 1999 A
5971538 Heffner Oct 1999 A
6034653 Robertson et al. Mar 2000 A
6076927 Owens Jun 2000 A
6137675 Perkins Oct 2000 A
6157291 Kuenster Dec 2000 A
6195136 Handschy et al. Feb 2001 B1
6359723 Handschy et al. Mar 2002 B1
6369952 Rallison et al. Apr 2002 B1
6421031 Ronzani et al. Jul 2002 B1
6456438 Lee et al. Sep 2002 B1
6480174 Kaufmann Nov 2002 B1
6491389 Yaguchi et al. Dec 2002 B2
6824265 Harper Nov 2004 B1
6847336 Lemelson et al. Jan 2005 B1
6987787 Mick Jan 2006 B1
D521493 Wai May 2006 S
7088234 Naito et al. Aug 2006 B2
7199934 Yamasaki Apr 2007 B2
7206134 Weissman et al. Apr 2007 B2
7425065 Wang Sep 2008 B2
7477207 Estep Jan 2009 B2
7582828 Ryan Sep 2009 B2
7791889 Belady et al. Sep 2010 B2
7830370 Yamazaki et al. Nov 2010 B2
D628616 Yuan Dec 2010 S
7850301 DiChiara et al. Dec 2010 B2
7855743 Sako et al. Dec 2010 B2
7928926 Yamamoto et al. Apr 2011 B2
8004765 Amitai Aug 2011 B2
D645492 Zhao Sep 2011 S
D645493 Zhao Sep 2011 S
8018579 Krah et al. Sep 2011 B1
D646316 Zhao Oct 2011 S
D647947 Yu Nov 2011 S
8089568 Brown et al. Jan 2012 B1
8092007 DiChiara et al. Jan 2012 B2
8166421 Magal et al. Apr 2012 B2
D665838 Kim et al. Aug 2012 S
D667482 Healy et al. Sep 2012 S
D667483 Krsmanovic Sep 2012 S
D669066 Olsson et al. Oct 2012 S
D671590 Klinar et al. Nov 2012 S
8378924 Jacobsen et al. Feb 2013 B2
D680152 Olsson et al. Apr 2013 S
8427396 Kim Apr 2013 B1
D685019 Li Jun 2013 S
8494215 Kimchi et al. Jul 2013 B2
D692047 Shin Oct 2013 S
8553910 Dong et al. Oct 2013 B1
8564883 Totani et al. Oct 2013 B2
D693398 Rubin Nov 2013 S
8576276 Bar-Zeev et al. Nov 2013 B2
8576491 Takagi et al. Nov 2013 B2
8587869 Totani et al. Nov 2013 B2
8593795 Chi et al. Nov 2013 B1
8594467 Lu et al. Nov 2013 B2
8662686 Takagi et al. Mar 2014 B2
8665214 Forutanpour et al. Mar 2014 B2
8670183 Clavin et al. Mar 2014 B2
8678581 Blum et al. Mar 2014 B2
8698157 Hanamura Apr 2014 B2
8711487 Takeda et al. Apr 2014 B2
D704764 Markovitz et al. May 2014 S
8745058 Garcia-Barrio Jun 2014 B1
8750541 Dong et al. Jun 2014 B1
8752963 McCulloch et al. Jun 2014 B2
8803867 Oikawa Aug 2014 B2
8814691 Osterhout et al. Aug 2014 B2
8823071 Oyamada Sep 2014 B2
8832557 Tang et al. Sep 2014 B2
8837880 Takeda et al. Sep 2014 B2
8866702 Mirov et al. Oct 2014 B1
D716808 Yeom et al. Nov 2014 S
8878749 Wu et al. Nov 2014 B1
D719568 Heinrich et al. Dec 2014 S
D719569 Heinrich et al. Dec 2014 S
D719570 Heinrich et al. Dec 2014 S
D723092 Markovitz et al. Feb 2015 S
8955973 Raffle et al. Feb 2015 B2
8957835 Hoellwarth Feb 2015 B2
8964298 Haddick et al. Feb 2015 B2
9143693 Zhou et al. Sep 2015 B1
D741398 Echeverri Oct 2015 S
9158116 Osterhout Oct 2015 B1
D744581 Votel et al. Dec 2015 S
D745007 Cazalet et al. Dec 2015 S
D747401 Exley Jan 2016 S
D751551 Ho et al. Mar 2016 S
D751552 Osterhout Mar 2016 S
D757006 Cazalet et al. May 2016 S
D761796 Heinrich Jul 2016 S
D765076 Rochat et al. Aug 2016 S
9423842 Osterhout Aug 2016 B2
D766895 Choi Sep 2016 S
D768759 Markovitz et al. Oct 2016 S
D769873 Cazalet et al. Oct 2016 S
9482880 Chandrasekhar et al. Nov 2016 B1
9523856 Osterhout et al. Dec 2016 B2
9529195 Osterhout et al. Dec 2016 B2
9529199 Osterhout et al. Dec 2016 B2
9651787 Haddick May 2017 B2
9651788 Osterhout et al. May 2017 B2
9651789 Osterhout et al. May 2017 B2
9672210 Osterhout Jun 2017 B2
9684172 Border et al. Jun 2017 B2
D792400 Osterhout Jul 2017 S
D793391 Nakagawa et al. Aug 2017 S
D793467 Krause Aug 2017 S
D794022 Limaye et al. Aug 2017 S
D795865 Porter et al. Aug 2017 S
9746676 Osterhout et al. Aug 2017 B2
D796504 Natsume et al. Sep 2017 S
D796506 Natsume et al. Sep 2017 S
D800118 Xing et al. Oct 2017 S
D803832 Lin et al. Nov 2017 S
9846308 Osterhout Dec 2017 B2
9897822 Osterhout et al. Feb 2018 B2
9933622 Border et al. Apr 2018 B2
D819026 Limaye et al. May 2018 S
10018837 Border et al. Jul 2018 B2
10025119 Huynh Jul 2018 B2
10036889 Border et al. Jul 2018 B2
20020021498 Ohtaka et al. Feb 2002 A1
20020054272 Ebata et al. May 2002 A1
20020152425 Chaiken et al. Oct 2002 A1
20020183101 Oh et al. Dec 2002 A1
20030030912 Gleckman et al. Feb 2003 A1
20040008158 Chi et al. Jan 2004 A1
20040066363 Yamano et al. Apr 2004 A1
20040132509 Glezerman Jul 2004 A1
20050237271 Yamamoto Oct 2005 A1
20050248717 Howell et al. Nov 2005 A1
20050264752 Howell et al. Dec 2005 A1
20050280772 Hammock et al. Dec 2005 A1
20060061542 Stokic et al. Mar 2006 A1
20060109623 Harris et al. May 2006 A1
20060239629 Qi et al. Oct 2006 A1
20070100637 McCune et al. May 2007 A1
20070296684 Thomas et al. Dec 2007 A1
20080122736 Ronzani et al. May 2008 A1
20080125288 Case et al. May 2008 A1
20080143954 Abreu et al. Jun 2008 A1
20080291277 Jacobsen et al. Nov 2008 A1
20090013204 Kobayashi et al. Jan 2009 A1
20090040296 Moscato et al. Feb 2009 A1
20090108837 Johansson et al. Apr 2009 A1
20090279180 Amitai et al. Nov 2009 A1
20100045928 Levy et al. Feb 2010 A1
20100079356 Hoellwarth Apr 2010 A1
20100079508 Hodge et al. Apr 2010 A1
20100130140 Waku et al. May 2010 A1
20100149073 Chaum et al. Jun 2010 A1
20100259718 Hardy Oct 2010 A1
20100309426 Howell et al. Dec 2010 A1
20110130958 Stahl et al. Jun 2011 A1
20110131495 Bull et al. Jun 2011 A1
20110159931 Boss et al. Jun 2011 A1
20110164047 Pance et al. Jul 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110196610 Waldman et al. Aug 2011 A1
20110199171 Prest et al. Aug 2011 A1
20110201213 Dabov et al. Aug 2011 A1
20110202823 Berger et al. Aug 2011 A1
20110213664 Osterhout et al. Sep 2011 A1
20110234475 Endo et al. Sep 2011 A1
20110241975 Mukawa Oct 2011 A1
20110285764 Kimura et al. Nov 2011 A1
20120026455 Takahashi Feb 2012 A1
20120050493 Ernst et al. Mar 2012 A1
20120062850 Travis Mar 2012 A1
20120078628 Ghulman et al. Mar 2012 A1
20120147317 Loeb, Jr. Jun 2012 A1
20120162270 Fleck et al. Jun 2012 A1
20120169608 Forutanpour et al. Jul 2012 A1
20120188245 Hyatt et al. Jul 2012 A1
20120212593 Na'aman et al. Aug 2012 A1
20120223885 Perez Sep 2012 A1
20120242570 Kobayashi et al. Sep 2012 A1
20120242698 Haddick et al. Sep 2012 A1
20120250152 Larson et al. Oct 2012 A1
20120264510 Wigdor et al. Oct 2012 A1
20120268449 Choi et al. Oct 2012 A1
20120306850 Balan et al. Dec 2012 A1
20120307198 Ifergan Dec 2012 A1
20120326948 Crocco et al. Dec 2012 A1
20120327116 Liu et al. Dec 2012 A1
20130009366 Hannegan et al. Jan 2013 A1
20130044042 Olsson et al. Feb 2013 A1
20130063695 Hsieh Mar 2013 A1
20130069985 Wong et al. Mar 2013 A1
20130083009 Geisner et al. Apr 2013 A1
20130083055 Piemonte et al. Apr 2013 A1
20130100259 Ramaswamy Apr 2013 A1
20130120841 Shpunt et al. May 2013 A1
20130121562 Barnum May 2013 A1
20130135198 Hodge et al. May 2013 A1
20130154913 Genc et al. Jun 2013 A1
20130185052 Boyd et al. Jul 2013 A1
20130196757 Latta et al. Aug 2013 A1
20130201080 Evans et al. Aug 2013 A1
20130201081 Evans et al. Aug 2013 A1
20130207970 Shpunt et al. Aug 2013 A1
20130230215 Gurman et al. Sep 2013 A1
20130235331 Heinrich Sep 2013 A1
20130249776 Olsson et al. Sep 2013 A1
20130250503 Olsson et al. Sep 2013 A1
20130257622 Davalos et al. Oct 2013 A1
20130265212 Kato et al. Oct 2013 A1
20130265227 Julian et al. Oct 2013 A1
20130293580 Spivack et al. Nov 2013 A1
20130321265 Bychkov et al. Dec 2013 A1
20130321271 Bychkov et al. Dec 2013 A1
20130342981 Cox et al. Dec 2013 A1
20140028704 Wu et al. Jan 2014 A1
20140029498 Kim et al. Jan 2014 A1
20140043682 Hussey et al. Feb 2014 A1
20140062854 Cho Mar 2014 A1
20140111864 Margulis et al. Apr 2014 A1
20140129328 Mathew May 2014 A1
20140146394 Tout et al. May 2014 A1
20140147829 Jerauld May 2014 A1
20140152530 Venkatesha et al. Jun 2014 A1
20140152558 Salter et al. Jun 2014 A1
20140152676 Rohn et al. Jun 2014 A1
20140153173 Pombo et al. Jun 2014 A1
20140159995 Adams et al. Jun 2014 A1
20140160055 Margolis et al. Jun 2014 A1
20140160157 Poulos et al. Jun 2014 A1
20140160170 Lyons Jun 2014 A1
20140168735 Yuan et al. Jun 2014 A1
20140176603 Kumar et al. Jun 2014 A1
20140177023 Gao et al. Jun 2014 A1
20140183269 Glaser et al. Jul 2014 A1
20140206416 Aurongzeb et al. Jul 2014 A1
20140347572 Liu et al. Nov 2014 A1
20140354624 Chaji Dec 2014 A1
20140375545 Finocchio et al. Dec 2014 A1
20150029088 Kim et al. Jan 2015 A1
20150042544 Sugihara et al. Feb 2015 A1
20150084862 Sugihara et al. Mar 2015 A1
20150145839 Hack et al. May 2015 A1
20150168730 Ashkenazi et al. Jun 2015 A1
20150178932 Wyatt et al. Jun 2015 A1
20150198807 Hirai Jul 2015 A1
20150205117 Border et al. Jul 2015 A1
20150205132 Osterhout et al. Jul 2015 A1
20150293587 Wilairat et al. Oct 2015 A1
20150294627 Yoo et al. Oct 2015 A1
20150309317 Osterhout Oct 2015 A1
20150309534 Osterhout Oct 2015 A1
20150309995 Osterhout Oct 2015 A1
20150346496 Haddick Dec 2015 A1
20150346511 Osterhout Dec 2015 A1
20150347823 Monnerat et al. Dec 2015 A1
20150382305 Drincic Dec 2015 A1
20160018646 Osterhout et al. Jan 2016 A1
20160018647 Osterhout et al. Jan 2016 A1
20160018648 Osterhout et al. Jan 2016 A1
20160018649 Osterhout et al. Jan 2016 A1
20160037833 Kriesel Feb 2016 A1
20160048025 Cazalet Feb 2016 A1
20160078278 Moore et al. Mar 2016 A1
20160085278 Osterhout Mar 2016 A1
20160103325 Mirza Apr 2016 A1
20160131904 Border et al. May 2016 A1
20160131911 Border et al. May 2016 A1
20160132082 Border et al. May 2016 A1
20160133201 Border et al. May 2016 A1
20160161743 Osterhout et al. Jun 2016 A1
20160161747 Osterhout Jun 2016 A1
20160171846 Brav et al. Jun 2016 A1
20160178904 Deleeuw et al. Jun 2016 A1
20160187658 Osterhout et al. Jun 2016 A1
20160209674 Montalban Jul 2016 A1
20160246055 Border et al. Aug 2016 A1
20160370606 Huynh Dec 2016 A1
20170031395 Osterhout et al. Feb 2017 A1
20170099749 Nikkhoo et al. Apr 2017 A1
20170219831 Haddick et al. Aug 2017 A1
20170220865 Osterhout et al. Aug 2017 A1
20170227778 Osterhout Aug 2017 A1
20170227793 Abreu Aug 2017 A1
20170235133 Border et al. Aug 2017 A1
20170235134 Border et al. Aug 2017 A1
20170311483 Kawai Oct 2017 A1
20170337187 Osterhout Nov 2017 A1
20170343810 Bietry et al. Nov 2017 A1
20170351098 Osterhout et al. Dec 2017 A1
20180003988 Osterhout Jan 2018 A1
20180024369 Kato Jan 2018 A1
20180143451 Osterhout et al. May 2018 A1
20180267302 Border et al. Sep 2018 A1
Foreign Referenced Citations (24)
Number Date Country
368898 May 1990 EP
777867 Jun 1997 EP
2207164 Jul 2010 EP
2486450 Aug 2012 EP
2502410 Sep 2012 EP
2009171505 Jul 2009 JP
5017989 Sep 2012 JP
2012212990 Nov 2012 JP
1020110101944 Sep 2011 KR
9414152 Jun 1994 WO
03023756 Mar 2003 WO
WO2010101919 Sep 2010 WO
2011143655 Nov 2011 WO
2012058175 May 2012 WO
2013050650 Apr 2013 WO
2013103825 Jul 2013 WO
2013110846 Aug 2013 WO
2013170073 Nov 2013 WO
2013176079 Nov 2013 WO
2016073734 May 2016 WO
WO2016132974 Aug 2016 WO
2016205601 Dec 2016 WO
2017100074 Jun 2017 WO
2018044537 Mar 2018 WO
Non-Patent Literature Citations (27)
Entry
US 8,743,465 B2, 06/2014, Totani et al. (withdrawn)
US 8,792,178 B2, 07/2014, Totani et al. (withdrawn)
U.S. Appl. No. 16/130,268, filed Sep. 13, 2018, Pending.
U.S. Appl. No. 15/802,727, filed Nov. 3, 2017, Pending.
U.S. Appl. No. 29/589,483, filed Dec. 31, 2016, Pending.
U.S. Appl. No. 29/589,676, filed Jan. 4, 2017, Allowed.
U.S. Appl. No. 29/575,093, filed Aug. 22, 2016, Pending.
U.S. Appl. No. 29/581,145, filed Oct. 17, 2016, Pending.
“Audio Spotlight”, by Holosonics, http://www.holosonics.com, accessed Jul. 3, 2014, 3 pages.
“Sound from Ultrasound”, Wikipedia entry, http://en.m.wikipedia.org/wiki/Sound_from_ultrasound, accessed Jul. 3, 2014, 13 pages.
15782758.5, “European Application Serial No. 15782758.5, Extended European Search Report dated Nov. 27, 2017”, Osterhout Group, Inc., 10 Pages.
15857713.0, “European Application Serial No. 15857713.0, Extended European Search Report Received dated Oct. 16, 2017”, Osterhout Group, Inc., 7 Pages.
Clements-Cortes, Amy et al., “Short-Term Effects of Rhythmic Sensory Stimulation in Alzheimer's Disease: An Exploratory Pilot Study”, Journal of Alzheimer's Disease 52 (2016) DOI 10.3233/JAD-160081 IOS Press, Handling Associate Editor: George Acquaah-Mensah, Feb. 9, 2016, 651-660.
Osterhout, Ralph , “Commercial and Social Implications”, https://infinityleap.com/interview-the-ceo-of-osterhout-design-group-ralph-osterhout/>, Aug. 26, 2017, 12 pages.
PCT/US2015/026704, “International Application Serial No. PCT/US2015/026704, International Preliminary Report on Patentability and Written Opinion dated Nov. 3, 2016”, Osterhout Group, Inc., 10 Pages.
PCT/US2015/026704, “International Search Report and Written Opinion dated Aug. 21, 2015”, Osterhout Group, Inc., 15 pages.
PCT/US2015/059264, “International Application Serial No. PCT/US2015/059264, International Preliminary Report on Patentability and Written Opinion dated May 18, 2017”, Osterhout Group, Inc., 8 Pages.
PCT/US2015/059264, “International Application Serial No. PCT/US2015/059264, International Search Report and Written Opinion dated Feb. 19, 2016”, Osterhout Group, Inc., 11 Pages.
PCT/US2016/038008, “Application Serial No. PCT/US2016/038008, International Search Report and Written Opinion dated Oct. 27, 2016”, Osterhout Group, Inc., 8 pages.
PCT/US2016/038008, “International Application Serial No. PCT/US2016/038008, International Preliminary Report on Patentability dated Dec. 28, 2017”, Osterhout Group, Inc., 6 Pages.
PCT/US2016/064441, “Application Serial No. PCT/US2016/064441, International Search Report and Written Opinion dated Feb. 7, 2017”, Osterhout Group, Inc., 11 pages.
PCT/US2016/064441, “International Application Serial No. PCT/US2016/064441, International Preliminary Report on Patentability and Written Opinion dated Jun. 21, 2018”, Osterhout Group, Inc., 9 Pages.
PCTUS2017046701, “Application Serial No. PCTUS2017046701, International Search Report and the Written Opinion dated Nov. 6, 2017”, 7 pages.
Schedwill, “Bidirectional OLED Microdisplay”, Fraunhofer Research Institution for Organics, Materials and Electronic Device Comedd, Apr. 11, 2014, 2 pages.
Vogel, et al., “Data glasses controlled by eye movements”, Information and communication, Fraunhofer-Gesellschaft, Sep. 22, 2013, 2 pages.
Ye, Hui et al., “High Quality Voice Morphing”, Cambridge University Engineering Department Trumpington Street, Cambridge, England, CB2 1PZ, 2004, I-9-I-11.
European Search Report dated Mar. 13, 2020, for EP Application No. 17847193.4, eight pages.
Related Publications (1)
Number Date Country
20180059434 A1 Mar 2018 US