Unmanned aerial vehicle launch system

Information

  • Patent Grant
  • 11851177
  • Patent Number
    11,851,177
  • Date Filed
    Wednesday, January 17, 2018
    6 years ago
  • Date Issued
    Tuesday, December 26, 2023
    4 months ago
Abstract
Aspects of the present invention relate to unmanned aerial vehicle launch technologies and methods for receiving at a head-worn computer (HWC) a plurality of images captured by an unmanned aerial vehicle (UAV) launched from a grenade launcher, and displaying the plurality of captured images in a see-through display of the HWC.
Description
BACKGROUND
Field of the Invention

This invention relates unmanned aerial vehicle (UAV) systems. More particularly, this invention relates to UAV launch technologies.


Description of Related Art

Unmanned aerial vehicles (UAVs) are used in connection with many different operations. UAVs come in many shapes and sizes. A need exists to improve UAVs and to improve UAV launch systems to make UAVs more useful and deployable.


SUMMARY

Aspects of the present invention relate to UAV launch technologies.


These and other systems, methods, objects, features, and advantages of the present invention will be apparent to those skilled in the art from the following detailed description of the preferred embodiment and the drawings. All documents mentioned herein are hereby incorporated in their entirety by reference.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments are described with reference to the following Figures. The same numbers may be used throughout to reference like features and components that are shown in the Figures:



FIG. 1 illustrates a head worn computing system in accordance with the principles of the present invention.



FIG. 2 illustrates a head worn computing system with optical system in accordance with the principles of the present invention.



FIG. 3 illustrates a UAV launch and communication system in accordance with the principles of the present invention.



FIG. 4 illustrates a launch system in accordance with the principles of the present invention.



FIG. 5 illustrates a communication system in accordance with the principles of the present invention.





While the invention has been described in connection with certain preferred embodiments, other embodiments would be understood by one of ordinary skill in the art and are encompassed herein.


DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)

Aspects of the present invention relate to head-worn computing (“HWC”) systems. HWC involves, in some instances, a system that mimics the appearance of head-worn glasses or sunglasses. The glasses may be a fully developed computing platform, such as including computer displays presented in each of the lenses of the glasses to the eyes of the user. In embodiments, the lenses and displays may be configured to allow a person wearing the glasses to see the environment through the lenses while also seeing, simultaneously, digital imagery, which forms an overlaid image that is perceived by the person as a digitally augmented image of the environment, or augmented reality (“AR”).


HWC involves more than just placing a computing system on a person's head. The system may need to be designed as a lightweight, compact and fully functional computer display, such as wherein the computer display includes a high resolution digital display that provides a high level of emersion comprised of the displayed digital content and the see-through view of the environmental surroundings. User interfaces and control systems suited to the HWC device may be required that are unlike those used for a more conventional computer such as a laptop. For the HWC and associated systems to be most effective, the glasses may be equipped with sensors to determine environmental conditions, geographic location, relative positioning to other points of interest, objects identified by imaging and movement by the user or other users in a connected group, and the like. The HWC may then change the mode of operation to match the conditions, location, positioning, movements, and the like, in a method generally referred to as a contextually aware HWC. The glasses also may need to be connected, wirelessly or otherwise, to other systems either locally or through a network. Controlling the glasses may be achieved through the use of an external device, automatically through contextually gathered information, through user gestures captured by the glasses sensors, and the like. Each technique may be further refined depending on the software application being used in the glasses. The glasses may further be used to control or coordinate with external devices that are associated with the glasses.


Referring to FIG. 1, an overview of the HWC system 100 is presented. As shown, the HWC system 100 comprises a HWC 102, which in this instance is configured as glasses to be worn on the head with sensors such that the HWC 102 is aware of the objects and conditions in the environment 114. In this instance, the HWC 102 also receives and interprets control inputs such as gestures and movements 116. The HWC 102 may communicate with external user interfaces 104. The external user interfaces 104 may provide a physical user interface to take control instructions from a user of the HWC 102 and the external user interfaces 104 and the HWC 102 may communicate bi-directionally to affect the user's command and provide feedback to the external device 108. The HWC 102 may also communicate bi-directionally with externally controlled or coordinated local devices 108. For example, an external user interface 104 may be used in connection with the HWC 102 to control an externally controlled or coordinated local device 108. The externally controlled or coordinated local device 108 may provide feedback to the HWC 102 and a customized GUI may be presented in the HWC 102 based on the type of device or specifically identified device 108. The HWC 102 may also interact with remote devices and information sources 112 through a network connection 110. Again, the external user interface 104 may be used in connection with the HWC 102 to control or otherwise interact with any of the remote devices 108 and information sources 112 in a similar way as when the external user interfaces 104 are used to control or otherwise interact with the externally controlled or coordinated local devices 108. Similarly, HWC 102 may interpret gestures 116 (e.g. captured from forward, downward, upward, rearward facing sensors such as camera(s), range finders, IR sensors, etc.) or environmental conditions sensed in the environment 114 to control either local or remote devices 108 or 112.


We will now describe each of the main elements depicted on FIG. 1 in more detail; however, these descriptions are intended to provide general guidance and should not be construed as limiting. Additional description of each element may also be further described herein.


The HWC 102 is a computing platform intended to be worn on a person's head. The HWC 102 may take many different forms to fit many different functional requirements. In some situations, the HWC 102 will be designed in the form of conventional glasses. The glasses may or may not have active computer graphics displays. In situations where the HWC 102 has integrated computer displays the displays may be configured as see-through displays such that the digital imagery can be overlaid with respect to the user's view of the environment 114. There are a number of see-through optical designs that may be used, including ones that have a reflective display (e.g. LCoS, DLP), emissive displays (e.g. OLED, LED), hologram, TIR waveguides, and the like. In embodiments, lighting systems used in connection with the display optics may be solid state lighting systems, such as LED, OLED, quantum dot, quantum dot LED, etc. In addition, the optical configuration may be monocular or binocular. It may also include vision corrective optical components. In embodiments, the optics may be packaged as contact lenses. In other embodiments, the HWC 102 may be in the form of a helmet with a see-through shield, sunglasses, safety glasses, goggles, a mask, fire helmet with see-through shield, police helmet with see through shield, military helmet with see-through shield, utility form customized to a certain work task (e.g. inventory control, logistics, repair, maintenance, etc.), and the like.


The HWC 102 may also have a number of integrated computing facilities, such as an integrated processor, integrated power management, communication structures (e.g. cell net, WiFi, Bluetooth, local area connections, mesh connections, remote connections (e.g. client server, etc.)), and the like. The HWC 102 may also have a number of positional awareness sensors, such as GPS, electronic compass, altimeter, tilt sensor, IMU, and the like. It may also have other sensors such as a camera, rangefinder, hyper-spectral camera, Geiger counter, microphone, spectral illumination detector, temperature sensor, chemical sensor, biologic sensor, moisture sensor, ultrasonic sensor, and the like.


The HWC 102 may also have integrated control technologies. The integrated control technologies may be contextual based control, passive control, active control, user control, and the like. For example, the HWC 102 may have an integrated sensor (e.g. camera) that captures user hand or body gestures 116 such that the integrated processing system can interpret the gestures and generate control commands for the HWC 102. In another example, the HWC 102 may have sensors that detect movement (e.g. a nod, head shake, and the like) including accelerometers, gyros and other inertial measurements, where the integrated processor may interpret the movement and generate a control command in response. The HWC 102 may also automatically control itself based on measured or perceived environmental conditions. For example, if it is bright in the environment the HWC 102 may increase the brightness or contrast of the displayed image. In embodiments, the integrated control technologies may be mounted on the HWC 102 such that a user can interact with it directly. For example, the HWC 102 may have a button(s), touch capacitive interface, and the like.


As described herein, the HWC 102 may be in communication with external user interfaces 104. The external user interfaces may come in many different forms. For example, a cell phone screen may be adapted to take user input for control of an aspect of the HWC 102. The external user interface may be a dedicated UI, such as a keyboard, touch surface, button(s), joy stick, and the like. In embodiments, the external controller may be integrated into another device such as a ring, watch, bike, car, and the like. In each case, the external user interface 104 may include sensors (e.g. IMU, accelerometers, compass, altimeter, and the like) to provide additional input for controlling the HWD 104.


As described herein, the HWC 102 may control or coordinate with other local devices 108. The external devices 108 may be an audio device, visual device, vehicle, cell phone, computer, and the like. For instance, the local external device 108 may be another HWC 102, where information may then be exchanged between the separate HWCs 108.


Similar to the way the HWC 102 may control or coordinate with local devices 106, the HWC 102 may control or coordinate with remote devices 112, such as the HWC 102 communicating with the remote devices 112 through a network 110. Again, the form of the remote device 112 may have many forms. Included in these forms is another HWC 102. For example, each HWC 102 may communicate its GPS position such that all the HWCs 102 know where all of HWC 102 are located.



FIG. 2 illustrates a HWC 102 with an optical system that includes an upper optical module 202 and a lower optical module 204. While the upper and lower optical modules 202 and 204 will generally be described as separate modules, it should be understood that this is illustrative only and the present invention includes other physical configurations, such as that when the two modules are combined into a single module or where the elements making up the two modules are configured into more than two modules. In embodiments, the upper module 202 includes a computer controlled display (e.g. LCoS, DLP, OLED, etc.) and image light delivery optics. In embodiments, the lower module includes eye delivery optics that are configured to receive the upper module's image light and deliver the image light to the eye of a wearer of the HWC. In FIG. 2, it should be noted that while the upper and lower optical modules 202 and 204 are illustrated in one side of the HWC such that image light can be delivered to one eye of the wearer, that it is envisioned by the present invention that embodiments will contain two image light delivery systems, one for each eye. It should also be noted that while many embodiments refer to the optical modules as “upper” and “lower” it should be understood that this convention is being used to make it easier for the reader and that the modules are not necessarily located in an upper-lower relationship. For example, the image generation module may be located above the eye delivery optics, below the eye delivery optics, on a side of the eye delivery optics, or otherwise positioned to satisfy the needs of the situation and/or the HWC 102 mechanical and optical requirements.


An aspect of the present invention relates to launch technologies related to unmanned aerial vehicles (UAVs). In embodiments, the launch system is adapted to use a grenade launcher (e.g. 40 mm grenade launcher) for ease of field integration. In embodiments, the launch system is adapted to be relatively quiet to allow for covert operations. When a grenade launcher is normally used in the field, quietness is not a significant requirement because the grenade being launched is going to land at a relatively short distance from the launcher and there is going to be a rather large noise made by the exploding grenade. The inventors of the present invention have realized that the use of a grenade launcher can provide ease for field integration and the launch should be relatively quiet to allow for those instances where one wants to launch the UAV and remain unnoticed. The inventors have also discovered that the use of an explosive charge to send the UAV to its usable or near usable height can significantly save on UAV battery life and hence increase the fly time of the UAV.



FIG. 3 illustrates a UAV launch system according to the principles of the present invention. In this embodiment, the launcher 320 may be a grenade launcher, such as a standalone grenade launcher, grenade launcher rifle combination, etc. An explosive charge may then be used to launch the UAV 314 from the grenade launcher 320. In embodiments, the UAV 314 is held in a launch position inside of a separable case 318. The separable case 318 may be designed to encase the UAV 314 in a launch position and the separable case 318, with the UAV inside, can be fired out of the launcher 320. The separable case 318 may then separate from the UAV in mid-flight. Once released from the separable case 318, the UAV may deploy flight sustaining elements, such as counter rotating blades that can keep the UAV in-flight in a controlled system, as indicating as unfold UAV position 312. Once the UAV achieves its fully deployed position 304 it may fly autonomously or through the control of a remote operator. The fully deployed UAV 304 may take the form of a dual counter rotating blade helicopter and it may operate a self contained camera with a field of view 310 to image the surroundings, sensors to sense environmental conditions of the surroundings, etc. The output from the camera and other sensors may be communicated to other systems through communication channel(s) 308. The UAV 304 may be controlled by a person on the ground in near proximity to the UAV. For example, a person wearing a HWC 102 may control the UAV as an externally controlled device 108 through the use of gestures 116, external control devices 104, etc. (as described herein elsewhere) with the HWC 102 communicating with the UAV either directly or indirectly.


As indicated in FIG. 3, the UAV may communicate with devices on the ground, such as HWC(s) 102, server(s), etc. to get files, streams, etc. from the UAV's camera and other sensor systems to persons in need of such information. The ground devices may further be used to control the flight and sensor systems of the UAV. For example, a person on the ground wearing a HWC 102 may receive files and streams of information from the UAV 304 and the person may also control the functions of the UAV by using control technologies described herein elsewhere.



FIG. 4 illustrates a UAV launch system adapted to be loaded and fired from a grenade launcher (not shown). The UAV launch system illustrated in this embodiment includes a separable launch case 318, which houses the UAV copter 314 in a launch configuration. The separable launch case 318 is secured to a retained propellant case 418. Both the separable launch case 318 and the retained propellant case are configured to be loaded into a grenade launcher for launching. The retained propellant case 418 houses a piston 410 that travels on a post 404. The piston 410 is illustrated in the pre-launch or down position. The retained propellant case 418 also includes a propellant container that contains a propellant 408. The grenade launcher (not shown) will trigger the propellant 408 to initiate the UAV launch. The propellant container has vents that allow the propellant to expand rapidly into the nearby container once triggered. The expanding gas pushes the piston 410 towards the UAV 314 and the separable launch case 318. The piston travels along the post 404 and the piston is stopped by the stop 402 towards the top of the post 404. The piston 410 has a seal 412 that seals, at least temporarily, the vessel below the piston 410 to contain a substantial amount of the gas that expands into the vessel from the triggered propellant. The upper portion of the piston 410 may also be sealed to help prevent the expanding gas from escaping into the surrounding environment. The seal 412 contains the expanding gas and resulting pressure and also prevents a loud noise from being generated by containing the gas. If the gas were allowed to escape to the surrounding environment quickly a loud noise would occur and the launch event would be marked by a loud audio signature. The seal(s) 412, or others, causing the expanding gas to be contained may permit the gas to slowly escape following the launch; so long as the leak is slow it will not cause a loud noise. The purpose of the seals is to contain the gas pressure to prevent a loud launch noise, so a substantial portion of the gas needs to be contained for at least a short period of time and then the gas can slowly escape around the seals.



FIG. 5 illustrates a UAV communication network in accordance with the principles of the present invention. Once deployed, the UAV 304 may communicate with ground personnel via optional communications links 508. For example, the optional communications links 508 may include a communication protocol that involves the UAV 304 communicating to a server node 504 on the ground and then the server node may communicate the information with the HWC 102 nodes 502. In other embodiments, the UAV 304 may communicate with an HWC 102 node 502 directly. The HWC 502 nodes and the server node may be arranged in an ad-hoc self-healing network arrangement. In other embodiments, the network topology and protocols may be otherwise arranged (e.g. spoke and hub, IP, etc.).


Although embodiments of HWC have been described in language specific to features, systems, computer processes and/or methods, the appended claims are not necessarily limited to the specific features, systems, computer processes and/or methods described. Rather, the specific features, systems, computer processes and/or and methods are disclosed as non-limited example implementations of HWC.


All documents referenced herein are hereby incorporated by reference.

Claims
  • 1. A method, comprising: receiving, at a first time, at a head-worn computer (HWC), a first video captured by a sensor of an unmanned aerial vehicle (UAV) launched via a launch system;receiving input at the HWC from a user of the HWC;communicating the input via the HWC to a flight system of the UAV;receiving, at a second time later than the first time, a second video captured by the sensor of the UAV; anddisplaying the first video and the second video to the user via a see-through display of the HWC,wherein: the communicating the input to the flight system causes the UAV to, in accordance with a determination by the HWC that the UAV is deployed from the launch system, move from a first position at the first time to a second position at the second time, andthe flight system is configured to, in accordance with a determination by the HWC that the UAV is not deployed from the launch system, forgo causing the UAV to move from the first position to the second position.
  • 2. The method of claim 1, wherein the UAV comprises counter rotating blades and the input comprises control input for the counter rotating blades.
  • 3. The method of claim 1, wherein receiving the first video at the HWC comprises receiving the first video at the HWC from the UAV via a server node on the ground.
  • 4. The method of claim 3, wherein receiving the first video at the HWC from the UAV via the server node comprises receiving the first video at the HWC via an ad-hoc network.
  • 5. The method of claim 4, wherein the ad-hoc network comprises a plurality of HWCs.
  • 6. The method of claim 3, wherein receiving the first video at the HWC from the UAV via the server node comprises receiving the first video at the HWC via a spoke and hub network comprising the server node and a plurality of HWCs.
  • 7. The method of claim 1, wherein the input comprises control input for the sensor.
  • 8. The method of claim 1, wherein the first video comprises streaming video.
  • 9. The method of claim 1, wherein receiving the input at the HWC comprises receiving the input via a mobile phone.
  • 10. The method of claim 1, wherein receiving the input at the HWC comprises receiving the input via a wearable device that does not include the HWC.
  • 11. The method of claim 1, wherein the input comprises gesture input.
  • 12. The method of claim 1, wherein receiving the input at the HWC comprises: identifying a device in communication with the HWC;presenting a customized GUI on the HWC based on the identified device; andreceiving the input via the device.
  • 13. A system comprising: a head-worn computer (HWC) comprising a see-through display; andone or more processors configured to perform a method comprising: receiving, at a first time, at the HWC, a first video captured by a sensor of an unmanned aerial vehicle (UAV) launched via a launch system;receiving input at the HWC from a user of the HWC;communicating the input via the HWC to a flight system of the UAV;receiving, at a second time later than the first time, a second video captured by the sensor of the UAV; anddisplaying the first video and the second video to the user via the see-through display, wherein: the communicating the input to the flight system causes the UAV to, in accordance with a determination by the HWC that the UAV is deployed from the launch system, move from a first position at the first time to a second position at the second time, andthe flight system is configured to, in accordance with a determination by the HWC that the UAV is not deployed from the launch system, forgo causing the UAV to move from the first position to the second position.
  • 14. The system of claim 13, wherein the UAV comprises counter rotating blades and the input comprises control input for the counter rotating blades.
  • 15. The system of claim 13, wherein receiving the first video at the HWC comprises receiving the first video at the HWC from the UAV via a server node on the ground.
  • 16. The system of claim 15, wherein receiving the first video at the HWC from the UAV via the server node comprises receiving the first video at the HWC via an ad-hoc network.
  • 17. The system of claim 16, wherein the ad-hoc network comprises a plurality of HWCs.
  • 18. The system of claim 15, wherein receiving the first video at the HWC from the UAV via the server node comprises receiving the first video at the HWC via a spoke and hub network comprising the server node and a plurality of HWCs.
  • 19. The system of claim 13, wherein the input comprises control input for the sensor.
  • 20. The system of claim 13, wherein the first video comprises streaming video.
  • 21. The system of claim 13, wherein receiving the input at the HWC comprises receiving the input via a mobile phone.
  • 22. The system of claim 13, wherein receiving the input at the HWC comprises receiving the input via a wearable device that does not include the HWC.
  • 23. The system of claim 13, wherein the input comprises gesture input.
  • 24. The system of claim 13, wherein receiving the input at the HWC comprises: identifying a device in communication with the HWC;presenting a customized GUI on the HWC based on the identified device; andreceiving the input via the device.
CLAIM TO PRIORITY

This application is a continuation of the following U.S. patent application, which is incorporated by reference in its entirety: U.S. Ser. No. 14/271,028, filed May 6, 2014.

US Referenced Citations (106)
Number Name Date Kind
1897833 Benway Feb 1933 A
4852988 Velez Aug 1989 A
5596451 Handschy Jan 1997 A
5717422 Fergason Feb 1998 A
5748264 Hegg May 1998 A
5808800 Handschy Sep 1998 A
6078427 Fontaine Jun 2000 A
6195136 Handschy Feb 2001 B1
6347764 Brandon Feb 2002 B1
6359723 Handschy Mar 2002 B1
6369952 Rallison Apr 2002 B1
6433760 Vaissie Aug 2002 B1
6456438 Lee Sep 2002 B1
6491391 Blum et al. Dec 2002 B1
6847336 Lemelson Jan 2005 B1
6943754 Aughey Sep 2005 B2
6977776 Volkenandt et al. Dec 2005 B2
6987787 Mick Jan 2006 B1
7088234 Naito Aug 2006 B2
7199934 Yamasaki Apr 2007 B2
7206134 Weissman Apr 2007 B2
7347551 Fergason et al. Mar 2008 B2
7488294 Torch Feb 2009 B2
7830370 Yamazaki Nov 2010 B2
7855743 Sako Dec 2010 B2
7928926 Yamamoto Apr 2011 B2
8004765 Amitai Aug 2011 B2
8190147 Kauffman May 2012 B2
8235529 Raffle Aug 2012 B1
8378924 Jacobsen Feb 2013 B2
8427396 Kim Apr 2013 B1
8494215 Kimchi Jul 2013 B2
8505430 Miralles Aug 2013 B2
8564883 Totani Oct 2013 B2
8576276 Bar-zeev Nov 2013 B2
8576491 Takagi Nov 2013 B2
8587869 Totani Nov 2013 B2
8594467 Lu Nov 2013 B2
8611015 Wheeler Dec 2013 B2
8638498 Bohn et al. Jan 2014 B2
8662686 Takagi Mar 2014 B2
8670183 Clavin Mar 2014 B2
8696113 Lewis Apr 2014 B2
8698157 Hanamura Apr 2014 B2
8711487 Takeda Apr 2014 B2
8733927 Lewis May 2014 B1
8733928 Lewis May 2014 B1
8745058 Garcia-barrio Jun 2014 B1
8750541 Dong Jun 2014 B1
8752963 Mcculloch Jun 2014 B2
8803867 Oikawa Aug 2014 B2
8823071 Oyamada Sep 2014 B2
8837880 Takeda Sep 2014 B2
8929589 Publicover et al. Jan 2015 B2
9010929 Lewis Apr 2015 B2
9274338 Robbins et al. Mar 2016 B2
9292973 Bar-zeev et al. Mar 2016 B2
9658473 Lewis May 2017 B2
9720505 Gribetz et al. Aug 2017 B2
10013053 Cederlund et al. Jul 2018 B2
10025379 Drake et al. Jul 2018 B2
10185147 Lewis Jan 2019 B2
20020021498 Ohtaka Feb 2002 A1
20030030597 Geist Feb 2003 A1
20060023158 Howell et al. Feb 2006 A1
20090279180 Amitai Nov 2009 A1
20110051627 El-Damhougy Mar 2011 A1
20110211056 Publicover et al. Sep 2011 A1
20110213664 Osterhout Sep 2011 A1
20120021806 Maltz Jan 2012 A1
20120050493 Ernst Mar 2012 A1
20120062850 Travis Mar 2012 A1
20120194551 Osterhout Aug 2012 A1
20120212593 Na Aug 2012 A1
20120223885 Perez Sep 2012 A1
20120250152 Larson Oct 2012 A1
20120264510 Wigdor Oct 2012 A1
20120306850 Balan Dec 2012 A1
20120327116 Liu Dec 2012 A1
20130009366 Hannegan Jan 2013 A1
20130044042 Olsson Feb 2013 A1
20130100259 Ramaswamy Apr 2013 A1
20130127980 Haddick May 2013 A1
20130154913 Genc Jun 2013 A1
20130196757 Latta Aug 2013 A1
20130257622 Davalos Oct 2013 A1
20140028704 Wu Jan 2014 A1
20140043682 Hussey Feb 2014 A1
20140062854 Cho Mar 2014 A1
20140129328 Mathew May 2014 A1
20140146394 Tout May 2014 A1
20140147829 Jerauld May 2014 A1
20140152530 Venkatesha Jun 2014 A1
20140152558 Salter Jun 2014 A1
20140152676 Rohn Jun 2014 A1
20140159995 Adams Jun 2014 A1
20140160055 Margolis Jun 2014 A1
20140160157 Poulos Jun 2014 A1
20140160170 Lyons Jun 2014 A1
20140168735 Yuan Jun 2014 A1
20140176603 Kumar Jun 2014 A1
20140177023 Gao Jun 2014 A1
20140195918 Friedlander Jul 2014 A1
20150142211 Shehata May 2015 A1
20150168731 Robbins Jun 2015 A1
20160293015 Bragin Oct 2016 A1
Foreign Referenced Citations (13)
Number Date Country
2316473 Jan 2001 CA
2362895 Dec 2002 CA
2388766 Dec 2003 CA
368898 May 1990 EP
777867 Jun 1997 EP
2486450 Aug 2012 EP
2502410 Sep 2012 EP
WO2011143655 Nov 2011 WO
WO2012058175 May 2012 WO
WO2013050650 Apr 2013 WO
WO2013103825 Jul 2013 WO
WO2013110846 Aug 2013 WO
WO2013170073 Nov 2013 WO
Non-Patent Literature Citations (12)
Entry
US 8,743,465 B2, 06/2014, Totani (withdrawn)
US 8,792,178 B2, 07/2014, Totani (withdrawn)
Jacob, R. “Eye Tracking in Advanced Interface Design”, Virtual Environments and Advanced Interface Design, Oxford University Press, Inc. (Jun. 1995).
Rolland, J. et al., “High-resolution inset head-mounted display”, Optical Society of America, vol. 37, No. 19, Applied Optics, (Jul. 1, 1998).
Tanriverdi, V. et al. (Apr. 2000). “Interacting With Eye Movements In Virtual Environments,” Department of Electrical Engineering and Computer Science, Tufts University, Medford, MA 02155, USA, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, eight pages.
Yoshida, A. et al., “Design and Applications of a High Resolution Insert Head Mounted Display”, (Jun. 1994).
Final Office Action dated Jan. 27, 2017, for U.S. Appl. No. 14/271,028, filed May 6, 2014, eleven pages.
Non-Final Office Action dated Apr. 18, 2016, for U.S. Appl. No. 14/271,028, filed May 6, 2014, eight pages.
Schedwill, “Bidirectional OLEO Microdisplay”, Fraunhofer Research Institution for Organics, Materials and Electronic Device Comedd, Apr. 11, 2014, 2 pages.
Vogel, et al., “Data glasses controlled by eye movements”, Information and communication, Fraunhofer-Gesellschaft, Sep. 22, 2013, 2 pages.
Azuma, Ronald T. (Aug. 1997). “A Survey of Augmented Reality,” In Presence: Teleoperators and Virtual Environments 6, 4, Hughes Research Laboratories, Malibu, CA, located at: https://web.archive.org/web/20010604100006/http://www.cs.unc.edu/˜azuma/ARpresence.pdf , retrieved on Oct. 26, 2020.
Bimber, Oliver et al. (2005). “Spatial Augmented Reality: Merging Real and Virtual Worlds,” A. K. Peters, Ltd., Wellesley, MA.
Related Publications (1)
Number Date Country
20180155025 A1 Jun 2018 US
Continuations (1)
Number Date Country
Parent 14271028 May 2014 US
Child 15873428 US