Headset computer operation using vehicle sensor feedback for remote control vehicle

Abstract
A system performs stable control of moving devices (such as a helicopter or robot) with attached camera(s), providing live imagery back to a head-mounted computer (HMC). The HMC controls the moving device. The HMC user specifies a desired path or location for the moving device. Camera images enable the user-specified instructions to be followed accurately and the device's position to be maintained thereafter. A method of controlling a moving device with a headset computer includes analyzing, at the headset computer, at least one image received from the moving device to form an indication of change in position of the moving device. The method displays to a user of the headset computer the indication of change in position of the moving device. The method can additionally include enabling the user to control the moving device.
Description
BACKGROUND

Mobile computing devices, such as notebook PCs, smart phones, and tablet computing devices, are now common tools used for producing, analyzing, communicating, and consuming data in both business and personal life. Consumers continue to embrace a mobile digital lifestyle as the ease of access to digital information increases with high-speed wireless communications technologies becoming ubiquitous. Popular uses of mobile computing devices include displaying large amounts of high-resolution computer graphics information and video content, often wirelessly streamed to the device. While these devices typically include a display screen, the preferred visual experience of a high-resolution, large format display cannot be easily replicated in such mobile devices because the physical size of such device is limited to promote mobility. Another drawback of the aforementioned device types is that the user interface is hands-dependent, typically requiring a user to enter data or make selections using a keyboard (physical or virtual) or touch-screen display. As a result, consumers are now seeking a hands-free high-quality, portable, color display solution to augment or replace their hands-dependent mobile devices.


SUMMARY

Recently developed micro-displays can provide large-format, high-resolution color pictures and streaming video in a very small form factor. One application for such displays can be integrated into a wireless headset computer worn on the head of the user with a display within the field of view of the user, similar in format to either eyeglasses, audio headset or video eyewear. A “wireless computing headset” device includes one or more small high-resolution micro-displays and optics to magnify the image. The WVGA microdisplay's can provide super video graphics array (SVGA) (800×600) resolution or extended graphic arrays (XGA) (1024×768) or even higher resolutions. A wireless computing headset contains one or more wireless computing and communication interfaces, enabling data and streaming video capability, and provides greater convenience and mobility through hands dependent devices. For more information concerning such devices, see co-pending patent applications entitled “Mobile Wireless Display Software Platform for Controlling Other Systems and Devices,” U.S. application Ser. No. 12/348,648 filed Jan. 5, 2009, “Handheld Wireless Display Devices Having High Resolution Display Suitable For Use as a Mobile Internet Device,” PCT International Application No. PCT/US09/38601 filed Mar. 27, 2009, and “Improved Headset Computer,” U.S. Application No. 61/638,419 filed Apr. 25, 2012, each of which are incorporated herein by reference in their entirety.


Embodiments of the present invention provide a way to implement stable remote control of moving devices (such as a helicopter or robot) that have one or more cameras attached, providing live imagery back to the controlling device, a head-mounted or headset computer (HMC or HSC) or other computer device. The user specifies how he wants the moving device to move. This invention uses analysis of the camera images to enable those instructions (user specified) to be followed accurately and the moving device's position to be maintained thereafter.


In one embodiment, a method of controlling a moving device with a headset computer includes analyzing, at the headset computer, at least one image received from the moving device to form an indication of change in position of the moving device. The method further can include displaying to a user of the headset computer the indication of change in position of the moving device. The method can additionally include enabling the user to control the moving device.


In another embodiment, the method can enable the user to control the moving device by user input. The user input can be head movement, hand gesture, voice command, or a digital command.


In another embodiment, the method can include transmitting at least one directional or angular command to the moving device.


In another embodiment, analyzing the at least one image further can include transmitting the at least one image to a host computer to perform the analysis and also include receiving the analysis from the host computer. The method can further include coupling the headset computer and the host computer for communication over a wireless transport. In another embodiment, the method can include coupling the headset computer and the moving device for communication over a wireless transport.


In another embodiment, the method can further include displaying, to the user, the at least one images from the moving device. The method can also include overlaying the indication of change of the moving device on the at least one images.


In another embodiment, enabling the user to control the device can further include analyzing user input, comparing the user input to a limit of the moving device, and reducing movement commands to the movement device to be within the limit.


In another embodiment, enabling the user to control the device can further include sending commands to an autopilot of the motion device.


In another embodiment, a system for controlling a moving device with a headset computer can include an analysis module configured to analyze, at the headset computer, at least one image received from the moving device to form an indication of change in position of the moving device. The system can further include a display configured to display to a user of the headset computer the indication of change in position of the moving device. The system can additionally include a control module configured to enable the user to control the moving device.





BRIEF DESCRIPTION OF DRAWINGS

The foregoing will be apparent from the following more particular description of example embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments of the present invention.



FIGS. 1A-1B are schematic illustrations of a headset computer cooperating with a host computer (e.g., Smart Phone, laptop, etc.) according to principles of the present invention.



FIG. 2 is a block diagram of flow of data and control in the embodiment of FIGS. 1A-1B.



FIG. 3 is a diagram 3000 showing an example embodiment of a front facing camera view of a motion device (e.g., a helicopter) hovering in front of window.



FIG. 4 is a block diagram 4000 illustrating an example embodiment of a method employed by the headset computer.



FIG. 5 illustrates a computer network or similar digital processing environment in which the present invention may be implemented.



FIG. 6 is a diagram of the internal structure of a computer (e.g., client processor/device or server computers) in the computer system of FIG. 5.





DETAILED DESCRIPTION


FIGS. 1A and 1B show example embodiments of a wireless computing headset device 100 (also referred to herein as a headset computer (HSC) or head mounted computer (HMC)) that incorporates a high-resolution (VGA or better) microdisplay element 1010, an image analysis system (analyzer) and other features described below. HSC 100 can include audio input and/or output devices, including one or more microphones, input and output speakers, geo-positional sensors (GPS), three to nine axis degrees of freedom orientation sensors, atmospheric sensors, health condition sensors, digital compass, pressure sensors, environmental sensors, energy sensors, acceleration sensors, position, attitude, motion, velocity and/or optical sensors, cameras (visible light, infrared, etc.), multiple wireless radios, auxiliary lighting, rangefinders, or the like and/or an array of sensors embedded and/or integrated into the headset and/or attached to the device via one or more peripheral ports (not shown in detail in FIG. 1B). Typically located within the housing of headset computing device 100 are various electronic circuits including, a microcomputer (single or multicore processors), one or more wired and/or wireless communications interfaces, memory or storage devices, various sensors and a peripheral mount or mount, such as a “hot shoe.”


Example embodiments of the HSC 100 can receive user input through sensing voice commands, head movements, 110, 111, 112 and hand gestures 113, or any combination thereof. Microphone(s) operatively coupled or preferably integrated into the HSC 100 can be used to capture speech commands which are then digitized and processed using automatic speech recognition techniques. Gyroscopes, accelerometers, and other micro-electromechanical system sensors can be integrated into the HSC 100 and used to track the user's head movement to provide user input commands. Cameras or other motion tracking sensors can be used to monitor a user's hand gestures for user input commands. Such a user interface overcomes the hands-dependant formats of other mobile devices.


The headset computing device 100 can be used in various ways. It can be used as a remote display for streaming video signals received from a remote host computing device 200 (shown in FIG. 1A). The host 200 may be, for example, a notebook PC, smart phone, tablet device, or other computing device having less or greater computational complexity than the wireless computing headset device 100, such as cloud-based network resources. The host may be further connected to other networks 210, such as the Internet. The headset computing device 100 and host 200 can wirelessly communicate via one or more wireless protocols, such as Bluetooth®, Wi-Fi, WiMAX, 4G LTE or other wireless radio link 150. (Bluetooth is a registered trademark of Bluetooth Sig, Inc. of 5209 Lake Washington Boulevard, Kirkland, Wash. 98033.) In an example embodiment, the host 200 may be further connected to other networks, such as through a wireless connection to the Internet or other cloud-based network resources, so that the host 200 can act as a wireless relay. Alternatively, some example embodiments of the HSC 100 can wirelessly connect to the Internet and cloud-based network resources without the use of a host wireless relay.



FIG. 1B is a perspective view showing some details of an example embodiment of a headset computer 100. The example embodiment HSC 100 generally includes, a frame 1000, strap 1002, rear housing 1004, speaker 1006, cantilever, or alternatively referred to as an arm or boom 1008 with a built in microphone, and a micro-display subassembly 1010.


A head worn frame 1000 and strap 1002 are generally configured so that a user can wear the headset computer device 100 on the user's head. A housing 1004 is generally a low profile unit which houses the electronics, such as the microprocessor, memory or other storage device, along with other associated circuitry. The microprocessor is configured for image analysis used to control (e.g. remotely control) a moving vehicle or object according to principles of the present invention. Speakers 1006 provide audio output to the user so that the user can hear information. Microdisplay subassembly 1010 is used to render visual information to the user. It is coupled to the arm 1008. The arm 1008 generally provides physical support such that the microdisplay subassembly is able to be positioned within the user's field of view 300 (FIG. 1A), preferably in front of the eye of the user or within its peripheral vision preferably slightly below or above the eye. Arm 1008 also provides the electrical or optical connections between the microdisplay subassembly 1010 and the control circuitry housed within housing unit 1004.


According to aspects that will be explained in more detail below, the HSC or HMC display device 100 allows a user to select a field of view 300 within a much larger area defined by a virtual display 400. The user can typically control the position, extent (e.g., X-Y or 3D range), and/or magnification of the field of view 300.



FIGS. 1A-1B show an example embodiment of a monocular microdisplay presenting a single fixed display element supported on the face of the user with a cantilevered boom. Other mechanical configurations for the remote control display device 100 are possible.



FIG. 2 is a block diagram 2000 showing additional details of an example embodiment of the HSC 100, host 200 and the data that travels between them. The HSC 100 receives vocal input from the user via the microphone, hand movements 2002 or body gestures via positional and orientation sensors, the camera or optical sensor(s), and head movement inputs 2004 via the head tracking circuitry such as 3 axis to 9 axis degrees of freedom orientational sensing. These are translated by software in the HSC 100 into keyboard and/or mouse commands 2008 that are then sent over the Bluetooth or other wireless interface 150 to the host 200. The host 200 then interprets these translated commands in accordance with its operating system/application software to perform various functions. Among the commands is one to select a field of view 300 within the virtual display 400 and return that selected screen data to the HSC 100. Thus, it should be understood that a very large format virtual display area might be associated with application software or an operating system running on the host 200. However, only a portion of that large virtual display area 400 within the field of view 300 is returned to and actually displayed by the micro display 1010 of HSC 100.


A moving object (for a nonlimiting example of a vehicle) of interest is coupled to the HMC 100 or other computer device 200 (the connection type is not important provided camera images from the object can be streamed to the HMC in real time). The moving object sends images from at least one camera to the HMC 100 or other computer device 200. The camera may be that of another HMC or HSC 100 worn by a passenger of the moving vehicle, for example, instead of the HMC remotely controlling the vehicle.


The HMC 100 or other computer device 200 runs software or otherwise implements an image analyzer that monitors changes in the position of objects in the camera images to calculate whether the moving object/vehicle is correctly following a desired course or maintaining position if no desired course is provided. Where discrepancies are detected between the desired course and the moving object's current position, the instructions to the moving object/vehicle may be adjusted to compensate and bring the moving object/vehicle into the desired position.



FIG. 3 is a diagram 3000 showing an example embodiment of front facing camera views of a motion device (e.g., a helicopter) hovering in front of window. The operator is instructing the helicopter to remain stationary. The HMC 100 or computer host 200 executing the invention software processes the received real time images 304, 314 and 324 to detect horizontal and vertical edges, as shown in the analyzed images 306, 316, and 326. The HMC 100 selects clearly detectable edges and monitors them for movement from frame to frame to determine movement of the helicopter.


When the camera is in the correct position 302, the analyzed image 306, corresponding with the real time image 304, shows detected horizontal and vertical edges. The HSC 100 continually monitors the edges in subsequent frames to determine whether the helicopter has moved.


After the camera has dropped too low 312, the detected edges have moved upwards in the real time image 314, shown by the dotted line and arrow in analyzed image 316. In response, the analysis system (generally at 100, 200) causes the helicopter system (or operator) to increase rotor power for a short time to bring the helicopter back up to the required level.


After the camera has rotated to the right 322, the detected edges shift to the left in real time image 324, shown by the dotted line and arrow in analyzed image 326. In response, the system/analyzer 100, 200 enables the helicopter operator to adjust the tail fin/rotor to rotate the helicopter back to the correct position 302.


If the operator wants to move the helicopter by a small amount, he can specify horizontal and vertical deltas, and in response, the autopilot software of the helicopter adjusts the desired edge positions accordingly. The operator can specify the horizontal and vertical deltas by manual entry (e.g., with a keyboard and mouse) or by hand gesture, head tracking, body gesture, etc. Then the auto-adjustment process (e.g., the image analyzer feeding to the vehicle autopilot system) causes the helicopter to move to the new position to match the desired edge positions.


If the operator wants to move the helicopter to a new location entirely, he can temporarily suspend the automatic position control (analyzer of the present invention coupled to helicopter auto pilot) and fly the helicopter using direct control (e.g., a joystick, manual entry, body movement, head tracking, or hand gestures) until the helicopter reaches the desired position. The helicopter can re-engage automatic position control to fine tune the desired position. Further, automatic position control can remain enabled during direct control to maintain orientation and prevent stalling and/or crashing of the helicopter.


In one embodiment the HSC 100 may take the form of the HSC described in a co-pending U.S. Patent Publication Number 2011/0187640 which is hereby incorporated by reference in its entirety.


In another embodiment, the invention relates to the concept of using a Head Mounted Display (HMD) 1010 in conjunction with an external ‘smart’ device 200 (such as a smartphone or tablet) to provide information and control to the user hands-free. The invention requires transmission of small amounts of data, providing a more reliable data transfer method running in real-time.


In this sense therefore, the amount of data to be transmitted over the connection 150 is small and includes instructions on how to lay out a screen, text to display, and other stylistic information such as drawing arrows, background color(s), or images to include, etc.


Additional data can be streamed over the same 150 or another connection and displayed on screen 1010, such as a video stream if required by the Controller 200.


The user can control the motion device by moving his or her head. The HSC tracks these movements using head tracking technology. For example, a user that turns his or her head to the right causes the motion device to turn to the right a proportional amount of degrees. The HSC moves the desired path/position/course in the analyzed image to the right, and then allows the auto-course adjustment to take place as described above. Similarly, the HSC can adjust the course if the user looks up or down. Other movements, such as moving forward or backward, can accelerate or decelerate the motion device. As such, the HSC combines analysis of head tracking movements with images received from the motion device to provide an intuitive, motion based, hands free control of the motion device.



FIG. 4 is a block diagram 4000 illustrating an example embodiment of a method employed by the headset computer. The method first analyzes images received from the moving device to form an indication of the change of position of the moving device (402). The method then displays, to the user of the headset computer, the indication of the change (404). Then, the method enables the user to control the device by user input (406).



FIG. 5 illustrates a computer network or similar digital processing environment in which the present invention may be implemented.


Client computer(s)/devices 50 and server computer(s) 60 provide processing, storage, and input/output devices executing application programs and the like. Client computer(s)/devices 50 can also be linked through communications network 70 to other computing devices, including other client devices/processes 50 and server computer(s) 60. Communications network 70 can be part of a remote access network, a global network (e.g., the Internet), a worldwide collection of computers, Local area or Wide area networks, and gateways that currently use respective protocols (TCP/IP, Bluetooth, etc.) to communicate with one another. Other electronic device/computer network architectures are suitable.



FIG. 6 is a diagram of the internal structure of a computer (e.g., client processor/device 50 or server computers 60) in the computer system of FIG. 5. Each computer 50, 60 contains system bus 79, where a bus is a set of hardware lines used for data transfer among the components of a computer or processing system. Bus 79 is essentially a shared conduit that connects different elements of a computer system (e.g., processor, disk storage, memory, input/output ports, network ports, etc.) that enables the transfer of information between the elements. Attached to system bus 79 is I/O device interface 82 for connecting various input and output devices (e.g., keyboard, mouse, displays, printers, speakers, etc.) to the computer 50, 60. Network interface 86 allows the computer to connect to various other devices attached to a network (e.g., network 70 of FIG. 5). Memory 90 provides volatile storage for computer software instructions 92 and data 94 used to implement an embodiment of the present invention (e.g., analyzer module, display module, and control module code detailed above). Disk storage 95 provides non-volatile storage for computer software instructions 92 and data 94 used to implement an embodiment of the present invention. Central processor unit 84 is also attached to system bus 79 and provides for the execution of computer instructions.


In one embodiment, the processor routines 92 and data 94 are a computer program product (generally referenced 92), including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the invention system. Computer program product 92 can be installed by any suitable software installation procedure, as is well known in the art. In another embodiment, at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection. In other embodiments, the invention programs are a computer program propagated signal product 107 embodied on a propagated signal on a propagation medium (e.g., a radio wave, an infrared wave, a laser wave, a sound wave, or an electrical wave propagated over a global network such as the Internet, or other network(s)). Such carrier medium or signals provide at least a portion of the software instructions for the present invention routines/program 92.


In alternate embodiments, the propagated signal is an analog carrier wave or digital signal carried on the propagated medium. For example, the propagated signal may be a digitized signal propagated over a global network (e.g., the Internet), a telecommunications network, or other network. In one embodiment, the propagated signal is a signal that is transmitted over the propagation medium over a period of time, such as the instructions for a software application sent in packets over a network over a period of milliseconds, seconds, minutes, or longer. In another embodiment, the computer readable medium of computer program product 92 is a propagation medium that the computer system 50 may receive and read, such as by receiving the propagation medium and identifying a propagated signal embodied in the propagation medium, as described above for computer program propagated signal product.


Generally speaking, the term “carrier medium” or transient carrier encompasses the foregoing transient signals, propagated signals, propagated medium, storage medium and the like.


While this invention has been particularly shown and described with references to example embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.

Claims
  • 1. A method of controlling a moving device with a headset computer, the method comprising: analyzing, at the headset computer, at least one image received from the moving device to form an indication of change in position of the moving device based on changes in position of one or more objects in the at least one image;displaying to a user of the headset computer the indication of change in position of the moving device; andenabling the user to control the moving device by analyzing user input, comparing the user input to a limit of the moving device, and reducing movement commands to the movement device to be within the limit.
  • 2. The method of claim 1, further comprising enabling the user to control the moving device by user input, the user input being at least one of head movement, hand gesture, voice command, or digital command.
  • 3. The method of claim 1, further comprising transmitting at least one directional or angular command to the moving device.
  • 4. The method of claim 1, wherein analyzing the at least one image further includes transmitting the at least one image to a host computer to perform the analysis and receiving the analysis from the host computer.
  • 5. The method of claim 4, further comprising coupling the headset computer and the host computer for communication over a wireless transport.
  • 6. The method of claim 1, further comprising coupling the headset computer and the moving device for communication over a wireless transport.
  • 7. The method of claim 1, further comprising, displaying, to the user, the at least one images from the moving device, and further overlaying the indication of change of the moving device on the at least one images.
  • 8. The method of claim 1, wherein enabling the user to control the device further includes sending commands to an autopilot of the motion device.
  • 9. A system for controlling a moving device with a headset computer, the system comprising: an analysis module configured to analyze, at the headset computer, at least one image received from the moving device to form an indication of change in position of the moving device based on changes in position of an object in the at least one image;a display configured to display to a user of the headset computer the indication of change in position of the moving device; anda control module configured to enable the user to control the moving device by analyzing user input, comparing the user input to a limit of the moving device, and reducing movement commands to the movement device to be within the limit.
  • 10. The system of claim 9, wherein the control module is further configured to enable the user to control the moving device by user input, the user input being at least one of head movement, hand gesture, voice command, or digital command.
  • 11. The system of claim 9, further comprising transmitting at least one directional or angular command to the moving device.
  • 12. The system of claim 9, wherein the analysis module is further configured to transmit the at least one image to a host computer to perform the analysis and receive the analysis from the host computer.
  • 13. The system of claim 12, wherein the headset computer and the host computer are coupled for communication over a wireless transport.
  • 14. The system of claim 9, wherein the headset computer and the moving device are coupled for communication over a wireless transport.
  • 15. The system of claim 9, wherein the display module is further configured to display, to the user, the at least one images from the moving device, and further configured to overlay the indication of change of the moving device on the at least one images.
  • 16. The system of claim 9, wherein the control module is further configured to send commands to an autopilot of the motion device.
RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application No. 61/749,196, filed on Jan. 4, 2013 and is a continuation-in-part of U.S. application Ser. No. 13/468,207, filed May 10, 2012. The entire teachings of the above applications are incorporated herein by reference.

US Referenced Citations (175)
Number Name Date Kind
5005213 Hanson et al. Apr 1991 A
5594469 Freeman et al. Jan 1997 A
5990793 Bieback Nov 1999 A
6010216 Jesiek Jan 2000 A
6084556 Zwern Jul 2000 A
6108197 Janik Aug 2000 A
6198462 Daily et al. Mar 2001 B1
6204974 Spitzer Mar 2001 B1
6313864 Tabata et al. Nov 2001 B1
6325507 Jannard Dec 2001 B1
6369952 Rallison et al. Apr 2002 B1
6778906 Hennings et al. Aug 2004 B1
6798391 Peterson, III Sep 2004 B2
6853293 Swartz et al. Feb 2005 B2
6900777 Hebert et al. May 2005 B1
6922184 Lawrence et al. Jul 2005 B2
6956614 Quintana et al. Oct 2005 B1
6966647 Jannard et al. Nov 2005 B2
7004582 Jannard et al. Feb 2006 B2
7013009 Warren Mar 2006 B2
7082393 Lahr Jul 2006 B2
7147324 Jannard et al. Dec 2006 B2
7150526 Jannard et al. Dec 2006 B2
7213917 Jannard et al. May 2007 B2
7216973 Jannard et al. May 2007 B2
7219994 Jannard et al. May 2007 B2
7231038 Warren Jun 2007 B2
7249846 Grand et al. Jul 2007 B2
7278734 Jannard et al. Oct 2007 B2
7331666 Swab et al. Feb 2008 B2
7445332 Jannard et al. Nov 2008 B2
7452073 Jannard et al. Nov 2008 B2
7458682 Lee Dec 2008 B1
7461936 Jannard Dec 2008 B2
7494216 Jannard et al. Feb 2009 B2
7501995 Morita et al. Mar 2009 B2
7512414 Jannard et al. Mar 2009 B2
7620432 Willins et al. Nov 2009 B2
7682018 Jannard Mar 2010 B2
7740353 Jannard Jun 2010 B2
7744213 Jannard et al. Jun 2010 B2
7753520 Fuziak, Jr. Jul 2010 B2
7760898 Howell et al. Jul 2010 B2
7798638 Fuziak, Jr. Sep 2010 B2
7806525 Howell et al. Oct 2010 B2
7918556 Lewis Apr 2011 B2
7959084 Wulff Jun 2011 B2
7966189 Le et al. Jun 2011 B2
7967433 Jannard et al. Jun 2011 B2
7969383 Eberl et al. Jun 2011 B2
7969657 Cakmakci et al. Jun 2011 B2
7976480 Grajales et al. Jul 2011 B2
7988283 Jannard Aug 2011 B2
7997723 Pienimaa et al. Aug 2011 B2
8010156 Warren Aug 2011 B2
8020989 Jannard et al. Sep 2011 B2
8025398 Jannard Sep 2011 B2
8072393 Riechel Dec 2011 B2
8092011 Sugihara et al. Jan 2012 B2
8098439 Amitai et al. Jan 2012 B2
8123352 Matsumoto et al. Feb 2012 B2
8140197 Lapidot et al. Mar 2012 B2
8184983 Ho et al. May 2012 B1
8212859 Tang et al. Jul 2012 B2
8855719 Jacobsen et al. Oct 2014 B2
8862186 Jacobsen et al. Oct 2014 B2
8929954 Jacobsen et al. Jan 2015 B2
20010003712 Roelofs Jun 2001 A1
20020015008 Kishida et al. Feb 2002 A1
20020030649 Zavracky et al. Mar 2002 A1
20020044152 Abbott, III et al. Apr 2002 A1
20020094845 Inasaka Jul 2002 A1
20020154070 Sato et al. Oct 2002 A1
20030016253 Aoki et al. Jan 2003 A1
20030067536 Boulanger et al. Apr 2003 A1
20030068057 Miller et al. Apr 2003 A1
20040113867 Tomine et al. Jun 2004 A1
20040193413 Wilson et al. Sep 2004 A1
20040210852 Balakrishnan et al. Oct 2004 A1
20040267527 Creamer et al. Dec 2004 A1
20050114140 Brackett et al. May 2005 A1
20050237296 Lee Oct 2005 A1
20050245292 Bennett et al. Nov 2005 A1
20050264527 Lin Dec 2005 A1
20060010368 Kashi Jan 2006 A1
20060028400 Lapstun et al. Feb 2006 A1
20060061551 Fateh Mar 2006 A1
20060109237 Morita et al. May 2006 A1
20060132382 Jannard Jun 2006 A1
20060178085 Sotereanos et al. Aug 2006 A1
20060221266 Kato et al. Oct 2006 A1
20060238877 Ashkenazi et al. Oct 2006 A1
20070009125 Frerking et al. Jan 2007 A1
20070030174 Randazzo et al. Feb 2007 A1
20070103388 Spitzer May 2007 A1
20070180979 Rosenberg Aug 2007 A1
20070220108 Whitaker Sep 2007 A1
20070265495 Vayser Nov 2007 A1
20080055194 Baudino et al. Mar 2008 A1
20080084992 Peddireddy et al. Apr 2008 A1
20080120141 Kariathungal et al. May 2008 A1
20080144854 Abreu Jun 2008 A1
20080180640 Ito Jul 2008 A1
20080198324 Fuziak Aug 2008 A1
20080200774 Luo Aug 2008 A1
20080239080 Moscato Oct 2008 A1
20090002640 Yang et al. Jan 2009 A1
20090079839 Fischer et al. Mar 2009 A1
20090093304 Ohta Apr 2009 A1
20090099836 Jacobsen et al. Apr 2009 A1
20090117890 Jacobsen et al. May 2009 A1
20090128448 Riechel May 2009 A1
20090154719 Wulff et al. Jun 2009 A1
20090180195 Cakmakci et al. Jul 2009 A1
20090182562 Caire et al. Jul 2009 A1
20090204410 Mozer et al. Aug 2009 A1
20090251409 Parkinson et al. Oct 2009 A1
20100020229 Hershey et al. Jan 2010 A1
20100033830 Yung Feb 2010 A1
20100053069 Tricoukes et al. Mar 2010 A1
20100073201 Holcomb et al. Mar 2010 A1
20100117930 Bacabara et al. May 2010 A1
20100119052 Kambli May 2010 A1
20100121480 Stelzer et al. May 2010 A1
20100128626 Anderson et al. May 2010 A1
20100141554 Devereaux et al. Jun 2010 A1
20100156812 Stallings et al. Jun 2010 A1
20100164990 Van Doorn Jul 2010 A1
20100171680 Lapidot et al. Jul 2010 A1
20100182137 Pryor Jul 2010 A1
20100238184 Janicki Sep 2010 A1
20100245585 Fisher et al. Sep 2010 A1
20100271587 Pavlopoulos Oct 2010 A1
20100277563 Gupta et al. Nov 2010 A1
20100289817 Meier et al. Nov 2010 A1
20100302137 Benko et al. Dec 2010 A1
20100309295 Chow Dec 2010 A1
20110001699 Jacobsen et al. Jan 2011 A1
20110089207 Tricoukes et al. Apr 2011 A1
20110090135 Tricoukes et al. Apr 2011 A1
20110092825 Gopinathan et al. Apr 2011 A1
20110134910 Chao-Suren et al. Jun 2011 A1
20110187640 Jacobsen et al. Aug 2011 A1
20110214082 Osterhout et al. Sep 2011 A1
20110221669 Shams et al. Sep 2011 A1
20110221671 King, III et al. Sep 2011 A1
20110227812 Haddick et al. Sep 2011 A1
20110227813 Haddick et al. Sep 2011 A1
20110254698 Eberl et al. Oct 2011 A1
20110255050 Jannard et al. Oct 2011 A1
20110273662 Hwang et al. Nov 2011 A1
20120013843 Jannard Jan 2012 A1
20120026071 Hamdani et al. Feb 2012 A1
20120056846 Zaliva Mar 2012 A1
20120062445 Haddick et al. Mar 2012 A1
20120068914 Jacobsen et al. Mar 2012 A1
20120075177 Jacobsen et al. Mar 2012 A1
20120089392 Larco et al. Apr 2012 A1
20120092208 LeMire et al. Apr 2012 A1
20120105740 Jannard et al. May 2012 A1
20120110456 Larco et al. May 2012 A1
20120114131 Tricoukes et al. May 2012 A1
20120173100 Ellis Jul 2012 A1
20120188245 Hyatt Jul 2012 A1
20120236025 Jacobsen et al. Sep 2012 A1
20120287284 Jacobsen et al. Nov 2012 A1
20120294549 Doepke Nov 2012 A1
20130174205 Jacobsen et al. Jul 2013 A1
20130231937 Woodall et al. Sep 2013 A1
20130274985 Lee et al. Oct 2013 A1
20130289971 Parkinson Oct 2013 A1
20130300649 Parkinson et al. Nov 2013 A1
20140235169 Parkinson et al. Aug 2014 A1
20140368412 Jacobsen et al. Dec 2014 A1
20150072672 Jacobsen et al. Mar 2015 A1
Foreign Referenced Citations (8)
Number Date Country
WO 9521408 Aug 1995 WO
WO 9523994 Sep 1995 WO
WO 0079327 Dec 2000 WO
WO 2009076016 Jun 2009 WO
WO 2011051660 May 2011 WO
WO 2011097226 Aug 2011 WO
WO 2011114149 Sep 2011 WO
WO 2012040386 Mar 2012 WO
Non-Patent Literature Citations (9)
Entry
European Search Report for EP 12782481.1 dated Sep. 29, 2014.
International Preliminary Report on Patentability and Written Opinion, PCT/US2011/023337, mailing date, “Wireless Hands-Free Computing Headset with Detachable Accessories Controllable by Motion, Body Gesture and/or Vocal Commands”, Aug. 16, 2012, 8 pages.
Notification of Transmittal of International Search Report and Written Opinion of PCT/US2012/068686, “Wireless Hands-Free Computing Head Mounted Video Eyewear for Local/Remote Diagnosis and Repair”, Date of Mailing: Mar. 25, 2013, 11 pages.
Notification of Transmittal of International Search Report and Written Opinion of PCT/US2012/037284, “Wireless Hands-Free Computing Head Mounted Video Eyewear for Local/Remote Diagnosis and Repair”, dated Oct. 1, 2012.
Notification Concerning Transmittal of International Preliminary Report on Patentability of PCT/US2012/037284, “Headset Computer That Uses Motion and Voices to Control Information Display and Remote Devices”, Date of Mailing: Nov. 21, 2013, 7 pages.
EP 12782481.1 Supplemental European Search Report, “Context Sensitive Overlays in Voice Controlled Headset Computer Displays,” dated Sep. 29, 2014.
Morphew, M.E., et al., “Helmet Mounted Displays for Unmanned Aerial Vehicle Control”, Proceedings of SPIE, vol. 5442, Oct. 20, 2004.
International Search Report and Written Opinion for PCT/US2013/065927 dated Mar. 21, 2014, entitled “Improved Headset Computer Operation Using Vehicle Sensor Feedback for Remote Control Vehicle”.
International Preliminary Report on Patentability for PCT/US2013/065927 dated Jul. 16, 2015 entitled “Improved Headset Computer Operation Using Vehicle Sensor Feedback for Remote Control Vehicle”.
Related Publications (1)
Number Date Country
20130300649 A1 Nov 2013 US
Provisional Applications (1)
Number Date Country
61749196 Jan 2013 US
Continuation in Parts (1)
Number Date Country
Parent 13468207 May 2012 US
Child 13830931 US