Mobile computing devices, such as notebook PCs, smart phones, and tablet computing devices, are now common tools used for producing, analyzing, communicating, and consuming data in both business and personal life. Consumers continue to embrace a mobile digital lifestyle as the ease of access to digital information increases with high-speed wireless communications technologies becoming ubiquitous. Popular uses of mobile computing devices include displaying large amounts of high-resolution computer graphics information and video content, often wirelessly streamed to the device. While these devices typically include a display screen, the preferred visual experience of a high-resolution, large format display cannot be easily replicated in such mobile devices because the physical size of such device is limited to promote mobility. Another drawback of the aforementioned device types is that the user interface is hands-dependent, typically requiring a user to enter data or make selections using a keyboard (physical or virtual) or touch-screen display. As a result, consumers are now seeking a hands-free, high-quality, portable, color display solution to augment or replace their hands-dependent mobile devices.
Recently developed micro-displays can provide large-format, high-resolution color pictures and streaming video in a very small form factor. One application for such displays can be integrated into a wireless headset computer worn on the head of the user with a display within the field of view of the user, similar in format to either eyeglasses, audio headset or video eyewear. A “wireless computing headset” device includes one or more small high-resolution micro-displays and optics to magnify the image. The WVGA microdisplay's can provide super video graphics array (SVGA) (800×600) resolution or extended graphic arrays (XGA) (1024×768) or even higher resolutions. A wireless computing headset contains one or more wireless computing and communication interfaces, enabling data and streaming video capability, and provides greater convenience and mobility through hands dependent devices. For more information concerning such devices, see co-pending patent applications entitled “Mobile Wireless Display Software Platform for Controlling Other Systems and Devices,” U.S. application Ser. No. 12/348,648 filed Jan. 5, 2009, “Handheld Wireless Display Devices Having High Resolution Display Suitable For Use as a Mobile Internet Device,” PCT International Application No. PCT/US09/38601 filed Mar. 27, 2009, and “Improved Headset Computer,” U.S. Application No. 61/638,419 filed Apr. 25, 2012, each of which are incorporated herein by reference in their entirety.
In one embodiment, a method includes providing a user interface in a headset computer and, in response to user utterance of a cue toggle command, displaying at least one cue in the user interface. Each cue can correspond to a voice command associated with code to execute. In response to user utterance of the voice command, the method can also include executing the code associated with the voice command.
In another embodiment, the method can further includes displaying the interface without the cue at least one of prior to the cue toggle command and after a subsequent cue toggle command. Displaying the cue can include displaying words that activate the voice command. Displaying the cue can also include displaying the cue in the user interface corresponding to the voice command associated with the control, where the control is displayed in the user interface. Displaying the cue can include displaying the cue in the user interface corresponding to the voice command associated with the control, where the control is hidden from the user interface. Displaying the cue can include displaying the cue in the user interface corresponding to the voice command associated with the control, where the control is a global headset control. The cue can be loaded from a control, the control indicating the cue and voice command.
In another embodiment, a system for displaying a user interface in a headset computer can include a display module configured to provide a user interface in a headset computer. The display module can be further configured to, in response to user utterance of a cue toggle command, display at least one cue in the user interface. Each cue can correspond to a voice command associated with code to execute. The system can further include a command module configured to, in response to user utterance of the voice command, execute code associated with the voice command.
In another embedment, a method of developing a user interface in a headset computer includes embedding a cue and a voice command in a control for the user interface. The method also includes providing the control to the user interface, the user interface configured to display the cue responsive to a cue toggle command.
The foregoing will be apparent from the following more particular description of example embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments of the present invention.
Example embodiments of the HSC 100 can receive user input through sensing voice commands, head movements, 110, 111, 112 and hand gestures 113, or any combination thereof. Microphone(s) operatively coupled or preferably integrated into the HSC 100 can be used to capture speech commands which are then digitized and processed using automatic speech recognition techniques. Gyroscopes, accelerometers, and other micro-electromechanical system sensors can be integrated into the HSC 100 and used to track the user's head movement to provide user input commands. Cameras or other motion tracking sensors can be used to monitor a user's hand gestures for user input commands. Such a user interface overcomes the hands-dependant formats of other mobile devices.
The HSC 100 can be used in various ways. It can be used as a remote display for streaming video signals received from a remote host computing device 200 (shown in
A head worn frame 1000 and strap 1002 are generally configured so that a user can wear the headset computer device 100 on the user's head. A housing 1004 is generally a low profile unit which houses the electronics, such as the microprocessor, memory or other storage device, along with other associated circuitry. Speakers 1006 provide audio output to the user so that the user can hear information. Microdisplay subassembly 1010 is used to render visual information to the user. It is coupled to the arm 1008. The arm 1008 generally provides physical support such that the microdisplay subassembly is able to be positioned within the user's field of view 300 (
According to aspects that will be explained in more detail below, the HSC display device 100 allows a user to select a field of view 300 within a much larger area defined by a virtual display 400. The user can typically control the position, extent (e.g., X-Y or 3D range), and/or magnification of the field of view 300.
While what is shown in
In one embodiment the HSC 100 may take the form of the HSC described in a co-pending U.S. Patent Publication No. 2011/0187640 which is hereby incorporated by reference in its entirety.
In another embodiment, the invention relates to the concept of using a Head Mounted Display (HMD) 1010 in conjunction with an external ‘smart’ device 200 (such as a smartphone or tablet) to provide information and control to the user hands-free. The invention requires transmission of small amounts of data, providing a more reliable data transfer method running in real-time.
In this sense therefore, the amount of data to be transmitted over the connection 150 is small-simple instructions on how to lay out a screen, which text to display, and other stylistic information such as drawing arrows, or the background colours, images to include, etc.
Additional data could be streamed over the same 150 or another connection and displayed on screen 1010, such as a video stream if required by the Host 200.
This invention relates to the viewing of context sensitive overlays within applications, on voice controlled HSCs 100.
The concept is the presentation of data, contextually, over a visual, on demand. Overlays can be called up by the user with a voice command, typically “Show commands.” The voice command is standard across the system 100 and available at all times. This command causes HSC 100 to display applicable voice commands and other information in a context sensitive and intuitive way.
The applicable commands are shown on a semi-transparent overlay of the current screen view of display unit 1010. This allows the user to retain the context of the screen he called the overlay up for.
The overlay and displayed applicable commands fade away after a short period of time. This is accomplished by a timing mechanism that refreshes the screen view.
The applicable commands are displayed in order of relevance. The most relevant command is given more prominence in terms of placement over less relevant commands. 100 determines relevancy based on the current context of the display 1010 contents.
Each screen in the relevant system is made up of user-interface (UI) components, some of which are ‘controls’. A control is a UI component that provides information to the user or enables some form of functionality. Examples of controls are buttons, radio buttons, text boxes, check boxes, drop down menus, file menus, ribbon menus, live tiles, etc. Within the software developer's component library, these are available in their various forms, allowing customization of certain features. For example, one such control might be a ‘button,’ simply enabling the user to press it using a voice command available on the button. Controls, such as the ‘button’ control, are available to the developer, for example, as part of the developer's component library or other library. The developer can insert the pre-coded control and customize it to his or her liking, instead of manually coding the control from scratch.
A “Show Commands” function is built into the controls of the developer library. When the developer, for example, creates a button and specifies a text string to be written onto the button, the text string becomes the default voice command to activate the button, unless the developer (or user) overrides the voice command. The control (e.g., the button) is configured to react to a “show commands” voice command by overlaying the text string to activate the control over the control itself, or near the control.
Every User Interface screen made available on the HSC 100 has the ability to receive a “Show Commands” voice command (e.g., a system command, available by default). Therefore, when a screen is constructed using controls from the UI library, show commands functionality is built in, providing guidance as to the voice commands available to the user. These voice commands are (by default) shown in context of the current displayed contents (screen view).
Other available voice commands can also be placed within the show commands overlay that are not associated with a visible control. These are placed in the overlay by the developer, adding a voice-command only control, or adding a hidden control, and provide a visual cue for voice commands that are not associated with a button or other control.
The user interface 302 in show commands mode also shows a plurality of implicit voice commands 1-9 306a-i. The implicit voice commands 1-9 306a-i do not correspond to any particular visual control of the user interface, they are voice commands that are available to the user. For example, the user can say implicit voice commands 1 and 2 306a-b to move to the previous and next page, respectively. The user can draft an email by saying implicit command 3 306c. The user can manage his or her email account by saying implicit command 4 306d. The user can see his or her accounts by saying implicit command 5 306e. The user can switch folders by saying implicit voice command 6 306f. The user can refresh the inbox by saying implicit voice command 7 306g.
Further, the user can go back to a previous screen by saying implicit voice command 8 306h. The user can return to a home screen by saying implicit voice command 9 306i. Implicit voice commands 8 and 9 can be universal to all screens on the HSC. Voice commands 1-6 304a-f and implicit voice commands 1-7 306a-g are local commands for this particular application. However, in other embodiments, implicit voice commands 1-2 306a-b can be global commands for moving to previous and next pages of applications.
The voice command overlay aids the user by de-cluttering the screen of options and buttons. The voice commands further help prompt the user to how to use the system, which is especially useful while the user is learning how to use the device and voice commands.
The system then determines whether it has received a voice command (e.g., a voice command shown by one of the cues) (508). If not, it keeps listening for a voice command (508). If so, however, it executes the code associated with the voice command (510).
Client computer(s)/devices 50 and server computer(s) 60 provide processing, storage, and input/output devices executing application programs and the like. Client computer(s)/devices 50 can also be linked through communications network 70 to other computing devices, including other client devices/processes 50 and server computer(s) 60. Communications network 70 can be part of a remote access network, a global network (e.g., the Internet), a worldwide collection of computers, Local area or Wide area networks, and gateways that currently use respective protocols (TCP/IP, Bluetooth, etc.) to communicate with one another. Other electronic device/computer network architectures are suitable.
In one embodiment, the processor routines 92 and data 94 are a computer program product (generally referenced 92), including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the invention system. Computer program product 92 can be installed by any suitable software installation procedure, as is well known in the art. In another embodiment, at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection. In other embodiments, the invention programs are a computer program propagated signal product 107 embodied on a propagated signal on a propagation medium (e.g., a radio wave, an infrared wave, a laser wave, a sound wave, or an electrical wave propagated over a global network such as the Internet, or other network(s)). Such carrier medium or signals provide at least a portion of the software instructions for the present invention routines/program 92.
In alternate embodiments, the propagated signal is an analog carrier wave or digital signal carried on the propagated medium. For example, the propagated signal may be a digitized signal propagated over a global network (e.g., the Internet), a telecommunications network, or other network. In one embodiment, the propagated signal is a signal that is transmitted over the propagation medium over a period of time, such as the instructions for a software application sent in packets over a network over a period of milliseconds, seconds, minutes, or longer. In another embodiment, the computer readable medium of computer program product 92 is a propagation medium that the computer system 50 may receive and read, such as by receiving the propagation medium and identifying a propagated signal embodied in the propagation medium, as described above for computer program propagated signal product.
Generally speaking, the term “carrier medium” or transient carrier encompasses the foregoing transient signals, propagated signals, propagated medium, storage medium and the like.
While this invention has been particularly shown and described with references to example embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.
This application claims the benefit of to U.S. Application No. 61/749,240 filed Jan. 4, 2013 and is a continuation-in-part of U.S. application Ser. No. 13/234,916 filed Sep. 16, 2011, which claims the benefit of 61/384,586 filed Sep. 20, 2010. The entire teachings of the above applications are incorporated herein by reference. This application also claims priority to and is a continuation-in-part of U.S. application Ser. No. 13/799,888, filed Mar. 13, 2013 which claims the benefit of U.S. Application No. 61/653,127, filed May 30, 2012.
Number | Name | Date | Kind |
---|---|---|---|
4567479 | Boyd | Jan 1986 | A |
5005213 | Hanson et al. | Apr 1991 | A |
5208449 | Eastman | May 1993 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5689619 | Smyth | Nov 1997 | A |
5698834 | Worthington | Dec 1997 | A |
5742263 | Wang | Apr 1998 | A |
5818455 | Stone | Oct 1998 | A |
5990793 | Beiback | Nov 1999 | A |
6010216 | Jesiek | Jan 2000 | A |
6084556 | Zwern | Jul 2000 | A |
6108197 | Janik | Aug 2000 | A |
6192343 | Morgan | Feb 2001 | B1 |
6198462 | Daily et al. | Mar 2001 | B1 |
6204974 | Spitzer | Mar 2001 | B1 |
6313864 | Tabata et al. | Nov 2001 | B1 |
6325507 | Jannard et al. | Dec 2001 | B1 |
6369952 | Rallison et al. | Apr 2002 | B1 |
6408257 | Harrington et al. | Jun 2002 | B1 |
6532446 | King | Mar 2003 | B1 |
6538676 | Peters et al. | Mar 2003 | B1 |
6678906 | Hennings et al. | Aug 2004 | B1 |
6778906 | Hennings et al. | Aug 2004 | B1 |
6798391 | Petersen, III | Sep 2004 | B2 |
6853293 | Swartz et al. | Feb 2005 | B2 |
6900777 | Hebert et al. | May 2005 | B1 |
6922184 | Lawrence et al. | Jul 2005 | B2 |
6956614 | Quintana et al. | Oct 2005 | B1 |
6965862 | Schuller | Nov 2005 | B2 |
6966647 | Jannard et al. | Nov 2005 | B2 |
7004582 | Jannard et al. | Feb 2006 | B2 |
7013009 | Warren | Mar 2006 | B2 |
7082393 | Lahr | Jul 2006 | B2 |
7147324 | Jannard et al. | Dec 2006 | B2 |
7150526 | Jannard et al. | Dec 2006 | B2 |
7213917 | Jannard et al. | May 2007 | B2 |
7216973 | Jannard et al. | May 2007 | B2 |
7219994 | Jannard et al. | May 2007 | B2 |
7231038 | Warren | Jun 2007 | B2 |
7249846 | Grand et al. | Jul 2007 | B2 |
7278734 | Jannard et al. | Oct 2007 | B2 |
7331666 | Swab et al. | Feb 2008 | B2 |
7445332 | Jannard et al. | Nov 2008 | B2 |
7452073 | Jannard et al. | Nov 2008 | B2 |
7458682 | Lee | Dec 2008 | B1 |
7461936 | Jannard | Dec 2008 | B2 |
7494216 | Jannard et al. | Feb 2009 | B2 |
7501995 | Morita et al. | Mar 2009 | B2 |
7512414 | Jannard et al. | Mar 2009 | B2 |
7620432 | Williams et al. | Nov 2009 | B2 |
7620433 | Bodylet | Nov 2009 | B2 |
7682018 | Jannard | Mar 2010 | B2 |
7740353 | Jannard | Jun 2010 | B2 |
7744213 | Jannard et al. | Jun 2010 | B2 |
7753520 | Fuziak, Jr. | Jul 2010 | B2 |
7760898 | Howell et al. | Jul 2010 | B2 |
7798638 | Fuziak, Jr. | Sep 2010 | B2 |
7806525 | Howell et al. | Oct 2010 | B2 |
7918556 | Lewis | Apr 2011 | B2 |
7959084 | Wulff | Jun 2011 | B2 |
7966189 | Le et al. | Jun 2011 | B2 |
7967433 | Jannard et al. | Jun 2011 | B2 |
7969383 | Eberl et al. | Jun 2011 | B2 |
7969657 | Cakmakci et al. | Jun 2011 | B2 |
7976480 | Grajales et al. | Jul 2011 | B2 |
7988283 | Jannard et al. | Aug 2011 | B2 |
7997723 | Pienimaa et al. | Aug 2011 | B2 |
8010156 | Warren | Aug 2011 | B2 |
8020989 | Jannard et al. | Sep 2011 | B2 |
8025398 | Jannard | Sep 2011 | B2 |
8072393 | Riechel | Dec 2011 | B2 |
8092011 | Sugihara et al. | Jan 2012 | B2 |
8098439 | Amitai et al. | Jan 2012 | B2 |
8123352 | Matsumoto et al. | Feb 2012 | B2 |
8140197 | Lapidot et al. | Mar 2012 | B2 |
8108143 | Tester | May 2012 | B1 |
8170262 | Liu | May 2012 | B1 |
8184983 | Ho et al. | May 2012 | B1 |
8212859 | Tang et al. | Jul 2012 | B2 |
8327295 | Ikeda | Dec 2012 | B2 |
8577427 | Serota | Nov 2013 | B2 |
8838075 | Basir | Sep 2014 | B2 |
8855719 | Jacobsen et al. | Oct 2014 | B2 |
8862186 | Jacobsen et al. | Oct 2014 | B2 |
8885719 | Kondo et al. | Nov 2014 | B2 |
8929954 | Jacobsen et al. | Jan 2015 | B2 |
9118875 | Ida | Aug 2015 | B2 |
9122307 | Jacobsen et al. | Sep 2015 | B2 |
9235262 | Jacobsen et al. | Jan 2016 | B2 |
9294607 | Jacobsen et al. | Mar 2016 | B2 |
9301085 | Parkinson et al. | Mar 2016 | B2 |
9316827 | Lindley et al. | Apr 2016 | B2 |
9369760 | Jacobsen et al. | Jun 2016 | B2 |
9507772 | Parkinson et al. | Nov 2016 | B2 |
9817232 | Lindley et al. | Nov 2017 | B2 |
20010003712 | Roelofs | Jun 2001 | A1 |
20010035845 | Zwern | Nov 2001 | A1 |
20020015008 | Kishida | Feb 2002 | A1 |
20020030649 | Zavracky et al. | Mar 2002 | A1 |
20020044152 | Abbott, III et al. | Apr 2002 | A1 |
20020065115 | Lindholm | May 2002 | A1 |
20020094845 | Inasaka | Jul 2002 | A1 |
20020154070 | Sato et al. | Oct 2002 | A1 |
20020158815 | Zwern | Oct 2002 | A1 |
20030016253 | Aoki et al. | Jan 2003 | A1 |
20030046401 | Abbott | Mar 2003 | A1 |
20030065805 | Barnes, Jr. | Apr 2003 | A1 |
20030067536 | Boulanger et al. | Apr 2003 | A1 |
20030068057 | Miller et al. | Apr 2003 | A1 |
20030222917 | Trantow | Dec 2003 | A1 |
20040102967 | Levin | May 2004 | A1 |
20040113867 | Tomine et al. | Jun 2004 | A1 |
20040193413 | Wilson et al. | Sep 2004 | A1 |
20040210852 | Balakrishnan et al. | Oct 2004 | A1 |
20040267527 | Creamer et al. | Dec 2004 | A1 |
20050047629 | Farrell et al. | Mar 2005 | A1 |
20050108643 | Schybergson et al. | May 2005 | A1 |
20050114140 | Brackett et al. | May 2005 | A1 |
20050237296 | Lee | Oct 2005 | A1 |
20050245292 | Bennett et al. | Nov 2005 | A1 |
20050261890 | Robinson | Nov 2005 | A1 |
20050264527 | Lindholm | Dec 2005 | A1 |
20060010368 | Kashi | Jan 2006 | A1 |
20060028400 | Lapstun et al. | Feb 2006 | A1 |
20060061551 | Fateh | Mar 2006 | A1 |
20060074624 | Sahashi | Apr 2006 | A1 |
20060109237 | Morita et al. | May 2006 | A1 |
20060132382 | Jannard | Jun 2006 | A1 |
20060166705 | Seshadri et al. | Jul 2006 | A1 |
20060178085 | Sotereanos et al. | Aug 2006 | A1 |
20060221266 | Kato et al. | Oct 2006 | A1 |
20060238877 | Ashkenazi et al. | Oct 2006 | A1 |
20070009125 | Frerking et al. | Jan 2007 | A1 |
20070030174 | Randazzo et al. | Feb 2007 | A1 |
20070053544 | Jhao et al. | Mar 2007 | A1 |
20070093279 | Janik | Apr 2007 | A1 |
20070103388 | Spitzer | May 2007 | A1 |
20070180979 | Rosenberg | Aug 2007 | A1 |
20070220108 | Whitaker | Sep 2007 | A1 |
20070238475 | Goedken | Oct 2007 | A1 |
20070265495 | Vayser | Nov 2007 | A1 |
20080052643 | Ike et al. | Feb 2008 | A1 |
20080055194 | Baudino et al. | Mar 2008 | A1 |
20080084992 | Peddireddy et al. | Apr 2008 | A1 |
20080120141 | Kariathungal et al. | May 2008 | A1 |
20080144854 | Abreu | Jun 2008 | A1 |
20080180640 | Ito | Jul 2008 | A1 |
20080198324 | Fuziak | Aug 2008 | A1 |
20080200774 | Luo | Aug 2008 | A1 |
20080201634 | Gibb et al. | Aug 2008 | A1 |
20080211768 | Breen et al. | Sep 2008 | A1 |
20080239080 | Moscato | Oct 2008 | A1 |
20090002640 | Yang et al. | Jan 2009 | A1 |
20090079839 | Fischer et al. | Mar 2009 | A1 |
20090093304 | Ohta | Apr 2009 | A1 |
20090099836 | Jacobsen et al. | Apr 2009 | A1 |
20090117890 | Jacobsen et al. | May 2009 | A1 |
20090128448 | Riechel | May 2009 | A1 |
20090154719 | Wulff et al. | Jun 2009 | A1 |
20090180195 | Cakmakci et al. | Jul 2009 | A1 |
20090182562 | Caire | Jul 2009 | A1 |
20090204410 | Mozer et al. | Aug 2009 | A1 |
20090213071 | Wang et al. | Aug 2009 | A1 |
20090240488 | White | Sep 2009 | A1 |
20090251409 | Parkinson et al. | Oct 2009 | A1 |
20100001699 | Dragojevic | Jan 2010 | A1 |
20100020229 | Hershey et al. | Jan 2010 | A1 |
20100033830 | Yung | Feb 2010 | A1 |
20100041447 | Graylin | Feb 2010 | A1 |
20100053069 | Tricoukes et al. | Mar 2010 | A1 |
20100073201 | Holcomb et al. | Mar 2010 | A1 |
20100106497 | Phillips | Apr 2010 | A1 |
20100117930 | Bacabara | May 2010 | A1 |
20100119052 | Kambli | May 2010 | A1 |
20100121480 | Stelzer et al. | May 2010 | A1 |
20100128626 | Anderson et al. | May 2010 | A1 |
20100141554 | Devereaux et al. | Jun 2010 | A1 |
20100156812 | Stallings et al. | Jun 2010 | A1 |
20100164990 | Van Doorn | Jul 2010 | A1 |
20100169073 | Almagro | Jul 2010 | A1 |
20100171680 | Lapidot et al. | Jul 2010 | A1 |
20100182137 | Pryor | Jul 2010 | A1 |
20100204981 | Ribeiro | Aug 2010 | A1 |
20100225734 | Weller et al. | Sep 2010 | A1 |
20100235161 | Kim et al. | Sep 2010 | A1 |
20100238184 | Janicki | Sep 2010 | A1 |
20100245585 | Fisher et al. | Sep 2010 | A1 |
20100250231 | Almagro | Sep 2010 | A1 |
20100309295 | Chow | Sep 2010 | A1 |
20100271587 | Pavlopoulos | Oct 2010 | A1 |
20100277563 | Gupta et al. | Nov 2010 | A1 |
20100289817 | Meier et al. | Nov 2010 | A1 |
20100302137 | Benko et al. | Dec 2010 | A1 |
20100306711 | Kahn et al. | Dec 2010 | A1 |
20110001699 | Jacobsen et al. | Jan 2011 | A1 |
20110089207 | Tricoukes et al. | Apr 2011 | A1 |
20110090135 | Tricoukes et al. | Apr 2011 | A1 |
20110092825 | Gopinathan et al. | Apr 2011 | A1 |
20110134910 | Chao-Suren et al. | Jun 2011 | A1 |
20110187640 | Jacobsen et al. | Aug 2011 | A1 |
20110214082 | Osterhout et al. | Sep 2011 | A1 |
20110221656 | Haddick et al. | Sep 2011 | A1 |
20110221669 | Shams et al. | Sep 2011 | A1 |
20110221671 | King, III et al. | Sep 2011 | A1 |
20110227812 | Haddick et al. | Sep 2011 | A1 |
20110227813 | Haddick et al. | Sep 2011 | A1 |
20110238405 | Pedre | Sep 2011 | A1 |
20110248904 | Miyawaki | Oct 2011 | A1 |
20110254698 | Eberl et al. | Oct 2011 | A1 |
20110254865 | Yee et al. | Oct 2011 | A1 |
20110255050 | Jannard et al. | Oct 2011 | A1 |
20110273662 | Hwang et al. | Nov 2011 | A1 |
20120013843 | Jannard | Jan 2012 | A1 |
20120026071 | Hamdani et al. | Feb 2012 | A1 |
20120056846 | Zaliva | Mar 2012 | A1 |
20120062445 | Haddick et al. | Mar 2012 | A1 |
20120068914 | Jacobsen | Mar 2012 | A1 |
20120075177 | Jacobsen et al. | Mar 2012 | A1 |
20120089392 | Larco et al. | Apr 2012 | A1 |
20120105740 | Jannard et al. | May 2012 | A1 |
20120110456 | Larco et al. | May 2012 | A1 |
20120114131 | Tricoukes et al. | May 2012 | A1 |
20120166203 | Fuchs et al. | Jun 2012 | A1 |
20120088245 | Rotter et al. | Jul 2012 | A1 |
20120173100 | Ellis | Jul 2012 | A1 |
20120188245 | Hyatt | Jul 2012 | A1 |
20120236025 | Jacobsen et al. | Sep 2012 | A1 |
20120287284 | Jacobsen et al. | Nov 2012 | A1 |
20120302288 | Born et al. | Nov 2012 | A1 |
20130070930 | Johnson | Mar 2013 | A1 |
20130174205 | Jacobsen et al. | Jul 2013 | A1 |
20130231937 | Woodall et al. | Sep 2013 | A1 |
20130239000 | Parkinson | Sep 2013 | A1 |
20130274985 | Lee et al. | Oct 2013 | A1 |
20130288753 | Jacobsen et al. | Oct 2013 | A1 |
20130289971 | Parkinson | Oct 2013 | A1 |
20130300649 | Parkinson et al. | Nov 2013 | A1 |
20140003616 | Johnson et al. | Jan 2014 | A1 |
20140059263 | Rosenberg et al. | Feb 2014 | A1 |
20140093103 | Breece et al. | Apr 2014 | A1 |
20140111427 | Lindley et al. | Apr 2014 | A1 |
20140223299 | Han | Aug 2014 | A1 |
20140235169 | Parkinson et al. | Aug 2014 | A1 |
20140334644 | Selig | Nov 2014 | A1 |
20140368412 | Jacobsen et al. | Dec 2014 | A1 |
20150039311 | Clark et al. | Feb 2015 | A1 |
20150072672 | Jacobsen et al. | Mar 2015 | A1 |
20150279354 | Gruenstein | Oct 2015 | A1 |
20150346489 | Lindley et al. | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
1735019 | Feb 2006 | CN |
1797299 | Jul 2006 | CN |
101196793 | Jun 2008 | CN |
101243392 | Aug 2008 | CN |
101349944 | Jan 2009 | CN |
101444087 | May 2009 | CN |
101581969 | Nov 2009 | CN |
101599267 | Dec 2009 | CN |
101620511 | Jan 2010 | CN |
101755299 | Jun 2010 | CN |
102541438 | Jul 2012 | CN |
102812417 | Dec 2012 | CN |
103 44 062 | Apr 2005 | DE |
2 207 164 | Jul 2010 | EP |
09-034895 | Feb 1997 | JP |
10-020867 | Jan 1998 | JP |
2001-100878 | Apr 2001 | JP |
2001-506389 | May 2001 | JP |
2001-202175 | Jul 2001 | JP |
2001-216069 | Aug 2001 | JP |
2002-525769 | Aug 2002 | JP |
2003-241880 | Aug 2003 | JP |
2003-241880 | Aug 2003 | JP |
2004-233117 | Aug 2004 | JP |
2004-233117 | Aug 2004 | JP |
2005-012377 | Jan 2005 | JP |
2007-079978 | Mar 2007 | JP |
2007-213501 | Aug 2007 | JP |
2008-052590 | Mar 2008 | JP |
2008-278536 | Nov 2008 | JP |
2011-511935 | Jul 2009 | JP |
2009-179062 | Aug 2009 | JP |
2010-102163 | May 2010 | JP |
2011-511935 | Apr 2011 | JP |
2011-198150 | Oct 2011 | JP |
2012-002568 | Jan 2012 | JP |
2012-044429 | Mar 2012 | JP |
2012-056568 | Mar 2012 | JP |
2012-174149 | Sep 2012 | JP |
WO 1995021408 | Aug 1995 | WO |
WO 1995023994 | Sep 1995 | WO |
WO 9901838 | Jan 1999 | WO |
WO 0017848 | Mar 2000 | WO |
WO 2000079327 | Dec 2000 | WO |
WO 2005017729 | Feb 2005 | WO |
WO 2009076016 | Jun 2009 | WO |
WO 2009091639 | Jul 2009 | WO |
WO 2009091639 | Jul 2009 | WO |
WO 2009120984 | Oct 2009 | WO |
WO 2010019634 | Feb 2010 | WO |
WO 2010129679 | Nov 2010 | WO |
WO 2011051660 | May 2011 | WO |
WO 2011097226 | Aug 2011 | WO |
WO 2011114149 | Sep 2011 | WO |
WO 2012040107 | Mar 2012 | WO |
WO 2012040386 | Mar 2012 | WO |
WO 2012154938 | Nov 2012 | WO |
WO 2013101438 | Jul 2013 | WO |
Entry |
---|
International Search Report and Written Opinion of PCT/US2011/052164 dated Jan. 17, 2012. |
International Preliminary Report on Patentability and Written Opinion, PCT/US2011/023337, dated Aug. 16, 2012, 8 pages. |
Notification of Transmittal of International Search Report and Written Opinion of PCT/US2012/037284 dated Oct. 1, 2012. |
Notification of Transmittal of International Search Report and Written Opinion of PCT/US2012/068686, dated Mar. 25, 2013, 11 pages. |
Notification of Transmittal of the International Search Report and Written Opinion for PCT/US2013/078051, “Lifeboard-Series Of Home Pages For Head Mounted Displays (HMD) That Respond To Head Tracking”, dated Apr. 22, 2014. |
European Search Report for EP 12782481.1 dated Sep. 29, 2014. |
Notification of Transmittal of the International Search Report and Written Opinion for PCT/US2013/041070 “Controlled Headset Computer Displays” dated Oct. 18, 2013. |
Notification of Transmittal of The International Search Report and Written Opinion for PCT/US/2013/041349 “Head-Worn Computer With Improved Virtual Display Function” dated Aug. 9, 2013. |
EP 12782481.1 Supplemental European Search Report, “Context Sensitive Overlays In Voice Controlled Headset Computer Displays,” dated Sep. 29, 2014. |
International Search Report and Written Opinion for PCT/US2013/065927 dated Mar. 21, 2014, entitled, “Improved Headset Computer Operation Using Vehicle Sensor Feedback for Remote Control Vehicle”. |
Morphew, M.E., et al., “Helmet Mounted Displays for Unmanned Aerial Vehicle Control”, Proceedings of SPIE, vol. 5442, Oct. 20, 2004. |
Notification Concerning Transmittal of International Preliminary Report on Patentability of PCT/US2012/037284, “Headset Computer That Uses Motion And Voices To Control Information Display And Remote Devices”, dated Nov. 21, 2013, 7 pages. |
International Preliminary Report on Patentability for PCT/US2013/041070 dated Jul. 16, 2015; entitled “Context Sensitive Overlays In Voice Controlled Headset Computer Displays”. |
International Preliminary Report on Patentability for PCT/US2011/052164 dated Apr. 4, 2013; entitled “Advanced Remote Control Of Host Application Using Motion And Voice Commands”. |
Number | Date | Country | |
---|---|---|---|
20130231937 A1 | Sep 2013 | US |
Number | Date | Country | |
---|---|---|---|
61749240 | Jan 2013 | US | |
61384586 | Sep 2010 | US | |
61653127 | May 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13234916 | Sep 2011 | US |
Child | 13799790 | US | |
Parent | 13799888 | Mar 2013 | US |
Child | 13234916 | US |