Mobile computing devices, such as notebook personal computers (PC's), Smartphones, and tablet computing devices, are now common tools used for producing, analyzing, communicating, and consuming data in both business and personal life. Consumers continue to embrace a mobile digital lifestyle as the ease of access to digital information increases with high-speed wireless communications technologies becoming ubiquitous. Popular uses of mobile computing devices include displaying large amounts of high-resolution computer graphics information and video content, often wirelessly streamed to the device. While these devices typically include a display screen, the preferred visual experience of a high-resolution, large format display cannot be easily replicated in such mobile devices because the physical size of such devices is limited to promote mobility. Another drawback of the aforementioned device types is that the user interface is hands-dependent, typically requiring a user to enter data or make selections using a keyboard (physical or virtual) or touch-screen display. As a result, consumers are now seeking a hands-free high-quality, portable, color display solution to augment or replace their hands-dependent mobile devices.
Developers of software applications (or “apps”) have attempted to capitalize on the increased popularity of mobile computing devices by developing a single app that can be used across multiple devices and platforms, for example Smartphones, tablets, and PCs. Developing applications that can be used across multiple devices and platforms is economical for developers because it maximizes return of their work product by enabling the app to be sold to as many consumers as possible. While some challenges exist in developing apps across multiple devices and platforms, currently all such devices (e.g. Smartphones, tablets and PCs) use hands-dependent user interfaces, such as touchscreens and/or keyboards (physical or virtual) and/or pointing devices.
The present application relates to human/computer interfaces. More particularly, the present invention relates to a mobile wireless wearable headset computing device employing a hands-free user interface, which operates using voice commands and tracked head movement, and when docked at a docking station, the headset computing device operates employing a user interface typically associated with a personal computer (PC), for example, using an external full screen monitor for graphical output and using a keyboard and mouse as input devices. Further, the headset computing device makes available a different set of applications or application features more suited to keyboard and mouse operation when in a docked mode than when used in a headset mode as a stand-alone headset computer. A common data set stored in the headset memory supports both/all sets of applications (docked and undocked modes of operation/applications of the headset).
Recently developed micro-displays can provide large-format, high-resolution color pictures and streaming video in a very small form factor. One application for such displays is integration with a wireless headset computer worn on the head of the user, with the display positioned within the field of view of the user, similar in format to eyeglasses, an audio headset, or video eyewear. A “wireless computing headset” device includes one or more small high-resolution micro-displays and optics to magnify the image. The micro-displays can provide super video graphics array (SVGA) (800×600) resolution, quarter high-definition graphics array (qHD) (860×540), extended graphic arrays (XGA) (1024×768) or even higher resolutions. A wearable computer can use a qHD micro-display to provide a virtual 15 inch laptop-sized display. A wireless computing headset contains one or more wireless computing and communication interfaces, enabling data and streaming video capability, and provides greater convenience and mobility than hands dependent devices.
Examples of a mobile wireless wearable headset computing device are Golden-i® Headsets available from Kopin Corporation of Taunton, Mass. For more information concerning such devices, see co-pending U.S. application Ser. No. 12/348,646 entitled “Mobile Wireless Display Software Platform for Controlling Other Systems and Devices,” by Parkinson et al., filed Jan. 5, 2009, PCT International Application No. PCT/US09/38601 entitled “Handheld Wireless Display Devices Having High Resolution Display Suitable For Use as a Mobile Internet Device,” by Jacobsen et al., filed Mar. 27, 2009, and U.S. Application No. 61/638,419 entitled “Improved Headset Computer,” by Jacobsen et al., filed Apr. 25, 2012, each of which are incorporated herein by reference in their entirety.
An example method of operating a headset computer includes, providing a docking station for a headset computer, executing a hands-free first version of a subject application on the headset computer, and executing a different version of the subject application on the headset computer when the headset computer is operatively coupled to the docking station, wherein the hands-free version and the different version utilize a common data set stored in a memory of stored in memory of the headset computer.
Another example method of operating a headset computing device includes, determining whether the headset computing device is in a headset state or a docked state communicatively coupled to a docking station, operating the headset computing device in a headset mode or a docked mode based on the determined state, executing an application on the headset computing device, enabling and disabling application features according to the headset mode or the docked mode wherein the headset mode enables a hands-free user interface and disables a hands-dependent user interface, and the docked mode enables the hands-free dependant user interface and disables the hands-free user interface, and accessing a common application data set stored in a memory module of the headset computer according to the executed application.
Enabling the hands-free user interface can further include, using automatic speech recognition (ASR) and head tracking (HT) user inputs to interface with the application according to the headset mode. The enabling and disabling application features can further include enabling read-only and disabling write user permissions according to the headset mode. The hands-free user interface can further include rendering a headset version of a graphical user interface compatible with automatic speech recognition and head-tracking inputs through a micro-display of the headset computing device according to the headset mode.
The hands-dependent user interface, according to the docked mode, can further include using a keyboard and pointing device user inputs to interface with the application. The enabling and disabling application features can further include enabling write and disabling read-only user permissions. The hands-dependent user interface can further include rendering a graphical user interface through a monitor communicatively coupled to the docking station.
Example methods of operating the headset computing device can further include executing the application on the headset computing device wherein the application is a word processing application, a spreadsheet application, a presentation application, or an Internet browser application on the headset computing device. Example methods of operating the headset computing device in the docked mode can further include recharging a rechargeable battery of the headset computing device. The rechargeable battery can power operations of the headset computing device in the headset mode.
Further example embodiments include a dual-mode headset computing device, including a processor communicatively coupled to a micro-display and a memory module, operating in a headset mode or docked mode, a common data set stored in the memory module, a docking station, including a docking port enabling operational coupling to the headset computing device in a docked state, the docked mode being based on a determination of the docked state, an application executed by the processor including application features being enabled or disabled according to the headset mode or the docked mode, wherein the headset mode enables a hands-free user interface and disables a hands-dependent user interface, and the docked mode enables the hands-dependent user interface and disables the hands-free user interface, and the common data set being accessed according to the application.
Example embodiments of the hands-free user interface can further include an automatic speech recognition module and a head-tracking module for receiving user input to interface with the application, during operation in the headset mode. The application features can further include an enabled read-only user permission and a disabled write user permission in the headset mode. The micro-display can render a headset version of a graphical user interface in the headset mode.
The hands-dependent user interface can further include a keyboard and a pointing device for receiving user inputs to interface with the application. Application features can further include an enabled write user permission and a disabled read-only user permission in the docked mode. Further, the hands dependent user interface can include a monitor, communicatively coupled to the docking station, for rendering a graphical user interface compatible with the keyboard and pointing device. The application can be a word processing, spreadsheet, presentation, or an Internet browsing application.
Example embodiments of the headset computing device can further include a rechargeable battery for supplying power to the headset computing device while operating in the headset mode and recharging the battery while in the docked mode.
Example embodiments of the can further include a non-transitory computer program product for operating a headset computing device, the computer program product comprising a computer readable medium having a computer readable instructions stored thereon, which, when loaded and executed by a processor, cause the processor to determine whether the headset computing devices in a headset state or a docked state communicatively coupled to a docking station, operate the headset computing device in a headset mode or a docked mode based on the determined state, execute an application on the headset computing device, enable and disable application features according to the headset mode or docked mode, wherein the headset mode enables a hands-free user interface and disables a hands-dependent user interface, and the docked mode enables the hands-dependent user interface and disabled the hands-free user interface, and access a common application data set stored in a memory module of the headset computer according to the executed application.
The foregoing will be apparent from the following more particular description of example embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments of the present invention.
A description of example embodiments of the invention follows.
An example embodiment, according to principles of the present invention, includes a headset computing device, also referred to herein as a headset computer (HSC), having a port for docking. When worn on a user's head, the HSC can operate in a “headset” mode and behave as a hands-free computing device. When docked with a docking station, the HSC can behave in a “docked” mode and behave as a typical personal computer (PC) that can use a full screen monitor as a display output and a keyboard and/or mouse as input devices.
Operating as a hands-free device in the headset mode, the HSC can use automatic speech recognition and head-tracking features to recognize verbal and head-motion commands. In the headset mode, the HSC presents to the user a specific set of specified hands-fee applications or application features that use a micro-display, automatic speech recognition and head-tracking features.
Operating as a typical PC in the docked mode, the HSC can use a traditional hands-dependent interface including a graphical user interface (GUI) based on the user inputting commands using a keyboard and/or mouse (or other pointing device). In the docked mode, the HSC presents to the user a specific set of specified hands-dependent applications or application features that use the keyboard and/mouse interface. The docking port integrated with the HSC enables the HSC to dock with a docking station. While docked, the docking station enables: recharging of the HSC battery; video output from the HSC to a conventional PC monitor; audio output from HSC to standard audio speakers; audio input from a microphone; keyboard and mouse operation; Internet connectivity, for example via Ethernet.
The applications, whether hands-fee applications or hands-dependent applications (e.g., traditional PC application) or application features, can share the same application data, since the data can be stored in the HSC's memory, such as a hard disk. Applications can include PC applications, such as Word, Excel, Internet Explorer, etc., all of which are operated in a traditional manner using a mouse and keyboard. While the HSC operates in either mode, i.e., headset mode or docked mode, the application can assess the same data files so that both modes share the same data files. For example the user can create new documents in docked mode (also referred to herein as “desktop” mode or PC mode), which can be viewed thereafter in headset mode (also referred to herein as “hands-free” mode). In another example, the hands-free user can take snapshot photographs via a camera operatively connected (preferably integrated) to the HSC, which can then be viewed and edited in desktop mode. Therefore, the HSC can operate as a dual-personality device that serves both an “Office User” and a “Mobile Worker”. The term “Application Feature”, as used herein, can refer to a number of computer software controlled elements including: GUI features, such as presenting available verbal and head-motion commands, file operations including, create, open, write, read, etc. or any combination thereof.
The automatic speech recognition feature of the HSC enabling control of the device using voice commands may be a useful feature. The automatic speech recognition feature can be enabled, for example, by using a speech recognition product, such as those available from Nuance Communications, Inc. 1 Wayside Road, Burlington, Mass. 01803. In addition, the head-tracking feature can be enabled using, for example, a six-axis or nine-axis sensor module tracker available from Hillcrest Laboratories, Inc., 15245 Shady Grove Road, Suite 400, Rockville, Md. 20850.
The operating systems and/or device drivers used for the HSC can be modified to take into account whether the HSC is in the docking station or operating away from the docking station (e.g., on a user's head). Selection of the operating mode can be made automatically, such as by using an input that detects when the HSC is docked.
As will be understood, the operation, control, and visual displays generated by the hands-free applications are different from the corresponding elements of a regular desktop PC application. The visual presentation of the information on the micro-display and desktop display preferable are different.
In one embodiment the HSC may take the form of the HSC described in a co-pending U.S. patent application Ser. No. 13/018,999, entitled “Wireless Hands-Free Computing Headset With Detachable Accessories Controllable By Motion, Body Gesture And/Or Vocal Commands” by Jacobsen et al., filed Feb. 1, 2011, which is hereby incorporated by reference its entirety.
Example embodiments of the HSC 100 can receive user input through sensing voice commands, head movements, 110, 111, 112 and hand gestures 113, or any combination thereof as illustrated in
A head worn frame 1000 and articulated supports 1002 are generally configured so that a user can wear the headset computer device 100 on the user's head. Housings 1004 are generally low profile units which house the electronics, such as the microprocessor, memory or other storage device, low power wireless communications device(s), along with other associated circuitry. Speakers 1006 provide audio output to the user so that the user can hear information, such as the audio portion of a multimedia presentation, or audio prompt, alert, or feedback signaling recognition of a user command.
A micro-display subassembly 1010 is used to render visual information, such as images and video, to the user. The micro-display 1010 is coupled to the arm 1008. The arm 1008 generally provides physical support such that the micro-display subassembly 1010 is able to be positioned within the user's field of view, preferably in front of the eye of the user or within its peripheral vision preferably slightly below or above the eye. The arm 1008 also provides the electrical or optical connections between the micro-display subassembly 1010 and the control circuitry housed within housing unit 1004.
According to aspects that will be explained in more detail below, the HSC 100 with micro-display 1010 can enable an end-user to select a field of view 300 (
While the example embodiments of an HSC 100 shown in
A digital processor 1030 is operatively coupled to micro-display 1010, memory module 1040, battery 1016, audio speakers 1006, and user input devices including microphone 1014 (microphones) and motion sensors 1018. The processor 1030 uses automatic speech recognition module 1032 and head tracking module 1034 to convert signals received from the hands-free user interface sensors into control commands for executing application 1034. The application 1036 can access a data set 1042 stored in a memory module 1040. While operating in a headset mode, processor 1030 outputs user interface to micro-display 1010 and audio speakers 1006. A battery 1016 powers the HSC 100 while in the headset mode. (It should be recognized by those of skill in the art that memory 1040 while preferably located in headset computing device 100, does not have to be located in HSC 100 but merely accessible to HSC 100, e.g., a cloud-based memory module).
A docking station unit 2050, (also referred to herein as a docking station) includes docking port 2052. The docking port 2052 communicatively couples to HSC 100 enabling a communications and power link between the docking station 200 and the HSC 100. The docking station 2050 is further communicatively coupled to a display (monitor) 2010, for example a conventional computer monitor, audio speakers 2006, keyboard 2062, pointing device 2064, for example a mouse, and optionally a microphone 2066. The display monitor 2010 displays graphical user information to a user when the headset is operating in docked mode. Further, while in docked mode, the keyboard 2062 and pointing device 2064 are used to capture user input at docking station 2050 and communicate such input to processor 1030 so that application 1036 can be controlled. The microphone 2066 can optionally be used as a speech input for the headset computing device 100 while in docking station mode and can be processed by automatic speech recognition module 1032 in a manner similar to the speech control processes in the headset mode.
The “My Life” display screens 301a-2 of the HSC 100 operating in the headset mode can provide the user visual information such as time, date and calendar scheduling information, as well as notifications and the available voice commands associated with such features. For example, the voice command “open item 2” can open a voicemail and or visual voicemail message.
The “My Social” display screens 301a-3 of the HSC 100 operating in the headset mode can provide the user with the latest news, social network updates (for example, Tweets®) and other need-to-know information. (Tweet is a registered trademark of Twitter, Inc. of 1355 Market Street, Suite 900, San Francisco, Calif. 94103.)
A docked mode software stack 4200 is used during operation of the HSC 100 in a docked mode, and includes a rechargeable battery driver module 4201, a touch device driver 4203, which can control input from a user through a touch device such as a touch screen or other capacitive other input touch input device such as a trackpad, pointing device driver 4205, such as a optical mouse, keyboard driver 4207 and docked mode display driver 4209 which displays the graphical user interface typically through a monitor 2010.
Further example embodiments of the present invention may be configured using a computer program product; for example, controls may be programmed in software for implementing example embodiments of the present invention. Further example embodiments of the present invention may include a non-transitory computer readable medium containing instruction that may be executed by a processor, and, when executed, cause the processor to complete methods described herein. It should be understood that elements of the block and flow diagrams described herein may be implemented in software, hardware, firmware, or other similar implementation determined in the future. In addition, the elements of the block and flow diagrams described herein may be combined or divided in any manner in software, hardware, or firmware. If implemented in software, the software may be written in any language that can support the example embodiments disclosed herein. The software may be stored in any form of computer readable medium, such as random access memory (RAM), read only memory (ROM), compact disk read only memory (CD-ROM), and so forth. In operation, a general purpose or application specific processor loads and executes software in a manner well understood in the art. It should be understood further that the block and flow diagrams may include more or fewer elements, be arranged or oriented differently, or be represented differently. It should be understood that implementation may dictate the block, flow, and/or network diagrams and the number of block and flow diagrams illustrating the execution of embodiments of the invention.
The teachings of all patents, published applications and references cited herein are incorporated by reference in their entirety.
While this invention has been particularly shown and described with references to example embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.
This application claims the benefit of U.S. Provisional Application No. 61/653,471, filed on May 31, 2012. The entire teachings of the above application are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5948047 | Jenkins | Sep 1999 | A |
5990793 | Bieback | Nov 1999 | A |
6010216 | Jesiek | Jan 2000 | A |
6108197 | Janik | Aug 2000 | A |
6204974 | Spitzer | Mar 2001 | B1 |
6325507 | Jannard et al. | Dec 2001 | B1 |
6798391 | Peterson, III | Sep 2004 | B2 |
6853293 | Swartz et al. | Feb 2005 | B2 |
6900777 | Hebert et al. | May 2005 | B1 |
6922184 | Lawrence et al. | Jul 2005 | B2 |
6956614 | Quintana et al. | Oct 2005 | B1 |
6966647 | Jannard et al. | Nov 2005 | B2 |
7004582 | Jannard et al. | Feb 2006 | B2 |
7013009 | Warren | Mar 2006 | B2 |
7054965 | Bell et al. | May 2006 | B2 |
7082393 | Lahr | Jul 2006 | B2 |
7147324 | Jannard et al. | Dec 2006 | B2 |
7150526 | Jannard et al. | Dec 2006 | B2 |
7213917 | Jannard et al. | May 2007 | B2 |
7216973 | Jannard et al. | May 2007 | B2 |
7219994 | Jannard et al. | May 2007 | B2 |
7231038 | Warren | Jun 2007 | B2 |
7249846 | Grand et al. | Jul 2007 | B2 |
7278734 | Jannard et al. | Oct 2007 | B2 |
7331666 | Swab et al. | Feb 2008 | B2 |
7445332 | Jannard et al. | Nov 2008 | B2 |
7452073 | Jannard et al. | Nov 2008 | B2 |
7461936 | Jannard | Dec 2008 | B2 |
7494216 | Jannard et al. | Feb 2009 | B2 |
7512414 | Jannard et al. | Mar 2009 | B2 |
7574239 | Bjerrum-Niese | Aug 2009 | B2 |
7620432 | Willins et al. | Nov 2009 | B2 |
7682018 | Jannard | Mar 2010 | B2 |
7740353 | Jannard | Jun 2010 | B2 |
7744213 | Jannard et al. | Jun 2010 | B2 |
7753520 | Fuziak, Jr. | Jul 2010 | B2 |
7760898 | Howell et al. | Jul 2010 | B2 |
7798638 | Fuziak, Jr. | Sep 2010 | B2 |
7806525 | Howell et al. | Oct 2010 | B2 |
7918556 | Lewis | Apr 2011 | B2 |
7959084 | Wulff | Jun 2011 | B2 |
7966189 | Le et al. | Jun 2011 | B2 |
7967433 | Jannard et al. | Jun 2011 | B2 |
7969383 | Eberl et al. | Jun 2011 | B2 |
7969657 | Cakmakci et al. | Jun 2011 | B2 |
7976480 | Grajales et al. | Jul 2011 | B2 |
7988283 | Jannard | Aug 2011 | B2 |
7997723 | Pienimaa et al. | Aug 2011 | B2 |
8010156 | Warren | Aug 2011 | B2 |
8020989 | Jannard et al. | Sep 2011 | B2 |
8025398 | Jannard | Sep 2011 | B2 |
8072393 | Riechel | Dec 2011 | B2 |
8092011 | Sugihara et al. | Jan 2012 | B2 |
8098439 | Amitai et al. | Jan 2012 | B2 |
8123352 | Matsumoto et al. | Feb 2012 | B2 |
8140197 | Lapidot et al. | Mar 2012 | B2 |
8212859 | Tang et al. | Jul 2012 | B2 |
8814691 | Haddick et al. | Aug 2014 | B2 |
20020015008 | Kishida et al. | Feb 2002 | A1 |
20020094845 | Inasaka | Jul 2002 | A1 |
20030068057 | Miller et al. | Apr 2003 | A1 |
20050264527 | Lin | Dec 2005 | A1 |
20060132382 | Jannard | Jun 2006 | A1 |
20080198324 | Fuziak | Aug 2008 | A1 |
20080208593 | Ativanichayaphong | Aug 2008 | A1 |
20080304688 | Kumar | Dec 2008 | A1 |
20090128448 | Riechel | May 2009 | A1 |
20090154719 | Wulff et al. | Jun 2009 | A1 |
20090180195 | Cakmakci et al. | Jul 2009 | A1 |
20100020229 | Hershey et al. | Jan 2010 | A1 |
20100033830 | Yung | Feb 2010 | A1 |
20100053069 | Tricoukes et al. | Mar 2010 | A1 |
20100064228 | Tsern | Mar 2010 | A1 |
20100121480 | Stelzer et al. | May 2010 | A1 |
20100171680 | Lapidot et al. | Jul 2010 | A1 |
20100238184 | Janicki | Sep 2010 | A1 |
20100245585 | Fisher | Sep 2010 | A1 |
20100250817 | Collopy et al. | Sep 2010 | A1 |
20100271587 | Pavlopoulos | Oct 2010 | A1 |
20100277563 | Gupta et al. | Nov 2010 | A1 |
20100289817 | Meier et al. | Nov 2010 | A1 |
20110001699 | Jacobsen et al. | Jan 2011 | A1 |
20110089207 | Tricoukes et al. | Apr 2011 | A1 |
20110090135 | Tricoukes et al. | Apr 2011 | A1 |
20110162035 | King | Jun 2011 | A1 |
20110214082 | Osterhout et al. | Sep 2011 | A1 |
20110221669 | Shams et al. | Sep 2011 | A1 |
20110221671 | King, III et al. | Sep 2011 | A1 |
20110227812 | Haddick et al. | Sep 2011 | A1 |
20110227813 | Haddick et al. | Sep 2011 | A1 |
20110254698 | Eberl et al. | Oct 2011 | A1 |
20110255050 | Jannard et al. | Oct 2011 | A1 |
20110273662 | Hwang et al. | Nov 2011 | A1 |
20120013843 | Jannard | Jan 2012 | A1 |
20120026071 | Hamdani et al. | Feb 2012 | A1 |
20120056846 | Zaliva | Mar 2012 | A1 |
20120062445 | Haddick et al. | Mar 2012 | A1 |
20120068914 | Jacobsen | Mar 2012 | A1 |
20120105740 | Jannard et al. | May 2012 | A1 |
20120114131 | Tricoukes et al. | May 2012 | A1 |
20120188245 | Hyatt | Jul 2012 | A1 |
Number | Date | Country |
---|---|---|
WO 9512408 | Aug 1995 | WO |
WO 9523994 | Sep 1995 | WO |
WO 0079327 | Dec 2000 | WO |
WO 2009076016 | Jun 2009 | WO |
WO 2011051660 | May 2011 | WO |
WO 2012040386 | Mar 2012 | WO |
WO 2013180964 | Dec 2013 | WO |
Entry |
---|
Notification of Transmittal of the International Search Report and the Written Opinion of the International Search Authority, or the Declaration for PCT/US2013/041344, “Headset Computer (HSC) With Docking Station and Dual Personality”, dated Aug. 19, 2013. |
International Preliminary Report on Patentability dated Dec. 2, 2014 for PCT/US2013/041344. |
Number | Date | Country | |
---|---|---|---|
20130326208 A1 | Dec 2013 | US |
Number | Date | Country | |
---|---|---|---|
61653471 | May 2012 | US |