Earpiece with app environment

Abstract
An earpiece includes an earpiece housing, a processor disposed within the earpiece housing, a memory operatively connected to the processor and disposed within the earpiece housing, and a plurality of software applications stored within the memory. The earpiece is configured to allow a user of the earpiece to select one of the plurality of software applications to run using the processor as a foreground application and allows for receiving user input into the foreground application.
Description
FIELD OF THE INVENTION

The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to ear pieces which provide an app environment.


BACKGROUND

Earpieces are generally specific purpose devices with little or no intelligence. What is needed is an intelligent earpiece with enhanced functionality which includes a wide range of processing capabilities. However, given the size constraints on earpieces (including size constrains on battery space) there are limitations in the ability to process. What is needed is an earpiece or set of earpieces which allows for a wide range of processing capabilities.


SUMMARY

Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.


It is a further object, feature, or advantage of the present invention to provide an earpiece with an app environment.


It is a still further object, feature, or advantage of the present invention to provide an earpiece that allows a user to select an app to run in the foreground or the background.


Another object, feature, or advantage is to allow a user to determine which app of a plurality of different apps on an earpiece is to receive user input.


One or more of these and/or other objects, features, or advantages of the present invention will become apparent from the specification and claims that follow. No single embodiment need provide each and every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by an objects, features, or advantages stated herein.


According to one aspect, an earpiece includes an earpiece housing, a processor disposed within the earpiece housing, a memory operatively connected to the processor and disposed within the earpiece housing, and a plurality of software applications stored within the memory. The earpiece is configured to allow a user of the earpiece to select one of the plurality of software applications to run using the processor as a foreground application and allows for receiving user input into the foreground application.


According to another aspect, a method for controlling an earpiece is provided. The method includes installing a plurality of different software applications within an earpiece, receiving a selection of one of the plurality of different software applications from a user through a user interface of the earpiece, and executing the one of the plurality of different software applications based on the selection in a foreground mode of operation for the earpiece. The method may further include receiving user input from the user of the earpiece through the user interface and receiving the user input into the one of the plurality of different software applications in the foreground mode of operation.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates one embodiment of a system where one or more earpieces are configured with an app environment.



FIG. 2 is a block diagram of an earpiece which provides an app environment.



FIG. 3 is another block diagram of an earpiece which provides an app environment.



FIG. 4 illustrates an environment with multiple apps where one of the apps is considered a foreground app and which receives user input.



FIG. 5 illustrates a mobile device in communication with a set of earpieces.



FIG. 6 illustrates one example of a methodology.





DETAILED DESCRIPTION


FIG. 1 illustrates one embodiment of a system 10 which includes a left earpiece 12A and a right earpiece 12. The left earpiece 12A has a left earpiece housing 14A and the right earpiece 12B has a right earpiece housing 14B. An external microphone 70A is shown at the left earpiece 12A and another external microphone 70B is shown at the right earpiece 12B. A plurality of software applications or apps are shown which may be stored on a memory of one or more of the earpieces is also shown.


The earpiece 12A allows a user to place one or more apps 61A, 61B, 61C, 61D on the device within storage or other memory 60 of the device and provides a user interface to allow a user to select one of the apps to run in the foreground or background or to select as the active app to receive user input. This app environment provides a number of different advantages. First, not all functionality needs to be built-into the operating system for the earpiece 12A (or set of earpieces 12A, 12B). Instead functionality may be delivered as an app to the earpiece 12A and a user need only store or execute the apps on their earpiece 12A which they wish to use. Second, issues regarding limitations on processing ability and battery use are reduced when only those apps which a user wishes to use need be present on a device.


The app environment allows a user to determine which app or apps are run in the background. Because the earpiece may include any number of different sensors and combinations of sensors and perform any number of different functions, there are also many instances where it may be useful for an app to be running in the background without requiring any attention from the user. For example, an app may simply be monitoring physiological sensors associated with the user either storing the data for later use or analysis or monitoring to determine when measured physiological parameters meet or exceed some threshold of interest. Or the app may simply be monitoring environmental sensors and either storing the data for later use or analysis or monitoring to determine when measured environmental parameters meet or exceed some threshold of interest. Or the app may simply be communicating information with other wearable device, mobile devices, or other types of computing devices. There are any number of different functions that an app may be performing related to collecting, processing, or communicating data. It is to also be understood that some apps may have very specific purposes.


In operation, the device may consider one application to be a foreground app or a background app. As used herein, the term “foreground” refers to an application or task that is currently being used by the user. Such an application or task may be interrupt-driven or real-time processes. As used herein, the term “background” refers to an application or task that is running but not currently being directly used by the user or otherwise considered to be in the background by the user. The device may determine which app or apps are running as well as which app is to be run in the foreground either automatically based on context or alternatively based on user input.



FIG. 2 illustrates one example of a block diagram of an earpiece 12. The ear piece 12 includes an earpiece housing 14. Disposed within the earpiece housing 13 is a processor 30. The term “processor” as used herein means a single processor or more than one processor in operative communication. The processor 30 may include a digital signal processor, a microprocessor, microcontroller and/or other types of processors, or combinations of these or other types of processors. One or more internal microphones or bone microphones 71 may be operatively connected to the processor 30. One or more external microphones 70 may be operatively connected to the processor 30. One or more wireless transceivers 34 may be operatively connected to the processor. One or more speakers 73 may be operatively connected to the processor 30. A memory or storage unit 60 may be provided which is operatively connected to the processor 30. The memory or storage unit 60 may store one or more software applications or apps 61. Each app may include a set of software instructions for execution by the processor 30.



FIG. 3 illustrates another example of a block diagram of an earpiece. The earpiece includes a sensor array 32 which includes one or more sensors. Examples of the type of sensors 32 which may be present include an air microphone 70, a bone microphone 71, a first inertial sensor 74, a second inertial sensor 76, a biometric or physiological sensor 78. Of course any number of other sensors may be present. The sensors 32 are operatively connected to an intelligent control system or processor 30. A gesture control interface 36 is also operatively connected to the intelligent control system 30. The gesture control interface 36 may be infrared based, ultrasound based, capacitive sensor based or use other technologies. The gesture control interface 36 may include one or more emitters 82 and one or more detectors 84. In operation, a user may make gestures which are detected by the gesture control interface 36. Examples of gestures may include, single taps, double taps, tap and hold, swipes in various directions or other types of gestures. One or more transceivers may also be operatively connected to the intelligent control system or processor 30. For example, a radio transceiver 34 may be used for Bluetooth or Bluetooth Low Energy (BLE) communications. The radio transceiver 34 may be used to communicate with other wearable devices, mobile devices such as phones or tablets, or other types of computing devices. The transceiver 35 may be a near field magnetic induction (NFMI) transceiver or other type of transceiver such as may be used to communicate with another earpiece. The earpiece may further include one or more lighting elements such as LEDs 20 and may also include one or more speakers 73. It is to be noted that the earpiece may provide a number of different ways to receive user input from a user. This may include receiving user input through the gesture control interface 36, receiving user input from one or more inertial sensors 76, 78 (such as receiving a head nod to indicate yes or side-to-side head movement to indicate no), receiving voice input from one or more microphones 70, 71, or otherwise receiving input from a user such as indirectly through another device in operative communication with the earpiece via the radio transceiver 34 or transceiver 35.



FIG. 4 illustrates a plurality of apps stored in a machine readable storage medium such as a memory where one of the apps is selected as a foreground app. The rest of the apps may remain in the background. The app which is in the foreground may receive user input from a user. A user may select which app is the foreground app in a number of different ways. For example, where a gestural interface is present, a gesture may be associated with app selections. In one embodiment, a user may be prompted with the name of an app and then either select the app to make the app the foreground app or wait for or move on to the name of the next app in a list or circular list of available apps. In order to improve the efficiency of the user interface, it is contemplated that the list of available apps may be ordered in a manner such that the app that the user most likely wants to select is the first app in the list presented to the user. One way in which the apps may be ordered is based on which app was most recently used, which is mostly frequently used by the user, the app most frequently used by the user at the same time of day or at the same location, or based on other variables and relationships which may predict what a user's preference may be. Other methodologies which may be used may include ordering the list based on the last app used by the user, or based on pattern analyses performed based on usage history or other parameters.


Instead of using a gestural interface, other types of user interfaces may be used. For example, a user may provide voice input to indicate that the user wants to select a particular app, such as, “Bragi, run [name]” or “Bragi, run the [name] app” or “Bragi, app menu” or other voice input. Of course, other types of user interfaces may be used. For example, where the earpiece is in communication with a mobile device, the user interface of the mobile device may be used to select an app to execute. Similarly, where the earpiece is in communication with a vehicle, entertainment device, or other computing device with a user interface, the user interface of the other device may be used to select an app to execute on the earpiece. In addition, the user may schedule when the various apps are executed in advance. For example, FIG. 5 illustrates that a mobile device 100 may be in operative communication with one or more earpieces 10. A user may use the mobile device 100 or a program executing on the mobile device 100 to select which app to run on the earpiece system 10 or to download one or more applications to the earpiece(s) 10 for execution. It is contemplated that an app need only be stored and executed on one of the earpieces and that data may be communicated to and from the other earpiece.


Any number of different apps may be present on the earpiece. This may include apps for business, apps for productivity, apps for health and wellness, apps for entertainment, and other types of apps. Apps may be apps for storing or analyzing sensor input, apps for interacting with other devices, apps for playing media or media streams, apps for augmenting environmental audio signals or other audio, or other types of apps.


It is also to be understood that one or more apps may be added to the earpiece by a user. This may occur by downloading the app to the earpiece either through direct connection or wirelessly. Thus, a user may decide which apps are present on their device and include only those apps of interest.



FIG. 6 illustrates one example of a methodology. In step 200 a plurality of different software applications are installed within a memory of an earpiece. These software applications may come pre-installed or loaded by a manufacturer or distributor and/or may include one or more software applications installed or loaded by a user or other entity. In step 202, a selection of one of the plurality of different applications is received from a user through a user interface of the earpiece. In step 204. The selected application may be executed in a foreground mode of operation for the earpiece. When the application is executed in the foreground mode of operation, the application may receive user input through the user interface.


Therefore, earpieces with an app environment have been shown and described. The present invention contemplates numerous variations in the apparatus, systems, and methodologies shown and described and it is not to be limited to the specific embodiments provided herein.

Claims
  • 1. An earpiece comprising: an earpiece housing;a processor disposed within the earpiece housing:a memory operatively connected to the processor and disposed within the earpiece housing; anda plurality of software applications stored within the memory;wherein the earpiece is configured to allow a user of the earpiece to select one of the plurality of software applications to run using the processor as a foreground application of the earpiece; andwherein the earpiece is further configured to receive user input from the user in the form of movement of the user detected with an inertial sensor operatively connected to the processor into the foreground application.
  • 2. The earpiece of claim 1 further comprising a microphone and wherein the user input comprises voice input received from the user using the microphone.
  • 3. The earpiece of claim 1 wherein the earpiece is further configured to download an additional software application to the memory using a wireless transceiver disposed within the earpiece housing.
  • 4. A method for controlling an earpiece comprising: installing a plurality of different software applications within an earpiece;receiving a selection of one of the plurality of different software applications from a user through a user interface of the earpiece;executing the one of the plurality of different software applications based on the selection in a foreground mode of operation for the earpiece; andreceiving user input from the user of the earpiece through the user interface into the one of the plurality of different software applications in the foreground mode of operation, wherein the user input comprises movement of the user received at an inertial sensor of the earpiece.
  • 5. The method of claim 4 wherein the user input further comprises voice input received at a microphone of the earpiece.
  • 6. The method of claim 4 wherein the user input further comprises a gesture received at a gestural control interface of the earpiece.
  • 7. The earpiece of claim 1 wherein the earpiece is further configured to allow the user to select one of the plurality of software applications to run as a background application using the processor.
  • 8. The earpiece of claim 7 wherein the background application run by the processor is used for monitoring physiological parameters associated with the user.
  • 9. The method of claim 4 further comprising presenting the plurality of software applications through the user interface.
  • 10. The method of claim 9 wherein the plurality of software applications comprises business applications, productivity applications, health applications, entertainment applications, sensor analysis applications, streaming media applications, and audio amplification applications.
  • 11. The method of claim 9 wherein the plurality of software applications presented by the user interface are ordered by recent usage.
  • 12. The method of claim 11 wherein the plurality of software applications presented by the user interface are based on a pattern analysis.
PRIORITY STATEMENT

This application claims priority to U.S. Provisional Patent Application No. 62/359,542, filed Jul. 7, 2016, and entitled “Earpiece with App Environment”, hereby incorporated by reference in its entirety.

US Referenced Citations (212)
Number Name Date Kind
2325590 Carlisle et al. Aug 1943 A
2430229 Kelsey Nov 1947 A
3047089 Zwislocki Jul 1962 A
D208784 Sanzone Oct 1967 S
3586794 Michaelis Jun 1971 A
3934100 Harada Jan 1976 A
3983336 Malek et al. Sep 1976 A
4069400 Johanson et al. Jan 1978 A
4150262 Ono Apr 1979 A
4334315 Ono et al. Jun 1982 A
D266271 Johanson et al. Sep 1982 S
4375016 Harada Feb 1983 A
4588867 Konomi May 1986 A
4617429 Bellafiore Oct 1986 A
4654883 Iwata Mar 1987 A
4682180 Gans Jul 1987 A
4791673 Schreiber Dec 1988 A
4852177 Ambrose Jul 1989 A
4865044 Wallace et al. Sep 1989 A
4984277 Bisgaard et al. Jan 1991 A
5008943 Arndt et al. Apr 1991 A
5185802 Stanton Feb 1993 A
5191602 Regen et al. Mar 1993 A
5201007 Ward et al. Apr 1993 A
5201008 Arndt et al. Apr 1993 A
D340286 Seo Oct 1993 S
5280524 Norris Jan 1994 A
5295193 Ono Mar 1994 A
5298692 Ikeda et al. Mar 1994 A
5343532 Shugart Aug 1994 A
5347584 Narisawa Sep 1994 A
5363444 Norris Nov 1994 A
D367113 Weeks Feb 1996 S
5497339 Bernard Mar 1996 A
5606621 Reiter et al. Feb 1997 A
5613222 Guenther Mar 1997 A
5654530 Sauer et al. Aug 1997 A
5692059 Kruger Nov 1997 A
5721783 Anderson Feb 1998 A
5748743 Weeks May 1998 A
5749072 Mazurkiewicz et al. May 1998 A
5771438 Palermo et al. Jun 1998 A
D397796 Yabe et al. Sep 1998 S
5802167 Hong Sep 1998 A
D410008 Almqvist May 1999 S
5929774 Charlton Jul 1999 A
5933506 Aoki et al. Aug 1999 A
5949896 Nageno et al. Sep 1999 A
5987146 Pluvinage et al. Nov 1999 A
6021207 Puthuff et al. Feb 2000 A
6054989 Robertson et al. Apr 2000 A
6081724 Wilson Jun 2000 A
6094492 Boesen Jul 2000 A
6111569 Brusky et al. Aug 2000 A
6112103 Puthuff Aug 2000 A
6157727 Rueda Dec 2000 A
6167039 Karlsson et al. Dec 2000 A
6181801 Puthuff et al. Jan 2001 B1
6208372 Barraclough Mar 2001 B1
6230029 Yegiazaryan et al. May 2001 B1
6275789 Moser et al. Aug 2001 B1
6339754 Flanagan et al. Jan 2002 B1
D455835 Anderson et al. Apr 2002 S
6408081 Boesen Jun 2002 B1
6424820 Burdick et al. Jul 2002 B1
D464039 Boesen Oct 2002 S
6470893 Boesen Oct 2002 B1
D468299 Boesen Jan 2003 S
D468300 Boesen Jan 2003 S
6542721 Boesen Apr 2003 B2
6560468 Boesen May 2003 B1
6654721 Handelman Nov 2003 B2
6664713 Boesen Dec 2003 B2
6690807 Meyer Feb 2004 B1
6694180 Boesen Feb 2004 B1
6718043 Boesen Apr 2004 B1
6738485 Boesen May 2004 B1
6748095 Goss Jun 2004 B1
6754358 Boesen et al. Jun 2004 B1
6784873 Boesen et al. Aug 2004 B1
6823195 Boesen Nov 2004 B1
6852084 Boesen Feb 2005 B1
6879698 Boesen Apr 2005 B2
6892082 Boesen May 2005 B2
6920229 Boesen Jul 2005 B2
6952483 Boesen et al. Oct 2005 B2
6987986 Boesen Jan 2006 B2
7010137 Leedom et al. Mar 2006 B1
7113611 Leedom et al. Sep 2006 B2
D532520 Kampmeier et al. Nov 2006 S
7136282 Rebeske Nov 2006 B1
7203331 Boesen Apr 2007 B2
7209569 Boesen Apr 2007 B2
7215790 Boesen et al. May 2007 B2
D549222 Huang Aug 2007 S
D554756 Sjursen et al. Nov 2007 S
7403629 Aceti et al. Jul 2008 B1
D579006 Kim et al. Oct 2008 S
7463902 Boesen Dec 2008 B2
7508411 Boesen Mar 2009 B2
D601134 Elabidi et al. Sep 2009 S
7825626 Kozisek Nov 2010 B2
7965855 Ham Jun 2011 B1
7979035 Griffin et al. Jul 2011 B2
7983628 Boesen Jul 2011 B2
D647491 Chen et al. Oct 2011 S
8095188 Shi Jan 2012 B2
8108143 Tester Jan 2012 B1
8140357 Boesen Mar 2012 B1
D666581 Perez Sep 2012 S
8300864 Mullenborn et al. Oct 2012 B2
8406448 Lin Mar 2013 B2
8436780 Schantz et al. May 2013 B2
D687021 Yuen Jul 2013 S
8719877 VonDoenhoff et al. May 2014 B2
8774434 Zhao et al. Jul 2014 B2
8831266 Huang Sep 2014 B1
8891800 Shaffer Nov 2014 B1
8994498 Agrafioti et al. Mar 2015 B2
D728107 Martin et al. Apr 2015 S
9013145 Castillo et al. Apr 2015 B2
9037125 Kadous May 2015 B1
D733103 Jeong et al. Jun 2015 S
9081944 Camacho et al. Jul 2015 B2
9510159 Cuddihy et al. Nov 2016 B1
D773439 Walker Dec 2016 S
D775158 Dong et al. Dec 2016 S
D777710 Palmborg et al. Jan 2017 S
D788079 Son et al. May 2017 S
20010005197 Mishra et al. Jun 2001 A1
20010027121 Boesen Oct 2001 A1
20010043707 Leedom Nov 2001 A1
20010056350 Calderone et al. Dec 2001 A1
20020002413 Tokue Jan 2002 A1
20020007510 Mann Jan 2002 A1
20020010590 Lee Jan 2002 A1
20020030637 Mann Mar 2002 A1
20020046035 Kitahara et al. Apr 2002 A1
20020057810 Boesen May 2002 A1
20020076073 Taenzer et al. Jun 2002 A1
20020118852 Boesen Aug 2002 A1
20030002705 Boesen Jan 2003 A1
20030065504 Kraemer et al. Apr 2003 A1
20030100331 Dress et al. May 2003 A1
20030104806 Ruef et al. Jun 2003 A1
20030115068 Boesen Jun 2003 A1
20030125096 Boesen Jul 2003 A1
20030218064 Conner et al. Nov 2003 A1
20040070564 Dawson et al. Apr 2004 A1
20040160511 Boesen Aug 2004 A1
20050017842 Dematteo Jan 2005 A1
20050043056 Boesen Feb 2005 A1
20050125320 Boesen Jun 2005 A1
20050148883 Boesen Jul 2005 A1
20050165663 Razumov Jul 2005 A1
20050196009 Boesen Sep 2005 A1
20050251455 Boesen Nov 2005 A1
20050266876 Boesen Dec 2005 A1
20060029246 Boesen Feb 2006 A1
20060074671 Farmaner et al. Apr 2006 A1
20060074808 Boesen Apr 2006 A1
20060166715 Engelen et al. Jul 2006 A1
20060166716 Seshadri et al. Jul 2006 A1
20060220915 Bauer Oct 2006 A1
20060258412 Liu Nov 2006 A1
20080076972 Dorogusker et al. Mar 2008 A1
20080090622 Kim et al. Apr 2008 A1
20080146890 LeBoeuf et al. Jun 2008 A1
20080254780 Kuhl et al. Oct 2008 A1
20080255430 Alexandersson et al. Oct 2008 A1
20090003620 McKillop et al. Jan 2009 A1
20090017881 Madrigal Jan 2009 A1
20090073070 Rofougaran Mar 2009 A1
20090097689 Prest et al. Apr 2009 A1
20090105548 Bart Apr 2009 A1
20090191920 Regen et al. Jul 2009 A1
20090245559 Boltyenkov et al. Oct 2009 A1
20090296968 Wu et al. Dec 2009 A1
20100033313 Keady et al. Feb 2010 A1
20100203831 Muth Aug 2010 A1
20100210212 Sato Aug 2010 A1
20100320961 Castillo et al. Dec 2010 A1
20110286615 Olodort et al. Nov 2011 A1
20120057740 Rosal Mar 2012 A1
20130316642 Newham Nov 2013 A1
20130346168 Zhou et al. Dec 2013 A1
20140072146 Itkin et al. Mar 2014 A1
20140079257 Ruwe et al. Mar 2014 A1
20140106677 Altman Apr 2014 A1
20140122116 Smythe May 2014 A1
20140163771 Demeniuk Jun 2014 A1
20140185828 Helbling Jul 2014 A1
20140222462 Shakil et al. Aug 2014 A1
20140235169 Parkinson et al. Aug 2014 A1
20140270227 Swanson Sep 2014 A1
20140270271 Dehe et al. Sep 2014 A1
20140348367 Vavrus et al. Nov 2014 A1
20150028996 Agrafioti et al. Jan 2015 A1
20150110587 Hori Apr 2015 A1
20150148989 Cooper et al. May 2015 A1
20150245127 Shaffer Aug 2015 A1
20160033280 Moore et al. Feb 2016 A1
20160072558 Hirsch et al. Mar 2016 A1
20160073189 Lindén et al. Mar 2016 A1
20160125892 Bowen et al. May 2016 A1
20160210111 Kraft Jul 2016 A1
20160360350 Watson et al. Dec 2016 A1
20170078780 Qian et al. Mar 2017 A1
20170105681 Singh Apr 2017 A1
20170111726 Martin et al. Apr 2017 A1
20170155992 Perianu et al. Jun 2017 A1
20170180844 Nanni Jun 2017 A1
Foreign Referenced Citations (19)
Number Date Country
204244472 Apr 2015 CN
104683519 Jun 2015 CN
104837094 Aug 2015 CN
1469659 Oct 2004 EP
1017252 May 2006 EP
2903186 Aug 2015 EP
2074817 Apr 1981 GB
2508226 May 2014 GB
2008103925 Aug 2008 WO
2007034371 Nov 2008 WO
2011001433 Jan 2011 WO
2012071127 May 2012 WO
2013134956 Sep 2013 WO
2014046602 Mar 2014 WO
2014043179 Jul 2014 WO
2015061633 Apr 2015 WO
2015110577 Jul 2015 WO
2015110587 Jul 2015 WO
2016032990 Mar 2016 WO
Non-Patent Literature Citations (45)
Entry
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223.
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014).
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013).
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014).
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016).
BRAGI is on Facebook (2014).
BRAGI Update—Arrival of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014).
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015).
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014).
BRAGI Update—Let's Get Ready to Rumble, A Lot to Be Done Over Christmas (Dec. 22, 2014).
BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014).
BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014).
BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014).
BRAGI Update—Memories From The First Month of Kickstarter—Update on Progress (Aug. 1, 2014).
BRAGI Update—Memories From the Second Month of Kickstarter—Update on Progress (Aug. 22, 2014).
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014).
BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014).
BRAGI Update—Status on Wireless, Bits and Pieces, Testing-Oh Yeah, Timeline(Apr. 24, 2015).
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015).
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014).
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015).
BRAGI Update—Alpha 5 and Back to China, Backer Day, On Track(May 16, 2015).
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015).
BRAGI Update—Certifications, Production, Ramping Up.
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015).
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015).
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015).
BRAGI Update—Getting Close(Aug. 6, 2015).
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015).
BRAGI Update—On Track, On Track and Gems Overview.
BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015).
BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy Are We Getting Close(Sep. 10, 2015).
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016).
Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017).
Hyundai Motor America, “Hyundai Motor Company Introduces a Health + Mobility Concept for Wellness In Mobility”, Fountain Valley, Californa (2017).
Last Push Before The Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014).
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footle and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dlna _iot/ (Sep. 24, 2014).
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000.
Stretchgoal—It's Your Dash (Feb. 14, 2014).
Stretchgoal—The Carrying Case for The Dash (Feb. 12, 2014).
Stretchgoal—Windows Phone Support (Feb. 17, 2014).
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014).
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014).
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014).
Weisiger; “Conjugated Hyperbilirubinemia”, Jan. 5, 2016.
Related Publications (1)
Number Date Country
20180014108 A1 Jan 2018 US
Provisional Applications (1)
Number Date Country
62359542 Jul 2016 US