The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to wearable devices that may be modified by facial gestures.
Gestural based control systems have their limitations. Precision spatial location is essential for the proper determination of the gestural command. If the position of the measured segment of the body is not in optimal location, errors may occur. Given these issues, what is needed are improved methods, apparatus, and systems for wireless control systems based on gestures.
According to one aspect, electromyogram (EMG) technology is used to measure the electrical activity of a user's facial muscles. Most people are able to control their facial muscles to such a degree as to permit monitoring by an electronic sensor in order to control a wearable device. In addition, the electrical activity of the muscles of the head and neck region may also be measured to provide additional levels of control to the wearable device. Data collected may be used to provide improved gesture control. Data collected may be combined with data from inertial sensors.
Therefore, it is a primary object, feature, or advantage to improve over the state of the art.
It is a further object, feature, or advantage to assist paralyzed individuals In participating in the activities of life through recognition of control patterns based on facial EMG presets.
It is a still further object, feature, or advantage to allow a user to select EMG control settings in lieu of, or in addition to other control inputs including gesture based controls via accelerometer macros.
Another object, feature, or advantage is to provide greater precision in fine tuning the control functions of a device.
Yet another object, feature, or advantage is the transmission of EMG functional data to receptor devices. This allows the receptor device or devices to better respond to the inputs/commands of the user.
A further object, feature, or advantage is to provide bio-medical monitoring of the user through the use of sensor array systems.
A still further object, feature, or advantage is to augment accelerometer based solutions for control of macros.
One or more of these and/or other objects, features, or advantages of the present invention will become apparent from the specification and claims that follow. No single embodiment need provide each and every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by any object, features or advantage stated herein.
According to one aspect a system includes at least one wearable device, wherein each wearable device includes a processor, a gesture control interface operatively connected to the processor, at least one sensor configured to detect electrical activity from a user's facial muscles, the at least one sensor operatively connected to the processor, and wherein the processor is configured to interpret the electrical activity from the user's facial muscles as a first command to perform action according to a pre-determined user setting. The at least one sensor may be an electromyogram (EMG) sensor. Each wearable device may further include a transceiver operatively connected to the processor. The at least one wearable device may include a set of earpieces comprising a first earpiece and a second earpiece. The first earpiece may further include at least one microphone operatively connected to the processor and at least one speaker operatively connected to the processor. The first earpiece may be further configured not to interpret the electrical activity from the user's facial muscles as a command if the user is talking as determined using the at least one microphone. The first earpiece may further include an inertial sensor operatively connected to the processor and wherein the processor is configured to interpret the electrical activity from the user's facial muscles in combination with at least one of head orientation or movement as a second command. The at least one wearable device may include a set of earphones. The system may further include a software application executing on a mobile device configured to provide for modifying the pre-determined user setting.
According to another aspect, a method for using facial muscle electromyogram (EMG) potential as input may include providing at least one wearable device, wherein each wearable device includes a processor, a gesture control interface operatively connected to the processor, at least one EMG sensor configured to detect electrical activity from a user's facial muscles, the at least one EMG sensor operatively connected to the processor. The is configured to interpret the electrical activity from the user's facial muscles as a first command to perform an action according to a pre-determined user setting. The method may further include receiving the facial muscle EMG potentials at the at least one EMG sensor and interpreting at the processor the facial muscle EMG potentials as a first command to perform the action according to a pre-determined user setting.
The EMG sensors 36 may be combined with additional forms of user input. This may include one or more inertial sensors 74, a gesture control interface 38, one or more air microphones 32, and one or more bone microphones 34. The one or more inertial sensors 74 may include a 9-axis inertial sensor which includes a 3-axis accelerometer, a 3-axis gyrometer, and a 3-axis compass.
The wireless earpiece 12B may also include a radio transceiver 26 such as a BLE, BLUETOOTH, Wi-Fi, or other type of radio transceiver, one or more speakers 16, and one or more processors 18. The one or more processors 18 may be operatively connected to the other components including the one or more EMG sensors 36, the one or more air microphones 32, the inertial sensor 74, the one or more bone microphones 34, the gesture control interface 38, the one or more speakers 16, and the radio transceiver 38.
Where one or more bone conduction microphones 34 and/or one or more air microphones 32 are present, signals from the microphones 34, 32 may be used to determine when certain muscle movement detected with the EMG sensors 36 is associated with speech of the user and when it is not. Thus, for example, when speaking a user would be moving their mouth which requires engaging a number of different muscles. The wireless earpiece may associate the readings from the one or more EMG sensors 36 with speech and thus not consider input received through the EMG sensors 36 to be user input to perform particular actions.
It is also to be understood that in order to provide user input, a user may combine one more gestures as determined by the gesture control interface 38 with one or more facial expressions as determined by the one or more EMG sensors 76 in addition to one or more head movements or head orientations as determined by the one or more inertial sensors 74. Thus, complex input from a user may be quickly communicated using a combination of modalities and in a manner that may be more private than providing voice commands.
It is also to be understood that instead of performing processing of EMG sensor data on one or more of the earpieces, this data may be communicated such as over a wireless communication linkage such as a BLE or BLUETOOTH connection to the mobile device 2. The mobile device 2 may then perform processing and return results to the set of wireless earpieces 10. Alternatively, the mobile device 2 may communicate the sensor data over a network to a remote location such as to a cloud computing service which may analyze the data and return the results of the analysis to the mobile device 2 and then in turn to the set of wireless earpieces 10 if so desired. It is also contemplated that the same data may be analyzed in multiple locations and that different types of analysis may occur depending on the location. For example, the most computationally intensive forms of analyses may be performed at a remote location with greater computation resources than present in the set of wireless earpieces 10.
Because a human is able to control the muscles of facial expression to impressive degrees, the precise control allows the user to transfer nuances of the human emotion spectrum. Such slight movement of the muscles of facial expression can be monitored, and their activity harnessed. Additionally, larger muscles of the head and neck may be able to be activated in order to provide other levels of biometric EMG control inputs to the device. These device inputs may also be preset such as via a software application executing on a mobile device. Thus, user settings may be modified using the software application. Any number of actions may be performed as determined by a user. This may, for example, include actions to initiate a phone call to a particular person or place, listen to a particular song or other audio selection, begin an activity, or any number of other actions which the wearable device may perform.
Biometric data from the EMG sensors may also be relayed to receptor devices without the need or requirement for EMG controls. The array of biometric EMG sensors may be used to better understand the emotional and physiologic status of the user. Such data may be used predictively as well as eliciting a pre-programmed response. In particular, instead of relying upon pre-determined user settings to associate facial expressions with specific commands, the facial expressions may be used to predict user actions or user needs. For example, where voice feedback is being provided to a user of an earpiece presenting options to a user and a user winces in response to an option, the facial expression may be interpreted as a “no”.
Although various methods, systems, and apparatuses have been shown and described herein, the present invention contemplates any number of options, variations, and alternative embodiments. For example, it is contemplated that the wearable device may be of any number of types of devices and any number of different facial gestures may be recognized.
This application is a continuation of U.S. Non-Provisional patent application Ser. No. 17/700,248, filed on Mar. 21, 2022 which is a continuation of U.S. Non-Provisional patent application Ser. No. 17/102,864, filed on Nov. 24, 2020 now patented as U.S. Pat. No. 11,294,466 which is a continuation of U.S. Non-Provisional patent application Ser. No. 15/703,811, filed on Sep. 13, 2017 now patented as U.S. Pat. No. 10,852,829 and claims priority to U.S. Provisional patent application Ser. No. 62/393,926, filed on Sep. 13, 2016, and all entitled “Measurement of Facial Muscle EMG Potentials for Predictive Analysis Using a Smart Wearable System and Method”, hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
2325590 | Carlisle et al. | Aug 1943 | A |
2430229 | Kelsey | Nov 1947 | A |
3047089 | Zwislocki | Jul 1962 | A |
D208784 | Sanzone | Oct 1967 | S |
3586794 | Michaelis | Jun 1971 | A |
3934100 | Harada | Jan 1976 | A |
3983336 | Malek et al. | Sep 1976 | A |
4069400 | Johanson et al. | Jan 1978 | A |
4150262 | Ono | Apr 1979 | A |
4334315 | Ono et al. | Jun 1982 | A |
D266271 | Johanson et al. | Sep 1982 | S |
4375016 | Harada | Feb 1983 | A |
4588867 | Konomi | May 1986 | A |
4617429 | Bellafiore | Oct 1986 | A |
4654883 | Iwata | Mar 1987 | A |
4682180 | Gans | Jul 1987 | A |
4791673 | Schreiber | Dec 1988 | A |
4852177 | Ambrose | Jul 1989 | A |
4865044 | Wallace et al. | Sep 1989 | A |
4934378 | Perry | Jun 1990 | A |
4984277 | Bisgaard et al. | Jan 1991 | A |
5008943 | Arndt et al. | Apr 1991 | A |
5185802 | Stanton | Feb 1993 | A |
5191602 | Regen et al. | Mar 1993 | A |
5201007 | Ward et al. | Apr 1993 | A |
5201008 | Arndt et al. | Apr 1993 | A |
D340286 | Seo | Oct 1993 | S |
5280524 | Norris | Jan 1994 | A |
5295193 | Ono | Mar 1994 | A |
5298692 | Ikeda et al. | Mar 1994 | A |
5343532 | Shugart | Aug 1994 | A |
5347584 | Narisawa | Sep 1994 | A |
5363444 | Norris | Nov 1994 | A |
D367113 | Weeks | Feb 1996 | S |
5497339 | Bernard | Mar 1996 | A |
5606621 | Reiter et al. | Feb 1997 | A |
5613222 | Guenther | Mar 1997 | A |
5654530 | Sauer et al. | Aug 1997 | A |
5692059 | Kruger | Nov 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5748743 | Weeks | May 1998 | A |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5771438 | Palermo et al. | Jun 1998 | A |
D397796 | Yabe et al. | Sep 1998 | S |
5802167 | Hong | Sep 1998 | A |
D410008 | Almqvist | May 1999 | S |
5929774 | Charlton | Jul 1999 | A |
5933506 | Aoki et al. | Aug 1999 | A |
5949896 | Nageno et al. | Sep 1999 | A |
5987146 | Pluvinage et al. | Nov 1999 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6081724 | Wilson | Jun 2000 | A |
6084526 | Blotky et al. | Jul 2000 | A |
6094492 | Boesen | Jul 2000 | A |
6111569 | Brusky et al. | Aug 2000 | A |
6112103 | Puthuff | Aug 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6167039 | Karlsson et al. | Dec 2000 | A |
6181801 | Puthuff et al. | Jan 2001 | B1 |
6208372 | Barraclough | Mar 2001 | B1 |
6230029 | Yegiazaryan et al. | May 2001 | B1 |
6270466 | Weinstein et al. | Aug 2001 | B1 |
6275789 | Moser et al. | Aug 2001 | B1 |
6339754 | Flanagan et al. | Jan 2002 | B1 |
D455835 | Anderson et al. | Apr 2002 | S |
6408081 | Boesen | Jun 2002 | B1 |
6424820 | Burdick et al. | Jul 2002 | B1 |
D464039 | Boesen | Oct 2002 | S |
6470893 | Boesen | Oct 2002 | B1 |
D468299 | Boesen | Jan 2003 | S |
D468300 | Boesen | Jan 2003 | S |
6542721 | Boesen | Apr 2003 | B2 |
6560468 | Boesen | May 2003 | B1 |
6654721 | Handelman | Nov 2003 | B2 |
6664713 | Boesen | Dec 2003 | B2 |
6690807 | Meyer | Feb 2004 | B1 |
6694180 | Boesen | Feb 2004 | B1 |
6718043 | Boesen | Apr 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6748095 | Goss | Jun 2004 | B1 |
6754358 | Boesen et al. | Jun 2004 | B1 |
6784873 | Boesen et al. | Aug 2004 | B1 |
6823195 | Boesen | Nov 2004 | B1 |
6852084 | Boesen | Feb 2005 | B1 |
6879698 | Boesen | Apr 2005 | B2 |
6892082 | Boesen | May 2005 | B2 |
6920229 | Boesen | Jul 2005 | B2 |
6952483 | Boesen et al. | Oct 2005 | B2 |
6987986 | Boesen | Jan 2006 | B2 |
7010137 | Leedom et al. | Mar 2006 | B1 |
7113611 | Leedom et al. | Sep 2006 | B2 |
D532520 | Kampmeier et al. | Nov 2006 | S |
7136282 | Rebeske | Nov 2006 | B1 |
7203331 | Boesen | Apr 2007 | B2 |
7209569 | Boesen | Apr 2007 | B2 |
7215790 | Boesen et al. | May 2007 | B2 |
D549222 | Huang | Aug 2007 | S |
D554756 | Sjursen et al. | Nov 2007 | S |
7403629 | Aceti et al. | Jul 2008 | B1 |
D579006 | Kim et al. | Oct 2008 | S |
7463902 | Boesen | Dec 2008 | B2 |
7508411 | Boesen | Mar 2009 | B2 |
D601134 | Elabidi et al. | Sep 2009 | S |
7825626 | Kozisek | Nov 2010 | B2 |
7965855 | Ham | Jun 2011 | B1 |
7979035 | Griffin et al. | Jul 2011 | B2 |
7983628 | Boesen | Jul 2011 | B2 |
D647491 | Chen et al. | Oct 2011 | S |
8095188 | Shi | Jan 2012 | B2 |
8108143 | Tester | Jan 2012 | B1 |
8140357 | Boesen | Mar 2012 | B1 |
D666581 | Perez | Sep 2012 | S |
8300864 | Müllenborn et al. | Oct 2012 | B2 |
8406448 | Lin et al. | Mar 2013 | B2 |
8436780 | Schantz et al. | May 2013 | B2 |
D687021 | Yuen | Jul 2013 | S |
8493286 | Agrama | Jul 2013 | B1 |
8719877 | VonDoenhoff et al. | May 2014 | B2 |
8774434 | Zhao et al. | Jul 2014 | B2 |
8831266 | Huang | Sep 2014 | B1 |
8891800 | Shaffer | Nov 2014 | B1 |
8994498 | Agrafioti et al. | Mar 2015 | B2 |
D728107 | Martin et al. | Apr 2015 | S |
9013145 | Castillo et al. | Apr 2015 | B2 |
9037125 | Kadous | May 2015 | B1 |
D733103 | Jeong et al. | Jun 2015 | S |
9081944 | Camacho et al. | Jul 2015 | B2 |
9372533 | Agrama | Jun 2016 | B1 |
9510159 | Cuddihy et al. | Nov 2016 | B1 |
D773439 | Walker | Dec 2016 | S |
D775158 | Dong | Dec 2016 | S |
D777710 | Palmborg | Jan 2017 | S |
D788079 | Son et al. | May 2017 | S |
10852829 | Rüdiger | Dec 2020 | B2 |
11294466 | Rüdiger | Apr 2022 | B2 |
20010005197 | Mishra et al. | Jun 2001 | A1 |
20010027121 | Boesen | Oct 2001 | A1 |
20010043707 | Leedom | Nov 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020002413 | Tokue | Jan 2002 | A1 |
20020007510 | Mann | Jan 2002 | A1 |
20020010590 | Lee | Jan 2002 | A1 |
20020030637 | Mann | Mar 2002 | A1 |
20020046035 | Kitahara et al. | Apr 2002 | A1 |
20020057810 | Boesen | May 2002 | A1 |
20020076073 | Taenzer et al. | Jun 2002 | A1 |
20020118852 | Boesen | Aug 2002 | A1 |
20030002705 | Boesen | Jan 2003 | A1 |
20030065504 | Kraemer et al. | Apr 2003 | A1 |
20030100331 | Dress et al. | May 2003 | A1 |
20030104806 | Ruef et al. | Jun 2003 | A1 |
20030115068 | Boesen | Jun 2003 | A1 |
20030125096 | Boesen | Jul 2003 | A1 |
20030218064 | Conner et al. | Nov 2003 | A1 |
20030220584 | Honeyager et al. | Nov 2003 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040160511 | Boesen | Aug 2004 | A1 |
20050017842 | Dematteo | Jan 2005 | A1 |
20050043056 | Boesen | Feb 2005 | A1 |
20050094839 | Gwee | May 2005 | A1 |
20050125320 | Boesen | Jun 2005 | A1 |
20050148883 | Boesen | Jul 2005 | A1 |
20050165663 | Razumov | Jul 2005 | A1 |
20050196009 | Boesen | Sep 2005 | A1 |
20050251455 | Boesen | Nov 2005 | A1 |
20050266876 | Boesen | Dec 2005 | A1 |
20060029246 | Boesen | Feb 2006 | A1 |
20060073787 | Lair et al. | Apr 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060074808 | Boesen | Apr 2006 | A1 |
20060166715 | Engelen et al. | Jul 2006 | A1 |
20060166716 | Seshadri et al. | Jul 2006 | A1 |
20060220915 | Bauer | Oct 2006 | A1 |
20060258412 | Liu | Nov 2006 | A1 |
20080076972 | Dorogusker et al. | Mar 2008 | A1 |
20080090622 | Kim et al. | Apr 2008 | A1 |
20080146890 | LeBoeuf | Jun 2008 | A1 |
20080166005 | Terlizzi et al. | Jul 2008 | A1 |
20080254780 | Kuhl et al. | Oct 2008 | A1 |
20080255430 | Alexandersson et al. | Oct 2008 | A1 |
20090003620 | McKillop et al. | Jan 2009 | A1 |
20090005700 | Joshi et al. | Jan 2009 | A1 |
20090008275 | Ferrari et al. | Jan 2009 | A1 |
20090017881 | Madrigal | Jan 2009 | A1 |
20090073070 | Rofougaran | Mar 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090105548 | Bart | Apr 2009 | A1 |
20090191920 | Regen et al. | Jul 2009 | A1 |
20090245559 | Boltyenkov et al. | Oct 2009 | A1 |
20090261114 | McGuire et al. | Oct 2009 | A1 |
20090296968 | Wu et al. | Dec 2009 | A1 |
20100033313 | Keady et al. | Feb 2010 | A1 |
20100203831 | Muth | Aug 2010 | A1 |
20100210212 | Sato | Aug 2010 | A1 |
20100320961 | Castillo et al. | Dec 2010 | A1 |
20110140844 | McGuire et al. | Jun 2011 | A1 |
20110239497 | McGuire et al. | Oct 2011 | A1 |
20110286615 | Olodort et al. | Nov 2011 | A1 |
20120057740 | Rosal | Mar 2012 | A1 |
20120116537 | Liebetanz | May 2012 | A1 |
20120149467 | Heck | Jun 2012 | A1 |
20120229248 | Parshionikar et al. | Sep 2012 | A1 |
20130123656 | Heck | May 2013 | A1 |
20130274583 | Heck | Oct 2013 | A1 |
20130316642 | Newham | Nov 2013 | A1 |
20130346168 | Zhou et al. | Dec 2013 | A1 |
20140079257 | Ruwe et al. | Mar 2014 | A1 |
20140106677 | Altman | Apr 2014 | A1 |
20140122116 | Smythe | May 2014 | A1 |
20140153768 | Hagen et al. | Jun 2014 | A1 |
20140160248 | Pomerantz et al. | Jun 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140185828 | Helbling | Jul 2014 | A1 |
20140219467 | Kurtz | Aug 2014 | A1 |
20140222462 | Shakil et al. | Aug 2014 | A1 |
20140235169 | Parkinson et al. | Aug 2014 | A1 |
20140270227 | Swanson | Sep 2014 | A1 |
20140270271 | Dehe et al. | Sep 2014 | A1 |
20140335908 | Krisch et al. | Nov 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20150028996 | Agrafioti et al. | Jan 2015 | A1 |
20150110587 | Hori | Apr 2015 | A1 |
20150148989 | Cooper et al. | May 2015 | A1 |
20150245127 | Shaffer | Aug 2015 | A1 |
20150261298 | Li | Sep 2015 | A1 |
20150324645 | Jang et al. | Nov 2015 | A1 |
20150360030 | Cartledge | Dec 2015 | A1 |
20150366471 | LeBoeuf et al. | Dec 2015 | A1 |
20160033280 | Moore | Feb 2016 | A1 |
20160072558 | Hirsch et al. | Mar 2016 | A1 |
20160073189 | Lindén et al. | Mar 2016 | A1 |
20160100676 | Sandanger | Apr 2016 | A1 |
20160109961 | Parshionikar | Apr 2016 | A1 |
20160125892 | Bowen et al. | May 2016 | A1 |
20160353195 | Lott | Dec 2016 | A1 |
20160360350 | Watson et al. | Dec 2016 | A1 |
20170041699 | Mackellar | Feb 2017 | A1 |
20170059152 | Hirsch et al. | Mar 2017 | A1 |
20170060256 | Heck et al. | Mar 2017 | A1 |
20170060262 | Hviid et al. | Mar 2017 | A1 |
20170060269 | Förstner et al. | Mar 2017 | A1 |
20170061751 | Loermann et al. | Mar 2017 | A1 |
20170062913 | Hirsch et al. | Mar 2017 | A1 |
20170064426 | Hviid | Mar 2017 | A1 |
20170064428 | Hirsch | Mar 2017 | A1 |
20170064432 | Hviid et al. | Mar 2017 | A1 |
20170064437 | Hviid et al. | Mar 2017 | A1 |
20170078780 | Qian et al. | Mar 2017 | A1 |
20170078785 | Qian et al. | Mar 2017 | A1 |
20170108918 | Boesen | Apr 2017 | A1 |
20170109131 | Boesen | Apr 2017 | A1 |
20170110124 | Boesen et al. | Apr 2017 | A1 |
20170110899 | Boesen | Apr 2017 | A1 |
20170111723 | Boesen | Apr 2017 | A1 |
20170111725 | Boesen et al. | Apr 2017 | A1 |
20170111726 | Martin et al. | Apr 2017 | A1 |
20170111740 | Hviid et al. | Apr 2017 | A1 |
20170151447 | Boesen | Jun 2017 | A1 |
20170151668 | Boesen | Jun 2017 | A1 |
20170151918 | Boesen | Jun 2017 | A1 |
20170151930 | Boesen | Jun 2017 | A1 |
20170151957 | Boesen | Jun 2017 | A1 |
20170151959 | Boesen | Jun 2017 | A1 |
20170153114 | Boesen | Jun 2017 | A1 |
20170153636 | Boesen | Jun 2017 | A1 |
20170154532 | Boesen | Jun 2017 | A1 |
20170155985 | Boesen | Jun 2017 | A1 |
20170155992 | Perianu et al. | Jun 2017 | A1 |
20170155993 | Boesen | Jun 2017 | A1 |
20170155997 | Boesen | Jun 2017 | A1 |
20170155998 | Boesen | Jun 2017 | A1 |
20170156000 | Boesen | Jun 2017 | A1 |
20170164089 | Lee et al. | Jun 2017 | A1 |
20170178631 | Boesen | Jun 2017 | A1 |
20170180842 | Boesen | Jun 2017 | A1 |
20170180843 | Perianu et al. | Jun 2017 | A1 |
20170180897 | Perianu | Jun 2017 | A1 |
20170188127 | Perianu et al. | Jun 2017 | A1 |
20170188132 | Hirsch et al. | Jun 2017 | A1 |
20170195829 | Belverato et al. | Jul 2017 | A1 |
20170208393 | Boesen | Jul 2017 | A1 |
20170214987 | Boesen | Jul 2017 | A1 |
20170215016 | Dohmen et al. | Jul 2017 | A1 |
20170230752 | Dohmen et al. | Aug 2017 | A1 |
20170257698 | Boesen et al. | Sep 2017 | A1 |
20170339484 | Kim | Nov 2017 | A1 |
20170347177 | Masaki et al. | Nov 2017 | A1 |
20180107275 | Chen et al. | Apr 2018 | A1 |
Number | Date | Country |
---|---|---|
204244472 | Apr 2015 | CN |
104683519 | Jun 2015 | CN |
104837094 | Aug 2015 | CN |
1469659 | Oct 2004 | EP |
1017252 | May 2006 | EP |
2903186 | Aug 2015 | EP |
2074817 | Nov 1981 | GB |
2508226 | May 2014 | GB |
2008103925 | Aug 2008 | WO |
2007034371 | Nov 2008 | WO |
2011001433 | Jan 2011 | WO |
2012071127 | May 2012 | WO |
2013134956 | Sep 2013 | WO |
2014046602 | Mar 2014 | WO |
2014043179 | Jul 2014 | WO |
2015061633 | Apr 2015 | WO |
2015110577 | Jul 2015 | WO |
2015110587 | Jul 2015 | WO |
2016032990 | Mar 2016 | WO |
Entry |
---|
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223. |
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014) pp. 1-14. |
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013), pp. 1-7. |
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014). |
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016). |
BRAGI Is On Facebook (2014), pp. 1-51. |
BRAGI Update—Arrival Of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014), pp. 1-8. |
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015), pp. 1-18. |
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014), pp. 1-8. |
BRAGI Update—Let's Get Ready To Rumble, A Lot To Be Done Over Christmas (Dec. 22, 2014), pp. 1-18. |
BRAGI Update—Memories From April—Update On Progress (Sep. 16, 2014), pp. 1-15. |
BRAGI Update—Memories from May—Update On Progress—Sweet (Oct. 13, 2014), pp. 1-16. |
BRAGI Update—Memories From One Month Before Kickstarter—Update On Progress (Jul. 10, 2014), pp. 1-17. |
BRAGI Update—Memories From The First Month of Kickstarter—Update on Progress (Aug. 1, 2014), pp. 1-16. |
BRAGI Update—Memories From The Second Month of Kickstarter—Update On Progress (Aug. 22, 2014), pp. 1-15. |
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014), pp. 1-9. |
BRAGI Update—Office Tour, Tour To China, Tour to CES (Dec. 11, 2014), pp. 1-14. |
BRAGI Update—Status On Wireless, Bits and Pieces, Testing—Oh Yeah, Timeline(Apr. 24, 2015), pp. 1-18. |
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015), pp. 1-19. |
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014), pp. 1-21. |
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015), pp. 1-21. |
BRAGI Update—Alpha 5 and Back To China, Backer Day, On Track(May 16, 2015), pp. 1-15. |
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015), pp. 1-16. |
BRAGI Update—Certifications, Production, Ramping Up (Nov. 13, 2015), pp. 1-15. |
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015), pp. 1-20. |
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015), pp. 1-20. |
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015), pp. 1-14. |
BRAGI Update—Getting Close(Aug. 6, 2015), pp. 1-20. |
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015), pp. 1-17. |
BRAGI Update—On Track, On Track and Gems Overview (Jun. 24, 2015), pp. 1-19. |
BRAGI Update—Status On Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015), pp. 1-17. |
BRAGI Update—Unpacking Video, Reviews On Audio Perform and Boy Are We Getting Close(Sep. 10, 2015), pp. 1-15. |
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016), pp. 1-2. |
Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017), pp. 1-8. |
Hyundai Motor America, “Hyundai Motor Company Introduces A Health + Mobility Concept For Wellness In Mobility”, Fountain Valley, Californa (2017), pp. 1-3. |
International Search Report & Written Opinion, PCT/EP2016/070231 (Nov. 18, 2016) 12 pages. |
Last Push Before The Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014), pp. 1-7. |
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footie and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dlna_iot/ (Sep. 24, 2014). |
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
Stretchgoal—It's Your Dash (Feb. 14, 2014), pp. 1-14. |
Stretchgoal—The Carrying Case for The Dash (Feb. 12, 2014), pp. 1-9. |
Stretchgoal—Windows Phone Support (Feb. 17, 2014), pp. 1-17. |
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014), pp. 1-12. |
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014), pp. 1-7. |
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014), pp. 1-11. |
Wikipedia, “Gamebook”, https://en.wikipedia.org/wiki/Gamebook, Sep. 3, 2017, 5 pages. |
Wikipedia, “Kinect”, “https://en.wikipedia.org/wiki/Kinect”, 18 pages, (Sep. 9, 2017). |
Wikipedia, “Wii Balance Board”, “https://en.wikipedia.org/wiki/Wii_Balance_Board”, 3 pages, (Jul. 20, 2017). |
Number | Date | Country | |
---|---|---|---|
20230297169 A1 | Sep 2023 | US |
Number | Date | Country | |
---|---|---|---|
62393926 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17700248 | Mar 2022 | US |
Child | 18322993 | US | |
Parent | 17102864 | Nov 2020 | US |
Child | 17700248 | US | |
Parent | 15703811 | Sep 2017 | US |
Child | 17102864 | US |