The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to earpieces.
Earpieces hold great promise as widely adopted wearable devices. One of the problems with earpieces continue to be limitations on the manner in which user input is provided. What is needed are improved earpieces which allow for receiving user input in an efficient and desirable manner.
Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.
It is a further object, feature, or advantage of the present invention to provide for new ways of receiving user input for ear pieces.
It is a still further object, feature, or advantage of the present invention to provide for new ways of receiving manual input from users.
Another object, feature, or advantage is to receive manual input from a user of an earpiece without needing a touch sensor.
Yet another object, feature, or advantage is to receive manual input from a user without needing manual buttons.
Another object, feature, or advantage of the present invention is to reduce or eliminate false positive indications that taps occurred.
Yet another object, feature, or advantage is to provide for a way for receiving manual input from a user which is easy for a user to use.
One or more of these and/or other objects, features, or advantages of the present invention will become apparent from the specification and claims that follow. No single embodiment need provide each and every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by an objects, features, or advantages stated herein.
According to one aspect, an earpiece comprises an earpiece housing, a digital signal processor disposed within the ear piece housing, and at least one microphone operatively connected to the digital signal processor. The earpiece is configured to receive audio from the at least one microphone and process the audio with the digital signal processor to determine if a user has performed a tap on the earpiece. The earpiece may further include a wireless transceiver disposed within the ear piece wherein the earpiece is configured to communicate data indicative of occurrence of the tap using the wireless transceiver. The wireless transceiver may include a near field magnetic induction transceiver (NFMI)or a radio transceiver such as a Bluetooth, BLE, or other type of radio transceiver. Multiple transceivers may be present such as one NFMI transceiver and one BLE transceiver. The earpiece may further include a processor disposed within the ear piece housing and a wireless transceiver disposed within the ear piece housing and operatively connected to the processor and wherein the processor is configured to receive data indicative of the tap on the ear piece from the digital signal processor and wherein the processor is configured to receive data indicative of a tap on a different earpiece through the wireless transceiver. The processor may be further programmed to interpret one or more taps on the earpiece and/or one or more taps on the different earpiece as a user command and to perform an action based on the user command. The action may include communicating the user command to another device in operative communication with the earpiece. The earpiece may be configured to receive audio from the at least one microphone and process the audio with the digital signal processor to determine a location of the tap on the earpiece. The at least one microphone may be positioned to face outwards.
According to another aspect, an earpiece includes an earpiece housing, a processor disposed within the ear piece housing, at least one microphone operatively connected to the processor, and a wireless transceiver disposed within the earpiece housing and operatively connected to the processor. The earpiece is configured to receive audio from the at least one microphone and process the audio with the processor to determine if a user has performed a tap on the earpiece. The earpiece may be further configured to interpret user input comprising the tap and perform an action based on the user input. The user input may further include one or more taps on an additional earpiece in operative communication with the earpiece. The user input may include a plurality of taps including the tap. The wireless transceiver may be a radio transceiver.
According to another aspect, a system includes a set of earpieces including a left ear piece and a right ear piece, each of the earpieces comprising an ear piece housing, a digital signal processor disposed within the ear piece housing, at least one microphone operatively connected to the processor, wherein each of the earpieces is configured to receive audio from the at least one microphone and process the audio with the digital signal processor to determine if a user has performed a tap on the earpiece.
According to another aspect, a method for use in a wireless earpiece comprising an earpiece housing, a processor disposed within the earpiece housing, at least one microphone operatively connected to the processor. The method includes receiving user input comprising a physical tap by the user on the earpiece, monitoring audio associated with the user input from the at least one microphone, and processing the audio associated with the user input to determine occurrence of the physical tap. The method may further include performing an action based on the user input.
According to another aspect, an earpiece includes an earpiece housing, a digital signal processor disposed within the ear piece housing, and at least one intelligent microphone operatively connected to the digital signal processor. The earpiece is configured to receive audio from the at least one intelligent microphone and process the audio with the digital signal processor.
An earpiece wearable device may be used to sense acoustic events using one or more microphones of the earpiece, where the acoustic event is created by a mechanical or physical interaction with the device. For example, a user may tap the earpiece housing and the microphone(s) may sense the audio and a processor such as a digital signal processor may then analyze the audio to determine that the acoustic event was a tap. Thus, user input from a user may be sensed as an acoustic event. The user input may be a single tap on one earpiece, multiple taps on the earpiece, or where two earpieces are used (one left earpiece and one right earpiece), the user input may include one or more taps on each of the earpieces. The earpiece may interpret the user input as a command and perform one or more actions based on the command.
The microphone may be of any number of types. For example, the microphone may be a smart microphone or intelligent microphone from Knowles Corporation which integrates an audio processing algorithm with acoustic detection into a multi-mode digital microphones. One of the benefits of such a selection of microphone is that such a device can recognize when the audio should be in sleep mode and when it should be awakened thereby reducing power usage relative to a device which is always on in a battery usage mode.
It should be appreciated that user input in the form of taps may be used to perform any number of functions. These may include to raise or lower volume such as by receiving a tap on one earpiece to raise volume and receiving a tap on a second earpiece to lower volume. These may include receive a double tap to play music or pause music. Note that the use of taps or user input may be context-driven. Thus, while music is playing a double tap may pause the music. If the music is paused or stopped, the double tap may play the music. Similarly, a tap on one earpiece may be used to accept a phone call while a tap on the other earpiece may be used to reject the phone call.
In one configuration where a digital signal processor 40 is used, the digital signal processor 40 may process an audio signal to analyze an acoustical event. The digital signal processor may be configured to detect, classify, and identify acoustical events as user input in the form of user interactions such as taps. In one implementation, training may be permitted where a user is instructed to perform different actions including performing different physical events such as taps to collect examples of acoustical events. It is to be understood that varying levels of complexity to the processing may be applied if greater discernment in a user's actions are required. For example, if instead of tapping on a surface of the earpiece, tapping in other areas of the ear or head or on other items such as jewelry may require more complexity or computing power to detect, classify, and identify the acoustical event.
One or more speakers 73 are operatively connected to the intelligent control system. In addition, one or more transceivers may be in operative communication with the intelligent control system 18. For example, the transceiver 35 may be a near field magnetic induction (NFMI) transceiver which may, for example, be used to communicate between the earpiece and a second earpiece or other wearable device. The radio transceiver 34 is operatively connected to the intelligent control system 18. The radio transceiver 34 may be a Bluetooth transceiver, a BLE transceiver, a cellular transceiver, a UWB transceiver, a Wi-Fi transceiver, or other type of radio transceiver. Storage 60 is shown which is operatively connected to the intelligent control system 18. The storage 60 may be in the form of flash memory or other memory which may be used for various purposes including storing audio files which may be stored by the device and played back. Thus, for example, music may be played by the device or audio may be recorded by the device and stored locally. Of course, the storage 60 may be used to store other information as well.
As shown in
For example, a determination may be made as to whether contextual data is indicative that a user is likely or more likely to communicate with a tap. For example, if the wireless earpiece has just prompted the user with a voice prompt, it may be more likely that a user will communicate with one or more taps. Similarly, if the user has just inserted the wireless earpiece into the ear, it may be more likely that the user will communicate with one or more taps. The determination as to whether a user has just inserted the earpiece may be made based on inertial data, contact sensors, optical sensors, or otherwise.
By way of further example, inertial sensor data may be further used to assist in verifying that a user has performed a tap on the wireless earpiece. For example, an inertial signal may be correlated with the audio signal at the time of the tap to confirm the occurrence of a tap.
It is further to be understood that multiple microphone signals may be used in determining whether a tap has occurred or not, including multiple microphones present at the wireless earpiece. The use of multiple microphones and their respective positions relative to a surface for tapping, may be further be used to increase the likelihood of determining that a tap has occurred while reducing the likelihood of false positive events.
Therefore, an earpiece, system of earpieces, and associated methods have been shown and described. Although specific embodiments and examples have been shown and described, the present invention is not to be limited to any specific embodiments. In particular, options, variations, and alternatives are contemplated including in the specific structure, components, interactions between the components, number of microphones, types of microphones, type of processor(s) including digital signal processors, microprocessors, and or other types of processors, the shape or configuration of the earpiece housing, algorithms for performing analysis, whether the earpieces are integrated into a headset, the type of physical interaction with the earpieces, and other options, variations, and alternatives.
This application claims priority to U.S. Provisional Patent Application No. 62/461,657, filed Feb. 21, 2017, hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2325590 | Carlisle et al. | Aug 1943 | A |
2430229 | Kelsey | Nov 1947 | A |
3047089 | Zwislocki | Jul 1962 | A |
D208784 | Sanzone | Oct 1967 | S |
3586794 | Michaelis | Jun 1971 | A |
3934100 | Harada | Jan 1976 | A |
3983336 | Malek et al. | Sep 1976 | A |
4069400 | Johanson et al. | Jan 1978 | A |
4150262 | Ono | Apr 1979 | A |
4334315 | Ono et al. | Jun 1982 | A |
D266271 | Johanson et al. | Sep 1982 | S |
4375016 | Harada | Feb 1983 | A |
4588867 | Konomi | May 1986 | A |
4617429 | Bellafiore | Oct 1986 | A |
4654883 | Iwata | Mar 1987 | A |
4682180 | Gans | Jul 1987 | A |
4791673 | Schreiber | Dec 1988 | A |
4852177 | Ambrose | Jul 1989 | A |
4865044 | Wallace et al. | Sep 1989 | A |
4984277 | Bisgaard et al. | Jan 1991 | A |
5008943 | Arndt et al. | Apr 1991 | A |
5185802 | Stanton | Feb 1993 | A |
5191602 | Regen et al. | Mar 1993 | A |
5201007 | Ward et al. | Apr 1993 | A |
5201008 | Arndt et al. | Apr 1993 | A |
D340286 | Seo | Oct 1993 | S |
5280524 | Norris | Jan 1994 | A |
5295193 | Ono | Mar 1994 | A |
5298692 | Ikeda et al. | Mar 1994 | A |
5343532 | Shugart | Aug 1994 | A |
5347584 | Narisawa | Sep 1994 | A |
5363444 | Norris | Nov 1994 | A |
D367113 | Weeks | Feb 1996 | S |
5497339 | Bernard | Mar 1996 | A |
5606621 | Reiter et al. | Feb 1997 | A |
5613222 | Guenther | Mar 1997 | A |
5654530 | Sauer et al. | Aug 1997 | A |
5692059 | Kruger | Nov 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5748743 | Weeks | May 1998 | A |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5771438 | Palermo et al. | Jun 1998 | A |
D397796 | Yabe et al. | Sep 1998 | S |
5802167 | Hong | Sep 1998 | A |
D410008 | Almqvist | May 1999 | S |
5929774 | Charlton | Jul 1999 | A |
5933506 | Aoki et al. | Aug 1999 | A |
5949896 | Nageno et al. | Sep 1999 | A |
5987146 | Pluvinage et al. | Nov 1999 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6081724 | Wilson | Jun 2000 | A |
6084526 | Blotky et al. | Jul 2000 | A |
6094492 | Boesen | Jul 2000 | A |
6111569 | Brusky et al. | Aug 2000 | A |
6112103 | Puthuff | Aug 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6167039 | Karlsson et al. | Dec 2000 | A |
6181801 | Puthuff et al. | Jan 2001 | B1 |
6208372 | Barraclough | Mar 2001 | B1 |
6230029 | Yegiazaryan et al. | May 2001 | B1 |
6275789 | Moser et al. | Aug 2001 | B1 |
6339754 | Flanagan et al. | Jan 2002 | B1 |
D455835 | Anderson et al. | Apr 2002 | S |
6366677 | Sigwanz | Apr 2002 | B1 |
6408081 | Boesen | Jun 2002 | B1 |
6424820 | Burdick et al. | Jul 2002 | B1 |
D464039 | Boesen | Oct 2002 | S |
6470893 | Boesen | Oct 2002 | B1 |
D468299 | Boesen | Jan 2003 | S |
D468300 | Boesen | Jan 2003 | S |
6542721 | Boesen | Apr 2003 | B2 |
6560468 | Boesen | May 2003 | B1 |
6563301 | Gventer | May 2003 | B2 |
6654721 | Handelman | Nov 2003 | B2 |
6664713 | Boesen | Dec 2003 | B2 |
6690807 | Meyer | Feb 2004 | B1 |
6694180 | Boesen | Feb 2004 | B1 |
6718043 | Boesen | Apr 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6748095 | Goss | Jun 2004 | B1 |
6754358 | Boesen et al. | Jun 2004 | B1 |
6784873 | Boesen et al. | Aug 2004 | B1 |
6823195 | Boesen | Nov 2004 | B1 |
6852084 | Boesen | Feb 2005 | B1 |
6879698 | Boesen | Apr 2005 | B2 |
6892082 | Boesen | May 2005 | B2 |
6920229 | Boesen | Jul 2005 | B2 |
6952483 | Boesen et al. | Oct 2005 | B2 |
6987986 | Boesen | Jan 2006 | B2 |
7010137 | Leedom et al. | Mar 2006 | B1 |
7113611 | Leedom et al. | Sep 2006 | B2 |
D532520 | Kampmeier et al. | Nov 2006 | S |
7136282 | Rebeske | Nov 2006 | B1 |
7203331 | Boesen | Apr 2007 | B2 |
7209569 | Boesen | Apr 2007 | B2 |
7215790 | Boesen et al. | May 2007 | B2 |
D549222 | Huang | Aug 2007 | S |
D554756 | Sjursen et al. | Nov 2007 | S |
7403629 | Aceti et al. | Jul 2008 | B1 |
D579006 | Kim et al. | Oct 2008 | S |
7463902 | Boesen | Dec 2008 | B2 |
7508411 | Boesen | Mar 2009 | B2 |
D601134 | Elabidi et al. | Sep 2009 | S |
7825626 | Kozisek | Nov 2010 | B2 |
7965855 | Ham | Jun 2011 | B1 |
7979035 | Griffin et al. | Jul 2011 | B2 |
7983628 | Boesen | Jul 2011 | B2 |
D647491 | Chen et al. | Oct 2011 | S |
8095188 | Shi | Jan 2012 | B2 |
8108143 | Tester | Jan 2012 | B1 |
8140357 | Boesen | Mar 2012 | B1 |
D666581 | Perez | Sep 2012 | S |
8300864 | Müllenborn et al. | Oct 2012 | B2 |
8406448 | Lin | Mar 2013 | B2 |
8430817 | Al-Ali et al. | Apr 2013 | B1 |
8436780 | Schantz et al. | May 2013 | B2 |
D687021 | Yuen | Jul 2013 | S |
8679012 | Kayyali | Mar 2014 | B1 |
8719877 | VonDoenhoff et al. | May 2014 | B2 |
8774434 | Zhao et al. | Jul 2014 | B2 |
8831266 | Huang | Sep 2014 | B1 |
8891800 | Shaffer | Nov 2014 | B1 |
8994498 | Agrafioti et al. | Mar 2015 | B2 |
D728107 | Martin et al. | Apr 2015 | S |
9013145 | Castillo et al. | Apr 2015 | B2 |
9037125 | Kadous | May 2015 | B1 |
D733103 | Jeong et al. | Jun 2015 | S |
9081944 | Camacho et al. | Jul 2015 | B2 |
9510159 | Cuddihy et al. | Nov 2016 | B1 |
D773439 | Walker | Dec 2016 | S |
D775158 | Dong et al. | Dec 2016 | S |
D777710 | Palmborg et al. | Jan 2017 | S |
9544689 | Fisher et al. | Jan 2017 | B2 |
D788079 | Son et al. | May 2017 | S |
20010005197 | Mishra et al. | Jun 2001 | A1 |
20010027121 | Boesen | Oct 2001 | A1 |
20010043707 | Leedom | Nov 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020002413 | Tokue | Jan 2002 | A1 |
20020007510 | Mann | Jan 2002 | A1 |
20020010590 | Lee | Jan 2002 | A1 |
20020030637 | Mann | Mar 2002 | A1 |
20020046035 | Kitahara et al. | Apr 2002 | A1 |
20020057810 | Boesen | May 2002 | A1 |
20020076073 | Taenzer et al. | Jun 2002 | A1 |
20020118852 | Boesen | Aug 2002 | A1 |
20030002705 | Boesen | Jan 2003 | A1 |
20030065504 | Kraemer et al. | Apr 2003 | A1 |
20030100331 | Dress et al. | May 2003 | A1 |
20030104806 | Ruef et al. | Jun 2003 | A1 |
20030115068 | Boesen | Jun 2003 | A1 |
20030125096 | Boesen | Jul 2003 | A1 |
20030218064 | Conner et al. | Nov 2003 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040160511 | Boesen | Aug 2004 | A1 |
20050017842 | Dematteo | Jan 2005 | A1 |
20050043056 | Boesen | Feb 2005 | A1 |
20050094839 | Gwee | May 2005 | A1 |
20050125320 | Boesen | Jun 2005 | A1 |
20050148883 | Boesen | Jul 2005 | A1 |
20050165663 | Razumov | Jul 2005 | A1 |
20050196009 | Boesen | Sep 2005 | A1 |
20050251455 | Boesen | Nov 2005 | A1 |
20050266876 | Boesen | Dec 2005 | A1 |
20060029246 | Boesen | Feb 2006 | A1 |
20060073787 | Lair et al. | Apr 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060074808 | Boesen | Apr 2006 | A1 |
20060166715 | Engelen et al. | Jul 2006 | A1 |
20060166716 | Seshadri et al. | Jul 2006 | A1 |
20060220915 | Bauer | Oct 2006 | A1 |
20060258412 | Liu | Nov 2006 | A1 |
20080076972 | Dorogusker et al. | Mar 2008 | A1 |
20080090622 | Kim et al. | Apr 2008 | A1 |
20080146890 | LeBoeuf et al. | Jun 2008 | A1 |
20080187163 | Goldstein et al. | Aug 2008 | A1 |
20080253583 | Goldstein et al. | Oct 2008 | A1 |
20080254780 | Kuhl et al. | Oct 2008 | A1 |
20080255430 | Alexandersson et al. | Oct 2008 | A1 |
20080298606 | Johnson | Dec 2008 | A1 |
20090003620 | McKillop et al. | Jan 2009 | A1 |
20090008275 | Ferrari et al. | Jan 2009 | A1 |
20090017881 | Madrigal | Jan 2009 | A1 |
20090073070 | Rofougaran | Mar 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090105548 | Bart | Apr 2009 | A1 |
20090154739 | Zellner | Jun 2009 | A1 |
20090191920 | Regen et al. | Jul 2009 | A1 |
20090245559 | Boltyenkov et al. | Oct 2009 | A1 |
20090261114 | McGuire et al. | Oct 2009 | A1 |
20090296968 | Wu et al. | Dec 2009 | A1 |
20100033313 | Keady et al. | Feb 2010 | A1 |
20100203831 | Muth | Aug 2010 | A1 |
20100210212 | Sato | Aug 2010 | A1 |
20100320961 | Castillo et al. | Dec 2010 | A1 |
20110140844 | McGuire et al. | Jun 2011 | A1 |
20110239497 | McGuire et al. | Oct 2011 | A1 |
20110249824 | Asada | Oct 2011 | A1 |
20110286615 | Olodort et al. | Nov 2011 | A1 |
20110293102 | Kitazawa | Dec 2011 | A1 |
20120057740 | Rosal | Mar 2012 | A1 |
20120155670 | Rutschman | Jun 2012 | A1 |
20120309453 | Maguire | Dec 2012 | A1 |
20130106454 | Liu et al. | May 2013 | A1 |
20130316642 | Newham | Nov 2013 | A1 |
20130346168 | Zhou et al. | Dec 2013 | A1 |
20140004912 | Rajakarunanayake | Jan 2014 | A1 |
20140014697 | Schmierer et al. | Jan 2014 | A1 |
20140020089 | Perini, II | Jan 2014 | A1 |
20140072136 | Tenenbaum et al. | Mar 2014 | A1 |
20140079257 | Ruwe et al. | Mar 2014 | A1 |
20140106677 | Altman | Apr 2014 | A1 |
20140122116 | Smythe | May 2014 | A1 |
20140146973 | Liu et al. | May 2014 | A1 |
20140153768 | Hagen et al. | Jun 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140185828 | Helbling | Jul 2014 | A1 |
20140219467 | Kurtz | Aug 2014 | A1 |
20140222462 | Shakil et al. | Aug 2014 | A1 |
20140235169 | Parkinson et al. | Aug 2014 | A1 |
20140270227 | Swanson | Sep 2014 | A1 |
20140270271 | Dehe et al. | Sep 2014 | A1 |
20140335908 | Krisch et al. | Nov 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20150028996 | Agrafioti et al. | Jan 2015 | A1 |
20150035643 | Kursun | Feb 2015 | A1 |
20150036835 | Chen | Feb 2015 | A1 |
20150110587 | Hori | Apr 2015 | A1 |
20150148989 | Cooper et al. | May 2015 | A1 |
20150181356 | Krystek et al. | Jun 2015 | A1 |
20150245127 | Shaffer | Aug 2015 | A1 |
20150264472 | Aase | Sep 2015 | A1 |
20150264501 | Hu et al. | Sep 2015 | A1 |
20150358751 | Deng et al. | Dec 2015 | A1 |
20150359436 | Shim et al. | Dec 2015 | A1 |
20150373467 | Gelter | Dec 2015 | A1 |
20150373474 | Kraft et al. | Dec 2015 | A1 |
20160033280 | Moore et al. | Feb 2016 | A1 |
20160034249 | Lee | Feb 2016 | A1 |
20160050509 | Madhu | Feb 2016 | A1 |
20160071526 | Wingate et al. | Mar 2016 | A1 |
20160072558 | Hirsch et al. | Mar 2016 | A1 |
20160073189 | Lindén et al. | Mar 2016 | A1 |
20160125892 | Bowen et al. | May 2016 | A1 |
20160162259 | Zhao et al. | Jun 2016 | A1 |
20160209691 | Yang et al. | Jul 2016 | A1 |
20160324478 | Goldstein | Nov 2016 | A1 |
20160353196 | Baker et al. | Dec 2016 | A1 |
20160360350 | Watson et al. | Dec 2016 | A1 |
20170059152 | Hirsch et al. | Mar 2017 | A1 |
20170060262 | Hviid et al. | Mar 2017 | A1 |
20170060269 | Förstner et al. | Mar 2017 | A1 |
20170061751 | Loermann et al. | Mar 2017 | A1 |
20170062913 | Hirsch et al. | Mar 2017 | A1 |
20170064426 | Hviid | Mar 2017 | A1 |
20170064428 | Hirsch | Mar 2017 | A1 |
20170064432 | Hviid et al. | Mar 2017 | A1 |
20170064437 | Hviid et al. | Mar 2017 | A1 |
20170078780 | Qian et al. | Mar 2017 | A1 |
20170078785 | Qian et al. | Mar 2017 | A1 |
20170108918 | Boesen | Apr 2017 | A1 |
20170109131 | Boesen | Apr 2017 | A1 |
20170110124 | Boesen et al. | Apr 2017 | A1 |
20170110899 | Boesen | Apr 2017 | A1 |
20170111723 | Boesen | Apr 2017 | A1 |
20170111725 | Boesen et al. | Apr 2017 | A1 |
20170111726 | Martin et al. | Apr 2017 | A1 |
20170111740 | Hviid et al. | Apr 2017 | A1 |
20170127168 | Briggs et al. | May 2017 | A1 |
20170131094 | Kulik | May 2017 | A1 |
20170142511 | Dennis | May 2017 | A1 |
20170146801 | Stempora | May 2017 | A1 |
20170151447 | Boesen | Jun 2017 | A1 |
20170151668 | Boesen | Jun 2017 | A1 |
20170151918 | Boesen | Jun 2017 | A1 |
20170151930 | Boesen | Jun 2017 | A1 |
20170151957 | Boesen | Jun 2017 | A1 |
20170151959 | Boesen | Jun 2017 | A1 |
20170153114 | Boesen | Jun 2017 | A1 |
20170153636 | Boesen | Jun 2017 | A1 |
20170154532 | Boesen | Jun 2017 | A1 |
20170155985 | Boesen | Jun 2017 | A1 |
20170155992 | Perianu et al. | Jun 2017 | A1 |
20170155993 | Boesen | Jun 2017 | A1 |
20170155997 | Boesen | Jun 2017 | A1 |
20170155998 | Boesen | Jun 2017 | A1 |
20170156000 | Boesen | Jun 2017 | A1 |
20170178631 | Boesen | Jun 2017 | A1 |
20170180842 | Boesen | Jun 2017 | A1 |
20170180843 | Perianu et al. | Jun 2017 | A1 |
20170180897 | Perianu | Jun 2017 | A1 |
20170188127 | Perianu et al. | Jun 2017 | A1 |
20170188132 | Hirsch et al. | Jun 2017 | A1 |
20170193978 | Goldman | Jul 2017 | A1 |
20170195829 | Belverato et al. | Jul 2017 | A1 |
20170208393 | Boesen | Jul 2017 | A1 |
20170214987 | Boesen | Jul 2017 | A1 |
20170215016 | Dohmen et al. | Jul 2017 | A1 |
20170230752 | Dohmen et al. | Aug 2017 | A1 |
20170251933 | Braun et al. | Sep 2017 | A1 |
20170257698 | Boesen et al. | Sep 2017 | A1 |
20170263236 | Boesen et al. | Sep 2017 | A1 |
20170273622 | Boesen | Sep 2017 | A1 |
20170280257 | Gordon et al. | Sep 2017 | A1 |
20170366233 | Hviid et al. | Dec 2017 | A1 |
20180007994 | Boesen et al. | Jan 2018 | A1 |
20180008194 | Boesen | Jan 2018 | A1 |
20180008198 | Kingscott | Jan 2018 | A1 |
20180009447 | Boesen et al. | Jan 2018 | A1 |
20180011006 | Kingscott | Jan 2018 | A1 |
20180011682 | Milevski et al. | Jan 2018 | A1 |
20180011994 | Boesen | Jan 2018 | A1 |
20180012228 | Milevski et al. | Jan 2018 | A1 |
20180013195 | Hviid et al. | Jan 2018 | A1 |
20180014102 | Hirsch et al. | Jan 2018 | A1 |
20180014103 | Martin et al. | Jan 2018 | A1 |
20180014104 | Boesen et al. | Jan 2018 | A1 |
20180014107 | Razouane et al. | Jan 2018 | A1 |
20180014108 | Dragicevic et al. | Jan 2018 | A1 |
20180014109 | Boesen | Jan 2018 | A1 |
20180014113 | Boesen | Jan 2018 | A1 |
20180014140 | Milevski et al. | Jan 2018 | A1 |
20180014436 | Milevski | Jan 2018 | A1 |
20180034951 | Boesen | Feb 2018 | A1 |
20180035217 | Han | Feb 2018 | A1 |
20180040093 | Boesen | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
204244472 | Apr 2015 | CN |
104683519 | Jun 2015 | CN |
104837094 | Aug 2015 | CN |
1469659 | Oct 2004 | EP |
1017252 | May 2006 | EP |
2903186 | Aug 2015 | EP |
2074817 | Apr 1981 | GB |
2508226 | May 2014 | GB |
06292195 | Oct 1998 | JP |
2008103925 | Aug 2008 | WO |
2008113053 | Sep 2008 | WO |
2007034371 | Nov 2008 | WO |
2011001433 | Jan 2011 | WO |
2012071127 | May 2012 | WO |
2013134956 | Sep 2013 | WO |
2014046602 | Mar 2014 | WO |
2014043179 | Jul 2014 | WO |
2015061633 | Apr 2015 | WO |
2015110577 | Jul 2015 | WO |
2015110587 | Jul 2015 | WO |
2016032990 | Mar 2016 | WO |
2016187869 | Dec 2016 | WO |
Entry |
---|
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223. |
Alzahrani et al: “A Multi-Channel Opto-Electronic Sensor to Accurately Monitor Heart Rate against Motion Artefact during Exercise”, Sensors, vol. 15, No. 10, Oct. 12, 2015, pp. 25681-25702, XP055334602, DOI: 10.3390/s151025681 the whole document. |
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014). |
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013). |
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014). |
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016). |
BRAGI Is On Facebook (2014). |
BRAGI Update—Arrival Of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014). |
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015). |
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014). |
BRAGI Update—Lets Get Ready To Rumble, A Lot To Be Done Over Christmas (Dec. 22, 2014). |
BRAGI Update—Memories From April—Update On Progress (Sep. 16, 2014). |
BRAGI Update—Memories from May—Update On Progress—Sweet (Oct. 13, 2014). |
BRAGI Update—Memories From One Month Before Kickstarter—Update On Progress (Jul. 10, 2014). |
BRAGI Update—Memories From The First Month of Kickstarter—Update on Progress (Aug. 1, 2014). |
BRAGI Update—Memories From The Second Month of Kickstarter—Update On Progress (Aug. 22, 2014). |
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014). |
BRAGI Update—Office Tour, Tour To China, Tour to CES (Dec. 11, 2014). |
BRAGI Update—Status On Wireless, Bits and Pieces, Testing—Oh Yeah, Timeline(Apr. 24, 2015). |
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015). |
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014). |
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015). |
BRAGI Update—Alpha 5 and Back To China, Backer Day, On Track(May 16, 2015). |
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015). |
BRAGI Update—Certifications, Production, Ramping Up. |
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015). |
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015). |
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015). |
BRAGI Update—Getting Close(Aug. 6, 2015). |
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015). |
BRAGI Update—On Track, On Track and Gems Overview. |
BRAGI Update—Status On Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015). |
BRAGI Update—Unpacking Video, Reviews On Audio Perform and Boy Are We Getting Close(Sep. 10, 2015). |
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016). |
Hoffman, “How to Use Android Beam to Wirelessly Transfer Content Between Devices”, (Feb. 22, 2013). |
Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017). |
Hyundai Motor America, “Hyundai Motor Company Introduces A Health +Mobility Concept For Wellness In Mobility”, Fountain Valley, Californa (2017). |
International Search Report & Written Opinion, PCT/EP16/70245 (dated Nov. 16, 2016). |
International Search Report & Written Opinion, PCT/EP2016/070231 (dated Nov. 18, 2016). |
International Search Report & Written Opinion, PCT/EP2016/070247 (dated Nov. 18, 2016). |
Jain A et al: “Score normalization in multimodal biometric systems”, Pattern Recognition, Elsevier, GB, vol. 38, No. 12, Dec. 31, 2005, pp. 2270-2285, XPO27610849, ISSN: 0031-3203. |
Last Push Before The Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014). |
Nemanja Paunovic et al, “A methodology for testing complex professional electronic systems”, Serbian Journal of Electrical Engineering, vol. 9, No. 1, Feb. 1, 2012, pp. 71-80, XPO55317584, Yu. |
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footle and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dlna_iot/ (Sep. 24, 2014). |
Nuance, “ING Netherlands Launches Voice Biometrics Payment System in the Mobile Banking App Powered by Nuance”, “https://www.nuance.com/about-us/newsroom/press-releases/ing-netherlands-launches-nuance-voice-biometirics.html”, 4 pages (Jul. 28, 2015). |
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
Stretchgoal—It's Your Dash (Feb. 14, 2014). |
Stretchgoal—The Carrying Case for The Dash (Feb. 12, 2014). |
Stretchgoal—Windows Phone Support (Feb. 17, 2014). |
The Dash +The Charging Case & The BRAGI News (Feb. 21, 2014). |
The Dash—A Word From Our Software, Mechanical and Acoustics Team +An Update (Mar. 11, 2014). |
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014). |
Wertzner et al., “Analysis of fundamental frequency, jitter, shimmer and vocal intensity in children with phonological disorders”, V. 71, n.5, 582-588, Sep./Oct. 2005; Brazilian Journal of Othrhinolaryngology. |
Wikipedia, “Gamebook”, https://en.wikipedia.org/wiki/Gamebook, Sep. 3, 2017, 5 pages. |
Wikipedia, “Kinect”, “https://en.wikipedia.org/wiki/Kinect”, 18 pages, (Sep. 9, 2017). |
Wikipedia, “Wii Balance Board”, “https://en.wikipedia.org/wiki/Wii_Balance_Board”, 3 pages, (Jul. 20, 2017). |
Number | Date | Country | |
---|---|---|---|
20180242069 A1 | Aug 2018 | US |
Number | Date | Country | |
---|---|---|---|
62461657 | Feb 2017 | US |