The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to earpieces.
Real time audio feedback is a concept that would be of great use in many industries today. However, such feedback is not feasible in many instances, either because managers or instructors are too busy with other tasks or electronic devices which may be able to provide such feedback, such as smartphones, may be either too cumbersome or inappropriate for certain tasks. What is needed is a system and method that provides audio feedback in response to user performance for manual or prescribed tasks.
Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.
It is a further object, feature, or advantage of the present invention to provide audio feedback in response to user performance.
It is a still further object, feature, or advantage of the present invention to monitor user performance such as using detected audio sensed with microphones, movement sensed with inertial sensors, or otherwise.
One or more of these and/or other objects, features, or advantages of the present invention will become apparent from the specification and claims that follow. No single embodiment need provide each and every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by an object, feature, or advantage stated herein.
According to one aspect a method of providing audio feedback in response to a user performance using an earpiece includes steps of identifying a manual work operation to be performed by the user, wherein the identifying the manual work operation is performed by the earpiece, monitoring performance of the manual work operation by the user, wherein the monitoring the performance of the work operation is performed by the earpiece, generating 3D sound cues at the earpiece to assist in the performance of the manual work operation by the user, and outputting the 3D sound cues to the user at one or more speakers of the earpiece during the performance of the manual work operation by the user. The 3D sound cues may be generated to be perceived as coming from a spatial location having a contextual relationship with the manual work operation by the user. The monitoring the performance of the manual work operation by the user may be performed by monitoring movement of the user during the manual work operation using one or more inertial sensors of the wireless earpiece. The monitoring the performance of the manual work operation by the user may be performed by monitoring audio of the user sensed by the earpiece during the manual work operation using one or more microphones of the earpiece. The 3D sound cues include voice audio. The manual work operations may be associated with various purposes including therapy such as occupational therapy or physical therapy.
According to another aspect, a method of providing audio feedback in response to a user performance using a set of earpieces including a left ear piece and a right earpiece is provided. The method includes identifying a manual work operation to be performed by the user, wherein the identifying the manual work operation is performed by at least one of the left earpiece and the right earpiece, monitoring performance of the manual work operation by the user, wherein the monitoring the performance of the work operation is performed by the left earpiece and the right earpiece, generating 3D sound cues to assist in the performance of the manual work operation by the user, wherein the generating 3D sound cues is performed at least one of the left earpiece and the right earpiece, and outputting the 3D sound cues to the user at one or more speakers of the left earpiece and one or more speakers of the right earpiece during the performance of the manual work operation by the user. The 3D sound cues may be generated to be perceived as coming from a spatial location having a contextual relationship with the manual work operation by the user. The monitoring the performance of the manual work operation by the user may be performed by monitoring movement of the user during the manual work operation using one or more inertial sensors of the left wireless earpiece and one or more inertial sensors of the right wireless earpiece. The monitoring the performance of the manual work operation by the user may be performed by monitoring audio of the user sensed during the manual work operation using one or more microphones of the left wireless earpiece and one or more microphones of the right wireless earpiece. The monitoring the performance of the manual work operation by the user may be performed by monitoring movement of the user during the manual work operation using one or more inertial sensors of the left wireless earpiece and one or more inertial sensors of the right wireless earpiece and monitoring audio of the user sensed by the earpiece during the manual work operation using one or more microphones of the left wireless earpiece and one or more microphones of the right wireless earpiece.
An earpiece or set of earpieces may be used to provide voice feedback to a user and also to monitor and track the user's performance of a manual operation or task. The earpiece may assist with the manual operations by providing context-sensitive instructions. For example, where a user is to perform a task with their left hand, audio may be reproduced which sounds as if it is coming from the left of the user through audio processing where a source is appropriately placed. This may be used in any number of different contexts including training for any number of tasks, physical or occupational therapy, sports performance training, or otherwise.
A processor 20 is disposed within the earpiece housing 14, a gesture control interface 22 with at least one emitter 42 and at least one detector 44 is operatively connected to the one or more processors 20. A radio transceiver 26 disposed within the earpiece housing 14 is also operatively connected to the processor 20. The radio transceiver 26 may be a BLUETOOTH, BLE, Wi-Fi, or other type of radio transceiver. Another transceiver is 28 disposed within the earpiece housing 14 and may be operatively connected to the one or more processors 20. The transceiver 28 may be a magnetic induction transceiver such as a near field magnetic induction (NFMI) transceiver. A data storage device 29 may be disposed within the earpiece housing and operatively connected to the one or more processors 20. The data storage device 29 may store data to be used in analyzing performance of manual operations of the user or in providing audio feedback to the user. One or more LEDs 38 may be operatively connected to the one or more processors 20 to provide visual feedback.
The electromyography (EMG) sensor 34 may be present and configured to read myographic activity from a user to ascertain a facial or other physical gesture and communicate a signal related to the myographic activity to the processor 20. For example, the EMG sensor 34 may sense the electrical activity of a user's facial muscles during a struggle to accomplish a certain task or a problem the user is trying to solve and may submit the resulting readings to the processor 20, which may subsequently produce audio feedback at the speaker 18 to assist the user in fixing or eliminating the problem or alleviating any struggle the user may have. An infrared sensor 36 may also be employed to ascertain the movement of third-party objects or entities. Each sensor 16 may be positioned at any location conducive to receiving information and need not necessarily be in direct contact with either the user or the external environment.
One or more speakers 18 may be operatively connected the earpiece housing 14 and may be configured to, in addition to producing audio feedback in response to a command from the processor 20, produce one or more ambient and/or non-ambient sounds from one or more microphones 24, 32 or produce one or more audio signals from either the radio transceiver 26, the transceiver 28 or the data storage device 29. The produced sounds may consist of musical sounds, non-musical sounds, commentary, instructions, miscellaneous information, or anything else of interest or importance to the user. In addition, the audio feedback or sounds provided by the speaker 18 may be produced in a three-dimensional manner. For example, if the audio feedback relates to poor foot positioning, the audio feedback may be provided in such a manner as the user or third party will interpret the audio feedback as originating from the user's or third party's feet.
A processor 20 may be disposed within the earpiece housing 14 and operatively connected to components within the earpiece 12 and may be configured to, in addition to producing audio feedback, process signals from the radio transceiver 26, process signals from the transceiver 28, process signals originating from the data storage device 29, process signals from the bone conduction microphone 32, process signals from the EMG sensor 34, process signals from the infrared sensor 36, and process signals from the one or more inertial sensors 33.
A gesture control interface 22 having at least one emitter 42 and a detector 44 may be operatively connected to the one or more processors 20 and may be configured to allow the user or a third party to control one or more functions of the earpiece 12. For example, a menu may be prompted through the use of a gesture with the gestural control interface 22, which may allow the user or a third party to select one or more motions to sense, reprogram or reconfigure the audio feedback to be produced at the speaker 18, listen to a song either stored within the data storage device 29 or received through the radio transceiver 26, listen to a playlist, listen to a newscast or a podcast, listen to a weather report, obtain information on the user's current surroundings, or anything else that may be of interest to the user or a third party, and the aforementioned list is non-exclusive. The selections may be chosen through the use of one or more additional gestures or through the use of one or more voice commands from the user and/or a third party. The types of gestures that may be used with the gesture control interface 22 to control the earpiece 12 include, without limitation, touching, tapping, swiping, use of an instrument, or any combination of the aforementioned gestures. Touching gestures used to control the earpiece 12 may be of any duration and may include the touching of areas that are not part of the gesture control interface 22. Tapping gestures used to control the earpiece 12 may include one or more taps and need not be brief. Swiping gestures used to control the earpiece 12 may include a single swipe, a swipe that changes direction at least once, a swipe with a time delay, a plurality of swipes, or any combination of the aforementioned.
The emitters and detector may be optical emitters an optical detectors. Where optical emitters and detectors are used, additional information may be sensed in addition to gestures of a user. This may include position of the user relative to other objects, or the movement of other objects, or movement of a user in performing a work operation.
One or more microphones 24 may be operatively connected to the earpiece housing 14 and may be configured to receive ambient sounds from one or more outside sources, which may originate from the user, a third party, a machine, an animal, another earpiece, another electronic device, or even nature itself. The ambient sounds received by a microphone 24 may include a word, a combination of words, a sound, a combination of sounds, or any combination of the aforementioned. The sounds may be of any frequency and need not necessarily be audible to the user. In addition, one or more microphones 24 may also be configured to receive one or more voice commands which may be used to cease, commence, change, or modify one or more functions of the earpiece 12. For example, a voice command to cease receiving ambient sounds may be provided by the user or a third party saying, “Cease reception of internal ear sounds,” or a voice command to play the fifth song in a playlist may be provided by the user or a third party saying, “Play song five in playlist,” or “Skip to song five.” Other commands may be used to cease, commence, change or modify other functions of the earpiece 12.
A radio transceiver 26 may be disposed within the earpiece 12 and may be configured to receive signals from external electronic devices and to transmit those signals to the processor 20. The external electronic devices the radio transceiver 26 may be configured to receive signals from include Bluetooth devices, mobile devices, desktops, laptops, tablets, modems, routers, communications towers, cameras, watches, third-party earpieces, earpieces, other wearable devices, or other electronic devices capable of transmitting or receiving wireless signals. The radio transceiver 26 may receive or transmit more than one signal simultaneously.
A transceiver 28 may be disposed within the earpiece 12 and may be configured to receive signals from and to transmit signals to a second earpiece of the user if the user is using more than one earpiece. The transceiver 28 may receive or transmit more than one signal simultaneously. The transceiver 28 may be of any number of types including a near field magnetic induction (NFMI) transceiver.
One or more LEDs 38 may be operatively connected to the earpiece housing 14 and may be configured to emit light in order to convey information to a user concerning the earpiece 12. The LEDs 38 may be located in any area on the earpiece 12 suitable for viewing by the user or a third party and may consist of as few as one diode which may be provided in combination with a light guide.
It should be understood that either a single earpiece may be used or alternatively a set of wireless earpieces may be used. Although the earpieces shown are of an ear bud style configuration, although configurations may be used including headsets.
Therefore, methods, apparatus and system for providing audio feedback in response to a user performance of manual work operations have been shown and described. The present invention contemplates numerous variations, options, and alternatives.
This application claims priority to U.S. Provisional Patent Application No. 62/417,385, filed Nov. 4, 2017, and entitled “Manual Operation Assistance with Earpiece 3D Sound Cues”, hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2325590 | Carlisle et al. | Aug 1943 | A |
2430229 | Kelsey | Nov 1947 | A |
3047089 | Zwislocki | Jul 1962 | A |
D208784 | Sanzone | Oct 1967 | S |
3586794 | Michaelis | Jun 1971 | A |
3934100 | Harada | Jan 1976 | A |
3983336 | Malek et al. | Sep 1976 | A |
4069400 | Johanson et al. | Jan 1978 | A |
4150262 | Ono | Apr 1979 | A |
4334315 | Ono et al. | Jun 1982 | A |
D266271 | Johanson et al. | Sep 1982 | S |
4375016 | Harada | Feb 1983 | A |
4588867 | Konomi | May 1986 | A |
4617429 | Bellafiore | Oct 1986 | A |
4654883 | Iwata | Mar 1987 | A |
4682180 | Gans | Jul 1987 | A |
4791673 | Schreiber | Dec 1988 | A |
4852177 | Ambrose | Jul 1989 | A |
4865044 | Wallace et al. | Sep 1989 | A |
4984277 | Bisgaard et al. | Jan 1991 | A |
5008943 | Arndt et al. | Apr 1991 | A |
5185802 | Stanton | Feb 1993 | A |
5191602 | Regen et al. | Mar 1993 | A |
5201007 | Ward et al. | Apr 1993 | A |
5201008 | Arndt et al. | Apr 1993 | A |
D340286 | Seo | Oct 1993 | S |
5280524 | Norris | Jan 1994 | A |
5295193 | Ono | Mar 1994 | A |
5298692 | Ikeda et al. | Mar 1994 | A |
5343532 | Shugart | Aug 1994 | A |
5347584 | Narisawa | Sep 1994 | A |
5363444 | Norris | Nov 1994 | A |
D367113 | Weeks | Feb 1996 | S |
5497339 | Bernard | Mar 1996 | A |
5606621 | Reiter et al. | Feb 1997 | A |
5613222 | Guenther | Mar 1997 | A |
5654530 | Sauer et al. | Aug 1997 | A |
5692059 | Kruger | Nov 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5748743 | Weeks | May 1998 | A |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5771438 | Palermo et al. | Jun 1998 | A |
D397796 | Yabe et al. | Sep 1998 | S |
5802167 | Hong | Sep 1998 | A |
D410008 | Almqvist | May 1999 | S |
5929774 | Charlton | Jul 1999 | A |
5933506 | Aoki et al. | Aug 1999 | A |
5949896 | Nageno et al. | Sep 1999 | A |
5987146 | Pluvinage et al. | Nov 1999 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6081724 | Wilson | Jun 2000 | A |
6084526 | Blotky et al. | Jul 2000 | A |
6094492 | Boesen | Jul 2000 | A |
6111569 | Brusky et al. | Aug 2000 | A |
6112103 | Puthuff | Aug 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6167039 | Karlsson et al. | Dec 2000 | A |
6181801 | Puthuff et al. | Jan 2001 | B1 |
6185152 | Shen | Feb 2001 | B1 |
6208372 | Barraclough | Mar 2001 | B1 |
6230029 | Yegiazaryan et al. | May 2001 | B1 |
6275789 | Moser et al. | Aug 2001 | B1 |
6339754 | Flanagan et al. | Jan 2002 | B1 |
D455835 | Anderson et al. | Apr 2002 | S |
6408081 | Boesen | Jun 2002 | B1 |
6424820 | Burdick et al. | Jul 2002 | B1 |
D464039 | Boesen | Oct 2002 | S |
6470893 | Boesen | Oct 2002 | B1 |
D468299 | Boesen | Jan 2003 | S |
D468300 | Boesen | Jan 2003 | S |
6542721 | Boesen | Apr 2003 | B2 |
6560468 | Boesen | May 2003 | B1 |
6563301 | Gventer | May 2003 | B2 |
6654721 | Handelman | Nov 2003 | B2 |
6664713 | Boesen | Dec 2003 | B2 |
6690807 | Meyer | Feb 2004 | B1 |
6694180 | Boesen | Feb 2004 | B1 |
6718043 | Boesen | Apr 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6748095 | Goss | Jun 2004 | B1 |
6754358 | Boesen et al. | Jun 2004 | B1 |
6784873 | Boesen et al. | Aug 2004 | B1 |
6823195 | Boesen | Nov 2004 | B1 |
6852084 | Boesen | Feb 2005 | B1 |
6879698 | Boesen | Apr 2005 | B2 |
6892082 | Boesen | May 2005 | B2 |
6920229 | Boesen | Jul 2005 | B2 |
6952483 | Boesen et al. | Oct 2005 | B2 |
6987986 | Boesen | Jan 2006 | B2 |
7010137 | Leedom et al. | Mar 2006 | B1 |
7113611 | Leedom et al. | Sep 2006 | B2 |
D532520 | Kampmeier et al. | Nov 2006 | S |
7136282 | Rebeske | Nov 2006 | B1 |
7203331 | Boesen | Apr 2007 | B2 |
7209569 | Boesen | Apr 2007 | B2 |
7215790 | Boesen et al. | May 2007 | B2 |
D549222 | Huang | Aug 2007 | S |
D554756 | Sjursen et al. | Nov 2007 | S |
7403629 | Aceti et al. | Jul 2008 | B1 |
D579006 | Kim et al. | Oct 2008 | S |
7463902 | Boesen | Dec 2008 | B2 |
7508411 | Boesen | Mar 2009 | B2 |
7532901 | LaFranchise | May 2009 | B1 |
D601134 | Elabidi et al. | Sep 2009 | S |
7825626 | Kozisek | Nov 2010 | B2 |
7965855 | Ham | Jun 2011 | B1 |
7979035 | Griffin et al. | Jul 2011 | B2 |
7983628 | Boesen | Jul 2011 | B2 |
D647491 | Chen et al. | Oct 2011 | S |
8095188 | Shi | Jan 2012 | B2 |
8108143 | Tester | Jan 2012 | B1 |
8140357 | Boesen | Mar 2012 | B1 |
D666581 | Perez | Sep 2012 | S |
8300864 | Müllenborn et al. | Oct 2012 | B2 |
8406448 | Lin | Mar 2013 | B2 |
8430817 | Al-Ali et al. | Apr 2013 | B1 |
8436780 | Schantz et al. | May 2013 | B2 |
D687021 | Yuen | Jul 2013 | S |
8679012 | Kayyali | Mar 2014 | B1 |
8719877 | VonDoenhoff et al. | May 2014 | B2 |
8774434 | Zhao et al. | Jul 2014 | B2 |
8831266 | Huang | Sep 2014 | B1 |
8891800 | Shaffer | Nov 2014 | B1 |
8994498 | Agrafioti et al. | Mar 2015 | B2 |
D728107 | Martin et al. | Apr 2015 | S |
9013145 | Castillo et al. | Apr 2015 | B2 |
9037125 | Kadous | May 2015 | B1 |
D733103 | Jeong et al. | Jun 2015 | S |
9081944 | Camacho et al. | Jul 2015 | B2 |
9510159 | Cuddihy et al. | Nov 2016 | B1 |
D773439 | Walker | Dec 2016 | S |
D775158 | Dong et al. | Dec 2016 | S |
D777710 | Palmborg et al. | Jan 2017 | S |
9544689 | Fisher et al. | Jan 2017 | B2 |
D788079 | Son et al. | May 2017 | S |
9711062 | Ellis | Jul 2017 | B2 |
9767709 | Ellis | Sep 2017 | B2 |
20010005197 | Mishra et al. | Jun 2001 | A1 |
20010027121 | Boesen | Oct 2001 | A1 |
20010043707 | Leedom | Nov 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020002413 | Tokue | Jan 2002 | A1 |
20020007510 | Mann | Jan 2002 | A1 |
20020010590 | Lee | Jan 2002 | A1 |
20020030637 | Mann | Mar 2002 | A1 |
20020046035 | Kitahara et al. | Apr 2002 | A1 |
20020057810 | Boesen | May 2002 | A1 |
20020076073 | Taenzer et al. | Jun 2002 | A1 |
20020118852 | Boesen | Aug 2002 | A1 |
20030002705 | Boesen | Jan 2003 | A1 |
20030065504 | Kraemer et al. | Apr 2003 | A1 |
20030100331 | Dress et al. | May 2003 | A1 |
20030104806 | Ruef et al. | Jun 2003 | A1 |
20030115068 | Boesen | Jun 2003 | A1 |
20030125096 | Boesen | Jul 2003 | A1 |
20030218064 | Conner et al. | Nov 2003 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040102931 | Ellis | May 2004 | A1 |
20040160511 | Boesen | Aug 2004 | A1 |
20050017842 | Dematteo | Jan 2005 | A1 |
20050043056 | Boesen | Feb 2005 | A1 |
20050094839 | Gwee | May 2005 | A1 |
20050125320 | Boesen | Jun 2005 | A1 |
20050148883 | Boesen | Jul 2005 | A1 |
20050165663 | Razumov | Jul 2005 | A1 |
20050196009 | Boesen | Sep 2005 | A1 |
20050197063 | White | Sep 2005 | A1 |
20050251455 | Boesen | Nov 2005 | A1 |
20050266876 | Boesen | Dec 2005 | A1 |
20060029246 | Boesen | Feb 2006 | A1 |
20060073787 | Lair et al. | Apr 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060074808 | Boesen | Apr 2006 | A1 |
20060166715 | Engelen et al. | Jul 2006 | A1 |
20060166716 | Seshadri et al. | Jul 2006 | A1 |
20060220915 | Bauer | Oct 2006 | A1 |
20060258412 | Liu | Nov 2006 | A1 |
20070269785 | Yamanoi | Nov 2007 | A1 |
20080076972 | Dorogusker et al. | Mar 2008 | A1 |
20080090622 | Kim et al. | Apr 2008 | A1 |
20080102424 | Holljes | May 2008 | A1 |
20080146890 | LeBoeuf et al. | Jun 2008 | A1 |
20080187163 | Goldstein et al. | Aug 2008 | A1 |
20080215239 | Lee | Sep 2008 | A1 |
20080253583 | Goldstein et al. | Oct 2008 | A1 |
20080254780 | Kuhl et al. | Oct 2008 | A1 |
20080255430 | Alexandersson et al. | Oct 2008 | A1 |
20080298606 | Johnson et al. | Dec 2008 | A1 |
20090003620 | McKillop et al. | Jan 2009 | A1 |
20090008275 | Ferrari et al. | Jan 2009 | A1 |
20090017881 | Madrigal | Jan 2009 | A1 |
20090073070 | Rofougaran | Mar 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090105548 | Bart | Apr 2009 | A1 |
20090154739 | Zellner | Jun 2009 | A1 |
20090191920 | Regen et al. | Jul 2009 | A1 |
20090245559 | Boltyenkov et al. | Oct 2009 | A1 |
20090261114 | McGuire et al. | Oct 2009 | A1 |
20090296968 | Wu et al. | Dec 2009 | A1 |
20100033313 | Keady et al. | Feb 2010 | A1 |
20100203831 | Muth | Aug 2010 | A1 |
20100210212 | Sato | Aug 2010 | A1 |
20100290636 | Mao | Nov 2010 | A1 |
20100320961 | Castillo et al. | Dec 2010 | A1 |
20110140844 | McGuire et al. | Jun 2011 | A1 |
20110239497 | McGuire et al. | Oct 2011 | A1 |
20110286615 | Olodort et al. | Nov 2011 | A1 |
20120057740 | Rosal | Mar 2012 | A1 |
20120155670 | Rutschman | Jun 2012 | A1 |
20120309453 | Maguire | Dec 2012 | A1 |
20130106454 | Liu et al. | May 2013 | A1 |
20130178967 | Mentz | Jul 2013 | A1 |
20130316642 | Newham | Nov 2013 | A1 |
20130346168 | Zhou et al. | Dec 2013 | A1 |
20140004912 | Rajakarunanayake | Jan 2014 | A1 |
20140014697 | Schmierer et al. | Jan 2014 | A1 |
20140020089 | Perini, II | Jan 2014 | A1 |
20140072136 | Tenenbaum et al. | Mar 2014 | A1 |
20140073429 | Meneses | Mar 2014 | A1 |
20140079257 | Ruwe et al. | Mar 2014 | A1 |
20140106677 | Altman | Apr 2014 | A1 |
20140122116 | Smythe | May 2014 | A1 |
20140146973 | Liu et al. | May 2014 | A1 |
20140153768 | Hagen et al. | Jun 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140185828 | Helbling | Jul 2014 | A1 |
20140219467 | Kurtz | Aug 2014 | A1 |
20140222462 | Shakil et al. | Aug 2014 | A1 |
20140235169 | Parkinson et al. | Aug 2014 | A1 |
20140270227 | Swanson | Sep 2014 | A1 |
20140270271 | Dehe et al. | Sep 2014 | A1 |
20140310595 | Acharya | Oct 2014 | A1 |
20140335908 | Krisch et al. | Nov 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20150028996 | Agrafioti et al. | Jan 2015 | A1 |
20150035643 | Kursun | Feb 2015 | A1 |
20150036835 | Chen | Feb 2015 | A1 |
20150056584 | Boulware | Feb 2015 | A1 |
20150110587 | Hori | Apr 2015 | A1 |
20150148989 | Cooper et al. | May 2015 | A1 |
20150181356 | Krystek et al. | Jun 2015 | A1 |
20150245127 | Shaffer | Aug 2015 | A1 |
20150264472 | Aase | Sep 2015 | A1 |
20150264501 | Hu et al. | Sep 2015 | A1 |
20150358751 | Deng et al. | Dec 2015 | A1 |
20150359436 | Shim et al. | Dec 2015 | A1 |
20150364058 | Lagree | Dec 2015 | A1 |
20150373467 | Gelter | Dec 2015 | A1 |
20150373474 | Kraft et al. | Dec 2015 | A1 |
20160033280 | Moore et al. | Feb 2016 | A1 |
20160034249 | Lee et al. | Feb 2016 | A1 |
20160071526 | Wingate et al. | Mar 2016 | A1 |
20160072558 | Hirsch et al. | Mar 2016 | A1 |
20160073189 | Lindén et al. | Mar 2016 | A1 |
20160119737 | Mehnert | Apr 2016 | A1 |
20160124707 | Ermilov | May 2016 | A1 |
20160125892 | Bowen et al. | May 2016 | A1 |
20160162259 | Zhao et al. | Jun 2016 | A1 |
20160209691 | Yang et al. | Jul 2016 | A1 |
20160324478 | Goldstein | Nov 2016 | A1 |
20160353196 | Baker et al. | Dec 2016 | A1 |
20160360350 | Watson et al. | Dec 2016 | A1 |
20170021257 | Olsen, II | Jan 2017 | A1 |
20170046503 | Cho | Feb 2017 | A1 |
20170059152 | Hirsch et al. | Mar 2017 | A1 |
20170060262 | Hviid et al. | Mar 2017 | A1 |
20170060269 | Förstner et al. | Mar 2017 | A1 |
20170061751 | Loermann et al. | Mar 2017 | A1 |
20170061817 | Mettler May | Mar 2017 | A1 |
20170062913 | Hirsch et al. | Mar 2017 | A1 |
20170064426 | Hviid | Mar 2017 | A1 |
20170064428 | Hirsch | Mar 2017 | A1 |
20170064432 | Hviid et al. | Mar 2017 | A1 |
20170064437 | Hviid et al. | Mar 2017 | A1 |
20170078780 | Qian et al. | Mar 2017 | A1 |
20170078785 | Qian et al. | Mar 2017 | A1 |
20170108918 | Boesen | Apr 2017 | A1 |
20170109131 | Boesen | Apr 2017 | A1 |
20170110124 | Boesen et al. | Apr 2017 | A1 |
20170110899 | Boesen | Apr 2017 | A1 |
20170111723 | Boesen | Apr 2017 | A1 |
20170111725 | Boesen et al. | Apr 2017 | A1 |
20170111726 | Martin et al. | Apr 2017 | A1 |
20170111740 | Hviid et al. | Apr 2017 | A1 |
20170127168 | Briggs et al. | May 2017 | A1 |
20170131094 | Kulik | May 2017 | A1 |
20170142511 | Dennis | May 2017 | A1 |
20170146801 | Stempora | May 2017 | A1 |
20170151447 | Boesen | Jun 2017 | A1 |
20170151668 | Boesen | Jun 2017 | A1 |
20170151918 | Boesen | Jun 2017 | A1 |
20170151930 | Boesen | Jun 2017 | A1 |
20170151957 | Boesen | Jun 2017 | A1 |
20170151959 | Boesen | Jun 2017 | A1 |
20170153114 | Boesen | Jun 2017 | A1 |
20170153636 | Boesen | Jun 2017 | A1 |
20170154532 | Boesen | Jun 2017 | A1 |
20170155985 | Boesen | Jun 2017 | A1 |
20170155992 | Perianu et al. | Jun 2017 | A1 |
20170155993 | Boesen | Jun 2017 | A1 |
20170155997 | Boesen | Jun 2017 | A1 |
20170155998 | Boesen | Jun 2017 | A1 |
20170156000 | Boesen | Jun 2017 | A1 |
20170164890 | Leip | Jun 2017 | A1 |
20170178631 | Boesen | Jun 2017 | A1 |
20170180842 | Boesen | Jun 2017 | A1 |
20170180843 | Perianu et al. | Jun 2017 | A1 |
20170180897 | Perianu | Jun 2017 | A1 |
20170188127 | Perianu et al. | Jun 2017 | A1 |
20170188132 | Hirsch et al. | Jun 2017 | A1 |
20170193978 | Goldman | Jul 2017 | A1 |
20170195829 | Belverato et al. | Jul 2017 | A1 |
20170208393 | Boesen | Jul 2017 | A1 |
20170214987 | Boesen | Jul 2017 | A1 |
20170215016 | Dohmen et al. | Jul 2017 | A1 |
20170230752 | Dohmen et al. | Aug 2017 | A1 |
20170251933 | Braun et al. | Sep 2017 | A1 |
20170257698 | Boesen et al. | Sep 2017 | A1 |
20170263236 | Boesen et al. | Sep 2017 | A1 |
20170266494 | Crankson | Sep 2017 | A1 |
20170273622 | Boesen | Sep 2017 | A1 |
20170280257 | Gordon et al. | Sep 2017 | A1 |
20170361213 | Goslin | Dec 2017 | A1 |
20170366233 | Hviid et al. | Dec 2017 | A1 |
20180007994 | Boesen et al. | Jan 2018 | A1 |
20180008194 | Boesen | Jan 2018 | A1 |
20180008198 | Kingscott | Jan 2018 | A1 |
20180009447 | Boesen et al. | Jan 2018 | A1 |
20180011006 | Kingscott | Jan 2018 | A1 |
20180011682 | Milevski et al. | Jan 2018 | A1 |
20180011994 | Boesen | Jan 2018 | A1 |
20180012228 | Milevski et al. | Jan 2018 | A1 |
20180013195 | Hviid et al. | Jan 2018 | A1 |
20180014102 | Hirsch et al. | Jan 2018 | A1 |
20180014103 | Martin et al. | Jan 2018 | A1 |
20180014104 | Boesen et al. | Jan 2018 | A1 |
20180014107 | Razouane et al. | Jan 2018 | A1 |
20180014108 | Dragicevic et al. | Jan 2018 | A1 |
20180014109 | Boesen | Jan 2018 | A1 |
20180014113 | Boesen | Jan 2018 | A1 |
20180014140 | Milevski et al. | Jan 2018 | A1 |
20180014436 | Milevski | Jan 2018 | A1 |
20180034951 | Boesen | Feb 2018 | A1 |
20180040093 | Boesen | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
204244472 | Apr 2015 | CN |
104683519 | Jun 2015 | CN |
104837094 | Aug 2015 | CN |
1469659 | Oct 2004 | EP |
1017252 | May 2006 | EP |
2903186 | Aug 2015 | EP |
2074817 | Apr 1981 | GB |
2508226 | May 2014 | GB |
06292195 | Oct 1998 | JP |
2008103925 | Aug 2008 | WO |
2008113053 | Sep 2008 | WO |
2007034371 | Nov 2008 | WO |
2011001433 | Jan 2011 | WO |
2012071127 | May 2012 | WO |
2013134956 | Sep 2013 | WO |
2014046602 | Mar 2014 | WO |
2014043179 | Jul 2014 | WO |
2015061633 | Apr 2015 | WO |
2015110577 | Jul 2015 | WO |
2015110587 | Jul 2015 | WO |
2016032990 | Mar 2016 | WO |
2016187869 | Dec 2016 | WO |
Entry |
---|
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223. |
Alzahrani et al: “A Multi-Channel Opto-Electronic Sensor to Accurately Monitor Heart Rate against Motion Artefact during Exercise”, Sensors, vol. 15, No. 10, Oct. 12, 2015, pp. 25681-25702, XPO55334602, DOI: 10.3390/s151025681 the whole document. |
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014). |
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013). |
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014). |
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016). |
BRAGI Is on Facebook (2014). |
BRAGI Update—Arrival of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014). |
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015). |
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014). |
BRAGI Update—Let's Get Ready to Rumble, A Lot to Be Done Over Christmas (Dec. 22, 2014). |
BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014). |
BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014). |
BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014). |
BRAGI Update—Memories From the First Month of Kickstarter—Update on Progress (Aug. 1, 2014). |
BRAGI Update—Memories From the Second Month of Kickstarter—Update on Progress (Aug. 22, 2014). |
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014). |
BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014). |
BRAGI Update—Status on Wireless, Bits and Pieces, Testing-Oh Yeah, Timeline(Apr. 24, 2015). |
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015). |
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014). |
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015). |
BRAGI Update—Alpha 5 and Back to China, Backer Day, On Track(May 16, 2015). |
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015). |
BRAGI Update—Certifications, Production, Ramping Up. |
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015). |
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015). |
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015). |
BRAGI Update—Getting Close(Aug. 6, 2015). |
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015). |
BRAGI Update—On Track, On Track and Gems Overview. |
BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015). |
BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy Are We Getting Close(Sep. 10, 2015). |
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016). |
Hoffman, “How to Use Android Beam to Wirelessly Transfer Content Between Devices”, (Feb. 22, 2013). |
Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017). |
Hyundai Motor America, “Hyundai Motor Company Introduces a Health + Mobility Concept for Wellness in Mobility”, Fountain Valley, Californa (2017). |
International Search Report & Written Opinion, PCT/EP16/70245 (dated Nov. 16, 2016). |
International Search Report & Written Opinion, PCT/EP2016/070231 (dated Nov. 18, 2016). |
International Search Report & Written Opinion, PCT/EP2016/070247 (dated Nov. 18, 2016). |
Jain A et al: “Score normalization in multimodal biometric systems”, Pattern Recognition, Elsevier, GB, vol. 38, No. 12, Dec. 31, 2005, pp. 2270-2285, XPO27610849, ISSN: 0031-3203. |
Last Push Before the Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014). |
Niemanja Paunovic et al, “A methodology for testing complex professional electronic systems”, Serbian Journal of Electrical Engineering, vol. 9, No. 1, Feb. 1, 2012, pp. 71-80, XPO55317584, YU. |
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footie and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dlna_iot/ (Sep. 24, 2014). |
Nuance, “ING Netherlands Launches Voice Biometrics Payment System in the Mobile Banking App Powered by Nuance”, “https://www.nuance.com/about-us/newsroom/press-releases/ing-netherlands-launches-nuance-voice-biometrics.html”, 4 pages (Jul. 28, 2015). |
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
Stretchgoal—It's Your Dash (Feb. 14, 2014). |
Stretchgoal—The Carrying Case for the Dash (Feb. 12, 2014). |
Stretchgoal—Windows Phone Support (Feb. 17, 2014). |
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014). |
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014). |
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014). |
Wertzner et al., “Analysis of fundamental frequency, jitter, shimmer and vocal intensity in children with phonological disorders”, V. 71, n.5, 582-588, Sep./Oct. 2005; Brazilian Journal of Othrhinolaryngology. |
Wikipedia, “Gamebook”, https://en.wikipedia.org/wiki/Gamebook, Sep. 3, 2017, 5 pages. |
Wikipedia, “Kinect”, “https://en.wikipedia.org/wiki/Kinect”, 18 pages, (Sep. 9, 2017). |
Wikipedia, “Wii Balance Board”, “https://en.wikipedia.org/wiki/Wii_Balance_Board”, 3 pages, (Jul. 20, 2017). |
Number | Date | Country | |
---|---|---|---|
20180125417 A1 | May 2018 | US |
Number | Date | Country | |
---|---|---|---|
62417385 | Nov 2016 | US |