The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to earpieces.
The earpiece holds great promise as a wearable device. However, one of the challenges with incorporating various functionality into an earpiece is the relatively small space. What is needed is technology which allows for improved functionality for an earpiece but reduces the amount of space required.
For example, currently, two separate sensor assemblies are required for the proper determination of pulse oximetry and audio detection in an earpiece device. This means that the user must wear a device that is large enough to accommodate both of these sensors. This causes problems with design and usability, as well as creating further issues with costs of the materials required to provide an adequate audio and pulse oximetry signal for processing. In addition, further work is required in order to provide shielding from other components of the device, due to the space available in comparison to the amount of electronic components needed to support each input modality. What is needed is a new combined optical sensor for detecting both audio signals as well as pulse oximetry.
Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.
It is a further object, feature, or advantage of the present invention to combine the reception of audio signals and oxygen saturation data onto a single sensor.
It is a still further object, feature, or advantage of the present invention to separate the audio signals and the oxygen saturation data using signal processing techniques.
It is another object, feature, or advantage to reduce the number of key components.
Another object, feature, or advantage is to enhance ability to incorporate other sensor arrays due to enhanced space availability.
Yet another object, feature, or advantage is to reduce power requirements due to the reduction in the number of power intensive components.
A further object, feature, or advantage is allow signal processing to provide more information about blood flow in a studied area.
A still further object, feature, or advantage is to provide new opportunities to enhance amount and variety of sensed data.
One or more of these and/or other objects, features, or advantages will become apparent from the specification and claims that follow. No single embodiment need include each and every object, feature, or advantage as different embodiments may achieve different objects, features, and advantages.
A new and novel system combines the optical sensor for both the detection of high quality audio signals as well as the accurate detection of pulse oximetry from the earpiece. This new system may include a high quality optical source that provides the emitted light directed at the skin, cartilage and bone of the external ear. An optical sensor is utilized to detect the reflected signals from the skin, bone and cartilage of the external ear. The received signal may include both low frequency pulse oximetry signals as well as higher frequency audio signals. In this fashion, the combined sensor is able to receive audio signals emanated from the body and transmitted through bone and soft tissues, as well as the pulse rate and oxygen saturation from reflected light from blood. Signal processing then is used to detect and segregate the two signals. Transmission of the processed information is then possible using various schemes including wired or wireless transmission methods. This has the advantage of detecting multiple signals using a single sensor.
According to one aspect, a system includes at least one earpiece, wherein each earpiece comprises an earpiece housing, an optical source operatively connected to the earpiece housing, wherein the optical source is configured to emit light toward an ear surface, an optical sensor operatively connected to the earpiece housing, wherein the optical sensor is configured to receive reflected light from the ear surface, and at least one processor disposed within at least one earpiece and operatively connected to the optical source and the optical sensor, wherein the at least one processor is configured to separate the pulse oximetry signals from the audio signals in the reflected light detected by the optical sensor. The at least one processor may be further configured to determine an oxygen saturation level from the pulse oximetry signals. The at least one processor may apply a filter to separate the pulse oximetry signals from the audio signals, the audio signals centered at frequency higher than the pulse oximetry signals. The at least one earpiece may include a set of earpieces. The optical source may be a laser. The ear surface may be an outer ear surface or an inner ear surface. The at least one processor may be further configured to determine bone vibration measurements from the audio signals. The at least one processor may be further configured to interpret the bone vibrations measurements and to determine voice input from a user of the wireless earpiece. The method may further include reconfiguring the at least one processor using the voice input.
According to another aspect, a method includes emitting, via an optical source of a wireless earpiece, light toward an ear surface, receiving, via an optical sensor of the wireless earpiece, a reflected light signal from the ear surface, separating, via at least one processor of the wireless earpiece, audio signals from pulse oximetry signals within the reflected light signal, and determining, via the at least one processor of the wireless earpiece, an oxygen saturation level from the reflected light.
An optical source 16 is operatively connected to the earpiece housing 14 and configured to emit light toward an ear surface. Other types of electromagnetic radiation capable of reflecting off of human tissue may be substituted for light. The light may be emitted continuously or intermittently in bursts, wherein the bursts may last for any reasonable length of time. If the optical source 16 is configured to measure bone vibrations, the light or electromagnetic source must be substantially coherent in order to accurately measure the bone vibrations. In addition, the distance between the optical source 16 and the ear surface may be fixed or known in order to obtain accurate bone vibration readings, and ideally the optical source 16 and the ear surface touch one another.
An optical sensor 18 is operatively connected to the earpiece housing 14 and configured to receive reflected light from the ear surface which may include pulse oximetry data as well as audio signals. The audio signals may emanate from any source, including the user, a third party, an electronic apparatus, or an entity from nature itself, and may concern music, sounds, instructions, information, or a combination of the aforementioned. The list is non-exclusive and the audio signals do not have to relate to the user or the earpiece 12. The reflected light from the ear surface may be received continuously or discretely by the optical sensor 18 depending on the configuration of the optical source 16. The optical sensor 18, depending on the physical configuration of the earpiece 12, may receive refracted light, or light that has been bent due to travel through human tissue, from the optical source 16 in lieu of reflected light. If one or more processors are configured to calculate bone vibration data, then the optical sensor 18 may receive light directly from the optical source 16 to use as a reference beam in order to accurately measure the velocity of the bone vibrations due to the frequency shift they cause to the reflected light. The reflected light combines with the reference beam in order to create a signal that can be transmitted by the optical sensor 18. The optical sensor 18 may be relatively close to the optical source 16 and the distance between the optical source 16 and the ear surface ideally should be a fixed quantity or the optical source 16 should touch the ear surface. Also, the light or electromagnetic wave should be substantially coherent so that the bone vibration measurements may be accurately determined by one of the processors 20 present within the earpiece 12.
The LEDs 28 affixed to the earpiece housing 14 may be configured to emit light in order to convey information to a user concerning the earpiece 12. The LEDs 28 may be located in any area on the earpiece 12 suitable for viewing by the user or a third party and may consist of as few as one diode which may be provided in combination with a light guide. In addition, the LEDs 28 may be discernable by a human eye or an electronic device and need not have a minimum luminescence.
One or more microphones 32 may be operatively connected to the earpiece housing 14 and may be configured to receive sounds from one or more sources in addition to the optical sensor 18, including the user, a third party, a machine, an animal, another earpiece, another electronic device, or even nature itself. The sounds received by one or more microphones 32 may include a word, combination of words, a sound, combinations of sounds, or any combination of the aforementioned. The sounds may be of any frequency and need not be audible to the user and may be used to reconfigure one or more components of the earpiece 12. One or more of the microphones 32 may be a bone microphone. It is to be understood however, that the earpiece need not include a bone microphone. In addition, the earpiece need not include any microphone at all other than the optical sensor because the optical sensor may be used to sense data including audio signals.
One or more output devices 34 operatively connected to the earpiece housing 14 may be configured to transmit sounds received from one or more microphones 32, the transceiver 56, or the radio transceiver 58 or even a data storage device 60. One or more output devices 34 may transmit information related to the operations of the earpiece 12 or information queried by the user or a third party to outside sources. For example, an output device 34 may transmit a signal related to oxygen saturation levels to an external electronic device. The oxygen saturation levels may be used by a medical professional for diagnostic purposes, a user for technical or personal purposes, or a third party for scientific, technical, or other purposes. An output device 34 may also transmit any audio signals received by the optical sensor 18, which may include music, sounds, instructions, information, commentary, auditory media, or a combination of the aforementioned.
One or more sensors 36 may be operatively connected to the earpiece housing 14 and may be configured to obtain addition oxygen saturation level data that neither the optical source 16 nor the optical sensor 18 are configured for. For example, the microphones 32 may be used to detect vibrations via pressure disturbances in the user's ear canal or one or more inertial sensors 42 and 44 may be used to determine motion data related to the user's head and neck regions to be used to modify one or more readings of the optical sensor 18 or even to ascertain one or more variables of the oxygen saturation level determination. Each sensor 36 may be located anywhere on or within the earpiece housing 14 and need not be in direct contact with either the user or the external environment.
The gesture control interface 46 affixed to the earpiece housing 14 is configured to allow a user additional control over the earpiece 12. The gesture control interface 46 may include at least one emitter 52 and at least one detector 54 to detect gestures from either the user, a third party, an instrument, or a combination of the aforementioned and transmit one or more signals related to one or more gestures to one or more processors 20. The gesture control interface 46 may be implemented using optical emitters and detectors, may be implemented with capacitance sensing, or otherwise. The gestures that may be used with the gesture control interface 46 to control the earpiece 12 include, without limitation, touching, tapping, swiping, use of an instrument, or any combination of the aforementioned gestures. Touching gestures used to control the earpiece 12 may be of any duration and may include the touching of areas that are not part of the gesture control interface 46. Tapping gestures used to control the earpiece 12 may include any number of taps and need not be brief. Swiping gestures used to control the earpiece 12 may include a single swipe, a swipe that changes direction at least once, a swipe with a time delay, a plurality of swipes, or any combination of the aforementioned.
One or more processors 20 are operatively connected to each component within the earpiece 12 and may be configured, in addition to transmitting and receiving signals from either the optical source 16 or the optical sensor 18, to receive signals from one or more microphones 32, one or more sensors 36, the transceiver 56, or the radio transceiver 58. One or more processors may also be configured to use information received from one or more microphones 32, one or more sensors 36, the transceiver 56, or the radio transceiver 58 in addition to information from the optical sensor 18 to assist in the determination of any oxygen saturation level data that may be relevant. One or more processors 20 may be reconfigured by the user or a third party through the use of one or more microphones 32, the gestural control interface 46, or by an electronic signal received from the transceiver 56 or the radio transceiver 58. Reconfigurations may include whether to determine bone vibrations, whether to transmit the oxygen saturation data to an external device, or setting the frequency of optical sensor measurements. The aforementioned list is non-exclusive.
The transceiver 56 disposed within the earpiece 12 may be configured to receive signals from and to transmit signals to a second earpiece of the user if the user is using more than one earpiece. The transceiver 56 may receive or transmit more than one signal simultaneously. The transceiver 56 may be of any number of types including a near field magnetic induction (NFMI) transceiver.
The radio transceiver 58 disposed within the earpiece 12 may be configured to receive signals from external electronic devices and to transmit those signals to one or more processors 20. The external electronic devices the radio transceiver 58 may be configured to receive signals from include Bluetooth devices, mobile devices, desktops, laptops, tablets, modems, routers, communications towers, cameras, watches, third-party earpieces, earpieces, or other electronic devices capable of transmitting or receiving wireless signals. The radio transceiver 58 may receive or transmit more than one signal simultaneously.
The battery 30 may be operatively connected to components within an earpiece 12 to provide power. The battery 30 should provide enough power to operate an earpiece 12 for a reasonable duration of time. The battery 30 may be of any type suitable for powering an earpiece 12. However, the battery 30 need not be present in an earpiece 12. Alternative battery-less power sources, such as thermal harvesters that produce energy from differences between the user's or a third party's skin or internal body temperature and the ambient air, solar apparatuses which generate energy from the photovoltaic effect, or sensors configured to receive energy from radio waves (all of which are operatively connected to one or more earpieces 12) may be used to power the earpiece 12 in lieu of a battery 30.
Therefore, various methods, systems, and apparatus have been shown and described. Although specific embodiments are set forth herein, the present invention contemplates numerous options, variations, and alternatives including variations in type of optical source and optical sensor,
This application claims priority to U.S. Provisional Patent Application 62/359,027, filed on Jul. 6, 2016, and entitled Combined Optical Sensor for Audio and Pulse Oximetry System and Method, hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2325590 | Carlisle et al. | Aug 1943 | A |
2430229 | Kelsey | Nov 1947 | A |
3047089 | Zwislocki | Jul 1962 | A |
D208784 | Sanzone | Oct 1967 | S |
3586794 | Michaelis | Jun 1971 | A |
3934100 | Harada | Jan 1976 | A |
3983336 | Malek et al. | Sep 1976 | A |
4069400 | Johanson et al. | Jan 1978 | A |
4150262 | Ono | Apr 1979 | A |
4334315 | Ono et al. | Jun 1982 | A |
D266271 | Johanson et al. | Sep 1982 | S |
4375016 | Harada | Feb 1983 | A |
4588867 | Konomi | May 1986 | A |
4617429 | Bellafiore | Oct 1986 | A |
4654883 | Iwata | Mar 1987 | A |
4682180 | Gans | Jul 1987 | A |
4791673 | Schreiber | Dec 1988 | A |
4852177 | Ambrose | Jul 1989 | A |
4865044 | Wallace et al. | Sep 1989 | A |
4984277 | Bisgaard et al. | Jan 1991 | A |
5008943 | Arndt et al. | Apr 1991 | A |
5185802 | Stanton | Feb 1993 | A |
5191602 | Regen et al. | Mar 1993 | A |
5201007 | Ward et al. | Apr 1993 | A |
5201008 | Arndt et al. | Apr 1993 | A |
D340286 | Seo | Oct 1993 | S |
5280524 | Norris | Jan 1994 | A |
5295193 | Ono | Mar 1994 | A |
5298692 | Ikeda et al. | Mar 1994 | A |
5343532 | Shugart | Aug 1994 | A |
5347584 | Narisawa | Sep 1994 | A |
5363444 | Norris | Nov 1994 | A |
D367113 | Weeks | Feb 1996 | S |
5497339 | Bernard | Mar 1996 | A |
5606621 | Reiter et al. | Feb 1997 | A |
5613222 | Guenther | Mar 1997 | A |
5654530 | Sauer et al. | Aug 1997 | A |
5692059 | Kruger | Nov 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5748743 | Weeks | May 1998 | A |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5771438 | Palermo et al. | Jun 1998 | A |
D397796 | Yabe et al. | Sep 1998 | S |
5802167 | Hong | Sep 1998 | A |
D410008 | Almqvist | May 1999 | S |
5929774 | Charlton | Jul 1999 | A |
5933506 | Aoki et al. | Aug 1999 | A |
5949896 | Nageno et al. | Sep 1999 | A |
5987146 | Pluvinage et al. | Nov 1999 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6081724 | Wilson | Jun 2000 | A |
6094492 | Boesen | Jul 2000 | A |
6111569 | Brusky et al. | Aug 2000 | A |
6112103 | Puthuff | Aug 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6167039 | Karlsson et al. | Dec 2000 | A |
6181801 | Puthuff et al. | Jan 2001 | B1 |
6208372 | Barraclough | Mar 2001 | B1 |
6230029 | Yegiazaryan et al. | May 2001 | B1 |
6275789 | Moser et al. | Aug 2001 | B1 |
6339754 | Flanagan et al. | Jan 2002 | B1 |
D455835 | Anderson et al. | Apr 2002 | S |
6408081 | Boesen | Jun 2002 | B1 |
6424820 | Burdick et al. | Jul 2002 | B1 |
D464039 | Boesen | Oct 2002 | S |
6470893 | Boesen | Oct 2002 | B1 |
D468299 | Boesen | Jan 2003 | S |
D468300 | Boesen | Jan 2003 | S |
6542721 | Boesen | Apr 2003 | B2 |
6560468 | Boesen | May 2003 | B1 |
6654721 | Handelman | Nov 2003 | B2 |
6664713 | Boesen | Dec 2003 | B2 |
6690807 | Meyer | Feb 2004 | B1 |
6694180 | Boesen | Feb 2004 | B1 |
6718043 | Boesen | Apr 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6748095 | Goss | Jun 2004 | B1 |
6754358 | Boesen et al. | Jun 2004 | B1 |
6784873 | Boesen et al. | Aug 2004 | B1 |
6823195 | Boesen | Nov 2004 | B1 |
6852084 | Boesen | Feb 2005 | B1 |
6879698 | Boesen | Apr 2005 | B2 |
6892082 | Boesen | May 2005 | B2 |
6920229 | Boesen | Jul 2005 | B2 |
6952483 | Boesen et al. | Oct 2005 | B2 |
6987986 | Boesen | Jan 2006 | B2 |
7010137 | Leedom et al. | Mar 2006 | B1 |
7113611 | Leedom et al. | Sep 2006 | B2 |
D532520 | Kampmeier et al. | Nov 2006 | S |
7136282 | Rebeske | Nov 2006 | B1 |
7203331 | Boesen | Apr 2007 | B2 |
7209569 | Boesen | Apr 2007 | B2 |
7215790 | Boesen et al. | May 2007 | B2 |
D549222 | Huang | Aug 2007 | S |
D554756 | Sjursen et al. | Nov 2007 | S |
7403629 | Aceti et al. | Jul 2008 | B1 |
D579006 | Kim et al. | Oct 2008 | S |
7463902 | Boesen | Dec 2008 | B2 |
7508411 | Boesen | Mar 2009 | B2 |
D601134 | Elabidi et al. | Sep 2009 | S |
7825626 | Kozisek | Nov 2010 | B2 |
7965855 | Ham | Jun 2011 | B1 |
7979035 | Griffin et al. | Jul 2011 | B2 |
7983628 | Boesen | Jul 2011 | B2 |
D647491 | Chen et al. | Oct 2011 | S |
8095188 | Shi | Jan 2012 | B2 |
8108143 | Tester | Jan 2012 | B1 |
8140357 | Boesen | Mar 2012 | B1 |
D666581 | Perez | Sep 2012 | S |
8300864 | Müllenborn et al. | Oct 2012 | B2 |
8406448 | Lin et al. | Mar 2013 | B2 |
8436780 | Schantz et al. | May 2013 | B2 |
D687021 | Yuen | Jul 2013 | S |
8719877 | VonDoenhoff et al. | May 2014 | B2 |
8774434 | Zhao et al. | Jul 2014 | B2 |
8831266 | Huang | Sep 2014 | B1 |
8891800 | Shaffer | Nov 2014 | B1 |
8994498 | Agrafioti et al. | Mar 2015 | B2 |
D728107 | Martin et al. | Apr 2015 | S |
9013145 | Castillo et al. | Apr 2015 | B2 |
9037125 | Kadous | May 2015 | B1 |
D733103 | Jeong et al. | Jun 2015 | S |
9081944 | Camacho et al. | Jul 2015 | B2 |
9510159 | Cuddihy et al. | Nov 2016 | B1 |
D773439 | Walker | Dec 2016 | S |
D775158 | Dong et al. | Dec 2016 | S |
D777710 | Palmborg et al. | Jan 2017 | S |
D788079 | Son et al. | May 2017 | S |
20010005197 | Mishra et al. | Jun 2001 | A1 |
20010027121 | Boesen | Oct 2001 | A1 |
20010043707 | Leedom | Nov 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020002413 | Tokue | Jan 2002 | A1 |
20020007510 | Mann | Jan 2002 | A1 |
20020010590 | Lee | Jan 2002 | A1 |
20020030637 | Mann | Mar 2002 | A1 |
20020046035 | Kitahara et al. | Apr 2002 | A1 |
20020057810 | Boesen | May 2002 | A1 |
20020076073 | Taenzer et al. | Jun 2002 | A1 |
20020118852 | Boesen | Aug 2002 | A1 |
20030002705 | Boesen | Jan 2003 | A1 |
20030065504 | Kraemer et al. | Apr 2003 | A1 |
20030100331 | Dress et al. | May 2003 | A1 |
20030104806 | Ruef et al. | Jun 2003 | A1 |
20030115068 | Boesen | Jun 2003 | A1 |
20030125096 | Boesen | Jul 2003 | A1 |
20030218064 | Conner et al. | Nov 2003 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040160511 | Boesen | Aug 2004 | A1 |
20050017842 | Dematteo | Jan 2005 | A1 |
20050043056 | Boesen | Feb 2005 | A1 |
20050125320 | Boesen | Jun 2005 | A1 |
20050148883 | Boesen | Jul 2005 | A1 |
20050165663 | Razumov | Jul 2005 | A1 |
20050196009 | Boesen | Sep 2005 | A1 |
20050251455 | Boesen | Nov 2005 | A1 |
20050266876 | Boesen | Dec 2005 | A1 |
20060029246 | Boesen | Feb 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060074808 | Boesen | Apr 2006 | A1 |
20060166715 | Engelen et al. | Jul 2006 | A1 |
20060166716 | Seshadri et al. | Jul 2006 | A1 |
20060220915 | Bauer | Oct 2006 | A1 |
20060258412 | Liu | Nov 2006 | A1 |
20080076972 | Dorogusker et al. | Mar 2008 | A1 |
20080090622 | Kim et al. | Apr 2008 | A1 |
20080146890 | LeBoeuf et al. | Jun 2008 | A1 |
20080254780 | Kuhl et al. | Oct 2008 | A1 |
20080255430 | Alexandersson et al. | Oct 2008 | A1 |
20090003620 | McKillop et al. | Jan 2009 | A1 |
20090017881 | Madrigal | Jan 2009 | A1 |
20090073070 | Rofougaran | Mar 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090105548 | Bart | Apr 2009 | A1 |
20090191920 | Regen et al. | Jul 2009 | A1 |
20090245559 | Boltyenkov et al. | Oct 2009 | A1 |
20090296968 | Wu et al. | Dec 2009 | A1 |
20100033313 | Keady et al. | Feb 2010 | A1 |
20100203831 | Muth | Aug 2010 | A1 |
20100210212 | Sato | Aug 2010 | A1 |
20100320961 | Castillo et al. | Dec 2010 | A1 |
20110286615 | Dlodort et al. | Nov 2011 | A1 |
20120057740 | Rosal | Mar 2012 | A1 |
20130316642 | Newham | Nov 2013 | A1 |
20130346168 | Zhou et al. | Dec 2013 | A1 |
20140058220 | LeBoeuf | Feb 2014 | A1 |
20140072146 | Itkin et al. | Mar 2014 | A1 |
20140079257 | Ruwe et al. | Mar 2014 | A1 |
20140106677 | Altman | Apr 2014 | A1 |
20140122116 | Smythe | May 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140185828 | Helbling | Jul 2014 | A1 |
20140222462 | Shakil et al. | Aug 2014 | A1 |
20140235169 | Parkinson et al. | Aug 2014 | A1 |
20140270227 | Swanson | Sep 2014 | A1 |
20140270271 | Dehe et al. | Sep 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20150028996 | Agrafioti et al. | Jan 2015 | A1 |
20150110587 | Hori | Apr 2015 | A1 |
20150148989 | Cooper et al. | May 2015 | A1 |
20150245127 | Shaffer | Aug 2015 | A1 |
20160033280 | Moore et al. | Feb 2016 | A1 |
20160072558 | Hirsch et al. | Mar 2016 | A1 |
20160073189 | Lindén et al. | Mar 2016 | A1 |
20160125892 | Bowen et al. | May 2016 | A1 |
20160360350 | Watson et al. | Dec 2016 | A1 |
20170078780 | Qian et al. | Mar 2017 | A1 |
20170111726 | Martin et al. | Apr 2017 | A1 |
20170155992 | Perianu et al. | Jun 2017 | A1 |
Number | Date | Country |
---|---|---|
204244472 | Apr 2015 | CN |
104683519 | Jun 2015 | CN |
104837094 | Aug 2015 | CN |
1469659 | Oct 2004 | EP |
1017252 | May 2006 | EP |
2903186 | Aug 2015 | EP |
2074817 | Apr 1981 | GB |
2508226 | May 2014 | GB |
06292195 | Oct 1998 | JP |
2008103925 | Aug 2008 | WO |
2007034371 | Nov 2008 | WO |
2011001433 | Jan 2011 | WO |
2012071127 | May 2012 | WO |
2013134956 | Sep 2013 | WO |
2014043179 | Mar 2014 | WO |
2014046602 | Mar 2014 | WO |
2015061633 | Apr 2015 | WO |
2015110577 | Jul 2015 | WO |
2015110587 | Jul 2015 | WO |
2016032990 | Mar 2016 | WO |
Entry |
---|
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223. |
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014). |
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013). |
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014). |
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016). |
BRAGI is on Facebook (2014). |
BRAGI Update—Arrival of Prototype Chassis Parts —More People—Awesomeness (May 13, 2014). |
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015). |
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014). |
BRAGI Update—Let's Get Ready to Rumble, A Lot to be Done Over Christmas (Dec. 22, 2014). |
BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014). |
BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014). |
BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014). |
BRAGI Update—Memories From the First Month of Kickstarter—Update on Progress (Aug. 1, 2014). |
BRAGI Update—Memories From the Second Month of Kickstarter—Update on Progress (Aug. 22, 2014). |
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014). |
BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014). |
BRAGI Update—Status on Wireless, Bits and Pieces, Testing-Oh Yeah, Timeline(Apr. 24, 2015). |
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015). |
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014). |
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015). |
BRAGI Update—Alpha 5 and Back to China, Backer Day, on Track(May 16, 2015). |
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015). |
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015). |
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015). |
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015). |
BRAGI Update—Getting Close(Aug. 6, 2014). |
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015). |
BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015). |
BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy Are We Getting Close(Sep. 10, 2015). |
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016). |
Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017). |
Hyundai Motor America, “Hyundai Motor Company Introduces A Health +0 Mobility Concept for Wellness in Mobility”, Fountain Valley, Californa (2017). |
Last Push Before the Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014). |
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footie and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dlna_iot/ (Sep. 24, 2014). |
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
Stretchgoal—Its Your Dash (Feb. 14, 2014). |
Stretchgoal—The Carrying Case for the Dash (Feb. 12, 2014). |
Stretchgoal—Windows Phone Support (Feb. 17, 2014). |
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014). |
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014). |
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014). |
Number | Date | Country | |
---|---|---|---|
20180008198 A1 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
62359027 | Jul 2016 | US |