Combined optical sensor for audio and pulse oximetry system and method

Information

  • Patent Grant
  • 10555700
  • Patent Number
    10,555,700
  • Date Filed
    Thursday, June 29, 2017
    7 years ago
  • Date Issued
    Tuesday, February 11, 2020
    4 years ago
  • Inventors
  • Original Assignees
  • Examiners
    • Hindenburg; Max F
    Agents
    • Goodhue, Coleman & Owens, P.C.
Abstract
A system includes at least one earpiece, wherein each earpiece comprises an earpiece housing, an optical source operatively connected to the earpiece housing, wherein the optical source is configured to emit light toward an ear surface, an optical sensor operatively connected to the earpiece housing, wherein the optical sensor is configured to receive reflected light from the ear surface, and at least one processor disposed within at least one earpiece and operatively connected to the optical source and the optical sensor, wherein the at least one processor is configured to separate the pulse oximetry signals from the audio signals in the reflected light detected by the optical sensor.
Description
FIELD OF THE INVENTION

The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to earpieces.


BACKGROUND

The earpiece holds great promise as a wearable device. However, one of the challenges with incorporating various functionality into an earpiece is the relatively small space. What is needed is technology which allows for improved functionality for an earpiece but reduces the amount of space required.


For example, currently, two separate sensor assemblies are required for the proper determination of pulse oximetry and audio detection in an earpiece device. This means that the user must wear a device that is large enough to accommodate both of these sensors. This causes problems with design and usability, as well as creating further issues with costs of the materials required to provide an adequate audio and pulse oximetry signal for processing. In addition, further work is required in order to provide shielding from other components of the device, due to the space available in comparison to the amount of electronic components needed to support each input modality. What is needed is a new combined optical sensor for detecting both audio signals as well as pulse oximetry.


SUMMARY

Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.


It is a further object, feature, or advantage of the present invention to combine the reception of audio signals and oxygen saturation data onto a single sensor.


It is a still further object, feature, or advantage of the present invention to separate the audio signals and the oxygen saturation data using signal processing techniques.


It is another object, feature, or advantage to reduce the number of key components.


Another object, feature, or advantage is to enhance ability to incorporate other sensor arrays due to enhanced space availability.


Yet another object, feature, or advantage is to reduce power requirements due to the reduction in the number of power intensive components.


A further object, feature, or advantage is allow signal processing to provide more information about blood flow in a studied area.


A still further object, feature, or advantage is to provide new opportunities to enhance amount and variety of sensed data.


One or more of these and/or other objects, features, or advantages will become apparent from the specification and claims that follow. No single embodiment need include each and every object, feature, or advantage as different embodiments may achieve different objects, features, and advantages.


A new and novel system combines the optical sensor for both the detection of high quality audio signals as well as the accurate detection of pulse oximetry from the earpiece. This new system may include a high quality optical source that provides the emitted light directed at the skin, cartilage and bone of the external ear. An optical sensor is utilized to detect the reflected signals from the skin, bone and cartilage of the external ear. The received signal may include both low frequency pulse oximetry signals as well as higher frequency audio signals. In this fashion, the combined sensor is able to receive audio signals emanated from the body and transmitted through bone and soft tissues, as well as the pulse rate and oxygen saturation from reflected light from blood. Signal processing then is used to detect and segregate the two signals. Transmission of the processed information is then possible using various schemes including wired or wireless transmission methods. This has the advantage of detecting multiple signals using a single sensor.


According to one aspect, a system includes at least one earpiece, wherein each earpiece comprises an earpiece housing, an optical source operatively connected to the earpiece housing, wherein the optical source is configured to emit light toward an ear surface, an optical sensor operatively connected to the earpiece housing, wherein the optical sensor is configured to receive reflected light from the ear surface, and at least one processor disposed within at least one earpiece and operatively connected to the optical source and the optical sensor, wherein the at least one processor is configured to separate the pulse oximetry signals from the audio signals in the reflected light detected by the optical sensor. The at least one processor may be further configured to determine an oxygen saturation level from the pulse oximetry signals. The at least one processor may apply a filter to separate the pulse oximetry signals from the audio signals, the audio signals centered at frequency higher than the pulse oximetry signals. The at least one earpiece may include a set of earpieces. The optical source may be a laser. The ear surface may be an outer ear surface or an inner ear surface. The at least one processor may be further configured to determine bone vibration measurements from the audio signals. The at least one processor may be further configured to interpret the bone vibrations measurements and to determine voice input from a user of the wireless earpiece. The method may further include reconfiguring the at least one processor using the voice input.


According to another aspect, a method includes emitting, via an optical source of a wireless earpiece, light toward an ear surface, receiving, via an optical sensor of the wireless earpiece, a reflected light signal from the ear surface, separating, via at least one processor of the wireless earpiece, audio signals from pulse oximetry signals within the reflected light signal, and determining, via the at least one processor of the wireless earpiece, an oxygen saturation level from the reflected light.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of one embodiment of the system.



FIG. 2 illustrates a side view of a right earpiece and the functional relationships of the right earpieces components a user.



FIG. 3 is a blown-up portion of FIG. 2 illustrating the light source and the light receiver.



FIG. 4 is a block diagram regarding another embodiment of the system.



FIG. 5 includes a flowchart of an implementation of a method.



FIG. 6 includes a flowchart of another implementation of the method





DETAILED DESCRIPTION


FIG. 1 illustrates a block diagram of the system 10 including one or more earpieces 12, with each earpiece comprising an earpiece housing 14, an optical source 16 operatively connected to an earpiece housing 14, an optical sensor 18 operatively connected to an earpiece housing 14, and one or more processors 20 operatively connected to the optical source 16 and the optical sensor 18. The optical source 16 is configured to emit light toward an ear surface, which may be on the outside or the inside of a user's body. Instead of an optical source, other types of electromagnetic wave sources may be used which emit any type of electromagnetic wave capable of reflecting off of ear tissue. For example, a radio wave may be used instead of a light source, as radio waves partially reflect off of living tissue. The optical sensor 18 operatively connected to the earpiece housing 14 is configured to receive reflected light from the ear surface. The reflected light may contain audio signals as well as pulse oximetry signals. The audio signals may also be used to modify an operational state of the earpiece 12. The reflected light may be received intermittently by the optical sensor 18 or may be received continuously depending on the operation of the optical source 16. One or more processors 20 are configured to separate the audio signals and the pulse oximetry signals from the reflected light. The audio may then be used for any number of purposes including receiving voice input from a user and the pulse oximetry signals may be used to determine an oxygen saturation level. The reflected light data may be received continuously or discretely, and the processors 20 do not need to use every bit of data in determining an oxygen saturation level. The separation of the signals may be performed in any number of ways including through the use of filtering of the reflected signal. The reflected signal may include both low frequency pulse oximetry signals as well as higher frequency, thus high pass filters and/or low pass filters may be used.



FIG. 2 shows a right earpiece 12B inserted into a user's ear having an optical source 16 and an optical sensor 18. The right earpiece 12B may be an ear bud style device configured to fit comfortably within a user's ear canal 48 so as to both minimize the amount of external sound reaching the user's ear canal 48 and to facilitate the transmission of sounds to a user's tympanic membrane 50. Ideally, the ear surface will be the inside of a user's ear canal 48, but the light may be directed at any surface within the user's ear or proximate thereto. Positioning the optical source 16 and the optical sensor 18 inside the user's ear canal 48 has three distinct advantages. One, the inside of the user's ear canal 48 contains little if no external light, allowing easier and more accurate measurements by the optical sensor 18. Two, the inside of the user's ear canal 48 allows ready access to areas suitable for oxygen saturation measurements. Three, the distance between the optical source 16 and the ear surface in the user's ear canal 48 is approximately the same for each prospective user and ideally both the optical source 16 and the optical sensor 18 touch the ear surface 26 directly, allowing for relatively accurate oxygen saturation calculations.



FIG. 3 shows an optical source 16 and an optical sensor 18 in close proximity to an ear surface 26. The optical source 16 and/or the optical sensor 18 may even touch the ear surface 26 in certain embodiments. The optical source 16 emits light 22 toward an ear surface 26. If the earpiece 12B is configured to read bone vibration data, then the optical source 16 transmits substantially coherent light 22 or another substantially coherent electromagnetic wave to the optical sensor 18 to use as a reference beam to merge with the reflected light. If the earpiece 12B is configured to determine the direction of the bone vibrations, then the reference beam may be beamed through an acousto-optic modulator to add a small frequency shift to the reference beam, which can be used to help determine a direction of one or more bone vibrations. The light 22 travels through the ear surface 26 and periodically reflects off of bone and tissue within the ear, creating reflected light 24. The reflected light 24 is subsequently received by the optical sensor 18. If the earpiece 12B is configured to read bone vibration measurements, then the reflected light 24 will merge with a reference beam, where the velocity and the displacement information of the bone vibrations can be determined from the frequency shift of the merged wave. The reflected light 24 is then transmitted to one or more processors 20 in order to determine oxygen saturation data and depending on the configuration of one or more processors 20 within the earpiece 12B, bone vibration measurements.



FIG. 4 is a block diagram of a system 10 which comprises at least one earpiece 12 with each earpiece 12 further comprising an earpiece housing 14, an optical source 16 operatively connected to the earpiece housing 14, an optical sensor 18 operatively connected to the earpiece housing 14, at least one LED 28 operatively connected to the earpiece housing 14, one or more microphones 32 operatively connected to the earpiece housing 14, one or more output devices 34 such as speakers operatively connected to the earpiece housing 14, at least one sensor 36 operatively connected to the earpiece housing 14 such as physiological sensors 38, or inertial sensors 42, 44, a gesture control interface 46 with at least one emitter 52 and at least one detector 54 operatively connected to the earpiece housing 14, a transceiver 56 disposed within the earpiece 12, a radio transceiver 58 disposed within the earpiece 12, a battery 30 disposed within the earpiece 12, and one or more processors 20 disposed within the earpiece 12. One or more speakers 39 may also be operatively connected to the processor(s) 20.


An optical source 16 is operatively connected to the earpiece housing 14 and configured to emit light toward an ear surface. Other types of electromagnetic radiation capable of reflecting off of human tissue may be substituted for light. The light may be emitted continuously or intermittently in bursts, wherein the bursts may last for any reasonable length of time. If the optical source 16 is configured to measure bone vibrations, the light or electromagnetic source must be substantially coherent in order to accurately measure the bone vibrations. In addition, the distance between the optical source 16 and the ear surface may be fixed or known in order to obtain accurate bone vibration readings, and ideally the optical source 16 and the ear surface touch one another.


An optical sensor 18 is operatively connected to the earpiece housing 14 and configured to receive reflected light from the ear surface which may include pulse oximetry data as well as audio signals. The audio signals may emanate from any source, including the user, a third party, an electronic apparatus, or an entity from nature itself, and may concern music, sounds, instructions, information, or a combination of the aforementioned. The list is non-exclusive and the audio signals do not have to relate to the user or the earpiece 12. The reflected light from the ear surface may be received continuously or discretely by the optical sensor 18 depending on the configuration of the optical source 16. The optical sensor 18, depending on the physical configuration of the earpiece 12, may receive refracted light, or light that has been bent due to travel through human tissue, from the optical source 16 in lieu of reflected light. If one or more processors are configured to calculate bone vibration data, then the optical sensor 18 may receive light directly from the optical source 16 to use as a reference beam in order to accurately measure the velocity of the bone vibrations due to the frequency shift they cause to the reflected light. The reflected light combines with the reference beam in order to create a signal that can be transmitted by the optical sensor 18. The optical sensor 18 may be relatively close to the optical source 16 and the distance between the optical source 16 and the ear surface ideally should be a fixed quantity or the optical source 16 should touch the ear surface. Also, the light or electromagnetic wave should be substantially coherent so that the bone vibration measurements may be accurately determined by one of the processors 20 present within the earpiece 12.


The LEDs 28 affixed to the earpiece housing 14 may be configured to emit light in order to convey information to a user concerning the earpiece 12. The LEDs 28 may be located in any area on the earpiece 12 suitable for viewing by the user or a third party and may consist of as few as one diode which may be provided in combination with a light guide. In addition, the LEDs 28 may be discernable by a human eye or an electronic device and need not have a minimum luminescence.


One or more microphones 32 may be operatively connected to the earpiece housing 14 and may be configured to receive sounds from one or more sources in addition to the optical sensor 18, including the user, a third party, a machine, an animal, another earpiece, another electronic device, or even nature itself. The sounds received by one or more microphones 32 may include a word, combination of words, a sound, combinations of sounds, or any combination of the aforementioned. The sounds may be of any frequency and need not be audible to the user and may be used to reconfigure one or more components of the earpiece 12. One or more of the microphones 32 may be a bone microphone. It is to be understood however, that the earpiece need not include a bone microphone. In addition, the earpiece need not include any microphone at all other than the optical sensor because the optical sensor may be used to sense data including audio signals.


One or more output devices 34 operatively connected to the earpiece housing 14 may be configured to transmit sounds received from one or more microphones 32, the transceiver 56, or the radio transceiver 58 or even a data storage device 60. One or more output devices 34 may transmit information related to the operations of the earpiece 12 or information queried by the user or a third party to outside sources. For example, an output device 34 may transmit a signal related to oxygen saturation levels to an external electronic device. The oxygen saturation levels may be used by a medical professional for diagnostic purposes, a user for technical or personal purposes, or a third party for scientific, technical, or other purposes. An output device 34 may also transmit any audio signals received by the optical sensor 18, which may include music, sounds, instructions, information, commentary, auditory media, or a combination of the aforementioned.


One or more sensors 36 may be operatively connected to the earpiece housing 14 and may be configured to obtain addition oxygen saturation level data that neither the optical source 16 nor the optical sensor 18 are configured for. For example, the microphones 32 may be used to detect vibrations via pressure disturbances in the user's ear canal or one or more inertial sensors 42 and 44 may be used to determine motion data related to the user's head and neck regions to be used to modify one or more readings of the optical sensor 18 or even to ascertain one or more variables of the oxygen saturation level determination. Each sensor 36 may be located anywhere on or within the earpiece housing 14 and need not be in direct contact with either the user or the external environment.


The gesture control interface 46 affixed to the earpiece housing 14 is configured to allow a user additional control over the earpiece 12. The gesture control interface 46 may include at least one emitter 52 and at least one detector 54 to detect gestures from either the user, a third party, an instrument, or a combination of the aforementioned and transmit one or more signals related to one or more gestures to one or more processors 20. The gesture control interface 46 may be implemented using optical emitters and detectors, may be implemented with capacitance sensing, or otherwise. The gestures that may be used with the gesture control interface 46 to control the earpiece 12 include, without limitation, touching, tapping, swiping, use of an instrument, or any combination of the aforementioned gestures. Touching gestures used to control the earpiece 12 may be of any duration and may include the touching of areas that are not part of the gesture control interface 46. Tapping gestures used to control the earpiece 12 may include any number of taps and need not be brief. Swiping gestures used to control the earpiece 12 may include a single swipe, a swipe that changes direction at least once, a swipe with a time delay, a plurality of swipes, or any combination of the aforementioned.


One or more processors 20 are operatively connected to each component within the earpiece 12 and may be configured, in addition to transmitting and receiving signals from either the optical source 16 or the optical sensor 18, to receive signals from one or more microphones 32, one or more sensors 36, the transceiver 56, or the radio transceiver 58. One or more processors may also be configured to use information received from one or more microphones 32, one or more sensors 36, the transceiver 56, or the radio transceiver 58 in addition to information from the optical sensor 18 to assist in the determination of any oxygen saturation level data that may be relevant. One or more processors 20 may be reconfigured by the user or a third party through the use of one or more microphones 32, the gestural control interface 46, or by an electronic signal received from the transceiver 56 or the radio transceiver 58. Reconfigurations may include whether to determine bone vibrations, whether to transmit the oxygen saturation data to an external device, or setting the frequency of optical sensor measurements. The aforementioned list is non-exclusive.


The transceiver 56 disposed within the earpiece 12 may be configured to receive signals from and to transmit signals to a second earpiece of the user if the user is using more than one earpiece. The transceiver 56 may receive or transmit more than one signal simultaneously. The transceiver 56 may be of any number of types including a near field magnetic induction (NFMI) transceiver.


The radio transceiver 58 disposed within the earpiece 12 may be configured to receive signals from external electronic devices and to transmit those signals to one or more processors 20. The external electronic devices the radio transceiver 58 may be configured to receive signals from include Bluetooth devices, mobile devices, desktops, laptops, tablets, modems, routers, communications towers, cameras, watches, third-party earpieces, earpieces, or other electronic devices capable of transmitting or receiving wireless signals. The radio transceiver 58 may receive or transmit more than one signal simultaneously.


The battery 30 may be operatively connected to components within an earpiece 12 to provide power. The battery 30 should provide enough power to operate an earpiece 12 for a reasonable duration of time. The battery 30 may be of any type suitable for powering an earpiece 12. However, the battery 30 need not be present in an earpiece 12. Alternative battery-less power sources, such as thermal harvesters that produce energy from differences between the user's or a third party's skin or internal body temperature and the ambient air, solar apparatuses which generate energy from the photovoltaic effect, or sensors configured to receive energy from radio waves (all of which are operatively connected to one or more earpieces 12) may be used to power the earpiece 12 in lieu of a battery 30.



FIG. 5 illustrates one embodiment of the method 100. In step 104, an earpiece emits light toward an ear surface. Any type of electromagnetic wave may be substituted for light as long as it is capable or reflecting off of human tissue, and the ear surface may be any surface suitable of reflecting electromagnetic radiation and may be on the outside or the inside of the ear. The light may be emitted intermittently in pulses or continuously at the ear surface. In step 106, an optical sensor receives both audio signals and pulse oximetry signals within the reflected light from the ear surface. The audio signals may originate from the user, a third party, an electronic apparatus, an animal, or even nature itself, and the audio signals may include more than one type of audio signal. The audio signals also do not need to be audible to the user, and may be used to modify or reconfigure the earpiece or a component within the earpiece. The reflected light, like the light, may be substituted with any type of electromagnetic wave capable of reflecting off of human tissue and does not need to be the same intensity as the light. In step 108, one or more processors separates the audio signals and the pulse oximetry signals from the reflected light. In step 110, one or more processors determines an oxygen saturation level from the pulse oximetry signals in the reflected light. A processor may use an algorithm stored within the processor itself or use an algorithm stored within a data storage device operatively connected to a processor. The oxygen saturation may also be provided to the user or a third party.



FIG. 6 illustrates a second embodiment of the method 200. The first steps are identical to the first steps of FIG. 5, but after separating the audio signals and the pulse oximetry signals from the reflected light, one or more processors, in step 210, checks to see if they are configured to determine bone vibration measurements from the reflected light. If so, then, in step 212, one or more processors determine bone vibration measurements from the reflected light. One or more processors involved in determining bone vibration measurements may use an algorithm stored within one of the processors or an algorithm stored within one or more data storage devices operatively connected to one of the processors, and the processors involved in the determination do not need to consider all of the data related to the reflected light. Any number of algorithms for extracting voice input from bone vibrations may be used including any number of speech recognition algorithms including hidden Markov models, dynamic time warping, neural networks, genetic algorithms or other methods. The voice input may contain keywords or voice commands of the user of the wireless earpiece. The reflected light data may be received continuously or intermittently in bursts of varying length. In step 214, the bone vibration measurements are used to modify the audio signals so as to maintain the audio signals original fidelity. The audio signals may not need modification, but de minimis modifications will have no effect on the audio signals fidelity. If none of the processors are configured to read bone vibration data, then in step 216, one or more processors determines an oxygen saturation level from the reflected light. Like the determination of the bone vibration measurements, a processor involved in the oxygen saturation level determination may use an algorithm stored within one of the processors or an algorithm stored within one or more data storage devices operatively connected to a processor, and the processor does not need to consider all of the reflected light data. Regardless of whether one or more of the processors was configured to determine bone vibration data in step 210, in step 218, the oxygen saturation levels and/or the audio signals are transmitted to the user or an external electronic device. The external electronic device may be a smartphone, tablet, watch, modem, router, communications tower, or a medical device, wherein the medical device is used for medical diagnosis or treatment.


Therefore, various methods, systems, and apparatus have been shown and described. Although specific embodiments are set forth herein, the present invention contemplates numerous options, variations, and alternatives including variations in type of optical source and optical sensor,

Claims
  • 1. A system comprising: at least one earpiece, wherein each earpiece comprises an earpiece housing;an optical source operatively connected to the earpiece housing, wherein the optical source is configured to emit light toward an ear surface;an optical sensor operatively connected to the earpiece housing, wherein the optical sensor is configured to receive reflected light from the ear surface; andat least one processor disposed within at least one earpiece and operatively connected to the optical source and the optical sensor, wherein the at least one processor is configured to separate the pulse oximetry signals from the audio signals in the reflected light detected by the optical sensor and use the audio signals to receive voice input from a user wearing the at least one earpiece.
  • 2. The system of claim 1 wherein the at least one processor is further configured to determine an oxygen saturation level from the pulse oximetry signals.
  • 3. The system of claim 1 wherein the at least one processor applies a filter to separate the pulse oximetry signals from the audio signals, the audio signals centered at frequency higher than the pulse oximetry signals.
  • 4. The system of claim 1 wherein the at least one earpiece comprises a set of earpieces.
  • 5. The system of claim 1 wherein the optical source is a laser.
  • 6. The system of claim 1 wherein the ear surface comprises an outer ear surface.
  • 7. The system of claim 1 wherein the ear surface comprises an inner ear surface.
  • 8. A method comprising: emitting, via an optical source of a wireless earpiece, light toward an ear surface;receiving, via an optical sensor of the wireless earpiece, a reflected light signal from the ear surface;separating, via at least one processor of the wireless earpiece, audio signals from pulse oximetry signals within the reflected light signal;determining, via the at least one processor of the wireless earpiece, an oxygen saturation level from the reflected light;calculating bone vibration measurements via the at least one processor of the wireless earpiece using the audio signals; andinterpreting the bone vibration measurements to determine voice input of a user of the wireless earpiece.
  • 9. The method of claim 8 wherein the optical source is a laser.
  • 10. The method of claim 8 wherein the ear surface comprises an outer ear surface.
  • 11. The method of claim 8 wherein the ear surface comprises an inner ear surface.
  • 12. The method of claim 8 further comprising reconfiguring the at least one processor using the voice input.
PRIORITY STATEMENT

This application claims priority to U.S. Provisional Patent Application 62/359,027, filed on Jul. 6, 2016, and entitled Combined Optical Sensor for Audio and Pulse Oximetry System and Method, hereby incorporated by reference in its entirety.

US Referenced Citations (210)
Number Name Date Kind
2325590 Carlisle et al. Aug 1943 A
2430229 Kelsey Nov 1947 A
3047089 Zwislocki Jul 1962 A
D208784 Sanzone Oct 1967 S
3586794 Michaelis Jun 1971 A
3934100 Harada Jan 1976 A
3983336 Malek et al. Sep 1976 A
4069400 Johanson et al. Jan 1978 A
4150262 Ono Apr 1979 A
4334315 Ono et al. Jun 1982 A
D266271 Johanson et al. Sep 1982 S
4375016 Harada Feb 1983 A
4588867 Konomi May 1986 A
4617429 Bellafiore Oct 1986 A
4654883 Iwata Mar 1987 A
4682180 Gans Jul 1987 A
4791673 Schreiber Dec 1988 A
4852177 Ambrose Jul 1989 A
4865044 Wallace et al. Sep 1989 A
4984277 Bisgaard et al. Jan 1991 A
5008943 Arndt et al. Apr 1991 A
5185802 Stanton Feb 1993 A
5191602 Regen et al. Mar 1993 A
5201007 Ward et al. Apr 1993 A
5201008 Arndt et al. Apr 1993 A
D340286 Seo Oct 1993 S
5280524 Norris Jan 1994 A
5295193 Ono Mar 1994 A
5298692 Ikeda et al. Mar 1994 A
5343532 Shugart Aug 1994 A
5347584 Narisawa Sep 1994 A
5363444 Norris Nov 1994 A
D367113 Weeks Feb 1996 S
5497339 Bernard Mar 1996 A
5606621 Reiter et al. Feb 1997 A
5613222 Guenther Mar 1997 A
5654530 Sauer et al. Aug 1997 A
5692059 Kruger Nov 1997 A
5721783 Anderson Feb 1998 A
5748743 Weeks May 1998 A
5749072 Mazurkiewicz et al. May 1998 A
5771438 Palermo et al. Jun 1998 A
D397796 Yabe et al. Sep 1998 S
5802167 Hong Sep 1998 A
D410008 Almqvist May 1999 S
5929774 Charlton Jul 1999 A
5933506 Aoki et al. Aug 1999 A
5949896 Nageno et al. Sep 1999 A
5987146 Pluvinage et al. Nov 1999 A
6021207 Puthuff et al. Feb 2000 A
6054989 Robertson et al. Apr 2000 A
6081724 Wilson Jun 2000 A
6094492 Boesen Jul 2000 A
6111569 Brusky et al. Aug 2000 A
6112103 Puthuff Aug 2000 A
6157727 Rueda Dec 2000 A
6167039 Karlsson et al. Dec 2000 A
6181801 Puthuff et al. Jan 2001 B1
6208372 Barraclough Mar 2001 B1
6230029 Yegiazaryan et al. May 2001 B1
6275789 Moser et al. Aug 2001 B1
6339754 Flanagan et al. Jan 2002 B1
D455835 Anderson et al. Apr 2002 S
6408081 Boesen Jun 2002 B1
6424820 Burdick et al. Jul 2002 B1
D464039 Boesen Oct 2002 S
6470893 Boesen Oct 2002 B1
D468299 Boesen Jan 2003 S
D468300 Boesen Jan 2003 S
6542721 Boesen Apr 2003 B2
6560468 Boesen May 2003 B1
6654721 Handelman Nov 2003 B2
6664713 Boesen Dec 2003 B2
6690807 Meyer Feb 2004 B1
6694180 Boesen Feb 2004 B1
6718043 Boesen Apr 2004 B1
6738485 Boesen May 2004 B1
6748095 Goss Jun 2004 B1
6754358 Boesen et al. Jun 2004 B1
6784873 Boesen et al. Aug 2004 B1
6823195 Boesen Nov 2004 B1
6852084 Boesen Feb 2005 B1
6879698 Boesen Apr 2005 B2
6892082 Boesen May 2005 B2
6920229 Boesen Jul 2005 B2
6952483 Boesen et al. Oct 2005 B2
6987986 Boesen Jan 2006 B2
7010137 Leedom et al. Mar 2006 B1
7113611 Leedom et al. Sep 2006 B2
D532520 Kampmeier et al. Nov 2006 S
7136282 Rebeske Nov 2006 B1
7203331 Boesen Apr 2007 B2
7209569 Boesen Apr 2007 B2
7215790 Boesen et al. May 2007 B2
D549222 Huang Aug 2007 S
D554756 Sjursen et al. Nov 2007 S
7403629 Aceti et al. Jul 2008 B1
D579006 Kim et al. Oct 2008 S
7463902 Boesen Dec 2008 B2
7508411 Boesen Mar 2009 B2
D601134 Elabidi et al. Sep 2009 S
7825626 Kozisek Nov 2010 B2
7965855 Ham Jun 2011 B1
7979035 Griffin et al. Jul 2011 B2
7983628 Boesen Jul 2011 B2
D647491 Chen et al. Oct 2011 S
8095188 Shi Jan 2012 B2
8108143 Tester Jan 2012 B1
8140357 Boesen Mar 2012 B1
D666581 Perez Sep 2012 S
8300864 Müllenborn et al. Oct 2012 B2
8406448 Lin et al. Mar 2013 B2
8436780 Schantz et al. May 2013 B2
D687021 Yuen Jul 2013 S
8719877 VonDoenhoff et al. May 2014 B2
8774434 Zhao et al. Jul 2014 B2
8831266 Huang Sep 2014 B1
8891800 Shaffer Nov 2014 B1
8994498 Agrafioti et al. Mar 2015 B2
D728107 Martin et al. Apr 2015 S
9013145 Castillo et al. Apr 2015 B2
9037125 Kadous May 2015 B1
D733103 Jeong et al. Jun 2015 S
9081944 Camacho et al. Jul 2015 B2
9510159 Cuddihy et al. Nov 2016 B1
D773439 Walker Dec 2016 S
D775158 Dong et al. Dec 2016 S
D777710 Palmborg et al. Jan 2017 S
D788079 Son et al. May 2017 S
20010005197 Mishra et al. Jun 2001 A1
20010027121 Boesen Oct 2001 A1
20010043707 Leedom Nov 2001 A1
20010056350 Calderone et al. Dec 2001 A1
20020002413 Tokue Jan 2002 A1
20020007510 Mann Jan 2002 A1
20020010590 Lee Jan 2002 A1
20020030637 Mann Mar 2002 A1
20020046035 Kitahara et al. Apr 2002 A1
20020057810 Boesen May 2002 A1
20020076073 Taenzer et al. Jun 2002 A1
20020118852 Boesen Aug 2002 A1
20030002705 Boesen Jan 2003 A1
20030065504 Kraemer et al. Apr 2003 A1
20030100331 Dress et al. May 2003 A1
20030104806 Ruef et al. Jun 2003 A1
20030115068 Boesen Jun 2003 A1
20030125096 Boesen Jul 2003 A1
20030218064 Conner et al. Nov 2003 A1
20040070564 Dawson et al. Apr 2004 A1
20040160511 Boesen Aug 2004 A1
20050017842 Dematteo Jan 2005 A1
20050043056 Boesen Feb 2005 A1
20050125320 Boesen Jun 2005 A1
20050148883 Boesen Jul 2005 A1
20050165663 Razumov Jul 2005 A1
20050196009 Boesen Sep 2005 A1
20050251455 Boesen Nov 2005 A1
20050266876 Boesen Dec 2005 A1
20060029246 Boesen Feb 2006 A1
20060074671 Farmaner et al. Apr 2006 A1
20060074808 Boesen Apr 2006 A1
20060166715 Engelen et al. Jul 2006 A1
20060166716 Seshadri et al. Jul 2006 A1
20060220915 Bauer Oct 2006 A1
20060258412 Liu Nov 2006 A1
20080076972 Dorogusker et al. Mar 2008 A1
20080090622 Kim et al. Apr 2008 A1
20080146890 LeBoeuf et al. Jun 2008 A1
20080254780 Kuhl et al. Oct 2008 A1
20080255430 Alexandersson et al. Oct 2008 A1
20090003620 McKillop et al. Jan 2009 A1
20090017881 Madrigal Jan 2009 A1
20090073070 Rofougaran Mar 2009 A1
20090097689 Prest et al. Apr 2009 A1
20090105548 Bart Apr 2009 A1
20090191920 Regen et al. Jul 2009 A1
20090245559 Boltyenkov et al. Oct 2009 A1
20090296968 Wu et al. Dec 2009 A1
20100033313 Keady et al. Feb 2010 A1
20100203831 Muth Aug 2010 A1
20100210212 Sato Aug 2010 A1
20100320961 Castillo et al. Dec 2010 A1
20110286615 Dlodort et al. Nov 2011 A1
20120057740 Rosal Mar 2012 A1
20130316642 Newham Nov 2013 A1
20130346168 Zhou et al. Dec 2013 A1
20140058220 LeBoeuf Feb 2014 A1
20140072146 Itkin et al. Mar 2014 A1
20140079257 Ruwe et al. Mar 2014 A1
20140106677 Altman Apr 2014 A1
20140122116 Smythe May 2014 A1
20140163771 Demeniuk Jun 2014 A1
20140185828 Helbling Jul 2014 A1
20140222462 Shakil et al. Aug 2014 A1
20140235169 Parkinson et al. Aug 2014 A1
20140270227 Swanson Sep 2014 A1
20140270271 Dehe et al. Sep 2014 A1
20140348367 Vavrus et al. Nov 2014 A1
20150028996 Agrafioti et al. Jan 2015 A1
20150110587 Hori Apr 2015 A1
20150148989 Cooper et al. May 2015 A1
20150245127 Shaffer Aug 2015 A1
20160033280 Moore et al. Feb 2016 A1
20160072558 Hirsch et al. Mar 2016 A1
20160073189 Lindén et al. Mar 2016 A1
20160125892 Bowen et al. May 2016 A1
20160360350 Watson et al. Dec 2016 A1
20170078780 Qian et al. Mar 2017 A1
20170111726 Martin et al. Apr 2017 A1
20170155992 Perianu et al. Jun 2017 A1
Foreign Referenced Citations (20)
Number Date Country
204244472 Apr 2015 CN
104683519 Jun 2015 CN
104837094 Aug 2015 CN
1469659 Oct 2004 EP
1017252 May 2006 EP
2903186 Aug 2015 EP
2074817 Apr 1981 GB
2508226 May 2014 GB
06292195 Oct 1998 JP
2008103925 Aug 2008 WO
2007034371 Nov 2008 WO
2011001433 Jan 2011 WO
2012071127 May 2012 WO
2013134956 Sep 2013 WO
2014043179 Mar 2014 WO
2014046602 Mar 2014 WO
2015061633 Apr 2015 WO
2015110577 Jul 2015 WO
2015110587 Jul 2015 WO
2016032990 Mar 2016 WO
Non-Patent Literature Citations (42)
Entry
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223.
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014).
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013).
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014).
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016).
BRAGI is on Facebook (2014).
BRAGI Update—Arrival of Prototype Chassis Parts —More People—Awesomeness (May 13, 2014).
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015).
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014).
BRAGI Update—Let's Get Ready to Rumble, A Lot to be Done Over Christmas (Dec. 22, 2014).
BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014).
BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014).
BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014).
BRAGI Update—Memories From the First Month of Kickstarter—Update on Progress (Aug. 1, 2014).
BRAGI Update—Memories From the Second Month of Kickstarter—Update on Progress (Aug. 22, 2014).
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014).
BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014).
BRAGI Update—Status on Wireless, Bits and Pieces, Testing-Oh Yeah, Timeline(Apr. 24, 2015).
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015).
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014).
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015).
BRAGI Update—Alpha 5 and Back to China, Backer Day, on Track(May 16, 2015).
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015).
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015).
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015).
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015).
BRAGI Update—Getting Close(Aug. 6, 2014).
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015).
BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015).
BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy Are We Getting Close(Sep. 10, 2015).
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016).
Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017).
Hyundai Motor America, “Hyundai Motor Company Introduces A Health +0 Mobility Concept for Wellness in Mobility”, Fountain Valley, Californa (2017).
Last Push Before the Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014).
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footie and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dlna_iot/ (Sep. 24, 2014).
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000.
Stretchgoal—Its Your Dash (Feb. 14, 2014).
Stretchgoal—The Carrying Case for the Dash (Feb. 12, 2014).
Stretchgoal—Windows Phone Support (Feb. 17, 2014).
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014).
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014).
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014).
Related Publications (1)
Number Date Country
20180008198 A1 Jan 2018 US
Provisional Applications (1)
Number Date Country
62359027 Jul 2016 US