This application relates to hearing assistance systems, and more particularly, to hearing assistance systems with own voice detection.
Hearing assistance devices are electronic devices that amplify sounds above the audibility threshold to is hearing impaired user. Undesired sounds such as noise, feedback and the user's own voice may also be amplified, which can result in decreased sound quality and benefit for the user. It is undesirable for the user to hear his or her own voice amplified. Further, if the user is using an ear mold with little or no venting, he or she will experience an occlusion effect where his or her own voice sounds hollow (“talking in a barrel”). Thirdly, if the hearing aid has a noise reduction/environment classification algorithm, the user's own voice can be wrongly detected as desired speech.
One proposal to detect voice adds a bone conductive microphone to the device. The bone conductive microphone can only be used to detect the user's own voice, has to make a good contact to the skull in order to pick up the own voice, and has a low signal-to-noise ratio. Another proposal to detect voice adds a directional microphone to the hearing aid, and orients the microphone toward the mouth of the user to detect the user's voice. However, the effectiveness of the directional microphone depends on the directivity of the microphone and the presence of other sound sources, particularly sound sources in the same direction as the mouth. Another proposal to detect voice provides a microphone in the ear-canal and only uses the microphone to record an occluded signal. Another proposal attempts to use a filter to distinguish the user's voice from other sound. However, the filter is unable to self correct to accommodate changes in the user's voice and for changes in the environment of the user.
The present subject matter provides apparatus and methods to use a hearing assistance device to detect a voice of the wearer of the hearing assistance device. Embodiments use an adaptive filter to provide a self-correcting voice detector, capable of automatically adjusting to accommodate changes in the wearer's voice and environment.
Examples are provided, such as an apparatus configured to be worn by a wearer who has an ear and an ear canal. The apparatus includes a first microphone adapted to be worn about the ear of the person, a second microphone adapted to be worn about the ear canal of the person and at a different location than the first microphone, a sound processor adapted to process signals from the first microphone to produce a processed sound signal, and a voice detector to detect the voice of the wearer. The voice detector includes an adaptive filter to receive signals from the first microphone and the second microphone.
Another example of an apparatus includes a housing configured to be worn behind the ear or over the ear, a first microphone in the housing, and an ear piece configured to be positioned in the ear canal, wherein the ear piece includes a microphone that receives sound from the outside when positioned near the ear canal. Various voice detection systems employ an adaptive filter that receives signals from the first microphone and the second microphone and detects the voice of the wearer using a peak value for coefficients of the adaptive filter and an error signal from the adaptive filter.
The present subject matter also provides methods for detecting a voice of a wearer of a hearing assistance device where the hearing assistance device includes a first microphone and a second microphone. An example of the method is provided and includes using a first electrical signal representative of sound detected by the first microphone and a second electrical signal representative of sound detected by the second microphone as inputs to a system including an adaptive filter, and using the adaptive filter to detect the voice of the wearer of the hearing assistance device.
This Summary is an overview of some of the teachings of the present application and is not intended to be an exclusive or exhaustive treatment of the present subject matter. Further details about the present subject matter are found in the detailed description. The scope of the present invention is defined by the appended claims and their equivalents.
The following detailed description refers to subject matter in the accompanying drawings which show, by way of illustration, specific aspects and embodiments in which the present subject matter may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the present subject matter. References to “an”, “one”, or “various” embodiments in this disclosure are not necessarily to the same embodiment, and such references contemplate more than one embodiment. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope is defined only by the appended claims, along with the full scope of legal equivalents to which such claims are entitled.
Various embodiments disclosed herein provide a self-correcting voice detector, capable of reliably detecting the presence of the user's own voice through automatic adjustments that accommodate changes in the user's voice and environment. The detected voice can be used, among other things, to reduce the amplification of the user's voice, control an anti-occlusion process and control an environment classification process.
The present subject matter provides, among other things, an “own voice” detector using two microphones in a standard hearing assistance device. Examples of standard hearing aids include behind-the-ear (BTE), over-the-ear (OTE), and receiver-in-canal (RIC) devices. It is understood that RIC devices have a housing adapted to be worn behind the ear or over the ear. Sometimes the RIC electronics housing is called a BTE housing or an OTE housing. According to various embodiments, one microphone is the microphone as usually present in the standard hearing assistance device, and the other microphone is mounted in an ear bud or ear mold near the user's ear canal. Hence, the microphone is directed to detection of acoustic signals outside and not inside the ear canal. The two microphones can be used to create a directional signal.
Other embodiments may be used in which the first microphone (M1) is adapted to be worn about the ear of the person and the second microphone (M2) is adapted to be worn about the ear canal of the person. The first and second microphones are at different locations to provide a time difference for sound from a user's voice to reach the microphones. As illustrated in
A digital sound processing system 308 processes the acoustic signals received by the first and second microphones, and provides a signal to the receiver 306 to produce an audible signal to the wearer of the device 305. The illustrated digital sound processing system 308 includes an interface 307, a sound processor 308, and a voice detector 309. The illustrated interface 307 converts the analog signals from the first and second microphones into digital signals for processing by the sound processor 308 and the voice detector 309. For example, the interface may include analog-to-digital converters, and appropriate registers to hold the digital signals for processing by the sound processor and voice detector. The illustrated sound processor 308 processes a signal representative of a sound received by one or both of the first microphone and/or second microphone into a processed output signal 310, which is provided to the receiver 306 to produce the audible signal. According to various embodiments, the sound processor 308 is capable of operating in a directional mode in which signals representative of sound received by the first microphone and sound received by the second microphone are processed to provide the output signal 310 to the receiver 306 with directionality.
The voice detector 309 receives signals representative of sound received by the first microphone and sound received by the second microphone. The voice detector 309 detects the user's own voice, and provides an indication 311 to the sound processor 308 regarding whether the user's own voice is detected. Once the user's own voice is detected any number of possible other actions can take place. For example, in various embodiments when the user's voice is detected, the sound processor 308 can perform one or more of the following, including but not limited to reduction of the amplification of the user's voice, control of an anti-occlusion process, and/or control of an environment classification process. Those skilled in the art will understand that other processes may take place without departing from the scope of the present subject matter.
In various embodiments, the voice detector 309 includes an adaptive filter. Examples of processes implemented by adaptive filters include Recursive Least Square error (RLS), Least Mean Squared error (LMS), and Normalized Least Mean Square error (NLMS) adaptive filter processes. The desired signal for the adaptive filter is taken from the first microphone (e.g., a standard behind-the-ear or over-the-ear microphone), and the input signal to the adaptive filter is taken from the second microphone. If the hearing aid wearer is talking, the adaptive filter models the relative transfer function between the microphones. Voice detection can be performed by comparing the power of the error signal to the power of the signal from the standard microphone and/or looking at the peak strength in the impulse response of the filter. The amplitude of the impulse response should be in a certain range in order to be valid for the own voice. If the user's own voice is present, the power of the error signal will be much less than the power of the signal from the standard microphone, and the impulse response has a strong peak with an amplitude above a threshold (e.g. above about 0.5 for normalized coefficients). In the presence of the user's own voice, the largest normalized coefficient of the filter is expected to be within the range of about 0.5 to about 0.9. Sound from other noise sources would result in a much smaller difference between the power of the error signal and the power of the signal from the standard microphone, and a small impulse response of the filter with no distinctive peak.
The illustrated power analyzer 413 compares the power of the error signal 420 to the power of the signal representative of sound received from the first microphone. According to various embodiments, a voice will not be detected unless the power of the signal representative of sound received from the first microphone is much greater than the power of the error signal. For example, the power analyzer 413 compares the difference to a threshold, and will not detect voice if the difference is less than the threshold.
The illustrated coefficient analyzer 414 analyzes the filter coefficients from the adaptive filter process 415. According to various embodiments, a voice will not be detected unless a peak value for the coefficients is significantly high. For example, some embodiments will not detect voice unless the largest normalized coefficient is greater than a predetermined value (e.g. 0.5).
In
In
The present subject matter includes hearing assistance devices, and was demonstrated with respect to BTE, OTE, and RIC type devices, but it is understood that it may also be employed in cochlear implant type hearing devices. It is understood that other hearing assistance devices not expressly stated herein may fall within the scope of the present subject matter.
This application is intended to cover adaptations or variations of the present subject matter. It is to be understood that the above description is intended to be illustrative, and not restrictive. The scope of the present subject matter should be determined with reference to the appended claims, along with the full scope of legal equivalents to which such claims are entitled.
The application is a continuation of U.S. application Ser. No. 15/614,200, filed Jun. 5, 2017, now issued as U.S. Pat. No. 10,171,922, which is a continuation of U.S. application Ser. No. 14/809,729, filed Jul. 27, 2015, now issued as U.S. Pat. No. 9,699,573, which application is a continuation of U.S. application Ser. No. 13/933,017, filed on Jul. 1, 2013, now issued as U.S. Pat. No. 9,094,766, which application is a continuation of U.S. application Ser. No. 12/749,702, filed Mar. 30, 2010, now issued as U.S. Pat. No. 8,477,973, which claims the benefit of priority under 35 U.S.C. § 119(e) to U.S. Provisional Application No. 61/165,512, filed Apr. 1, 2009, each of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4791672 | Nunley et al. | Dec 1988 | A |
5008954 | Oppendahl | Apr 1991 | A |
5208867 | Stites, III | May 1993 | A |
5327506 | Stites, III | Jul 1994 | A |
5426719 | Franks et al. | Jun 1995 | A |
5479522 | Lindemann et al. | Dec 1995 | A |
5550923 | Hotvet | Aug 1996 | A |
5553152 | Newton | Sep 1996 | A |
5659621 | Newton | Aug 1997 | A |
5701348 | Shennib et al. | Dec 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5761319 | Dar et al. | Jun 1998 | A |
5917921 | Sasaki et al. | Jun 1999 | A |
5991419 | Brander | Nov 1999 | A |
6175633 | Morrill et al. | Jan 2001 | B1 |
6639990 | Astrin et al. | Oct 2003 | B1 |
6661901 | Svean et al. | Dec 2003 | B1 |
6671379 | Nemirovski | Dec 2003 | B2 |
6718043 | Boesen | Apr 2004 | B1 |
6728385 | Kvaløy et al. | Apr 2004 | B2 |
6738482 | Jaber | May 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6801629 | Brimhall et al. | Oct 2004 | B2 |
7027603 | Taenzer | Apr 2006 | B2 |
7027607 | Pedersen et al. | Apr 2006 | B2 |
7072476 | White et al. | Jul 2006 | B2 |
7110562 | Feeley et al. | Sep 2006 | B1 |
7242924 | Xie | Jul 2007 | B2 |
7477754 | Rasmussen et al. | Jan 2009 | B2 |
7512245 | Rasmussen | Mar 2009 | B2 |
7536020 | Fukumoto | May 2009 | B2 |
7929713 | Victorian et al. | Apr 2011 | B2 |
7983907 | Visser et al. | Jul 2011 | B2 |
8031881 | Zhang | Oct 2011 | B2 |
8059847 | Nordahn | Nov 2011 | B2 |
8081780 | Goldstein et al. | Dec 2011 | B2 |
8111849 | Tateno et al. | Feb 2012 | B2 |
8116489 | Mejia et al. | Feb 2012 | B2 |
8130991 | Rasmussen et al. | Mar 2012 | B2 |
8331594 | Brimhall et al. | Dec 2012 | B2 |
8391522 | Biundo Lotito et al. | Mar 2013 | B2 |
8391523 | Biundo Lotito et al. | Mar 2013 | B2 |
8477973 | Merks | Jul 2013 | B2 |
9036833 | Victorian et al. | May 2015 | B2 |
9094766 | Merks | Jul 2015 | B2 |
9219964 | Merks | Dec 2015 | B2 |
9369814 | Victorian | Jun 2016 | B2 |
9699573 | Merks | Jul 2017 | B2 |
9712926 | Merks | Jul 2017 | B2 |
10171922 | Merks | Jan 2019 | B2 |
10225668 | Merks | Mar 2019 | B2 |
20010038699 | Hou | Nov 2001 | A1 |
20020034310 | Hou | Mar 2002 | A1 |
20020080979 | Brimhall et al. | Jun 2002 | A1 |
20020141602 | Nemirovski | Oct 2002 | A1 |
20030012391 | Armstrong et al. | Jan 2003 | A1 |
20030165246 | Kvaloy et al. | Sep 2003 | A1 |
20040081327 | Jensen | Apr 2004 | A1 |
20050058313 | Victorian et al. | Mar 2005 | A1 |
20070009122 | Debiasio et al. | Jan 2007 | A1 |
20070098192 | Sipkema | May 2007 | A1 |
20070195968 | Jaber | Aug 2007 | A1 |
20080192971 | Tateno et al. | Aug 2008 | A1 |
20080260191 | Victorian et al. | Oct 2008 | A1 |
20090016542 | Goldstein et al. | Jan 2009 | A1 |
20090034765 | Boillot et al. | Feb 2009 | A1 |
20090074201 | Zhang | Mar 2009 | A1 |
20090097681 | Puria et al. | Apr 2009 | A1 |
20090147966 | McIntosh et al. | Jun 2009 | A1 |
20090220096 | Usher et al. | Sep 2009 | A1 |
20090238387 | Arndt et al. | Sep 2009 | A1 |
20100061564 | Clemow et al. | Mar 2010 | A1 |
20100246845 | Burge et al. | Sep 2010 | A1 |
20100260364 | Merks | Oct 2010 | A1 |
20110195676 | Victorian et al. | Aug 2011 | A1 |
20110299692 | Rung et al. | Dec 2011 | A1 |
20120070024 | Anderson | Mar 2012 | A1 |
20120128187 | Yamada et al. | May 2012 | A1 |
20130195296 | Merks | Aug 2013 | A1 |
20140010397 | Merks | Jan 2014 | A1 |
20140270230 | Oishi et al. | Sep 2014 | A1 |
20150043765 | Merks | Feb 2015 | A1 |
20160021469 | Victorian et al. | Jan 2016 | A1 |
20160029131 | Merks | Jan 2016 | A1 |
20160192089 | Merks | Jun 2016 | A1 |
20170318398 | Merks | Nov 2017 | A1 |
20170339497 | Merks | Nov 2017 | A1 |
20190200142 | Merks | Jun 2019 | A1 |
Number | Date | Country |
---|---|---|
2242289 | Dec 2016 | EP |
WO-9845937 | Oct 1998 | WO |
WO-0207477 | Jan 2002 | WO |
WO-2006028587 | Mar 2003 | WO |
WO-03073790 | Sep 2003 | WO |
WO-2004021740 | Mar 2004 | WO |
WO-2004077090 | Sep 2004 | WO |
WO-2005004534 | Jan 2005 | WO |
WO-2005125269 | Dec 2005 | WO |
WO-2009034536 | Mar 2009 | WO |
WO-2009034536 | Mar 2009 | WO |
Entry |
---|
“U.S. Appl. No. 15/651,459, Non Final Office Action dated Jun. 15, 2018”, 11 pgs. |
“U.S. Appl. No. 16/290,131, Preliminary Amendment Filed Mar. 8, 2019”, 7 pgs. |
“U.S. Appl. No. 15/651,459, Notice of Allowance dated Oct. 25, 2018”, 5 pgs. |
“U.S. Appl. No. 15/651,459,Response Filed Sep. 17, 2018 to Non Final Office Action dated Jun. 15, 2018”, 11 pgs. |
“U.S. Appl. No. 10/660,454, Advisory Action dated May 20, 2008”, 4 pgs. |
“U.S. Appl. No. 10/660,454, Final Office Action dated Dec. 27, 2007”, 18 pgs. |
“U.S. Appl. No. 10/660,454, Non-Final Office Action dated Jul. 27, 2007”, 16 pgs. |
“U.S. Appl. No. 10/660,454, Response filed Apr. 25, 2008 to Final Office Action dated Dec. 27, 2007”, 15 pgs. |
“U.S. Appl. No. 10/660,454, Response filed May 9, 2007 to Restriction Requirement dated Apr. 9, 2007”, 11 pgs. |
“U.S. Appl. No. 10/660,454, Response filed Oct. 15, 2007 to Non-Final Office Action dated Jul. 27, 2007”, 17 pgs. |
“U.S. Appl. No. 10/660,454, Restriction Requirement dated Apr. 9, 2007”, 5 pgs. |
“U.S. Appl. No. 12/163,665, Notice of Allowance dated Feb. 7, 2011”, 4 pgs. |
“U.S. Appl. No. 12/163,665, Notice of Allowance dated Sep. 28, 2010”, 9 pgs. |
“U.S. Appl. No. 12/749,702, Response filed Aug. 27, 2012 to Non-Final Office Action dated May 25, 2012”, 13 pgs. |
“U.S. Appl. No. 12/749,702, Final Office Action dated Oct. 12, 2012”, 7 pgs. |
“U.S. Appl. No. 12/749,702, Non-Final Office Action dated May 25, 2012”, 6 pgs. |
“U.S. Appl. No. 12/749,702, Notice of Allowance dated Mar. 4, 2013”, 7 pgs. |
“U.S. Appl. No. 12/749,702, Response filed Feb. 12, 2013 to Final Office Action dated Oct. 12, 2012”, 10 pgs. |
“U.S. Appl. No. 13/088,902, Advisory Action dated Nov. 28, 2014”, 3 pgs. |
“U.S. Appl. No. 13/088,902, Final Office Action dated Sep. 23, 2014”, 21 pgs. |
“U.S. Appl. No. 16/088,902, Final Office Action dated Nov. 29, 2013”, 16 pgs. |
“U.S. Appl. No. 13/088,902, Non-Final Office Action dated Mar. 27, 2014”, 15 pgs. |
“U.S. Appl. No. 13/088,902, Non-Final Office Action dated May 21, 2013”, 15 pgs. |
“U.S. Appl. No. 16/088,902, Notice of Allowance dated Jan. 20, 2015”, 5 pgs. |
“U.S. Appl. No. 13/088,902, Response filed Feb. 28, 2014 to Final Office Action dated Nov. 29, 2013”, 12 pgs. |
“U.S. Appl. No. 13/088,902, Response filed Jun. 27, 2014 to Non-Final Office Action dated Mar. 27, 2014”, 13 pgs. |
“U.S. Appl. No. 13/088,902, Response filed Aug. 21, 2013 to Non-Final Office Action dated May 21, 2013”, 10 pgs. |
“U.S. Appl. No. 13/088,902, Response filed Nov. 20, 2014 to Final Office Action dated Sep. 23, 2014”, 12 pgs. |
“U.S. Appl. No. 13/933,017, Non-Final Office Action dated Sep. 18, 2014”, 6 pgs. |
“U.S. Appl. No. 13/933,017, Notice of Allowance dated Mar. 20, 2015”, 7 pgs. |
“U.S. Appl. No. 13/933,017, Response filed Dec. 18, 2014 to Non-Final Office Action dated Sep. 18, 2014”, 6 pgs. |
“U.S. Appl. No. 14/464,149, Non-Final Office Action dated Apr. 29, 2015”, 4 pgs. |
“U.S. Appl. No. 14/464,149, Notice of Allowance dated Aug. 14, 2015”, 6 pgs. |
“U.S. Appl. No. 14/464,149, Response filed Jul. 29, 2015 to Non-Final Office Action dated Apr. 29, 2015”, 7 pgs. |
“U.S. Appl. No. 14/714,841, Notice of Allowance dated Feb. 12, 2016”, 12 pgs. |
“U.S. Appl. No. 14/714,841, Preliminary Amendment filed Oct. 13, 2015”, 7 pgs. |
“U.S. Appl. No. 14/809,729, Corrected Notice of Allowance dated Jun. 1, 2017”, 7 pgs. |
“U.S. Appl. No. 14/809,729, Non-Final Office Action dated Aug. 24, 2016”, 16 pgs. |
“U.S. Appl. No. 14/809,729, Notice of Allowance dated Feb. 3, 2017”, 10 pgs. |
“U.S. Appl. No. 14/809,729, Preliminary Amendment filed Oct. 12, 2015”, 6 pgs. |
“U.S. Appl. No. 14/809,729, Response filed Nov. 23, 2016 to Non-Final Office Action dated Aug. 24, 2016”, 7 pgs. |
“U.S. Appl. No. 14/976,711, Non-Final Office Action dated Aug. 26, 2016”, 5 pgs. |
“U.S. Appl. No. 14/976,711, Notice of Allowability dated May 12, 2017”, 9 pgs. |
“U.S. Appl. No. 14/976,711, Notice of Allowance dated Mar. 14, 2017”, 5 pgs. |
“U.S. Appl. No. 14/976,711, Preliminary Amendment filed Mar. 14, 2016”, 6 pgs. |
“U.S. Appl. No. 14/976,711, Response filed Nov. 23, 2016 to Non-Final Office Action dated Aug. 26, 2016”, 7 pgs. |
“U.S. Appl. No. 15/614,200, Non-Final Office Action dated Mar. 8, 2018”, 10 pgs. |
“U.S. Appl. No. 15/614,200, Notice of Allowance dated Aug. 31, 2018”, 11 pgs. |
“U.S. Appl. No. 15/614,200, Preliminary Amendment filed Aug. 14, 2017”, 6 pgs. |
“U.S. Appl. No. 15/614,200, Response Filed Jun. 1, 2018 to Non-Final Office Action dated Mar. 8, 2018”, 9 pgs. |
“Canadian Application Serial No. 2,481,397, Non-Final Office Action dated Dec. 5, 2007”, 6 pgs. |
“Canadian Application Serial No. 2,481,397, Response filed Jun. 5, 2008 to Office Action dated Dec. 5, 2007”, 15 pgs. |
“European Application Serial No. 04255520.1, European Search Report dated Nov. 6, 2006”, 3 pgs. |
“European Application Serial No. 04255520.1, Office Action dated Jun. 25, 2007”, 4 pgs. |
“European Application Serial No. 04255520.1, Response filed Jan. 7, 2008”, 21 pgs. |
“European Application Serial No. 10250710.0, Examination Notification Art. 94(3) dated Jun. 25, 2014”, 5 pgs. |
“European Application Serial No. 10250710.0, Response filed Oct. 13, 2014 to Examination Notification Art. 94(3) dated Jun. 25, 2014”, 21 pgs. |
“European Application Serial No. 10250710.0, Search Report dated Jul. 20, 2010”, 6 Pgs. |
“European Application Serial No. 10250710.0, Search Report Response dated Apr. 18, 2011”, 16 pg. |
“European Application Serial No. 10250710.0, Summons to Attend Oral Proceedings mailed May 12, 2016”, 3 pgs. |
“European Application Serial No. 15181620.4, Communication Pursuant to Article 94(3) EPC dated Dec. 12, 2016”, 6 pgs. |
“European Application Serial No. 15181620.4, Extended European Search Report dated Jan. 22, 2016”, 8 pgs. |
“European Application Serial No. 15181620.4, Response filed Apr. 21, 2017 to Communication Pursuant to Article 94(3) EPC dated Dec. 12, 2016”, 33 pgs. |
“European Application Serial No. 16206730.0, Extended European Search Report dated Apr. 20, 2017”, 8 pgs. |
“The New Jawbone: The Best Bluetooth Headset Just Got Better”, www.aliph.com, (2008), 3 pages. |
Evjen, Peder M., “Low-Power Transceiver Targets Wireless Headsets”, Microwaves & RF, (Oct. 2002), 68, 70, 72-73, 75-76, 78-80. |
Luo, Fa-Long, et al., “Recent Developments in Signal Processing for Digital Hearing Aids”, IEEE Signal Processing Magazine, (Sep. 2006), 103-106. |
“U.S. Appl. No. 16/290,131, Non Final Office Action dated Sep. 6, 2019”, 7 pgs. |
“U.S. Appl. No. 16/290,131, Response filed Dec. 5, 2019 to Non Final Office Action dated Sep. 6, 2019”, 8 pgs. |
“European Application Serial No. 15181620.4, Communication of a Notice of Opposition mailed Jun. 27, 2019”, 40 pgs. |
Number | Date | Country | |
---|---|---|---|
20190215619 A1 | Jul 2019 | US |
Number | Date | Country | |
---|---|---|---|
61165512 | Apr 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15614200 | Jun 2017 | US |
Child | 16235214 | US | |
Parent | 14809729 | Jul 2015 | US |
Child | 15614200 | US | |
Parent | 13933017 | Jul 2013 | US |
Child | 14809729 | US | |
Parent | 12749702 | Mar 2010 | US |
Child | 13933017 | US |