Method and apparatus for a binaural hearing assistance system using monaural audio signals

Information

  • Patent Grant
  • 9036823
  • Patent Number
    9,036,823
  • Date Filed
    Friday, May 4, 2012
    13 years ago
  • Date Issued
    Tuesday, May 19, 2015
    10 years ago
Abstract
The present application provides method and apparatus for a binaural hearing assistance system using a monaural audio signal input. The system, in various examples, provides adjustable delay/phase adjustment and sound level adjustment. Different embodiments are provided for receiving the monaural signal and distributing it to a plurality of hearing assistance devices. Different relaying modes are provided. Special functions are supported, such as telecoil functions. The system also has examples that account for a head-related transfer function in providing advanced sound processing for the wearer. Other examples are provided that are described in the detailed description.
Description
FIELD OF THE INVENTION

This application relates generally to method and apparatus for a hearing assistance system, and more particularly to method and apparatus for a binaural hearing assistance system using a monaural audio signal.


BACKGROUND

Modern wireless audio devices frequently apply a monaural signal to a single ear. For example, devices such as cell phones and cellular headsets receive monaural communications for application to a single ear. By this approach, many advantages of binaural hearing are lost. Such devices only apply sound to one ear, so hearing can be impaired by loud noises in the other ear, and hearing can be impaired by hearing limitations associated with a particular ear.


Thus, there is a need in the art for an improved hearing assistance system which provides the advantages of binaural hearing for listening to a monaural signal. The system should be controllable to provide better hearing, convenience, and an unobtrusive design. In certain variations, the system may also allow a user to customize his or her hearing experience by controlling the sounds received by the system.


SUMMARY

This application addresses the foregoing need in the art and other needs not discussed herein. The various embodiments described herein relate to a wireless system for binaural hearing assistance devices.


One embodiment includes an apparatus for a user having a first ear and a second ear, including a wireless device to transmit a signal containing monaural information; a first hearing assistance device including: a first radio receiver to receive the signal; an adjustable phase shifter adapted to apply a plurality of controllable, incremental phase shifts to the monaural information on the signal; and a first speaker to produce a first audio signal for the first ear; and a second hearing assistance device including a second radio receiver and a second speaker to produce a second audio signal for the second ear, wherein the first and second audio signals are produced with adjustable relative phase based on a setting of the adjustable phase shifter. Various embodiments provide adjustable level controls and microphones in combinations of first and/or second hearing assistance devices. Some applications include communications between cellular devices, such as cellular phones and hearing aids. Various embodiments provide applications using wireless audio controllers having packetized audio. Both manual and automatic adjustments are provided. In various embodiments, different combinations of receivers and sensors, such as magnetic field sensors, are provided. In various embodiments, processing adapted to account for head-related transfer functions and for controlling the electronics using it are provided.


In one embodiment, a system is provided for a user having a first ear and a second ear, including: a device comprising a controllable phase shifter adapted to receive a monaural information signal and convert it into a first monaural signal and a second monaural signal, the first and second monaural signals having an interaural phase shift; a first hearing assistance device including: a first receiver adapted to receive the first monaural signal; and a first speaker to produce a first audio signal for the first ear; and a second hearing assistance device including: a second receiver adapted to receive the second monaural signal; and a second speaker to produce a second audio signal for the second ear. Various embodiments provide adjustable level controls and microphones in combinations of first and/or second hearing assistance devices. Some applications include communications between cellular devices, such as cellular phones and hearing aids. Various embodiments provide applications using wireless audio controllers having packetized audio. Both manual and automatic adjustments are provided. In various embodiments, different combinations of receivers and sensors, such as magnetic field sensors, are provided. In various embodiments, processing adapted to account for head-related transfer functions and for controlling the electronics using it are provided.


Methods are also provided, including for example, a method for providing sound to a first ear and a second ear of a wearer of first and second hearing assistance devices, including: receiving a monaural information signal; converting the monaural information signal into a first monaural signal and a second monaural signal, the first and second monaural signals differing in relative phase which is controllable; and providing a first sound based on the first monaural signal to the first ear of the wearer and a second sound based on the second monaural signal to the second ear of the wearer to provide binaural sound to the wearer. Different applications, including different methods for lateralizing perceived sounds and levels of perceived sounds, are provided. Different embodiments for methods of use, including sensing telephone (telecoil) modes, are provided. Different embodiments for applications employing head-related transfer functions and relaying are also provided. A variety of different interaural delays and phase changes are provided. Other embodiments not expressly mentioned in this Summary are found in the detailed description.


This Summary is an overview of some of the teachings of the present application and not intended to be an exclusive or exhaustive treatment of the present subject matter. Further details about the present subject matter are found in the detailed description and appended claims.





BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments are illustrated by way of example in the figures of the accompanying drawings.



FIG. 1A shows one system using devices in a direct communication mode according to one embodiment of the present subject matter.



FIG. 1B shows a block diagram of signal flow in a hearing assistance device according to one embodiment of the present subject matter.



FIG. 1C shows detail of the signal processing block of FIG. 1B according to one embodiment of the present subject matter.



FIG. 2 shows one system of devices in a relaying communication mode according to one embodiment of the present subject matter.



FIG. 3 shows one system of devices in a relaying communication mode according to one embodiment of the present subject matter.



FIG. 4A shows one system providing multiple signals according to one embodiment of the present subject matter.



FIG. 4B shows a signal flow of a wireless audio controller according to one embodiment of the present subject matter.





DETAILED DESCRIPTION

In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the various embodiments. It will be apparent, however, to one skilled in the art that the various embodiments may be practiced without some of these specific details. The following description and drawings provide examples for illustration, and are not intended to provide an exhaustive treatment of all possible implementations.


It should be noted that references to “an”, “one”, or “various” embodiments in this disclosure are not necessarily to the same embodiment, and such references contemplate more than one embodiment.


The present subject matter presents sound to both ears of a user wearing wireless hearing assistance devices which is derived from a single monaural signal. Among other things, it allows for better control of the received sound and obtains benefits of binaural hearing for listening to the monaural signal. In various embodiments, the sound presented to one ear is phase shifted relative to the sound presented to the other ear. In various embodiments, the phase shift arises from a constant time delay. In various embodiments, the phase shift arises from a constant phase shift at all frequencies. In various embodiments, the phase shift arises from a phase shift that is varying as a function of frequency. In various embodiments, the sound presented to one ear is set to a different level relative to the sound presented to the other ear. In various embodiments, the sound presented to one ear is controllable in relative phase and in relative level with respect to the sound presented to the other ear. Various apparatus and method set forth herein can be employed to accomplish these embodiments and their equivalents. Other variations not expressly set forth herein exist which are within the scope of the present subject matter. Thus, the examples provided herein demonstrate various aspects of the present subject matter and are not intended to be limiting or exclusive.



FIG. 1A shows one system using devices in a direct communication mode according to one embodiment of the present subject matter. In various embodiments, wireless device 102 supports one or more communication protocols. In various embodiments, communications of far field signals are supported. Some embodiments employ 2.4 GHz communications. In various embodiments the wireless communications can include standard or nonstandard communications. Some examples of standard wireless communications include, but are not limited to, FM, AM, SSB, BLUETOOTH™, IEEE 802.11 (wireless LANs) wi-fi, 802.15 (WPANs), 802.16 (WiMAX), 802.20, and cellular protocols including, but not limited to CDMA and GSM, ZigBee, and ultra-wideband (UWB) technologies. Such protocols support radio frequency communications and some support infrared communications. It is possible that other forms of wireless communications can be used such as ultrasonic, optical, and others. It is understood that the standards which can be used include past and present standards. It is also contemplated that future versions of these standards and new future standards may be employed without departing from the scope of the present subject matter.


Such wireless devices 102 include, but are not limited to, cellular telephones, personal digital assistants, personal computers, streaming audio devices, wide area network devices, local area network devices, personal area network devices, and remote microphones. In various embodiments, the wireless device 102 includes one or more of the interface embodiments demonstrated in U.S. Provisional Patent application Ser. No. 60/687,707, filed Jun. 5, 2005, entitled: COMMUNICATION SYSTEM FOR WIRELESS AUDIO DEVICES, and U.S. patent application Ser. No. 11/447,617, filed Jun. 5, 2006, entitled: COMMUNICATION SYSTEM FOR WIRELESS AUDIO DEVICES which claims the benefit of the provisional application, the entire disclosures of which are hereby incorporated by reference. This is also applicable to wireless devices 202, 302, and 402 as described herein.


In the embodiment demonstrated by FIG. 1A, the listener has primary and secondary wireless hearing assistance devices R1 and R2. The wireless hearing assistance devices include, but are not limited to, various embodiments of hearing aids. In one embodiment, at least one wireless hearing assistance device is a behind-the-ear hearing aid. In one embodiment, at least one wireless hearing assistance device is an in-the-ear hearing aid. In one embodiment, at least one wireless hearing assistance device is a completely-in-the-canal hearing aid. In one embodiment, at least one wireless hearing assistance device is a wireless earpiece. In one embodiment, at least one wireless hearing assistance device is a behind-the-ear hearing aid with a wireless adaptor attached. Various examples of wireless adapters for some hearing assistance devices using a direct-audio input (DAI) interface are demonstrated in U.S. patent application Ser. No. 11/207,591, filed Aug. 18, 2005, entitled “WIRELESS COMMUNICATIONS ADAPTER FOR A HEARING ASSISTANCE DEVICE;” and PCT Patent Application No. PCT/US2005/029971, filed Aug. 18, 2005, entitled “WIRELESS COMMUNICATIONS ADAPTER FOR A HEARING ASSISTANCE DEVICE,” the entire disclosures of which are incorporated by reference.


In the system of FIG. 1A, the communication protocol of wireless device 102 is adapted to controllably provide wireless communications 105, 109 to both the primary wireless hearing assistance device R1 and the secondary wireless hearing assistance device R2. In various embodiments, the communications are unidirectional. In various embodiments, the communications are bidirectional. In various embodiments, the communications include at least one unidirectional communication and one bidirectional communication. Thus, the system is highly programmable to adapt to a number of communication requirements and applications. The system is adapted to provide binaural information to both R1 and R2 based a monaural signal from wireless device 102.


In embodiments using BLUETOOTH as the communication protocol, it is noted that BLUETOOTH is normally directed for point-to-point communications using PINs (personal identification numbers), such that the wireless device 102 is typically paired with only one other device, such as primary device R1. Thus, to allow the wireless device 102 to also communicate with secondary device R2, a second pairing must be done, whether by standard or nonstandard means.



FIG. 1B shows a block diagram of signal flow in a hearing assistance device according to one embodiment of the present subject matter. For purposes of demonstration, this block diagram will be that of wireless audio device R1. However, it is understood that R2 or any other wireless audio device receiving the monaural signal from wireless device 102 could employ the subject matter of FIG. 1B without departing from the scope of the present subject matter.


The monaural signal 105 is received by receiver 122 which demodulates the signal and provides the audio signal 128 to signal processor 124. Signal processor 124 processes the signal to provide signal 130, which is then sent to speaker 126 to play the processed signal 130 to one ear of a wearer of R1. Various inputs from a user or from other external programming means may be employed to provide control to the signal processing performed by signal processor 124. These inputs can be accomplished with a variety of switches, and or programming ports, as needed to provide signal processing selections and/or parameters for the system.


In one embodiment, signal processor 124 is a digital signal processor. In one embodiment, signal processor 124 comprises hardware and software to accomplish the signal processing task. In one embodiment, signal processor 124 employs dedicated hardware in combination with other computational or digital signal processing hardware to perform the signal processing task. It is understood that a separate amplifier may be used for amplifying the signal 130 before sending it to speaker 126 as is known in the art. Thus, FIG. 1B is intended to demonstrate the basic operational blocks at one level and is not intended to be exclusive or exhaustive of the expressions of the present subject matter.



FIG. 1C shows detail of the signal processing block 124 of FIG. 1B according to one embodiment of the present subject matter. In this example, the monaural input signal 128 is processed by phase shifter 132 to provide a phase shifted version of the input signal 128. In various embodiments, the phase shift arises from a constant time delay applied to input signal 128. In various embodiments, the phase shift arises from a constant phase shift at all frequencies applied to input signal 128. In various embodiments, the phase shift arises from a phase shift that is varying as a function of frequency. Thus, control 138 provides some form of setting for adjusting phase shift and/or for selecting the type of phase shift to be applied. In one embodiment, the signal 125 is provided by a source external to the hearing assistance device R1 to control the phase shift. Various means for supplying signal 125 include one or more of switches operable by the user, soft switches programmed by a programming device attached to the hearing assistance device, or any combination of such inputs. Furthermore, in various embodiments, signal 125 may be internally generated by systems within the programming device to provide phase shift control as a function of one or more of sound received, conditions detected, and other processes requiring a change of either phase shift amount and/or mode. The signal 125 may also be transmitted and received by the device to adjust its operation.


For example, signal 125 could be generated as a result of a telephone device in proximity to the hearing assistance device to lateralize received sounds to the ear proximal the telephone. As another example, signal 125 can be generated to discontinue phase adjustment when the user receives a wireless signal indicating a ringing telephone. As another example, signal 125 can be generated to discontinue phase adjustment when detecting an emergency vehicle or other siren in proximity. Many other applications and operations of the system are possible without departing from the scope of the present subject matter. Those provided herein are intended to be demonstrative and not exhaustive or limiting of the present subject matter.



FIG. 1C also shows the phase shifted signal may optionally be processed for other effects by processor 134. The resulting signal is sent to amplifier circuit 136 to generate output 130 for speaker 126. Processor 134 allows further adjustment of the signal, including level adjustment. For example, the level and phase of the signal 130 can be programmably controlled, in one embodiment. If the hearing assistance device on the other ear (e.g., R2) does not adjust phase or level, then by controlling R1 a wearer of the hearing assistance devices R1 and R2 can experience both interaural level differences and interaural time/phase differences that are adjustable and controllable.


In applications where both R1 and R2 include the system of FIGS. 1A-1C, the settings of both devices can be adjusted to achieve desired interaural level and interaural time/phase differences. One way of communicating settings to both devices is to use signals embedded in the monaural information signals S1 that are received by R1 and R2. Thus, the monaural information is identical in such embodiments, but the signals provided may be used to adjust R1 relative to R2. Such embodiments require processing on wireless device 102 to provide appropriate control of R1 with respect to R2. It is understood that in one embodiment, such systems may employ a signaling that adjusts only R1, leaving R2 to operate without adjustment. In one embodiment, both R1 and R2 receive signals that adjust both devices to relatively provide the desired interaural level and/or interaural time/phase differences. In other embodiments, the signals for such interaural differences are generated within R1 and/or R2. For example, in a telephone sensing embodiment, the electronics of R1 may include a magnetic field sensor which programs R1 to shift to a telecoil mode (thereby turning off or diminishing the local microphone-received sound of the hearing assistance device R1) when a telephone is detected at or near R1. Many other embodiments and applications are possible without departing from the scope of the present subject matter.


Other signaling and communications modes may be accomplished without departing from the scope of the present subject matter. For example, FIG. 2 shows one system of devices in a relaying communication mode according to one embodiment of the present subject matter. The relaying can be of control signals, audio signals, or a combination of both. The relaying can be accomplished to perform functions adjusting phase and amplitude of both R1 and R2 and provides the ability to control lateralization and volume of the monaural signal to both ears. For example, when one ear detects a telephone signal, the relayed signal could include instructions to shut off or diminish the local received sound to the other ear to better hear the caller. The relayed signal could also lateralize the sound to the device detecting the phone to enjoy the enhanced benefits of binaural reception of the caller. Such embodiments can provide relaying of the caller's voice to the ear without the telephone against it, albeit at the proper phase and level to properly lateralize the sound of the caller's voice.


New virtual communication modes are also possible. When used in conjunction with telecommunications equipment, the system could provide a virtual handheld phone function without the user ever picking up the phone. For example, with this system, the user may answer his/her telephone (signaled from a ringing telephone), engage in a wireless session with his/her phone (e.g., Bluetooth communications with a cellular phone), and the system will programmably and automatically lateralize sound to a desired ear for binaural reception of the caller. All these activities can be performed without ever having to pick the phone up or place it near the ear. Those of skill in the art will readily appreciate a number of other applications within the scope of the present subject matter.


In some embodiments, it is possible to also insert special audio information for playing to one or more ears based on events. For example, given the previous example of virtual phone, a voice could play when caller identification identifies the caller to let the wearer know who the caller is and to decide whether to answer his/her phone.


Other applications too numerous to mention herein are possible without departing from the scope of the present subject matter.



FIG. 3 shows one system of devices in a relaying communication mode according to one embodiment of the present subject matter. In the embodiment of FIG. 3 it is possible to allow one receiver (e.g., R1) to be used to receive the monaural signal S1 and thereby relay the audio and/or control information to a second receiver (R2) in a relaying mode. The information communicated from wireless device 302 to primary device R1 is retransmitted to secondary device R2. Such systems have an additional time delay for the relay signal to reach secondary device R2 with the information. Thus, for synchronization of the information timing, the system may employ delay in the primary device R1 to account for the extra time to relay the information to secondary device R2.


This additional relaying option demonstrates the flexibility of the system. Other relaying modes are possible without departing from the scope of the present subject matter.


In the various relaying modes provided herein, relaying may be performed in a variety of different embodiments. In one embodiment, the relaying is unidirectional. In one embodiment the relaying is bidirectional. In one embodiment, relaying of audio information is unidirectional and control information is bidirectional. Other embodiments of programmable relaying are possible involving combinations of unidirectional and bidirectional relaying. Thus, the system is highly programmable to adapt to a number of communication requirements and applications.



FIG. 4A shows one system providing multiple signals according to one embodiment of the present subject matter. This system demonstrates that phase and/or level adjustment may be performed at the wireless device 402 to provide a first signal S1 and a second signal S2 from a single monaural signal. In some embodiments, the signals S1 and S2 are adjusted to the desired interaural phase/time delay and interaural level differences by wireless device 402 and then played to the wearer of R1 and R2 without further adjustments to the phase and/or level. In some embodiments, further adjustment of the interaural phase/time delay and/or interaural level can be performed by either R1 or R1 or both in combination. The adjustments to interaural phase/time delay and/or interaural level are controllable by inputs to the wireless device 402 and many of the same applications can be performed as set forth herein.



FIG. 4B shows a signal flow of a wireless audio controller according to one embodiment of the present subject matter. In this example, the monaural input signal 405 is processed by phase shifter 432 to provide a phase shifted version of the input signal 405. In various embodiments, the phase shift arises from a constant time delay applied to input signal 405. In various embodiments, the phase shift arises from a constant phase shift at all frequencies applied to input signal 405. In various embodiments, the phase shift arises from a phase shift that is varying as a function of frequency. Thus, control 438 provides some form of setting for adjusting phase shift and/or for selecting the type of phase shift to be applied. In one embodiment, the signal 425 is provided by a source external to the hearing assistance device R1 to control the phase shift. Various means for supplying signal 425 include one or more of switches operable by a user, soft switches programmed by a programming device, or any combination of such inputs. Furthermore, in various embodiments, signal 425 may be internally generated by systems within the programming device to provide phase shift control as a function of one or more of sound received, conditions detected, and other processes requiring a change of either phase shift amount and/or mode. The signal 425 may also be transmitted and received by the device to adjust its operation.


The phase adjusted signal may also be further processed using processor 434. The resulting signal is sent to radio transmitter 440 to provide S1 and S2 with the desired interaural phase/time delay and interaural level adjustments. Thus, the phase shifter circuitry is located at the wireless device 402 in this embodiment. In various embodiments, the wireless device 402 includes one or more of the interface embodiments demonstrated in U.S. Provisional Patent Application Ser. No. 60/687,707, filed Jun. 5, 2005, entitled: COMMUNICATION SYSTEM FOR WIRELESS AUDIO DEVICES, and U.S. patent application Ser. No. 11/447,617, filed Jun. 5, 2006, entitled: COMMUNICATION SYSTEM FOR WIRELESS AUDIO DEVICES which claims the benefit of U.S. Provisional Application Ser. No. 60/687,707, the entire disclosures of which are hereby incorporated by reference. The functionalities of the wireless audio controller can be combined with the phase/time delay and level adjusting features described herein. Various different inputs may be used in combination to perform phase/time delay adjustment control and interaural level adjustment control.


The system of FIG. 4 can perform many of the applications set forth above for those systems of FIGS. 1-3. Furthermore, the systems may work in conjunction to provide interaural phase/time delay and interaural level adjustment of the signals for a variety of applications. Various different inputs may be used in combination to perform phase/time delay adjustment control and interaural level adjustment control.


The following discussion applies to all of the embodiments set forth herein. For audio applications including speech, a number of modes exist for binaural presentation of speech to the primary device and secondary device. Binaural speech information can greatly enhance intelligibility of speech. This is especially so when speech has been distorted through a vocoder and when the wearer is attempting to listen in a noisy environment. The following modes also provide other advantages to speech information, such as loudness summation and a release of masking making the speech more understandable in a noisy environment.


1) Coherent Signals: When signals are coherent, the signals provided to a wearer of, for example, a hearing aid receiving signals via the DAI interfaces are identical, producing a perception of centered sound to the user. Such speech would be diotic.


2) Incoherent Signals: A phase shift is applied across the spectrum of the signal either in the primary or the secondary device. For example, the speech signal in the secondary device could be inverted, equivalent to providing a 180 degree phase shift at all frequencies. The binaural speech will be perceived as diffuse and may be preferred by the wearer over the centered, diotic speech associated with coherent signals (above). The speech in the case of incoherent signals is dichotic. Those of skill in the art will know that many phase adjustments can be made to achieve a diffuse perception, including a constant change across frequency of a phase value other than 180 degrees, and a frequency-varying phase change. Time-domain filters, such as all-pass filters, can also be used to adjust the phase of the signal without the use of time-to-frequency conversion. One approach to providing such a phase shift includes conversion of the time domain signals processed by the system into frequency domain signals and then application of a predetermined phase to create the 180 degree shift for all frequencies of interest.


3) Lateralized Signals: A delay and/or attenuation is applied to the speech in either the primary or secondary device in order for the speech to be perceived as coming from the side that did not receive the delay and/or attenuation. Typical numbers include, but are not limited to, a one millisecond delay and a one decibel attenuation. Typical ranges of delay include, but are not limited to, 0.3 milliseconds to 10 milliseconds. One such other range includes 0.2 milliseconds to 5 milliseconds. Typical attenuation ranges include, but are not limited to, 1 decibel and 6 decibels. One such other range includes 1 decibel to 10 decibels. Other delays and attenuations may be used without departing from the scope of the present subject matter. A listener may prefer, for example, a one millisecond delay and a one decibel attenuation, since speech from, for example, a cell phone, is normally heard in one ear and since the perceived sound will be in one ear, yet retain the benefits of having a binaural signal to the listener. In various embodiments, the attenuations and delays are programmed by the dispensing professional using hearing aid fitting software. So, different patients could have different parameters set according to their preference. Some patients may prefer diffuse sound, some may prefer sound to their left, some may prefer sound to their right, etc.


The wearer's voice in various embodiments can be transmitted back to the wireless device. For example, in cases where the wireless device is a cell phone and the primary and secondary wireless hearing assistance devices are hearing aids, it is understood that the communications back to the cell phone by the aids include:


1) In one embodiment, the primary device (e.g., hearing aid) paired with the wireless device (e.g., cell phone) transmits the wearer's voice back to the wireless device (cell phone) and does not transmit this to the secondary device (e.g., other hearing aid). Thus, no voice pickup is used by the secondary device and no transmission of the wearer's voice is made from secondary device to primary device.


2) In one embodiment, the secondary device (e.g., other hearing aid) does transmit audio to the primary device (e.g., hearing aid paired with the cell phone).


In varying embodiments, the signals picked up from the primary device and secondary device can be processed in a variety of ways. One such way is to create a beamformed signal that improves overall signal-to-noise ratio that is transmitted back to the wireless device (e.g., cell phone). A delay would be added to the primary voice-pickup signal before effective combination with the secondary voice signal. Such a system can steer the beam to a location orthogonal to the axis formed by a line connecting primary and secondary, i.e., the direction of maximum sensitivity of the beamformed signal can be set at the location of the wearer's mouth. In addition to beam forming, noise cancellation of uncorrelated noise sources can be accomplished. In one application, such cancellation can take place by the primary device prior to transmission to the wireless device. These techniques improve the signal-to-noise ratio and quality of the signal received by a person listening to the signals from the wireless device (e.g., a person at the other end of the communication, for example, at another telephone).


It is understood that the present phase shifter could be replaced with a processor offering a head-related transfer function (HRTF) which performs phase and level changes as a function of frequency that are specific to the acoustic transfer function from a free field source to the ear of the listener. Such processing could be accomplished using a digital signal processor or other dedicated processor.


It is understood that the examples set forth herein can be applied to a variety of wireless devices and primary and secondary device combinations. Thus, the examples set forth herein are not limited to telephone applications. It is further understood that the wireless devices set forth herein can be applied to right and left hearing applications as desired by the user and is not limited to any one direction of operation.


This description has set forth numerous characteristics and advantages of various embodiments and details of structure and function of various embodiments, but is intended to be illustrative and not intended in an exclusive or exhaustive sense. Changes in detail, material and management of parts, order of process and design may occur without departing from the scope of the appended claims and their legal equivalents.

Claims
  • 1. A method for providing sound to first and second ears of a wearer of first and second hearing assistance devices, comprising: deriving a first sound and a second sound from a monaural information signal provided by a wireless device wirelessly coupled to the first and second hearing assistance devices using radio communication, the second sound having a controllable relative phase with respect to the first sound;receiving a control signal;controlling adjustment of the relative phase as a function of the control signal, including discontinuing the adjustment of the relative phase in response to the control signal indicating presence of a predetermined type sound or condition in proximity to the first and second hearing assistance devices; andpresenting the first sound to the first ear using the first hearing assistance device and the second sound to the second ear using the second hearing assistance device to provide binaural sound to the wearer.
  • 2. The method of claim 1, wherein receiving the control signal comprises receiving the control signal from a device external to the first and second hearing assistance devices.
  • 3. The method of claim 1, wherein receiving the control signal comprises receiving a signal indicating a telephone in proximity to the first and second hearing assistance devices or ringing.
  • 4. The method of claim 3, comprising lateralizing the binaural sound perceived by the wearer in response to the control signal indicating the telephone in proximity to the first and second hearing assistance devices.
  • 5. The method of claim 3, comprising discontinuing the adjustment of the relative phase in response to the control signal indicating the telephone ringing.
  • 6. The method of claim 1, comprising discontinuing the adjustment of the relative phase in response to the control signal indicating detection of an emergency vehicle in proximity to the first and second hearing assistance devices.
  • 7. The method of claim 1, comprising discontinuing the adjustment of the relative phase in response to the control signal indicating detection of a siren in proximity to the first and second hearing assistance devices.
  • 8. The method of claim 1, wherein deriving the first sound and the second sound comprises: applying phase shift to the monaural information signal; andsetting the first sound and the second sound to levels different from each other.
  • 9. The method of claim 8, comprising producing a first monaural signal and a second monaural signal by applying the phase shift to the monaural information signal using the wireless device, transmitting the first monaural signal from the wireless device to the first hearing assistance device, transmitting the second monaural signal from the wireless device to the second hearing assistance device, producing the first sound using the first monaural signal using the first hearing assistance device, and producing the second sound using the second monaural signal using the second hearing assistance device.
  • 10. The method of claim 8, comprising transmitting the monaural information signal from the wireless device to the first hearing assistance device, and wherein applying the phase shift to the monaural information signal includes applying the phase shift to the monaural information signal using the first hearing assistance device.
  • 11. An apparatus for providing sound to first and second ears of a wearer using a wireless transmitter, comprising: a pair of hearing assistance devices configured to receive radio signals including a monaural information signal from the wireless transmitter, apply phase shift to the monaural information signal to produce a first monaural signal and a second monaural signal with a controllable interaural phase difference between the first and second monaural signals, provide a first sound based on the first monaural signal to the first ear of the wearer and a second sound based on the second monaural signal to the second ear of the wearer to provide binaural sound to the wearer, receive a control signal, and control settings for adjustment of the phase shift using the control signal including lateralizing the binaural sound perceived by the wearer or discontinuing the adjustment of the phase shift in response to the control signal indicating presence of a predetermined type sound or condition in proximity to the pair of hearing assistance devices.
  • 12. The apparatus of claim 11, wherein the pair of hearing assistance device is configured to lateralize the binaural sound perceived by the wearer in response to the control signal indicating the telephone in proximity to the pair of hearing assistance device.
  • 13. The apparatus of claim 11, wherein the pair of hearing assistance device is configured to discontinue the adjustment of the relative phase in response to the control signal indicating the telephone ringing.
  • 14. The apparatus of claim 11, wherein the pair of hearing assistance device is configured to discontinue the adjustment of the relative phase in response to the control signal indicating detection of an emergency vehicle in proximity to the pair of hearing assistance device.
  • 15. The apparatus of claim 11, wherein the pair of hearing assistance device is configured to discontinue the adjustment of the relative phase in response to the control signal indicating detection of a siren in proximity to the first hearing assistance device.
  • 16. The apparatus of claim 11, wherein the pair of hearing assistance device is further configured to apply level shift to at least the first monaural signal such that the first monaural signal and the second monaural signal are produced with a controllable interaural level difference between the first and second monaural signals.
  • 17. The apparatus of claim 16, wherein the pair of hearing assistance devices comprises a first hearing assistance to provide the first sound to the first ear of the wearer and a second hearing assistance to provide the second sound to the second ear of the wearer, the first hearing assistance device including a first radio receiver configured to receive first signals of the radio signals, a first signal processor configured to apply the phase shift, and a first speaker configured to provide the first sound to the first ear, the second hearing assistance device includes a second radio receiver to receive second signals, a second signal processor, and a second speaker to provide the second sound to the second ear.
  • 18. The apparatus of claim 17, wherein the second radio receiver is configured to receive the second signals from the wireless device.
  • 19. The apparatus of claim 18, wherein the second radio receiver is further configured to receive the second signals from the first hearing assistance device.
  • 20. The apparatus of claim 17, wherein the second radio receiver is configured to receive the second signals from the first hearing assistance device.
CLAIM OF PRIORITY

This application is a continuation of and claims the benefit of priority under 35 U.S.C. §120 to U.S. patent application Ser. No. 11/456,538, filed on Jul. 10, 2006, which is incorporated herein by reference in its entirety.

US Referenced Citations (230)
Number Name Date Kind
2530621 Lybarger Nov 1950 A
2554834 Lavery May 1951 A
2656421 Lybarger Oct 1953 A
3396245 Flygstad Aug 1968 A
3527901 Geib Sep 1970 A
3571514 Wruk Mar 1971 A
3660695 Schmitt May 1972 A
3742359 Behymer Jun 1973 A
3770911 Knowles et al. Nov 1973 A
3798390 Gage et al. Mar 1974 A
3836732 Johanson et al. Sep 1974 A
3875349 Ruegg Apr 1975 A
3894196 Briskey Jul 1975 A
3946168 Preves Mar 1976 A
3975599 Johanson Aug 1976 A
4051330 Cole Sep 1977 A
4142072 Berland Feb 1979 A
4187413 Moser Feb 1980 A
4366349 Adelman Dec 1982 A
4395601 Kopke et al. Jul 1983 A
4396806 Anderson Aug 1983 A
4419544 Adelman Dec 1983 A
4425481 Mansgold et al. Jan 1984 A
4449018 Stanton May 1984 A
4456795 Saito Jun 1984 A
4467145 Borstel Aug 1984 A
4471490 Bellafiore Sep 1984 A
4489330 Marutake et al. Dec 1984 A
4490585 Tanaka Dec 1984 A
4508940 Steeger Apr 1985 A
4596899 Wojcik et al. Jun 1986 A
4622440 Slavin Nov 1986 A
4631419 Sadamatsu et al. Dec 1986 A
4637402 Adelman Jan 1987 A
4638125 Buettner Jan 1987 A
4696032 Levy Sep 1987 A
4710961 Buttner Dec 1987 A
4712244 Zwicker et al. Dec 1987 A
4723293 Harless Feb 1988 A
4751738 Widrow et al. Jun 1988 A
4756312 Epley Jul 1988 A
4764957 Angelini et al. Aug 1988 A
4845755 Busch et al. Jul 1989 A
4862509 Towsend Aug 1989 A
4882762 Waldhauer Nov 1989 A
4887299 Cummins et al. Dec 1989 A
4926464 Schley-May May 1990 A
4930156 Norris May 1990 A
4995085 Kern et al. Feb 1991 A
5010575 Marutake et al. Apr 1991 A
5027410 Williamson et al. Jun 1991 A
5029215 Miller, II Jul 1991 A
5086464 Groppe Feb 1992 A
5091952 Williamson et al. Feb 1992 A
5157405 Wycoff et al. Oct 1992 A
5189704 Krauss Feb 1993 A
5204917 Arndt et al. Apr 1993 A
5212827 Meszko et al. May 1993 A
5214709 Ribic May 1993 A
5226087 Ono et al. Jul 1993 A
5280524 Norris Jan 1994 A
5289544 Franklin Feb 1994 A
5390254 Adelman Feb 1995 A
5404407 Weiss Apr 1995 A
5422628 Rodgers Jun 1995 A
5425104 Shennib Jun 1995 A
5426689 Griffith et al. Jun 1995 A
5434924 Jampolsky Jul 1995 A
5463692 Fackler Oct 1995 A
5479522 Lindemann et al. Dec 1995 A
5483599 Zagorski Jan 1996 A
5502769 Gilbertson Mar 1996 A
5524056 Killion et al. Jun 1996 A
5553152 Newton Sep 1996 A
5581747 Anderson Dec 1996 A
5600728 Satre Feb 1997 A
5629985 Thompson May 1997 A
5636285 Sauer Jun 1997 A
5640293 Dawes et al. Jun 1997 A
5640457 Gnecco et al. Jun 1997 A
5651071 Lindemann et al. Jul 1997 A
5659621 Newton Aug 1997 A
5687242 Iburg Nov 1997 A
5706351 Weinfurtner Jan 1998 A
5710820 Martin et al. Jan 1998 A
5721783 Anderson Feb 1998 A
5734976 Bartschi et al. Mar 1998 A
5737430 Widrow Apr 1998 A
5740257 Marcus Apr 1998 A
5751820 Taenzer May 1998 A
5757932 Lindemann et al. May 1998 A
5757933 Preves et al. May 1998 A
5761319 Dar et al. Jun 1998 A
5768397 Fazio Jun 1998 A
5793875 Lehr et al. Aug 1998 A
5796848 Martin Aug 1998 A
5809151 Husung Sep 1998 A
5822442 Agnew et al. Oct 1998 A
5823610 Ryan et al. Oct 1998 A
5825631 Prchal Oct 1998 A
5835610 Ishige et al. Nov 1998 A
5835611 Kaiser et al. Nov 1998 A
5852668 Ishige et al. Dec 1998 A
5862238 Agnew et al. Jan 1999 A
5966639 Goldberg et al. Oct 1999 A
5991419 Brander Nov 1999 A
5991420 Stern Nov 1999 A
6031922 Tibbetts Feb 2000 A
6031923 Gnecco et al. Feb 2000 A
6041129 Adelman Mar 2000 A
6078675 Bowen-Nielsen et al. Jun 2000 A
6078825 Hahn et al. Jun 2000 A
6088339 Meyer Jul 2000 A
6101258 Killion et al. Aug 2000 A
6104821 Husung Aug 2000 A
6115478 Schneider Sep 2000 A
6118877 Lindemann et al. Sep 2000 A
6144748 Kerns Nov 2000 A
6148087 Martin Nov 2000 A
6157727 Rueda Dec 2000 A
6157728 Tong et al. Dec 2000 A
6175633 Morrill et al. Jan 2001 B1
6216040 Harrison Apr 2001 B1
6230029 Hahn et al. May 2001 B1
6236731 Brennan et al. May 2001 B1
6240192 Brennan et al. May 2001 B1
6240194 De Koning May 2001 B1
6310556 Green et al. Oct 2001 B1
6311155 Vaudrey et al. Oct 2001 B1
6324291 Weidner Nov 2001 B1
6327370 Killion et al. Dec 2001 B1
6347148 Brennan et al. Feb 2002 B1
6356741 Bilotti et al. Mar 2002 B1
6366863 Bye et al. Apr 2002 B1
6381308 Cargo et al. Apr 2002 B1
6389142 Hagen et al. May 2002 B1
6449662 Armitage Sep 2002 B1
6459882 Palermo et al. Oct 2002 B1
6466679 Husung Oct 2002 B1
6522764 Bogeskov-Jensen Feb 2003 B1
6549633 Westermann Apr 2003 B1
6633645 Bren et al. Oct 2003 B2
6694034 Julstrom et al. Feb 2004 B2
6760457 Bren et al. Jul 2004 B1
7016511 Shennib Mar 2006 B1
7062223 Gerber et al. Jun 2006 B2
7075903 Solum Jul 2006 B1
7099486 Julstrom et al. Aug 2006 B2
7103191 Killion Sep 2006 B1
7116792 Taenzer et al. Oct 2006 B1
7139404 Feeley et al. Nov 2006 B2
7142814 Nassimi Nov 2006 B2
7149552 Lair Dec 2006 B2
7162381 Boor et al. Jan 2007 B2
7181032 Jakob et al. Feb 2007 B2
7248713 Bren et al. Jul 2007 B2
7257372 Kaltenbach et al. Aug 2007 B2
7317997 Boor et al. Jan 2008 B2
7369669 Hagen et al. May 2008 B2
7412294 Woolfork Aug 2008 B1
7447325 Bren et al. Nov 2008 B2
7450078 Knudsen et al. Nov 2008 B2
7529565 Hilpisch et al. May 2009 B2
7561707 Kornagel Jul 2009 B2
7590253 Killion Sep 2009 B2
7813762 Sanguino et al. Oct 2010 B2
7822217 Hagen et al. Oct 2010 B2
8041066 Solum Oct 2011 B2
8208642 Edwards Jun 2012 B2
8515114 Solum Aug 2013 B2
8737653 Woods May 2014 B2
20010007050 Adelman Jul 2001 A1
20010007335 Tuttle et al. Jul 2001 A1
20020006206 Scofield Jan 2002 A1
20020030871 Anderson et al. Mar 2002 A1
20020076073 Taenzer et al. Jun 2002 A1
20020090099 Hwang Jul 2002 A1
20020131614 Jakob et al. Sep 2002 A1
20020132585 Palermo et al. Sep 2002 A1
20020186857 Bren et al. Dec 2002 A1
20030045283 Hagedoorn Mar 2003 A1
20030059073 Bren et al. Mar 2003 A1
20030059076 Martin Mar 2003 A1
20030078071 Uchiyama Apr 2003 A1
20030133582 Niederdrank Jul 2003 A1
20030215106 Hagen et al. Nov 2003 A1
20030231783 Kah Dec 2003 A1
20040010181 Feeley et al. Jan 2004 A1
20040052391 Bren et al. Mar 2004 A1
20040052392 Sacha et al. Mar 2004 A1
20040077387 Sayag et al. Apr 2004 A1
20040136555 Enzmann Jul 2004 A1
20040141628 Villaverde et al. Jul 2004 A1
20040208333 Cheung et al. Oct 2004 A1
20050008178 Joergensen et al. Jan 2005 A1
20050078844 Von Ilberg Apr 2005 A1
20050099341 Zhang et al. May 2005 A1
20050100182 Sykes et al. May 2005 A1
20050160270 Golberg et al. Jul 2005 A1
20050249371 Vogt Nov 2005 A1
20060013420 Sacha Jan 2006 A1
20060018497 Kornagel Jan 2006 A1
20060039577 Sanguino et al. Feb 2006 A1
20060044140 Berg Mar 2006 A1
20060057973 Wikel et al. Mar 2006 A1
20060068842 Sanguino et al. Mar 2006 A1
20060093172 Ludvigsen et al. May 2006 A1
20060193273 Passier et al. Aug 2006 A1
20060193375 Lee Aug 2006 A1
20060198529 Kjems et al. Sep 2006 A1
20060205349 Passier et al. Sep 2006 A1
20060245611 Jorgensen et al. Nov 2006 A1
20060274747 Duchscher et al. Dec 2006 A1
20070004464 Lair et al. Jan 2007 A1
20070080889 Zhang Apr 2007 A1
20070121975 Sacha et al. May 2007 A1
20070149261 Huddart Jun 2007 A1
20070230727 Sanguino et al. Oct 2007 A1
20070248237 Bren et al. Oct 2007 A1
20080008341 Edwards Jan 2008 A1
20080013769 Sacha et al. Jan 2008 A1
20080159548 Solum Jul 2008 A1
20080232623 Solum et al. Sep 2008 A1
20080273727 Hagen et al. Nov 2008 A1
20080306745 Roy et al. Dec 2008 A1
20110090837 Duchscher et al. Apr 2011 A1
20110158442 Woods Jun 2011 A1
20120121094 Solum May 2012 A1
20140177885 Solum Jun 2014 A1
20140348359 Woods Nov 2014 A1
Foreign Referenced Citations (40)
Number Date Country
670349 May 1989 CH
673551 Mar 1990 CH
2510731 Sep 1976 DE
3036417 May 1982 DE
3443907 Jun 1985 DE
10146886 Apr 2003 DE
0789474 Aug 1997 EP
0941014 Sep 1999 EP
0989775 Mar 2000 EP
1185138 Mar 2002 EP
1196008 Apr 2002 EP
1365628 Nov 2003 EP
1398995 Mar 2004 EP
1174003 Jul 2004 EP
1484942 Dec 2004 EP
1519625 Mar 2005 EP
1531650 May 2005 EP
1670283 Jun 2006 EP
1715718 Oct 2006 EP
1365628 Dec 2011 EP
1879426 Aug 2013 EP
2714561 Jun 1995 FR
918998 Jan 1997 JP
10084209 Mar 1998 JP
WO-9641498 Dec 1996 WO
WO-0021332 Apr 2000 WO
WO-0158064 Aug 2001 WO
WO-0167433 Sep 2001 WO
WO-0203750 Jan 2002 WO
WO-0209363 Jan 2002 WO
WO-0223950 Mar 2002 WO
WO-2004034738 Apr 2004 WO
WO-2004100607 Nov 2004 WO
WO-2004110099 Dec 2004 WO
WO-2005009072 Jan 2005 WO
WO-2005101731 Oct 2005 WO
WO-2006023857 Mar 2006 WO
WO-2006023920 Mar 2006 WO
WO-2006078586 Jul 2006 WO
WO-2006133158 Dec 2006 WO
Non-Patent Literature Citations (189)
Entry
US 8,175,281, 05/2012, Edwards (withdrawn)
“U.S. Appl. No. 12/649,648 , Response filed Nov. 13, 2013 to Final Office Action mailed Sep. 13, 2013”, 9 pgs.
“U.S. Appl. No. 12/649,648, Final Office Action mailed Sep. 13, 2013”, 16 pgs.
“U.S. Appl. No. 12/649,648, Notice of Allowance mailed Nov. 22, 2013”, 7 pgs.
“U.S. Appl. No. 13/970,368, Preliminary Amendment mailed Mar. 6, 2014”, 6 pgs.
“U.S. Appl. No. 14/188,104, Non Final Office Action mailed Nov. 10, 2014”, 9 pgs.
“European Application Serial No. 03253052.9, Communication of Notice of Opposition mailed Sep. 24, 2012”, 22 pgs.
“European Application Serial No. 03253052.9, Communication of Notice of Opposition mailed Oct. 23, 2012”, 1 pgs.
“European Application Serial No. 03253052.9, EPO Brief Communication mailed Oct. 17, 2014”, 6 pgs.
“European Application Serial No. 03253052.9, Response filed May 2, 2013 to Notice of Opposition mailed Sep. 24, 2012”, (May 2, 2013), 36 pgs.
“European Application Serial No. 03253052.9, Summons to Attend Oral Proceedings Mailed Mar. 13, 2014”, 7 pgs.
“European Application Serial No. 03253052.9, Written Submission filed Oct. 13, 2014”, 12 pgs.
“European Application Serial No. 07254947.0, Summons to Attend Oral Proceedings mailed Nov. 7, 2014”, 3 pgs.
“European Application Serial No. 10252192.9, Response filed Jul. 18, 2013 to Extended European Search Report mailed Jan. 2, 2013”, (Jul. 18, 2013).
“U.S. Appl. No. 09/052,631, Final Office Action mailed Jul. 11, 2000”, 8 pgs.
“U.S. Appl. No. 09/052,631, Final Office Action mailed Jul. 30, 2001”, 5 pgs.
“U.S. Appl. No. 09/052,631, Non Final Office Action mailed Jan. 18, 2001”, 6 pgs.
“U.S. Appl. No. 09/052,631, Non Final Office Action mailed Dec. 28, 1999”, 10 pgs.
“U.S. Appl. No. 09/052,631, Notice of Allowance mailed Dec. 18, 2001”, 6 pgs.
“U.S. Appl. No. 09/052,631, Response filed May 18, 2001 to Non Final Office Action mailed Jan. 18, 2001”, 7 pgs.
“U.S. Appl. No. 09/052,631, Response filed Oct. 30, 2001 to Final Office Action mailed Jul. 30, 2001”, 5 pgs.
“U.S. Appl. No. 09/052,631, Response filed Nov. 10, 2000 to Final Office Action mailed Jul. 11, 2000”, 5 pgs.
“U.S. Appl. No. 09/659,214, Advisory Action mailed Jun. 2, 2003”, 3 pgs.
“U.S. Appl. No. 09/659,214, Final Office Action mailed Feb. 14, 2003”, 7 pgs.
“U.S. Appl. No. 09/659,214, Final Office Action mailed Mar. 19, 2003”, 7 pgs.
“U.S. Appl. No. 09/659,214, Non Final Office Action mailed Jul. 18, 2003”, 7 pgs.
“U.S. Appl. No. 09/659,214, Non Final Office Action mailed Sep. 6, 2002”, 7 pgs.
“U.S. Appl. No. 09/659,214, Notice of Allowance mailed Feb. 10, 2004”, 6 pgs.
“U.S. Appl. No. 09/659,214, Response filed May 19, 2003 to Final Office Action mailed Mar. 19, 2003”, 9 pgs.
“U.S. Appl. No. 09/659,214, Response filed Oct. 24, 2003 to Non Final Office Action mailed Jul. 18, 2003”, 9 pgs.
“U.S. Appl. No. 09/659,214, Response filed Nov. 12, 2002 to Non Final Office Action mailed Sep. 6, 2002”, 10 pgs.
“U.S. Appl. No. 10/146,536, Advisory Action mailed Oct. 16, 2007”, 5 pgs.
“U.S. Appl. No. 10/146,536, Final Office Action mailed May 18, 2007”, 28 pgs.
“U.S. Appl. No. 10/146,536, Non-Final Office Action mailed Sep. 19, 2006”, 26 pgs.
“U.S. Appl. No. 10/146,536, Non-Final Office Action mailed Dec. 16, 2005”, 25 pgs.
“U.S. Appl. No. 10/146,536, Notice of Allowance mailed Dec. 27, 2007”, 10 pgs.
“U.S. Appl. No. 10/146,536, Response filed Feb. 20, 2007 to Non-Final Office Action mailed Sep. 19, 2006”, 20 pgs.
“U.S. Appl. No. 10/146,536, Response filed Jun. 16, 2006 to Non-Final Office Action mailed Dec. 16, 2005”, 14 pgs.
“U.S. Appl. No. 10/146,536, Response filed Nov. 19, 2007 to Final Office Action mailed May 18, 2007”, 19 pgs.
“U.S. Appl. No. 10/146,536, Response filed Sep. 18, 2007 to Final Office Action dated Jun. 18, 2007”, 24 pgs.
“U.S. Appl. No. 10/214,045, 312 Amendment filed Jun. 12, 2003”, 6 pgs.
“U.S. Appl. No. 10/214,045, Non Final Office Action mailed Dec. 2, 2002”, 7 pgs.
“U.S. Appl. No. 10/214,045, Notice of Allowance mailed Apr. 8, 2003”, 17 pgs.
“U.S. Appl. No. 10/214,045, Response filed Apr. 2, 2002 to Non Final Office Action mailed Dec. 2, 2002”, 8 pgs.
“U.S. Appl. No. 10/243,412, Examiner Interview Summary mailed Mar. 9, 2006”, 7 pgs.
“U.S. Appl. No. 10/243,412, Final Office Action mailed Jan. 9, 2008”, 6 pgs.
“U.S. Appl. No. 10/243,412, Non Final Office Action mailed May 17, 2007”, 10 pgs.
“U.S. Appl. No. 10/243,412, Non Final Office Action mailed Jul. 28, 2006”, 10 pgs.
“U.S. Appl. No. 10/243,412, Notice of Allowance mailed Jun. 30, 2008”, 8 pgs.
“U.S. Appl. No. 10/243,412, Response filed Jan. 16, 2006 to Restriction Requirement mailed Dec. 16, 2005”, 12 pgs.
“U.S. Appl. No. 10/243,412, Response filed May 9, 2008 to Non-Final Office Action mailed Jan. 9, 2008”, 12 pgs.
“U.S. Appl. No. 10/243,412, Response filed Sep. 17, 2007 to Non Final Office Action mailed May 17, 2007”, 15 pgs.
“U.S. Appl. No. 10/243,412, Response filed Dec. 28, 2006 to Non Final Office Action mailed Jul. 28, 2006”, 16 pgs.
“U.S. Appl. No. 10/243,412, Restriction Requirement mailed Dec. 16, 2005”, 5 pgs.
“U.S. Appl. No. 10/244,295, Final Office Action mailed Aug. 11, 2006”, 9 pgs.
“U.S. Appl. No. 10/244,295, Final Office Action mailed May 24, 2007”, 11 pgs.
“U.S. Appl. No. 10/244,295, Non Final Office Action mailed Mar. 11, 2005”, 10 pgs.
“U.S. Appl. No. 10/244,295, Non Final Office Action mailed Nov. 29, 2006”, 12 pgs.
“U.S. Appl. No. 10/244,295, Non Final Office Action mailed Feb. 3, 2006”, 9 pgs.
“U.S. Appl. No. 10/244,295, Notice of Allowance mailed Aug. 7, 2007”, 7 pgs.
“U.S. Appl. No. 10/244,295, Response filed Feb. 28, 2007 to Non Final Office Action mailed Nov. 29, 2006”, 16 pgs.
“U.S. Appl. No. 10/244,295, Response filed May 3, 1920 to Non-Final Office Action mailed Feb. 3, 2006”, 17 pgs.
“U.S. Appl. No. 10/244,295, Response filed Jun. 13, 2005 to Non-Final Office Action mailed Mar. 11, 2005”, 20 pgs.
“U.S. Appl. No. 10/244,295, Response filed Jul. 24, 2007 to Final Office Action mailed May 24, 2007”, 12 pgs.
“U.S. Appl. No. 10/244,295, Response filed Oct. 11, 2006 Final Office Action mailed Aug. 11, 2006”, 17 pgs.
“U.S. Appl. No. 10/284,877, Final Office Action mailed Jun. 14, 2006”, 11 pgs.
“U.S. Appl. No. 10/284,877, Final Office Action mailed Nov. 14, 2006”, 11 pgs.
“U.S. Appl. No. 10/284,877, Non Final Office Action mailed Mar. 25, 2005”, 8 pgs.
“U.S. Appl. No. 10/284,877, Non Final Office Action mailed Dec. 1, 2005”, 10 pgs.
“U.S. Appl. No. 10/284,877, Notice of Allowance mailed Mar. 22, 2007”, 7 pgs.
“U.S. Appl. No. 10/284,877, Response filed Mar. 1, 2006 to Non Final Office Action mailed Dec. 1, 2005”, 17 pgs.
“U.S. Appl. No. 10/284,877, Response filed Mar. 14, 2007 to Final Office Action mailed Nov. 14, 2007”, 8 pgs.
“U.S. Appl. No. 10/284,877, Response filed Jun. 27, 2005 to Non Final Office Action mailed Mar. 25, 2005”, 15 pgs.
“U.S. Appl. No. 10/284,877, Response filed Oct. 16, 2006 to Final Office Action mailed Jun. 14, 2006”, 16 pgs.
“U.S. Appl. No. 11/207,555, Final Office Action mailed Jan. 22, 2009”, 15 pgs.
“U.S. Appl. No. 11/207,555, Final Office Action mailed Feb. 4, 2010”, 13 pgs.
“U.S. Appl. No. 11/207,555, Non-Final Office Action mailed Jun. 3, 2008”, 12 pgs.
“U.S. Appl. No. 11/207,555, Non-Final Office Action mailed Jul. 16, 2009”, 12 pgs.
“U.S. Appl. No. 11/207,555, Response filed Jun. 22, 2009 to Final Office Action mailed Jan. 22, 2009”, 9 pgs.
“U.S. Appl. No. 11/207,555, Response filed Nov. 3, 2008 to Non Final Office Action mailed Jun. 3, 2008”, 8 pgs.
“U.S. Appl. No. 11/207,555, Response filed Nov. 16, 2009 to Non-Final Office Action mailed Jul. 15, 2009”, 8 pgs.
“U.S. Appl. No. 11/207,591, Final Office Action mailed Jan. 6, 2009”, 13 pgs.
“U.S. Appl. No. 11/207,591, Final Office Action mailed Jan. 15, 2010”, 13 pgs.
“U.S. Appl. No. 11/207,591, Non-Final Office Action mailed Jul. 14, 2009”, 13 pgs.
“U.S. Appl. No. 11/207,591, Non-Final Office Action mailed Jul. 28, 2008”, 11 pgs.
“U.S. Appl. No. 11/207,591, Non-Final Office Action mailed Nov. 16, 2007”, 9 pgs.
“U.S. Appl. No. 11/207,591, Response filed May 6, 2008 to Non Final Office Action mailed Nov. 16, 2007”, 8 pgs.
“U.S. Appl. No. 11/207,591, Response filed May 6, 2009 to Final Office Action mailed Jan. 6, 2009”, 8 pgs.
“U.S. Appl. No. 11/207,591, Response filed Oct. 14, 2009 to Non Final Office Action mailed Jul. 14, 2009”, 10 pgs.
“U.S. Appl. No. 11/207,591, Response filed Oct. 28, 2008 to Non Final Office Action mailed Jul. 28, 2008”, 7 pgs.
“U.S. Appl. No. 11/207,591, Notice of Allowance mailed Jul. 1, 2010”, 7 pgs.
“U.S. Appl. No. 11/207,591, Response filed Jun. 15, 2010 to Final Office Action mailed Jan. 15, 2010”, 9 pgs.
“U.S. Appl. No. 11/447,617, Final Office Action mailed Mar. 3, 2010”, 31 pgs.
“U.S. Appl. No. 11/447,617, Non Final Office Action mailed Aug. 31, 2011”, 29 pgs.
“U.S. Appl. No. 11/447,617, Non-Final Office Action mailed Jun. 22, 2009”, 25 pgs.
“U.S. Appl. No. 11/447,617, Response filed May 26, 2009 to Restriction Requirement mailed Apr. 24, 2009”, 8 pgs.
“U.S. Appl. No. 11/447,617, Response filed Aug. 3, 2010 to Final Office Action mailed Mar. 3, 2010”, 14 pgs.
“U.S. Appl. No. 11/447,617, Response filed Nov. 23, 2009 to Non Final Office Action mailed Jun. 22, 2009”, 15 pgs.
“U.S. Appl. No. 11/447,617, Restriction Requirement mailed Apr. 24, 2009”, 6 pgs.
“U.S. Appl. No. 11/456,538, Final Office Action mailed Mar. 3, 2011”, 28 pgs.
“U.S. Appl. No. 11/456,538, Non-Final Office Action mailed Aug. 19, 2010”, 25 Pgs.
“U.S. Appl. No. 11/456,538, Notice of Allowance mailed Apr. 5, 2012”, 10 pgs.
“U.S. Appl. No. 11/456,538, Notice of Allowance mailed May 16, 2012”, 10 pgs.
“U.S. Appl. No. 11/456,538, Notice of Allowance Mailed Dec. 19, 2011”, 9 pgs.
“U.S. Appl. No. 11/456,538, Response filed Jan. 19, 2011 to Non Final Office Action mailed Aug. 19, 2010”, 16 pgs.
“U.S. Appl. No. 11/456,538, Response filed Aug. 5, 2011 to Final Office Action mailed Mar. 3, 2011”, 15 pgs.
“U.S. Appl. No. 11/619,541, Non Final Office Action mailed Dec. 21, 2010”, 7 pgs.
“U.S. Appl. No. 11/619,541, Notice of Allowance mailed Jul. 5, 2011”, 6 pgs.
“U.S. Appl. No. 11/619,541, Response filed May 23, 2011 to Non Final Office Action mailed Dec. 21, 2010”, 10 pgs.
“U.S. Appl. No. 11/692,763, Non-Final Office Action mailed Jan. 21, 2010”, 11 pgs.
“U.S. Appl. No. 11/692,763, Response filed Jun. 21, 2010 to Non Final Office Action mailed Jan. 21, 2010”, 9 pgs.
“U.S. Appl. No. 12/115,423, Notice of Allowance mailed Sep. 15, 2010”, 9 pgs.
“U.S. Appl. No. 12/649,648 , Response filed Jun. 5, 2013 to Non Final Office Action mailed Mar. 5, 2013”, 9 pgs.
“U.S. Appl. No. 12/649,648, Non Final Office Action mailed Mar. 5, 2013”, 15 pgs.
“U.S. Appl. No. 12/980,696, Non Final Office Action mailed Apr. 20, 2011”, 7 pgs.
“U.S. Appl. No. 13/270,860, Non Final Office Action mailed Dec. 18, 2012”, 5 pgs.
“U.S. Appl. No. 13/270,860, Notice of Allowance mailed Apr. 17, 2013”, 10 pgs.
“U.S. Appl. No. 13/270,860, Preliminary Amendment filed Jan. 27, 2012”, 7 pgs.
“U.S. Appl. No. 13/270,860, Response filed Mar. 18, 2013 to Non Final Office Action mailed Dec. 18, 2012”, 7 pgs.
“Canadian Application Serial No. 2,428,908, Office action mailed Mar. 15, 2007”, 6 pgs.
“Canadian Application Serial No. 2,428,908, Office action mailed Nov. 4, 2008”, 9 pgs.
“Canadian Application Serial No. 2,428,908, Response filed Sep. 17, 2007 to Office Action mailed Mar. 15, 2007”, 25 pgs.
“Chinese Application Serial No. 200680028085.8, Office Action mailed Apr. 12, 2011”, 3 pgs.
“European Application Serial No. 05791651.2, Office Action mailed Mar. 15, 2011”, 5 pgs.
“European Application Serial No. 03253052, European Search Report mailed Nov. 24, 2005”, 2 pgs.
“European Application Serial No. 03253052.9, European Search Report mailed Nov. 24, 2005”, 2 pgs.
“European Application Serial No. 03253052.9, Office Action mailed Mar. 26, 2009”, 3 pgs.
“European Application Serial No. 03253052.9, Response filed Oct. 5, 2009 to Office Action mailed Mar. 26, 2009”, 25 pgs.
“European Application Serial No. 05790836.0, Office Action Mailed Jun. 4, 2009”, 3 pgs.
“European Application Serial No. 05791651.2, Office Action Response Filed Jul. 7, 2011”, 11 pgs.
“European Application Serial No. 06772250.4, Office Action mailed Dec. 22, 2010”, 3 pgs.
“European Application Serial No. 06772250.4, Response filed Jun. 24, 2011 to Office Action mailed Dec. 22, 2010”, 18 pgs.
“European Application Serial No. 07252582.7, Extended European Search Report mailed Apr. 4, 2008”, 7 pgs.
“European Application Serial No. 07252582.7, Office Action Mailed Feb. 6, 2009”, 2 pgs.
“European Application Serial No. 07252582.7, Office Action mailed Dec. 27, 2011”, 4 pgs.
“European Application Serial No. 07252582.7, Response filed Apr. 20, 2011 to Office Action mailed Oct. 15, 2010”, 4 pgs.
“European Application Serial No. 07252582.7, Response filed Apr. 27, 2012 to Office Action mailed Dec. 27, 2011”, 3 pgs.
“European Application Serial No. 07252582.7, Response filed Aug. 11, 2009 to Office Communication mailed Feb. 6, 2009”, 2 pgs.
“European Application Serial No. 07252582.7.0, Office Action mailed Oct. 15, 2010”, 4 pgs.
“European Application Serial No. 07254947.0, Extended European Search Report mailed Apr. 3, 2008”, 6 pgs.
“European Application Serial No. 07254947.0, Office Action mailed Aug. 25, 2008”, 1 pgs.
“European Application Serial No. 07254947.0, Office Action mailed Jan. 19, 2012”, 5 pgs.
“European Application Serial No. 07254947.0, Office Action mailed Oct. 12, 2010”, 4 pgs.
“European Application Serial No. 07254947.0, Response filed Apr. 26, 2011 to Official Communication mailed Oct. 12, 2010”, 11 pgs.
“European Application Serial No. 07254947.0, Response filed Jul. 20, 2012 to Examination Notification Art. 94(3) mailed Jan. 19, 2012”, 9 pgs.
“European Application Serial No. 07254947.0, Response filed Feb. 28, 2009 to Official Communication mailed Aug. 25, 2008”, 2 pgs.
“European Application Serial No. 10252192.9, Extended European Search Report mailed Jan. 2, 2013”, 8 pgs.
“Hearing Aids—Part 12: Dimensions of electrical connector systems”, IEC 118-12, (1996), 24 pgs.
“Hearing Aids—Part 6: Characteristics of electrical input circuits for hearing aids”, IEC 60118-6, (1999), 12 pgs.
“International Application Serial No. PCT/US2005/029793, International Preliminary Report on Patentability mailed Mar. 1, 2007”, 5 pgs.
“International Application Serial No. PCT/US2005/029793, International Search Report mailed Jan. 5, 2006”, 7 pgs.
“International Application Serial No. PCT/US2005/029793, Written Opinion mailed Jan. 5, 2006”, 4 pgs.
“International Application Serial No. PCT/US2005/029971, International Preliminary Report on Patentability mailed Mar. 1, 2007”, 6 pgs.
“International Application Serial No. PCT/US2005/029971, International Search Report mailed Jan. 5, 2006”, 7 pgs.
“International Application Serial No. PCT/US2005/029971, Written Opinion mailed Jan. 5, 2006”, 4 pgs.
“International Application Serial No. PCT/US2006/021870, International Preliminary Report on Patentability mailed Nov. 3, 2006”, 13 pgs.
“International Application Serial No. PCT/US2006/021870, International Search Report mailed Nov. 3, 2006”, 4 pgs.
“Kleer Announces Reference Design for Wireless Earphones”, [Online]. Retrieved from the Internet: <URL:http://kleer.com/newsevents/press—releases/prjan2.php>, (Jan. 2, 2007), 2 pgs.
“Technical Data Sheet—Microphone Unit 6903”, Published by Microtronic, (Dec. 2000), 2 pgs.
Beck, L. B., “The “T” Switch; Some Tips for Effective Use”, Shhh, (Jan.-Feb. 1989), 12-15.
Birger, Kollmeier, et al., “Real-time multiband dynamic compression and noise reduction for binaural hearing aids”, Journal of Rehabilitation Research and Developement, vol. 30, No. 1, (Jan. 1, 1993), 82-94.
Davis, A., et al., “Magnitude of Diotic Summation in Speech-in-Noise Tasks:Performance Region and Appropriate Baseline”, British Journal of Audiology, 24, (1990), 11-16.
Gilmore, R., “Telecoils: past, present & future”, Hearing Instruments, 44 (2), (1993), pp. 22, 26-27, 40.
Greefkes, J. A, et al., “Code Modulation with Digitally Controlled Companding for Speech Transmission”, Philips Tech. Rev., 31(11/12), (1970), 335-353.
Griffing, Terry S, et al., “Acoustical Efficiency of Canal ITE Aids”, Audecibel, (Spring 1983), 30-31.
Griffing, Terry S, et al., “Custom canal and mini in-the-ear hearing aids”, Hearing Instruments, vol. 34, No. 2, (Feb. 1983), 31-32.
Griffing, Terry S, et al., “How to evaluate, sell, fit and modify canal aids”, Hearing Instruments, vol. 35, No. 2, (Feb. 1984), 3 pgs.
Haartsen, J., “Bluetooth—The Universal Radio Interface for Ad Hoc, Wireless Conncetivity”, Ericsson Review, No. 3, (1998), 110-117.
Halverson, H. M., “Diotic Tonal Volumes as a Function of Difference of Phase”, The American Journal of Psychology, 33(4), (Oct. 1922), 526-534.
Hansaton Akustik GmbH, “48 K-AMP CONTACTMATIC”, (from Service Manual), (Apr. 1996), 8 pgs.
Lacanette, Kerry, “A Basic Introduction to Filters—Active, Passive, and Switched-Capacitor”, National Semiconductor Corporation, http://www.swarthmore.edu/NatSci/echeeve1/Ref/DataSheet/Inttofilters.pdf, (Apr. 1991), 1-22.
Lindemann, “Two microphone nonlinear frequency domain beamformer for hearing aid noise reduction”, Applications of Signal Processing to Audio and Acoustics, IEEE ASSP Workshop on Applications of Signal Processing to Audio and Acoustics, 1995., IEEE ASSP Workshop on Oct. 15-18, 1995 on pp. 24-27 Publication Date: Oct. 15-18, 1995 on page(s):, (Oct. 1995), 24-27.
Lindemann, Eric, “Two Microphone Nonlinear Frequency Domain Beamformer for Hearing Aid Noise Reduction”, Proc. IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, (1995), 24-27.
Lybarger, S. F, “Development of a New Hearing Aid with Magnetic Microphone”, Electrical Manufacturing, (Nov. 1947), 11 pages.
Mahon, William J, “Hearing Aids Get a Presidential Endorsement”, The Hearing Journal (Oct. 1983), 7-8.
Olivier, Roy, “Distributed Signal Processing for Binaural Hearing Aid”, [Online]. Retrieved from Internet: <http://infoscience.epfl.ch/record/126277/files/EPFL TH4220.pdf?version=1>, (Jan. 1, 2008), 1-143.
Olivier, Roy, et al., “Rate-Constrained Collaborative Noise Reduction for Wireless Hearing Aid”, IEEE Transactions on signal Processing, IEEE Service center, New York, NY, US, vol. 57, No. 2, (Feb. 1, 2009), 645-657.
Peissig, J., et al., “Directivity of binaural noise reduction in spatial multiple noise-source arrangements for normal and impaired listeners”, J Acoust Soc Am., 101(3), (Mar. 1997), 1660-70.
Preves, D. A., “A Look at the Telecoil—Its Development and Potential”, SHHH Journal, (Sep./Oct. 1994), 7-10.
Preves, David A., “Field Trial Evaluations of a Switched Directional/Omnidirectional In-the-Ear Hearing Instrument”, Journal of the American Academy of Audiology, 10(5), (May 1999), 273-283.
Schaefer, Conrad, “Letter referencing Micro Ear Patent”, (Aug. 22, 2002), 2 pgs.
Srinivasan, S., “Low-bandwidth binaural beamforming”, IEEE Electronics Letters, 44(22), (Oct. 23, 2008), 1292-1293.
Srinivasan, Sriram, et al., “Beamforming under Quantization Errors in Wireless Binaural Hearing Aids”, EURASIP Journal on Audio, Speech, and Music Processing, vol. 2008, Article ID 824797, (Jan. 28, 2008), 8 pgs.
Sullivan, Roy F, “Custom canal and concha hearing instruments: A real ear comparison Part I”, Hearing Instruments, vol. 40, No. 4, (Jul. 1989), 23-29.
Sullivan, Roy F, “Custom canal and concha hearing instruments: A real ear comparison Part II”, Hearing Instruments, vol. 40, No. 7, (Jul. 1989), 30-36.
Teder, Harry, “Something New in CROS”, Hearing Instruments, vol. 27, No. 9, Published by Harcourt Brace Jovanovich, (Sep. 1976), pp. 18-19.
Valente, Michael, et al., “Audiology: Treatment”, Thieme Medical Publishers, (Mar. 1, 2000), 594-599.
Vivek, Goyal K, “Theoretical Foundations of Transform Coding”, IEEE Single Processing Magazine, IEEE Service center, Piscataway, NJ, US, vol. 18, No. 5, (Sep. 1, 2001), 9-21.
Zelnick, E., “The Importance of Interaural Auditory Differences in Binaural Hearing”, In: Binaural Hearing and Amplification, vol. 1, Libby, E. R., Editor, Zenetron, Inc., Chicago, IL, (1980), 81-103.
Related Publications (1)
Number Date Country
20120308019 A1 Dec 2012 US
Continuations (1)
Number Date Country
Parent 11456538 Jul 2006 US
Child 13464419 US