The present invention contains subject matter related to Japanese Patent Application JP 2005-048542 filed in the Japanese Patent Office on Feb. 24, 2005, the entire contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to a microphone apparatus.
2. Description of the Related Art
In a videoconference, for example, generally, speech of speakers is picked up by a microphone on a table. The microphone may also pick up ambient noise, and an unclear speech signal may be output from the microphone. There are methods for picking up speech of speakers by using a microphone in order to obtain a clear speech signal.
A first method is to use a directional microphone and to give emphasis on speech while suppressing noise when the speech is input to the microphone. A second method is to adaptively process a speech signal output from a microphone to reduce noise components. The first and second methods relatively reduce the level of the noise components included in the speech signal, thereby obtaining a clear speech signal.
A microphone apparatus employing the first method includes six microphones disposed around a reference microphone (microphone unit), in which the outputs of the microphones are combined using a Fourier transform so that the overall microphone apparatus provides unidirectional performance.
This microphone apparatus is disclosed in Japanese Unexamined Patent Application Publication No. 2002-271885.
The above-described microphone apparatus combines the outputs of the microphones by determining the value of the first-order approximation term in the Fourier transform under the assumption of a single sound source and by deriving the value of the third-order approximation term from the first-order approximation term. Although the microphone apparatus provides unidirectional performance, the directional range (i.e., angular range in which gain can be obtained) is as wide as about ±60° off the main axis.
However, such a wide directional range makes it difficult to achieve the desired effects of a directional microphone in an environment where a plurality of sound sources or a noise source exists.
It is therefore desirable to provide a unidirectional microphone apparatus with a narrow directional range in which the direction of the directivity is electrically variable.
According to an embodiment of the present invention, a microphone apparatus for processing and outputting an output signal of a microphone array including at least nine microphones includes a directivity function processing circuit that converts the output signal of the microphone array into a unidirectional signal and that outputs the unidirectional signal, wherein the directivity function processing circuit expands a directivity function whose variable is an incident angle of an acoustic wave into a Fourier series up to at least third order, and the variable in the expanded expression is produced from output signals of the microphones forming the microphone array.
Therefore, the microphone apparatus has sharp unidirectional characteristics, and the directional direction of the microphone apparatus can be varied.
Directivity Function
A microphone is a converter for converting an acoustic wave output from a sound source into a speech signal (audio signal), and has a predetermined transfer characteristic with respect to the direction, frequency, etc., of the input acoustic wave.
The characteristic of the microphone is given by Eq. (1) shown in
For example, a non-directional (omnidirectional) microphone has a directivity pattern shown in
D(θ, ω)=1
A bidirectional microphone has a directivity pattern shown in
D(θ, ω)=cos θ
Eq. (1) is satisfied when a single sound source exists. When N sound sources exist, Eq. (1) is satisfied for each of the sound sources, and the characteristic of the microphone is therefore given by Eq. (2) shown in
Analysis of Unidirectional Microphone
θ: direction (angle) of sound source with respect to microphone
θc: directional direction (direction of directional microphone)
θw: directional range (angular range in which predetermined gain can be obtained).
The illustrated characteristic is regarded as a directivity function with respect to the variable θ, and can be written in terms of a Fourier series as given by Eq. (3) shown in
In Eq. (4) , by setting, for example, θw=60° and changing the directional direction θc, directional characteristics shown in
Creation of Directivity Function
Referring to
A sound source (not shown) is located in a plane including the microphone array 10. The distance between the sound source and the reference microphone M4 is represented by R, and the incident angle of the acoustic wave with respect to the microphones M0 to M8, or the directional direction, is represented by θ. The distance R is greater than the distance d between the microphones M0 to M8. The incident angle θ has any value. In
The acoustic wave output from the sound source is given by Eq. (5) shown in
In the microphone array 10, Eq. (1) is applied to the reference microphone M4. By substituting Eq. (3) in Eq. (1) and modifying the equation, Eq. (6) shown in
According to Eq. (6), the microphone array 10 has directivity, for example, as shown in
Method for Determining cos θ, cos 2θ, cos 3θ, sin θ, sin 2θ, and sin 3θ
The values of cos θ, cos 2θ, cos 3θ, sin θ, sin 2θ, and sin 3θ that are needed in Eq. (6) are determined from the output signals of the microphones M0 to M3 and M5 to M8, which will be described in detail below.
Case of cos θ
As shown in
The difference between the output signal of the microphone M3 and the output signal of the microphone M5 is given by Eq. (8) shown in
If the microphone M4 is assumed to be located at the center between the microphones M3 and M5, it is understood according to Eq. (10) that the output signal of the microphone M4 can be generated from the output signals of the microphones M3 and M5. Furthermore, Eq. (10) shows that the bidirectional characteristic shown in
Case of sin θ
As shown in
The difference between the output signal of the microphone M1 and the output signal of the microphone M7 is given by Eq. (12) shown in
According to Eq. (14), the value of sin θ is obtained by performing arithmetic processing on the output signals of the microphones M1 and M7. Furthermore, Eq. (14) shows that the bidirectional characteristic in which the bidirectional characteristic shown in
Case of cos 2θ
Eq. (10) also shows that the output signal of the microphone M3 and the output signal of the microphone M5 are used to determine the output signal of the microphone M4 at the center therebetween.
As shown in
The output signals of the virtual microphones V3 and V5 are given by Eqs. (15) and (16) shown in
Substituting Eq. (18) in Eq. (17) and rearranging the terms lead to Eq. (19). By applying a double-angle identity, which is given by Eq. (20) shown in
According to Eq. (22), the value of cos 2θ is obtained by performing arithmetic processing on the output signals of the microphones M3 to M5.
Case of sin 2θ
A similar procedure of determining cos 2θ is used to determine sin 2θ. Specifically, as shown in
The output signals of the virtual microphones V3 and V5 are given by Eqs. (23) and (24) shown in
Substituting Eq. (26) in Eq. (25) and rearranging the terms lead to Eq. (28). By applying a double-angle identity, which is given by Eq. (27) shown in
According to Eq. (29), the value of cos 2θ is obtained by performing arithmetic processing on the output signals of the microphones M0, M2, M6, and M8.
Case of cos 3θ
As shown in
The output signals of the virtual microphones V0 and V6 are given by Eqs. (30) and (31) shown in
A virtual microphone V4 is provided at the position of the microphone M4, and the output signal of the virtual microphone V4 is determined from Eqs. (34) and (35), thereby obtaining Eq. (36) shown in
According to Eq. (38), the value of cos 3θ is obtained by performing arithmetic processing on the output signals of the microphones M0, M2, M3, M5, M6, and M8.
Case of sin 3θ
As shown in
The output signals of the virtual microphones V3, V4, and V5 are given by Eqs. (39), (40), and (41) shown in
Further, a virtual microphone Va is provided at the center between the virtual microphones V3 and V4, and a virtual microphone Vb is provided at the center between the virtual microphones V4 and V5. The output signals of the virtual microphones Va and Vb are given by Eqs. (42) and (43) shown in
Substituting Eqs. (44) and (14) in a triple-angle identity, which is given by Eq. (45) shown in
According to Eq. (46), the value of sin 3θ is obtained by performing arithmetic processing on the output signals of the microphones M0 to M3 and M5 to M8.
Synthesis of Microphone Outputs
By replacing cos θ, cos 2θ, cos 3θ, sin θ, sin 2θ, and sin 3θ in Eq. (6) with Eqs. (10) , (22), (38), (14), (29), and (46), respectively, Eq. (47) shown in
In Eq. (47), some terms are multiplied by 1/(jω). This arithmetic operation is carried out by performing a Fourier transform on the corresponding signals into the frequency domain. Specifically, the multiplication of 1/j means that the phase of the speech signal component at each frequency is advanced by 90°. In the actual arithmetic operation, the speech signal component in each band after the Fourier transform is processed so that the value of the imaginary part is replaced with the value of the real part and the value of the real part is replaced with the value of the imaginary part by inverting the sign of the real part.
The multiplication of 1/ω causes the amplitude (level) of the signal component to change depending on the frequency (ω/2π), and the amplitude is also compensated.
The microphone apparatus includes a microphone array 10 having the structure shown in
The output signal y(t) is supplied to a digital-to-analog (D/A) converter circuit 14, and is D/A converted into an analog signal. The analog signal is transmitted to an output terminal 15 as a microphone output.
The directivity function processing circuit 13 is composed of, for example, a microcomputer, and is connected with an operation key 13C. When the directional direction θc and the directional range θw are specified through the operation key 13C, the Fourier coefficients a0 to a3 and b1 to b3 corresponding to the specified directional direction θc and directional range θw are generated and used in Eq. (47). In the processing circuit 13, therefore, the output signals of the microphones M0 to M8 provide a characteristic corresponding to the specified directional direction θc and directional range θw, and are combined into the signal given by Eq. (47).
The apparatus shown in
As can be seen from
Details of Operation of Directivity Function Processing Circuit
The directivity function processing circuit 13 executes a routine 100 shown in
The routine 100 starts from step 101. In step 102, the output signals of the microphones M0 to M8, that is, the speech data output from the A/D converter circuit 12, which correspond to nine-channel data for a sample, are input. In step 103, the sums and differences in the bracketed expressions in Eq. (47) are calculated. For example, in the term in the third line of Eq. (47) (i.e., the term corresponding to Eq. (10)), the expression {xM3(t)−xM5(t)} is calculated.
In step 111, it is determined whether or not the processing of steps 102 and 103 for the period of one frame has been performed, and, if not, the routine 100 returns to step 102.
If the processing of steps 102 and 103 for the period of one frame has been performed, the routine 100 proceeds from step 111 to step 112. In step 112, the calculation results determined in step 103 are converted into frequency-domain data by performing a fast Fourier transform (FFT). In step 113, coefficients of the bracketed expressions in Eq. (47) are phase-converted. For example, in the term in the third line of Eq. (47) (i.e., the term corresponding to Eq. (10)), the coefficient of the expression {xM3(t)−xM5(t)} is c/(2jωd), and the value c/(2ωd) is calculated, and is converted into the value of the imaginary part.
In step 114, the Fourier coefficients a0 to a3 and b1 to b3 corresponding to the desired directivity are multiplied by the values determined in steps 103 and 113, and the Fourier-series sum is calculated to determine the value given by Eq. (47). In step 115, the determined value is subjected to inverse fast Fourier transform (IFFT) processing, and is converted into time-domain data.
In step 121, the data converted in step 115 is supplied to the D/A converter circuit 14 for every period of one sample on a sample-by-sample basis. In step 122, it is determined whether or not the processing of step 121 for the period of one frame has been performed, and, if not, the routine 100 returns to step 121.
If the processing of step 121 for the period of one frame has been performed, the routine 100 proceeds from step 122 to step 123. In step 123, the process for the period of one frame ends.
According to the routine 100, the process given by Eq. (47) is performed. In the routine 100, the values in the bracketed expressions are calculated for each sample in step 103 before the FFT is performed in step 112. The process can therefore be properly and smoothly carried out.
Another Method for Determining cos 2θ
As shown in
The relationship between the acoustic wave with the incident angle φ and the output signals of the virtual microphones V0, V2, V6, and V8 is equivalent to the relationship between the acoustic wave with the incident angle θ and the output signals of the microphones M0, M2, M6, and M8. Thus, the output signals of the virtual microphones V0, V2, V6, and V8 are processed by a similar procedure to that of Eq. (29) (which is also shown in
As shown in
Substituting Eq. (50) in Eq. (52) leads to Eq. (53) shown in
For example, in Eq. (10), the difference signal between the output signal of the microphone M3 and the output signal of the microphone M5 is obtained in the bracketed expression. When the distance d between the microphones M0 to M8 is small, if the frequency of the input acoustic wave is low, the difference between the acoustic wave input to the microphone M3 and the acoustic wave input to the microphone M5 is small and the level of the difference signal obtained in Eq. (10) becomes low.
When the distance d is large, if the frequency of the input acoustic wave is high, the path length difference between the acoustic wave input to the microphone M3 and the acoustic wave input to the microphone M5 is one wavelength or more, and the process given by Eq. (10) is not proper.
The same applies to the difference signal or sum signal of the output signals of the microphones M0 to M8, resulting in low arithmetic precision in Eq. (47). It can therefore be difficult to obtain the desired directivity.
In such a case, two microphone arrays 10 are used. The distance d between microphones differs from one of the microphone arrays to the other, and the reference microphone disposed at the center is shared. The low-frequency component of the speech signal is extracted from the microphone array having a larger distance between the microphones, and the high-frequency component of the speech signal is extracted from the microphone array having a smaller distance. The signal obtained by summing the extracted components is subjected to the process given by Eq. (47), thereby achieving high directivity over a wide band.
In the above-described microphone apparatus, it is difficult to suppress noise arriving from the same direction as that of the target acoustic wave. In this case, for example, the output signal of the directivity function processing circuit 13 is adaptively processed to suppress the noise signal. In a case where noise is included in speech of speakers in a videoconference or the like, therefore, the noise can be suppressed to obtain a clear speech signal.
Further, first, the direction of a sound source can be detected, and, then, the directional direction θc and the directional range θw can be set again according to the detected direction, thereby emphasizing a target signal or suppressing an unnecessary signal. That is, the directivity function can be set so that sound in a specific direction can or cannot be picked up. Alternatively, a plurality of microphone arrays 10 may be arranged on the same plane so that the directional directions of the microphone arrays 10 are directed to a specific point, thereby emphasizing sound from a sound source located at the specific point.
Furthermore, it is possible to pick up clearer target sound by setting the directional direction to the target sound direction and the noise sound direction and subtracting the signal in the noise sound direction from the signal in the target sound direction. It is also possible to predict and remove acoustic waves input irrespective of the directional direction, such as noise from the vertical direction.
Moreover, a microphone array having a function, such as an echo canceller, may be used. In this case, impulse responses of the echo canceller are separately learned as information for the array outputs with individual directivities in, for example, 5°-step directional directions, thereby rapidly removing echo of the speech in the direction to which the microphone is directed. Alternatively, impulse responses of the echo canceller may be separately learned as information for, for example, eight directions, and the impulse response in a direction close to the direction to which the microphone is to be directed among the eight directions may be used as the initial value. In this case, the total amount of arithmetic operations can be reduced, and the residual echo can be reduced compared with the computation from the completely initial value.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
2005-048542 | Feb 2005 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
4653606 | Flanagan | Mar 1987 | A |
5335011 | Addeo et al. | Aug 1994 | A |
Number | Date | Country |
---|---|---|
2001-231089 | Aug 2001 | JP |
2001231089 | Aug 2001 | JP |
2002-271885 | Sep 2002 | JP |
2002271885 | Sep 2002 | JP |
Number | Date | Country | |
---|---|---|---|
20060188111 A1 | Aug 2006 | US |