1. Field of the Invention
The present invention relates to a directivity control apparatus, a directivity control method and a directivity control system which control directivity of sound data.
2. Description of the Related Art
In a related art, in a monitoring system provided at a predetermined position (for example, on a ceiling surface) of a factory, a shop such as a retail shop and a bank, or a public place such as a library, one or more camera devices such as a PTZ camera device or an omnidirectional camera device are connected to the system via a network to achieve a wide angle of view of image data (including a still image and a moving image, the same applies hereinafter) of a video in a monitoring target range.
Further, since the amount of information obtained by monitoring with a video is limited, a monitoring system in which sound data generated by a specific monitoring target such as a person present within the angle of view of a camera device is obtained using a microphone array device in addition to one or more camera devices is highly demanded. In such a monitoring system, in a case where an observer wants to listen to sound data generated by a specific monitoring target, it is necessary to establish synchronization between image data of a video captured by a camera device and sound data of a sound captured by a microphone array device.
Here, as the related art for establishing synchronization between the image data of video captured by the camera device and the sound data of the sound captured by the microphone array device, a signal processing device disclosed in JP-A-2009-130767 is known.
The signal processing device disclosed in JP-A-2009-130767 calculates a distance to an object captured by an imaging unit according to a result of a zoom operation of the object by a photographer, and emphasizes the sound collected by a microphone unit according to the calculated distance. Further, the signal processing device delays either of a video signal captured by the imaging unit or a sound signal collected by the microphone unit according to the distance to the object from the photographer. By doing this, since the signal processing device delays either the video signal or the sound signal according to the distance to the object even when the zoom operation is performed on the object by the photographer, the synchronization between the video signal and the sound signal can be achieved.
In JP-A-2009-130767, an emphasis processing of the sound signal collected by the microphone unit is performed in accordance with the zoom operation by the photographer. However, when the configuration of JP-A-2009-130767 is attempted to be applied to the above-described monitoring system and the monitoring range selected by the observer is changed by the zoom operation, there is a possibility that directivity of the sound, from the microphone array device, with respect to a specific object such as a person in a monitoring range changed in accordance with the zoom operation is not properly formed.
When the directivity of the sound data in the monitoring system is not properly formed, a sound generated by the specific object serving as a monitoring target is not transmitted to the observer even if the video and the sound are synchronized, and the efficiency of a monitoring task to be performed by the observer is deteriorated.
A non-limited object of the present invention is to provide a directivity control apparatus, a directivity control method and a directivity control system that form directivity of a sound with respect to an object serving as a changed monitoring target and suppress deterioration of efficiency of a monitoring task to be performed by an observer even when an object serving as a monitoring target is changed in accordance with the zoom processing with respect to the monitoring target.
An aspect of the present invention provides a directivity control apparatus for controlling a directivity of a sound collected by a sound collecting unit including a plurality of microphones, the directivity control apparatus including: a beam forming unit, configured to form a beam in a direction from the sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit; and a magnification setting unit, configured to set a magnification for magnifying or demagnifying the image in the display according to an input, wherein the beam forming unit is configured to change a size of the formed beam in accordance with the magnification set by the magnification setting unit.
An aspect of the present invention provides a directivity control method in a directivity control apparatus for controlling a directivity of a sound collected by a sound collecting unit including a plurality of microphones, the directivity control method including: forming a beam in a direction from the sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit; setting a magnification for magnifying or demagnifying the image in the display according to an input; and changing a size of the formed beam in accordance with the magnification as set.
An aspect of the present invention provides a non-transitory storage medium, in which a program is stored, the program causing a directivity control apparatus for controlling a directivity of a sound collected by a sound collecting unit including a plurality of microphones to execute the following steps of; forming a beam in a direction from the sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit; setting a magnification for magnifying or demagnifying the image in the display according to an input; and changing a size of the formed beam in accordance with the magnification as set.
An aspect of the present invention provides a directivity control system, including; an imaging unit, configured to capture an image in a sound collection area; a first sound collecting unit including a plurality of microphones, configured to collect sound in the sound collection area; and a directivity control apparatus, configured to control a directivity of the sound collected by the first sound collecting unit, wherein the directivity control apparatus includes; a display unit on which image in the sound collection area captured by the imaging unit is displayed; a beam forming unit, configured to form a beam in a direction from the first sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit; and a magnification setting unit, configured to set a magnification for magnifying or demagnifying the image in the display according to an input, wherein the beam forming unit is configured to change a size of the formed beam in accordance with the magnification set by the magnification setting unit.
An aspect of the present invention provides a directivity control system, including: an imaging unit, configured to capture an image in a sound collection area; a first sound collecting unit including a plurality of microphones, configured to collect sound in the sound collection area; a second sound collecting unit disposed in a periphery of the first sound collecting unit; and a directivity control apparatus, configured to control a directivity of the sound collected by the first sound collecting unit and the second collecting unit, wherein the directivity control apparatus includes: a display unit on which image in the sound collection area captured by the imaging unit is displayed; and a beam forming unit, configured to form a beam in a direction from the first sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit according to a designation of the position.
According to aspects of the present invention, directivity of a sound with respect to an object serving as a changed monitoring target is appropriately formed and deterioration of efficiency of a monitoring task to be performed by an observer can be suppressed even when the object serving as a monitoring target is changed in accordance with a zoom processing with respect to the monitoring target.
In the accompanying drawings:
Hereinafter, respective embodiments of a directivity control apparatus, a directivity control method and a directivity control system will be described with reference to the accompanying drawings. The directivity control system of respective embodiments is used as a monitoring system including a manned monitoring system and an unmanned monitoring system disposed in, for example, a factory, a public facility such as a library or an event hall or a shop such as a retail shop or a bank.
In addition, the present invention can be realized as a program causing a computer serving as the directivity control apparatus to execute an operation prescribed by the directivity control method or a computer readable recording medium in which a program causing a computer to execute an operation prescribed by the directivity control method is recorded.
Hereinafter, respective devices constituting the directivity control system 10 will be described. For convenience of description hereinafter, a description will be made of a housing of the camera device 1 and a housing of the omnidirectional microphone array device 2 integrally attached to the same position (see
The camera device 1 as an example of an imaging unit is disposed by being fixed to a ceiling surface 8 of an event hall via, for example, a ceiling-mounted metal plate 7z described below (see
When an arbitrary position of the camera device 1 is designated by a finger 95 of an observer in image data displayed on a display device 35, the camera device 1 receives coordinate data of a designated position in image data from the directivity control apparatus 3, and calculates data of the distance and a direction (including a horizontal angle and a vertical angle; the same applies hereinafter) from the camera device 1 to a sound position in a real space corresponding to the designated position (hereinafter, simply referred to as a “sound position”) and transmits the calculated data to the directivity control apparatus 3. Further, a data calculation processing of the distance and the direction in the camera device 1 is a known technique, so the description thereof will be omitted.
In addition, the camera device 1 performs a zoom-in processing or a zoom-out processing of image data according to, for example, a periodic timing in the camera device 1 or an input operation of the finger 95 of the observer with respect to the image data displayed on the display device 35. The periodic timing is, for example, approximately once every hour or every ten minutes. The information related to the magnification of the zoom-in processing or the zoom-out processing may be designated in advance or be appropriately changed. The camera device 1 transmits the information related to the magnification of the zoom-in processing or the zoom-out processing to the directivity control apparatus 3 after the zoom-in processing or the zoom-out processing is performed.
The omnidirectional microphone array device 2 as an example of a sound collecting unit is fixed to the ceiling surface 8 of an event hall and disposed via, for example, a ceiling-mounted metal plate 7z described below (see
The omnidirectional microphone array device 2 performs a predetermined sound signal processing (for example, an amplification processing, a filter processing, or an addition processing) on sound data of the sound collected by a microphone element in the microphone unit when the power source is turned on, and transmits the sound data obtained by the predetermined sound signal processing to the directivity control apparatus 3 or the recorder 4 via the network NW.
Here, the appearance of the housing of the omnidirectional microphone array device 2 will be described with reference to
The omnidirectional microphone array device 2C shown in
The omnidirectional microphone array device 2A shown in
The omnidirectional microphone array device 2B shown in
The omnidirectional microphone array device 2 shown in
The omnidirectional microphone array device 2D shown in
Respective microphone units 22 and 23 of the omnidirectional microphone array device 2 may be a nondirectivity microphone, a bidirectivity microphone, a unidirectivity microphone, a sharp directivity microphone, a superdirectional microphone (for example, a shotgun microphone), or a combination of those microphones.
The directivity control apparatus 3 may be a stationary Personal Computer (PC) arranged in, for example, a monitoring and control room (not illustrated), or a data communication terminal such as a portable telephone which can be carried by an observer, a Personal Digital Assistant (PDA), a tablet terminal, or a smart phone.
The directivity control apparatus 3 includes at least a communication unit 31, an operation unit 32, an image processing unit 33, a signal processing unit 34, a display device 35, a speaker device 36, and a memory 37. The signal processing unit 34 includes at least a directivity direction calculation unit 34a, an output control unit 34b, and a zoom-coordination control unit 34c.
The communication unit 31 receives image data transmitted from the camera device 1, information related to magnification of the zoom-in processing or the zoom-out processing, or a sound data transmitted from the omnidirectional microphone array device 2 and outputs the data to the signal processing unit 34.
The operation unit 32 is a user interface (UI) for informing the signal processing unit 34 of an input operation by an observer, and is a pointing device such as a mouse or a keyboard. Alternatively, the operation unit 32 may be formed with a touch panel arranged corresponding to a display screen of the display device 35 and which can detect the input operation by the finger 95 or a stylus pen of the observer 5.
The operation unit 32 outputs coordinate data of the designated position designated by the finger 95 of the observer among pieces of image data (that is, image data captured by the camera device 1) displayed on the display device 35 to the signal processing unit 34. Further, the operation unit 32 outputs instruction items of the zoom-in processing or the zoom-out processing to the signal processing unit 34 in a case where the zoom-in processing or the zoom-out processing is instructed to be performed by the input operation using the finger 95 in the image data displayed on the display device 35.
The image processing unit 33 performs a predetermined image processing (for example, face detection of a person or motion detection of a person) with respect to the image data displayed on the display device 35 according to the instruction of the signal processing unit 34 and outputs the results of the image processing to the signal processing unit 34.
The image processing unit 33 detects the contour of a face of a changed monitoring target (for example, a person) displayed on a display area of the display device 35 after the zoom-in processing according to the instruction of the signal processing unit 34 in a case where the zoom-in processing is performed by the camera device 1 and performs a masking processing on the face. Specifically, the image processing unit 33 calculates the rectangular area containing the contour of the detected face and performs a predetermined vignetting processing in the rectangular area. The image processing unit 33 outputs the image data generated by the vignetting processing to the signal processing unit 34.
The signal processing unit 34 is formed with, for example, a Central Processing Unit (CPU), a Micro Processing Unit (MPU), or a Digital Signal Processor (DSP), and performs a control processing for controlling the entire operations of respective units of the directivity control apparatus 3, an I/O processing of the data between other units, an arithmetic (calculation) processing of data, and a memory processing of data.
When coordinate data of the designated position of the image data designated by the finger 95 of the observer is acquired at the time of calculation of the directivity direction coordinate (θMAh, θMAv) from the operation unit 32, the directivity direction calculation unit 34a transmits the coordinate data to the camera device 1 from the communication unit 31. The directivity direction calculation unit 34a acquires data in the direction from an installation position of the camera device 1 to a sound position (or a position of a sound source) in a real space corresponding to the designated position of the image data, and data of the direction, from the communication unit 31.
The directivity direction calculation unit 34a calculates a directivity direction coordinate (θMAh, θMAv) in the directivity direction toward the sound position from the installation position of the omnidirectional microphone array device 2 using the data of the distance from the installation position of the camera device 1 to the sound position, and the data of the direction. As shown in the present embodiment, in a case where the housing of the omnidirectional microphone array device 2 is integrally attached so as to surround the housing of the camera device 1, the direction (the horizontal angle or the vertical angle) from the camera device 1 to the sound position can be used as the directivity direction coordinate (θMAh, θMAv) from the omnidirectional microphone array device 2 to the sound position. In addition, in a case where the housing of the camera device 1 and the housing of the omnidirectional microphone array device 2 are separately attached, the directivity direction calculation unit 34a calculates the directivity direction coordinate (θMAh, θMAv) from the omnidirectional microphone array device 2 to the sound position using data of a calibration parameter calculated in advance and data of the direction (the horizontal angle and the vertical angel) from the camera device 1 to the sound position. Further, the term “calibration” means an operation of calculating or acquiring a predetermined calibration parameter necessary for the directivity direction calculation unit 34a of the directivity control apparatus 3 to calculate the directivity direction coordinate (θMAh, θMAv).
In the directivity direction coordinate (θMAh, θMAv), θMAh indicates a horizontal angle in the directivity direction toward the sound position from the installation position of the omnidirectional microphone array device 2 and θMAv indicates a vertical angle in the directivity direction toward the sound position from the installation direction of the omnidirectional microphone array device 2. In the description hereinafter, for convenience of description, the reference directions (direction at 0 degree) of respective horizontal angles of the camera device 1 and the omnidirectional microphone array device are assumed to be match.
The output control unit 34b as a beam forming unit controls the operations of the display device 35 and the speaker device 36, displays the image data transmitted from the camera device 1 to the display device 35, and outputs the sound data transmitted from the omnidirectional microphone array device 2 to the speaker device 36. Further, the output control unit 34b forms the directivity of the sound (or beam) collected by the omnidirectional microphone array device 2 in the directivity direction indicated by the directivity direction coordinate (θMAh, θMAv), which is calculated by the directivity direction calculation unit 34a using the sound data transmitted from the omnidirectional microphone array device 2.
In addition, in a case in which the zoom-in processing or the zoom-out processing of the image data is performed by the camera device 1, the output control unit 34b displays the image data after the zoom-in processing or the zoom-out processing on the display device 35, and re-forms the directivity of the sound data using the width (or size) of the beam in the directivity direction adjusted by a zoom-coordination control unit 34c described below. The “size” in this embodiment is not limited to the width of the beam representing the directivity, but may include a longitudinal length of the directivity patterns PT1, PT2 and PT3 as shown in
Be doing this, the directivity control apparatus 3 can relatively amplify the volume level of the sound generated by the monitoring target which is present in the directivity direction with the directivity formed therein, and can relatively reduce the volume level by suppressing the sound in the direction with no directivity formed therein.
In a case where the zoom-in processing or the zoom-out processing of the image data is performed by the camera device 1, the zoom-coordination control unit 34c as a magnification setting unit adjusts at least either of or both of the directivity formed by the output control unit 34b (that is, the width of the beam in the directivity direction) and the volume level of the sound data output from the speaker device 36 using the zoom-in processing. In addition, the amounts of the width of the beam and the volume level to be adjusted may respectively be predetermined values or values according to the information related to the magnification of the zoom-in processing or the zoom-out processing.
Specifically, in a case where the zoom-in processing of the image data is performed by the camera device 1, the zoom-coordination control unit 34c adjusts the width of the beam in the directivity direction to be narrow using the information related to the predetermined value or the magnification of the zoom-in processing, and increases the volume level of the sound data (see
In contrast, in a case where the zoom-out processing of the image data is performed by the camera device 1, the zoom-coordination control unit 34c adjusts the width of the beam in the directivity direction to be great using the information related to the predetermined value or the magnification of the zoom-out processing, and maintains the volume level of the sound data (see
Hereinafter, a description will be made of a case where the zoom-in processing or zoom-out processing is performed, but the same procedure can be applied to a case where magnifying or demagnifying operation of the image is performed instead of the zoom-in processing or the zoom-out processing. For example, the directivity of the sound may be changed when the image is magnified or demagnified while reproducing the recorded video.
In
The directivity pattern PT1 shown in
The directivity pattern PT3 shown in
Since the width of the beam in the directivity direction is adjusted to be narrow when the zoom-in processing is performed with respect to the image data of the display device 35 shown in
In contrast, since the width of the beam in the directivity direction is adjusted to be great when the zoom-out processing is performed with respect to the image data of the display device 35 shown in
Further, in a case where the zoom-in processing is performed by the camera device 1, the zoom-coordination control unit 34c performs the voice change processing on the sound data collected by the omnidirectional microphone array device 2 and outputs the data to the output control unit 34b.
The display device 35 as an example of a display unit is formed with, for example, a Liquid Crystal Display (LCD) or organic Electroluminescence (EL) and displays image data captured by the camera device 1 under the control of the output control unit 34b.
The speaker device 36 as an example of a sound output unit outputs sound data of the sound collected by the omnidirectional microphone array device 2 or sound data in which the directivity is formed in the directivity direction indicated by the directivity direction coordinate (θMAh, θMAv). Further, the display device 35 and the speaker device 36 may have configurations of being separate from the directivity control apparatus 3.
A memory 38 as an example of a memory unit is formed with a Random Access Memory (RAM) and functions as a work memory when respective units of the directivity control apparatus 3 are operated. In addition, the memory 38 may be formed with a hard disk or a flash memory, and stores the image data and the sound data stored in the recorder 4 in this case.
The recorder 4 stores the image data captured by the camera device 1 and the sound data of the sound collected by the omnidirectional microphone array device 2 in an associated manner.
A sound wave generated from a sound source 80 is incident at a certain angle (incident angle=(90−θ) (degrees)) to respective microphone elements 221, 222, 223, . . . , 22(n−1), and 22n to be incorporated in microphone units 22 and 23 of the omnidirectional microphone array device 2.
The sound source 80 is, for example, a monitoring target (for example, two persons 91 and 92 shown in
The sound wave generated by the sound source 80 first arrives at the microphone element 221 to be collected and then arrives at the microphone element 222 to be collected. In this manner, the sound is collected by the same processes one after another, and then the sound wave finally arrives at the microphone element 22n to be collected.
Moreover, in a case where the sound sources 80 are sounds of monitoring targets (for example, the two persons 91 and 92) at the time of meeting, the direction from the positions of respective microphone elements 221, 222, 223, . . . , 22(n−1), 22n of the omnidirectional microphone array device 2 toward the sound sources 80 is the same as the direction from respective microphones (microphone element) of the omnidirectional microphone array device 2 toward the sound direction corresponding to the designated position designated in the display device 35 by the observer.
Here, time from the sound source arriving at the microphone elements 221, 222, 223, . . . , 22(n−1) in this order to the sound wave finally arriving at the microphone element 22n, which are arrival time differences τ1, τ2, τ3, . . . , τ(n−1), are generated. For this reason, in a case where the sound data of the sound collected by the respective microphone elements 221, 222, 223, . . . , 22(n−1), and 22n are added as they are, the volumes level of the sound wave attenuates each other as a whole because the data is added in a state in which phases are shifted.
Moreover, τ1 is a time difference between the time at which the sound wave arrives at the microphone element 221 and the time at which the sound wave arrives at the microphone element 22n, τ2 is a time difference between the time at which the sound wave arrives at the microphone element 222 and the time at which the sound wave arrives at the microphone element 22n, and, in the same manner, τ(n−1) is a time difference between the time at which the sound wave arrives at the microphone element 22(n−1) and the time at which the sound wave arrives at the microphone element 22n.
In the present embodiment, the omnidirectional microphone array device 2 includes A/D converters 241, 242, 243, . . . , 24(n−1), and 24n corresponding to each of the microphone elements 221, 222, 223, . . . , 22(n−1), and 22n; delay units 251, 252, 253, . . . , 25(n−1), and 25n; and an adder 26 (see
In other words, the omnidirectional microphone array device 2 performs AD conversion of analog sound data collected by respective microphone elements 221, 222, 223, . . . , 22(n−1), and 22n to digital sound data in A/D converters 241, 242, 243, . . . , 24(n−1), and 24n.
Moreover, in delay units 251, 252, 253, . . . , 25(n−1), and 25n, after the omnidirectional microphone array device 2 arranges phases of entire sound waves by applying delay times corresponding to arrival time differences in respective microphone elements 221, 222, 223, . . . , 22(n−1), and 22n, the sound data is added after the delay processing in the adder 26. By doing this, the omnidirectional microphone array device 2 forms directivity of sound data in respective microphone units 221, 222, 223, . . . , 22(n−1), and 22n in the direction at a predetermined angle θ.
For example, in
L1 is a difference between the sound wave arrival distances of the microphone element 221 and the microphone element 22n. L2 is a difference of the sound wave arrival distance between the microphone element 222 and the microphone element 22n. L3 is a difference of the sound wave arrival distance between the microphone element 223 and the microphone element 22n, and, in the same manner, L(n−1) is a difference of the sound wave arrival distance between the microphone element 22(n−1) and the microphone element 22n. Vs is the velocity of the sound wave (sound velocity). L1, L2, L3, . . . , L(n−1) and Vs are known values. In
In this way, the omnidirectional microphone array device 2 can easily form the directivity of the sound data of the sound collected by respective microphone elements 221, 222, 223, . . . , 22(n−1), and 22n incorporated in the microphone units 22 and 23 by changing the delay times D1, D2, D3, . . . , Dn−1, and Dn set in the delay units 251, 252, 253, . . . , 25(n−1), and 25n.
Moreover, the description of the forming processing of the directivity shown in
In
In
In
In
Next, detailed operation procedures of the directivity control system 10 of the present embodiment will be described with reference to
In
In contrast, when it is determined that the zoom-coordination flag is on (S1, Yes), the output control unit 34b forms the directivity of the sound data in the directivity direction from the omnidirectional microphone array device 2 toward the sound position in the real space corresponding to the designated position in the image data displayed on the display device 35 (S2).
Subsequent to Step S2, it is assumed that the zoom-in processing or the zoom-out processing of the image data displayed on the display device 35 is instructed to be performed by the periodic timing of the camera device 1 or the input operation of the observer. The camera device 1 performs the zoom-in processing or the zoom-out processing of the image data displayed on the display device 35 according to the execution instruction of the zoom-in processing or the zoom-out processing. The camera device 1 transmits the information related to the magnification of the zoom-in processing or the zoom-out processing and the image data after the zoom-in processing or the zoom-out processing to directivity control apparatus 3 via the network NW after the zoom-in processing or the zoom-out processing. The zoom-coordination control unit 34c acquires the information (zoom information) related to the magnification of the zoom-in processing or the zoom-out processing and the image data after the zoom-in processing or the zoom-out processing from the communication unit 31.
The zoom-coordination control unit 34c performs a predetermined image processing on the image processing unit 33 using the image data after the zoom-in processing or the zoom-out processing. The image processing unit 33 performs the predetermined image processing (for example, face detection of a person or motion detection of a person) with respect to the image data after the zoom-in processing or the zoom-out processing which is displayed on the display device 35, and outputs the results of the image processing to the zoom-coordination control unit 34c (S4).
In a case where a person is undetected in the image data after the zoom-in processing or the zoom-out processing, which is displayed on the display device 35 (S5, NO) from the results of the imaging processing in Step S4, the zoom-coordination control unit 34c determines that the directivity of the sound data is maintained without adjustment regardless of the zoom-in processing or the zoom-out processing and the volume level of the sound data is maintained without adjustment. The output control unit 34b outputs the sound data collected by the omnidirectional microphone array device 2 in a state in which the directivity of the sound data before the zoom-in processing or the zoom-out processing is maintained (S6). Subsequent to Step S6, the operation of the directivity control apparatus 3 shown in
In contrast, in a case where a person is detected in the image data after the zoom-in processing or the zoom-out processing, which is displayed on the display device 35 from the results of the image processing in Step S4 (S5, YES), the zoom-coordination control unit 34c determines whether the zoom-in processing is performed by the camera device 1 based on the information related to the magnification of the zoom-in processing or the zoom-out processing acquired in Step S3 (S7).
In a case where it is determined that the zoom-in processing is performed by the camera device 1 (S7, YES), the zoom-coordination control unit 34c performs a predetermined privacy protection processing related to the image and the sound (S8). Here, the operation of the predetermined privacy protection processing will be described with reference to
In
In contrast, in a case where the zoom-coordination control unit 34c determines that the sound privacy protection setting is on (S8-1, YES), the zoom-coordination control unit 34c performs the voice change processing with respect to the sound data output from the speaker device 36 after the image data displayed on the display device 35 is subjected to the zoom-in processing (S8-2). Subsequent to Step S8-2, the sound privacy protection processing shown in
As an example of the voice change processing, the zoom-coordination control unit 34c increases or decreases the pitch of waveforms of the sound data of the sound collected by the omnidirectional microphone array device 2 or the sound data with the directivity formed therein by the output control unit 34b (for example, see
Further, in
In contrast, in a case where the zoom-coordination control unit 34c determines that the image privacy protection setting is on (S8-3, YES), the image processing unit 33 detects (extracts) a contour DTL of a face of a new monitoring target (for example, a person TRG) displayed on the display area of the display device 35 after the zoom-in processing according to the instruction of the zoom-coordination control unit 34c (S8-4), and performs the masking processing on the contour DTL of the face (S8-5). Specifically, the image processing unit 33 calculates a rectangular area including the contour DTL of the detected face and performs a predetermined vignetting processing in the rectangular area (see
In this way, the image processing unit 33 can effectively protect the privacy of the object on the image by making it difficult for the object (for example, a specific person) serving as a changed monitoring target after the zoom-in processing to be recognized who the object is. The output control unit 34b displays the image data as it is after the zoom-in processing on the display device 35 (S8-6). Subsequent to Step S8-6, the image privacy protection processing shown in
In
In contrast, in a case where the zoom-coordination control unit 34c determines that the zoom-out processing is performed by the camera device 1 (S10, YES), the zoom-coordination control unit 34c adjusts the width of the beam in the directivity direction to be great and maintains the volume level of the sound data or decrease the volume level if the current volume level is large enough using the information related to the known values or the magnification of the zoom-out processing (S11). Further, the output control unit 34b re-forms the directivity of the sound data according to the width of the beam in the directivity direction after the adjustment by the zoom-coordination control unit 34c (S11). Subsequent to Step S11, the operation of the directivity control apparatus 3 advances to Step S6.
By doing this, in the directivity control system 10 of the present embodiment, since the directivity control apparatus 3 adjusts the strength of the directivity of the sound data (that is, the width of the beam in the directivity direction) according to the zoom processing and re-forms the directivity along with the width of the beam after the adjustment when the object serving as a monitoring target is changed by the zoom processing of the camera device 1 with respect to the monitoring target (for example, a person) displayed on the display device 35, the directivity of the sound data with respect to the object serving as a changed monitoring target is appropriately formed and the deterioration of efficiency of a monitoring task performed by an observer can be suppressed.
For example, since the directivity control apparatus 3 can adjust the width of the beam in the directivity direction to be narrow and can output the sound generated by the object (for example, a specific person) serving as a changed monitoring target such that the sound is distinguished from the surrounding sound of the object, the efficiency of the monitoring task performed by the observer can be improved.
Moreover, for example, since the directivity control apparatus 3 can adjust the width of the beam in the directivity direction to be great and can comprehensively output the sound generated by the object serving as a changed monitoring target (for example, plural persons) in a case where the zoom processing of the image data is the zoom-out processing, the efficiency of the monitoring task performed by the observer can be improved.
In addition, since the directivity control apparatus determines whether to adjust the volume level of the sound data in a case where the object serving as a monitoring target is changed by the zoom processing with respect to the monitoring target, the sound can be output without a sense of discomfort with the size of the display area of the display unit serving as a changed monitoring target according to the content of the zoom processing.
For example, since the directivity control apparatus 3 can increase the volume level of the sound data and can output the sound generated by the object serving as a changed monitoring target (for example, a specific person) with a volume higher than the surrounding sound of the object in a case where the zoom processing of the image data is the zoom-in processing, the efficiency of the monitoring task performed by the observer can be improved.
Further, for example, since the directivity control apparatus 3 can maintain the volume level of the sound data even when the zoom processing of the image data is the zoom-out processing, the directivity control apparatus 3 can output the sound generated by the object (for example, plural persons) serving as a changed monitoring target such that the sound becomes equivalent to the surrounding sound of the object and performs the monitoring task without a sense of discomfort of the observer even by the zoom-out processing.
In addition, since the directivity control apparatus 3 maintains the width of the beam in the directivity direction in a case where the image processing unit determines that a person is not detected in the image data, a sense of discomfort that the environmental sound in the periphery of the sound collection area fluctuates in a state in which a person is not reflected can be eliminated without adjusting the strength of the directivity of the sound data when a person is not detected in the image data.
In the directivity control system 10 of the first embodiment, the directivity control apparatus 3 adjusts the width of the beam in the directivity direction to be narrow or wide according to the zoom-in processing or the zoom-out processing of the camera device 1, and increases the volume level of the sound data when the zoom-in processing is performed.
In contrast, in the directivity control system 10 according to the first embodiment, since the number of arrangements of the microphones incorporated in the omnidirectional microphone array device 2 is known, a case in which the strength of the sound data in the directivity direction is not enough can be considered depending on the environment of the sound collection range even when the width of the beam in the directivity direction or the volume level is adjusted.
Here, in a second embodiment, in the case where the width of the beam in the directivity direction or the volume level is adjusted according to the zoom-in processing or the zoom-out processing but the strength of the sound data in the directivity direction is not enough, the directivity control system in which an expansion microphone unit is coupled onto the periphery of the omnidirectional microphone array device 2 will be described. In the system configurations of the directivity control system of the second embodiment, since the configurations other than the expansion microphone unit described below are the same as those of the directivity control system 10 according to the first embodiment, the description related to the same content will be simply described or omitted, and the content different from that of the directivity control system 10 of the first embodiment will be described.
Next, operation procedures of a directivity control apparatus 3 according to the present embodiment will be described with reference to
In
Subsequent to Step S9, the zoom-coordination control unit 34c inquires of the observer whether the sound strength of the sound data output by the volume level of the sound data after the directivity in Step S9 is re-formed or adjusted is sufficient (S21). For example, the zoom-coordination control unit 34c displays a pop-up screen for inquiring whether the sound strength is sufficient on the display device 35 and receives an input operation of answers to the inquiry performed by the observer. In a case where an answer given by an observer that the sound strength is sufficient is input (S21, YES), the operation of the directivity control apparatus 3 advances to Step S6.
In contrast, in a case where an answer given by an observer that the sound strength is not sufficient is input (S21, NO), since the sound strength in the directivity direction is not sufficient in the current configuration of the directivity control system 10 provided with the omnidirectional microphone array device 2, the expansion microphone unit is newly coupled onto the periphery of the omnidirectional microphone array device 2 (S23) according to the attaching method described below after the power source of the omnidirectional microphone array device 2 or the omnidirectional microphone array device 2 and the expansion microphone unit is turned off (S22). In a case where the coupling of the expansion microphone unit with respect to the periphery of the omnidirectional microphone array device 2 ends (S24, YES), the power source of the omnidirectional microphone array device 2 or the omnidirectional microphone array device 2 and the expansion microphone unit is turned off (S25). Subsequently, the zoom-coordination control unit 34c again inquires of the observer whether the sound strength of the sound data output by the volume level of the sound data after the directivity in Step S9 is re-formed or adjusted is sufficient (S21).
Hereinafter, various expansion microphone units coupled onto the periphery of the omnidirectional microphone array device 2 as a first sound collecting unit according to the present embodiment will be described with reference to the drawings.
In
The coupling method is performed by releasing the omnidirectional microphone array device 2 and the camera device 1 from the ceiling surface 8, attaching the expansion microphone unit 2z1 to the ceiling surface 8 to be fixed by a screw 41 through screw holes 7eb1 and 7eb2, attaching the omnidirectional microphone array device 2 separately from the expansion microphone unit 2z1 in the height direction, and fixing the omnidirectional microphone array device 2 and the expansion microphone unit 2z1 by the screw 41 through the screw holes 7eb1 and 7eb2. Further, the omnidirectional microphone array device 2, the camera device 1, and the expansion microphone unit 2z1 may be attached to the ceiling surface 8 so as to be fixed thereto and may respectively be fixed by the screw 41 through the screw holes 7eb1 and 7eb2. It is preferable that the housing of the expansion microphone unit 2z1 is fixed on the ceiling surface 8 by use of a ceiling-mount metal fitting 7r. In addition, it is also preferable that the screw holes 7eb1 and 7eb2 provided in the housing of the omnidirectional microphone array device 2 are disposed at a position outside a margin line SPL indicated in
Accordingly, by the coupling of the expansion microphone unit 2z1 shown in
In
Accordingly, by the coupling of the expansion microphone unit 282 shown in
In
The coupling method is performed by releasing the omnidirectional microphone array device 2 and the camera device 1 from the ceiling surface 8, attaching the expansion microphone unit 2z3 to the ceiling surface 8 to be fixed by a screw 41 through screw holes 7ed1 and 7ed2, attaching the omnidirectional microphone array device 2 separately from the expansion microphone unit 2z3 in the height direction, and fixing the omnidirectional microphone array device 2 and the expansion microphone unit 2z3 by the screw 41 through the screw holes 7ed1 and 7ed2. Further, the omnidirectional microphone array device 2 and the expansion microphone unit 2z3 may be attached to the ceiling surface 8 so as to be fixed thereto and may be fixed by the screw 41 through the screw holes 7ed1 and 7ed2 respectively.
Accordingly, by the coupling of the expansion microphone unit 2z3 shown in
In
In addition, in
Accordingly, the directivity control system 10 of the present embodiment can further uniformly improve the sound collection properties of the sound with respect to all directions when compared to the sound collection properties of the sound when the omnidirectional microphone array device 2 is used alone, and can flexibly dispose the expansion microphone units 2z4 and 2z4a, and then can make a difference between sound collection properties according to the expansion direction of the expansion microphone units 2z4 and 2z4a by coupling of the expansion microphone units 2z4 and 2z4a shown in
In
The coupling method is performed by releasing the omnidirectional microphone array device 2 from the ceiling surface 8, fitting an end of the ceiling-mounted metal plate 7z, which is already provided, and end of attaching metal plates 7z1 and 7z2 for expansion for attaching the expansion microphone unit (for example, the expansion microphone units 2z5a and 2z5c to be engaged with each other, and fixing the ends with the screw 41. Further, the omnidirectional microphone array device 2 and the camera device 1 are fixed by the screw 41 attached to the ceiling-mounted metal plate 7z, and then the expansion microphone unit (for example, the expansion microphone units 2z5a and 2z5c) is attached to attaching metal plates 7z1 and 7z2 for expansion to be fixed by the screw 41.
Accordingly, by the coupling of the expansion microphone units 2z5a, 2z5b, 2z5c, and 2z5d shown in
Here, an attaching structure of the omnidirectional microphone array device 2 and the camera device 1 with respect to the ceiling-mounted metal plate 7z shown in
In
An engaging piece 7a, which projects to the same axis i direction, for attaching the camera device 1 to be fixed is formed in three sites on the concentric circle of the surface of the ceiling-mounted metal plate 7z toward the ceiling surface 8. Further, an engaging piece 7b, which projects to the same axis i direction, for attaching the omnidirectional microphone array device 2 to be fixed is formed in three sites on the concentric circle, the diameter of which is larger than that of the concentric circle on which the engaging piece 7a is formed, of the surface of the ceiling-mounted metal plate 7z.
An engaging hole 71 engaged with a fixing pin 43 which is provided on the bottom of the camera device 1 is formed on the engaging piece 7a in a gourd shape whose diameter of one end portion is larger than that of the other end portion. In the same manner, an engaging hole 73 engaged with a fixing pin 45 which is provided on the bottom of the omnidirectional microphone array device 2 is formed on the engaging piece 7b in a gourd shape whose diameter of one end portion is larger than that of the other end portion.
The fixing pins 43 and 45 respectively include a head portion having a thickness (diameter) from one end portion to the other end portion of the engaging holes 71 and 73 respectively and a body portion which is thinner than the head portion.
Fan-shaped holes 7c and 7d are formed in three sites respectively on the surface of the ceiling-mounted metal plate 7z such that the holes expand outward of the engaging pieces 7a and 7b. The shapes and the positions of these fan-shaped holes 7c and 7d are designed such that the reference directions of each horizontal angle of the omnidirectional microphone array device 2 and the camera device 1 are matched to each other in a case where the omnidirectional microphone array device 2 and the camera device 1 are attached to the ceiling-mounted metal plate 7z.
Screw holes 7e to which the screws 41 are inserted are formed in three sites on the central portion of the surface of the ceiling-mounted metal plate 7z. The ceiling-mounted metal plate 7z is fixed to the ceiling surface 8 by screwing the screw 41 to the ceiling surface 8 via the screw hole 7e.
When the omnidirectional microphone array device 2 and the camera device 1 are attached to the ceiling-mounted metal plate 7z, the camera device 1 is firstly attached to the ceiling-mounted metal plate 7z. In this case, the fixing pin 43 is engaged with the engaging hole 71 formed in the engaging piece 7a.
That is, the fixing pin 43 which projects to the bottom of the camera device 1 is inserted into one end portion side whose diameter of the engaging hole 71 is large. Further, in a state in which the head portion of the fixing pin 43 is projected from the engaging hole 71, the fixing pin 43 is allowed to be moved in the engaging hole 71 by allowing the camera device 1 to rotate clockwise or counterclockwise (rotary engaging system). Further, in a state in which the head portion of the fixing pin 43 is moved to another end side of the engaging hole 71, the fixing pin 43 and the engaging hole 71 are engaged with each other, and the camera device 1 is fixed in the same axis i direction.
The omnidirectional microphone array device 2 is attached to the ceiling-mounted metal plate 7z such that the camera device 1 is exposed from the inside of the opening 21a of the omnidirectional microphone array device 2 after the camera device 1 is attached to the ceiling-mounted metal plate 7z. In this case, the fixing pin 45 is engaged with the engaging hole 73 formed on the engaging piece 7b. Further, the procedures of fixing the fixing pin 45 to the engaging hole 73 are the same as the procedures of fixing the fixing pin 43 to the engaging hole 71.
In
In
Firstly, the coupling method is performed by connecting the microphone line accommodating tubes n1, n2, n3, and n4 to the connectors c1, c2, c3, and c4 provided in four sites facing the housing of the omnidirectional microphone array device 2. The microphone line accommodating tubes n1, n2, n3, and n4 are produced by, for example, a resin mold, and a signal line for transmitting sound data of the sound collected by the omnidirectional microphone units m1, m2, m3, and m4 are accommodated therein. After the microphone line accommodating tubes n1, n2, n3, and n4 are connected to the connectors c1, c2, c3, and c4, the expansion microphone units m1, m2, m3, and m4 are connected to the microphone line accommodating tubes n1, n2, n3, and n4, and the coupling of the expansion microphone units m1, m2, m3, and m4 are ended.
Accordingly, by coupling the expansion microphone units m1, m2, m3, and m4 shown in
In
In the coupling method shown in
The coupling method shown in
The coupling method shown in
In the coupling method shown in
In the coupling method shown in
Accordingly, by the coupling of the expansion microphone unit 2zs1 shown in
Finally, a hardware configuration of the omnidirectional microphone array device 2 and the expansion microphone unit in a case where the expansion microphone unit of the present embodiment is coupled to the omnidirectional microphone array device 2 will be simply described with reference to
The expansion microphone unit 2z1 includes at least plural (for example, m) microphone elements 22(n+1) to 22(n+m) and ADCs 24(n+1) to 24(n+m) having the same number of the microphone elements. The expansion microphone unit 2z1 can be coupled to the omnidirectional microphone array device 2 via a coupling unit CN2. Analog sound signals collected by the microphone elements 22(n+1) to 22(n+m) of the expansion microphone unit 2z1 are converted to digital sound signals in the ADCs 24(n+1) to 24(n+m), and then input to an I/F unit 2if of the omnidirectional microphone array device 2. A CPU 2p transmits the sound signals collected by the microphone elements 221 to 22n in the omnidirectional microphone array device 2 and the expansion microphone units 22(n+1) to 22(n+m) to the directivity control apparatus 3 from a communication I/F unit (not illustrated).
Hereinafter, configurations, operations, and effects of the directivity control apparatus, the directivity control method, and the directivity control system according to aspects of the present invention will be described.
An aspect of the present invention provides a directivity control apparatus for controlling a directivity of a sound collected by a sound collecting unit including a plurality of microphones, the directivity control apparatus including: a beam forming unit, configured to form a beam in a direction from the sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit; and a magnification setting unit, configured to set a magnification for magnifying or demagnifying the image in the display according to an input, wherein the beam forming unit is configured to change a size of the formed beam in accordance with the magnification set by the magnification setting unit.
The directivity control apparatus may be configured so that the beam forming unit is configured to decrease the size of the beam in a case where the magnification is set to magnify the image by the magnification setting unit.
The directivity control apparatus may be configured so that the beam forming unit is configured to increase the size of the beam in a case where the magnification is set to demagnify the image by the magnification setting unit.
The directivity control apparatus may be configured so that the beam forming unit is configured to determine whether to adjust a volume level of the sound according to the magnification set by the magnification setting unit.
The directivity control apparatus may be configured so that the beam forming unit is configured to increase the volume level of the sound in a case where the magnification is set to magnify the image by the magnification setting unit.
The directivity control apparatus may be configured so that the beam forming unit is configured to decrease the volume level of the sound in a case where the magnification is set to demagnify the image by the magnification setting unit.
The directivity control apparatus may be configured by further including an image processing unit, configured to process the image displayed on the display unit, wherein the beam forming unit is configured to maintain the size of the beam in a case where a person is undetected in the image by the image processing unit.
The directivity control apparatus may be configured so that the beam forming unit is configured to perform a voice change processing on the sound in a case where the magnification is set to magnify the image by the magnification setting unit.
The directivity control apparatus may be configured so that the image processing unit is configured to perform a masking processing on a part of the person in the image in a case where the magnification is set to magnify the image by the magnification setting unit.
An aspect of the present invention provides a directivity control method in a directivity control apparatus for controlling a directivity of a sound collected by a sound collecting unit including a plurality of microphones, the directivity control method including; forming a beam in a direction from the sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit; setting a magnification for magnifying or demagnifying the image in the display according to an input; and changing a size of the formed beam in accordance with the magnification as set.
An aspect of the present invention provides a non-transitory storage medium, in which a program is stored, the program causing a directivity control apparatus for controlling a directivity of a sound collected by a sound collecting unit including a plurality of microphones to execute the following steps of: forming a beam in a direction from the sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit; setting a magnification for magnifying or demagnifying the image in the display according to an input; and changing a size of the formed beam in accordance with the magnification as set.
An aspect of the present invention provides a directivity control system, including: an imaging unit, configured to capture an image in a sound collection area; a first sound collecting unit including a plurality of microphones, configured to collect sound in the sound collection area; and a directivity control apparatus, configured to control a directivity of the sound collected by the first sound collecting unit, wherein the directivity control apparatus includes: a display unit on which image in the sound collection area captured by the imaging unit is displayed; a beam forming unit, configured to form a beam in a direction from the first sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit; and a magnification setting unit, configured to set a magnification for magnifying or demagnifying the image in the display according to an input, wherein the beam forming unit is configured to change a size of the formed beam in accordance with the magnification set by the magnification setting unit.
The directivity control system may be configured by further including a second sound collecting unit which includes an opening surrounding a periphery of the first sound collecting unit, and a housing arranged concentrically with the first sound collecting unit.
The directivity control system may be configured by further including a second sound collecting unit which includes an opening surrounding a periphery of the first sound collecting unit, and an elliptic housing.
The directivity control system may be configured by further including a second sound collecting unit which includes an opening surrounding a periphery of the first sound collecting unit, and a rectangular housing.
The directivity control system may be configured by further including a second sound collecting unit which includes an opening surrounding a periphery of the first sound collecting unit, and a honeycomb-shaped housing.
The directivity control system may be configured so that the first sound collecting unit and the second sound collecting unit are disposed by being separated from each other in a height direction of the first sound collecting unit and the second sound collecting unit.
The directivity control system may be configured by further including a second sound collecting unit including at least one bar-shaped housing in a periphery of the first sound collecting unit.
The directivity control system may be configured by further including at least one second sound collecting unit disposed in a periphery of the first sound collecting unit, wherein the second sound collecting unit is connected to a connector provided in a periphery of the first sound collecting unit via a predetermined signal line accommodating tube.
The directivity control system may be configured by further including a second sound collecting unit including a rectangular housing which is the same as that of the first sound collecting unit, wherein each of the housings of the first sound collecting unit and the second sound collecting unit has an intermediate side portion provided with a connecting unit for connecting intermediate sides along which a semicircular concave surface is formed and opposite end portions provided with a connecting unit for connecting opposite ends at with a quadrant concave surface is formed.
An aspect of the present invention provides a directivity control system, including: an imaging unit, configured to capture an image in a sound collection area; a first sound collecting unit including a plurality of microphones, configured to collect sound in the sound collection area; a second sound collecting unit disposed in a periphery of the first sound collecting unit; and a directivity control apparatus, configured to control a directivity of the sound collected by the first sound collecting unit and the second collecting unit, wherein the directivity control apparatus includes: a display unit on which image in the sound collection area captured by the imaging unit is displayed; and a beam forming unit, configured to form a beam in a direction from the first sound collecting unit toward a sound source corresponding to a position designated in an image on a display unit according to a designation of the position.
The directivity control system may be configured so that the second sound collecting unit includes an opening surrounding a periphery of the first sound collecting unit, and a housing arranged concentrically with the first sound collecting unit.
The directivity control system may be configured so that the second sound collecting unit includes an opening surrounding a periphery of the first sound collecting unit, and an elliptic housing.
The directivity control system may be configured so that the second sound collecting unit includes an opening surrounding a periphery of the first sound collecting unit, and a rectangular housing.
The directivity control system may be configured so that the second sound collecting unit includes an opening surrounding a periphery of the first sound collecting unit, and a honeycomb-shaped housing.
The directivity control system may be configured so that the first sound collecting unit and the second sound collecting unit are disposed by being separated from each other in a height direction of the first sound collecting unit and the second sound collecting unit.
The directivity control system may be configured so that the second sound collecting unit includes at least one bar-shaped housing in a periphery of the first sound collecting unit.
The directivity control system may be configured so that the second sound collecting unit is disposed in a periphery of the first sound collecting unit, wherein the second sound collecting unit is connected to a connector provided in a periphery of the first sound collecting unit via a predetermined signal line accommodating tube.
The directivity control system may be configured so that the second sound collecting unit includes a rectangular housing which is the same as that of the first sound collecting unit, and each of the housings of the first sound collecting unit and the second sound collecting unit has an intermediate side portion provided with a connecting unit for connecting intermediate sides along which a semicircular concave surface is formed and opposite end portions provided with a connecting unit for connecting opposite ends at with a quadrant concave surface is formed.
Hereinbefore, various embodiments have been described with reference to the accompanying drawings, but the present invention is not limited to the examples. It is obvious that various modifications or corrections can be made by those skilled in the art within the scope of the present invention and understood that those modifications and corrections belong to the technical range of the present invention.
In the above embodiments, a description is made as an example in which the width of the beam in the directivity direction is adjusted by use of information on the magnification in the zoom-in processing or the zoom-out processing. However, the adjustment of the beam is not limited to the width of the beam, but any types of the size of the beam may be adjusted. For example, the height of the beam (i.e., a width of the beam in a direction orthogonal to the directivity direction) may be adjusted in stead of the width of the beam in the directivity direction.
The present invention can be effectively used as a directivity control apparatus, a directivity control method and a directivity control system which appropriately form the directivity of a sound with respect to an object serving as a monitoring target as changed and suppress deterioration of the efficiency of a monitoring task performed by an observer even when the object of the monitoring target is changed by a zoom processing with respect to the monitoring target.
Number | Name | Date | Kind |
---|---|---|---|
4559642 | Miyaji et al. | Dec 1985 | A |
5523783 | Cho et al. | Jun 1996 | A |
6157403 | Nagata | Dec 2000 | A |
8411165 | Ozawa | Apr 2013 | B2 |
9210503 | Avendano | Dec 2015 | B2 |
20050140810 | Ozawa | Jun 2005 | A1 |
20050200736 | Ito | Sep 2005 | A1 |
20050237395 | Takenaka et al. | Oct 2005 | A1 |
20080247567 | Kjolerbakken | Oct 2008 | A1 |
20100026780 | Tico et al. | Feb 2010 | A1 |
20100123785 | Chen | May 2010 | A1 |
20100254543 | Kjolerbakken | Oct 2010 | A1 |
20110164760 | Horibe | Jul 2011 | A1 |
20110317041 | Zurek | Dec 2011 | A1 |
20120019689 | Zurek et al. | Jan 2012 | A1 |
20130029684 | Kawaguchi | Jan 2013 | A1 |
20130107028 | Glei ner | May 2013 | A1 |
20130259238 | Xiang et al. | Oct 2013 | A1 |
20130342731 | Lee | Dec 2013 | A1 |
Number | Date | Country |
---|---|---|
05-260589 | Oct 1993 | JP |
05-308553 | Nov 1993 | JP |
6-133189 | May 1994 | JP |
09-219858 | Aug 1997 | JP |
10-051889 | Feb 1998 | JP |
2000-004493 | Jan 2000 | JP |
2002-223493 | Aug 2002 | JP |
2005-124090 | May 2005 | JP |
2005-311604 | Nov 2005 | JP |
2009-130767 | Jun 2009 | JP |
2009-130854 | Jun 2009 | JP |
2011-024112 | Feb 2011 | JP |
2013-157946 | Aug 2013 | JP |
Entry |
---|
Search report from E.P.O., mail date is Aug. 28, 2014. |
International Search Report from PCT in PCT/JP2014/001899, mail date is Apr. 28, 2014. |
U.S. Appl. No. 14/272,695 to Shinichi Shigenaga et al., filed May 8, 2014. |
U.S. Appl. No. 14/228,716 to Michinori Kishimoto et al., filed Mar. 28, 2014. |
Number | Date | Country | |
---|---|---|---|
20150281833 A1 | Oct 2015 | US |