System and method for acquiring acoustic information from a resonating body

Information

  • Patent Grant
  • 9638672
  • Patent Number
    9,638,672
  • Date Filed
    Friday, March 6, 2015
    9 years ago
  • Date Issued
    Tuesday, May 2, 2017
    7 years ago
Abstract
The present invention provides for systems and methods for acquiring acoustic information from a resonating body. Specifically, a focused light beam is emitted from an emitter module onto an impedance matching reflector. The impedance matching reflector has similar wave propagation characteristics to the resonating body, so as to capture mechanical vibrations thereof. The impedance matching reflector reflects the focused light beam as it vibrates, and the vibrations are captured by a receiver module through the reflected light beam, which is converted to an input signal. The input signal may be further processed and converted to an audio output signal, as an accurate representation of the acoustic information emanating from within the resonating body.
Description
FIELD OF THE INVENTION

The present invention provides for a system and method for acquiring acoustic information from a resonating body. Specifically, a light beam is used to measure the vibration of a reflective surface attached to an acoustic analog having similar mechanical properties as those of the resonating body, in order to acquire accurate acoustic information from within the resonating body.


BACKGROUND OF THE INVENTION

Auscultation relates to the listening of internal sounds within the body, and is performed for purposes of examining the circulatory system, respiratory system, as well as gastrointestinal system within a human or animal body. Auscultation may also be performed on a mechanical body, such as listening to the engine of a vehicle.


Auscultation is typically performed by using a stethoscope, which captures sound from a body through a diaphragm or bell. When the diaphragm or bell is placed directly against the body, it receives sound vibrations from therein, that then create acoustic pressure waves which travel up the tubing to a listener's ears.


One problem with traditional acoustic stethoscopes however, is the low sound volume and quality, due to a loss of signal when vibrations are transmitted between materials of different mechanical properties, i.e. the resonating human or animal body and the diaphragm or bell of the stethoscope. These different mechanical properties, such as density, impedance, vibration transmission speed, etc., affect the sound signal differently as it travels from the body to the stethoscope.


Another problem arises in high noise environments, for example, such as auscultation by a medical personnel in an ambulance or helicopter. In these types of situations, external noise will interfere with the diaphragm or bell of a stethoscope, which may either drown out or otherwise affect the auscultation sounds.


Therefore, there is a need in the art to accurately capture or acquire acoustic information from within a resonating body, while minimizing the loss of signal from transference between the body and the stethoscope, as well as noise interference from the external environment.


SUMMARY OF THE INVENTION

The present invention meets the existing needs described above by providing for systems and methods for acquiring accurate acoustic information from a resonating body, regardless of the external environment.


Accordingly, in initially broad terms, a laser or other focused light source is used to measure the vibrations of a reflective surface embedded in or attached to an optically transparent material with similar mechanical properties to a resonating body that contains the acoustic signal. This matching of the mechanical properties, and subsequent vibroscopy within the impedance matched system allows for the collection of an accurate sound signal.


As such, a system of the present invention comprises an emitter module, an impedance matching reflector, a receiver module, and may also comprise a housing. At least one system of the present invention may be directed to a medical stethoscope for listening to internal sounds of a human or animal body.


Emitter module is structured and configured to emit a focused light beam, such as a laser, onto the impedance matching reflector in order to create a reflected light beam.


The impedance matching reflector is structured to have wave propagation characteristics similar to the resonating body, so that it may vibrate at a similar frequency and share similar sound impedance and other acoustic transference characteristics as the resonating body. Impedance matching reflector may comprise an impedance matching lens and a reflective surface embedded or affixed therein. The impedance matching lens preferably comprises an optically transparent material, such as an acrylic lens or a ballistics gel. The reflective surface may comprise a mirror formed of appropriate materials so as to reflect the emitted focused light beam onto the receiver module.


The receiver module is structured to detect the light intensity of the reflected light beam and/or vibrations or other characteristics thereof, in order to create an input signal. The input signal may be modified by an audio processor for sound enhancement and/or noise reduction. The input signal is converted to an output audio signal through an audio transducer forming part of, or in communication with, the receiver module.


In other embodiments of the present invention directed to methods for acquiring acoustic information from a resonating body, an impedance matching reflector may first be positioned in vibrational transference relations to a resonating body. A focused light beam is then emitted from an emitter module onto the impedance matching reflector in order to create a reflected light beam. The reflected light beam is then received at a receiver module as the impedance matching reflector vibrates in accordance with the resonating body. The reflected light beam is further converted by the receiver module in order to create an input signal. The input signal may further be processed and/or converted into an audio output signal with an audio transducer.


These and other objects, features and advantages of the present invention will become clearer when the drawings as well as the detailed description are taken into consideration.





BRIEF DESCRIPTION OF THE DRAWINGS

For a fuller understanding of the nature of the present invention, reference should be had to the following detailed description taken in connection with the accompanying drawings in which:



FIG. 1 is a schematic representation of a system for acquiring acoustic information from a resonating body.



FIG. 2 is a schematic representation of a system for acquiring acoustic information from a resonating body comprising a different impedance matching reflector configuration.



FIG. 3 is a schematic representation of a system for acquiring acoustic information from a resonating body comprising another impedance matching reflector configuration.



FIG. 4 is a flowchart directed to a method for acquiring acoustic information from a resonating body.





Like reference numerals refer to like parts throughout the several views of the drawings.


DETAILED DESCRIPTION OF THE EMBODIMENT

As illustrated by the accompanying drawings, the present invention is directed to a system and method for acquiring acoustic information from a resonating body. Specifically, some embodiments of the present invention relate to the use of a focused light beam in order to measure the vibrations of an impedance matching reflector in vibrational transference relations to a resonating body.


Accordingly, as schematically represented in FIG. 1, one embodiment of the present invention may comprise a system 100 for acquiring acoustic information from a resonating body 150. System 100 may comprise an emitter module 101, an impedance matching reflector 110, a receiver module 102, and may further comprise a housing 140 structured to stabilize the internal components. Resonating body 150 may comprise any object or body capable of emitting sound from within. In at least one embodiment of the present invention, the system 100 is used as a stethoscope to acquire acoustic information from a human or animal body.


Emitter module 101 is structured and configured to emit a focused light beam onto the impedance matching reflector 110 in order to create a reflected light beam. In a preferred embodiment, the emitter module 101 will be structured to create a highly collimated light beam such as a laser. Accordingly, emitter module 101 may comprise at least one laser diode. The laser diode may comprise a double heterostructure laser diode, quantum well laser diode, distributed Bragg reflector laser diode, distributed feedback laser diode, vertical-cavity surface-emitting laser (VCSEL) diode, vertical-external-cavity surface-emitting-laser diode (VECSEL) diode, and other laser diodes known to those skilled in the art. In other embodiments, the emitter module 101 may be structured to create other appropriately focused light beams which may also be measurable in intensity and/or movement.


Impedance matching reflector 110 is structured to have wave propagation characteristic substantially similar to the resonating body, so as to vibrate at a substantially similar frequency as the resonating body. For instance, the impedance matching reflector 110 may comprise a similar density or have similar acoustic impedance as the resonating body. At least a portion of the impedance matching reflector 110 is placed in vibrational transference relations to the resonating body, so as to receive mechanical vibrations and vibrate in a similar fashion or at a similar frequency as the resonating body.


In a preferred embodiment of the present invention, the impedance matching reflector 110 may comprise an impedance matching lens 111 and a reflective surface 112 embedded therein. The impedance matching lens 111 comprises an optically transparent material. In at least one embodiment where it may be desirable to measure a solid structure, the impedance matching lens 111 may comprise an acrylic lens. The acrylic lens may comprise polymethyl methacrylate or other appropriate and transparent acrylic polymers. In at least one embodiment of the present invention comprising housing 140 structured to enclose the emitter module 101, receiver module 102, and impedance matching reflector 110, at least a portion of the impedance matching reflector 110 may be exposed externally to the housing 140, such as to make direct physical contact with the resonating body 150. This exposed area of the impedance matching reflector 110 is the equivalent of the “diaphragm” and/or “bell” of a traditional stethoscope.


In at least one embodiment of the present invention directed to medical stethoscopes, the impedance matching lens 111 may comprise a ballistics gelatin or gel that simulates the density and viscosity of human or animal tissue. The ballistics gel preferably comprises a synthetic gel that may be reusable and may be reformed without affecting the properties of the gel. Different ballistics gels having various densities and viscosities may be formulated for different tissues, in order to provide for a closer simulation of the vibrational characteristics of the resonating body, which may in this embodiment comprise muscle tissue, epithelial tissue, connective tissue, and nervous tissue. In some embodiments of the present invention, a combination of optically clear acrylic and clear ballistics gel may be used to form the impedance matching lens 111. The impedance matching lens 111, 211, and 311 may comprise different profiles, such as a semi-dome or semi-spherical profile as shown in system 100 of FIG. 1, a prismatic profile as shown in system 200 of FIG. 3, a cuboid profile as shown in system 300 of FIG. 3, or other shapes or profiles depending on the positioning of the emitter module 101 and receiver module 102 and desired reflection angle of the light beam 103.


The reflective surface 112 is configured to reflect the focused light beam 103 created by the emitter module 101 to the receiver module 102 as a reflected light beam 104. Accordingly, reflective surface 112 may be embedded, suspended, or otherwise affixed within the impedance matching lens 111. Reflective surface 112 may comprise a mirror or other suitable materials having a reflective coating on an appropriate substrate. In a preferred embodiment, it may be desirable to use a thin and lightweight material so as to not affect the wave propagation characteristics of the impedance matching lens 111. The reflective surface 112 may further be optimized for desired use by altering its reflective properties. The reflective surface 112, 212 may comprise a substantially flat surface as illustrated in system 100 of FIG. 1 or system 200 of FIG. 2. The reflective surface 312 may also comprise a surface having two reflective segments connected at a right angle, so as to reflect the light beam 103 created by the emitter module 101 as reflected light beam 104 to receiver module 102 when the two modules are placed in closer proximity to each other. Of course, other number of reflective segments, flat and/or curved profiles, various shapes of the reflect surfaces or segments may be used, depending on the desired positioning of the emitter module 101 and receiver module 102, as well as the desired light reflective and refractive characteristics.


Receiver module 102 is structured to detect the light intensity and/or vibrations in the reflected light beam 104 as it vibrates in accordance to the impedance matching reflector 310. The receiver module 102 is further configured to convert the reflected light beam into an input signal. Accordingly, receiver module 102 may comprise a light transducer that includes at least one photodetector or light sensor. The photodetector may comprise at least one photo diode, photo resistor, optical detector, photoresistor, photovoltaic cell, photomultiplier, phototube, phototransistor, charge coupled device (CCD).


Receiver module 102 may further comprise an audio processor for modifying the input signal. The processing module may comprise a digital signal processor, amplifier, filters, and volume controls. Processing module may comprise processor and combination of circuits structured to further enhance the audio quality of the signal coming from the microphone preamplifier, such as but not limited to shelf filters, equalizers, modulators. For example, in at least one embodiment the audio processor may comprise a processor that performs the steps for processing a signal as taught by the present inventor's U.S. Pat. No. 8,160,274. Audio processor may incorporate various acoustic profiles customized for a user and/or for an environment, such as those described in the present inventor's U.S. Pat. No. 8,565,449. Audio processor may additionally incorporate processing suitable for high noise environments, such as those described in the present inventor's U.S. Pat. No. 8,462,963. Parameters of the audio processor may be controlled and modified by a user via any means known to one skilled in the art, such as by a direct interface or a wireless communication interface.


The housing 140 is structured to substantially enclose the components of the present invention, including the emitter module 101, receiver module 102, impedance matching reflector 110 and the impedance matching lens 111 and reflective surface 112 thereof. Housing 140 is ideally structured to stabilize the internal components therein, so as to provide for a consistent and accurate measurement regardless of external environment vibrations. Housing 140 is also ideally formed from an opaque material, such as to block out potential light interference that may comprise the reflected light beam emitted by the emitting module 101.


The present invention may further comprise an acoustic transducer, not shown, in communicable relations to the receiver module 102 and/or the audio processor thereof. The acoustic transducer is structured to convert the input signal from the receiver module 102 as audio signal. In at least one embodiment of the present invention, the acoustic transducer may be formed as part of a headset or speaker remotely connected to the receiver module 102 and/or the audio processor, such as via near-field communication or other wireless technologies. Of course, the acoustic transducer may also be connected via a wired connection to the receiver module 102 and/or the audio processor.


Drawing attention to FIG. 4, other embodiments of the present invention are drawn to methods for acquiring acoustic information from a resonating body. Accordingly, an impedance matching reflector is first positioned in vibrational transference relations to a resonating body, as in 401. The impedance matching reflector may be positioned or placed in direct confronting relations to the resonating body, such that a majority of an exposed surface of the impedance matching reflector is in direct physical contact with the resonating body. The impedance matching reflector will ideally vibrate at the same frequency such that sound waves from the resonating body is accurately transferred to the impedance matching reflector. The impedance matching reflector may comprise an impedance matching lens and reflective surface embedded therein as described above.


A focused light beam is then emitted from an emitter module, as in 402, onto the impedance matching reflector in order to create a reflected light beam. In a preferred embodiment, the focused light beam is reflected off a reflective surface embedded within an impedance matching lens, as described above, such that the vibrational characteristics of the resonating body and impedance matching reflector is captured through the reflected light beam, as the light beam vibrates and changes in reflection angle and intensity.


The reflected light beam is received, as in 403, at a receiver module as the impedance matching reflector vibrates in accordance with the resonating body. As discussed above, the reflected light beam is captured at the receiver module as it changes in reflection angle and intensity. The receiver module may comprise a light transducer and/or a photodetector in accordance to the system of the present invention.


The reflected light beam is converted by the receiver module, as in 404, in order to create an input signal. As such, the light transducer and/or photodetector of the receiver module or forming part of the receiver module are appropriately structured and configured to receive changes in the reflected light beam, and convert the same as an input signal or electrical signal.


The input signal may further be converted, as in 405, into an audio output signal with an audio transducer. As such, the input signal may be transmitted to an audio transducer forming part of the receiver module or communicably connected to the receiver module as described in the system embodiments described above. The transducer may comprise a headset, a loud speaker, or any surface capable of acoustic resonance. The input signal may be processed by at least one audio processor in order to enhance the input signal such as by amplifying the desirable signals and/or filtering out extraneous noise.


It should be understood that the above steps may be conducted exclusively or nonexclusively and in any order. Further, the physical devices recited in the methods may comprise any apparatus and/or systems described within this document or known to those skilled in the art.


Since many modifications, variations and changes in detail can be made to the described preferred embodiment of the invention, it is intended that all matters in the foregoing description and shown in the accompanying drawings be interpreted as illustrative and not in a limiting sense. Thus, the scope of the invention should be determined by the appended claims and their legal equivalents.


Now that the invention has been described,

Claims
  • 1. A system for acquiring acoustic information from a resonating body comprising: an emitter module configured to emit a focused light beam onto a reflective surface in order to create a reflected light beam,said reflective surface at least partially disposed within an impedance matching lens,said impedance matching lens is structured to have at least one wave propagation characteristic that is substantially similar to the resonating body, so as to vibrate at a substantially similar frequency as the resonating body,said impedance matching lens further disposable in direct confronting relation to the resonating body,a receiver module structured to detect the vibrations in the reflected light beam as it vibrates in accordance to said impedance matching lens, said receiver module is further configured to convert the reflected light beam into an input signal.
  • 2. The system as recited in claim 1 further comprising an acoustic transducer structured to convert the input signal into an audio signal.
  • 3. The system as recited in claim 1 wherein said receiver module comprises a light transducer structured to convert the reflected light signal into the input signal.
  • 4. The system as recited in claim 3 wherein said emitter module comprises at least one laser diode.
  • 5. The system as recited in claim 3 wherein said receiver module comprises at least one photo diode.
  • 6. The system as recited in claim 3 wherein said receiver module comprises at least one photo resistor.
  • 7. The system as recited in claim 3 wherein said receiver module comprises at least one charge coupled device.
  • 8. The system as recited in claim 1 wherein said impedance matching lens comprises a substantially similar vibration transmission speed as the resonating body.
  • 9. The system as recited in claim 1 wherein said impedance matching lens comprises a substantially similar density as the resonating body.
  • 10. The system as recited in claim 1 wherein said impedance matching lens comprises a substantially similar acoustic impedance as the resonating body.
  • 11. The system as recited in claim 1 wherein said impedance matching lens comprises an optically transparent material.
  • 12. The system as recited in claim 11 wherein said impedance matching lens comprises a synthetic ballistics gel.
  • 13. The system as recited in claim 11 wherein said impedance matching lens comprises an acrylic lens.
  • 14. A method for acquiring acoustic information from a resonating body comprising: preselecting an impedance matching reflector to correspond with an impedance of the resonating body,positioning the impedance matching reflector in vibrational transference relations to the resonating body,emitting a focused light beam from an emitter module onto the impedance matching reflector in order to create a reflected light beam,receiving the reflected light beam at a receiver module as the impedance matching reflector vibrates in accordance with the resonating body,converting the reflected light beam at the receiver module in order to create an input signal,converting the input signal into an audio output signal with an audio transducer.
  • 15. The method as recited in claim 14 comprising positioning the impedance matching reflector in direct confronting relations to the resonating body.
  • 16. The method as recited in claim 14 comprising emitting a focused light beam from an emitter module onto a reflective surface within an impedance matching lens that together form the impedance matching reflector.
  • 17. The method as recited in claim 14 wherein the focused light beam comprises a laser and the reflected light beam comprises a reflected laser.
  • 18. The method as recited in claim 14 wherein the impedance matching reflector comprises at least one wave propagation characteristic that is substantially similar to the resonating body.
US Referenced Citations (224)
Number Name Date Kind
3795876 Takahashi et al. Mar 1974 A
3813687 Geil May 1974 A
4162462 Endoh et al. Jul 1979 A
4184047 Langford Jan 1980 A
4218950 Uetrecht Aug 1980 A
4226533 Snowman Oct 1980 A
4257325 Bertagni Mar 1981 A
4353035 Schröder Oct 1982 A
4356558 Owen et al. Oct 1982 A
4363007 Haramoto et al. Dec 1982 A
4412100 Orban Oct 1983 A
4517415 Laurence May 1985 A
4538297 Waller Aug 1985 A
4549289 Schwartz et al. Oct 1985 A
4584700 Scholz Apr 1986 A
4602381 Cugnini et al. Jul 1986 A
4612665 Inami et al. Sep 1986 A
4641361 Rosback Feb 1987 A
4677645 Kaniwa et al. Jun 1987 A
4696044 Waller, Jr. Sep 1987 A
4701953 White Oct 1987 A
4704726 Gibson Nov 1987 A
4715559 Fuller Dec 1987 A
4739514 Short et al. Apr 1988 A
4815142 Imreh Mar 1989 A
4856068 Quatieri, Jr. et al. Aug 1989 A
4887299 Cummins et al. Dec 1989 A
4997058 Bertagni Mar 1991 A
5007707 Bertagni Apr 1991 A
5073936 Gurike et al. Dec 1991 A
5133015 Scholz Jul 1992 A
5210806 Kihara et al. May 1993 A
5239997 Guarino et al. Aug 1993 A
5355417 Burdisso et al. Oct 1994 A
5361381 Short Nov 1994 A
5420929 Geddes et al. May 1995 A
5425107 Bertagni et al. Jun 1995 A
5463695 Werrbach Oct 1995 A
5465421 McCormick et al. Nov 1995 A
5467775 Callahan et al. Nov 1995 A
5473214 Hildebrand Dec 1995 A
5515444 Burdisso et al. May 1996 A
5539835 Bertagni et al. Jul 1996 A
5541866 Sato et al. Jul 1996 A
5572443 Emoto et al. Nov 1996 A
5615275 Bertagni Mar 1997 A
5617480 Ballard et al. Apr 1997 A
5638456 Conley et al. Jun 1997 A
5640685 Komoda Jun 1997 A
5671287 Gerzon Sep 1997 A
5693917 Bertagni et al. Dec 1997 A
5699438 Smith et al. Dec 1997 A
5727074 Hildebrand Mar 1998 A
5737432 Werrbach Apr 1998 A
5828768 Eatwell et al. Oct 1998 A
5832097 Armstrong et al. Nov 1998 A
5838805 Warnaka et al. Nov 1998 A
5848164 Levine Dec 1998 A
5872852 Dougherty Feb 1999 A
5901231 Parrella et al. May 1999 A
5990955 Koz Nov 1999 A
6058196 Heron May 2000 A
6078670 Beyer Jun 2000 A
6093144 Jaeger et al. Jul 2000 A
6108431 Bachler Aug 2000 A
6201873 Dal Farra Mar 2001 B1
6202601 Ouellette et al. Mar 2001 B1
6208237 Saiki et al. Mar 2001 B1
6263354 Gandhi Jul 2001 B1
6285767 Klayman Sep 2001 B1
6292511 Goldston et al. Sep 2001 B1
6317117 Goff Nov 2001 B1
6318797 Böhm et al. Nov 2001 B1
6332029 Azima et al. Dec 2001 B1
6518852 Derrick Feb 2003 B1
6535846 Shashoua Mar 2003 B1
6570993 Fukuyama May 2003 B1
6618487 Azima et al. Sep 2003 B1
6661897 Smith Dec 2003 B2
6661900 Allred et al. Dec 2003 B1
6772114 Sluijter et al. Aug 2004 B1
6847258 Ishida et al. Jan 2005 B2
6871525 Withnall et al. Mar 2005 B2
6907391 Bellora et al. Jun 2005 B2
6999826 Zhou et al. Feb 2006 B1
7006653 Guenther Feb 2006 B2
7016746 Wiser et al. Mar 2006 B2
7024001 Nakada Apr 2006 B1
7058463 Ruha et al. Jun 2006 B1
7123728 King et al. Oct 2006 B2
7254243 Bongiovi Aug 2007 B2
7266205 Miller Sep 2007 B2
7274795 Bongiovi Sep 2007 B2
7519189 Bongiovi Apr 2009 B2
7577263 Tourwe Aug 2009 B2
7613314 Camp, Jr. Nov 2009 B2
7676048 Tsutsui Mar 2010 B2
7711442 Ryle et al. May 2010 B2
7778718 Janke et al. Aug 2010 B2
7916876 Helsloot Mar 2011 B1
8068621 Okabayashi et al. Nov 2011 B2
8160274 Bongiovi Apr 2012 B2
8175287 Ueno et al. May 2012 B2
8229136 Bongiovi Jul 2012 B2
8284955 Bongiovi et al. Oct 2012 B2
8462963 Bongiovi Jun 2013 B2
8472642 Bongiovi Jun 2013 B2
8503701 Miles Aug 2013 B2
8565449 Bongiovi Oct 2013 B2
8705765 Bongiovi Apr 2014 B2
8879743 Mitra Nov 2014 B1
9195433 Bongiovi et al. Nov 2015 B2
9264004 Bongiovi et al. Feb 2016 B2
9276542 Bongiovi et al. Mar 2016 B2
9281794 Bongiovi et al. Mar 2016 B1
9344828 Bongiovi et al. May 2016 B2
9348904 Bongiovi et al. May 2016 B2
9350309 Bongiovi et al. May 2016 B2
9397629 Bongiovi et al. Jul 2016 B2
9398394 Bongiovi et al. Jul 2016 B2
20010008535 Lanigan Jul 2001 A1
20010043704 Schwartz Nov 2001 A1
20020057808 Goldstein May 2002 A1
20020094096 Paritsky Jul 2002 A1
20030016838 Paritsky Jan 2003 A1
20030023429 Claesson Jan 2003 A1
20030035555 King et al. Feb 2003 A1
20030043940 Janky et al. Mar 2003 A1
20030112088 Bizjak Jun 2003 A1
20030138117 Goff Jul 2003 A1
20030142841 Wiegand Jul 2003 A1
20030164546 Giger Sep 2003 A1
20030179891 Rabinowitz et al. Sep 2003 A1
20030216907 Thomas Nov 2003 A1
20040003805 Ono et al. Jan 2004 A1
20040022400 Magrath Feb 2004 A1
20040044804 MacFarlane Mar 2004 A1
20040086144 Kallen May 2004 A1
20040138769 Akiho Jul 2004 A1
20040146170 Zint Jul 2004 A1
20050090295 Ali et al. Apr 2005 A1
20050117771 Vosburgh et al. Jun 2005 A1
20050129248 Kraemer et al. Jun 2005 A1
20050175185 Korner Aug 2005 A1
20050201572 Lindahl et al. Sep 2005 A1
20050249272 Kirkeby et al. Nov 2005 A1
20050254564 Tsutsui Nov 2005 A1
20060034467 Sleboda et al. Feb 2006 A1
20060064301 Aguilar et al. Mar 2006 A1
20060098827 Paddock et al. May 2006 A1
20060126851 Yuen et al. Jun 2006 A1
20060126865 Blamey et al. Jun 2006 A1
20060138285 Oleski et al. Jun 2006 A1
20060140319 Eldredge et al. Jun 2006 A1
20060189841 Pluvinage Aug 2006 A1
20060291670 King et al. Dec 2006 A1
20070010132 Nelson Jan 2007 A1
20070119421 Lewis et al. May 2007 A1
20070173990 Smith et al. Jul 2007 A1
20070177459 Behn Aug 2007 A1
20070206643 Egan Sep 2007 A1
20070223713 Gunness Sep 2007 A1
20070223717 Boersma Sep 2007 A1
20070253577 Yen et al. Nov 2007 A1
20080031462 Walsh et al. Feb 2008 A1
20080040116 Cronin Feb 2008 A1
20080069385 Revit Mar 2008 A1
20080112576 Bongiovi May 2008 A1
20080123870 Stark May 2008 A1
20080123873 Bjorn-Josefsen et al. May 2008 A1
20080137881 Bongiovi Jun 2008 A1
20080165989 Seil et al. Jul 2008 A1
20080181424 Schulein et al. Jul 2008 A1
20080219459 Bongiovi et al. Sep 2008 A1
20080255855 Lee et al. Oct 2008 A1
20090022328 Neugebauer et al. Jan 2009 A1
20090054109 Hunt Feb 2009 A1
20090062946 Bongiovi et al. Mar 2009 A1
20090086996 Bongiovi et al. Apr 2009 A1
20090282810 Leone et al. Nov 2009 A1
20090290725 Huang Nov 2009 A1
20090296959 Bongiovi Dec 2009 A1
20100166222 Bongiovi Jul 2010 A1
20100256843 Bergstein et al. Oct 2010 A1
20100278364 Berg Nov 2010 A1
20100303278 Sahyoun Dec 2010 A1
20110013736 Tsukamoto et al. Jan 2011 A1
20110087346 Larsen et al. Apr 2011 A1
20110194712 Potard Aug 2011 A1
20110230137 Hicks et al. Sep 2011 A1
20110257833 Trush et al. Oct 2011 A1
20120014553 Bonanno Jan 2012 A1
20120099741 Gotoh et al. Apr 2012 A1
20120170759 Yuen et al. Jul 2012 A1
20120213034 Imran Aug 2012 A1
20120213375 Mahabub et al. Aug 2012 A1
20120302920 Bridger et al. Nov 2012 A1
20130121507 Bongiovi et al. May 2013 A1
20130162908 Son et al. Jun 2013 A1
20130163783 Burlingame Jun 2013 A1
20130169779 Pedersen Jul 2013 A1
20130220274 Deshpande et al. Aug 2013 A1
20130227631 Sharma et al. Aug 2013 A1
20130242191 Leyendecker Sep 2013 A1
20130288596 Suzuki et al. Oct 2013 A1
20130338504 Demos et al. Dec 2013 A1
20140067236 Henry et al. Mar 2014 A1
20140100682 Bongiovi Apr 2014 A1
20140112497 Bongiovi Apr 2014 A1
20140153765 Gan et al. Jun 2014 A1
20140185829 Bongiovi Jul 2014 A1
20140261301 Leone Sep 2014 A1
20140369504 Bongiovi Dec 2014 A1
20140379355 Hosokawsa Dec 2014 A1
20150215720 Carroll Jul 2015 A1
20150297169 Copt et al. Oct 2015 A1
20150297170 Copt et al. Oct 2015 A1
20160036402 Bongiovi et al. Feb 2016 A1
20160044436 Copt et al. Feb 2016 A1
20160240208 Bongiovi et al. Aug 2016 A1
20160258907 Butera, III et al. Sep 2016 A1
20160344361 Bongiovi et al. Nov 2016 A1
20170033755 Bongiovi et al. Feb 2017 A1
20170041732 Bongiovi et al. Feb 2017 A1
Foreign Referenced Citations (142)
Number Date Country
2005274099 Oct 2010 AU
20070325096 Apr 2012 AU
2012202127 Jul 2014 AU
96114177 Feb 1999 BR
96113723 Jul 1999 BR
2533221 Jun 1995 CA
2161412 Apr 2000 CA
2576829 Jul 2014 CA
1173268 Feb 1998 CN
12221528 Jun 1999 CN
101536541 Sep 2009 CN
101946526 Jan 2011 CN
102265641 Nov 2011 CN
102652337 Aug 2012 CN
103004237 Mar 2013 CN
0780050323 May 2013 CN
203057339 Jul 2013 CN
0206746 Aug 1992 EP
0541646 Jan 1995 EP
0580579 Jun 1998 EP
0698298 Feb 2000 EP
0932523 Jun 2000 EP
0666012 Nov 2002 EP
2814267 Oct 2016 EP
2218599 Oct 1998 ES
2249788 Oct 1998 ES
2219949 Aug 1999 ES
2003707 Mar 1979 GB
2320393 Dec 1996 GB
P0031074 Jun 2012 ID
260362 Apr 2014 IN
198914 Jul 2014 IS
3150910 Jun 1991 JP
7106876 Apr 1995 JP
1020040022442 Mar 2004 JP
2005500768 Jan 2005 JP
1020090101209 Sep 2009 JP
4787255 Jul 2011 JP
5048782 Jul 2012 JP
201543561 Mar 2015 JP
1020040022442 Mar 2004 KR
1020090101209 Sep 2009 KR
101503541 Mar 2015 KR
J001182 Oct 2013 MO
274143 Aug 2005 MX
301172 Nov 2006 MX
315197 Nov 2013 MX
553744 Jan 2009 NZ
574141 Apr 2010 NZ
557201 May 2012 NZ
12009501073 Nov 2014 PH
2407142 Dec 2010 RU
2483363 May 2013 RU
152762 Dec 2011 SG
155213 Feb 2013 SG
1319288 Jun 1987 SU
WO 9219080 Oct 1992 WO
WO 9311637 Jun 1993 WO
WO 9321743 Oct 1993 WO
WO 9427331 Nov 1994 WO
WO 9514296 May 1995 WO
WO 9531805 Nov 1995 WO
WO 9535628 Dec 1995 WO
WO 9601547 Jan 1996 WO
WO 9611465 Apr 1996 WO
WO 9708847 Mar 1997 WO
WO 9709698 Mar 1997 WO
WO 9709840 Mar 1997 WO
WO 9709841 Mar 1997 WO
WO 9709842 Mar 1997 WO
WO 9709843 Mar 1997 WO
WO 9709844 Mar 1997 WO
WO 9709845 Mar 1997 WO
WO 9709846 Mar 1997 WO
WO 9709848 Mar 1997 WO
WO 9709849 Mar 1997 WO
WO 9709852 Mar 1997 WO
WO 9709853 Mar 1997 WO
WO 9709854 Mar 1997 WO
WO 9709855 Mar 1997 WO
WO 9709856 Mar 1997 WO
WO 9709857 Mar 1997 WO
WO 9709858 Mar 1997 WO
WO 9709859 Mar 1997 WO
WO 9709861 Mar 1997 WO
WO 9709862 Mar 1997 WO
WO 9717818 May 1997 WO
WO 9717820 May 1997 WO
WO 9813942 Apr 1998 WO
WO 9816409 Apr 1998 WO
WO 9828942 Jul 1998 WO
WO 9831188 Jul 1998 WO
WO 9834320 Aug 1998 WO
WO 9839947 Sep 1998 WO
WO 9842536 Oct 1998 WO
WO 9843464 Oct 1998 WO
WO 9852381 Nov 1998 WO
WO 9852383 Nov 1998 WO
WO 9853638 Nov 1998 WO
WO 9902012 Jan 1999 WO
WO 9908479 Feb 1999 WO
WO 9911490 Mar 1999 WO
WO 9912387 Mar 1999 WO
WO 9913684 Mar 1999 WO
WO 9921397 Apr 1999 WO
WO 9935636 Jul 1999 WO
WO 9935883 Jul 1999 WO
WO 9937121 Jul 1999 WO
WO 9938155 Jul 1999 WO
WO 9941939 Aug 1999 WO
WO 9952322 Oct 1999 WO
WO 9952324 Oct 1999 WO
WO 9956497 Nov 1999 WO
WO 9962294 Dec 1999 WO
WO 9965274 Dec 1999 WO
WO 0001264 Jan 2000 WO
WO 0002417 Jan 2000 WO
WO 0007408 Feb 2000 WO
WO 0007409 Feb 2000 WO
WO 0013464 Mar 2000 WO
WO 0015003 Mar 2000 WO
WO 0033612 Jun 2000 WO
WO 0033613 Jun 2000 WO
WO 03104924 Dec 2003 WO
WO 2006020427 Feb 2006 WO
WO 2007092420 Aug 2007 WO
WO 2008067454 Jun 2008 WO
WO 2009070797 Jun 2009 WO
WO 2009114746 Sep 2009 WO
WO 2009155057 Dec 2009 WO
WO 2010027705 Mar 2010 WO
WO 2010051354 May 2010 WO
WO 2011081965 Jul 2011 WO
WO 2013055394 Apr 2013 WO
WO 2013076223 May 2013 WO
WO2014201103 Dec 2014 WO
WO 2015061393 Apr 2015 WO
WO 2015077681 May 2015 WO
WO 2015161034 Oct 2015 WO
WO 2016019263 Feb 2016 WO
WO 2016022422 Feb 2016 WO
WO2016144861 Sep 2016 WO
Non-Patent Literature Citations (1)
Entry
NovaSound Int., http://www.novasoundint.com/new—page—t.htm, 2004.
Related Publications (1)
Number Date Country
20160258907 A1 Sep 2016 US