The subject matter of this disclosure relates to a hearing enhancement device, and more specifically, to a hearing enhancement device capable of functioning together with a coprocessor device.
Historically, hearing aids assisted people with hearing loss by providing sound amplification. Typically, hearing aids include microphones to detect external sound, a processor to amplify the detected sound, a battery, and a speaker to present amplified sound to a user. Many hearing aids presently translate the detected sound into a digital signal and use a digital signal processor (DSP) to process the signal. The DSP can manipulate the signal by applying signal processing algorithms stored on the hearing aid to improve the quality of the amplified sound.
Wearers of hearing aids desire increasingly smaller sized devices to improve comfort and personal appearance. However, the small size of hearing aids limits functionality. This form-factor constraint is apparent in short battery life, low powered processors, and weak signal processing algorithms. Sound processing is limited due to the constraints imposed by the small size of hearing aids. For example, much of the processing power of current hearing aids is devoted to reducing feedback, and thus, remaining processing power is unable to run powerful signal processing algorithms.
It is desirable to maintain the hearing aid as a small device that is placed in or on the ear of a user. It is also desirable to hearing aid users to have a device which is portable, always present, and able to produce high quality amplified sound. Even with increases in processor power and component miniaturization, hearing aid users still have many complaints about the capabilities of current hearing aids. Therefore, methods and devices that provide improved signal processing and function within the existing form-factor constraints would have considerable utility.
Most of the form-factor limitations of conventional hearing aids can be overcome by coupling a hearing aid to an external coprocessor device. Since the coprocessor device is not required to be placed in or near the ear, it is possible for the coprocessor device to have a powerful processor with greater functionality than a stand-alone hearing assist device. By sending a signal detected at the hearing assist device out to a coprocessor for processing it is possible realize the benefits of a small hearing assist device, without sacrificing signal processing power.
In one aspect, the hearing assist device has a processor and a memory to store signal processing algorithms. Thus the hearing assist device is able to process signals (e.g., audio signals converted into electronic form) without a coprocessor device. In order to communicate with the coprocessor device, the hearing assist device may also include a communication interface to communicate with the coprocessor device, and a handshaking module to receive information regarding a functionality of the coprocessor device via the communication interface. In some instances the coprocessor device may have different capabilities than the hearing assist device, so a functionality comparing module in the hearing assist device compares the functionality of the coprocessor device to a functionality of the hearing assist device. Since there may be instances in which the hearing assist device will provide better signal processing and other instances in which the coprocessor device would be a superior processor, a processor switching module in the hearing assist device may direct the signal for at least partial processing to a processor in either (or both) of the hearing assist device or the coprocessor device. The processed signal is then returned to the hearing assist device (if processed by a coprocessor device) and presented to a user by means, such as a speaker on the hearing assist device.
The detailed description is described with reference to the accompanying figures. In the figures, the use of the same reference numbers in different figures indicates similar or identical items. These drawings depict only illustrative embodiments of the invention and are not, therefore, to be considered to be limiting of its scope.
This disclosure describes techniques, by which the form-factor constraints inherent in hearing aids are overcome by leveraging the processing power of an additional processor, such as a coprocessor, which does not suffer from the same form-factor constraints. Processing power superior to that provided by conventional hearing aids has become ubiquitous in modern societies in the form of mobile phones, personal digital assistants, electronic music players, desktop and laptop computers, game consoles, television set-top-boxes, automobile radios, navigation systems, and the like. Any of these devices may function as a coprocessor, while continuing to perform the primary functions of each respective device. The coprocessor may also be a device specially designed to function together with a hearing aid.
Permanent coupling to the coprocessor device, however, requires that a hearing aid user always bring a coprocessor device if he or she desires to benefit from the hearing aid. The bulk of a coprocessor device may be undesirable when, for example, engaged in sports. Operation of the coprocessor device may even be prohibited at times such as while on an airplane or near sensitive medical equipment. In such situations the hearing aid user may desire whatever benefit the hearing aid can provide even if enhanced processing of the coprocessor device is not available. Thus, it is desirable to have a hearing aid that will function as a stand-alone-device in the absence of a coprocessor device, and provide enhanced functionality if and when a coprocessor is available.
In some embodiments, the hearing aid provides sound enhancement to a user with diminished hearing capacity. However, in other embodiments, the methods and devices of the present disclosure enhance the hearing abilities of a user with or without impaired hearing. For example, appropriate signal processing algorithms used together with the subject of the present disclosure may allow a solider to distinguish the snap of a twig from other sounds in a forest, or allow a mechanic to detect a grating of gears inside a noisy engine. Accordingly, devices of the present disclosure are referred to as hearing assist devices to encompass devices used to enhance sound for users with or without hearing impairment.
Each hearing assist device 102 may include a processor switching module 110 to manage routing of signals amongst the processors of the hearing assist device 102 and one or more of the coprocessor devices 104. The coprocessor devices may include a handshaking module 112 to facilitate communication between the hearing assist device 102 and the coprocessor device 104, including sending information describing a functionality of the coprocessor device 104 to the hearing assist device 102 as part of the handshaking.
Flexibility inherent in the system 100 of the present disclosure allows one hearing assist device 102 to communicate with zero to m coprocessor devices 104. Moreover, the hearing assist device 102 may dynamically add or drop coprocessor devices 104 on the fly. The hearing assist device 102 functions as a stand-alone device when zero coprocessor devices 104 are present. The hearing assist device 102(a) may, for example, communicate only with coprocessor device 104(a) via the wired communication interface 106. In other embodiments, hearing assist device 102(a) may communicate with a first coprocessor device 104(a) via the wired communication interface 106 and a second coprocessor device 104(m) via the wireless communication interface. Many other communication paths are covered within the scope of the present disclosure including a hearing assist device 102 communicating with more than two coprocessor devices 104 through any combination of wired and/or wireless communication interfaces.
It is also envisioned that, in some embodiments, more than one hearing assist device 102 may communicate with a coprocessor device. For example, hearing assist device 102(a) and hearing assist device 102(n) may both communicate with coprocessor device 104(m) via two wireless communication interfaces 108. The two hearing assist devices, 102(a) and 102(n), may represent devices placed in a right ear and a left ear of a single user. The two hearing assist devices 102(a) and 102(n) may alternatively represent devices worn by two different users. Many other communication paths are covered within the scope of the present disclosure, including multiple users each wearing one or two hearing assist devices 102 and all of the hearing assist devices 102 using a coprocessor device 104 through a plurality of wired and/or wireless communication interfaces.
Any combination of multiple hearing assist devices 102 in communication with single or multiple coprocessor devices 104 is also within the scope of the present disclosure. For example, hearing assist device 102(a) may be connected to coprocessor device 104(a) via a wired communication interface 106 and to coprocessor device 104(m) via a wireless communication interface 108. While at the same time, hearing assist device 102(n) may also be connected to coprocessor device 104(m) via a wireless communication interface.
The hearing assist devices 102 may also be able to communicate with other hearing assist devices either directly (not shown) or via a coprocessor device 104 such as hearing assist device 102(a) communicating with hearing assist device 102(n) via coprocessor device 104(m). Thus, a given hearing assist device 102 may stand alone and communicate with no other devices, it may communicate with a one or more coprocessor devices 104, it may communicate with a one or more other hearing assist devices 102, or it may communicate with the one or more coprocessor devices 104 and one or more other hearing assist devices 102.
The coprocessor devices 104 may also be able to communicate with other coprocessor devices (not shown). The coprocessor devices 104 may also communicate with a server 110. In some embodiments, the server 110 may be a network server connected to a network such as the Internet. Communication between the coprocessor devices 104 and the server 110 may be wired or wireless. In some embodiments, not shown, a coprocessor device 104 may be a component of a larger computing device and the server may be another component of the same larger computing device. Thus, a given coprocessor device 104 may communicate with a one or more hearing assist devices 102, and/or with a one or more other coprocessor devices 104, and/or with a one or more servers 110.
Hearing Assist Device
The handshaking module 214 of the hearing assist device 102 may be configured to receive information describing a functionality of the coprocessor device 104 via the communication interface 212. Examples of specific functionalities of the coprocessor device are described below. In some embodiments, the handshaking module 214 may also send information describing a functionality of the hearing assist device 102 to the coprocessor device 104. By using the handshaking module 214 to mediate initial communications between the hearing assist device 102 and the coprocessor device 104, the hearing assist device 102 is able to do more than merely open a communication channel to passively await a transfer of data. The handshaking module 214 allows for exchange of information describing the functionalities of the hearing assist device 102 and the coprocessor device 104, such that communicative connections will be made only if necessary and only to the extent necessary to provide an enhanced processing to the hearing assist device 102.
Once the functionalities of the hearing assist device 102 and the coprocessor device 104 are known, then a functionality comparing module 216 may compare the functionality of the coprocessor device 104 to the functionality of the hearing assist device 102. If multiple coprocessor devices 104 are available, the functionality comparing module 216 may compare the functionality of each coprocessor device 104 to each other and/or to the functionality of the hearing assist device 102. In some embodiments, the functionality may be a signal processing algorithm. The functionality comparing module 216 may determine that a signal processing algorithm on one of the coprocessor devices 104 provides a signal processing functionality absent from the hearing assist device 102 and also absent from other coprocessor devices 104. For example, a laptop computer functioning as a coprocessor device may have a pitch-shifting signal processing algorithm which all other devices in the system lack. In such a situation it may be desirable to process the signal at the laptop computer to benefit from the pitch-shifting ability of the coprocessor 104.
Even if a same general type of signal processing functionality is present on both the hearing assist device 102 and the coprocessor device 104, that signal processing functionality may be enhanced on the coprocessor device 104. For example, both the hearing assist device 102 and the coprocessor device 104 may include versions of a signal processing algorithm for signal separation, but the specific algorithm on the coprocessor device 104 may, for example, provide greater signal separation than the algorithm on the hearing assist device 102. Thus, the signal processing functionality present on the coprocessor device 104 is enhanced as compared to the signal processing functionality on the hearing assist device 102 because of the enhanced signal processing algorithm (e.g., the greater signal separation algorithm) available on the coprocessor device 104.
Enhanced signal processing functionality may also be achieved when two devices have identical signal processing algorithms but one device provides an enhanced processing capability. For example, the processor power or memory available on the coprocessor device 104 may allow that coprocessor device 104 to provide enhanced processing capability as compared to the hearing assist device 102 even when both devices use the same signal processing algorithm.
When multiple coprocessor devices 104 are present, the functionality comparing module 216 may compare a signal processing functionality of one coprocessor device 104 to another coprocessor device 104 to determine if any of the coprocessor devices 104 have a signal processing functionality absent from the other devices. The functionality comparing module 216 may also determine if any of the plurality of coprocessor devices 104 has an enhanced signal processing functionality (either in terms of a superior algorithm or in terms of processing power or memory) as compared to the other coprocessor devices 104.
The hearing assist device 102 also includes a processor switching module 110 configured to direct the signal for at least partial processing to the processor 206 of the hearing assist device 102 and/or a processor of the coprocessor device. Given the coprocessor devices 104 available to the hearing assist device 102 at any point in time, the functionality comparing module 216 will determine which processor or combination of processors can provide desired signal processing functionality for the needs of the user of the hearing assist device 102. The desired signal processing functionality may be determined in advance by an audiologist or manufactures of the hearing assist device. In some embodiments the desired signal processing functionality may be determined by the user (e.g. manual selection) or by the coprocessor device (e.g. if the coprocessor device is a car radio then the desired signal processing functionality includes correction for road and engine noise). The processor switching module 110 may then dynamically switch processing of the signal based on the comparisons performed by the functionality comparing module 216.
In some embodiments, the signal may be processed in series by the processor switching module 110 directing the signal to the processor 206 of the hearing assist device 102 and/or processors of one or more coprocessor devices 104. For example, a sound detected by the sensor 202 and converted to a signal by the converter 204 may be initially processed at the processor 206, sent to a first coprocessor via the communication interface 212 for additional processing, sent from the first coprocessor to a second coprocessor for further processing, and finally received from the second coprocessor via the communication interface 212. One benefit of processing the signal in series is that the processing by the first coprocessor device can be taken into account by the second coprocessor. In other embodiments, the signal may be processed in parallel by the processor 206 of the hearing assist device 102 and/or processors of one or more coprocessor devices 104. When processed in parallel, the signal may be processed substantially simultaneously by a plurality of processors and then the respective processed signals may be integrated into one signal at the hearing assist device 102 by an integrator (not shown). One possible benefit of processing the signal in parallel is that latency of signal processing by the coprocessors is minimized.
Ultimately, the processed signal is presented to the user of the hearing assist device. The hearing assist device 102 includes a stimulator configured to stimulate an auditory nerve of a user. The stimulator may take any form which directly or indirectly induces the auditory nerve to generate an electrical signal that is perceived by the user as representing sound. In some embodiments the stimulator may be a speaker. In other embodiments the stimulator may be a device, such as a cochlear implant, that acts directly on the auditory nerve.
While the hearing assist device 102 is shown and described as having certain hardware and software modules, it should be understood that all modules may be implemented as appropriate in hardware, software, firmware, or combinations thereof. If implemented by software, the software may reside on memory associated with any component of the hearing assist device 102, standalone memory provided in connection with the hearing assist device 102, a remote memory storage device, removable/nonremovable memory, a combination of the foregoing, or any other combination of one or more processor-readable media. While the hearing assist device 102 is shown as having certain modules, it should be understood that in some embodiments, one or more of the modules could be combined or omitted entirely.
Coprocessor Device
Coprocessor device 104 includes a processor 306 configured to process a signal. In one embodiment the signal may be a signal received from a hearing assist device 102. In another embodiment the signal may be a signal received from another coprocessor 104. In yet another embodiment the signal may be a signal from the converter 304.
Coprocessor device 104 also includes a memory 308 configured to store signal processing algorithms. The signal processing algorithms 310 may include, but are not limited to, echo cancellation, noise reduction, directionality, pitch shifting, signal separation, audio compression, sub-band processing, language translation, user customized hearing profiles, and feedback reduction algorithms as well as audiologist customizations.
The coprocessor device 104 also includes a communication interface 312 similar to the communication interface 212 of the hearing assist device 102 of
In some situations the coprocessor device 104 may initially lack a signal processing functionality required by the user. In one embodiment, the communication interface 312 may be configured to send a signal from the coprocessor device 104 to a server 110 in order to access additional signal processing functionality available on the server. For example, the server 110 may function similar to, or make use of, an ITUNES® server by receiving requests for signal processing algorithms (instead of songs) from one or many coprocessor devices 104 (instead of MP3 players). ITUNES® is available from Apple Corporation, of Mountain View, Calif. The coprocessor device 104 may be preconfigured with the address and access information for server 110, or the address and/or access information may be provided to the coprocessor device 104 along with the signal from the communication interface 312 of the hearing assist device. If multiple servers 110 are available the coprocessor device 104 may choose a server 110 from which to obtain the signal processing algorithm, or a server 110 may be designated by information received from the hearing assist device 102.
The handshaking module 112 of the coprocessor device 104 may be configured to send information describing a functionality of the coprocessor device 104 via the communication interface 312. The functionality of the coprocessor device 104 may include, but is not limited to, a processor speed, a processor load, a processor capability, a memory capacity, a memory capability, an available signal processing algorithm, an enhancement of a signal processing algorithm, a sensor capability, and a strength of a communication signal. Memory capacity may be any measure of a capacity to store information such as total capacity, available capacity, capacity dedicated to signal processing, and the like. Interactions between the handshaking module 214 of the hearing assist device 102 and the handshaking module 112 of the coprocessor device 104 allow the hearing assist device 102 to decide when and if to use the processing capabilities of available coprocessor devices 104. For example, the hearing assist device 102 may terminate a connection to the coprocessor device 104 immediately following handshaking if the coprocessor device 104 provides no signal processing functionality beyond that available on the hearing assist device 102. In other situations, the handshaking may continue even after a communication channel is established to inform the hearing assist device 102 of a change in the signal processing functionality of the coprocessor device 104. For example, the coprocessor device 104 may have a changed signal processing functionality due to installation of a new signal processing algorithm or change in a processor load due to changes in demands placed on the processor 306. If multiple servers 110 are available, the handshaking module 112 may decode information sent from the hearing assist device 102 in order to determine which server to utilize.
While the coprocessor device 104 is shown and described as having certain hardware and software modules, it should be understood that all modules may be implemented as appropriate in hardware, software, firmware, or combinations thereof. If implemented by software, the software may reside on memory associated with any component of the coprocessor device 104, standalone memory provided in connection with the coprocessor device 104, a remote memory storage device, removable/nonremovable memory, a combination of the foregoing, or any other combination of one or more processor-readable media. While the coprocessor device 104 is shown as having certain modules, it should be understood that in some embodiments, one or more of the modules could be combined or omitted entirely.
Signal Direction Process
Referring back to
At 408, the process 400 directs a signal to a processor of the hearing assist device and/or the coprocessor device. As discussed above, the signal may be processed by either or both devices. The directing may be performed by the processor switching module 104 of the hearing assist device 102 of
The directing of the signal at 408 may direct the signal to be processed, at 410, by the hearing assist device. For example, if no coprocessor devices are available, then the signal will be processed at the hearing assist device. The signal may also be directed to the hearing assist device if the functionality at the coprocessor device is the same as, or inferior to, the functionality at the hearing assist device.
The directing of the signal at 408 may also direct the signal to be processed at the coprocessor device. In order to process the signal at the coprocessor device, the signal is sent to the coprocessor device for processing (at 412). Following that processing, the hearing assist device will receive a processed signal from the coprocessor device (at 414). For example, the signal may be directed to the coprocessor device whenever the coprocessor device is available. Additionally or alternatively, the signal may be directed to the coprocessor device based on a user input. The user input may, in some embodiments, override other considerations regarding direction of a signal. Directing the signal to the coprocessor device and/or receiving a processing signal from the coprocessor device (at 412) also includes the directing and/or receiving with respect to a plurality of coprocessor devices. In another example, the signal may be directed to the coprocessor device based upon a determination that the coprocessor device has a necessary and/or superior functionality.
The direction of the signal at 408 may direct the signal to both the hearing assist device and to the coprocessor device. As discussed above, the signal may be split and processed by a plurality of processors in parallel or sent in series through a plurality of processors. Directing the signal to both devices may occur, for example, if the hearing assist device has some signal processing algorithms not available on the coprocessor device, and the coprocessor device has other signal processing algorithms not available on the hearing assist device. Parallel processing on the hearing assist device and coprocessor may also be used to speed overall processing of the signal by distributing the processing job between the devices.
a and 5b show a flowchart of an illustrative process 500 for directing a signal for processing to a hearing assist device, a coprocessor device, and/or an additional coprocessor device. At 502, the signal is processed with a hearing assist device. The processing may include any of the signal processing algorithms discussed above or other processing. At 504, a coprocessor device may be detected. The detecting may be performed by the handshaking module 214 of
If a coprocessor is detected at 504, the hearing assist device detects any additional coprocessor devices at 506. Any number of coprocessor devices (including additional coprocessor devices) may be detected by the hearing assist device. If more than two coprocessor devices are available, the detection at 506 may repeat until no additional coprocessor devices are detected. The detection of an additional coprocessor device may be via a direct connection (e.g., wired or wireless) to the communication interface 212 of the hearing assist device. In some embodiments the detection may be indirect. For example, the hearing assist device may detect the coprocessor device, but the hearing assist device may be unable to detect the additional coprocessor device. In such situations the coprocessor device may act as a bridge connecting the hearing assist device and the additional coprocessor device. In one embodiment the hearing assist device may have a wireless connection to a coprocessor device and the coprocessor device may be connected to a network, such as the Internet, thus connecting the coprocessor device—and indirectly the hearing assist device—to additional coprocessor devices. The coprocessor device may also be connected by a network to other devices such as servers, data stores, databases, or the like containing additional signal processing algorithms.
Following detection at 504 and/or at 506, the hearing assist device is connected, directly or indirectly, through wired or wireless connections to one or more coprocessor devices. Each of the coprocessor devices has a signal processing functionality that may be the same or different from the other coprocessor devices and from the hearing assist device. If no additional coprocessor is detected at 506, then at 508 a functionality of the coprocessor device is compared to a functionality of the hearing assist device. In some embodiments the comparing compares signal processing functionalities of both devices and may determine that one device has a functionality absent from the other device. For example, a pitch shifting functionality may be absent from the hearing assist device but available on the coprocessor device. In other embodiments the comparing compares signal processing functionalities, determines that a same functionality is present on both devices, but enhanced on one of the devices. The enhancement may be an enhanced signal processing algorithm. For example, both devices may have a noise reduction functionality, but the coprocessor device may have an enhanced algorithm that achieves greater noise reduction. The enhancement may also be an enhancement achieved through an enhanced processing capability. For example, the hearing assist device and the coprocessor device may both have a same noise reduction algorithm, but due to a faster processor in the coprocessor device the coprocessor device can achieve greater noise reduction and/or complete the processing in a shorter time, and thus, has an enhanced noise reduction functionality. Enhanced signal processing functionality is also possible due to a combination of an enhanced signal processing algorithm and an enhanced processing capability.
At 510, the signal may be directed to the hearing assist device for at least partial processing. In some embodiments the directing is based on the comparing at 508. For example, if the hearing assist device has a signal processing functionality absent from the coprocessor device or a signal processing functionality is enhanced on the hearing assist device then the signal will be directed to the hearing assist device. Alternatively, at 510, if the signal is not directed to the hearing assist device, it is directed to the coprocessor device. The signal is processed with the coprocessor device at 512. As discussed above, the signal may be processed in part by the hearing assist device and in part by the coprocessor device.
While 510 shows a yes/no split, it is to be understood that processing of the signal may be distributed between the hearing assist device and the coprocessor device based on the respective signal processing functionality present on each device or based on other factors. The signal may be processed in series (e.g., first at the hearing assist device and then at the coprocessor device or vice versa) or in parallel (e.g., substantially simultaneously at the hearing assist device and at the coprocessor device) and the resulting processed signals may be integrated at the hearing assists device before presentation to the user. If, at 514, the signal is processed in parallel with the hearing assist device and/or an additional coprocessor device the process 500 follows the “yes” path and the signals which were processed in parallel are integrated at the hearing assist device (at 516). If, at 514, the process 500 follows the “no” path then the signals are processed in series (at 518) and do not require integration.
If, at 506, the additional coprocessor device is detected, the respective functionalities of the hearing assist device, the coprocessor device, and the additional coprocessor device are compared at 520. The comparisons at 520 are analogous to the comparisons at 508, but at 520 three (or more) devices are compared each to the others. Connections between the hearing assist device and the coprocessor device and/or the additional coprocessor device may be dynamic. Wireless signals may be lost and wired connections may be unplugged. Presence of the coprocessor device and/or the additional coprocessor device may be confirmed by periodic pings sent from the hearing assist device or heartbeats sent from the coprocessor device or the additional coprocessor device. Absence of a previously available coprocessor device or additional coprocessor device may be detected by a failure to receive an expected signal from the coprocessor device or the additional coprocessor device. If the coprocessor device or the additional coprocessor device is no longer detected, then the comparing at 520 (or at 508) may repeat. The results of the comparing may change when available coprocessors change.
At 522, the signal may be directed to the hearing assist device for at least partial processing.
The process 500 continues in
As discussed above, the processing may be in series or in parallel. If, at 528, the signal is processed in parallel with the hearing assist device and/or the coprocessor device the process 500 follows the “yes” path and the signals which were processed in parallel are integrated at the hearing assist device (at 516). If, at 528, the process 500 follows the “no” path then the signals are processed in series (at 518) and do not require integration. With three or more devices the processing may also be a combination of series and parallel processing. For example, the signal may be processed in series with respect to the hearing assist device and the coprocessor devices as a group and processed in parallel with respect to the coprocessor device and the additional coprocessor device.
Any of the acts of any of the methods described herein may be implemented at least partially by a processor or other electronic device based on instructions stored on one or more processor-readable media. By way of example, and not limitation, processor-readable media may comprise volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as processor-readable instructions, data structures, program modules or other data. Processor-readable media includes, but is not limited to, RAM, ROM, electrically erasable programmable ROM (EEPROM), flash memory or other memory technology, compact disk-ROM (CD-ROM), digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information. Combinations of any of the above should also be included within the scope of processor-readable media.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Based on the teachings of the present disclosure, a variety of alternate embodiments may be conceived, and the present disclosure is not limited to the particular embodiments described herein and shown in the accompanying figures. Rather, the specific features and acts are disclosed as illustrative examples.
This application claims the benefit of U.S. Provisional Application No. 61/188,840 filed Aug. 13, 2008.
Number | Name | Date | Kind |
---|---|---|---|
4918737 | Luethi | Apr 1990 | A |
5390254 | Adelman | Feb 1995 | A |
5479522 | Lindemann et al. | Dec 1995 | A |
5710819 | Tøpholm et al. | Jan 1998 | A |
5721783 | Anderson | Feb 1998 | A |
5824022 | Zilberman et al. | Oct 1998 | A |
5835610 | Ishige et al. | Nov 1998 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6035050 | Weinfurtner et al. | Mar 2000 | A |
6041129 | Adelman | Mar 2000 | A |
6058197 | Delage | May 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6390971 | Adams et al. | May 2002 | B1 |
6424722 | Hagen et al. | Jul 2002 | B1 |
6449662 | Armitage | Sep 2002 | B1 |
6556686 | Weidner | Apr 2003 | B1 |
6684063 | Berger et al. | Jan 2004 | B2 |
6816600 | Jakob et al. | Nov 2004 | B1 |
6851048 | Armitage | Feb 2005 | B2 |
6888948 | Hagen et al. | May 2005 | B2 |
6895345 | Bye et al. | May 2005 | B2 |
6938124 | Rust et al. | Aug 2005 | B2 |
6954535 | Arndt et al. | Oct 2005 | B1 |
6975739 | Bogason et al. | Dec 2005 | B2 |
6978155 | Berg | Dec 2005 | B2 |
7054957 | Armitage | May 2006 | B2 |
7257372 | Kaltenbach et al. | Aug 2007 | B2 |
7283842 | Berg | Oct 2007 | B2 |
7292698 | Niederdrank et al. | Nov 2007 | B2 |
20010007050 | Adelman | Jul 2001 | A1 |
20020091337 | Adams et al. | Jul 2002 | A1 |
20050058313 | Victorian et al. | Mar 2005 | A1 |
20060039577 | Sanguino et al. | Feb 2006 | A1 |
20060177799 | Stuart et al. | Aug 2006 | A9 |
20070225050 | Kaltenbach et al. | Sep 2007 | A1 |
20070239294 | Brueckner et al. | Oct 2007 | A1 |
20070282394 | Segel et al. | Dec 2007 | A1 |
20080107278 | Roeck et al. | May 2008 | A1 |
Number | Date | Country | |
---|---|---|---|
20100040248 A1 | Feb 2010 | US |
Number | Date | Country | |
---|---|---|---|
61188840 | Aug 2008 | US |