This application is related to U.S. Provisional Patent Application Ser. No. 61/029,564, filed Feb. 19, 2008, which is incorporated herein by reference in its entirety. This application is also related to U.S. patent application Ser. No. 12/388,341, filed Feb. 18, 2009, which is incorporated herein by reference in its entirety.
This document relates generally to hearing assistance systems and more particularly to methods and apparatus for detection of special environments for hearing assistance devices.
Hearing assistance devices, such as hearing aids, can provide adjustable operational modes or characteristics that improve the performance of the hearing assistance device for a specific person or in a specific environment. Some of the operational characteristics include, but are not limited to volume control, tone control, directionality, and selective signal input. These and other operational characteristics can be programmed into a hearing aid. Advanced hearing assistance devices, such as digital hearing aids, may be programmed to change from one operational mode or characteristic to another depending on algorithms operating on the device. As the person wearing a hearing assistance device moves between different acoustic environments, it may be advantageous to change the operational modes or characteristics of the hearing assistance device to adjust the device to particular acoustic environments. Some devices may possess signal processing adapted to classify the acoustic environments in which the hearing assistance device operates. However, such signal processing may require a relatively large amount of signal processing power, be prone to error, and may not yield sufficient improvement in cases when processing power is available. Certain environments may be more difficult to classify than others and can result in misclassification of the environment or frequent switching of the adapted behavior to the detected environment, thereby resulting in reduced hearing benefits of the hearing assistance device. One problematic environment is that of a vehicle, such as an automobile. Wearers of digital hearing aids in moving vehicles are exposed to a variety of sounds coming from the vehicle, open windows, fans, and sounds from outside of the vehicle. Users may experience frequent mode switching from adaptive devices as they attempt to adjust rapidly to changing acoustic environmental inputs.
There is a need in the art for an improved system for determining acoustic environments in hearing assistance devices.
Disclosed herein, among other things, are systems and methods for detection of special environments for hearing assistance devices. One aspect of the present subject matter includes a method of operating a hearing assistance device for a user. A signal is received from a mobile device, such as a cellular telephone, representative of an environmental parameter sensed by the mobile device. In various embodiments, an acoustic environment about the mobile device is identified based on the received signal using a signal processor. An operational mode of the hearing assistance device is adjusted using the signal processor based on the identified acoustic environment, according to various embodiments.
One aspect of the present subject matter includes a hearing assistance system including a hearing assistance device for a user. The system includes a wireless receiver configured to receive a signal from mobile device, such as a cellular telephone, including a representation of a sensed parameter related to an acoustic environment about the mobile device. According to various embodiments, the system also includes a processor configured to identify the acoustic environment using the received signal and to adjust a hearing assistance device parameter based the identified environment.
This Summary is an overview of some of the teachings of the present application and not intended to be an exclusive or exhaustive treatment of the present subject matter. Further details about the present subject matter are found in the detailed description and appended claims. The scope of the present invention is defined by the appended claims and their legal equivalents.
The following detailed description of the present subject matter refers to subject matter in the accompanying drawings which show, by way of illustration, specific aspects and embodiments in which the present subject matter may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the present subject matter. References to “an”, “one”, or “various” embodiments in this disclosure are not necessarily to the same embodiment, and such references contemplate more than one embodiment. The following detailed description is demonstrative and not to be taken in a limiting sense. The scope of the present subject matter is defined by the appended claims, along with the full scope of legal equivalents to which such claims are entitled.
The present detailed description will discuss hearing assistance devices using the example of hearing aids. Hearing aids are only one type of hearing assistance device. Other hearing assistance devices include, but are not limited to, those in this document. It is understood that their use in the description is intended to demonstrate the present subject matter, but not in a limited or exclusive or exhaustive sense.
As a person wearing a hearing assistance device moves between different acoustic environments, it may be advantageous to change the operational modes or characteristics of the hearing assistance device to adjust the device to particular acoustic environments. Certain environments may be more difficult to identify than others and can result in misidentification of the environment. One problematic environment is that of a vehicle, such as an automobile. Wearers of digital hearing aids in moving vehicles are exposed to a variety of sounds coming from the vehicle, open windows, fans, and sounds from outside of the vehicle.
Disclosed herein, among other things, are systems and methods for detection of special environments for hearing assistance devices. One aspect of the present subject matter includes a hearing assistance system including a hearing assistance device for a user. The system includes a wireless receiver configured to receive a signal from mobile device, such as a cellular telephone, including a representation of a sensed parameter related to an acoustic environment about the mobile device. According to various embodiments, the system also includes a processor configured to identify the acoustic environment using the received signal and to adjust a hearing assistance device parameter based the identified environment.
The present subject matter provides a system and method for identifying acoustic environments using a mobile device. Examples of mobile devices include cellular telephones such as iPhones, Android phones, and Blackberry phones. Other types of mobile devices used include, but are not limited to: car global positioning system (GPS) systems, iPods, personal digital assistants (PDAs), and beacon devices. One environment detected by the present system includes an inside of a car. Identifying the car environment is useful, since many hearing aid adaptive features should operate differently in a car. For example, if the car environment is identified, then directionality should be set to omni-directional rather than directional mode. In one embodiment, for an iPhone enabled hearing aid, the accelerometer and the GPS system of the iPhone can be used to distinguish that the car is moving. At greater than 5 mph (for example), the iPhone sends a signal to the hearing aid that it is now in a moving vehicle, in an embodiment. Other parameters can be sensed by the mobile device to assist in identifying the acoustic environment about the mobile device, without departing from the scope of the present subject matter. In various embodiments, the hearing aid assumes that this vehicle is a car, and activates or adjusts adaptive features for the car.
Prior adjustment techniques did not reliably classify the car environment, leading to adaptive behavior that is not appropriate for the car. For example, directional switching was based on level and signal to noise ratio (SNR). In a car, this leads to frequent false switching. Switching to directional mode in a car is almost always wrong. The car is both a unique and common environment for hearing aid wearers. By correctly classifying the car environment using the present subject matter, the hearing aid can adapt appropriately to this unique environment, with its unique requirements (noisy, but constant LF noise; not facing the talker, etc). The present subject matter classifies the car environment reliably and provides that information to the hearing aid signal processor. Using movement of a mobile device, such as a cellular phone, the present subject matter reliably differentiates the car environment. Other acoustic environments are also similarly classified: train, taxi, limo, bike, and airplane. In one embodiment, each of these similar environments is classified as a car, with the same or similar adaptive behavior. In other embodiments, the system can further differentiate between car and bike, for example. The present subject matter improves hearing aid performance in a car, which is a common acoustic environment.
In various embodiments, the beacon device includes one or more sensors. In one embodiment, the sensor is an accelerometer. In one embodiment, the sensor is a micro-electro-mechanical system (MEMS) accelerometer. In one embodiment, the sensor is a magnetic sensor. In one embodiment, the sensor is a giant magnetorestrictive (GMR) sensor. In one embodiment the sensor is an anisotropic magnetorestrictive (AMR) sensor. In one embodiment the sensor is a microphone. In various embodiments, a combination of sensors are employed, including, but not limited to those stated in this disclosure. In various embodiments signal processing circuits capable of processing the sensor outputs are included. In various embodiments, a processor is included which processes signals from the one or more sensors. In various embodiments, the processor is adapted to determine the acoustic environment based on data from at least one of the one or more sensors. In such embodiments, environment information is sent wirelessly to one or more hearing assistance devices. In various embodiments, the beacon device sends the sensor data wirelessly. In such embodiments, one or more hearing assistance devices can receive the data and process it to identify an acoustic environment. In various embodiments, the beacon may act as a remote sensor to the one or more hearing assistance devices. The information from the beacon can be used exclusively, selectively, or in combination with audio information from the hearing assistance device to determine an acoustic environment. Other sensors and applications are possible without departing from the scope of the present subject matter.
In various embodiments, memory 112 stores one or more acoustic environment codes that identify one or more particular acoustic environments. Transmitter 114 is configured to transmit the one or more acoustic environment codes stored in memory 112 at uniform intervals. In one embodiment, the transmitter 114 is adapted to detect the presence of a hearing assistance device and initiate transmission of one or more acoustic environment codes stored in memory 112. In various embodiments, memory 112 includes non-volatile flash memory. In various embodiments, memory 112 includes a DRAM (Dynamic Random Access Memory). In various embodiments, memory 112 includes an SRAM (Static Random Access Memory). In various embodiments, memory 112 stores sensor signal information from one or more sensors. In various embodiments, such sensor signal information is telemetered using transmitter 114. In various embodiments, such sensor signal information is processed before it is transmitted. Other techniques and apparatus may be employed to provide the memory. For example, in one embodiment, the code is hardwired to provide the memory used by transmitter 114.
In various embodiments, beacon device 110 is attached to devices to assist the hearing assistance device in determining the appropriate processing required by the hearing assistance device. For example, a beacon device 110 could be attached to a user's television, and the hearing assistance device would automatically switch to a “television” mode when the television is powered on (thus activating the TV beacon). In various embodiments, the hearing assistance device switches to a predetermined mode when it senses various coded beacon devices in range. In various embodiments, beacon devices could be attached to noisy consumer devices such as a vacuum cleaner, which can change noise reduction more accurately and quickly then when compared to having to detect such consumer devices solely based on their acoustic signature. In various embodiments, beacon devices could be configured to automatically terminate transmission of acoustic environment codes when the consumer device (such as a television, vacuum cleaner, etc.) is turned off.
In various embodiments, such as in behind-the-ear devices, hearing assistance electronics 205 is in communications with a speaker (or receiver, as is used commonly in hearing aids) in communication with electronics in first housing 221. In such embodiments, a hollow sound tube is used to transmit sound from the receiver in the behind-the-ear or over-the-ear device to an earpiece 228 in the ear. Thus, in the BTE application, BTE housing 221 is connected to a sound tube 223 to provide sound from the receiver to a standard or custom earpiece 228. In such BTE designs, no receiver is found in the earpiece 228.
In various embodiments, beacon device 110 transmits an acoustic environment code identifying an acoustic environment. In various embodiments, the wireless receiver 206 in the hearing assistance device 210 receives the acoustic environment codes transmitted by the beacon device 110. In various embodiments, upon receiving the acoustic environment code, the wireless receiver 206 sends the received acoustic environment code to hearing assistance electronics 205. In various embodiments, sensor information is transmitted by the beacon device 110 to hearing assistance device 210 and the information is processed by the hearing assistance device. In various embodiments, the processing includes environment determination. In various embodiments, the information transmitted includes sensor based information. In various embodiments, the information transmitted includes statistical information associated with sensed information.
In various embodiments the hearing assistance electronics 205 can be programmed to perform a variety of functions depending on a received code. Some examples include, but are not limited to, configuring the operational mode of the at least one microphone, adjusting operational parameters, adjusting operational modes, and/or combinations of one or more of the foregoing options. In various embodiments, the operating mode of the microphone is set to directional mode based on the received acoustic environment code that identifies a particular acoustic environment (e.g., acoustic environment where the user is listening to fixed speaker in a closed room), if the wearer would benefit from a directional mode setting for a better quality of hearing. In various embodiments, the operating mode of the microphone is set to an omni-directional mode based on the received acoustic environment code. For example, if the user is listening to natural sounds in an open field, the microphone setting can be set to omni-directional mode for providing further clarity of the acoustic waves received by the hearing assistance device 210. In various embodiments, where there is more than one microphone, the operating mode of a first microphone can be set to a directional mode and the operating mode of a second microphone can be set to an omni-directional mode based on the acoustic environment code received from the beacon device 110.
In various embodiments, where there is more than one microphone, the combination of microphones can be set to a directional mode or an omni-directional mode, or a combination of omni and directional modes, based on the acoustic environment code received from the beacon device 110.
In various embodiments, the first housing 221 is a housing adapted to be worn on the ear of a user, such as, an on-the-ear (OTE) housing or a behind-the-ear (BTE) housing. In various embodiments, the second housing 228 includes an earmold. In various embodiments, the second housing 228 includes an in-the-ear (ITE) housing. In various embodiments, the second housing 228 includes an in-the-canal (ITC) housing. In various embodiments, the second housing 228 includes a completely-in-the-canal (CIC) housing. In various embodiments the second housing 228 includes an earbud. In various embodiments, the receiver 207 is placed in the ear canal of the wearer using a small nonocclusive housing. Other earpieces are possible without departing from the scope of the present subject matter.
In the illustrated embodiment of
In various embodiments, each of the acoustic environment codes stored in memory 112 is indicative of various different acoustic environments. In various embodiments, the transmitted wireless signals include data indicative of the acoustic environment of the location of beacon device 110. In various embodiments, the acoustic environments include, but are not limited to, the inside of a car, an empty room, a lecture hall, a room with furniture, open spaces such as in a country side, a sidewalk of a typical city street, inside a plane, a factory work environment, etc. In various embodiments, the acoustic environment codes are stored in register locations within memory 112. In some embodiments, memory 112 includes non-volatile flash memory.
At block 630, method 600 includes receiving the one or more environment codes at a hearing assistance device. In various embodiments, receiving the one or more environment codes at a hearing assistance device comprises receiving an acoustic environment code when the hearing assistance device enters the particular acoustic environment identified by the acoustic environment code. In various embodiments, receiving the first acoustic environment code comprises receiving the first acoustic environment code when a user having the hearing assistance device enters an automobile, a plane, a railway car or a ship. In various embodiments, the environment code is received when the automobile, plane, railway car or ship begins moving. In various embodiments, acoustic environments can include inside of a car, an empty room, a lecture hall, a room with furniture, open spaces such as in a countryside, a sidewalk of a typical city street, inside a plane, a factory work environment, in a room during vacuuming, watching a television, hearing the radio etc.
At block 640, method 600 includes adjusting an operational mode of the hearing assistance device based on the received environment code. In various embodiments, adjusting the operational mode of the hearing assistance device comprises switching between a first microphone and a second microphone. In various embodiments, switching between a first microphone and a second microphone comprises switching between a directional microphone and an omni-directional microphone. In various embodiments, adjusting the operational mode of the device includes switching from a first omni-directional microphone configuration to a second multi-microphone directional configuration, such as in multi-microphone directional beamforming.
In various embodiments, information is telemetered relating to signals sensed by the one or more sensors on the wireless beacon device. In such designs the information telemetered includes, but is not limited to, sensed signals, and/or statistical information about the sensed signals. Hearing assistance devices receiving such information are programmed to process the received signals to determine an environmental status. In such embodiments, the received information may be used by the hearing assistance system to determine the acoustic environment and/or to at least partially control operation of the hearing assistance device for better listening by the wearer.
The present subject matter aids communication in challenging environments in intelligent ways. It improves the communication experience for hearing assistance users in challenging listening environments such as moving vehicles.
Various embodiments of the present subject matter support wireless communications with a hearing assistance device. In various embodiments the wireless communications can include standard or nonstandard communications. Some examples of standard wireless communications include link protocols including, but not limited to, Bluetooth™, IEEE 802.11 (wireless LANs), 802.15 (WPANs), 802.16 (WiMAX), cellular protocols including, but not limited to CDMA and GSM, ZigBee, and ultra-wideband (UWB) technologies. Such protocols support radio frequency communications and some support infrared communications. Although the present system is demonstrated as a radio system, it is possible that other forms of wireless communications can be used such as ultrasonic, optical, infrared, and others. It is understood that the standards which can be used include past and present standards. It is also contemplated that future versions of these standards and new future standards may be employed without departing from the scope of the present subject matter.
The wireless communications support a connection from other devices. Such connections include, but are not limited to, one or more mono or stereo connections or digital connections having link protocols including, but not limited to 802.3 (Ethernet), 802.4, 802.5, USB, SPI, PCM, ATM, Fibre-channel, Firewire or 1394, InfiniBand, or a native streaming interface. In various embodiments, such connections include all past and present link protocols. It is also contemplated that future versions of these protocols and new future standards may be employed without departing from the scope of the present subject matter.
It is understood that variations in communications protocols, antenna configurations, and combinations of components may be employed without departing from the scope of the present subject matter. Hearing assistance devices typically include an enclosure or housing, a microphone, hearing assistance device electronics including processing electronics, and a speaker or receiver. It is understood that in various embodiments the microphone is optional. It is understood that in various embodiments the receiver is optional. Antenna configurations may vary and may be included within an enclosure for the electronics or be external to an enclosure for the electronics. Thus, the examples set forth herein are intended to be demonstrative and not a limiting or exhaustive depiction of variations.
It is further understood that any hearing assistance device may be used without departing from the scope and the devices depicted in the figures are intended to demonstrate the subject matter, but not in a limited, exhaustive, or exclusive sense. It is also understood that the present subject matter can be used with a device designed for use in the right ear or the left ear or both ears of the user.
It is understood that the hearing aids referenced in this patent application include a processor. The processor may be a digital signal processor (DSP), microprocessor, microcontroller, other digital logic, or combinations thereof. The processing of signals referenced in this application can be performed using the processor. Processing may be done in the digital domain, the analog domain, or combinations thereof. Processing may be done using subband processing techniques. Processing may be done with frequency domain or time domain approaches. Some processing may involve both frequency and time domain aspects. For brevity, in some examples drawings may omit certain blocks that perform frequency synthesis, frequency analysis, analog-to-digital conversion, digital-to-analog conversion, amplification, audio decoding, and certain types of filtering and processing. In various embodiments the processor is adapted to perform instructions stored in memory which may or may not be explicitly shown. Various types of memory may be used, including volatile and nonvolatile forms of memory. In various embodiments, instructions are performed by the processor to perform a number of signal processing tasks. In such embodiments, analog components are in communication with the processor to perform signal tasks, such as microphone reception, or receiver sound embodiments (i.e., in applications where such transducers are used). In various embodiments, different realizations of the block diagrams, circuits, and processes set forth herein may occur without departing from the scope of the present subject matter.
The present subject matter is demonstrated for hearing assistance devices, including hearing aids, including but not limited to, behind-the-ear (BTE), in-the-ear (ITE), in-the-canal (ITC), receiver-in-canal (RIC), or completely-in-the-canal (CIC) type hearing aids. It is understood that behind-the-ear type hearing aids may include devices that reside substantially behind the ear or over the ear. Such devices may include hearing aids with receivers associated with the electronics portion of the behind-the-ear device, or hearing aids of the type having receivers in the ear canal of the user, including but not limited to receiver-in-canal (RIC) or receiver-in-the-ear (RITE) designs. The present subject matter can also be used in hearing assistance devices generally, such as cochlear implant type hearing devices and such as deep insertion devices having a transducer, such as a receiver or microphone, whether custom fitted, standard, open fitted or occlusive fitted. It is understood that other hearing assistance devices not expressly stated herein may be used in conjunction with the present subject matter.
This application is intended to cover adaptations or variations of the present subject matter. It is to be understood that the above description is intended to be illustrative, and not restrictive. The scope of the present subject matter should be determined with reference to the appended claims, along with the full scope of legal equivalents to which such claims are entitled.
Number | Name | Date | Kind |
---|---|---|---|
4777474 | Clayton | Oct 1988 | A |
6195572 | Patterson et al. | Feb 2001 | B1 |
6870940 | Meyer et al. | Mar 2005 | B2 |
7853030 | Grasbon et al. | Dec 2010 | B2 |
8705782 | Woods et al. | Apr 2014 | B2 |
8867765 | Solum | Oct 2014 | B2 |
20030064746 | Rader et al. | Apr 2003 | A1 |
20030235319 | Rass | Dec 2003 | A1 |
20040138723 | Malick et al. | Jul 2004 | A1 |
20060222194 | Bramslow | Oct 2006 | A1 |
20070237335 | O'sullivan | Oct 2007 | A1 |
20070249289 | Grafenberg et al. | Oct 2007 | A1 |
20080013769 | Sacha et al. | Jan 2008 | A1 |
20080199971 | Tondra | Aug 2008 | A1 |
20090097683 | Burns et al. | Apr 2009 | A1 |
20090184706 | Duric et al. | Jul 2009 | A1 |
20090196444 | Solum | Aug 2009 | A1 |
20090208043 | Woods et al. | Aug 2009 | A1 |
20100208631 | Zhang et al. | Aug 2010 | A1 |
20110293123 | Neumeyer et al. | Dec 2011 | A1 |
20120235633 | Kesler et al. | Sep 2012 | A1 |
20150003652 | Bisgaard | Jan 2015 | A1 |
20150023537 | Woods et al. | Jan 2015 | A1 |
Number | Date | Country |
---|---|---|
2104378 | Jul 2013 | DK |
2521377 | Nov 2012 | EP |
2104378 | Jul 2013 | EP |
WO-2007046748 | Apr 2007 | WO |
WO-2008055960 | May 2008 | WO |
Entry |
---|
“U.S. Appl. No. 12/388,341 , Response filed Nov. 13, 2013 to Final Office Action mailed Aug. 13, 2013 and Advisory Action mailed Oct. 30, 2013”, 11 pgs. |
“U.S. Appl. No. 12/388,341, Advisory Action mailed Feb. 22, 2012”, 2 pgs. |
“U.S. Appl. No. 12/388,341, Advisory Action mailed Oct. 30, 2013”, 3 pgs. |
“U.S. Appl. No. 12/388,341, Final Office Action mailed Aug. 13, 2013”, 18 pgs. |
“U.S. Appl. No. 12/388,341, Final Office Action mailed Sep. 19, 2012”, 18 pgs. |
“U.S. Appl. No. 12/388,341, Final Office Action mailed Dec. 9, 2011”, 14 pgs. |
“U.S. Appl. No. 12/388,341, Non Final Office Action mailed Feb. 14, 2013”, 16 pgs. |
“U.S. Appl. No. 12/388,341, Non Final Office Action mailed Apr. 10, 2012”, 17 pgs. |
“U.S. Appl. No. 12/388,341, Non Final Office Action mailed Jun. 17, 2011”, 11 pgs. |
“U.S. Appl. No. 12/388,341, Notice of Allowance mailed Nov. 22, 2013”, 9 pgs. |
“U.S. Appl. No. 12/388,341, Response filed Jan. 22, 2013 to Final Office Action mailed Sep. 19, 2012”, 10 pgs. |
“U.S. Appl. No. 12/388,341, Response filed Feb. 9, 2012 to Final Office Action mailed Dec. 9, 2011”, 9 pgs. |
“U.S. Appl. No. 12/388,341, Response filed Mar. 9, 2012 to Advisory Action mailed Feb. 22, 2012”, 11 pgs. |
“U.S. Appl. No. 12/388,341, Response filed May 14, 2013 to Non Final Office Action mailed Feb. 14, 2013”, 10 pgs. |
“U.S. Appl. No. 12/388,341, Response filed Sep. 10, 2012 to Non Final Office Action mailed Apr. 10, 2012”, 10 pgs. |
“U.S. Appl. No. 12/388,341, Response filed Sep. 19, 2011 to Non Final Office Action mailed Jun. 17, 2011”, 9 pgs. |
“U.S. Appl. No. 12/388,341, Response filed Oct. 14, 2013 to Final Office Action mailed Aug. 13, 2013”, 11 pgs. |
“European Application Serial No. 09250424.0, European Search Report Jun. 4, 2009”, 8 pgs. |
“European Application Serial No. 09250424.0, Examination Notification mailed Feb. 8, 2011”, 8 pgs. |
“European Application Serial No. 09250424.0, Response filed Jun. 16, 2011 to Examination Notification mailed Feb. 8, 2011”, 12 pgs. |
“European Application Serial No. 09250424.0, Response filed Dec. 9, 2009 to European Search Report mailed Jun. 4, 2009”, 6 pgs. |
Tondra, Mark, “Flow Assay With Integrated Detector”, U.S. Appl. No. 60/887,609, filed Feb. 1, 2007, 28 pgs. |
“European Application Serial No. 14177458.8, Extended European Search Report mailed Jan. 9, 2015”, 7 pgs. |
“U.S. Appl. No. 14/186,754, Non Final Office Action mailed Jul. 23, 2015”, 6 pgs. |
“European Application Serial No. 14177458.8, Extended European Search Report mailed Jan. 22, 2015”. |
“U.S. Appl. No. 14/186,754, Advisory Action mailed Apr. 7, 2016”, 3 pgs. |
“U.S. Appl. No. 14/186,754, Final Office Action mailed Jan. 12, 2016”, 11 pgs. |
“U.S. Appl. No. 14/186,754, filed Mar. 14, 2016 to Final Office Action mailed Jan. 12, 2016”, 7 pgs. |
“U.S. Appl. No. 14/186,754, filed Oct. 22, 2015 to Non Final Office Action mailed Jul. 23, 2015”, 6 pgs. |
“European Application Serial No. 14177458.8, Communication pursuant to Rules 70(2) and 70a(2) mailed Feb. 12, 2015”, 4 pgs. |
“European Application Serial No. 14177458.8, filed Jul. 23, 2015 to Communication pursuant to Rules 70(2) and 70a(2) mailed Feb. 12, 2015”, 9 pgs. |
Number | Date | Country | |
---|---|---|---|
20150023536 A1 | Jan 2015 | US |