Traditionally field workers such as fire fighters and plant workers communicate using two way radios such as radios supplied by Motorola, Inc. As mentioned in United States Patent 20070142072, two way radios allow users the ability to wirelessly communicate with others on a small network. Most two way radios use various channels or frequencies for communication. Monitoring of more than one channel allows a user to communicate with a plurality of people for a variety of purposes. In a security environment, for instance, channel 1 may be used to communicate about and monitor emergency conditions. Channel 2 may be used to communicate about and monitor major security threats. Channel 3 may be used to communicate about and monitor minor security threats. A user may monitor all three channels by using a two way radio having a scanning mode. However, the user is limited to transmitting on the most recently scanned channel. If the user fails to transmit within a short predetermined period of time, the two way radio may scan to a new channel. The user then needs to manually select the channel, wasting valuable time and eliminating the ability to scan other channels during the selection process.
Other devices that can be used instead of two way radios include cellular telephones. These devices have revolutionized personal communication by allowing telephone access from anywhere within reach of wireless network infrastructure (e.g., cellular networks, communication satellites, or other infrastructure of other wireless networks adapted for voice communications). In as much as the use of handheld wireless voice communication devices is not restricted to homes and offices, such devices will often be used in environments where there is considerable ambient noise. Examples of such environments include busy urban settings, inside moving vehicles and on factory floors. Ambient noise in an environment can degrade the intelligibility of received voice audio and thereby interfere with users' ability to communicate.
United States Patent 20060270467 discusses enhancing the intelligibility of speech emitted into a noisy environment by filtering ambient noise with a filter that simulates the physical blocking of noise by at least a part of a voice communication device and determining a frequency dependent SNR of received voice audio relative to ambient noise is computed on a perceptual (e.g. Bark) frequency scale. Formants are identified and the SNR in bands including certain formants are modified with formant enhancement gain factors in order to improve intelligibility. However, in certain industrial, emergency, government and military applications, such noise filtering is insufficient to provide high quality, hands-free, yet inconspicuous communication capability for field personnel.
Methods and apparatus that support communications and/or medical information monitoring for field personnel are disclosed. In one embodiment, a device provides an electronic and transducer device that can be attached, adhered, or otherwise embedded into or upon a removable oral appliance or other oral device to form a two-way communication assembly.
In another embodiment, the device provides an electronic and transducer device that can be attached, adhered, or otherwise embedded into or upon a removable oral appliance or other oral device to form a medical tag containing patient identifiable information. Such an oral appliance may be a custom-made device fabricated from a thermal forming process utilizing a replicate model of a dental structure obtained by conventional dental impression and/or imaging methods. The electronic and transducer assembly may receive incoming sounds either directly or through a receiver to process and amplify the signals and transmit the processed sounds via a vibrating transducer element coupled to a tooth or other bone structure, such as the maxillary, mandibular, or palatine bone structure.
Advantages of preferred embodiments may include one or more of the following. The system is a multi-purpose communication platform that is rugged, wireless and secure. The system provides quality, hands-free, yet inconspicuous communication capability for field personnel. In embodiments that support voice and data from the field, the system automates the capture and transfer of data in the field to a central remote computer for further processing. By minimizing human data collection, the system reduces paperwork, allows for the collection of more complete field information, eliminates redundant data entry, and increases responsiveness to fluid situations present in industrial environment or emergency, local government or military incidents.
For applications that require health monitoring, the system enables healthcare providers to make certain all patient episodes are captured and recorded no matter the environment. Most importantly; the system provides a standard of care to field personnel by providing access to previously unavailable information. For military and emergency applications, the system provides commanders with real time visibility of their readiness status and provides support for medical command and control, telemedicine and medical informatics applications across the continuum of the spectrum of military medical operations but especially for the first responder and far forward medical facilities. With soldiers deployed in many different parts of the world, the system allows medical professionals to capture patient episodes anywhere, anytime, and ensure complete patient information is recorded and transferred to the soldier's medical records at home. Additionally, the system provides a dental identification capability which is retained on the individual and thus is less subject to destruction, loss, forgetfulness, or any of numerous other problems.
As shown in the embodiment of
A beacon 4 can be positioned on the user's body and communicate with the mouth wearable communicator 6 to identify that the user is in need of assistance. The beacon 4 can provide an audible signal, a location signal, a visible signal to request assistance. The beacon 4 can be affixed to a first responder's uniform and can be activated if the user's vital signs indicate he/she has been hurt and can help to locate them in a smoky environment.
The two way communication device can use the body sensor 2 to sense a vital sign or a measured parameter such as environment temperature, among others. The vital sign can include body temperature, hydration, heart rate, EKG, EEG, pulse rate, oxygen saturation, respiratory cycle, air flow velocity, potential of hydrogen (pH) level, for example. The measured parameter can include environment temperature. The sensor can be positioned in the linking unit 3 or other devices attached or close to body. Also other measured parameter signals that can be picked from the linking unit 3 or the mouth piece 6 that may be of value, including “Location (using GPS or cell phone signals or equivalent method)”, “Speed”, “Acceleration”, “Shock”, “Smoke”, “Light”, “Infra Red”, “Radiation” and “Wind Speed.” The measure parameters can help identify the location of a Fire Fighter or Miner, and to assess if he or she is at risk, for example. The body sensor 2 can also include an activity monitor. For instance accelerometers can monitor whether the subject is moving, walking or running and relay the information to the remote station 5.
The two way communication device can have a microphone to pick up sound. The microphone can be an intraoral microphone or an extraoral microphone. In one embodiment, the microphone cancels environmental noise and transmits a user's voice to the remote station. This embodiment provides the ability to cancel environmental noises while transmitting subject's own voice to the remote station 5 such as a call center. As the microphone is in a fixed location (compared to ordinary wireless communication devices) and very close to user's own voice, the system can handle environmental noise reduction that is important in working in high noise areas. As such, the two way communication device can be used by workers in loud environments such as a professional entertainer or athlete and/or support personnel, a soldier, a medic, a fireman, an emergency worker, among others.
The two way communication device can provide a wireless sensor network that communicates with the mouth wearable communicator 6, the one or more body sensors 2, and the linking unit 3. In other embodiments, the linking unit 3 wirelessly communicates with other linking units of other users to form a wireless local area network for detection, identification and communication with a crew.
In one embodiment, the mouth wearable communicator 6 has a housing having a shape which is conformable to at least a portion of at least one tooth; an actuatable transducer disposed within or upon the housing and in vibratory communication with a surface of the at least one tooth; and a wireless communication transceiver coupled to the transducer to provide received sound to the user and to provide communication for the user. The two way communication device can be an oral appliance having a shape which conforms to the at least one tooth. An electronic assembly can be disposed within or upon the housing and which is in communication with the transducer.
In another aspect, a method for providing two way communication includes intraorally wearing a mouth wearable communicator; sensing one or more body parameters or environmental parameters; and linking to the mouth wearable communicator and the one or more body sensors to a remote station.
Implementations of the above method can include one or more of the following. The system can transmit sound using a bone conduction device. The mouth wearable communicator can be a custom oral device. The system can embed one or more sensors in the mouth wearable communicator. The system can store medical or environmental data in the mouth wearable communicator. For emergency or military users that may face danger, the system can store data in the mouth wearable communicator as a secured black box device for subsequent forensic analysis.
Turning now to
In another embodiment, the device 1 provides an electronic and transducer device 4 that can be attached, adhered, or otherwise embedded into or upon a removable oral appliance or other oral device to form a medical tag containing patient identifiable information. Such an oral appliance may be a custom-made device fabricated from a thermal forming process utilizing a replicate model of a dental structure obtained by conventional dental impression methods. The electronic and transducer assembly may receive incoming sounds either directly or through a receiver to process and amplify the signals and transmit the processed sounds via a vibrating transducer element coupled to a tooth or other bone structure, such as the maxillary, mandibular, or palatine bone structure.
The device 1 can include sensors that detect chemicals present in the user's saliva and provide medical information on the user. The device 1 can also sense heart rate, EKG, and other bio-signals that can be picked up within the mouth. Additionally, the device 1 can communicate with a medical data collection module 2 that can collect vital signs such as temperature, heart rate, EKG, respiration rate, and other vital signs or medical information. The device 1 can communicate with the module 2 through various short range radios such as a Bluetooth radio, for example.
The device 1 can also communicate through a long range transceiver such as a short-wave transceiver, a cellular telephone transceiver, or a satellite transceiver 3. Such transceivers can be provided within the device 1, or alternatively, can be body worn. In the embodiment of
An exemplary process to collect medical information from the user (such as fire fighting personnel) and for supporting bone-conduction two way communication can be as follows:
In one embodiment, the medical data would include user identification, triage status, condition, and treatment. The data would be routed via a cellular transceiver or a satellite transceiver to a Command Post where it is processed, stored, relayed to the Internet, and moved back to devices on the field. As a result, data on casualties would be accessible immediately for operational use by other users, medics, responders, incident commanders and even receiving hospitals that can help the user. Real-time information regarding victims and their status is critical to the overall management of field medical care. Medical command can then coordinate timely information on the number of casualties and their needs with the known availability of resources, such as on-scene providers, ambulance locations, and area hospital capacities. Real-time information is also provided for determining the appropriate patient destination, depending on the type of injuries and the capabilities of the receiving facilities.
An exemplary process to collect work information from a plant or site and for supporting bone-conduction two way communication with the worker can be as follows:
In another embodiment for military use, the remote computer can support a BATTLEFIELD MEDICAL INFORMATION SYSTEMS TACTICAL-JOINT (BMIST-J) for enabling military providers to record, store, retrieve and transfer medical records to the DoD's Clinical Data Repository by synchronizing the received data. The system supports digital versions of the DD 1380 (field medical card) and SF 600 (chronological medical record of care). Diagnostic and treatment decision aids are provided by the system. The data captured by the device 1 is also Personal Information Carrier (PIC) compatible. The system provides a secure, legible, electronic records of battlefield treatments, contributes to a comprehensive, life-long medical history, and facilitates medical surveillance. This device can also help medical personnel to triage soldiers more effectively by noting vital signs and ranking accordingly.
Turning now to more details on the device 1, as shown in
Generally, the volume of electronics and/or transducer assembly 16 may be minimized so as to be unobtrusive and as comfortable to the user when placed in the mouth. Although the size may be varied, a volume of assembly 16 may be less than 800 cubic millimeters. This volume is, of course, illustrative and not limiting as size and volume of assembly 16 and may be varied accordingly between different users.
Moreover, removable oral appliance 18 may be fabricated from various polymeric or a combination of polymeric and metallic materials using any number of methods, such as computer-aided, machining processes using computer numerical control (CNC) systems or three-dimensional printing processes, e.g., stereolithography apparatus (SLA), selective laser sintering (SLS), and/or other similar processes utilizing three-dimensional geometry of the patient's dentition, which may be obtained via any number of techniques. Such techniques may include use of scanned dentition using intra-oral scanners such as laser, white light, ultrasound, mechanical three-dimensional touch scanners, magnetic resonance imaging (MRI), computed tomography (CT), other optical methods, etc.
In forming the removable oral appliance 18, the appliance 18 may be optionally formed such that it is molded to fit over the dentition and at least a portion of the adjacent gingival tissue to inhibit the entry of food, fluids, and other debris into the oral appliance 18 and between the transducer assembly and tooth surface. Moreover, the greater surface area of the oral appliance 18 may facilitate the placement and configuration of the assembly 16 onto the appliance 18.
Additionally, the removable oral appliance 18 may be optionally fabricated to have a shrinkage factor such that when placed onto the dentition, oral appliance 18 may be configured to securely grab onto the tooth or teeth as the appliance 18 may have a resulting size slightly smaller than the scanned tooth or teeth upon which the appliance 18 was formed. The fitting may result in a secure interference fit between the appliance 18 and underlying dentition.
In one variation, with assembly 14 positioned upon the teeth, as shown in
The transmitter assembly 22, as described in further detail below, may contain a microphone assembly as well as a transmitter assembly and may be configured in any number of shapes and forms worn by the user, such as a watch, necklace, lapel, phone, belt-mounted device, etc.
With respect to microphone 30, a variety of various microphone systems may be utilized. For instance, microphone 30 may be a digital, analog, and/or directional type microphone. Such various types of microphones may be interchangeably configured to be utilized with the assembly, if so desired.
Power supply 36 may be connected to each of the components in transmitter assembly 22 to provide power thereto. The transmitter signals 24 may be in any wireless form utilizing, e.g., radio frequency, ultrasound, microwave, Blue Tooth® (BLUETOOTH SIG, INC., Bellevue, Wash.), etc. for transmission to assembly 16. Assembly 22 may also optionally include one or more input controls 28 that a user may manipulate to adjust various acoustic parameters of the electronics and/or transducer assembly 16, such as acoustic focusing, volume control, filtration, muting, frequency optimization, sound adjustments, and tone adjustments, etc.
The signals transmitted 24 by transmitter 34 may be received by electronics and/or transducer assembly 16 via receiver 38, which may be connected to an internal processor for additional processing of the received signals. The received signals may be communicated to transducer 40, which may vibrate correspondingly against a surface of the tooth to conduct the vibratory signals through the tooth and bone and subsequently to the middle ear to facilitate hearing of the user. Transducer 40 may be configured as any number of different vibratory mechanisms. For instance, in one variation, transducer 40 may be an electromagnetically actuated transducer. In other variations, transducer 40 may be in the form of a piezoelectric crystal having a range of vibratory frequencies, e.g., between 250 to 4000 Hz.
Power supply 42 may also be included with assembly 16 to provide power to the receiver, transducer, and/or processor, if also included. Although power supply 42 may be a simple battery, replaceable or permanent, other variations may include a power supply 42 which is charged by inductance via an external charger. Additionally, power supply 42 may alternatively be charged via direct, coupling to an alternating current (AC) or direct current (DC) source. Other variations may include a power supply 42 which is charged via a mechanical mechanism, such as an internal pendulum or slidable electrical inductance charger as known in the art, which is actuated via, e.g., motions of the jaw and/or movement for translating the mechanical motion into stored electrical energy for charging power supply 42.
In another variation of assembly 16, rather than utilizing an extra-buccal transmitter, two-way communication assembly 50 may be configured as an independent assembly contained entirely within the user's mouth, as shown in
In order to transmit the vibrations corresponding to the received auditory signals efficiently and with minimal loss to the tooth or teeth, secure mechanical contact between the transducer and the tooth is ideally maintained to ensure efficient vibratory communication. Accordingly, any number of mechanisms may be utilized to maintain this vibratory communication.
In one variation as shown in
An electronics and/or transducer assembly 64 may be simply placed, embedded, or encapsulated within housing 62 for contacting the tooth surface. In this variation, assembly 64 may be adhered against the tooth surface via an adhesive surface or film 66 such that contact is maintained between the two. As shown in
Aside from an adhesive film 66, another alternative may utilize an expandable or swellable member to ensure a secure mechanical contact of the transducer against the tooth. As shown in
Another variation is shown in
In yet another variation, the electronics may be contained as a separate assembly 90 which is encapsulated within housing 62 and the transducer 92 may be maintained separately from assembly 90 but also within housing 62. As shown in
In other variations as shown in
In yet another variation shown in
Another variation for a mechanical mechanism is illustrated in
In yet another variation, the electronics 150 and the transducer 152 may be separated from one another such that electronics 150 remain disposed within housing 62 but transducer 152, connected via wire 154, is located beneath dental oral appliance 60 along an occlusal surface of the tooth, as shown in
In the variation of
In yet another variation, an electronics and/or transducer assembly 170 may define a channel or groove 172 along a surface for engaging a corresponding dental anchor 174, as shown in
In yet another variation,
Similarly, as shown in
In yet other variations, vibrations may be transmitted directly into the underlying bone or tissue structures rather than transmitting directly through the tooth or teeth of the user. As shown in
In yet another variation, rather utilizing a post or screw drilled into the underlying bone itself, a transducer may be attached, coupled, or otherwise adhered directly to the gingival tissue surface adjacent to the teeth. As shown in
For any of the variations described above, they may be utilized as a single device or in combination with any other variation herein, as practicable, to achieve the desired hearing level in the user. Moreover, more than one oral appliance device and electronics and/or transducer assemblies may be utilized at any one time. For example,
Moreover, each of the different transducers 270, 272, 274, 276 can also be programmed to vibrate in a manner which indicates the directionality of sound received by the microphone worn by the user. For example, different transducers positioned at different locations within the user's mouth can vibrate in a specified manner by providing sound or vibrational queues to inform the user which direction a sound was detected relative to an orientation of the user. For instance, a first transducer located, e.g., on a user's left tooth, can be programmed to vibrate for sound detected originating from the user's left side. Similarly, a second transducer located, e.g., on a user's right tooth, can be programmed to vibrate for sound detected originating from the user's right side. Other variations and queues may be utilized as these examples are intended to be illustrative of potential variations.
In variations where the one or more microphones are positioned. In intra-buccal locations, the microphone may be integrated directly into the electronics and/or transducer assembly, as described above. However, in additional variation, the microphone unit may be positioned at a distance from the transducer assemblies to minimize feedback. In one example, similar to a variation shown above, microphone unit 282 may be separated from electronics and/or transducer assembly 280, as shown in
Although the variation illustrates the microphone unit 282 placed adjacent to the gingival tissue 268, unit 282 may be positioned upon another tooth or another location within the mouth. For instance,
In yet another variation for separating the microphone from the transducer assembly,
The applications of the devices and methods discussed above are not limited to the treatment of hearing loss but may include any number of further treatment applications. Moreover, such devices and methods may be applied to other treatment sites within the body. Modification of the above-described assemblies and methods for carrying out the invention, combinations between different variations as practicable, and variations of aspects of the invention that are obvious to those of skill in the art are intended to be within the scope of the claims.
Number | Name | Date | Kind |
---|---|---|---|
2045404 | Nicholides | Jun 1936 | A |
2161169 | Jefferis | Jun 1939 | A |
2318872 | Madiera | May 1943 | A |
2977425 | Cole | Mar 1961 | A |
2995633 | Puharich et al. | Aug 1961 | A |
3156787 | Puharich et al. | Nov 1964 | A |
3170993 | Puharich et al. | Feb 1965 | A |
3267931 | Puharich et al. | Aug 1966 | A |
3325743 | Blum | Jun 1967 | A |
3787641 | Santori | Jan 1974 | A |
3894196 | Briskey | Jul 1975 | A |
3985977 | Beaty et al. | Oct 1976 | A |
4025732 | Traunmuller | May 1977 | A |
4150262 | Ono | Apr 1979 | A |
4498461 | Hakansson | Feb 1985 | A |
4591668 | Iwata | May 1986 | A |
4612915 | Hough et al. | Sep 1986 | A |
4642769 | Petrofsky | Feb 1987 | A |
4738268 | Kipnis | Apr 1988 | A |
4817044 | Ogren | Mar 1989 | A |
4832033 | Maher et al. | May 1989 | A |
4920984 | Furumichi et al. | May 1990 | A |
4982434 | Lenhardt et al. | Jan 1991 | A |
5012520 | Steeger | Apr 1991 | A |
5033999 | Mersky | Jul 1991 | A |
5047994 | Lenhardt et al. | Sep 1991 | A |
5060526 | Barth et al. | Oct 1991 | A |
5082007 | Adell | Jan 1992 | A |
5233987 | Fabian et al. | Aug 1993 | A |
5323468 | Bottesch | Jun 1994 | A |
5325436 | Soli et al. | Jun 1994 | A |
5372142 | Madsen et al. | Dec 1994 | A |
5402496 | Soli et al. | Mar 1995 | A |
5403262 | Gooch | Apr 1995 | A |
5447489 | Issalene et al. | Sep 1995 | A |
5455842 | Mersky et al. | Oct 1995 | A |
5460593 | Mersky et al. | Oct 1995 | A |
5546459 | Sih et al. | Aug 1996 | A |
5558618 | Maniglia | Sep 1996 | A |
5565759 | Dunstan | Oct 1996 | A |
5616027 | Jacobs et al. | Apr 1997 | A |
5624376 | Ball et al. | Apr 1997 | A |
5661813 | Shimauchi et al. | Aug 1997 | A |
5706251 | May | Jan 1998 | A |
5730151 | Summer et al. | Mar 1998 | A |
5760692 | Block | Jun 1998 | A |
5800336 | Ball et al. | Sep 1998 | A |
5812496 | Peck | Sep 1998 | A |
5828765 | Gable | Oct 1998 | A |
5902167 | Filo et al. | May 1999 | A |
5914701 | Gersheneld et al. | Jun 1999 | A |
5961443 | Rastatter et al. | Oct 1999 | A |
5980246 | Ramsay et al. | Nov 1999 | A |
5984681 | Huang | Nov 1999 | A |
6029558 | Stevens et al. | Feb 2000 | A |
6047074 | Zoels et al. | Apr 2000 | A |
6068590 | Brisken | May 2000 | A |
6072884 | Kates | Jun 2000 | A |
6072885 | Stockham, Jr. et al. | Jun 2000 | A |
6075557 | Holliman et al. | Jun 2000 | A |
6115477 | Filo et al. | Sep 2000 | A |
6118882 | Haynes | Sep 2000 | A |
6171229 | Kroll et al. | Jan 2001 | B1 |
6223018 | Fukumoto et al. | Apr 2001 | B1 |
6239705 | Glen | May 2001 | B1 |
6333269 | Naito et al. | Dec 2001 | B2 |
6377693 | Lippa et al. | Apr 2002 | B1 |
6394969 | Lenhardt | May 2002 | B1 |
6504942 | Hong et al. | Jan 2003 | B1 |
6538558 | Sakazume et al. | Mar 2003 | B2 |
6585637 | Brillhart et al. | Jul 2003 | B2 |
6631197 | Taenzer | Oct 2003 | B1 |
6633747 | Reiss | Oct 2003 | B1 |
6682472 | Davis | Jan 2004 | B1 |
6754472 | Williams et al. | Jun 2004 | B1 |
6778674 | Panasik et al. | Aug 2004 | B1 |
6826284 | Benesty et al. | Nov 2004 | B1 |
6885753 | Bank | Apr 2005 | B2 |
6917688 | Yu et al. | Jul 2005 | B2 |
6941952 | Rush, III | Sep 2005 | B1 |
6954668 | Cuozzo | Oct 2005 | B1 |
6985599 | Asnes | Jan 2006 | B2 |
7003099 | Zhang et al. | Feb 2006 | B1 |
7033313 | Lupin et al. | Apr 2006 | B2 |
7035415 | Belt et al. | Apr 2006 | B2 |
7074222 | Westerkull | Jul 2006 | B2 |
7076077 | Atsumi et al. | Jul 2006 | B2 |
7099822 | Zangi | Aug 2006 | B2 |
7162420 | Zangi et al. | Jan 2007 | B2 |
7171003 | Venkatesh et al. | Jan 2007 | B1 |
7171008 | Elko | Jan 2007 | B2 |
7174022 | Zhang et al. | Feb 2007 | B1 |
7206423 | Feng et al. | Apr 2007 | B1 |
7246058 | Burnett | Jul 2007 | B2 |
7258533 | Tanner et al. | Aug 2007 | B2 |
7269266 | Anjanappa et al. | Sep 2007 | B2 |
7271569 | Oglesbee | Sep 2007 | B2 |
7310427 | Retchin et al. | Dec 2007 | B2 |
7329226 | Ni et al. | Feb 2008 | B1 |
7331349 | Brady et al. | Feb 2008 | B2 |
7333624 | Husung | Feb 2008 | B2 |
7361216 | Kangas et al. | Apr 2008 | B2 |
7409070 | Pitulia | Aug 2008 | B2 |
7486798 | Anjanappa et al. | Feb 2009 | B2 |
7520851 | Davis et al. | Apr 2009 | B2 |
7522738 | Miller, III | Apr 2009 | B2 |
7522740 | Julstrom et al. | Apr 2009 | B2 |
7610919 | Utley et al. | Nov 2009 | B2 |
20010003788 | Ball et al. | Jun 2001 | A1 |
20010051776 | Lenhardt | Dec 2001 | A1 |
20020026091 | Leysieffer | Feb 2002 | A1 |
20020071581 | Leysieffer et al. | Jun 2002 | A1 |
20020077831 | Numa | Jun 2002 | A1 |
20020122563 | Schumaier | Sep 2002 | A1 |
20020173697 | Lenhardt | Nov 2002 | A1 |
20030059078 | Downs et al. | Mar 2003 | A1 |
20030091200 | Pompei | May 2003 | A1 |
20030212319 | Magill | Nov 2003 | A1 |
20040057591 | Beck et al. | Mar 2004 | A1 |
20040131200 | Davis | Jul 2004 | A1 |
20040141624 | Davis et al. | Jul 2004 | A1 |
20040202339 | O'Brien, Jr. et al. | Oct 2004 | A1 |
20040202344 | Anjanappa et al. | Oct 2004 | A1 |
20040243481 | Bradbury et al. | Dec 2004 | A1 |
20040247143 | Lantrua et al. | Dec 2004 | A1 |
20050037312 | Uchida | Feb 2005 | A1 |
20050067816 | Buckman | Mar 2005 | A1 |
20050070782 | Brodkin | Mar 2005 | A1 |
20050129257 | Tamura | Jun 2005 | A1 |
20050196008 | Anjanappa et al. | Sep 2005 | A1 |
20050241646 | Sotos et al. | Nov 2005 | A1 |
20060008106 | Harper | Jan 2006 | A1 |
20060025648 | Lupin et al. | Feb 2006 | A1 |
20060064037 | Shalon et al. | Mar 2006 | A1 |
20060166157 | Rahman et al. | Jul 2006 | A1 |
20060167335 | Park et al. | Jul 2006 | A1 |
20060270467 | Song et al. | Nov 2006 | A1 |
20060275739 | Ray | Dec 2006 | A1 |
20070010704 | Pitulia | Jan 2007 | A1 |
20070036370 | Granovetter et al. | Feb 2007 | A1 |
20070041595 | Carazo et al. | Feb 2007 | A1 |
20070105072 | Koljonen | May 2007 | A1 |
20070142072 | Lassally | Jun 2007 | A1 |
20070230713 | Davis | Oct 2007 | A1 |
20070242835 | Davis | Oct 2007 | A1 |
20070265533 | Tran | Nov 2007 | A1 |
20070276270 | Tran | Nov 2007 | A1 |
20070280491 | Abolfathi | Dec 2007 | A1 |
20070280492 | Abolfathi | Dec 2007 | A1 |
20070280493 | Abolfathi | Dec 2007 | A1 |
20070280495 | Abolfathi | Dec 2007 | A1 |
20070286440 | Abolfathi et al. | Dec 2007 | A1 |
20070291972 | Abolfathi et al. | Dec 2007 | A1 |
20080019542 | Menzel et al. | Jan 2008 | A1 |
20080019557 | Bevirt et al. | Jan 2008 | A1 |
20080021327 | El-Bialy et al. | Jan 2008 | A1 |
20080064993 | Abolfathi et al. | Mar 2008 | A1 |
20080070181 | Abolfathi et al. | Mar 2008 | A1 |
20080227047 | Lowe et al. | Sep 2008 | A1 |
20080304677 | Abolfathi et al. | Dec 2008 | A1 |
20090028352 | Petroff | Jan 2009 | A1 |
20090052698 | Rader et al. | Feb 2009 | A1 |
20090088598 | Abolfathi | Apr 2009 | A1 |
20090097684 | Abolfathi et al. | Apr 2009 | A1 |
20090097685 | Menzel et al. | Apr 2009 | A1 |
20090099408 | Abolfathi et al. | Apr 2009 | A1 |
20090105523 | Kassayan et al. | Apr 2009 | A1 |
20090147976 | Abolfathi | Jun 2009 | A1 |
20090180652 | Davis et al. | Jul 2009 | A1 |
Number | Date | Country |
---|---|---|
0715838 | Jun 1996 | EP |
0741940 | Nov 1996 | EP |
0824889 | Feb 1998 | EP |
1299052 | Feb 2002 | EP |
1633284 | Dec 2004 | EP |
1691686 | Aug 2006 | EP |
1718255 | Nov 2006 | EP |
1783919 | May 2007 | EP |
2003-070752 | Mar 2003 | JP |
2003-310561 | Nov 2003 | JP |
2004-167120 | Jun 2004 | JP |
2005-278765 | Oct 2005 | JP |
2007028248 | Feb 2007 | JP |
2007028610 | Feb 2007 | JP |
2007044284 | Feb 2007 | JP |
2007049599 | Feb 2007 | JP |
2007049658 | Feb 2007 | JP |
WO 8302047 | Jun 1983 | WO |
WO 9102678 | Mar 1991 | WO |
WO 9519678 | Jul 1995 | WO |
WO 9621335 | Jul 1996 | WO |
WO 0209622 | Feb 2002 | WO |
WO 2004045242 | May 2004 | WO |
WO 2004105650 | Dec 2004 | WO |
WO 2005000391 | Jan 2005 | WO |
WO 2005037153 | Apr 2005 | WO |
WO 2005053533 | Jun 2005 | WO |
WO 2006088410 | Aug 2006 | WO |
WO 2006130909 | Dec 2006 | WO |
WO 2007043055 | Apr 2007 | WO |
WO 2007052251 | May 2007 | WO |
WO 2007059185 | May 2007 | WO |
WO 2007140367 | Dec 2007 | WO |
WO 2007140368 | Dec 2007 | WO |
WO 2007140373 | Dec 2007 | WO |
WO 2007143453 | Dec 2007 | WO |
WO 2008024794 | Feb 2008 | WO |
WO 2008030725 | Mar 2008 | WO |
WO 2009014812 | Jan 2009 | WO |
WO 2009025917 | Feb 2009 | WO |
WO 2009066296 | May 2009 | WO |
WO 2009073852 | Jun 2009 | WO |
Entry |
---|
“Special Forces Smart Noise Cancellation Ear Buds with Built-In GPS,” http://www.gizmag.com/special-forces-smart-noise-cancellation-ear-buds-with-built-in-gps/9428/, 2 pages, 2008. |
Altmann, et al. Foresighting the new technology waves—Exper Group. In: State of the Art Reviews and Related Papers—Center on Nanotechnology and Society. 2004 Conference. Published Jun. 14, 2004. Page 1-291. Available at http://www.nano-and-society.org. |
Berard, G., “Hearing Equals Behavior” [summary], 1993, http://www.bixby.org/faq/tinnitus/treatment.html. |
Broyhill, D., “Battlefield Medical Information System—Telemedicine,” A research paper presented to the U.S. Army Command and General Staff College in partial Fulfillment of the requirement for A462 Combat Health Support Seminar, 12 pages, 2003. |
Dental Cements—Premarket Notification, U.S. Department of Health and Human Services Food and Drug Administration Center for Devices and Radiological Health, pp. 1-10, Aug. 18, 1998. |
Henry, et al. “Comparison of Custom Sounds for Achieving Tinnitus Relief, ” J Am Acad Audiol,15:585-598, 2004. |
Jastreboff, Pawel, J., “Phantom auditory perception (tinnitus): mechanisms of generation and perception,” Neuroscience Research, 221-254, 1990, Elsevier Scientific Publishers Ireland, Ltd. |
Robb, “Tinnitus Device Directory Part I,” Tinnitus Today, p. 22, Jun. 2003. |
Song, S. et al., “A 0.2-mW 2-Mb/s Digital Transceiver Based on Wideband Signaling for Human Body Communications,” IEEE J Solid-State Cir, 42(9), 2021-2033, Sep. 2007. |
Stuart, A., et al., “Investigations of the Impact of Altered Auditory Feedback In-The-Ear Devices on the Speech of People Who Stutter: Initial Fitting and 4-Month Follow-Up,” Int J Lang Commun Disord, 39(1), Jan. 2004, [abstract only]. |
U.S. Appl. No. 11/672,239, filed Feb. 7, 2007 in the name of Abolfathi, Non-final Office Action mailed Jun. 18, 2009. |
U.S. Appl. No. 11/672,239, filed Feb. 7, 2007 in the name of Abolfathi, Non-final Office Action mailed Nov. 13, 2008. |
U.S. Appl. No. 11/672,250, filed Feb. 7, 2007 in the name of Abolfathi, Non-final Office Action mailed Apr. 21, 2009. |
U.S. Appl. No. 11/672,250, filed Feb. 7, 2007 in the name of Abolfathi, Non-final Office Action mailed Aug. 8, 2008. |
U.S. Appl. No. 11/672,264, filed Feb. 7, 2007 in the name of Abolfathi, Non-Final Rejection mailed Apr. 28, 2009. |
U.S. Appl. No. 11/672,264, filed Feb. 7, 2007 in the name of Abolfathi, Non-Final Rejection mailed Aug. 6, 2008. |
U.S. Appl. No. 11/672,271, filed Feb. 7, 2007 in the name of Abolfathi, Final Office Action mailed May 18, 2009. |
U.S. Appl. No. 11/672,271, filed Feb. 7, 2007 in the name of Abolfathi, Non-final Office Action mailed Aug. 20, 2008. |
U.S. Appl. No. 11/741,648, filed Apr. 27, 2007 in the name of Menzel et al., Final Office Action mailed May 18, 2009. |
U.S. Appl. No. 11/741,648, filed Apr. 27, 2007 in the name of Menzel et al., Non-final Office Action mailed Sep. 4, 2008. |
U.S. Appl. No. 11/754,823, filed May 29, 2007 in the name of Abolfathi et al., Final Office Action mailed May 12, 2009. |
U.S. Appl. No. 11/754,823, filed May 29, 2007 in the name of Abolfathi et al., Non-final Office Action mailed Aug. 14, 2008. |
U.S. Appl. No. 11/754,833, filed May 29, 2007 in the name of Abolfathi et al., Final Office Action mailed May 14, 2009. |
U.S. Appl. No. 11/754,833, filed May 29, 2007 in the name of Abolfathi et al., Non-final Office Action mailed Aug. 6, 2008. |
U.S. Appl. No. 11/866,345, filed May 29, 2007 in the name of Abolfathi et al., Final Office Action mailed Apr. 15, 2009. |
U.S. Appl. No. 11/866,345, filed May 29, 2007 in the name of Abolfathi et al., Non-final Office Action mailed Mar. 19, 2008. |
Wen, Y. et al, “Online Prediction of Battery Lifetime for Embedded and Mobile Devices,” Special Issue on Embedded Systems: Springer-Verlag Heidelberg Lecture Notes in Computer Science, V3164/2004, 15 pages, Dec. 2004. |
Number | Date | Country | |
---|---|---|---|
20090149722 A1 | Jun 2009 | US |