This patent relates to sleep, and more particularly to sleep sounds.
Sleep is sometimes problematic for people. Snoring can be an issue for the snorer, as well as anyone sleeping in their vicinity. There are numerous methods of attempting to cope with snoring.
One prior art method of determining snoring and its cause is sleep studies. Sleep studies involve sleeping numerous nights in a laboratory that monitors the user's snoring, oxygen flow, and other physiological symptoms during sleep. This information can be used to determine when and why someone snores. However, sleep studies are time consuming and expensive, and people's sleep patterns are sometimes disrupted because of the change in environment.
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
A method or apparatus to track, and in one embodiment reduce, user snoring and/or other unhealthy sleep behaviors is described. Snoring may be an indication of health issues, such as sleep apnea, and generally is correlated with less restful sleep. Furthermore, snoring can disturb the sleep of others. There are many variations of snoring. While loud snorers are often aware of their snoring, intermitted snorers or those who sleep alone may not be aware that they are snoring. In addition to snoring, the sleep sound system may be able to detect, and recommend corrective action for, teeth grinding, restless leg syndrome, sleep apnea (disrupted breathing), and other conditions that may be identified based on sensor and sound data recorded during the user's sleep.
The present system uses a sleep monitoring mechanism, in conjunction with a microphone or other mechanism to monitor a user's sleep sounds. The sleep sounds can then be analyzed, and data may be used to inform the user and/or appropriate other party such as a medical provider, make recommendations, and/or for other purposes. In one embodiment, the audio data may also be used in determining the user's sleep state. In one embodiment, the audio data may be recorded, so that an appropriate professional may evaluate it.
The following detailed description of embodiments of the invention make reference to the accompanying drawings in which like references indicate similar elements, showing by way of illustration specific embodiments of practicing the invention. Description of these embodiments is in sufficient detail to enable those skilled in the art to practice the invention. One skilled in the art understands that other embodiments may be utilized and that logical, mechanical, electrical, functional, and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
In one embodiment, sleep monitor system 110 includes sensor system 105 and recording system 150. Sensor system 105 incorporates one or more sensors, to detect the user's sleep conditions, and recording system 150 utilizes a microphone or video recording to store the user's snoring or other sleep behavior for later analysis.
In one embodiment, sensor system 105's sensors may be incorporated into various formats such as a smart phone 120, wrist band 110, sleep mask 115, pillow 135, armband 135, or other sensor element 140. The system may include a plurality of such devices, each providing one or more sensors. In one embodiment, recording system 150 may also be incorporated into these devices. In another embodiment, the recording system 150 may be separated. If the recording system 150 is a separate system, it may communicate with sensor system 105 via a direct connection or a wireless connection.
The sensors 205, in one embodiment, include motion sensor 210. In one embodiment, motion sensor is an accelerometer 210, used to detect user motion. User motion can be used to determine a user's current sleep phase, as well other aspects of the user's quality of sleep. In one embodiment, sensors 205 may include additional sensors, which may also be used to assist in determining a current sleep phase, as well as to collect relevant data. For example, additional sensors may include one or more of: temperature sensor 212 which may include a body thermometer and/or an ambient temperature thermometer, humidity detector 219, blood pressure monitor 214 brain sensor 216 to detect brain waves which indicate sleep phase, and/or other sensors 218.
The sleep sensing system 200 also includes recording system 225 in one embodiment. The recording system 225 includes a microphone 250, or other sound pick-up mechanism. The microphone 250 is used to pick up sounds, such as the sounds of snoring. In one embodiment, timer 245 controls when microphone 250 is turned on. In one embodiment, the microphone 250 is turned on periodically. In one embodiment, the periodicity is per sleep phase, ensuring that sounds are monitored in each sleep cycle. In one embodiment, the timer 245 is controlled by the sleep state logic 220, which determines the current sleep state based on the sensor data. For example, the recording may be initiated when the data indicates that a sleep state transition has occurred, and periodically thereafter. In one embodiment, a video camera 252 may be turned on when the sleep transition has occurred, or when snoring is detected, etc. In another embodiment, the recording (audio and/or video) may be continuous. The data from the recording system 225 is stored in data store 254. In one embodiment, data store 254 includes the recording data, and correlated sensor data and sleep state information.
Timer 245 may also control when other sensors 205 are turned on.
The sensor data and recording data is recorded in data store 254. In one embodiment, the data is analyzed by data analysis logic 260. Data analysis logic 260 determines whether the sound recorded is that of snoring, in one embodiment. In one embodiment, data analysis logic 260 may also identify recording of other sounds or video that may be indicators of health concern. For example, hick-ups, choking sounds, thrashing, or any other such sounds or movements indicative of a problem. In one embodiment, the data store 254 stores any such relevant data.
In one embodiment, the data store 254 may store continuously recorded sounds/video/sensor data, and may purge “uninteresting” data in a First In-First Out type of system. For example, in one embodiment the data store 254 may allocate two hours worth of audio/video data for a night. Every hour, the recorded data may be analyzed by data analysis logic 260, and when the data is not informative (e.g. no change is occurring, the sleeper continues to sleep in the same sleep phase, the snoring continues at the same volume and intensity, etc.) the data may be discarded.
In one embodiment, data analysis logic 260 can be used to assist in diagnosing and treating various conditions. Over time, the data analysis logic 260, or statistical analysis logic 292, monitor data and build up information about what variables/conditions effect the person's conditions for the better and worse. This data can be used by the user, medical personnel, or an appropriate other party to help understand what makes various conditions better and worse. In one embodiment, recommendation logic 230 can provide information based on this information, to recommend adjustments in the user's behavior and environment/conditions. This applies to snoring, choking, or any of other conditions. Recommendation logic 230 may output its recommendations via communication logic 235.
The user interface system 265 allows display of the recorded data. In one embodiment, the recorded data may actually be displayed on the user interface 298 on computer system 280 when the data from sleep monitor system 200 is sent to the computer system 280.
In one embodiment, for certain high-risk sounds detected, the data analysis logic 260 may send the data to alert system 270. Alert system 270 may be a local speaker or other output mechanism, the user's mobile telephone number, a doctor or other relevant contact, or similar destination. In one embodiment, for choking or other sounds that may indicate immediate health distress, alert system 270 or UI system 265 may attempt to waken the user immediately.
In one embodiment, the data from sleep monitor system 200 may be sent to a computer system 280, either directly or via a network, such as a wireless network. The computing system 280 may be a local device or a server system. In one embodiment, the computer system 280 may perform the data analysis, using remote data analysis logic 262, and return data to the sleep monitor system 200, via communication system 282, to communication logic 235. In one embodiment, data analysis logic 260 and remote analysis logic 262 may share processing, such that some, or all, of the more processing-intensive analysis is done remotely.
In one embodiment, the computer system 280 may receive statistical data from the sleep monitoring system 200, and collect such abstracted data from a large number of users, as collected sleep data 290. This enables statistical analysis logic 292 to perform cumulative data analysis to determine risk factors for snoring or other adverse health events across anonymized data from many users. For example, the system may determine that certain users snore more, and have less restful sleep, when the temperature is above 76 degrees Fahrenheit. This may enable the system, via UI 265 or UI 298 to suggest to a user to reduce the bedroom temperature.
In one embodiment, the computer system 280 may utilize additional data, such as weather data 284 or other data 286, in addition to sensor data received, to form a more complete picture of the environment. For example, local weather data 284 obtained from third party sources, may be added to the sensor data received from various users.
At block 320, the process tracks the user's sleep patterns. The sleep tracking may occur through motion tracking, as with an accelerometer, through cameras, and/or other sensors. In one embodiment, the data from the combination of sensors may be used. Sleep tracking may categorize the user's current sleep by phase, e.g. deep sleep, light sleep, awake, or optionally N1, N2, N3, and deep sleep, or using another pattern. The user's sleep patterns, in one embodiment, are divided up into sleep phases. In one embodiment, a plurality of sensors' data may be combined to determine the sleep phases.
At block 330, the process periodically turns on the microphone to record the sounds being made by the sleeper. In one embodiment, the periodicity is designed to ensure that the microphone is turned for some time on during all phases of sleep. In one embodiment, the system may also periodically turn on a video camera or take still images. In one embodiment, the recording may be turned on based on the user's sleep state, as determined by the sensors in the sleep monitor. In one embodiment, the recording may be turned on periodically. In another embodiment, the recording may be continuous. In another embodiment, when other sensor data indicates snoring, or distress, the recording may be turned on.
Returning to
If snoring was detected, at block 350, the snoring is analyzed, and the snoring data is made available to the user. In one embodiment, the snoring data may be associated with the sleep quality data, which may be available to the user.
At block 360, the process determines whether based on the monitored data, there are any recommendations for adjustments to user behavior and/or environment. For example, the system may determine that the user snores more when the user goes to sleep after midnight. The system may then have a recommendation to reduce snoring. Conditions ranging from activity level, time to bed, light levels in the room, temperatures, or other behaviors or environmental conditions
If there are recommendations, at block 370 the recommendation is provided. The recommendation may be provided to the user, an appropriate third party such as a medical professional.
The process then ends at block 380. In one embodiment, if the snoring detected indicates a likely health problem, such as sleep apnea or similar, the user may be sent an alert in addition to making the snoring data available via a user interface. The alert may be a text message, an email, or another mechanism.
At block 515, the user's sleep indication is received. In one embodiment, this may be a manual switch or button used to indicate that the user is going to sleep. In one embodiment, this may be automatically detected by the sleep monitoring system.
At block 520, the process determines whether the user is awake. If the user is awake, e.g. ending the sleep period, the process ends at block 545. In one embodiment, this occurs when the user manually indicates that he or she has finished sleeping. In one embodiment, this may be automatically detected when the user rises from bed or otherwise takes an action that is incompatible with sleep.
If the user is not waking up, at block 525, the recording is turned on to record sleep sounds. In one embodiment, this occurs a set period after the user's sleep is initiated. In another embodiment, the timing of the sound recording may depend on user characteristics. In another embodiment, this may be continuous when the user is sleeping. The recording may be an audio recording, or an audio/visual recording.
At block 530, the process determines whether sleep sounds were detected. These sleep sounds include sounds of snoring, or other noises made by sleeping humans, as well as other noises in one embodiment. If sounds were detected, the process continues to block 535.
At block 535, the process determines whether the noises and/or other data indicate an urgent problem. Such noises may include choking, coughing, fire alarm, or other noises which generally would need a prompt response. If the noises, in one embodiment in combination with other sensor data, are indicative of such an urgent problem, at block 540, a waking alarm is sent to the user. This is designed to wake the user and may be auditory, visual, tactile, or other alarm formats. In one embodiment, the alarm may be sounded directly by the sleep system, sent to a mobile telephone or landline telephone, or otherwise conveyed to the user. In one embodiment, the alarm may also be sent to a third party, when appropriate. The third party may be designed by the user, e.g. the user's partner, medical provider, alarm company, 911 provider, etc. The process then ends, when the user is awakened, at block 545.
If the noises are not indicative of an urgent problem, the process continues to block 550, and the timing to record sleep sounds is set to a period. For example, if snoring is detected, the sleep sounds may be set to record every 15 minutes. In one embodiment, the timing may range from continuous (e.g. 0 seconds), to timing (every 15 minutes), to once per sleep phase, once per sleep cycle, once per sleep time, or even less frequently. The process then continues to block 560.
If at block 530 no noises were detected, at block 555 the timer is set to record sleep sounds with a period. In one embodiment, this period is different from the period set when snoring or other relevant sounds were detected. In one embodiment, the testing for noise is less frequent if no noise was detected. The process then continues to block 560.
At block 560, the process determines whether the sleep phase of the user has changed. In one embodiment, the sleep phase is determined based on data from one or more sensors. In one embodiment, the sleep sounds may be included in making this determination. In general, snoring differs by sleep phase, for most sleepers. Therefore, in one embodiment, the frequency of recording is determined for each sleep phase. If there is no change in sleep phase, the process continues to block 560, to continue recording sleep noises with the periodicity indicated. In one embodiment, any time a sleep sound is detected, the process of verifying that the noise does not require urgent response is applied.
If the sleep phase changes, at block 560, in one embodiment the process resets the recording timer, at block 565. The process then continues to block 520, to determine whether the user is awake.
In one embodiment, the above process is applicable when the system does not have significant amount of data about the user. In one embodiment, once the user's sleep has been monitored over an extended period and no snoring has been detected, the system may set a testing rate for future sleep cycles, without evaluating the detection of noises. In one embodiment, however, if noises are detected, the timing of the testing is adjusted.
At block 620, the snoring is mapped to sensor data. As noted above, the sensor data may include motion data, temperature, video or image data, brainwave data, blood pressure data, heart rate data, etc. In one embodiment, the combination of sensor data is evaluated. In another embodiment, the motion data is evaluated.
At block 630, the process determines whether the recorded sound is likely not the user. In one embodiment, snoring has a somewhat characteristic motion associated with it, e.g. there is a vibration. There may also be an associated brainwave or other sensor data. If the sound has no vibration at all, it is possible that the sound is actually another person in the same room or another room snoring, or a different source.
If the sound is likely not the user, at block 640 the user is informed of the issue, and information is requested about the presence of potentially other snorers or sources of similar sound in the household. If the user indicates, at block 650, that there are no other sleepers or other noise sources, the process continues to block 670. If there is another sleeper, the user is informed of the potential issue with the snoring of the other person, at block 660. The process then ends at block 699.
If the mapping indicated, at block 630, that the user was the likely snorer, the process continues directly to block 670.
At block 670, the snoring data is mapped to the sleep phase in which it occurs. There may be a correlation between snoring timing and potential health issues.
At block 680, the process determines whether the snoring indicates a potential health problem. This may be based on the frequency, loudness, type, or timing of the snoring. If there is a potential health problem, at block 690 an alert is sent to the user, ton suggest following up this issue. At block 695, the results of the analysis are made available to the user. In one embodiment, in addition to alerting the user, a third party may be alerted as well. For example, a medical professional or other relevant party, as indicated by the sleep system. In one embodiment, the user is prompted to enter additional information, when available. For example, if an alert is sent, suggesting a visit to a doctor, a follow-up may ask whether the user did visit a doctor, and results of that visit.
If there is no potential health problem indicated, at block 680, the process continues to block 695, to make the results of the analysis available to the user. An anonymized version of this information may be passed to the server, and used in future analysis of sleep issues in a cumulative manner, in one embodiment. For example, if the recommendation for a doctor visit results in the use of a sleep aid, a tool for preventing grinding of teeth, or medication, this information may be used to refine future recommendations regarding potential health problems. The process ends at block 699.
While the above process is described with respect to snoring, one of skill in the art would understand that the same combination of movement and periodic sound recording data may be used to analyze for teeth grinding, sleep walking, wheezing, or other potentially harmful health indicators which may be observed during sleep. In one embodiment, sleep sounds are useful as a feedback loop, to provide audio data that the user or a healthcare provider may listen to, and evaluate for problems. The sleep sound data may also be used as sensor data, to determine sleep phase, as well as sleep quality.
One of skill in the art would understand that although the above description is with respect to sleep sounds, a similar logic may be applied in recording video, brain data, imaging data, or other types of sensor data that could provide useful information about a user's sleep and/or health.
The data processing system illustrated in
The system further includes, in one embodiment, a random access memory (RAM) or other volatile storage device 720 (referred to as memory), coupled to bus 740 for storing information and instructions to be executed by processor 710. Main memory 720 may also be used for storing temporary variables or other intermediate information during execution of instructions by processing unit 710.
The system also comprises in one embodiment a read only memory (ROM) 750 and/or static storage device 750 coupled to bus 740 for storing static information and instructions for processor 710. In one embodiment, the system also includes data storage device 730 such as a magnetic disk or optical disk and its corresponding disk drive, or Flash memory or other storage, which is capable of storing data when no power is supplied to the system. Data storage device 730 in one embodiment is coupled to bus 740 for storing information and instructions.
The system may further be coupled to an output device 770, such as a cathode ray tube (CRT) or a liquid crystal display (LCD) coupled to bus 740 through bus 760 for outputting information. The output device 770 may be a visual output device, an audio output device, and/or tactile output device (e.g. vibrations, etc.)
An input device 775 may be coupled to the bus 760. The input device 775 may be an alphanumeric input device, such as a keyboard including alphanumeric and other keys, for enabling a user to communicate information and command selections to processing unit 710. An additional user input device 780 may further be included. One such user input device 780 is cursor control device 780, such as a mouse, a trackball, stylus, cursor direction keys, or touch screen, may be coupled to bus 740 through bus 760 for communicating direction information and command selections to processing unit 710, and for controlling movement on display device 770.
Another device, which may optionally be coupled to computer system 700, is a network device 785 for accessing other nodes of a distributed system via a network. The communication device 785 may include any of a number of commercially available networking peripheral devices such as those used for coupling to an Ethernet, token ring, Internet, or wide area network, personal area network, wireless network, or other method of accessing other devices. The communication device 785 may further be a null-modem connection, or any other mechanism that provides connectivity between the computer system 700 and the outside world.
Note that any or all of the components of this system illustrated in
It will be appreciated by those of ordinary skill in the art that the particular machine which embodies the present invention may be configured in various ways according to the particular implementation. The control logic or software implementing the present invention can be stored in main memory 720, mass storage device 730, or other storage medium locally or remotely accessible to processor 710.
It will be apparent to those of ordinary skill in the art that the system, method, and process described herein can be implemented as software stored in main memory 720 or read only memory 750 and executed by processor 710. This control logic or software may also be resident on an article of manufacture comprising a computer readable medium having computer readable program code embodied therein and being readable by the mass storage device 730 and for causing the processor 710 to operate in accordance with the methods and teachings herein.
The present invention may also be embodied in a handheld or portable device containing a subset of the computer hardware components described above. For example, the handheld device may be configured to contain only the bus 740, the processor 710, and memory 750 and/or 720.
The handheld device may be configured to include a set of buttons or input signaling components with which a user may select from a set of available options. These could be considered input device #1 775 or input device #2 780. The handheld device may also be configured to include an output device 770 such as a liquid crystal display (LCD) or display element matrix for displaying information to a user of the handheld device. Conventional methods may be used to implement such a handheld device. The implementation of the present invention for such a device would be apparent to one of ordinary skill in the art given the disclosure of the present invention as provided herein.
The present invention may also be embodied in a special purpose appliance including a subset of the computer hardware components described above. For example, the appliance may include a processing unit 710, a data storage device 730, a bus 740, and memory 720, and no input/output mechanisms, or only rudimentary communications mechanisms, such as a small touch-screen that permits the user to communicate in a basic manner with the device. In general, the more special-purpose the device is, the fewer of the elements need be present for the device to function. In some devices, communications with the user may be through a touch-based screen, or similar mechanism. In one embodiment, the device may not provide any direct input/output signals, but may be configured and accessed through a website or other network-based connection through network device 785.
It will be appreciated by those of ordinary skill in the art that any configuration of the particular machine implemented as the computer system may be used according to the particular implementation. The control logic or software implementing the present invention can be stored on any machine-readable medium locally or remotely accessible to processor 710. A machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g. a computer). For example, a machine readable medium includes read-only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or other storage media which may be used for temporary or permanent data storage. In one embodiment, the control logic may be implemented as transmittable data, such as electrical, optical, acoustical, or other forms of propagated signals (e.g. carrier waves, infrared signals, digital signals, etc.).
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
This patent claims priority to U.S. Provisional Application No. 61/620,857 filed on Apr. 5, 2012, and incorporates that application in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3798889 | Chadwick | Mar 1974 | A |
4228806 | Lidow | Oct 1980 | A |
4297685 | Brainard, II | Oct 1981 | A |
4573804 | Kavoussi et al. | Mar 1986 | A |
4788533 | Mequignon | Nov 1988 | A |
4848360 | Palsgard et al. | Jul 1989 | A |
4858609 | Cole | Aug 1989 | A |
4982738 | Griebel | Jan 1991 | A |
5008865 | Shaffer et al. | Apr 1991 | A |
5275159 | Griebel | Jan 1994 | A |
5335657 | Terry, Jr. et al. | Aug 1994 | A |
5458105 | Taylor | Oct 1995 | A |
5545192 | Czeisler et al. | Aug 1996 | A |
5562106 | Heeke et al. | Oct 1996 | A |
5671733 | Raviv et al. | Sep 1997 | A |
5844996 | Enzmann et al. | Dec 1998 | A |
5928133 | Halyak | Jul 1999 | A |
5961447 | Raviv et al. | Oct 1999 | A |
6045514 | Raviv et al. | Apr 2000 | A |
6239706 | Yoshiike et al. | May 2001 | B1 |
6350275 | Vreman et al. | Feb 2002 | B1 |
6361508 | Johnson et al. | Mar 2002 | B1 |
6468234 | Van der Loos et al. | Oct 2002 | B1 |
6547728 | Cornuejols | Apr 2003 | B1 |
6556222 | Narayanaswami | Apr 2003 | B1 |
6888779 | Mollicone et al. | May 2005 | B2 |
6928031 | Kanevsky et al. | Aug 2005 | B1 |
6963271 | Fyffe | Nov 2005 | B1 |
7006650 | Wild | Feb 2006 | B1 |
7106662 | Acker | Sep 2006 | B1 |
7153278 | Ono et al. | Dec 2006 | B2 |
7280439 | Shaddox | Oct 2007 | B1 |
7366572 | Heruth et al. | Apr 2008 | B2 |
7513003 | Mossbeck | Apr 2009 | B2 |
7559903 | Moussavi et al. | Jul 2009 | B2 |
7572225 | Stahmann | Aug 2009 | B2 |
7841987 | Solos et al. | Nov 2010 | B2 |
7914468 | Shalon et al. | Mar 2011 | B2 |
8179270 | Rai et al. | May 2012 | B2 |
8193941 | Wolfe et al. | Jun 2012 | B2 |
8398546 | Pacione et al. | Mar 2013 | B2 |
8407835 | Connor | Apr 2013 | B1 |
8475339 | Hwang et al. | Jul 2013 | B2 |
8482418 | Harman | Jul 2013 | B1 |
8577448 | Bauer et al. | Nov 2013 | B2 |
8680974 | Meiertoberens et al. | Mar 2014 | B2 |
8738925 | Park et al. | May 2014 | B1 |
8892036 | Causey et al. | Nov 2014 | B1 |
8942719 | Hyde et al. | Jan 2015 | B1 |
9060735 | Yang et al. | Jun 2015 | B2 |
9161719 | Tsutsumi et al. | Oct 2015 | B2 |
9594354 | Kahn et al. | Mar 2017 | B1 |
9675268 | Bauer et al. | Jun 2017 | B2 |
9844336 | Zigel et al. | Dec 2017 | B2 |
10004452 | Kazem-Moussavi et al. | Jun 2018 | B2 |
20020080035 | Youdenko | Jun 2002 | A1 |
20020100477 | Sullivan et al. | Aug 2002 | A1 |
20020124848 | Sullivan et al. | Sep 2002 | A1 |
20030095476 | Mollicone et al. | May 2003 | A1 |
20030204412 | Brier | Oct 2003 | A1 |
20030231495 | Searfoss | Dec 2003 | A1 |
20040034289 | Teller et al. | Feb 2004 | A1 |
20040049132 | Barron et al. | Mar 2004 | A1 |
20040133081 | Teller et al. | Jul 2004 | A1 |
20040210155 | Takemura et al. | Oct 2004 | A1 |
20040218472 | Narayanaswami et al. | Nov 2004 | A1 |
20050012622 | Sutton | Jan 2005 | A1 |
20050043645 | Ono et al. | Feb 2005 | A1 |
20050075116 | Laird et al. | Apr 2005 | A1 |
20050143617 | Auphan | Jun 2005 | A1 |
20050154330 | Loree | Jul 2005 | A1 |
20050190065 | Ronnholm | Sep 2005 | A1 |
20050236003 | Meader | Oct 2005 | A1 |
20050237479 | Rose | Oct 2005 | A1 |
20050245793 | Hilton et al. | Nov 2005 | A1 |
20050288904 | Warrior | Dec 2005 | A1 |
20060017560 | Albert | Jan 2006 | A1 |
20060025299 | Miller et al. | Feb 2006 | A1 |
20060064037 | Shalon et al. | Mar 2006 | A1 |
20060097884 | Jang | May 2006 | A1 |
20060150734 | Mimnagh-Kelleher et al. | Jul 2006 | A1 |
20060252999 | DeVaul et al. | Nov 2006 | A1 |
20060266356 | Sotos et al. | Nov 2006 | A1 |
20060279428 | Sato et al. | Dec 2006 | A1 |
20060293602 | Clark | Dec 2006 | A1 |
20060293608 | Rothman et al. | Dec 2006 | A1 |
20070016091 | Butt et al. | Jan 2007 | A1 |
20070016095 | Low et al. | Jan 2007 | A1 |
20070129644 | Richards et al. | Jun 2007 | A1 |
20070191692 | Hsu et al. | Aug 2007 | A1 |
20070239225 | Saringer | Oct 2007 | A1 |
20070251997 | Brown et al. | Nov 2007 | A1 |
20070287930 | Sutton | Dec 2007 | A1 |
20080062818 | Plancon et al. | Mar 2008 | A1 |
20080109965 | Mossbeck | May 2008 | A1 |
20080125820 | Stahmann | May 2008 | A1 |
20080191885 | Iv et al. | Aug 2008 | A1 |
20080234785 | Nakayama et al. | Sep 2008 | A1 |
20080243014 | Moussavi et al. | Oct 2008 | A1 |
20080289637 | Wyss | Nov 2008 | A1 |
20080319277 | Bradley | Dec 2008 | A1 |
20090030767 | Morris et al. | Jan 2009 | A1 |
20090048540 | Otto et al. | Feb 2009 | A1 |
20090069644 | Hsu et al. | Mar 2009 | A1 |
20090082699 | Bang et al. | Mar 2009 | A1 |
20090094750 | Oguma et al. | Apr 2009 | A1 |
20090105785 | Wei et al. | Apr 2009 | A1 |
20090121826 | Song et al. | May 2009 | A1 |
20090128487 | Langereis et al. | May 2009 | A1 |
20090143636 | Mullen et al. | Jun 2009 | A1 |
20090177327 | Turner et al. | Jul 2009 | A1 |
20090203970 | Fukushima et al. | Aug 2009 | A1 |
20090207028 | Kubey et al. | Aug 2009 | A1 |
20090227888 | Salmi | Sep 2009 | A1 |
20100010330 | Rankers et al. | Jan 2010 | A1 |
20100061596 | Mostafavi et al. | Mar 2010 | A1 |
20100079291 | Kroll et al. | Apr 2010 | A1 |
20100079294 | Rai et al. | Apr 2010 | A1 |
20100083968 | Wondka et al. | Apr 2010 | A1 |
20100094148 | Bauer et al. | Apr 2010 | A1 |
20100100004 | Someren | Apr 2010 | A1 |
20100102971 | Virtanen | Apr 2010 | A1 |
20100152543 | Heneghan et al. | Jun 2010 | A1 |
20100152546 | Behan et al. | Jun 2010 | A1 |
20100256512 | Sullivan | Oct 2010 | A1 |
20100283618 | Wolfe et al. | Nov 2010 | A1 |
20100331145 | Lakovic et al. | Dec 2010 | A1 |
20110015495 | Dothie et al. | Jan 2011 | A1 |
20110018720 | Rai et al. | Jan 2011 | A1 |
20110058456 | De et al. | Mar 2011 | A1 |
20110090226 | Sotos et al. | Apr 2011 | A1 |
20110105915 | Bauer et al. | May 2011 | A1 |
20110137836 | Kuriyama et al. | Jun 2011 | A1 |
20110160619 | Gabara | Jun 2011 | A1 |
20110190594 | Heit et al. | Aug 2011 | A1 |
20110199218 | Caldwell | Aug 2011 | A1 |
20110230790 | Kozlov | Sep 2011 | A1 |
20110295083 | Doelling et al. | Dec 2011 | A1 |
20120004749 | Abeyratne et al. | Jan 2012 | A1 |
20120083715 | Yuen et al. | Apr 2012 | A1 |
20120232414 | Mollicone et al. | Sep 2012 | A1 |
20120243379 | Balli | Sep 2012 | A1 |
20120253220 | Rai et al. | Oct 2012 | A1 |
20130012836 | Veiga et al. | Jan 2013 | A1 |
20130018284 | Kahn et al. | Jan 2013 | A1 |
20130023214 | Wang et al. | Jan 2013 | A1 |
20130053653 | Cuddihy et al. | Feb 2013 | A1 |
20130053656 | Mollicone et al. | Feb 2013 | A1 |
20130060306 | Colbauch | Mar 2013 | A1 |
20130144190 | Bruce | Jun 2013 | A1 |
20130184601 | Zigel et al. | Jul 2013 | A1 |
20130204314 | Miller et al. | Aug 2013 | A1 |
20130208576 | Loree et al. | Aug 2013 | A1 |
20130286793 | Umamoto | Oct 2013 | A1 |
20130289419 | Berezhnyy et al. | Oct 2013 | A1 |
20130310658 | Ricks et al. | Nov 2013 | A1 |
20140005502 | Klap et al. | Jan 2014 | A1 |
20140051938 | Goldstein et al. | Feb 2014 | A1 |
20140135955 | Burroughs | May 2014 | A1 |
20140171815 | Yang et al. | Jun 2014 | A1 |
20140200691 | Lee et al. | Jul 2014 | A1 |
20140207292 | Ramagem et al. | Jul 2014 | A1 |
20140219064 | Filipi et al. | Aug 2014 | A1 |
20140232558 | Park et al. | Aug 2014 | A1 |
20140259417 | Nunn et al. | Sep 2014 | A1 |
20140276227 | Perez | Sep 2014 | A1 |
20140288878 | Donaldson | Sep 2014 | A1 |
20140371635 | Shinar et al. | Dec 2014 | A1 |
20150068069 | Tran et al. | Mar 2015 | A1 |
20150073283 | Vugt et al. | Mar 2015 | A1 |
20150098309 | Adams et al. | Apr 2015 | A1 |
20150141852 | Dusanter et al. | May 2015 | A1 |
20150148871 | Maxik et al. | May 2015 | A1 |
20150173671 | Paalasmaa et al. | Jun 2015 | A1 |
20150178362 | Wheeler | Jun 2015 | A1 |
20150190086 | Chan et al. | Jul 2015 | A1 |
20150233598 | Shikii et al. | Aug 2015 | A1 |
20150265903 | Kolen et al. | Sep 2015 | A1 |
20170003666 | Nunn et al. | Jan 2017 | A1 |
Number | Date | Country |
---|---|---|
1139187 | Oct 2001 | EP |
WO 2008038288 | Apr 2008 | WO |
Number | Date | Country | |
---|---|---|---|
61620857 | Apr 2012 | US |