The present invention relates to sleep analysis, and more particularly to providing sleep analysis through a sleep surface.
An average person spends about one-third of his or her life asleep. Sleep is the time our bodies undergo repair and detoxification. Research has shown that poor sleep patterns is an indication of and often directly correlated to poor health. Proper, restful and effective sleep has a profound effect on our mental, emotional and physical well-being.
Every person has a unique circadian rhythm that, without manipulation, will cause the person to consistently go to sleep around a certain time and wake up around a certain time. For most people, a typical night's sleep is comprised of five different sleep cycles, each lasting about 90 minutes. The first four stages of each cycle are often regarded as quiet sleep or non-rapid eye movement (NREM). The final stage is often denoted by and referred to as rapid eye movement (REM). REM sleep is thought to help consolidate memory and emotion. REM sleep is also the time when blood flow rises sharply in several areas of the brain that are linked to processing memories and emotional experiences. During REM sleep, areas of the brain associated with complex reasoning and language experience blood flow declines, whereas areas of the brain associated with processing memories and emotional experiences exhibit increased blood flow.
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
An improved sleep surface based sleep analysis system is described. This system can also be referred to as an in-bed sensor system. The system includes one or more sensors arranged underneath sleeping surface, such as a mattress, mattress topper, or other portion of the sleeping surface. The sensor portion of the system is designed to be placed sufficiently remotely from the user that the shape of the sensor cannot be felt by a user. In one embodiment, the sensor is sufficiently sensitive to pick up micro-motions when placed underneath or in a mattress, on a box spring, on or in slats, on or in an adjustable base, on or in a platform or another configuration that provides a solid base underneath a mattress. In one embodiment, the sensor system can be retrofitted into an existing bed. In one embodiment, the LDC1000, LDC1612, or LDC1614 inductor sensor system from TEXAS INSTRUMENTS® is used as the inductive sensor. A different inductor sensor may be used. Alternatively or additionally, other types of sensors, such as accelerometers may be used, as long as they are sufficiently sensitive to pick up user micro-motions.
In one embodiment, the system may be incorporated into a box spring, foundation, base, mattress, or mattress topper. In one embodiment, the output of the sensor is coupled to the rest of the sensor system via a cable. In one embodiment, a cable provides power to the sensor, and is used to send data from the sensor to the other parts of the sensor system. In one embodiment, the sensor may be separately powered, and data may be transmitted using a network connection such as Bluetooth or Wi-Fi, or another format. In one embodiment, power and data may both be transmitted wirelessly. In one embodiment, the sensor is coupled to a processor, which is coupled to a mobile device and/or a network.
The following detailed description of embodiments of the invention makes reference to the accompanying drawings in which like references indicate similar elements, showing by way of illustration specific embodiments of practicing the invention. Description of these embodiments is in sufficient detail to enable those skilled in the art to practice the invention. One skilled in the art understands that other embodiments may be utilized and that logical, mechanical, electrical, functional and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
In one embodiment, the analysis system 130 communicates with a user mobile device 140. The connection between analysis system 130 and user mobile device 140 may be via a network 150, such as via a WiFi connection or cellular network connection. The connection may be via a local area network or personal area network, such as Bluetooth. In one embodiment, the connection may be physical wireline connection. The user mobile device 140 may be a smart phone, a tablet, or another device, which provides additional user interface features and controls. In one embodiment, the connection may be provided through a docking station, or similar connection element, which physically connects to sleep analysis system 115 and the user mobile device 140.
In one embodiment, the sleep analysis system 115 may additionally or alternatively connected to a computer system 145 or a server 170, via network 150, such as Bluetooth, WiFi, or another type of connection. In one embodiment, the user mobile device 140 and/or the sleep analysis system 130 may provide controls to devices which are part of the Internet of Things 160. The Internet of Things 160 may include elements of a smart home, or environment, and provide controllers for a smart bed, lights, thermostats, coffee makers, window treatments, speakers, alarm clocks, and other aspects of the user's environment that may be controlled via commands through a network. The IoT system 160 may control IoT enabled elements to assist in optimizing the user's sleep and health.
In one embodiment, some or all of the user data may further be transmitted to a sleep server 170, which can provide additional analysis, in one embodiment. In one embodiment, user data is stored on the mobile device. In one embodiment, collective anonymized user data is stored by sleep statistics system/server 175. The sleep statistics system/server 175 utilizes the abstracted data to analyze sleep patterns across large populations, correlated by user characteristics. The system 175 in one embodiment, includes data from millions of nights of sleep, and uses that data to provide recommendations to users and adjustments to the system.
The sleep tracker processor 186 in one embodiment connects to an application on a mobile device 188 via a wireless connection. In one embodiment, the wireless connection is a low-power Bluetooth (BLE) connection. The connection may be continuous, periodic, or intermittent. In another embodiment, the connection is a Wi-Fi connection, or another type of wireless connection. In one embodiment, the mobile device 188 may be connected to the sleep tracker processor 186 via a wired connection.
Alternatively, a sensor array may include multiple sensors, as shown in
In one embodiment, these sensors may each be separate sensors with connections to the other sensors or the sleep tracker processor. In one embodiment, the sensors may be spaced in a rectangle, at head and mid-chest height of a user. Alternative configurations of a multi-sensor array may be used.
The sensors in one embodiment, are connected in parallel, or series, to a data collection mechanism, which provides power to the sensors and provides the sensor data to the sleep tracker processor. In one embodiment, the sensor itself may have a small buffer for collected data, so that the sensor data may be passed through a s ingle cable, using sequential or parallel encoding.
The sensors may be located in various locations within the bed structure, as shown in
In one embodiment, the sensor may be located between the foundation and the mattress, as shown in
In one embodiment, in a configuration without a foundation or box spring, as shown in
In one embodiment, shown in
An alternative configuration, shown in
Another configuration, shown in
Sleep tracker processor 410 includes an AC/AC converter 412, to step down the AC voltage from the plug to a level that the sensor system 400 needs. In one embodiment, an AC/DC converter 414 provides power for the processor and other elements of the system, including the processor, and potentially other sensors 404.
The sensor system 400 includes inductive sensor 402 which may be built into the bed, as described above, and may receive low power AC power from AC/AC converter 412. The output of the inductive sensor 402, and other sensors 404 may be converted by signal converter 406 to a digital signal, prior to being sent to the sleep tracker processor via a cable. In one embodiment, sensor system 400 includes a local data store 408 to store sensor data. In one embodiment, the data store 408 may be used to buffer data.
In one embodiment, the in-bed portion of the system is the sensor system 400 including the inductive sensor 402, any other built-in sensors 404, and signal converter 406, powered through a cable from the sleep tracker processor 410. The converted signal is sent to the sleep tracker processor 410, which in one embodiment is an element plugged into the wall. In another embodiment, the raw sensor data may be sent to sleep tracker processor 410.
Sleep tracker processor 410 controls calibrator 416 which may be used to calibrate the inductive sensor 402. The calibration may be based on the user's characteristics, such as gender, weight, etc.
The data separator 420 separates the data from a plurality of sensors, if there are multiple sleepers. The data separator 420 in one embodiment further separates the breathing, micro-motion, and heart rate data. The data is then used by sleep position identifier 422 to identify the sleep position of each of the users in the bed.
ECG analyzer 424 analyzes the separated heart data. This data may be generally used to help identify sleep state. However, it can also be used to detect potential problems, such as arrhythmia.
Apnea detector 426 utilizes the breathing data to ensure that the user is breathing smoothly. Snore recorder 428 in one embodiment utilizes a microphone to record snoring. In one embodiment, the recorder 428 may include a detection mechanism and a trigger to turn on the microphone and recording when appropriate based on breathing data.
Multi-data sleep-state calculator 430 utilizes the heart rate, breathing, and micro-motion data, as well as any other data available, to identify the users' sleep states. In one embodiment, sleep-stage calculator 430 data is compared to the output of sleep stage predictor/inference engine 434 by sleep stage verifier 432. The real data is used to validate the prediction, rather than generate the sleep state directly from the data. In one embodiment, the inference engine 434 may utilize primarily the user's own data. However, in one embodiment, data collected over many users may be used to initially populate inference engine's data set. Sleep stage data store 436 stores the current & past sleep state data. This may be used by the inference engine 434, as well as communicated to server system 480, or mobile device 450.
In one embodiment, connection between sleep tracker processor 410 and mobile device 450 through communication logic 440/452 is intermittent, and the sleep tracker processor does not rely on the mobile device 450 for calculations or processing. In another embodiment, one or more of the logic blocks described may either be on the mobile device 450, or may be shared with the mobile device 450 such that the combination of the sleep tracker processor 410 and the mobile device 450 make the described calculations.
In one embodiment sleep tracker processor 410 further includes mattress controls 442 to control the mattress based on sleep stage and any other detected issues. Additionally, sleep tracker processor 410 may in one embodiment include alert system 444. In another embodiment, these logics may be part of mobile device 450.
Mobile device 450 receives data from sleep tracker processor 410, via its own communication logic 452. In one embodiment, the data is used by the mobile device 450 to perform statistical analysis on the data 454. Predictive system 468, in one embodiment, predicts future health issues. In one embodiment, the predictive system utilizes historical data and data from collective user statistics 482 provided by server system 480 to make smart predictions. In one embodiment, mobile device includes Internet of Things (IoT) control 460. Alternately or additionally such controls may be within sleep tracker processor 410. This enables adjustment of the user's environment to optimize sleep and minimize health risks.
Feedback system 456 utilizes the graphic display capabilities of the mobile device 450 to provide detailed feedback to the user about their sleep, and other data, in one embodiment.
In one embodiment, sleep tracker processor 410 connects to server system 480 either directly or through mobile device 450. Server system 480 collects anonymized user statistics 482 and provides a high power big data analytics system 484, to make predictions, which may be provided to the user's system, and shown to the user. The predictions may be provided to the user as recommendations. For example, if the system determines that when users sleep less than 6 hours a day for more than 3 days in a row, they are much more likely to get ill, it may warn the user prior to that threshold being met that he or she needs to sleep more.
In one embodiment, server system 480 also includes system updater 488 which may push updates to inductive sensor and sleep tracker processor 410. This remote update capability means not only that the calibration and prediction is accurate, but also that if a problem is discovered it can be remedied without requiring access to the mattress by IT personnel.
At block 520, the user gets on the bed, and lays down. In one embodiment, the system detects the user getting on the bed, as well as identifying when the user lays down.
At block 530, the process determines whether there is another person on the bed already. If so, the second user is added to the monitoring, and multi-user adjustment is utilized, at block 540. Multi-user adjustment monitors the relative data between the users to accurately attribute data to the correct user. Because the sensor is very sensitive, a sensor on the right side of the bed is capable of picking up movement from the user laying on the left side of the bed.
If there is no other user on the bed, at block 540 the sensors are initialized and monitoring is initiated. In one embodiment, initializing the sensor includes calibration. In one embodiment, the sensor is calibrated each time a person gets onto the bed. In one embodiment, if the bedframe is adjustable, when the bedframe is adjusted the sensor is recalibrated. In one embodiment, the system can be utilized for a fully adjustable bed, such as the Simmons NuFlex Adjustable Bed Base™, which can be positioned at many angles with a moveable head and foot portion. Because the sensor system is built into the base, in one embodiment, and calibrates when the bed is reconfigured, it can be used regardless of the positioning of an adjustable bed.
At block 550, the process detects the user falling asleep. When the user falls asleep the movements, and heart rate change, and this is used to detect the user falling asleep.
At block 560, the system uses predictive logic to predict the user's sleep phase, and verifying the predicted state using sensor data. The use of the predictive logic reduces the processing time and complexity for correctly identifying the user's sleep state and status.
At block 570, the process determines whether it's near the alarm time. If it is near the alarm time, at block 575, the user is woken within the alarm window, at a light sleep stage, as detected by the system. The process continues to block 580. If it's not near the alarm time, the process continues directly to block 580.
At block 580, the process determines whether the user woke up and got out of bed. If so, at block 590, the sleep data is stored. In one embodiment, the predictive logic is updated, if needed. In one embodiment, the predictive logic is customized based on data about the user. Therefore, as additional data is acquired, the predictive logic is continuously updated to ensure that the system correctly predicts and responds to the user's sleep states. In one embodiment, the system can also monitor the user's health, and can respond to that data appropriately as well. The process then ends at block 595.
If the user has not woken up and gotten out of bed, at block 580, the process continues to use predictive logic and monitor the user's state. In one embodiment, this may continue even after an alarm. In one embodiment, if the user does not get out of bed with an alarm, the system may adjust the alarm settings to increase the likelihood that the user will wake up and get out of bed.
At block 620, the data collection unit is plugged into power, and the sensors are plugged into the data collection unit. In one embodiment, the sensors are connected to the data collection unit via a CAT5 cable. In another embodiment, a separate power cable and data cable is used.
At block 630, the sensitivity of the sensors is tested in situ. In one embodiment, once the sensor is in place (e.g. in its final location) its sensitivity is tested, and it is calibrated. Calibration ensures that the sensor is functional. In one embodiment, if there is a problem, the system a message may be sent to get any issues corrected. In one embodiment, calibration is a self-calibration.
At block 640, the process determines whether the system has been associated with a mobile device, application, or computer. In one embodiment, the system is designed to be paired with a device that provides additional user interface features. If the system has not yet associated, the process at block 645 pairs the sensor system with the user's mobile device, computer, or application. The process then continues to block 650. In one embodiment, the system uses a Bluetooth local area network, and utilizes standard Bluetooth discover methods.
At block 650, the process determines whether the sensor system has user data. User data, in one embodiment, includes user characteristic data. In one embodiment, characteristic data may include one or more of the user's gender, age, height, health, and athletic ability. In one embodiment, this data may be available from the paired device. For example, if the user has an existing health application on the user's mobile device, this data may be available. The user may also have added this data to the application already.
At block 655, the basic user data is obtained. In one embodiment, basic user data may include user characteristics as well as usage characteristics. Usage characteristics indicate the number of sleepers on the bed, and in one embodiment the sleep configuration. Sleep configuration indicates where each user sleeps, and which position they sleep in. In one embodiment, the system requests this data from the user. The process then continues to block 660.
At block 660, the process determines whether the sensors need adjustment. In one embodiment, this determination is made based on the user data and sensor calibration and sensitivity data. If the sensors do not need adjustment, the process ends at block 670. If the sensor needs adjustment at block 665, the sensor is adjusted. In one embodiment, the sensor adjustment may be based on user weight and sleep position, as well as sleep configuration data. In one embodiment, the adjustment reduces sensitivity for a larger user, and increases sensitivity for a smaller user. Other types of adjustments available in the sensor system may be used. After a sensor adjustment, the sensor recalibrates itself. The process then ends at block 670.
At block 720, the system receives movement data. In one embodiment, movement data is received from the inductive sensor. In one embodiment, movement data may also be received from additional sensors, such as a wristband, or mobile device. In one embodiment, data from multiple devices may be combined to generate the movement data.
At block 730, the process determines whether there are multiple people on the bed. When there are multiple people on the bed, the system must isolate data for each user, to be able to evaluate the user's sleep.
If there are multiple people on the bed, at block 740, the data is isolated, based on the relative data strength received from the sensors. As a general rule, each sensor will detect movement by anyone on the bed. However, the sensor in closer proximity to the user will sense a stronger movement from the same heart beat, breath, or micro motion. Therefore, the relative signal strength detected by multiple sensors may be used to isolate the data associated with each user. The process then continues to block 750.
At block 750, the heart beat data is isolated. Each heart beat moves the user's body slightly. The heart beat is the cardiac cycle, including diastole, systole, and intervening pause. Heart beats are generally at a particular frequency, called the heart rate. The heart rate slows during sleep but does not generally change quickly. Therefore, the heart beat data can be isolated based on the known heart rate, and the cardiac cycle information.
At block 760, the breathing data is separated out. Each breath inflates the user's lungs and thus causes motion in the mattress. Breathing is also rhythmic, and includes the contraction and flattening of the diaphragm, and the relaxation of the diaphragm. Breathing is generally slowed in deeper sleep, however the rhythmic pattern of breathing continues. This data is used to separate out breathing data.
Once breathing data and heart beat data are identified in the motion data, the remaining information is the micro-motions made by the user in sleep.
At block 770, the combination of micro-motion data, heart beat data, and breathing data is used to determine the user's sleep phase, or verify the predicted sleep phase. At block 780, the appropriate sleep phase is identified. The process then stores this data, and returns to block 720 to continue monitoring. In one embodiment, all of these data elements are stored, and may be used in later analysis to identify health conditions, sleep patterns, and potentially other facts that may impact the user's life or health. In one embodiment, because the sensors are very sensitive, even various issues such as hiccups or restless leg syndrome or sleep paralysis. In one embodiment, in addition to making this information available to the user, the system may also utilize this information to optimize the user's sleep quality.
At block 820, the isolated data sets, inferences, data from other sensors is analyzed, when available. The data sets, in one embodiment, include the heart beat data (which includes heart rate, and any arrhythmia or other deviations from the standard cardiac cycle), breathing data (which includes snoring, apnea, and other information derived from the movement data), motion data, and micro-movement data. The derived data includes sleep phase data, symmetry data which indicates the symmetry of movements, sleep cycle patterns, the time the user and other data derived from the above data sets. Additionally, the system may utilize environmental data in its analysis. This may include local data such as the angle of the bed, the light level, temperature, humidity, etc. The system further uses data over time, to analyze for changes in the user's sleeping patterns, and changes in breathing or heart measurements that may indicate a potential health condition. For example, if a user over time starts to develop a tremor, starts to have intermittent apnea, starts to sleep badly, this may be detected by the system.
In one embodiment, at block 830 the process determines whether there is any potential issue indicated by the data. If no potential issue is indicated, the process ends at block 835. This process continuously analyzes the data sets available, to identify actual and potential problems. In one embodiment, anonymized data is shared with the system. This may be used to identify precursor data which precedes, and indicates a later developing problem. For example, the system may determine based on a large sample set that people who stop being able to sleep horizontally tend to develop detectable apnea after some time. Because the system learns from the patterns observed, and correlates them over many users and many sleep sessions, such data can be identified.
If there is a potential issue, at block 840 the process determines whether the issue is immediately dangerous. For example, if the user is choking, or has severe enough apnea, or there is a heart arrhythmia that may indicate a heart attack, the system identifies the issue as immediately dangerous, in one embodiment. If so, at block 845, and immediate action is taken. In one embodiment, an immediate action may be an attempt to rouse the user. In one embodiment, an immediate action may be to alert a third party, such as another sleeper or 911. In one embodiment, the immediate action may be to wake another person in the house who could check on the user. The process then continues to block 850. If the identified issue is not immediately dangerous, the process continues directly to block 850.
At block 850, the data for the issue is recorded, and monitored. The user is informed, in one embodiment upon waking. In one embodiment, the data is made available to the user so that the user can forward the information to his or her doctor.
At block 855, the process determines whether the issue can be addressed—cured or improved—by altering a sleep condition. For example, certain congestion issues can be fixed by using a thicker pillow or adjusting the adjustable bed to elevate the user's head. Similarly, some early apnea issues can be avoided if the sleeper does not sleep on his or her back. If the issue can't be addressed, the process ends at block 835. In one embodiment, the user may be alerted to find another method of addressing the detected problem.
If the process could be addressed, at block 860, the process determines whether the issue can be addressed in this sleep period. In one embodiment, using the smart phone Internet of Things system, the system is capable of adjusting certain sleep conditions. Thus, in one embodiment, the system may be able to alleviate the user's issues by adjusting one or more aspects of the sleep environment. For example, the system may turn on an air filter, adjust the inline of the bed, turn on or off lights or sounds, open or close blinds or even windows or doors, etc. In one embodiment, the system may be able to use light, motion, and/or sound guide the user to a different sleep state, which may address a condition as well.
If the condition can be addressed, the sleep environment may be adjusted, or the user may be triggered to change, without waking the user, at block 870. In one embodiment, the process then continues to block 880. At block 880, changes are suggested to the user to address the identified issue(s). These suggestions may include changing the sleep environment, visiting a doctor, altering behaviors, such as increasing exercise or changing eating habits, etc. The process then ends, at block 835.
One of ordinary skill in the art will recognize that the processes described in the above flowcharts are conceptual representations of the operations used. The specific operations of the processes may not be performed in the order shown and described. For example and in one embodiment, the process is interrupt driven, rather than sequentially testing for various occurrences. In one embodiment, data is received or processed in a different order. The specific operations may not be performed in one continuous series of operations, and different specific operations may be performed in different embodiments. Additional operations may be performed, or some operations may be skipped. Furthermore, the processes could be implemented using several sub-processes, or as part of a larger macro process. For instance, in some embodiments, the processes shown in these flowcharts are performed by one or more software applications that execute on one or more computing devices.
The data processing system illustrated in
The system further includes, in one embodiment, a random access memory (RAM) or other volatile storage device 920 (referred to as memory), coupled to bus 940 for storing information and instructions to be executed by processor 910. Main memory 920 may also be used for storing temporary variables or other intermediate information during execution of instructions by processing unit 910.
The system also comprises in one embodiment a read only memory (ROM) 950 and/or static storage device 950 coupled to bus 940 for storing static information and instructions for processor 910. In one embodiment, the system also includes a data storage device 930 such as a magnetic disk or optical disk and its corresponding disk drive, or Flash memory or other storage which is capable of storing data when no power is supplied to the system. Data storage device 930 in one embodiment is coupled to bus 940 for storing information and instructions.
The system may further be coupled to an output device 970, such as a cathode ray tube (CRT) or a liquid crystal display (LCD) coupled to bus 940 through bus 960 for outputting information. The output device 970 may be a visual output device, an audio output device, and/or tactile output device (e.g. vibrations, etc.)
An input device 975 may be coupled to the bus 960. The input device 975 may be an alphanumeric input device, such as a keyboard including alphanumeric and other keys, for enabling a user to communicate information and command selections to processing unit 910. An additional user input device 980 may further be included. One such user input device 980 is cursor control device 980, such as a mouse, a trackball, stylus, cursor direction keys, or touch screen, may be coupled to bus 940 through bus 960 for communicating direction information and command selections to processing unit 910, and for controlling movement on display device 970.
Another device, which may optionally be coupled to computer system 900, is a network device 985 for accessing other nodes of a distributed system via a network. The communication device 985 may include any of a number of commercially available networking peripheral devices such as those used for coupling to an Ethernet, token ring, Internet, or wide area network, personal area network, wireless network or other method of accessing other devices. The communication device 985 may further be a null-modem connection, or any other mechanism that provides connectivity between the computer system 900 and the outside world.
Note that any or all of the components of this system illustrated in
It will be appreciated by those of ordinary skill in the art that the particular machine that embodies the present invention may be configured in various ways according to the particular implementation. The control logic or software implementing the present invention can be stored in main memory 920, mass storage device 930, or other storage medium locally or remotely accessible to processor 910.
It will be apparent to those of ordinary skill in the art that the system, method, and process described herein can be implemented as software stored in main memory 920 or read only memory 950 and executed by processor 910. This control logic or software may also be resident on an article of manufacture comprising a computer readable medium having computer readable program code embodied therein and being readable by the mass storage device 930 and for causing the processor 910 to operate in accordance with the methods and teachings herein.
The present invention may also be embodied in a handheld or portable device containing a subset of the computer hardware components described above. For example, the handheld device may be configured to contain only the bus 940, the processor 910, and memory 950 and/or 920.
The handheld device may be configured to include a set of buttons or input signaling components with which a user may select from a set of available options. These could be considered input device#1 975 or input device#2 980. The handheld device may also be configured to include an output device 970 such as a liquid crystal display (LCD) or display element matrix for displaying information to a user of the handheld device. Conventional methods may be used to implement such a handheld device. The implementation of the present invention for such a device would be apparent to one of ordinary skill in the art given the disclosure of the present invention as provided herein.
The present invention may also be embodied in a special purpose appliance including a subset of the computer hardware components described above, such as a kiosk or a vehicle. For example, the appliance may include a processing unit 910, a data storage device 930, a bus 940, and memory 920, and no input/output mechanisms, or only rudimentary communications mechanisms, such as a small touch-screen that permits the user to communicate in a basic manner with the device. In general, the more special-purpose the device is, the fewer of the elements need be present for the device to function. In some devices, communications with the user may be through a touch-based screen, or similar mechanism. In one embodiment, the device may not provide any direct input/output signals, but may be configured and accessed through a website or other network-based connection through network device 985.
It will be appreciated by those of ordinary skill in the art that any configuration of the particular machine implemented as the computer system may be used according to the particular implementation. The control logic or software implementing the present invention can be stored on any machine-readable medium locally or remotely accessible to processor 910. A machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g. a computer). For example, a machine readable medium includes read-only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or other storage media which may be used for temporary or permanent data storage. In one embodiment, the control logic may be implemented as transmittable data, such as electrical, optical, acoustical or other forms of propagated signals (e.g. carrier waves, infrared signals, digital signals, etc.).
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
The present application claims priority to U.S. Provisional Application No. 62/133,734 filed on Mar. 16, 2015, and incorporates that application in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2082843 | Mathez | Jun 1937 | A |
3541781 | Bloom | Nov 1970 | A |
3683933 | Mansfield | Aug 1972 | A |
3798889 | Chadwick | Mar 1974 | A |
4228806 | Lidow | Oct 1980 | A |
4297685 | Brainard, II | Oct 1981 | A |
4322609 | Kato | Mar 1982 | A |
4573804 | Kavoussi et al. | Mar 1986 | A |
4788533 | Mequignon | Nov 1988 | A |
4848360 | Palsgard et al. | Jul 1989 | A |
4858609 | Cole | Aug 1989 | A |
4982738 | Griebel | Jan 1991 | A |
5008865 | Shaffer et al. | Apr 1991 | A |
5047930 | Martens et al. | Sep 1991 | A |
5168759 | Bowman | Dec 1992 | A |
5275159 | Griebel | Jan 1994 | A |
5335657 | Terry, Jr. et al. | Aug 1994 | A |
5458105 | Taylor et al. | Oct 1995 | A |
5545192 | Czeisler et al. | Aug 1996 | A |
5562106 | Heeke et al. | Oct 1996 | A |
5671733 | Raviv et al. | Sep 1997 | A |
5844996 | Enzmann et al. | Dec 1998 | A |
5868647 | Belsole | Feb 1999 | A |
5928133 | Halyak | Jul 1999 | A |
5961447 | Raviv et al. | Oct 1999 | A |
6014682 | Stephen et al. | Jan 2000 | A |
6045514 | Raviv et al. | Apr 2000 | A |
6231527 | Sol | May 2001 | B1 |
6239706 | Yoshiike et al. | May 2001 | B1 |
6350275 | Vreman et al. | Feb 2002 | B1 |
6361508 | Johnson et al. | Mar 2002 | B1 |
6468234 | Van et al. | Oct 2002 | B1 |
6547728 | Cornuejols | Apr 2003 | B1 |
6556222 | Narayanaswami | Apr 2003 | B1 |
6834436 | Townsend et al. | Dec 2004 | B2 |
6888779 | Mollicone et al. | May 2005 | B2 |
6928031 | Kanevsky et al. | Aug 2005 | B1 |
6963271 | Fyffe | Nov 2005 | B1 |
7006650 | Wild | Feb 2006 | B1 |
7041049 | Raniere | May 2006 | B1 |
7106662 | Acker, Jr. | Sep 2006 | B1 |
7139342 | Phanse | Nov 2006 | B1 |
7153278 | Ono et al. | Dec 2006 | B2 |
7280439 | Shaddox | Oct 2007 | B1 |
7366572 | Heruth et al. | Apr 2008 | B2 |
7513003 | Mossbeck | Apr 2009 | B2 |
7559903 | Moussavi et al. | Jul 2009 | B2 |
7572225 | Stahmann et al. | Aug 2009 | B2 |
7652581 | Gentry | Jan 2010 | B2 |
7841987 | Sotos et al. | Nov 2010 | B2 |
7862226 | Bracher et al. | Jan 2011 | B2 |
7868757 | Radivojevic et al. | Jan 2011 | B2 |
7914468 | Shalon et al. | Mar 2011 | B2 |
7974849 | Begole et al. | Jul 2011 | B1 |
8179270 | Rai et al. | May 2012 | B2 |
8193941 | Wolfe et al. | Jun 2012 | B2 |
8398546 | Pacione et al. | Mar 2013 | B2 |
8407835 | Connor | Apr 2013 | B1 |
8475339 | Hwang et al. | Jul 2013 | B2 |
8482418 | Harman | Jul 2013 | B1 |
8577448 | Bauer et al. | Nov 2013 | B2 |
8680974 | Meiertoberens et al. | Mar 2014 | B2 |
8738925 | Park et al. | May 2014 | B1 |
8892036 | Causey et al. | Nov 2014 | B1 |
8909357 | Rawls-Meehan | Dec 2014 | B2 |
8942719 | Hyde et al. | Jan 2015 | B1 |
9060735 | Yang et al. | Jun 2015 | B2 |
9161719 | Tsutsumi et al. | Oct 2015 | B2 |
9257029 | Hendrick, III et al. | Feb 2016 | B1 |
9448536 | Kahn et al. | Sep 2016 | B1 |
9474876 | Kahn et al. | Oct 2016 | B1 |
9594354 | Kahn et al. | Mar 2017 | B1 |
9675268 | Bauer et al. | Jun 2017 | B2 |
9844336 | Zigel et al. | Dec 2017 | B2 |
10004452 | Kazem-Moussavi et al. | Jun 2018 | B2 |
10207075 | Kahn et al. | Feb 2019 | B1 |
10252058 | Fuerst | Apr 2019 | B1 |
10335060 | Kahn et al. | Jul 2019 | B1 |
10842968 | Kahn et al. | Nov 2020 | B1 |
11100922 | Mutagi et al. | Aug 2021 | B1 |
20010049482 | Pozos et al. | Dec 2001 | A1 |
20020080035 | Youdenko | Jun 2002 | A1 |
20020100477 | Sullivan et al. | Aug 2002 | A1 |
20020124848 | Sullivan et al. | Sep 2002 | A1 |
20030095476 | Mollicone et al. | May 2003 | A1 |
20030204412 | Brier | Oct 2003 | A1 |
20030227439 | Lee et al. | Dec 2003 | A1 |
20030231495 | Searfoss, III | Dec 2003 | A1 |
20040034289 | Teller et al. | Feb 2004 | A1 |
20040049132 | Barron et al. | Mar 2004 | A1 |
20040071382 | Rich et al. | Apr 2004 | A1 |
20040111039 | Minamiura | Jun 2004 | A1 |
20040133081 | Teller et al. | Jul 2004 | A1 |
20040210155 | Takemura et al. | Oct 2004 | A1 |
20040218472 | Narayanaswami et al. | Nov 2004 | A1 |
20050012622 | Sutton | Jan 2005 | A1 |
20050043645 | Ono et al. | Feb 2005 | A1 |
20050075116 | Laird et al. | Apr 2005 | A1 |
20050076715 | Kuklis | Apr 2005 | A1 |
20050143617 | Auphan | Jun 2005 | A1 |
20050154330 | Loree, IV | Jul 2005 | A1 |
20050190065 | Ronnholm | Sep 2005 | A1 |
20050236003 | Meader | Oct 2005 | A1 |
20050237479 | Rose | Oct 2005 | A1 |
20050245793 | Hilton et al. | Nov 2005 | A1 |
20050283039 | Cornel | Dec 2005 | A1 |
20050288904 | Warrior et al. | Dec 2005 | A1 |
20060017560 | Albert | Jan 2006 | A1 |
20060025299 | Miller et al. | Feb 2006 | A1 |
20060064037 | Shalon et al. | Mar 2006 | A1 |
20060097884 | Jang et al. | May 2006 | A1 |
20060136018 | Lack et al. | Jun 2006 | A1 |
20060150734 | Mimnagh-Kelleher et al. | Jul 2006 | A1 |
20060252999 | Devaul et al. | Nov 2006 | A1 |
20060266356 | Sotos et al. | Nov 2006 | A1 |
20060279428 | Sato et al. | Dec 2006 | A1 |
20060293602 | Clark | Dec 2006 | A1 |
20060293608 | Rothman et al. | Dec 2006 | A1 |
20070016091 | Butt et al. | Jan 2007 | A1 |
20070016095 | Low et al. | Jan 2007 | A1 |
20070093722 | Noda et al. | Apr 2007 | A1 |
20070100666 | Stivoric et al. | May 2007 | A1 |
20070129644 | Richards et al. | Jun 2007 | A1 |
20070139362 | Colton et al. | Jun 2007 | A1 |
20070191692 | Hsu et al. | Aug 2007 | A1 |
20070239225 | Saringer | Oct 2007 | A1 |
20070250286 | Duncan et al. | Oct 2007 | A1 |
20070251997 | Brown et al. | Nov 2007 | A1 |
20070287930 | Sutton | Dec 2007 | A1 |
20080062818 | Plancon et al. | Mar 2008 | A1 |
20080109965 | Mossbeck | May 2008 | A1 |
20080125820 | Stahmann et al. | May 2008 | A1 |
20080169931 | Gentry et al. | Jul 2008 | A1 |
20080191885 | Loree, IV et al. | Aug 2008 | A1 |
20080234785 | Nakayama et al. | Sep 2008 | A1 |
20080243014 | Moussavi et al. | Oct 2008 | A1 |
20080269625 | Halperin et al. | Oct 2008 | A1 |
20080275348 | Catt et al. | Nov 2008 | A1 |
20080275349 | Halperin et al. | Nov 2008 | A1 |
20080289637 | Wyss | Nov 2008 | A1 |
20080319277 | Bradley | Dec 2008 | A1 |
20090030767 | Morris et al. | Jan 2009 | A1 |
20090048540 | Otto et al. | Feb 2009 | A1 |
20090069644 | Hsu et al. | Mar 2009 | A1 |
20090071810 | Hanson | Mar 2009 | A1 |
20090082699 | Bang et al. | Mar 2009 | A1 |
20090094750 | Oguma et al. | Apr 2009 | A1 |
20090105785 | Wei et al. | Apr 2009 | A1 |
20090121826 | Song et al. | May 2009 | A1 |
20090128487 | Langereis et al. | May 2009 | A1 |
20090143636 | Mullen et al. | Jun 2009 | A1 |
20090150217 | Luff | Jun 2009 | A1 |
20090177327 | Turner et al. | Jul 2009 | A1 |
20090203970 | Fukushima et al. | Aug 2009 | A1 |
20090203972 | Heneghan et al. | Aug 2009 | A1 |
20090207028 | Kubey et al. | Aug 2009 | A1 |
20090227888 | Salmi et al. | Sep 2009 | A1 |
20090264789 | Molnar et al. | Oct 2009 | A1 |
20090320123 | Yu et al. | Dec 2009 | A1 |
20100010330 | Rankers et al. | Jan 2010 | A1 |
20100010565 | Lichtenstein et al. | Jan 2010 | A1 |
20100036211 | La Rue | Feb 2010 | A1 |
20100061596 | Mostafavi et al. | Mar 2010 | A1 |
20100075807 | Hwang et al. | Mar 2010 | A1 |
20100079291 | Kroll et al. | Apr 2010 | A1 |
20100079294 | Rai et al. | Apr 2010 | A1 |
20100083968 | Wondka et al. | Apr 2010 | A1 |
20100094139 | Brauers | Apr 2010 | A1 |
20100094148 | Bauer et al. | Apr 2010 | A1 |
20100100004 | Van Someren | Apr 2010 | A1 |
20100102971 | Virtanen et al. | Apr 2010 | A1 |
20100152543 | Heneghan et al. | Jun 2010 | A1 |
20100152546 | Behan et al. | Jun 2010 | A1 |
20100217146 | Osvath | Aug 2010 | A1 |
20100256512 | Sullivan | Oct 2010 | A1 |
20100283618 | Wolfe et al. | Nov 2010 | A1 |
20100331145 | Lakovic et al. | Dec 2010 | A1 |
20110015467 | Dothie et al. | Jan 2011 | A1 |
20110015495 | Dothie et al. | Jan 2011 | A1 |
20110018720 | Rai et al. | Jan 2011 | A1 |
20110046498 | Klap et al. | Feb 2011 | A1 |
20110054279 | Reisfeld et al. | Mar 2011 | A1 |
20110058456 | Van et al. | Mar 2011 | A1 |
20110090226 | Sotos et al. | Apr 2011 | A1 |
20110105915 | Bauer et al. | May 2011 | A1 |
20110137836 | Kuriyama et al. | Jun 2011 | A1 |
20110160619 | Gabara | Jun 2011 | A1 |
20110190594 | Heit et al. | Aug 2011 | A1 |
20110199218 | Caldwell et al. | Aug 2011 | A1 |
20110230790 | Kozlov | Sep 2011 | A1 |
20110245633 | Goldberg et al. | Oct 2011 | A1 |
20110295083 | Doelling et al. | Dec 2011 | A1 |
20110302720 | Yakam et al. | Dec 2011 | A1 |
20110304240 | Meitav et al. | Dec 2011 | A1 |
20120004749 | Abeyratne et al. | Jan 2012 | A1 |
20120083715 | Yuen et al. | Apr 2012 | A1 |
20120232414 | Mollicone et al. | Sep 2012 | A1 |
20120243379 | Balli | Sep 2012 | A1 |
20120253220 | Rai et al. | Oct 2012 | A1 |
20120296156 | Auphan | Nov 2012 | A1 |
20130012836 | Crespo et al. | Jan 2013 | A1 |
20130018284 | Kahn et al. | Jan 2013 | A1 |
20130023214 | Wang et al. | Jan 2013 | A1 |
20130053653 | Cuddihy et al. | Feb 2013 | A1 |
20130053656 | Mollicone et al. | Feb 2013 | A1 |
20130060306 | Colbauch | Mar 2013 | A1 |
20130144190 | Bruce et al. | Jun 2013 | A1 |
20130184601 | Zigel et al. | Jul 2013 | A1 |
20130197857 | Lu et al. | Aug 2013 | A1 |
20130204314 | Miller, III et al. | Aug 2013 | A1 |
20130208576 | Loree, IV et al. | Aug 2013 | A1 |
20130283530 | Main | Oct 2013 | A1 |
20130286793 | Umamoto | Oct 2013 | A1 |
20130289419 | Berezhnyy et al. | Oct 2013 | A1 |
20130300204 | Partovi | Nov 2013 | A1 |
20130310658 | Ricks et al. | Nov 2013 | A1 |
20130344465 | Dickinson et al. | Dec 2013 | A1 |
20140005502 | Klap et al. | Jan 2014 | A1 |
20140051938 | Goldstein et al. | Feb 2014 | A1 |
20140085077 | Luna et al. | Mar 2014 | A1 |
20140135955 | Burroughs | May 2014 | A1 |
20140171815 | Yang et al. | Jun 2014 | A1 |
20140200691 | Lee et al. | Jul 2014 | A1 |
20140207292 | Ramagem et al. | Jul 2014 | A1 |
20140218187 | Chun | Aug 2014 | A1 |
20140219064 | Filipi et al. | Aug 2014 | A1 |
20140232558 | Park et al. | Aug 2014 | A1 |
20140256227 | Aoki et al. | Sep 2014 | A1 |
20140259417 | Nunn et al. | Sep 2014 | A1 |
20140259434 | Nunn et al. | Sep 2014 | A1 |
20140276227 | Perez | Sep 2014 | A1 |
20140288878 | Donaldson | Sep 2014 | A1 |
20140306833 | Ricci | Oct 2014 | A1 |
20140350351 | Halperin et al. | Nov 2014 | A1 |
20140371635 | Shinar et al. | Dec 2014 | A1 |
20150015399 | Gleckler | Jan 2015 | A1 |
20150068069 | Tran et al. | Mar 2015 | A1 |
20150073283 | Van et al. | Mar 2015 | A1 |
20150085622 | Carreel et al. | Mar 2015 | A1 |
20150094544 | Spolin et al. | Apr 2015 | A1 |
20150098309 | Adams et al. | Apr 2015 | A1 |
20150101870 | Gough et al. | Apr 2015 | A1 |
20150136146 | Hood et al. | May 2015 | A1 |
20150141852 | Dusanter et al. | May 2015 | A1 |
20150148621 | Sier | May 2015 | A1 |
20150148871 | Maxik et al. | May 2015 | A1 |
20150164238 | Benson et al. | Jun 2015 | A1 |
20150164409 | Benson et al. | Jun 2015 | A1 |
20150164438 | Halperin | Jun 2015 | A1 |
20150173671 | Paalasmaa et al. | Jun 2015 | A1 |
20150178362 | Wheeler | Jun 2015 | A1 |
20150190086 | Chan et al. | Jul 2015 | A1 |
20150220883 | Lingg et al. | Aug 2015 | A1 |
20150233598 | Shikii et al. | Aug 2015 | A1 |
20150238139 | Raskin et al. | Aug 2015 | A1 |
20150265903 | Kolen et al. | Sep 2015 | A1 |
20150289802 | Thomas et al. | Oct 2015 | A1 |
20150320588 | Connor | Nov 2015 | A1 |
20150333950 | Johansson | Nov 2015 | A1 |
20150351694 | Shimizu | Dec 2015 | A1 |
20160015315 | Auphan et al. | Jan 2016 | A1 |
20160045035 | Van Erlach | Feb 2016 | A1 |
20160217672 | Yoon et al. | Jul 2016 | A1 |
20160262693 | Sheon | Sep 2016 | A1 |
20160287869 | Errico et al. | Oct 2016 | A1 |
20170003666 | Nunn et al. | Jan 2017 | A1 |
20170020756 | Hillenbrand, II | Jan 2017 | A1 |
20170188938 | Toh et al. | Jul 2017 | A1 |
20180049701 | Raisanen | Feb 2018 | A1 |
20180103770 | Nava et al. | Apr 2018 | A1 |
20180338725 | Shan et al. | Nov 2018 | A1 |
20190021675 | Gehrke et al. | Jan 2019 | A1 |
20190044380 | Lausch et al. | Feb 2019 | A1 |
20190132570 | Chen et al. | May 2019 | A1 |
20190156296 | Lu et al. | May 2019 | A1 |
20190190992 | Warrick | Jun 2019 | A1 |
20190201270 | Sayadi et al. | Jul 2019 | A1 |
Number | Date | Country |
---|---|---|
2003203967 | Nov 2004 | AU |
377738 | Jan 1964 | CH |
668349 | Dec 1988 | CH |
697528 | Nov 2008 | CH |
19642316 | Apr 1998 | DE |
1139187 | Oct 2001 | EP |
08-160172 | Jun 1996 | JP |
2007-132581 | May 2007 | JP |
10-2009-0085403 | Aug 2009 | KR |
10-2010-0022217 | Mar 2010 | KR |
9302731 | Feb 1993 | WO |
2008038288 | Apr 2008 | WO |
2009099292 | Aug 2009 | WO |
2011141840 | Nov 2011 | WO |
Entry |
---|
“NPL—EasySense LTD”, archive.org, accessed: Jan. 7, 2019, published: Nov. 27, 2006. |
Acligraphy, From Wikipedia, the free encyclopedia, downloaded at: http://en.wikipedia.org/wiki/Actigraphy on Apr. 24, 2014, 4 pages. |
Campbell, Appleinsider, “Apple buys sleep tracking firm Beddit” May 9, 2017. Retrieved from https://appleinsider.com/articles/May 17, 09/apple-buys-sleep-tracking-firm-beddit (Year: 2017). |
Crist, CNET “Samsung introduces SleepSense” Sep. 3, 2015. Retrieved from https://www.cnet.com/reviews/samsung-sleepsense-preview (Year: 2015). |
Daniel et al., “Activity Characterization from Actimetry Sensor Data for Sleep Disorders Diagnosis”, Sep. 2008, 10 pages. |
Desai, Rajiv, “The Sleep”, Mar. 17, 2011, Educational Blog, 82 pages. |
Fitbit Product Manual, “Fitbit Product Manual”, available online at <http://www.filtbit.com/manual>, Mar. 29, 2010, pp. 1-20. |
Haughton Mifflin, “Estimate”, The American Heritage dictionary of the English language (5th ed.), Jul. 24, 2017, 2 pages. |
How BodyMedia FIT Works, <http://www.bodymedia.com/Shop/Leam-More/How-it-works>, accessed Jun. 17, 2011, 2 pages. |
Internet Archive, Withings “Sleep Tracking Mat” Nov. 22, 2018. Retrieved from https://web.archive.org/web/20181122024547/https://www.withings.com/us/en/sleep (Year: 2018). |
Jaines, Kira, “Music to Help You Fall Asleep,” <http://www.livestrong.com/article/119802-music-fall-sleep/>, May 10, 2010, 2 pages. |
JETLOG Reviewers Guide, <http://www.jetlog.com/fileadmin/Presse_us/24x7ReviewersGuide.pdf>, 2009, 5 pages. |
Leeds, Joshua, “Sound-Remedies.com: Sonic Solutions for Health, Learning & Productivity,” <http://www.sound-remedies.com/ammusforslee.html>, Accessed May 23, 2013, 2 pages. |
Lichstein, et al., “Actigraphy Validation with Insomnia”, Sleep, vol. 29, No. 2, 2006, pp. 232-239. |
Liden, Craig B, et al, “Characterization and Implications of the Sensors Incorporated into the SenseWear(TM) Armband for Energy Expenditure and Activity Detection”, , accessed Jun. 17, 2011, 7 pages. |
Mattila et al., “A Concept for Personal Wellness Management Based on Activity Monitoring,” Pervasive Computing Technologies for Healthcare, 2008. |
Patel, et al., Validation of Basis Science Advanced Sleep Analysis, Estimation of Sleep Stages and Sleep Duration, Basis Science, San Francisco, CA, Jan. 2014, 6 pages. |
Pires, P. D. C. Activity Characterization from Actimetry Sensor Data for Sleep Disorders Diagnosis, Universidade T ecnica de Lisboa, Sep. 2008, 10 pages. |
Pollak et al., “How Accurately Does Wrist Actigraphy Identify the States of Sleep and Wakefulness?”, Sleep, vol. 24, No. 8, 2001, pp. 957-965. |
Power Nap, <en.wikipedia.org/wiki/Power.sub.-nap>, Last Modified Sep. 20, 2012, 4 pages. |
PowerNap iPhone App, <http://forums.precentral.net/webos-apps-software/223091-my-second-app---powernap-out-app-catalog-nap-timer.html>, Jan. 6, 2010, 10 pages. |
Rechtschaffen et al., Manual of Standardized Terminology, Techniques and Scoring System for Sleep Stages of Human Subjects, 1968, 57 pages. |
Sara Mednick, <en.wikipedia.org/wiki/Sara.sub.-Mednick>, Last Modified Sep. 12, 2012, 2 pages. |
Schulz et al. “Phase shift in the REM sleep rhythm.” Pflugers Arch. 358, 1975, 10 pages. |
Schulz et al. “The REM-NREM sleep cycle: Renewal Process or Periodically Driven Process?.” Sleep, 1980, pp. 319-328. |
Sleep Debt, <en.wikipedia.org/wiki/Sleep.sub.-debt>, Last Modified Aug. 25, 2012, 3 pages. |
Sleep Inertia, <en.wikipedia.org/wiki/Sleep_inertia>, Last Modified Sep. 12, 2012, 2 pages. |
Sleep, <en.wikipedia.org/wiki/Sleep.sub.-stages#Physiology>, Last Modified Oct. 5, 2012, 21 pages. |
Slow Wave Sleep, <en.wikipedia.org/wiki/Slow-wave.sub.-sleep>, Last Modified Jul. 22, 2012, 4 pages. |
Sunseri et al., “The SenseWear (TM) Armband as a Sleep Detection Device,” available online at <http://sensewear.bodymedia.com/SenseWear-Sludies/SW-Whilepapers/The-SenseWear-armband-as-a-Sleep-Delection-Device>, 2005, 9 pages. |
Wikipedia, “David.sub Dinges”, available online at <en.wikipedia.org/wiki/David.sub_Dinges>, Sep. 12, 2012, 2 pages. |
Yassourdidis et al. “Modelling and Exploring Human Sleep with Event History Analysis.” Journal of Sleep Research, 1999, pp. 25-36. |
Number | Date | Country | |
---|---|---|---|
62133734 | Mar 2015 | US |