Hardware sensor system for controlling sleep environment

Information

  • Patent Grant
  • 11793455
  • Patent Number
    11,793,455
  • Date Filed
    Monday, October 14, 2019
    5 years ago
  • Date Issued
    Tuesday, October 24, 2023
    a year ago
Abstract
A sleep sensor system comprising a sensor to collect movement data from a sleeper on an adjustable bed, a receiver to send movement data from the sensor to a server. The sleep sensor system further comprising the receiver to receive command data from the server, a bed controller to transmit bed movement command data to the adjustable bed, to cause the adjustable bed to alter one or more settings, and an IoT controller to transmit IoT command data to an IoT device to cause the IoT device to take an action.
Description
FIELD

The present invention relates to sleep sensors, and more particularly to a sensor system including Internet of Things controls.


BACKGROUND

An average person spends about one-third of his or her life asleep. Sleep is the time our bodies undergo repair and detoxification. Research has shown that poor sleep patterns is an indication of and often directly correlated to poor health. Proper, restful and effective sleep has a profound effect on our mental, emotional and physical well-being.


Every person has a unique circadian rhythm that, without manipulation, will cause the person to consistently go to sleep around a certain time and wake up around a certain time. For most people, a typical night’s sleep is comprised of five different sleep cycles, each lasting about 90 minutes. The first four stages of each cycle are often regarded as quiet sleep or non-rapid eye movement (NREM). The final stage is often denoted by and referred to as rapid eye movement (REM). REM sleep is thought to help consolidate memory and emotion. REM sleep is also the time when blood flow rises sharply in several areas of the brain that are linked to processing memories and emotional experiences. During REM sleep, areas of the brain associated with complex reasoning and language experience blood flow declines, whereas areas of the brain associated with processing memories and emotional experiences exhibit increased blood flow.


Therefore, it is useful for everyone to know more about how well they sleep.





BRIEF DESCRIPTION OF THE FIGURES

The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:



FIG. 1 is block diagram of one embodiment of a system in which the sensor system may be implemented.



FIG. 2A is a block diagram of one embodiment of the user site portion of the sensor system.



FIG. 2B is a block diagram of one embodiment of the server side portion of the sensor system.



FIG. 3 is a circuit diagram of one embodiment of the connector of the sensor system.



FIG. 4 is an overview flowchart of one embodiment of the sensor system sending and receiving signals.



FIG. 5 is a flowchart of one embodiment of controlling the bed via the signals from the receiver.



FIG. 6 is a flowchart of one embodiment of providing feedback regarding bed movement.



FIG. 7 is a block diagram of one embodiment of a computer system that may be used as the sensor.





DETAILED DESCRIPTION

A sleep monitoring system is described. The system includes user-side elements including a sensor element and a receiver. The receiver interfaces with a bed controller, which in one embodiment enables changing the position of the head and/or foot of the bed. In one embodiment, the bed controller also permits changing a temperature, heating/cooling of the bed. In one embodiment, the bed controller also permits alteration of air flow. The receiver system, in one embodiment, also enables interface with external Internet of Things (IoT) elements, enabling a user to control the bed via existing IoT infrastructure, such as AMAZON’s Alexa or GOOGLE Home, or APPLE Siri, or other such devices. The receiver system further enables the adjustment of the bed based on sleep analytics. In one embodiment, the receiver interfaces with the bed using a DIN connector, which provides power to the bed, and provides a command interface from the receiver to the bed. In this way, a controlled bed can become a “smart bed” which is part of the IoT infrastructure.


The following detailed description of embodiments of the invention makes reference to the accompanying drawings in which like references indicate similar elements, showing by way of illustration specific embodiments of practicing the invention. Description of these embodiments is in sufficient detail to enable those skilled in the art to practice the invention. One skilled in the art understands that other embodiments may be utilized and that logical, mechanical, electrical, functional and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.



FIG. 1 is block diagram of one embodiment of a system in which the sensor system may be implemented. The system includes a sleep analytics system 100 including sensors 120, receiver 130, server 160, and analytics engine 170. In one embodiment, the client portion of the sleep analytics system 100 is located in a user’s home and includes the sensors 120 and receiver 130.


In one embodiment, the receiver 130 is coupled to sensors 120 via a cable. In another embodiment the connection may be wireless, such as low power Bluetooth (BLE), Wi-Fi, or another type of wireless connection. In one embodiment, receiver 130 also may be coupled to a controller 140, which controls bed 150. In one embodiment, this connection is a wired connection. Alternatively, it may be a wireless connection.


In one embodiment, the sensors 120 may include one or more sensors positioned in bed 150 which are used to measure the user’s sleep. In one embodiment, sensors 120 may include sensors which are not in bed 150 but positioned in the room in which the bed 150 is located. In one embodiment, one or more these additional sensors may be built into receiver 130. In one embodiment, there may be external sensors which may be coupled to receiver 130 either via wires or wirelessly. The receiver 130 collects data from the one or more sensors, for transmission to the server 160.


In one embodiment, the receiver 130 is coupled to the server 160 via a network 150. The server portion includes server 160 and analytics engine 170, which in one embodiment are located off-site, removed from the user. In another embodiment, the server may be a local system, such as a computer system running an application. The network 150 may be the Internet, and the receiver 130 may send data to the server via a wireless network, such as Wi-Fi or the cellular network. In one embodiment, server 160 and analytics engine 170 may be on the same physical device. In one embodiment, server and/or analytics engine 170 may include a plurality of devices. In one embodiment, one or both of the server 160 and the analytics engine 170 may be using cloud computing and may be implemented as a distributed system.


In one embodiment, the user may be provided information about their sleep experience and the ability to set preferences via a computer device 180. In one embodiment, the user computer device 180 may be a mobile telephone, tablet, laptop, or desktop computer running an application or providing access to a website. In one embodiment, the user computer device 180 may be an IoT device such as AMAZON’s Alexa or GOOGLE Home, or APPLE Siri. In one embodiment, the user computer device 180 may obtain data from the server 160 and/or analytics engine 170 via the network 150. In one embodiment, the user computer device 180 may connect to the server 160/analytics engine 170 via the receiver 130.



FIG. 2A is a block diagram of one embodiment of the user site portion of the sensor system. The user site portion includes a bed 150 or other sleeping surface. In one embodiment, the bed 150 includes a bed adjustor 236, and optionally an IoT interface 246. The insulated sensors 210 are placed in the bed 150 in one embodiment. The insulated sensors 210 may be built into the bedframe, placed under a mattress or mattress topper, or otherwise positioned such that they are underneath the sleeper. The data from sensors 210 goes to signal receiver 224 in receiver 220. In one embodiment, the receiver 220 may also include one or more additional sensors 226 such as air sensors 226. Air sensors 226, in one embodiment, may include temperature, humidity, and volatile organic compound (VOC) sensors. Other types of environmental sensors may also be included in receiver 220. In one embodiment, additional external sensors may also be coupled to receiver 220 wirelessly or via wired connections.


The signal data is digitized by digitizer 228. In one embodiment, an analog-to-digital converter is used. The data is then sent to server 250 via wireless transceiver 230. The server 250 processes the data, and returns data to the wireless transceiver 230.


The data from the server 250 may be used to adjust the sensors 210, and the receiver 220′s functionality. Furthermore, in one embodiment, the data from the server 250 may be used to control loT devices 248 via IoT controller 244. In one embodiment, the loT controller 244 may be a connection to an IoT device 248 such as a speaker system, light controls, television controls, or other such devices which may be remotely controlled.


In one embodiment, the data from the server 250 may also be used to control the bed 150. In one embodiment, command data may be transmitted through a connector 222 to the bed adjustor 236 within the bed 150. The bed adjuster 236 controls the bed 150, via one or more motors or other control systems.


In one embodiment, connector 222 provides a physical connection from the receiver 220 to the bed adjustor 236. In one embodiment, the connector 222 also provides power 240 to receiver 220. In one embodiment, the bed adjustor 236 is plugged into wall power. Since the bed adjustor controls setting of a large physical device, the bed 150, it utilizes standard wall power, in one embodiment. In contrast, the systems of the receiver 220 consumes very little power. The receiver 220 may have a single connection through connector 222 to the bed adjustor 236 which provides power, and also provides signal and receives data from receiver 220. In one embodiment, the connection is a DIN plug. In one embodiment, the DIN connector used is a standard connector for audio signals which provides a circular shielding metal skirt protecting a number of straight round pins. This ensures that the plug is inserted with the correct orientation and ensures that the shielding is connected between socket and plug prior to any signal path connection being made. Power system 242 in receiver 220 provides a step-down voltage converter to ensure that the power available to the receiver 220 is appropriate. In one embodiment, the receiver 220 may utilize multiple power elements to step down power from the level provided from the power output 240 of connector 222.


In one embodiment, this system additional provides the ability to interact with the bed 150 via IoT even if the bed itself does not include the optional native IoT interface 246, using the IoT controller 244 in the receiver 220. This may enable a user to for example utilize voice commands through a system such as AMAZON’s Alexa, GOOGLE Home, or APPLE Siri to adjust the bed position, instead of utilizing controls associated with the bed.


Furthermore, in one embodiment, the system also allows the receiver 220 to control IoT devices 248 to adjust the user’s environment based on the user’s sleep state, as determined by server 250. For example, if the user has fallen asleep, the IoT systems 248 may lock the door, close the blinds, adjust the temperature, adjust the bed position, or otherwise adjust the environment to ensure that the user has a restful sleep. Similarly, if an alarm is set, the IoT controller 244 may control IoT devices 248 to transition the user smoothly to a waking state, to maximize comfort and ensure that the user wakes in the correct portion of their sleep cycle to minimize post-sleep tiredness.


In one embodiment, air sensor 226 data may be used by server 250 to adjust the environment and/or provide alerts to the user, if needed. In one embodiment, such alerts may be provided through an application on a computer system such as a mobile phone, an IoT interface, or other device which provides data to the user about their sleep experience and environment. In one embodiment, air sensor data 226 may also be used to automatically adjust the user’s environment to ensure healthy air quality. For example, if the air is determined to be too dry, a humidifier may be turned on, via loT controller 244. If the air has too many VOCs, a window may be opened, a fan may be turned on, etc. In one embodiment, the system determines whether it is possible to address the issue via existing IoT devices. If that is not possible due to a problem which cannot be addressed by the existing loT devices 248 the user may be alerted. In one embodiment, an alert may ask the user to take a particular corrective step, e.g. open windows, run a humidifier, run an air filter, etc. In one embodiment, the user is informed of changes to their environment via loT devices 248, but alerts are sent only when the user’s action is needed.



FIG. 2B is a block diagram of one embodiment of the server side portion of the sensor system. The server side portion in one embodiment includes a server 250 and an analytics engine 280. The server 250 processes the user’s data, real-time and using a look-back engine, as described in co-pending U.S. Pat. Application No. 16/601,567, entitled Sleep Detection and Analysis System filed concurrently herewith, and incorporated herein by reference.


Server 250 receives data from receiver 220. The processing utilizes a signal identifier 252 identifies the signal content. The signal in one embodiment may include one or more of: sleep sensor data, environmental sensor data, control data from the bed controller, and IoT data. In one embodiment, the identified signal is sent to the appropriate processor.


The sensor data processor 254 processes the sensor data which is used by sleep state identifier 265 to identify the user’s sleep state, and respiration/heartrate calculator 268 to calculate those aspects of the data. In one embodiment, snore detector 270 also uses the sensor data to determine whether the user is snoring or having other respiratory or health issues which may be identified based on the sensor data. In one embodiment, all of the calculated data is stored in memory 276. In one embodiment, the data is stored associated with the environmental sensor data and is stored per user. That is, if there are two sleepers on the bed, in one embodiment, the data is evaluated and stored for each sleeper. In one embodiment, adjustment calculator 272 utilizes the data from sleep state identifier 265, respiration/HR calculator 268, and snore/health detector 270 to determine whether the user’s bed and/or environment should be adjusted. If so, adjustment calculator sends an instruction to IoT command generator 262 and/or bed command calculator 273 to make the appropriate adjustment. In one embodiment, if the IoT/bed controls cannot address an identified issue, the user may be alerted through a user interface module 251.


If the data is bed data, bed movement recorder 258 determines what occurred in the bed. In one embodiment, a database of bed position 259 is used in this identification. In one embodiment, the database includes a ‘current bed configuration’ status, which enables the system to determine an updated bed position. If the bed control data is a request to alter the bed’s configuration, the bed command calculator 273 utilizes the command, the current bed configuration, and data about the bed controls to create a command which would correctly move the bed to the requested configuration. This data in one embodiment is stored in memory 276. The updated bed position is also stored in the database 259, and the current bed configuration is updated. In one embodiment, the current bed position is only updated once the receiver has reported the updated position of the bed.


If the signal is an IoT request, the IoT data processor 260 determines what changes should be made, and which IoT device should be instructed. The IoT command generator 262 generates the appropriate commands for the designated IoT device. This data is stored in memory 276 in one embodiment. The IoT command, bed command, or appropriate user feedback is returned to the receiver 220, in one embodiment. Additionally, the server 250 may make the data available to the user via a user interface. The user interface may be presented via a web interface, or an application on the user’s system. In one embodiment, the user interface module 251 makes this data available in a format that is comprehensible by the user. In one embodiment, the user interface module 251 provides access to historical data in memory 276 as well as current status data.


Recommendation engine 256 provides recommendations to the user, based on data from analytics engine 280.


Analytics engine 280 utilizes data from a large number of users and calculates sleep state and sleep quality analytics across users. The analytics engine 280 may be used to generate recommendations and feedback to users.


The analytics engine 280 in one embodiment includes a data accumulator 283 which obtains sleep data, environmental data, and user characteristic data. User characteristic data, in one embodiment, is data entered by the user which may impact the user’s sleep. For example, user characteristic data may include age, gender, health conditions, use of medications, use of medical devices such as positive airflow devices (CPAPs), etc.


In one embodiment, the data accumulated is anonymized by anonymizer 286. Anonymizer 286 strips identifying data and creates “cumulative” data for analytics. Rather than having data for “Bob, age 55” the system would have data for “users, ages 50-60.” This enables the sharing of results derived from this data.


Bed movement analytics 290, sleep state analytics 292, and sleep quality analytics 294 are implemented by a processor and analyze the cumulative data to evaluate the impact of the bed positions, environment, and sleep states on sleep quality. In one embodiment, the analytics engine 280 is attempting to determine recommendations. Recommendations, for example, may be adjustments to the bed configuration, or to the sleep environment, to ensure optimal sleep quality. While recommendation engine 296 utilizes cumulative data from many users’ experiences, the server recommendation engine 256 is customized, in one embodiment, to the particular user’s personal data. In one embodiment, the recommendations from analytics engine 280 are used for new users (for whom insufficient personal data is available for recommendations) and for users whose situation changes (in which case the prior personal data is no longer relevant). For example, if a user starts utilizing a sleep aid, whether medication or a machine, the recommendation engine 296 may adjust the settings recommended for the user. Other uses of the cumulative data, and personal data, may be made.


In one embodiment, the feedback system 298 of the analytics engine 280 may also be used to provide feedback to a manufacturer of the smart bed. For example, because the receiver obtains bed position and bed movement data, the system can provide information to bed manufacturers regarding ranges of motion, sleep positions, and any real-world issues with the bed.



FIG. 3 is a circuit diagram of one embodiment of the connector 222 of the sensor system. In one embodiment, a USB plug 310 provides a USB connection from the connector to the receiver 220. The USB connection goes to microcontroller 320, which is coupled to a connector 340, for connection to the bed. In one embodiment, the connector is a DIN connector 340. In one embodiment, the connector 222 provides a male and female DIN connector, so that the number of DIN connectors available to the bed system remains the same after the connector 222 is plugged in. Of course, while the described embodiment uses a DIN connector 340, an alternative connector mechanism including USB plugs, proprietary plugs, etc. may be used.


The connector 222 also receives power from the DIN connectors 340. The power is coupled through power subsystem 330. In one embodiment, the power subsystem 330 utilizes a low dropout regulator (LDO) and a buck regulator to step down and clean the power signal and provide it via the USB to the receiver 220. In one embodiment, the system provides 3.3 volt DC power to the controller via plug 310.



FIG. 4 is an overview flowchart of one embodiment of the sensor system sending and receiving signals. The process starts at block 410. In one embodiment, this process starts when the user sets up the system initially. In one embodiment, the bed system is set up, and the sensor is correctly positioned under the mattress or within the bed frame.


At block 415, the receiver is connected to the bed, using the DIN connection, in one embodiment.


At block 420, the power and data connection are verified. In one embodiment, the data connection to and from the sensor, and to and from the server is verified. Establishing data connections is well known in the art. In one embodiment, if the receiver uses a wireless connection, the user may identify the wireless network, and password to establish the data connection.


At block 425, data from the sensor is received.


At block 430, an IoT or bed control request is received, in one embodiment. The request may be from the bed or a separate IoT device. The request in one embodiment may be a request to change a configuration of the bed or the environment.


At block 435, the data received from the sensors and control request data when received, is converted to digital format. In one embodiment, the data is encoded for transmission. In one embodiment, a lossless encoding method is used. In one embodiment, the data is temporarily stored in a memory, such as a buffer.


At block 440 the process determines if it is time to send/receive data, in one embodiment. In one embodiment, data is sent at regular intervals in batches. In another embodiment, data is sent continuously. If it is not yet time to send data, the process returns to block 425 to continue receiving sensor data.


If it is time to send the data, the process sends batched data to the server. The data includes buffered sensor data and when an IoT request is received, IoT data, in one embodiment. At block 445, the process determines whether data was received at the server. If no data was received, which means that there was no sensor data or IoT data sent, the process returns to block 425, to continue awaiting data from sensors and/or IoT devices.


If data was received, the system at block 450 determines whether the data has been processed. If so, the system returns to awaiting data.


At block 455, the sensor data is processed. In one embodiment, this includes determining the user’s current sleep state, and any other relevant data about the user. In one embodiment, sleep sensor data is also supplemented with other sensor data. In one embodiment, this processed sensor data is stored. In one embodiment, the system also determines whether, based on the analysis of the sleep sensor data and other sensor data, the user’s environments should be altered. If so, in one embodiment, the process may generate IoT/bed control data. The process then continues to block 460.


At block 460 the system determines whether there is bed control data. If so, at block 465, the system prepares a control signal to be sent to the bed, via the DIN connection. In one embodiment, the control signal is translated into a format that the bed controller can receive. In one embodiment, if the command is relative absolute (e.g. raise the head of the bed to a 10 degree angle) the system further translates that command to an adjustment based on data from the known current position of the bed (e.g. move the angle of the bed by 4 degrees up.) The control signal is translated into a format which is usable by the bed controller. For example, the translation may be as follows:

  • receive command: raise the head of the bed to a 10 degree angle
  • determine current position of bed: head raised to a 6 degree angle
  • translate command to change command: raise head by 4 degrees
  • translate command to bed controller command: turn on motor #2 for 30 seconds.


The process then returns to block 450, to determine whether all data has been processed.


If there is no bed control data, as determined at block 460, the process determines whether there is IoT control data, at block 470. IoT control data is used to adjust elements in the user’s environment. This may include elements that directly impact the sleep, such as turning on or off lights, fans, noise machines, etc. as well as other elements like locking doors, closing garage doors, turning on coffee machines, etc. If the data is IoT control data, ta block 475 control signals are sent to the appropriate IoT device. In one embodiment, the control signals are sent through the bed’s system, as described above.


If the data is neither bed control data nor IoT data, it is processed to alter receiver or sensor settings. The resultant data is sent to the receiver, if appropriate. This process runs when the system is active.



FIG. 5 is a flowchart of one embodiment of controlling the bed via the signals from the receiver. The process starts at block 510. In one embodiment, the process starts when a bed adjustment request is received. At block 520 the process determines whether the request was received via an Internet of Things interface. If so, at block 530 the request is sent to the server for processing.


At block 540, bed control information is received from the server. In one embodiment, the bed control information is customized for the particular smart bed associated with the user’s account and in a format which the bed can utilize. In one embodiment, the server translates control data to this format, prior to sending it to the receiver.


At block 550, the bed control information is directed to the bed controller. In one embodiment, this is done via a DIN connection cable.


At block 560, the process determines whether the bed configuration was changed. If so, the data is sent to the server 570. This is used to confirm bed configuration changes, and also track current and past bed configurations. In one embodiment, the verification occurs because the command may result in no change, either because the bed is already in that configuration, or because the bed cannot achieve the requested configuration. In one embodiment, when the server receives a configuration request, it determines whether the bed is already in that configuration and alerts the user. In one embodiment, the server does not send a configuration command if that is the case. The process ends at block 580.



FIG. 6 is a flowchart of one embodiment of providing feedback regarding bed movement. The process starts at block 610. At block 615, data is received from the sensor system. This data is movement data, and as discussed above is used by the server to determine the user’s sleep state, and in one embodiment optimize the user’s sleep environment.


At block 620, the process determines whether an IoT bed control request has been received. An IoT bed control request is a request from the user, utilizing an IoT device such as APPLE’s Siri or GOOGLE’s Home device, requesting an adjustment in the position or configuration of the bed. In one embodiment, the bed may have adjustable positions for the head and foot of the bed. In one embodiment, the bed may also have adjustable heating/cooling, airflow, and/or other aspects. If an IoT bed control request is received, at block 625 the request is processed into a command which can be transmitted to the bed via the receiver, and sent to the receiver. The processing, which includes interpretation and analysis, occurs on the server side, in one embodiment. In one embodiment, the IoT device receives “natural language” commands verbally, which are initially analyzed by the IoT device, and then passed to the server. The commands generally are relative commands (e.g. raise the head of the bed), which are interpreted by the server based on the known current position data.


The process then continues to block 635. In one embodiment, at block 635, the movement data is added to the database of bed positions. The process then continues to block 640.


If the data did not include an IoT bed control request, the process at block 630 determines whether the data include a user adjustment report. A user adjustment report provides data about manual adjustments made directly by the user, using a remote control or another method of controlling the bed configuration. If the data includes a user adjustment report, at block 635, the movement data is added to the database of bed positions 635. In one embodiment, the movement data (e.g. head moved up by 10 degrees) is added to the existing position data, to determine a “current position.” In one embodiment, the database of bed positions includes a “current position” which is used as the basis for the next bed adjustment, whether automatic or via control requests. The database in one embodiment also includes historical movement data and position data.


At block 640, the bed position is correlated with sleep data and sleep quality information. This data may be used in analyzing the impact of bed configuration on sleep quality. The results of such analysis may be used to provide recommendations to the user and/or automatic adjustments if the user authorizes such adjustments.


At block 650, the process determines whether there is enough data for a recommendation regarding bed configuration. If so, the recommendation is sent to the user at block 655. In one embodiment, the user may authorize the implementation of such recommendations. If that is the case, the system may automatically made adjustments based on the analysis. In one embodiment, at the time of set-up the user may authorize the system to automatically adjust the bed to improve sleep quality.


At block 680 the user abstracted data, that is anonymized data, is collected from multiple users. At block 670 the data is analyzed and the process determines whether there is enough data for reporting. If so, at block 675, the system reports the data. In one embodiment, the reporting may be to the bed manufacturer. In one embodiment, the report may be to users, as global recommendations or analysis data. This information may be useful for creating recommendations for users and for manufacturers. It may also be useful for adjusting recommendation based on user conditions. For example, the system may determine based on analysis of cumulative data that teenagers sleep better with their feet elevated, compared to adults or children, or that someone with a health condition sleeps better with a particular bed and/or environmental adjustment. The process then returns to block 615 to continue receiving data.


Of course, though FIGS. 4 through 6 are shown as a flowcharts, in one embodiment they may be implemented as interrupt-driven systems and the ordering of individual elements may be altered unless they are dependent on each other. For example, in FIG. 6, the reports, requests, and recommendations may be independently obtained and the appropriate analysis may be triggered whenever relevant data is received.



FIG. 7 is a block diagram of one embodiment of a computer system that may be used with the present invention. It will be apparent to those of ordinary skill in the art, however that other alternative systems of various system architectures may also be used. Furthermore, while this is a particular computer system, the processors, servers, and other computer devices described may be implemented in a distributed cloud-based environment in which portions of a computer system’s capabilities are used to provide the described features and processes.


The data processing system illustrated in FIG. 7 includes a bus or other internal communication means 740 for communicating information, and a processing unit 710 coupled to the bus 740 for processing information. The processing unit 710 may be a central processing unit (CPU), a digital signal processor (DSP), or another type of processing unit 710.


The system further includes, in one embodiment, a random access memory (RAM) or other volatile storage device 720 (referred to as memory), coupled to bus 740 for storing information and instructions to be executed by processor 710. Main memory 720 may also be used for storing temporary variables or other intermediate information during execution of instructions by processing unit 710.


The system also comprises in one embodiment a read only memory (ROM) 750 and/or static storage device 750 coupled to bus 740 for storing static information and instructions for processor 710. In one embodiment, the system also includes a data storage device 730 such as a magnetic disk or optical disk and its corresponding disk drive, or Flash memory or other storage which is capable of storing data when no power is supplied to the system. Data storage device 730 in one embodiment is coupled to bus 740 for storing information and instructions.


The system may further be coupled to an output device 770, such as a cathode ray tube (CRT) or a liquid crystal display (LCD) coupled to bus 740 through bus 760 for outputting information. The output device 770 may be a visual output device, an audio output device, and/or tactile output device (e.g. vibrations, etc.)


An input device 775 may be coupled to the bus 760. The input device 775 may be an alphanumeric input device, such as a keyboard including alphanumeric and other keys, for enabling a user to communicate information and command selections to processing unit 710. An additional user input device 780 may further be included. One such user input device 780 is cursor control device 780, such as a mouse, a trackball, stylus, cursor direction keys, or touch screen, may be coupled to bus 740 through bus 760 for communicating direction information and command selections to processing unit 710, and for controlling movement on display device 770.


Another device, which may optionally be coupled to computer system 700, is a network device 785 for accessing other nodes of a distributed system via a network. The communication device 785 may include any of a number of commercially available networking peripheral devices such as those used for coupling to an Ethernet, token ring, Internet, or wide area network, personal area network, wireless network or other method of accessing other devices. The communication device 785 may further be a null-modem connection, or any other mechanism that provides connectivity between the computer system 700 and the outside world.


Note that any or all of the components of this system illustrated in FIG. 7 and associated hardware may be used in various embodiments of the present invention.


It will be appreciated by those of ordinary skill in the art that the particular machine that embodies the present invention may be configured in various ways according to the particular implementation. The control logic or software implementing the present invention can be stored in main memory 720, mass storage device 730, or other storage medium locally or remotely accessible to processor 710.


It will be apparent to those of ordinary skill in the art that the system, method, and process described herein can be implemented as software stored in main memory 720 or read only memory 750 and executed by processor 710. This control logic or software may also be resident on an article of manufacture comprising a computer readable medium having computer readable program code embodied therein and being readable by the mass storage device 730 and for causing the processor 710 to operate in accordance with the methods and teachings herein.


The present invention may also be embodied in a handheld or portable device containing a subset of the computer hardware components described above. For example, the handheld device may be configured to contain only the bus 740, the processor 710, and memory 750 and/or 720.


The handheld device may be configured to include a set of buttons or input signaling components with which a user may select from a set of available options. These could be considered input device #1 775 or input device #2 780. The handheld device may also be configured to include an output device 770 such as a liquid crystal display (LCD) or display element matrix for displaying information to a user of the handheld device. Conventional methods may be used to implement such a handheld device. The implementation of the present invention for such a device would be apparent to one of ordinary skill in the art given the disclosure of the present invention as provided herein.


The present invention may also be embodied in a special purpose appliance including a subset of the computer hardware components described above, such as a server system. For example, the appliance may include a processing unit 710, a data storage device 730, a bus 740, and memory 720, and no input/output mechanisms, or only rudimentary communications mechanisms, such as a small touch-screen that permits the user to communicate in a basic manner with the device. In general, the more special-purpose the device is, the fewer of the elements need be present for the device to function. In some devices, communications with the user may be through a touch-based screen, or similar mechanism. In one embodiment, the device may not provide any direct input/output signals, but may be configured and accessed through a website or other network-based connection through network device 785.


It will be appreciated by those of ordinary skill in the art that any configuration of the particular machine implemented as the computer system may be used according to the particular implementation. The control logic or software implementing the present invention can be stored on any machine-readable medium locally or remotely accessible to processor 710. A machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g. a computer). For example, a machine readable medium includes read-only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or other storage media which may be used for temporary or permanent data storage. In one embodiment, the control logic may be implemented as transmittable data, such as electrical, optical, acoustical or other forms of propagated signals (e.g. carrier waves, infrared signals, digital signals, etc.).


In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A sleep sensor system comprising: a sensor to collect movement data from a sleeper on an adjustable bed;a receiver to send the movement data from the sensor to a server, the receiver further to receive command data from the server;a bed controller to receive the command data from the receiver and to transmit bed movement command data based on the command data to the adjustable bed, to cause the adjustable bed to alter one or more of its settings in response to the bed movement command data; andan Internet of Things (IoT) controller to transmit IoT command data received from the server to an loT device to cause the IoT device to take an action,wherein the command data is received from a bed command calculator of the server, the bed command calculator configured to calculate the command data to control the adjustable bed, based on a combination of a current position of the bed and a requested update to a configuration of the bed.
  • 2. The sleep sensor system of claim 1, further comprising: a connector to interface between a bed adjustor, associated with the adjustable bed, and the receiver.
  • 3. The sleep sensor system of claim 2, wherein the connector includes the bed controller.
  • 4. The sleep sensor system of claim 2, wherein the connector provides power to the receiver.
  • 5. The sleep sensor system of claim 4, wherein the connector receives power from the bed adjustor, the bed adjustor being incorporated into the adjustable bed.
  • 6. The sleep sensor system of claim 4, wherein the power from the connector is stepped down by a power system in the receiver.
  • 7. The sleep sensor system of claim 1, wherein the IoT controller is further to receive instructions from an IoT device to control the adjustable bed; and the receiver further to receive the command data from the server to make an adjustment, based on the instructions.
  • 8. The sleep sensor system of claim 7, wherein the bed movement command data is customized to the adjustable bed.
  • 9. The sleep sensor system of claim 1, wherein the IoT controller is used to adjust a sleeping environment of a user.
  • 10. A sleep sensor system of claim 1, wherein: the server receives the movement data from the sensor via the receiver, the server analyzes the movement data from the sensor, and the server sends control signals to control the adjustable bed to the receiver.
  • 11. A method of controlling a sleep environment including an adjustable bed comprising: sensing movement data using a sensor in the adjustable bed;collecting the movement data at a receiver;monitoring a sleep environment at the receiver to collect environmental sensor data;sending the movement data to a server;receiving command data from the server, the command data to adjust the adjustable bed or the sleep environment;transmitting, using a bed controller, bed movement command data to the adjustable bed, the bed movement command data based on the received command data, causing the adjustable bed to alter one or more of its settings; andtransmitting, by an internet of things (IoT) controller, loT command data, received from the server, to an IoT device to alter the sleep environment,wherein the command data is received from a bed command calculator in the server, the bed command calculator configured to calculate the command data to control the adjustable bed, based on a combination of a current position of the bed and a requested update to a bed configuration.
  • 12. The method of claim 11, further comprising: coupling the receiver to the adjustable bed via a connector to interface between a bed adjustor, associated with the adjustable bed and the receiver.
  • 13. The method of claim 12, wherein the connector includes the bed controller.
  • 14. The method of claim 12, wherein the connector provides power to the receiver.
  • 15. The method of claim 14, wherein the connector receives power from the bed adjustor, the bed adjustor being incorporated into the adjustable bed.
  • 16. The method of claim 14, further comprising: stepping down the power from the connector by a power system in the receiver.
  • 17. The method of claim 11, further comprising: receiving a voice instruction to alter settings of the adjustable bed via an IoT device; andgenerating the command data to adjust the bed based on the voice instruction.
  • 18. The method of claim 11, further comprising: analyzing the movement data to identify a sleep state, respiration, heart rate, snoring; anddetermining whether a change should be made to the adjustable bed or the sleep environment should be made based on the analyzing; andwhen the change is identified, generating command data to make the change.
  • 19. A system to retrofit an adjustable bed with internet of things (IoT) features, the system comprising: a sensor to obtain sleep data, for placement in the bed;a connector capable of coupling to controls for the adjustable bed;a receiver to interface with the adjustable bed through the connector, the receiver comprising: a signal receiver to receive the sleep data from the sensor;a digitizer to convert the received sleep data to digital sleep data;a transceiver to send the digital data to a server, and to receive IoT command data from the server;an Internet of Things (IoT) controller to receive instructions from an IoT device to control the adjustable bed, and further to send the IoT command data to an IoT device to alter a sleep environment;a server comprising: a signal identifier to identify a requested update to a bed configuration, based on one of: an instruction received from the IoT device, a manual change made by a user, and a change to be made based on analysis of the digital sleep data;a bed command calculator to calculate command data to control the adjustable bed, based on a combination of a current position of the bed and the identified requested update to the bed configuration, the command data sent to the receiver, and used by the receiver to trigger a bed adjustor in the adjustable bed.
  • 20. The sleep sensor system of claim 1, wherein the receiver is to interface with the adjustable bed through a connector, the receiver comprising: a signal receiver to receive the movement data from the sensor;a digitizer to convert the received movement data to digital data; anda transceiver to send the digital data to the server, and to receive the command data from the server.
RELATED APPLICATION

The present application claims priority to U.S. Provisional Pat. Application No. 62/745,978 filed on Oct. 15, 2018. The present application also claims priority to U.S. Provisional Pat. Application No. 62/745,976 (8689P231Z) and U.S. Provisional Application No. 62/745,984 (8689P233Z) both filed on Oct. 15, 2019, and incorporates all three of those applications by reference in their entirety.

US Referenced Citations (273)
Number Name Date Kind
2082843 Samuel Jun 1937 A
3541781 Bloom Nov 1970 A
3683933 Mansfield Aug 1972 A
3798889 Chadwick Mar 1974 A
4228806 Lidow Oct 1980 A
4297685 Brainard, II Oct 1981 A
4322609 Kato Mar 1982 A
4573804 Kavoussi et al. Mar 1986 A
4788533 Mequignon Nov 1988 A
4848360 Palsgard et al. Jul 1989 A
4858609 Cole Aug 1989 A
4982738 Griebel Jan 1991 A
5008865 Shaffer et al. Apr 1991 A
5047930 Martens et al. Sep 1991 A
5168759 Bowman Dec 1992 A
5275159 Griebel Jan 1994 A
5335657 Terry et al. Aug 1994 A
5458105 Taylor et al. Oct 1995 A
5545192 Czeisler et al. Aug 1996 A
5562106 Heeke et al. Oct 1996 A
5671733 Raviv et al. Sep 1997 A
5844996 Enzmann et al. Dec 1998 A
5868647 Belsole Feb 1999 A
5928133 Halyak Jul 1999 A
5961447 Raviv et al. Oct 1999 A
6014682 Stephen et al. Jan 2000 A
6045514 Raviv et al. Apr 2000 A
6231527 Sol May 2001 B1
6239706 Yoshiike et al. May 2001 B1
6350275 Vreman et al. Feb 2002 B1
6361508 Johnson et al. Mar 2002 B1
6468234 Van Der Loos et al. Oct 2002 B1
6547728 Cornuejols Apr 2003 B1
6556222 Narayanaswami Apr 2003 B1
6834436 Townsend et al. Dec 2004 B2
6888779 Mollicone et al. May 2005 B2
6928031 Kanevsky et al. Aug 2005 B1
6963271 Fyffe Nov 2005 B1
7006650 Wild Feb 2006 B1
7041049 Raniere May 2006 B1
7106662 Acker Sep 2006 B1
7139342 Phanse Nov 2006 B1
7153278 Ono et al. Dec 2006 B2
7280439 Shaddox Oct 2007 B1
7366572 Heruth et al. Apr 2008 B2
7513003 Mossbeck Apr 2009 B2
7559903 Moussavi et al. Jul 2009 B2
7572225 Stahmann et al. Aug 2009 B2
7652581 Gentry et al. Jan 2010 B2
7841987 Sotos et al. Nov 2010 B2
7862226 Bracher et al. Jan 2011 B2
7868757 Radivojevic et al. Jan 2011 B2
7914468 Shalon et al. Mar 2011 B2
7974849 Begole et al. Jul 2011 B1
8179270 Rai et al. May 2012 B2
8193941 Wolfe et al. Jun 2012 B2
8398546 Pacione et al. Mar 2013 B2
8407835 Connor Apr 2013 B1
8475339 Hwang et al. Jul 2013 B2
8482418 Harman Jul 2013 B1
8577448 Bauer et al. Nov 2013 B2
8680974 Meiertoberens et al. Mar 2014 B2
8738925 Park et al. May 2014 B1
8892036 Causey et al. Nov 2014 B1
8909357 Rawls-Meehan Dec 2014 B2
8942719 Hyde et al. Jan 2015 B1
9060735 Yang et al. Jun 2015 B2
9161719 Tsutsumi et al. Oct 2015 B2
9257029 Hendrick III et al. Feb 2016 B1
9448536 Kahn et al. Sep 2016 B1
9474876 Kahn et al. Oct 2016 B1
9594354 Kahn et al. Mar 2017 B1
9675268 Bauer et al. Jun 2017 B2
9844336 Zigel et al. Dec 2017 B2
10004452 Kazem-Moussavi et al. Jun 2018 B2
10207075 Kahn et al. Feb 2019 B1
10252058 Fuerst Apr 2019 B1
10335060 Kahn et al. Jul 2019 B1
11100922 Mutagi et al. Aug 2021 B1
20010049482 Pozos et al. Dec 2001 A1
20020080035 Youdenko Jun 2002 A1
20020100477 Sullivan et al. Aug 2002 A1
20020124848 Sullivan et al. Sep 2002 A1
20030095476 Mollicone et al. May 2003 A1
20030204412 Brier Oct 2003 A1
20030227439 Lee et al. Dec 2003 A1
20030231495 Searfoss III Dec 2003 A1
20040034289 Teller et al. Feb 2004 A1
20040049132 Barron et al. Mar 2004 A1
20040071382 Rich et al. Apr 2004 A1
20040111039 Minamiura et al. Jun 2004 A1
20040133081 Teller et al. Jul 2004 A1
20040210155 Takemura et al. Oct 2004 A1
20040218472 Narayanaswami et al. Nov 2004 A1
20050012622 Sutton Jan 2005 A1
20050043645 Ono et al. Feb 2005 A1
20050075116 Laird et al. Apr 2005 A1
20050076715 Kuklis et al. Apr 2005 A1
20050143617 Auphan Jun 2005 A1
20050154330 Loree et al. Jul 2005 A1
20050190065 Ronnholm et al. Sep 2005 A1
20050236003 Meader Oct 2005 A1
20050237479 Rose Oct 2005 A1
20050245793 Hilton et al. Nov 2005 A1
20050283039 Lustig Dec 2005 A1
20050288904 Warrior et al. Dec 2005 A1
20060017560 Albert Jan 2006 A1
20060025299 Miller et al. Feb 2006 A1
20060064037 Shalon et al. Mar 2006 A1
20060097884 Jang et al. May 2006 A1
20060136018 Lack et al. Jun 2006 A1
20060150734 Mimnagh-Kelleher et al. Jul 2006 A1
20060252999 Devaul et al. Nov 2006 A1
20060266356 Sotos et al. Nov 2006 A1
20060279428 Sato et al. Dec 2006 A1
20060293602 Clark Dec 2006 A1
20060293608 Rothman et al. Dec 2006 A1
20070016091 Butt et al. Jan 2007 A1
20070016095 Low et al. Jan 2007 A1
20070093722 Noda et al. Apr 2007 A1
20070100666 Stivoric et al. May 2007 A1
20070129644 Richards et al. Jun 2007 A1
20070139362 Colton et al. Jun 2007 A1
20070191692 Hsu et al. Aug 2007 A1
20070239225 Saringer Oct 2007 A1
20070250286 Duncan et al. Oct 2007 A1
20070251997 Brown et al. Nov 2007 A1
20070287930 Sutton Dec 2007 A1
20080062818 Plancon et al. Mar 2008 A1
20080109965 Mossbeck May 2008 A1
20080125820 Stahmann et al. May 2008 A1
20080169931 Gentry et al. Jul 2008 A1
20080191885 Loree IV et al. Aug 2008 A1
20080234785 Nakayama et al. Sep 2008 A1
20080243014 Moussavi et al. Oct 2008 A1
20080269625 Halperin et al. Oct 2008 A1
20080275348 Catt et al. Nov 2008 A1
20080275349 Halperin et al. Nov 2008 A1
20080289637 Wyss Nov 2008 A1
20080319277 Bradley Dec 2008 A1
20090030767 Morris et al. Jan 2009 A1
20090048540 Otto et al. Feb 2009 A1
20090069644 Hsu et al. Mar 2009 A1
20090071810 Hanson et al. Mar 2009 A1
20090082699 Bang et al. Mar 2009 A1
20090094750 Oguma et al. Apr 2009 A1
20090105785 Wei et al. Apr 2009 A1
20090121826 Song et al. May 2009 A1
20090128487 Langereis et al. May 2009 A1
20090143636 Mullen et al. Jun 2009 A1
20090150217 Luff Jun 2009 A1
20090177327 Turner et al. Jul 2009 A1
20090203970 Fukushima et al. Aug 2009 A1
20090207028 Kubey et al. Aug 2009 A1
20090227888 Salmi et al. Sep 2009 A1
20090264789 Molnar et al. Oct 2009 A1
20090320123 Yu et al. Dec 2009 A1
20100010330 Rankers et al. Jan 2010 A1
20100010565 Lichtenstein et al. Jan 2010 A1
20100036211 La Rue et al. Feb 2010 A1
20100061596 Mostafavi et al. Mar 2010 A1
20100075807 Hwang et al. Mar 2010 A1
20100079291 Kroll et al. Apr 2010 A1
20100079294 Rai et al. Apr 2010 A1
20100083968 Wondka et al. Apr 2010 A1
20100094139 Brauers et al. Apr 2010 A1
20100094148 Bauer et al. Apr 2010 A1
20100100004 van Someren Apr 2010 A1
20100102971 Virtanen et al. Apr 2010 A1
20100152543 Heneghan et al. Jun 2010 A1
20100152546 Behan et al. Jun 2010 A1
20100217146 Osvath Aug 2010 A1
20100256512 Sullivan Oct 2010 A1
20100283618 Wolfe et al. Nov 2010 A1
20100331145 Lakovic et al. Dec 2010 A1
20110015467 Dothie et al. Jan 2011 A1
20110015495 Dothie et al. Jan 2011 A1
20110018720 Rai et al. Jan 2011 A1
20110054279 Reisfeld et al. Mar 2011 A1
20110058456 Van De Sluis et al. Mar 2011 A1
20110090226 Sotos et al. Apr 2011 A1
20110105915 Bauer et al. May 2011 A1
20110137836 Kuriyama et al. Jun 2011 A1
20110160619 Gabara Jun 2011 A1
20110190594 Heit et al. Aug 2011 A1
20110199218 Caldwell et al. Aug 2011 A1
20110230790 Kozlov Sep 2011 A1
20110245633 Goldberg et al. Oct 2011 A1
20110295083 Doelling et al. Dec 2011 A1
20110302720 Yakam et al. Dec 2011 A1
20110304240 Meitav et al. Dec 2011 A1
20120004749 Abeyratne et al. Jan 2012 A1
20120083715 Yuen et al. Apr 2012 A1
20120232414 Mollicone et al. Sep 2012 A1
20120243379 Balli Sep 2012 A1
20120253220 Rai et al. Oct 2012 A1
20120296156 Auphan Nov 2012 A1
20130012836 Crespo et al. Jan 2013 A1
20130018284 Kahn et al. Jan 2013 A1
20130023214 Wang et al. Jan 2013 A1
20130053653 Cuddihy et al. Feb 2013 A1
20130053656 Mollicone et al. Feb 2013 A1
20130060306 Colbauch Mar 2013 A1
20130144190 Bruce et al. Jun 2013 A1
20130184601 Zigel et al. Jul 2013 A1
20130197857 Lu et al. Aug 2013 A1
20130204314 Miller III et al. Aug 2013 A1
20130208576 Loree IV et al. Aug 2013 A1
20130283530 Main et al. Oct 2013 A1
20130286793 Umamoto Oct 2013 A1
20130289419 Berezhnyy et al. Oct 2013 A1
20130300204 Partovi Nov 2013 A1
20130310658 Ricks et al. Nov 2013 A1
20130344465 Dickinson et al. Dec 2013 A1
20140005502 Klap et al. Jan 2014 A1
20140051938 Goldstein et al. Feb 2014 A1
20140085077 Luna et al. Mar 2014 A1
20140135955 Burroughs May 2014 A1
20140171815 Yang et al. Jun 2014 A1
20140200691 Lee et al. Jul 2014 A1
20140207292 Ramagem et al. Jul 2014 A1
20140218187 Chun et al. Aug 2014 A1
20140219064 Filipi et al. Aug 2014 A1
20140232558 Park et al. Aug 2014 A1
20140256227 Aoki et al. Sep 2014 A1
20140259417 Nunn et al. Sep 2014 A1
20140259434 Nunn et al. Sep 2014 A1
20140276227 Pérez Sep 2014 A1
20140288878 Donaldson Sep 2014 A1
20140306833 Ricci Oct 2014 A1
20140350351 Halperin et al. Nov 2014 A1
20140371635 Shinar et al. Dec 2014 A1
20150015399 Gleckler et al. Jan 2015 A1
20150068069 Tran et al. Mar 2015 A1
20150073283 Van Vugt et al. Mar 2015 A1
20150085622 Carreel et al. Mar 2015 A1
20150094544 Spolin et al. Apr 2015 A1
20150098309 Adams et al. Apr 2015 A1
20150101870 Gough et al. Apr 2015 A1
20150136146 Hood et al. May 2015 A1
20150141852 Dusanter et al. May 2015 A1
20150148621 Sier May 2015 A1
20150148871 Maxik et al. May 2015 A1
20150164238 Benson et al. Jun 2015 A1
20150164409 Benson et al. Jun 2015 A1
20150164438 Halperin et al. Jun 2015 A1
20150173671 Paalasmaa et al. Jun 2015 A1
20150178362 Wheeler Jun 2015 A1
20150190086 Chan et al. Jul 2015 A1
20150220883 B'Far et al. Aug 2015 A1
20150233598 Shikii et al. Aug 2015 A1
20150265903 Kolen et al. Sep 2015 A1
20150289802 Thomas et al. Oct 2015 A1
20150320588 Connor Nov 2015 A1
20150333950 Johansson Nov 2015 A1
20150351694 Shimizu et al. Dec 2015 A1
20160015315 Auphan et al. Jan 2016 A1
20160045035 Van Erlach Feb 2016 A1
20160217672 Yoon et al. Jul 2016 A1
20160262693 Sheon Sep 2016 A1
20160287869 Errico et al. Oct 2016 A1
20170003666 Nunn et al. Jan 2017 A1
20170020756 Hillenbrand II et al. Jan 2017 A1
20170188938 Toh et al. Jul 2017 A1
20180049701 Raisanen Feb 2018 A1
20180103770 Nava et al. Apr 2018 A1
20180338725 Shan et al. Nov 2018 A1
20190021675 Gehrke et al. Jan 2019 A1
20190044380 Lausch et al. Feb 2019 A1
20190132570 Chen et al. May 2019 A1
20190156296 Lu et al. May 2019 A1
20190190992 Warrick Jun 2019 A1
20190201270 Sayadi et al. Jul 2019 A1
Foreign Referenced Citations (14)
Number Date Country
2003203967 Nov 2004 AU
377738 Jan 1964 CH
668349 Dec 1988 CH
697528 Nov 2008 CH
19642316 Apr 1998 DE
1139187 Oct 2001 EP
08160172 Jun 1996 JP
2007-132581 May 2007 JP
1020090085403 Aug 2009 KR
1020100022217 Mar 2010 KR
9302731 Feb 1993 WO
2008038288 Apr 2008 WO
2009099292 Aug 2009 WO
2011141840 Nov 2011 WO
Non-Patent Literature Citations (36)
Entry
“NPL- EasySense LTD”, archive.org, accessed: Jan. 7, 2019, published: Nov. 27, 2006.
Acligraphy, From Wikipedia, the free encyclopedia, downloaded at: http://en.wikipedia.org/wiki/Actigraphy on Apr. 24, 2014, 4 pages.
Advisory Action, U.S. Appl. No. 15/071,191, Jan. 27, 2023, 3 pages.
Advisory Action, U.S. Appl. No. 17/647,160, Jan. 10, 2023, 3 pages.
Crist, CNET “Samsung introduces SleepSense” Sep. 3, 2015. Retrieved from https://www.cnet.com/reviews/samsung-sleepsense-preview (Year: 2015).
Daniel et al., “ Activity Characterization from Actimetry Sensor Data for Sleep Disorders Diagnosis”, Sep. 2008, 10 pages.
Desai, Rajiv, “The Sleep”,Mar. 17, 2011, Educational Blog, 82 pages.
Final Office Action, U.S. Appl. No. 15/071,189, Nov. 28, 2022, 22 pages.
Final Office Action, U.S. Appl. No. 16/799,786, Dec. 9, 2022, 16 pages.
Fitbit Product Manual, “Fitbit Product Manual”, available online at <http://www.filtbit.com/manual>, Mar. 29, 2010, pp. 1-20.
Haughton Mifflin, “Estimate”, The American Heritage dictionary of the English language (5th ed.), Jul. 24, 2017, 2 pages.
How BodyMedia FIT Works, <http://www.bodymedia.com/Shop/Learn-More/How-it-works>, accessed Jun. 17, 2011, 2 pages.
Internet Archive, Withings “Sleep Tracking Mat” Nov. 22, 2018. Retrieved from https://web.archive.org/web/20181122024547/https://www.withings.com/us/en/sleep (Year: 2018).
JAINES, Kira, “Music to Help You Fall Asleep,” <http://www.livestrong.com/article/119802-music-fall-sleep/>, May 10, 2010, 2 pages.
JETLOG Reviewers Guide, <http://www.jetlog.com/fileadmin/Presse_us/24x7ReviewersGuide,pdf>, 2009, 5 pages.
Leeds, Joshua, “Sound-Remedies.com: Sonic Solutions for Health, Learning & Productivity,” <http://www.sound-remedies.com/ammusforslee.html>, Accessed May 23, 2013, 2 pages.
Lichstein, et al., “Actigraphy Validation with Insomnia”, Sleep, Vol, 29, No. 2, 2006, pp.232-239.
Liden, Craig B, et al, “Characterization and Implications of the Sensors Incorporated into the SenseWear(TM) Armband for Energy Expenditure and Activity Detection”,, accessed Jun. 17, 2011, 7 pages.
Mattila et al., “A Concept for Personal Wellness Management Based on Activity Monitoring,” Pervasive Computing Technologies for Healthcare, 2008.
Patel, et al., Validation of Basis Science Advanced Sleep Analysis, Estimation of Sleep Stages and Sleep Duration, Basis Science, San Francisco, CA, Jan. 2014, 6 pages.
Pires, P. D. C. Activity Characterization from Actimetry Sensor Data for Sleep Disorders Diagnosis, Universidade T ecnica de Lisboa, Sep. 2008, 10 pages.
Pollak et al., “How Accurately Does Wrist Actigraphy Identify the States of Sleep and Wakefulness?”, Sleep, Vol. 24, No. 8, 2001, pp.957-965.
Power Nap, <en.wikipedia.org/wiki/Power.sub.-nap>, Last Modified Sep. 20, 2012, 4 pages.
PowerNap iPhone App, <http://forums.precentral.net/webos-apps-software/223091-my-second-app---powernap-out-app-catalog-nap-timer.html>, Jan. 6, 2010, 10 pages.
Rechtschaffen et al., “Manual of Standardized Terminology, Techniques and Scoring System for Sleep Stages of Human Subjects, 1968, 57 pages.
Sara Mednick, <en.wikipedia.org/wiki/Sara.sub.-Mednick>, Last Modified Sep. 12, 2012, 2 pages.
Schulz et al. “Phase shift in the REM sleep rhythm.” Pflugers Arch. 358, 1975, 10 pages.
Schulz et al. “The REM-NREM sleep cycle: Renewal Process or Periodically Driven Process?.” Sleep, 1980, pp.319-328.
Sleep Debt, <en.wikipedia.org/wiki/Sleep.sub.-debt>, Last Modified Aug. 25, 2012, 3 pages.
Sleep Inertia, <en.wikipedia.org/wiki/Sleep_inertia>, Last Modified Sep. 12, 2012, 2 pages.
Sleep, <en.wikipedia.org/wiki/Sleep.sub.-stages#Physiology>, Last Modified Oct. 5, 2012, 21 pages.
Slow Wave Sleep, <en.wikipedia.org/wiki/Slow-wave.sub.-sleep>, Last Modified Jul. 22, 2012, 4 pages.
Wikipedia, “David.sub Dinges”, available online at <en.wikipedia.org/wiki/David.sub_Dinges>, Sep. 12, 2012, 2 pages.
Yassourdidis et al. “Modelling and Exploring Human Sleep with Event History Analysis.” Journal of Sleep Research, 1999, pp. 25-36.
Campbell, Appleinsider, “Apple buys sleep tracking firm Beddit” May 9, 2017. Retrieved from https://appleinsider.com/articles/17/05/09/apple-buys-sleep-tracking-firm-beddit (Year: 2017).
SUNSERI et al., “The SenseWear (TM) Armband as a Sleep Detection Device,” available online at <http://sensewear.bodymedia.com/SenseWear-Sludies/SW-Whilepapers/The-SenseWear-armband-as-a-Sleep-Delection-Device>, 2005, 9 pages.
Provisional Applications (3)
Number Date Country
62745978 Oct 2018 US
62745976 Oct 2018 US
62745984 Oct 2018 US