The present disclosure pertains to systems and methods for modeling at least a sleep parameter for a subject.
A general premise of sleep/wake regulation is that the longer one is awake, the shorter time it takes to fall asleep. However, not all “wake” periods accumulate sleep-need equivalently. Considering timing of sleep and wake alone are not sufficient to accurately model the dynamics of sleep-need. Numerous factors play a role in sleep/wake regulation and these cannot be all included in a model. The present disclosure overcomes these deficiencies.
Accordingly, one or more aspects of the present disclosure relates to a system for modeling at least a sleep parameter for a subject. The system comprises a non-transitory computer readable medium having instructions thereon, the instructions when executed by a computer causing the computer to: obtain sleep and/or wake information related to a subject for a given period of time; estimate bedtime and/or wakeup time of the subject using a sleep model and the sleep and/or wake information; predict at least a sleep parameter of the subject for an upcoming time interval based upon the sleep model and the estimated bedtime and/or wakeup time; obtain physiological information of the subject; and adjust the predicted sleep parameter for the upcoming time interval based upon the sleep model and the physiological information.
Another aspect of the present disclosure relates to a method for modeling at least a sleep parameter for a subject. The method comprises: obtain sleep and/or wake information related to a subject for a given period of time; estimate bedtime and/or wakeup time of the subject using a sleep model and the sleep and/or wake information; predict at least a sleep parameter of the subject for an upcoming time interval based upon the sleep model and the estimated bedtime and/or wakeup time; obtain physiological information of the subject; and adjust the predicted sleep parameter for the upcoming time interval based upon the sleep model and the physiological information.
Still another aspect of the present disclosure relates to a system for modeling at least a sleep parameter for a subject. The system comprises one or more input devices configured to generate output signals indicating one or more physiological information related to a subject, and one or more physical processors operatively connected with the one or more input devices, the one or more physical processors configured by machine-readable instructions to: obtain sleep and/or wake information related to a subject for a given period of time; estimate bedtime and/or wakeup time of the subject using a sleep model and the sleep and/or wake information; predict at least a sleep parameter of the subject for an upcoming time interval based upon the sleep model and the estimated bedtime and/or wakeup time; obtain physiological information of the subject; and adjust the predicted sleep parameter for the upcoming time interval based upon the sleep model and the physiological information.
Still another aspect of the present disclosure relates to a system for modeling at least a sleep parameter for a subject, the system comprising: obtaining means for obtaining sleep and/or wake information related to a subject for a given period of time; estimating means for estimating bedtime and/or wakeup time of the subject using a sleep model and the sleep and/or wake information; predicting means for predicting at least a sleep parameter of the subject for an upcoming time interval based upon the sleep model and the estimated bedtime and/or wakeup time; obtaining means for obtaining physiological information of the subject; and adjusting means for adjusting the predicted sleep parameter for the upcoming time interval based upon the sleep model and the physiological information.
These and other objects, features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the disclosure.
As used herein, the singular form of “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. As used herein, the term “or” means “and/or” unless the context clearly dictates otherwise. As used herein, the statement that two or more parts or components are “coupled” shall mean that the parts are joined or operate together either directly or indirectly, i.e., through one or more intermediate parts or components, so long as a link occurs. As used herein, “directly coupled” means that two elements are directly in contact with each other. As used herein, “fixedly coupled” or “fixed” means that two components are coupled so as to move as one while maintaining a constant orientation relative to each other.
As used herein, the word “unitary” means a component is created as a single piece or unit. That is, a component that includes pieces that are created separately and then coupled together as a unit is not a “unitary” component or body. As employed herein, the statement that two or more parts or components “engage” one another shall mean that the parts exert a force against one another either directly or through one or more intermediate parts or components. As employed herein, the term “number” shall mean one or an integer greater than one (i.e., a plurality).
Directional phrases used herein such as, for example and without limitation, top, bottom, left, right, upper, lower, front, back, and derivatives thereof, relate to the orientation of the elements shown in the drawings and are not limiting upon the claims unless expressly recited therein.
In some embodiments, system 10 comprises one or more of sensor(s) 18, a processor 20, external resources 14, electronic storage 22, client computing platform(s) 24, a network 26, and/or other components. In
Sensor(s) 18 is configured to generate output signals conveying information related to one or more physiological information related to subject 12. In some embodiments, the physiological information of the subject may include one or more of heart rate, heart rate variability, microvascular blood volume, electrical function of the heart, brain activity, eye movement, physical activity, sleep/wake status, sleep duration, wake duration, sleep/wake onset, and/or other physiological information. In some embodiments, the one or more sensor(s) may include one or more of a heart rate sensor, an electrocardiogram (ECG), a photoplethysmograph (PPG), an electroencephalogram (EEG), and electrooculography, and/or other sensors. In some embodiments, sensor(s) 18 may include a pulse oximeter, a movement sensor, an accelerometer, a blood pressure sensor, an actimetry sensor, a camera, a breathing sensor, and/or other sensors configured for monitoring the subject state. Although sensor(s) 18 is illustrated at a single location near subject 12, this is not intended to be limiting. In some embodiments, sensor(s) 18 may include sensors disposed in a plurality of locations, (e.g., sensor disposed on (or near) the chest, limb, head, ear, eye, and/or other body parts of subject 12. In some embodiments, sensor(s) 18 may include sensors coupled (in a removable manner) with clothing of subject 12. In some embodiments, sensor(s) 18 may include sensors disposed in a computing device (e.g., a mobile phone, a computer, etc.), and/or disposed in a medical device used by the user. In some embodiments, sensor(s) 18 may include sensors that are positioned to point at subject 12 (e.g., a camera).
In some embodiments, sensor(s) 18 may be included in a wearable device. The wearable device may be any device that is worn, or that is in full or partial contact with any body part of the subject. In some embodiments, the wearable device may be in the form of a wristband, an activity monitor, a smart watch, a headband, ear plugs, etc. In some embodiments, the wearable device may be configured to generate output signals conveying information related to heart rate, heart rate variability, microvascular blood volume, electrical function of the heart, brain activity, eye movement, physical activity, sleep/wake status, sleep duration, wake duration, sleep/wake onset, and/or other physiological information, and/or other physiological parameters. The output signals may be transmitted to a computing device (within or outside of the wearable device) wirelessly and/or via wires. In some embodiments, some or all components of system 10 may be included in the wearable device.
Processor 20 is configured to provide information processing capabilities in system 10. As such, processor 20 may include one or more of a digital processor, an analog processor, and a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Although processor 20 is shown in
As shown in
It should be appreciated that although components 28, 30, 32, and 34 are illustrated in
Subject information component 28, in some embodiments, is configured to obtain information related to subject 12. In some embodiments, information related to subject 12 may include biographical information. For example, biographical information may include demographic information (e.g., gender, ethnicity, age, etc.), vital sign information (e.g., height, weight, BMI, blood pressure, pulse, temperature, respiration, etc.), medical/health condition information (e.g., a disease type, severity of the disease, stage of the disease, categorization of the disease, symptoms, behaviors, readmission, relapse, etc.), treatment history information (e.g., type of treatments, length of treatments, current and past medications, etc.), and/or other information.
In some embodiments, subject information component 28 may be configured to determine (and/or obtain) information related to other subjects. For example, subjects with similar sleep and/or wake information, demographic information, vital sign information, medical/health condition information, treatment history information, similar desired outcome (e.g., from sensory simulation), and/or other similarities with subject 12. It should be noted that the subject information described above is not intended to be limiting. A large number of information related to subjects may exist and may be used with system 10 in accordance with some embodiments. For example, users may choose to customize system 10 and include any type of subject data they deem relevant.
In some embodiments, subject information component 28 may be configured to obtain/extract information from one or more databases (e.g., electronic storage 22 shown in
Physiological parameters component 30 may be configured to determine (and/or obtain) one or more physiological parameters related to subject 12. In some embodiments, the one or more physiological parameters may be determined based on output signals from sensor(s) 18. In some embodiments, the one or more physiological parameters may include heart rate, heart rate variability, microvascular blood volume, electrical function of the heart, brain activity, eye movement, physical activity, sleep/wake status, sleep duration, wake duration, sleep/wake onset, and/or other physiological information, alpha power, theta power, and/or other physiological parameters. In some embodiments, the one or more physiological parameters may be determined before, during, and/or after a sleep session. For example, in some embodiments, the one or more physiological parameters may be determined between the time the subject wakes and the time he goes to bed.
In some embodiments, the one or more physiological parameters related to subject 12 may include information related to one or more physical activities of the subject (e.g., standing, sitting, walking, running, exercising, relaxing, and/or other physical activities). In some cases, medical treatment of the subject may influence their sleepiness, drowsiness, sleep need, sleep quality, and/or other sleep parameters of the subject. For example, the information about the physical activity may include a type of physical activity, intensity, duration, time of the day the activity was carried out, and/or other physical activities. In some cases, the information about the physical activity may be that the user did not carry out any physical activity.
In some embodiments, the one or more physiological parameters related to subject 12 may include information related to one or more dietary information of the subject (e.g., food and/or drink). In some cases, diet of the subject may influence their sleepiness, drowsiness, sleep need, sleep quality, and/or other sleep parameters of the subject (e.g., caffeine, alcohol, carbohydrates, fatty foods, spicy foods, and/or other foods). In some embodiments, the dietary information may include one or more types of drinks and/or foods, amount of drinks and/or foods, time of day the drinks and/or foods were consumed, and/or other dietary information.
In some embodiments, the one or more physiological parameters related to subject 12 may include information related to one or more medical treatment (e.g., drug, medical intervention, therapy, and/or other medical treatment). In some cases, medical treatment of the subject may influence their sleepiness, drowsiness, sleep need, sleep quality, and/or other sleep parameters of the subject. For example, in some embodiments, the medical treatment information may include one or more types of medical treatment, dosage, time of day the medical treatment is taken, and/or other medical treatment information.
In some embodiments, the one or more physiological parameters related to subject 12 may include psychological information related to the subject (e.g., stress, anxiety, and/or other psychological information). In some cases, psychological information of the subject may influence their sleepiness, drowsiness, sleep need, sleep quality, and/or other sleep parameters of the subject. For example, the psychological information may include a type of psychological information, intensity level, time of the day it occurs, and/or other psychological information of the subject.
In some embodiments, the one or more physiological parameters related to subject 12 may include information related to daytime nap information of the subject. In some cases, daytime napping may influence the subject sleep need, sleep debt, sleep/wake quality, and/or other sleep parameters of the subject. For example, in some embodiments, the daytime nap information may include one or more of the length, quality, time of the nap, and/or other daytime information.
In some embodiments, physiological parameters component 30 is configured to obtain the physiological parameters from one or more databases within or outside system 10. For example, electronic storage 22, external resources 14, and/or other databases. In some embodiments, the one or more physiological parameters may be obtained from one or more sensors outside of system 10. For example, in some embodiments, system 10 does not require sensors 18, instead it receives information from outside sensors that are related to the subject. The information may be in the form of output signals that physiological parameters component 30 uses to determine the physiological parameters, and/or the information obtained is physiological parameters that do not require additional processing from system 10. In some embodiments, this information may be sent automatically to system 10 whenever it becomes available, or system 10 may be configured to request the information (e.g., in a continuous manner, and/or on a schedule). For example, in some embodiments, the outside sensors may be independent physiological sensors, sensors included in activity devices, medical devices, mobile phones, smart wearables devices, and/or other sensors. In some embodiments, the physiological parameters are obtained via an input device (e.g., client computing platform(s) 24). The user may use the input device to provide physiological parameters to system 10. For example, the input device may be a wearable device, a smart phone, a smart watch, a computer, and/or any other device that is able to communicate with system 10.
Sleep information component 32 may be configured to obtain sleep information of the subject. For example, sleep information may include bedtime and/or wakeup time of the subject, sleep/wake status, sleep duration, wake duration, sleep/wake onset, sleep latency, sleep need, sleep debt, and/or other sleep parameters of the subject. In some embodiments, sleep information may include historical sleep information. For example, in some embodiments, sleep information component 32 may be configured to obtain sleep information over a period of time (e.g., 24h, 48h, few days, weeks, months, years, or any period of time.)
In some embodiments, sleep information component 32 is configured to obtain the sleep information from one or more databases within or outside system 10. For example, electronic storage 22, external resources 14, and/or other databases. In some embodiments, the one or more sleep information may be obtained from one or more sensors outside of system 10. For example, in some embodiments, system 10 does not require sensors 18, instead it receives information from outside sensors that are related to the subject. The information may be in the form of output signals that sleep information component 32 uses to determine the sleep information, and/or the information obtained is physiological parameters that do not require additional processing from system 10. For example, in some embodiments, the outside sensors may be independent physiological sensors, sensors included in activity devices, medical devices, mobile phones, smart wearables devices, and/or other sensors. In some embodiments, the sleep information is obtained via an input device (e.g., client computing platform(s) 24). The user may use the input device to provide sleep information to system 10. For example, the input device may be a wearable device, a smart phone, a smart watch, a computer, and/or any other device that is able to communicate with system 10.
Sleep information component 32, in some embodiments, may be configured to estimate one or more sleep parameters of the subject. In some embodiments, the one or more sleep parameters of the subject may be estimated using one or more sleep models. In some embodiments, parameters of a sleep model of the one or more sleep models may be obtained using a two-process model. Generally, sleep and wake periods alternate throughout a 24-hour cycle and they are regulated by two factors: homeostatic and circadian. In some embodiments, an example sleep model of sleep/wake regulation is a two-process model where a homeostatic component “H” (see
We consider that at time “t” the signal s(t) is sampled and that a relation Sh exists such that an estimation of H at time t: can be obtained from s(t): =Sh(s(t)). The function Sh depends on the specific type of signal and the specific instances thereof are detailed herein below. The homeostatic component H models the accumulation of sleep-need during wakefulness according to the following equation:
H(t)during wake:H(t)=μ+(H0−μ)exp(t0−t)/τw
During sleep 214, the homeostatic component H models the dissipation of sleep-need according to the following equation:
H(t)during sleep: H(t)=He exp(t0−t)/τs
The circadian factor determines the timing of sleep and wake through two “parallel” 24-hour period sinusoidal curves (C(t)) 204 according to the following equations:
C(t): Cup(t)=H0++0.1 sin(2πt)
C
dwn(t)=H0−+0.1 sin(2πt)
where the sleep model parameters are the average (24-hour format) bedtime and wakeup time, the asymptotic value μ for H during wakefulness, the time constant τw which controls the rate at which sleep-need accumulates during wake, the time constant τs which controls the rate at which sleep-need dissipates during the sleep period, and the upper and lower circadian shift parameters H0+ and H0− respectively. In some embodiments, the sleep model parameters may be estimated from a sequence of sleep/wake history. For example, a sleep/wake history of a few days, weeks, months, or years (e.g., at least seven days). In some embodiments, the model parameters may be continuously updated as more data is collected. Using the data from several days may increase robustness against noisy data. For example, in some embodiments, a first step in the model is to estimate the average bedtime and wakeup time whose variance decreases as more data is taken into account.
In some embodiments, in practice, the one or more sleep models may be built utilizing past sleep/wake information (as described above). Error! Reference source not found. shows an example of bedtime/wakeup time data used to estimate the parameters of a sleep model. In this model, days are used as the time unit.
where Wi and Si are the duration of wakefulness and sleep associated with day “i” respectively, and H0 is the homeostatic threshold to transition from sleep to wake, He is the homeostatic threshold to transition from wake to sleep, τw=1/δ is the time constant controlling the accumulation of sleep-need during wake, and τs=1/σ is the time constant controlling the dissipation of sleep-need during sleep.
By matching He in Error! Reference source not found. and Error! Reference source not found., the function F to minimize in Error! Reference source not found. is obtained which has to be solved to estimate the values of H0, σ, and δ. The iterative optimization Error! Reference source not found. (where) procedure uses the partial derivatives in Error! Reference source not found.
Returning to
In some embodiments, the example of sleep model shown in
As described above, in some embodiments, one or more sensor signals and/or behaviors may be used to estimate H. These are not intended to limit the scope of this invention which covers the overall concept of utilizing daytime signals to adjust sleep predictions. In some embodiments, additional factors may be taken into consideration for a more accurate prediction. The sleep prediction may be adjusted (e.g. by sleep prediction component shown in
In some embodiments, bedtime and/or wakeup time of the subject may be estimated using sleep model 424 and the sleep and/or wake information from sleep/wake history database 422. In some embodiments, sleep model 424 may be a two-process sleep model 420. In some embodiments, sleep model 420 is similar to sleep model 200 shown in
In some embodiments, physiological (and/or behavioral) information of the subject 12 may be obtained from one or more devices 418. In some embodiments, the physiological (and/or behavioral) information is similar to physiological information obtained by physiological parameter component 30 shown in
In some embodiments, the predicted sleep parameters for the upcoming time interval may be adjusted based upon sleep model 420 and the physiological (and/or behavioral) information 430 obtained from devices 418. The adjusted model 460 shows that the homeostasis curve is adjusted, and a new delayed sleep onset estimation is determined in response to the new information (physiological and/or behavioral) 430 becoming available (e.g., the user took a nap). As explained above, at time “t” the signal s(t) is sampled and that a relation Sh exists such that an estimation of H at time t: can be obtained from s(t): =Sh(s(t)). The function Sh depends on the specific type of signal and specific instances thereof. The corrected value of H at time “t” that is utilized in model 460 to produce an adjusted estimation of sleep onset is: λ×H(t)+(1−λ)×, where H(t) is the value originally predicted from the model and 0<λ<1 is a positive number that controls the degree of correction due to .
In some embodiments, the one or more physiological information obtained may include daytime sleep behavior information. In some embodiments, information about sleep behavior other than the night-time sleep of the user may be obtained. For example, in some cases, daytime napping is among behaviors that may influence night-time sleep.
As shown in
In some embodiments, the one or more physiological information obtained may include daytime eye movement information. Eye movement and/or eyelid movement (e.g., blinks) are linked to vigilance, attention, and sleepiness. In some embodiments, one or more eye and/or lid movements may indicate a drowsiness level. For example, these one or more eye and/or lid movements may include percentage or duration of eye closure, blink duration, blink rate or blink amplitude, pendular motions, slow eye movements, lid closing/reopening time, interval between blinks, changes in pupil size, saccadic velocities, amplitude-velocity ratios of the eye closure, and/or other eye and/or lid movements. In some embodiments, daytime eye/lid movements of the subject may be monitored using sensors within and/or outside system 10. For example, the eye movement may be monitored using one or more sensors that incorporate detection and tracking of changes in the ocular region (e.g., ocular sensors, cameras, electrooculography (EOG), infrared reflectance oculography, etc.). In some embodiments, the eye/lid movement may be continually measured.
In some embodiments, physiological parameters component 30 may be configured to obtain/determine one or more daytime eye movement features based on the daytime eye movements. In some embodiments, the daytime eye movement features may be used to estimate a sleepiness scale. For example, the Karolinska Sleepiness Scale (KSS). KSS is a measure of a subjective level of sleepiness at a particular time during the day. The KSS is a 9-point Likert scale often used when conducting studies involving self-reported, subjective assessment of an individual's level of drowsiness at the time. On this scale, subjects indicate which level best reflects the psycho-physical state experienced in the last 10 min. The KSS Scores are defined as follows:
In general, KSS increases with longer periods of wakefulness and it correlates with the time of the day.
KSS from daytime eye movement derived features can be estimated as follows:
KSS≈1+ε×|iris−pupil|.
KSS can in turn be used to estimate H as follows:
KSS(t)≈9.68−0.46×1/H(t).
The type of adjustment on sleep onset prediction based on eye movement information is illustrated in
Other sleep parameters may be predicted using the sleep model. For example, in some embodiments, sleep duration may be predicted given bedtime “Tb” (measured from wakeup time). Indeed, the accumulated sleep-need up to time Tb is:
H(Tb)=μ+(H0−μ)exp((t0−Tb)/τw).
The duration of sleep is then calculated using the sleep-need dissipation formula:
Sleep duration=τs log(H(Tb)/Hs),
where Hs is the threshold determining the transition from sleep to wake (as shown in
In some embodiments, the one or more physiological information may include daytime brain activity measurement (e.g., electroencephalography (EEG)). In some embodiments, changes in the subject's electroencephalography (EEG) measurements may indicate drowsiness in the subject. In some embodiments, EEG measurements may be obtained from one or more sensors within or outside system 10. In some embodiments, EEG based metrics include power in relevant frequency bands such as theta (4 to 8 Hz) and alpha (8-12 Hz).
To estimate the corresponding values of H, the relation KSS(t)≈9.68−0.46×1/H(t) may be leveraged.
In some embodiments, the one or more physiological information may include subjective feedback on sleepiness obtained from the subject. For example, in some embodiments, under certain circumstances or due to professional requirements, it is possible to obtain subjective feedback on sleepiness typically using a visual analog scale (VAS). VAS correlates with KSS (see
In some embodiments, as shown in
External resources 14 include sources of patient and/or other information. In some embodiments, external resources 14 include sources of patient and/or other information, such as databases, websites, etc., external entities participating with system 10 (e.g., a medical records system of a healthcare provider that stores medical history information for populations of patients), one or more servers outside of system 10, a network (e.g., the internet), electronic storage, equipment related to Wi-Fi technology, equipment related to Bluetooth® technology, data entry devices, sensors, scanners, and/or other resources. In some embodiments, some or all of the functionality attributed herein to external resources 14 may be provided by resources included in system 10. External resources 14 may be configured to communicate with processor 20, computing devices 24, electronic storage 22, and/or other components of system 10 via wired and/or wireless connections, via a network (e.g., a local area network and/or the internet), via cellular technology, via Wi-Fi technology, and/or via other resources.
Electronic storage 22 includes electronic storage media that electronically stores information. The electronic storage media of electronic storage 22 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with system 10 and/or removable storage that is removably connectable to system 10 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 22 may be (in whole or in part) a separate component within system 10, or electronic storage 22 may be provided (in whole or in part) integrally with one or more other components of system 10 (e.g., computing devices 24, processor 20, etc.). In some embodiments, electronic storage 22 may be located in a server together with processor 20, in a server that is part of external resources 14, in a computing device 24, and/or in other locations. Electronic storage 22 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Electronic storage 22 may store software algorithms, information determined by processor 20, information received via a computing device 24 and/or graphical user interface 40 and/or other external computing systems, information received from external resources 14, sensors 18, and/or other information that enables system 10 to function as described herein.
Client computing platform(s) 24 is configured to provide an interface between system 10 and subject 12, and/or other users through which subject 12 and/or other users may provide information to and receive information from system 10. For example, client computing platform(s) 24 may display a representation of the output signal from sensors 18 (e.g., an EEG, 2D/3D images, video, audio, text, etc.) to a user. This enables data, cues, results, instructions, and/or any other communicable items, collectively referred to as “information,” to be communicated between a user (e.g., subject 12, a doctor, a caregiver, and/or other users) and one or more of processor 20, electronic storage 22, and/or other components of system 10.
Examples of interface devices suitable for inclusion in client computing platform(s) 24 comprise a keypad, buttons, switches, a keyboard, knobs, levers, a display screen, a touch screen, speakers, a microphone, an indicator light, an audible alarm, a printer, a tactile feedback device, and/or other interface devices. In some embodiments, client computing platform(s) 24 comprise a plurality of separate interfaces. In some embodiments, client computing platform(s) 24 comprise at least one interface that is provided integrally with processor 20, sensor(s) 18, and/or other components of system 10.
Computing devices 24 are configured to provide interfaces between caregivers (e.g., doctors, nurses, friends, family members, etc.), patients, and/or other users, and system 10. In some embodiments, individual computing devices 24 are, and/or are included, in desktop computers, laptop computers, tablet computers, smartphones, and/or other computing devices associated with individual caregivers, patients, and/or other users. In some embodiments, individual computing devices 24 are, and/or are included, in equipment used in hospitals, doctor's offices, and/or other medical facilities to patients; test equipment; equipment for treating patients; data entry equipment; and/or other devices. Computing devices 24 are configured to provide information to, and/or receive information from, the caregivers, patients, and/or other users. For example, computing devices 24 are configured to present a graphical user interface 40 to the caregivers to facilitate display representations of the data analysis and/or other information. In some embodiments, graphical user interface 40 includes a plurality of separate interfaces associated with computing devices 24, processor 20 and/or other components of system 10; multiple views and/or fields configured to convey information to and/or receive information from caregivers, patients, and/or other users; and/or other interfaces.
In some embodiments, computing devices 24 are configured to provide graphical user interface 40, processing capabilities, databases, and/or electronic storage to system 10. As such, computing devices 24 may include processors 20, electronic storage 22, external resources 14, and/or other components of system 10. In some embodiments, computing devices 24 are connected to a network (e.g., the internet). In some embodiments, computing devices 24 do not include processors 20, electronic storage 22, external resources 14, and/or other components of system 10, but instead communicate with these components via the network. The connection to the network may be wireless or wired. For example, processor 20 may be located in a remote server and may wirelessly cause display of graphical user interface 40 to the caregivers on computing devices 24. As described above, in some embodiments, an individual computing device 24 is a laptop, a personal computer, a smartphone, a tablet computer, and/or other computing devices. Examples of interface devices suitable for inclusion in an individual computing device 24 include a touch screen, a keypad, touch-sensitive and/or physical buttons, switches, a keyboard, knobs, levers, a display, speakers, a microphone, an indicator light, an audible alarm, a printer, and/or other interface devices. The present disclosure also contemplates that an individual computing device 24 includes a removable storage interface. In this example, information may be loaded into a computing device 24 from removable storage (e.g., a smart card, a flash drive, a removable disk, etc.) that enables the caregivers, patients, and/or other users to customize the implementation of computing devices 24. Other exemplary input devices and techniques adapted for use with computing devices 24 include, but are not limited to, an RS-232 port, an RF link, an IR link, a modem (telephone, cable, etc.), and/or other devices.
The network 26 may include the Internet and/or other networks, such as local area networks, cellular networks, Intranets, near field communication, frequency (RF) link, Bluetooth™, Wi-Fi™, and/or any type(s) of wired or wireless network(s). Such examples are not intended to be limiting, and the scope of this disclosure includes embodiments in which external resources 14, sensor(s) 18, processor(s) 20, electronic storage 22, and/or client computing platform(s) 24 are operatively linked via some other communication media.
In some embodiments, method 1000 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 1000 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 1000.
At operation 1002, sleep and/or wake information related to a subject for a given period of time is obtained. In some embodiments, operation 1002 is performed by one or more processors the same as or similar to processors 20 (shown in
At operation 1004, bedtime and/or wakeup time of the subject is estimated using a sleep model and the sleep and/or wake information. In some embodiments, operation 1004 is performed by a physical computer processor the same as or similar to processor(s) 20 (shown in
At operation 1006, at least a sleep parameter of the subject for an upcoming time interval is predicted based upon the sleep model and the estimated bedtime and/or wakeup time. In some embodiments, operation 1006 is performed by a physical computer processor the same as or similar to processor(s) 20 (shown in
At operation 1008, physiological information of the subject is obtained. In some embodiments, operation 1008 is performed by a physical computer processor the same as or similar to processor(s) 20 (shown in
At operation 1010, the predicted sleep parameter for the upcoming time interval is predicted based upon the sleep model and the physiological information. In some embodiments, operation 1010 is performed by a physical computer processor the same as or similar to processor(s) 20 (shown in
In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word “comprising” or “including” does not exclude the presence of elements or steps other than those listed in a claim. In a device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements. The mere fact that certain elements are recited in mutually different dependent claims does not indicate that these elements cannot be used in combination.
Although the description provided above provides detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the expressly disclosed embodiments, but on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.
This application claims the benefit of U.S. Provisional Application No. 62/990,110, filed on 16 Mar. 2020 and U.S. Provisional Application No. 63/046,391, filed on 30 Jun. 2020. These applications are hereby incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
62990110 | Mar 2020 | US | |
63046391 | Jun 2020 | US |