Not applicable.
One of the most-important aspects in providing a healthy lifestyle is achieving a good night's sleep. Over the years, many people have tried to improve the quality of sleep. It is not easy to quantify the sleep one has achieved using the beds available today. Instead, a more common answer to the question, “How did you sleep last night?” is a general answer, such as “Fine.” It would be beneficial to more accurately measure the quantity and quality of sleep one is achieving. The user of the bed, or others, could then use the data to measure improvements in sleep as different approaches to improving sleep are attempted.
It would also be beneficial to interact with the bed in a more meaningful way. Today's beds offer consumers only limited opportunities to customize the bed and have it interact with their environment in some way. Consumers are now accustomed to using technology in their lives. It would be beneficial to use technology to provide consumers a way to tie the bed into other aspects of their environment.
The present invention is described in detail below with reference to the attached drawing figures, wherein:
The rear cavity 14 is located directly behind the headboard 18. The headboard 18 is designed to hide the rear cavity 14. The rear cavity 14 is equipped with support racks 20 (
The frame 12 is also designed with a pair of integrated end table shelves 26 (
Preferably, a pair of adjustable bed units 30 are coupled to the frame 12. It should be understood that only one adjustable bed unit 30 could be used with the bed 10. In the preferred embodiment, a pair of twin adjustable beds 30 are provided. Each bed unit 30 is individually adjustable, to provide a “his” and “hers” style. The bed units 30 are adjustable to a number of different positions. For example, the head of the bed can be raised, as can the area of the bed adjacent the knee area of the user. These adjustable beds are known generally to those in the bedding field. In a preferred embodiment, as best seen in
Each bed unit 30 preferably has a heating and cooling pad 36 installed over the mattress of the bed. Each pad 36 is coupled to a control unit housed within the rear cavity 14. The control unit can be held by the support racks 20. This allows the surface temperature of each bed unit to be individually controlled. The pad 36 is installed directly over the mattress of the bed unit 30 and has a number of fluid chambers running through it. The control unit adjusts the temperature of the water flowing through the chambers to adjust the temperature of the mattress. As one example, a mattress pad known as the ChiliPad™ marketed and sold by T2 International of Mooresville, N.C. can be used as the pad 36. An integrated heating and cooling unit is also within the scope of the present invention. Such an integrated unit replaces the pad 36 and integrates it directly into the mattress of the bedding unit 30.
The mattress of each bed unit 30 is preferable made up of three layers. The first layer 38 (
Each bed unit 30 is also provided with a sensor unit 44 (
Along with the sensor unit 44, each bed is also preferably provided with a microphone (not shown). The microphone is preferably a standard electret microphone, 100 Hz high pass and 400 Hz low pass, first order filtering, full-wave rectified and averaged with a 200 msec low pass time constant sampled at 50 samples per second. It should be understood that other microphones could be used as well.
The signals from the sensor units 44 and the microphones are used to detect the respiration, motion, pulse and snoring of a person laying on the bed unit 30. The signal is filtered using active filtering through operational amplifiers, precision resistors, capacitors and inductors. These components are arranged to create low-pass filters, high-pass filters, and/or band-pass filters. Using this filtering, the single signal coming from the sensor unit 44 can be divided into separate channels. A separate channel can be filtered from the signal for each of the respiration, motion, pulse and snoring conditions of the user. Each condition has an electronic signature and the filtering is used to separate and identify the specific signature. If the microphone is used, the snoring condition is detected by the microphone. Each of the bed units 30 is provided with the above detection assembly. To provide separate data for each bed unit, the bed units are isolated from one another. Further use of the signaling from these sensors is described in more detail below.
As best seen in
To compensate for the image bias built into a standard projector, the projection unit 52 is rotated about a vertical axis. The bias built into the standard projector is to compensate for projecting upwards, for a projector setting on a conference table or downwards for a projector located in the ceiling. The bias includes projecting an image in the keystone shape such that the image will be square upon the projection surface. Since the bias needed to generate a square image on the ceiling is different from a wall, the projector needs to be rotated 180 degrees about the vertical axis to switch. So a standard video projector can be used. The rotation of the projector causes the projector to automatically reverse the image. To compensate for the reversal of the image, caused by the rotation of the projector, the image projected by the projector needs to be electronically reversed prior to projection, which is a known reversal process to those of skill in the art.
The bed 10 is controlled through a computing device 60, which can also be located within the headboard 18 and specifically on the support racks 20. The computing device can be a robust personal computer, or a thin-client computer coupled to a more robust computer at another location. As an example, and without limitation, the computing device 60 can be a thin-client computer coupled over a personal network to a more powerful server type computer located elsewhere within the home. The computing device 60 is used to control the bed 10, to process the signals received from the sensing units 44 and microphone, and to provide the media experience in connection with the audio and video components described above. Therefore, the signals from the sensing units 44 and microphone are passed to the computing device 60 after filtering. The use of this data is further described below.
In addition to the sensing devices and microphones, the other components of the bed 10 are also coupled to the computing device 60. The audio and video components are therefore coupled to the computing device 60, as are the motors used to control the orientation of each bedding unit 30. Similarly, the control unit of each cooling pad 36 is coupled to the computing device 60. Other environmental room appliances are also preferably coupled to the computing device 60. These environmental room appliances are typically web services devices (WSD) and can include, for example, such things as alarm clocks, automatic window shades, room lighting, home security cameras, thermostats and phones. It should be understood that other electronic devices could also be coupled to the computing device 60, as will be better understood from the use scenarios described below.
Preferably, the computing device 60 is a media personal computer equipped to provide storage and retrieval of videos, music and images. The computing device 60 is also preferably equipped to receive cable or satellite television signals. Any of a number of computing devices 60 available today and running a media operating system such as the Windows Media Center® software available from the Microsoft Corporation of Redmond, Wash. are acceptable. Such an operating system utilizes a user interface that is remote friendly, and operable at a distance without the use of a keyboard. In the preferred embodiment, the user interface is operable using a radio-frequency (RF) remote. The software provides easy access to, for example, stored video, cable or satellite signals, stored images, and stored audio files. Using the computing device 60, and software modified to accommodate control of the bed positions, media and room conditions can be controlled using a single RF remote.
The computing device 60 is programmed to include a selectable icon to control settings for the bed 10 and the environment for the bed. The settings, for example, can be accessed through a “My Bed” icon programmed into the software. Using the software, preprogrammed settings can be provided to users. These settings are virtually limitless. An entry user interface can be displayed, such as that shown in
For example, a “Reading” setting can be programmed into the software. When the “Reading” setting is activated, the computing device 60 can be programmed to adjust the bed 10 and the room environment. This could include raising the head of the bedding unit 30 on the appropriate side (i.e. the appropriate one of the bedding units 30), turning on the lights to accommodate reading, adjusting the temperature of the bed if desired, and turning down/off the volume of any audio currently playing. Other settings are also preferably provided, and can include a “Sleep” setting, where the bed is adjusted to a flat position, the lights are turned off as is any currently playing audio and/or video. A “Video” or “TV” setting can also be programmed into the computing device 60. In such a setting, the user may be provided an option of a forward projection or upward projection of the image. The bed and projection will be adjusted accordingly. For example, if the user desires a forward projection, the image is projected forwardly and the bed is adjusted so that the person in the bed is in more of a seated position, looking forwardly. In addition, the computing device 60 will extend the audio speakers 22 and 24 with the “Video” or “TV” setting activated. Anytime a setting is selected requiring audio, the speakers are extended. The speakers 22 and 24 are retracted when a setting is selected, such as “Sleep” where audio is not desired.
Preferably, all of the bedding controls and room environment controls are also individually accessible through the distance user interface of the computing device 60. Using a remote, a user of the bed can therefore individually control the position of the bed, as well as the temperature and other operational aspects of the bed 10, such as the massage feature. The user can also individually control the available media. This allows a user to turn on the TV or video available, for example, without adjusting the bed or other room conditions.
Diagnostic Monitoring
As described above, the bed 10 is able to detect a person's pulse, respiration, major movements and snoring using the sensing units 44 and the microphone. The signals from the sensing unit 44 and microphone are delivered to the computing device 60. The computing device 60 records this diagnostic information about the person. The diagnostic measurements can be initiated by the user or can be set to begin measurement at a certain time, or whenever the system determines the user is in the bed. For example, the system can determine a person is in the bed when pulse and respiration are detected for a certain length of time, or by using the load cell to detect presence. The system can then begin recording data for the sleep session of the user.
The bed 10 can therefore provide data regarding the quality of sleep achieved during any sleep session. The sensing units 44 provide data to the computing device 60 which can then record and deliver the data to an interested person. For example, the computing device 60 can provide the data to the user of the bed, and can compare data from different time periods.
The bed 10, using the computing device 60, can be used to provide the sleep data to the user in the morning to provide a quick “sleep summary” to the user. This can be provided through the display using the video projector 52, or can be delivered through the network to any of a number of devices. For example, the summary data can be provided to the user's cell phone, personal digital assistant or to another computer, such as the user's work computer though an available network, such as the Internet, a LAN or WAN. Moreover, should the user desire and authorize such activity, the data could be sent to another person, such as the user's physician.
In addition to the sleep summary data show in
The data can be used to calculate the quality of the sleep achieved during any sleep session. This calculation can factor in the total time a person is in bed, the number of major movements during the sleep session, the number of times a user left the bed, any respiratory interruptions and any snoring activity. Basically, all or part of the data collected during a sleep session can be used to calculate the quality of sleep, or “rest factor” for any given sleep session. This rest factor can then be compared from previously calculated rest factors to indicate whether the sleep quality achieved is improving or deteriorating. Adjustments can be made to the sleeping environment, the person's lifestyle (such as diet and exercise) and such things as medication. The effectiveness of these adjustments can then be determined by comparing the before and after rest factors.
For example, and without limitation, assume the sensing unit determines a person enters bed at 10:15 pm, and gets out of bed in the morning at 6:15 am (see
Another exemplary formula for indicating the quality of sleep obtained by a person, or rest factor, is represented by the formula:
Rest Factor=(A*SnoreFactor+B*ApneaFactor+C*MovementFactor+D*ExitFactor+E*SleepFactor)/(A+B+C+D+E).
In this calculation;
SnoreFactor=100−0.5*Number of Snore Events;
ApneaFactor=100−5.0*Number of Apnea Events;
MovementFactor=100−0.5*Number of Movement Events;
ExitFactor=100−5.0*Number of Exit Events; and
SleepFactor=100−|8—Number of hours in bed|.
Each of A, B, C, D and E are constants. In the currently preferred embodiment, the constants are each equal to one. But, each of the constants could be a different number. It should of course be understood that the formula and examples above are only examples, and that other formulas could be used, with different weights given to different factors. It should also be understood that the formula and examples above are only examples, and that other formulas could be used, with different weights given to different factors.
Using this rest factor formula, the quality of sleep during the night can be calculated and presented to the user, as shown in exemplary
Also, as stated above, the diagnostic monitoring can be specifically activated by the user through the user interface, or the monitoring can be triggered by another event, such as a user entering the bed, a specific time, or a diagnostic event, such as snoring.
In addition to calculating the quality of rest, the signals generated by the sensing units 44 and microphones can be used as triggers to affect the sleeping environment of the person. As one example, if the sensing units 44 and or the microphones detect a snoring event, the head of the bedding unit 30 on which the person is sleeping can be raised slightly and controlled by the computing device 60. As an example, the head of the bed could be raised by seven degrees. The system continues to monitor for snoring, and if the snoring continues, the head of the bed can be raised further. This monitoring and raising can be programmed to occur automatically and can continue up to some predetermined maximum raised position, such as thirty five degrees. Once the snoring has stopped for a set period of time, such as five minutes, the bed 10 can react by lowering the head of the bed to the horizontal, standard, sleeping position. It should be understood that amount of each head raise, and the length of time between each raise, can be customized to best accommodate each individual user, although it is preferable to set the system with a standard default response system.
Other detected events can also be used as change triggers. Any respiratory interruptions, such as those common in people suffering from sleep apnea, can be used as a trigger to provide an appropriate response. Should a respiratory interruption occur, the head of the bed could be raised, or the massage units activated, or some other responsive action in an attempt to halt the respiratory interruption. As another example, should the sensing units 44 detect the user leaving the bed, the computing device 60 can communicate with the coupled WSDs to assist the person in some way. More specifically, if the sensing units 44 detect the user leaving the bed, the computing device can adjust the lighting, such as by illuminating a path to the restroom.
The bed 10 can also be programmed to automatically change the bed orientation, condition and room environment as a function of events or conditions. As an example, and without limitation, the cooling pad 36 can be programmed to adjust the temperature of the bedding unit 30 as a function of time, either making the bed cooler or warmer as the sleep session progresses. Additionally, the cooling pad 36 can be coupled to the computing device 60 and can be controlled to automatically adjust the temperature of the cooling pad as changes in temperature of the bed environment are detected. In this example, a temperature sensing device is included and is used to provide feedback to the computing device 60. If the temperature of the sleeping environment increases above a predetermined point, the cooling pad 36 is activated to lower the temperature. Similarly, if the temperature of the sleeping environment drops below a predetermined point, the pad 36 is activated to raise the temperature.
Using the computing device 60 coupled to the bed 10 also provides opportunities for different waking experiences. For example, the computing device 60 can be programmed to turn on the television at a certain time and/or to wake the person with a gentle massage. The user could also wake to a screen providing the sleep summary data.
All of the monitoring and responsive actions described above can be customized by the user of the bed. Additionally, the user can adjust or turn off any of the monitoring as desired, or can adjust the sensitivity of the system. This allows users to activate any responsive actions only upon more severe snoring events, for example.
This application claims priority to U.S. Provisional application No. 61/018,805, filed Jan. 3, 2008.
Number | Name | Date | Kind |
---|---|---|---|
2789158 | Livingston | Apr 1957 | A |
3680936 | Backhaus | Aug 1972 | A |
4146885 | Lawson, Jr. | Mar 1979 | A |
4320766 | Alihanka et al. | Mar 1982 | A |
5054139 | Jones | Oct 1991 | A |
5062169 | Kennedy et al. | Nov 1991 | A |
5479939 | Ogino | Jan 1996 | A |
5969488 | Fromson | Oct 1999 | A |
5989193 | Sullivan | Nov 1999 | A |
6468234 | Van Der Loos et al. | Oct 2002 | B1 |
6485441 | Woodward | Nov 2002 | B2 |
6585328 | Oexman et al. | Jul 2003 | B1 |
6821258 | Reed et al. | Nov 2004 | B2 |
7001334 | Reed et al. | Feb 2006 | B2 |
7007327 | Ogawa et al. | Mar 2006 | B2 |
7396331 | Mack et al. | Jul 2008 | B2 |
20040103475 | Ogawa et al. | Jun 2004 | A1 |
20040156517 | Schmidt et al. | Aug 2004 | A1 |
20040168257 | Torrez | Sep 2004 | A1 |
20050091739 | Lerma | May 2005 | A1 |
20060047217 | Mirtalebi et al. | Mar 2006 | A1 |
20070120689 | Zerhusen et al. | May 2007 | A1 |
20070161917 | Ozaki et al. | Jul 2007 | A1 |
20080095395 | Pieklik et al. | Apr 2008 | A1 |
20080264426 | Walker | Oct 2008 | A1 |
20080275314 | Mack et al. | Nov 2008 | A1 |
20090318776 | Toda et al. | Dec 2009 | A1 |
Number | Date | Country | |
---|---|---|---|
20090177327 A1 | Jul 2009 | US |
Number | Date | Country | |
---|---|---|---|
61018805 | Jan 2008 | US |