Various embodiments relate generally to home automation devices, and human biological signal gathering and analysis.
According to current scientific research into sleep, there are two major stages of sleep: rapid eye movement (“REM”) sleep, and non-REM sleep. First comes non-REM sleep, followed by a shorter period of REM sleep, and then the cycle starts over again.
There are three stages of non-REM sleep. Each stage can last from 5 to 15 minutes. A person goes through all three stages before reaching REM sleep.
In stage one, a person's eyes are closed, but the person is easily woken up. This stage may last for 5 to 10 minutes.
In stage two, a person is in light sleep. A person's heart rate slows and the person's body temperature drops. The person's body is getting ready for deep sleep.
Stage three is the deep sleep stage. A person is harder to rouse during this stage, and if the person was woken up, the person would feel disoriented for a few minutes. During the deep stages of non-REM sleep, the body repairs and regrows tissues, builds bone and muscle, and strengthens the immune system.
REM sleep happens 90 minutes after a person falls asleep. Dreams typically happen during REM sleep. The first period of REM typically lasts 10 minutes. Each of later REM stages gets longer, and the final one may last up to an hour. A person's heart rate and breathing quickens. A person can have intense dreams during REM sleep, since the brain is more active. REM sleep affects learning of certain mental skills.
Even in today's technological age, supporting healthy sleep is relegated to the technology of the past such as an electric blanket, a heated pad, or a bed warmer. The most advanced of these technologies, an electric blanket, is a blanket with an integrated electrical heating device which can be placed above the top bed sheet or below the bottom bed sheet. The electric blanket may be used to pre-heat the bed before use or to keep the occupant warm while in bed. However, turning on the electric blanket requires the user to remember to manually turn on the blanket, and then manually turn it on. Further, the electric blanket provides no additional functionality besides warming the bed.
Introduced are a bed device system and methods for: gathering human biological signals, such as heart rate, breathing rate, or temperature; analyzing the gathered human biological signals; and controlling the bed device system based on the analysis.
In one embodiment of the invention, one or more user sensors, associated with a piece of furniture, such as a bed, measure the bio signals associated with a user, such as the heart rate associated with said user or breathing rate associated with said user. One or more environment sensors measure the environment property such as temperature, humidity, light, or sound. Based on the bio signals associated with said user and environment properties received, the system determines the time at which to send an instruction to an appliance to turn on or to turn off. In one embodiment, the appliance is a bed device, capable of heating or cooling the user's bed. In another embodiment, the appliance is a thermostat, a light, a coffee machine, or a humidifier.
In another embodiment of the invention, based on the heart rate, temperature, and breathing rate, associated with a user, the system determines the sleep phase associated with said user. Based on the sleep phase and the user-specified wake-up time, the system determines a time to wake up the user, so that the user does not feel tired or disoriented when woken up.
These and other objects, features and characteristics of the present embodiments will become more apparent to those skilled in the art from a study of the following detailed description in conjunction with the appended claims and drawings, all of which form a part of this specification. While the accompanying drawings include illustrations of various embodiments, the drawings are not intended to limit the claimed subject matter.
Examples of a method, apparatus, and computer program for automating the control of home appliances and improving the sleep environment are disclosed below. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. One skilled in the art will recognize that the embodiments of the invention may be practiced without these specific details or with an equivalent arrangement. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments of the invention.
Brief definitions of terms, abbreviations, and phrases used throughout this application are given below.
In this specification, the term “biological signal” and “bio signal” are synonyms, and are used interchangeably.
Reference in this specification to “sleep phase” means light sleep, deep sleep, or REM sleep. Light sleep comprises stage one and stage two, non-REM sleep.
Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described that may be exhibited by some embodiments and not by others. Similarly, various requirements are described that may be requirements for some embodiments but not others.
Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements. The coupling or connection between the elements can be physical, logical, or a combination thereof. For example, two devices may be coupled directly, or via one or more intermediary channels or devices. As another example, devices may be coupled in such a way that information can be passed there between, while not sharing any physical connection with one another. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
If the specification states a component or feature “may,” “can,” “could,” or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
The term “module” refers broadly to software, hardware, or firmware components (or any combination thereof). Modules are typically functional components that can generate useful data or another output using specified input(s). A module may or may not be self-contained. An application program (also called an “application”) may include one or more modules, or a module may include one or more application programs.
The terminology used in the Detailed Description is intended to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with certain examples. The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. For convenience, certain terms may be highlighted, for example using capitalization, italics, and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that the same element can be described in more than one way.
Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein, but special significance is not to be placed upon whether or not a term is elaborated or discussed herein. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification, including examples of any terms discussed herein, is illustrative only and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.
Bed Device
The processor 230 is any type of microcontroller, or any processor in a mobile terminal, fixed terminal, or portable terminal including a mobile handset, station, unit, device, multimedia computer, multimedia tablet, Internet node, cloud computer, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, personal communication system (PCS) device, personal navigation device, personal digital assistants (PDAs), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, the accessories and peripherals of these devices, or any combination thereof.
According to one embodiment, even if the users switch sides of the bed, the system will correctly identify which user is sleeping in which zone by identifying the user based on any of the following signals alone, or in combination: heart rate, breathing rate, body motion, or body temperature associated with said user. The system can also identify the user by receiving from a user device associated with the user an identification (ID) associated with the user. For example, the user can specify the user ID of the person sleeping on the sensor strip. If there are multiple sensor strips and/or multiple sensors, the user can specify the ID of the person associated with each sensor strip and/or each sensor.
In another embodiment, the power supply associated with the heating coil 600 is divided into a plurality of zones, each power supply zone corresponding to a subzone 620, 630, 640, 650, 670, 680, 690, 695. The user can control the temperature of each subzone 620, 630, 640, 650, 670, 680, 690, 695 independently. Further, each user can independently specify the temperature preferences for each of the subzones. Even if the users switch sides of the bed, the system will correctly identify the user, retrieve the user identification (user ID), and the preferences associated with the user by identifying the user based on any of the following signals alone, or in combination: heart rate, breathing rate, body motion, or body temperature associated with said user. According to another embodiment, if the users switch sides of the bed, the system receives the user ID of the new user from a user device associated with the user, and retrieves the preferences associated with the user.
At block 720, the process determines the control signal and the time to send a control signal. At block 730, the process sends the control signal to the bed device. For example, if the user is in bed, the bed temperature is low, and the ambient light is low, the process sends a control signal to the bed device. The control signal comprises an instruction to heat the bed device to the average nightly temperature associated with the user. According to another embodiment, the control signal comprises an instruction to heat the bed device to a user-specified temperature. Similarly, if the user is in bed, the bed temperature is high, and the ambient light is low, the process sends a control signal to the bed device to cool the bed device to the average nightly temperature associated with the user. According to another embodiment, the control signal comprises an instruction to cool the bed device to a user-specified temperature.
In another embodiment, in addition to obtaining the biological signal associated with the user, and the environment property, the process obtains a history of biological signals associated with the user. The history of biological signals can be stored in a database associated with the bed device, or in a database associated with a user. The history of biological signals comprises the average bedtime the user went to sleep for each day of the week; that is, the history of biological signals comprises the average bedtime associated with the user on Monday, the average bedtime associated with the user on Tuesday, etc. For a given day of the week, the process determines the average bedtime associated with the user for that day of the week, and sends the control signal to the bed device, allowing enough time for the bed to reach the desired temperature, before the average bedtime associated with the user. The control signal comprises an instruction to heat, or cool the bed to a desired temperature. The desired temperature may be automatically determined, such as by averaging the historical nightly temperature associated with a user, or the desired temperature may be specified by the user.
Bio Signal Processing
The technology disclosed here categorizes the sleep phase associated with a user as light sleep, deep sleep, or REM sleep. Light sleep comprises stage one and stage two sleep. The technology performs the categorization based on the breathing rate associated with said user, heart rate associated with said user, motion associated with said user, and body temperature associated with said user. Generally, when said user is awake, the breathing is erratic. When the user is sleeping, the breathing becomes regular. The transition between being awake and sleeping is quick and lasts less than 1 minute.
In one embodiment, the process keeps track of a likelihood that the user is asleep. The process can take into account the environment property and/or the sleep state associated with the user to determine the likelihood that the user is asleep. If the likelihood that the user is asleep is above a specified threshold, such as above 0.5, the process determines that the user is asleep, and takes corresponding actions.
To calculate the likelihood that the user is asleep, the process takes into account the environment property such as the current time, and compares the current time to the average bedtime associated with the user. If the current time is greater than the average bedtime associated with the user, the process increases the likelihood that the user is asleep. If the light intensity is lower than an average light intensity associated with the space where the environment sensor is placed, the process also increases the likelihood that the user is asleep. If the sound intensity is lower than an average sound intensity associated with the space where the environment sensor is placed, the process increases the likelihood that the user is asleep. Similarly, if the light intensity is higher than the average light intensity, the process decreases the likelihood that the user is asleep, and if the sound intensity is higher than the average sound intensity, the process decreases the likelihood that the user is asleep. The average associated with the environment property, such as the average light intensity and the average sound intensity, can be stored and/or retrieved from a database associated with the environment property, which can be the same database as the database associated with the user.
The process can increase or decrease the likelihood that the user is asleep based on the sleep state associated with the user, and the confidence level associated with the sleep state determination. For example, if the sleep state associated with the user is awake, the process decreases the likelihood that the user is asleep by an amount corresponding to the confidence level associated with the awake sleep state. For example, the process multiplies the likelihood that the user is asleep by (1—the confidence level). Similarly, if the sleep state associated with the user is asleep, the process increases the likelihood that the user is asleep by an amount corresponding to the confidence level associated with the sleep state. For example, the process multiplies the likelihood that the user is asleep by the confidence level.
Based on the likelihood that the user is asleep, the process, at block 1050, sends an appropriate control signal to an appliance. For example, if the user is asleep, the process sends the control signal to the thermostat to adjust the temperature to the average nightly temperature. Further, if the user is asleep and the lights are on, the process turns off the lights. Similarly, if the user is asleep and a media device is on, such as a television, a mobile device, a PlayStation, etc., the process turns off the media device. If the user is asleep, and the humidifier is off, the process sends a control signal to the humidifier to turn on. If the user is asleep, the process sends a control signal to the locks to engage. If the user is awake, the process sends the control signal to: the thermostat to adjust the temperature to the average temperature associated with the current time, the coffee maker to start making coffee, the humidifier to turn off, etc.
Smart Home
The processor 1100 is any type of microcontroller, or any processor in a mobile terminal, fixed terminal, or portable terminal, including a mobile handset, station, unit, device, multimedia computer, multimedia tablet, Internet node, cloud computer, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, personal communication system (PCS) device, personal navigation device, personal digital assistants (PDAs), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, the accessories and peripherals of these devices, or any combination thereof.
The processor 1100 can be connected to the user sensor 1140, 1150, or the environment sensor 1160, 1170 by a computer bus, such as an I2C bus. Also, the processor 1100 can be connected to the user sensor 1140, 1150, or environment sensor 1160, 1170 by a communication network 1110. By way of example, the communication network 1110 connecting the processor 1100 to the user sensor 1140, 1150, or the environment sensor 1160, 1170, includes one or more networks such as a data network, a wireless network, a telephony network, or any combination thereof. The data network may be any local area network (LAN), metropolitan area network (MAN), wide area network (WAN), public data network (e.g., the Internet), short range wireless network, or any other suitable packet-switched network, such as a commercially owned, proprietary packet-switched network, e.g., a proprietary cable or fiber-optic network, and the like, or any combination thereof. In addition, the wireless network may be, for example, a cellular network and may employ various technologies, including enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., worldwide interoperability for microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), wireless LAN (WLAN), Bluetooth®, Internet Protocol (IP) data casting, satellite, mobile ad-hoc network (MANET), and the like, or any combination thereof.
According to another embodiment, at block 1300, the process obtains a current biological signal associated with a user from a sensor associated with said user. At block 1310, the process also obtains environment data, such as the ambient light, from an environment sensor associated with a bed device. Based on the current biological signal, the process identifies whether the user is asleep. If the user is asleep and the lights are on, the process sends an instruction to turn off the lights. In another embodiment, if the user is asleep, the lights are off, and the ambient light is high, the process sends an instruction to the blinds to shut. In another embodiment, if the user is asleep, the process sends an instruction to the locks to engage.
In another embodiment, the process, at block 1300, obtains a history of biological signals, such as at what time the user goes to bed on a particular day of the week (e.g., the average bedtime associated with said user on Monday, the average bedtime associated with said user on Tuesday, etc.). The history of biological signals can be stored in a database associated with the bed device, or in a database associated with a user. Alternatively, the user may specify a bedtime for the user for each day of the week. Further, the process obtains the exercise data associated with said user, such as the number of hours the user spent exercising, or the heart rate associated with said user during exercising. According to one embodiment, the process obtains the exercise data from a user phone, a wearable device, fitbit bracelet, or database associated with said user. Based on the average bedtime for that day of the week and the exercise data during the day, the process, at block 1320, determines the expected bedtime associated with said user that night. The process then sends an instruction to the bed device to heat to a desired temperature, before the expected bedtime. The desired temperature can be specified by the user, or the desired temperature can be determined automatically, based on the average nightly temperature associated with said user.
Monitoring of Biological Signals
Biological signals associated with a person, such as a heart rate or a breathing rate, indicate said person's state of health. Changes in the biological signals can indicate an immediate onset of a disease, or a long-term trend that increases the risk of a disease associated with said person. Monitoring the biological signals for such changes can predict the onset of a disease, can enable calling for help when the onset of the disease is immediate, or can provide advice to the person if the person is exposed to a higher risk of the disease in the long-term.
The user device 1520 is any type of a mobile terminal, fixed terminal, or portable terminal, including a mobile handset, station, unit, device, multimedia computer, multimedia tablet, Internet node, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, personal communication system (PCS) device, personal navigation device, personal digital assistants (PDAs), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, the accessories and peripherals of these devices, or any combination thereof.
The processor 1500 is any type of microcontroller, or any processor in a mobile terminal, fixed terminal, or portable terminal, including a mobile handset, station, unit, device, multimedia computer, multimedia tablet, Internet node, cloud computer, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, personal communication system (PCS) device, personal navigation device, personal digital assistants (PDAs), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, the accessories and peripherals of these devices, or any combination thereof.
The processor 1500 can be connected to the user sensor 1530, 1540 by a computer bus, such as an I2C bus. Also, the processor 1500 can be connected to the user sensor 1530, 1540 by a communication network 1510. By way of example, the communication network 1510 connecting the processor 1500 to the user sensor 1530, 1540 includes one or more networks, such as a data network, a wireless network, a telephony network, or any combination thereof. The data network may be any local area network (LAN), metropolitan area network (MAN), wide area network (WAN), a public data network (e.g., the Internet), short range wireless network, or any other suitable packet-switched network, such as a commercially owned, proprietary packet-switched network, e.g., a proprietary cable or fiber-optic network, and the like, or any combination thereof. In addition, the wireless network may be, for example, a cellular network and may employ various technologies, including enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., worldwide interoperability for microwave access (WiMAX), Long Term Evolution (LTE) networks, code division multiple access (CDMA), wideband code division multiple access (WCDMA), wireless fidelity (WiFi), wireless LAN (WLAN), Bluetooth®, Internet Protocol (IP) data casting, satellite, mobile ad-hoc network (MANET), and the like, or any combination thereof.
According to one embodiment, the process of
According to another embodiment, the process of
According to one embodiment, the process of
In the example of
This disclosure contemplates the computer system 1900 taking any suitable physical form. As an example and not by way of limitation, computer system 1900 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, or a combination of two or more of these. Where appropriate, computer system 1900 may include one or more computer systems 1900; be unitary or distributed; span multiple locations; span multiple machines; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 1900 may perform, without substantial spatial or temporal limitation, one or more steps of one or more methods described or illustrated herein. As an example and not by way of limitation, one or more computer systems 1900 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 1900 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
The processor may be, for example, a conventional microprocessor such as an Intel Pentium microprocessor or Motorola power PC microprocessor. One of skill in the relevant art will recognize that the terms “machine-readable (storage) medium” or “computer-readable (storage) medium” include any type of device that is accessible by the processor.
The memory is coupled to the processor by, for example, a bus. The memory can include, by way of example but not limitation, random access memory (RAM), such as dynamic RAM (DRAM) and static RAM (SRAM). The memory can be local, remote, or distributed.
The bus also couples the processor to the non-volatile memory and drive unit. The non-volatile memory is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a read-only memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software in the computer 1900. The non-volatile storage can be local, remote, or distributed. The non-volatile memory is optional because systems can be created with all applicable data available in memory. A typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor.
Software is typically stored in the non-volatile memory and/or the drive unit. Indeed, storing an entire large program in memory may not even be possible. Nevertheless, it should be understood that, for software to run, if necessary, it is moved to a computer readable location appropriate for processing, and for illustrative purposes, that location is referred to as the memory in this paper. Even when software is moved to the memory for execution, the processor will typically make use of hardware registers to store values associated with the software, and local cache that, ideally, serves to speed up execution. As used herein, a software program is assumed to be stored at any known or convenient location (from non-volatile storage to hardware registers) when the software program is referred to as “implemented in a computer-readable medium.” A processor is considered to be “configured to execute a program” when at least one value associated with the program is stored in a register readable by the processor.
The bus also couples the processor to the network interface device. The interface can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of the computer system 1900. The interface can include an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g., “direct PC”), or other interfaces for coupling a computer system to other computer systems. The interface can include one or more input and/or output devices. The I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other input and/or output devices, including a display device. The display device can include, by way of example but not limitation, a cathode ray tube (CRT), liquid crystal display (LCD), or some other applicable known or convenient display device. For simplicity, it is assumed that controllers of any devices not depicted in the example of
In operation, the computer system 1900 can be controlled by operating system software that includes a file management system, such as a disk operating system. One example of operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Wash., and their associated file management systems. Another example of operating system software with its associated file management system software is the Linux™ operating system and its associated file management system. The file management system is typically stored in the non-volatile memory and/or drive unit and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the non-volatile memory and/or drive unit.
Some portions of the detailed description may be presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, as apparent from the following discussion, it is appreciated that, throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or “generating” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the methods of some embodiments. The required structure for a variety of these systems will appear from the description below. In addition, the techniques are not described with reference to any particular programming language, and various embodiments may thus be implemented using a variety of programming languages.
In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a laptop computer, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, an iPhone, a Blackberry, a processor, a telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
While the machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies or modules of the presently disclosed technique and innovation.
In general, the routines executed to implement the embodiments of the disclosure may be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.
Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include but are not limited to recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media, such as digital and analog communication links.
In some circumstances, operation of a memory device, such as a change in state from a binary one to a binary zero or vice-versa, for example, may comprise a transformation, such as a physical transformation. With particular types of memory devices, such a physical transformation may comprise a physical transformation of an article to a different state or thing. For example, but without limitation, for some types of memory devices, a change in state may involve an accumulation and storage of charge or a release of stored charge. Likewise, in other memory devices, a change of state may comprise a physical change or transformation in magnetic orientation or a physical change or transformation in molecular structure, such as from crystalline to amorphous or vice versa. The foregoing is not intended to be an exhaustive list of all examples in which a change in state for a binary one to a binary zero or vice-versa in a memory device may comprise a transformation, such as a physical transformation. Rather, the foregoing is intended as illustrative examples.
A storage medium typically may be non-transitory or comprise a non-transitory device. In this context, a non-transitory storage medium may include a device that is tangible, meaning that the device has a concrete physical form, although the device may change its physical state. Thus, for example, non-transitory refers to a device remaining tangible despite this change in state.
Remarks
In many of the embodiments disclosed in this application, the technology is capable of allowing multiple different users to use the same piece of furniture equipped with the presently disclosed technology. For example, different people can sleep in the same bed. In addition, two different users can switch the side of the bed that they sleep on, and the technology disclosed here will correctly identify which user is sleeping on which side of the bed. The technology identifies the users and obtains the user ID, based on any of the following signals alone or in combination: heart rate, breathing rate, body motion, or body temperature associated with each user. In another embodiment, the technology disclosed here identifies the user by receiving both the user ID and side of the bed associated with the user ID, from a device associated with the user.
The foregoing description of various embodiments of the claimed subject matter has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed. Many modifications and variations will be apparent to one skilled in the art. Embodiments were chosen and described in order to best describe the principles of the invention and its practical applications, thereby enabling others skilled in the relevant art to understand the claimed subject matter, the various embodiments, and the various modifications that are suited to the particular uses contemplated.
While embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
Although the above Detailed Description describes certain embodiments and the best mode contemplated, no matter how detailed the above appears in text, the embodiments can be practiced in many ways. Details of the systems and methods may vary considerably in their implementation details, while still being encompassed by the specification. As noted above, particular terminology used when describing certain features or aspects of various embodiments should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the invention with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification, unless those terms are explicitly defined herein. Accordingly, the actual scope of the invention encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the embodiments under the claims.
The language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this Detailed Description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of various embodiments is intended to be illustrative, but not limiting, of the scope of the embodiments, which is set forth in the following claims.
This application is a continuation-in-part of U.S. patent application Ser. No. 14/732,646, filed Jun. 5, 2015, which claims priority to the following U.S. provisional patent applications: U.S. Provisional Patent Application Ser. No. 62/008,480, filed Jun. 5, 2014; U.S. Provisional Patent Application Ser. No. 62/024,945, filed Jul. 15, 2014; U.S. Provisional Patent Application Ser. No. 62/159,177, filed May 8, 2015; and U.S. Provisional Patent Application Ser. No. 62/161,142, filed May 13, 2015. All of the above referenced applications are incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4136685 | Ramey | Jan 1979 | A |
4299233 | Lemelson | Nov 1981 | A |
4440177 | Anderson et al. | Apr 1984 | A |
5157372 | Langford | Oct 1992 | A |
5307051 | Sedlmayr | Apr 1994 | A |
5319363 | Welch et al. | Jun 1994 | A |
5353788 | Miles | Oct 1994 | A |
5479939 | Ogino | Jan 1996 | A |
5902255 | Ogino | May 1999 | A |
5948303 | Larson | Sep 1999 | A |
6045514 | Raviv et al. | Apr 2000 | A |
6236621 | Schettino et al. | May 2001 | B1 |
6254545 | Stasz et al. | Jul 2001 | B1 |
6485432 | Stasz et al. | Nov 2002 | B1 |
6491642 | Stasz et al. | Dec 2002 | B1 |
6547728 | Cornuejols et al. | Apr 2003 | B1 |
6551256 | Stasz et al. | Apr 2003 | B1 |
6702755 | Stasz et al. | Mar 2004 | B1 |
6765489 | Ketelhohn et al. | Jul 2004 | B1 |
6774795 | Eshelman et al. | Aug 2004 | B2 |
6784826 | Kane et al. | Aug 2004 | B2 |
6825769 | Colmenarez et al. | Nov 2004 | B2 |
6888453 | Lutz et al. | May 2005 | B2 |
6890304 | Uebaba et al. | May 2005 | B1 |
7089099 | Shostak et al. | Aug 2006 | B2 |
7202791 | Trajkovic | Apr 2007 | B2 |
7289036 | Salzhauer et al. | Oct 2007 | B2 |
7369680 | Strubbe et al. | May 2008 | B2 |
7372370 | Stults et al. | May 2008 | B2 |
7461422 | Baker et al. | Dec 2008 | B1 |
7734334 | Mietus et al. | Jun 2010 | B2 |
7825813 | Farhan et al. | Nov 2010 | B2 |
7868757 | Radivojevic et al. | Jan 2011 | B2 |
7883480 | Dunlop et al. | Feb 2011 | B2 |
8035508 | Breed et al. | Oct 2011 | B2 |
8147407 | Moore et al. | Apr 2012 | B2 |
8147420 | Henke et al. | Apr 2012 | B2 |
8292819 | Kuo et al. | Oct 2012 | B2 |
8337431 | Heruth et al. | Dec 2012 | B2 |
8348840 | Heit et al. | Jan 2013 | B2 |
8355769 | Popovic et al. | Jan 2013 | B2 |
8410942 | Chacon et al. | Apr 2013 | B2 |
8410943 | Metz et al. | Apr 2013 | B2 |
8427311 | Schlangen et al. | Apr 2013 | B2 |
8444558 | Young et al. | May 2013 | B2 |
8461996 | Gallagher et al. | Jun 2013 | B2 |
8493220 | Virtanen et al. | Jul 2013 | B2 |
8512221 | Kaplan et al. | Aug 2013 | B2 |
8523758 | Kirby et al. | Sep 2013 | B1 |
8525680 | Ribble et al. | Sep 2013 | B2 |
8628462 | Berka et al. | Jan 2014 | B2 |
8628478 | Conte et al. | Jan 2014 | B2 |
8641616 | Shirai et al. | Feb 2014 | B2 |
8672853 | Young et al. | Mar 2014 | B2 |
8692677 | Wada et al. | Apr 2014 | B2 |
8698635 | Epperson et al. | Apr 2014 | B2 |
8755879 | Hang et al. | Jun 2014 | B2 |
8766805 | Alameh et al. | Jul 2014 | B2 |
8803366 | Proud | Aug 2014 | B2 |
8803682 | Wong et al. | Aug 2014 | B2 |
8810430 | Proud | Aug 2014 | B2 |
8836516 | Wolfe et al. | Sep 2014 | B2 |
8850421 | Proud | Sep 2014 | B2 |
8852127 | Bell et al. | Oct 2014 | B2 |
8866621 | Conte et al. | Oct 2014 | B2 |
8876737 | Prendergast | Nov 2014 | B2 |
8880137 | Tomasco et al. | Nov 2014 | B2 |
8880207 | Karunajeewa et al. | Nov 2014 | B2 |
8893329 | Petrovski et al. | Nov 2014 | B2 |
8932199 | Popovic et al. | Jan 2015 | B2 |
8933809 | Emori et al. | Jan 2015 | B2 |
8939884 | Kashima et al. | Jan 2015 | B2 |
8948861 | Kalik et al. | Feb 2015 | B2 |
8961413 | Liden et al. | Feb 2015 | B2 |
8979730 | Rademaker et al. | Mar 2015 | B2 |
8988014 | Noguchi et al. | Mar 2015 | B2 |
9000931 | Tomimori et al. | Apr 2015 | B2 |
9011347 | Addison et al. | Apr 2015 | B2 |
9186479 | Franceschetti et al. | Nov 2015 | B1 |
9232910 | Alshaer et al. | Jan 2016 | B2 |
9370457 | Nunn | Jun 2016 | B2 |
20020015740 | Ackman et al. | Feb 2002 | A1 |
20020080035 | Youdenko et al. | Jun 2002 | A1 |
20020128700 | Cross, Jr. et al. | Sep 2002 | A1 |
20030159219 | Harrison et al. | Aug 2003 | A1 |
20030195140 | Ackman et al. | Oct 2003 | A1 |
20050190065 | Ronnholm | Sep 2005 | A1 |
20060173257 | Nagai et al. | Aug 2006 | A1 |
20060293608 | Rothman et al. | Dec 2006 | A1 |
20070282215 | Ni et al. | Dec 2007 | A1 |
20080027337 | Dugan et al. | Jan 2008 | A1 |
20080155750 | Mossbeck et al. | Jul 2008 | A1 |
20080157956 | Radivojevic | Jul 2008 | A1 |
20080169931 | Gentry et al. | Jul 2008 | A1 |
20090105560 | Solomon et al. | Apr 2009 | A1 |
20090105605 | Abreu et al. | Apr 2009 | A1 |
20100076252 | Henke et al. | Mar 2010 | A1 |
20110034811 | Naujokat et al. | Feb 2011 | A1 |
20110115635 | Petrovski | May 2011 | A1 |
20110156915 | Brauers | Jun 2011 | A1 |
20110267196 | Hu et al. | Nov 2011 | A1 |
20110295083 | Doelling et al. | Dec 2011 | A1 |
20120092171 | Hwang et al. | Apr 2012 | A1 |
20120103556 | Lee et al. | May 2012 | A1 |
20120119886 | Rawls-Meehan et al. | May 2012 | A1 |
20120138067 | Rawls-Meehan et al. | Jun 2012 | A1 |
20120143095 | Nakamura et al. | Jun 2012 | A1 |
20120251989 | Wetmore et al. | Oct 2012 | A1 |
20130144190 | Bruce et al. | Jun 2013 | A1 |
20130234823 | Kahn et al. | Sep 2013 | A1 |
20130245502 | Lange et al. | Sep 2013 | A1 |
20130276234 | Rawls-Meehan et al. | Oct 2013 | A1 |
20130282198 | Kneuer et al. | Oct 2013 | A1 |
20140116440 | Thompson et al. | May 2014 | A1 |
20140257573 | Van De Sluis | Sep 2014 | A1 |
20140259418 | Nunn | Sep 2014 | A1 |
20140278229 | Hong | Sep 2014 | A1 |
20140323799 | Van Driel et al. | Oct 2014 | A1 |
20140343889 | Shalom et al. | Nov 2014 | A1 |
20150112155 | Bijjani et al. | Apr 2015 | A1 |
20150120205 | Jeon | Apr 2015 | A1 |
20150128353 | Kildey | May 2015 | A1 |
20150137994 | Rahman | May 2015 | A1 |
20150164438 | Halperin et al. | Jun 2015 | A1 |
20150173672 | Goldstein | Jun 2015 | A1 |
20150182305 | Lowe et al. | Jul 2015 | A1 |
20150199919 | Ander et al. | Jul 2015 | A1 |
20150351556 | Franceschetti et al. | Dec 2015 | A1 |
20150352313 | Franceschetti et al. | Dec 2015 | A1 |
20150355605 | Franceschetti et al. | Dec 2015 | A1 |
20150355612 | Franceschetti et al. | Dec 2015 | A1 |
20150366365 | Golin et al. | Dec 2015 | A1 |
20160015315 | Auphan et al. | Jan 2016 | A1 |
20160073788 | Franceschetti et al. | Mar 2016 | A1 |
20160073950 | Franceschetti et al. | Mar 2016 | A1 |
20160093196 | Shinar | Mar 2016 | A1 |
20160128488 | Franceschetti et al. | May 2016 | A1 |
20160136383 | Franceschetti et al. | May 2016 | A1 |
20160151603 | Shouldice | Jun 2016 | A1 |
20160192886 | Nunn | Jul 2016 | A1 |
20160310697 | Franceschetti et al. | Oct 2016 | A1 |
20170028165 | Franceschetti et al. | Feb 2017 | A1 |
Number | Date | Country |
---|---|---|
103519597 | Jan 2014 | CN |
103945802 | Jul 2014 | CN |
2788595 | Jul 2000 | FR |
2004154242 | Jun 2004 | JP |
2008000222 | Jan 2008 | JP |
2008279193 | Nov 2008 | JP |
2013134160 | Sep 2013 | WO |
Entry |
---|
International Search Report and Written Opinion mailed Jul. 14, 2016, for International Application No. PCT/US2016/030594, 7 pages. |
International Search Report and Written Opinion mailed Sep. 24, 2015, for International Patent Application No. PCT/US2015/034574, 7 pages. |
Non-Final Office Action mailed Apr. 15, 2016, for U.S. Appl. No. 14/946,496 of Franceschetti, M., et al., filed Nov. 19, 2015. |
Non-Final Office Action mailed Aug. 31, 2015 for U.S. Appl. No. 14/732,608 by Franceschetti, M., et al., filed Jun. 5, 2015. |
Non-Final Office Action Mailed Jun. 1, 2016 of U.S. Appl. No. 14/969,932 by Franceschetti, M., et al., filed Dec. 15, 2015. |
Non-Final Office Action mailed Jun. 13, 2016, for U.S. Appl. No. 14/969,902 of Franceschetti, M. et al. filed Dec. 15, 2015. |
Notice of Allowance mailed Oct. 7, 2015, for U.S. Appl. No. 14/732,608 by Franceschetti, M., et al., filed Jun. 5, 2015. |
U.S. Appl. No. 14/942,458 of Franceschetti, M. filed Nov. 16, 2015. |
International Search Report and Written Opinion Mailed Aug. 18, 2016 for International Patent Application No. PCT/US2016/031060, filed May 5, 2016. (7 pages). |
International Search Report and Written Opinion mailed Aug. 25, 2016 for International Patent Application No. PCT/US2016/031054, filed May 5, 2016. (8 pages). |
U.S. Appl. No. 14/732,608 of Franceschetti, M., et al., filed Jun. 5, 2015. |
U.S. Appl. No. 14/732,624 of Franceschetti, M., et al., filed Jun. 5, 2015. |
U.S. Appl. No. 14/732,638 of Franceschetti, M., et al., filed Jun. 5, 2015. |
U.S. Appl. No. 14/732,646 of Franceschetti, M., et al., filed Jun. 5, 2015. |
U.S. Appl. No. 14/946,496 of Franceschetti, M., et al., filed Nov. 19, 2015. |
U.S. Appl. No. 14/947,685 of Franceschetti, M., et al., filed Nov. 20, 2015. |
U.S. Appl. No. 14/969,932 of Franceschetti, M., et al., filed Dec. 15, 2015. |
U.S. Appl. No. 14/969,902 of Franceschetti, M., et al., filed Dec. 15, 2015. |
U.S. Appl. No. 15/178,117 of Franceschetti, M., et al., filed Jun. 9, 2016. |
Cavusoglu, M., et al., “Spectral Envelope Analysis of Snoring Signals,” Proceedings of the Sixth IASTED International Conference, Biomedical Engineering, Feb. 13-15, 2008, Innsbruck Austria, pp. 473-477. |
Final Office Action Mailed Nov. 23, 2016 for U.S. Appl. No. 14/969,902 by Franceschetti, M., et al., filed Dec. 15, 2015. |
Final Office Action mailed Oct. 11, 2016, for U.S. Appl. No. 14/946,496 of Franceschetti, M., et al., filed Nov. 19, 2015. |
International Search Report and Written Opinion mailed Sep. 29, 2016 for International Application No. PCT/US2016/029889, filed Apr. 28, 2016. (7 pages). |
Non-Final Office Action mailed Dec. 13, 2016 for U.S. Appl. No. 14/942,458 by Franceschetti, M., et al., filed Nov. 16, 2015. |
Non-Final Office Action Mailed Dec. 14, 2016 for U.S. Appl. No. 15/178,117 by Franceschetti, M., et al., filed Jun. 9, 2016. |
Non-Final Office Action Mailed Dec. 16, 2016 for U.S. Appl. No. 15/178,132 of Franceschetti, M., et al. filed Jun. 9, 2016. |
Non-Final Office Action Mailed Oct. 19, 2016 of U.S. Appl. No. 14/732,624 by Franceschetti, M., et al., filed Jun. 5, 2015. |
Notice of Allowance mailed Oct. 18, 2016, for U.S. Appl. No. 14/969,932 by Franceschetti, M., et al., filed Dec. 15, 2015. |
U.S. Appl. No. 15/293,049 of Franceschetti, M., et al., filed Oct. 13, 2016. |
International Search Report and Written Opinion mailed Sep. 29, 2016 for International Patent Application No. PCT/US2016/031062, filed May 5, 2016, 8 pages. |
Restriction Requirement Mailed Sep. 9, 2016 of U.S. Appl. No. 15/178,117 by Franceschetti, M., et al., filed Jun. 9, 2016. |
Non-Final Office Action Mailed Apr. 3, 2017 of U.S. Appl. No. 14/732,646 of Franceschetti, M., et al., filed Jun. 5, 2015. |
Number | Date | Country | |
---|---|---|---|
20160310697 A1 | Oct 2016 | US |
Number | Date | Country | |
---|---|---|---|
62008480 | Jun 2014 | US | |
62024945 | Jul 2014 | US | |
62159177 | May 2015 | US | |
62161142 | May 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14732646 | Jun 2015 | US |
Child | 15178124 | US |