Systems and methods for improved assisted or independent living environments

Information

  • Patent Grant
  • 12321142
  • Patent Number
    12,321,142
  • Date Filed
    Tuesday, July 5, 2022
    2 years ago
  • Date Issued
    Tuesday, June 3, 2025
    7 days ago
Abstract
The present embodiments relate to detecting instances of individuals being in peril within an independent or assisted living environment. According to certain aspects, with an individual's permission or affirmative consent, a hardware controller (such as a smart or interconnected home controller, or even a mobile device) may receive and analyze sensor data detected within the independent or assisted living environment to determine whether an individual may be in peril. In this circumstance, the hardware controller may generate a notification that indicates the situation and may communicate the notification to a proper individual, such as a family member or care giver, who may be in a position to mitigate or alleviate any risks posed by the situation. The foregoing functionality also may be used by an insurance provider to generate, update, or adjust insurance policies, premiums, rates, or discounts, and/or make recommendations to an insured individual.
Description
FIELD OF THE DISCLOSURE

The present disclosure generally relates to managing a connected property. More particularly, the present disclosure relates to assessing sensor data from smart devices in a property to detect when individuals may be in peril, and facilitating actions to mitigate the situation.


BACKGROUND

With the proliferation of the “internet of things,” more household devices and items are gaining communication and network connectivity capabilities. The new capabilities are enabling easier data detection and more accurate information and metrics. However, the ability to detect certain conditions associated with devices and items may be limited. Additionally, the channels to control and maintain devices and items as a response to certain conditions may also be limited.


BRIEF SUMMARY

The present embodiments may, inter alia, access certain device data to detect certain conditions and situations within a property and determine actions or commands to perform to address the conditions and situations. Further, the present embodiments may effectively and efficiently communicate relevant information associated with the conditions and enable users to facilitate the actions or commands. One particular functionality relates to analyzing sensor data to detect when one or more individuals may be in peril, such as in an independent or assisted living environment, and then notifying proper individuals of the situation.


Generally, the present embodiments may relate to (1) home control and/or automation, as well as (2) loss prevention, reduction, and/or mitigation through proactively identifying periled individuals, notifying an individual of detected situations, and enabling individuals to mitigate the detected situations. The foregoing functionality also may be used by an insurance provider to generate, update, or adjust insurance policies, premiums, rates, discounts, points, and/or rewards, and/or make recommendations to an insured individual.


According to one embodiment, a computer-implemented method of detecting periled individuals within an independent or assisted living environment may be provided. The independent or assisted living environment may be populated with a hardware controller in communication with a plurality of sensors. The method may include (1) receiving, by the hardware controller, sensor data from at least one sensor located within the independent or assisted living environment, the at least one sensor either (i) secured to an individual or (ii) configured to sense environmental data within a proximity of the individual; (2) analyzing the sensor data by one or more processors; (3) based upon the analyzing, determining that the individual is in peril; (4) responsive to determining that the individual is in peril, generating a notification indicating that the individual is in peril; and/or (5) communicating the notification to an electronic device of an additional individual to facilitate alleviating a risk associated with the individual being in peril. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


According to another embodiment, a hardware controller for detecting periled individuals within an independent or assisted living environment, where the hardware controller may communication with a set of sensors populated within the independent or assisted living environment, may be provided. The hardware controller may include a communication module adapted to interface with the set of sensors populated within the independent or assisted living environment; a memory adapted to store non-transitory computer executable instructions; and/or a processor adapted to interface with the communication module and the memory. The processor may be configured to execute the non-transitory computer executable instructions to cause the processor to receive, via the communication module, sensor data from at least one sensor of the set of sensors located within the independent or assisted living environment, the at least one sensor either (i) secured to an individual or (ii) configured to sense environmental data within a proximity of the individual, analyze the sensor data, based upon the analyzing, determine that the individual is in peril, responsive to determining that the individual is in peril, generate a notification indicating that the individual is in peril, and/or communicate, via the communication module, the notification to an electronic device of an additional individual to facilitate alleviating a risk associated with the individual being in peril. The hardware controller may include additional, less, or alternate functionality, including that discussed elsewhere herein.


Advantages will become more apparent to those skilled in the art from the following description of the preferred embodiments which have been shown and described by way of illustration. As will be realized, the present embodiments may be capable of other and different embodiments, and their details are capable of modification in various respects. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS

The Figures described below depict various aspects of the system and methods disclosed herein. It should be understood that each Figure depicts an embodiment of a particular aspect of the disclosed system and methods, and that each of the figures is intended to accord with a possible embodiment thereof. Further, wherever possible, the following description refers to the reference numerals included in the following figures, in which features depicted in multiple figures are designated with consistent reference numerals.


There are shown in the drawings arrangements which are presently discussed, it being understood, however, that the present embodiments are not limited to the precise arrangements and instrumentalities shown, wherein:



FIG. 1 depicts an exemplary environment including components and entities associated with managing device operation and facilitating insurance policy processing, in accordance with some embodiments.



FIG. 2 is an exemplary signal diagram associated with assessing sensor data to detect individuals in peril and facilitating various actions to mitigate the situations, in accordance with some embodiments.



FIG. 3 is a flow diagram of an exemplary computer-implemented method of assessing sensor data to detect individuals in peril and facilitating various actions to mitigate the situations, in accordance with some embodiments.



FIG. 4 is a block diagram of an exemplary controller in accordance with some embodiments.



FIG. 5 is a block diagram of an exemplary processing server in accordance with some embodiments.



FIGS. 6A and 6B depict exemplary interfaces associated with notifying of periled individuals and facilitating various actions to mitigate the situations, in accordance with some embodiments.





The Figures depict preferred embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the systems and methods illustrated herein may be employed without departing from the principles of the invention described herein.


DETAILED DESCRIPTION

The present embodiments may relate to, inter alia, assessing operation of devices or personal property within a home or other type of property, such as household furniture, appliances, electronics, vehicles (e.g., cars, boats, motorcycles), and/or other personal belongings (e.g., clothing, jewelry, antiques). Generally, a home or property may have a “smart” central controller that may be wirelessly connected, or connected via hard-wire, with various household related items, devices, and/or sensors. The central controller may be associated with any type of property, such as homes, office buildings, restaurants, farms, and/or other types of properties.


The central controller, and/or one or more remote processors or servers associated with an insurance provider or other entity, may be in wireless or wired communication with various “smart” items or devices, such as smart appliances (e.g., clothes washer, dryer, dish washer, refrigerator, etc.); smart heating devices (e.g., furnace, space heater, etc.); smart cooling devices (e.g., air conditioning units, fans, ceiling fans, etc.); smart plumbing fixtures (e.g., toilets, showers, water heaters, piping, interior and yard sprinklers, etc.); smart cooking devices (e.g., stoves, ovens, grills, microwaves, etc.); smart wiring, lighting, and lamps; smart personal vehicles; smart thermostats; smart windows, doors, or garage doors; smart window blinds or shutters; wearable devices; and/or other smart devices and/or sensors capable of wireless or wired communication. Each smart device (or sensor associated therewith), as well as the central controller and/or insurance provider remote processor(s), may be equipped with a processor, memory unit, software applications, wireless transceivers, local power supply, various types of sensors, and/or other components.


The central controller, and/or insurance provider remote processor(s), may collect or retrieve various data from the devices or personal property, analyze the data, and/or identify various situations indicated by the data and/or actions to facilitate based upon the analysis. In particular, the central controller and/or insurance provider remote processor(s) may receive operation data from the smart devices, where the operation data may include various sensor data associated with the smart devices. The central controller and/or insurance provider remote processor(s) may analyze the operation data (e.g., by comparing the operation data to baseline sensor data) to detect that an individual may be in peril, or otherwise exposed to injury, loss, destruction and/or the like. According to embodiments, the individual be located within an independent or assisted living environment. In these situations, the central controller and/or the insurance provider may generate a notification that indicates the situation and may communicate the notification to a proper individual who may be in position to help the individual in peril. The central controller and/or insurance provider may also determine to process an insurance policy that may be impacted by the situation.


The systems and methods discussed herein address a challenge that is particular to property management. In particular, the challenge relates to a difficulty in identifying when an individual located on a premises may be in peril or otherwise in need of help, as well as a difficulty in mitigating the situation. This is particularly apparent when the individual is not under constant care or connected to conventional monitoring machines. Existing environments rely on individuals to self-report situations and/or rely on caregivers to happen upon the situations. However, these existing environments still result in numerous situations that go unaddressed as a result of nobody noticing situations or the individual being unable to call or signal for assistance. In contrast, the present systems and methods leverage sensor data from connected devices to detect and identify situations in which individuals may be in peril or otherwise in need of assistance, and dynamically generate notifications of the same and send the notifications to proper individuals in position to offer assistance. Therefore, because the systems and methods employ the collection and analysis of sensors data associated with connected devices within the property, the systems and methods are necessarily rooted in computer technology in order to overcome the noted shortcomings that specifically arise in the realm of property management.


Similarly, the systems and methods provide improvements in a technical field, namely, property automation and safety. Instead of the systems and methods merely being performed by hardware components using basic functions, the systems and methods employ complex steps that go beyond the mere concept of simply retrieving and combining data using a computer. In particular, the hardware components receive data from connected devices, analyze the data identify a potentially threatening situation for an individual, generating a notification that indicates the potentially threatening situation, and/or communicate the notification to a proper individual. Additionally, because a central controller in a property retrieves and analyzes sensor data from a plurality of connected devices in the property, the central controller and the connected devices are part of a “thin client” environment that improves data persistence and information processing. This combination of elements further impose meaningful limits in that the operations are applied to improve property automation and safety by detecting potentially threatening situations, and facilitating mitigating actions in a meaningful and effective way.


According to implementations, the systems and methods may support a dynamic, real-time or near-real-time analysis of any received data. In particular, the central controller and/or insurance provider may retrieve and/or receive real-time sensor data from the sensors, analyze the sensor data in real-time, and dynamically determine that an individual is in peril. Additionally, the central controller and/or insurance provider may dynamically generate a notification of the situation in real-time, and communicate the notification to another individual in real-time. Accordingly, the real-time capability of the systems and methods enable the individuals in peril with an assurance of efficient and effective treatment should the individuals be in peril, and enable any caregivers with real-time notifications that individuals are in peril.


Generally, the systems and methods offer numerous benefits relating to the safety of individuals. In particular, the systems and methods may automatically detect situations in which individuals may be in peril, and may automatically facilitate actions to address the situations. As a result, the safety of individuals may improve, especially in independent or assisted living environments. Further, the systems and methods enable additional individuals to be notified of the situations so that the additional individuals are able to promptly address the situations.


The systems and methods may further offer a benefit to insurance providers and customers thereof. Particularly, the present embodiments may facilitate (a) providing and updating insurance policies; (b) the handling or adjusting of home insurance claims; (c) the disbursement of monies related to insurance claims; (d) modifying insurance coverage amounts; (e) updating and improving estimate models, and/or (f) other insurance-related activities. The systems and methods may further offer a benefit to customers by offering improved insurance claim processing. Further, the insurance providers may stand out as a cost-effective insurance provider, thereby retaining existing customers and attracting new customers. It should be appreciated that further benefits to the systems and methods are envisioned.


The method may also include adjusting an insurance policy, premium, or discount (such as a homeowners, renters, auto, home, health, or life insurance policy, premium, or discount) based upon the assisted living and/or other functionality discussed herein, and/or an insured having a home and/or mobile device with such functionality.


I. Exemplary Environment and Components for Assessing Device Operation and Functionalities Relating Thereto



FIG. 1 depicts an exemplary environment 100 including components and entities for managing devices associated with a property and processing insurance policies associated therewith. Although FIG. 1 depicts certain entities, components, and devices, it should be appreciated that additional, fewer, or alternate entities and components are envisioned.


As illustrated in FIG. 1, the environment 100 may include a property 105 that contains a controller 120 and a plurality of devices 110 that may be each connected to a local communication network 115. According to the present embodiments, the property 105 may be an independent or assisted living environment in which one or more individuals needing independent or assisted living care may reside. The independent or assisted living environment may employ caregivers who provide care to the residents as needed. However, it should be appreciated that the property 105 may be other types of properties, such as a private residence, an office, a hotel, or the like.


Each of the plurality of devices 110 may be a “smart” device that may be configured with one or more sensors capable of sensing and communicating operating data associated with the corresponding device 110. As shown in FIG. 1, the plurality of devices 110 may include a smart alarm system 110a, a smart stove 110b, and/or a smart washing machine 110c. Each of the plurality of devices 110 may be located within or proximate to the property 105 (generally, “on premises”). In one implementation, one or more of the plurality of devices 110 may be a device that is wearable by an individual, such as a heart rate monitor, a pedometer, a blood pressure monitor, or other types of wearable devices or monitors. Although FIG. 1 depicts only one property 105, it should be appreciated that multiple properties are envisioned, each with its own controller and devices. Further, it should be appreciated that additional, fewer, or alternate devices may be present in the property 105.


In some cases, the plurality of devices 110 may be purchased from a manufacturer with the “smart” functionally incorporated therein. In other cases, the plurality of devices 110 may have been purchased as “dumb” devices and subsequently modified to add the “smart” functionality to the device. For instance, a homeowner may purchase an alarm system that installs sensors on or near a door to detect when a door has been opened and/or unlocked.


In some embodiments, the plurality of devices 110 may monitor their own status or condition via the sensors to detect any issues or problems. In response to detecting issues or problems, the plurality of devices 110 may be able to indicate the issues or problems via display components, such as LED lights, display screens, or other visual indicators. In further embodiments, the controller 120 may be configured to monitor, via sensor data, whether the plurality of devices 110 and/or parts thereof have been installed correctly, whether replacement parts are new and/or otherwise in good condition, and/or other conditions associated with the plurality of devices 110 and/or parts thereof.


The plurality of devices 110 may be configured to communicate with a controller 120 via the local communication network 115. The local communication network 115 may facilitate any type of data communication between devices and controllers located on or proximate to the property 105 via any standard or technology (e.g., LAN, WLAN, any IEEE 802 standard including Ethernet, and/or others). The local communication network 115 may further support various short-range communication protocols, such as Bluetooth®, Bluetooth® Low Energy, near field communication (NFC), radio-frequency identification (RFID), and/or other types of short-range protocols.


According to aspects, the plurality of devices 110 may transmit, to the controller 120 via the local communication network 115 (and/or to the insurance provider 130 remote processing server 135 via the network 125), operational data gathered from sensors associated with the plurality of devices 110. The operational data may be audio data, image or video data, motion data, status data, usage amounts, vital sign data, and/or other data or information. For instance, the operational data may include imaging or audio data recorded within a room; a heart rate of an individual wearing one of the plurality of devices 110; and/or other information that may be pertinent to an operation state or status of the plurality of devices 110. For further instance, the operational data may include motion data that may indicate the presence of and movement of any individuals within the property 105 and/or located on the exterior of the property 105. Additionally, the operational data may include device usage data. The operational data may include a timestamp representing the time that the operational data was recorded.


The controller 120 may be coupled to a database 112 that stores various operational data and information associated with the plurality of devices 110. Although FIG. 1 depicts the database 112 as coupled to the controller 120, it is envisioned that the database 112 may be maintained in the “cloud” such that any element of the environment 100 capable of communicating over either the local network 115 or one or more other networks 125 may directly interact with the database 112.


In some embodiments, the database 112 may organize the operational data according to which individual device 110 the data may be associated and/or the room or subsection of the property in which the data was recorded. Further, the database 112 may maintain an inventory list that includes the plurality of devices 110, as well as various data and information associated with the plurality of devices 110 (e.g., locations, replacement costs, etc.).


In one embodiment, the database 112 may maintain various operation states of the plurality of devices 110. In particular, the operation states may specify various settings of the plurality of devices 110 such that when the respective device is configured at the setting(s), the respective device will operate in the corresponding operation state. For instance, an operation state for a smart thermostat may be “heat conservation” whereby the corresponding setting is 64 degrees (as opposed to a more “normal” 70 degree setting). It should be appreciated that each operation state may specify settings for more than one of the devices 110.


The controller 120 (and/or the plurality of devices 112) may be configured to communicate with other components and entities, such as an insurance provider 130 and various third party source(s) 138 via the network(s) 125. According to some embodiments, the network(s) 125 may facilitate any data communication between the controller 120 located on the property 105 and entities or individuals remote to the property 105 via any standard or technology (e.g., GSM, CDMA, TDMA, WCDMA, LTE, EDGE, OFDM, GPRS, EV-DO, UWB, IEEE 802 including Ethernet, WiMAX, Wi-Fi, and/or others). In some cases, both the local network 115 and the network 125(s) may utilize the same technology.


Generally, the insurance provider 130 may be any individual, group of individuals, company, corporation, or other type of entity that may issue insurance policies for customers, such as a home insurance policy associated with the property 105. According to the present embodiments, the insurance provider 130 may include one or more processing server(s) 135 configured to facilitate the functionalities as discussed herein. Although FIG. 1 depicts the processing server 135 as a part of the insurance provider 130, it should be appreciated that the processing server 135 may be separate from (and connected to or accessible by) the insurance provider 130.


Further, although the present disclosure describes the systems and methods as being facilitated in part by the insurance provider 130, it should be appreciated that other non-insurance related entities may implement the systems and methods. For instance, a general contractor may aggregate the insurance-risk data across many properties to determine which appliances or products provide the best protection against specific causes of loss, and/or deploy the appliances or products based upon where causes of loss are most likely to occur. Accordingly, it may not be necessary for the property 105 to have an associated insurance policy for the property owners to enjoy the benefits of the systems and methods.


The third-party source(s) 138 may represent any entity or component that is configured to obtain, detect, and/or determine data or information that may be relevant to the devices 110 of the property 105. In some embodiments, the third-party source(s) 138 may be a manufacturer, supplier, servicer, or retailer of the any of the devices 110, as well as for replacement devices for the devices 110. For instance, if one of the devices 110 is a refrigerator, the third-party source 138 may be refrigerator manufacturer that sells refrigerators of the same or different types or models as the refrigerator device 110. The third-party source(s) 138 may store data associated with a replacement device (e.g., cost, retail location, general information, availability, or the like). Further, the third-party source(s) 138 may store baseline data associated with various types of situations in which individuals may be in peril. The third-party source(s) 138 may be configured to communicate various data or information to the controller 120 and/or to the insurance provider 130 via the network(s) 125, whereby the controller 120 and/or the insurance provider 130 may examine the data or information to facilitate various functionalities.


The controller 120, the insurance provider 130 and/or the processing server 135, and the third-party source(s) 138 may also be in communication, via the network(s) 125, with an electronic device 145 associated with an individual 140. In some embodiments, the individual 140 may have an insurance policy (e.g., a long-term care insurance policy) associated with the property 105, or may otherwise be associated with the property 105 (e.g., the individual 140 may live in the property 105). The individual 140 may also be associated with a resident of the property 105 (e.g., a family member of a person who resides in the property 105). The electronic device 145 may be a mobile device, such as a smartphone, a desktop computer, a laptop, a tablet, a phablet, a smart watch, smart glasses, wearable electronics, pager, personal digital assistant, or any other electronic device, including computing devices configured for wireless radio frequency (RF) communication and data transmission. In some implementations, the controller 120 (and/or insurance provider 130 remote processing server 135) may communicate, to the individual 140 via the electronic device 145, an indication of the operation of the plurality of devices 110, such as the commands transmitted to the plurality of devices 110. Further, the controller 120 (and/or insurance provider 130 remote processing server 135) may enable the individual 140 to remotely control various of the plurality of devices 110 via the electronic device 145.


According to some other implementations, the controller 120 (and/or insurance provider 130 remote processing server 135) may analyze sensor data from any of the plurality of devices 110 to determine if one or more individuals may be in peril or otherwise in need of help or assistance. The controller 120 (and/or insurance provider 130 remote processing server 135) may generate notifications or alerts that may indicate the situation, and communicate the notifications or alerts to the electronic device 145 via the network 125. Further, the controller 120 (and/or insurance provider 130 or remote processing server 135) may determine any changes to or processing associated with an insurance policy that may result from the situation, and may communicate with the remote processing server 135 to facilitate the processing.


The controller 120 (and/or insurance provider 130 remote processing server 135) may also transmit any modifications to insurance policies based upon detected data from the plurality of devices 110. In response, the individual (e.g., a policyholder) may accept the proposed insurance claim or make modifications to the proposed insurance claim, and/or otherwise accept/reject any modifications to the insurance policy. The electronic device may transmit, via the network 125, the accepted or modified insurance claim back to the controller 120 (and/or insurance provider 130 remote processing server 135).


The controller 120 may facilitate any processing of the insurance claim with the processing server 135 of the insurance provider 130. Additionally or alternatively, the processing server 135 may facilitate the proposed insurance claim communications and processing directly with the customer 140. In some implementations, the insurance provider 130 remote processing server 135 may provide the same functionality as that described herein with respect to the controller 120.


II. Exemplary Communication Flow for Detecting when Individuals are in Peril and Communicating Notifications Relating Thereto


Referring to FIG. 2, illustrated is an exemplary signal diagram 200 associated with detecting when individuals may be in peril and communicating notifications relating thereto. FIG. 2 includes a set of smart devices 210 (such as the smart devices 110 as discussed with respect to FIG. 1), a controller 220 (such as the controller 120 as discussed with respect to FIG. 1), a processing server 235 (such as the processing server 135 as discussed with respect to FIG. 1), and a user device 245 (such as the user device 145 as discussed with respect to FIG. 1).


The smart devices 210 and the controller 220 may be located within an independent or assisted living environment 205 (which generally may be the property 105 as discussed with respect to FIG. 1). The individual who may be in peril may be an individual who resides in the independent or assisted living environment 205 and who may receive care by employees or other workers of the independent or assisted living environment 205. According to embodiments, the smart devices 210 may include a set of sensors configured to generate and communicate various sensor data. Further, according to embodiments, the user device 245 may belong to an individual associated with the independent or assisted living environment 205, such as an employee or worker, a resident of the independent or assisted living environment 205, a caregiver, caretaker, and/or family member of an individual residing in the independent or assisted living environment 205, or an individual otherwise associated with an individual residing in the independent or assisted living environment 205.


The signal diagram 200 may begin when the controller 220 optionally requests (250) the smart devices 210 for sensor data. In some implementations, the controller 220 may periodically request the smart devices 210 for sensor data, or the controller 220 may request the smart devices 210 for sensor data in response to various triggers (e.g., at a certain time of the day or in response to receiving particular sensor data from a particular smart device 210). The controller 220 may also request sensor data from one or more specific smart devices 210. In an implementation, the smart device(s) 210 may be devices configured to be worn by an individual, such as a resident of the independent or assisted living environment 205.


The smart device(s) 210 may send (252) sensor data to the controller 220. For example, the sensor data may be audio data, imaging data (e.g., images and/or videos), motion/movement sensor data, location data, and/or vital sign data. It should be appreciated that other types of sensor data and combinations of sensor data are envisioned. The smart device(s) 210 may provide the sensor data automatically as the data is detected, in response to receiving a request from the controller 220, or in response to various triggers. For example, the smart device 210 may be a heart rate monitor that may send heart rate data of an individual to the controller 220 when the corresponding heart rate exceeds 120 beats/minute. For further example, the smart device 210 may be a band wearable by an individual that may send acceleration data to the controller 220 when the corresponding acceleration exceeds a certain threshold (which may be indicative of a fall).


The controller 220 may optionally access (254) baseline sensor data that may correspond to the received sensor data. In particular, if the controller 220 receives sensor data or a particular type (e.g., acceleration data), the controller 220 may access baseline data of the same type (e.g., baseline acceleration data). The controller 220 may analyze (256) the received sensor data. In particular, the controller 220 may analyze the received sensor data to determine whether there are any abnormalities, causes for concern, and/or the like. In one implementation, the controller 220 may compare the received sensor data to the baseline sensor data to determine a level of similarity, where the level of similarity may meet a set threshold value. In another implementation, the controller 220 may compare the received sensor data to any corresponding threshold levels which may indicate any abnormalities, causes for concert, and/or the like.


Based upon the analysis of (256), the controller 220 may determine (258) if the individual is in peril. In particular, if the received sensor data meets or exceeds any threshold level (or differs from any threshold level by a certain amount or percentage), or if any calculated similarity level meets a threshold value, then the controller may deem that an individual is in peril. For example, if the received sensor data is audio data having a decibel reading and the controller 220 determines that the decibel reading exceeds a threshold decibel level, then the controller 220 may deem that the individual is in peril (such as if the individual is summoning help).


In one embodiment, the controller 220 may examine various combinations of sensor data (e.g., audio data, imaging data, motion/movement sensor data, location data, and/or vital sign data) to assess the situation. For example, the controller 220 may determine that the individual's heart rate is above a threshold amount, but may also determine that the individual is currently located in a fitness center, and therefore deem that the individual is not in peril. Conversely, the controller 220 may determine that the individual's heart rate is above a threshold value, and may determine that the individual is located in his or her room, and therefore deem that the individual may be in peril.


If the controller 220 determines that the individual is not in peril (“NO”), processing may end or proceed to other functionality. If the controller 220 determines that the individual is in peril (“YES”), the controller may generate (260) a notification indicating that the individual is in peril. In embodiments, the notification may include any details of the situation and may also include various selections that enable a receiving individual (e.g., an individual associated with the user device 245) to initiate help or aid for the individual in peril. For example, the notification may include contact information for the independent or assisted living environment 205 and/or for an individual associated therewith.


The controller 220 may send (262) the notification to the user device 245, so that the individual associated with the user device 245 may access and review the notification. As discussed above, the individual associated with the user device 245 may review the notification and take any appropriate action, in particular any action that may alleviate the situation resulting from the individual residing in the independent or assisted living environment 205 being in peril. For example, if the individual associated with the user device 245 is a caregiver employed by the independent or assisted living environment 205, the caregiver may access the room of the individual and provide any needed care, or may summon appropriate medical personnel.


In some embodiments, the controller 220 may also facilitate insurance processing associated with the situation of the individual. In particular, the controller may send (264) an indication of the event (i.e., the individual being in peril) to the processing server 325, via one or more standard channels. The processing server 325 may examine the indication of the event and access (266) an insurance policy that may belong to the individual in peril (i.e., the individual in peril may be a policy holder for the insurance policy). The processing server 235 may also process (268) the insurance policy accordingly. In particular, the processing server 235 may determine whether the individual being in peril may impact the insurance policy, and may adjust the insurance policy accordingly, such as by adjusting the insurance policy to insure a particular type of care that was previously not needed by the individual.


Although FIG. 2 depicts the controller 220 performing various steps and determinations, it should be appreciated that the processing server 235 may perform the same or similar steps or determinations. For example, the processing server 235 may receive the sensor data, compare the sensor data to baseline data, generate a notification, communicate a notification, determine a mitigating action, and/or facilitate the mitigating action.


III. Exemplary Method for Detecting Individuals in Peril


Referring to FIG. 3, depicted is a block diagram of an exemplary computer-implemented method 300 of detecting individuals who may be in peril within an independent or assisted living environment (and/or abnormal conditions associated with individuals or premises). The method 300 may be facilitated by an electronic device within the property, such as the controller 120 that may be in direct or indirect communication with an insurance provider (such as the insurance provider 130 or a remote processor or server thereof).


The method 300 may begin when the controller receives (block 305) sensor data associated with an individual from at least one sensor. In some embodiments, the sensor may be secured to the individual (e.g., a wearable device including an accelerometer, a heart rate monitor, a vital signs monitor, and/or the like) or configured to sense environmental data within a proximity of the individual (e.g., an audio sensor, a device usage sensor, a video camera, and/or the like). In one implementation, the controller may request the sensor data from the at least one sensor.


The controller may optionally access (block 310) baseline sensor data corresponding to the retrieved sensor data. In some implementations, the retrieved sensor data may have a specific type (e.g., wearable device data, recorded video, recorded audio), where the controller may access baseline sensor data that corresponds to the type of retrieved sensor data. The controller may analyze (block 315) the received sensor data according to various calculations, techniques, or algorithms. The analysis may determine whether the retrieved sensor data is consistent with or reflects that the individual is in peril (and/or abnormal condition exists). In particular, the controller may determine whether the sensor data meets or exceeds any threshold level (or differs from any threshold level by a certain amount or percentage), or whether any calculated similarity level meets a threshold value.


For example, if a microphone detects a loud crash that exceeds a certain decibel level (which may be associated with the individual falling), then the controller may determine that the individual is in peril. For further example, if the sensor data indicates a heart rate that is dropping by a certain amount or percentage, then the controller may determine that the individual is in peril.


If the controller determines that the individual is not in peril (“NO”), then processing may end or proceed to any other functionality. If the controller determines that the individual is in peril (“YES”), then the controller may generate (block 325) a notification indicating that the individual is in peril. In some implementations, the notification may include various information, such as an identification of the individual, a current location of the individual, a description of the situation and/or abnormal condition, contact information of relevant individuals, and/or other information.


The controller may communicate (block 330) the notification to an electronic device of an additional individual. The controller may store an identification of the electronic device. For example, the electronic device may be a smartphone belonging to a caregiver associated with the individual. In some scenarios, the controller may communicate the notification to an insurance provider.


IV. Exemplary Controller



FIG. 4 illustrates a diagram of an exemplary controller 420 (such as the controller 120 discussed with respect to FIG. 1) in which the functionalities as discussed herein may be implemented. It should be appreciated that the controller 420 may be associated with a property, as discussed herein.


The controller 420 may include a processor 422 as well as a memory 478. The memory 478 may store an operating system 479 capable of facilitating the functionalities as discussed herein, as well as a set of applications 475 (i.e., machine readable instructions). For instance, one of the set of applications 475 may be a peril detection application 484 configured to analyze sensor data, detect when individuals may be in peril, and facilitate actions to mitigate the detected situations. The set of applications 475 may also include one or more other applications 484, such as an insurance processing application.


The processor 422 may interface with the memory 478 to execute the operating system 479 and the set of applications 475. According to some embodiments, the memory 478 may also include a data record storage 480 that stores various data, such as baseline data corresponding to various types of sensor data. The peril detection application 484 may interface with the data record storage 480 to retrieve relevant baseline data that the peril detection application 484 may use to determine whether individuals may be in peril. The memory 478 may include one or more forms of volatile and/or non-volatile, fixed and/or removable memory, such as read-only memory (ROM), electronic programmable read-only memory (EPROM), random access memory (RAM), erasable electronic programmable read-only memory (EEPROM), and/or other hard drives, flash memory, MicroSD cards, and others.


The controller 420 may further include a communication module 477 configured to communicate data via one or more networks 425. According to some embodiments, the communication module 477 may include one or more transceivers (e.g., WWAN, WLAN, and/or WPAN transceivers) functioning in accordance with IEEE standards, 3GPP standards, or other standards, and/or configured to receive and transmit data via one or more external ports 476. Further, the communication module 477 may include a short-range network component (e.g., an RFID reader) configured for short-range network communications. For instance, the communication module 477 may receive, via the network 425, sensor data from a plurality of devices populated within a property.


The controller 420 may further include a user interface 481 configured to present information to a user and/or receive inputs from the user. As shown in FIG. 4, the user interface 481 may include a display screen 482 and I/O components 483 (e.g., ports, capacitive or resistive touch sensitive input panels, keys, buttons, lights, LEDs, speakers, microphones). According to some embodiments, the user may access the controller 420 via the user interface 481 to assess sensor data, process insurance policies, and/or perform other functions. The controller 420 may be configured to perform insurance-related functions, such as generating proposed insurance claims and facilitating insurance claim processing. In some embodiments, the controller 420 may perform the functionalities as discussed herein as part of a “cloud” network or may otherwise communicate with other hardware or software components within the cloud to send, retrieve, and/or otherwise analyze data.


In general, a computer program product in accordance with an embodiment may include a computer usable storage medium (e.g., standard random access memory (RAM), an optical disc, a universal serial bus (USB) drive, or the like) having computer-readable program code embodied therein, wherein the computer-readable program code may be adapted to be executed by the processor 422 (e.g., working in connection with the operating system 479) to facilitate the functions as described herein. In this regard, the program code may be implemented in any desired language, and may be implemented as machine code, assembly code, byte code, interpretable source code or the like (e.g., via C, C++, Java, Actionscript, Objective-C, Javascript, CSS, XML). In some embodiments, the computer program product may be part of a cloud network of resources.


V. Exemplary Server



FIG. 5 illustrates a diagram of an exemplary processing server 535 (such as the processing server 135 discussed with respect to FIG. 1) in which the functionalities as discussed herein may be implemented. It should be appreciated that the processing server 535 may be associated with an insurance provider, as discussed herein. In one embodiment, the processing server may be configured with the same functionality as that of the controllers 120, 220 of FIGS. 1 and 2, respectively.


The processing server 535 may include a processor 522, as well as a memory 578. The memory 578 may store an operating system 579 capable of facilitating the functionalities as discussed herein as well as a set of applications 575 (i.e., machine readable instructions). For instance, one of the set of applications 575 may be a policy processing application 584 configured to manage customer insurance policies. It should be appreciated that other applications 590 are envisioned, such as a peril detection application configured to determine whether individuals may be in peril.


The processor 522 may interface with the memory 578 to execute the operating system 579 and the set of applications 575. According to some embodiments, the memory 578 may also include a data record storage 580 that stores various information associated with customer insurance policies, as well as baseline data corresponding to a set of default sensor data and thresholds relating thereto. The policy processing application 584 may interface with the data record storage 580 to retrieve relevant information that the policy processing application 584 may use to manage insurance policies, generate notifications, and/or perform other functionalities, such as determine whether individuals are in peril. Further, the device replacement application may interface with the data record storage 580 to retrieve device information. The memory 578 may include one or more forms of volatile and/or non-volatile, fixed and/or removable memory, such as read-only memory (ROM), electronic programmable read-only memory (EPROM), random access memory (RAM), erasable electronic programmable read-only memory (EEPROM), and/or other hard drives, flash memory, MicroSD cards, and others.


The processing server 535 may further include a communication module 577 configured to communicate data via one or more networks 525. According to some embodiments, the communication module 577 may include one or more transceivers (e.g., WWAN, WLAN, and/or WPAN transceivers) functioning in accordance with IEEE standards, 3GPP standards, or other standards, and configured to receive and transmit data via one or more external ports 576. For instance, the communication module 577 may transmit, via the network 525, baseline data corresponding to a set of default intrusion attempts.


The processing server 525 may further include a user interface 581 configured to present information to a user and/or receive inputs from the user. As shown in FIG. 5, the user interface 581 may include a display screen 582 and I/O components 583 (e.g., ports, capacitive or resistive touch sensitive input panels, keys, buttons, lights, LEDs, speakers, microphones). According to some embodiments, the user may access the processing server 535 via the user interface 581 to process insurance policies and/or perform other functions. In some embodiments, the processing server 535 may perform the functionalities as discussed herein as part of a “cloud” network or may otherwise communicate with other hardware or software components within the cloud to send, retrieve, and/or otherwise analyze data.


In general, a computer program product in accordance with an embodiment may include a computer usable storage medium (e.g., standard random access memory (RAM), an optical disc, a universal serial bus (USB) drive, or the like) having computer-readable program code embodied therein, wherein the computer-readable program code may be adapted to be executed by the processor 522 (e.g., working in connection with the operating system 579) to facilitate the functions as described herein. In this regard, the program code may be implemented in any desired language, and may be implemented as machine code, assembly code, byte code, interpretable source code or the like (e.g., via C, C++, Java, Actionscript, Objective-C, Javascript, CSS, XML). In some embodiments, the computer program product may be part of a cloud network of resources.


VI. Exemplary User Interfaces



FIGS. 6A and 6B illustrate exemplary interfaces associated with example commands, displays, and actions for electronic devices. An electronic device (e.g., a mobile device, such as a smartphone) may be configured to display the interfaces and/or receive selections and inputs via the interfaces. For example, a dedicated application associated with an insurance provider (or with an independent or assisted living environment) and that is configured to operate on the electronic device may display the interfaces. It should be appreciated that the interfaces are merely examples and that alternative or additional content is envisioned.



FIG. 6A illustrates an interface 650 including details related to situation in which an individual is deemed to be in peril. In particular, the interface 650 may include an alert that details the situation: that data has been detected indicating that John Doe may have experienced a fall. The interface 650 further enables a user of the electronic device to select an appropriate action to take. In particular, the interface 650 may include a “dismiss” selection 651 that, upon selection, may dismiss the interface 650, a “contact” selection 652 that, upon selection, may cause the electronic device to contact John Doe (e.g., via a phone call or text message) or another individual, and a “more info” selection 653 that, upon selection, may retrieve more information related to the situation.



FIG. 6B illustrates an additional interface 655 that may include more information relating to the situation indicated in FIG. 6A. In one implementation, the electronic device may display the interface 655 in response to the user selecting the “more info” selection 653 of the interface 650. The interface 655 may indicate the location of John Doe (as shown: Room 204). Thus, the user of the electronic device may know where to find John Doe within the independent or assisted living environment, and may be better equipped to handle the situation and/or may be afforded with the ability to reach John Doe in a shorter amount of time. The interface 655 may include an “okay” selection 656 that, upon selection, may dismiss the interface 655.


VII. Exemplary Method of Detecting Periled Individuals


In one aspect, a computer-implemented method of detecting periled individuals within an independent or assisted living environment may be provided. The independent or assisted living environment may be populated with a hardware controller in communication with a plurality of sensors. The method may include (1) receiving, by the hardware controller, sensor data from at least one sensor located within the independent or assisted living environment, the at least one sensor either (i) secured to an individual or (ii) configured to sense environmental data within a proximity of the individual; (2) analyzing the sensor data by one or more processors; (3) based upon the analyzing, determining that the individual is in peril; (4) responsive to determining that the individual is in peril, generating a notification indicating that the individual is in peril; and/or (5) communicating the notification to an electronic device of an additional individual to facilitate alleviating a risk associated with the individual being in peril. The method may include additional, less, or alternate actions, including those discussed elsewhere herein, and/or may be implemented via one or more local or remote processors, and/or via computer-executable instructions stored on non-transitory computer-readable media or medium.


In one implementation, receiving the sensor data may include receiving motion data from a wearable device that is removably secured to the individual. In another implementation, analyzing the sensor data may include analyzing the motion data from the wearable device to determine that the individual has experienced a rapid acceleration. In a further implementation, the sensor data may include vital sign data, and analyzing the sensor data may include analyzing the vital sign data to determine that the individual is in need of immediate care.


In a still further implementation, analyzing the sensor data may include (1) analyzing the sensor data to determine a current condition of the individual; (2) receiving updated sensor data from the at least one sensor; and (3) determining, from the updated sensor data, that the current condition is maintained for a threshold period of time.


Additionally, in one implementation, the sensor data may include audio data received from a microphone located within a room of the individual, and wherein analyzing the sensor data may include determining, from the audio data, that the individual has suffered a fall. In another implementation, analyzing the sensor data may include accessing baseline sensor data corresponding to the retrieved sensor data, and comparing the received sensor data to the baseline sensor data. In a further implementation, comparing the received sensor data to the baseline sensor data may include (1) determining a level of similarity between the retrieved sensor data and the baseline sensor data, and (2) determining that the level of similarity meets a threshold value.


In another implementation, communicating the notification to the electronic device may include (1) identifying a caregiver for the individual, and (2) communicating the notification to the electronic device of the caregiver. In an additional implementation, the method may further include identifying an insurance-related event associated with the individual being in peril.


VIII. Exemplary Hardware Controller


In a further aspect, a hardware controller for detecting periled individuals within an independent or assisted living environment, where the hardware controller may communication with a set of sensors populated within the independent or assisted living environment, may be provided. The hardware controller may include (1) a communication module adapted to interface with the set of sensors populated within the independent or assisted living environment; (2) a memory adapted to store non-transitory computer executable instructions; and/or (3) a processor adapted to interface with the communication module and the memory. The processor may be configured to execute the non-transitory computer executable instructions to cause the processor to (a) receive, via the communication module, sensor data from at least one sensor of the set of sensors located within the independent or assisted living environment, the at least one sensor either (i) secured to an individual or (ii) configured to sense environmental data within a proximity of the individual, (b) analyze the sensor data, (c) based upon the analyzing, determine that the individual is in peril, (d) responsive to determining that the individual is in peril, generate a notification indicating that the individual is in peril, and/or (e) communicate, via the communication module, the notification to an electronic device of an additional individual to facilitate alleviating a risk associated with the individual being in peril. The hardware controller may include additional, less, or alternate functionality, including that discussed elsewhere herein.


In one implementation, to receive the sensor data, the processor may be configured to receive motion data from a wearable device that is removably secured to the individual. In another implementation, to analyze the sensor data, the processor may be configured to analyze the motion data from the wearable device to determine that the individual has experienced a rapid acceleration. Further, in one implementation, the sensor data may include vital sign data, and where to analyze the sensor data, the processor may be configured to analyze the vital sign data to determine that the individual is in need of immediate care.


In an additional implementation, to analyze the sensor data, the processor may be configured to analyze the sensor data to (1) determine a current condition of the individual, (2) receive updated sensor data from the at least one sensor, and (3) determine, from the updated sensor data, that the current condition is maintained for a threshold period of time. In a further implementation, the sensor data may include audio data received from a microphone located within a room of the individual, and wherein to analyze the sensor data, the processor may be configured to determine, from the audio data, that the individual has suffered a fall.


Additionally, in one implementation, to analyze the sensor data, the processor is may be configured to (1) access baseline sensor data corresponding to the retrieved sensor data, and (2) compare the received sensor data to the baseline sensor data. In another implementation, to compare the received sensor data to the baseline sensor data, the processor may be configured to (1) determine a level of similarity between the retrieved sensor data and the baseline sensor data, and (2) determine that the level of similarity meets a threshold value.


In one implementation, to communicate the notification to the electronic device, the processor may be configured to (1) identify a caregiver for the individual, and (2) communicate the notification to the electronic device of the caregiver. Moreover, in one implementation, the processor may be further configured to identify an insurance-related event associated with the individual being in peril.


IX. Exemplary Independent Living


The systems and methods may facilitate various functionalities associated with independent and/or assisted living environments. In some implementations, the home controller may analyze various sensor data (e.g., vibrations, sounds, pressure data, etc.) to determine whether an individual has fallen or otherwise detect that individual has suffered an injury or is in a position of immobility. For instance, the sensor data may be acceleration data from a wearable device that indicates a sudden acceleration, which may indicate a fall or other incident.


The controller, and/or the insurance provider remote processing server, may also monitor locations (e.g., via GPS coordinates) of individuals on the premises, as well as receive motion-activated, proximity, and/or connection data from sensors installed at various locations on the premises. The controller, and/or the insurance provider remote processing server, may monitor the temperature of individuals via thermal sensors associated with the individuals. For instance, if the controller, and/or the insurance provider remote processing server, determines that a particular individual's temperature is below a certain threshold for a certain amount of time, then the controller, and/or the insurance provider remote processing server, respectively, may determine that the individual is at risk (and/or that there is an abnormal condition) and may generate and send a notice to another individual of the situation.


The controller, and/or the insurance provider remote processing server, may also establish baseline or “normal” conditions for an individual, a property or portion of the property, and/or may determine whether the individual, and/or one or more properties or parameters deviates from the baseline or “normal” conditions.


As noted, the methods of smart home control and/or automation, or assisted living, detailed above may also include actions directed to independent living. For example, the wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include audio data. The smart home controller or remote processor may be configured to determine that an insured has fallen from voice recognition or vibrations contained within the audio data from analysis of the audio data by the smart home controller or remote processor, and/or analyze the audio data to determine an estimated level of severity of an injury for the insured.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include GPS (Global Positioning System) data that the smart home controller or remote processor may use or analyze to identify a location of the insured, such as GPS data transmitted from a mobile device, smart watch, smart glasses, smart clothes, or a wearable electronics device.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include movement data of persons or animals within the insured home. The smart home controller or remote processor may be configured to determine, from analysis of the movement data by the smart home controller or remote processor, a likelihood of an abnormal condition (e.g., insured home occupant or animal injury, unexpected insured home vacancy, etc.) from a lack of movement within the insured home for a given amount of time or a pre-determined threshold of time, and then issue a message to a mobile device of the insured or a friend or neighbor.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include movement data of persons or animals within the insured home. The smart home controller or remote processor may be configured to determine, from analysis of the movement data by the smart home controller or remote processor, a likelihood of an abnormal condition (e.g., unexpected insured home occupancy) from movement detected within the insured home when the insured home is not occupied by the insured, and then issue a message to a mobile device of the insured or a friend or neighbor.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include smart door data from a smart garage door or smart door indicating that the smart garage door or smart door has been opened or shut. The smart home controller or remote processor may be configured to compare the smart door data with other data (such as data associated with an amount of electricity usage, operating equipment, or thermal imaging data) to determine whether an occupant has entered or exited an insured home, and, if so, generate an alert to a family member that the occupant has either entered or exited the insured home, respectively.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include data regarding the operation of a stove or oven (e.g., temperature, time on, etc.). The smart home controller or remote processor may analyze the data to determine that the stove or oven is on at too high a temperature or has been on for too long a time, and then automatically de-energize or turn off the stove or oven, and/or generate and transmit a corresponding wireless communication message to the insured indicating the abnormal stove or oven condition.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include body heat data associated with a body temperature of a person or animal. The smart home controller or remote processor may analyze the body heat data to determine that the body temperature is abnormal (e.g., too cold or too hot), and then generate and transmit a wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the body temperature of the person or animal is abnormal.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include shower data. The smart home controller or remote processor may analyze the shower data to determine that a person has fallen while taking a shower. The shower data may include pressure data (such as from pressure sensing smart hand rails or pressure sensing smart floor or tub mats). The smart home controller or remote processor may then generate and transmit a wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the person has fallen, or has likely fallen, while taking a shower.


The shower data may be generated from pressure sensing matting. The smart home controller or remote processor may be configured to analyze the shower data to determine (a) whether the person that fell has gotten up, (b) a likely severity of a fall taken by the person, and/or (c) whether assistance is likely needed. The smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating the situation or that assistance is needed as determined by the smart home controller or remote processor.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include data indicating a normal routine of a person (e.g., time of day they usually wake, go to sleep, shower, cook, watch television, use a computer or other electronics, etc.). The smart home controller or remote processor may be configured to learn, from analysis of the data by the smart home controller or remote processor, the normal routine of the person over time, and compare present data indicating a current time and/or an activity of the person with the normal routine of the person learned to detect potential abnormal conditions. After which, the smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating an abnormal condition detected by the smart home controller or remote processor.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include data indicating vitals (e.g., blood pressure, heart rate, oxygen levels, etc.) of an occupant of the insured home. The smart home controller or remote processor may analyze the data indicating vitals to detect or determine an abnormal or unhealthy condition. After which, the smart home controller or remote processor may generate and transmit a message to an insured or family members of the occupant when the vitals indicate an abnormal or unhealthy condition of the occupant, as determined or detected by the smart home controller or remote processor.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include data indicating that an infant or young child (1) is in a vicinity of, or in, a swimming pool located within a yard of the insured home, and (2) is not being supervised by an adult. In response to determining such from analysis of the data (and/or based upon the analysis by the smart home controller or remote processor of the wired or wireless communication or data transmission, and/or data received), the smart home controller or remote processor may generate and transmit an electronic warning or message to the insured or family members to facilitate appropriate supervision in the vicinity of the swimming pool.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include a body heat reading of an occupant of the insured home. The smart home controller or remote processor may determine, based upon the analysis by the smart home controller or remote processor of the wired or wireless communication or data transmission, and/or data received, that the body heat of the occupant is too cold or too hot (as compared to normal), and then remotely adjust a smart thermostat setting accordingly (e.g., (1) if the occupant's body temperature is too hot, then the smart home controller or remote processor may direct or control turning on the air conditioning or adjusting the air conditioning, or (2) if the occupant's body temperature is too cold, then the smart home controller or remote processor may direct or control turning on the furnace/heat or adjusting the furnace setting) to maintain an appropriate temperature within the insured home and/or health of the occupant.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include bed data gathered from a bed sensor, smart bed, or bed room camera indicating an abnormal condition or that a person has remained in bed for an abnormal period of time. Based upon the analysis by the smart home controller or remote processor of the bed data, the smart home controller or remote processor may determine that the abnormal condition exists or that the person has remained in bed for an abnormal period of time. After which, the smart home controller or remote processor may generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating the abnormal condition exists or that the person has remained in bed for an abnormal amount of time.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include gas data gathered from, or generated by, a gas sensor, detector, or meter indicating a gas or natural gas leak within the insured home. Based upon the analysis by the smart home controller or remote processor of the gas data, the smart home controller or remote processor may determine that an abnormal condition exists, such as the gas or natural gas leak exists, within the home. The smart home controller or remote processor may generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the abnormal condition, or gas or natural gas leak, exists.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include odor data gathered from, or generated by, an odor detector, sensor, or meter indicating an abnormal odor condition within the insured home. Based upon the analysis by the smart home controller or remote processor of the odor data, the smart home controller or remote processor may determine that the abnormal odor condition exists within the home. The smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the abnormal odor condition exists.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include smell data gathered from, or generated by, a smell detector, sensor, or meter indicating an abnormal smell condition within the insured home. Based upon the analysis by the smart home controller or remote processor of the smell data, the smart home controller or remote processor may determine that the abnormal smell condition exists within the home. The smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the abnormal smell condition exists.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include mold data gathered from, or generated by, a mold detector, sensor, or meter indicating an abnormal mold condition within the insured home. Based upon the analysis by the smart home controller or remote processor of the mold data, the smart home controller or remote processor may determine that the abnormal mold condition exists within the home. The smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the abnormal mold condition exists.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include temperature data gathered from, or generated by, a temperature detector, sensor, or meter indicating an abnormal temperature condition within the insured home. Based upon the analysis by the smart home controller or remote processor of the temperature data, the smart home controller or remote processor may determine that the abnormal temperature condition exists within the home. The smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the abnormal temperature condition exists.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include humidity data gathered from, or generated by, a humidity detector, sensor, or meter indicating an abnormal humidity condition within the insured home. Based upon the analysis by the smart home controller or remote processor of the humidity data, the smart home controller or remote processor may determine that the abnormal humidity condition exists within the home. The smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the abnormal humidity condition exists.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include moisture data gathered from, or generated by, a moisture detector, sensor, or meter indicating an abnormal moisture condition within the insured home. Based upon the analysis by the smart home controller or remote processor of the moisture data, the smart home controller or remote processor may determine that the abnormal moisture condition exists within the home. The smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the abnormal moisture condition exists.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include sound data gathered from, or generated by, a sound detector, sensor, or meter indicating an abnormal sound condition within the insured home. Based upon the analysis by the smart home controller or remote processor of the sound data, the smart home controller or remote processor may determine that the abnormal sound condition exists within the home. The smart home controller or remote processor may then generate and transmit a corresponding wireless communication message to an insured, family member, friend, neighbor, or caregiver indicating that the abnormal sound condition exists.


The gas, odor, smell, mold, temperature, humidity, moisture, or sound data may be analyzed at or via the smart home controller or remote processor. From the data analysis, the smart home controller or remote processor may determine a likely cause of an associated abnormal gas, odor, smell, mold, temperature, humidity, moisture, or sound condition, respectively. The methods discussed herein may include (1) directing or controlling, at or via the smart home controller or remote processor, various smart home equipment to mitigate the abnormal gas, odor, smell, mold, temperature, humidity, moisture, or sound condition, respectively (such as operating fans, or smart ventilation, air conditioning, furnace, heating, environmental, or other smart equipment); (2) generating an insurance claim, at or via the smart home controller or remote processor, associated with the abnormal gas, odor, smell, mold, temperature, humidity, moisture, or sound condition, respectively; (3) handling or processing the insurance claim, at or via the smart home controller or remote processor, associated with the abnormal gas, odor, smell, mold, temperature, humidity, moisture, or sound condition, respectively; and/or (4) updating, at or via the smart home controller or remote processor, a premium, rate, or discount for an insurance policy associated with, or covering, the insured home and/or insured or occupant of the insured home based upon the abnormal gas, odor, smell, mold, temperature, humidity, moisture, or sound condition, respectively.


The gas, odor, smell, mold, temperature, humidity, moisture, or sound data may be analyzed at or via the smart home controller or remote processor to determine a corresponding abnormal condition, or a likely cause of the abnormal condition. For instance, the smart home controller or remote processor may receive gas, odor, smell, mold, temperature, humidity, moisture, or sound data indicative of actual gas, odor, smell, mold, temperature, humidity, moisture, or sound conditions within the insured home. The smart home controller or remote processor may compare the gas, odor, smell, mold, temperature, humidity, moisture, or sound data received with expected gas, odor, smell, mold, temperature, humidity, moisture, or sound data or conditions stored in a memory unit, and if differences exist, the smart home controller or remote processor may determine that a corresponding abnormal condition exists and/or determine a cause (or potential cause) of the corresponding abnormal condition. Additionally or alternatively, the smart home controller or remote processor may compare the gas, odor, smell, mold, temperature, humidity, moisture, or sound data received with a baseline of “normal” gas, odor, smell, mold, temperature, humidity, moisture, or sound data or conditions of the insured home gathered over time and/or stored in a memory unit, and if differences exist, the smart home controller or remote processor may determine that a corresponding abnormal condition exists and/or determine a cause (or potential cause) of the corresponding abnormal condition.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include data regarding the operation of a stove or oven (e.g., temperature, time on, etc.). Based upon the analysis by the smart home controller or remote processor of the data, the smart home controller or remote processor may determine that an abnormal condition exists within the home. For instance, the smart home controller or remote processor may analyze the data to determine that the stove or oven has been left unattended for too long, and automatically turn off or de-energize the stove or oven, respectively, and/or generate and transmit a corresponding wireless communication message to the insured or a family member indicating that the abnormal condition exists and/or that the stove or oven has been automatically turned off.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include data regarding running water (e.g., washer, tub, shower, etc.). Based upon the analysis by the smart home controller or remote processor of the data, the smart home controller or remote processor may determine that an abnormal condition exists within the home. For instance, the smart home controller or remote processor may analyze the data to determine that the running water has been flowing for too long or left unattended for too long, and automatically turn off or de-energize an electrical valve (e.g., solenoid valve) or stop a source of the flowing water, and/or generate and transmit a corresponding wireless communication message to the insured or a family member indicating that the abnormal condition exists and/or that the water has been automatically turned off.


The wired or wireless communication or data transmission, and/or data, received and/or analyzed by the smart home controller or remote processor may include fireplace data regarding fireplace operation (e.g., flue, duct, or damper position, chimney opening, gas, etc.). Based upon the analysis by the smart home controller or remote processor of the data, the smart home controller or remote processor may determine that an abnormal condition exists within the home. For instance, the smart home controller or remote processor may analyze the data to determine that an abnormal fireplace conditions exists (e.g., flue or damper in wrong position, gas on or off, chimney plugged with debris or bird nest, smoke is collecting on the interior of the insured home, etc.). In response, the smart home controller or remote processor may generate and transmit an associated alert or message, and/or automatically direct or control (such as via wireless communication) the operation of smart equipment to alleviate the impact of the abnormal fireplace condition (e.g., opens, closes, moves or changes flue or damper position; opens, closes, or operates a gas valve; starts or operates ventilation equipment or fans to move smoke out of the interior of the insured home; prevents closing damper(s) until the temperature of the fireplace has cooled down to a predetermined set point after use, etc.).


X. Additional Considerations


As used herein, the term “smart” may refer to devices, sensors, or appliances located within or proximate to a property, and with the ability to communicate information about the status of the device, sensor, or appliance and/or receive instructions that control the operation of the device, sensor, or appliance. In one instance, a smart thermostat may be able to remotely communicate the current temperature of the home and receive instructions to adjust the temperature to a new level. In another instance, a smart water tank may be able to remotely communicate the water level contained therein and receive instructions to restrict the flow of water leaving the tank. In contrast, “dumb” devices, sensors, or appliances located within or proximate to a property require manual control. Referring again to the thermostat embodiment, to adjust the temperature on a “dumb” thermostat, a person would have to manually interact with the thermostat. As such, a person is unable to use a communication network to remotely adjust a “dumb” device, sensor, or appliance.


A “smart device” as used herein may refer to any of a smart device, sensor, appliance, and/or other smart equipment that may be located (or disposed) within or proximate to a property. In embodiments in which an appliance and a sensor external to the particular appliance are associated with each other, “smart device” may refer to both the external sensors and the appliance collectively. Some exemplary devices that may be “smart devices” are, without limitation, valves, piping, clothes washers/dryers, dish washers, refrigerators, sprinkler systems, toilets, showers, sinks, soil monitors, doors, locks, windows, shutters, ovens, grills, fire places, furnaces, lighting, sump pumps, security cameras, and alarm systems. Similarly, an individual associated with the property shall be referred to as the “homeowner,” “property owner,” or “policyholder,” but it is also envisioned that the individual is a family member of the homeowner, a person renting/subletting the property, a person living or working on the property, a neighbor of the property, or any other individual that may have an interest in preventing or mitigating damage to the property.


Further, any reference to “home” or “property” is meant to be exemplary and not limiting. The systems and methods described herein may be applied to any property, such as offices, farms, lots, parks, and/or other types of properties or buildings. Accordingly, “homeowner” may be used interchangeably with “property owner.”


With the foregoing, an insurance customer may opt-in to a rewards, insurance discount, or other type of program. After the insurance customer provides their affirmative consent, an insurance provider remote server may collect data from the customer's mobile device, smart home controller, or other smart devices—such as with the customer's permission. The data collected may be related to insured assets or individuals before (and/or after) an insurance-related event, including those events discussed elsewhere herein. In return, risk averse insureds, home owners, home or apartment occupants, or care givers may receive discounts or insurance cost savings related to life, home, renters, personal articles, auto, and other types of insurance from the insurance provider.


In one aspect, smart or interconnected home data, and/or other data, including the types of data discussed elsewhere herein, may be collected or received by an insurance provider remote server, such as via direct or indirect wireless communication or data transmission from a smart home controller, mobile device, or other customer computing device, after a customer affirmatively consents or otherwise opts-in to an insurance discount, reward, or other program. The insurance provider may then analyze the data received with the customer's permission to provide benefits to the customer. As a result, risk averse customers may receive insurance discounts or other insurance cost savings based upon data that reflects low risk behavior and/or technology that mitigates or prevents risk to home or apartment occupants, and/or insured assets, such as homes, personal belongings, or vehicles.


Although the following text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the invention is defined by the words of the claims set forth at the end of this patent. The detailed description is to be construed as exemplary only and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. One could implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.


Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in exemplary configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.


Additionally, certain embodiments are described herein as including logic or a number of routines, subroutines, applications, or instructions. These may constitute either software (e.g., code embodied on a non-transitory, machine-readable medium) or hardware. In hardware, the routines, etc., are tangible units capable of performing certain operations and may be configured or arranged in a certain manner. In exemplary embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In various embodiments, a hardware module may be implemented mechanically or electronically. For instance, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For instance, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for instance, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules may provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for instance, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For instance, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information).


The various operations of exemplary methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some exemplary embodiments, comprise processor-implemented modules.


Similarly, the methods or routines described herein may be at least partially processor-implemented. For instance, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some exemplary embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment, or as a server farm), while in other embodiments the processors may be distributed across a number of locations.


The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some exemplary embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other exemplary embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.


Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.


As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.


The terms “insurer,” “insuring party,” and “insurance provider” are used interchangeably herein to generally refer to a party or entity (e.g., a business or other organizational entity) that provides insurance products, e.g., by offering and issuing insurance policies. Typically, but not necessarily, an insurance provider may be an insurance company.


Although the embodiments discussed herein relate to property or care-type insurance policies, it should be appreciated that an insurance provider may offer or provide one or more different types of insurance policies. Other types of insurance policies may include, for instance, condominium owner insurance, renter's insurance, life insurance (e.g., whole-life, universal, variable, term), health insurance, disability insurance, long-term care insurance, annuities, business insurance (e.g., property, liability, commercial auto, workers compensation, professional and specialty liability, inland marine and mobile property, surety and fidelity bonds), automobile insurance, boat insurance, insurance for catastrophic events such as flood, fire, volcano damage and the like, motorcycle insurance, farm and ranch insurance, personal liability insurance, personal umbrella insurance, community organization insurance (e.g., for associations, religious organizations, cooperatives), personal articles, and/or other types of insurance products. In embodiments as described herein, the insurance providers process claims related to insurance policies that cover one or more properties (e.g., homes, automobiles, personal property), although processing other insurance policies is also envisioned.


The terms “insured,” “insured party,” “policyholder,” “customer,” “claimant,” and “potential claimant” are used interchangeably herein to refer to a person, party, or entity (e.g., a business or other organizational entity) that is covered by the insurance policy, e.g., whose insured article or entity (e.g., property, life, health, auto, home, business) is covered by the policy. A “guarantor,” as used herein, generally refers to a person, party or entity that is responsible for payment of the insurance premiums. The guarantor may or may not be the same party as the insured, such as in situations when a guarantor has power of attorney for the insured. An “annuitant,” as referred to herein, generally refers to a person, party or entity that is entitled to receive benefits from an annuity insurance product offered by the insuring party. The annuitant may or may not be the same party as the guarantor.


As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For instance, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For instance, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).


In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the description. This description, and the claims that follow, should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.


The patent claims at the end of this patent application are not intended to be construed under 35 U.S.C. § 112(f) unless traditional means-plus-function language is expressly recited, such as “means for” or “step for” language being explicitly recited in the claim(s).


This detailed description is to be construed as examples and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. One could implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this application.

Claims
  • 1. A computer-implemented method of detecting periled individuals within a property, the computer-implemented method comprising: receiving, by a hardware controller, sensor data from at least one sensor located within the property, of a plurality of sensors installed at various locations on the property;analyzing the sensor data by one or more processors, wherein analyzing the sensor data includes analyzing sensor data associated with a room of the property, and wherein analyzing the sensor data is based upon comparing the sensor data associated with the room of the property to baseline sensor data associated with the room of the property to determine a level of similarity between the sensor data associated with the room of the property and the baseline sensor data associated with the room of the property;based upon whether the level of similarity meets a threshold value, determining that an individual located in the room of the property has experienced a fall;responsive to determining that the individual located in the room of the property has experienced the fall, generating a notification indicating that the individual located in the room of the property has experienced the fall; andcommunicating the notification to an electronic device.
  • 2. The computer-implemented method of claim 1, wherein the sensor data includes imaging data.
  • 3. The computer-implemented method of claim 1, wherein the sensor data includes motion data.
  • 4. The computer-implemented method of claim 1, wherein the electronic device is a personal digital assistant device.
  • 5. The computer-implemented method of claim 1, wherein analyzing the sensor data is further based upon accessing baseline sensor data associated with the individual.
  • 6. The computer-implemented method of claim 5, wherein analyzing the sensor data based upon accessing baseline sensor data associated with the individual includes learning a routine associated with the individual.
  • 7. The computer-implemented method of claim 1, wherein analyzing the sensor data comprises: analyzing the sensor data to determine a current condition of the individual;receiving updated sensor data from the at least one sensor; anddetermining, from the updated sensor data, that a current condition is maintained for a threshold period of time.
  • 8. A hardware controller for detecting periled individuals within a property, the hardware controller in communication with a set of sensors installed at various locations on the property, comprising: a communication module adapted to interface with a plurality of sensors installed at various locations on the property;a non-transitory memory adapted to store computer executable instructions; anda processor adapted to interface with the communication module and the non-transitory memory, wherein the processor is configured to execute the computer executable instructions to cause the processor to: receive, via the communication module, sensor data from at least one sensor of the plurality of sensors installed at various locations on the property,analyze the sensor data, wherein analyzing the sensor data includes analyzing sensor data associated with a room of the property, and wherein analyzing the sensor data is based upon comparing the sensor data associated with the room of the property to baseline sensor data associated with the room of the property to determine a level of similarity between the sensor data associated with the room of the property and the baseline sensor data associated with the room of the property,based upon whether the level of similarity meets a threshold value, determine that an individual located in the room of the property has experienced a fall,responsive to determining that the individual located in the room of the property has experienced the fall, generate a notification indicating that the individual located in the room of the property has experienced the fall, andcommunicate, via the communication module, the notification to an electronic device.
  • 9. The hardware controller of claim 8, wherein the sensor data includes imaging data.
  • 10. The hardware controller of claim 8, wherein the sensor data includes motion data.
  • 11. The hardware controller of claim 8, wherein the electronic device is a personal digital assistant device.
  • 12. The hardware controller of claim 8, wherein to analyze the sensor data, the processor is further configured to execute the computer executable instructions to cause the processor to access baseline sensor data associated with the individual.
  • 13. The hardware controller of claim 12, wherein analyzing the sensor data based upon accessing baseline sensor data associated with the individual includes learning a routine associated with the individual.
  • 14. A non-transitory computer-readable medium storing instructions for detecting periled individuals within a property, that, when executed by one or more processors, cause the one or more processors to: receive sensor data from at least one sensor located within the property, of a plurality of sensors installed at various locations on the property;analyze the sensor data, wherein analyzing the sensor data includes analyzing sensor data associated with a room of the property, and wherein analyzing the sensor data is based upon comparing the sensor data associated with the room of the property to baseline sensor data associated with the room of the property to determine a level of similarity between the sensor data associated with the room of the property and the baseline sensor data associated with the room of the property;based upon whether the level of similarity meets a threshold value, determine that an individual located in the room of the property has experienced a fall;responsive to determining that the individual located in the room of the property has experienced the fall, generate a notification indicating that the individual located in the room of the property has experienced the fall; andcommunicate the notification to an electronic device.
  • 15. The non-transitory computer-readable medium storing instructions of claim 14, wherein the sensor data includes imaging data.
  • 16. The non-transitory computer-readable medium storing instructions of claim 14, wherein the sensor data includes motion data.
  • 17. The non-transitory computer-readable medium storing instructions of claim 14, wherein the electronic device is a personal digital assistant device.
  • 18. The non-transitory computer-readable medium storing instructions of claim 14, wherein the instructions that cause the one or more processors to analyze the sensor data include instructions that cause the one or more processors to analyze the sensor data based upon accessing baseline sensor data associated with the individual.
  • 19. The non-transitory computer-readable medium storing instructions of claim 18, wherein analyzing the sensor data based upon accessing baseline sensor data associated with the individual includes learning a routine associated with the individual.
  • 20. The non-transitory computer-readable medium storing instructions of claim 14, wherein the instructions that cause the one or more processors to analyze the sensor data include instructions that cause the one or more processors to: analyze the sensor data to determine a current condition of the individual;receive updated sensor data from the at least one sensor of the plurality of sensors; anddetermine, from the updated sensor data, that a current condition is maintained for a threshold period of time.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. patent application Ser. No. 17/706,302 (filed Mar. 28, 2022, and entitled “SYSTEMS AND METHODS for IMPROVED ASSISTED OR INDEPENDENT LIVING ENVIRONMENTS”); which is a continuation of U.S. patent application Ser. No. 17/701,316 (filed Mar. 22, 2022, and entitled “SYSTEMS AND METHODS for IMPROVED ASSISTED OR INDEPENDENT LIVING ENVIRONMENTS”); which is a continuation of U.S. patent application Ser. No. 16/738,328 (filed Jan. 9, 2020, and entitled “SYSTEMS AND METHODS for IMPROVED ASSISTED OR INDEPENDENT LIVING ENVIRONMENTS”); which is a continuation of U.S. patent application Ser. No. 14/873,865 (filed Oct. 2, 2015, and entitled “SYSTEMS AND METHODS for IMPROVED ASSISTED OR INDEPENDENT LIVING ENVIRONMENTS”); which claims benefit of the filing date of U.S. Provisional Patent Application Nos. 62/060,962 (filed Oct. 7, 2014, and entitled “SYSTEMS AND METHODS FOR MANAGING DEVICES WITHIN A CONNECTED PROPERTY AND INSURANCE POLICIES ASSOCIATED THEREWITH”); 62/105,407 (filed Jan. 20, 2015, and entitled “SYSTEMS AND METHODS FOR MANAGING DEVICES WITHIN A CONNECTED PROPERTY AND INSURANCE POLICIES ASSOCIATED THEREWITH”); 62/187,624 (filed Jul. 1, 2015, and entitled “SYSTEMS AND METHODS FOR FACILITATING DEVICE REPLACEMENT WITHIN A CONNECTED PROPERTY”); 62/187,645 (filed Jul. 1, 2015, and entitled “SYSTEMS AND METHODS FOR MANAGING BUILDING CODE COMPLIANCE FOR A PROPERTY”); 62/187,651 (filed Jul. 1, 2015, and entitled “SYSTEMS AND METHODS FOR AUTOMATICALLY GENERATING AN ESCAPE ROUTE”); 62/187,642 (filed Jul. 1, 2015, and entitled “SYSTEMS AND METHODS FOR ANALYZING SENSOR DATA TO DETECT PROPERTY INTRUSION EVENTS”); 62/187,666 (filed Jul. 1, 2015, and entitled “SYSTEMS AND METHODS FOR IMPROVED ASSISTED OR INDEPENDENT LIVING ENVIRONMENTS”); 62/189,329 (filed Jul. 7, 2015, and entitled “SYSTEMS AND METHODS FOR MANAGING WARRANTY INFORMATION ASSOCIATED WITH DEVICES POPULATED WITHIN A PROPERTY”); 62/193,317 (filed Jul. 16, 2015, and entitled “SYSTEMS AND METHODS FOR MANAGING SMART DEVICES BASED UPON ELECTRICAL USAGE DATA”); 62/197,343 (filed Jul. 27, 2015, and entitled “SYSTEMS AND METHODS FOR CONTROLLING SMART DEVICES BASED UPON IMAGE DATA FROM IMAGE SENSORS”); 62/198,813 (filed Jul. 30, 2015, and entitled “SYSTEMS AND METHODS FOR MANAGING SERVICE LOG INFORMATION”); 62/200,375 (filed Aug. 3, 2015, and entitled “SYSTEMS AND METHODS FOR AUTOMATICALLY RESPONDING TO A FIRE”); 62/201,671 (filed Aug. 6, 2015, and entitled “SYSTEMS AND METHODS FOR AUTOMATICALLY MITIGATING RISK OF DAMAGE FROM BROKEN CIRCUITS”); 62/220,383 (filed Sep. 18, 2015, and entitled “METHODS AND SYSTEMS FOR RESPONDING TO A BROKEN CIRCUIT”)—which are all hereby incorporated by reference in their entireties.

US Referenced Citations (586)
Number Name Date Kind
1446000 Davis Feb 1923 A
3740739 Griffin et al. Jun 1973 A
3817161 Koplon Jun 1974 A
3875612 Poitras Apr 1975 A
4066072 Cummins Jan 1978 A
5005125 Farrar et al. Apr 1991 A
5099751 Newman et al. Mar 1992 A
5128859 Carbone et al. Jul 1992 A
5553609 Chen et al. Sep 1996 A
5554433 Perrone et al. Sep 1996 A
5576952 Stutman et al. Nov 1996 A
5684710 Ehlers et al. Nov 1997 A
5884289 Anderson et al. Mar 1999 A
5903426 Ehling May 1999 A
5935251 Moore Aug 1999 A
5967975 Ridgeway Oct 1999 A
6023762 Dean et al. Feb 2000 A
6026166 Lebourgeois Feb 2000 A
6155324 Elliott et al. Dec 2000 A
6222455 Kaiser Apr 2001 B1
6286682 Rodolfo Sep 2001 B1
6324516 Shults et al. Nov 2001 B1
6351698 Kubota Feb 2002 B1
6428475 Shen Aug 2002 B1
6466921 Cordery et al. Oct 2002 B1
6526807 Doumit et al. Mar 2003 B1
6535855 Cahill et al. Mar 2003 B1
6554183 Sticha et al. Apr 2003 B1
6611206 Eshelman et al. Aug 2003 B2
6812848 Candela Nov 2004 B2
6826536 Forman Nov 2004 B1
6847892 Zhou et al. Jan 2005 B2
6886139 Liu Apr 2005 B2
6934692 Duncan Aug 2005 B1
6954758 Kenneth Oct 2005 B1
7030767 Candela Apr 2006 B2
7091865 Cuddihy et al. Aug 2006 B2
7154399 Cuddihy et al. Dec 2006 B2
7194416 Provost et al. Mar 2007 B1
7242305 Cuddihy et al. Jul 2007 B2
7301463 Paterno Nov 2007 B1
7309216 Spadola et al. Dec 2007 B1
7319990 Henty Jan 2008 B1
7340401 Koenig et al. Mar 2008 B1
7348882 Adamczyk et al. Mar 2008 B2
7356516 Richey et al. Apr 2008 B2
7395219 Strech Jul 2008 B2
7397346 Helal et al. Jul 2008 B2
7411510 Nixon Aug 2008 B1
7467094 Rosenfeld et al. Dec 2008 B2
7502498 Wen et al. Mar 2009 B2
7562121 Berisford et al. Jul 2009 B2
7586418 Cuddihy et al. Sep 2009 B2
7598856 Nick et al. Oct 2009 B1
7657441 Richey et al. Feb 2010 B2
7715036 Silverbrook et al. May 2010 B2
7733224 Tran Jun 2010 B2
7787946 Stahmann et al. Aug 2010 B2
7801612 Johnson et al. Sep 2010 B2
7809587 Dorai et al. Oct 2010 B2
7813822 Hoffberg Oct 2010 B1
7831235 Mononen et al. Nov 2010 B2
7835926 Naidoo et al. Nov 2010 B1
7911334 Busey Mar 2011 B2
7966378 Berisford et al. Jun 2011 B2
8019622 Kaboff et al. Sep 2011 B2
8031079 Kates Oct 2011 B2
8041636 Hunter et al. Oct 2011 B1
8050665 Orbach Nov 2011 B1
8106769 Maroney et al. Jan 2012 B1
8108271 Duncan et al. Jan 2012 B1
8140418 Casey et al. Mar 2012 B1
8185191 Shapiro May 2012 B1
8214082 Tsai et al. Jul 2012 B2
8229861 Trandal et al. Jul 2012 B1
8280633 Eldering et al. Oct 2012 B1
8289160 Billman Oct 2012 B1
8311941 Grant Nov 2012 B2
8316237 Felsher et al. Nov 2012 B1
8346594 Begeja et al. Jan 2013 B2
8400299 Maroney et al. Mar 2013 B1
8490006 Reeser et al. Jul 2013 B1
8510196 Brandmaier et al. Aug 2013 B1
8527306 Reeser et al. Sep 2013 B1
8529456 Cobain Sep 2013 B2
8533144 Reeser et al. Sep 2013 B1
8571993 Kocher et al. Oct 2013 B2
8595034 Bauer et al. Nov 2013 B2
8596293 Mous et al. Dec 2013 B2
8605209 Becker Dec 2013 B2
8620841 Filson et al. Dec 2013 B1
8621097 Venkatakrishnan et al. Dec 2013 B2
8640038 Reeser et al. Jan 2014 B1
8650048 Hopkins et al. Feb 2014 B1
8665084 Shapiro et al. Mar 2014 B2
8669864 Tedesco et al. Mar 2014 B1
8670998 Bertha et al. Mar 2014 B2
8675920 Hanson et al. Mar 2014 B2
8682952 Kutzik et al. Mar 2014 B2
8694501 Trandal et al. Apr 2014 B1
8712893 Brandmaier et al. Apr 2014 B1
8730039 Billman May 2014 B1
8731975 English et al. May 2014 B2
8744901 Begeja et al. Jun 2014 B2
8749381 Maroney et al. Jun 2014 B1
8803690 Junqua et al. Aug 2014 B2
8856383 Beninato et al. Oct 2014 B2
8868616 Otto et al. Oct 2014 B1
8882666 Goldberg et al. Nov 2014 B1
8890680 Reeser et al. Nov 2014 B2
8917186 Grant Dec 2014 B1
8929853 Butler Jan 2015 B2
8965327 Davis et al. Feb 2015 B2
8976937 Shapiro et al. Mar 2015 B2
9049168 Jacob et al. Jun 2015 B2
9057746 Houlette et al. Jun 2015 B1
9082072 Wedding, Jr. Jul 2015 B1
9111349 Szeliski et al. Aug 2015 B2
9117349 Shapiro et al. Aug 2015 B2
9142119 Grant Sep 2015 B1
9152737 Micali et al. Oct 2015 B1
9165334 Simon Oct 2015 B2
9183578 Reeser et al. Nov 2015 B1
9202363 Grant Dec 2015 B1
9208661 Junqua et al. Dec 2015 B2
9262909 Grant Feb 2016 B1
9280252 Brandmaier et al. Mar 2016 B1
9286772 Shapiro et al. Mar 2016 B2
9297150 Klicpera Mar 2016 B2
9335297 Cummins et al. May 2016 B1
9344330 Jacob et al. May 2016 B2
9375142 Schultz et al. Jun 2016 B2
D764461 Romanoff et al. Aug 2016 S
9408561 Stone et al. Aug 2016 B2
9424606 Wilson et al. Aug 2016 B2
9424737 Bailey et al. Aug 2016 B2
9429925 Wait Aug 2016 B2
9443195 Micali et al. Sep 2016 B2
9472092 Grant Oct 2016 B1
9491277 Melissa Nov 2016 B2
9536052 Amarasingham et al. Jan 2017 B2
9589441 Shapiro et al. Mar 2017 B2
9609003 Chmielewski et al. Mar 2017 B1
9652976 Bruck et al. May 2017 B2
9654434 Sone et al. May 2017 B2
9665892 Reeser et al. May 2017 B1
9666060 Reeser et al. May 2017 B2
9699529 Petri et al. Jul 2017 B1
9712576 Gill Jul 2017 B1
9739813 Houlette et al. Aug 2017 B2
9754477 Poder et al. Sep 2017 B2
9767680 Trundle Sep 2017 B1
9786158 Beaver et al. Oct 2017 B2
9798979 Fadell et al. Oct 2017 B2
9798993 Payne et al. Oct 2017 B2
9800570 Bleisch Oct 2017 B1
9800958 Petri et al. Oct 2017 B1
9801541 Mensinger et al. Oct 2017 B2
9812001 Grant Nov 2017 B1
9824397 Patel et al. Nov 2017 B1
9838854 Fretwell Dec 2017 B2
9866507 Frenkel et al. Jan 2018 B2
9888371 Jacob Feb 2018 B1
9892463 Hakimi et al. Feb 2018 B1
9898168 Shapiro et al. Feb 2018 B2
9898912 Jordan et al. Feb 2018 B1
9911042 Cardona et al. Mar 2018 B1
9922524 Devdas et al. Mar 2018 B2
9923971 Madey et al. Mar 2018 B2
9942630 Petri et al. Apr 2018 B1
9947202 Moon et al. Apr 2018 B1
9978033 Payne et al. May 2018 B1
9997056 Bleisch Jun 2018 B2
10002295 Cardona et al. Jun 2018 B1
10022084 Nonaka et al. Jul 2018 B2
10042341 Jacob Aug 2018 B1
10043369 Hopkins et al. Aug 2018 B2
10047974 Riblet et al. Aug 2018 B1
10055793 Call et al. Aug 2018 B1
10055803 Orduna et al. Aug 2018 B2
10057664 Moon et al. Aug 2018 B1
10073929 Vaynriber et al. Sep 2018 B2
10102584 Devereaux et al. Oct 2018 B1
10102585 Bryant et al. Oct 2018 B1
10107708 Schick et al. Oct 2018 B1
10136294 Mehta et al. Nov 2018 B2
10142394 Chmielewski et al. Nov 2018 B2
10147296 Gregg Dec 2018 B2
10176705 Grant Jan 2019 B1
10181160 Hakimi-Boushehri et al. Jan 2019 B1
10181246 Jackson Jan 2019 B1
10186134 Moon et al. Jan 2019 B1
10198771 Madigan et al. Feb 2019 B1
10204500 Cullin et al. Feb 2019 B2
10206630 Stone et al. Feb 2019 B2
10217068 Davis et al. Feb 2019 B1
10226187 Al-Ali et al. Mar 2019 B2
10226204 Heaton et al. Mar 2019 B2
10229394 Davis et al. Mar 2019 B1
10244294 Moon et al. Mar 2019 B1
10249158 Jordan et al. Apr 2019 B1
10258295 Fountaine Apr 2019 B2
10282787 Hakimi-Boushehri et al. May 2019 B1
10282788 Jordan et al. May 2019 B1
10282961 Jordan et al. May 2019 B1
10295431 Schick et al. May 2019 B1
10296978 Corder et al. May 2019 B1
10297138 Reeser et al. May 2019 B2
10298735 Preston et al. May 2019 B2
10304311 Clark et al. May 2019 B2
10304313 Moon et al. May 2019 B1
10319209 Carlton-Foss Jun 2019 B2
10323860 Riblet et al. Jun 2019 B1
10325471 Jason Jun 2019 B1
10325473 Moon et al. Jun 2019 B1
10332059 Matsuoka et al. Jun 2019 B2
10335059 Annegarn et al. Jul 2019 B2
10346811 Jordan et al. Jul 2019 B1
10353359 Jordan et al. Jul 2019 B1
10356303 Jordan et al. Jul 2019 B1
10360345 Ramsdell et al. Jul 2019 B2
10380692 Parker et al. Aug 2019 B1
10387966 Shah et al. Aug 2019 B1
10388135 Jordan et al. Aug 2019 B1
10412169 Madey et al. Sep 2019 B1
10446000 Friar et al. Oct 2019 B2
10446007 Kawazu et al. Oct 2019 B2
10467476 Cardona et al. Nov 2019 B1
10469282 Konrardy et al. Nov 2019 B1
10475141 McIntosh et al. Nov 2019 B2
10480825 Riblet et al. Nov 2019 B1
10482746 Moon et al. Nov 2019 B1
10506411 Jacob Dec 2019 B1
10506990 Lee et al. Dec 2019 B2
10514669 Call et al. Dec 2019 B1
10515372 Jordan et al. Dec 2019 B1
10522009 Jordan et al. Dec 2019 B1
10522021 Jason Dec 2019 B1
10546478 Moon et al. Jan 2020 B1
10547918 Moon et al. Jan 2020 B1
10548512 Hausdorff et al. Feb 2020 B2
10553096 Ashar et al. Feb 2020 B2
10565541 Payne et al. Feb 2020 B2
10573146 Jordan et al. Feb 2020 B1
10573149 Jordan et al. Feb 2020 B1
10579028 Jacob Mar 2020 B1
10586177 Choueiter et al. Mar 2020 B1
10607295 Hakimi-Boushehri et al. Mar 2020 B1
10621686 Mazar et al. Apr 2020 B2
10623790 Maddalena Apr 2020 B2
10634576 Schick et al. Apr 2020 B1
10664922 Madigan et al. May 2020 B1
10679292 Call et al. Jun 2020 B1
10685402 Bryant et al. Jun 2020 B1
10699346 Corder et al. Jun 2020 B1
10699348 Devereaux et al. Jun 2020 B1
10726494 Shah et al. Jul 2020 B1
10726500 Shah et al. Jul 2020 B1
10733671 Hakimi-Boushehri et al. Aug 2020 B1
10733868 Moon et al. Aug 2020 B2
10735829 Petri et al. Aug 2020 B2
10740691 Choueiter et al. Aug 2020 B2
10741033 Jordan et al. Aug 2020 B1
10750252 Petri et al. Aug 2020 B2
10795329 Jordan et al. Oct 2020 B1
10796557 Sundermeyer et al. Oct 2020 B2
10802477 Konrardy et al. Oct 2020 B1
10804700 Cohen et al. Oct 2020 B2
10818105 Konrardy et al. Oct 2020 B1
10823458 Riblet et al. Nov 2020 B1
10824971 Davis et al. Nov 2020 B1
10825316 Victor Nov 2020 B1
10825318 Williams et al. Nov 2020 B1
10825320 Moon et al. Nov 2020 B1
10825321 Moon et al. Nov 2020 B2
10832225 Davis et al. Nov 2020 B1
10846800 Bryant et al. Nov 2020 B1
10907844 Ribbich et al. Feb 2021 B2
10922756 Call et al. Feb 2021 B1
10922948 Moon et al. Feb 2021 B1
10943447 Jordan et al. Mar 2021 B1
10970990 Jacob Apr 2021 B1
10990069 Jacob Apr 2021 B1
11003334 Conway et al. May 2021 B1
11004320 Jordan et al. May 2021 B1
11015997 Schick et al. May 2021 B1
11017480 Shah et al. May 2021 B2
11024142 Tunnell et al. Jun 2021 B2
11042131 Strohmenger et al. Jun 2021 B2
11042137 Call et al. Jun 2021 B1
11042938 Robare Jun 2021 B1
11042942 Hakimi-Boushehri et al. Jun 2021 B1
11043098 Jordan et al. Jun 2021 B1
11049078 Jordan et al. Jun 2021 B1
11049189 Shah et al. Jun 2021 B2
11056235 Dunstan et al. Jul 2021 B2
11074659 Hakimi-Boushehri et al. Jul 2021 B1
11094180 Williams et al. Aug 2021 B1
11100594 West et al. Aug 2021 B1
11118812 Riblet et al. Sep 2021 B1
11126708 Reimer Sep 2021 B2
11138861 Blatt et al. Oct 2021 B2
11164257 Devereaux et al. Nov 2021 B1
11232873 Aspro et al. Jan 2022 B1
11277465 Chmielewski et al. Mar 2022 B2
11348193 Konrardy et al. May 2022 B1
11417212 Farooqui et al. Aug 2022 B1
11423758 Williams Aug 2022 B2
20020040306 Sugiyama et al. Apr 2002 A1
20020046047 Budd Apr 2002 A1
20030023459 Shipon Jan 2003 A1
20030144793 Melaku et al. Jul 2003 A1
20040030531 Miller et al. Feb 2004 A1
20040054789 Breh et al. Mar 2004 A1
20040153346 Grundel et al. Aug 2004 A1
20040153382 Boccuzzi et al. Aug 2004 A1
20040177032 Bradley et al. Sep 2004 A1
20040211228 Nishio et al. Oct 2004 A1
20040220538 Panopoulos Nov 2004 A1
20040249250 McGee et al. Dec 2004 A1
20050030175 Wolfe Feb 2005 A1
20050080520 Kline et al. Apr 2005 A1
20050137465 Cuddihy et al. Jun 2005 A1
20050139420 Spoltore et al. Jun 2005 A1
20050143956 Long et al. Jun 2005 A1
20050174242 Cohen Aug 2005 A1
20050228245 Quy Oct 2005 A1
20050251427 Dorai et al. Nov 2005 A1
20050275527 Kates Dec 2005 A1
20060001545 Wolf Jan 2006 A1
20060033625 Johnson et al. Feb 2006 A1
20060058612 Dave et al. Mar 2006 A1
20060100912 Kumar et al. May 2006 A1
20060154642 Scannell Jul 2006 A1
20060184379 Tan et al. Aug 2006 A1
20060205564 Peterson Sep 2006 A1
20060271456 Romain et al. Nov 2006 A1
20070186165 Maislos et al. Aug 2007 A1
20070214002 Smith et al. Sep 2007 A1
20080018474 Bergman et al. Jan 2008 A1
20080019392 Lee Jan 2008 A1
20080059351 Richey et al. Mar 2008 A1
20080101160 Besson May 2008 A1
20080154099 Aspel et al. Jun 2008 A1
20080184272 Brownewell Jul 2008 A1
20080201174 Ramasubramanian et al. Aug 2008 A1
20080235629 Porter et al. Sep 2008 A1
20080240379 Maislos et al. Oct 2008 A1
20080285797 Hammadou Nov 2008 A1
20080292151 Kurtz et al. Nov 2008 A1
20080294462 Nuhaan et al. Nov 2008 A1
20080301019 Monk Dec 2008 A1
20090001891 Patterson Jan 2009 A1
20090012373 Raij et al. Jan 2009 A1
20090024420 Winkler Jan 2009 A1
20090044595 Vokey Feb 2009 A1
20090094129 Rhodes et al. Apr 2009 A1
20090240170 Rowley Sep 2009 A1
20090243852 Haupt et al. Oct 2009 A1
20090259581 Horowitz et al. Oct 2009 A1
20090265193 Collins et al. Oct 2009 A1
20090281393 Smith Nov 2009 A1
20090326981 Karkanias et al. Dec 2009 A1
20100027777 Gupta et al. Feb 2010 A1
20100073840 Hennessey, Jr. Mar 2010 A1
20100131416 Means May 2010 A1
20100145164 Howell Jun 2010 A1
20100191824 Lindsay Jul 2010 A1
20100235285 Hoffberg Sep 2010 A1
20100241465 Amigo et al. Sep 2010 A1
20100286490 Koverzin Nov 2010 A1
20100299217 Hui Nov 2010 A1
20110003577 Rogalski et al. Jan 2011 A1
20110021140 Binier Jan 2011 A1
20110077875 Tran et al. Mar 2011 A1
20110112660 Bergmann et al. May 2011 A1
20110161117 Busque et al. Jun 2011 A1
20110173122 Singhal Jul 2011 A1
20110181422 Tran Jul 2011 A1
20110201901 Khanuja Aug 2011 A1
20110218827 Kenefick et al. Sep 2011 A1
20110224501 Hudsmith Sep 2011 A1
20110234406 Young et al. Sep 2011 A1
20110238564 Lim et al. Sep 2011 A1
20110246123 Dellostritto et al. Oct 2011 A1
20110251807 Rada et al. Oct 2011 A1
20110276489 Larkin Nov 2011 A1
20120016695 Bernard et al. Jan 2012 A1
20120046973 Eshleman et al. Feb 2012 A1
20120047072 Larkin Feb 2012 A1
20120095846 Leverant Apr 2012 A1
20120101855 Collins et al. Apr 2012 A1
20120116820 English et al. May 2012 A1
20120143619 Routt Jun 2012 A1
20120143754 Patel Jun 2012 A1
20120166115 Apostolakis Jun 2012 A1
20120188081 Van Katwijk Jul 2012 A1
20120232935 Voccola Sep 2012 A1
20120237908 Fitzgerald et al. Sep 2012 A1
20120265586 Mammone Oct 2012 A1
20120280811 McKalip et al. Nov 2012 A1
20120290333 Birchall Nov 2012 A1
20120290482 Atef et al. Nov 2012 A1
20130013513 Ledbetter et al. Jan 2013 A1
20130030974 Casey et al. Jan 2013 A1
20130049950 Wohlert Feb 2013 A1
20130060167 Dracup et al. Mar 2013 A1
20130073299 Warman et al. Mar 2013 A1
20130073306 Shlain et al. Mar 2013 A1
20130073321 Hofmann et al. Mar 2013 A1
20130082842 Balazs et al. Apr 2013 A1
20130096960 English et al. Apr 2013 A1
20130100268 Mihailidis et al. Apr 2013 A1
20130104022 Coon Apr 2013 A1
20130122849 Doezema May 2013 A1
20130143519 Doezema Jun 2013 A1
20130144486 Ricci Jun 2013 A1
20130147899 Labhard Jun 2013 A1
20130159021 Felsher Jun 2013 A1
20130166325 Ganapathy et al. Jun 2013 A1
20130214925 Weiss Aug 2013 A1
20130223405 Kim et al. Aug 2013 A1
20130226624 Blessman et al. Aug 2013 A1
20130234840 Trundle et al. Sep 2013 A1
20130257626 Masli et al. Oct 2013 A1
20130262155 Hinkamp Oct 2013 A1
20130267795 Cosentino et al. Oct 2013 A1
20130290013 Forrester Oct 2013 A1
20130290033 Reeser et al. Oct 2013 A1
20130304514 Hyde et al. Nov 2013 A1
20140006284 Faith et al. Jan 2014 A1
20140058854 Ranganath et al. Feb 2014 A1
20140108031 Ferrara Apr 2014 A1
20140122133 Weisberg et al. May 2014 A1
20140136242 Weekes et al. May 2014 A1
20140142729 Lobb et al. May 2014 A1
20140148733 Stone et al. May 2014 A1
20140180723 Cote et al. Jun 2014 A1
20140184408 Herbst et al. Jul 2014 A1
20140201315 Jacob et al. Jul 2014 A1
20140201844 Buck Jul 2014 A1
20140207486 Carty et al. Jul 2014 A1
20140222329 Frey Aug 2014 A1
20140222469 Stahl et al. Aug 2014 A1
20140229205 Gibson Aug 2014 A1
20140238511 Klicpera Aug 2014 A1
20140244997 Goel et al. Aug 2014 A1
20140257851 Walker et al. Sep 2014 A1
20140257871 Christensen et al. Sep 2014 A1
20140266669 Fadell et al. Sep 2014 A1
20140266717 Warren et al. Sep 2014 A1
20140267263 Beckwith Sep 2014 A1
20140268353 Fujimura Sep 2014 A1
20140276549 Osorio Sep 2014 A1
20140277939 Ren Sep 2014 A1
20140278571 Mullen et al. Sep 2014 A1
20140303801 Ahn et al. Oct 2014 A1
20140317710 Sager et al. Oct 2014 A1
20140340216 Puskarich Nov 2014 A1
20140340227 Reed, Jr. Nov 2014 A1
20140358592 Wedig et al. Dec 2014 A1
20140362213 Tseng Dec 2014 A1
20140379156 Kamel et al. Dec 2014 A1
20150002293 Nepo Jan 2015 A1
20150032480 Blackhurst et al. Jan 2015 A1
20150061859 Matsuoka et al. Mar 2015 A1
20150094830 Lipoma et al. Apr 2015 A1
20150116112 Flinsenberg et al. Apr 2015 A1
20150134343 Kluger et al. May 2015 A1
20150154712 Cook Jun 2015 A1
20150154880 Petito et al. Jun 2015 A1
20150160623 Holley Jun 2015 A1
20150160636 McCarthy et al. Jun 2015 A1
20150163412 Holley et al. Jun 2015 A1
20150170288 Harton et al. Jun 2015 A1
20150179040 Nishihara et al. Jun 2015 A1
20150187016 Adams Jul 2015 A1
20150187019 Fernandes et al. Jul 2015 A1
20150194032 Wright Jul 2015 A1
20150206249 Fini Jul 2015 A1
20150244855 Serra et al. Aug 2015 A1
20150269825 Tran Sep 2015 A1
20150285832 Thomas et al. Oct 2015 A1
20150287310 Deiiuliis et al. Oct 2015 A1
20150305690 Tan et al. Oct 2015 A1
20150332407 Wilson et al. Nov 2015 A1
20150347910 Fadell et al. Dec 2015 A1
20150350848 Eramian Dec 2015 A1
20150355649 Ovadia Dec 2015 A1
20150356701 Gandy et al. Dec 2015 A1
20150364028 Child et al. Dec 2015 A1
20160018226 Plocher et al. Jan 2016 A1
20160027278 McIntosh et al. Jan 2016 A1
20160042463 Gillespie Feb 2016 A1
20160078744 Gieck Mar 2016 A1
20160099934 Logue Apr 2016 A1
20160104250 Allen et al. Apr 2016 A1
20160119424 Kane et al. Apr 2016 A1
20160140834 Tran May 2016 A1
20160165387 Nhu Jun 2016 A1
20160171864 Ciaramelletti et al. Jun 2016 A1
20160174913 Somanath et al. Jun 2016 A1
20160188829 Southerland et al. Jun 2016 A1
20160225240 Voddhi et al. Aug 2016 A1
20160259902 Feldman et al. Sep 2016 A1
20160260310 Chuang Sep 2016 A1
20160337829 Fletcher et al. Nov 2016 A1
20160342767 Narasimhan et al. Nov 2016 A1
20160360965 Tran Dec 2016 A1
20160371620 Nascenzi et al. Dec 2016 A1
20170004695 Brasch et al. Jan 2017 A1
20170124276 Tee May 2017 A1
20170124277 Shlagman May 2017 A1
20170147722 Greenwood May 2017 A1
20170148297 Ross May 2017 A1
20170172465 Osorio Jun 2017 A1
20170193164 Simon et al. Jul 2017 A1
20170228109 Zhang et al. Aug 2017 A1
20170262604 Francois Sep 2017 A1
20170270260 Shetty et al. Sep 2017 A1
20170277834 Zipnick et al. Sep 2017 A1
20170304659 Chen et al. Oct 2017 A1
20180000346 Cronin Jan 2018 A1
20180000385 Heaton Jan 2018 A1
20180018864 Baker Jan 2018 A1
20180032696 Rome Feb 2018 A1
20180068081 Salem Mar 2018 A1
20180075204 Lee et al. Mar 2018 A1
20180153477 Nagale et al. Jun 2018 A1
20180160988 Miller et al. Jun 2018 A1
20180196919 Abou et al. Jul 2018 A1
20180211509 Ramaci Jul 2018 A1
20180211724 Wang Jul 2018 A1
20180228404 Bhunia Aug 2018 A1
20180228405 Burwinkle Aug 2018 A1
20180276710 Tietzen et al. Sep 2018 A1
20180280245 Khalid Oct 2018 A1
20180308569 Luellen Oct 2018 A1
20180322947 Potts et al. Nov 2018 A1
20180325470 Fountaine Nov 2018 A1
20180342329 Rufo et al. Nov 2018 A1
20180357386 Sanjay-Gopal Dec 2018 A1
20180357879 Negre Dec 2018 A1
20180365957 Wright et al. Dec 2018 A1
20190019379 Beller et al. Jan 2019 A1
20190046039 Ramesh et al. Feb 2019 A1
20190069154 Booth et al. Feb 2019 A1
20190080056 Das et al. Mar 2019 A1
20190083003 Lee et al. Mar 2019 A1
20190099114 Mouradian Apr 2019 A1
20190108841 Vergyri et al. Apr 2019 A1
20190122522 Stefanski et al. Apr 2019 A1
20190122760 Wang Apr 2019 A1
20190133445 Eteminan et al. May 2019 A1
20190156944 Eriksson et al. May 2019 A1
20190180879 Jain Jun 2019 A1
20190205675 McGill Jul 2019 A1
20190206533 Singh et al. Jul 2019 A1
20190209022 Sobol et al. Jul 2019 A1
20190228397 Madden Jul 2019 A1
20190251520 Bentley, III et al. Aug 2019 A1
20190279647 Jones et al. Sep 2019 A1
20190287376 Netscher et al. Sep 2019 A1
20190303760 Kumar et al. Oct 2019 A1
20190318283 Kelly et al. Oct 2019 A1
20190320900 Majmudar Oct 2019 A1
20190320945 Johnson Oct 2019 A1
20200019852 Yoon et al. Jan 2020 A1
20200121544 George et al. Apr 2020 A1
20200126670 Bender et al. Apr 2020 A1
20200143655 Gray et al. May 2020 A1
20200302549 Jordan et al. Sep 2020 A1
20200327791 Moon et al. Oct 2020 A1
20200334554 Takahashi et al. Oct 2020 A1
20200334967 Sharma et al. Oct 2020 A1
20200337651 Kwan Oct 2020 A1
20210035432 Moon et al. Feb 2021 A1
20210042843 Bryant et al. Feb 2021 A1
20210158671 Jordan et al. May 2021 A1
20210186329 Tran Jun 2021 A1
20210212576 Macneish et al. Jul 2021 A1
20210279811 Waltman et al. Sep 2021 A1
20210307621 Svenson et al. Oct 2021 A1
20210312789 Linn Oct 2021 A1
20220031239 Curtis Feb 2022 A1
20220101275 Aspro et al. Mar 2022 A1
Foreign Referenced Citations (22)
Number Date Country
2781251 Dec 2013 CA
202865924 Apr 2013 CN
111626536 Sep 2020 CN
113138558 Jul 2021 CN
201811043670 Dec 2018 IN
2002-092767 Mar 2002 JP
2003-157357 May 2003 JP
2006-048554 Feb 2006 JP
2013-179381 Sep 2013 JP
2014-056423 Mar 2014 JP
2014-142889 Aug 2014 JP
2017-116994 Jun 2017 JP
10-2015-0129845 Nov 2015 KR
2009061936 May 2009 WO
2011133628 Oct 2011 WO
2013076721 May 2013 WO
2014159131 Oct 2014 WO
2014207558 Dec 2014 WO
2016081511 May 2016 WO
2019086849 May 2019 WO
2020010217 Jan 2020 WO
2021087185 May 2021 WO
Non-Patent Literature Citations (234)
Entry
“Elderly Alexa helps families care for their loved ones via voice”, Perez, Sarah, techcrunch.com, May 14, 2017 (Year: 2017).
“How to use Alexa Care Hub to help monitor and contact older relatives or friends”, Dave Johnson, Business Insider, Jan. 14, 2021, https://www.businessinsider.com/how-to-use-alexa-care-hub.
Amazons Care Hub will see success due to swelling interest in aging at home“and boosted smart speaker adoption”, Zoe LaRock, Nov. 13, 2020, https://www.businessinsider.com/amazon-care-hub-will-succeed-amid-growing-smart-speaker-adoption-2020-11.
Final Office Action, U.S. Appl. No. 17/574,874, May 18, 2022.
Gurley, The Accuracy Of Self-Reported Data Of An Aging Population Using A Telehealth System In A Retirement Community Setting Based On The Users Age, Gender, Employment Status And Computer Experience, Dissertation, University of Maryland, Baltimore (2016).
Knutsen, Confusion about causation in insurance: solutions for catastrophic losses, Ala. L. Rev., 5:957-1023 (2010).
Michael E. Porter, “How Smart, Connected Products Are Transforming Competition”, Harvard Business Review, Nov. 2014 (Year: 2014).
Non-Final Office Action, U.S. Appl. No. 17/574,874, Apr. 8, 2022, 17 pages.
Núñez-Marcos et al., Vision-Based Fall Detection with Convolutional Neural Networks, Wir. Comm. Mob. Comp., 2017(9474806):16 (2017).
System for Loss Prevention, IP.com, published Nov. 8, 2008.
The Accuracy Of Self-Reported Data Of An Aging Population Using A Telehealth System In A Retirement Community Setting Based On The Users Age, Gender, Employment Status And Computer Experience, Gurley, Kelley Anne. University of Maryland, Baltimore.
U.S. Appl. No. 14/692,864, Final Office Action, dated Nov. 8, 2017.
U.S. Appl. No. 14/692,864, Non final Office Action, dated May 24, 2018.
U.S. Appl. No. 14/692,943, Non final Office Action, dated Sep. 12, 2017.
U.S. Appl. No. 14/692,943, Notice of Allowance, dated May 1, 2018.
U.S. Appl. No. 14/692,946, Final Office Action, dated Oct. 30, 2017.
U.S. Appl. No. 14/692,946, Nonfinal Office Action, dated Apr. 4, 2017.
U.S. Appl. No. 14/692,946, Nonfinal Office Action, dated Apr. 6, 2018.
U.S. Appl. No. 14/692,953, Final Office Action, dated Apr. 27, 2018.
U.S. Appl. No. 14/692,953, Nonfinal Office Action, dated Sep. 19, 2017.
U.S. Appl. No. 14/692,961, Final Office Action, dated Jun. 20, 2018.
U.S. Appl. No. 14/692,961, Final Office Action, dated Sep. 1, 2017.
U.S. Appl. No. 14/692,961, Nonfinal Office Action, dated Apr. 14, 2017.
U.S. Appl. No. 14/692,961, Nonfinal Office Action, dated Dec. 28, 2017.
U.S. Appl. No. 14/693,021, Final Office Action, dated Jan. 25, 2018.
U.S. Appl. No. 14/693,021, Nonfinal Office Action, dated Jun. 30, 2017.
U.S. Appl. No. 14/693,032, Final Office Action, dated Mar. 22, 2018.
U.S. Appl. No. 14/693,032, Nonfinal Office Action, dated Sep. 7, 2017.
U.S. Appl. No. 14/693,032, Notice of Allowance, dated Jun. 22, 2018.
U.S. Appl. No. 14/693,034, Nonfinal Office Action, dated May 17, 2017.
U.S. Appl. No. 14/693,034, Notice of Allowance, dated Oct. 25, 2017.
U.S. Appl. No. 14/693,039, Final Office Action, dated Dec. 15, 2017.
U.S. Appl. No. 14/693,039, Nonfinal Office Action, dated Jun. 5, 2017.
U.S. Appl. No. 14/693,039, Nonfinal Office Action, dated May 3, 2018.
U.S. Appl. No. 14/693,057, Final Office Action, dated Feb. 7, 2018.
U.S. Appl. No. 14/693,057, Nonfinal Office Action, dated Aug. 21, 2017.
U.S. Appl. No. 14/873,722, Final Office Action, dated Jun. 15, 2018.
U.S. Appl. No. 14/873,722, Nonfinal Office Action, dated Dec. 5, 2017.
U.S. Appl. No. 14/873,783, Final Office Action, dated May 23, 2018.
U.S. Appl. No. 14/873,783, Nonfinal Office Action, dated Dec. 8, 2017.
U.S. Appl. No. 14/873,823, Final Office Action, dated Jun. 29, 2018.
U.S. Appl. No. 14/873,823, Final Office Action, dated Mar. 15, 2017.
U.S. Appl. No. 14/873,823, Final Office Action, dated Nov. 3, 2017.
U.S. Appl. No. 14/873,823, Nonfinal Office Action, dated Feb. 23, 2018.
U.S. Appl. No. 14/873,823, Nonfinal Office Action, dated Jun. 21, 2017.
U.S. Appl. No. 14/873,823, Nonfinal Office Action, dated Nov. 30, 2016.
U.S. Appl. No. 14/873,864, Corrected Notice of Allowability, dated Jan. 18, 2018.
U.S. Appl. No. 14/873,864, Final Office Action, dated Dec. 2, 2016.
U.S. Appl. No. 14/873,864, Nonfinal Office Action, dated Apr. 5, 2017.
U.S. Appl. No. 14/873,864, Nonfinal Office Action, dated Jul. 14, 2016.
U.S. Appl. No. 14/873,864, Notice of Allowance, dated Aug. 28, 2017.
U.S. Appl. No. 14/873,864, Notice of Allowance, dated Dec. 21, 2017.
U.S. Appl. No. 14/873,914, Non final Office Action, dated Dec. 26, 2017.
U.S. Appl. No. 14/873,942, Non final Office Action, dated Nov. 22, 2017.
U.S. Appl. No. 14/873,942, Nonfinal Office Action, dated Mar. 16, 2018.
U.S. Appl. No. 15/409,248, filed Jan. 18, 2017, Konrardy et al., “Sensor Malfunction Detection”.
U.S. Appl. No. 15/409,271, filed Jan. 18, 2017, Konrardy et al., “Autonomous Vehicle Component Malfunction Impact Assessment”.
U.S. Appl. No. 15/409,305, filed Jan. 18, 2017, Konrardy et al., “Component Malfunction Impact Assessment”.
U.S. Appl. No. 15/409,318, filed Jan. 18, 2017, Konrardy et al., “Automatic Repair of Autonomous Vehicles”.
U.S. Appl. No. 15/409,336, filed Jan. 18, 2017, Konrardy et al., “Automatic Repair of Autonomous Components”.
U.S. Appl. No. 15/409,340, filed Jan. 18, 2017, Konrardy et al., “Autonomous Vehicle Damage and Salvage Assessment”.
U.S. Appl. No. 15/409,349, filed Jan. 18, 2017, Konrardy et al., “Component Damage and Salvage Assessment”.
U.S. Appl. No. 15/409,359, filed Jan. 18, 2017, Konrardy et al., “Detecting and Responding to Autonomous Vehicle Collisions”.
U.S. Appl. No. 15/409,371, filed Jan. 18, 2017, Konrardy et al., “Detecting and Responding to Autonomous Environment Incidents”.
U.S. Appl. No. 15/409,445, filed Jan. 18, 2017, Konrardy et al., “Virtual Testing of Autonomous Vehicle Control System”.
U.S. Appl. No. 15/409,473, filed Jan. 18, 2017, Konrardy et al., “Virtual Testing of Autonomous Environment Control System”.
U.S. Appl. No. 15/859,859, filed Jan. 2, 2018, Hakmi-Boushehri et al., “Systems and Methods for Community-Based Cause of Loss Determination”.
U.S. Appl. No. 15/895,149, filed Feb. 13, 2018, Jordan et al., Systems and Methods for Automatically Generating an Escape Route.
U.S. Appl. No. 17/077,785, Notice of Allowance, mailed Jul. 14, 2022.
U.S. Appl. No. 14/692,864, Non final Office Action, dated May 16, 2017.
Yildirim et al., Fall detection using smartphone-based application, Int. J. Appl. Mathematics Electronics and Computers, 4(4):140-144 (2016).
Yu et al., A posture recognition-based fall detection system for monitoring an elderly person in a smart home environment, IEEE Tran. Infor. Tech. Biom., 16(6):1274-1286 (2012).
Woyke et al., The octogenarians who love Amazon's Alexa, MIT Technology Review, Jun. 9, 2017.
YouTube.com website, screen shot of Amazon Echo—SNL—posted on Saturday Night Live channel (2017).
Zanthion, Smart Communities are More Than Measurement, website (2018).
Zanthion, Smart Motion sensor,product webpage (2018).
Zechmann et al., Challenges in communicating user requirements: Lessons from a multi-national AAL project, IN: Gerschall et al. (eds.), International Reports on Socio-Informatics (IRSI), Proceedings of the COOP 2016—Symposium on challenges and experiences in designing for an ageing society, vol. 13, Issue 3, pp. 43-50 (2016).
Zechmann et al., Project No. 610658, eWall for Active Long Living, Deliverable No. D8.3, Report on the demonstration trial, delivered Oct. 31, 2016.
Amazon Echo Show—How to Setup (pub. Jul. 2, 2017), available at <https://www.youtube.com/watch?v=tFEQTAMmEEk>.
The Amazon Echo Show—Pretty Dang Good (the Most in-depth review on YT) (pub. Jul. 11, 2017), available at: <https://www.youtube.com/watch?v=7RrlR56_ako>.
Ma et al., Assistive adjustable smart shower system, 2017 IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies (Chase) pp. 253-254 (2017).
Ma et al., Vico VR-based wireless daily activity recognition and assessment system for stroke rehabilitation, 2018 IEEE International Conference on Bioinformatics and Biomedicine, 2018.
Mascarenhas, Bostinno Approved: The Week's Top Tech & Startup Events in Boston, downloaded from the Internet at: <https://www.bizjournals.com/boston/inno/stories/inno-events/2017/03/17/bostinno-approved-the-weeks-top-tech-startup.html>, May 17, 2017.
MavHome Website (2004).
Meet Alexa: Reminders, YouTube video (screenshot) on Amazon Alexa Channel (Feb. 2018).
Mihovska et al., Project No. 610658, eWall for Active Long Living, Deliverable No. D7.5.1, Standardization contributions, delivered Oct. 31, 2015.
Mihovska et al., Project No. 610658, eWall for Active Long Living, Deliverable No. D7.5.2, Standardization contributions, delivered Oct. 31, 2016.
Mihovska et al., Project No. 610658, eWall for Active Long Living, Deliverable No. D7.6.1, 1st Project Workshop, delivered Oct. 31, 2014.
Mihovska et al., Project No. 610658, eWall for Active Long Living, Deliverable No. D7.6.2, 2nd Project Workshop, delivered Oct. 30, 2015.
Mihovska et al., Project No. 610658, eWall for Active Long Living, Deliverable No. D7.6.3, 3rd Project Workshop, delivered Oct. 31, 2016.
Mozer, The Neural Network House: An Environment that Adapts to its Inhabitants, Proceedings of the American Association for Artificial Intelligence Spring Symposium on Intelligent Environments (1998).
Newland et al., Continuous in-home symptom and mobility measures for individuals with multiple sclerosis: a case presentation, Journal of Neuroscience Nursing, 49(4):241-6 (Aug. 2017).
Newman, How to use Alexa Routines to make your Amazon Echo even smarter, TechHive, Dec. 17, 2018.
Office Basic video from HoneyCo Homes on Vimeo (2018).
Op den Akker et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D2.2, Initial Scenarios and Use-Cases, delivered Feb. 28, 2014.
Oppenauer-Meerskraut et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D6.4, Small scale studies report, delivered Oct. 31, 2015.
Perez, ‘Elderly Alexa’ helps families care for their remote loved ones via voice, downloaded from the Internet at: <https://techcrunch.com/2017/05/14/elderly-alexa-helps-family-care-for-their-remote-loved-ones-via-voice/>, May 14, 2017.
Petersen et al., Time Out-of-Home and Cognitive, Physical, and Emotional Wellbeing of Older Adults: A Longitudinal Mixed Effects Model, PLoS One, 10(10):e0139643 (Oct. 2015).
Pneumatikakis et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D3.3.1, eWALL configurable metadata streams, delivered Oct. 31, 2014.
Pneumatikakis et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D3.3.2. eWall configurable metadata streams, delivered Apr. 30, 2015.
Pocs et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D2.4, Ethics, Privacy and Security, delivered Apr. 29, 2014.
Product Website for LifePod, <https://lifepod.com/> (2018).
Prospero, How to create an Alexa smart home routine, Tom's Guide, Mar. 1, 2019.
Pullen et al., This Amazon Echo tip is great for families and roommates, Fortune Magazine, Feb. 13, 2017.
Ralevic, How to build a custom Amazon Alexa skill, step-by-step: My favorite chess player, Crowdbotics, Jul. 24, 2018.
Rantz et al., Randomized Trial of Intelligent Sensor System for Early Illness Alerts in Senior Housing, J. Am. Med. Dir. Assoc., 18(10):860-70 (2017).
Riboni et al., Extended report: Fine-granted recognition of abnormal behaviors for early detection of mild cognitive impairment, arXiv:1501.05581v1 (Jan. 2015).
Robben et al., Delta Features From Ambient Sensor Data are Good Predictors of Change in Functional Health, IEEE Journal of Biomedical and Health Informatics, vol. 21, No. 4, pp. 986-993, Jul. 2017.
Robben et al., Expert knowledge for modeling the relation between functional health and data from ambient assisted living sensor systems, Abstract P314, Poster Presentation, European Geriatric Medicine, 2014.
Robben et al., How is grandma doing? Predicting functional health status from binary ambient sensor data, AAI Technical Report PS-12-01 (2012).
Schaarup et al., Cognitive Walkthrough: An element in system development and evaluation—experiences from the eWall telehealth system, Procedia Computer Science, 100:539-46 (2016).
Seelye et al., Passive assessment of routine driving with unobtrusive sensors: a new approach for identifying and monitoring functional level in nomal aging and mild cognitive impairment, J. Alzheimers Dis., 59(4):1427-37 (2017).
Simunic et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D6.3, Technical evaluation report, delivered Apr. 30, 2015.
Simunic et al., Project No. 610658, eWall for Active Long Living, Deliverable No. D7.7, Education material & training for professionals, delivered Oct. 31, 2016.
Smart Homes, article, posted online Aug. 15, 2002.
So Easy: How to Delete Alexa's History, Tom's Guide YouTube channel, screenshot of YouTube video (2017).
Solutions—CloudCare2U website (2017).
Sprint et al., Analyzing Sensor-Based Time Series Data to Track Changes in Physical Activity during Inpatient Rehabilitation, Sensors, 17, 2219 (2017).
Sprint et al., Using Smart Homes to Detect and Analyze Health Events, Computer, vol. 49, No. 11, pp. 29-37, Nov. 2016.
Su et al., Monitoring the relative blood pressure using a hydraulic bed sensor system, IEEE Transactions on Biomedical Engineering, 66(3):740-8 (Mar. 2019).
Su et al., Radar placement for fall detection: signature and performance, Journal of Ambient Intelligence and Smart Environments, 10:21-34 (2018).
The Oregon Center for Aging and Technology (ORCATECH) website, <https://www.ohsu.edu/oregon-center-for-aging-and-technology> (known as of Jul. 1, 2011).
Thomaz et al., Challenges and Opportunities in Automated Detection of Eating Activity, Mobile Health: Sensors, Analytic Methods, and Applications, 2017.
Trimble, UT-Arlington project envisions smarter homes, Forth Worth Star Telegram, Feb. 17, 2002.
TruSense—A New Way of Being There, with TruSense smart home technology, website <https://mytrusense.com/ (2018).
Twitter account for eWALL for active Long Living (2016).
U.S. Appl. No. 62/736,933, filed Sep. 26, 2018, Applicant: Verily Life Sciences Llc, USPTO Filing Receipt dated Oct. 15, 2018.
Wang et al., Performance-based physical function and future dementia in older people, Arch. Intern. Med., 166:1115-20 (2006).
Website listing of HoneyCo Homes videos on Vimeo (2016).
Website relating to corporate video for HoneCo Connect, published Jul. 5, 2017.
Exhibit 10: U.S. Pat. No. 10,258,295 to Fountaine, (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 11: U.S. Patent Publication No. 2019/0122522 to Stefanski, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 12: Amazon Alexa and Amazon Echo Show (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 13: U.S. Patent Publication No. 2016/0027278 A1 to McIntosh, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 14: U.S. Pat. No. 10,258,295 to Fountaine, (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 15: U.S. Patent Publication No. 2019/0122522 to Stefanski, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 16: Amazon Alexa and Amazon Echo Show, (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 17: U.S. Pat. No. 9,375,142 B2 to Schultz, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 18: U.S. Patent Publication No. 2016/0027278 A1 to McIntosh, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 19: U.S. Patent Publication No. 2019/0122522 to Stefanski, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 2: U.S. Pat. No. 10,258,295 to Fountaine, (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 20: U.S. Patent Publication No. 2019/0080056 A1 to Das, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 21: U.S. Patent Publication No. 2016/0027278 A1 to McIntosh, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 22: Automated Clinical Assessment from Smart home-based Behavior Data to Dawadi et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 3: U.S. Patent Publication No. 2019/0122522 to Stefanski, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 4: Amazon Alexa and Amazon Echo Show, (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 5: U.S. Patent Publication No. 2016/0027278 A1 to McIntosh, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 6: U.S. Pat. No. 10,258,295 to Fountaine, (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 7: U.S. Patent Publication No. 2019/0122522 to Stefanski et al. (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 8: Amazon Alexa and Amazon Echo Show, (Defendents' Invalidity Contentions), Jul. 12, 2023.
Exhibit 9: U.S. Patent Publication No. 2016/0027278 A1 to McIntosh, et al., (Defendents' Invalidity Contentions), Jul. 12, 2023.
Facebook page for HoneyCo Homes, started in 2017.
Fadia, IoT for the Aging: You're Never Too Old to Innovate, IoT Evoluation, Feb. 22, 2018.
Fratu et al., Comparative study of Radio Mobile and ICS Telecom propagation prediction models for DVB-T, IEEE BMSB 2015 International Conference, Jun. 16-19, 2015, Ghent, Belgium.
Fritz et al., Identifying varying health states in smart home sensor data: an expert-guided approach, 2017.
Getting to Know Your Echo Show, product guide, available to the public on or before May 9, 2017.
Gonfalonieri et al., How Amazon Alexa works? Your guide to Natural Language Processing (AI), Medium, Nov. 21, 2018.
Goodfellow et al., Deep Learning, Chapters 6 and 7, An MIT Press Book (2016).
Grguric et al., Project No. 610658, eWall for Active Long Living, Deliverable No. D7.10, Socio-economic study, delivered Oct. 31, 2016.
Hajjar, ‘Elderly Alexa’ helps families care for their loved ones via voice, Northeastern Global News, downloaded from the Internet at: <https://news.northeastern.edu/in-the-media/elderly-alexa-helps-families-care-for-their-remote-loved-ones-via-voice/>, May 14, 2017.
Hangaard et al., Participatory heuristic evaluation of the second iteration of the eWALL interface application, pp. 599-603, IN: Hoerbst et al. (eds.), Exploring Complexity in Health: An Interdisciplinary Systems Approach, European Federation for Medical Informatics and IOS Press (2016).
Hangaard et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D2.5, Clinical workflows and pathways, delivered Jul. 30, 2014.
Hellmers et al., Towards a Minimized Unsupervised Technical Assessment of Physical Performance in Domestic Environments, Proceedings of the 11th EAI International Conference on Pervasive Computing Technologies for Healthcare, May 2017.
HoneyCo Advanced video from HoneyCo Homes on Vimeo (2018).
HoneyCo Basic video from HoneyCo Homes on Vimeo (2018).
HoneyCo Connect video from HoneyCo Homes on Vimeo (2017).
How Does TruSense Work? Devices + Intelligence = True Protection, downloaded from the Internet at: <https://mytruesense.com/how-it-works/> (2017).
Ichkov et a., Hybrid access control with modified SINR association for future heterogeneous networks, arXiv:1507.04271, 2015.
Infarinato et al., Acceptance and Potential Impact of the eWall platform for health monitoring and promotion in persons with a chronic disease or age-related impairmant, International Journal of Environmental Research and Public Health, 17:7893 (2020).
Jarvis, An Open Door to Technology, Fort Worth Star Telegram, Dec. 1, 2002.
Jarvis, Home of the Future, The Times, Muenster, Indiana, Dec. 29, 2001.
Jarvis, The House That Tech Built, Knight Ridder News Service Story (Jan. 11, 2002).
Jarvis, UTA research seeks to create “smart house”, Fort Worth Star Telegram, Nov. 20, 2001.
Kaye et al., Intelligent systems for assessing aging changes: home-based, unobtrusive, and continuous assessment of aging, The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences, 66B(S1):1180-90 (2011).
Kennedy, Why this entrepreneur moved from New York to Launch his startup in Nashville, Nashville Business Journal, Jun. 13, 2016.
Kyriazakos et al., eWALL: An open-source cloud-based eHealth platform for creating home caring environments for older adults living with chronic diseases or frailty, Journal Wireless Personal Communications, 97(2):1835-75 (2017).
Kyriazakos et al., Forecast—A cloud-based personalized intelligent virtual coaching platform for the well-being of cancer patients, Clinical and Translational Radiation Oncology, 8:50-59 (2018).
Kyriazakos, Project No. 610658, eWall for Active Long Living, Deliverable No. D7.2, Basic dissemination material, delivered Jan. 31, 2014.
Kyriazakos, Project No. 610658, eWall for Active Long Living, Deliverable No. 7.3, Dissemination plan, delivered Jan. 31, 2014.
Lumini et al., The Role of Educational Technology in Caregiving, Chapter 11, IN: Caregiving and Home Care, Intech, 2018.
Final Rejection Mailed on Oct. 7, 2024 for U.S. Appl. No. 17/961,338, 14 page(s).
Final Rejection Mailed on Sep. 11, 2024 for U.S. Appl. No. 17/706,302, 9 page(s).
Non-Final Rejection Mailed on Aug. 14, 2024 for U.S. Appl. No. 18/138,558, 15 page(s).
Notice of Allowance and Fees Due (PTOL-85) Mailed on Sep. 12, 2024 for U.S. Appl. No. 17/894,939, 8 page (s).
“4 Game Changes from the TechCrunch Disrupt Hackathon”, PubNub, May 15, 2017.
“A smart decision for independent living”, website, HoneyCo Homes (2017).
“Canary Care: How it Works”, website, <https://www.canarycare.co.uk/how-it-works/> (2019).
“Elderly-Alexa”, TechCrunch website, photos New York Disrupt Hackathon (2017).
Facilitating Elders Aging in Place: The 2017 Enterprise Management Hackathon, , downloaded from the Internet at: <https://mitsloan.mit.edu/sites/default/files/inline-files/2017_EMTrack_Hackathon_article.pdf> (2017).
“HoneyCo Homes: Using Smart Technology to Help Seniors Age in Place”, Nashville Medical News, published Nov. 9, 2017.
“How Canary Care Helps”, downloaded from the Internet at: <https://web.archive.org/web/20190322142707/https://www.canarycare.co.uk/how-it-helps/ (2019).
Seniors have increasingly become more tech savvy, Nashville Post, Aug. 28, 2017.
2nd AHA Summit, downloaded from the Internet at: <https://cloudcare2u.com/> (2017).
Aicha et al., Continuous gait velocity analysis using ambient sensors in a smart home, European Conference on Ambient Intelligence (Nov. 2015).
Aicha et al., Continuous measuring of the indoor walking speed of older adults living alone, J. Ambient Intell. Human Comput., 9:589-99 (2018).
Akl et al., Unobtrusive Detection of Mild Cognitive Impairment in Older Adults Through Home Monitoring, IEEE J Biomed Health Inform., 21(2):339-48 (Mar. 2017).
Alpaydin, Introduction to Machine Learning, Third Edition, the MIT Press, Chapters 11 and 12 (2014).
Amazon Alexa Development 101 (full tutorial course—Jun. 2018 version), screenshot of YouTube video (Jun. 2018).
Amazon Echo Show Teardown, published Jun. 28, 2017.
Amazon Echo User Guides, available to the public on or before May 9, 2017.
Amazon.com Help website, Alexa and Alexa Device FAQs (2016).
Amazon.com Website for Alexa Adams, Alexa: 1001 Tips and Tricks How to Use Your Amazon Alexa devices (Amazon Echo, Second Generation Echo, Echo Show, Amazon Echo Look, Echo Plus, Echo Spot, Echo Dot, Echo Tap, Echo Connect), paperback, Dec. 24, 2017.
Amazon.com Website for Andrew Howard, Amazon Echo Show: 2018 Updated Advanced User Guide to Amazon Echo Show with Step-by-Step Instructions (alexa, dot, echo user guide, echo amazon, amazon dot, echo show, user manual), Paperback, Mar. 18, 2018.
Amazon.com website for Echo Show—1st Generation White (2017).
Amazon.com website for Echo Show—Alexa-enabled bluetooth speaker with 7# screen—black (2017).
Amazon.com website, Introducing Echo Show—Black (Jun. 2017).
Amazon.com website, Quick Start Guides for Alexa-Enabled Devices—Amazon Customer Service (available to the public on or before May 9, 2017).
An End-to-End Solution, CarePredict, website, <https://www.carepredict.com/how-it-works/> (2018).
Angeletou et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D2.1, Preliminary User and System Requirements, delivered Feb. 28, 2014.
Angeletou et al., Project No. 610658, eWALL for Active Long Living, deliverable No. D2.6, Evaluation and validation methodology, delivered Oct. 31, 2014.
Austin et al., A Smart-Home System to Unobtrusively and Continuously Assess Loneliness in Older Adults, IEEE Joumal of Translational Engineering in Health and Medicine, Jun. 2016.
Austin et al., Unobtrusive Monitoring of the Longitudinal Evolution of In-Home Gait Velocity Data with Applications to Elder Care, Conf. Proc. IEEE Eng. Med. Biol. Soc., 2011:6495-8 (2011).
Austin et al., Variability in medication taking is associated with cognitive performance in nondemented older adults, Alzheimer's & Dementia: Diagnosis, Assessment & Disease Monitoring, 6:210-3 (2017).
Banerjee et al., Exploratory analysis of older adults' sedentary behavior in the primary living area using kinect depth data, Journal of Ambient Intelligence and Smart Environments, 9:163-79 (2017).
Bennison, There's no place like (this) home, Fort Worth Business Press (2001).
Bloch et al., Project No. 6.10658, eWall for Active Long Living, Deliverable No. D7.1, Website, delivered Nov. 29, 2013.
Borisov et al., Measuring changes in gair and vehicle transfer ability during inpatient rehabilitation with wearable inertial sensors, Proc. IEEE Int. Conf Pervasive Comput. Commun. Workshops, Mar. 2017.
Bouwer, Evaluating eWALL: Assessing and enhancing older adults' acceptance of a prototype smart home technology, University of Twente, Faculty of Behavioral, Management and Social Sciences, Jan. 2015.
Care@Home™ PERS Control Panel User Guide, Version 3.6, Essence Smart Care Ltd., Sep. 2014.
Care@Home™ Administrator User Guide, Essence Smart Care Ltd., Version 1.5, Jun. 2016.
Caregiver Platform Video from HoneyCo Homes on Vimeo (2018).
Choi et al., Doctor AI: Predicting Clinical Events via Recurrent Neural Networks, Proceedings of Machine Leaming for Healthcare, 2016.
Chung et al., Feasibility testing of a home-based sensor system to monitor mobility and daily activities in Korean American older adults, Int. J. Older People Nurs., 12(1), (Mar. 2017).
Cook et al., MavHome: An agent-based smart home, Proc. of the First IEEE International Conference on Pervasive Computing and Communications, 2003.
Curci et al., Toward naturalistic self-monitoring of medicine intake, CHItaly '17, Sep. 18-20, 2017, Caligari, Italy, Association for Computing Machinery.
Daume, A Course in Machine Learning (Jan. 12, 2013).
Daume, A Course in Machine Learning (Jan. 30, 2017).
Dawadi et al., Automated Clinical Assessment from Smart home-based Behavior Data, IEEE J Biomed Health Inform. (online publication Aug. 17, 2015; physical publication Jul. 2016) (author manuscript available in PubMed Central Jul. 1, 2017).
Defendents' Initial Invalidity Contentions, State Farm Mutual Automobile Insurance Co., (Plaintiff) v. Amazon.com and Amazon.com Services LLC (Defendant), C.A. No. 22-1447 (CJB), filed in the United States District Court for the District of Delaware on Jul. 12, 2023.
Echo Show by Amazon, Amazon website (2017).
Essence Smart Care—Care@Home™, brochure (2016).
eWALL OSS, downloaded from the Internet at: <https://cloudcare2u.com/> (2017).
eWALL Project EU website (2016).
Exhibit 1: U.S. Patent Publication No. 2016/0027278 A1 to McIntosh (Defendents' Invalidity Contentions), Jul. 12, 2023.
Related Publications (1)
Number Date Country
20220334544 A1 Oct 2022 US
Provisional Applications (14)
Number Date Country
62220383 Sep 2015 US
62201671 Aug 2015 US
62200375 Aug 2015 US
62198813 Jul 2015 US
62197343 Jul 2015 US
62193317 Jul 2015 US
62189329 Jul 2015 US
62187642 Jul 2015 US
62187651 Jul 2015 US
62187624 Jul 2015 US
62187645 Jul 2015 US
62187666 Jul 2015 US
62105407 Jan 2015 US
62060962 Oct 2014 US
Continuations (4)
Number Date Country
Parent 17706302 Mar 2022 US
Child 17857880 US
Parent 17701316 Mar 2022 US
Child 17706302 US
Parent 16738328 Jan 2020 US
Child 17701316 US
Parent 14873865 Oct 2015 US
Child 16738328 US