Sensing peripheral heuristic evidence, reinforcement, and engagement system

Information

  • Patent Grant
  • 11462094
  • Patent Number
    11,462,094
  • Date Filed
    Thursday, January 13, 2022
    2 years ago
  • Date Issued
    Tuesday, October 4, 2022
    2 years ago
Abstract
Systems and methods for identifying a condition associated with an individual in a home environment are provided. Sensors associated with the home environment detect data, which is captured and analyzed by a local or remote processor to identify the condition. In some instances, the sensors are configured to capture data indicative of electricity use by devices associated with the home environment, including, e.g., which devices are using electricity, what date/time electricity is used by each device, how long each device uses electricity, and/or the power source for the electricity used by each device. The processor analyzes the captured data to identify any abnormalities or anomalies, and, based upon any identified abnormalities or anomalies, the processor determines a condition (e.g., a medical condition) associated with an individual in the home environment. The processor generates and transmits a notification indicating the condition associated with the individual to a caregiver of the individual.
Description
FIELD OF THE DISCLOSURE

The present disclosure generally relates to identifying a condition associated with an individual in a home environment.


BACKGROUND

As individuals age, many develop cognitive conditions or health conditions making it difficult and/or unsafe for them to live independently in a home environment. However, because the signs of such cognitive conditions and/or health conditions may be subtle, or may develop slowly over time, it may be difficult for caregivers to determine whether an individual is capable of safely living independently.


SUMMARY

In one aspect, a computer-implemented method for identifying a condition associated with an individual in a home environment may be provided. The method may include, via one or more local or remote processors, servers, transceivers, and/or sensors: (1) capturing data detected by a plurality of sensors associated with a home environment; (2) analyzing, by a processor, the captured data to identify one or more abnormalities or anomalies; and/or (3) determining, by a processor, based upon the identified one or more abnormalities or anomalies, a condition associated with an individual in the home environment. The method may additionally include (4) generating, by a processor, to a caregiver of the individual, a notification indicating the condition associated with the individual. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


In another aspect, a computer system for identifying a condition associated with an individual in a home environment may be provided. The computer system may include one or more sensors associated with a home environment, one or more processors configured to interface with the one or more sensors, and/or one or more memories storing non-transitory computer executable instructions. The non-transitory computer executable instructions, when executed by the one or more processors, cause the computer system to (1) capture data detected by the one or more sensors; (2) analyze the captured data to identify one or more abnormalities or anomalies; (3) determine, based upon the identified one or more abnormalities or anomalies, a condition associated with an individual in the home environment; and/or (4) generate, to a caregiver of the individual, a notification indicating the condition associated with the individual. The system may include additional, less, or alternate functionality, including that discussed elsewhere herein.


In still another aspect, a computer-readable storage medium having stored thereon a set of non-transitory instructions, executable by a processor, for identifying a condition associated with an individual in a home environment may be provided. The instructions include instructions for (1) obtaining data detected by a plurality of sensors associated with a home environment; (2) analyzing the captured data to identify one or more abnormalities or anomalies; (3) determining, based upon the identified one or more abnormalities or anomalies, a condition associated with an individual in the home environment; and/or (4) generating, to a caregiver of the individual, a notification indicating the condition associated with the individual. The instructions may direct additional, less, or alternate functionality, including that discussed elsewhere herein.


In still another aspect, a computer-implemented method for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments may be provided. The computer-implemented method may include (1) receiving, by a processor, historical data detected by a plurality of sensors associated with a plurality of home environments; (2) receiving, by a processor, historical data indicating conditions associated with individuals in each of the plurality of home environments; (3) analyzing, by a processor, using a machine learning module, the historical data detected by the plurality of sensors associated with the plurality of home environments and the historical data indicating conditions associated with individuals in each of the plurality of home environments; and/or (4) identifying, by a processor, using the machine learning module, based upon the analysis, one or more abnormalities or anomalies in the historical data detected by the plurality of sensors corresponding to conditions associated with the individuals in the home environments. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


In still another aspect, a computer system for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments may be provided. The computer system may include one or more processors and one or more memories storing non-transitory computer executable instructions. When executed by the one or more processors, the non-transitory computer executable instructions may cause the computer system to: (1) receive historical data detected by a plurality of sensors associated with a plurality of home environments; (2) receive historical data indicating conditions associated with individuals in each of the plurality of home environments; (3) analyze, using a machine learning module, the historical data detected by the plurality of sensors associated with the plurality of home environments and the historical data indicating conditions associated with individuals in each of the plurality of home environments; and/or (4) identify, using the machine learning module, based upon the analysis, one or more abnormalities or anomalies in the historical data detected by the plurality of sensors corresponding to conditions associated with the individuals in the home environments. The system may include additional, less, or alternate functionality, including that discussed elsewhere herein.


In still another aspect, a computer-readable storage medium having stored thereon a set of non-transitory instructions, executable by a processor, for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments may be provided. The instructions may include instructions for: (1) receiving historical data detected by a plurality of sensors associated with a plurality of home environments: (2) receiving historical data indicating conditions associated with individuals in each of the plurality of home environments; (3) analyzing, using a machine learning module, the historical data detected by the plurality of sensors associated with the plurality of home environments and the historical data indicating conditions associated with individuals in each of the plurality of home environments; and/or (4) identifying, using the machine learning module, based upon the analysis one or more abnormalities or anomalies in the historical data detected by the plurality of sensors corresponding to conditions associated with the individuals in the home environments. The instructions may direct additional, less, or alternate functionality, including that discussed elsewhere herein.


Advantages will become more apparent to those skilled in the art from the following description of the preferred embodiments which have been shown and described by way of illustration. As will be realized, the present embodiments may be capable of other and different embodiments, and their details are capable of modification in various respects. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS

The Figures described below depict various aspects of the system and methods disclosed therein. It should be understood that each Figure depicts one embodiment of a particular aspect of the disclosed system and methods, and that each of the Figures is intended to accord with a possible embodiment thereof. Further, wherever possible, the following description refers to the reference numerals included in the following Figures, in which features depicted in multiple Figures are designated with consistent reference numerals.


There are shown in the drawings arrangements which are presently discussed, it being understood, however, that the present embodiments are not limited to the precise arrangements and instrumentalities shown, wherein:



FIG. 1 illustrates an exemplary computer system for identifying a condition associated with an individual in a home environment, in accordance with some embodiments.



FIG. 2 illustrates an exemplary home environment, in accordance with some embodiments.



FIG. 3 illustrates several exemplary user interface displays, in accordance with some embodiments.



FIG. 4 illustrates a flow diagram of an exemplary computer-implemented method for identifying a condition associated with an individual in a home environment, in accordance with some embodiments.



FIG. 5 illustrates a flow diagram of an exemplary computer-implemented method for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments, in accordance with some embodiments.





The Figures depict preferred embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the systems and methods illustrated herein may be employed without departing from the principles of the invention described herein.


DETAILED DESCRIPTION

As discussed above, individuals may develop cognitive conditions or health conditions making it difficult and/or unsafe for them to live independently in a home environment. However, because the signs of such cognitive conditions and/or health conditions may be subtle, or may develop slowly over time, it may be difficult for caregivers to determine whether an individual is capable of safely living independently.


Accordingly, systems and methods for identifying conditions associated with an individual in a home environment are provided herein. Sensors associated with the home environment may passively detect data, which may be captured and analyzed by a processor in order to identify potential conditions associated with an individual in the home environment. In some instances, the sensors may include sensors configured to capture data indicative of electricity use by devices associated with the home environment, including, e.g., which devices are using electricity, what date/time electricity is used by each device, how long each device uses electricity, and/or the power source for the electricity used by each device. The processor may analyze the captured data to identify any abnormalities or anomalies, and, based upon any identified abnormalities or anomalies, the processor may determine a condition (e.g., a medical condition, or a deviation from normal routines or activity) associated with an individual in the home environment. The processor may generate a notification indicating the condition associated with the individual to a caregiver of the individual. Advantageously, the caregiver may be informed of any conditions associated with the individual that may arise.


Moreover, in some embodiments, the systems and methods for identifying conditions associated with an individual in a home environment may include systems and methods for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments. For instance, historical sensor data associated with home environments and corresponding historically identified conditions associated with individuals in the home environments may be used as inputs for a machine learning module to develop a predictive model identifying which a potential condition associated with an individual in a home environment using abnormalities or anomalies in the sensor data associated with the home environment.


The systems and methods provided herein therefore offer numerous benefits. In particular, the systems and methods effectively, efficiently, and non-intrusively, identify conditions associated with an individual in a home environment and provide a notification to a caregiver associated with the individual indicating any identified conditions, allowing a caregiver outside of the home environment to be informed of the status of the individual. In this way, the safety of the individual in the home environment may be improved. That is, because the caregiver will be alerted of conditions affecting the individual, which may include health conditions, cognitive conditions, etc., the caregiver may provide any assistance needed to the individual. Moreover, the caregiver outside the home environment may be able to use the information provided to continually assess whether the individual may continue to safely live independently in the home environment, and accordingly make changes in the individual's care when needed.


Furthermore, according to certain implementations, the systems and methods may support a dynamic, real-time or near-real-time analysis of any captured, received, and/or detected data. In particular, in some embodiments, a caregiver device may receive an indication of a condition associated with the individual in the home environment in real-time or near real-time, and may automatically and dynamically take actions such as requesting emergency medical assistance when needed. In this regard, any caregiver is afforded the benefit of accurate and relevant data, and may provide immediate care and assistance as needed.


Exemplary Computer System


Turning to FIG. 1, an exemplary computer system 100 for identifying a condition associated with an individual in a home environment is illustrated. The high-level architecture illustrated in FIG. 1 may include both hardware and software applications, as well as various data communications channels for communicating data between the various hardware and software components, as is described below.


The computer system 100 may include one or more sensors 102 associated with a home environment 104 such as, e.g., a house, apartment, condominium, or other living space associated with an individual. The computer system 100 further includes a server 106, and a caregiver device 108, each of which may communicate with one another (and/or with the sensors 102) using one or more network 110, which may be a wireless network, or which may include a combination of wireless and wired networks.


The sensors 102 may include, for example, electrical use sensors configured to detect data indicating, e.g., which outlets within the home environment are using electricity, which devices within the home environment are using electricity, the amount of electricity used by each device, dates and/or times at which each device uses electricity, the duration of time for which each device uses electricity, the power source for each device's electricity use (e.g., standard power sources versus generator in emergency situation), etc. Devices within the home environment may include appliances (e.g., stoves (including ignition sources for gas stoves), ovens, dishwashers, washers, dryers, refrigerators, freezers, microwaves, etc.), electronic devices (e.g., televisions, telephones, computers, etc.), heating and cooling devices (e.g., HVAC unit, furnace, hot water heater, etc.), lighting devices (e.g., overhead lights, lamps, etc.), water pumps, garage door openers, alarm systems, exercise equipment (e.g., treadmills, stationary bikes, elliptical machines, etc.), etc.


Additionally, in some instances, the sensors 102 associated with the home environment may also include smart sensors, or sensors associated with smart phones or other “smart” devices (i.e., devices connected to cellular and/or internet networks, or otherwise able to communicate with other devices, e.g., using Bluetooth) within the home environment, such as, e.g., virtual assistant devices, smart home devices, smart thermostats, smart appliances (e.g., smart ovens, smart dishwashers, smart washers, smart dryers, smart refrigerators, smart freezers, smart microwaves, etc.), smart lighting systems, smart speaker systems, smart robotics (e.g., smart robotic vacuums), smart motion sensors, smart water sensors, smart gas and ignition monitors, smart contact sensors, smart air movement and/or draft sensors, smart HVAC systems, smart pet monitor devices, smart medication dispensers, smart wearable electronics, smart alarm systems or security systems, smart scales, or any other suitable smart devices.


Moreover, in some instances, additional data may be captured by other suitable sensors 102 or other devices associated with the home environment (not shown), e.g., geo-locator tags associated with the home environment, weather monitoring devices associated with the home environment, vehicle sensors associated with the home environment (or with an individual within the home environment), wireless internet usage monitors, 3D printers, nanobots, fine motor control measurement devices, or any other suitable sensors or devices.


The sensors 102 may be configured to detect many different types of data, including but not limited to video data, image data, audio and/or decibel data, activity and/or movement data, vibration data, light data, arm/disarm data (i.e., with respect to an alarm system), body temperature data associated with the individual, home environment temperature data, moisture data, odor data, heart rate data associated with the individual, breathing rate data associated with the individual, hydration data associated with the individual, weight data associated with the individual, glucose/ketones levels associated with the individual, medication adherence data, travel and/or location data associated with the individual, socialization data associated with the individual, medical/health monitor device use data associated with the individual, appliance use data associated with the individual, electronics use data associated with the individual, air quality data, air quality data, sleep data associated with the individual, eye movement data associated with the individual, exercise data associated with the individual, body control data associated with the individual, fine motor control data associated with the individual, speech data associated with the individual, health and/or nutrition data associated with the individual, hygiene data associated with the individual, sight and/or hearing data associated with the individual, etc. Accordingly, this data may be communicated to the server 106 and/or caregiver device 108 via the network 110.


The server 106 may in some instances be a collection of multiple co-located or geographically distributed servers, etc. Additionally, although only one server 106 is shown in FIG. 1, there may be many servers 106. Furthermore, the server may include a processor 112 and a memory 114. The processor 112 may in some embodiments include multiple processors, and may be configured to execute any software applications residing on the memory 114. The software applications may be configured to analyze the data detected by the sensors 102 to determine information associated with individuals in the home environment 104.


For example, by analyzing the data detected by the sensors 102, the server 106 may determine indications of, e.g., presence (in particular locations or in the home environment generally), walking patterns, falls, hazards, imminent danger, evidence of atypical behavior (physical, mental, emotional, social), intruders in the home environment, theft in the home environment, fraud, abuse, position of furniture (e.g., for safety layout recommendations), trips, falls and other hazards, moisture and/or puddles on the floor or other surfaces or ceiling, whether lights are on or off, typical and atypical voice patterns (e.g., representing stoke, Alzheimer's detection, hearing decline, cognitive decline), socialization (such as decline in conversation or detecting other people in the house), falls, behavioral change (e.g., more aggression, more arguing, less conversation than usual) laughter, crying, other sounds relating to emotion, vehicle coming and going, movements of the individual from room to room, pace and speed of movement of the individual, duration of time spent in a space by the individual, time the individual spends outside of the house, atypical movements (e.g., seizures, movements before or after a fall or injury) walking patterns and/or gaits, movements and activities related to cooking, cleaning, exercise, entertainment, socialization, movements associated with a trip or stumble, eye movement, fall and impact level of fall, items dropping or breaking, entrance and or exit of the individual into or out of the home environment, tornados, earthquakes, other disaster events, phone calls and/or texts, vehicles coming and/or going, locations in the home environment where lights are turned on or off, for time duration of lights turned on and/or off, date of activation, activity of automatic lights that activate when movement or other stimulus is detected, alarm activation and/or disarm, alarm activation with no disarm, number of times over an amount of time, such as hour or single day an alarm is armed and disarmed, frequency or amount of accidental alarm activations, exit times when alarm is armed, home environment temperature (e.g., highs, lows, averages), hot water temperature from heater, faucets, and bathtub/showers, oven or stovetop temperatures, body temperature of inhabitants, differentiation of temperature between rooms of the house, opening and/or closing of vents, plumbing leaks, sump pump activation/issue, humidity averages and out of average range in the home environment, bed wetting or accidents, carbon monoxide, carbon dioxide, air quality, smoke, stagnant air, mold/mildew, ammonia, body odor, feces, pet, urine, natural gas, burning food, presence of certain foods, medical information associated with the individual (such as heart rate, blood pressure, cholesterol, glucose, ketones, weight, hydration, nutrition, medication adherence, medical device use or adherence, breathing rate), GPS location, travel itinerary, routine locations traveled to, mode of travel (e.g., plane, train, automobile, ride sharing services), travel services, duration of travel, travel delays, travel interruptions or difficulties, interaction routines, individuals with whom the individual frequently interacts, frequency of interactions, types of interactions, internet usage or activity, streaming television shows or movies, social media interaction or activity, social media postings, email usage, etc.


Moreover, the memory 114 may include multiple memories, which may be implemented as semiconductor memories, magnetically readable memories, optically readable memories, biologically readable memories, and/or any other suitable type(s) of non-transitory, computer-readable storage media. Additionally, the server may be configured to access a database 116 (e.g., via the network 110), which may store data related to, inter alia, behavioral patterns of individuals in the home environment 104, conditions associated with individuals in the home environment 104, or other suitable information associated with individuals in the home environment 104, etc. In some instances, blockchain encryption may be implemented to securely store this data. The data stored by the database 116 may be used by any software applications stored in the memory 114.


For example, various software applications stored in the memory 114 may be configured to analyze the information associated with the individual to identify abnormalities or anomalies associated with the individual (e.g., abnormalities or anomalies in the individual's behaviors). Based upon the identified abnormalities or anomalies, a condition associated with an individual in the home environment may be determined. Specifically, the condition (e.g., a medical condition, cognitive condition, etc.) may be determined based upon atypical behaviors of the individual indicated by the identified abnormalities or anomalies. Based upon the condition, the software applications may be configured to generate a notification for a caregiver of the individual, and/or transmit the notification to a caregiver device 108.


The caregiver device 108 may be, for instance, a personal computer, cellular phone, smart phone, tablet computer, smart watch, a wearable electronic device such as a fitness tracker, a dedicated caregiver device, or any other suitable mobile device associated with a caregiver of an individual in the home environment 104. Furthermore, the caregiver device 108 may include a processor 118 and a memory 120. The processor 118 may in some embodiments include multiple processors, and may be configured to execute any software applications residing on the memory 120. Moreover, the memory 120 may include multiple memories, which may be implemented as semiconductor memories, magnetically readable memories, optically readable memories, biologically readable memories, and/or any other suitable type(s) of non-transitory, computer-readable storage media. Additionally, the caregiver device 108 may include a user interface 122 (e.g., a display configured to display a user interface), upon which notifications, alerts, etc. may be displayed, as discussed in greater detail with respect to FIG. 3 below.


Exemplary Home Environment


Turning now to FIG. 2, an exemplary home environment 200 is illustrated, in accordance with some embodiments. An individual 202 in the home environment 200 may use devices within the home environment 200 as part of a daily routine. For example, the individual 202 may wake up every morning and switch on a lamp 204 before using a shower 206 (and/or sink 208, toilet 210, or bath 212). Next, the individual 202 may switch on a television 214, check emails or social media posting, and/or a living room lamp 216, before making breakfast using a stove 218 or oven 220.


Data captured by electrical use sensors (or other sensors in the home environment) 102 indicating the typical usage of each of these devices within the home environment 200 may be utilized to establish patterns and/or routines associated with the individual 202. For example, data captured by electrical use sensors indicating that the lamp 204 uses electricity starting every morning at 6:00 a.m. may indicate that the individual 202 has a pattern or routine of waking every morning at 6:00 a.m. As another example, data captured by electrical use sensors indicating that the television 214 uses energy for 2-3 hours per day may indicate that the individual 202 has a pattern or routine of watching 2-3 hours of television per day.


Of course, other of the sensors 102 may also be utilized to establish patterns and/or routines associated with the individual 202 in various embodiments. For instance, data stored by smart lighting may indicate usage times of various lamps within the home environment, data stored by a smart television may indicate the duration of use of the television, data stored by a laptop or other computer may indicate usage times of the internet or streaming television shows, internet usage may also indicate social media usage or time-of-day usage, etc.


By analyzing the captured data, information about the behaviors of the individual within the home environment may be determined. Specifically, the captured data may be analyzed to identify abnormalities or anomalies. Based upon the identified abnormalities or anomalies, a condition associated with an individual in the home environment may be determined. Specifically, the condition (in some instances, a medical condition) may be determined based upon atypical behaviors of the individual indicated by the identified abnormalities, abnormalities, or shifts in behavior patterns. Consequently, a notification indicating the condition associated with the individual 202 may be generated and/or displayed for a caregiver of the individual (such as a parent, child, spouse, doctor, nurse, or other medical caregiver, assisted living facility caregiver or other professional caregiver, etc.), e.g., via a caregiver device 108.


Exemplary User Interface



FIG. 3 illustrates several exemplary user interface displays 300, 302, 304 generated and displayed for a caregiver (e.g., via a caregiver device 108) of an individual in a home environment based upon the data captured by the sensors 102. For example, user interface display 300 displays an alert indicating that a medical emergency associated with the individual in the home environment has been detected. Consequently, the caregiver may be presented with options to call the individual, and/or to call an emergency service (e.g., medical services, fire services, police services, etc.)


As another example, user interface display 302 displays an update indicating that a snapshot report for June 2018 is available for caregiver review. User interface display 304 displays an exemplary June 2018 Snapshot report associated with the individual in the home environment, generated for the caregiver of the individual.


In various embodiments, snapshot reports as shown in user interface display 304 may be generated periodically (e.g., daily, weekly, and/or yearly), showing the individual's progress or decline in various areas, as well as items of potential concern. In some instances, the snapshot report may further include an indication of, e.g., whether the individual is able to live independently, whether the individual may need assistance, whether a fall or injury may have occurred, whether the individual is bathing, eating, or sleeping, potential cognitive or medical issues, whether the individual is taking his or her medication and/or whether the medication is helping the individual, etc. For example, as shown in exemplary user interface display 304, the snapshot report includes a notification that early signs of Alzheimer's disease have been detected in the individual, and that a possible fall has been detected.


In various embodiments the report may include one or more of customized time period reports, comparative timeframe reports, an indication of the number of notifications sent to caregiver and for what, medication taken, for what, and if that medication is working, areas of concern, areas of decline, areas where intervention or improvement needs to take place, areas where the individual needs assistance or additional assistance, suggested and/or recommend actions to be taken by an individual or by a caregiver, recommended services, support, and resources within a certain proximity of the location, number ranges of optimal (based upon age/ability) levels of physical, mental, social, and emotional engagement, activity, and/or ability, goals and/or goal setting/meeting features, etc. In some instances, the report may include an independent living overall score (e.g., with 100 representing ideal independent living capability, 70 representing family/support services interaction needed for independent living capability, 60 representing significant family/support services interaction needed to live independently, 50 and below representing professional assistance needed, etc.). For example, as shown in exemplary user interface display 304, the snapshot report includes an independent living score of 50.


Exemplary Computer-Implemented Method for Identifying Condition Associated with Individual in Home Environment


Turning now to FIG. 4, a flow diagram of an exemplary computer-implemented method 400 for identifying a condition associated with an individual in a home environment is illustrated, in accordance with some embodiments. The method 400 can be implemented as a set of instructions stored on a computer-readable memory and executable on one or more processors.


At block 402, data detected by a plurality of sensors (e.g., sensors 102) associated with a home environment (e.g., home environment 104) is captured. For example, an electrical use sensor may detect that, at a given time, an oven in the home environment is currently using electricity. As another example, an electrical use sensor may detect data indicating that a lamp in the home environment uses power on weekdays starting at 6:00 a.m., and on weekends starting at 8:00 a.m. As still another example, an electrical use sensor may detect data indicating that a water pump in the home environment has been operating for a certain amount of time (e.g., 15 minutes, one hour, all day, etc.). As an additional example, an electrical use sensor may detect that a furnace in the home environment is using power from a power generator.


By analyzing the captured data, information about the behaviors of the individual within the home environment may be determined. In some instances, the information indicated by the captured data may relate to current behaviors of the individual. For example, data from an electrical use sensor indicating that an oven in the home environment is currently using electricity may indicate that an individual in the home environment is currently cooking a meal. As another example, from an electrical use sensor indicating that a water pump in the home environment has been operating for a certain amount of time (e.g., 15 minutes, one hour, all day, etc.) may indicate that an individual within the home environment is currently taking a shower or a bath.


Additionally, over time, the information indicated by the captured data may be used to determine patterns in the behaviors of the individual. In other words, the captured data may be analyzed to identify data patterns indicative of behavior patterns. For instance, data from an electrical use sensor indicating that a lamp in the home environment uses power every day starting at 6:00 a.m. may indicate that an individual within the home environment wakes up around 6:00 a.m. each morning. Similarly, data from an electrical use sensor indicating that a water pump in the home environment typically operates for 15 minutes at 6:30 a.m. may indicate that the individual typically showers at 6:30 a.m. each morning.


At block 404, the captured data is analyzed to identify abnormalities or anomalies. The captured data may be compared against identified patterns in order to identify instances in which the data is inconsistent with the identified patterns, which may in turn indicate abnormalities or anomalies in the behaviors of the individual. For instance, in the example described above, an identified pattern in the electrical usage of a lamp in the home environment starting at 6:00 a.m. may indicate a typical wake-up time of 6:00 a.m. for an individual in the home environment. However, if the captured data indicates that one day the lamp does not use electricity until 8:00 a.m., an anomaly or abnormality may be identified for that day. This captured data may in turn indicate an anomaly or abnormality in the behavior of the individual, i.e., that the individual woke up later than usual on that day.


As another example, data from an electrical use sensor may indicate that a water pump in the home environment operates once per day for 15 minutes, which may in turn indicate that the user showers every day. However, if the captured data indicates that one day the water pump does not operate, or operates for a time duration shorter than a typical shower (e.g., only operates for 2 minutes), an abnormality or anomaly may be identified for that day. This captured data may in turn indicate an abnormality or anomaly in the behavior of the individual, i.e., that the individual did not shower on that day.


In some instances, analyzing the captured data to identify abnormalities or anomalies may include analyzing the captured data to identify the emergence of new behavior patterns, or shifts in behavior patterns of the individual. For instance, using the example described above, over several years, an identified pattern in the electrical usage of a lamp (or television or computer) in the home environment starting at 6:00 a.m. may indicate a typical wake-up time of 6:00 a.m. for an individual in the home environment over those several years. However, over the most recent month, a new pattern in the electrical usage of the lamp may emerge, indicating a new typical wake-up time of 8:00 a.m. for the individual in the home environment.


As another example, a pattern in the electrical usage of a stove or oven may indicate that an individual in the home environment uses the stove or oven (likely to cook a meal) five times per week over the course of six months. However, over the most recent six months, a new pattern in the electrical usage of the stove or oven may emerge, indicating that the individual in the home environment uses the stove or oven (likely to cook a meal) only one time per week.


At block 406, a condition associated with an individual in the home environment is determined based upon the identified abnormalities or anomalies. Specifically, the condition (in some instances, a medical condition) may be determined based upon atypical behaviors of the individual indicated by the identified abnormalities, abnormalities, or shifts in behavior patterns. For example, if the captured data indicates that a user has left a stove or oven on for an atypically long time, a forgetfulness condition associated with the individual may be determined (i.e., because the user likely forgot that the stove or oven was on).


As another example, if the captured data previously indicated that the individual likely showered once per day, and the capture data indicates that the individual currently showers only once per week, a hygiene condition associated with the individual may be determined (i.e., because the individual is likely bathing less frequently). As still another example, if the captured data indicates that the individual wakes up at a time that is atypically late, an insomnia or fatigue condition associated with the individual may be determined.


Other atypical behaviors that may be detected based upon identified abnormalities or anomalies may include, for instance, lights not being turned on/off, changes in laundry habits, refrigerator not opening and closing, alarm not armed or disarmed, slurred speech when using a voice assistant or passive listening, change in gait, water running too long (i.e., suggesting a shower fall or cognitive failure), sensing a large mass has quickly flattened (i.e., suggesting a fall has occurred).


At block 408, a notification indicating the condition associated with the individual is generated. The notification may be displayed to a caregiver of the individual, such as a parent, child, spouse, doctor, nurse, or other medical caregiver, assisted living facility caregiver or other professional caregiver, etc. In some instances, the notification may indicate a specific event, such as a fall, an entrance or an exit, a travel event, a medical event, or another critical event associated with the individual.


In some examples, when the condition associated with the individual is an emergency medical condition or other urgent condition, a request for emergency services to be provided to the individual may be generated. For instance, ambulance, police, or fire services may be requested.


In some instances, e.g., as shown in FIG. 3, the notification may be a digital “snapshot” report generated periodically (e.g., daily, weekly, and/or yearly), showing the individual's progress or decline in various areas, as well as items of potential concern. The report may further include an indication of, e.g., whether the individual is able to live independently, whether the individual may need assistance, whether a fall or injury may have occurred, whether the individual is bathing, eating, or sleeping, potential cognitive issues, whether the individual is taking his or her medication and/or whether the medication is helping the individual, etc. In some instances, the reports may be configured to be shared with medical professionals, family members, caregivers, etc. Blockchain encryption may be implemented in some instances to keep the report information secure.


Exemplary Computer-Implemented Method for Training Machine Learning Module


Turning now to FIG. 5, a flow diagram of an exemplary computer-implemented method 500 for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments is illustrated, in accordance with some embodiments. The method 500 can be implemented as a set of instructions stored on a computer-readable memory and executable on one or more processors.


At block 502, historical data detected by a plurality of sensors (e.g., sensors 102) associated with a plurality of home environments (e.g., home environment 104) may be received. In some instances, the historical sensor data may be received upon accessing a database (e.g., database 116). Similarly, at block 504, historical data indicating one or more conditions associated with the individuals in each home environment may be received. In some instances, the historical data may be received upon accessing a database (e.g., database 116).


At block 506, the historical data detected by the plurality of sensors associated with the plurality of home environments and the historical data indicating conditions associated with individuals in each home environment may be analyzed using a machine learning module (or artificial intelligence, or other machine learning models, programs, or algorithms). That is, the historical sensor data and/or the historical condition data may be used as input for the machine learning module, which may employ a neural network, which may be a convolutional neural network, a deep learning neural network, or a combined learning module or program that learns in two or more fields or areas of interest. Re-inforced or reinforcement learning techniques may also be used.


The machine learning module's analysis of the historical sensor data and the historical condition data may include identifying and recognizing patterns in the data, including the types of data and usage data discussed herein. In some embodiments, existing data, including usage, text or voice/speech data may be used in order to facilitate making predictions for subsequent data. Voice recognition and/or word recognition techniques may also be used. Models may be created based upon example inputs in order to make valid and reliable predictions for novel inputs.


Additionally or alternatively, the machine learning programs may be trained by inputting sample data sets or certain data into the programs, such as mobile device, smart home, smart home sensor, drone, autonomous or semi-autonomous drone, image, vehicle telematics, smart or autonomous vehicle, and/or intelligent home telematics data. In general, training the neural network model may include establishing a network architecture, or topology, and adding layers that may be associated with one or more activation functions (e.g., a rectified linear unit, softmax, etc.), loss functions and/or optimization functions. Data sets used to train the artificial neural network(s) may be divided into training, validation, and testing subsets; these subsets may be encoded in an N-dimensional tensor, array, matrix, or other suitable data structures.


Training may be performed by iteratively training the network using labeled training samples. Training of the artificial neural network may produce byproduct weights, or parameters which may be initialized to random values. The weights may be modified as the network is iteratively trained, by using one of several gradient descent algorithms, to reduce loss and to cause the values output by the network to converge to expected, or “learned”, values. In an embodiment, a regression neural network may be selected which lacks an activation function, wherein input data may be normalized by mean centering, to determine loss and quantify the accuracy of outputs. Such normalization may use a mean squared error loss function and mean absolute error. The artificial neural network model may be validated and cross-validated using standard techniques such as hold-out, K-fold, etc. In some embodiments, multiple artificial neural networks may be separately trained and operated, and/or separately trained and operated in conjunction.


Furthermore, the machine learning programs may utilize deep learning algorithms that may be primarily focused on pattern recognition, and may be trained after processing multiple examples. The machine learning programs may include Bayesian program learning (BPL), voice recognition and synthesis, image or object recognition, optical character recognition, and/or natural language processing—either individually or in combination. The machine learning programs may also include natural language processing, semantic analysis, automatic reasoning, and/or other machine learning techniques, including those discussed elsewhere herein.


In supervised machine learning, a processing element may be provided with example inputs and their associated outputs, and may seek to discover a general rule that maps inputs to outputs, so that when subsequent novel inputs are provided the processing element may, based upon the discovered rule, accurately predict the correct output. In unsupervised machine learning, the processing element may be required to find its own structure in unlabeled example inputs.


At block 508, one or more abnormalities or anomalies in the historical data detected by the plurality of sensors corresponding to the conditions associated with the individuals in the home environments may be identified using a machine learning module, based upon the analysis. The identified abnormalities or anomalies in the historical data and their corresponding conditions may comprise a predictive model to be used to analyze current sensor data. For example, the model may include a prediction of a condition associated with an individual in the home environment based upon certain abnormal or anomalous patterns in current sensor data.


In some instances, certain aspects of the exemplary computer-implemented method 500 may be combined with aspects of the exemplary computer-implemented method 400. For example, the identified abnormalities or anomalies in the historical data and their corresponding conditions discussed with respect to block 508 may be utilized in the analysis discussed with respect to block 404 of the method 400, e.g., by comparing the one or more abnormalities or anomalies in current sensor data to the identified abnormalities or anomalies in the historical data and their corresponding conditions. Of course, additional or alternative combinations of these methods may be envisioned in various embodiments.


Exemplary Systems & Methods for Anomaly Detection


In one aspect, a computer-implemented method for identifying a condition associated with an individual in a home environment may be provided. The method may include (1) capturing or receiving data detected by a plurality of sensors associated with a home environment, such as at one or more transceivers associated with one or more local or remote processors via wireless transmission or data transmission over one or more radio frequency links; (2) analyzing, by the one or more local or remote processors, the captured or received sensor data to identify one or more abnormalities or anomalies; (3) determining, by the one or more local or remote processors, based upon the identified one or more abnormalities or anomalies, a condition associated with an individual in the home environment; (4) generating, by the one or more local or remote processors a notification indicating the condition associated with the individual; and/or (5) transmitting, by the one or more local or remote processors and/or associated transceivers the notification to a caregiver mobile device, the caregiver being associated with the individual. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


In another aspect, a computer system for identifying a condition associated with an individual in a home environment may be provided. The system may include one or more sensors associated with a home environment; one or more local or remote processors and/or associated transceivers configured to interface with the one or more sensors; and one or more memories storing non-transitory computer executable instructions that, when executed by the one or more processors, cause the computer system to: (1) capture data detected by the one or more sensors, or receive data generated by the one or more sensors, such as data wirelessly communicated over one or more radio frequency links; (2) analyze the captured data to identify one or more abnormalities or anomalies; (3) determine, based upon the identified one or more abnormalities or anomalies, a condition associated with an individual in the home environment; (4) generate an electronic notification indicating the condition associated with the individual; and/or (5) transmit the electronic notification, such as via wireless communication or data transmission over one or more radio frequency links to a mobile device or other computing device associated with a caregiver associated with the individual. The system may include additional, less, or alternate functionality, including that discussed elsewhere herein.


In another aspect, a computer-implemented method for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments may be provided. The method may include (1) receiving, by one or more local or remote processors, historical data detected by a plurality of sensors associated with a plurality of home environments; (2) receiving, by the one or more local or remote processors, historical data indicating conditions associated with individuals in each of the plurality of home environments; (3) analyzing, by the one or more local or remote processors, using a machine learning module, (i) the historical data detected by the plurality of sensors associated with the plurality of home environments, and/or (ii) the historical data indicating conditions associated with individuals in each of the plurality of home environments; and/or (4) identifying, by the one or more local or remote processors, using the machine learning module, based upon the analysis, one or more abnormalities or anomalies in the historical data detected by the plurality of sensors corresponding to conditions associated with the individuals in the home environments. The method may also include (5) capturing or receiving current data detected by a plurality of sensors associated with a home environment; (6) analyzing, by the one or more local or remote processors and/or the trained machine learning module, the captured current data to identify one or more abnormalities or anomalies in the current data; (7) comparing, by the one or more local or remote processors and/or the trained machine learning module, the one or more abnormalities or anomalies in the current data to the abnormalities or anomalies in the historical data detected by the plurality of sensors corresponding to conditions associated with the individuals in the home environments; and/or (8) determining, by the one or more local or remote processors and/or the trained machine learning module, based upon the comparison, a current condition associated with an individual in the home environment. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


In one aspect, a computer system for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments may be provided. The system may include one or more local or remote processors and/or associated transceivers; and one or more memories storing non-transitory computer executable instructions that, when executed by the one or more processors and/or associated transceivers, cause the computer system to: (1) receive historical data detected by a plurality of sensors associated with a plurality of home environments; (2) receive historical data indicating conditions associated with individuals in each of the plurality of home environments; (3) analyze, using a machine learning module, (i) the historical data detected by the plurality of sensors associated with the plurality of home environments and/or (ii) the historical data indicating conditions associated with individuals in each of the plurality of home environments; and/or (4) identify, using the machine learning module, based upon the analysis, one or more abnormalities or anomalies in the historical data detected by the plurality of sensors corresponding to conditions associated with the individuals in the home environments. The system may include additional, less, or alternate functionality, including that discussed elsewhere herein.


In one aspect, a computer-implemented method for training a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments may be provided. The method may include (1) receiving, by one or more local or remote processors and/or associated transceivers (such as via wireless communication or data transmission over one or more radio frequency links or communication channels), historical data detected by a plurality of sensors associated with a plurality of home environments; (2) receiving, by the one or more local or remote processors and/or associated transceivers, historical data indicating conditions associated with individuals in each of the plurality of home environments; (3) inputting, by the one or more local or remote processors, into a machine learning module (i) the historical data detected by the plurality of sensors associated with the plurality of home environments and/or (ii) the historical data indicating conditions associated with individuals in each of the plurality of home environments to train the machine learning module to identify one or more abnormalities or anomalies in the historical data detected by the plurality of sensors corresponding to conditions associated with the individuals in the home environments: (4) capturing current data detected by a plurality of sensors associated with a home environment, or receiving current data detected by the plurality of sensors associated with the home environment, via the one or more local or remote processors and/or associated transceivers; (5) inputting, by the one or more local or remote processors, the current data into the trained machine learning module to identify one or more abnormalities or anomalies in the current data and/or an individual in the current data; (6) generating, by the one or more local or remote processors, a notification regarding the one or more abnormalities or anomalies and/or the individual; and/or (7) transmitting, by the one or more local or remote processors and/or transceivers, the notification to a mobile or other computing device of a caregiver for the individual. The method may include additional, less, or alternate actions, including those discussed elsewhere herein.


In another aspect, a computer system configured to train a machine learning module to identify abnormalities or anomalies in sensor data corresponding to conditions associated with individuals in home environments may be provided. The system may include one or more local or remote processors, memories, sensors, transceivers, and/or servers configured to: (1) receive, such as via wireless communication or data transmission over one or more radio frequency links or communication channels, historical data detected by a plurality of sensors associated with a plurality of home environments; (2) receive, such as via wireless communication or data transmission over one or more radio frequency links or communication channels, historical data indicating conditions associated with individuals in each of the plurality of home environments; (3) input, into a machine learning module, (i) the historical data detected by the plurality of sensors associated with the plurality of home environments and/or (ii) the historical data indicating conditions associated with individuals in each of the plurality of home environments to train the machine learning module to identify one or more abnormalities or anomalies (and/or conditions associated with an individual) in the historical data detected by the plurality of sensors corresponding to conditions associated with the individuals in the home environments; (4) capture current data detected by a plurality of sensors associated with a home environment, or receive, such as via wireless communication or data transmission over one or more radio frequency links or communication channels, current data detected by the plurality of sensors associated with the home environment; (5) input the current data into the trained machine learning module to identify one or more abnormalities or anomalies in the current data and/or an individual in the current data; (6) generate an electronic notification regarding the one or more abnormalities or anomalies and/or the individual, and/or (7) transmit the electronic notification, such as via wireless communication or data transmission over one or more radio frequency links or communication channels, to a mobile or other computing device of a caregiver for the individual. The system may include additional, less, or alternate functionality, including that discussed elsewhere herein.


Other Matters


The computer-implemented methods discussed herein may include additional, less, or alternate actions, including those discussed elsewhere herein. The methods may be implemented via one or more local or remote processors, transceivers, servers, and/or sensors (such as processors, transceivers, servers, and/or sensors mounted on homes, mobile devices, vehicles, computers, televisions, drones, or associated with smart infrastructure or remote servers), and/or via computer-executable instructions stored on non-transitory computer-readable media or medium.


Additionally, the computer systems discussed herein may include additional, less, or alternate functionality, including that discussed elsewhere herein. The computer systems discussed herein may include or be implemented via computer-executable instructions stored on non-transitory computer-readable media or medium.


In some embodiments, a processor or a processing element may be trained using supervised or unsupervised machine learning, and the machine learning program may employ a neural network, which may be a convolutional neural network, a deep learning neural network, or a combined learning module or program that learns in two or more fields or areas of interest.


Exemplary Spheres Embodiments


In one embodiment, a SPHERES system may be provided. SPHERES—or Sensing Peripheral Heuristic Evidence, Reinforcement, and Engagement System—may be a network of passive smart systems (i.e., Electric use monitoring sensors, connected home assistant, connected smart home systems, connected video security and more) combined with AI/software to form a solution which autonomously and passively monitors and measures spheres of information, behaviors, and data points, looking for routines and changes to those routines. In one embodiment, the sensor or other data collected may provide actionable insights to caregivers based upon discrete data points, and may monitor routines of certain individuals at risk of medical or other issues.


From passive data collection, SPHERES enables live tracking and notification of the routine norms and abnormalities/anomalies, impacts and dangers within the data and provides instant digital notification when the system senses a change which could cause harm, injury, or loss, is an emergency, or needs immediate additional human interaction and follow-up.


The system also produces on-demand, daily, monthly, yearly, and custom time period-over-time period digital snapshots of collected data to easily understand changes (and red flags) in norms, routines, and behaviors. SPHERES reports include recommendations based upon the collected data points to improve user experience and engagement, increase safety or efficiency, decrease time to intervention/action, save money, or otherwise decrease negative aspects or improve positive impacts. To keep all this personal data safe, it is stored using Blockchain encryption.


One exemplary embodiment may be related to independent/senior living. Here, SPHERES, using electric monitoring sensors, connected smart home devices and assistants (such as Nest, Google Home, Amazon Alexa), and security devices (such as Canary and ADT), and a PERS device (such as Life Alert), a family caregiver could understand the daily routine of a semi-independent loved one, know instantly or within a short period of time, if there are red flags or emergencies which need interaction, and over time, have information they need to help the care recipient make informed independent living or enhanced caregiving environment choices.


In this example, SPHERES would monitor energy use within the home, such as when lights are turned on or off, when computers are activated, when water pumps are running, and when large appliances or TVs and devices are running, when a smart light activates or senses presence, and/or when a garage or other door open. SPHERES may also monitor security features such as cameras and alarm systems, and/or monitor home assistants, such as a voice-activated assistant.


SPHERES, using passive monitoring, keeps track of an independent living consumer's daily routine. Because humans are creatures of habit, trends/routines—such as sleep and wake time, breakfast, lunch, and dinner routines, entertainment themes, enter/exit themes, movement around house or property patterns, personal hygiene themes, etc.—may be formed. Additionally, everyone has their own walking pattern, speech pattern, and other patterns which are unique.


By using a connected network of smart appliances, devices, and systems, sensor technology, and AI/software data sorting and analysis, SPHERES can understand when something is typical or atypical, such as: lights not being turned on/off; changes in laundry habits; refrigerator not opening and closing; alarm not armed or disarmed; when speech is slurred when using a voice assistant or passive listening; a camera could sense a person's gate has changed maybe due to injury; water running too long could suggest a shower fall or cognitive failure; sensing a large mass similar to that of the resident has quickly flattened suggesting a fall has occurred; and/or many other data points and abnormalities in routine (including those mentioned elsewhere herein) suggesting action and caregiver instant notification is needed.


SPHERES may also include reporting, recommendations, and action plan functionality. Using multiple data points over time, digital reports may be generated showing progress or decline and items of concern. These snapshot reports may be used to understand the whole picture of a loved one's Independent Living situation. Are they able to live independently? Do they need assistance? Has a fall or injury happened? Are they depressed and staying in bed longer/not eating? Are there cognitive issues? Is medication being taken? Is the medication beneficial or affective? Based upon passive monitoring of routines and behaviors, these and more are questions SPHERES can help a family and professional caregivers understand an independent living environment, make comparisons to progress and decline, and which can inform discussions and decisions for or with a loved one.


SPHERES may also include equipment and devices used to passively monitor a home and/or occupants. A first and primary source monitored may be Electrical Use Monitoring System (such as one that may be tied into, or a part of, a home's main breaker box) may be used to monitor time used, duration, what device is asking for power (e.g., appliance, furnace, hot water heater, devices, garage door, overhead lights/lamp, alarm, electrical outlet, water pump, ignition source), and/or what is providing external power (e.g., generator for emergencies). Other sources monitored may include smart phones, mobile devices, connected devices, connected assistants, connected appliances, connected home controls, connected safety systems, connected lighting or speaker systems, connected robotics or sensors, connected motion sensors, connected water sensors, connected gas and ignition monitors, connected contact sensors, connected air movement/draft sensors, connected pet monitoring, geo locator tags, weather monitor, connected vehicle sensors, Wi-Fi activity, medication dispensers or medical dispenser sensors, 3D printers, nano-bots, fine motor control measurements, and/or smart toilets/drains.


SPHERES may include sensors that may detect and measure video and still pictures, audio and decibel levels, activity and movement, vibration, light, arm/disarm functionality, temperature (body and house), moisture, odor, heart rate, breathing rate, hydration, weight, glucose/ketones levels, medical adherence, travel and location, socialization, medical/health monitor use, appliance and electronics use, air quality, sleep, eye movement, exercise, body control, fine motor control, speech, health, nutrition, hygiene, and/or sight and hearing.


The data collected by SPHERES may be analyzed by one or more artificial intelligence or machine learning models, modules, algorithms, or programs, including the types of machine learning techniques discussed elsewhere herein. For instance, the artificial intelligence or machine learning models, modules, algorithms, or programs may be trained using sample image, audio, home telematics, or other types of data, including video and still images. Once trained, current or near real-time data, including video, image, audio, home telematics, or other data, may be input into trained models, modules, algorithms, or programs to identify abnormal or normal conditions or events, such as presence, walking patterns, falls, hazards, imminent danger, evidence of atypical behavior (physical, mental, emotional, social), intruders, theft, fraud, abuse, detecting position of furniture for safety layout recommendations, detecting trip and/or other hazards. Also detected may be moisture/puddles on the floor or other surfaces or ceiling, and/or when lights are on and off.


Audio and decibel conditions or events may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms, such as typical and atypical voice patterns (representing stoke, Alzheimer's detection, hearing decline—increased TV volume, cognitive decline—such as repeating), social (such as decline in conversation or detecting other people in the house), fall detection, behavioral change (such as more aggression, arguing or less conversation than normal), laughter, crying, other sounds relating to emotion, vehicle coming and going.


Activity and movement may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms, such as moving from room to room, pace and speed of movement, time spent in a space, time spent outside of the house, atypical movement (such as a seizure or movement before or after a fall or injury), patterned walking style, movements and activities related to cooking, cleaning, exercise, entertainment, social, detecting movements associated with a trip or stumble, and/or eye movement.


Activity and movement may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms, such as detecting fall and impact level of fall, detect an item drop/breaking, detecting entry/exit, detecting tornado, earthquake, or other disaster event, phone text/call, and/or a vehicle coming going.


Light switch/light activation may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms, such as detecting when lights turn on and off for individual locations, for what amount of time, and date of activation. The sensor or other data collected may also be used to detect activity of automatic lights that activate when movement or other stimulus is detected.


Arming and disarming functionality may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms. For instance, conditions that may be identified or detected include entry/exit, alarm activation/disarm, alarm activation with no disarm, number of times over an amount of time, such as hour or single day a home security alarm is armed and disarmed. Also detected may be the amount of accidental alarm activations, and/or the amount of times exiting a home with arming.


Home, home systems, and body temperature may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms. For instance, conditions that may be identified or detected include, home temperature highs, lows, and averages; hot water temperature from heater, faucets, and bathtub/showers; oven or stovetop temperatures; body temperature of inhabitants to understand if a notification is needed due to illness, accident, or death; differentiation of temperature between rooms of the house; and/or if vents or windows are open or closed.


Moisture may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms. For instance, conditions that may be identified or detected include plumbing or other leaks, sump pump activation/issue, humidity averages and out of average range in the home, and/or bed wetting or accidents.


Odor may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms. For instance, conditions that may be identified or detected may include carbon monoxide/dioxide, air quality, smoke, stagnant air, mold/mildew, ammonia, body odor, feces, pet, urine, natural gas, burning food, and/or the presence of certain foods which cause allergic reactions.


Medical/bio data may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms. For instance, conditions that may be identified or detected may include medical information or characteristics, such as heart rate, BP, cholesterol, glucose, ketones, weight, hydration, nutrition, medication adherence or non-adherence, medical/health monitor device use/adherence at home, and/or breathing rate.


Travel and location may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms. For instance, GPS, travel itinerary, routine locations, mode or service, purchases of travel, airline, train, or Uber, length and time of travel, travel delays, interruptions, and/or difficulties.


Socialization may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms. For instance, conditions that may be identified or detected may include interaction routines, who interaction is with, how many times a day/week/month, and/or types/categories of conversations.


Appliance and electronics use may also be identified or detected by artificial intelligence programs or trained machine learning models, modules, programs, or algorithms.


In the event that a condition, event, or abnormal condition is detected or identified, a caregiver may receive a digital notification. For instance, digital messages may be received, such as via a mobile device, for fall, medical, emergency, or critical event, entry/exit, travel, or a list of caregiver selected events for anything outside of norms or recommended norms in the above document.


Digital reports may be generated and transmitted to a caregiver's mobile or other computing device. The digital reports may be easy to scan, read, print, share, and analyze Daily, Weekly, and Yearly snapshot reports; customized time period reports; and/or comparative timeframe reports. The reports may detail the number of notifications sent to caregiver and for what; and/or medication taken, for what, and if that medication is working. The reports may highlight areas of concern, areas of decline, and areas where intervention or improvement needs to take place or needs assistance or additional assistance. The reports may suggest/recommend actions to be taken, and/or may feature recommended services, support, and resources within a certain proximity of the location. The reports may feature number ranges of optimal (based upon age/ability) levels of physical, mental, social, and emotional engagement, activity, and ability and where recipients of care rank.


The reports may further suggest goals and goal setting/meeting features and/or provide an Independent Living overall score—as an example for demonstration: 100 representing ideal independent living capability, 70 representing family/support services interaction needed for independent living capability, 60 representing significant family/support services interaction needed to live independently, and 50 and below representing professional assistance needed. The reports may be able to be shared with doctors and selected family members but secured with a rotating generated password provided by the designated primary family caregiver or Power of Attorney. Additionally or alternatively, the reports may be encrypted using blockchain or similar technology.


Additional Considerations


With the foregoing, an insurance customer may opt-in to a rewards, insurance discount, or other type of program. After the insurance customer provides their affirmative consent, an insurance provider remote server may collect data from the customer's mobile device, smart home controller, smart vehicles, computers, televisions, or other smart devices—such as with the customer's permission or affirmative consent. The data collected may be related to insured assets before (and/or after) an insurance-related event, including those events discussed elsewhere herein. In return, risk averse insureds may receive discounts or insurance cost savings related to home, renters, personal articles, auto, and other types of insurance from the insurance provider.


In one aspect, data, including the types of data discussed elsewhere herein, may be collected or received by an insurance provider remote server, such as via direct or indirect wireless communication or data transmission from a smart home controller, mobile device, or other customer computing device, after a customer affirmatively consents or otherwise opts-in to an insurance discount, reward, or other program. The insurance provider may then analyze the data received with the customer's permission to provide benefits to the customer. As a result, risk averse customers may receive insurance discounts or other insurance cost savings based upon data that reflects low risk behavior and/or technology that mitigates or prevents risk to (i) insured assets, such as homes, personal belongings, or vehicles, and/or (ii) home or apartment occupants.


Although the foregoing text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the invention may be defined by the words of the claims set forth at the end of this patent. The detailed description is to be construed as exemplary only and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. One could implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.


Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.


Additionally, certain embodiments are described herein as including logic or a number of routines, subroutines, applications, or instructions. These may constitute either software (e.g., code embodied on a non-transitory, machine-readable medium) or hardware. In hardware, the routines, etc., are tangible units capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.


In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that may be permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that may be temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.


Hardware modules may provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it may be communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within an office environment, or as a server farm), while in other embodiments the processors may be distributed across a number of locations.


Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.


As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.


As used herein, the terms “comprises,” “comprising,” “may include,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).


In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the description. This description, and the claims that follow, should be read to include one or at least one and the singular also may include the plural unless it is obvious that it is meant otherwise.


This detailed description is to be construed as examples and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. One could implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this application.


Unless a claim element is defined by reciting the word “means” and a function without the recital of any structure, it is not intended that the scope of any claim element be interpreted based upon the application of 35 U.S.C. § 112(f). The systems and methods described herein are directed to an improvement to computer functionality, and improve the functioning of conventional computers.

Claims
  • 1. A computer-implemented method for identifying abnormal conditions, the method comprising: capturing sensor data by one or more home-mounted sensors associated with a home environment;analyzing the captured sensor data using a trained machine learning model, the trained machine learning model being trained using historical sensor data collected at a plurality of home environments and historical condition data, the historical condition data indicating conditions associated with individuals in the plurality of home environments, wherein the trained machine learning model comprises a neural network comprising one or more layers, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function;determining an abnormal condition of an individual in the home environment based upon the analysis using the trained machine learning model, the abnormal condition of the individual being a fall, wherein the determining an abnormal condition of an individual in the home environment comprises determining an impact level of the fall using the trained machine learning model based on the captured sensor data;generating an electronic notification indicating the abnormal condition of the individual via one or more processors;transmitting, via the one or more processors, the electronic notification indicating the abnormal condition of the individual to a caregiver device to be presented with one or more options; andin response to receiving a selection of the one or more options, requesting an emergency service to be provided to the individual.
  • 2. The computer-implemented method of claim 1, wherein the captured sensor data comprises image data captured by one or more home-mounted cameras.
  • 3. The computer-implemented method of claim 1, wherein the neural network model is trained using the historical sensor data, wherein the historical sensor data comprises historical image data.
  • 4. The computer-implemented method of claim 1, wherein the trained machine learning model comprises an image recognition model.
  • 5. The computer-implemented method of claim 1, wherein the transmitting the electronic notification indicating the abnormal condition of the individual comprises requesting an emergency service to be provided to the individual.
  • 6. A computer-implemented method for identifying abnormal conditions, comprising: receiving, via one or more processors of a caregiver device, a notification indicating an abnormal condition of an individual in a home environment from a device, wherein the device is configured to capture sensor data one or more home-mounted sensors associated with a home environment and analyze the captured sensor data using a trained machine learning model to determine the abnormal condition of the individual, wherein (i) the abnormal condition of the individual is a fall, wherein the determining an abnormal condition of an individual in the home environment comprises determining an impact level of the fall using the trained machine learning model based on the captured sensor data, and (ii) the trained machine learning model is trained using historical sensor data collected at a plurality of home environments and historical condition data, the historical condition data indicating conditions associated with individuals in the plurality of home environments, wherein the trained machine learning model comprises a neural network comprising one or more layers, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function;in response to receiving the notification indicating the abnormal condition of the individual, presenting one or more options via the one or more processors of the caregiver device; andin response to receiving a selection of the one or more options, requesting an emergency service to be provided to the individual via the one or more processors of the caregiver device.
  • 7. The computer-implemented method of claim 6, wherein the captured sensor data comprises image data captured by one or more home-mounted cameras.
  • 8. The computer-implemented method of claim 6, wherein the neural network model is trained using the historical sensor data, wherein the historical sensor data comprises historical image data.
  • 9. The computer-implemented method of claim 6, wherein the trained machine learning model comprises an image recognition model.
  • 10. A computer system comprising: a first device comprising:one or more home-mounted sensors configured to capture sensor data associated with a home environment; andone or more processors coupled to the one or more sensors and configured to: analyze the captured sensor data using a trained machine learning model, the trained machine learning model being trained using historical sensor data collected at a plurality of home environments and historical condition data, the historical condition data indicating conditions associated with individuals in the plurality of home environments, wherein the trained machine learning model comprises a neural network comprising one or more layers, wherein at least one layer of the one or more layers is associated with at least one selected from a group consisting of an activation function, a loss function, and an optimization function;determine an abnormal condition of an individual in the home environment based upon the analysis using the trained machine learning model, the abnormal condition of the individual being a fall, wherein the determining an abnormal condition of an individual in the home environment comprises determining an impact level of the fall using the trained machine learning model based on the captured sensor data;generate a notification indicating the abnormal condition of the individual; andtransmit the notification indicating the abnormal condition of the individual to a caregiver device; andthe caregiver device comprising at least one processor and configured to: receive the notification indicating the abnormal condition of the individual;in response to receiving the notification indicating the abnormal condition of the individual, present one or more options; andin response to receiving a selection of the one or more options, request an emergency service to be provided to the individual.
  • 11. The computer system of claim 10, wherein the captured sensor data comprises image data captured by one or more home-mounted cameras.
  • 12. The computer system of claim 10, wherein the neural network model is trained using the historical sensor data, wherein the historical sensor data comprises historical image data.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 17/077,785, filed Oct. 22, 2020, which is a continuation of U.S. application Ser. No. 16/169,544, filed Oct. 24, 2018, now U.S. Pat. No. 10,825,318, the entire disclosure of both applications being hereby incorporated herein, and which claim priority to and the benefit of: U.S. Application No. 62/654,975, filed Apr. 9, 2018, and entitled “SENSING PERIPHERAL HEURISTIC EVIDENCE, REINFORCEMENT, AND ENGAGEMENT SYSTEM,” the entire disclosure of which is hereby incorporated herein in its entirety; and U.S. Application No. 62/658,682, filed Apr. 17, 2018, and entitled “SENSING PERIPHERAL HEURISTIC EVIDENCE, REINFORCEMENT, AND ENGAGEMENT SYSTEM,” the entire disclosure of which is hereby incorporated herein in its entirety.

US Referenced Citations (487)
Number Name Date Kind
1446000 Davis Feb 1923 A
3740739 Griffin et al. Jun 1973 A
3817161 Koplon Jun 1974 A
3875612 Poitras Apr 1975 A
4066072 Cummins Jan 1978 A
5005125 Farrar et al. Apr 1991 A
5099751 Newman et al. Mar 1992 A
5128859 Carbone et al. Jul 1992 A
5553609 Chen et al. Sep 1996 A
5554433 Perrone et al. Sep 1996 A
5576952 Stutman et al. Nov 1996 A
5684710 Ehlers et al. Nov 1997 A
5884289 Anderson et al. Mar 1999 A
5903426 Ehling May 1999 A
5967975 Ridgeway Oct 1999 A
6023762 Dean et al. Feb 2000 A
6026166 Lebourgeois Feb 2000 A
6155324 Elliott et al. Dec 2000 A
6222455 Kaiser Apr 2001 B1
6286682 D Arbelles Sep 2001 B1
6324516 Shults et al. Nov 2001 B1
6428475 Shen Aug 2002 B1
6466921 Cordery et al. Oct 2002 B1
6526807 Doumit et al. Mar 2003 B1
6535855 Cahill et al. Mar 2003 B1
6554183 Sticha et al. Apr 2003 B1
6611206 Eshelman et al. Aug 2003 B2
6812848 Candela Nov 2004 B2
6826536 Forman Nov 2004 B1
6847892 Zhou et al. Jan 2005 B2
6886139 Liu Apr 2005 B2
6934692 Duncan Aug 2005 B1
6954758 O'Flaherty Oct 2005 B1
7030767 Candela Apr 2006 B2
7091865 Cuddihy et al. Aug 2006 B2
7154399 Cuddihy et al. Dec 2006 B2
7194416 Provost et al. Mar 2007 B1
7242305 Cuddihy et al. Jul 2007 B2
7309216 Spadola et al. Dec 2007 B1
7319990 Henty Jan 2008 B1
7340401 Koenig et al. Mar 2008 B1
7348882 Adamczyk et al. Mar 2008 B2
7356516 Richey et al. Apr 2008 B2
7395219 Strech Jul 2008 B2
7397346 Helal et al. Jul 2008 B2
7411510 Nixon Aug 2008 B1
7467094 Rosenfeld et al. Dec 2008 B2
7502498 Wen et al. Mar 2009 B2
7562121 Berisford et al. Jul 2009 B2
7586418 Cuddihy et al. Sep 2009 B2
7598856 Nick et al. Oct 2009 B1
7657441 Richey et al. Feb 2010 B2
7715036 Silverbrook et al. May 2010 B2
7733224 Tran Jun 2010 B2
7809587 Dorai et al. Oct 2010 B2
7813822 Hoffberg Oct 2010 B1
7966378 Berisford et al. Jun 2011 B2
8019622 Kaboff et al. Sep 2011 B2
8031079 Kates Oct 2011 B2
8041636 Hunter et al. Oct 2011 B1
8050665 Orbach Nov 2011 B1
8106769 Maroney et al. Jan 2012 B1
8108271 Duncan et al. Jan 2012 B1
8140418 Casey et al. Mar 2012 B1
8214082 Tsai et al. Jul 2012 B2
8229861 Trandal et al. Jul 2012 B1
8280633 Eldering et al. Oct 2012 B1
8289160 Billman Oct 2012 B1
8311941 Grant Nov 2012 B2
8316237 Felsher et al. Nov 2012 B1
8400299 Maroney et al. Mar 2013 B1
8490006 Reeser et al. Jul 2013 B1
8510196 Brandmaier et al. Aug 2013 B1
8527306 Reeser et al. Sep 2013 B1
8529456 Cobain Sep 2013 B2
8533144 Reeser et al. Sep 2013 B1
8571993 Kocher et al. Oct 2013 B2
8595034 Bauer et al. Nov 2013 B2
8596293 Mous et al. Dec 2013 B2
8605209 Becker Dec 2013 B2
8620841 Filson et al. Dec 2013 B1
8621097 Venkatakrishnan et al. Dec 2013 B2
8640038 Reeser et al. Jan 2014 B1
8650048 Hopkins et al. Feb 2014 B1
8665084 Shapiro et al. Mar 2014 B2
8669864 Tedesco et al. Mar 2014 B1
8670998 Bertha et al. Mar 2014 B2
8675920 Hanson et al. Mar 2014 B2
8682952 Kutzik et al. Mar 2014 B2
8694501 Trandal et al. Apr 2014 B1
8712893 Brandmaier et al. Apr 2014 B1
8730039 Billman May 2014 B1
8731975 English et al. May 2014 B2
8749381 Maroney et al. Jun 2014 B1
8803690 Junqua et al. Aug 2014 B2
8856383 Beninato et al. Oct 2014 B2
8868616 Otto et al. Oct 2014 B1
8882666 Goldberg et al. Nov 2014 B1
8890680 Reeser et al. Nov 2014 B2
8917186 Grant Dec 2014 B1
8929853 Butler Jan 2015 B2
8965327 Davis et al. Feb 2015 B2
8976937 Shapiro et al. Mar 2015 B2
9049168 Jacob et al. Jun 2015 B2
9057746 Houlette et al. Jun 2015 B1
9111349 Szeliski et al. Aug 2015 B2
9117349 Shapiro et al. Aug 2015 B2
9142119 Grant Sep 2015 B1
9152737 Micali et al. Oct 2015 B1
9165334 Simon Oct 2015 B2
9183578 Reeser et al. Nov 2015 B1
9202363 Grant Dec 2015 B1
9208661 Junqua et al. Dec 2015 B2
9262909 Grant Feb 2016 B1
9280252 Brandmaier et al. Mar 2016 B1
9286772 Shapiro et al. Mar 2016 B2
9297150 Klicpera Mar 2016 B2
9344330 Jacob et al. May 2016 B2
D764461 Romanoff et al. Aug 2016 S
9408561 Stone et al. Aug 2016 B2
9424606 Wilson et al. Aug 2016 B2
9424737 Bailey et al. Aug 2016 B2
9429925 Wait Aug 2016 B2
9443195 Micali et al. Sep 2016 B2
9472092 Grant Oct 2016 B1
9491277 Vincent Nov 2016 B2
9536052 Amarasingham et al. Jan 2017 B2
9589441 Shapiro et al. Mar 2017 B2
9609003 Chmielewski et al. Mar 2017 B1
9652976 Bruck et al. May 2017 B2
9654434 Sone et al. May 2017 B2
9665892 Reeser et al. May 2017 B1
9666060 Reeser et al. May 2017 B2
9699529 Petri et al. Jul 2017 B1
9739813 Houlette et al. Aug 2017 B2
9767680 Trundle Sep 2017 B1
9786158 Beaver et al. Oct 2017 B2
9798979 Fadell et al. Oct 2017 B2
9798993 Payne et al. Oct 2017 B2
9800570 Bleisch Oct 2017 B1
9800958 Petri et al. Oct 2017 B1
9812001 Grant Nov 2017 B1
9824397 Patel et al. Nov 2017 B1
9866507 Frenkel et al. Jan 2018 B2
9888371 Jacob Feb 2018 B1
9892463 Hakim et al. Feb 2018 B1
9898168 Shapiro et al. Feb 2018 B2
9898912 Jordan et al. Feb 2018 B1
9911042 Cardona et al. Mar 2018 B1
9922524 Devdas et al. Mar 2018 B2
9923971 Madey et al. Mar 2018 B2
9942630 Petri et al. Apr 2018 B1
9947202 Moon et al. Apr 2018 B1
9978033 Payne et al. May 2018 B1
9997056 Bleisch Jun 2018 B2
10002295 Cardona et al. Jun 2018 B1
10022084 Nonaka et al. Jul 2018 B2
10042341 Jacob Aug 2018 B1
10043369 Hopkins et al. Aug 2018 B2
10047974 Riblet et al. Aug 2018 B1
10055793 Call et al. Aug 2018 B1
10055803 Orduna et al. Aug 2018 B2
10057664 Moon et al. Aug 2018 B1
10073929 Vaynriber et al. Sep 2018 B2
10102584 Devereaux et al. Oct 2018 B1
10102585 Bryant et al. Oct 2018 B1
10107708 Schick et al. Oct 2018 B1
10136294 Mehta et al. Nov 2018 B2
10142394 Chmielewski et al. Nov 2018 B2
10147296 Gregg Dec 2018 B2
10176705 Grant Jan 2019 B1
10181160 Hakimi-Boushehri et al. Jan 2019 B1
10181246 Jackson Jan 2019 B1
10186134 Moon et al. Jan 2019 B1
10198771 Madigan et al. Feb 2019 B1
10204500 Cullin et al. Feb 2019 B2
10206630 Stone et al. Feb 2019 B2
10217068 Davis et al. Feb 2019 B1
10226187 Al-Ali et al. Mar 2019 B2
10226204 Heaton et al. Mar 2019 B2
10229394 Davis et al. Mar 2019 B1
10244294 Moon et al. Mar 2019 B1
10249158 Jordan et al. Apr 2019 B1
10258295 Fountaine Apr 2019 B2
10282787 Hakimi-Boushehri et al. May 2019 B1
10282788 Jordan et al. May 2019 B1
10282961 Jordan et al. May 2019 B1
10295431 Schick et al. May 2019 B1
10297138 Reeser et al. May 2019 B2
10298735 Preston et al. May 2019 B2
10304311 Clark et al. May 2019 B2
10304313 Moon et al. May 2019 B1
10319209 Carlton-Foss Jun 2019 B2
10323860 Riblet et al. Jun 2019 B1
10325471 Victor Jun 2019 B1
10325473 Moon et al. Jun 2019 B1
10332059 Matsuoka et al. Jun 2019 B2
10335059 Annegarn et al. Jul 2019 B2
10346811 Jordan et al. Jul 2019 B1
10353359 Jordan et al. Jul 2019 B1
10356303 Jordan et al. Jul 2019 B1
10380692 Parker et al. Aug 2019 B1
10387966 Shah et al. Aug 2019 B1
10388135 Jordan et al. Aug 2019 B1
10412169 Madey et al. Sep 2019 B1
10446000 Friar et al. Oct 2019 B2
10446007 Kawazu et al. Oct 2019 B2
10467476 Cardona et al. Nov 2019 B1
10469282 Konrardy et al. Nov 2019 B1
10475141 McIntosh et al. Nov 2019 B2
10480825 Riblet et al. Nov 2019 B1
10482746 Moon et al. Nov 2019 B1
10506411 Jacob Dec 2019 B1
10506990 Lee et al. Dec 2019 B2
10514669 Call et al. Dec 2019 B1
10515372 Jordan et al. Dec 2019 B1
10522009 Jordan et al. Dec 2019 B1
10522021 Victor Dec 2019 B1
10546478 Moon et al. Jan 2020 B1
10547918 Moon et al. Jan 2020 B1
10548512 Hausdorff et al. Feb 2020 B2
10565541 Payne et al. Feb 2020 B2
10573146 Jordan et al. Feb 2020 B1
10573149 Jordan et al. Feb 2020 B1
10579028 Jacob Mar 2020 B1
10586177 Choueiter et al. Mar 2020 B1
10607295 Hakimi-Boushehri et al. Mar 2020 B1
10621686 Mazar et al. Apr 2020 B2
10623790 Maddalena Apr 2020 B2
10634576 Schick et al. Apr 2020 B1
10679292 Call et al. Jun 2020 B1
10685402 Bryant et al. Jun 2020 B1
10726494 Shah et al. Jul 2020 B1
10726500 Shah et al. Jul 2020 B1
10733671 Hakimi-Boushehri et al. Aug 2020 B1
10733868 Moon et al. Aug 2020 B2
10735829 Petri et al. Aug 2020 B2
10740691 Choueiter et al. Aug 2020 B2
10741033 Jordan et al. Aug 2020 B1
10750252 Petri et al. Aug 2020 B2
10795329 Jordan et al. Oct 2020 B1
10796557 Sundermeyer et al. Oct 2020 B2
10802477 Konrardy et al. Oct 2020 B1
10818105 Konrardy et al. Oct 2020 B1
10823458 Riblet et al. Nov 2020 B1
10824971 Davis et al. Nov 2020 B1
10825316 Victor Nov 2020 B1
10825318 Williams et al. Nov 2020 B1
10825320 Moon et al. Nov 2020 B1
10825321 Moon et al. Nov 2020 B2
10832225 Davis et al. Nov 2020 B1
10846800 Bryant et al. Nov 2020 B1
10922756 Call et al. Feb 2021 B1
10922948 Moon et al. Feb 2021 B1
10943447 Jordan et al. Mar 2021 B1
10970990 Jacob Apr 2021 B1
10990069 Jacob Apr 2021 B1
11004320 Jordan et al. May 2021 B1
11015997 Schick et al. May 2021 B1
11017480 Shah et al. May 2021 B2
11042137 Call et al. Jun 2021 B1
11042938 Robare Jun 2021 B1
11042942 Hakimi-Boushehri et al. Jun 2021 B1
11043098 Jordan et al. Jun 2021 B1
11049078 Jordan et al. Jun 2021 B1
11049189 Shah et al. Jun 2021 B2
11074659 Hakimi-Boushehri et al. Jul 2021 B1
11118812 Riblet et al. Sep 2021 B1
11126708 Reimer Sep 2021 B2
11138861 Blatt Oct 2021 B2
20020040306 Sugiyama et al. Apr 2002 A1
20020046047 Budd Apr 2002 A1
20030023459 Shipon Jan 2003 A1
20040054789 Breh et al. Mar 2004 A1
20040153346 Grundel et al. Aug 2004 A1
20040153382 Boccuzzi et al. Aug 2004 A1
20040177032 Bradley et al. Sep 2004 A1
20040211228 Nishio et al. Oct 2004 A1
20040220538 Panopoulos Nov 2004 A1
20040249250 McGee et al. Dec 2004 A1
20050030175 Wolfe Feb 2005 A1
20050080520 Kline et al. Apr 2005 A1
20050137465 Cuddihy et al. Jun 2005 A1
20050139420 Spoltore et al. Jun 2005 A1
20050143956 Long et al. Jun 2005 A1
20050228245 Quy Oct 2005 A1
20050251427 Dorai et al. Nov 2005 A1
20050275527 Kates Dec 2005 A1
20060033625 Johnson et al. Feb 2006 A1
20060058612 Dave et al. Mar 2006 A1
20060100912 Kumar et al. May 2006 A1
20060154642 Scannell, Jr. Jul 2006 A1
20060184379 Tan et al. Aug 2006 A1
20060205564 Peterson Sep 2006 A1
20060271456 Romain et al. Nov 2006 A1
20070186165 Maislos et al. Aug 2007 A1
20080018474 Bergman et al. Jan 2008 A1
20080019392 Lee Jan 2008 A1
20080059351 Richey et al. Mar 2008 A1
20080101160 Besson May 2008 A1
20080154099 Aspel et al. Jun 2008 A1
20080184272 Brownewell Jul 2008 A1
20080201174 Ramasubramanian et al. Aug 2008 A1
20080240379 Maislos et al. Oct 2008 A1
20080285797 Hammadou Nov 2008 A1
20080292151 Kurtz et al. Nov 2008 A1
20080294462 Nuhaan et al. Nov 2008 A1
20080301019 Monk Dec 2008 A1
20090001891 Patterson Jan 2009 A1
20090012373 Raij et al. Jan 2009 A1
20090024420 Winkler Jan 2009 A1
20090044595 Vokey Feb 2009 A1
20090094129 Rhodes et al. Apr 2009 A1
20090243852 Haupt et al. Oct 2009 A1
20090259581 Horowitz et al. Oct 2009 A1
20090265193 Collins et al. Oct 2009 A1
20090281393 Smith Nov 2009 A1
20090326981 Karkanias et al. Dec 2009 A1
20100027777 Gupta et al. Feb 2010 A1
20100073840 Hennessey, Jr. Mar 2010 A1
20100131416 Means May 2010 A1
20100145164 Howell Jun 2010 A1
20100191824 Lindsay Jul 2010 A1
20100235285 Hoffberg Sep 2010 A1
20100241465 Amigo et al. Sep 2010 A1
20100286490 Koverzin Nov 2010 A1
20100299217 Hui Nov 2010 A1
20110003577 Rogalski et al. Jan 2011 A1
20110021140 Binier Jan 2011 A1
20110077875 Tran et al. Mar 2011 A1
20110112660 Bergmann et al. May 2011 A1
20110161117 Busque et al. Jun 2011 A1
20110173122 Singhal Jul 2011 A1
20110181422 Tran Jul 2011 A1
20110201901 Khanuja Aug 2011 A1
20110218827 Kenefick et al. Sep 2011 A1
20110224501 Hudsmith Sep 2011 A1
20110238564 Lim et al. Sep 2011 A1
20110246123 Dellostritto et al. Oct 2011 A1
20110251807 Rada et al. Oct 2011 A1
20110276489 Larkin Nov 2011 A1
20120016695 Bernard et al. Jan 2012 A1
20120046973 Eshleman et al. Feb 2012 A1
20120047072 Larkin Feb 2012 A1
20120095846 Leverant Apr 2012 A1
20120101855 Collins et al. Apr 2012 A1
20120116820 English et al. May 2012 A1
20120143754 Patel Jun 2012 A1
20120166115 Apostolakis Jun 2012 A1
20120188081 Van Katwijk Jul 2012 A1
20120232935 Voccola Sep 2012 A1
20120237908 Fitzgerald et al. Sep 2012 A1
20120265586 Mammone Oct 2012 A1
20120290333 Birchall Nov 2012 A1
20120290482 Atef et al. Nov 2012 A1
20130013513 Ledbetter et al. Jan 2013 A1
20130030974 Casey et al. Jan 2013 A1
20130049950 Wohlert Feb 2013 A1
20130073299 Warman et al. Mar 2013 A1
20130073306 Shlain et al. Mar 2013 A1
20130073321 Hofmann et al. Mar 2013 A1
20130082842 Balazs et al. Apr 2013 A1
20130096960 English et al. Apr 2013 A1
20130100268 Mihailidis et al. Apr 2013 A1
20130104022 Coon Apr 2013 A1
20130144486 Ricci Jun 2013 A1
20130147899 Labhard Jun 2013 A1
20130159021 Felsher Jun 2013 A1
20130166325 Ganapathy et al. Jun 2013 A1
20130223405 Kim et al. Aug 2013 A1
20130226624 Blessman et al. Aug 2013 A1
20130234840 Trundle et al. Sep 2013 A1
20130257626 Masli et al. Oct 2013 A1
20130262155 Hinkamp Oct 2013 A1
20130267795 Cosentino et al. Oct 2013 A1
20130290013 Forrester Oct 2013 A1
20130290033 Reeser et al. Oct 2013 A1
20130304514 Hyde et al. Nov 2013 A1
20140006284 Faith et al. Jan 2014 A1
20140058854 Ranganath et al. Feb 2014 A1
20140108031 Ferrara Apr 2014 A1
20140122133 Weisberg et al. May 2014 A1
20140136242 Weekes et al. May 2014 A1
20140148733 Stone et al. May 2014 A1
20140180723 Cote et al. Jun 2014 A1
20140184408 Herbst Jul 2014 A1
20140201315 Jacob et al. Jul 2014 A1
20140201844 Buck Jul 2014 A1
20140207486 Carty et al. Jul 2014 A1
20140222329 Frey Aug 2014 A1
20140222469 Stahl et al. Aug 2014 A1
20140229205 Gibson Aug 2014 A1
20140238511 Klicpera Aug 2014 A1
20140244997 Goel et al. Aug 2014 A1
20140257851 Walker et al. Sep 2014 A1
20140257871 Christensen et al. Sep 2014 A1
20140266669 Fadell et al. Sep 2014 A1
20140266717 Warren et al. Sep 2014 A1
20140278571 Mullen et al. Sep 2014 A1
20140303801 Ahn et al. Oct 2014 A1
20140340216 Puskarich Nov 2014 A1
20140358592 Wedig et al. Dec 2014 A1
20140362213 Tseng Dec 2014 A1
20140379156 Kamel et al. Dec 2014 A1
20150002293 Nepo Jan 2015 A1
20150032480 Blackhurst et al. Jan 2015 A1
20150061859 Matsuoka et al. Mar 2015 A1
20150094830 Lipoma et al. Apr 2015 A1
20150116112 Flinsenberg et al. Apr 2015 A1
20150134343 Kluger et al. May 2015 A1
20150154712 Cook Jun 2015 A1
20150154880 Petito et al. Jun 2015 A1
20150160623 Holley Jun 2015 A1
20150160636 McCarthy et al. Jun 2015 A1
20150163412 Holley et al. Jun 2015 A1
20150170288 Harton et al. Jun 2015 A1
20150187019 Fernandes et al. Jul 2015 A1
20150206249 Fini Jul 2015 A1
20150244855 Serra Aug 2015 A1
20150287310 Deiiuliis et al. Oct 2015 A1
20150305690 Tan et al. Oct 2015 A1
20150332407 Wilson et al. Nov 2015 A1
20150347910 Fadell et al. Dec 2015 A1
20150356701 Gandy et al. Dec 2015 A1
20150364028 Child et al. Dec 2015 A1
20160018226 Plocher et al. Jan 2016 A1
20160042463 Gillespie Feb 2016 A1
20160078744 Gieck Mar 2016 A1
20160104250 Allen et al. Apr 2016 A1
20160119424 Kane et al. Apr 2016 A1
20160171864 Ciaramelletti et al. Jun 2016 A1
20160174913 Somanath et al. Jun 2016 A1
20160188829 Southerland et al. Jun 2016 A1
20160225240 Voddhi et al. Aug 2016 A1
20160259902 Feldman et al. Sep 2016 A1
20160337829 Fletcher et al. Nov 2016 A1
20160342767 Narasimhan et al. Nov 2016 A1
20160360965 Tran Dec 2016 A1
20160371620 Nascenzi et al. Dec 2016 A1
20170124276 Tee May 2017 A1
20170124277 Shlagman May 2017 A1
20170147722 Greenwood May 2017 A1
20170172465 Osorio Jun 2017 A1
20170193164 Simon et al. Jul 2017 A1
20170262604 Francois Sep 2017 A1
20170270260 Shetty et al. Sep 2017 A1
20170304659 Chen et al. Oct 2017 A1
20180000346 Cronin Jan 2018 A1
20180032696 Rome Feb 2018 A1
20180068081 Salem Mar 2018 A1
20180075204 Lee et al. Mar 2018 A1
20180153477 Nagale et al. Jun 2018 A1
20180160988 Miller et al. Jun 2018 A1
20180211509 Ramaci Jul 2018 A1
20180211724 Wang Jul 2018 A1
20180276710 Tietzen et al. Sep 2018 A1
20180280245 Khalid Oct 2018 A1
20180308569 Luellen Oct 2018 A1
20180322947 Potts et al. Nov 2018 A1
20180325470 Fountaine Nov 2018 A1
20180342329 Rufo et al. Nov 2018 A1
20180357386 Sanjay-Gopal Dec 2018 A1
20180365957 Wright et al. Dec 2018 A1
20190019379 Beller et al. Jan 2019 A1
20190046039 Ramesh et al. Feb 2019 A1
20190069154 Booth et al. Feb 2019 A1
20190083003 Lee et al. Mar 2019 A1
20190108841 Vergyri et al. Apr 2019 A1
20190122760 Wang Apr 2019 A1
20190133445 Eteminan et al. May 2019 A1
20190206533 Singh et al. Jul 2019 A1
20190228397 Madden Jul 2019 A1
20190279647 Jones et al. Sep 2019 A1
20190287376 Netscher et al. Sep 2019 A1
20190303760 Kumar et al. Oct 2019 A1
20190320900 Majmudar Oct 2019 A1
20200019852 Yoon et al. Jan 2020 A1
20200121544 George et al. Apr 2020 A1
20200126670 Bender et al. Apr 2020 A1
20200143655 Gray et al. May 2020 A1
20200302549 Jordan et al. Sep 2020 A1
20200327791 Moon et al. Oct 2020 A1
20200334554 Takahashi et al. Oct 2020 A1
20200337651 Kwan Oct 2020 A1
20210035432 Moon et al. Feb 2021 A1
20210042843 Bryant et al. Feb 2021 A1
20210158671 Jordan et al. May 2021 A1
Foreign Referenced Citations (15)
Number Date Country
202865924 Apr 2013 CN
201811043670 Jul 2018 IN
201811043670 Dec 2018 IN
2014-142889 Aug 2014 JP
2014142889 Aug 2014 JP
2009061936 May 2009 WO
2009061936 May 2009 WO
2011133628 Oct 2011 WO
2011133628 Oct 2011 WO
2013076721 May 2013 WO
2014207558 Dec 2014 WO
2019086849 May 2019 WO
2019086849 May 2019 WO
2020010217 Jan 2020 WO
2020010217 Jan 2020 WO
Non-Patent Literature Citations (73)
Entry
Knutsen, Confusion about causation in insurance: solutions for catastrophic losses, Ala. L. Rev., 5:957-1023 (2010).
Michael E. Porter, “How Smart, Connected Products Are Transforming Competition”, Harvard Business Review, Nov. 2014 (Year: 2014).
System for Loss Prevention, IP.com, published Nov. 8, 2008.
U.S. Appl. No. 14/692,864, Final Office Action, dated Nov. 8, 2017.
U.S. Appl. No. 14/692,864, Nonfinal Office Action, dated May 24, 2018.
U.S. Appl. No. 14/692,943, Nonfinal Office Action, dated Sep. 12, 2017.
U.S. Appl. No. 14/692,943, Notice of Allowance, dated May 1, 2018.
U.S. Appl. No. 14/692,946, Final Office Action, dated Oct. 30, 2017.
U.S. Appl. No. 14/692,946, Nonfinal Office Action, dated Apr. 4, 2017.
U.S. Appl. No. 14/692,946, Nonfinal Office Action, dated Apr. 6, 2018.
U.S. Appl. No. 14/692,953, Final Office Action, dated Apr. 27, 2018.
U.S. Appl. No. 14/692,953, Nonfinal Office Action, dated Sep. 19, 2017.
U.S. Appl. No. 14/692,961, Final Office Action, dated Jun. 20, 2018.
U.S. Appl. No. 14/692,961, Final Office Action, dated Sep. 1, 2017.
U.S. Appl. No. 14/692,961, Nonfinal Office Action, dated Apr. 14, 2017.
U.S. Appl. No. 14/692,961, Nonfinal Office Action, dated Dec. 28, 2017.
U.S. Appl. No. 14/693,021, Final Office Action, dated Jan. 25, 2018.
U.S. Appl. No. 14/693,021, Nonfinal Office Action, dated Jun. 30, 2017.
U.S. Appl. No. 14/693,032, Final Office Action, dated Mar. 22, 2018.
U.S. Appl. No. 14/693,032, Nonfinal Office Action, dated Sep. 7, 2017.
U.S. Appl. No. 14/693,032, Notice of Allowance, dated Jun. 22, 2018.
U.S. Appl. No. 14/693,034, Nonfinal Office Action, dated May 17, 2017.
U.S. Appl. No. 14/693,034, Notice of Allowance, dated Oct. 25, 2017.
U.S. Appl. No. 14/693,039, Final Office Action, dated Dec. 15, 2017.
U.S. Appl. No. 14/693,039, Nonfinal Office Action, dated Jun. 5, 2017.
U.S. Appl. No. 14/693,039, Nonfinal Office Action, dated May 3, 2018.
U.S. Appl. No. 14/693,057, Final Office Action, dated Feb. 7, 2018.
U.S. Appl. No. 14/693,057, Nonfinal Office Action, dated Aug. 21, 2017.
U.S. Appl. No. 14/873,722, Final Office Action, dated Jun. 15, 2018.
U.S. Appl. No. 14/873,722, Nonfinal Office Action, dated Dec. 5, 2017.
U.S. Appl. No. 14/873,783, Final Office Action, dated May 23, 2018.
U.S. Appl. No. 14/873,783, Nonfinal Office Action, dated Dec. 8, 2017.
U.S. Appl. No. 14/873,823, Final Office Action, dated Jun. 29, 2018.
U.S. Appl. No. 14/873,823, Final Office Action, dated Mar. 15, 2017.
U.S. Appl. No. 14/873,823, Final Office Action, dated Nov. 3, 2017.
U.S. Appl. No. 14/873,823, Nonfinal Office Action, dated Feb. 23, 2018.
U.S. Appl. No. 14/873,823, Nonfinal Office Action, dated Jun. 21, 2017.
U.S. Appl. No. 14/873,823, Nonfinal Office Action, dated Nov. 30, 2016.
U.S. Appl. No. 14/873,864, Corrected Notice of Allowability, dated Jan. 18, 2018.
U.S. Appl. No. 14/873,864, Final Office Action, dated Dec. 2, 2016.
U.S. Appl. No. 14/873,864, Nonfinal Office Action, dated Apr. 5, 2017.
U.S. Appl. No. 14/873,864, Nonfinal Office Action, dated Jul. 14, 2016.
U.S. Appl. No. 14/873,864, Notice of Allowance, dated Aug. 28, 2017.
U.S. Appl. No. 14/873,864, Notice of Allowance, dated Dec. 21, 2017.
U.S. Appl. No. 14/873,914, Nonfinal Office Action, dated Dec. 26, 2017.
U.S. Appl. No. 14/873,942, Nonfinal Office Action, dated Mar. 16, 2018.
U.S. Appl. No. 14/873,942, Nonfinal Office Action, dated Nov. 22, 2017.
U.S. Appl. No. 15/409,248, filed Jan. 18, 2017, Konrardy et al., “Sensor Malfunction Detection”.
U.S. Appl. No. 15/409,271, filed Jan. 18, 2017, Konrardy et al., “Autonomous Vehicle Component Malfunction Impact Assessment”.
U.S. Appl. No. 15/409,305, filed Jan. 18, 2017, Konrardy et al., “Component Malfunction Impact Assessment”.
“Elderly Alexa helps families care for their loved ones via voice”, Perez, Sarah, techcrunch.com, May 14, 2017 (Year: 2017).
“How to use Alexa Care Hub to help monitor and contact older relatives or friends”, Dave Johnson, Business Insider, Jan. 14, 2021, https://www.businessinsider.com/how-to-use-alexa-care-hub.
Amazons Care Hub will see success due to swelling interest in aging at home “and boosted smart speaker adoption”, Zoe LaRock, Nov. 13, 2020, https://www.businessinsider.com/amazon-care-hub-will-succeed-amid-growing-smart-speaker-adoption-2020-11.
The Accuracy Of Self-Reported Data Of An Aging Population Using A Telehealth System In A Retirement Community Setting Based On The Users Age, Gender, Employment Status And Computer Experience, Gurley, Kelley Anne. University of Maryland, Baltimore.
U.S. Appl. No. 15/409,318, filed Jan. 18, 2017, Konrardy et al., “Automatic Repair of Autonomous Vehicles”.
U.S. Appl. No. 15/409,336, filed Jan. 18, 2017, Konrardy et al., “Automatic Repair of Autonomous Components”.
U.S. Appl. No. 15/409,340, filed Jan. 18, 2017, Konrardy et al., “Autonomous Vehicle Damage and Salvage Assessment”.
U.S. Appl. No. 15/409,349, filed Jan. 18, 2017, Konrardy et al., “Component Damage and Salvage Assessment”.
U.S. Appl. No. 15/409,359, filed Jan. 18, 2017, Konrardy et al., “Detecting and Responding to Autonomous Vehicle Collisions”.
U.S. Appl. No. 15/409,371, filed Jan. 18, 2017, Konrardy et al., “Detecting and Responding to Autonomous Environment Incidents”.
U.S. Appl. No. 15/409,445, filed Jan. 18, 2017, Konrardy et al., “Virtual Testing of Autonomous Vehicle Control System”.
U.S. Appl. No. 15/409,473, filed Jan. 18, 2017, Konrardy et al., “Virtual Testing of Autonomous Environment Control System”.
U.S. Appl. No. 15/859,859, filed Jan. 2, 2018, Hakmi-Boushehri et al., “Systems and Methods for Community-Based Cause of Loss Determination”.
U.S. Appl. No. 15/895,149, filed Feb. 13, 2018, Jordan et al., Systems and Methods for Automatically Generating an Escape Route.
U.S. Appl. No. 14/692,864, Nonfinal Office Action, dated May 16, 2017.
Núñez-Marcos et al., Vision-Based Fall Detection with Convolutional Neural Networks, Wir. Comm. Mob. Comp., 2017(9474806):16 (2017).
Yildirim et al., Fall detection using smartphone-based application, Int. J. App. Math. Electro. Comp., 4(4): 140-144 (2016).
Yu et al., A posture recognition-based fall detection system for monitoring an elderly person in a smart home environment, IEEE Tran. Infor. Tech. Biom., 16(6): 1274-1286 (2012).
Final Office Action, U.S. Appl. No. 17/574,874, dated May 18, 2022.
Gurley, The Accuracy Of Self-Reported Data Of An Aging Population Using A Telehealth System In A Retirement Community Setting Based On The Users Age, Gender, Employment Status And Computer Experience, Dissertation, University of Maryland, Baltimore (2016).
Yildirim et al., Fall detection using smartphone-based application, Int. J. Appl. Mathematics Electronics and Computers, 4(4):140-144 (2016).
Yu et al., A posture recognition-based fall detection system for monitoring an elderly person in a smart home environment, IEEE Tran. Infor. Tech. Blom., 16(6):1274-1286 (2012).
U.S. Appl. No. 17/077,785, Notice of Allowance, dated Jul. 14, 2022.
Related Publications (1)
Number Date Country
20220139190 A1 May 2022 US
Provisional Applications (2)
Number Date Country
62658682 Apr 2018 US
62654975 Apr 2018 US
Continuations (2)
Number Date Country
Parent 17077785 US
Child 17574874 US
Parent 16169544 Oct 2018 US
Child 17077785 US