The present invention relates to the field of wearable electronic devices, and more specifically relates to a device and method for stress identification, emotion recognition and management.
Coping with stress and recognizing and controlling one's negative emotions is something everybody is doing on a daily basis, but for some people this is harder than for others. This is a particularly pressing issue for people affected by certain mental health related conditions, such as autism, attention deficit and hyperactivity disorder (ADHD), post-traumatic stress disorder (PTSD) or bipolar disorder, where emotional manifestations are externalized, but also for people affected with other conditions, such as anxiety and panic attacks, where the emotional manifestations are internalized.
These are conditions that are often marked by ongoing patterns of hyperactivity and/or impulsivity, triggered by daily stress, and interfere with social functioning, at school or in the workplace, and with the person's overall development. The most common symptom of externalizing manifestations conditions are connected with hyperactivity and impulsivity, which mean that the affected person seems to move about constantly, including in situations in which it is not appropriate. Occasionally, these situations may include the person exhibiting emotional flares, in which the affected person makes hasty actions, including violent actions with a high potential for harm, without initial consideration.
Current solutions to such conditions include medication, which may have significant side effects and age limitations (not advisable below the ages of 7-10, depending on the active compound), and psychological therapy, which may provide gradual benefits only on a long term horizon, and may have limited capability for immediate and direct behavioral influence during social interactions, especially for children, who manifest the lowest amount of self-control, for whom specialized counseling may be necessary. For example, trained observers may use simple attention distraction or meditation to calm down children before the onset of the emotional flare.
Another solution for children and teenagers, which greatly improves the efficiency of therapy, is classroom intervention, usually provided in schools. Classroom intervention may help the children identify the stressors and deal with them at the exact moment of need during classes, but comes with high costs and low penetration as it requires trained personnel and one-to-one interaction with the individuals. However, classroom intervention is far from being as widely available as needed. A Lehigh University study focused on ADHD (DuPaul, 2019) published in March 2019, found that out of 2,495 children with ADHD, one in three received no school-based interventions and two out of five received no classroom management. At least one in five students who experience significant academic and social impairment, those most in need of services, received no school intervention whatsoever.
Until this date, the academic research has been mostly focused on studying the effects and correlation of stress with Galvanic Skin Response (skin conductivity) and Heart Rate. Some example papers describing this aspect include:
It is a logical conclusion that modern technology, especially sensor technology, could be used for better solutions; however, there have been limited inventions focused on this field. Currently, there is only one type of device used for alleviating the effects of stress and anxiety through the use of random patterns of vibrations that counter external stressors, but without any personalization on the patients' specific condition and mental state. There are also several wearable devices working on emotion tracking, such as emotional sensors developed by mPath™ of Broomfield CO, the Upmood watch developed by Upmood Design Lab™ of San Francisco, CA, and the Feel Emotion Sensor developed by Sentio Solutions' of San Francisco, CA. The amount of data collected and the types of sensors used differ, and so do the use cases.
Described herein are techniques that improve upon the prior techniques and devices for stress detection, emotion recognition and emotion management.
Concepts discussed herein relate to a wearable device or apparatus for monitoring biometric data of a user and enabling biofeedback indications in response to biometric data received in order to serve as an early warning system for the user and/or to guide the user through coping with the identified stress or emotions. In one particular embodiment, the wearable device includes at least some of a set of sensors (e.g., used for measuring and calculating parameters including heart rate, heart rate variability, blood oxygenation, galvanic skin response (GSR), skin temperature, pulse rate (also called heart rate), blood pressure, position and movement), a button, at least one digital signal processor having a memory unit coupled to at least one of the sensors and a feedback mechanism. The feedback mechanism can include a vibration actuator and/or a set of light emitting diodes. Further, the memory stores computer-executable instructions for controlling the at least one processor to cause the sensor to collect data continuously (or in response to the activation signal from the button) and to process the collected data. Furthermore, the concepts include the algorithm used for detecting the emergence of an emotional event and for launching a warning and/or an intervention process. Furthermore, the device is configured to provide user feedback, through a feedback mechanism, with reference to the collected data. One embodiment of the feedback mechanism further comprises a vibration actuator as a biofeedback indicator and the feedback to the user is provided as haptic vibration. Another embodiment of the feedback mechanism comprises a set of LEDs and the feedback is provided to the user as patterns of light.
These and other embodiments of the invention are more fully described in association with the drawings below.
Various embodiments and aspects of the inventions will be described with reference to details discussed below, and the accompanying drawings will illustrate the various embodiments. The following description and drawings are illustrative of the invention and are not to be construed as limiting the invention. Numerous specific details are described to provide a thorough understanding of various embodiments of the present invention. However, in certain instances, well-known or conventional details are not described in order to provide a concise discussion of embodiments of the present inventions. Reference in the specification to “one embodiment” or “an embodiment” or “another embodiment” means that a particular feature, structure, or characteristic described in conjunction with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification do not necessarily all refer to the same embodiment.
In one embodiment, the process begins with the user in a calm emotional state, and the user wearing the device 10 around his/her wrist (step 31). The device 10 constantly monitors (step 32) a plurality of biomarkers and based on them, evaluates (step 33) the user's emotional score, and compares it with a baseline derived threshold for intervention. The mental stress and intellectual challenge from the current activity gradually affects the user and increase his/her overall stress to levels where, without an intervention, an emotional event (step 34) would manifest, in which the user could lose partial or complete control of their behavior and perform potentially harmful and/or violent actions. The wearable device 10 detects this emotional escalation and launches an alert signal (step 35) through the haptic vibration and/or light pattern feedback mechanism of the device 10, alerting the user about his/her emotional state and starting the intervention process (step 36). The device 10 continues the biofeedback based intervention until the user returns to a calm emotional state (step 31) and the process continues in a similar manner as described above until the session is terminated by the user.
At step 440, the emotional score may be compared with the initial threshold T0. If the emotional score is below the initial threshold T0 (no branch of step 440), the control module 22 may determine whether or not the monitoring should continue (step 450). If the monitoring should continue (yes branch of step 450), the process returns to step 431. If the monitoring should not continue (no branch of step 450; e.g., a stop signal is received from the button 141), the process continues to step 460.
If the emotional score is above the alert threshold (yes branch of step 440), the device 10 determines whether an alert signal has already been transmitted in a specified period of time before the current time (step 441). If no alert signal has been issued (no branch of step 441), an alert signal is transmitted (step 442). If an alert signal was already transmitted in a specified period of time before the current time (yes branch of step 441), the device 10 compares the emotional score with one or more of the previously estimated intermediary thresholds T1-T0 (steps 443, 445), and transmits the corresponding feedback signal to the user (steps 444, 446).
A different number of intermediary thresholds T1-T0 can be configured depending on the characteristics of the intervention process. Biofeedback signals can be delivered through vibrations and/or light in different increments of time, duration, magnitude or patterns, as desired and as a function of the emotional score level calculated for the user. For example, a short, sharp and abrupt vibration is emitted at step 442 as an alert signal, and subsequently at step 444, a longer, smoother and gentler vibration is emitted to indicate that the emotional score is decreasing. Concurrently, the visual indicators 20 may display a pattern of light (i) with various attributes of the LED (e.g., intensity, color, ON/OFF, etc.) changing based on the emotional score and (ii) to prompt the user to perform a specific action in the intervention process 36.
The intervention process 36 can be tailored to the user based on the user's specific conditions and characteristics. For example, if the intervention is a breathing exercise, a pattern of light can be used to guide the breathing of the user. In other instances, if the intervention is a meditation routine, different vibrations can be used to guide the user through the meditation routine without the need for the user to look at the device 10. In some embodiments, the intervention can be dynamic, for example, gradually increasing or gradually decreasing in intensity as desired. In one embodiment, the intervention process that is directed by the device 10 can be selected by the user via a system (e.g., a phone) that is communicatively coupled to the device 10.
The process of monitoring the emotional score of the user and providing biofeedback based intervention continues at least until the emotional score drops below the alert threshold, T0. After that, if at step 450, the signal to stop monitoring is received from the button 141, the method proceeds to step 460, in which the control module 22 determines whether an emotional event has occurred during the session. If so (yes branch of step 460), the control module 22 may update the values of the alert threshold, T0, and intermediary thresholds, T1-Tn, based on information and data derived from the monitoring session. In one embodiment, machine learning can be used to update these thresholds, while in an alternative embodiment, the update may be based on pre-calculated parameters. After the thresholds have been updated or in the case that no event was detected (no branch of step 460), the process may conclude (step 470).
At step 2440, the current emotional score, ES, may be compared with the baseline emotional score, B. If the emotional score, ES, is below the baseline, B (no branch of step 2440), the control module 22 may determine whether or not the monitoring should continue (step 2450). If the monitoring should continue (yes branch of step 2450), the process returns to step 2431 where additional biomarkers are collected. If the monitoring should not continue (no branch of step 2450; e.g., a stop signal is received from the button 141), the process continues to step 2460.
If the emotional score is above the baseline, B (yes branch of step 2440), the control module 22 determines whether the meltdown likelihood, L, is greater than the nth emotional event threshold, Tn. If so (yes branch of step 2441), an alert signal is transmitted (step 2442). If not (no branch of step 2441), the control module 22 determines whether the meltdown likelihood, L, is greater than the n-1th emotional event threshold, Tn-1. If so (yes branch of step 2443), a specific feedback signal is transmitted (step 2444). If not (no branch of step 2443), the process continues in a similar manner for other thresholds. If the process reaches step 2445, the control module 22 determines whether the meltdown likelihood, L, is greater than the first threshold T1. If so (yes branch of step 2445), a specific feedback signal is transmitted (step 2446). If not (no branch of step 2445), the process returns to step 2431 where additional biomarkers are collected.
A different number of emotional event thresholds, T1-Tn, can be configured depending on the characteristics of the intervention process. Biofeedback signals can be delivered through vibrations and/or light in different increments of time, duration, magnitude or patterns, as desired and as a function of the emotional score level calculated for the user. For example, a short, sharp and abrupt vibration is emitted at step 1442 as an alert signal, and subsequently at step 1444, a longer, smoother and gentler vibration is emitted to indicate that the emotional score is decreasing. Concurrently, the visual indicators 20 may display a pattern of light (i) with various attributes of the LED (e.g., intensity, color, ON/OFF, etc.) changing based on the emotional score and (ii) to prompt the user to perform a specific action in the intervention process 36.
The process of monitoring the emotional score of the user and providing biofeedback based intervention continues at least until the emotional score drops below the baseline, B. After that, if at step 2450, the signal to stop monitoring is received from the button 141, the method proceeds to step 2460, in which the control module 22 determines whether an emotional event has occurred during the session. If so (yes branch of step 2460), once the monitoring has been completed, the AI algorithm analyzes the data collected, and updates the baseline, B and emotional event thresholds, T1-Tn as needed based on a multidimensional analysis of the recorded biological signals. In one embodiment, machine learning can be used to update these thresholds, while in an alternative embodiment, the update may be based on pre-calculated parameters. After the thresholds have been updated or in the event that no event was detected (no branch of step 2460), the process may conclude (step 2470).
The device 10 was worn by the user in order to determine the user's emotional score. More specifically, the emotional score 51 was calculated based on the biomarkers collected by the plurality of sensors of the device 10, using a process of statistical analysis adapted from a previously collected set of data. From the depiction, it can be observed that the peaks 52 in the agitation level 50, which were at or above line 502 (i.e., one standard deviation above the mean), were preceded by peaks of the emotional score 53, indicating that the emotional score 51 could be a valuable predictor for moments of high agitation in the user, and suggesting the usefulness in the monitoring of the emotional score 51 to trigger timely intervention processes to ward off potential flare ups in the user's emotions.
The emotional score 610 of the first individual (diagnosed with AMID) exhibited numerous peaks 615 above line 612 (i.e., one standard deviation above the mean), which was selected as the alert threshold, T0. In accordance with the algorithm depicted in
The emotional score 620 of the second individual (not diagnosed with any mental health conditions) presented much lower amplitude peaks 625, all of them below line 622 (i.e., one standard deviation above the mean). In order to evaluate the effect of the device 10, the alert threshold, T0, was manually modified to be the mean of the emotional score during the calibration period. Each time the emotional score exceeded the alert threshold, T0, an alert signal was transmitted and the intervention process was launched, similarly to the first individual. It can be observed for the second individual that the interventions also caused a decrease in the emotional score, but the decrease occurred over a longer time period.
The hardware 906 may include various devices/components, including an embedded processor 924 (i.e., microcontroller) that has a floating point unit (FPU) unit to help with the digital signal processing tasks, and is low-power to enable longer battery life. The hardware 906 may also include a wireless communication system 932 to communicatively couple the wearable device to an external device via a wireless protocol, such as Bluetooth, Thread, ZigBee and WiFi. The hardware 906 may also include a GSR sensor 936 that is implemented using a low voltage technique utilizing precision operational amplifiers. This method avoids needing to boost the skin electrode voltage to high levels. The hardware 906 may also include an embedded flash memory (not depicted) that allows the wearable device to store large amounts of data without being paired to a smartphone or PC and also provides local storage for the ML model data, enabling AI/ML data processing to be performed on the device itself, independently of an external device such as a smartphone or PC. The hardware 906 may also include a battery management system 926 for managing the usage of the battery 802, a 3-axis accelerometer for sensing movement of the individual wearing the device, a multi-wavelength PPG sensor 930, and a skin temperature sensor 934 for measuring the skin temperature of the individual.
Photoplethysmography (PPG) is a technique employing one or more optical sensors that makes measurements at the surface of the skin to detect volumetric changes in blood circulation. More specifically, PPG uses low-intensity infrared (IR), red and green light. Since light is more strongly absorbed by blood than the surrounding tissues, the changes in blood flow can be detected by PPG sensors as changes in the intensity of light. A voltage signal from a PPG sensor is thus proportional to the volume of blood flowing through the blood vessels. Volumetric changes in blood flow are associated with cardiac activity, hence changes in a PPG signal can be indicative of changes in cardiac activity. The PPG signal itself is typified by a waveform that includes an alternating current (AC) component superimposed on a direct current (DC) component. The AC component corresponds to variations in blood volume in synchronization with a heartbeat, while the DC component is determined by tissue structures surrounding the blood vessels where measurements are taken and other factors and may vary with respiration. By analyzing the PPG signal, various physiological biomarkers may be extracted, including blood oxygen saturation, blood pressure, heart rate, heart rate variability, and other cardiovascular parameters. The user's emotional score is derived as a function of the various measured parameters obtained through analysis of the PPG signal, and, optionally, additional parameters that may be measured using other sensors. In one embodiment, the radial basis function kernel is used to compute the emotional score from the various measured parameters (also called extracted features).
At step 1114, the time-domain features 1110 may be compared to the frequency-domain features 1112 to determine which samples of the signals to discard and which samples of the signals to further analyze. Such “validation” step is further described below in
The process depicted in
At step 1302, the microcontroller 924 may perform a wavelet decomposition of the PPG signals. The wavelet decomposition may utilize a tailor-made wavelet that closely approximates the heart-beat waveform (i.e., a waveform in the time domain that has the systolic and diastolic peaks), and results in very good accuracy in the extraction of features. At step 1304, the PPG signals may be processed in an operation known is “de-trending” in which the detail coefficients on the ultra-low frequency band of the PPG signal are zeroed, thereby eliminating the drift and temperature variations of the PPG signal. At step 1306, the microcontroller 924 may extract the low frequency (i.e., 0.04-0.15 Hz) and high frequency (i.e., 0.16-0.4 Hz) sympathetic and parasympathetic autonomous neural system (ANS) signal power from the PPG signal. At step 1308, the microcontroller 924 may extract a heart-beat signal from the 0.5-3 Hz decomposition bin of the wavelet decomposition. Finally, at step 1310, the microcontroller 924 may compose the decomposed signals to form a much cleaner signal (e.g., without the unwanted components). More specifically, the composition may use the approximation and detail coefficients (e.g., detrended and denoised) that are split using filter banks.
At step 1502, a support vector machine (SVM) kernel is used to predict stressful events. At step 1504, a haptic feedback is started in response to the prediction of a stressful event. At step 1506, an envelope detector process is started, and indicates the beginning of a stressful event. More specifically, the envelope detector process begins when the output of a radial basis function kernel exceeds a threshold value. At step 1508, the envelope of a stress event is closed, and indicates the conclusion of a stressful event. More specifically, in step 1508, the output of a radial basis function kernel falls below the threshold value.
At step 1510, the microcontroller 924 determines whether the user transmitted a false positive signal (e.g., using the physical button 808 on the device). If so (yes branch of step 1510), the microcontroller 924 marks features extracted within the envelope as a false positive and appends all the extracted features (e.g., heartbeat, GSR, etc.) within the envelop (i.e., within the window of time from which stress event was detected to when stress event ended) to the ML model and then recalculates the model parameters based on the appended data (step 1514). If not (no branch of step 1510), the microcontroller 924 marks features outside of the envelope (i.e., outside the window of time from which stress event was detected to when stress event ended) as normal, and then appends all the extracted features that were recorded before and after the stress event to the ML model and recalculates the model parameters based on the appended data (step 1512). After the completion of either step 1512 or step 1514, the microcontroller 924 determines whether the storage space storing the ML model is running low (step 1516). If not (no branch of step 1516), the process concludes. If so (yes branch of step 1516), the microcontroller 924 performs a model database deduplication and cleanup in order to free up some memory. Further at step 1518, the parameters of the ML model may be recalculated.
The user may be provided with an action plan in hardcopy form from the medical professional that describes each of the alert signals and associates each alert signal with an action to be performed by the user. Alternatively or in addition, the action plan may be provided electronically by the medical professional and may be accessed by the user via a mobile application (e.g., first logging into his/her account and then viewing the instructional material). If the user is not currently being treated by a medical professional, one or more action plans may be provided on the user's account, and the user can select one of the action plans based his/her specific condition/conditions and emotional management goals.
When the user receives an alert signal, if he/she knows the associated action to perform by heart (yes branch of step 1704), the action may be performed by the user (step 1708). If the user does not know the associated action by heart (no branch of step 1704), he/she can consult the action plan (step 1706) to determine and perform the associated action (step 1708). At step 1710, the system determines whether the monitoring of the user should continue (e.g., determine whether the user is still wearing the device). If so (yes branch of step 1710), the monitoring is continued (step 1712), subsequently, a new alert signal is generated (step 1714) and the process is continued from step 1702.
If the monitoring has concluded (no branch of step 1710), the AI algorithm instantiated on the microcontroller 924 may evaluate the efficiency or the effectiveness of the action plan (step 1716) by evaluating an impact of the performed action on the emotional score of the user. Specifically, the AI algorithm may compare the measured characteristics of the emotional event (e.g., length, intensity and evolution of the emotional event; amplitude, slope of emotional score) with preset and/or forecasted models. Such information (including the efficiency or the effectiveness of the action plan, the measured characteristics of the emotional event, the comparison of the emotional event to preset and/or forecasted models) can be transmitted to the medical professional who can use it to further the diagnosis of the user/patient and evaluate the efficiency/effectiveness of the treatment plan (which may include the action plan as well as other treatments such as medication). The AI algorithm can also suggest changes to the action plan (step 1718) based on the effectiveness of the action plan, for example, changes to the action corresponding to an alert signal, changes to the intensity of an alert signal, changes to the duration of an alert signal, etc. These recommendations can either be immediately incorporated into the user's action plan or alternatively can be first transmitted to the medical professional for review prior to being incorporated into the user's action plan (step 1720). In addition, the medical professional can determine changes to the action plan in addition to those suggested by the AI algorithm and request that those changes be incorporated into the action plan. Subsequently, those request changes from the medical professional can be incorporated into the action plan. Through frequent repetition of the actions, the user learns the recommended actions. In time, it is possible for the user to realize the association between various ones of his/her own biological signals (e.g., faster heartbeat or respiration, increased temperature, minute perspiration, etc.) with the actions prescribed by the medical professional or AI algorithm.
As is apparent from the foregoing discussion, aspects of the present invention involve the use of various computer systems and computer readable storage media having computer-readable instructions stored thereon.
Computer system 1800 includes a bus 1802 or other communication mechanism for communicating information, and a processor 1804 coupled with the bus 1802 for processing information. Computer system 1800 also includes a main memory 1806, such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 1802 for storing information and instructions to be executed by processor 1804. Main memory 1806 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1804. Computer system 1800 further includes a read only memory (ROM) 1808 or other static storage device coupled to the bus 1802 for storing static information and instructions for the processor 1804. A storage device 1810, for example a hard disk, flash memory-based storage medium, or other storage medium from which processor 1804 can read, is provided and coupled to the bus 1802 for storing information and instructions (e.g., operating systems, applications programs and the like).
Computer system 1800 may be coupled via the bus 1802 to a display 1812, such as a flat panel display, for displaying information to a computer user. An input device 1814, such as a keyboard including alphanumeric and other keys, may be coupled to the bus 1802 for communicating information and command selections to the processor 1804. Another type of user input device is cursor control device 1816, such as a mouse, a trackpad, or similar input device for communicating direction information and command selections to processor 1804 and for controlling cursor movement on the display 1812. Other user interface devices, such as microphones, speakers, etc. are not shown in detail but may be involved with the receipt of user input and/or presentation of output.
The processes referred to herein may be implemented by processor 1804 executing appropriate sequences of non-transitory computer-readable instructions (or non-transitory machine-readable instructions) contained in main memory 1806. Such instructions may be read into main memory 1806 from another computer-readable medium, such as storage device 1810, and execution of the sequences of instructions contained in the main memory 1806 causes the processor 1804 to perform the associated actions. In alternative embodiments, hard-wired circuitry or firmware-controlled processing units may be used in place of or in combination with processor 1804 and its associated computer software instructions to implement the invention. The computer-readable instructions may be rendered in any computer language.
In general, all of the above process descriptions are meant to encompass any series of logical steps performed in a sequence to accomplish a given purpose, which is the hallmark of any computer-executable application. Unless specifically stated otherwise, it should be appreciated that throughout the description of the present invention, use of terms such as “processing”, “computing”, “calculating”, “determining”, “displaying”, “receiving”, “transmitting” or the like, refer to the action and processes of an appropriately programmed computer system, such as computer system 1800 or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within its registers and memories into other data similarly represented as physical quantities within its memories or registers or other such information storage, transmission or display devices.
Computer system 1800 also includes a communication interface 1818 coupled to the bus 1802. Communication interface 1818 may provide a two-way data communication channel with a computer network, which provides connectivity to and among the various computer systems discussed above. For example, communication interface 1818 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN, which itself is communicatively coupled to the Internet through one or more Internet service provider networks. The precise details of such communication paths are not critical to the present invention. What is important is that computer system 1800 can send and receive messages and data through the communication interface 1818 and in that way communicate with hosts accessible via the Internet. It is noted that the components of system 1800 may be located in a single device or located in a plurality of physically and/or geographically distributed devices.
Thus, a wearable device and method for stress detection, emotion recognition and emotional management has been described. It is to be understood that the above-description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
This application is a National Stage under 35 USC 371 of and claims priority to International Application No. PCT/IB2021/062448, filed 30 Dec. 2021, which claims the priority benefit of U.S. Provisional Application No. 63/131,933, filed 30 Dec. 2020.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2021/062448 | 12/30/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63131933 | Dec 2020 | US |