Atopic dermatitis is a chronic relapsing and remitting skin disease that affects approximately 10% of adults and 12% of children in the United States. It is characterized by red, excoriated lesions on the skin with pruritus (itch). Individuals experiencing pruritus typically scratch the affected skin, which exacerbates the inflammation causing the pruritus and perpetuates an itch-scratch cycle. For many individuals with atopic dermatitis, pruritus peaks in the nighttime, resulting in sleep disturbance.
Assessments of a disease associated with pruritus, such as atopic dermatitis, are traditionally subjective, episodic, and provide poor measurements on the impact of atopic dermatitis. For example, one traditional tool is a clinical outcome assessment (COA) that involves a clinician assessing total body surface area of a lesion and lesion severity. COAs are subjective in that their assessments vary across different clinicians and are episodic in nature, as they can only be done when an individual is seen by a clinician. Another traditional tool is a patient reported outcome (PRO) that is a qualitative and subjective report from the patient as to the severity of the pruritus. PROs may lack accuracy due to lack of compliance, recall bias, and diary fatigue.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used in isolation as an aid in determining the scope of the claimed subject matter.
Embodiments of the present disclosure enable improved computer decision support tools for detecting scratch and, in some aspects, predicting flare events in the future. As used herein, the term “flare event” may refer to an acute or particularly severe phase of pruritus. Embodiments may include utilizing data acquired by a sensor device, which may be a wearable device, to automatically detect scratch events. In this way, scratch events may be detected based on a continuous stream of data input into one or more machine learning classifiers to provide an objective assessment of scratching. The scratch behavior detected, in accordance with some embodiments herein, is nighttime scratching or scratching during a period in which the user is intending to sleep. This detection helps track scratching during peak pruritus time or even when a user is unaware of the scratching. As such, scratch events detected, in accordance with embodiments of this disclosure, may provide more accurate measures of the user's current pruritus and atopic dermatitis. Further, embodiments may utilize patterns of detected scratching to predict a likely itch level in a future interval, which may indicate a future flare event. The detected scratch events and, in some embodiments, predicted future itch level and/or flare event, may be utilized in computerized decision support tools to more accurately and timely track atopic dermatitis symptoms and initiate intervening and/or therapeutic treatments to alleviate or prevent symptoms.
Scratch may be detected utilizing accelerometer data acquired by a sensor that is worn by a monitored individual (which may also be referred to herein as a patient or a user). Using the sensor data, a hand motion event may be detected, and it may be determined whether that hand motion event is a likely scratch event. In some aspects, prior to detecting hand motion events, context is determined to limit the potential sensor data utilized for detecting hand motion events. In some aspects, the context includes detecting whether the sensor is configured for proper data acquisition, such as detecting that the sensor is being worn by the user, which is more likely to result in accurate detection of hand motion events and, in turn, scratch events. Additionally, a user sleep opportunity may be detected to determine a period of time during which the user intends to sleep, and hand motion events and scratch events may be detected using sensor data acquired during this user sleep opportunity. In this way, scratches occurring at nighttime (when pruritus peaks) and/or while a user is sleeping and less likely to be aware of the scratching may be detected.
A detected likely scratch event may be recorded, and an action may be initiated based on one or more detected scratch events. For instance, an alert or a notification may be issued to a user to notify that user of the detected scratch event(s). Additionally, data related to the detected scratch event may be processed for computer-implemented decision making. For example, scratch event data may be aggregated to identify a total number of detected scratch events over a period of time, such as a 24-hour time period. In some embodiments, a total scratch duration may also be determined by adding the durations of all detected scratch events within the defined period of time. The total scratch events and/or total scratch duration may be utilized to initiate recommendations to seek medical treatment or consultation with a clinician or issuing a notification to a user device associated with a clinician of the monitored individual. Additionally, or alternatively, the total scratch event and/or total scratch duration may be added to a user's electronic calendar for the period of time during which the scratch data was detected. Additionally, embodiments may determine total scratch events and/or total scratch duration for multiple periods of time to identify scratching severity over time and/or changes in scratching behavior, either of which may be utilized to initiate an action.
Detection of a scratch event may be achieved by applying one or more machine learning models to feature values extracted from sensor data for a detected hand motion event. In some aspects, the machine learning model is an ensemble of models, such as gradient boosting or a random forest classifier. Aspects of the present disclosure may, therefore, include training machine learning model(s) to detect whether a hand motion is a scratch event or not.
Some embodiments of the present disclosure may further utilize detected scratch events to predict a likelihood of a user having itch in a future time interval. Scratch patterns may be determined based on the detected scratch events over a period of time. In some embodiments, the period of time may be 24 hours, but it is contemplated that other periods of time, such as 3 days or 5 days, may be utilized. Additional contextual information may be determined, such as the temperature and/or humidity levels at a location of the user for the time period during which the scratch events were detected. Additionally, the temperature and/or humidity level forecast for the future time interval may be determined. Based on the scratch pattern and contextual information, a likely itch level for the future time interval may be determined. Further, some embodiments may predict a likely flare event for the user by determining whether a predicted itch level is of sufficient severity to rise to a level of a flare event. Determining a likelihood of a future flare event may be determined by comparing the predicted itch level to one or more threshold itch levels.
Some embodiments may initiate an action based on the predicted itch level and, in some instances, a flare event, during a future time interval. Initiating an action may include generating an itch or flare notification to a patient or a clinician treating the monitored patient, adding the predicted itch level and/or flare event to an electronic calendar for the future time interval, and/or making one or more recommendations. A recommendation may be to start treatment, continue treatment, or modify treatment of the monitored patient. Additionally, a recommendation may be for the monitored patient to schedule a consultation with a clinician.
Further aspects of this disclosure include detecting whether the monitored user is asleep utilizing the sensor data. Similar to some embodiments of detecting scratch events, sensor data, acquired during times in which a configuration for proper data acquisition is detected and/or the user's sleep opportunity, may be utilized to determine whether the user is asleep or not. The sensor data may be utilized to determine activity index values for windows of time, and a combination of the activity index values, such as a weighted sum, may be compared to a sleep threshold to detect whether the user is asleep or not. Determinations of periods of time during a user's sleep opportunity when the user is awake versus asleep may be utilized to determine an overall sleep score, which provides a measure of the user's quality of sleep for a period of time, such as one night. In some aspects, the sleep score may further be determined based on detected scratch events as more scratch events during the user's sleep opportunity may indicate a lower quality of sleep.
Aspects of the disclosure are described in detail below with reference to the attached drawing figures, wherein:
The subject matter of the present disclosure is described herein with specificity with the help of different aspects to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. The claimed subject matter might be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this present disclosure, in conjunction with other present or future technologies. Moreover, although the terms “step” and/or “block” may be used herein to connote different elements of methods employed, the terms should not be interpreted as implying any particular order among or between various steps disclosed herein, unless and except when the order of individual steps is explicitly stated. Each method described herein may comprise a computing process that may be performed using any combination of a hardware, firmware, and/or software. For instance, various functions may be carried out by a processor executing instructions stored in memory. The methods may also be embodied as computer-useable instructions stored on computer storage media. The methods may be provided by a stand-alone application, a service or a hosted service (stand-alone or in combination with another hosted service), or a plug-in to another product, to name a few.
Aspects of the present disclosure relate to computerized decision support tools for predicting scratch and flare events. Affecting approximately 10% of adults and 12% of children in the United States, atopic dermatitis is a chronic relapsing and remitting skin disease that is characterized by red, excoriated lesions on the skin with pruritus (itch). Individuals experiencing pruritus typically scratch the affected skin, which exacerbates the inflammation causing the pruritus and perpetuates an itch-scratch cycle. For many individuals with atopic dermatitis, pruritus peaks in the nighttime, resulting in sleep disturbance. Not only does the physical act of scratching disrupt sleep, but scratching has also found to trigger cognitive and behavioral changes that lead to and reinforce insomnia and sleep-disruptions. Additionally, scratch-medicated epidermal damage may result in inflammatory responses that disrupts circadian rhythm.
Conventional assessments of a disease associated with pruritus, such as atopic dermatitis, are traditionally subjective, episodic, and provide poor measurements on the impact of atopic dermatitis. For example, one traditional tool is a clinical outcome assessment (COA) that involves a clinician assessing total body surface area of a lesion and lesion severity. COAs are subjective in that their assessments vary among different clinicians and are episodic in nature, as they can only be assessed when the individual is seen by a clinician. Another traditional tool is a patient reported outcome (PRO) that is a qualitative and subjective report from a patient about the severity of pruritus. Such PROs may include Patient Global Impression Severity (PGIS), Peak Pruritus Numerical Rating Scale (ppNRS), Severity of Pruritus Scale (SPS), Dermatology Life Quality Index (DLQI), Family or Children DLQI (FDLQI/CDLQI), Medical Outcome Study (MOS) Sleep Scale, Patient Oriented Eczema Measure (POEM), PROMIS Pain Interference, and PROMIS-Anxiety. PROs may lack accuracy due to lack of compliance, recall bias, and diary fatigue.
Attempts to provide an objective assessment have been made by utilizing recurrent neural networks to detect scratching from sensor data. However, these current tools require two sensors (one on each wrist of a user or patient), thus increasing the burden on the patient, the likelihood of lack of user compliance, and inaccurate results due to challenges associated with aligning time among the two sensors and possibility of one of the sensors not being properly configured. Further, current machine learning attempts to detect scratch do not focus on detecting scratching during sleep opportunities. As explained above, pruritus peaks at nighttime and can disrupt sleep, and, therefore, conventional solutions that do not detect scratch events within the context of sleep opportunity fail to provide an accurate assessment of the current state of pruritus. Further, conventional tools do not predict future itch or flare events and, therefore, have a limited ability to enable preventative therapeutic measures.
To improve accuracy and reliability, embodiments of the present disclosure result in improved computer decision support tools by detecting scratch and, in some aspects, predicting flare events that are likely to occur in the future from continuous sensor data, unobtrusively acquired by a sensor device worn by a user. As such, the information utilized to detect scratching is not episodic in nature. Additionally, some embodiments of the sensor device, such as a wrist worn device, are less invasive than conventional techniques requiring the user to sleep in a controlled, monitored environment, which results in a greater likelihood of user compliance and are particularly well adapted for use by populations that are traditionally not very compliant, such as children. In some aspects, only one sensor device is worn by a monitored user (interchangeably referred herein as patient) to further reduce potential user burden. Additionally, feature values extracted from the sensor data may be utilized to detect scratching using one or more machine learning classifiers, thereby removing subjectivity. Embodiments may detect scratching from sensor data obtained during nighttime or during a user sleep opportunity, facilitating tracking of scratch during peak pruritus time or when a user is unaware of the scratching. Further, a likelihood of the user experiencing an itch level or a flare event in the future may be predicted from patterns of detected scratch events. The detected scratch events and/or predicted future itch level and/or flare event may be fed into computerized decision support tools to accurately and timely track atopic dermatitis symptoms and initiate intervening and/or therapeutic treatments to alleviate or prevent worsening symptoms.
At a high level, a sensor device worn by a user may acquire sensor data to detect scratch. In exemplary aspects, the sensor data is accelerometer data captured by a wearable sensor located on or around the user's wrist. From the sensor data, a two-tier approach may be utilized to detect scratch. In some embodiments, a hand movement event may be detected, and sensor data detected within the hand movement event may then be classified as a scratch event.
In some aspects, prior to detecting hand movement, context is determined to narrow the scope of the sensor data for hand movement analysis. In some aspects, the context includes detecting whether the sensor device is configured for proper data acquisition, which is more likely to result in accurate detection of hand movement and scratch events. For instance, detecting whether the sensor device is configured for proper data acquisition may include determining that a wearable sensor device, such as a wrist-worn device, is being worn by a user or not. In some implementations, this step includes determining not only whether the sensor device is worn but whether the manner in which the device is worn facilitates capturing the intended data. As described herein, the determination that the sensor device is properly configured for data acquisition may include utilizing sensed temperature information (e.g., a user's near-body temperature) and comparing the sensed temperature information to a predetermined threshold to determine whether the device is being worn or not. In other implementations, this determination is made by applying a set of heuristic rules to statistical features of motion data, such as standard deviations and/or ranges of x, y, and z variables in accelerometer data. In some embodiments, a combination of variables, such as temperature and motion data, may be utilized to detect that the device is not worn.
Additionally, in some aspects, the scope of the data utilized for hand movement detection may further be narrowed to data captured within a sleep opportunity or an interval in which the user intends to sleep. As such, embodiments of this disclosure may determine a sleep opportunity. The sleep opportunity may be identified by comparing changes in arm angles, as derived from motion sensed data, to a sleep opportunity threshold to detect candidate sleep opportunity periods. In some embodiments, a longest group of candidate sleep opportunity periods (which may exclude periods of non-wear) within a relevant time frame, such as a 24-hour period, may be selected as the sleep opportunity.
After determining a sleep opportunity, motion data captured during the determined sleep opportunity may be utilized for detecting hand movement and scratch events. In this way, embodiments may determine scratching at nighttime (when pruritus peaks) and/or when a user is sleeping and less likely to be aware of the scratching.
In some embodiments, detecting hand motion includes segmenting the sensor data within the user sleep opportunity into windows of time and applying a heuristic algorithm to each window to determine the presence of hand movement within each window. In some embodiments, the heuristic algorithm for hand motion detection includes computing a rolling coefficient of variation and determining whether that value satisfies a motion threshold.
Various embodiments of the disclosure may determine whether the hand movement corresponds to a scratch event. To detect a scratch event, feature values may be extracted from sensor data within the windows determined to represent hand movement. In exemplary aspects, the features are time domain features or frequency domain features. The extracted feature values may run through a scratch detector that determines whether the detected hand motion was a scratch event or not. In exemplary aspects, the scratch detector comprises an ensemble of machine learning models, such as a random forest classifier. Aspects of the disclosure may include building the scratch detector, which may include feature selection and engineering and training one or more machine learning models. In some aspects, the machine learning models are trained by utilizing a leave-one-subject-out (LOSO) validation process.
In some aspects, a detected scratch event may be recorded, and an action may be initiated based on one or more detected scratch events. For instance, an alert or a notification may be issued to a user, via a user interface on a user device, to notify the user of the scratch event(s). Additionally, the detected scratch event data may be processed for computer-implemented decision making In one embodiment, scratch endpoint data may be determined from detected scratch events. For example, a total number of detected scratch events over a period of time, such as a 24-hour period of time, and/or a total scratch duration within that period may be determined. The total scratch events and/or total scratch duration may be utilized to initiate recommendations to a monitored individual to seek medical treatment or consultation with a clinician. Additionally, or alternatively, total scratch events and/or total scratch duration may be utilized to issue a notification to a user device associated with a clinician of the monitored individual. The total scratch event and/or total scratch duration may be added to a tracking application or a service to present the scratch endpoints as associated with the period of time for which it was detected. A scratch score may further be computed based on the detected scratch events and/or scratch endpoints and may be presented to the monitored user or clinician. Additionally, embodiments may determine total scratch events and/or total scratch duration for multiple periods of time to identify scratching severity over time and/or changes in patterns, either of which may be utilized to initiate an action. Scratch endpoints disclosed herein represent novel digital endpoints that are useful in quantitatively and objectively measuring pruritus or, more specifically, atopic dermatitis. This new type of data may be created utilizing the disclosed technology for monitoring scratch, which may be done using one or more wearable devices for continuous monitoring. In this way, the disclosed method of gathering data for measuring scratch results in new scratch endpoint data that is more accurate and useable than the conventional technologies for monitoring and treating a user because it provides a quantitative, accurate, and objective measure. As stated above, this method of obtaining the data used in creating the scratch endpoints is particularly useful in populations with typically lower compliance rates, such as children.
Some embodiments of the disclosure may include detecting whether the monitored user is asleep and/or awake during the sleep opportunity. As such, similar to some embodiments of detecting scratch, sleep may be detected by utilizing sensor data acquired during times in which a sensor configuration for proper data acquisition is detected (e.g., when the sensor is worn) and within the determined sleep opportunity. Detecting sleep may include determining activity index values for windows of time based on motion sensed data (e.g., accelerometer data), and a combination of multiple activity index values, such as a weighted sum, may be compared to a sleep threshold to detect whether the user is asleep or highly likely to be asleep. Determination of periods in which the user is awake or asleep within the user's sleep opportunity may be utilized to determine an overall sleep score that provides one or more measures of the user's sleep for a period of time, such as one night. In some aspects, the sleep score may further be determined based on a number of detected scratch events, as more scratch events during the user's sleep opportunity may indicate a lower quality of sleep.
Further embodiments of the present disclosure utilize detected scratch events to predict a likelihood of the user having itch in a future time interval. Scratch patterns may be assembled based on historical scratch events over a period of time. Additional contextual information may be determined and utilized for this prediction, such as atmospheric temperature and/or humidity levels at a location of the user. This contextual information may be historical contextual information such that it may provide insight so that an itch or flare predictor may learn and current or forecasted contextual information may be input into that predictor. Based on the scratch pattern and contextual information, a likely itch level for the future time interval may be determined. Further, some embodiments may predict a likely flare event for the user by determining whether the predicted itch level is of sufficient severity to rise to the level of a flare event. Determining a likelihood of a future flare event may include comparing the predicted itch level with one or more threshold itch levels, which may be based on reference population or user-specific threshold(s) defined based on historical user information and/or user or clinician settings or preferences.
Embodiments may initiate an action based on the predicted user itch level and, in some instances, a flare event, within a future time interval. Initiating an action may include generating an itch or flare notification to a user or a clinician who is treating or expected to treat the user.
In addition, or alternatively, initiating an action may include adding the detected itch level and/or flare event to an electronic calendar for a future time interval, thereby allowing a user to track predicted itch levels and future flare events. Further, an action may include making one or more recommendations. A recommendation may be to start treatment, continue treatment, and/or modify existing treatment. For instance, in operation, a user may receive a recommendation to purchase or refill a treatment to reduce or mitigate a predicted flare risk. Additionally, a recommendation may be for the user to schedule a consultation with a clinician.
Among others, a benefit of embodiments of the disclosure includes providing an assessment of pruritus (based on the resulting scratch) with greater accuracy and reliability (as compared to conventional solutions) based on continuous (or semi-continuous, periodic, as needed, or as-it-becomes-available) data acquired in a way to reduce burden on the user and increase user compliance. For instance, studies have shown that itch, as measured subjectively, does not have a high correlation with nighttime scratching, and itch has a lower correlation with severity of atopic dermatitis than objective scratch measures determined in accordance with embodiments herein. As such, embodiments may be used to more effectively treat and manage pruritus or atopic dermatitis compared to conventional subjective measures. Further, applying machine learning classifiers to the sensor data to detect scratch events removes bias and subjectivity, further improving accuracy and reliability. These classifiers help to provide reliable computer decision support tools that are based on detected scratch data, thereby improving recommendations for treatment and/or responses to scratching. Compared to other scratch detection approaches utilizing a recurrent neural network, some embodiments of this disclosure utilize gradient boosting or a random forest classifier and yield results that are more interpretable, when compared to the recurrent neural network approaches, and, therefore, better capable of being modified or refined for particular contexts. These embodiments further may be performed faster and are less computationally burdensome on computing systems. Additionally, embodiments enable prediction of itch and, to some extent, flare events within the future to better help a monitored user make informed decisions about treatment and/or to help the user's clinician manage care of the condition by proactively treating the skin to reduce the risk of itch or a flare. Further advantageous may result from embodiments determining a user's sleep opportunity and measuring scratching within the determined sleep opportunity. As previously stated, scratching may be particularly disruptive on a user's sleep and, as such, monitoring scratching during a sleep opportunity may more reliably lead to effective measures to improve a user's sleep.
As can be appreciated, embodiments of this disclosure may comprise a tracking application or service that tracks scratch events per night in an accurate manner with limited burden on the user. Such tracking, including alerts, notifications, and recommendations, may promote better treatment compliance on the user's part. Accurate and non-sporadic tracking over time may also enable a clinician to make informed decisions with respect to the monitored individual's treatment. In this way, embodiments of this disclosure may be desirable for both the monitored individual and treating clinician in the form of a tracking service. Also, utilizing the tracking service may be part of a clinician's prescription and/or treatment plan for an individual suffering from pruritus or who was prescribed a medication that lists pruritus as a known potential side effect. For example, a clinician may prescribe a cream to a patient suffering from pruritus with directions to apply the cream every other day and to utilizing an embodiment of the disclosed tracking application or service. Based on the scratch event data acquired for the patient over the next few weeks, it may be determined that the scratching is not improving and the clinician may determine to alter the prescribed course of treatment.
Turning now to
Among other components not shown, example operating environment 100 includes a number of user devices, such as user computer devices 102a, 102b, 102c through 102n and a clinician user device 108; one or more decision support applications, such as decision support applications 105a and 105b; an electronic health record (EHR) 104; one or more data sources, such as a data store 150; a server 106; one or more sensors, such as a sensor(s) 103; and a network 110. It should be understood that operating environment 100 shown in
It should be understood that any number of user devices, servers, decision support applications, data sources, and EHRs may be employed within operating environment 100 within the scope of the present disclosure. Each element may comprise a single device or component, or multiple devices or components cooperating in a distributed environment. For instance, server 106 may be provided via multiple devices arranged in a distributed environment that collectively provide the functionality described herein. Additionally, other components not shown herein may also be included within the distributed environment.
User devices 102a, 102b, 102c through 102n and clinician user device 108 can be client user devices on a client-side of operating environment 100, while server 106 can be on a server-side of operating environment 100. Server 106 can comprise server-side software designed to work in conjunction with client-side software on user devices 102a, 102b, 102c through 102n and 108 so as to implement any combination of the features and functionalities discussed in the present disclosure. This division of operating environment 100 is provided to illustrate one example of a suitable environment, and there is no requirement that any combination of server 106 and user devices 102a, 102b, 102c through 102n and 108 remain as separate entities.
User devices 102a, 102b, 102c through 102n and 108 may comprise any type of computing device capable of use by a user. For example, in one embodiment, user devices 102a, 102b, 102c through 102n and 108 may be the type of computing devices described in relation to
Some user devices, such as user devices 102a, 102b, 102c through 102n may be intended to be used by a user who is being monitored via one or more sensors, such as sensor(s) 103. In some embodiments, a user device may include an integrated sensor (similar to sensor 103) or operate in conjunction with external sensor 103. In other exemplary aspects, sensor 103 may be positioned on or near the monitored user's wrist. It is contemplated that sensor 103 may alternatively be positioned on or near an appendage (e.g., on or near the user's head, attached to the subject's clothing, worn around the subject's head, neck, leg, arm, ankle, finger, etc.). In other aspects, sensor 103 may be a skin-patch sensor adhered to the subject's skin; ingestible or sub-dermal sensor; sensor components integrated into the subject's living environment (including the bed, pillow, or bathroom); and sensors operable with or through a smartphone carried by the subject, for example. In one embodiment, user device comprises a wearable wrist computing device with an integrated sensor, such as a smart watch or a tablet that is communicatively coupled to a source of sensor data.
In exemplary embodiments, sensor 103, such as a gyroscopic or an accelerometer sensor, senses motion information. For example, sensor 103 may comprise a wearable accelerometer sensor, which may be implemented on a fitness tracker wristband device, a smartwatch, and/or a smart mobile device. Other types of sensors may also be integrated into or work in conjunction with user devices, such as sensors configured to detect ambient light (e.g., a photodetector); sensors configured to detect user location (e.g., an indoor positioning system (IPS) or a global positioning system (GPS)); sensors configured to detect atmospheric information (e.g., a thermometer, a hygrometer or a barometer); and physiological sensors (e.g., sensors detecting heart rate, blood pressure, core body temperature, near body temperature, or galvanic skin response (GSR)). Some embodiments include multiple sensors 103, such as three sensors, to obtain accelerometer data, ambient light data, and temperature (e.g., near-body temperature) data. Some embodiments of sensors 103 may include sensors measuring information to be used to monitor fine finger movement, such as electromyography (EMG) for measuring activation of muscles, acoustic surveillance, and/or vibration transducers. It is contemplated, however, that physiological information about the monitored individual, according to embodiments of the disclosure, may also be received from the monitored individual's historical data in EHR 104, or from human measurements or human observations.
Data may be acquired by sensor 103 continuously, periodically, as needed, or as it becomes available. Further, data acquired by sensor 103 may be associated with time and date information and may be represented as one or more time series of measured variables. In an embodiment, sensor 103 collects raw sensor information and performs signal processing, forming variable decision statistics, cumulative summing, trending, wavelet processing, thresholding, computational processing of decision statistics, logical processing of decision statistics, pre-processing and/or signal condition. Alternatively, one or more of these functions may be performed by a user device, such as user device 102c or clinician user device 108, server 106, and/or decision support applications (apps) 105a or 105b.
Some user devices, such as clinician user device 108, may be intended to be used by a clinician who is treating or otherwise monitoring a user associated with sensor 103. Clinician user device 108 is communicatively coupled through network 110 to EHR 104. Operating environment 100 depicts an indirect communicative coupling between clinician user device 108 and EHR 104 through network 110. However, it is contemplated that an embodiment of clinician user device 108 may be communicatively coupled to EHR 104 directly. An embodiment of clinician user device 108 includes a user interface operated by a software application or a set of applications on clinician user device 108. In an embodiment, the application is a Web-based application or applet. In accordance with embodiments presented herein, a healthcare provider (clinician) application may facilitate accessing and receiving information from a clinician about a specific patient or a set of patients for which the scratch events, future itch levels, and/or sleep detection are determined. Embodiments of clinician user device 108 also facilitate accessing and receiving information from a clinician about a specific patient or population of patients including patient history; healthcare resource data; physiological variables (e.g., vital signs), measurements, time series, predictions (including plotting or displaying the determined outcome and/or issuing an alert) described herein; or other health-related information. The clinician user device 108 further facilitates display of results, recommendations, or orders, for example. In an embodiment, clinician user device 108 facilitates receiving orders for the patient based on the results of monitoring and predictions. Clinician user device 108 may also be used for providing diagnostic services or evaluation of the performance of the technology described herein in conjunction with various embodiments.
Embodiments of decision support applications 105a and 105b comprise a software application or a set of applications (which may include programs, routines, functions, or computer-performed services) residing on a client computing device, one or more servers in the cloud, distributed in the cloud environment, or on a client computing device such as a personal computer, a laptop, a smartphone, a tablet, a mobile computing device, or front-end terminals in communication with back-end computing systems. In an embodiment, decision support applications 105a and 105b include Web-based applications or a set of applications usable to manage user services provided by an embodiment of the invention. For example, in an embodiment, each of the decision support applications 105a and 105b facilitates processing, interpreting, accessing, storing, retrieving, and communicating information acquired from user devices 102a-n and 108, sensor 103, EHR 104, or data store 150, including predictions and evaluations determined by embodiments of the invention.
Accessing and/or utilizing information through decision support applications 105a and 105b or utilizing associated functionality may require a user, such as a patient or a clinician, to login with credentials. Further, decision support applications 105a and 105b may store and transmit data in accordance with privacy settings defined by clinician, patient, an associated healthcare facility or system, and/or applicable local and federal rules and regulations regarding protecting health information, such as Health Insurance Portability and Accountability Act (HIPAA) rules and regulations.
In an embodiment, decision support applications 105a and 105b can send a notification (such as an alarm or other indication) directly to clinician user device 108 or user devices 102a-n through network 110. Decision support applications 105a and 105b may also send maintenance indications to clinician user device 108 or user devices 102a-n. Further, an interface component may be used in decision support applications 105a and 105b to facilitate access by a user (including a clinician/caregiver or patient) to functions or information on sensor 103, such as operational settings or parameters, user identification, user data stored on sensor 103, and diagnostic services or firmware updates for sensor 103, for example.
Further, embodiments of decision support applications 105a and 105b may collect sensor data directly or indirectly from sensor 103 and utilize the sensor data to detect scratch events, predict future itch levels and flare events, and/or detect sleep, as described further with respect to
As mentioned above, operating environment 100 includes one or more EHRs 104, which may be associated with a monitored individual. EHR 104 may be directly or indirectly communicatively coupled to user devices 102a-n and 108, via network 110. In some embodiments, EHR 104 represents health information from different sources and may be embodied as distinct records systems, such as separate EHR systems for different clinician user devices (such as 108). As a result, the clinician user devices may be for clinicians of different provider networks or care facilities.
Embodiments of EHR 104 include one or more data stores of health records, which may be stored on data store 150, and may further include one or more computers or servers that facilitate storing and retrieving health records. In some embodiments, EHR 104 may be implemented as a cloud-based platform or may be distributed across multiple physical locations. EHR 104 may further include record systems that store real-time or near real-time patient (or user) information, such as wearable, bedside, or in-home patient monitors, for example.
Data store 150 represents one or more data sources and/or data systems, which are configured to make data available to any of the various components of operating environment 100, or system 200 described in connection with
Operating environment 100 can be utilized to implement one or more of the components of system 200 (described in
Referring now to
Example system 200 includes network 110, which is described in connection with
In one embodiment, the functions performed by components of system 200 are associated with one or more decision support applications, services, or routines (such as decision support applications 105a-b of
Continuing with
Data utilized in embodiments of the present disclosure may be received from a variety of sources and may be available in a variety of formats. For example, in some embodiments, user data received via data collection component 210 may be determined via one or more sensors (such as sensor 103 of
In some aspects, data collection component 210 may provide data collected in form of data streams or signals. A “signal” can be a feed or stream of data from a corresponding data source. For example, a user signal could be user data from a wearable device, a smartphone, a home-sensor device, a GPS device (e.g., for location coordinates), a vehicle-sensor device, a user device, a gyroscope sensor, an accelerometer sensor, a calendar service, an email account, a credit card account, or other data sources. In some embodiments, data collection component 210 receives or accesses data continuously, periodically, or on as needed basis. Data collection component 210 may obtain data at a predetermined sampling rate. In one example, data collection component 210 utilizes a sampling rate of 100 Hz for one or more data signals, such as accelerometer signal, ambient light signal, and a body temperature signal.
Sensor monitor 280 may be generally responsible for monitoring collected data for information that may be used for detecting scratch, predicting flare (including predicting itch), and/or detecting sleep, which may include identifying and/or tracking features (sometimes referred to herein as “variables”), such as motion or accelerometer data or other related contextual information. In an embodiment, sensor monitor 280 comprises one or more applications or services that analyze information detected via one or more sensors integrated into or communicatively coupled to user devices used by the user and/or cloud-based services associated with the user, to determine motion information and related contextual information. For instance, sensor monitor 280 may comprise a service of a decision support application, such as any of decision support applications 105a-b of
Additionally, sensor monitor 280 may determine current or near real-time information, such as motion information and, in some embodiments, may also determine historical motion information, which may be determined based on individual record 240. Further, in some embodiments, sensor monitor 280 may determine motion information, detected scratch data, predicted itch/flare events, and detected sleep/wake periods (which may include historical activity) from other similar users (i.e., crowdsourcing), as described previously.
In some embodiments, information determined by sensor monitor 280 may be provided to scratch detector 260, flare predictor 290, and sleep/wake detector 230, including motion information acquired from a sensor (such as sensor 103 in
Some embodiments of sensor monitor 280, or its subcomponents, may determine a device name or identification (device ID) for each device associated with a user. This information about the identified user device(s) associated with a user may be stored in a user profile associated with the user, such as in user account(s)/device(s) 248 of individual record 240. In an embodiment, the user devices may be polled, interrogated, or otherwise analyzed to determine information about the devices. This information may be used for determining a label or an identification of the device (e.g., a device ID) so that user interaction with the device may be recognized from user data by sensor monitor 280. In some embodiments, users may declare or register a device, such as by logging into an account via the device, installing an application on the device, connecting to an online service that interrogates the device, or otherwise providing information about the device to an application or a service. In some embodiments, devices that sign into an account associated with the user, such as an email account, social network, or the like, are identified and determined to be associated with the user.
Continuing with system 200 of
At a high level, an embodiment of scratch detector 260 may utilize sensor data of a monitored individual to detect individual's hand movement and classify that hand movement as a scratch event or not. In some implementations, the sensor data considered for detecting hand movement is data acquired during a period in which the sensor 103 is properly worn. Further, an embodiment of scratch detector 260 detects nighttime scratch by detecting scratch events within a user's sleep opportunity, which is a period of time when the user intends to sleep.
As shown in
In example embodiments, sensor wear determiner 261 may automatically determine when the sensor 103 is being worn, utilizing data received from the sensor 103 or another sensor (not shown). For instance, sensor wear determiner 261 may automatically determine when sensor 103 capturing motion data is being worn utilizing motion data, physiological data, such as human body temperature, heart rate, blood pressure, pulse, galvanic skin response, etc., received from a sensor on a device acquiring motion data. Alternatively, sensor wear determiner 261 may determine when a device is being worn based on manual indication by the wearer. For instance, the wearer may enter an indication when the device is being worn and when it is taken off. In another instance, the wearer may enter times corresponding to these events.
As such, in one embodiment, sensor wear determiner 261 determines a period of non-wear configuration by comparing statistical measurements of motion data over windows of time to a non-wear threshold. For example, accelerometer data, which may comprise x, y, and z measurements, may be divided into windows of time, and statistical measurements may be computed and utilized with one or more heuristic rules to determine a wear configuration or a non-wear configuration. In an exemplary embodiment, the accelerometer data may be divided into multiple one-hour windows with 15 minutes overlap. A non-wear determination may be a vector of binary values representing wear/non-wear configuration for each window of the motion data. A window where a period of non-wear is not detected may be considered a period of wear.
In an example embodiment, sensor wear determiner 261 may determine whether sensor 103 is in a worn configuration or not during a window of time by comparing a statistical features of motion data in the window to a predefined threshold value. For example, in an embodiment, sensor wear determiner 261 determines whether the standard deviation of any of the three axes (x-axis, y-axis, or z axis) signals of accelerometer data within a window satisfies a non-wear motion threshold value, and if so, that window is determined to be non-wear. In an exemplary embodiment, the non-wear motion threshold is 0.001 g, and sensor wear determiner 261 determines that a window is non-wear if the standard deviation of values of any axes is less than the non-wear motion threshold.
In another example embodiment, sensor wear determiner 261 may determine whether the sensor 103 is in the worn configuration or not by comparing a temperature during a window (or interval) of time to a non-wear temperature threshold. In one exemplary embodiment, the non-wear temperature threshold is 25 degrees Celsius (i.e., 77 degrees Fahrenheit), and sensor wear determiner 261 determines that a window is non-wear if the temperature during that window is less than the non-wear temperature threshold.
Further, in exemplary embodiments, sensor wear determiner 261 considers both motion data and temperature data to determine whether to classify a window of time as wear or non-wear. In one exemplary embodiment, sensor wear determiner 261 determines that a window is non-wear if the temperature is less than the non-wear temperature threshold (e.g., 25 degrees Celsius) or if the standard deviation of values of any axes of motion data within the window is less than the non-wear motion threshold (e.g., 0.001 g).
In some embodiments, multiple statistical features may be computed for motion data and compared to thresholds to determine whether the window is a period of non-wear or not. In one exemplary embodiment, if any two axes have a standard deviation that satisfies (i.e., is less than) a non-wear standard deviation motion threshold, the period is determined as a non-wear window, or if any two axes have a range that satisfies (i.e., is less than) a non-wear range motion threshold, the period is detected as a non-wear window. In an example, a non-wear standard deviation motion threshold is approximately 0.013 g, and an example non-wear range motion threshold is 0.15 g.
In further aspects, the above processes may provide an initial wear/non-wear determination, and sensor wear determiner 261 may apply heuristic rule(s) to rescore one or more windows. Rescoring may help identify times where interruptions in the data indicate that the device is not worn, but contextual information, such as the length of time of this interruption and the accelerometer data occurring before or after, may indicate otherwise (i.e., may indicate that the device is being worn).
In an example embodiment, the heuristic rules consider the lengths of time of the wear and non-wear blocks to determine whether to switch a wear/non-wear determination for any of the blocks of time. As used herein with respect to rescoring by sensor wear determiner 261, blocks of time may be successive windows with the same wear or non-wear classification. For instance, three successive one-hour windows, initially determined to be “non-wear”, form a three-hour block of non-wear. In an example embodiment, sensor wear determiner 261 applies the following for rescoring one or more windows:
Further details of an embodiment of sensor wear determiner 261 may be implemented as described below in conjunction with
Additionally, prior to sensor wear determiner 261 determining a wear configuration, motion data may be preprocessed and filtered. For example, motion data may be first down-sampled, such as from 100 Hertz (Hz) to 20 Hz. Additionally, data may be segmented into relevant periods of time for which a scratch analysis is detected. For example, data may be separated into 24-hour segments (12:00 pm today to 12:00 pm the following day). Further, in some embodiments, any 24-hour period that does not have a minimum amount of recording time, such as 6 hours, may be discarded and not analyzed further by scratch detector 260.
As part of scratch detector 260, sleep opportunity determiner 262 may be generally responsible for determining a user's sleep opportunity. As used herein, sleep opportunity refers to an interval of time in which an individual intends to sleep, which may or may not be consistent with when the individual actually sleeps. As such, in some embodiments, the sleep opportunity is the time between when an individual lays down to rest and gets up from rest. A user's sleep opportunity within a predefined period may also be referred to as a total sleep opportunity (TSO). For instance, for a 24-hour period, individuals typically intend to go to sleep only once (e.g., at nighttime), and sleep opportunity determiner 262 may determine the total sleep opportunity to be the longest interval during that 24-hour period in which a user intends to rest.
The determination of a user's sleep opportunity may be utilized to focus sensor data within the context of nighttime or sleep scratching for further processing by scratch detector 260. For instance, scratch detector 260 may detect nighttime scratching by specifically detecting scratch events based on motion data captured during the period of time determined to represent the user's sleep opportunity by sleep opportunity determiner 262. The term “nighttime” is used herein to represent a typical period in which an individual takes the longest rest; however, it is contemplated that embodiments of this disclosure are not limited to detecting scratch at night. For instance, some individuals, such as individuals who work evenings or overnight, may take their longest rest or sleep during the day, and the sleep opportunity for such individuals may be a daytime interval.
Sleep opportunity determiner 262 may determine user's sleep opportunity for motion data captured over a predefined period, such as a 24-hour period. Example implementations of sleep opportunity determiner 262 may apply a heuristic approach based on a change in arm angle determined from motion data to determine candidate sleep opportunity periods. A largest consecutive group of candidate rest periods within the predefined period (e.g., 24-hour) may be selected as the user's sleep opportunity. In exemplary aspects, sleep opportunity determiner 262 may determine the sleep opportunity utilizing only motion data within the predefined period in which sensor wear is detected by sensor wear determiner 261, while non-wear periods are excluded by sleep opportunity determiner 262 when identifying the largest group of candidate rest periods.
In some aspects, an arm angle is computed from accelerometer signals (x-axis, y-axis, and z-axis measurements), and an absolute difference between successive arm angle values (i.e., a change in arm angle over time) may be compared to a rest threshold. In an example embodiment, a rolling median of raw signal values (x-axis, y-axis, and z-axis measurements) is computed over an interval (e.g., 5 seconds), and the rolling median of raw signal values are utilized to calculate arm angle in accordance with the following formula, where ax, ay, and az refer to accelerometer values along the x-axis, y-axis, and z-axis respectively and:
An average arm angle may be computed for an interval (e.g., consecutive 5 seconds), and the absolute difference between successive average arm angle values may be computed. A rolling median of the difference between successive average arm angle values may be computed for an interval (e.g., 5 minutes), and the rolling median of the difference between successive average arm angle values may be compared to a rest threshold. The rest threshold may be defined by arm angle values measured for the monitored individual. For example, in one embodiment, a candidate rest period is determined when the median difference between successive average arm angle values is less than or equal to the rest threshold, which may be defined as 0.15 multiplied by the 10th percentile value of all differences in arm angle values within the 24-hour period.
Sleep opportunity determiner 262 may determine the sleep opportunity based on the intervals identified as candidate rest periods. In an example, candidate periods with periods of detected non-wear are removed. The remaining candidate rest periods may be compared to a threshold length. In one implementation, the threshold length is 30 minutes, and candidate rest periods are kept if they are greater than 30 minutes. Additionally, candidate periods may be grouped together if the gaps between the periods satisfy a maximum length of time. For instance, candidate periods with a gap less than 15 minutes may be grouped together. In one example, sleep opportunity determiner 262 may determine the user's sleep opportunity to be the longest group of candidate periods within the 24-hour period. Further details of an embodiment of sleep opportunity determiner 262 are discussed further below in conjunction with
Reliably detecting sleep opportunity within which to measure scratch helps effectively determine how an individual's sleep and nighttime scratch vary on a day-to-day basis. Embodiments of this disclosure may utilizing a sleep opportunity that captures difficulties falling asleep by not limiting the sleep opportunity to times when the user is actually asleep.
Other implementations of sleep opportunity determiner 262 may determine the sleep opportunity from other sensor data. For example, in one embodiment, sleep opportunity determiner 262 may determine the sleep opportunity utilizing light information from a photodetector, and the sleep opportunity may be determined as a period of time in which the amount of light remains below a threshold level for a minimum time period. Alternatively, physiological data, such as heart rate, core body temperature, near body temperature, blood pressure, and/or respiration rate, captured from the monitored individual may be utilized to determine the sleep opportunity. Further, in some aspects, sleep opportunity determiner 262 may determine the sleep opportunity from user-entered data. For example, a user may input times corresponding to when the user intends to go to sleep and wake up or times corresponding to when the user did go to sleep and wake up.
As previously stated, embodiments of scratch detector 260 utilize a two-tier approach to detect scratch events. In some embodiments, hand movements may be detected, and each detected hand movement may be classified as a scratch event or a non-scratch event. Hand movement detector 264 is generally responsible for detecting hand movement using motion sensor information. Example embodiments of hand movement detector 264 may receive (from sensor 103) motion sensor information, such as accelerometer data and/or gyroscopic data. In one embodiment, hand movement detector 264 may output an indication of hand motion for the received data.
In exemplary aspects, hand movement detector 264 may apply a heuristic algorithm to motion sensor data captured during a sleep opportunity, which may be determined by sleep opportunity determiner 262. The motion sensor data, such as accelerometer data, may be segmented into windows of pre-determined length, and motion sensor data for each window may be passed through a heuristic hand movement detection algorithm to determine the presence of hand movement. An example embodiment utilizes three-second non-overlapping windows within the sleep opportunity for a given 24-hour period. It is contemplated that other windows may be utilized, such as a one-second window or a two-second window for instance.
In exemplary aspects, the hand movement detection algorithm includes computing the vector magnitude of the motion sensor signal (e.g., √{square root over (x2+y2+z2)}). A low pass filter may be applied to the vector magnitude signal, in accordance with some embodiments. In an example embodiment, the low-pass filter has a 6 Hz cutoff. The hand movement detection algorithm may further include calculating a rolling coefficient of variation (CoV) and applying a threshold to the calculated CoV values. As used herein, CoV refers to a relative standard deviation or a ratio of standard deviation to the mean. Any values that satisfy the threshold (e.g., are above or equal to) may be determined to be a hand movement. In some embodiments, this threshold utilized is 25th percentile of all calculated CoV values from testing data. In an example embodiment, the CoV threshold is 0.023.
The rolling CoV may be computed for each second within a non-overlapping 3-second window. For instance, for accelerometer data of 20 Hz, or 20 samples per second, hand movement detector 264 may make 60 classifications of hand movement for each non-overlapping 3-second window.
In an embodiment, hand movement may be detected for a given window if it is present for each second within that window utilizing the CoV threshold. For instance, hand movement detector 264 may detect hand movement for a three-second window if movement is detected for each of the three seconds within that window.
Further details of an embodiment of hand movement detector are described in conjunction with
Once hand movement detector 264 identifies a hand movement event, the motion sensor information corresponding to the detected hand movement event may be considered as a potential scratch event. In some embodiments, determining whether the hand movement event is a scratch event may include analyzing features within motion sensor data. In one such embodiment, features extractor 266 may generally be responsible for extracting feature information that may be indicative of a scratch motion. Features may be extracted from motion sensor data corresponding to the hand movement detected by hand movement detector 264. In extracting features, feature values may be computed for each window (e.g., 3-second window) for which hand motion is detected.
Features may be extracted from one or more components of motion sensor data in the form of a motion signal. For example, in some embodiments, a vector magnitude, a first principal component, and a second principal component of accelerometer signal are each utilized for feature extraction. Additionally, in some embodiments, a filter is applied to the motion sensor data prior to feature extraction. In one instance, a high-pass filter with a 0.25 Hz cutoff may be applied prior to feature extraction, which may help to remove drift and the contribution of gravity. Alternatively, in another instance, a band filter may be applied.
In exemplary embodiments, the features fall within the time domain or frequency domain. Example embodiments of features extractor 266 may extract, or compute, one or more of the following features:
In example embodiments, each of the above 36 time and frequency domain features (where features of vectors magnitude, first principal component and second principal component are separate features) may be extracted during training of scratch event classifier(s) 268, and a subset of the features are selected to be extracted by features extractor 266 during runtime. For instance, one embodiment of features extractor 266 extracts the following 26 time and frequency domain features: RMS (vector magnitude); signal entropy (vector magnitude, first principal component, and second principal component); IQR of auto-covariance (vector magnitude, first principal component, and second principal component); skewness (first principal component and second principal component); dominant frequency value (first principal component); dominant frequency magnitude (first principal component and second principal component); mean cross rate (second principal component); jerk ratio (vector magnitude and second principal component); log dimensionless jerk (first principal component); SPARC (vector magnitude, first principal component, and second principal component); permutation entropy (vector magnitude, first principal component, and second principal component); spectral flatness (first principal component and second principal component); spectral entropy (second principal component); and signal range (vector magnitude). Alternative embodiments of features extractor 266 may extract values for different combinations of the above and/or other features. The particular features for extraction by features extractor 266 may be determined from feature selection and feature engineering. An example process for feature selection is described in connection with
Continuing with scratch detector 260, scratch event classifier(s) 268 is generally responsible for determining whether to classify a motion signal as a scratch event. Embodiments of scratch event classifier 268 may utilize at least the extracted features of the motion signal (as determined by features extractor 266) to output a classification of the motion signal as a scratch event or not a scratch event (i.e., non-scratch event). As discussed earlier, the extracted features may be extracted from windows (e.g., 3-second windows) of motion signal corresponding to a detected hand movement such that the classification may determine whether the hand motion represents a scratch event or not.
In some embodiments, scratch event classifier 268 may utilize scratch-event detection logic 256 in storage 250 to determine whether motion signal is a scratch event or not. Scratch-event detection logic 256 may include rules, conditions, associations, machine learning models, or other criteria for inferring or detecting a likelihood of a scratch event based on motion sensor data. For example, scratch-event detection logic 256 may determine, from the accelerometer data, a probability that the detected movement was caused by a user scratching his or her body. Scratch-event detection logic 256 may take different forms depending on the mechanism(s) used to detect scratching. In some embodiments, scratch-event detection logic 256 may comprise fuzzy logic, a neural network(s), a finite state machine, a support vector machine, a logistic regression, clustering, other machine-learning techniques, similar statistical classification processes, or combinations of these to identify likely scratch events. Specifically, some exemplary embodiments of scratch-event detection logic 256 may include one or more binary machine learning classifiers. Scratch-event detection logic 256 may comprise an ensemble of machine learning models. In one embodiment, scratch-event detection logic 256 may be a random forest classifier. In another embodiment, gradient boosting may be utilized.
Model(s) forming the scratch-event detection logic 256 may be trained in accordance with embodiments of this disclosure. In one embodiment, scratch event classifier 268 is trained on an annotated training data and validated using leave-one-subject-out (LOSO) process. Further details of training are disclosed with reference to embodiments described in connection with
Scratch event classifier 268 outputs an indication of whether a scratch event has occurred utilizing the scratch-event detection logic 256. In some embodiments, the output of scratch event classifier 268 is binary, i.e., either scratch event or not a scratch event. Additionally, or alternatively, the output may have a corresponding quantitative or qualitative measure, such as a degree, a magnitude, or a level, associated with the detected scratch event. Output of scratch event classifier 268 may also be a “scratch event number”, where if the number is above a scratch-event threshold, then it is considered as a scratch event, but if not, then it is not considered as a scratch event. In some embodiments, output of scratch event classifier 268 is stored in individual record 240 of the monitored individual. Specifically, this information may be stored as historical scratch events 244 (in individual record 240), as shown in
Based on detected scratch events, a number of scratch endpoints may be determined for each period of time (e.g., 24-hour period) for use by other components of system 200, such as by flare predictor 290 and/or decision support tool(s) 270, as described further herein. As used herein, the term “scratch endpoint” refers to a quantifiable measure of scratching behavior, which may be derived from raw sensor data. In one exemplary embodiment, total scratch event count may be determined by summing the number of detected scratch events within the sleep opportunity determined for the period of time. Additionally, in some embodiments, a total scratch duration may be determined by summing the lengths of time of the detected scratch events, which may be provided in minutes. Further, a duration between different scratch events may be determined by summing the time between scratch events within the sleep opportunity. A ratio of the duration between scratch events and number of scratch events may also be computed. Transformations, such as a log transformation, may be applied to one or more of the scratch endpoints. For example, a total scratch count and a total scratch duration may be each be log transformed. In one example, the log transformation is that is applied is log(x+1) so to include possible zero values. In some aspects, scratch end points for each period are stored and provided to other components in the form of, for example, comma separated values (CSV) spreadsheets.
Continuing with
In some embodiments, sleep classification logic 253 may determine periods of sleep or wake based on motion sensor data. In one exemplary embodiment, activity values may be determined from motion sensor data within a sleep opportunity segmented into windows of time, and the activity values for those windows of time may be utilized to classify periods within the sleep opportunity as asleep or awake. As depicted in
Activity index determiner 232 may generally be responsible for determining activity index levels, which be a metric for summarizing tri-axial motion data. In an exemplary embodiment, motion sensor data captured during a user's sleep opportunity may be utilized to determine activity index levels. Sleep opportunity determiner 262 may determine the sleep opportunity, which may include determining sensor wear as described earlier with respect to sensor wear determiner 261. Additionally, any preprocessing steps discussed with respect to sensor wear determiner 261 and/or sleep opportunity determiner 262 may be applied to motion sensor data for determining activity index levels (by activity index determiner 232). For instance, a high-pass filter may be applied to the motion sensor data, which may be accelerometer data, and the cutoff may be 0.25 Hz.
Sleep opportunity may be segmented into windows of a predetermined length, and activity index determiner 232 may compute an activity index level for each window. In exemplary aspects, the predetermined length may be one minute, such that an activity index level is determined for each minute within the sleep opportunity. In an example embodiment, activity index determiner 232 determines activity index level, in accordance with the follow algorithm in which At is the activity level at time t for patient i and m is axis m:
Embodiments of sleep/wake classifier 234 may apply heuristic rules to the activity index levels (or values) to classify the windows as asleep or awake. Some embodiments of sleep/wake classifier 234 may compute a statistical feature of activity index values and apply a sleep threshold. An embodiment may determine a weighted sum of activity index values within a particular time period. For instance, the weighted sum for a one-minute window may be computed using activity index values over a span of 7 minutes, such as from time instances t−4 to t+4. An example algorithm for determining the weighted sum of activity index values is provided below:
D
0=0.243×(W−4A−4+W−3A−3+W−2A−2+W−1A−1+W0A0+W+1A+1+W+2A+2)
In some embodiments, sleep/wake classifier 234 may determine whether the weighted sum satisfies a sleep threshold. For example, the sleep threshold may be 0.5 and a window may be classified as a sleep period if the weighted sum for that period is less than 0.5.
Further embodiments of sleep/wake classifier 234 may apply one or more rescoring rules for improved specificity. For example, in one embodiment, Webster's rescoring rules may be similar to that described in Roger J. Cole, Daniel F. Kripke, William Gruen, Daniel J. Mullaney, J. Christian Gillin, Automatic Sleep/Wake Identification From Wrist Activity, Sleep, Volume 15, Issue 5, September 1992, Pages 461-469 (source: https://doi.org/10.1093/sleep/15.5.461).
Sleep/wake detector 230 may utilize other algorithms for detecting whether the user is sleeping, such as algorithms processing physiological variables. For instance, sleep/wake detector 230 may determine when a user is awake or asleep based on heart rate, blood pressure, core body temperature, near body temperature, and/or galvanic skin response data.
Based on detected sleep intervals, a number of sleep endpoints may be determined for each period of time (e.g., 24-hour period) for use by other components of system 200, such as by flare predictor 290 and/or decision support tool(s) 270, as described further herein. As used herein, the term “sleep endpoint” refers to a quantifiable measure of sleep behavior, which may be derived from raw sensor data. For example, total sleep time (TST) and, in some embodiments, percentage time asleep within the sleep opportunity may be computed. The number of arousals, which may also be referred to as wake bouts or periods of awake between periods of sleep, may be determined. Additionally, wake after sleep onset (WASO) and sleep onset latency (SOL) may be determined. As used herein, WASO refers to amount of time (e.g., in minutes) that a user is awake after initially falling asleep, while SOL refers to an amount of time (e.g., in minutes) at the beginning of the sleep opportunity before the first period of sleep. In some aspects, sleep end points for each period are stored and provided to other components of system 200 in the form of CSV spreadsheets. In some aspects, a user's sleep opportunity or, more specifically, TSO, as previously determined may also be saved as a sleep end point.
These end points may be utilized to generate a sleep score, in accordance with some embodiments. The sleep score may indicate one or more characteristics or qualities of a user's sleep for a particular evening or over a period of time. In some embodiments, scratch end points, as described with respect to scratch detector 260, may further be utilized with sleep end points to generate a sleep score. In this way, the impact of scratching during an individual's sleep may be measured. An example embodiment of output of sleep/wake detector 230, including a sleep score, is discussed below with respect to
Continuing with system 200 of
Scratch patterns assembler 292 may assemble historic scratch information for a user, in accordance with some embodiments. The historic scratch information may include historical scratch events determined by scratch detector 260 and stored in individual record 240 of the monitored user, as shown by historical scratch events 244. In some embodiments, the historic scratch information includes scratch endpoints determined from detected scratch events such as count of total scratch episodes (or events), total scratch duration, duration between scratch events, and/or a ratio of duration between scratch events and number of scratch events. Further, some embodiments of scratch patterns assembler 292 may also consider historic sleep-related data, including sleep endpoints discussed above with respect to sleep/wake detector 230.
Contextual data determiner 294 may be generally responsible for determining context information for historic scratch events and assembled scratch patterns as well as contextual information for a future time interval, in accordance with some embodiments. This contextual data may provide insight into potential causes, signs, or symptoms of future itch or flare. For instance, some embodiments of contextual data determiner 294 may determine weather information, such as atmospheric temperature and/or humidity, which may have an impact on a user's itch level. In some embodiments, weather information is determined by a location, which may be entered by a user or may be determined based on location information, such as GPS data, obtained from a user device associated with the user. Weather information may also come from one or more smart devices associated with the user, such as a smart thermostat. Other contextual data may include user's health data, which may be determined from profile/health record (e.g., EHR) 241 in the individual record 240. This health data may include, but is not limited to, user's age, weight, diagnosed conditions, past prescriptions, and/or current prescriptions.
In addition, contextual data determiner 294 may determine context from user-input data. For example, a user may input a user-defined itch rating, notes, and/or photographs of the user's skin, including skin lesions. In some aspects, contextual information may include user input regarding past treatment details including date, etc. For instance, a user may input whether the user applied prescribed ointment on a particular day. This information may have been input by the user into a tracking or monitoring application. Additional sources of contextual information may come from workout tracking applications, food logs, and/or water consumption logs.
In some embodiments, contextual data determiner 294 may append or associate the contextual information with pattern information determined from scratch patterns assembler 292. In one exemplary embodiment, the association may be based on common date and/or time. For example, an increase in scratch events over a particular week, detected by scratch patterns assembler 292, may be correlated to a high humidity level detected by contextual data determiner 294 for that same week. In this way, pattern data from scratch patterns assembler 292 may be enriched through contextual information.
Contextual data determiner 294 may also determine current and/or future context data. For instance, contextual data determiner 294 may determine a weather forecast, such as predicted temperature and/or humidity, for the future time interval. Additionally, current health information, such as whether a user has a current prescription for atopic dermatitis and the user's current weight, may be determined.
Itch predictor 296 may generally be responsible for predicting the user's itch level within a future time interval. As used herein, a predicted itch level may be represented as a scratch level, indicating an amount of scratching a user may do at a future time interval, which may be due to itch. Itch predictor 296 may use the scratch patterns of the user, as described with reference to scratch patterns assembler 292 and contextual data determiner 294, to predict the user's itch level at a future time interval. A future time interval may be the next one day, next few days, next week, same or next month, and the like.
Itch predictor 296 may apply itch prediction logic 259 to determine a future (or predicted) itch level. Itch prediction logic 259 include rules, conditions, thresholds, associations, machine learning models, or other criteria for inferring or detecting a likelihood of a particular itch occurring in the future. Itch prediction logic 259 may take different forms depending on the mechanism(s) used to predict itch. In some embodiments, itch prediction logic 259 may comprise fuzzy logic, neural network(s), finite state machine, support vector machine, logistic regression, clustering, other machine-learning techniques, similar statistical classification processes, or combinations of these to determine a likelihood of itch at a future time interval. Itch prediction logic 259 may be applied to scratch patterns, historical context, current context (including user-specific data such as age, demographics, prior conditions, etc.) and, in some embodiments, sleep-related data, to determine the likelihood of itch.
In some embodiments, itch prediction logic 259 may be generalized logic based on reference data. In one exemplary embodiment, historical scratch patterns for a reference population may be assembled, contextual information for the reference population may be determined, and this reference information may be utilized to determine itch prediction logic 259, such as one or more heuristic rules or thresholds. In some embodiments, this logic may be based on crowdsourced data, or historic data of similar users (e.g. users with the same diagnosed condition, in the same or near the same geographic location, or same or similar demographics). Any such crowdsourced data may be pre-identified prior to use by embodiments of flare predictor 290.
Further, in some aspects, itch prediction logic 259 is based on the specific user's historical scratch patterns and, in some embodiments, sleep-related data, as well as historical contextual information. For example, one or more rules or thresholds or machine learning models may be built utilizing the monitored user's information. In this way, prior conditions, such as the level and rate increase of scratch events, weather, whether the user was taking treatment, or the like, may be considered in determining logic to apply to determine a particular itch level and/or flare event. Further, this information may also be used to then predict likely future itch levels or flare events when similar patterns are observed again.
Although itch predictor 296 has been described as predicting a level or a degree of itch, a severe and/or persistent itch may accompany a flare. In this way, a predicted itch level may, by itself, be a predicted flare risk in accordance with some embodiments. Further, in some aspects, a predicted itch level may be utilized to determine a likelihood of a future flare. In some embodiments, predicting a flare risk utilizes itch predictions for multiple future time periods.
A predicted itch level may be compared to one or more flare detection thresholds to determine whether the predicted itch level is of sufficient severity to be a flare risk. A flare detection threshold may be predetermined based on a reference population, such that this threshold may be utilized for the larger population. In other embodiments, a flare detection threshold may be determined for a particular monitored individual. For instance, the flare detection threshold may be set based on the user's historical information, including health data such as condition or age. The flare detection threshold may be set by a doctor/caregiver of the user and/or adjusted by the user, which may be stored in settings 249 in individual record 240. In this way, the determination of a flare prediction by applying the flare detection threshold may be customized for a specific user.
In some aspects, output of itch predictor 296 may be an itch level or a risk score for a future time interval. The itch level or risk score may be a numerical level or score, or a categorical level or score, such as indicating low, medium, high, and/or severe risk levels. Additionally, in some embodiments, two predictions may be made for each future time interval including one prediction based on an assumption of treatment of pruritus or an underlying condition causing pruritus, and another prediction based on an assumption of no treatment for pruritus or an underlying condition. A prediction based on an assumption of treatment may be based on a determination of a current use of a correct treatment determined by contextual data determiner 294. Additionally, or alternatively, this predication may be based on the determination of information indicating a potential treatment, which may be identified from reference information in storage 250. A prediction based on an assumption of no treatment may be based on contextual data determiner 294 determining that the user is not taking treatment or failing to determine current treatment information. Additionally, even where contextual data determiner 294 determines that a user is currently taking treatment, a prediction based on no treatment may be based on a presumption that the user may stop taking the treatment.
In some embodiments, flare notification generator 298 of flare predictor 290 may generally be responsible for generating a notification or an alert, indicating user's itch and/or flare risk. For example, where an itch level satisfies a flare detection threshold, flare notification generator 298 may issue a notification presenting that risk to a user device (such as any of 102a-n) of the monitored user and/or to a clinician user device 108 for a clinician treating the monitored user or recommended to treat the monitored user. Unless otherwise indicated, the term “flare notification” is used herein to include a notification about an itch level even if the itch level does not indicate that a flare event is likely.
Example embodiments of a flare notification generated in accordance with embodiments of flare notification generator 298 are described below with respect to
Some embodiments of flare notification generator 298 may determine a time instance or a time interval, which can be used to decide when to provide the flare notification. This determination may be based on user preferences, such as those stored in settings 249. Alternatively, or additionally, this determination may be based on location information and/or time of day in a way to increase the likelihood of the user taking necessary action to mitigate the flare risk. For instance, in one embodiment, flare notification is issued either in the morning or at night, which may be correspond to times when an individual is more likely to apply an at-home treatment and/or plan a trip to a store for treatment. For one such instance, flare notification generator 298 may determine whether a location of the user is at or near a store, such as a drug store, and may issue a notification with a recommendation for an over-the-counter treatment or to refill a prescription.
Further, some embodiments of flare notification generator 298 may securely transmit a flare risk and associated data, such as recent scratch data, to the user's caregiver. This flare notification may be sent directly to a user device associated with the user's caregiver, such as clinician user device 108. In addition, or alternatively, a flare notification may be logged at regular intervals in a data source accessible by the user's caregiver, such as the user's EHR 241.
Decision support tool(s) 270 (as shown in
Some embodiments of the decision support tool(s) 270 may determine a daily/nightly scratch score and/or a sleep score for the monitored user and/or, in some aspects, other related metrics. An example user interface of decision support tool(s) 270 providing nightly scratch score, sleep score, and related information is shown in
One example decision support tool 272 may comprise a scratch tracker application or service. In some embodiments, decision support tool 272 may associate scratch event data with periods of time, such as days, and present the scratch event data in association with the relevant period. Decision support tool 272 may include a calendar in which each day of the calendar provides scratch event data for the monitored user. This data may include historical scratch event data, which may include determined scratch endpoints, such as total scratch event count and total scratch duration, as described with respect to scratch detector 260. Decision support tool 272 may further allow a user to log additional information for each date, such as user-defined scratching or itch levels, notes/narratives, other symptoms, and/or photographs. An example scratch tracker application or service is further descripted in connection with
Another decision support tool 274 may comprise a flare risk predictor service and/or itch forecaster, which may be determined as described earlier with respect to flare predictor 290. Decision support tool 274 may provide the flare risk predictor or itch level prediction as a notification. Additionally, or alternatively, the flare risk or itch level prediction may be associated with future time intervals (e.g., future dates and times) and presented in association with those dates, such as, on a calendar. An example flare risk predictor service and/or itch forecaster is further descripted in connection with
Another exemplary decision support tool 276 shown in
Some embodiments of decision support tool 276 include aspects for treating a user's pruritus, which may be presented as atopic dermatitis, based on scratching detected from a wearable device with a sensor, such as sensor 103. Treatment may be targeted to reduce the severity of a user's pruritus. Treatment determined based on the detected scratching may be intended to prevent the user's pruritus from worsening. Treating a user's pruritus based on the detected scratching may include determining a new treatment protocol, which may include a new therapeutic agent(s), a dosage of a new agent or a new dosage of an existing agent being taken by the user or a dosage of a new agent, and/or a manner of administering a new agent or a new manner of administration of an existing agent taken by the user. A recommendation for the new treatment protocol may be provided to the user or caregiver for the user. In some embodiments, a prescription may be sent to the user, the user's caregiver, or a user's pharmacy. In some instances, treatment may include refilling an existing prescription without making changes. Further embodiments may include administering the recommended therapeutic agent(s) to the user in accordance with the recommendation treatment protocol and/or tracking the application or use of the recommended therapeutic agent(s). In this way, embodiments of the disclosure may better enable controlling, monitoring, and/or managing the use or application of therapeutic agents for treating pruritus, which would not only be beneficial on a user's condition but could help healthcare providers and drug manufacturers, as well as others within the supply chain, better comply with regulations and recommendations set by the Food and Drug Administration and other governing bodies. In example aspects, treatment includes one or more therapeutic agents from the following:
Some embodiments include treatment being one or more therapeutic agents from the following, which may be in addition to or alternative to the agents listed above:
These example decision support tools 272, 274, and 276 may be utilized independently or in conjunction with each other. For example, one application may employ all three decision support tools. Additional details of decision support tools are discussed in conjunction with
Presentation component 220 of system 200 may generally be responsible for presenting detected scratch event information, detected sleep/wake information, itch/flare predictions, and/or related information. Presentation component 220 may comprise one or more applications or services on a user device, across multiple user devices, or in the cloud environment. For example, in one embodiment, presentation component 220 may manage the presentation of information, such as notifications and alerts, to a user across multiple user devices associated with that user. Based on presentation logic, context, and/or other user data, presentation component 220 may determine on which user device(s) content is presented, as well as the context of the presentation, such as how (e.g., in what format and how much content, which can be dependent on a user device or context) it is presented, when it is presented, or other such aspects of presentation.
In some embodiments, presentation component 220 may generate user interface features associated with or used to facilitate presenting aspects of other components of system 200, such as scratch detector 260, sleep/wake detector 230, flare predictor 290, and decision support tool(s) 270, to the user. Such features can include interface elements (such as icons or indicators, graphics buttons, sliders, menus, audio prompts, alerts, alarms, vibrations, pop-up windows, notification bar or status bar items, in-app notifications, or other similar features for interfacing with a user), queries, and prompts. Examples of graphic user interfaces (GUIs) that may be generated and provided to a user by presentation component 220 are described in connection with
Storage 250 of example system 200 may generally store information including data, computer instructions (e.g., software program instructions, routines, or services), logic, profiles, and/or models used in embodiments described herein. In an embodiment, storage 250 may comprise a data store (or computer data memory), such as data store 150. Further, although depicted as a single data store component, storage 250 may be embodied as one or more data stores or in the cloud environment.
As shown in example system 200, storage 250 includes sleep classification logic 253, scratch-event detection logic 256, and itch prediction logic 259, all of which are previously described. Further, storage 250 may include one or more individual records 240, as shown in
Profile/health data (EHR) 241 may provide information relating to a monitored individual's health. Embodiments of profile/health data (EHR) 241 may include a portion or all of an individual's EHR or only some health data that is related to scratch or sleep. For instance, profile/health data (EHR) 241 may indicate past or currently diagnosed conditions, such as atopic dermatitis, eczema, psoriasis, or similar conditions; medications associated with treating pruritus-related conditions or with potential side effects of scratching/itching; weight; or age.
Sensor data 242 may include raw and/or processed sensor data, such as from sensor 103 (shown in
Further, historical scratch events 244 may comprise scratch events determined by scratch event classifier 268. In some embodiments, historical scratch events 244 also include scratch endpoints, such as count of total scratch episodes, total scratch duration, duration between scratch events, and/or a ratio of duration between scratch events and scratch episodes. Embodiments of historical scratch events 244 may also include itch or flare predictions determined by flare predictor 290. Further, in some embodiments, historical scratch events 244 may also include information about the detected scratch events and/or previously predicted itch or flares, such as the date-time of a scratch event or prediction. In some aspects, other contextual data, such as weather, location, or the like, may be stored as historical scratch events 244. Additionally, or alternatively, other contextual information extracted from user-provided observational data, such as user-defined itch ratings, notes and photographs may be stored as historical scratch events 244.
In some embodiments, logs 246 may include observation logs and/or response logs. An observation log may include user notes, photographs, or other observations that the user may provide, via a scratch monitor app, in accordance with one exemplary embodiment. These observations may relate to itching, scratching, flares, sleeping and other contextual information described herein, such as weather, temperature, or the like. As previously disclosed, observation logs may be examined by contextual data determiner 294 to gain additional insights for future predictions.
Further, in some embodiments, logs 246 may also include response logs indicating how a user reacted to a detected scratch event, detected sleep/wake period, itch or flare prediction, and/or resulting notification. For instance, a response log may indicate that a monitored user scheduled a tele-appointment with a clinician in response to a predicted future flare. In another instance, a user may add a recommended ointment to an electronic shopping list in response to detected scratch events. Additionally, response log may indicate if a monitored user did not take affirmative action or selected an “ignore” feature in response to a notification or an alert generated based on detected scratch events or an itch or flare prediction. Some embodiments of this disclosure may utilize response logs for calibration, improving scratch detection, sleep/wake detection, flare or itch prediction, and/or improving decision support recommendations or actions initiated.
Also, in some embodiments, user account(s)/device(s) 248 may generally include information about user devices accessed, used, or otherwise associated with a user. Examples of such user devices may include user devices 102a-n of
In one embodiment, user account(s)/device(s) 248 may include information related to accounts associated with a user, for example, online or cloud-based accounts (e.g., online health record portals, network/health provider, network websites, decision support applications, social media, email, phone, e-commerce websites, or the like). For example, user account(s)/device(s) 248 may include a monitored individual's account for a decision support application, such as decision support tool(s) 270; an account for a care provider site (which may be utilized to enable electronic scheduling of appointments, for example); and online e-commerce accounts, such as Amazon.com® or a drugstore (which may be utilized to enable online ordering of treatments, for example).
Additionally, user account(s)/device(s) 248 may also include a user's calendar, appointments, application data, other user accounts, or the like. Some embodiments of user account(s)/device(s) 248 may store information across one or more databases, knowledge graphs, or data structures. As described previously, the information stored in user account(s)/device(s) 248 may be determined from data collection component 210.
Furthermore, in some embodiments, settings 249 may generally include user settings or preferences associated with one or more steps for scratch detection, sleep/wake detection, or itch/flare prediction or with one or more decision support applications, such as decision support tool(s) 270. By way of example and not limitation, such settings may include user notification tolerance thresholds, which may define when and how a user would like to be notified of a predicted flare. In some aspects, settings 249 may include user preferences for applications, such as notifications, preferred caregivers, preferred pharmacy or other stores, and over-the-counter medications. In one embodiment, calibration, initialization and settings of sensor(s) may also be stored in settings 249.
At step 410, sensor data is received. Sensor data may include motion sensor data associated with a monitored user (or patient), such as raw accelerometer data captured by a wrist worn sensor or device. Other sensed or determined data, such as user-entered data, near body temperature data, weather-related data, and the like, may also be received as sensor data. Embodiments of step 410 may include pre-processing operations, such as applying frequency filters, segmenting data into relevant windows, such as 3-second windows, and deriving transformed signals, such as a vector magnitude, a first principal component, and a second principal component. Step 410 may be performed by sensor 103 of
Further, at step 420, it is determined if the sensor(s) is configured for proper data acquisition. This step may include detecting whether the sensor (such as sensor 103) is being worn or not by the monitored user, or being worn in a manner to capture the intended information. Step 420 may be performed by an embodiment of sensor wear determiner 261 of
At step 430, a user sleep opportunity is determined. A user sleep opportunity may be an interval of time during which the monitored user intends to sleep or is more likely to sleep compared to outside of that interval. This determination may be made utilizing motion sensed information, such as accelerometer data. Step 430 may be performed by an embodiment of sleep opportunity determiner 262 of
Additionally, method 400 (more specifically, step 430) may further include determining periods of actual sleep (and/or periods of wake) during the user sleep opportunity. This aspect of step 430 may be carried out by sleep/wake detector 230 or its subcomponents activity index determiner 232 and/or sleep/wake classifier 234 in
Sleep periods may be determined by computing activity index values from accelerometer data captured within a determined total sleep opportunity (TSO). In this way, sleep/wake detection may include applying a sequence of three algorithms Firstly, a total sleep opportunity may be detected. Secondly, activity index values may be computed from accelerometer data captured during the determined TSO, and thirdly, periods of time within the determined TSO may be classified as sleep/wake periods based on the activity index values.
Other techniques for determining sleep in accordance with an embodiment of method 400 may be based on physiological parameters that may be sensed, such as brain activity determined by a head-worn sensor, or based on a combination of a plurality of physiological parameters and motion data. For instance, step 430 may detect sleep during a period of less motion indicated in the motion data coupled with heart rate and/or respiration rate changes that are consistent with sleep. Output of a sleep (or wake) detection may be endpoints shown in the example user interface depicted in
Continuing with method 400, at step 440, a user hand motion event (which may also be referred to generally as hand movement) may be detected. Example embodiments of step 440 may detect hand motion events based on the sensor data, such as accelerometer data, acquired from a wearable device, such as a wrist-worn or finger-worn device. Step 440 may be carried out by an embodiment of hand movement detector 264 of
Further, at step 450, a likely scratch event may be detected. Step 450 may be determined from sensor data corresponding to detected hand movement. In this way, embodiments of step 450 determine whether detected hand movement is a scratch event or not. Specifically, features values, such as time and frequency domain feature values, may be extracted from sensor data corresponding to detected hand movement event, and the feature values may be input into one or more machine-learning classifiers, such as a random forest classifier, to determine whether or not the detected hand movement is likely a scratch event. Step 450 may be carried out by embodiments of features extractor 266 and scratch event classifier 268.
At step 460, a detected scratch event may be recorded. This step may include storing the classification of the scratch event and related contextual information. The scratch event data may be stored in individual record 240 and accessed for decision support, such as by decision support tool(s) 270. The scratch event data may further be provided to a user and/or a clinician, as described with respect to presentation component 220 of
At step 470, an action may be initiated based on the detected scratch event. Example actions may include actions, recommendations, and/or directives for alleviating itch and reducing scratch events. Step 470 may be performed by embodiments of decision support tool(s) 270 and/or presentation component in
The action may include sending or otherwise electronically communicating an alert or a notification to a user via a user device, such as user devices 102a-n in
In some embodiments, an action may further include processing the scratch event data for further decision making, which may include providing a recommendation for treatment and support based on the detected scratch events. Such a recommendation may include a recommendation to consult with a healthcare provider, continue an existing prescription or over-the-counter medicine, start using an over-the-counter medicine (which may additionally include adding the medicine to an electronic shopping list and/or e-commerce cart), adjust thermostat settings, and/or continue monitoring scratch events. One or more of these actions may be performed automatically in response to the detected scratch events and, in some embodiments, detected sleep/wake periods.
At step 4250, a set of rescoring rules may be applied to determine whether or not to change the initial determination of wear or non-wear for a given window or block of windows. Further details of heuristic rules to apply for rescoring at step 4250 are discussed in conjunction with sensor wear determiner 261 of
Process 4300 may generally include determining a user's total sleep opportunity based on the change in arm angle measured from motion data. At step 4310, rolling medians of raw tri-axial motion signal measurements are determined. For example, 5-second rolling medians of x-axis, y-axis, and z-axis measurements are determined at step 4310, and the median measurements are utilized to determine arm angles at step 4320.
At step 4330, average arm angle values may be computed for intervals (e.g., consecutive 5 seconds), and absolute differences between successive average arm angle values may be computed at step 4340. At step 4350, rolling medians of the difference between successive average arm angle values may be computed for an interval (e.g., 5 minutes). At step 4360, candidate rest periods may be determined by comparing the rolling median of the difference between successive average arm angle values to a rest threshold. For example, a candidate rest period may be detected when the median difference between successive average arm angle values is less than or equal to 0.15 multiplied by the 10th percentile value of all differences in arm angle values within the 24-hour period.
At step 4370, candidate rest periods identified as non-wear (which may be determined as described in conjunction with
Process 4800 may detect a user's sleep/wake periods utilizing activity index values calculated from motion data. At step 4810, a filter may be applied to motion sensor data. For instance, a high-pass filter with a cutoff of 0.25 Hz may be applied to the motion. Sleep opportunity may be segmented into windows of a predetermined length, and, at step 4820, an activity index level may be computed for each window, such as one minute. Activity level values may be computed as illustrated at step 4820 in
At step 4830, a weighted sum of activity index values within a particular time period may be determined. For instance, the weighted sum for a one-minute window may be computed using activity index values over a span of 7 minutes, such as from time instances t−4 to t+4.
At step 4840, each weighted sum may be compared to a sleep threshold to determine whether to initial categorize the period as a sleep period. For example, the sleep threshold may be 0.5 and a window may be classified as a sleep period if the weighted sum for that period is less than 0.5. At step 4850, one or more rescoring rules may be applied to classify a period from sleep to awake and/or from awake to asleep. The rescoring rules may be as described in conjunction with sleep/wake classifier 234 of
At step 4860, aggregate sleep endpoints may be determined for the total sleep opportunity. These sleep endpoints may include total sleep time (TST), percent time asleep (PTA), wake after sleep onset (WASO), sleep onset latency (SOL), and number of wake bouts (NWB). These sleep endpoints may be utilized as described with respect to decision support tool(s) 270 in
Initially, at block 4010, sensor data may be received, which may include preformatting or preprocessing raw accelerometer data. In some embodiments, raw data can be in the form of an example signal 6410, as depicted in
The rest of process 4001 may include generating predictions of scratch via a two-tier approach. First, the presence of hand movement is determined (see block 4040), and then those periods of hand movement are classified as either scratch events or non-scratch events (see block 4050). At block 4040, each 3-second window is passed through a heuristic hand movement detection algorithm to determine the presence of hand movement. Steps 4042 and 4044 within block 4040 may be performed by an embodiment of hand movement detector 264 of
The hand movement detection algorithm includes computing rolling (1-second) coefficient of variation (CoV), as shown at step 4042. These computed CoV values may be compared to a hand movement threshold, at step 4044. A parameter of the hand movement detection algorithm (threshold on calculated rolling coefficient of variation) may be tuned empirically based on a training dataset. For example, it may be determined that the 25th percentile of all calculated coefficient of variation values in the training dataset provides accurate results. In one embodiment, this threshold CoV value may be 0.023. In some embodiments, hand movement detection algorithm may use an example hand movement prediction signal 6440, as depicted in
Scratch classification is represented by block 4050. Steps within block 4050 may be performed by features extractor 266 and scratch event classifier 268 of
An example pipeline for predicting scratch at block 4050 includes preprocessing step 4052, feature extraction 4054, classification 4056, and computing endpoints 4058. The preprocessing step 4052 may generate three processed signals by applying filtering and dimensionality reduction to raw accelerometer data. First, the raw accelerometer data may be filtered using a high-pass filter, such as a first order Butterworth Infinite Impulse Response (IIR) high-pass filter with a cutoff frequency of 0.25 Hz. Next, in order to reduce dependency on device orientation, vector magnitude and first and second principal components of the filtered signal may be computed.
At step 4054, time and frequency domain features may be computed from the processed accelerometer data. An embodiment of step 4054 may utilize 26 features as identified above with respect to features extractor 266 of
At step 4056, the computed features may be run through the trained scratch classifier. In one embodiment, the scratch classifier is a random forest classifier. Further, the random forest classifier may include 50 estimators. The scratch classifier may determine, utilizing the computed features, whether the detected hand movement is likely a scratch event or not. Further details of step 4056 may be described with respect to scratch event classifier 268 in
At step 4058, digital endpoints of nighttime scratch (also referred to as scratch endpoints) may be derived by processing the scratch predictions during the determined sleep opportunity for each 24-hour period. The scratch endpoints may include total scratch events and total scratch duration. The sleep opportunity, such as TSO, may also be included as a digital endpoint as it is used for scratch detection. The table below summarizes some digital endpoints derived in an embodiment of step 4058.
Implementations of process 4001 may be performed with only one sensor, such as a wrist-worn sensor device. Some embodiments, however, may also function with two sensors, such as when a user is wearing a device on each wrist. When there are two sensors, total scratch counts may be computed by taking the sum of contiguous 3-second bouts of predicted scratch detected from both wrists, and total scratch duration may be computed by taking the sum of the durations of all predicted scratch bouts from both wrists.
At step 4510, accelerometer data is received. The accelerometer data may be captured by a wearable device associated with an individual (e.g., a monitored subject or patient) and located at an appendage of the individual. For example, the wearable device may be located at the individual's wrist, arm, and/or finger. Other sensed or determined data, such as user-entered data, near-body temperature data, weather-related data, and the like, may also be received as sensor data. The wearable device may include a plurality of sensors for capturing different types of data, such as accelerometer data and at least one of near-body temperature data and light data. Step 4510 may be performed by sensor 103 of
At step 4520, a hand movement is detected utilizing the accelerometer data. Step 4520 may be carried out by an embodiment of hand movement detector 264 of
At step 4530, a computerized classification model is utilized to determine that the hand movement indicates a scratch event. This determination may be based on the accelerometer data corresponding to the hand movement. In some embodiments, step 4530 includes generating a multidimensional timeseries from the accelerometer data corresponding to the hand movement and determining feature values from the multidimensional timeseries. The feature values may include at least one time-domain feature value and at least one frequency-domain feature value. The determination that the hand movement indicates the scratch event may be based on the feature values. Step 4530 may be carried out by embodiments described in connection with scratch detector 260, and more specifically embodiments described in connection with features extractor 266 and scratch event classifier 268, of
At step 4540, one or more response actions are initiated based on the determination that the hand movement indicates the scratch event. Example actions may include actions, recommendations, and/or directives for alleviating itch and reducing scratch events. Step 4540 may be performed by embodiments of decision support tool(s) 270 and/or presentation component in
In some embodiments, the response action includes generating a graphic user interface element providing on display of a user device, such as user computer device 102a-c, patient user device 102n, or clinician user device 108 of
Some embodiments of method 4500 may include determining a total sleep opportunity based on the accelerometer data. The total sleep opportunity may be a period of time during which the individual lays down for a rest and when the individual gets up from the rest. The hand movement detected at step 4520 may be detected utilizing accelerometer data only corresponding to the total sleep opportunity. Some embodiments of this process may be similar to step 430 in method 400 and/or may be performed by an embodiment of sleep opportunity determiner 262 of
At least one of near-body temperature and light data captured by a wearable device may be used, in addition to the accelerometer data, to determine the total sleep opportunity. Additionally, this determination of the total sleep opportunity may further include determining periods of actual sleep (and/or periods of wake) during the total sleep opportunity, which may be carried out by sleep/wake detector 230 or its subcomponents, activity index determiner 232 and/or sleep/wake classifier 234, in
At step 4610, accelerometer data collected from a motion sensing device is received. The accelerometer data may be captured by a wearable device associated with a subject at located at the subject's appendance (e.g., at the individual's wrist, arm, and/or finger). Other sensed or determined data, such as user-entered data, near-body temperature data, light data, weather-related data, and the like, may also be received from the motion sensing device or another device having a sensor(s). The wearable device may include a plurality of sensors for capturing different types of data, such as accelerometer data and at least one of near-body temperature data and light data. Step 4610 may be performed by sensor 103 of
At step 4620, a hand movement is detected utilizing the accelerometer data. Step 4620 may be carried out by an embodiment of hand movement detector 264 of
At step 4630, a computerized classification model is utilized to determine that the hand movement indicates a scratch event. This determination may be based on the accelerometer data corresponding to the hand movement. In some embodiments, step 4630 includes generating a multidimensional timeseries from the accelerometer data corresponding to the hand movement and determining feature values from the multidimensional timeseries. The feature values may include at least one time-domain feature value and at least one frequency-domain feature value. The determination that the hand movement indicates the scratch event may be based on the feature values. Some embodiments of step 4603 may be carried out by embodiments described in connection with scratch detector 260, and more specifically embodiments described in connection with features extractor 266 and scratch event classifier 268. Additionally, some embodiments of step 4630 may be similar to embodiments of step 450 of method 400. Some embodiments of method 4600 include recording the determination of the scratch event as further described with respect to step 460 of method 400.
At step 4640, a treatment protocol for the subject to treat pruritus may be initiated based on at least a first determination that the hand movement indicates the scratch event. Step 4640 may be performed by embodiments of decision support tool(s) 270 (e.g., tool 476) and/or presentation component 220 in
In some embodiments the treatment protocol is further based on a plurality of determination the a plurality of hand movements each indicate a scratch event. For example, the treatment protocol may be based on a pattern of scratching determined for the subject.
Some embodiments of step 4640 include determining at least one of a therapeutic agent, a dosage, and a method of administration of a therapeutic agent for determining the treatment protocol. In some aspects, the therapeutic agent is selected from the group consisting of: infliximab, adalimumab, belimumab, tanezumab, ranibizumab, bevacizumab, mepolizumab certolizumab, natalizumab, ustekinumab, vedolizumab, 6-mercaptopurine, hydroxychloroquine, obeticholic acid, mofetil, sodium mycophenolate, leflunomide, rituxan, solumedrol, depomedrol, betamethasone, prednisone, cyclosporin, tacrolimus, pimecrolimus, dupilumab, omalizumab, tralokinumab, etokimab, nemolizumab, Tezepelumab, lebrikizumab, fezakinumab, anti-OX40, efalizumab, etanercept, crisaborole, fluocinonide, mapracorat, hydrocortisone, desonide, alclometasone, triamcinolone, desoximetasone, loratidine, fexofenadine, desloratidine, levocetirizine, methapyrilene, cetirizine, budesonide, fluticasone, mometasone, dexamethasone, prednisolone, ciclesonide, beclomethasone, methotrexate, azathioprine, aspirin, ibuprofen, celecoxib, valdecoxib, WBI-1001 and/or MRX-6, abrocitinib, baricitinib, brepocitinib, cerdulatinib, decernotinib, delgocitinib, fedratinib, filgotinib, gandotinib, ilginatinib, itacitinib, lestaurtinib, momelotinib, oclacitinib pacritinib, peficitinib, ritlecitinib, ruxolitinib, tofacitinib, upadacitinib, THRX-212401, PF-07055087, PF-06471658, PF-07055090, ATI-502, BMS-986165, JTE052, PF-06826647, SNA 152, SHR-0302, tapinarof, and/or alitretinoin. In a preferred embodiments, the therapeutic agent is crisaborole and/or abrocitinib.
In some embodiments, initiating administration of the treatment protocol includes generating a graphic user interface element provided for display on a user device. the graphic user interface element may indicate a recommendation of the treatment protocol that is based on the first determination that the hand movement represents the scratching element. In one example, the user device is separate from the motion sensing device. For example, the motion sensing device may be an example of the user computer device 102a-c or patient user device 102n of
At step 4710, accelerometer data is received for a subject. The accelerometer data may be captured by a motion sensing device, which may be a wearable device associated with subject at located at the subject's appendance (e.g., at the individual's wrist, arm, and/or finger). Other sensed or determined data, such as user-entered data, near body temperature data, light data, weather-related data, and the like, may also be received from the motion sensing device or another device having sensor(s). The wearable device may include a plurality of sensors for capturing different types of data, such as accelerometer data and at least one of near-body temperature data and light data. Step 4710 may be performed by sensor 103 of
The graphic user interface element may be provided for display on user device that is communicatively coupled to a wearable device with sensors capturing the accelerometer data. For example, the user device may be a smart phone that is connected to a wearable device that captures the accelerometer data. Example embodiments of the user device and wearable device include user computer device 102a-c, patient user device 102n, and clinician user device 108 of
Some embodiments of method 4700 include providing for display, on the user device, a treatment protocol for the subject for treating atopic dermatitis. The treatment protocol may include a therapeutic agent, a dosage, and/or a method of administration, and may be based on the one or more scratch endpoints. Example therapeutic agents that may be included in method 4700 includes the therapeutic agents described at step 4640 in method 4600.
At step 510, user scratch patterns may be determined. Step 510 may be performed by an embodiment of scratch patterns assembler 292 of
At step 520, contextual information may be determined. Step 520 may be performed by an embodiment of contextual data determiner 294. The determined contextual information may include weather information, such as atmospheric temperature and/or humidity; user health data, such as a user's age, weight, diagnosed conditions, past prescriptions or therapies, and current medications; and user-input data, such as a user-defined itch rating, notes, photographs of the user's skin, and/or treatment logs. In some embodiments, user health data may be determined from a user's profile/health record (EHR) 241 stored in the individual record 240 of
At step 530, a user's itch may be determined for a future time interval. Step 530 may be performed by an embodiment of itch predictor 296. The determined future itch is a likelihood of future itching within a future time frame, such as tomorrow, the next day, or in five days. The determined future itch may include a level or magnitude, which may represent the severity level of a predicted or future itch.
A future itch may be determined at step 530 utilizing the user's scratch patterns and contextual information determined at steps 510 and 520, respectively. Various types of logic may be employed at step 530 to determine user's itch in the future. As described with respect to itch prediction logic 259 of
As may be appreciated, a user's itch may be determined for multiple future time frames, and the itch level predicted may vary within different time frames. For example, at step 530, a user may be determined to have a “low” itch level in two days, but may be determined to have a “high” itch level in five days.
At step 540, a likelihood of a flare event within a future time interval may be determined. Step 540 may be performed by an embodiment of itch predictor 296 or, more generally, flare detector 290. Determining a likelihood of a future flare event may include comparing a predicted itch level to one or more flare detection thresholds to determine whether the predicted itch level is of sufficient severity to be a flare risk. In some embodiments, the flare detection threshold(s) may be predetermined based on a reference population such that the flare detection threshold may be utilized for the population at large. In other embodiments, a flare detection threshold(s) is determined for each monitored individual. For instance, the flare detection threshold may be set based on the user's historical information, including health data such as condition and age. Further, the flare detection threshold(s) may be set by a clinician/caregiver of the user and/or adjusted by the user. This set threshold may be stored in settings 249 of individual record 240, as described in
At step 550, an action may be initiated based on the determined likelihood of a flare event and/or user itch. As such, step 550 may be performed by an embodiment of flare notification generator 298 and/or decision support tool 270, such as tool(s) 272, 274, or 276. In some embodiments, a flare notification or an alert indicating a user's itch and/or flare risk may be generated. In one exemplary embodiment, where an itch level satisfies a flare detection threshold, a flare notification indicating the risk may be sent to a user device of the monitored user. In another exemplary embodiment, the flare notification is sent to a clinician's user device for the clinician to accordingly treat the monitored user. Example embodiments of a flare notification generated in accordance with embodiments of step 550 are described below with respect to
Initiating an action at step 550 may also include generating recommendations or directives or initiating actions based on the itch level or flare risk. As an example, a recommendation to schedule an appointment with a caregiver, refill prescription, and/or add an over-the-counter therapy to a user's shopping list may be generated and presented to the user. Further, in some embodiments, initiating an action may include adding the prediction to a user's electronic calendar, such as in a monitoring or tracking application, or modifying a user interface element in the user's device to indicate the predicted risk within an electronic calendar. Some embodiments of step 550 include initiating steps to treat a user's pruritus (or, more specifically, atopic dermatitis) using one or more therapeutic agents, e.g., crisaborole and/or abrocitinib, based on a flare prediction generated utilizing data obtained using a sensor on a wearable device as described with respect to decision support tool 276. Method 500 may include tracking and/or monitoring the application and use of a therapeutic agent according to a recommended or directed treatment protocol provided at step 470.
Further, some embodiments of step 550 may include utilizing a response log, such as logs 246 in
Data preprocessing at block 610 includes, at step 612, alignment of video annotations to accelerometer data. To generate labels for training the scratch classifier, annotations of nighttime scratch and restless (non-scratch) movements may be created by human annotators who view thermal videos of in-clinic subject visits. Annotations may be performed by two human annotators and reviewed by an arbitrator for accuracy. Each annotation may include metadata, indicating which hand was moving (right, left, or both) in embodiments in which sensors are worn are both hands; the affected body location; as well as severity (mild, moderate, severe) of the scratch. To accurately make use of the reference video-based annotations, all annotations may be time-aligned with the accelerometer data, at step 612. Alignment of the video annotations and the accelerometer data may performed manually based on a prescribed clap event (i.e., subjects may be instructed to clap in front of a camera while wearing accelerometer devices) during each in-clinic visit.
Data preprocessing further includes, at 614, down sampling the accelerometer data to 20 Hertz (Hz), which may help maximize battery life. Data preprocessing further includes filtering the annotations, at step 616. In exemplary aspects, annotations of three seconds or longer may be used in training a binary classifier. If an annotation is greater than three seconds, it may be segmented into three-second windows, at step 616. In some embodiments, the windows may be overlapping, such as with a 50% overlap in the three-second windows. Step 616 may also include determining whether hand movement is present throughout the annotated three-second window and filtering out data that does not have hand movement throughout.
Preprocessed data from block 610 may then be passed to block 620 for signal preprocessing. Signal preprocessing steps in block 620 may be similar to preprocessing step 4052 described in connection with
The transformed signals may then be passed to block 630 for feature engineering. At step 632, a total of 36 time and frequency domain features are extracted from the transformed signals for each window. These 36 features may include, but not limited to, the following:
At step 634, principal component analysis (PCA) is utilized to determine feature importance in indicating whether movement is a scratch event or not, and 36 features may be ranked according to their relative importance. In one embodiment, data from a random subset of 15 subjects may be selected to analyze feature importance in a scratch classifier. Feature importance may be determined from SHapley Additive exPlanations (SHAP) summary values that order the top 20 features based on their importance for detecting scratch. In an example embodiment, it was determined that signal periodicity, smoothness, and dominant frequency may be predominant features of a scratch classifier. Specifically, in one embodiment, a mean cross rate of the second principal component signal may be determined to be the most influential feature for an example classifier. Moreover, higher values of this feature may result in higher SHAP values, which in turn indicates a higher probability that the model would predict scratch for the given window. Measures of smoothness (spectral arc length measure (SPARC)) and dominant frequency may also be influential features to distinguish scratch movements as higher SPARC values (i.e. a smoother signal) and lower dominant frequency values tend to result in a lower probability of scratch prediction by the classifier.
After determining feature importance, feature selection and training of the machine learning model may be done in accordance with leave-one-subject-out (LOSO) validation process, as depicted by block 640. At step 642, observations may be randomly sampled to balance the positive and negative classes prior to feature selection. At step 644, feature selection may be performed utilizing recursive feature elimination with cross-validation (RFECV) using a decision tree estimator. In one embodiment, a subset of the following 26 features may be selected during step 644: RMS (vector magnitude); signal entropy (vector magnitude, first principal component, and second principal component); IQR of auto-covariance (vector magnitude, first principal component, and second principal component); skewness (first principal component and second principal component); dominant frequency value (first principal component); dominant frequency magnitude (first principal component and second principal component); mean cross rate (second principal component); jerk ratio (vector magnitude and second principal component); log dimensionless jerk (first principal component); SPARC (vector magnitude, first principal component, and second principal component); permutation entropy (vector magnitude, first principal component, and second principal component); spectral flatness (first principal component and second principal component); spectral entropy (second principal component); and signal range (vector magnitude).
Continuing with
Aspects of the performance of an embodiment of a trained model are illustrated in
Further, in
Further, in
In this performance validation 6501, determinations of rest utilizing a total sleep opportunity (TSO) algorithm disclosed herein is compared against determinations of rest utilizing polysomnography (PSG), which is represented as PSG TSO. The PSG determinations represent the base or reference that is compared with the TSO as determined by embodiments of the present disclosure, such as TSO detected by the process 4300 of
Graphs 6502 and 6504 indicate performance of the disclosed TSO algorithm determined by sensor data from the left wrist and right wrist, respectively. Graph 6506 shows the agreement between left-wrist and right-wrist based determinations of TSO. Specifically, graph 6506 indicates that the agreement is strong or correlated, which means that the TSO algorithm disclosed herein may be sufficiently accurate for a single-wrist operation—either left-wrist based or right-wrist based (either dominant or non-dominant) detection of TSO. Using this technique, embodiments of the present disclosure may be used more accurately for a single-wrist operation, which represents an improvement over conventional technologies that required a dual-wrist operation. Additionally, because the algorithm for detecting TSO may also be utilized in scratch detection, as described with respect to
With reference to
In some embodiments, it is contemplated that a prescribed or recommended standard of care for a patient diagnosed with atopic dermatitis (or similar condition) may comprise utilizing an embodiment of the scratch monitor app 8101, which may operate on the user/patient's own computing device, such as a smartwatch, a mobile device, or other user device 102a-102n, or may be provided to the user/patient via the patient's healthcare provider or pharmacy.
In particular, as described herein, conventional solutions for monitoring and tracking user scratching, such as requiring users to monitor and report scratching, may suffer from being subjective and non-uniform, less accurate, inconsistently captured, and other deficiencies. However, embodiments of the technologies described herein may provide objective and/or uniform, consistent, and more accurate means of monitoring, detecting, and tracking scratch (and sleep) related data for a user. As a result, these embodiments thereby enable reliable use of these technologies for patients who are prescribed certain medicines. In this way, a doctor or a healthcare provider may issue an order that includes a patient taking a medicine and using a computer decision support app (e.g., scratch monitor app 8101) to, among other things, track and determine precise efficacy of the prescribed treatment. Moreover, the use of the computer decision support app (e.g., scratch monitor app 8101), as part of the standard of care for a patient who is administered or prescribed a particular medicine, supports the effective treatment of the patient. The effective treatment, in some embodiments, is achieved by enabling the healthcare provider to better understand the efficacy of a prescribed medicine, modify a dosage, change a particular prescribed medicine, or instruct the patient to cease using it because it is no longer needed due to the patient's condition having improved.
Further, continuing with
Selecting log icon 8112 can navigate the user to a scratch log tool (which may be indicated by a descriptor for a scratch log 8201) that comprises functionality to facilitate scratch or sleep related detection, tracking, and/or monitoring. In an embodiment, scratch log 8201 comprises calendar view 8105 or an alternative calendar view 8505 depicted in
The example scratch monitor app 8101 depicted in GUI 8100 includes a header region 8109 located near the top of GUI 8100. In particular, this example header region 8109 includes a hamburger icon 8103, descriptor 8201 showing “Scratch Log”, a share icon 8104, a stethoscope icon 8106, and a cycle icon 8108. Selecting hamburger icon 8103 may provide the user access to a menu of other services, features, or functionalities of scratch monitor app 8101, and may further include access to help, app version information, and access to secure user-account sign-in/sign-off functionality. Descriptor 8201 showing “Scratch Log” indicates to the user a mode, a feature set or an aspect of scratch monitor app 8101 to which the user has navigated. Here the descriptor 8201 indicates that the user is in the scratch log functionality of scratch monitor app 8101, which may have been accessed by selecting the log icon 8112. Share icon 8104 may be selected for sharing various data, reports, user-provided annotations or observations (e.g., notes or photos). For example, share icon 8104 may facilitate enabling the user to email a report of recent nights' scratch events to a caregiver of the user. In some embodiments, share icon 8104 may facilitate sharing aspects of the various data captured, displayed, or accessed via scratch monitor app 8101 on social media or with other similar users. Selecting stethoscope icon 8106 can provide the user with various communication or connection options to the user's healthcare provider. For example, selecting stethoscope icon 8106 may initiate functionality to facilitate scheduling a tele-appointment, sharing or uploading data to a medical record (e.g., profile/health data (EHR) 241) of the user for access by the user's healthcare provider, or accessing a healthcare provider's online portal for additional services. In some embodiments, selecting stethoscope icon 8106 may initiate functionality for the user to communicate specific data, such as the data that the user is currently viewing, to the user's healthcare provider, or may ping the user's healthcare provider to request them to look at the user's data. Finally, selecting cycle icon 8108 may cause a refresh or update to the views and/or data displayed via scratch monitor app 8101 so that the view is current with regards to the available data. In some embodiments, selecting cycle icon 8108 may refresh data pulled from a sensor (or from a computer application associated with data collection from a sensor, such as sensor 103 in
Scratch monitor app 8101 depicted in GUI 8100 may also include calendar view 8105. Embodiments of calendar view 8105 can facilitate accessing or displaying the detected and interpreted sleep and/or scratch related data for the user. For example, by selecting a particular date of the calendar view 8105, the user may be presented with a daily (or nightly) summary of the data for that date, such as provided by a GUI 8200, described in connection with
Turning now to
As shown in this example GUI 8200 of scratch monitor app 8101, the log functionality includes five selectable tabs: scores 8210, charts 8230, photo 8240, notes 8250, and treatment 8260. As per GUI 8200, as shown in
In some embodiments, scratching score 8212 may be displayed with various scratch-related analytics data 8213. By way of example and without limitation, data 8213 may include: a scratch trend, which indicates whether the user's scratching is increasing, decreasing, or remaining unchanged over recent nights (e.g., past 3 nights, 5 nights, or a week); a number of nightly or daily scratch events detected (e.g., 12 scratch events); total scratch time, which represents a cumulative total of the time of detected overnight scratch events (e.g., 84 seconds); the average duration of the detected scratch events (e.g., 7 seconds); and the duration of the longest detected scratch (e.g., 12 seconds). Similarly sleep score 8216 may be displayed with various sleep-related analytics data 8217. By way of example and without limitation, data 8217 may include: a sleep percentage, which represents a ratio of the user's detected sleep time over their sleep opportunity (e.g., TSO) time interval (here shown as 86%); total sleep time (TST), sleep onset latency (SOL, measured in minutes); wake after sleep onset (WASO, measured in minutes); and a number of wake bouts (NWB). Other sleep-related metrics may also be presented, and in some embodiments, a user may customize information that is displayed including scores, metrics, and visual summary 8218, by configuring the settings (e.g., via settings icon 8115). Similarly, in some embodiments, other related data such as temperature or humidity data may be displayed alongside the score(s).
Continuing with GUI 8200 shown in
Continuing with
In some embodiments, the users may enter other contextual information, such as their location, weather, and any physical activity that they engaged in during the day, for example, into notes 8250. In some instances, data such as user location and weather may be determined automatically, such as by using location sensors on the user computing device 8102a and looking up the weather information for the user device location. In some embodiments, as described in connection with contextual data determiner 294 (
In some embodiments, photo 8240 can comprise a UI for receiving photo(s) or video(s) from the user. Photo 8240 may also comprise functionality for snapping photos or videos on the user computing device 8102a on which scratch monitor app 8101 operates. For example, for a given day, the user may select notes 8250 to add a note indicating the user did not sleep well and scratched all night. The user also, or alternatively, may snap a photo on user computing device 8102a to be logged for this data, after selecting the tab for photo 8240. The photo may be of a lesion or an otherwise-affected area of the user's skin.
Selecting the tab indicating treatment 8260 on GUI 8200 may navigate the user to a UI within scratch monitor app 8101 with functionality for the user to specify details such as whether the user applied (or took) treatment for that date. For example, the user may specify that their prescription topical medication is applied on the affected area of the user's body. It is also contemplated that, in some embodiments, smart pillboxes or smart containers, which may include so-called internet-of-things (IoT) functionality, may automatically detect that a user has accessed medicine stored within a container and may communicate an indication to scratch monitor app 8101 indicating that the user has applied treatment on that date. In some embodiments, the tab for treatment 8260 may comprise a UI, enabling the user to specify their treatment, for instance, by selecting check-boxes indicating the kind of treatment the user followed on that date (e.g., applied OC lotion, took a bath, avoided exposure to sun, applied topical (or ingested oral) prescription medication, and so on).
Turning now to
Example GUI 8300 includes a descriptor 8303 indicating a current date the user is accessing the forecast functionality of scratch monitor app 8101 (e.g., Today, Tuesday, Mar. 17, 2020) and user's itch forecast 8301. As shown in
In some embodiments, and in the example embodiment depicted in GUI 8300, itch forecast 8301 further includes a user recommendation 8330. Here, the recommendation advises the user to “use your topical treatment every day, as directed.” User recommendation 8330 may include recommendations and/or directives for treating pruritus using one or more therapeutic agents, such as the agents discussed with respect to decision support tool 276. In some instances, the user may select or click on user recommendation 8330 to view the recommendation or additional details about the recommendation. The recommendation displayed or accessed via user recommendation 8330 may correspond to the specific itch forecast for the user and/or information available of the user's behavior or treatment regimen. This information may be provided by the user, the user's caregiver or a healthcare provider, or received as observational or treatment-related data, such as described in connection with
In some embodiments and in the example embodiment depicted in GUI 8300, itch forecast 8301 further includes a viewing functionality 8340 for viewing alternative forecasts (with or without treatment). For example, by selecting a treatment button 8341, daily itch forecast 8310 may be determined and presented to the user based on the user using treatment over the future time interval. Similarly, by selecting a no-treatment button 8343, daily itch forecast 8310 may be determined and presented to the user based on the user not using treatment over the future time interval. In particular, the user's treatment may be determined as part of contextual data (such as by contextual data determiner 294, discussed in connection with system 200 of
Turning now to
Example GUI 8400 includes a descriptor 8403 indicating the current date (e.g., Today, Monday May 4) and flare notification 8401 alerting the user for a likely future flare event. In the example embodiment depicted in GUI 8400, additional information may be presented in addition to flare notification 8401, such as a recommendation (not shown, e.g., avoid exposure to sunlight) and/or flare notification details 8410. In particular, in this example, flare notification details 8410 indicate when the flare is likely to occur (e.g., the future time interval between the next day and Thursday), the likelihood of the flare event occurring (e.g., 74% likelihood, which may be determined as described in connection to flare predictor 290 of
In the example embodiment depicted in GUI 8400, flare notification 8401 further includes one or more response options 8420 to facilitate a user's response to the flare notification. For example, response options 8420 may include an option 8422 to check/refill the user's prescription, an option 8424 to schedule a tele-appointment (or in-person appointment) with the user's healthcare provider, or an option 8426 to automatically add an over-the-counter (OC) therapy (e.g., cortisone cream, calamine lotion, etc.) to the user's electronic shopping list. In embodiments where purchasing or store-account information is specified in user account(s)/device(s), selecting option 8426 may automatically purchase the item for the user and deliver it to the user's address or make it available for pickup. In some embodiments, the particular OC therapy may be specified by the user or healthcare provider. For example, OC therapy may be defined via treatment tab 8260, settings 8115, user's profile/health data (EHR) 241 (
Upon selecting response option 8424, for scheduling a tele-appointment, it is contemplated that in many instances, a user may not have time to schedule a physical (in-person) appointment after receiving flare notification 8401 before the flare event happens. Therefore, a tele-appointment, which may include initiating video conference with user's healthcare provider using a camera on user computing device 8102a, provides a more timely solution for the user. Some embodiments of flare prediction, however, may forecast flares weeks in advance, and hence, physical appointment can be done as an alternate solution.
Turning now to
Accordingly, various aspects of technology directed to systems and methods for detecting scratch and predicting flares are provided. It is understood that various features, sub-combinations, and modifications of the embodiments described herein are of utility and may be employed in other embodiments without reference to other features or sub-combinations. Moreover, the order and sequences of steps shown in the example methods or process are not meant to limit the scope of the present disclosure in any way, and in fact, the steps may occur in a variety of different sequences within embodiments hereof. Such variations and combinations thereof are also contemplated to be within the scope of embodiments of this disclosure.
Having described various implementations, an exemplary computing environment suitable for implementing embodiments of the disclosure is now described. With reference to
Embodiments of the disclosure may be described in the general context of computer code or machine-useable instructions, including computer-useable or computer-executable instructions, such as program modules, being executed by a computer or other machine, such as a personal data assistant, a smartphone, a tablet PC, or other handheld or wearable device, such as a smart watch. Generally, program modules, including routines, programs, objects, components, data structures, and the like, refer to code that performs particular tasks or implements particular abstract data types. Embodiments of the disclosure may be practiced in a variety of system configurations, including handheld devices, consumer electronics, general-purpose computers, or specialty computing devices. Embodiments of the disclosure may also be practiced in distributed computing environments, where tasks are performed by remote-processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
With reference to
Computing device 1200 typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by computing device 1200 and includes both volatile and nonvolatile, and removable and non-removable media. By way of example, and not limitation, computer-readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Computer storage media includes, but is not limited to, Random-access memory (RAM), Read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, Compact Disc Read-Only Memory (CD-ROM), digital versatile disks (DVDs) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium, which can be used to store the desired information and can be accessed by computing device 1200. Computer storage media does not comprise signals per se. Communication media typically embodies computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media, such as a wired network or a direct-wired connection, and wireless media, such as acoustic, radio frequency (RF), infrared, and other wireless media. Combinations of any of the above should also be included within the scope of computer-readable media.
Memory 1212 includes computer storage media in the form of volatile and/or nonvolatile memory. The memory may be removable, non-removable, or a combination thereof. Exemplary hardware devices include for example solid-state memory, hard drives, and optical-disc drives. Computing device 1200 includes one or more processor(s) 1214 that reads data from various devices such as memory 1212 or I/O components 1220. Presentation component(s) 1216 presents data indications to a user or other device. Exemplary presentation component(s) 1216 may include a display device, a speaker, a printing component, a vibrating component, and the like.
The I/O port(s) 1218 allow computing device 1200 to be logically coupled to other devices, including I/O components 1220, some of which may be built in. Illustrative components include a microphone, a joystick, a game pad, a satellite dish, a scanner, a printer, or a wireless device. The I/O components 1220 may provide a natural user interface (NUI) that processes air gestures, voice, or other physiological inputs generated by a user. In some instances, inputs may be transmitted to an appropriate network element for further processing. An NUI may implement any combination of speech recognition, touch and stylus recognition, facial recognition, biometric recognition, gesture recognition (both on screen and adjacent to the screen), air gestures, head and eye tracking, and touch recognition associated with displays on the computing device 1200. The computing device 1200 may be equipped with depth cameras, such as stereoscopic camera systems, infrared camera systems, RGB camera systems, and combinations of these, for gesture detection and recognition. Additionally, the computing device 1200 may be equipped with accelerometers or gyroscopes that enable detection of motion. The output of the accelerometers or gyroscopes may be provided to the display of the computing device 1200 to render immersive augmented reality or virtual reality.
Some embodiments of computing device 1200 may include one or more radio(s) 1224 (or similar wireless communication components). The radio(s) 1224 transmits and receives radio or wireless communications. The computing device 1200 may be a wireless terminal adapted to receive communications and media over various wireless networks. Computing device 1200 may communicate via wireless protocols, such as code division multiple access (“CDMA”), global system for mobiles (“GSM”), time division multiple access (“TDMA”), or other wireless means, to communicate with other devices. The radio communications may be a short-range connection, a long-range connection, or a combination of both. Herein, “short” and “long” types of connections do not refer to the spatial relation between two devices. Instead, these connection types are generally referring to short range and long range as different categories, or types, of connections (i.e., a primary connection and a secondary connection). A short-range connection may include, by way of example and not limitation, a Wi-Fi® connection to a device (e.g., mobile hotspot) that provides access to a wireless communications network, such as a Wireless Local Area Network (WLAN) connection using the 802.11 protocol; a Bluetooth connection to another computing device is another example of a short-range connection; or a near-field communication. A long-range connection may include a connection using, by way of example and not limitation, one or more of CDMA, General Packet Radio Service (GPRS), GSM, TDMA, and 802.16 protocols.
The following embodiments represent example aspects of concepts contemplated by the disclosure herein. Any one of the following embodiments may be combined in a multiple dependent manner to depend from one or more other embodiments. Further, any combination of embodiments that explicitly depend from a previous embodiment may be combined while staying within the scope of aspects contemplated herein. The following embodiments are illustrative in nature and are not limiting.
In some embodiments, a system for providing decision support based on scratch events, such as the systems described in any of embodiments disclosed herein, comprises: a processor; and a computer memory having computer executable instructions stored thereon for performing operations when executed by the processor. The operations comprise: receiving accelerometer data for an individual; detecting a hand movement utilizing the accelerometer data; utilizing a computerized classification model to determine, based on the accelerometer data corresponding to the hand movement, that the hand movement indicates a scratch event; and initiating one or more response actions based at least on a determination that the hand movement indicates the scratch event. Among other benefits, these embodiments may provide an assessment of pruritus with greater accuracy and reliability (as compared to conventional solutions) based on accelerometer data acquired in a way to reduce burden on the user and increase user compliance. Using computerized classification models with the accelerometer data to detect scratch events helps remove bias and subjectivity, further improving accuracy and reliability. These classifiers help to provide reliable computer decision support tools that are based on detected scratch data, thereby improving recommendations for treatment and/or responses to scratching.
In the above embodiment of the system, the operations performed by the processor executing the computer executable instructions further comprise: generating a multidimensional timeseries from the accelerometer data corresponding to the hand movement; and determining a plurality of feature values from the multidimensional timeseries. The plurality of feature values include at least one time-domain feature value and at least one frequency-domain feature value. The determination that the hand movement is the scratch event is based on the plurality of feature values.
In any combination of the above embodiments of the system, the accelerometer data is captured by a wearable device located at an appendage of the individual. For example, the wearable device may be located on a wrist, finger, and/or arm. Using wearable device may enable continuous (or semi-continuous, periodic, as-needed, or as-it-becomes-available) data capturing that is less intrusive than other types of monitoring, which may be beneficial in monitoring individuals in populations with typically lower compliance rates, such as children.
In any combination of the above embodiments of the system, the operations performed by the processor executing the computer executable instructions further comprise determining a total sleep opportunity based on the accelerometer data. The total sleep opportunity comprises a period of time between when the individual lays down for a rest and when the individual gets up from the rest. The hand movement is detected utilizing accelerometer data corresponding to the total sleep opportunity. In this way, the scratch event detected may be considered nighttime scratching or scratching during a period in which the individual intends to sleep. This detection helps track scratching during peak pruritus time or even when an individual is unaware of the scratching. As such, scratch events detected, in accordance with embodiments of this disclosure, may provide more accurate measures of the individual's current condition (e.g., pruritus and atopic dermatitis).
In any combination of the above embodiments of the system, the accelerometer data is captured by a wearable device having a plurality of sensors, wherein the wearable device further captures at least one of near-body temperature data and light data. The total sleep opportunity is determined further based on the at least one of near-body temperature data and light data.
In any combination of the above embodiments of the system, the computerized classification model utilized to determine that the hand movement indicates the scratch event comprises at least one of an ensemble of machine learning models and a random forest classifier. For example, the computerized classification model may be an ensemble of machine learning models in which at least one model is a random forest classifier. Compared to other scratch detection approaches these embodiments yield results that are more interpretable, when compared to the recurrent neural network approaches, and, therefore, better capable of being modified or refined for particular contexts. Additionally, these may be quicker and less computationally burdensome than other approaches.
In any combination of the above embodiments of the system, the one or more response actions comprises generating a graphic user interface element provided for display on a user device. The graphic user interface element includes at least one of an indicator of one or more scratch endpoints comprising a total number of scratch events and a total scratch duration; and an indicator recommending that the individual seek clinical consultation based on the determination that the hand movement indicates the scratch event. Scratch endpoints may represent novel digital endpoints that are useful in quantitatively and objectively measuring pruritus or, more specifically, atopic dermatitis. Further, generating the graphic user interface element to provide for display on a user device, either with the scratch endpoint indicator(s) and/or the recommendation for clinical consultation promotes better treatment compliance for the individual being monitored and enables clinician's to make informed decisions with respect to treatment.
In any combination of the above embodiments of the system, the total number of scratch events and the total scratch duration are each determined for a total sleep opportunity that is determined based on the accelerometer data received for the individual. The total sleep opportunity comprises a period of time between when the individual lays down for a rest and when the individual gets up from the rest. In this way, the scratch event detected may be considered nighttime scratching or scratching during a period in which the individual intends to sleep. This detection helps track scratching during peak pruritus time or even when an individual is unaware of the scratching. As such, scratch events detected, in accordance with embodiments of this disclosure, may provide more accurate measures of the individual's current condition (e.g., pruritus and atopic dermatitis).
In some embodiments, a method for treating pruritus utilizing a motion sensing device associated with a subject is provided. The subject may comprise a human subject for which treatment of pruritus is sought. The method may comprise: receiving accelerometer data collected from the motion sensing device; detecting a hand movement utilizing the accelerometer data; utilizing a computerized classification model to determine, based on the accelerometer data corresponding to the hand movement, that the hand movement indicates a scratch event; and, based on at least a first determination that the hand movement indicates the scratch event, initiating a treatment protocol for the subject to treat pruritus. Among other benefits, these embodiments may provide an assessment of pruritus with greater accuracy and reliability (as compared to conventional solutions) based on accelerometer data acquired in a way to reduce burden on the user and increase user compliance. Using computerized classification models with the accelerometer data to detect scratch events helps remove bias and subjectivity, further improving accuracy and reliability. These classifiers help to provide reliable computer decision support tools that are based on detected scratch data, thereby improving recommendations for treatment and/or responses to scratching. As such, these embodiments may more effectively treat and manage pruritus (including in the form of atopic dermatitis) than conventional measures.
In the above embodiment of the method, initiating the treatment protocol is further based on a plurality of determinations that a plurality of hand movements each indicate a scratch event. Initiating the treatment protocol includes determining at least one of a therapeutic agent, a dosage, and a method of administration of the therapeutic agent.
In any combination of the above embodiments of the method, the therapeutic agent is selected from the group consisting of: infliximab, adalimumab, belimumab, tanezumab, ranibizumab, bevacizumab, mepolizumab certolizumab, natalizumab, ustekinumab, vedolizumab, 6-mercaptopurine, hydroxychloroquine, obeticholic acid, mofetil, sodium mycophenolate, leflunomide, rituxan, solumedrol, depomedrol, betamethasone, prednisone, cyclosporin, tacrolimus, pimecrolimus, dupilumab, omalizumab, tralokinumab, etokimab, nemolizumab, Tezepelumab, lebrikizumab, fezakinumab, anti-OX40, efalizumab, etanercept, crisaborole, fluocinonide, mapracorat, hydrocortisone, desonide, alclometasone, triamcinolone, desoximetasone, loratidine, fexofenadine, desloratidine, levocetirizine, methapyrilene, cetirizine, budesonide, fluticasone, mometasone, dexamethasone, prednisolone, ciclesonide, beclomethasone, methotrexate, azathioprine, aspirin, ibuprofen, celecoxib, valdecoxib, WBI-1001 and/or MRX-6, abrocitinib, baricitinib, brepocitinib, cerdulatinib, decernotinib, delgocitinib, fedratinib, filgotinib, gandotinib, ilginatinib, itacitinib, lestaurtinib, momelotinib, oclacitinib pacritinib, peficitinib, ritlecitinib, ruxolitinib, tofacitinib, upadacitinib, THRX-212401, PF-07055087, PF-06471658, PF-07055090, ATI-502, BMS-986165, JTE052, PF-06826647, SNA 152, SHR-0302, tapinarof, and/or alitretinoin.
In a preferred embodiment of any combination of the above embodiments, the therapeutic agent is selected from the group consisting of: crisaborole and abrocitinib.
In any combination of the above embodiments of the method, initiating administration of the treatment protocol includes generating a graphic user interface element provided for display on a user device. The graphic user interface element indicates a recommendation of the treatment protocol that based on the first determination that the hand movement represents the scratch event. This embodiment helps promote better treatment compliance for the subject and enables clinician's to make informed decisions with respect to treatment protocol for the subject.
In any combination of the above embodiments of the method, the user device is separate from the motion sensing device. For example, the user device may be a user computing device that is separate from the motion sensing device. One advantage of this embodiment allows the motion sensing device to be more portable and less bulky as it may be desirable for the display on the user device to be larger than what is permitted by a wearable device. Additionally, in some aspects, the user device may be a clinician user device and having that separate from the motion sensing device allows the data to be collected outside of the clinical setting, thereby improving the quality of the data and subject compliance.
In any combination of the above embodiments of the method, the method further comprises applying the treatment protocol to the subject based on the recommendation.
In any combination of the above embodiments of the method, the motion sensing device comprises a wearable device worn at an appendage of the subject. For example, the motion sensing device may be a wearable device worn at the subject's finger, wrist, or arm. Using wearable device may enable continuous (or semi-continuous, periodic, as-needed, or as-it-becomes-available) data capturing that is less intrusive than other types of monitoring, which may be beneficial in monitoring individuals in populations with typically lower compliance rates, such as children.
In any combination of the above embodiments of the method, the subject is diagnosed with atopic dermatitis based on the determination that the hand movement indicates a scratch event, and the treatment protocol is to treat atopic dermatitis.
In some embodiment, one or more computer storage media having computer-executable instructions embodied thereon that, when executed by one or more processors, cause the one or more processors to perform operations. The operations comprise: receiving accelerometer data for a subject; and causing for display, on a user device, one or more scratch endpoints for the subject based a determination that one or more hand movements detected from the accelerometer data indicate scratch events. The subject may comprise a human subject for which treatment of pruritus is sought. Among other benefits, these embodiments may provide an assessment of pruritus with greater accuracy and reliability (as compared to conventional solutions) based on accelerometer data acquired in a way to reduce burden on the user and increase user compliance. Using computerized classification models with the accelerometer data to detect scratch events helps remove bias and subjectivity, further improving accuracy and reliability. These classifiers help to provide reliable computer decision support tools that are based on detected scratch data, thereby improving recommendations for treatment and/or responses to scratching. Further, scratch endpoints may represent novel digital endpoints that are useful in quantitatively and objectively measuring pruritus or, more specifically, atopic dermatitis. The graphic user interface element provided for display on a user device with the scratch endpoint indicator(s) promotes better treatment compliance for the individual being monitored and enables clinician's to make informed decisions with respect to treatment.
In the above embodiment of the computer storage media, accelerometer data is received from one or more sensors integrated into a wearable device that is communicatively coupled to the user device. Using wearable device may enable continuous (or semi-continuous, periodic, as-needed, or as-it-becomes-available) data capturing that is less intrusive than other types of monitoring, which may be beneficial in monitoring individuals in populations with typically lower compliance rates, such as children.
In any combination of the above embodiments of the computer storage media, the accelerometer data is captured by sensors integrated into a first wearable device and a second wearable device worn contemporaneously by the subject.
In any combination of the above embodiments of the computer storage media, the operations further comprise causing to display, on the user device, a treatment protocol for the subject for treating atopic dermatitis, the treatment protocol being based on the one or more scratch endpoints.
Many different arrangements of the various components depicted, as well as components not shown, are possible without departing from the scope of the claims below. Embodiments of the disclosure have been described with the intent to be illustrative rather than restrictive. Alternative embodiments will become apparent to readers of this disclosure after and because of reading it. Alternative means of implementing the aforementioned can be completed without departing from the scope of the claims below. Certain features and sub-combinations are of utility and may be employed without reference to other features and sub-combinations and are contemplated within the scope of the claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2021/038699 | 6/23/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63043108 | Jun 2020 | US | |
63213592 | Jun 2021 | US |