The present invention relates to personalized activity classification and more specifically to generating personalized health treatment strategies for diabetes management using activity tracking based on automatic activity classification.
Most new cars today have tens to hundreds of sensors that monitor virtually every physical aspect of a vehicle. This includes sensors for measuring tire pressure, fuel line pressure, lane departure, air quality, restraint engagement, stability control, proximity to other vehicles/objects, etc. Altogether, a typical new car generates more than 25 gigabytes of data an hour. Most of these new cars are connected wirelessly to networks that enable a significant portion of the 25 gigabytes of data to be analyzed by manufacturers. The data is often analyzed to identify vehicle defects, maintenance issues, or to notify an owner that service may be needed in the future. Some of the data may be provided to third-parties for population or demographic travel studies.
In comparison, very little health information is collected from individual persons. Generally, heath information is only collected when someone visits a medical facility or wears a medical device. However, the collection of this medical information is limited. For example, the information is oftentimes limited in duration to a person's stay at a medical facility or limited only to certain physiological parameters measured by a medical device to facilitate its operation.
More recently, some people have begun to wear devices that monitor their activity level, commonly known as fitness trackers. These devices typically use Global Positioning System (“GPS”) data with physiological and/or force-based detection of a person's movements to determine the person's activity level. However, service providers of these devices typically only identify relatively low-level lifelog information (e.g., step count, location, or intensity of physical activity) from this collected activity information. The known service providers of fitness trackers, and application providers operable with fitness trackers, do not resolve the lower level lifelog information into higher contextual behavioral patterns, which makes analyzing and predicting people's lifestyles virtually impossible. As a result, diseases, such as diabetes, which can be controlled through lifestyle management, are often treated less effectively than possible. Ironically, vehicle manufacturers know more about a $30,000 car and how it is used than clinicians know about their priceless patients.
Diabetes affects an estimated 415 million people, or nearly one in eleven adults, globally. Experts expect this number to increase by more than 50% in the next twenty years, leading to in excess of 640 million cases worldwide. Diabetes treatments cost the U.S. alone upwards of two billion dollars annually. About 90-95% of these people are afflicted with Type 2 diabetes, which arises due to the body's inability to produce and/or use insulin. Though researchers believe some genetic factors may contribute to its development, other known risk factors for Type 2 diabetes include excess weight and physical inactivity. If left untreated, Type 2 diabetes can lead to glaucoma, nerve damage (particularly in the extremities), renal damage, and heart failure. Despite its severity, there is currently no cure for Type 2 diabetes. Instead, it is often mitigated through control of lifestyle factors such as weight management, nutrition, adequate sleep, and exercise.
Lifestyle accordingly plays an important role in managing diabetes. Lifestyle factors (diet, exercise, stress, sleep pattern and compliance) can impact the prevention and treatment of diabetes and its related metabolic disorders and complications. However, even patients who strictly adhere to their treatment plans often experience inexplicable blood sugar spikes throughout the day. The prevalence of diabetes and pre-diabetes are rising globally without adequate treatment systems to provide adaptive control and management for patients.
The example system, apparatus, and method disclosed herein are configured to provide for activity tracking and classification for diabetes management. The example system, apparatus, and method use a patient's lifelog data to automatically determine activity-level classifications. The activity-level classifications are displayed in a timeline and are used for comparisons to one or more recommended activities provided by medical guidelines. The system, apparatus, and method determine deviations from the guidelines to provide near real-time context-rich messages to patients to help them adhere to the medical guidelines or specified health management plans.
In an embodiment, the system, apparatus, and method disclosed herein create a multi-modal personal diabetes chronicle of a patient's daily activities by automatically integrating relevant heterogeneous sensory data from wearable internet of things (“IoT”) devices together with contextual, social, and/or environmental information to create a chronicle of events containing activities, biomarkers and/or other significant signals, and/or other information relevant to life events for people sensitive to diabetes. The system, apparatus, and method perform activity mining and/or machine learning to determine causal relationships among activities and learn relationships between them to build personal patient models. In the context of diabetes, the personal diabetes chronicle approach described herein enables root cause analysis and treatment recommendations based on the of interrelations between daily activities, stress, food intake, activity level, and glucose level in diabetic patients.
As disclosed herein, the system, apparatus, and method create a personalized health management strategy for diabetes treatment. The system, apparatus, and method record lifelog data measured from a patient's activity. The lifelog data includes sensor data from smartphones, wearable devices like smart watches, and/or other IoT devices. The system, apparatus, and method holistically resolve the measured data into higher-level life events and contextual information using a multi-modal approach within a personal diabetes chronicle to build a cybernetic loop. As such, the system, apparatus, and method provide context-rich feedback and analysis of a patient's activities rather than simply evaluating only sensor data (which may not have significant meaning for a patient). The system, apparatus, and method accordingly improve a patient's compliance with diabetes or other health management plans by providing recommendations and feedback based on a patient's own activities rather than context-less sensor data or general recommendations/guidelines.
Aspects of the subject matter described herein may be useful alone or in combination with one or more other aspect described herein. Without limiting the foregoing description, in a first aspect of the present disclosure, an activity tracking and classification apparatus includes an interface configured to receive, from an application operating on a user device, lifelog data, and a memory device storing a plurality of common daily activity models for respective patients. The apparatus also includes a processor communicatively coupled to the interface and the memory device and configured to assign and synchronize the lifelog data to atomic intervals based on a time the lifelog data occurred or was recorded by the application operating on the user device. The processor is also configured to segment the atomic intervals into daily activity intervals of first-level activities by determining if consecutive atomic intervals have a similar pattern of physical activity using at least a portion of the lifelog data. The processor is further configured to select a common daily activity model from the memory device that corresponds to a patient of the user device and perform second-level activity recognition for each of the daily activity intervals using the selected common daily activity model. Additionally, the processor is configured to generate, for display at the user device or a clinician device, a personal chronical of the recognized second-level activities.
In accordance with a second aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is configured to compare the recognized second-level activities to recommended activities for diabetes management, determine a recommendation based on the comparison, and transmit a message indicative of the recommendation to the user device to cause the patient to modify at least one of their future second-level activities for diabetes management or compliance to a prescribed routine.
In accordance with a third aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is configured to generate the personal chronical by chronologically ordering the recognized second-level activities.
In accordance with a fourth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the first-level activities are daily activities that are determined from at least a portion of the lifelog data and include at least one of walking, being still, running, cycling, driving, direction communication, indirect communication, and using the user device.
In accordance with a fifth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the lifelog data includes location data, force data, activity data, and application data.
In accordance with a sixth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the location data includes at least one of a latitude, a longitude, a venue name, a venue type, a venue likelihood, or a point-of-interest, the force data includes at least one of acceleration data or angular acceleration data, the activity data includes at least one of an activity type, a duration, or an activity level, and the application data includes at least one of an application name, an application type, a usage duration, an indication of direct communication, an indication of remote communication, an indication of a photo or video recording, a media type, a sound setting, or calendar event information.
In accordance with a seventh aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the application is configured to record the lifelog data from at least one of application usage on the user device, GPS data on the user device, force data on the user device, a camera on the user device, a microphone on the user device, an activity tracking device that is communicatively coupled to the user device, or a sensor device that is communicatively coupled to the user device.
In accordance with an eighth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the atomic intervals have non-overlapping durations between 30 seconds, 60 seconds, 2 minutes, 5 minutes, or 10 minutes, and the daily activity intervals are non-overlapping.
In accordance with a ninth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is configured to use a binary interval growing (“BIG”) algorithm to determine whether consecutive atomic intervals have the similar pattern of physical activity, and the processor is configured to classify each atomic interval as corresponding to a physical activity of moving or non-moving in conjunction with determining the first-level activity of whether consecutive atomic intervals have the similar pattern of physical activity.
In accordance with a tenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is configured to perform the second-level activity recognition for each of the daily activity intervals by creating hierarchal groupings that organize first and second-level activities based on attributes and interrelationships, and the attributes and interrelationships include at least one of a temporal aspect, a spatial aspect, an experiential aspect, a causal aspect, a structural aspect, or an informational aspect.
In accordance with an eleventh aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the patient has diabetes or is at risk of developing diabetes, and the processor is configured to at least one of provide glucose control for the patient to help prevent hyperglycemia and hypoglycemia, optimize the patient's metabolism and body weight, enhance the patient's health for metabolic syndrome, optimize medical care delivery for the patient, and improve the patient's well-being.
In accordance with a twelfth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, a memory device storing instructions, which when executed by a processor, cause the processor to receive lifelog data associated with a patient, synchronize the lifelog data to atomic intervals based on a time the lifelog data occurred or was recorded, segment the atomic intervals into daily activity intervals of first-level activities by determining if consecutive atomic intervals have a similar pattern of physical activity using at least a portion of the lifelog data, select a common daily activity model that corresponds to a patient that is associated with the lifelog data, perform second-level activity recognition for each of the daily activity intervals using the selected common daily activity model, and generate, for display at a user device or a clinician device, a personal chronical of the recognized second-level activities.
In accordance with a thirteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the first-level activities have a direct correspondence with at least some of the lifelog data and the second-level activities provide a greater context to an activity of the patient compared to the first-level activities.
In accordance with a fourteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the instructions, which when executed by the processor, cause the processor to use the common daily activity model of the patient to perform at least one of a Formal Concept Analysis (“FCA”) or a Bagging Formal Concept Analysis (“BFCA”) of the daily activity intervals of the first-level activities using at least a portion of the lifelog data for performing second-level activity recognition.
In accordance with a fifteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the instructions, which when executed by the processor, cause the processor to cause an application operating on the user device to display at least one user interface that prompts the patient to provide at least some of the lifelog data.
In accordance with a sixteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the instructions, which when executed by the processor, cause the processor to receive the lifelog data from at least one of an application operating on the user device, a sensor device associated with the user, or a third-party server.
In accordance with a seventeenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the instructions, which when executed by the processor, cause the processor to associate the recognized second-level activities with the respective daily activity intervals.
In accordance with an eighteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the instructions, which when executed by the processor, cause the processor to compare the recognized second-level activities to recommended activities for diabetes management, determine a recommendation based on the comparison, transmit a first message indicative of the recommendation to the clinician device, receive from the clinician device a response message indicative that the recommendation is approved, and transmit a second message indicative of the recommendation to the user device.
In accordance with a nineteenth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the instructions, which when executed by the processor, cause the processor to compare the recognized second-level activities to recommended activities for diabetes management, determine a recommendation based on the comparison, and transmit a message indicative of the recommendation to the user device.
In accordance with a twentieth aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the instructions, which when executed by the processor, cause the processor to use the recognized second-level activities to determine a time/day to transmit the message to the user device such that the recommendation relates to causing the patient to change at least one of the recognized second-level activities in the future at a time/day the patient normally performs the recognized second-level activity.
In accordance with a twenty-first aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the instructions, which when executed by the processor, cause the processor to calculate derivative lifelog data from at least a portion of the lifelog data, the derivative lifelog data comprising a mathematical combination of the portion of the lifelog data, segment the atomic intervals into daily activity intervals using additionally at least a portion of the derivative lifelog data, and perform the second-level activity recognition for each of the daily activity intervals using the selected common daily activity model in conjunction with interrelations among at least a portion of the lifelog data and at least a portion of the derivative lifelog data.
In accordance with a twenty-second aspect of the present disclosure, which may be used in combination with any other aspect listed herein unless stated otherwise, the processor is located on at least one of a user device or a server.
In a twenty-third aspect of the present disclosure, any of the structure and functionality disclosed in connection with
In light of the present disclosure and the above aspects, it is therefore an advantage of the present disclosure to provide an improved system, method, and apparatus for determining context-rich classifications of a patient's activities based on data measured by one or more sensors.
It is another advantage of the present disclosure to determine in near real-time if a patient is complying with a health treatment plan using daily activity intervals specified by context-rich classifications of a patient's activities.
It is yet another advantage of the present disclosure to provide near real-time health recommendations to patients that are timely based on their current circumstances for better adherence to a health treatment plan.
The advantages discussed herein may be found in one, or some, and perhaps not all of the embodiments disclosed herein. Additional features and advantages are described herein, and will be apparent from, the following Detailed Description and the figures.
The system, method, and apparatus disclosed herein are directed to activity tracking and classification for diabetes management. The example system, method, and apparatus are configured to use a patient's log of daily activities and/or measured sensor data to determine the patient's compliance with activity recommendations or plans for managing diabetes, such as the recommendations provided by the American Diabetes Association. The recommendations are configured to be provided in real-time or near-real time to correspond to activities that a patient is currently performing or about to perform. The timeliness of the recommendations is configured to improve a patient's adherence to diabetes management.
In an example, the system, method, and apparatus are configured to determine or predict that a patient is currently watching television. The system, method, and apparatus may make the determination or prediction based on received lifelog data and/or a previous analysis of the patient's lifelog data from pervious days/weeks/months. After determining the patient is watching television, the system, method, and apparatus determines that the patient could improve compliance with diabetes management by instead performing a physical activity, such as walking. The system, method, and apparatus may determine that the patient has had relatively little physical activity for the day, thereby warranting a recommendation. Further, the system, method, and apparatus may determine that the patient's blood sugar level is appropriate for a walk and that the weather at the patient's location is conducive for walking (e.g., it is not raining/snowing or too cold). Accordingly, as discussed herein, the system, method, and apparatus is configured to transmit a message to a device of the patient with a prompt indicating that their diabetes management could be improved by going outside for a short walk rather than watching television. Since the recommendation reflects the actual situation of the patient, the patient is more likely to follow the advice, thereby improving their diabetes management.
To provide recommendations, the example system, method, and apparatus are trained or otherwise configured to classify a patient's daily activities based on lifelog data. As disclosed herein, lifelog data does not provide a direct correlation to a patient's higher-context activities. Instead, as disclosed herein, lifelog data corresponds to data measured by a sensor in proximity of a patient and/or collected by a background logging application of a smartphone that is indicative of a patient's physical location, movement, and/or device usage. The lifelog data may include, for example, smartphone usage data indicative of which applications are being used on the patient's smartphone. The lifelog data may also include location data, activity, and force data that are detected by one or more sensors in the patient's phone or an activity tracking device/band. The lifelog data may also include information entered by a patient into an application, such as food consumption information and/or a patient's blood glucose levels. The lifelog data accordingly includes multimodal data streams that provide information indicative of an activity level state of a patient.
The example system, method, and apparatus are configured to partition and synchronize the lifelog data into atomic time intervals, such as overlapping or non-overlapping time durations of 30 seconds, 60 seconds, 2 minutes, 5 minutes, or 10 minutes. The system, method, and apparatus then provide first-level activity classification of the lifelog data by determining which of the sequential atomic intervals correspond to a same activity. To do this, the system, method, and apparatus segment the atomic intervals into daily activity intervals that correspond to first-level activities. The system, method, and apparatus execute one or more models, routines, and/or algorithms that use relationships between the first-level activities and/or the lifelog data for the corresponding atomic interval(s) to determine second-level activities and/or third-level activities. In some instances, the method, system, and apparatus disclosed herein analyze first-level activities and their temporal, causal, spatial, experimental, informational, and/or structural aspects between the first-level activities, which may be determined, at least in part, via analysis and/or correlation of the lifelog data for common atomic intervals. The example system, method, and apparatus use the second-level activities and/or third-level activities for determining a patient's compliance for diabetes management and/or determining which recommendations are to be transmitted to a patient at certain days/times.
The example system, method, and apparatus may use the second-level activities and/or third-level activities for generating a chronological order of a patient's daily activities. The system, method, and apparatus may compare the chronological order to recommended activities to determine patient compliance with a diabetes management plan. Further, the system, method, and apparatus may provide a graphical representation of the chronological order for viewing by a patient or clinician to help improve adherence to a diabetes management plan.
As described herein, first-level activities are daily patient activities that can be automatically recognized or have a strong correlation with the collected lifelog data. The first-level activities may include, for example, being still, walking, running, cycling, driving, direct communication, remote communication, or on a smartphone. In an example, activity tracking lifelog data provides a direct indication as to whether a patient is being still, walking, or running. In another example, smartphone usage information provides a direct indication of direct communication, remote communication, and/or on a smartphone. The second-level activities correspond to a patient's daily activities provided at a medium or high level of context relative to a first-level activity. The second-level activities may include working, commuting, exercising, religious event, shopping, eating, using a toilet, and a home event. It should be appreciated that the second-level activities can correspond to a single first-level activity or a combination of first-level activities that have high correlations that are specific to that patient. For example, a first patient may ride their bike to work, where first-level activities of ‘being still’ and ‘cycling’ correspond to the second-level activity of ‘commuting’. In this example, a second patient may instead drive their car to work. Here, for this second patient, the first-level activity of ‘driving’ corresponds to the second-level activity of ‘commuting’.
The third-level activities correspond to a patient's daily activities provided at a high level of context relative to a first-level activity and a second-level activity. Third-level activities for the second-level activity of ‘home event’ may include watching television, preparing food, socializing, housework, intimate relations, relaxing, taking a break, and sleeping. The first, second, and third-level activities described herein were originally proposed by Kahneman et al., in a white paper titled “A survey method for characterizing daily life experience: The day reconstruction method”, Science 306, 5702 (2004), pages 1776-1780. Table 1 below shows an example of the activities for each of the first, second and third-level activities. In other examples, the method, apparatus, and system described herein may use similar or different first, second, and third-level activities.
For the second-level activities of Table 1, ‘working’ corresponds to a job performed at a workplace by a patient, ‘commuting’ corresponds to an activity of traveling regularly between work and home, ‘exercising’ corresponds to the activity of performing physical actions to make or keep a body healthy, ‘religious event’ corresponds to an activity occurring at a religious place or place of reflection, ‘shopping’ corresponds to an activity of browsing and purchasing goods or services at a store or online, ‘eating’ corresponds to an activity of consuming food, ‘using toilet’ corresponds to an activity of going to the bathroom, and ‘home event’ corresponds to an activity occurring in a structure in which the patient resides, and does not include ‘using toilet’, ‘eating’, ‘shopping’, ‘exercising’ or ‘working’. In other embodiments, additional, fewer, or different second-level activities may be classified from a patient's lifelog data.
The system, method, and apparatus described herein are configured to model a patient to provide personalized health management, particularly for diabetes. The modeling of a patient provides an objective correlation between high-level data abstractions that relate to a patient's life experiences, behavioral patterns, and/or their feelings. This enables quantification of information for a patient's time usage and frequency for daily activities to determine a patient's stress level, pleasure responses, and other affective states, which may be reflected numerically for correlation with other relational identifiers.
It should be appreciated that to automatically quantify the daily activity of a patient, the recognition method provided by the system, method, and apparatus disclosed herein is unobtrusive and effortless by providing tracking via common devices such as smartphones, tablet computers, fitness trackers, glucose monitors, etc. As such, the system, method, and apparatus disclosed herein are configured to refrain from intervening in a patient's life patterns or activities by pushing them to do something or putting them in specific situations in order to recognize their daily activity. However, this creates a potential problem of not always capturing a patient's activities, especially subjective activities that do not have a high correlation to lifelog data recorded by a smartphone or fitness tracker.
As described-above, the example system, method, and apparatus overcome these potential problems by building patient models using daily-activity-intervals for classifying every atomic interval into a daily activity. The system, method, and apparatus model a patient's daily activities in a timeline or other chronological graph as objects in two-dimensional pixel space, where objects correspond to daily activities that are determined by a correlation between the times/pixels. The system, method, and apparatus may use one or more interval growing techniques for determining daily activity-intervals and related attributes/aspects (e.g., parameters of lifelog data). The system, method, and apparatus are configured to label the activity-intervals as the daily activities using, for example, a Bagging Formal Concept Analysis (“BFCA”). The system, method, and apparatus may then build a personal chronicle represented as events of the daily activities for view by the patient and/or a clinician.
It should be appreciated that the use of atomic intervals for resolving first-level daily activities enables second-level and/or third-level daily activities to be automatically appropriately classified. While there are known activity tracking applications and service providers, the lifelog data and analysis is based only on first-level activity data due to the direct correlation. These known activity tracking applications and service providers do not provide second-level activity classification. Further, these known activity tracking applications and service providers do not provide daily activity recognition based on analyzing physical activity patterns by using non-visual smartphone lifelog data and determining daily activity intervals for recognizing daily activity. Accordingly, known activity tracking applications and service providers do not identify atomic-level daily activities for recognizing higher cognitive and more abstract levels of activities.
The example user device 102 may include a smartphone, cellular phone, tablet computer, laptop computer, personal computer, workstation, smartwatch, smart-eyewear, etc. The application server 104 may include a processor, a group of processors, a controller, a microcontroller, a database etc. for receiving/storing data, performing computations, and outputting data. The network 106 may include any wired and/or wireless network including the Internet, an Ethernet, a cellular network, or combinations thereof.
In the illustrated example, the user device 102 is communicatively coupled to an external sensor device 108, which may be included in, for example, a fitness tracking device or bracelet. For example, the sensor device 108 may include a fitness tracker from Fitbit, Inc. The external sensor device 108 may include one or more physiological sensors and be configured to measure physiological and/or physical lifelog data of a patient including heartbeat lifelog data, weight lifelog data, blood pressure lifelog data, a number of steps take lifelog data, a pace of steps taken lifelog data, breathing lifelog data, GPS lifelog data, a glucose level lifelog data, sleep state lifelog data, etc. The user device 102 may be wired or wirelessly coupled to the sensor device 108 via, for example, a USB® connection, a Bluetooth® connection, a Lightning® connection, an NFC connection, etc.
While
In some embodiments, the activity tracking and classification system 100 of
In addition to user devices 102 and 120, the example activity tracking and classification system 100 of
The processor 202 is configured to execute instructions stored in the memory 210, including instructions for the application 110, which is configured to record and/or compile lifelog data 212. The application 110 performs operations based on execution of the one or more instructions by the processor 202 of the user device 102. The application 110 is configured to request or otherwise receive physiological and/or physical lifelog data 212 from data processed on the user device 102 (and/or other applications operating on the user device 102) and/or the sensor device 108.
In some examples, the application 110 is configured to use the network interface 204 for connecting to one or more interfaces (e.g., application programming interfaces (“APIs”)) at the application server 104 for transmitting collected lifelog data 212. The network interface 204 may include a transceiver and/or port for transmitting and receiving data via the Internet, an Ethernet, a cellular network, etc. In some instances, the application 110 may transmit the lifelog 212 data in data streams as the data is collected/received. In other instances, the application 110 may transmit the lifelog data 212 at periodic intervals, which may correspond to the atomic intervals disclosed herein. In yet other instances, the application 110 may be configured to transmit the lifelog data 212 at designated times, such as at the end of a day or upon request by the application server 104.
The example user device 102 may include one or more sensors 206 for measuring at least some of the lifelog data 212. For instance, the user device 102 may include a GPS sensor 206 for determining a latitude and longitude of a patient. The user device 102 may also include a six-degree of freedom force sensor 206 to detect linear and angular acceleration. The user device 102 may further include a temperature sensor, a moisture sensor, a camera, a microphone, etc. It should be appreciated that the user device 102 and/or the sensor device 108 may include virtually any sensor to measure a parameter or characteristic related to a patient for generating lifelog data.
The application 110 may communicate with registers in the memory 210 and/or processing routines operating on the processor 202 of the user device 102 that store at least a portion of data that can be used as the lifelog data 212. For example, the application 110 may obtain acceleration data from registers configured to store data from a six degree-of-freedom sensor and obtain GPS data from a register configured to store GPS data. The application 110 may also communicate with the sensor device interface 208 to receive the corresponding lifelog data. The sensor device interface 208 may include a transceiver for communicatively coupling with the sensor device 108. The senor device interface 208 may include, for example, a Bluetooth® interface, an RF interface, an NFC interface, a USB® interface, or a Lightning® interface.
In addition to obtaining sensor data, the example application 110 is configured to acquire, as the lifelog data 212, device application data of the user device 102. The device application data may includes, for example, an application name/type used by a patient, a usage duration, an indication of direct communication, an indication of remote communication, an indication of a photo or video recording, a media type, a sound setting, or calendar event information. In some instances, the application 110 is configured to operate in a background of the user device 102 to record how a patient uses the device 102. In other examples, the application 110 accesses a task manager to obtain information about application, process, or service usage. Regarding direct/remote communication monitoring, the application 110 may poll a microphone to detect instances a patient directly communicates with others or communicates via a phone or text messaging application (without making any recording of the patient).
In some embodiments, the application 110 may communicate with third-party applications on the user device 102. For example, the application 110 may communicate with a mapping application to determine location information that corresponds to GPS coordinates. The application 110 may supplement location information with dead-reckoning information from a six degree-of-freedom sensor to estimate a patient location when a GPS signal is not available, such as when a patient travels indoors. In another example, the application 110 may communicate with a health monitor application on the user device 102 to obtain raw and/or calculated health information provided by the application. In this manner, the application 110 takes advantage of the presence of third-party health monitors or tracking applications to provide a more comprehensive set of the lifelog data 212. For example, the user device 102 may include a step counter application that interfaces with a six degree-of-freedom sensor and/or GPS sensor to estimate a number of steps and a pacing of a patient. Instead of collecting this information separately, the application 110 may be configured to interface with the third-party health monitoring application for collecting the lifelog data for storage in the memory 210 as lifelog data 212.
In addition to the automatic collection of lifelog data, the example application 110 may include one or more user interfaces configured to enable a patient to enter certain lifelog data. The application 110 is configured to store the information entered by the patient into the memory 210 as the lifelog data 212.
In some examples, the application 110 is configured to cause the user interfaces of
In addition to providing user interfaces for the collection of lifelog data 212, the example application 110 is configured to provide a display of a patient's lifelog data.
The example user interface 900 also include a section 902 to enable a patient to view recommendations. The recommendations may be determined by the application 110 using the lifelog data 212 in conjunction with the classification analysis disclosed herein. Additionally or alternatively, the recommendations may be received via electronic messages from the application server 104. In these instances, the application server 104 uses the lifelog data 212 to determine a classification of activities over a timeline for determining which recommendations are to be generated. As shown in
The example application server 104 is configure to provide for automatic activity and classification and overcome limitations of known activity tracking systems by automatically characterizing a patient's daily activities to provide a useful personal chronicle or timeline. The example application server 104 includes a data interface 1002 for receiving the lifelog data 212 from the user devices 102 and 120 and/or third-party systems. The data interface 1002 includes one or more APIs for receiving the lifelog data 212 that is stored in the memory 210 of the user device 102 or provided via the web browser application 122 of the user device 120. The data interface 1002 determines a type of lifelog data based on which API received the data and/or using labels and/or metadata transmitted with the data. The data interface 1002 may, in some embodiments, normalize or convert the data into a common format for processing. The data interface 1002 may store the lifelog data to a memory device 1004. In some embodiments, the data interface 1002 is configured to receive the lifelog data at periodic and/or specified intervals. In other instances, the data interface 1002 may transmit a request to the application 110 on the user device 102 to receive the lifelog data 212 that has been stored since the previous transmission of lifelog data.
The example data interface 1002 may also receive lifelog data 212 from third-party applications 1106. The third-party applications 1106 may be configured to collect lifelog data from a related device, such as a fitness tracker, a glucose meter, or a location tracker. The third-party applications 1106 may also determine first-level activity data from the lifelog data 212. A patient may register with the application server 104 to request that the lifelog data 212 be transmitted from the third-party application 1106. Additionally or alternatively, the patient may register with the third-party application 1106 for transmission of the respective lifelog data 212 to the application server 104.
As shown in
In some embodiments, the application 110 and/or the application server 104 is configured to resolve GPS data into a particular venue type or name. If multiple venues are located in the same area, the application 110 and/or the application server 104 is configured to determine a probability of venue likelihood based, for example, on the lifelog data and past locations of the patient. Regarding activity level, the example the application 110 and/or the application server 104 is configured to calculate an activity level for each atomic interval (described below), which may be based on a ratio of time active versus time sitting or standing. It should be appreciated from
Returning to
The example synchronization engine 1006 is configured to read timestamps associated with lifelog data 212 for assigning to an atomic interface. The synchronization engine 1006 then creates a data structure for storage in the memory device 1004 for each of the atomic intervals. Table 2 below shows an example of a data structure of atomic intervals created by the synchronization engine 1006. In the illustrated example, the activity type corresponds to a first-level activity, and may be written to the data structure after classification by a first-level activity processor 1008. In the illustrated example, activity type ‘a1’ corresponds to ‘being still’, ‘a2’ corresponds to ‘walking’, ‘a3’ corresponds to ‘running’, ‘a4’ corresponds to ‘bycicling’, and ‘a5’ corresponds to ‘driving’.
After the lifelog data 212 is synchronized in one or more atomic intervals, the first-level activity processor 1008 is configured to analyze the atomic intervals for classification into first-level activities. The first-level activity processor 1008 performs this classification by segmenting the atomic intervals into daily activity intervals of first-level activities (shown in
In some examples, the first-level activity processor 1008 is configured to search for indications of pattern changes of physical activities that correspond to changes in other attributes, which are considered as ending one daily activity and starting another. For example, a patient may have been working in their office and sitting in a chair. After ten minutes, the patient moves towards the cafeteria for lunch. The first-level activity processor 1008 is configured to detect this change of physical activity, segment the movement, and make a daily activity interval by segmenting between the time spent sitting at the chair, the time spent walking to lunch, and the time spent eating lunch.
In the example shown in
In some embodiments, the first-level activity processor 1008 is configured with a binary interval growing (“BIG”) algorithm for resolving or classifying atomic intervals into larger activity intervals. The example BIG algorithm is configured to analyze sequential atomic intervals and group similar atomic intervals together to form a daily activity interval. For example, several sequential atomic intervals consisting of ‘walking’ can be grouped together into a single, longer daily activity interval, where any changes in physical activity can be determined as a change in daily activity.
The BIG algorithm, shown below as Algorithm 1, is configured to classify each atomic interval into a moving or non-moving type of interval. The BIG interval then processes each interval as a moving or non-moving interval. The BIG algorithm shown below starts with a seed or first atomic interval S1 and continues calculating δ(i) for each atomic interval (e.g., five minutes) to determine a similarity between the sequential atomic intervals. δ(i) may be represented by Equation (1) below:
δ(i)=∥f(Sj′)−f(Ii′)∥22 (1)
where S′j is {Ij, tj}, is {li, ti}, (x) is a classification algorithm for classifying the non-moving (0) or moving (1) type of atomic interval, and δ(i) is a distance between Sj and Ii. In this embodiment, the first-level activity processor 1008 is configured to segment atomic intervals when δ(i) is equal to a value of ‘1’, and create a daily activity interval by segmenting from I to L. For example, if the type of the seed atomic interval is non-moving, then f(S′j) is equal to a value of ‘0’. After a duration of the atomic interval, if the type of current atomic interval is also non-moving, f(I′i) is also equal to a value of ‘0’, and thus δ(i) is also equal to a value of ‘0’. However, after a duration of another atomic interval, if the current atomic interval is moving, f(I′i) is equal to a value of ‘1’, and δ(i) will have a value of ‘1’. The first-level activity processor 1008 is configured to segment at this moment and create a daily activity interval by segmenting from I to L and repeat this process for each subsequent atomic interval.
The example first-level activity processor 1008 analyzes the daily activity intervals to determine a first-level classification. As discussed above, there is a relatively strong correlation between a first-level activity and the lifelog data 212 associated with a particular daily activity interval. The first-level activity processor 1008 is configured to determine one or more first-level activities based on the corresponding lifelog data 212, as shown above in Table 2. For example, pacing data or step counter data indicative of no movement is determined by the first-level activity processor 1008 to correspond to a first-level activity of ‘being still’ while pacing data, step countering data, or heart rate data indicative of moving is determined by the first-level activity processor 1008 to correspond to a first-level activity of ‘running’ or ‘walking’ based, for example, on a pacing of the steps. Moreover, the first-level activity processor 1008 may use application usage lifelog data 212 to determine a first-level activity of ‘direct communication’, indirect communication’, and/or ‘on a smartphone’.
After daily activity intervals are determined, a second-level activity processor 1012 of the application server 104 is configured to determine a second-level classification. The second-level activity processor 1012 is configured to use a common daily activity model that is configured for the specific patient. The model may be stored, for example, in a memory device 1014 and have correlations between daily activities and temporal, causal, spatial, experimental, informational, and/or structural aspects between the first-level activities based on the lifelog data 212 that is specific for a patient. A patient model engine 1016 may create each model for a patient by identifying trends and correlations among months of patient lifelog data 212. The patient model engine 1016 bases each patient model on a global common daily activity model that is trained using data from a number of patients that provide second-level activity feedback during a training session, described in more detail below.
Each patient common daily activity model identifies and/or weighs unique properties or attributes of each first-level activity and/or lifelog data 212. The patient model incorporates several fundamental aspects of activities, such as temporal, spatial, experiential, causal, structural, and informational aspects. In other words, the patient model takes into account physical (e.g., activity occurrence timestamp and interval), logical (e.g., temporal domain), and relative relationships (e.g., temporal relationships to other activities) between aspects of lifelog data and corresponding first and second-level activities. In some examples, the patient model specifies which portion of the lifelog data 212 is to be used for each of the temporal, causal, spatial, experimental, informational, and/or structural aspects or attributes.
Each of the patient common daily activity models may be generated by the patient model engine 1016, from a global common daily activity model, using, for example, a Formal Concept Analysis (“FCA”) that identifies and determines correlations among the temporal, causal, spatial, experimental, informational, and/or structural aspects or attributes. It should be appreciated that FCA is a powerful technique when data sources are limited or uncertain as a result of FCA's specialty for discovering implicit information based on pre-defined binary relationships between lifelog data and aspects/attributes. As such, FCA provides for models that have hierarchal groupings that organize activities based on their attributes and interrelationships. Additionally or alternatively, the patient model engine 1016 may use Bagging Formal Concept Analysis (“BFCA”) for creating patient common daily activity models that classify second-level activities from first-level daily activity intervals.
The second-level activity processor 1012 is configured to use patient common daily activity models based on FCA and/or BFCA for performing second-level activity recognition. This includes taking segmented groups of first-level daily activities, such as ‘walking’ or ‘being still’, and identifying their second-level higher order meanings, such as ‘walking to work’ or ‘working’. Under this approach, the second-level activity processor 1012 is configured to represent each daily activity D as a triplet T=(D, A, R), where A is a set of aspects/attributes, and R is the binary relationships between D and A, where R⊆D×A. Once each daily activity is defined by a triplet, the second-level activity processor 1012 is configured to convert the triplet Tinto a cross table (e.g., Table 3 below).
Then second-level activity processor 1012 is configured to extract all possible formal concepts (Xi, Yi), where Xi⊆Di, and Yi⊆Ai, from the cross table. The second-level activity processor 1012 is then configured to set up the possible formal concepts as nodes in a concept lattice, which is a graphical representation of the partially ordered knowledge. The hierarchy of formal concepts can be constructed by the following relations shown in Equation (2) below:
(Xi,Yi)≤(X2,Y2), if X1⊆X2↔Y1⊇Y2 (2)
Where Xi and Yi satisfy the following relations in Equations (3) and (4):
X′
i
={a
i
∈A
i
|∀d
i
∈X
i,(di,ai)∈Ri} (3)
Y′
i
={d
i
∈D
i
|∀a
i
∈Y
i,(di,ai)∈Ri} (4)
Table 3 shows simplified relationships between first and second-level daily activity and their aspects/attributes. In the example, the second-level activity processor 1012 uses the patient common daily activity model to derive cross-table relationships, such as (Working, {Medium time-duration, Work}), (Using Toilet, {Walking, Work}), (Commuting, {Walking, Medium time-duration}), ({Working, Using Toilet}, Work), ({Working, Commuting}, Medium time-duration), and ({Using Toilet, Commuting}, Walking). The second-level activity processor 1012 uses these formal activity pairs as each node in a concept lattice, where their hierarchy is determined using Equation (2) above.
As provided above, the second-level activity processor 1012 uses FCA in conjunction with a patient common daily activity model to determine an expected second-level activity result depending on a structural similarity between an input attribute set and pre-defined attribute sets for first-level activities. Thus, different kinds of input attributes can significantly affect the structural similarities. Because of these affects, the example second-level activity processor 1012 may estimate which attributes are important keys to separating each different daily second-level activity, and locate all unique daily activity structures composed of those attributes.
It should be appreciated that FCA generally does not provide for an embedded statistical analysis, and instead depends on a structural similarity between activities without taking into account various probabilities of each activity. For example, a patient whose user device 102 records lifelog data 212 indicative of a first-level activity of ‘being still’ at home for eight hours straight (i.e. 12 am-8 am) is likely sleeping. However, the second-level activity processor 1012 using FCA may also classify the first-level activity as other second-level activities such as ‘eating’ or ‘relaxing’ since the processor 1012 has no way of weighting these activities by their respective probabilities. To combat this potential ambiguity, the second-level activity processor 1012 may also use BFCA, which applies an ensemble approach to FCA. When also using BFCA, the second-level activity processor 1012 uses FCA weighted second-level classifications by their likelihood of occurring, based on previously collected trends and typical activity aspects/attributes. For example, second-level activities such as ‘sleep’ and ‘working’ comprise the majority of a patient's day, and so are weighted more heavily than other second-level activities such as ‘eating.’
The second-level activity processor 1012 may use BFCA to categorize the labeled first-level daily activity intervals and create a number n of classifiers, where n is the number of recognizable second-level activities. In addition, using BFCA, m corresponds to a number of bags per classifier, where for each bag, ⅓ of random attributes p is provided, where p is the number of total attributes. The second-level activity processor 1012 then extracts or determines unique relationships between the labeled first-level daily activities and their randomly picked attributes. The second-level activity processor 1012 then builds a cross-table, similar to Table 3 above, for each bag using unique relationships. The second-level activity processor 1012 then generates a concept lattice, similar to the concept lattice 1200 of
Returning to
In some embodiments, the application server 104 includes a recommendation processor 1022 configured to analyze the second-level activities and related timeline or chronology of a patient. The recommendation processor 1022 is configured to determine if one or more recommendations are to be provided to a patient to improve their adherence to a health plan, such as a diabetes management plan. In other embodiments, the application 110 is configured to enable a patient to request guidance regarding food consumption or activity level.
To provide a recommendation, the recommendation processor 1022 is configured to compare at least the classified second-level activities to daily guidance from professional medical associations, such as the American Diabetes Association. The guidelines specify certain lengths of time that a patient is to engage in a certain activity over the course of a day or week. The memory device 1004 or 1014 is configured to store a data structure that records the activities and durations of the guidelines. The recommendation processor 1022 is configured to compare the second-level activities and related timeline or chronology to the guidelines in the memory device 1004 or 1014 to determine if the patient has met at least a threshold for each of the different types of activities. If a patient's level of activity is below a specified threshold, the recommendation processor 1022 is configured to determine that a recommendation is to be made. The recommendation processor 1022 may analyze the patient's lifelog data 212 to identify activities that the patient has previously performed. The recommendation processor 1022 also determines, from the second-level activities and related timeline or chronology, instances where a patient could perform the additional activity, such as during ‘home event’. The recommendation processor 1022 then creates a message that identifies the activity to be performed at a predetermined transmission time that corresponds to the next ‘home event’. After detecting that the predetermined transmission time has approached, the recommendation processor 1022 transmits the message to the application 110 on the user device 102. In some instances, the recommendation processor 1022 may check the patient's activities for the day to determine if the patient has already achieved a sufficient activity level, as specified by the appropriate guideline and/or may check current weather conditions. If the recommendation processor 1022 determines that the patient should engage in additional physical activity to meet a threshold specified by the guideline, and the weather is acceptable for performing the activity, the recommendation processor 1022 causes the message to be transmitted to the application 110. At this time, while, for example, the patient is watching television during ‘home event’, the application 110 on the user device 102 displays a notification that indicates “Consider taking a walk at this time to improve your activity level”. The patient is more likely to follow this recommendation since it provides real-time in the moment advice that a patient is easily able to follow by simply turning off the television and going for a short walk, rather than trying to adhere to broad recommendations or goals.
In some embodiments, the recommendation processor 1022 is configured to use the patient's lifelog data 212 in conjunction with the second-level activities for determining recommendations and/or scores for comparison to guidelines. For example, the recommendation processor 1022 may analyze a patient's previous second-level activities, food consumption, blood glucose level, stress management, sleep pattern, and other bio-markers for making a recommendation. The personalized diabetes treatment strategy can be used to provide individualized health improvement for the patient.
In some instances, the recommendation processor 1022 may create a baseline of second-level activities and related lifelog data 212, such as glucose level and blood pressure. The recommendation processor 1022 is configured to create trend graphs for storage in the memory device 1004 for the second-level activity levels and/or lifelog data. The recommendation processor 1022 may compare the trends overtime to absolute or change thresholds. The recommendation processor 1022 may determine that a change to a patient's health occurred if the patient's lifelog data and/or second-level activities have a significant deviation. For example, an increase of sleep time and decreased social interactions or time at work may be indicative of an onset of depression. In another example, an increase in a frequency of using the toilet may be indicative of an onset of diabetes. In another example, if a patient is particularly susceptible to blood glucose fluctuations when stressed, the recommendation processor 1022 is configured to inform the patient, via the application 110, when it senses an increase in stress levels (or senses lifelog data or attributes/aspects that are correlated to higher stress levels). Furthermore, in this example, the application 110 operating with the recommendation processor 1022 enables patients to directly track their stress levels as a function of their second-level activities to better avoid or manage such situations in the future.
The example procedure 1300 begins when the application server 104 receives lifelog data 212 from an application 110 on a user device 102 and/or from third-party sources as described in connection with
After the lifelog data 212 is synchronized to atomic intervals, the example application server 104 is configured to grow the time intervals using, for example, the BIG algorithm (block 1306). As discussed above in connection with
The example application server 104 is configured to perform daily activity segmentation for first-level activities (block 1308). As discussed above in connection with
Returning to
The example application server 104 then creates a timeline or chronology data that is illustrative of the second-level events (block 1312).
Returning to
The example application server 104 may also determine if a recommendation is needed for the patient (block 1316). As discussed above in connection with
The example application 110 and/or the application server 104 is configured to synchronize the derivative lifelog data 1502, 1504, and 1506 (with at least some of the lifelog data 212) into atomic intervals. The application 110 and/or the application server 104 then resolve the derivative lifelog data 1502, 1504, and 1506 into segmented daily activity intervals 1508 for first-level activities based, for example, on changes between the derivative lifelog data 1502, 1504, and 1506 between the atomic intervals. The application 110 and/or application server 104 next use a patient common daily activity model that incorporates at least one of FCA or BFCA to classify second-level activities from the first-level activities using one or more aspects/attributes. The interrelated aspects may be based on temporal components, spatial components, experiential components, causal components, structural components, and/or informational components. The application 110 and/or application server 104 then assign the classified second-level activities to the daily activity intervals 1508. The application 110 and/or application server 104 may then make the daily activity intervals 1508 available for display as a timeline or other chronological graph/data.
As discussed above, the example application server 104 of
To create the global common daily activity model, the example patient model engine 1016 is configured to process a training set of data. The training set of data corresponds to lifelog data in which test-patients provide feedback regarding their second-level activities. The test-patients may enter their second-level activities via a user interface of the application 110 of the user device 102 or a separate application that is configured for acquiring training lifelog data. During the course of one or more days, the training application or the application 110 on the user device 102 records lifelog data in a manner similar to the collection of lifelog data 212 discussed above. This includes, for example, collecting lifelog data from a sensor device 108 and/or having a test-patient enter lifelog data into one or more user interfaces. In addition, the application prompts test-patients to enter at least one of a first-level activity and/or a second-level activity that corresponds to the collected lifelog data. The user interface may prompt the test-patient to enter the activity as the data is collected and/or display a timeline showing at least some of the lifelog data partitioned into atomic intervals and prompting the test-patient to enter an activity for one or more of the intervals. In some instances, the user interface of the application may include a list of available first and/or second-level activities for a test-patient to select.
In an example, twenty-three test-patients were selected to provide training lifelog data for the patient model engine 1016. The test-patients provided a log of first and/or second-level activities correlated with lifelog data over a number of weeks. After receiving the data, the patient model engine 1016 removed any incomplete lifelog data that did not have an activity identified. In total, the patient model engine 1016 created a global common daily activity model from 15,087 daily activity intervals of the twenty-three test-patients. The patient model engine 1016 split the intervals into 30% for a training dataset and 70% for a test dataset to demonstrate the robustness of individual patient-specific common daily activity models generated from the global common daily activity model.
The example patient model engine 1016 is configured to maximize activity recognition performance by being programmed to assume that each daily activity has a specific combination of common aspect or attribute sets that represent the second-level activity. This means that all aspects or attributes (e.g., temporal, spatial, experiential, structural, informational, and causal aspects/attributes) of the global common daily activity model are not vital for each and every second-level activity classification. For example, the second-level activity of ‘commuting’ is related to or classified by using only spatial (e.g., work or home), structural (e.g., first-level daily activity such as ‘driving’, ‘walking’, or ‘being still’), and causal (e.g., relations between current and previous daily second-level activities) aspects/attributes.
The example patient model engine 1016 verifies the interrelationship between the attributes/aspects and second-level activities by calculating an accuracy of different combinations of the attributes/aspects for each of the second-level activities. To determine an accuracy, the patient model engine 1016 uses ten bags of concept lattices, similar to the concept lattice 1200 of
Table 4 below shows a comparison of attribute/aspect set combinations for each of the second-level activities. In the illustrated example, D1 corresponds to the second-level activity of ‘commuting’, D2 corresponds to ‘eating’, D3 corresponds to ‘exercising’, D4 corresponds to ‘home event’, D5 corresponds to ‘religious event’, D6 corresponds to ‘shopping’, D7 corresponds to ‘using toilet’, and D8 corresponds to ‘working’. Further, S1 corresponds to an attribute/aspect combination of temporal+experimental, S2 corresponds to temporal+spatial, S3 corresponds to spatial+experimental, S4 corresponds to S1+spatial, S5 corresponds to S4 casual, and S6 corresponds to S5 structural aspect. The results in Table 4 show that some combinations of attributes/aspects have better results for certain second-level activities than other aspect/attribute combinations. For example, for second-level activity D4 (i.e., ‘home event’), there is a strong correlation with the S5 aspect/attribute (i.e., temporal+experimental+spatial+casual) and the S6 aspect/attribute (i.e., temporal+experimental+spatial+casual+structural aspect). Table 4 also shows that the use of additional or unnecessary lifelog data of certain attributes/aspects in the classification of a second-level activity results in the confusion or introduction of errors in the global common daily activity model, regardless of activities of a specific patient.
After determining correlations between aspects/attributes and second-level activities, the example patient model engine 1016 is configured to determine a best or near-optimal number of concept lattice bags of the global common daily activity model for improving classification performance. In an example, the patient model engine 1016 trains separate BFCA models on a different number of bags, which range from 1 to 1000, using the selected attribute sets discussed-above in connection with Table 4. The example patient model engine 1016 then executes processes for performing trials for second-level activity classification on those trained models, respectively, by using the same lifelog test dataset. The example patient model engine 1016 then calculates an f-measure for each of the bags to determine which numbers of bags return the best recognition accuracy. The results of the test demonstrated that classification accuracy is roughly the same for models with less than 700 bags. However, the accuracy decreases when over 800 bags are used in the model, where the higher number of bags in a BFCA model can confuse the voting process given that the bags can make all of the classifiers robust. In the illustrated example, 200 bags was selected for the global common daily activity model, which provides an accuracy of 91.47%.
To further validate the model, the patient model engine 1016 creates a confusion matrix to determine specific results of each second-level daily activity recognition. For the matrix, patient-specific common daily activity models were created from the global common daily activity model using the training data set. The test dataset was then applied to the appropriate patient models. The results of the confusion matrix are shown below in Table 5, which illustrates a comparison between a predicted second-level activity and a targeted or actual second-level activity reported by the test-patients. Table 5 shows that the activity predictions are 90% to 100% accurate for most of the second-level activities. However, as shown in Table 5, the five-minute atomic intervals results in some ambiguous segmentation (4.2%) between the ‘commuting’ (i.e., D1) and ‘home event’ (i.e., D4) activities.
Table 5 also shows that randomly picked p/3 aspects/attributes can cause confusion in the patient models. For example, the ‘home event’ (i.e., D4) activity can be considered as an ‘eating’ (i.e., D2) activity in 4.3% of the daily activity intervals. In another example, the ‘shopping’ (i.e., D6) activity can be classified as either an ‘eating’ (i.e., D2) activity or a ‘using toilet’ (i.e., D7) activity respectively in 16.7% of the daily activity intervals if spatial aspects are missed. However, the overall accuracy of all the second-level daily activity classification (>90%) shows that using the randomly picked aspects/attributes, and a certain number of concept lattice bags can minimize the misclassification of a patient's daily activities.
It should be appreciated that the example patient model engine 1016 uses BFCA instead of and/or in conjunction with FCA to improve second-level activity classification. FCA depends only on structural similarities between an input attribute set and pre-defined relations. Thus, a patient model may sometimes recognize multiple daily activities if the second-level activities have similar structures to the pre-defined relations. This issue may be a problem, which can cause lower performance, given that FCA does not incorporate statistical methods to choose the most probable result. In contrast to FCA, BFCA shows that applying a statistical method to FCA, such as the ensemble approach, provides a near-optimal solution to overcome the classification problem, especially when lifelog data is imbalanced towards second-level activities of ‘sleeping’, ‘work-event’, and ‘working’.
It should be understood that various changes and modifications to the presently preferred embodiments described herein will be apparent to those skilled in the art. Such changes and modifications can be made without departing from the spirit and scope of the present subject matter and without diminishing its intended advantages. It is therefore intended that such changes and modifications be covered by the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2019/038711 | 6/29/2019 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62689537 | Jun 2018 | US |