One or more embodiments relate generally to activity recognition systems, and in particular a two-phase power-efficient activity recognition system for mobile devices.
Mobile devices include sensors such as accelerometers for capturing a user's physical context. An activity recognition system of a mobile device identifies and classifies a user's activity (e.g., running, biking, driving, etc.) based on the user's physical context. Information relating to the user's activity may be provided to a context-driven mobile application running on the mobile device.
One embodiment provides an activity recognition system for an electronic device comprising at least one sensor and a two-phase activity recognition module. The sensors capture data relating to user activity. The two-phase activity recognition module identifies a user activity based on data captured by the sensors, and dynamically controls power consumption of the activity recognition module based on the user activity identified.
One embodiment provides a method for facilitating activity recognition in an electronic device. The method comprises capturing data relating to user activity using at least one sensor device, identifying a user activity based on data captured by the sensor devices, and dynamically controlling power consumption for activity recognition based on the user activity identified.
One embodiment provides a non-transitory computer-readable medium having instructions which when executed on a computer perform a method comprising capturing data relating to user activity using at least one sensor, identifying a user activity based on data captured by the sensors, and dynamically controlling power consumption for activity recognition module based on the user activity identified.
One embodiment provides an electronic device comprising at least one sensor and a two-phase activity recognition module. The sensors capture data relating to user activity. The two-phase activity recognition module identifies a user activity based on data captured by the sensors, and dynamically controls power consumption of the activity recognition module based on the user activity identified.
These and other aspects and advantages of one or more embodiments will become apparent from the following detailed description, which, when taken in conjunction with the drawings, illustrate by way of example the principles of one or more embodiments.
For a fuller understanding of the nature and advantages of one or more embodiments, as well as a preferred mode of use, reference should be made to the following detailed description read in conjunction with the accompanying drawings, in which:
The following description is made for the purpose of illustrating the general principles of one or more embodiments and is not meant to limit the inventive concepts claimed herein. Further, particular features described herein can be used in combination with other described features in each of the various possible combinations and permutations. Unless otherwise specifically defined herein, all terms are to be given their broadest possible interpretation including meanings implied from the specification as well as meanings understood by those skilled in the art and/or as defined in dictionaries, treatises, etc.
One or more embodiments relate generally to activity recognition systems, and in particular a two-phase power-efficient activity recognition system for mobile devices. One embodiment provides an activity recognition system for an electronic device comprising at least one sensor and a two-phase activity recognition module. The sensors capture data relating to user activity. The two-phase activity recognition module identifies a user activity based on data captured by the sensors, and dynamically controls power consumption of the activity recognition module based on the user activity identified.
One embodiment provides a method for facilitating activity recognition in an electronic device. The method comprises capturing data relating to user activity using at least one sensor device, identifying a user activity based on data captured by the sensor devices, and dynamically controlling power consumption for activity recognition based on the user activity identified.
One embodiment provides a non-transitory computer-readable medium having instructions which when executed on a computer perform a method comprising capturing data relating to user activity using at least one sensor, identifying a user activity based on data captured by the sensors, and dynamically controlling power consumption for activity recognition module based on the user activity identified.
One embodiment provides an electronic device comprising at least one sensor and a two-phase activity recognition module. The sensors capture data relating to user activity. The two-phase activity recognition module identifies a user activity based on data captured by the sensors, and controls power consumption of the activity recognition module based on the user activity identified.
Activity recognition in a mobile device consumes a lot of power. Much of the power consumption in a mobile device arises from keeping the mobile device awake to perform activity recognition. One or more embodiments provide a two-phase activity recognition system for reducing the amount of time the mobile device is kept awake to accurately respond to a query for user activity.
The mobile device 100 may include other sensors, such as an image capture device 520 (
The mobile device 100 further comprises a two-phase activity recognition module 300 for activity recognition. Specifically, the activity recognition module 300 determines context information based on the sensor data captured by the sensors of the mobile device 100. The context information includes a current user activity of a user utilizing the mobile device 100. The context information is communicated to one or more context-driven applications 190 running on the mobile device, such as a fitness and health tracking application, or a context-based media playback application.
The mobile device 100 further comprises a user interface module 140 for generating a user interface through which a user may control the mobile device 100, such as controlling the playback of content on the mobile device 100.
The mobile device 100 further comprises a network interface module 170 for receiving data from, and sending data to, a content distributor or another mobile device 100 via a network (e.g., cellular network, IP network).
The mobile device 100 further comprises a re-chargeable battery unit 160 that supplies power for operating the mobile device 100.
The mobile device 100 further comprises a memory unit 150 for maintaining data, such as the context information.
In one embodiment, the mobile device 100 has at least two operating modes, such as an awake mode and a low-power sleep mode. In the awake mode, the mobile device 100 is fully operational and performs activity recognition. In the sleep mode, the mobile device 100 is in a low power mode to conserve the power supplied by the battery unit 160, and does not perform any activity recognition. As described in detail later herein, the mobile device 100 further comprises a timer unit 130 configured for switching the mobile device 100 between the sleep mode and the awake mode. In one embodiment, the mobile device 100 further comprises an adaptive sleep scheduling module 180 configured for dynamically adjusting the duration of a sleep mode.
In
At points A and E, the timer unit 130 acquires a wake lock. The wake lock ensures that the two-phase activity recognition module 300 is not interrupted while performing activity recognition. When the two-phase activity recognition module 300 completes activity recognition, the timer unit 130 releases the wake lock and switches the mobile device 100 to the sleep mode. The timer unit 130 then acquires a wake lock after a duration of time representing sleep time has elapsed. As stated above, in one embodiment, an adaptive sleep scheduling module 180 dynamically determines the duration of sleep time to conserve power.
The two-phase activity recognition module 300 is configured to dynamically vary the duration of awake time B based on user activity, thereby reducing power consumption of the two-phase activity recognition module 300 compared to using a fixed duration of awake time.
The sampling unit 210 is configured to obtain samples of sensor data relating to current user activity from the sensors of the mobile device 100. In one embodiment, the sampling unit 210 obtains tri-axial accelerometer sample data from the accelerometer 110.
The tri-axis normalization unit 220 is configured to transform the tri-axial accelerometer sample data into three orientation-independent time series: (i) the Cartesian magnitude, (ii) the global vertical component of acceleration in the direction of gravity, and (iii) the component of acceleration on the global horizontal plane perpendicular to gravity.
The window partitioning unit 230 is configured to segment each time series into finite sampling windows, wherein the duration of each sampling window is determined by the two-phase activity recognition module 300.
The feature extraction unit 240 is configured to transform each sampling window into a feature vector including time-domain features and frequency-domain features. Examples of time-domain features include real-valued power and entropy. Examples of frequency-domain features include the highest magnitude frequency, the magnitude of the highest magnitude frequency, the weighted mean of the top five highest magnitude frequencies weighted by magnitude, and the weighted variance of the top five highest magnitude frequencies weighted by magnitude.
The decision tree classification unit 250 identifies user activity for each sampling window based on features extracted for the sampling window and the decision tree model 260. In one embodiment, the decision tree model 260 maintained in the mobile device 100 is generated offline in a training phase using training data including activity-labeled accelerometer data from multiple users.
In alternative embodiments, the steps involved in the activity classifier module 200 may be substituted with alternative techniques. For example, a different set of orientation-independent time series may be provided as input to the window partitioning unit 230. As another example, the feature extraction unit 240 extracts alternative time-domain features and frequency-domain features from each sampling window. As yet another example, alternative classification techniques such as support vector machines or neural networks may be used after creating classifier models using offline training.
In one embodiment, the two-phase activity recognition module 300 maintains two instances of the activity classifier module 200, wherein each instance has its own corresponding decision tree model 260. Specifically, the two-phase activity recognition module 300 maintains a first instance of the activity classifier module 200 representing an idle classifier unit 310 (
The idle classifier unit 310 utilizes a small window of sensor sample data to determine whether an end user of the mobile device 100 is engaging in an idle activity or a non-idle physical activity. Idle activity encompasses user activity with little or no user movement, such as reading. The activity classifier unit 320 utilizes a larger window of sensor sample data to identify an actual physical activity that the end user is engaged in. Physical activity encompasses user activity involving at least moderate user movement, such as walking, biking, running, driving, etc.
In one embodiment, the two-phase activity recognition module 300 performs activity recognition in two phases. In a first phase Phase I, the two-phase activity recognition module 300 obtains a smaller-sized sampling window (e.g., a sampling window with a duration between 0.50 second and 1 second). The idle classifier unit 310 analyzes the smaller-sized sampling window to determine whether the user activity captured is idle activity or non-idle physical activity. If the user activity captured is idle activity, the two-phase activity recognition module 300 concludes that the user is idle, and notifies the timer unit 130 to switch the operational state of the mobile device 100 to the sleep mode. Therefore, the mobile device 100 immediately transitions to the sleep mode to conserve power when the two-phase activity recognition module 300 determines that the user is idle.
If the user activity captured in Phase I is non-idle physical activity, the two-phase activity recognition module 300 enters a second phase Phase II. In the second phase Phase II, the duration of awake time is extended so that the two-phase activity recognition module 300 may obtain additional sensor sample data for a larger-sized sampling window (e.g., a sampling window with a duration between 4 seconds and 8 seconds). The activity classifier unit 320 analyzes the larger-sized sampling window to determine the end user's fine-grained physical activity (e.g., walking, biking, idle etc.). Upon determining the actual physical activity that the user activity should be classified as, the two-phase activity recognition module 300 notifies the timer unit 130 to switch the operational state of the mobile device 100 to the low-power sleep mode.
In one embodiment, performing activity recognition in two phases reduces: (a) the percentage of time that the mobile device 100 is kept awake to perform activity recognition (“the wake time percentage”), and (b) the power consumption attributable to the two-phase activity recognition module 300 alone assuming that no other application is running on the mobile device 100. For example, based on a naturalistic accelerometer data collection effort totaling over 36 days from 8 subjects, an embodiment of a two-phase activity recognition module 300 that uses fixed durations of sleep time achieves 90% of the accuracy of an always-on activity recognition module. Further, compared to the always-on activity recognition module, the embodiment of the two-phase activity recognition module 300 that uses fixed durations of sleep reduces the wake time percentage and power consumption by 93.7% and 63.8%, respectively.
As another example, based on a naturalistic accelerometer data collection effort totaling over 36 days from 8 subjects, another embodiment of the two-phase activity recognition module 300 that uses adaptive durations of sleep time (e.g., set by the adaptive sleep scheduling unit 180) achieves 90% of the accuracy of an always-on activity recognition module. Further, compared to the always-on activity recognition module, the embodiment of the two-phase activity recognition module 300 that uses adaptive durations of sleep time reduces the wake time percentage and power consumption by 96.7% and 81.9%, respectively.
If the user activity is non-idle physical activity, proceed to process block 303 where the duration of awake is increased and additional sensor samples are obtained. In process block 304, the additional sensor samples are provided to a physical activity classifier unit to classify the user activity as a fine-grained physical activity. After the user activity is classified, output the physical activity, and proceed to process block 305 where the mobile device is placed in the low-power sleep mode, thereby ending the duration of awake time for the mobile device.
The information transferred via communications interface 517 may be in the form of signals such as electronic, electromagnetic, optical, or other signals capable of being received by communications interface 517, via a communication link that carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an radio frequency (RF) link, and/or other communication channels.
The system 500 further includes an image capture device 520 such as a camera, an audio capture device 531 such as a microphone, a magnetometer module 535, an accelerometer module 532, a gyroscope module 533, and a light sensor module 534. The system 500 may further include application modules as MMS module 521, SMS module 522, email module 523, social network interface (SNI) module 524, audio/video (AV) player 525, web browser 526, image capture module 527, etc.
The system 500 further includes an activity recognition module 530 as described herein, according to an embodiment. In one embodiment, the activity recognition application module 530 along with an operating system 529 may be implemented as executable code residing in a memory of the system 500. In another embodiment, the activity recognition application module 530 along with the operating system 529 may be implemented in firmware.
As is known to those skilled in the art, the aforementioned example architectures described above, according to said architectures, can be implemented in many ways, such as program instructions for execution by a processor, as software modules, microcode, as computer program product on computer readable media, as analog/logic circuits, as application specific integrated circuits, as firmware, as consumer electronic devices, AV devices, wireless/wired transmitters, wireless/wired receivers, networks, multi-media devices, etc. Further, embodiments of said architecture can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements.
One or more embodiments have been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to one or more embodiments. Each block of such illustrations/diagrams, or combinations thereof, can be implemented by computer program instructions. The computer program instructions when provided to a processor produce a machine, such that the instructions, which execute via the processor create means for implementing the functions/operations specified in the flowchart and/or block diagram. Each block in the flowchart/block diagrams may represent a hardware and/or software module or logic, implementing one or more embodiments. In alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures, concurrently, etc.
The terms “computer program medium,” “computer usable medium,” “computer readable medium”, and “computer program product,” are used to generally refer to media such as main memory, secondary memory, removable storage drive, a hard disk installed in hard disk drive. These computer program products are means for providing software to the computer system. The computer readable medium allows the computer system to read data, instructions, messages or message packets, and other computer readable information from the computer readable medium. The computer readable medium, for example, may include non-volatile memory, such as a floppy disk, ROM, flash memory, disk drive memory, a CD-ROM, and other permanent storage. It is useful, for example, for transporting information, such as data and computer instructions, between computer systems. Computer program instructions may be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
Computer program instructions representing the block diagram and/or flowcharts herein may be loaded onto a computer, programmable data processing apparatus, or processing devices to cause a series of operations performed thereon to produce a computer implemented process. Computer programs (i.e., computer control logic) are stored in main memory and/or secondary memory. Computer programs may also be received via a communications interface. Such computer programs, when executed, enable the computer system to perform the features of one or more embodiments as discussed herein. In particular, the computer programs, when executed, enable the processor and/or multi-core processor to perform the features of the computer system. Such computer programs represent controllers of the computer system. A computer program product comprises a tangible storage medium readable by a computer system and storing instructions for execution by the computer system for performing a method of one or more embodiments.
Though the one or more embodiments have been described with reference to certain versions thereof; however, other versions are possible. Therefore, the spirit and scope of the appended claims should not be limited to the description of the preferred versions contained herein.
This application claims priority to U.S. Provisional Patent Application Ser. No. 61/678,481, filed on Aug. 1, 2012, which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61678481 | Aug 2012 | US |