The field relates generally to computing devices and, more particularly, to employing a mechanism for outsourcing context-aware application-related functionalities to a sensor hub.
Context-aware software applications are becoming popular in handheld and mobile computing devices. Context-aware applications provide a new compute paradigm as it decouples computing from device usage and thus, the existing approach of “turning devices off” when the user is not interacting with these devices to improve battery life does not work. This is because a user's context is related to the user's daily life phases (e.g., user activity, user location, user social interaction, user emotional state, etc.), the mobile devices having context-aware applications have to continuously capture the user's context (even when the user “turns the device off”) which keeps the computing devices working and consuming power.
For example, a pedometer application is designed to measure the steps a user takes throughout the day irrespective of how the mobile device having the pedometer application is used. To accomplish the pedometer application's requirements, various device sensors (e.g., accelerometer, gyroscope, compass, etc.) would have to be sensing the user's movement (e.g., steps) for extended periods of time (thus, continuously consuming power) even when the mobile device is supposedly “turned off” and resting in the user's pocket. Unlike a typical mobile phone which turns off when the user is not interacting with the phone, mobile computing devices having context-aware applications have to stay on and be constantly in use in order to continuously capture sensor data throughout the day. Context-aware applications consume a great deal of power which requires computing device batteries to be charged multiple times a day.
Embodiments of the present invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
Embodiments of the invention provide a mechanism for outsourcing context-aware application-related activities to a sensor hub. A method of embodiments of the invention includes outsourcing a plurality of functionalities from an application processor to a sensor hub processor of a sensor hub by configuring the sensor hub processor, and performing one or more context-aware applications using one or more sensors coupled to the sensor hub processor.
In one embodiment, a sensor hub is provided that consists of a low power sensing subsystem (e.g., a processor, sensors, a combination of hardware and software, etc.) that operates when the application processor of a computing device is asleep and responsible for offloading multiple functionalities from the application processor to a sensor hub processor at a much lower power than if these functions were performed in the application processor. Further and in one embodiment, the sensor hub supports a wide range of context-aware applications through exposing a set of power efficient primitives to provide the necessary flexibility to configure the sensor hub for a wide range of context capabilities, while maintaining a low power requirement. The novel sensor hub overcomes the conventional power-related problems associated with conventional systems that require context-aware application-related sensors to be directly connected to application processors which consumes great deal of power and significantly lowers battery life.
In one embodiment, the computing device 100 further includes a sensor hub 110 having hardware (architecture) 112 and software (architecture) 114 in communication with the application processor 108 to perform various context-aware application-related functionalities, such as sensor data capturing, triggering, processing, filtering, streaming, storing, forwarding, calibrating, etc., are provided as primitives. These functionalities are outsourced, such as offloaded from the application processor 108 to the sensor hub 110 and are performed in a way that is flexible enough to allow for the changing needs of context-aware applications. In other words, in one embodiment, the primitives may run at the sensor hub 110 but are configured from and by the application processor 108 through a protocol, such as a sensor hub application programming interface (API).
Considering a real-life context-aware application example, a context-aware application is triggered as it requests “gesture recognition”. This request gets routed to the middleware running on the interactive application (IA) application processor 108. The middleware then configures the sensor hub 110 to capture data from, for example, the accelerometer, trigger the gyroscope if movement is detected from the accelerometer, perform gesture spotting on the sensor hub 110, and send data to the middleware if these conditions are satisfied. The middleware on the application processor then runs an algorithm (e.g., hidden Markov model (HMM) algorithm) anytime it receives the data and performs the final gesture recognition and decide that a user just performed a “shake” gesture. In other words, in one embodiment, the primitives (e.g., trigger, capture, and processing) are configured by the middleware running on the IA application processor 108, but they are implemented in the sensor hub software 114 of the sensor hub 110, exposed through a sensor hub API at the application processor 108, and requested, triggered, and configured through the application processor 108, as will be described with reference to the subsequent figures.
As will be explained subsequently in this document, the sensor hub hardware 112 may include a general-purpose low power processor (e.g., STMicro Cortex, etc.) that is coupled to a number of sensors (e.g., 3D accelerator, gyroscope, compass, barometer, etc.) working in concert with the sensor hub software 114 to perform functionalities relating to various context-aware applications to lower the power requirement of the computing device 100.
In one embodiment, the sensor hub processor 202 serves as an intermediate level processing agent within a processing hierarchy, such as between the sensors 214-228 and the application processor 108. This intermediate level agent mitigates the need for the application processor 108 to keep polling and processing sensor data (e.g., collecting sensor data and comparing it to a threshold) by allowing the application processor 108 to outsource the aforementioned tasks relating to sensor data to the sensor hub processor 202. Further, the sensor hub processor 202 provides the flexibility and programmability beyond what is typically offered by the sensors 214-228 when they are configured to work directly with the application processor 108 without the sensor hub processor 202.
It is contemplated that the sensor hub processor 202 may be employed to work with any number and type of sensors 214-228 depending on the nature and function of a context-aware application. For example, a context-aware application, like a pedometer application, may require certain sensors (such as a 3d accelerometer 222, 3d gyroscope 220, etc.) while another context-aware applications, like a camera application, may not require the exact same sensors that the pedometer application may require and vice versa. Some examples include an ambient microphone 216 (e.g., a Knowles ambient microphone) that is associated with a CODEC 214 (e.g., a Maxim CODEC) that is used for data conversion between the ambient microphone 216 and the sensor hub processor 202, a 3d compass 218 (e.g., a Honeywell compass), a 3d gyroscope 220 (e.g., an Invense gyroscope), a 3d accelerometer 222 (e.g., an STMicro accelerometer), a light sensor 224 (e.g., an Intersil light sensor), a barometer 226, and a flash 228, etc. In addition to the aforementioned physical sensors 214-228, various virtual sensors may be supported by the sensor hub processor 202. These virtual sensors (e.g., orientation_xy, orientation_z, heading from inertial measurement, noise level, etc.) may be calculated or obtained using sensor data obtained from the physical sensors 214-228.
As will be further described with reference to
In one embodiment, certain default primitives may be initially provided as part of the sensor hub 110, such as at the time of manufacturing of the computing device having the sensor hub 110 and the application processor 108. However, with time, certain primitives (corresponding to these functionalities or capabilities) may be added to (or removed from) the sensor hub software 114. If, for example, a new primitive (e.g., adding a new component) representing a new functionality (e.g., capability to add) is added to the sensor hub software 114, the sensor hub software 114 then facilitates the application processor 108 the ability to dynamically or on-demand (re)configure the sensor hub processor 202 to adopt this new functionality to be used in future transactions relating to context-aware applications. Similarly, the sensor hub processor 202 may be dynamically or on-demand (re)configured (by the application processor 108 as facilitated by sensor hub software 114) to be free of a particular functionality (e.g., calibration) if the corresponding primitive (e.g., calibrator) is removed from the list of primitives offered by the sensor hub software 114. In one embodiment, dynamic configuration refers to dynamic running and stopping of any number or combination of primitives. For example, capture of data may be started using the accelerometer 222 and stopped anytime thereafter; however, the capture primitive (e.g., the capture module 322 of
In one embodiment, the computing device 100 further includes a sensor hub 110 having hardware (architecture) 112 and software (architecture) 114 in communication with the application processor 108 to perform various context-aware application-related functionalities, such as sensor data capturing, triggering, processing, filtering, streaming, storing, forwarding, calibrating, etc. These functionalities may be provided and recognized as primitives. These functionalities are outsourced, such as offloaded from the application processor 108 to the sensor hub 110 and are performed in a way that is flexible enough to allow for the changing needs of context-aware applications. In other words, in one embodiment, the primitives run at the sensor hub 110 but are configured from and by the application processor 108 through a protocol, such as a sensor hub API. In other words, in one embodiment, the primitives (e.g., trigger, capture, processing, capturing, filtering, etc.) are configured by the middleware running on the IA application processor 108, but they are implemented in the sensor hub software 114 of the sensor hub 110, exposed through a sensor hub API at the application processor 108, and requested, triggered, and configured through the application processor 108, as will be described with reference to the subsequent figures.
In one embodiment, this configuration or reconfiguration of the sensor hub processor 202 by the application processor 108 using the primitives may be performed dynamically (e.g., a primitive may be automatically added, edited, or deleted each time a context-aware application or a user gesture triggers a change) or on-demand (e.g., allowing the user to make changes to the primitives by changing settings on the computing device employing the sensor hub 110).
In addition to the primitives (further described with reference to
Further, middleware 370 may provide high-level context storage, retrieval, notification, processing of data, etc., and implementation of other applications, services and components 316, such as an inference algorithm implementation, storage of raw sensor data (e.g., high data rate), etc. Similarly, various components, such as the drivers 312, the parsers 314, etc., can be used to provide abstract sensor hub details, support multiple consumers, exercise conflict resolution, etc. Referring back to the “gesture recognition” example described above with reference to
Referring now to
In one embodiment, sensor hub primitives 322-338 include a capture module 322 to allow selection of which of the several sensors 212-228 to capture data from, in addition to configuring the range and desired sampling rate. The capture module 322 further allows the sensor hub processor 202 to place any of the unneeded or inactive sensors 214-228 into a low-power mode to conserve power. Another primitive includes a data delivery module 324 that is used to facilitate configuration of the sensor hub processor 202 to stream data to the application processor 108 to optimize for latency while still maintaining transport efficiency. The data delivery module 324 or this mode may be used when the application processor 108 is awake or active. Alternatively, if a user (e.g., an end-user of a mobile computing device) is not interacting with the (mobile) computing device, the application processor 108 may go to sleep and configure the sensor hub processor 202 to collect the relevant data in the background and aggregate or persist the data at a storage medium. During the data delivery mode, the application processor 108 may periodically wake up and retrieve the stored data from the storage medium and perform the necessary tasks, such as context recognition.
Another primitive, in one embodiment, includes a processing module 326 that is used to trigger the application processor 108 to facilitate configuration of the sensor hub processor 202 to apply certain data processing functions to sensor data obtained from the sensors 214-228. These processing functions can be configurable via a set of parameters to enhance flexibility (e.g., number of samples, sliding windows, etc.). Further, these processing functions can be relatively easily expanded, as necessitated or desired, by either expanding the existing processing module 326 using the primitive adjustor 338. Primitive adjuster 338 includes a capability expansion module that can be used to expand the functionalities of an existing module, such as the processing module 326, or add a new module through software programming.
In one embodiment, the primitives 322-338 further include a condition evaluator 328 that helps facilitate configuration of the sensor hub processor 202 to perform data processing functions to combine sensor data from any one or more sensors 214-228 and evaluate the sensor data for occurrence of certain conditions. These conditions, when triggered, can result in one or more of the following actions: (1) trigger capture from new sensors; and (2) data reduction and (3) event detection. Trigger capture from new sensors refers to using a sensor 214-228 to trigger capture using the capture module 322 from a different sensor results in furthering power efficiency, since some sensors consume more than others. For example, in case of gesture recognition of a context-aware application, particular sensors, like accelerometer, gyroscope, etc., are needed to perform gesture recognition-related tasks. For example, data obtained from an accelerometer is used to detect movement of a user, which then results in starting of capture from a gyroscope (which typically consumes ten times (10×) more power). In one embodiment, capability of the application processor 108 is offloaded or outsourced to the sensor hub processor 202 to allow for low latency actions, which would not be possible if the capability remained with the application processor 108 because that would require the application processor 108 to be awakened each time the capture is to be triggered.
Regarding data reduction and event detection, due to continuous sensing, a certain amount of data is captured, using the capture module 322, but much of that data may not contain any interesting meaning. For example, considering gesture or speech recognition, often, sensors like an accelerometer or a microphone may collect some useless data that should not require the application processor 108 to wake up or receive the useless data. At some point, the user may perform a gesture or speak words and when the sensor hub 110 is able to detect that a possible gesture (or speech) is performed (without necessarily being able to understand the gesture or speech), it wakes up the application processor 108 and send the data over. In other words and in one embodiment, the application processor 108 is only awaken and receives data if the data collected has some significance; otherwise, the duty is outsourced to and performed by the sensor hub 110 to reduce the activity load on the application processor 108 and thus, lowering the computing device's power consumption.
For example, in case of gesture recognition, the first two stages of the gesture recognition pipeline are not computationally intensive and can be offloaded to the sensor hub 110. These stages enable detection of whether a movement was performed that resembles a gesture, without knowing the type of the gesture. Doing so can result in dropping of more than 95% of the original data when a gesture is not being performed and thus, waking up the IA, application processor 108, much less often. For the other 5% data, the application processor 108 may be awaken and the highly compute intensive stage that performs the gesture recognition is performed on the IA. This workload partitioning approach can be generalized across several interface pipelines including detection, speech recognition, speaker identification, and the like.
Continuing with the primitives 304, virtual sensors 330 serve as a primitive that can be used to provide high-level data (e.g., whether the computing device is face up, etc.) as opposed to the raw sensor data (obtained from or calculated by the sensors 214-228). Some virtual sensors 330 (e.g., orientation, heading, etc.) can be calculated efficiently in the sensor hub 110 and results in major data reduction if the original sensor data is not needed. Further, these virtual sensors 330 can trigger certain events and wakeup the application processor 108 accordingly.
Other primitives 304 include a calibrator 332, a time-stamping module 334, a power manager 336, and a primitive adjustor 338. Since some sensors (e.g., compass) may require frequent calibration, the calibrator 332 may be used to apply its calibration functions to perform various calibration tasks and deliver the calibration (or calibrated) data to the application processor 108. Since it might be important to maintain accurate time-stamps of sensor samples to enable accurate context recognition, the time-stamping module 334 may be used with the sensor hub's own built-in clock to synchronize the time data with the application processor 108 periodically to perform and register time-sampling of various activities relevant to a context-aware application. The time data may be shared with the application processor 108 as time-stamps that are sent along with the time data samples. Power manager 336 facilitates management of power at the sensor hub 110 so it is done autonomously and independently of the application processor 108. For example, the power manager 336 may switch the sensor hub 110 to a low power state (e.g., in-between successive data sample acquisitions) while the application processor 108 may be on a high power state. Primitive adjustor 338 allows for programming in new primitives to the list of sensor hub primitives 304 and/or (re)programming the existing primitives 304 to add or delete certain capabilities.
Method 400 starts at block 405 with associating a sensor hub to an application processor of a computing device (e.g., mobile or handheld computing device). The computing device hosts one or more context-aware applications. In one embodiment, at block 410, a plurality of sensors relating to the context-aware applications is associated with a sensor hub processor of the hardware architecture of the sensor hub. In one embodiment, the sensor hub may be placed on the same core of a chipset where the application processor resides or discretely on another chipset. In one embodiment, a set of software modules are programmed into the software architecture of the sensor hub as primitives and supplied to the application processor of the computing system at block 415. As aforementioned, the primitives (e.g., trigger, capture, and processing) are configured by the middleware running on the application processor, but they are implemented in the sensor hub software, exposed through a sensor hub API of the application processor, and requested, triggered, and configured through the application processor. In one embodiment, subsequent updates may be made to the primitives using a primitives adjuster or adjustment module as provided by the sensor hub software modules 304 as described with reference to
At block 420, these primitives are provided to the application processor to provide the application processor a novel capability to configure the sensor hub processor to perform various functionalities and tasks relating to activities associated with the context-aware applications of the computing system. This way, in one embodiment, the functionalities or activities that are typically performed by the application processor are outsourced to the sensor hub processor at block 425. For example, the sensors that are typically managed by the application processor directly are, in one embodiment, managed by the sensor hub processor thus relieving the application processor of many of its tasks relating to the context-aware applications. This allows the application processor to sleep and consequently, reducing overall power consumption.
At block 430, a determination is made as to whether any of the existing primitives are to be updated (e.g., expanded or reduced) and/or any new primitives are to be added. If yes, using a primitive adjustor, the update and/or addition is performed at block 435 and the process continues with configuration of the sensor hub processor by the application processor at block 420. If not, the process ends at block 440.
The one or more processors 501 execute instructions in order to perform whatever software routines the computing system implements. The instructions frequently involve some sort of operation performed upon data. Both data and instructions are stored in system memory 503 and cache 504. Cache 504 is typically designed to have shorter latency times than system memory 503. For example, cache 504 might be integrated onto the same silicon chip(s) as the processor(s) and/or constructed with faster static RAM (SRAM) cells whilst system memory 503 might be constructed with slower dynamic RAM (DRAM) cells. By tending to store more frequently used instructions and data in the cache 504 as opposed to the system memory 503, the overall performance efficiency of the computing system improves.
System memory 503 is deliberately made available to other components within the computing system. For example, the data received from various interfaces to the computing system (e.g., keyboard and mouse, printer port, Local Area Network (LAN) port, modem port, etc.) or retrieved from an internal storage element of the computer system (e.g., hard disk drive) are often temporarily queued into system memory 503 prior to their being operated upon by the one or more processor(s) 501 in the implementation of a software program. Similarly, data that a software program determines should be sent from the computing system to an outside entity through one of the computing system interfaces, or stored into an internal storage element, is often temporarily queued in system memory 503 prior to its being transmitted or stored.
The ICH 505 is responsible for ensuring that such data is properly passed between the system memory 503 and its appropriate corresponding computing system interface (and internal storage device if the computing system is so designed). The MCH 502 is responsible for managing the various contending requests for system memory 503 access amongst the processor(s) 501, interfaces and internal storage elements that may proximately arise in time with respect to one another.
One or more I/O devices 508 are also implemented in a typical computing system. I/O devices generally are responsible for transferring data to and/or from the computing system (e.g., a networking adapter); or, for large scale non-volatile storage within the computing system (e.g., hard disk drive). ICH 505 has bi-directional point-to-point links between itself and the observed I/O devices 508.
Portions of various embodiments of the present invention may be provided as a computer program product, which may include a computer-readable medium having stored thereon computer program instructions, which may be used to program a computer (or other electronic devices) to perform a process according to the embodiments of the present invention. The machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, compact disk read-only memory (CD-ROM), and magneto-optical disks, ROM, RAM, erasable programmable read-only memory (EPROM), electrically EPROM (EEPROM), magnet or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing electronic instructions.
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The Specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.