The present invention relates to sensor devices, and more particularly, to sensor devices utilized for activity detection and analytics.
Sensors, sensor devices, and wearable devices are utilized in a variety of applications including the detection and identification of user's activities (e.g. walking, running). Conventional sensor devices and activity classification devices suffer from various inaccuracies including false positive detection of the user's activities. In certain situations such as being in a vehicle or bike riding, conventional pedometers suffer from over-counting during activities where the user is not taking steps because repetitive acceleration data signatures are mistaken for steps. Therefore, there is a strong need for a solution that overcomes the aforementioned issues. The present invention addresses such a need.
A method and system for activity detection and analytics are disclosed. In a first aspect, the method comprises determining a context and providing the determined context and one or more outputs from at least one sensor to an analytics engine to provide analytics results.
In a second aspect, the system includes at least one sensor and a processing system coupled to the at least one sensor, wherein the processing system includes an analytics engine that is configured to receive a determined context and one or more outputs from at least one sensor to provide analytics results.
The accompanying figures illustrate several embodiments of the invention and, together with the description, serve to explain the principles of the invention. One of ordinary skill in the art readily recognizes that the embodiments illustrated in the figures are merely exemplary, and are not intended to limit the scope of the present invention.
The present invention relates to sensor devices, and more particularly, to sensor devices utilized for activity detection and analytics. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Various modifications to the preferred embodiment and the generic principles and features described herein will be readily apparent to those skilled in the art. Thus, the present invention is not intended to be limited to the embodiments shown but is to be accorded the widest scope consistent with the principles and features described herein.
A method and system in accordance with the present invention provide for a wearable device and platform that detects and classifies a user's activity based upon determined contexts and an algorithm to guard against erroneous activity classifications and to provide for accurate activity classifications when present. By integrating sensors, an activity recognition engine that includes the algorithm, and an analytics engine into the wearable device, activities are detected with a certain confidence to adaptively change decision parameters for the analytics engine that estimates the analytics.
In the described embodiments, “raw data” refers to measurement outputs from sensors which are not yet processed. “Motion data” refers to processed sensor data. Processing of the data by the wearable device may be accomplished by applying a sensor fusion algorithm or applying any other algorithm. In the case of the sensor fusion algorithm, data from one or more sensors are combined to provide an orientation of the device including but not limited to heading angle and/or confidence value. The predefined reference in world coordinates refers to a coordinate system where one axis of the coordinate system aligns with the earth's gravity, a second axis of the coordinate system coordinate points towards magnetic north, and the third coordinate is orthogonal to the first and second coordinates.
To describe the features of the present invention in more detail, refer now to the following description in conjunction with the accompanying Figures.
In one embodiment, an integrated system of the present invention includes a motion tracking device (also referred to as a Motion Processing Unit (MPU)) that includes sensors and electronic circuits.
In
In one embodiment, the sensors 160 and external sensors 130 provide a measurement along three axes that are orthogonal relative to each other, referred to as a 9-axis device. In other embodiments, the sensors 160 and/or external sensors 130 may not provide measurements along one or more axis. In one embodiment, the electronic circuits receive and processor measured outputs (e.g. sensor data) from one or more sensors. In another embodiment, the sensor data is processed on a processor on a different substrate/chip.
In one embodiment, the sensors 160 are formed on a first substrate (e.g. a first silicon substrate) and the electronic circuits are formed on a second substrate (e.g. a second silicon substrate). In one embodiment, the first substrate is vertically stacked, attached and electrically connected to the second substrate in a single semiconductor chip. In another embodiment, the first substrate is vertically stacked, attached and electrically connected to the second substrate in a single package. In yet another embodiment, the first and second substrates are electrically connected and placed next to each other in a single package.
In some embodiments, the processor 140, the memory 150, and the sensors 160 are formed on different chips and in other embodiments, the processor 140, the memory 150, and the sensors 160 are formed and reside on the same chip. In yet other embodiments, a sensor fusion algorithm, activity detection algorithm and analytics that is employed in calculating the orientation, activity detection algorithm and analytics are performed externally to the processor 140 and the MPU 190. In still other embodiments, the sensor fusion is determined by the MPU 190. In yet another embodiment, sensor fusion, activity detection algorithm, and analytics are determined both by the processor 140 and the application processor 110.
In one embodiment, the processor 140 executes code, according to an algorithm in the memory 150, to process data that is stored in the memory 150 and that is detected by the sensors 160. In another embodiment, the application processor 110 sends to or retrieves from the application memory 120 and is coupled to the processor 140. In one embodiment, the application in the processor 140 can be one of a variety of application types including but not limited to a navigation system, compass accuracy application, remote control application, 3-dimensional camera application, industrial automation application, and any other motion tracking type application. For the 3-dimensional camera application, a bias error or sensitivity error is estimate by the processor 140. It is understood that this is not an exhaustive list of applications and that other applications are contemplated. It will be appreciated that these and other embodiments of the present invention are readily understood as a result of the present invention wherein the system 100 of
In one embodiment, the determined context is utilized to classify an activity of the user to provide further analytics results. Examples of activities and analytics are listed below.
In one embodiment, the analytics engine and the activity recognition engine receive the one or more outputs from the same sensor. In another embodiment, the analytics engine and the activity recognition engine receive the one or more outputs from different sensors.
In one embodiment, the analytics engine utilizes a threshold to provide the analytics results. In one embodiment, the threshold is a predetermined and preset value that is based upon previous machine learning data sets and/or other sampling techniques. In another embodiment, the threshold is dynamically and continuously adjusted based upon the classified activity that is outputted from the activity recognition engine and that serves as an input into the analytics engine.
In one embodiment, the classified activity that is determined by the activity recognition engine includes a variety of activities including but not limited to biking, running, walking, driving, and sleeping. In one embodiment, the analytic results that are determined by the analytics engine includes a variety of results including but not limited to steps per minute (SPM), number of steps, distance, speed, stride length, energy, calories, heart rate, and exercise counts. In one embodiment, the analytics results that are outputted by the analytics engine are utilized by the activity recognition engine to determine the classified activity. In another embodiment, the classified activity is determined in accordance with a context that has been established by the wearable device, wherein the context includes a variety of contexts but not limited to particular situations, environments, and controlling activities of the user such as gesturing.
In one embodiment, the gesturing of the user that establishes the context includes but is not limited to any of a touch, button, tap, signature, audio, command operation, image, bio signal, heart rate monitor, and movement. In this embodiment, the wearable device includes a gesture detector that is utilized to detect contact of a user with a touch sensor of the wearable device. The gesture detector may also comprise an accelerometer to detect acceleration data of the system/user and a gyroscope to detect rotation of the system/user, wherein the accelerometer and the gyroscope are utilized in combination to generate motion data.
In one embodiment, the activity recognition engine and the analytics engine are capable of receiving user input and instructions. In one embodiment, the user input and instructions include but are not limited to threshold values, activity classification categories, and time periods for analysis.
In one embodiment, the wearable device includes at least one sensor and a processing system (e.g. processor) coupled to the at least one sensor. In one embodiment, the processing system includes an activity recognition engine and an analytics engine. In another embodiment, the wearable device further includes a power management unit that is controlled by any of the activity recognition engine, the determined context, the classified activity, the analytics engine, and the analytics results. In one embodiment, the at least one sensor is dynamically selected based on any of the activity recognition engine, the determined context, the classified activity, the analytics engine, and the analytics results.
The classified activity 330 includes but is not limited to activities such as biking, running, walking, driving, sleeping, and gestures. The classified activity 330 and data from the sensors 310 are both input into the analytics algorithm module 340 that determines a plurality of analytics 350 outputs including but not limited to steps per minute (SPM), number of steps, distance, speed, stride length, energy, calories, heart rate, and exercise counts. In another embodiment, only one of the classified activity 330 and data from the sensors 310 are input into the analytics algorithm module 340 for processing. In one embodiment, the activity recognition module 320 that detects the user's activity and the analytics algorithm module 340 that determines various analytics related to classified activity 330 can be executed on the same processor 140 or the same application processor 110 of the system 100 of
The classified activity 430 includes but is not limited to activities such as biking, running, walking, driving, sleeping, and gestures. The classified activity 430 and data from the sensors 410 are both inputted into the pedometer algorithm module 440 that determines a step count 450 output. In another embodiment, only one of the classified activity 430 and data from the sensors 410 are inputted into the pedometer algorithm module 440 for processing. In one embodiment, the activity recognition module 420 that detects the user's activity and the pedometer algorithm module 440 that determines a step count related to the classified activity 430 can be executed on the same processor 140 or the same application processor 110 of the system 100 of
In one embodiment, the activity/pedometer analytics modules 340/440 are dynamically and automatically adjusted based upon the received classified activity 330/430. For example, if the classified activity 330/430 is determined to be biking, the activity/pedometer analytics module 340/440 is adjusted to increase a peak threshold and increase the cadence count so that the activity/pedometer analytics module 340/440 is immune or less sensitive to various analytics such as counting steps. If the classified activity 330/440 is determined to be persistent walking, the activity/pedometer analytics module 340/440 is adjusted to decrease a peak threshold and decrease the cadence count so that the activity/pedometer analytics module 340/440 is more accurate in the determination of various analytics such as counting steps.
In one embodiment, the classified activity and the plurality of analytics outputs are stored internally by the wearable device system in the memory. In another embodiment, the classified activity and the plurality of analytics outputs are transmitted by the wearable device to a difference device or a cloud computer system and network for storage and displaying on a screen. The different device or the cloud computer system and network utilize the compiled and stored data received from a wearable device to communicate with the wearable device and update and improve upon the activity recognition engine/module and the analytics algorithm engine/module.
As above described, a method and system in accordance with the present invention utilizes a wearable platform and a Motion Processing Unit (MPU) to detect various data signals via sensors and to analyze the detected data signals for the automatic and continuous classification into various activities. By integrating various sensors, an activity recognition engine, and an analytics engine that includes an algorithm into the wearable device, various user activities are classified (e.g. walking, running, biking, etc.) and various metrics (e.g. step count, speed, distance, etc.) associated with that activity classification are determined. The MPU updates the algorithm utilized by the analytics engine based upon the outputted activity classifications and previous analytics metric determinations. It will be appreciated that the present invention has many implementations and uses not expressly stated herein.
A method and system for activity classification and analytics by a wearable device have been disclosed. Embodiments described herein can take the form of an entirely hardware implementation, an entirely software implementation, or an implementation containing both hardware and software elements. Embodiments may be implemented in software, which includes, but is not limited to, application software, firmware, resident software, microcode, etc. Embodiments described herein may also take the form where the entirety of the wearable device, sensors, and one or more remote devices or servers are co-located or integrated into the same or proximate device. In such an embodiment, the entirety of the present invention is integrated into one device. A method and system in accordance with the present invention is not so limited however.
The steps described herein may be implemented using any suitable controller or processor, and software application, which may be stored on any suitable storage location or computer-readable medium. The software application provides instructions that enable the processor to perform the functions described herein.
Furthermore, embodiments may take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer-readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The medium may be an electronic, magnetic, optical, electromagnetic, infrared, semiconductor system (or apparatus or device), or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), non-volatile read/write flash memory, a rigid magnetic disk, and an optical disk. Current examples of optical disks include DVD, compact disk-read-only memory (CD-ROM), and compact disk-read/write (CD-RAN).
Although the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations to the embodiments and those variations would be within the spirit and scope of the present invention. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.
This application claims benefit under 35 USC 119(e) of U.S. Provisional Patent Application No. 61/899,794, filed on Nov. 4, 2013, entitled “METHOD TO IMPROVE ACTIVITY DETECTION AND ANALYTICS,” which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61899794 | Nov 2013 | US |