Sensor based context management

Information

  • Patent Grant
  • 11607144
  • Patent Number
    11,607,144
  • Date Filed
    Monday, December 19, 2016
    7 years ago
  • Date Issued
    Tuesday, March 21, 2023
    a year ago
Abstract
According to an example aspect of the present invention, there is provided an apparatus comprising a memory configured to store first-type sensor data, at least one processing core configured to compile a message based at least partly on the first-type sensor data, to cause the message to be transmitted from the apparatus, to cause receiving in the apparatus of a machine readable instruction, and to derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data
Description
FIELD

The present invention relates to identification of user activity based on sensor information.


BACKGROUND

User sessions, such as training sessions, may be recorded, for example in notebooks, spreadsheets or other suitable media. Recorded training sessions enable more systematic training, and progress toward set goals can be assessed and tracked from the records so produced. Such records may be stored for future reference, for example to assess progress an individual is making as a result of the training. An activity session may comprise a training session or another kind of session.


Personal devices, such as, for example, smart watches, smartphones or smart jewellery, may be configured to produce recorded sessions of user activity. Such recorded sessions may be useful in managing physical training, child safety or in professional uses. Recorded sessions, or more generally sensor-based activity management, may be of varying type, such as, for example, running, walking, skiing, canoeing, wandering, or assisting the elderly.


Recorded sessions may be viewed using a personal computer, for example, wherein recordings may be copied from a personal device to the personal computer. Files on a personal computer may be protected using passwords and/or encryption, for example.


Personal devices may be furnished with sensors, which may be used, for example, in determining a location of the personal device. For example, a satellite positioning sensor may receive positioning information from a satellite constellation, and deduce therefrom where the personal device is located. A recorded training session may comprise a route determined by repeatedly determining the location of the personal device during the training session. Such a route may be later observed using a personal computer, for example.


Alternatively to a satellite positioning sensor, a personal device may be configured to determine its location using, for example, a cellular network based location determining method, wherein a cellular network is used to assist determination of the location. For example, a cell that maintains attachment of the personal device to the cellular network may have a known location, providing a location estimate of the personal device owing to the attachment and a finite geographic extent of the cell.


SUMMARY OF THE INVENTION

The invention is defined by the features of the independent claims. Some specific embodiments are defined in the dependent claims.


According to a first aspect of the present invention, there is provided an apparatus comprising a memory configured to store first-type sensor data, at least one processing core configured to compile a message based at least partly on the first-type sensor data, to cause the message to be transmitted from the apparatus, to cause receiving in the apparatus of a machine readable instruction, and to derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data.


Various embodiments of the first aspect may comprise at least one feature from the following bulleted list:

    • the machine readable instruction comprises at least one of the following: an executable program, an executable script and a set of at least two machine-readable characteristics, wherein each of the characteristics characterizes sensor data produced during a predefined activity type
    • the at least one processing core is configured to derive the estimated activity type at least in part by comparing, using the machine readable instruction, the first-type sensor data, or a processed form of the first-type sensor data, to reference data
    • the first-type sensor data comprises acceleration sensor data
    • the memory is further configured to store second-type sensor data, and wherein the at least one processing core is configured to derive the estimated activity type, using the machine readable instruction, based at least in part on the second-type sensor data
    • the second-type sensor data is of a different type than the first-type sensor data
    • the second-type sensor data comprises at least one of: sound sensor data, microphone-derived data and vibration sensor data
    • the at least one processing core is configured to derive the estimated activity type at least in part by comparing the second-type sensor data, or a processed form of the second-type sensor data, to reference data, the reference data comprising reference data of a first type and a second type
    • the at least one processing core is configured to present the estimated activity type to a user for verification
    • the at least one processing core is configured to cause the memory to store, in a sequence of estimated activity types, the estimated activity type and a second estimated activity type
    • the at least one processing core is configured to cause the memory to delete the machine readable instruction responsive to a determination that an activity session has ended.


According to a second aspect of the present invention, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to receive a message from a user device, the message comprising information characterizing first-type sensor data, determine, based at least partly on the first-type sensor data, an activity context, and transmit to the user device a machine-readable instruction configured to cause activity type determination in the activity context.


According to a third aspect of the present invention, there is provided a method, comprising storing first-type sensor data in an apparatus, compiling a message based at least partly on the first-type sensor data, causing the message to be transmitted from the apparatus, causing receiving in the apparatus of a machine readable instruction, and deriving an estimated activity type, using the machine readable instruction, based at least partly on sensor data.


Various embodiments of the third aspect may comprise at least one feature corresponding to a feature in the preceding bulleted list laid out in connection with the first aspect.


According to a fourth aspect of the present invention, there is provided a method, comprising receiving a message from a user device, the message comprising information characterizing first-type sensor data, determining, based at least partly on the first-type sensor data, an activity context, and transmitting to the user device a machine-readable instruction configured to cause activity type determination in the activity context.


According to a fifth aspect of the present invention, there is provided an apparatus comprising means for storing first-type sensor data in an apparatus, means for compiling a message based at least partly on the first-type sensor data, means for causing the message to be transmitted from the apparatus, means for causing receiving in the apparatus of a machine readable instruction, and means for deriving an estimated activity type, using the machine readable instruction, based at least partly on sensor data.


According to a sixth aspect of the present invention, there is provided a non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause an apparatus to at least store first-type sensor data, compile a message based at least partly on the first-type sensor data, cause the message to be transmitted from the apparatus, cause receiving in the apparatus of a machine readable instruction, and derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data.


According to a seventh aspect of the present invention, there is provided a computer program configured to cause a method in accordance with at least one of the third and fourth aspects to be performed.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an example system in accordance with at least some embodiments of the present invention;



FIG. 2 illustrates an example multisensorial time series;



FIG. 2B illustrates a second example multisensorial time series;



FIG. 3 illustrates an example apparatus capable of supporting at least some embodiments of the present invention;



FIG. 4 illustrates signalling in accordance with at least some embodiments of the present invention, and



FIG. 5 is a flow graph of a method in accordance with at least some embodiments of the present invention.





EMBODIMENTS

Employing sensor data to determine an activity type enhances usability of personal devices. For example, a user may be partly disabled, rendering his use of a device unassisted more difficult. Employing sensor data from more than one sensor may further enhance the accuracy of activity type estimation. To empower a reduced-capability user device to determine an activity type, a machine-readable instruction may be selected for, and provided to, the user device, by a back-end server. The selecting may be based on sensor data collected by the user device, such that the machine-readable instruction enables the user device to derive an estimated activity type in the user context where the user device is. Thus the derivation is partly performed in the back-end server, enabling the user device to provide a good estimation using only limited processor, energy and/or memory resources.



FIG. 1 illustrates an example system in accordance with at least some embodiments of the present invention. The system comprises device 110, which may comprise, for example, a smart watch, digital watch, smartphone, phablet device, tablet device, or another type of suitable device. Device 110 may comprise a display, which may comprise a touchscreen display, for example. The display may be limited in size. Device 110 may be powered, for example, by a rechargeable battery. An example of a limited-size display is a display worn on a wrist.


Device 110 may be communicatively coupled with a communications network. For example, in FIG. 1 device 110 is coupled, via wireless link 112, with base station 120. Base station 120 may comprise a cellular or non-cellular base station, wherein a non-cellular base station may be referred to as an access point. Examples of cellular technologies include wideband code division multiple access, WCDMA, and long term evolution, LTE, while examples of non-cellular technologies include wireless local area network, WLAN, and worldwide interoperability for microwave access, WiMAX. Base station 120 may be coupled with network node 130 via connection 123. Connection 123 may be a wire-line connection, for example. Network node 130 may comprise, for example, a controller or gateway device. Network node 130 may interface, via connection 134, with network 140, which may comprise, for example, the Internet or a corporate network. Network 140 may be coupled with further networks via connection 141. In some embodiments, device 110 is not configured to couple with base station 120. Network 140 may comprise, or be communicatively coupled, with a back-end server, for example.


Device 110 may be configured to receive, from satellite constellation 150, satellite positioning information via satellite link 151. The satellite constellation may comprise, for example the global positioning system, GPS, or the Galileo constellation. Satellite constellation 150 may comprise more than one satellite, although only one satellite is illustrated in FIG. 1 for the same of clarity. Likewise, receiving the positioning information over satellite link 151 may comprise receiving data from more than one satellite.


Alternatively or additionally to receiving data from a satellite constellation, device 110 may obtain positioning information by interacting with a network in which base station 120 is comprised. For example, cellular networks may employ various ways to position a device, such as trilateration, multilateration or positioning based on an identity of a base station with which attachment is possible or ongoing. Likewise a non-cellular base station, or access point, may know its own location and provide it to device 110, enabling device 110 to position itself within communication range of this access point.


Device 110 may be configured to obtain a current time from satellite constellation 150, base station 120 or by requesting it from a user, for example. Once device 110 has the current time and an estimate of its location, device 110 may consult a look-up table, for example, to determine a time remaining until sunset or sunrise, for example. Device 110 may likewise gain knowledge of the time of year.


Device 110 may comprise, or be coupled with, at least one sensor, such as, for example, an acceleration sensor, moisture sensor, temperature sensor, heart rate sensor or a blood oxygen level sensor. Device 110 may be configured to produce and store, using the at least one sensor, sensor data, for example in a time series that comprises a plurality of samples taken in a time sequence.


Device 110 may be configured to provide an activity session. An activity session may be associated with an activity type. Examples of activity types include rowing, paddling, cycling, jogging, walking, hunting, swimming and paragliding. In a simple form, an activity session may comprise device 110 storing sensor data produced with sensors comprised in device 110, or in another device with which device 110 is associated or paired. An activity session may be determined to have started and ended at certain points in time, such that the determination takes place afterward or concurrently with the starting and/or ending. In other words, device 110 may store sensor data to enable subsequent identification of activity sessions based at least partly on the stored sensor data.


An activity session in device 110 may enhance a utility a user can obtain from the activity, for example, where the activity involves movement outdoors, the activity session may provide a recording of the activity session. Device 110 may, in some embodiments, provide the user with contextual information. Such contextual information may comprise, for example, locally relevant weather information, received via base station 120, for example. Such contextual information may comprise at least one of the following: a rain warning, a temperature warning, an indication of time remaining before sunset, an indication of a nearby service that is relevant to the activity, a security warning, an indication of nearby users and an indication of a nearby location where several other users have taken photographs. Contextual information may be presented during an activity session.


A recording of an activity session may comprise information on at least one of the following: a route taken during the activity session, a metabolic rate or metabolic effect of the activity session, a time the activity session lasted, a quantity of energy consumed during the activity session, a sound recording obtained during the activity session and an elevation map along the length of the route taken during the activity session. A route may be determined based on positioning information, for example. Metabolic effect and consumed energy may be determined, at least partly, based on information concerning the user that device 110 has access to. A recording may be stored in device 110, an auxiliary device, or in a server or data cloud storage service. A recording stored in a server or cloud may be encrypted prior to transmission to the server or cloud, to protect privacy of the user. A recording may be produced even if the user has not indicated an activity session has started, since a beginning and ending of an activity session may be determined after the session has ended, for example based, at least partly, on sensor data.


Device 110 may have access to a backhaul communications link to provide indications relating to ongoing activity. For example, search and rescue services may be given access to information on joggers in a certain area of a forest, to enable their rescue if a chemical leak, for example, makes the forest unsafe for humans. Alternatively or additionally, further users may be enabled to receive information on ongoing activity sessions. Such further users may be pre-configured as users who are cleared to receive such information, with non-cleared users not being provided such information, for example. As a specific example, users on a friend list may be able to obtain information on an ongoing activity session. The friend list may be maintained in a social media service, for example. The information on the ongoing activity session may be provided from device 110 as periodic updates, for example.


After an activity has ended, device 110 may have stored therein, or in a memory to which device 110 has access, sensor data. The stored sensor data may be stored as a time series that spans the activity session as well as time preceding and/or succeeding the activity session. The beginning and ending points in time of the activity session may be selected from the time series by the user. The beginning and ending points may be pre-selected by device 110, or a personal computer, for a user to accept or reject, the pre-selecting being based on changes in the time series. For example, where, in the time series, acceleration sensor data begins to indicate more active movements of device 110, a beginning point of an activity session may be pre-selected. Such a change may correspond to a time in the time series when the user stopped driving a car and began jogging, for example. Likewise, a phase in the time series where the more active movements end may be pre-selected as an ending point of the activity session.


Sensor data may comprise information from more than one sensor, wherein the more than one sensor may comprise sensors of at least two distinct types. For example, sensor data may comprise acceleration sensor data and air pressure sensor data. Further examples are sound volume sensor data, moisture sensor data and electromagnetic sensor data. In general sensor data from a sensor of a first type may be referred to as first-type sensor data, and sensor data from a sensor of a second type may be referred to as second-type sensor data. A type of a sensor may be defined by a physical property the sensor is configured to measure. For example, all temperature sensors may be considered temperature-type sensors, regardless of the physical principle used to measure temperature in the temperature sensor.


Pre-selecting the beginning and/or ending points in the time series may comprise detecting that sensor data characteristics of more than one sensor type changes at approximately the same phase of the time series. Using more than one type of sensor data may enhance the accuracy of beginning and/or ending time point pre-selection, in case the sensors in question are affected by the activity performed during the activity session.


An activity type may be determined based, at least partly, on the sensor data. This determining may take place when the activity is occurring, or afterwards, when analysing the sensor data. The activity type may be determined by device 110 or by a personal computer that has access to the sensor data, for example, or a server that is provided access to the sensor data. Where a server is given access to the sensor data, the sensor data may be anonymized. The determination of the activity type may comprise comparing the sensor data to reference data. The reference data may comprise reference datasets, each reference dataset being associated with an activity type. The determination may comprise determining the reference dataset that most resembles the sensor data, for example in a least-squares sense. Alternatively to the sensor data itself, a processed form of the sensor data may be compared to the reference data. The processed form may comprise, for example, a frequency spectrum obtained from the sensor data. Alternatively, the processed form may comprise a set of local minima and/or maxima from the sensor data time series. The determined activity type may be selected as the activity type associated with the reference dataset that most resembles the processed or original sensor data.


Different activity types may be associated with different characteristic frequencies. For example, acceleration sensor data may reflect a higher characteristic frequency when the user has been running, as opposed to walking. Thus the determination of activity type may be based, in some embodiments, at least partly, on deciding which reference dataset has a characteristic frequency that most closely matches a characteristic frequency of a section of the sensor-derived information time series under investigation. Alternatively or in addition, acceleration sensor data may be employed to determine a characteristic movement amplitude.


Where device 110 is configured to store a time series of more than one type of sensor data, plural sensor data types may be employed in determining the activity type. The reference data may comprise reference datasets that are multi-sensorial in nature in such a way, that each reference dataset comprises data that may be compared to each sensor data type that is available. For example, where device 110 is configured to compile a time series of acceleration and sound sensor data types, the reference data may comprise reference datasets, each reference dataset corresponding to an activity type, wherein each reference dataset comprises data that may be compared with the acceleration data and data that may be compared with the sound data. The determined activity type may be determined as the activity type that is associated with the multi-sensorial reference dataset that most closely matches the sensor data stored by device 110. Again, original or processed sensor data may be compared to the reference datasets. Where device 110 comprises, for example, a smartphone, it may comprise plural sensors to accomplish the smartphone function. Examples of such sensors comprise microphones to enable voice calls and cameras to enable video calls. Furthermore a radio receiver may, in some cases, be configurable to measure electric or magnetic field properties. Device 110 may comprise a radio receiver, in general, where device 110 is furnished with a wireless communication capability.


A first example of multi-sensorial activity type determination is hunting, wherein device 110 stores first-type sensor data that comprises acceleration sensor data and second-type sensor data that comprises sound data. The reference data would comprise a hunting reference dataset, which would comprise acceleration reference data and sound reference data, to enable comparison with sensor data stored by device 110. Hunting may involve periods of low sound and low acceleration and intermittent combinations of loud, short sound and a low-amplitude high-frequency acceleration corresponding to a gunshot sound and kick.


A second example of multi-sensorial activity type determination is swimming, wherein device 110 stores first-type sensor data that comprises moisture sensor data and second-type sensor data that comprises magnetic field data from a compass sensor. The reference data would comprise a swimming reference dataset, which would comprise moisture reference data and magnetic field reference data, to enable comparison with sensor data stored by device 110. Swimming may involve high moisture due to being immersed in water, and elliptical movements of an arm, to which device 110 may be attached, which may be detectable as periodically varying magnetic field data. In other words, the direction of the Earth's magnetic field may vary from the point of view of the magnetic compass sensor in a periodic way in the time series.


Overall, a determined, or derived, activity type may be considered an estimated activity type until the user has confirmed the determination is correct. In some embodiments, a few, for example two or three, most likely activity types may be presented to the user as estimated activity types for the user to choose the correct activity type from. Using two or more types of sensor data increases a likelihood the estimated activity type is correct.


A context process may be employed in deriving an estimated activity type based on sensor data. A context process may comprise first determining a context in which the sensor data has been produced. For example, the context process may comprise using the sensor data to determine the context, such as a user context, and then deriving an activity type within that context. For example, a context may comprise outdoor activity, and deriving an estimated activity type may comprise first determining, based on the sensor data, the user is in an outdoor context, selecting an outdoor-context machine readable instruction, and using the machine-readable instruction to differentiate between different outdoor-context activity types, such as jogging and orienteering. As another example, a context may comprise indoor activity, and deriving an estimated activity type may comprise first determining, based on the sensor data, the user is in an indoor context, selecting an indoor-context machine readable instruction, and using this machine-readable instruction to differentiate between different indoor activity types, such as 100 meter runs and wrestling.


The machine readable instruction may comprise, for example, a script, such as an executable or compilable script, an executable computer program, a software plugin or a non-executable computer-readable descriptor that enables device 110 to differentiate between at least two activity types within the determined context. The machine readable instruction may comprise indications as to which type or types of sensor data, and in which format, are to be used in deriving the activity type using the machine readable instruction.


Determining an outdoor context may comprise determining the sensor data indicates a wide range of geographic movement, indicating the user has roamed outdoors. Determining an indoor context may comprise determining the sensor data indicates a narrower range of geographic movement, indicating the user has remained within an small range during the activity session. Where temperature-type sensor data is available, a lower temperature may be associated with an outdoor activity and a higher temperature may be associated with an indoor activity. The temperature may be indicative of this in particular where the user is in a geographic area where winter, autumn or spring conditions cause an outdoor temperature to be lower than an indoor temperature. The geographic area may be available in positioning data.


Therefore, in some embodiments, deriving an estimated activity type is a two-phase process first comprising determining a context based on the sensor data, and then deriving an estimated activity type within that context, using a machine-readable instruction specific to that context. Selecting the context and/or the activity type within the context may comprise comparing sensor data, or processed sensor data, to reference data. The two-phase process may employ two types of reference data, context-type reference data and activity-type reference data, respectively.


The context process may adaptively learn, based on previous activity sessions recorded by a plurality of users, how to more accurately determine contexts and/or activity types. The determining of context may be based on the context-type reference data, for example, the context-type reference data being adaptively updated in dependence of the previous sessions recorded by the plurality of users. Adapting the context-type reference data may take place in a server, for example, the server being configured to provide updated context-type reference data to device such as device 110, or a personal computer associated therewith. A server may have access to information from the plurality of users, and high processing capability, and thus be more advantageously placed to update the context-type reference data than device 110, for example.


The two-phase process described above may be performed in a distributed fashion, wherein a user device, such as device 110, initially obtains sensor data of at least one, and in some embodiments at least two, types. This sensor data is used to compile a message, the message comprising the sensor data, at least in part, in raw or processed format, the message being transmitted from the user device to a back-end server. The server may use the sensor data in the message to determine a context in which device 110 seems to be in. This determining may be based on reference data, for example. The server may then provide a machine-readable instruction to the user device, to thereby enable the user device to derive an activity type within the context. This deriving may also be based, at least partly, on reference data. The reference data used in the server need not be the same as the reference data used in the user device.


Selection of the machine readable instruction in the server may be based, in addition to the sensor data, on capabilities of device 110. In particular, the machine readable instruction may be selected so that sensor data device 110 is capable of producing may be accepted as input to the machine readable instruction. To enable this selecting, the message transmitted from device 110 to the server may comprise an indication concerning device 110. Examples of such an indication include a model and make of device 110, a serial number of device 110 and an indication of sensor types disposed in device 110. Alternatively to being comprised in the same message, device 110 may provide the indication to the server in another message.


An advantage of the distributed two-phase process is that the user device need not be enabled to detect a great range of potential activity types with different characteristics. For example, the size of reference data used in the user device may be reduced by performing context detection in a server. The machine readable instruction may comprise, at least in part, the reference data used in the user device. As the size of reference data may be thus reduced, the user device may be built with less memory and/or the activity type derivation may consume less memory in the user device.


The machine readable instruction may enable detecting events within the context. An example is detecting a number of shots fired when hunting. Further examples include a number of times a down-hill slope has been skied down, a number of laps run on a track, a number of golf strokes played and, where applicable, initial velocities of golf balls immediately following a stroke. Further examples of detecting events are detecting a number of steps during running or strokes during swimming.


While an activity type is in general selectable within a context, an activity type may itself be seen as a context within which a further selection may be possible. For example, an initial context may comprise indoor activity, wherein an activity type may be identified as watersports. Within that activity type, swimming may be derived as an activity type. Further, breaststroke may be derived as an activity type where swimming is seen as a context. Therefore, while the terms “context” and “activity type” are employed herein for convenience, what is relevant is the hierarchical relationship between the two.


A user interface of device 110 may be modified based on the context or activity type. For example, where hunting is derived as an activity type, a user interface may display the number of detected shots. As another example, when swimming the number of laps, or the distance covered, may be displayed. In some embodiments, user interface adaptations have a hierarchical behaviour, such that initially, for example, an outdoor user interface is activated as a response to a determination that the current context is an outdoor context. When an activity type is identified within the context, the user interface may be further adapted to serve that specific activity type. In some embodiments, device 110 is configured to receive from a server user interface information to enable device 110 to adapt to a great number of situations, without a need to store all the user interface adaptations in device 110 beforehand.


In some embodiments, a server may transmit a request message to device 110, the request message being configured to cause device 110 to perform at least one measurement, such as a capturing of sensor data, for example, and to return a result of the measurement to the server. This way, the server may be enabled to detect what is occurring in the surroundings of device 110.


The machine readable instructions may be adapted by the server. For example, a user who first obtains a device 110 may initially be provided, responsive to the messages sent from device 110, with machine readable instructions that reflect an average user population. Thereafter, as the user engages in activity sessions, the machine readable instructions may be adapted to more accurately reflect use by this particular user. For example, limb length may affect periodical properties of sensor data captured while the user is swimming. To enable the adapting, the server may request sensor data from device 110, for example periodically, and compare sensor data so obtained to the machine readable instructions, to hone the instructions for future use with this particular user.



FIG. 2 illustrates an example multisensorial time series. On the upper axis, 201, is illustrated a moisture sensor time series 210 while the lower axis, 202, illustrates a time series 220 of deviation of magnetic north from an axis of device 110.


Moisture time series 210 displays an initial portion of low moisture, followed by a rapid increase of moisture that then remains at a relatively constant, elevated, level before beginning to decline, at a lower rate than the increase, as device 110 dries.


Magnetic deviation time series 220 displays an initial, erratic sequence of deviation changes owing to movement of the user as he operates a locker room lock, for example, followed by a period of approximately periodic movements, before an erratic sequence begins once more. The wavelength of the periodically repeating motion has been exaggerated in FIG. 2 to render the illustration clearer.


A swimming activity type may be determined as an estimated activity type, beginning from point 203 and ending in point 205 of the time series, based on a comparison with a reference dataset comprised in a machine readable instruction associated with a water sports context, for example. Via the reference dataset, swimming as an activity type may be associated with simultaneous high moisture and periodic movements.



FIG. 2B illustrates a second example multisensorial time series. In FIG. 2B, like numbering denotes like elements as in FIG. 2. Unlike in FIG. 2, not one but two activity sessions are determined in the time series of FIG. 2B. Namely, a cycling session is determined to start at beginning point 207 and to end at point 203, when the swimming session begins. Thus the compound activity session may relate to triathlon, for example. In cycling, moisture remains low, and magnetic deviation changes only slowly, for example as the user cycles in a velodrome.



FIG. 3 illustrates an example apparatus capable of supporting at least some embodiments of the present invention. Illustrated is device 300, which may comprise, for example, a mobile communication device such as mobile 110 of FIG. 1 or FIG. 2. Comprised in device 300 is processor 310, which may comprise, for example, a single- or multi-core processor wherein a single-core processor comprises one processing core and a multi-core processor comprises more than one processing core. Processor 310 may comprise more than one processor. A processing core may comprise, for example, a Cortex-A8 processing core manufactured by ARM Holdings or a Steamroller processing core produced by Advanced Micro Devices Corporation. Processor 310 may comprise at least one Qualcomm Snapdragon and/or Intel Atom processor. Processor 310 may comprise at least one application-specific integrated circuit, ASIC. Processor 310 may comprise at least one field-programmable gate array, FPGA. Processor 310 may be means for performing method steps in device 300. Processor 310 may be configured, at least in part by computer instructions, to perform actions.


Device 300 may comprise memory 320. Memory 320 may comprise random-access memory and/or permanent memory. Memory 320 may comprise at least one RAM chip. Memory 320 may comprise solid-state, magnetic, optical and/or holographic memory, for example. Memory 320 may be at least in part accessible to processor 310. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be means for storing information. Memory 320 may comprise computer instructions that processor 310 is configured to execute. When computer instructions configured to cause processor 310 to perform certain actions are stored in memory 320, and device 300 overall is configured to run under the direction of processor 310 using computer instructions from memory 320, processor 310 and/or its at least one processing core may be considered to be configured to perform said certain actions. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be at least in part external to device 300 but accessible to device 300.


Device 300 may comprise a transmitter 330. Device 300 may comprise a receiver 340. Transmitter 330 and receiver 340 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard. Transmitter 330 may comprise more than one transmitter. Receiver 340 may comprise more than one receiver. Transmitter 330 and/or receiver 340 may be configured to operate in accordance with global system for mobile communication, GSM, wideband code division multiple access, WCDMA, long term evolution, LTE, IS-95, wireless local area network, WLAN, Ethernet and/or worldwide interoperability for microwave access, WiMAX, standards, for example.


Device 300 may comprise a near-field communication, NFC, transceiver 350. NFC transceiver 350 may support at least one NFC technology, such as NFC, Bluetooth, Wibree or similar technologies.


Device 300 may comprise user interface, UI, 360. UI 360 may comprise at least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by causing device 300 to vibrate, a speaker and a microphone. A user may be able to operate device 300 via UI 360, for example to manage activity sessions.


Device 300 may comprise or be arranged to accept a user identity module 370. User identity module 370 may comprise, for example, a subscriber identity module, SIM, card installable in device 300. A user identity module 370 may comprise information identifying a subscription of a user of device 300. A user identity module 370 may comprise cryptographic information usable to verify the identity of a user of device 300 and/or to facilitate encryption of communicated information and billing of the user of device 300 for communication effected via device 300.


Processor 310 may be furnished with a transmitter arranged to output information from processor 310, via electrical leads internal to device 300, to other devices comprised in device 300. Such a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 320 for storage therein. Alternatively to a serial bus, the transmitter may comprise a parallel bus transmitter. Likewise processor 310 may comprise a receiver arranged to receive information in processor 310, via electrical leads internal to device 300, from other devices comprised in device 300. Such a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 340 for processing in processor 310. Alternatively to a serial bus, the receiver may comprise a parallel bus receiver.


Device 300 may comprise further devices not illustrated in FIG. 3. For example, where device 300 comprises a smartphone, it may comprise at least one digital camera. Some devices 300 may comprise a back-facing camera and a front-facing camera, wherein the back-facing camera may be intended for digital photography and the front-facing camera for video telephony. Device 300 may comprise a fingerprint sensor arranged to authenticate, at least in part, a user of device 300. In some embodiments, device 300 lacks at least one device described above. For example, some devices 300 may lack a NFC transceiver 350 and/or user identity module 370.


Processor 310, memory 320, transmitter 330, receiver 340, NFC transceiver 350, UI 360 and/or user identity module 370 may be interconnected by electrical leads internal to device 300 in a multitude of different ways. For example, each of the aforementioned devices may be separately connected to a master bus internal to device 300, to allow for the devices to exchange information. However, as the skilled person will appreciate, this is only one example and depending on the embodiment various ways of interconnecting at least two of the aforementioned devices may be selected without departing from the scope of the present invention.



FIG. 4 illustrates signalling in accordance with at least some embodiments of the present invention. On the vertical axes are disposed, on the left, device 110 of FIG. 1, and on the right, a server SRV. Time advances from the top toward the bottom. Initially, in phase 410, device 110 obtains sensor data from at least one, and in some embodiments from at least two sensors. The sensor data may comprise first-type sensor data and, in some embodiments, also second-type sensor data. The sensor or sensors may be comprised in device 110, for example. The sensor data may be stored in a time series, for example at a sampling frequency of 1 Hz, 10 Hz, 1 Khz or indeed another sampling interval. The sampling interval need not be the same in first-type sensor data and second-type sensor data.


Phase 410 may comprise one or more activity sessions of at least one activity type. Where multiple activity sessions are present, they may be of the same activity type or different activity types. The user need not, in at least some embodiments, indicate to device 110 that activity sessions are ongoing. During phase 410, device 110 may, but in some embodiments need not, identify activity types or sessions. The time series compiled during phase 410 may last 10 or 24 hours, for example. As a specific example, the time series may last from the previous time sensor data was downloaded from device 110 to another device, such as, for example, personal computer PC1.


In phase 420, the sensor data is provided, at least partly, in raw or processed format to server SRV. This phase may further comprise providing to server SRV optional activity and/or event reference data. The providing may proceed via base station 120, for example. The time series may be encrypted during downloading to protect the user's privacy.


In phase 430, server SRV may determine, based at least partly on the sensor data in the message of phase 420, a context and an associated machine readable instruction. Where activity and/or event reference data is provided in phase 420, that data may be employed in phase 430.


In phase 440 the machine readable instruction determined in phase 430 is provided to device 110, enabling, in phase 450, a derivation of an estimated activity type within that context, based on sensor data. The derivation of phase 450 may be based on sensor data that was included in the message of phase 420, or device 110 may capture new sensor data and use it, with the machine readable instruction, in the derivation of phase 450.



FIG. 5 is a flow graph of a method in accordance with at least some embodiments of the present invention. The phases of the illustrated method may be performed in device 110, an auxiliary device or a personal computer, for example, or in a control device configured to control the functioning thereof, when implanted therein.


Phase 510 comprises storing first-type sensor data in an apparatus. Phase 520 comprises compiling a message based at least partly on the first-type sensor data. Phase 530 comprises causing the message to be transmitted from the apparatus. Phase 540 comprises causing receiving in the apparatus of a machine readable instruction, Finally, phase 550 comprises deriving an estimated activity type, using the machine readable instruction, based at least partly on sensor data. The message of phase 510 may comprise activity and/or event reference data.


It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting.


Reference throughout this specification to one embodiment or an embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Where reference is made to a numerical value using a term such as, for example, about or substantially, the exact numerical value is also disclosed.


As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no individual member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, various embodiments and example of the present invention may be referred to herein along with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of the present invention.


Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the preceding description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.


While the forgoing examples are illustrative of the principles of the present invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.


The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor require the existence of also un-recited features. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of “a” or “an”, that is, a singular form, throughout this document does not exclude a plurality.


INDUSTRIAL APPLICABILITY

At least some embodiments of the present invention find industrial application in facilitating analysis of sensor data.


ACRONYMS LIST



  • GPS Global Positioning System

  • LTE Long Term Evolution

  • NFC Near-Field Communication

  • WCDMA Wideband Code Division Multiple Access

  • WiMAX worldwide interoperability for microwave access

  • WLAN Wireless local area network













REFERENCE SIGNS LIST
















110
Device


120
Base Station


130
Network Node


140
Network


150
Satellite Constellation


201, 202
Axes in FIG. 2


203, 205,
Activity session endpoints in FIG. 2 and FIG. 2B


207


210, 220
Sensor data time series in FIGS. 2 and 2B


310-370
Structure illustrated in FIG. 3


410-430
Phases of the method of FIG. 4


510-530
Phases of the method of FIG. 5








Claims
  • 1. A mobile user device comprising: a memory configured to store first-type sensor data obtained during an activity session; andat least one processing core configured to compile a message based at least partly on the first-type sensor data, to cause the message to be transmitted from the mobile user device to a server external to the mobile user device, to cause receiving, responsive to the message, in the mobile user device, from the server, of a machine readable instruction configured to differentiate an activity type in either an indoor or an outdoor activity context and comprising an executable program or an executable script, and to derive an estimated activity type, using the executable program or the executable script, based at least partly on sensor data, for the activity session which is ongoing or ended, wherein the activity type is selected from: rowing, paddling, cycling, jogging, walking, hunting, swimming, paragliding, orienteering, and running.
  • 2. The mobile user device according to claim 1, wherein the machine readable instruction comprises a set of at least two machine-readable characteristics, wherein each of the machine-readable characteristics characterizes sensor data produced during a predefined activity type.
  • 3. The mobile user device according to claim 1, wherein the at least one processing core is configured to derive the estimated activity type at least in part by comparing, using the executable program or the executable script, the first-type sensor data, or a processed form of the first-type sensor data, to reference data.
  • 4. The mobile user device according to claim 1, wherein the first-type sensor data comprises acceleration sensor data.
  • 5. The mobile user device according to claim 1, wherein the memory is further configured to store second-type sensor data, and wherein the at least one processing core is configured to derive the estimated activity type, using the executable program or the executable script, based at least in part on the second-type sensor data.
  • 6. The mobile user device according to claim 4, wherein the second-type sensor data is of a different type than the first-type sensor data.
  • 7. The mobile user device according to claim 5, wherein the second-type sensor data comprises at least one of: sound sensor data, microphone-derived data and vibration sensor data.
  • 8. The mobile user device according to claim 5, wherein the at least one processing core is configured to derive the estimated activity type at least in part by comparing the second-type sensor data, or a processed form of the second-type sensor data, to reference data, the reference data comprising reference data of a first type and a second type.
  • 9. The mobile user device according to claim 1, wherein the at least one processing core is configured to present the estimated activity type to a user for verification.
  • 10. The mobile user device according to claim 1, wherein the at least one processing core is configured to cause the memory to store, in a sequence of estimated activity types, the estimated activity type and a second estimated activity type.
  • 11. The mobile user device according to claim 1, wherein the at least one processing core is configured to cause the memory to delete the machine readable instruction responsive to a determination that an activity session has ended.
  • 12. A server apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the server apparatus at least to: receive a message from a mobile user device external to the server apparatus, the message comprising information characterizing first-type sensor data obtained during an activity session;determine, based at least partly on the first-type sensor data, an indoor or outdoor activity context, andtransmit, responsive to the message, to the mobile user device a machine-readable instruction configured to cause activity type determination in the determined indoor or outdoor activity context, wherein the machine readable instruction comprises an executable program or an executable script, wherein the activity type is selected from: rowing, paddling, cycling, jogging, walking, hunting, swimming, paragliding, orienteering, and running.
  • 13. A method, comprising: storing first-type sensor data obtained during an activity session in a mobile user device;compiling a message based at least partly on the first-type sensor data;causing the message to be transmitted from the mobile user device to a server external to the mobile user device;causing receiving, responsive to the message, in the mobile user device, from the server, of a machine readable instruction configured to differentiate activity type in either an indoor or an outdoor activity context and comprising an executable program or an executable script, andderiving an estimated activity type, using the executable program or the executable script, based at least partly on sensor data, for the activity session which is ongoing or ended, wherein the activity type is selected from: rowing, paddling, cycling, jogging, walking, hunting, swimming, paragliding, orienteering, and running.
  • 14. The method according to claim 13, wherein the machine readable instruction comprises a set of at least two machine-readable characteristics, wherein each of the machine-readable characteristics characterizes sensor data produced during a predefined activity type.
  • 15. The method according to claim 13, estimated activity type is derived at least in part by comparing, using the executable program or the executable script, the first-type sensor data, or a processed form of the first-type sensor data, to reference data.
  • 16. The method according to claim 13, wherein the first-type sensor data comprises acceleration sensor data.
  • 17. The method according to claim 13, further comprising storing second-type sensor data and wherein the estimated activity type is derived, using the executable program or the executable script, based at least in part on the second-type sensor data.
  • 18. The method according to claim 17, wherein the second-type sensor data is of a different type than the first-type sensor data.
  • 19. The method according to claim 17, wherein the second-type sensor data comprises at least one of: sound sensor data, microphone-derived data and vibration sensor data.
  • 20. The method according to claim 17, wherein the estimated activity type is derived at least in part by comparing the second-type sensor data, or a processed form of the second-type sensor data, to reference data, the reference data comprising reference data of a first type and a second type.
  • 21. The method according to claim 13, further comprising presenting the estimated activity type to a user for verification.
  • 22. The method according to claim 13, further comprising storing, in a sequence of estimated activity types, the estimated activity type and a second estimated activity type.
  • 23. A non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause a mobile user device to at least: store first-type sensor data obtained during an activity session;compile a message based at least partly on the first-type sensor data;cause the message to be transmitted from the mobile user device to a server external to the mobile user device;cause receiving, responsive to the message, in the mobile user device, from the server, of a machine readable instruction configured to differentiate activity type in either an indoor or an outdoor activity context comprising an executable program or an executable script, andderive an estimated activity type, using the executable program or the executable script, based at least partly on sensor data, for the activity session which is ongoing or ended, wherein the activity type is selected from: rowing, paddling, cycling, jogging, walking, hunting, swimming, paragliding, orienteering, and running.
  • 24. A mobile user device for identification of user activity comprising: a memory configured to store first-type sensor data relating to an activity session;at least one processing core configured to:compile a message based at least partly on the first-type sensor data,cause the message to be transmitted from the mobile user device to a server external to the mobile user device,cause receiving in the mobile user device from the server a response to the message as a machine readable instruction configured to differentiate activity type in either an indoor or an outdoor activity context and comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the mobile user device is operating, andderive an estimated activity type, using the reference data specific to the context, based at least partly on sensor data by comparing the sensor data to the reference data specific to the context, for the activity session which is ongoing or ended, wherein the activity type is selected from: rowing, paddling, cycling, jogging, walking, hunting, swimming, paragliding, orienteering, and running.
  • 25. A non-transitory computer readable medium having stored thereon a set of computer readable instructions that, when executed by at least one processor, cause a mobile user device to at least: store first-type sensor data relating to an activity session;compile a message based at least partly on the first-type sensor data,cause the message to be transmitted from the mobile user device to a server external to the mobile user device,cause receiving in the mobile user device from the server a response to the message as a machine readable instruction configured to differentiate activity type in either an indoor or an outdoor activity context and comprising at least two machine-readable characteristics, wherein each of the at least two machine-readable characteristics characterizes sensor data produced during a predefined activity type, the at least two machine-readable characteristics comprising reference data specific to a context where the mobile user device is operating, andderive an estimated activity type, using the reference data specific to the context, based at least partly on sensor data by comparing the sensor data to the reference data specific to the context, for the activity session which is ongoing or ended, wherein the activity type is selected from: rowing, paddling, cycling, jogging, walking, hunting, swimming, paragliding, orienteering, and running.
US Referenced Citations (217)
Number Name Date Kind
5457284 Ferguson Oct 1995 A
5503145 Clough Apr 1996 A
5924980 Coetzee Jul 1999 A
6882955 Ohlenbusch et al. Apr 2005 B1
7627423 Brooks Dec 2009 B2
7706973 McBride et al. Apr 2010 B2
7721118 Tamasi et al. May 2010 B1
7917198 Ahola et al. Mar 2011 B2
7938752 Wang May 2011 B1
8052580 Saalasti et al. Nov 2011 B2
8323188 Tran Dec 2012 B2
8328718 Tran Dec 2012 B2
8538693 McBride et al. Sep 2013 B2
8612142 Zhang Dec 2013 B2
8655591 Van Hende Feb 2014 B2
8781730 Downey et al. Jul 2014 B2
8949022 Fahrner et al. Feb 2015 B1
9008967 McBride et al. Apr 2015 B2
9079090 Hohteri Jul 2015 B2
9107586 Tran Aug 2015 B2
9222787 Blumenberg et al. Dec 2015 B2
9317660 Burich et al. Apr 2016 B2
9595187 Kotz Mar 2017 B2
9648108 Granqvist et al. May 2017 B2
9665873 Ackland et al. May 2017 B2
9829331 McBride et al. Nov 2017 B2
9830516 Biswas et al. Nov 2017 B1
9907473 Tran Mar 2018 B2
9923973 Granqvist et al. Mar 2018 B2
10234290 Lush et al. Mar 2019 B2
10244948 Pham et al. Apr 2019 B2
10295556 Paczkowski et al. May 2019 B1
10327673 Eriksson et al. Jun 2019 B2
10415990 Cho et al. Sep 2019 B2
10433768 Eriksson et al. Oct 2019 B2
10515990 Hung et al. Dec 2019 B2
10634511 McBride et al. Apr 2020 B2
10816671 Graham et al. Oct 2020 B2
20030038831 Engelfriet Feb 2003 A1
20030109287 Villaret Jun 2003 A1
20050070809 Acres Mar 2005 A1
20050086405 Kobayashi et al. Apr 2005 A1
20060068812 Carro et al. Mar 2006 A1
20060136173 Case, Jr. et al. Jun 2006 A1
20060255963 Thompson et al. Nov 2006 A1
20070156335 McBride et al. Jul 2007 A1
20070208544 Kulach et al. Sep 2007 A1
20070276200 Ahola et al. Nov 2007 A1
20080052493 Chang Feb 2008 A1
20080109158 Huhtala et al. May 2008 A1
20080136620 Lee et al. Jun 2008 A1
20080158117 Wong et al. Jul 2008 A1
20080214360 Stirling et al. Sep 2008 A1
20080294663 Heinley et al. Nov 2008 A1
20080318598 Fry Dec 2008 A1
20090047645 DiBenedetto et al. Feb 2009 A1
20090048070 Vincent et al. Feb 2009 A1
20090094557 Howard Apr 2009 A1
20090100332 Kanjilal et al. Apr 2009 A1
20090265623 Kho et al. Oct 2009 A1
20100099539 Haataja Apr 2010 A1
20100167712 Stallings et al. Jul 2010 A1
20100187074 Manni Jul 2010 A1
20100257014 Roberts et al. Oct 2010 A1
20100313042 Shuster Dec 2010 A1
20110010704 Jeon et al. Jan 2011 A1
20110152695 Granqvist et al. Jun 2011 A1
20110218385 Bolyard et al. Sep 2011 A1
20110251822 Darley et al. Oct 2011 A1
20110252351 Sikora et al. Oct 2011 A1
20110281687 Gilley et al. Nov 2011 A1
20110283224 Ramsey et al. Nov 2011 A1
20110288381 Bartholomew et al. Nov 2011 A1
20110296312 Boyer et al. Dec 2011 A1
20110307723 Cupps et al. Dec 2011 A1
20120022336 Teixeira Jan 2012 A1
20120100895 Priyantha et al. Apr 2012 A1
20120109518 Huang May 2012 A1
20120116548 Goree et al. May 2012 A1
20120123806 Schumann et al. May 2012 A1
20120158289 Bernheim Brush et al. Jun 2012 A1
20120185268 Wiesner et al. Jul 2012 A1
20120219186 Wang et al. Aug 2012 A1
20120239173 Laikari et al. Sep 2012 A1
20120283855 Hoffman et al. Nov 2012 A1
20120289791 Jain et al. Nov 2012 A1
20120317520 Lee Dec 2012 A1
20130053990 Ackland et al. Feb 2013 A1
20130060167 Dracup Mar 2013 A1
20130095459 Tran Apr 2013 A1
20130127636 Aryanpur et al. May 2013 A1
20130151874 Parks et al. Jun 2013 A1
20130166888 Branson et al. Jun 2013 A1
20130178334 Brammer Jul 2013 A1
20130187789 Lowe et al. Jul 2013 A1
20130190903 Balakrishnan et al. Jul 2013 A1
20130217979 Blackadar et al. Aug 2013 A1
20130225370 Flynt et al. Aug 2013 A1
20130234924 Janefalkar et al. Sep 2013 A1
20130250845 Greene et al. Sep 2013 A1
20130289932 Baechler Oct 2013 A1
20130304377 Van Hende Nov 2013 A1
20130312043 Stone et al. Nov 2013 A1
20130332286 Medelius et al. Dec 2013 A1
20130345978 Lush et al. Dec 2013 A1
20140018686 Medelius et al. Jan 2014 A1
20140046223 Kahn Feb 2014 A1
20140094200 Schatzberg et al. Apr 2014 A1
20140135593 Jayalth et al. May 2014 A1
20140142732 Karvonen May 2014 A1
20140149754 Silva et al. May 2014 A1
20140159915 Hong et al. Jun 2014 A1
20140163927 Molettiere et al. Jun 2014 A1
20140208333 Beals et al. Jul 2014 A1
20140218281 Amayeh et al. Aug 2014 A1
20140235166 Molettiere et al. Aug 2014 A1
20140237028 Messenger et al. Aug 2014 A1
20140257533 Morris et al. Sep 2014 A1
20140275821 Beckman Sep 2014 A1
20140288680 Hoffman et al. Sep 2014 A1
20140300490 Kotz Oct 2014 A1
20140336796 Agnew Nov 2014 A1
20140337036 Haiut et al. Nov 2014 A1
20140337450 Choudhary et al. Nov 2014 A1
20140343380 Carter et al. Nov 2014 A1
20140350883 Carter et al. Nov 2014 A1
20140365107 Dutta et al. Dec 2014 A1
20140372064 Darley et al. Dec 2014 A1
20150006617 Yoo Jan 2015 A1
20150037771 Kaleal, III et al. Feb 2015 A1
20150042468 White et al. Feb 2015 A1
20150057945 White et al. Feb 2015 A1
20150113417 Yuen et al. Apr 2015 A1
20150119198 Wisbey et al. Apr 2015 A1
20150119728 Blackadar et al. Apr 2015 A1
20150127966 Ma et al. May 2015 A1
20150141873 Fei May 2015 A1
20150160026 Kitchel Jun 2015 A1
20150180842 Panther Jun 2015 A1
20150185815 Debates et al. Jul 2015 A1
20150209615 Edwards Jul 2015 A1
20150233595 Fadell et al. Aug 2015 A1
20150272483 Etemad et al. Oct 2015 A1
20150312857 Kim et al. Oct 2015 A1
20150317801 Bentley et al. Nov 2015 A1
20150326709 Pennanen et al. Nov 2015 A1
20150334772 Wong et al. Nov 2015 A1
20150335978 Syed et al. Nov 2015 A1
20150342533 Kelner Dec 2015 A1
20150347983 Jon et al. Dec 2015 A1
20150350822 Xiao et al. Dec 2015 A1
20150362519 Balakrishnan et al. Dec 2015 A1
20150374279 Takakura et al. Dec 2015 A1
20150382150 Ansermet et al. Dec 2015 A1
20160007288 Samardzija et al. Jan 2016 A1
20160007934 Arnold Jan 2016 A1
20160012294 Bouck Jan 2016 A1
20160018899 Tu et al. Jan 2016 A1
20160023043 Grundy Jan 2016 A1
20160026236 Vasistha et al. Jan 2016 A1
20160034043 Le Grand et al. Feb 2016 A1
20160034133 Wilson et al. Feb 2016 A1
20160041593 Dharawat Feb 2016 A1
20160058367 Raghuram et al. Mar 2016 A1
20160058372 Raghuram et al. Mar 2016 A1
20160059079 Watterson Mar 2016 A1
20160072557 Ahola Mar 2016 A1
20160081028 Chang et al. Mar 2016 A1
20160081625 Kim et al. Mar 2016 A1
20160084869 Yuen et al. Mar 2016 A1
20160091980 Baranski et al. Mar 2016 A1
20160104377 French et al. Apr 2016 A1
20160105852 Papakipos et al. Apr 2016 A1
20160135698 Baxi et al. May 2016 A1
20160143579 Martikka et al. May 2016 A1
20160144236 Ko et al. May 2016 A1
20160148396 Bayne et al. May 2016 A1
20160148615 Lee et al. May 2016 A1
20160184686 Sampathkumaran Jun 2016 A1
20160209907 Han et al. Jul 2016 A1
20160226945 Granqvist et al. Aug 2016 A1
20160259495 Butcher et al. Sep 2016 A1
20160317097 Adams et al. Nov 2016 A1
20160327915 Katzer et al. Nov 2016 A1
20160328991 Simpson et al. Nov 2016 A1
20160346611 Rowley et al. Dec 2016 A1
20160357202 Carter et al. Dec 2016 A1
20160374566 Fung et al. Dec 2016 A1
20160379547 Okada Dec 2016 A1
20170010677 Roh et al. Jan 2017 A1
20170011089 Bermudez et al. Jan 2017 A1
20170011210 Cheong et al. Jan 2017 A1
20170032256 Otto et al. Feb 2017 A1
20170038740 Knappe et al. Feb 2017 A1
20170043212 Wong Feb 2017 A1
20170063475 Feng Mar 2017 A1
20170065230 Sinha et al. Mar 2017 A1
20170087431 Syed et al. Mar 2017 A1
20170124517 Martin May 2017 A1
20170153119 Nieminen et al. Jun 2017 A1
20170153693 Duale et al. Jun 2017 A1
20170154270 Lindman et al. Jun 2017 A1
20170168555 Munoz et al. Jun 2017 A1
20170173391 Wiebe et al. Jun 2017 A1
20170232294 Kruger et al. Aug 2017 A1
20170262699 White et al. Sep 2017 A1
20170266494 Crankson et al. Sep 2017 A1
20170316182 Blackadar et al. Nov 2017 A1
20170340221 Cronin Nov 2017 A1
20180015329 Burich et al. Jan 2018 A1
20180108323 Lindman et al. Apr 2018 A1
20180193695 Lee Jul 2018 A1
20180345077 Blahnik et al. Dec 2018 A1
20190025928 Pantelopoulos et al. Jan 2019 A1
20190056777 Munoz et al. Feb 2019 A1
20190069244 Jeon et al. Feb 2019 A1
20190367143 Sinclair et al. Dec 2019 A1
Foreign Referenced Citations (63)
Number Date Country
2007216704 Apr 2008 AU
1877340 Dec 2006 CN
102495756 Jun 2012 CN
103309428 Sep 2013 CN
103631359 Mar 2014 CN
103970271 Aug 2014 CN
204121706 Jan 2015 CN
104680046 Jun 2015 CN
105242779 Jan 2016 CN
106062661 Oct 2016 CN
106604369 Apr 2017 CN
106999106 Aug 2017 CN
108052272 May 2018 CN
103154954 Jun 2018 CN
108377264 Aug 2018 CN
108983873 Dec 2018 CN
1755098 Feb 2007 EP
2096820 Sep 2009 EP
2107837 Oct 2009 EP
2172249 Apr 2010 EP
2770454 Aug 2014 EP
2703945 Mar 2015 EP
2849473 Mar 2015 EP
2910901 Aug 2015 EP
2996409 Mar 2016 EP
3018582 May 2016 EP
3023859 May 2016 EP
3361370 Aug 2018 EP
126911 Feb 2017 FI
2404593 Feb 2005 GB
2425180 Oct 2006 GB
2513585 Nov 2014 GB
2530196 Mar 2016 GB
2537423 Oct 2016 GB
2541234 Feb 2017 GB
2555107 Apr 2018 GB
20110070049 Jun 2011 KR
101500662 Mar 2015 KR
528295 Oct 2006 SE
201706840 Feb 2017 TW
I598076 Sep 2018 TW
WO02054157 Jul 2002 WO
WO2010083562 Jul 2010 WO
WO2010144720 Dec 2010 WO
WO 2011061412 May 2011 WO
WO2011123932 Oct 2011 WO
WO2012037637 Mar 2012 WO
WO2012115943 Aug 2012 WO
WO2012141827 Oct 2012 WO
WO2013091135 Jun 2013 WO
WO2013121325 Aug 2013 WO
WO2014118767 Aug 2014 WO
WO2014144258 Sep 2014 WO
WO2014193672 Dec 2014 WO
WO 2014209697 Dec 2014 WO
WO 2015021407 Feb 2015 WO
WO2014182162 Jun 2015 WO
WO2015087164 Jun 2015 WO
WO2015131065 Sep 2015 WO
WO2016022203 Feb 2016 WO
WO2017011818 Jan 2017 WO
WO2018217348 Nov 2018 WO
WO2018222936 Dec 2018 WO
Non-Patent Literature Citations (8)
Entry
ARM big. LITTLE. Wikipedia, The free encyclopedia, Oct. 11, 2018, Retrieved on May 28, 2020 from: <https://en.wikipedia.org/w/index.php?title=ARM_bit.LITTLE&oldid=863559211> foreword on p. 1, section “Run-state migration” on pp. 1-2.
Qualcomm Snapdragon Wear 3100 Platform Supports New Ultra-Low Power System Architecture For Next Generation Smartwatches. Qualcomm Technologies, Inc., Sep. 10, 2018, Retrieved on May 28, 2018 from: <https://www.qualcomm.com/news/releases/2018/09/10/qualcomm-snapdragon-wear-3100-platform-supports-sections “Snapdragon Wear 3100 Based Smartwatches Aim to Enrich the User Experience” on pp. 3-4.
Sheta et al: Packet scheduling in LTE mobile network. International Journal of Scientific & Engineering Research, Jun. 2013, vol. 4, Issue 6.
CNET: Dec. 11, 2017, “Apple watch can now sync with a treadmill”, youtube.com, [online], Available from: https://www.youtube.com/watch?v=7RvMC3wFDME [ Accessed Nov. 19, 2020].
CASH: A guide to GPS and route plotting for cyclists. 2018. www.cyclinguk.org/article/guide-gps-and-route-plotting-cyclists.
Sieber et al.: Embedded systems in the Poseidon MK6 rebreather. Intelligent Solutions in Embedded Systems, 2009, pp. 37-42.
Ainsworth et al: Parallel Error Detection Using Heterogeneous Cores. 48th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), 2018, IEEE, 2018.
DAVIS: The Best Technical Diving Computers 2019. Feb. 7, 2019.
Related Publications (1)
Number Date Country
20170176213 A1 Jun 2017 US