Engaging exercising devices with a mobile device

Information

  • Patent Grant
  • 11284807
  • Patent Number
    11,284,807
  • Date Filed
    Friday, May 31, 2019
    5 years ago
  • Date Issued
    Tuesday, March 29, 2022
    2 years ago
Abstract
According to an example aspect of the present invention, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to receive a first sensor data stream from an exercising device, receive a second sensor data stream from a mobile device, correlate the first data stream with the second data stream, determine that the exercising device is engaged with the mobile device, and provide indication to the exercising device and the mobile device that the devices are engaged with each other.
Description
FIELD

The present invention relates to an apparatus comprising at least one processing core and at least one memory including computer program code.


Further, the present invention relates to an use of an apparatus in connection with an exercising device and a mobile device.


Furthermore, the present invention relates to an arrangement comprising an apparatus, an exercising device and a mobile device.


BACKGROUND

Different exercising devices such as ergometers, gym exercising devices, fitness devices, sports devices, weight lifting devices, exercise bikes, treadmills, rowing machines, cross trainers, etc. are known. Some of the known exercising devices are often equipped with an internal display, a user interface and/or a memory, for example exercise bikes or treadmills. Further, various mobile devices are known by means of which information can be stored, processed and/or displayed. The mobile devices typically comprise a user interface such as a touchscreen, a keyboard or at least one button. Examples of such devices are smartphones, tablets, smartwatches and wrist-watches.


User sessions, such as training sessions, may be recorded, for example in notebooks, spreadsheets or other suitable media. Recorded training sessions enable more systematic training, and progress toward set goals can be assessed and tracked from the records so produced. Such records may be stored for future reference, for example to assess progress an individual is making as a result of the training. An activity session may comprise a training session or another kind of session.


Personal devices, such as, for example, smart watches, smartphones or smart jewelry, may be configured to produce recorded sessions of user activity. Such recorded sessions may be useful in managing physical training, child safety or in professional uses. Recorded sessions, or more generally sensor-based activity management, may be of varying type, such as, for example, running, walking, skiing, canoeing, wandering, or assisting the elderly.


Recorded sessions may be viewed using a personal computer, for example, wherein recordings may be copied from a personal device to the personal computer. Files on a personal computer may be protected using passwords and/or encryption, for example.


Personal devices may be furnished with sensors, which may be used, for example, in determining a heart-beat rate of a user during a user session. A recorded heart-beat rate of a user may be later observed using a personal computer, for example.


Document FI 20155989 discloses an apparatus comprising a memory configured to store first-type sensor data, at least one processing core configured to compile a message based at least partly on the first-type sensor data, to cause the message to be transmitted from the apparatus, to cause receiving in the apparatus of a machine readable instruction, and to derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data.


SUMMARY OF THE INVENTION

The invention is defined by the features of the independent claims. Some specific embodiments are defined in the dependent claims.


According to a first aspect of the present invention, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to receive a first sensor data stream from an exercising device, receive a second sensor data stream from a mobile device, correlate the first data stream with the second data stream, determine that the exercising device is engaged with the mobile device, and provide indication to the exercising device and the mobile device that the devices are engaged with each other.


Various embodiments of the first aspect may comprise at least one feature from the following bulleted list:

    • the apparatus is configured to store and process sensor data received from the exercising device
    • the apparatus is configured to store and process sensor data received from the mobile device
    • the first sensor data stream is based on a first acceleration sensor and the second data stream is based on a second acceleration sensor
    • the first sensor data stream is based on a first angular velocity sensor and the second data stream is based on a second angular velocity sensor
    • the apparatus is a server or a server infrastructure
    • the apparatus is configured to receive the first sensor data stream from an ergometer, a gym exercising device, a fitness device, a sports device, a weight lifting device, an exercise bike, a treadmill, a rowing machine, or a cross trainer
    • the apparatus is configured to receive the second sensor data stream from a wrist-watch, a tablet, a smartwatch, a smartphone, a wearable sensor or any other external sensor
    • the apparatus is configured to receive a third sensor data stream from a wearable sensor or an external sensor
    • the apparatus is capable of storing at least a part of the data streams in a memory
    • time stamps associated with sensor data are contained in each of the data streams
    • the processing core is capable of arranging sensor data in an order depending on the time stamps associated with the sensor data
    • the apparatus is capable of estimating a start and an end of an activity type based on sensor data and the associated time stamps
    • the apparatus is configured to allow a user to remotely read-out at least a part of at least one of the first sensor data stream, the second sensor data stream and the third data stream
    • the apparatus is remotely accessible by a person in order to analyse a user session
    • the apparatus is configured to estimate an activity type of a user based on the correlated data streams


According to a second aspect of the present invention, there is provided a use of the apparatus according to at least one of claims 1-13 in connection with an exercising device and a mobile device. According to an embodiment, the use takes place in a gym, in connection with a training session, in connection with a sports session or in connection with a fitness session.


According to a third aspect of the present invention, there is provided an arrangement comprising an apparatus according to at least one of claims 1-13, an exercising device and a mobile device.


Various embodiments of the third aspect may comprise at least one feature from the following bulleted list:

    • the exercising device is enabled to act as a server having control over the user interface of the mobile device
    • the exercising device is configured to participate in pairing with the mobile device
    • the exercising device is configured to participate in pairing with the mobile device during a session with the exercising device
    • the exercising device is configured to transmit program code to be stored and processed by the mobile device after a pairing process
    • the exercising device is capable of transmitting parameters and/or logics to the mobile device after a pairing process
    • the exercising device is capable of transmitting instructions to the mobile device after a pairing process
    • the exercising device is capable of receiving data which has been input via a user interface of the mobile device after a pairing process
    • the exercising device is capable of transmitting a recipe or an instruction to the mobile device how to analyse movements of a user
    • the recipe can be retrieved from a server by at least one of the mobile device, the apparatus, or the exercising device
    • the exercising device is configured to transmit data to the mobile device after a pairing process, which data is to be displayed on a display of the mobile device.
    • the exercising device is configured to transmit sensor data to the mobile device and to receive in response input parameters from the mobile device after a pairing process
    • the exercising device is an ergometer, a gym exercising device, a fitness device, a sports device, a weight lifting device, an exercise bike, a treadmill, a rowing machine, or a cross trainers
    • the exercising device is configured to transmit data to a server
    • the exercising device is capable of transmitting and receiving signals wirelessly
    • the exercising device is configured to transmit sensor data and associated time stamps to at least one of the apparatus and the mobile device
    • the mobile device is enabled to act as a client whose content is fully or at least partially controlled by the exercising device
    • the mobile device is configured to participate in pairing with the exercising device
    • the mobile device is configured to participate in pairing with the exercising device during a session with the exercising device
    • the mobile device is configured to store and process program code received from
    • the exercising device after a pairing process
    • the mobile device is capable of receiving parameters and/or logics from the exercising device after a pairing process
    • the mobile is capable of processing instructions received from the exercising device after a pairing process
    • the mobile device is capable of starting calculations based on the received instructions after a pairing process
    • the mobile device is capable of starting user interface methods based on the received instructions after a pairing process
    • the mobile device is capable of receiving a recipe or instruction from the exercising device how to analyse movements of a user after a pairing process
    • the mobile device is configured to control parameters or functions of the exercising device
    • the mobile device is capable of controlling a music program or a music playlist stored in the memory of the exercising device or in the memory of a second mobile device such as a tablet or a smartphone
    • the mobile device is configured to serve as a display of the exercising device after a pairing process
    • the mobile device is configured to serve as an additional display of the exercising device after a pairing process
    • the mobile device is configured to serve as a user interface of the exercising device after a pairing process
    • the mobile device is configured to serve as an additional user interface of the exercising device after a pairing process
    • the mobile device is configured to serve as a memory of the exercising device after a pairing process
    • the mobile device is configured to serve as an additional memory of the exercising device after a pairing process
    • the mobile device is configured to participate in the pairing process during a session with the exercising device after a pairing process
    • the session is based on sensors of the mobile device and the exercising device
    • the mobile device is configured to receive sensor data from the exercising device and to transmit in response input parameters to the exercising device after a pairing process
    • the mobile device is capable of transmitting instructions to the exercising device, for example to change a speed of a part of the exercising device, to change a resistance of the exercising device, or to change a weight of the exercising device after a pairing process
    • the mobile device is configured to transmit data to a server
    • the mobile device is capable of transmitting and receiving signals wirelessly
    • the mobile device is configured to transmit sensor data and associated time stamps to at least one of the apparatus and the exercising device
    • the mobile device is a wrist-watch, a tablet, a smartwatch or a smartphone


Considerable advantages are obtained by means of certain embodiments of the present invention. An apparatus comprising at least one processing core and at least one memory including computer program code is provided. According to certain embodiments of the present invention, it can be determined that an exercising device and a mobile device are engaged with each other based on correlated sensor data streams received from the exercising device and the mobile device. An activity type of a user can be estimated by the apparatus based on the received sensor data streams of the exercising device and the mobile device. The apparatus can be remotely accessible by a person in order to analyse a user session in real time or at a later stage. For example, the apparatus may be connected with the internet. A client connected to the apparatus may receive the sensor data streams and may be configured to cause pairing of the mobile device and the exercising device based on the correlated data streams or based on instructions input by a person.


After providing indication to the exercising device and the mobile device that the devices are engaged with each other, the exercising device can participate in pairing with the mobile device and the mobile device can participate in pairing with the exercising device according to certain embodiments of the present invention. The mobile device and the exercising can form at least temporarily a unit in which the mobile device serves as a display, a user interface and/or a memory of the exercising device. According to certain embodiments of the present invention, the mobile device serves as a display of the exercising device. The mobile device may serve as the only display or as an additional display of the exercising device during the time period of a user session. According to certain other embodiments of the present invention, the mobile device serves as a user interface of the exercising device. The mobile device may serve as the only user interface or as an additional user interface of the exercising device during the time period of a user session.


A mobile device and the exercising device can temporarily form a unit during the time period of a user session, a training session or a sports session. According to an embodiment, an app which is related to a specific exercising device can be stored in the memory of the mobile device of a user. Settings of the app can be personalized. For example, when the user is bringing his/her mobile device to a gym, the user can start a user session using his/her own personalized settings. Personalized settings may include information about age, weight, height, gender, body mass index, maximum performance capacity, activity parameter, previous energy expenditure and maximum heart rate, for instance. Also personalized exercise-guidance parameters such as an energy expenditure target, heart rate zones, activity zones, anaerobic threshold, fitness classification identifier and/or dehydration warning limits may be stored on the mobile device. Personalized data determined by sensors of the exercising device can further be stored in the memory of the mobile device and/or in the internet. Further, personalized data determined by sensors of the mobile device can be stored in the memory of the mobile device and/or in the internet. Determined data of the user, for example movement data or heart-beat rate data, can then be analysed at a later stage subsequent to the user session, training session or sports session. When another user is using the exercising device in the gym, his/her mobile device and the exercising device temporarily form a unit during the time period of another user session. Personalized settings can be used by this other user and personalized data can be stored in the memory of the respective mobile device and/or in the internet for further analysis. Consequently, it is not necessary to store any personalized data on the memory of the exercising device according to certain embodiments of the present invention.


According to another embodiment, no program code or a minimum amount thereof needs to be installed on the mobile device such as a wrist watch. The mobile device serves as a display and/or user interface for the exercising device. The procedure is controlled by the exercising device or the apparatus. Minimum system requirements are required for the mobile device. According to this embodiment, the input data is processed by the exercising device or the apparatus. The bidirectional communication link between the mobile device and the exercising device or apparatus may be used to enable the exercising device or apparatus to act as a server having control over the user interface and the mobile device to act as a client whose content is fully or at least partially controlled by the exercising device or apparatus.


Receiving a first sensor data stream from the exercising device, receiving a second sensor data stream from the mobile device, correlating the first data stream with the second data stream, determining that the exercising device is engaged with the mobile device, and providing indication to the exercising device and the mobile device that the devices are engaged with each other can take place automatically, thus providing to the user a comfortable user experience.


According to certain embodiments, a temporarily combined unit can share a classification task. I.e. the exercising device has its own classification task to produce semantic events like ‘lift’, ‘release’, ‘step’ etc. Similarly, the mobile device has its own classification task to produce semantic events. The system creates a more comprehensive analysis of the user's actions to provide e.g. a deeper understanding of the user's biomechanical accuracy.


Certain embodiments of the present invention are applicable with regards to health care, in industry, in working environments, in sports, and the like.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a schematic top view of an apparatus in accordance with at least some embodiments of the present invention,



FIG. 2 illustrates a schematic top view of a mobile device comprised by an arrangement in accordance with at least some embodiments of the present invention,



FIG. 3 illustrates a schematic side view of a mobile device comprised by another arrangement in accordance with at least some embodiments of the present invention,



FIG. 4 illustrates a schematic side view of an exercising device comprised by a further arrangement in accordance with at least some embodiments of the present invention,



FIG. 5 illustrates a schematic top view of a mobile device comprised by a yet further arrangement in accordance with at least some embodiments of the present invention,



FIG. 6 illustrates a schematic side view of an exercising device comprised by a yet further arrangement in accordance with at least some embodiments of the present invention, and



FIG. 7 illustrates a schematic view of an arrangement in accordance with at least some embodiments of the present invention.





EMBODIMENTS

In FIG. 1 a schematic top view of an apparatus 18 in accordance with at least some embodiments of the present invention is illustrated. The apparatus 18 comprises at least one processing core and at least one memory including computer program code. The at least one memory and the computer program code are configured to, with the at least one processing core, cause the apparatus 18 at least to receive a first sensor data stream from an exercising device 2, receive a second sensor data stream from a mobile device 1, correlate the first data stream with the second data stream, determine that the exercising device 2 is engaged with the mobile device 1, and provide indication to the exercising device 2 and the mobile device 1 that the devices 1, 2 are engaged with each other. In other words, the apparatus 18 is capable of estimating that a user 6 is using a specific exercising device 2 by correlating sensor data received from the mobile device 1 and sensor data received from the exercising device 2. The apparatus 18 is capable of estimating an activity type of a user 6, for example running, based on the correlated data streams.


The mobile device 1 may be a wrist-watch, for instance. The exercising device 2 may be a treadmill, for instance. The apparatus 18 may be, for example, a server or server infrastructure. The apparatus 18 is configured to store and process sensor data received from the exercising device 2. Further, the apparatus 18 is configured to store and process sensor data received from the mobile device 2. Of course, the apparatus may be also configured to store and process sensor data received from a wearable sensor or any other external sensor, for example a MOVESENSE sensor. Such sensor data may be wirelessly transferred to the apparatus 18 directly or to at least one of the mobile device 1 and the exercising device 2 first and then to the apparatus 18.


According to a certain embodiment, an external sensor (not shown), for example a MOVESENSE sensor, is attached to a user and connected to the mobile device 1, for example a wrist watch 1. When the user comes to an exercising device 2, the mobile device 1 automatically displays information. Simultaneously, the mobile device 1 receives instructions from the exercising device 2 and/or apparatus 18. However, also the exercising device 2 and/or apparatus 18 may receive data from the mobile device 1 and/or the external sensor. The data may, for example, include personal data, sensor data and/or external sensor data. The data is typically processed by the exercising device 2 and/or apparatus 18. This kind of user experience is automatically created. When the user changes the exercising device 2, for example in a gym, the displayed information on the display of the mobile device 1 also automatically changes. The exercising device 2 can additionally receive further data from a server or via the internet. External sensor data can be analysed by the exercising device 2 and/or apparatus 18 and content, for example information derived from the sensor data, can be automatically displayed on the mobile device 1. In such a situation, the exercising device 2 may be used to enable the exercising device 2 to act as a server having control over the user interface and the mobile device 1 to act as a client whose content is fully or at least partially controlled by the exercising device 2


The first sensor data stream may be, for example, based on a first acceleration sensor and the second data stream may be, for example, based on a second acceleration sensor. I.e., the mobile device 1 comprises an acceleration sensor and the exercising device 2 comprises an acceleration sensor and the acceleration data of both sensors is received by the server 18. The mobile device 1 and the exercising device 2 both measure the same parameter, i.e. acceleration. The sensor of the mobile device 1 may, for example, measure acceleration of a cyclical motion of the right arm of the user as indicated by arrow 7. The sensor of the exercising device 2 may, for example, measure acceleration of the treadmill belt as indicated by arrow 8. According to other embodiments, the sensor of the mobile device 1 and the sensor of the exercising device 2 both measure another parameter, for example angular velocity.


After indication that the devices 1, 2 are engaged with each other has been provided to the exercising device 2 and the mobile device 1, the exercising device 2 may participate in pairing with the mobile device 1 and the mobile device 1 may participate in pairing with the exercising device 2. The mobile device 1 in the form of a wrist-watch comprises at least one processing core and at least one memory including computer program code. The at least one memory and the computer program code are configured to, with the at least one processing core, cause the mobile device 1 at least to receive a first signal 3 from an exercising device 2, process the received signal, respond to the received signal by transmitting a second signal 4 to the exercising device 2, and participate in a pairing process 5 with the exercise device 2. Further examples are described below in connection with FIG. 2 to FIG. 6.


According to certain embodiments, the apparatus 18 may be further configured to allow a user to remotely read-out at least a part of at least one of the first sensor data stream and the second sensor data stream. In other words, the apparatus 18 is remotely accessible by a person in order to analyse a user session.


The first sensor data stream may be received by the apparatus 18 from the exercising device 2 via wire or a wireless technology, for example a low power wireless communication technology such as Bluetooth, Bluetooth Low Energy, or Wibree. The second sensor data stream may be typically received by the apparatus 18 from the mobile device 1 via a wireless technology, for example a low power wireless communication technology such as Bluetooth, Bluetooth Low Energy, or Wibree.


In FIG. 2 a schematic top view of a mobile device 1 comprised by an arrangement in accordance with at least some embodiments of the present invention is illustrated. The shown mobile device 1 is a wrist-watch. The shown exercising device 2 is a treadmill. The mobile device 1 comprises at least one processing core and at least one memory including computer program code. The at least one memory and the computer program code are configured to, with the at least one processing core, cause the mobile device 1 at least to receive a first signal 3 from an exercising device 2, process the received signal, respond to the received signal by transmitting a second signal 4 to the exercising device 2, and participate in a pairing process 5 with the exercise device 2. Pairing in accordance with the present invention takes place after indication that the mobile devices 1 and the exercising device 2 are engaged with each other has been provided to the exercising device 2 and the mobile device 1 as described in connection with FIG. 1.


Sensor data contained in the data streams is typically associated with time stamps. For example, one acceleration value is measured every second and for each measured acceleration value the respective time is transmitted to the apparatus 18 in the form of a time stamp. Sensor data from the mobile device 1 having a first time stamp is compared with sensor data from the exercising device 2 having an identical time stamp. This procedure can be repeated for a plurality of second, third, etc. time stamps. When the sensor data from the mobile device and the sensor data from the exercising device correlate with each other, the apparatus is capable of determining that the exercising device 2 is engaged with the mobile device 1. When a user changes an exercising device, for example in a gym, pairing between the new exercising device and the mobile device can be automatically initiated based on the sensor data from the new exercising device correlating with the sensor data from the mobile device 1.


In other words, when a user 6 wearing the wrist-watch 1 starts to use the exercising device 2, sensor data streams are transmitted from the mobile device 1 and the exercising device 2 to the apparatus 18 in the form of a server. The server determines that the exercising device 2 is engaged with the mobile device 1 and estimates an activity type of the user, for example running. Subsequent to providing indication to the exercising device 2 and the mobile device 1 that the devices 1, 2 are engaged with each other, the wrist-watch 1 and the exercising device start to communicate with each other. A first signal as indicated by arrow 3 from the exercising device 2 is transmitted to the wrist-watch 1. Then the received signal 3 is processed by the processing core of the wrist-watch 1. Subsequently, a second signal as indicated by arrow 4 is transmitted from the wrist-watch 1 to the exercising device 2. This process is called pairing 5. Data between the wrist-watch 1 and the exercising device 2 can now be transferred between the wrist-watch 1 and the exercising device 2. Data is typically transferred using low power wireless communication technology such as Bluetooth, Bluetooth Low Energy, or Wibree.


The treadmill belt of the exercising device 2 may be moving with a specific speed as indicated by arrow 8 as a user 6 is running on the belt. At the same time, the arms of the runner 6 move cyclically as indicated by arrow 7. Data may be determined by the sensors of the wrist-watch 1. Examples of such determined data are a heart-beat rate, number of steps during a certain period of time, or acceleration data. Data may also be determined by sensors of the exercising device 2 and transmitted to the mobile device 1. An example of such data is the speed of the moving treadmill belt of the exercising device 2. The information about the speed of the treadmill belt may be transmitted from the exercising device 1 to the wrist-watch 1. The information about the speed of the treadmill belt may then be displayed on the wrist-watch. In other words, the wrist-watch 1 is configured to serve as a display of the exercising device 2. Of course, also data determined by at least one sensor of the wrist-watch 1 may be displayed on the display of the wrist-watch. The user 6 may further choose which data is displayed.


According to certain embodiments, the exercising device 2 may also comprise an additional display and data may be transmitted from the wrist-watch 1 to the exercising device 2. A user 6 may choose which information is shown on the display of the exercising device 2 and which information is at the same time displayed on the display of the wrist-watch 1. In other words, the user 6 may choose which sensor data is displayed on the display of the wrist-watch 1 and which sensor data is displayed on the display of the exercising device 2.


According to certain other embodiments, the mobile device 1 is configured to control parameters or functions of the exercising device 2 after the pairing process 5. In the shown example, a user 6 may control the speed of the treadmill belt of the exercising device 2 as indicated by arrow 8 via a user interface of the wrist-watch 1. A user interface of the wrist watch 1 may be a touchscreen or at least one button, for instance. User instructions to change the speed of the treadmill belt may be transmitted from the wrist-watch 1 to the exercising device 2 and processed by the exercising device 2, thus causing the exercising device 2 to change the speed of the treadmill belt. According to this embodiment, the procedure is typically controlled by the exercising device 2 such that no program code or a minimum amount thereof needs to be installed on the mobile device 1. The mobile device 1 serves as a user interface for the exercising device 2. In other words, a computer program comprising program instructions which, when loaded into the exercising device 2, cause e.g. graphical user interface data to be determined for the mobile device 1 is provided. The graphical user interface data is wirelessly transmitted to the mobile device 1 from the exercising device 2 to provide at least one user interface functionality on the mobile device 1. Then data corresponding to user input is received and wirelessly transmitted to the exercising device 2. Minimum system requirements such as processing capacity and memory capacity are required for the mobile device 1. According to this embodiment, the input data is completely or at least partially processed by the exercising device 2. The bidirectional communication link between the mobile device 1 and the exercising device 2 may be used to enable the exercising device 2 to act as a server having control over the user interface and the mobile device 1 to act as a client whose content is fully or at least partially controlled by the exercising device 2.


In FIG. 3 a schematic side view of a mobile device 1 comprised by another arrangement in accordance with at least some embodiments of the present invention is illustrated. The shown mobile device 1 may be a tablet or other mobile device. The shown exercising device 2 is an ergometer or indoor exercise bike. Prior to pairing, indication that the mobile devices 1 and the exercising device 2 are engaged with each other has been provided to the exercising device 2 and the mobile device 1. After the pairing process as described above in connection with FIG. 2, parameters and/or logics such as an app are transmitted from the exercising device 2 to the mobile device 1. The mobile device 1 is configured to store and process program code received from the exercising device 2. The mobile device 1 is configured to serve as a display of the exercising device 2.


For example, a video simulation of a cycling track may be displayed on the display of the mobile device 1. Thus, the user 6 can cycle along the simulated track. Sensors of the exercising device 2 may determine the cycling speed of the user 6, for example. The sensor data of the exercising device 2 is then transmitted to the mobile device 1. The sensor data can be used as input data for the video simulation displayed on the mobile device 1. In other words, the user 6 can cycle along the virtual track with varying speeds. The visualization of the virtual cycling simulation is calculated based on the speed data obtained from the sensor data of the exercising device 2. On the other side, data may be transmitted from the mobile device 1 to the exercising device 2, thus causing the exercising device to change a parameter. Altitude data along the virtual track stored in the app may be provided, for instance. The altitude data can be used as input data for the parameters of the exercising device 2 as a function of time. When the data is received by the exercising device 2, it causes the exercising device 2 to change the resistance of the exercise bike during cycling along the virtual track. In other words, cycling upwards or downwards along the virtual track can be simulated. The exercising device 2 is configured to transmit sensor data to the mobile device 1 and to receive in response input parameters from the mobile device 1. Consequently, cycling along a virtual track, for example a passage of the Tour de France, can be simulated.


The exercising device 2 may be, for example, located in a gym and different users may subsequently cycle along the virtual track. When each user bring his/her own mobile device 1 to the gym, for each user a period of time may be determined by the app for cycling from the beginning of the virtual track to the end of the virtual track. The period of time for each user may then be transmitted from the respective mobile device 1 to the exercising device 2 and stored in a memory of the exercising device 2. The different periods of time may be ranked and listed so that a user can see his/her results in comparison to the results of other users. Thus, it is possible to simulate a cycling competition, for instance.


Of course, the mobile device 1 may also be used for displaying only information such as cycling speed, length of cycling session period or for selecting a cycling resistance of the exercising device 2.


Data determined by sensors of the exercising device 2 may be received by and stored in the mobile device 1. Alternatively, data determined by sensors of the exercising device 2 may be received by the mobile device and stored in the cloud. Thus, the user 6 can analyse the stored data at a later stage by reading out the memory of the mobile device 1 or viewing a webpage in the internet.


In FIG. 4 a schematic side view of an exercising device 2 comprised by a further arrangement in accordance with at least some embodiments of the present invention is illustrated. In the shown embodiment, the exercising device 2 is a rowing machine. The mobile device 1 may be a tablet computer, for instance. The exercising device 2 comprises at least one processing core 13 and at least one memory 14 including computer program code. The at least one memory 14 and the computer program code are configured to, with the at least one processing core 13, cause the exercising device 2 at least to transmit a first signal 3 to a mobile device 1, receive a second signal 4 from the mobile device 1, and participate in pairing 5 with the mobile device 1. Prior to pairing, indication that the mobile devices 1 and the exercising device 2 are engaged with each other has been provided to the exercising device 2 and the mobile device 1.


Subsequent to the pairing process 5, program code to be stored and processed by the mobile device 1 can be transmitted from the exercising device 2. Parameters and/or logics such as a rule engine, an app, a classification recipe or a html web page can be transmitted to the mobile device, for instance.


For example, an app may be transmitted to the mobile device 1. A user can select a training program with the help of the app. During the training session, the exercising device 2 may, for example, transmit a recipe or an instruction to the mobile device 1 how to analyse movements of a user 6. The movements of the user may be determined or recorded using sensors of the exercising device 2. Examples of such sensors of the exercising device are force sensors and acceleration sensors. Data determined by the sensors of the exercising device 2 may be shown on a display 15 of the mobile device 1.


A user can further input data using a user interface 17 of the mobile device 1. A user interface 17 may be, for example, a touchscreen, a button, a keyboard or an optical system analysing gestures of the user. The exercising device 2 is capable of receiving the data which has been input via the user interface 17 of the mobile device 1. The exercising device 2 is capable of receiving instructions from the mobile device 1. For example, another training program may be selected.


According to certain embodiments, a first exercising device 2 and a first mobile device 1 in accordance with at least some embodiments form a first unit and a second exercising device 2 and a second mobile device 1 in accordance with at least some embodiments form a second unit. The first unit and the second unit are capable of communicating with each other. For example, rowing of a rowing boat having two seats can be simulated. Subsequent to starting of a specific training program, two users simultaneously using respective exercising devices have to synchronize their movements in order to row a virtual rowing boat. A first user is then virtually in the position of the person sitting in front of the other person. Thus, a sports team can train rowing of a rowing boat, for example in winter time when training with a real rowing boat is not possible due to weather conditions.


In FIG. 5 a schematic top view of a mobile device 1 comprised by a yet further arrangement in accordance with at least some embodiments of the present invention is illustrated. The exercising device 2 includes an audio system 9. Prior to pairing, indication that the mobile devices 1 and the exercising device 2 are engaged with each other has been provided to the exercising device 2 and the mobile device 1. After the pairing process 5 as described above in connection with FIG. 1, a music program can be started or stopped, a volume of music can be controlled and/or a title can be selected using the mobile device 1 in the form of a wrist-watch 1. According to this embodiment, the procedure is typically fully or at least partially controlled by the exercising device 2 such that no program code or a minimum amount thereof needs to be installed on the mobile device 1. The mobile device 1 serves as a user interface for the exercising device 2. In other words, a computer program comprising program instructions which, when loaded into the exercising device 2, cause e.g. graphical user interface data to be determined for the mobile device 1 is provided. The graphical user interface data is wirelessly transmitted to the mobile device 1 from the exercising device 2 to provide at least one user interface functionality on the mobile device 1. Then data corresponding to user input is received and wirelessly transmitted to the exercising device 2. Minimum system requirements such as processing capacity and memory capacity are required for the mobile device 1. According to this embodiment, the input data is completely or at least partially processed by the exercising device 2. The bidirectional communication link between the mobile device 1 and the exercising device 2 may be used to enable the exercising device 2 to act as a server having control over the user interface and the mobile device 1 to act as a client whose content is fully or at least partially controlled by the exercising device 2.


Alternatively, a provided further, second mobile device (not shown), for example a smartphone, may include an audio system. In such a case, a music program can be started or stopped, a volume of music can be controlled and/or a title can be selected using the wrist-watch 1 after the pairing process 5 between the mobile device 1 and the exercising device 2. In other words, the mobile device 1 may be used to additionally control functions of a further second mobile device. According to this embodiment, the procedure is typically fully or at least partially controlled by the second mobile device such that no program code or a minimum amount thereof needs to be installed on the mobile device 1 such as a wrist-watch. The mobile device 1 serves as a user interface for the second mobile device. In other words, a computer program comprising program instructions which, when loaded into the second mobile device, cause e.g. graphical user interface data to be determined for the mobile device 1 is provided. The graphical user interface data is wirelessly transmitted to the mobile device 1 from the second mobile device to provide at least one user interface functionality on the mobile device 1. Then data corresponding to user input is received and wirelessly transmitted to the second mobile device. Minimum system requirements such as processing capacity and memory capacity are required for the mobile device 1. According to this embodiment, the input data is completely or at least partially processed by the second mobile device. The bidirectional communication link between the mobile device 1 and the second mobile device may be used to enable the second mobile device to act as a server having control over the user interface and the mobile device 1 to act as a client whose content is fully or at least partially controlled by the second mobile device.


In FIG. 6 a schematic side view of an exercising device 2 comprised by a yet further arrangement in accordance with at least some embodiments of the present invention is illustrated. The shown exercising device 2 is an ergometer in the form of an indoor exercise bike. The exercising device 2 comprises at least one processing core and at least one memory including computer program code. The at least one memory and the computer program code are configured to, with the at least one processing core, cause the exercising device at least to transmit a first signal to an mobile device 1, receive a second signal from the mobile device 1, and participate in pairing with the mobile device 1. Prior to pairing, indication that the mobile devices 1 and the exercising device 2 are engaged with each other has been provided to the exercising device 2 and the mobile device 1. The exercising device 2 further comprises a video system 10. The video system 10 may be, for example, a TV, a tablet, or a PC. The mobile device 1 may be used as a remote control of the video system 10.


A wrist-watch is shown as the mobile device 1. Typically, the exercising device 2 is configured to transmit the first signal and to receive the second signal when a distance between the mobile device 1 and the exercise device 2 is about 0 m-10 m. In other words, the pairing process is activated when a user 6 with the wrist-watch 1 is moving closer to the exercising device 2. After the pairing process between the mobile device 1 and the exercising device 2, data can be transmitted between the mobile device 1 and the exercising device 2. For example, the user 6 may select a TV channel to be shown on the video system 10 by the wrist-watch 1. The wrist-watch 1 can therefore serve as a remote control when cycling. Alternatively, data obtained by sensors of the mobile device 1, for example heart beat data, and data obtained by sensors of the exercising device 2, for example speed data, may be displayed on the video system 10. The exercising device 2 is configured to participate in the pairing process during a session with the mobile device 1. The session is based on sensors of the mobile device 1 and the exercising device 2.


In FIG. 7 a schematic view of an arrangement in accordance with at least some embodiments of the present invention is illustrated. FIG. 7 represents a situation in a gym with different exercising devices 2, for instance. The arrangement comprises an apparatus 18 comprising at least one processing core and at least one memory including computer program code. The at least one memory and the computer program code are configured to, with the at least one processing core, cause the apparatus 18 at least to receive a first sensor data stream from an exercising device 2, receive a second sensor data stream from a mobile device 1, correlate the first data stream with the second data stream, determine that the exercising device 2 is engaged with the mobile device 1, and provide indication to the exercising device 2 and the mobile device 1 that the devices 1, 2 are engaged with each other. In the shown example, four mobile devices 1 and four exercising devices 2 are illustrated. One mobile device 1 and one exercising device 2 form a single unit. In other words, four separate units are shown in FIG. 7. The apparatus 18 in the form of a server receives from three units a first sensor data stream from a sensor of the respective exercising device 2 and a second data stream from a sensor of a respective mobile device 1. For each unit, indication to the respective exercising device 2 and the respective mobile device 1 that the devices 1, 2 are engaged with each other is being provided and pairing between the devices 1, 2 of each unit can start. For each unit an activity type of a user, for example running, cycling and rowing, can be estimated. From the fourth unit no sensor data streams are received by the apparatus 18, i.e. the fourth unit is not in use and therefore no activity type can be estimated.


The server 18 is configured to estimate an activity type of different users based on the correlated data streams of a unit. In other words, the server 18 is capable of estimating if a person is cycling, running or rowing based on the correlated data. Data determined by sensors of each mobile device 1 and each respective exercising device 2 may be stored on the server. The server 18 is remotely accessible in order to analyse a user session at a later stage for each unit.


A user may use different exercising devices 2 in a gym, for example a treadmill first and then a cross trainer. Due to the correlated data streams and the time stamps associated with the sensor data contained in the data streams, thus e.g. indicating a start time and an end time of an activity type, pairing between a first exercising device and the mobile device can be automatically initiated first and pairing between a different second exercising device and the mobile device can be automatically initiated at a later stage.


It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting.


Reference throughout this specification to one embodiment or an embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Where reference is made to a numerical value using a term such as, for example, about or substantially, the exact numerical value is also disclosed.


As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no individual member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, various embodiments and example of the present invention may be referred to herein along with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of the present invention.


Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.


While the forgoing examples are illustrative of the principles of the present invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.


The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor require the existence of also un-recited features. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of “a” or “an”, that is, a singular form, throughout this document does not exclude a plurality.


INDUSTRIAL APPLICABILITY

At least some embodiments of the present invention find industrial application in displaying of sensor data determined by at least one sensor of an exercising device and at least one sensor of a mobile device. Certain embodiments of the present invention are applicable in health care, in industry, in working environments, sports, etc.


REFERENCE SIGNS LIST




  • 1 mobile device


  • 2 exercising device


  • 3 first signal


  • 4 second signal


  • 5 pairing


  • 6 user


  • 7 arrow


  • 8 arrow


  • 9 audio system


  • 10 video system


  • 11 processing core of mobile device


  • 12 memory of mobile device


  • 13 processing core of exercising device


  • 14 memory of exercising device


  • 15 display


  • 16 sensor


  • 17 user interface


  • 18 apparatus


Claims
  • 1. A server comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause at least: receiving by the server a first sensor data stream from a first exercising device, wherein the first sensor data stream comprises time stamps,receiving by the server a second sensor data stream from a mobile device, wherein the second sensor data stream comprises time stamps,correlating by the server the first sensor data stream with the second sensor data stream based on the time stamps comprised by the first sensor data stream and the time stamps comprised by the second sensor data stream,determining by the server that the first exercising device is engaged with the mobile device, andproviding from the server indication to the first exercising device and the mobile device that the first exercising device and the mobile device are engaged with each other, and subsequentlyreceiving by the server a third sensor data stream from a second exercising device, wherein the third sensor data stream comprises time stamps,receiving by the server a fourth sensor data stream from the mobile device, wherein the fourth sensor data stream comprises time stamps,correlating by the server the third sensor data stream with the fourth sensor data stream based on the time stamps comprised by the third sensor data stream and the time stamps comprised by the fourth sensor data stream,determining by the server that the second exercising device is engaged with the mobile device, andproviding from the server indication to the second exercising device and the mobile device that the second exercising device and the mobile device are engaged with each other, andchanging information displayed on a display of the mobile device automatically when the second exercising device is engaged with the mobile device,wherein the correlating comprises identifying identical time stamps, and
  • 2. The server according to claim 1, wherein the server is configured to store and process sensor data received from the first exercising device and the second exercising device.
  • 3. The server according to claim 1, wherein the server is configured to store and process sensor data received from the mobile device.
  • 4. The server according to claim 1, wherein the first sensor data stream or the third sensor data stream is based on a first acceleration sensor and the second data stream or the fourth sensor data stream is based on a second acceleration sensor.
  • 5. The server according to claim 1, wherein the first sensor data stream or the third sensor data stream is based on a first angular velocity sensor and the second data stream or the fourth sensor data stream is based on a second angular velocity sensor.
  • 6. The server according to claim 1, wherein the server is configured to receive the first sensor data stream or the third sensor data stream from an ergometer, a gym exercising device, a fitness device, a sports device, a weight lifting device, an exercise bike, a treadmill, a rowing machine, or a cross trainer.
  • 7. The server according to claim 1, wherein the server is configured to receive the second sensor data stream or the fourth sensor data stream from a tablet, a smartphone, a wearable sensor, or an external sensor.
  • 8. The server according to claim 1, wherein the server is configured to receive a fifth sensor data stream from a wearable sensor or an external sensor.
  • 9. The server according to claim 1, wherein the server is configured to allow a user to remotely read-out at least a part of at least one of the first sensor data stream, the second sensor data stream, the third sensor data stream and the fourth sensor data stream.
  • 10. The server according to claim 1, wherein the server is remotely accessible by a person in order to analyse a user session.
  • 11. The server according to claim 1, wherein the server is configured to estimate an activity type of a user based on the correlated data streams.
  • 12. The server according to claim 1, wherein the server is configured to receive the second sensor data stream or the fourth sensor data stream from a wrist-watch.
  • 13. The server according to claim 1, wherein the server is configured to receive the second sensor data stream or the fourth sensor data stream from a smartwatch.
  • 14. The server according to claim 1, wherein the first exercising device, the second exercising device and the mobile device each have a classification task to produce semantic events.
  • 15. The server according to claim 1, wherein the server is in the form of a mobile device.
  • 16. The server according to claim 8, wherein any of the first exercising device, and the second exercising device are enabled to act as a server having control over the user interface of the mobile device.
  • 17. Use of a server in connection with a first exercising device, a second exercising device and a mobile device, wherein the server comprises at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause at least: receiving by the server a first sensor data stream from a first exercising device, wherein the first sensor data stream comprises time stamps,receiving by the server a second sensor data stream from a mobile device, wherein the second sensor data stream comprises time stamps,correlating by the server the first sensor data stream with the second sensor data stream based on the time stamps comprised by the first sensor data stream and the time stamps comprised by the second sensor data stream,determining by the server that the first exercising device is engaged with the mobile device, andproviding from the server indication to the first exercising device and the mobile device that the first exercising device and the mobile device are engaged with each other, and subsequentlyreceiving by the server a third sensor data stream from a different second exercising device, wherein the third sensor data stream comprises time stamps,receiving by the server a fourth sensor data stream from the mobile device, wherein the fourth sensor data stream comprises time stamps,correlating by the server the third sensor data stream with the fourth sensor data stream based on the time stamps comprised by the third sensor data stream and the time stamps comprised by the fourth sensor data stream,determining by the server that the second exercising device is engaged with the mobile device, andproviding from the server indication to the second exercising device and the mobile device that the second exercising device and the mobile device are engaged with each other, andchanging information displayed on a display of the mobile device automatically when the second exercising device is engaged with the mobile device,
  • 18. The use according to claim 17, wherein the use takes place in a gym, in connection with a training session, in connection with a sports session or in connection with a fitness session.
  • 19. An arrangement comprising a server, a first exercising device, a second exercising device and a mobile device, wherein the server comprises at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause at least: receiving by the server a first sensor data stream from a first exercising device, wherein the first sensor data stream comprises time stamps,receiving by the server a second sensor data stream from a mobile device, wherein the second sensor data stream comprises time stamps,correlating by the server the first sensor data stream with the second sensor data stream based on the time stamps comprised by the first sensor data stream and the time stamps comprised by the second sensor data stream,determining by the server that the exercising device is engaged with the mobile device, andproviding from the server indication to the first exercising device and the mobile device that the first exercising device and the mobile device are engaged with each other, and subsequentlyreceiving by the server a third sensor data stream from a different second exercising device, wherein the third sensor data stream comprises time stamps,receiving by the server a fourth sensor data stream from the mobile device, wherein the fourth sensor data stream comprises time stamps,correlating by the server the third sensor data stream with the fourth sensor data stream based on the time stamps comprised by the third sensor data stream and the time stamps comprised by the fourth sensor data stream,determining by the server that the second exercising device is engaged with the mobile device, andproviding from the server indication to the second exercising device and the mobile device that the second exercising device and the mobile device are engaged with each other, andchanging information displayed on a display of the mobile device automatically when the second exercising device is engaged with the mobile device,wherein the correlating comprises identifying identical time stamps, and
  • 20. The arrangement according to claim 19, wherein the exercising device and the second exercising device are configured to participate in pairing with the mobile device.
  • 21. The arrangement according to claim 19, wherein the mobile device is configured to participate in pairing with the first exercising device and the second exercising device during a session with the first and second exercising devices.
Priority Claims (5)
Number Date Country Kind
20155989 Dec 2015 FI national
1522525 Dec 2015 GB national
20165707 Sep 2016 FI national
20165709 Sep 2016 FI national
20165710 Sep 2016 FI national
RELATED APPLICATIONS

This application is a Continuation-in-Part of U.S. patent application Ser. No. 15/382,763, filed on Dec. 19, 2016, which claims priority to Finnish Patent Application No. 20155989, filed on Dec. 21, 2015. The subject matter of which are incorporated by reference in their entirety. Additionally, the subject matter of U.S. patent application Ser. No. 14/945,914 is herein incorporated by reference in its entirety.

US Referenced Citations (208)
Number Name Date Kind
5457284 Ferguson Oct 1995 A
5503145 Clough Apr 1996 A
5924980 Coetzee Jul 1999 A
6882955 Ohlenbusch et al. Apr 2005 B1
7627423 Brooks Dec 2009 B2
7706973 McBride et al. Apr 2010 B2
7721118 Tamasi et al. May 2010 B1
7917198 Ahola et al. Mar 2011 B2
7938752 Wang May 2011 B1
8052580 Saalasti et al. Nov 2011 B2
8323188 Tran Dec 2012 B2
8328718 Tran Dec 2012 B2
8538693 McBride et al. Sep 2013 B2
8612142 Zhang Dec 2013 B2
8655591 Van Hende Feb 2014 B2
8781730 Downey et al. Jul 2014 B2
8949022 Fahrner Feb 2015 B1
9008967 McBride et al. Apr 2015 B2
9107586 Tran Aug 2015 B2
9222787 Blumenberg et al. Dec 2015 B2
9317660 Burich et al. Apr 2016 B2
9648108 Granqvist et al. May 2017 B2
9665873 Ackland et al. May 2017 B2
9829331 McBride et al. Nov 2017 B2
9830516 Biswas et al. Nov 2017 B1
9907473 Tran Mar 2018 B2
9923973 Granqvist et al. Mar 2018 B2
10234290 Lush et al. Mar 2019 B2
10244948 Pham et al. Apr 2019 B2
10327673 Eriksson et al. Jun 2019 B2
10415990 Cho et al. Sep 2019 B2
10433768 Eriksson et al. Oct 2019 B2
10515990 Hung et al. Dec 2019 B2
10634511 McBride et al. Apr 2020 B2
10816671 Graham Oct 2020 B2
20030038831 Engelfriet Feb 2003 A1
20030109287 Villaret Jun 2003 A1
20050070809 Acres Mar 2005 A1
20050086405 Kobayashi et al. Apr 2005 A1
20060068812 Carro et al. Mar 2006 A1
20060136173 Case, Jr. et al. Jun 2006 A1
20060255963 Thompson et al. Nov 2006 A1
20070156335 McBride et al. Jul 2007 A1
20070208544 Kulach et al. Sep 2007 A1
20070276200 Ahola et al. Nov 2007 A1
20080052493 Chang Feb 2008 A1
20080109158 Huhtala et al. May 2008 A1
20080136620 Lee et al. Jun 2008 A1
20080158117 Wong et al. Jul 2008 A1
20080214360 Stirling et al. Sep 2008 A1
20080294663 Heinley et al. Nov 2008 A1
20080318598 Fry Dec 2008 A1
20090047645 Dibenedetto et al. Feb 2009 A1
20090048070 Vincent et al. Feb 2009 A1
20090094557 Howard Apr 2009 A1
20090100332 Kanjilal et al. Apr 2009 A1
20090265623 Kho et al. Oct 2009 A1
20100099539 Haataja et al. Apr 2010 A1
20100167712 Stallings et al. Jul 2010 A1
20100187074 Manni Jul 2010 A1
20100257014 Roberts et al. Oct 2010 A1
20100313042 Shuster Dec 2010 A1
20110010704 Jeon et al. Jan 2011 A1
20110152695 Granqvist et al. Jun 2011 A1
20110218385 Bolyard et al. Sep 2011 A1
20110251822 Darley et al. Oct 2011 A1
20110252351 Sikora et al. Oct 2011 A1
20110281687 Gilley Nov 2011 A1
20110283224 Ramsey et al. Nov 2011 A1
20110288381 Bartholomew Nov 2011 A1
20110296312 Boyer et al. Dec 2011 A1
20110307723 Cupps et al. Dec 2011 A1
20120022336 Teixeira Jan 2012 A1
20120100895 Priyantha et al. Apr 2012 A1
20120109518 Huang May 2012 A1
20120116548 Goree et al. May 2012 A1
20120123806 Schumann et al. May 2012 A1
20120158289 Bernheim Brush et al. Jun 2012 A1
20120185268 Wiesner et al. Jul 2012 A1
20120219186 Wang et al. Aug 2012 A1
20120239173 Laikari et al. Sep 2012 A1
20120283855 Hoffman et al. Nov 2012 A1
20120289791 Jain Nov 2012 A1
20120317520 Lee Dec 2012 A1
20130053990 Ackland et al. Feb 2013 A1
20130060167 Dracup et al. Mar 2013 A1
20130095459 Tran Apr 2013 A1
20130127636 Aryanpur et al. May 2013 A1
20130151874 Parks et al. Jun 2013 A1
20130178334 Brammer Jul 2013 A1
20130187789 Lowe et al. Jul 2013 A1
20130190903 Balakrishnan et al. Jul 2013 A1
20130217979 Blackadar et al. Aug 2013 A1
20130225370 Flynt et al. Aug 2013 A1
20130234924 Janefalkar et al. Sep 2013 A1
20130250845 Greene et al. Sep 2013 A1
20130289932 Baechler et al. Oct 2013 A1
20130304377 Van Hende Nov 2013 A1
20130312043 Stone et al. Nov 2013 A1
20130332286 Medelius et al. Dec 2013 A1
20130345978 Lush et al. Dec 2013 A1
20140018686 Medelius et al. Jan 2014 A1
20140046223 Kahn et al. Feb 2014 A1
20140094200 Schatzberg et al. Apr 2014 A1
20140135593 Jayalth May 2014 A1
20140142732 Karvonen May 2014 A1
20140149754 Silva et al. May 2014 A1
20140159915 Hong et al. Jun 2014 A1
20140163927 Molettiere et al. Jun 2014 A1
20140208333 Beals et al. Jul 2014 A1
20140218281 Amayeh et al. Aug 2014 A1
20140235166 Molettiere et al. Aug 2014 A1
20140237028 Messenger et al. Aug 2014 A1
20140257533 Morris et al. Sep 2014 A1
20140275821 Beckman Sep 2014 A1
20140288680 Hoffman et al. Sep 2014 A1
20140336796 Agnew Nov 2014 A1
20140337036 Haiut et al. Nov 2014 A1
20140337450 Fitbit Nov 2014 A1
20140343380 Carter et al. Nov 2014 A1
20140350883 Carter et al. Nov 2014 A1
20140365107 Dutta et al. Dec 2014 A1
20140372064 Darley et al. Dec 2014 A1
20150006617 Yoo et al. Jan 2015 A1
20150037771 Kaleal, III et al. Feb 2015 A1
20150042468 White et al. Feb 2015 A1
20150057945 White et al. Feb 2015 A1
20150113417 Yuen et al. Apr 2015 A1
20150119198 Wisbey et al. Apr 2015 A1
20150127966 Ma et al. May 2015 A1
20150141873 Fei May 2015 A1
20150160026 Kitchel Jun 2015 A1
20150180842 Panther Jun 2015 A1
20150185815 Debates et al. Jul 2015 A1
20150209615 Edwards Jul 2015 A1
20150233595 Fadell et al. Aug 2015 A1
20150272483 Etemad et al. Oct 2015 A1
20150312857 Kim et al. Oct 2015 A1
20150317801 Bentley Nov 2015 A1
20150326709 Pennanen et al. Nov 2015 A1
20150334772 Wong et al. Nov 2015 A1
20150335978 Syed et al. Nov 2015 A1
20150342533 Kelner Dec 2015 A1
20150347983 Jon et al. Dec 2015 A1
20150350822 Xiao et al. Dec 2015 A1
20150362519 Balakrishnan et al. Dec 2015 A1
20150374279 Takakura et al. Dec 2015 A1
20150382150 Ansermet Dec 2015 A1
20160007288 Samardzija et al. Jan 2016 A1
20160007934 Arnold et al. Jan 2016 A1
20160012294 Bouck Jan 2016 A1
20160023043 Grundy Jan 2016 A1
20160026236 Vasistha et al. Jan 2016 A1
20160034043 Le Grand et al. Feb 2016 A1
20160034133 Wilson et al. Feb 2016 A1
20160041593 Dharawat Feb 2016 A1
20160058367 Raghuram et al. Mar 2016 A1
20160058372 Raghuram et al. Mar 2016 A1
20160059079 Watterson Mar 2016 A1
20160072557 Ahola Mar 2016 A1
20160081028 Chang et al. Mar 2016 A1
20160081625 Kim et al. Mar 2016 A1
20160084869 Yuen et al. Mar 2016 A1
20160091980 Baranski et al. Mar 2016 A1
20160104377 French et al. Apr 2016 A1
20160135698 Baxi et al. May 2016 A1
20160143579 Martikka May 2016 A1
20160144236 Ko May 2016 A1
20160148396 Bayne et al. May 2016 A1
20160148615 Lee et al. May 2016 A1
20160184686 Sampathkumaran Jun 2016 A1
20160209907 Han et al. Jul 2016 A1
20160226945 Granqvist et al. Aug 2016 A1
20160259495 Butcher et al. Sep 2016 A1
20160317097 Adams et al. Nov 2016 A1
20160327915 Katzer et al. Nov 2016 A1
20160328991 Simpson et al. Nov 2016 A1
20160346611 Rowley et al. Dec 2016 A1
20160367202 Carter Dec 2016 A1
20160374566 Fung et al. Dec 2016 A1
20160379547 Okada Dec 2016 A1
20170010677 Roh et al. Jan 2017 A1
20170011089 Bermudez et al. Jan 2017 A1
20170011210 Cheong et al. Jan 2017 A1
20170032256 Otto et al. Feb 2017 A1
20170038740 Knappe et al. Feb 2017 A1
20170063475 Feng Mar 2017 A1
20170065230 Sinha et al. Mar 2017 A1
20170087431 Syed et al. Mar 2017 A1
20170124517 Martin May 2017 A1
20170153119 Nieminen et al. Jun 2017 A1
20170153693 Duale et al. Jun 2017 A1
20170154270 Lindman et al. Jun 2017 A1
20170168555 Munoz et al. Jun 2017 A1
20170173391 Wiebe et al. Jun 2017 A1
20170232294 Kruger et al. Aug 2017 A1
20170262699 White et al. Sep 2017 A1
20170266494 Crankson et al. Sep 2017 A1
20170316182 Blackadar et al. Nov 2017 A1
20170340221 Cronin et al. Nov 2017 A1
20180015329 Burich et al. Jan 2018 A1
20180108323 Lindman et al. Apr 2018 A1
20180193695 Lee Jul 2018 A1
20180345077 Blahnik et al. Dec 2018 A1
20190025928 Pantelopoulos et al. Jan 2019 A1
20190056777 Munoz et al. Feb 2019 A1
20190069244 Jeon et al. Feb 2019 A1
20190367143 Sinclair et al. Dec 2019 A1
Foreign Referenced Citations (61)
Number Date Country
2007216704 Apr 2008 AU
1877340 Dec 2006 CN
102495756 Jun 2012 CN
103309428 Sep 2013 CN
103631359 Mar 2014 CN
204121706 Jan 2015 CN
104680046 Jun 2015 CN
105242779 Jan 2016 CN
106062661 Oct 2016 CN
106604369 Apr 2017 CN
108052272 May 2018 CN
103154954 Jun 2018 CN
108377264 Aug 2018 CN
108983873 Dec 2018 CN
1755098 Feb 2007 EP
2107837 Oct 2009 EP
2172249 Apr 2010 EP
2770454 Aug 2014 EP
2703945 Mar 2015 EP
2849473 Mar 2015 EP
2910901 Aug 2015 EP
2996409 Mar 2016 EP
3018582 May 2016 EP
3023859 May 2016 EP
3361370 Aug 2018 EP
2096820 Sep 2009 ER
126911 Feb 2017 FI
2404593 Feb 2005 GB
2425180 Oct 2006 GB
2513585 Nov 2014 GB
2530196 Mar 2016 GB
2537423 Oct 2016 GB
2541234 Feb 2017 GB
2555107 Apr 2018 GB
20110070049 Jun 2011 KR
101500662 Mar 2015 KR
528295 Oct 2006 SE
201706840 Feb 2017 TW
I598076 Sep 2018 TW
WO02054157 Jul 2002 WO
WO2010083562 Jul 2010 WO
WO2010144720 Dec 2010 WO
WO2011061412 May 2011 WO
WO2011123932 Oct 2011 WO
WO2012037637 Mar 2012 WO
WO2012115943 Aug 2012 WO
WO2012141827 Oct 2012 WO
WO2013091135 Jun 2013 WO
WO2013121325 Aug 2013 WO
WO2014118767 Aug 2014 WO
WO2014144258 Sep 2014 WO
WO2014193672 Dec 2014 WO
WO2014209697 Dec 2014 WO
WO2015021407 Feb 2015 WO
WO2014182162 Jun 2015 WO
WO2015087164 Jun 2015 WO
WO2015131065 Sep 2015 WO
WO2016022203 Feb 2016 WO
WO2017011818 Jan 2017 WO
WO2018217348 Nov 2018 WO
WO2018222936 Dec 2018 WO
Non-Patent Literature Citations (6)
Entry
ARM big. Little. Wikipedia, The free encyclopedia, Oct. 11, 2018, Retrieved on May 28, 2020 from: <https://en.wikipedia.org/w/index.php?title=ARM_bit.LITTLE&oldid=863559211> foreword on p. 1, section “Run-state migration” on pp. 1-2.
Qualcomm Snapdragon Wear 3100 Platform Supports New Ultra-Low Power System Architecture For Next Generation Smartwatches. Qualcomm Technologies, Inc., Sep. 10, 2018, Retrieved on May 28, 2020 from: <https://www.qualcomm.com/news/releases/2018/09/10/qualcomm-snapdragon-wear-3100-platform-supports sections “Snapdragon Wear 3100 Based Smartwatches Aim to Enrich the User Experience” on pp. 3-4.
CNET: Dec. 11, 2017, “Apple watch can now sync with a treadmill”, youtube.com, [online], Available from: https://www.youtube.com/watch?v=7RvMC3wFDME [ Accessed Nov. 19, 2020].
Sieber et al: Embedded systems in the Poseidon MK6 rebreather. Intelligent Solutions in Embedded Systems, 2009, pp. 37-42.
CASH: A guide to GPS and route plotting for cyclists. 2018. www.cyclinguk.org/article/guide-gps-and-route-plotting-cyclists.
Sheta et al: Packet scheduling in LTE mobile network. International Journal of Scientific & Engineering Research, Jun. 2013, vol. 4, Issue 6.
Related Publications (1)
Number Date Country
20190282103 A1 Sep 2019 US
Continuation in Parts (4)
Number Date Country
Parent 15382763 Dec 2016 US
Child 16427401 US
Parent 15386050 Dec 2016 US
Child 15382763 US
Parent 15386062 Dec 2016 US
Child 15386050 US
Parent 15386074 Dec 2016 US
Child 15386062 US