The present invention relates to an apparatus comprising at least one processing core and at least one memory including computer program code.
Further, the present invention relates to an use of an apparatus in connection with an exercising device and a mobile device.
Furthermore, the present invention relates to an arrangement comprising an apparatus, an exercising device and a mobile device.
Different exercising devices such as ergometers, gym exercising devices, fitness devices, sports devices, weight lifting devices, exercise bikes, treadmills, rowing machines, cross trainers, etc. are known. Some of the known exercising devices are often equipped with an internal display, a user interface and/or a memory, for example exercise bikes or treadmills. Further, various mobile devices are known by means of which information can be stored, processed and/or displayed. The mobile devices typically comprise a user interface such as a touchscreen, a keyboard or at least one button. Examples of such devices are smartphones, tablets, smartwatches and wrist-watches.
User sessions, such as training sessions, may be recorded, for example in notebooks, spreadsheets or other suitable media. Recorded training sessions enable more systematic training, and progress toward set goals can be assessed and tracked from the records so produced. Such records may be stored for future reference, for example to assess progress an individual is making as a result of the training. An activity session may comprise a training session or another kind of session.
Personal devices, such as, for example, smart watches, smartphones or smart jewelry, may be configured to produce recorded sessions of user activity. Such recorded sessions may be useful in managing physical training, child safety or in professional uses. Recorded sessions, or more generally sensor-based activity management, may be of varying type, such as, for example, running, walking, skiing, canoeing, wandering, or assisting the elderly.
Recorded sessions may be viewed using a personal computer, for example, wherein recordings may be copied from a personal device to the personal computer. Files on a personal computer may be protected using passwords and/or encryption, for example.
Personal devices may be furnished with sensors, which may be used, for example, in determining a heart-beat rate of a user during a user session. A recorded heart-beat rate of a user may be later observed using a personal computer, for example.
Document FI 20155989 discloses an apparatus comprising a memory configured to store first-type sensor data, at least one processing core configured to compile a message based at least partly on the first-type sensor data, to cause the message to be transmitted from the apparatus, to cause receiving in the apparatus of a machine readable instruction, and to derive an estimated activity type, using the machine readable instruction, based at least partly on sensor data.
The invention is defined by the features of the independent claims. Some specific embodiments are defined in the dependent claims.
According to a first aspect of the present invention, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to receive a first sensor data stream from an exercising device, receive a second sensor data stream from a mobile device, correlate the first data stream with the second data stream, determine that the exercising device is engaged with the mobile device, and provide indication to the exercising device and the mobile device that the devices are engaged with each other.
Various embodiments of the first aspect may comprise at least one feature from the following bulleted list:
According to a second aspect of the present invention, there is provided a use of the apparatus according to at least one of claims 1-13 in connection with an exercising device and a mobile device. According to an embodiment, the use takes place in a gym, in connection with a training session, in connection with a sports session or in connection with a fitness session.
According to a third aspect of the present invention, there is provided an arrangement comprising an apparatus according to at least one of claims 1-13, an exercising device and a mobile device.
Various embodiments of the third aspect may comprise at least one feature from the following bulleted list:
Considerable advantages are obtained by means of certain embodiments of the present invention. An apparatus comprising at least one processing core and at least one memory including computer program code is provided. According to certain embodiments of the present invention, it can be determined that an exercising device and a mobile device are engaged with each other based on correlated sensor data streams received from the exercising device and the mobile device. An activity type of a user can be estimated by the apparatus based on the received sensor data streams of the exercising device and the mobile device. The apparatus can be remotely accessible by a person in order to analyse a user session in real time or at a later stage. For example, the apparatus may be connected with the internet. A client connected to the apparatus may receive the sensor data streams and may be configured to cause pairing of the mobile device and the exercising device based on the correlated data streams or based on instructions input by a person.
After providing indication to the exercising device and the mobile device that the devices are engaged with each other, the exercising device can participate in pairing with the mobile device and the mobile device can participate in pairing with the exercising device according to certain embodiments of the present invention. The mobile device and the exercising can form at least temporarily a unit in which the mobile device serves as a display, a user interface and/or a memory of the exercising device. According to certain embodiments of the present invention, the mobile device serves as a display of the exercising device. The mobile device may serve as the only display or as an additional display of the exercising device during the time period of a user session. According to certain other embodiments of the present invention, the mobile device serves as a user interface of the exercising device. The mobile device may serve as the only user interface or as an additional user interface of the exercising device during the time period of a user session.
A mobile device and the exercising device can temporarily form a unit during the time period of a user session, a training session or a sports session. According to an embodiment, an app which is related to a specific exercising device can be stored in the memory of the mobile device of a user. Settings of the app can be personalized. For example, when the user is bringing his/her mobile device to a gym, the user can start a user session using his/her own personalized settings. Personalized settings may include information about age, weight, height, gender, body mass index, maximum performance capacity, activity parameter, previous energy expenditure and maximum heart rate, for instance. Also personalized exercise-guidance parameters such as an energy expenditure target, heart rate zones, activity zones, anaerobic threshold, fitness classification identifier and/or dehydration warning limits may be stored on the mobile device. Personalized data determined by sensors of the exercising device can further be stored in the memory of the mobile device and/or in the internet. Further, personalized data determined by sensors of the mobile device can be stored in the memory of the mobile device and/or in the internet. Determined data of the user, for example movement data or heart-beat rate data, can then be analysed at a later stage subsequent to the user session, training session or sports session. When another user is using the exercising device in the gym, his/her mobile device and the exercising device temporarily form a unit during the time period of another user session. Personalized settings can be used by this other user and personalized data can be stored in the memory of the respective mobile device and/or in the internet for further analysis. Consequently, it is not necessary to store any personalized data on the memory of the exercising device according to certain embodiments of the present invention.
According to another embodiment, no program code or a minimum amount thereof needs to be installed on the mobile device such as a wrist watch. The mobile device serves as a display and/or user interface for the exercising device. The procedure is controlled by the exercising device or the apparatus. Minimum system requirements are required for the mobile device. According to this embodiment, the input data is processed by the exercising device or the apparatus. The bidirectional communication link between the mobile device and the exercising device or apparatus may be used to enable the exercising device or apparatus to act as a server having control over the user interface and the mobile device to act as a client whose content is fully or at least partially controlled by the exercising device or apparatus.
Receiving a first sensor data stream from the exercising device, receiving a second sensor data stream from the mobile device, correlating the first data stream with the second data stream, determining that the exercising device is engaged with the mobile device, and providing indication to the exercising device and the mobile device that the devices are engaged with each other can take place automatically, thus providing to the user a comfortable user experience.
According to certain embodiments, a temporarily combined unit can share a classification task. I.e. the exercising device has its own classification task to produce semantic events like ‘lift’, ‘release’, ‘step’ etc. Similarly, the mobile device has its own classification task to produce semantic events. The system creates a more comprehensive analysis of the user's actions to provide e.g. a deeper understanding of the user's biomechanical accuracy.
Certain embodiments of the present invention are applicable with regards to health care, in industry, in working environments, in sports, and the like.
In
The mobile device 1 may be a wrist-watch, for instance. The exercising device 2 may be a treadmill, for instance. The apparatus 18 may be, for example, a server or server infrastructure. The apparatus 18 is configured to store and process sensor data received from the exercising device 2. Further, the apparatus 18 is configured to store and process sensor data received from the mobile device 2. Of course, the apparatus may be also configured to store and process sensor data received from a wearable sensor or any other external sensor, for example a MOVESENSE sensor. Such sensor data may be wirelessly transferred to the apparatus 18 directly or to at least one of the mobile device 1 and the exercising device 2 first and then to the apparatus 18.
According to a certain embodiment, an external sensor (not shown), for example a MOVESENSE sensor, is attached to a user and connected to the mobile device 1, for example a wrist watch 1. When the user comes to an exercising device 2, the mobile device 1 automatically displays information. Simultaneously, the mobile device 1 receives instructions from the exercising device 2 and/or apparatus 18. However, also the exercising device 2 and/or apparatus 18 may receive data from the mobile device 1 and/or the external sensor. The data may, for example, include personal data, sensor data and/or external sensor data. The data is typically processed by the exercising device 2 and/or apparatus 18. This kind of user experience is automatically created. When the user changes the exercising device 2, for example in a gym, the displayed information on the display of the mobile device 1 also automatically changes. The exercising device 2 can additionally receive further data from a server or via the internet. External sensor data can be analysed by the exercising device 2 and/or apparatus 18 and content, for example information derived from the sensor data, can be automatically displayed on the mobile device 1. In such a situation, the exercising device 2 may be used to enable the exercising device 2 to act as a server having control over the user interface and the mobile device 1 to act as a client whose content is fully or at least partially controlled by the exercising device 2
The first sensor data stream may be, for example, based on a first acceleration sensor and the second data stream may be, for example, based on a second acceleration sensor. I.e., the mobile device 1 comprises an acceleration sensor and the exercising device 2 comprises an acceleration sensor and the acceleration data of both sensors is received by the server 18. The mobile device 1 and the exercising device 2 both measure the same parameter, i.e. acceleration. The sensor of the mobile device 1 may, for example, measure acceleration of a cyclical motion of the right arm of the user as indicated by arrow 7. The sensor of the exercising device 2 may, for example, measure acceleration of the treadmill belt as indicated by arrow 8. According to other embodiments, the sensor of the mobile device 1 and the sensor of the exercising device 2 both measure another parameter, for example angular velocity.
After indication that the devices 1, 2 are engaged with each other has been provided to the exercising device 2 and the mobile device 1, the exercising device 2 may participate in pairing with the mobile device 1 and the mobile device 1 may participate in pairing with the exercising device 2. The mobile device 1 in the form of a wrist-watch comprises at least one processing core and at least one memory including computer program code. The at least one memory and the computer program code are configured to, with the at least one processing core, cause the mobile device 1 at least to receive a first signal 3 from an exercising device 2, process the received signal, respond to the received signal by transmitting a second signal 4 to the exercising device 2, and participate in a pairing process 5 with the exercise device 2. Further examples are described below in connection with
According to certain embodiments, the apparatus 18 may be further configured to allow a user to remotely read-out at least a part of at least one of the first sensor data stream and the second sensor data stream. In other words, the apparatus 18 is remotely accessible by a person in order to analyse a user session.
The first sensor data stream may be received by the apparatus 18 from the exercising device 2 via wire or a wireless technology, for example a low power wireless communication technology such as Bluetooth, Bluetooth Low Energy, or Wibree. The second sensor data stream may be typically received by the apparatus 18 from the mobile device 1 via a wireless technology, for example a low power wireless communication technology such as Bluetooth, Bluetooth Low Energy, or Wibree.
In
Sensor data contained in the data streams is typically associated with time stamps. For example, one acceleration value is measured every second and for each measured acceleration value the respective time is transmitted to the apparatus 18 in the form of a time stamp. Sensor data from the mobile device 1 having a first time stamp is compared with sensor data from the exercising device 2 having an identical time stamp. This procedure can be repeated for a plurality of second, third, etc. time stamps. When the sensor data from the mobile device and the sensor data from the exercising device correlate with each other, the apparatus is capable of determining that the exercising device 2 is engaged with the mobile device 1. When a user changes an exercising device, for example in a gym, pairing between the new exercising device and the mobile device can be automatically initiated based on the sensor data from the new exercising device correlating with the sensor data from the mobile device 1.
In other words, when a user 6 wearing the wrist-watch 1 starts to use the exercising device 2, sensor data streams are transmitted from the mobile device 1 and the exercising device 2 to the apparatus 18 in the form of a server. The server determines that the exercising device 2 is engaged with the mobile device 1 and estimates an activity type of the user, for example running. Subsequent to providing indication to the exercising device 2 and the mobile device 1 that the devices 1, 2 are engaged with each other, the wrist-watch 1 and the exercising device start to communicate with each other. A first signal as indicated by arrow 3 from the exercising device 2 is transmitted to the wrist-watch 1. Then the received signal 3 is processed by the processing core of the wrist-watch 1. Subsequently, a second signal as indicated by arrow 4 is transmitted from the wrist-watch 1 to the exercising device 2. This process is called pairing 5. Data between the wrist-watch 1 and the exercising device 2 can now be transferred between the wrist-watch 1 and the exercising device 2. Data is typically transferred using low power wireless communication technology such as Bluetooth, Bluetooth Low Energy, or Wibree.
The treadmill belt of the exercising device 2 may be moving with a specific speed as indicated by arrow 8 as a user 6 is running on the belt. At the same time, the arms of the runner 6 move cyclically as indicated by arrow 7. Data may be determined by the sensors of the wrist-watch 1. Examples of such determined data are a heart-beat rate, number of steps during a certain period of time, or acceleration data. Data may also be determined by sensors of the exercising device 2 and transmitted to the mobile device 1. An example of such data is the speed of the moving treadmill belt of the exercising device 2. The information about the speed of the treadmill belt may be transmitted from the exercising device 1 to the wrist-watch 1. The information about the speed of the treadmill belt may then be displayed on the wrist-watch. In other words, the wrist-watch 1 is configured to serve as a display of the exercising device 2. Of course, also data determined by at least one sensor of the wrist-watch 1 may be displayed on the display of the wrist-watch. The user 6 may further choose which data is displayed.
According to certain embodiments, the exercising device 2 may also comprise an additional display and data may be transmitted from the wrist-watch 1 to the exercising device 2. A user 6 may choose which information is shown on the display of the exercising device 2 and which information is at the same time displayed on the display of the wrist-watch 1. In other words, the user 6 may choose which sensor data is displayed on the display of the wrist-watch 1 and which sensor data is displayed on the display of the exercising device 2.
According to certain other embodiments, the mobile device 1 is configured to control parameters or functions of the exercising device 2 after the pairing process 5. In the shown example, a user 6 may control the speed of the treadmill belt of the exercising device 2 as indicated by arrow 8 via a user interface of the wrist-watch 1. A user interface of the wrist watch 1 may be a touchscreen or at least one button, for instance. User instructions to change the speed of the treadmill belt may be transmitted from the wrist-watch 1 to the exercising device 2 and processed by the exercising device 2, thus causing the exercising device 2 to change the speed of the treadmill belt. According to this embodiment, the procedure is typically controlled by the exercising device 2 such that no program code or a minimum amount thereof needs to be installed on the mobile device 1. The mobile device 1 serves as a user interface for the exercising device 2. In other words, a computer program comprising program instructions which, when loaded into the exercising device 2, cause e.g. graphical user interface data to be determined for the mobile device 1 is provided. The graphical user interface data is wirelessly transmitted to the mobile device 1 from the exercising device 2 to provide at least one user interface functionality on the mobile device 1. Then data corresponding to user input is received and wirelessly transmitted to the exercising device 2. Minimum system requirements such as processing capacity and memory capacity are required for the mobile device 1. According to this embodiment, the input data is completely or at least partially processed by the exercising device 2. The bidirectional communication link between the mobile device 1 and the exercising device 2 may be used to enable the exercising device 2 to act as a server having control over the user interface and the mobile device 1 to act as a client whose content is fully or at least partially controlled by the exercising device 2.
In
For example, a video simulation of a cycling track may be displayed on the display of the mobile device 1. Thus, the user 6 can cycle along the simulated track. Sensors of the exercising device 2 may determine the cycling speed of the user 6, for example. The sensor data of the exercising device 2 is then transmitted to the mobile device 1. The sensor data can be used as input data for the video simulation displayed on the mobile device 1. In other words, the user 6 can cycle along the virtual track with varying speeds. The visualization of the virtual cycling simulation is calculated based on the speed data obtained from the sensor data of the exercising device 2. On the other side, data may be transmitted from the mobile device 1 to the exercising device 2, thus causing the exercising device to change a parameter. Altitude data along the virtual track stored in the app may be provided, for instance. The altitude data can be used as input data for the parameters of the exercising device 2 as a function of time. When the data is received by the exercising device 2, it causes the exercising device 2 to change the resistance of the exercise bike during cycling along the virtual track. In other words, cycling upwards or downwards along the virtual track can be simulated. The exercising device 2 is configured to transmit sensor data to the mobile device 1 and to receive in response input parameters from the mobile device 1. Consequently, cycling along a virtual track, for example a passage of the Tour de France, can be simulated.
The exercising device 2 may be, for example, located in a gym and different users may subsequently cycle along the virtual track. When each user bring his/her own mobile device 1 to the gym, for each user a period of time may be determined by the app for cycling from the beginning of the virtual track to the end of the virtual track. The period of time for each user may then be transmitted from the respective mobile device 1 to the exercising device 2 and stored in a memory of the exercising device 2. The different periods of time may be ranked and listed so that a user can see his/her results in comparison to the results of other users. Thus, it is possible to simulate a cycling competition, for instance.
Of course, the mobile device 1 may also be used for displaying only information such as cycling speed, length of cycling session period or for selecting a cycling resistance of the exercising device 2.
Data determined by sensors of the exercising device 2 may be received by and stored in the mobile device 1. Alternatively, data determined by sensors of the exercising device 2 may be received by the mobile device and stored in the cloud. Thus, the user 6 can analyse the stored data at a later stage by reading out the memory of the mobile device 1 or viewing a webpage in the internet.
In
Subsequent to the pairing process 5, program code to be stored and processed by the mobile device 1 can be transmitted from the exercising device 2. Parameters and/or logics such as a rule engine, an app, a classification recipe or a html web page can be transmitted to the mobile device, for instance.
For example, an app may be transmitted to the mobile device 1. A user can select a training program with the help of the app. During the training session, the exercising device 2 may, for example, transmit a recipe or an instruction to the mobile device 1 how to analyse movements of a user 6. The movements of the user may be determined or recorded using sensors of the exercising device 2. Examples of such sensors of the exercising device are force sensors and acceleration sensors. Data determined by the sensors of the exercising device 2 may be shown on a display 15 of the mobile device 1.
A user can further input data using a user interface 17 of the mobile device 1. A user interface 17 may be, for example, a touchscreen, a button, a keyboard or an optical system analysing gestures of the user. The exercising device 2 is capable of receiving the data which has been input via the user interface 17 of the mobile device 1. The exercising device 2 is capable of receiving instructions from the mobile device 1. For example, another training program may be selected.
According to certain embodiments, a first exercising device 2 and a first mobile device 1 in accordance with at least some embodiments form a first unit and a second exercising device 2 and a second mobile device 1 in accordance with at least some embodiments form a second unit. The first unit and the second unit are capable of communicating with each other. For example, rowing of a rowing boat having two seats can be simulated. Subsequent to starting of a specific training program, two users simultaneously using respective exercising devices have to synchronize their movements in order to row a virtual rowing boat. A first user is then virtually in the position of the person sitting in front of the other person. Thus, a sports team can train rowing of a rowing boat, for example in winter time when training with a real rowing boat is not possible due to weather conditions.
In
Alternatively, a provided further, second mobile device (not shown), for example a smartphone, may include an audio system. In such a case, a music program can be started or stopped, a volume of music can be controlled and/or a title can be selected using the wrist-watch 1 after the pairing process 5 between the mobile device 1 and the exercising device 2. In other words, the mobile device 1 may be used to additionally control functions of a further second mobile device. According to this embodiment, the procedure is typically fully or at least partially controlled by the second mobile device such that no program code or a minimum amount thereof needs to be installed on the mobile device 1 such as a wrist-watch. The mobile device 1 serves as a user interface for the second mobile device. In other words, a computer program comprising program instructions which, when loaded into the second mobile device, cause e.g. graphical user interface data to be determined for the mobile device 1 is provided. The graphical user interface data is wirelessly transmitted to the mobile device 1 from the second mobile device to provide at least one user interface functionality on the mobile device 1. Then data corresponding to user input is received and wirelessly transmitted to the second mobile device. Minimum system requirements such as processing capacity and memory capacity are required for the mobile device 1. According to this embodiment, the input data is completely or at least partially processed by the second mobile device. The bidirectional communication link between the mobile device 1 and the second mobile device may be used to enable the second mobile device to act as a server having control over the user interface and the mobile device 1 to act as a client whose content is fully or at least partially controlled by the second mobile device.
In
A wrist-watch is shown as the mobile device 1. Typically, the exercising device 2 is configured to transmit the first signal and to receive the second signal when a distance between the mobile device 1 and the exercise device 2 is about 0 m-10 m. In other words, the pairing process is activated when a user 6 with the wrist-watch 1 is moving closer to the exercising device 2. After the pairing process between the mobile device 1 and the exercising device 2, data can be transmitted between the mobile device 1 and the exercising device 2. For example, the user 6 may select a TV channel to be shown on the video system 10 by the wrist-watch 1. The wrist-watch 1 can therefore serve as a remote control when cycling. Alternatively, data obtained by sensors of the mobile device 1, for example heart beat data, and data obtained by sensors of the exercising device 2, for example speed data, may be displayed on the video system 10. The exercising device 2 is configured to participate in the pairing process during a session with the mobile device 1. The session is based on sensors of the mobile device 1 and the exercising device 2.
In
The server 18 is configured to estimate an activity type of different users based on the correlated data streams of a unit. In other words, the server 18 is capable of estimating if a person is cycling, running or rowing based on the correlated data. Data determined by sensors of each mobile device 1 and each respective exercising device 2 may be stored on the server. The server 18 is remotely accessible in order to analyse a user session at a later stage for each unit.
A user may use different exercising devices 2 in a gym, for example a treadmill first and then a cross trainer. Due to the correlated data streams and the time stamps associated with the sensor data contained in the data streams, thus e.g. indicating a start time and an end time of an activity type, pairing between a first exercising device and the mobile device can be automatically initiated first and pairing between a different second exercising device and the mobile device can be automatically initiated at a later stage.
It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to equivalents thereof as would be recognized by those ordinarily skilled in the relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting.
Reference throughout this specification to one embodiment or an embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Where reference is made to a numerical value using a term such as, for example, about or substantially, the exact numerical value is also disclosed.
As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no individual member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, various embodiments and example of the present invention may be referred to herein along with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of the present invention.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
While the forgoing examples are illustrative of the principles of the present invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details of implementation can be made without the exercise of inventive faculty, and without departing from the principles and concepts of the invention. Accordingly, it is not intended that the invention be limited, except as by the claims set forth below.
The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor require the existence of also un-recited features. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of “a” or “an”, that is, a singular form, throughout this document does not exclude a plurality.
At least some embodiments of the present invention find industrial application in displaying of sensor data determined by at least one sensor of an exercising device and at least one sensor of a mobile device. Certain embodiments of the present invention are applicable in health care, in industry, in working environments, sports, etc.
Number | Date | Country | Kind |
---|---|---|---|
20155989 | Dec 2015 | FI | national |
1522525 | Dec 2015 | GB | national |
20165707 | Sep 2016 | FI | national |
20165709 | Sep 2016 | FI | national |
20165710 | Sep 2016 | FI | national |
This application is a Continuation-in-Part of U.S. patent application Ser. No. 15/382,763, filed on Dec. 19, 2016, which claims priority to Finnish Patent Application No. 20155989, filed on Dec. 21, 2015. The subject matter of which are incorporated by reference in their entirety. Additionally, the subject matter of U.S. patent application Ser. No. 14/945,914 is herein incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5457284 | Ferguson | Oct 1995 | A |
5503145 | Clough | Apr 1996 | A |
5924980 | Coetzee | Jul 1999 | A |
6882955 | Ohlenbusch et al. | Apr 2005 | B1 |
7627423 | Brooks | Dec 2009 | B2 |
7706973 | McBride et al. | Apr 2010 | B2 |
7721118 | Tamasi et al. | May 2010 | B1 |
7917198 | Ahola et al. | Mar 2011 | B2 |
7938752 | Wang | May 2011 | B1 |
8052580 | Saalasti et al. | Nov 2011 | B2 |
8323188 | Tran | Dec 2012 | B2 |
8328718 | Tran | Dec 2012 | B2 |
8538693 | McBride et al. | Sep 2013 | B2 |
8612142 | Zhang | Dec 2013 | B2 |
8655591 | Van Hende | Feb 2014 | B2 |
8781730 | Downey et al. | Jul 2014 | B2 |
8949022 | Fahrner | Feb 2015 | B1 |
9008967 | McBride et al. | Apr 2015 | B2 |
9107586 | Tran | Aug 2015 | B2 |
9222787 | Blumenberg et al. | Dec 2015 | B2 |
9317660 | Burich et al. | Apr 2016 | B2 |
9648108 | Granqvist et al. | May 2017 | B2 |
9665873 | Ackland et al. | May 2017 | B2 |
9829331 | McBride et al. | Nov 2017 | B2 |
9830516 | Biswas et al. | Nov 2017 | B1 |
9907473 | Tran | Mar 2018 | B2 |
9923973 | Granqvist et al. | Mar 2018 | B2 |
10234290 | Lush et al. | Mar 2019 | B2 |
10244948 | Pham et al. | Apr 2019 | B2 |
10327673 | Eriksson et al. | Jun 2019 | B2 |
10415990 | Cho et al. | Sep 2019 | B2 |
10433768 | Eriksson et al. | Oct 2019 | B2 |
10515990 | Hung et al. | Dec 2019 | B2 |
10634511 | McBride et al. | Apr 2020 | B2 |
10816671 | Graham | Oct 2020 | B2 |
20030038831 | Engelfriet | Feb 2003 | A1 |
20030109287 | Villaret | Jun 2003 | A1 |
20050070809 | Acres | Mar 2005 | A1 |
20050086405 | Kobayashi et al. | Apr 2005 | A1 |
20060068812 | Carro et al. | Mar 2006 | A1 |
20060136173 | Case, Jr. et al. | Jun 2006 | A1 |
20060255963 | Thompson et al. | Nov 2006 | A1 |
20070156335 | McBride et al. | Jul 2007 | A1 |
20070208544 | Kulach et al. | Sep 2007 | A1 |
20070276200 | Ahola et al. | Nov 2007 | A1 |
20080052493 | Chang | Feb 2008 | A1 |
20080109158 | Huhtala et al. | May 2008 | A1 |
20080136620 | Lee et al. | Jun 2008 | A1 |
20080158117 | Wong et al. | Jul 2008 | A1 |
20080214360 | Stirling et al. | Sep 2008 | A1 |
20080294663 | Heinley et al. | Nov 2008 | A1 |
20080318598 | Fry | Dec 2008 | A1 |
20090047645 | Dibenedetto et al. | Feb 2009 | A1 |
20090048070 | Vincent et al. | Feb 2009 | A1 |
20090094557 | Howard | Apr 2009 | A1 |
20090100332 | Kanjilal et al. | Apr 2009 | A1 |
20090265623 | Kho et al. | Oct 2009 | A1 |
20100099539 | Haataja et al. | Apr 2010 | A1 |
20100167712 | Stallings et al. | Jul 2010 | A1 |
20100187074 | Manni | Jul 2010 | A1 |
20100257014 | Roberts et al. | Oct 2010 | A1 |
20100313042 | Shuster | Dec 2010 | A1 |
20110010704 | Jeon et al. | Jan 2011 | A1 |
20110152695 | Granqvist et al. | Jun 2011 | A1 |
20110218385 | Bolyard et al. | Sep 2011 | A1 |
20110251822 | Darley et al. | Oct 2011 | A1 |
20110252351 | Sikora et al. | Oct 2011 | A1 |
20110281687 | Gilley | Nov 2011 | A1 |
20110283224 | Ramsey et al. | Nov 2011 | A1 |
20110288381 | Bartholomew | Nov 2011 | A1 |
20110296312 | Boyer et al. | Dec 2011 | A1 |
20110307723 | Cupps et al. | Dec 2011 | A1 |
20120022336 | Teixeira | Jan 2012 | A1 |
20120100895 | Priyantha et al. | Apr 2012 | A1 |
20120109518 | Huang | May 2012 | A1 |
20120116548 | Goree et al. | May 2012 | A1 |
20120123806 | Schumann et al. | May 2012 | A1 |
20120158289 | Bernheim Brush et al. | Jun 2012 | A1 |
20120185268 | Wiesner et al. | Jul 2012 | A1 |
20120219186 | Wang et al. | Aug 2012 | A1 |
20120239173 | Laikari et al. | Sep 2012 | A1 |
20120283855 | Hoffman et al. | Nov 2012 | A1 |
20120289791 | Jain | Nov 2012 | A1 |
20120317520 | Lee | Dec 2012 | A1 |
20130053990 | Ackland et al. | Feb 2013 | A1 |
20130060167 | Dracup et al. | Mar 2013 | A1 |
20130095459 | Tran | Apr 2013 | A1 |
20130127636 | Aryanpur et al. | May 2013 | A1 |
20130151874 | Parks et al. | Jun 2013 | A1 |
20130178334 | Brammer | Jul 2013 | A1 |
20130187789 | Lowe et al. | Jul 2013 | A1 |
20130190903 | Balakrishnan et al. | Jul 2013 | A1 |
20130217979 | Blackadar et al. | Aug 2013 | A1 |
20130225370 | Flynt et al. | Aug 2013 | A1 |
20130234924 | Janefalkar et al. | Sep 2013 | A1 |
20130250845 | Greene et al. | Sep 2013 | A1 |
20130289932 | Baechler et al. | Oct 2013 | A1 |
20130304377 | Van Hende | Nov 2013 | A1 |
20130312043 | Stone et al. | Nov 2013 | A1 |
20130332286 | Medelius et al. | Dec 2013 | A1 |
20130345978 | Lush et al. | Dec 2013 | A1 |
20140018686 | Medelius et al. | Jan 2014 | A1 |
20140046223 | Kahn et al. | Feb 2014 | A1 |
20140094200 | Schatzberg et al. | Apr 2014 | A1 |
20140135593 | Jayalth | May 2014 | A1 |
20140142732 | Karvonen | May 2014 | A1 |
20140149754 | Silva et al. | May 2014 | A1 |
20140159915 | Hong et al. | Jun 2014 | A1 |
20140163927 | Molettiere et al. | Jun 2014 | A1 |
20140208333 | Beals et al. | Jul 2014 | A1 |
20140218281 | Amayeh et al. | Aug 2014 | A1 |
20140235166 | Molettiere et al. | Aug 2014 | A1 |
20140237028 | Messenger et al. | Aug 2014 | A1 |
20140257533 | Morris et al. | Sep 2014 | A1 |
20140275821 | Beckman | Sep 2014 | A1 |
20140288680 | Hoffman et al. | Sep 2014 | A1 |
20140336796 | Agnew | Nov 2014 | A1 |
20140337036 | Haiut et al. | Nov 2014 | A1 |
20140337450 | Fitbit | Nov 2014 | A1 |
20140343380 | Carter et al. | Nov 2014 | A1 |
20140350883 | Carter et al. | Nov 2014 | A1 |
20140365107 | Dutta et al. | Dec 2014 | A1 |
20140372064 | Darley et al. | Dec 2014 | A1 |
20150006617 | Yoo et al. | Jan 2015 | A1 |
20150037771 | Kaleal, III et al. | Feb 2015 | A1 |
20150042468 | White et al. | Feb 2015 | A1 |
20150057945 | White et al. | Feb 2015 | A1 |
20150113417 | Yuen et al. | Apr 2015 | A1 |
20150119198 | Wisbey et al. | Apr 2015 | A1 |
20150127966 | Ma et al. | May 2015 | A1 |
20150141873 | Fei | May 2015 | A1 |
20150160026 | Kitchel | Jun 2015 | A1 |
20150180842 | Panther | Jun 2015 | A1 |
20150185815 | Debates et al. | Jul 2015 | A1 |
20150209615 | Edwards | Jul 2015 | A1 |
20150233595 | Fadell et al. | Aug 2015 | A1 |
20150272483 | Etemad et al. | Oct 2015 | A1 |
20150312857 | Kim et al. | Oct 2015 | A1 |
20150317801 | Bentley | Nov 2015 | A1 |
20150326709 | Pennanen et al. | Nov 2015 | A1 |
20150334772 | Wong et al. | Nov 2015 | A1 |
20150335978 | Syed et al. | Nov 2015 | A1 |
20150342533 | Kelner | Dec 2015 | A1 |
20150347983 | Jon et al. | Dec 2015 | A1 |
20150350822 | Xiao et al. | Dec 2015 | A1 |
20150362519 | Balakrishnan et al. | Dec 2015 | A1 |
20150374279 | Takakura et al. | Dec 2015 | A1 |
20150382150 | Ansermet | Dec 2015 | A1 |
20160007288 | Samardzija et al. | Jan 2016 | A1 |
20160007934 | Arnold et al. | Jan 2016 | A1 |
20160012294 | Bouck | Jan 2016 | A1 |
20160023043 | Grundy | Jan 2016 | A1 |
20160026236 | Vasistha et al. | Jan 2016 | A1 |
20160034043 | Le Grand et al. | Feb 2016 | A1 |
20160034133 | Wilson et al. | Feb 2016 | A1 |
20160041593 | Dharawat | Feb 2016 | A1 |
20160058367 | Raghuram et al. | Mar 2016 | A1 |
20160058372 | Raghuram et al. | Mar 2016 | A1 |
20160059079 | Watterson | Mar 2016 | A1 |
20160072557 | Ahola | Mar 2016 | A1 |
20160081028 | Chang et al. | Mar 2016 | A1 |
20160081625 | Kim et al. | Mar 2016 | A1 |
20160084869 | Yuen et al. | Mar 2016 | A1 |
20160091980 | Baranski et al. | Mar 2016 | A1 |
20160104377 | French et al. | Apr 2016 | A1 |
20160135698 | Baxi et al. | May 2016 | A1 |
20160143579 | Martikka | May 2016 | A1 |
20160144236 | Ko | May 2016 | A1 |
20160148396 | Bayne et al. | May 2016 | A1 |
20160148615 | Lee et al. | May 2016 | A1 |
20160184686 | Sampathkumaran | Jun 2016 | A1 |
20160209907 | Han et al. | Jul 2016 | A1 |
20160226945 | Granqvist et al. | Aug 2016 | A1 |
20160259495 | Butcher et al. | Sep 2016 | A1 |
20160317097 | Adams et al. | Nov 2016 | A1 |
20160327915 | Katzer et al. | Nov 2016 | A1 |
20160328991 | Simpson et al. | Nov 2016 | A1 |
20160346611 | Rowley et al. | Dec 2016 | A1 |
20160367202 | Carter | Dec 2016 | A1 |
20160374566 | Fung et al. | Dec 2016 | A1 |
20160379547 | Okada | Dec 2016 | A1 |
20170010677 | Roh et al. | Jan 2017 | A1 |
20170011089 | Bermudez et al. | Jan 2017 | A1 |
20170011210 | Cheong et al. | Jan 2017 | A1 |
20170032256 | Otto et al. | Feb 2017 | A1 |
20170038740 | Knappe et al. | Feb 2017 | A1 |
20170063475 | Feng | Mar 2017 | A1 |
20170065230 | Sinha et al. | Mar 2017 | A1 |
20170087431 | Syed et al. | Mar 2017 | A1 |
20170124517 | Martin | May 2017 | A1 |
20170153119 | Nieminen et al. | Jun 2017 | A1 |
20170153693 | Duale et al. | Jun 2017 | A1 |
20170154270 | Lindman et al. | Jun 2017 | A1 |
20170168555 | Munoz et al. | Jun 2017 | A1 |
20170173391 | Wiebe et al. | Jun 2017 | A1 |
20170232294 | Kruger et al. | Aug 2017 | A1 |
20170262699 | White et al. | Sep 2017 | A1 |
20170266494 | Crankson et al. | Sep 2017 | A1 |
20170316182 | Blackadar et al. | Nov 2017 | A1 |
20170340221 | Cronin et al. | Nov 2017 | A1 |
20180015329 | Burich et al. | Jan 2018 | A1 |
20180108323 | Lindman et al. | Apr 2018 | A1 |
20180193695 | Lee | Jul 2018 | A1 |
20180345077 | Blahnik et al. | Dec 2018 | A1 |
20190025928 | Pantelopoulos et al. | Jan 2019 | A1 |
20190056777 | Munoz et al. | Feb 2019 | A1 |
20190069244 | Jeon et al. | Feb 2019 | A1 |
20190367143 | Sinclair et al. | Dec 2019 | A1 |
Number | Date | Country |
---|---|---|
2007216704 | Apr 2008 | AU |
1877340 | Dec 2006 | CN |
102495756 | Jun 2012 | CN |
103309428 | Sep 2013 | CN |
103631359 | Mar 2014 | CN |
204121706 | Jan 2015 | CN |
104680046 | Jun 2015 | CN |
105242779 | Jan 2016 | CN |
106062661 | Oct 2016 | CN |
106604369 | Apr 2017 | CN |
108052272 | May 2018 | CN |
103154954 | Jun 2018 | CN |
108377264 | Aug 2018 | CN |
108983873 | Dec 2018 | CN |
1755098 | Feb 2007 | EP |
2107837 | Oct 2009 | EP |
2172249 | Apr 2010 | EP |
2770454 | Aug 2014 | EP |
2703945 | Mar 2015 | EP |
2849473 | Mar 2015 | EP |
2910901 | Aug 2015 | EP |
2996409 | Mar 2016 | EP |
3018582 | May 2016 | EP |
3023859 | May 2016 | EP |
3361370 | Aug 2018 | EP |
2096820 | Sep 2009 | ER |
126911 | Feb 2017 | FI |
2404593 | Feb 2005 | GB |
2425180 | Oct 2006 | GB |
2513585 | Nov 2014 | GB |
2530196 | Mar 2016 | GB |
2537423 | Oct 2016 | GB |
2541234 | Feb 2017 | GB |
2555107 | Apr 2018 | GB |
20110070049 | Jun 2011 | KR |
101500662 | Mar 2015 | KR |
528295 | Oct 2006 | SE |
201706840 | Feb 2017 | TW |
I598076 | Sep 2018 | TW |
WO02054157 | Jul 2002 | WO |
WO2010083562 | Jul 2010 | WO |
WO2010144720 | Dec 2010 | WO |
WO2011061412 | May 2011 | WO |
WO2011123932 | Oct 2011 | WO |
WO2012037637 | Mar 2012 | WO |
WO2012115943 | Aug 2012 | WO |
WO2012141827 | Oct 2012 | WO |
WO2013091135 | Jun 2013 | WO |
WO2013121325 | Aug 2013 | WO |
WO2014118767 | Aug 2014 | WO |
WO2014144258 | Sep 2014 | WO |
WO2014193672 | Dec 2014 | WO |
WO2014209697 | Dec 2014 | WO |
WO2015021407 | Feb 2015 | WO |
WO2014182162 | Jun 2015 | WO |
WO2015087164 | Jun 2015 | WO |
WO2015131065 | Sep 2015 | WO |
WO2016022203 | Feb 2016 | WO |
WO2017011818 | Jan 2017 | WO |
WO2018217348 | Nov 2018 | WO |
WO2018222936 | Dec 2018 | WO |
Entry |
---|
ARM big. Little. Wikipedia, The free encyclopedia, Oct. 11, 2018, Retrieved on May 28, 2020 from: <https://en.wikipedia.org/w/index.php?title=ARM_bit.LITTLE&oldid=863559211> foreword on p. 1, section “Run-state migration” on pp. 1-2. |
Qualcomm Snapdragon Wear 3100 Platform Supports New Ultra-Low Power System Architecture For Next Generation Smartwatches. Qualcomm Technologies, Inc., Sep. 10, 2018, Retrieved on May 28, 2020 from: <https://www.qualcomm.com/news/releases/2018/09/10/qualcomm-snapdragon-wear-3100-platform-supports sections “Snapdragon Wear 3100 Based Smartwatches Aim to Enrich the User Experience” on pp. 3-4. |
CNET: Dec. 11, 2017, “Apple watch can now sync with a treadmill”, youtube.com, [online], Available from: https://www.youtube.com/watch?v=7RvMC3wFDME [ Accessed Nov. 19, 2020]. |
Sieber et al: Embedded systems in the Poseidon MK6 rebreather. Intelligent Solutions in Embedded Systems, 2009, pp. 37-42. |
CASH: A guide to GPS and route plotting for cyclists. 2018. www.cyclinguk.org/article/guide-gps-and-route-plotting-cyclists. |
Sheta et al: Packet scheduling in LTE mobile network. International Journal of Scientific & Engineering Research, Jun. 2013, vol. 4, Issue 6. |
Number | Date | Country | |
---|---|---|---|
20190282103 A1 | Sep 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15382763 | Dec 2016 | US |
Child | 16427401 | US | |
Parent | 15386050 | Dec 2016 | US |
Child | 15382763 | US | |
Parent | 15386062 | Dec 2016 | US |
Child | 15386050 | US | |
Parent | 15386074 | Dec 2016 | US |
Child | 15386062 | US |