The present disclosure relates to systems and methods for capturing activity data over a period of time and methods and systems for configuring alarm settings in activity tracking devices.
In recent years, the need for health and fitness has grown tremendously. The growth has occurred due to a better understanding of the benefits of good fitness to overall health and wellness. Unfortunately, although today's modern culture has brought about many new technologies, such as the Internet, connected devices and computers, people have become less active. Additionally, many office jobs require people to sit in front of computer screens for long periods of time, which further reduces a person's activity levels. Furthermore, much of today's entertainment options involve viewing multimedia content, computer social networking, and other types of computer involved interfacing. Although such computer activity can be very productive as well as entertaining, such activity tends to reduce a person's overall physical activity.
To provide users concerned with health and fitness a way of measuring or accounting for their activity or lack thereof, fitness trackers are often used. Fitness trackers are used to measure activity, such as walking, motion, running, sleeping, being inactive, bicycling, exercising on an elliptical trainer, and the like. Usually, the data collected by such fitness trackers can be transferred and viewed on a computing device. However, such data is often provided as a basic accumulation of activity data with complicated or confusing interfaces.
It is in this context that embodiments described herein arise.
Embodiments described in the present disclosure provide systems, apparatus, computer readable media, and methods for configuring alarm setting for activity tracking devices using remote computing devices and transferring the configured alarm settings to the activity tracking devices. Some embodiments are directed toward the use of contact gestures either to transition an alarm of the activity tracking device into a snooze mode or to turn the alarm off.
In one embodiment, a method, which is executed by a processor, is provided. The method includes receiving an alarm setting that defines a time of day for triggering an alarm on a device for tracking activity data of a user, and activating the alarm upon reaching the time of day defined by the alarm setting. The alarm produces a vibration of the device. The method further includes determining a level of activity of the user detected by the device for tracking activity data of the user when the alarm is activated, and automatically deactivating the alarm if the level of activity qualifies for deactivating the alarm. The deactivating of the alarm causes the vibration of the device to be suspended.
In one embodiment, the level of activity that qualifies for deactivating the alarm is a predetermined number of steps. In one embodiment, the predetermined number of steps is detected after the alarm is activated. In one embodiment, the predetermined number of steps is detected during a period of time that includes a time period before the alarm is activated.
In another embodiment, a device configured for capture of activity data for a user is provided. The device includes a housing, a sensor, a motor, a memory, and a processor. The sensor is disposed in the housing to track activity data of the user. The motor causes vibration of the housing of the device. The memory stores an alarm setting that defines a time of day for triggering an alarm on the device. The processor activates the alarm upon reaching the time of day defined by the alarm setting, with the alarm causing the motor to produce the vibration of the housing. The sensor, which is interfaced with the processor, is configured to detect a current level of activity of the user. The processor is configured to automatically deactivate the alarm if the current level of activity qualifies for deactivating the alarm. The deactivating of the alarm causes the vibration of the device to be suspended.
In yet another embodiment, one or more non-transitory computer readable media are provided. The one or more computer readable media include instructions which, when executed by a processor, perform the following operations: receive an alarm setting that defines a time of day for triggering an alarm on a device for tracking activity data of a user; activate the alarm upon reaching the time of day defined by the alarm setting, the alarm producing a vibration of the device; determine a level of activity of the user detected by the device for tracking activity data of the user when the alarm is activated; and automatically deactivate the alarm if the level of activity qualifies for deactivating the alarm, the deactivating of the alarm causing the vibration of the device to be suspended. In one embodiment, the vibration being suspended transitions the alarm into the off mode.
In one embodiment, a method, which is executed by a processor, is provided. The method includes receiving an alarm setting that defines a time of day for triggering an alarm on a device for tracking activity data of a user, and activating the alarm upon reaching the time of day defined by the alarm setting. The alarm produces a vibration of the activity tracking device. The method further includes using a sensor to detect a physical contact upon the device, and deactivating the alarm if the physical contact qualifies as an input to deactivate the alarm. The deactivating of the alarm causes the vibration of the activity tracking device to be suspended.
In one embodiment, the suspension of the vibration of the activity tracking device transitions the alarm into either a snooze mode or an off mode. In one embodiment, the snooze mode continues for a predetermined period of time before reactivating the alarm. In one embodiment, the method further includes transitioning into the snooze mode one or more times until entering the off mode or processing the vibration for a threshold period of time.
In one embodiment, the physical contact is a result of one or more taps on a surface of the activity tracking device. In one embodiment, a snooze mode, which causes the vibration to be suspended, is entered when the physical contact is represented by a single tap onto a surface of the activity tracking device, or a double tap onto the surface of the device, or three taps onto the surface of the device, or four taps onto the surface of the device, or a predetermined set of repeated taps onto the surface of the device. In one embodiment, two or more of the taps are received within a predetermined period of time to qualify as an input.
In one embodiment, the method further includes transitioning from the snooze mode to an off mode when an additional physical contact is sensed by the sensor. The additional physical contact can be represented by a single tap onto a surface of the activity tracking device, or a double tap onto the surface of the device, or three taps onto the surface of the device, or four taps onto the surface of the device, or a predetermined set of repeated taps onto the surface of the device. In one embodiment, two or more of the taps are received within a predetermined period of time to qualify as an input.
In one embodiment, the method further includes transitioning from the snooze mode to an off mode when the processor of the activity tracking device determines that a button of the device is pressed.
In one embodiment, the alarm setting is received wirelessly from a computing device. In one embodiment, the computing device has access to the Internet. In one embodiment, the alarm setting is programmable at a website managed by a server, and the website is managed by the server to allow access to user accounts, with each user account having associated therewith one or more of the activity tracking devices, such that the alarm setting is custom set in a user account
In one embodiment, the alarm setting is transferred from the server to the computing device over the Internet and from the computing device to the device via a wireless Bluetooth connection. In one embodiment, the activity data of the user includes metrics associated with one or more of step count metrics, or stair count metrics, or distance traveled metrics, or active time metrics, or calories burned metrics, or sleep metrics.
In another embodiment, a device configured for capture of activity data for a user is provided. The device includes a housing, a sensor, a motor, a memory, and a processor. The sensor is disposed in the housing to capture physical contact upon the housing. The motor causes vibration of the housing of the device. The memory stores an alarm setting that defines a time of day for triggering an alarm on the device. The processor activates the alarm upon reaching the time of day defined by the alarm setting, with the alarm causing the motor to produce the vibration of the housing. The sensor, which is interfaced with the processor, is configured to detect a physical contact upon the housing of the device. The processor is configured to deactivate the alarm if the physical contact qualifies as an input to deactivate the alarm. The deactivating of the alarm causes the vibration of the device to be suspended.
In one embodiment, the housing is part of a wearable wrist attachable structure, or an attachable structure that can be carried or worn by the user. In one embodiment, the wearable wrist attachable structure is defined at least partially from a plastic material. In one embodiment, the physical contact captured by the sensor is from one or more taps upon the housing by a finger or hand. In one embodiment, the housing includes a button, and the physical contact upon the housing of the device is not from a button press.
In one embodiment, the housing further includes wireless communication logic. In one embodiment, the wireless communication logic includes one of WiFi processing logic, or Bluetooth (BT) processing logic, or radio processing logic. In one embodiment, the wireless communication logic is configured to pair with a portable computing device or a computer, and the portable computing device or the computer is configured for communication over the Internet with a server, the server having processing instructions for configuring the alarm settings.
In one embodiment, the processor examines predefined motion profiles captured by the sensor to qualify the physical contact as the input, such that motion profiles outside of the predetermined motion profiles do not qualify as the input. In one embodiment, suspending the vibration of the device transitions the alarm into one of a snooze mode or an off mode. In one embodiment, the processor configures the snooze mode to continue for a predetermined period of time before reactivating the alarm, and the processor transitions into the snooze mode one or more times until entering the off mode or processing the vibration for a threshold period of time.
In one embodiment, the physical contact is the result of one or more taps on a surface of the device. In one embodiment, two or more of the taps are received within a predetermined period of time to qualify as the input. In one embodiment, the processor causes a snooze mode, which causes the vibration to be suspended, to be entered when the physical contact is represented by a single tap onto a surface of the device, or a double tap onto the surface of the device, or three taps onto the surface of the device, or four taps onto the surface of the device, or a predetermined set of repeated taps onto the surface of the device.
In yet another embodiment, one or more non-transitory computer readable media are provided. The one or more computer readable media include instructions which, when executed by a processor, perform the following operations: receiving an alarm setting that defines a time of day for triggering an alarm on a device for tracking activity data of a user; activating the alarm upon reaching the time of day defined by the alarm setting, the alarm producing a vibration of the device; using a sensor to detect a physical contact upon the device; and deactivating the alarm if the physical contact qualifies as an input to deactivate the alarm, the deactivating causing the vibration of the device to be suspended.
Other aspects will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of embodiments described in the present disclosure.
Various embodiments described in the present disclosure may best be understood by reference to the following description taken in conjunction with the accompanying drawings in which:
Embodiments described in the present disclosure provide systems, apparatus, computer readable media, and methods for configuring alarm setting for activity tracking devices using remote computing devices and transferring the configured alarm settings to the activity tracking devices. Some embodiments are directed toward the use of contact gestures either to transition an alarm of the activity tracking device into a snooze mode or to turn the alarm off.
It should be noted that there are many inventions described and illustrated herein. The present inventions are neither limited to any single aspect nor embodiment thereof, nor to any combinations and/or permutations of such aspects and/or embodiments. Moreover, each of the aspects of the present inventions, and/or embodiments thereof, may be employed alone or in combination with one or more of the other aspects of the present inventions and/or embodiments thereof. For the sake of brevity, many of those permutations and combinations will not be discussed separately herein.
Further, in the course of describing and illustrating the present inventions, various circuitry, architectures, structures, components, functions and/or elements, as well as combinations and/or permutations thereof, are set forth. It should be understood that circuitry, architectures, structures, components, functions and/or elements other than those specifically described and illustrated, are contemplated and are within the scope of the present inventions, as well as combinations and/or permutations thereof.
The environmental sensors 118 may be in the form of motion detecting sensors. In some embodiments, a motion sensor can be one or more of an accelerometer, or a gyroscope, or a rotary encoder, or a calorie measurement sensor, or a heat measurement sensor, or a moisture measurement sensor, or a displacement sensor, or an ultrasonic sensor, or a pedometer, or an altimeter, or a linear motion sensor, or an angular motion sensor, or a multi-axis motion sensor, or a combination thereof. The biometric sensors 116 can be defined to measure physiological characteristics of the user that is using the activity tracking device 100. The user interface 114 provides a way for communicating with the activity tracking device 100, in response to user interaction 104. The user interaction 104 can be in the form of physical contact (e.g., without limitation, tapping, sliding, rubbing, multiple taps, gestures, etc.).
In some embodiments, the user interface 114 is configured to receive user interaction 104 that is in the form of noncontact input. The noncontact input can be by way of proximity sensors, button presses, touch sensitive screen inputs, graphical user interface inputs, voice inputs, sound inputs, etc. The activity tracking device 100 can communicate with a client and/or server 112 using the wireless transceiver 110. The wireless transceiver 110 will allow the activity tracking device 100 to communicate using a wireless connection, which is enabled by wireless communication logic. The wireless communication logic can be in the form of a circuit having radio communication capabilities. The radio communication capabilities can be in the form of a Wi-Fi connection, a Bluetooth connection, a low-energy Bluetooth connection, or any other form of wireless tethering or near field communication. In still other embodiments, the activity tracking device 100 can communicate with other computing devices using a wired connection (not shown). As mentioned, the environmental sensors 118 can detect motion of the activity tracking device 100.
The motion can be activity of the user, such as walking, running, stair climbing, etc. The motion can also be in the form of physical contact received on any surface of the activity tracking device 110, so long as the environmental sensors 118 can detect such motion from the physical contact. As will be explained in more detail below, the physical contact may be in the form of a tap or multiple taps by a finger upon the housing of the activity tracking device 100.
In other embodiments, the device components 102 are positioned substantially in a central position of the wrist attachable device, such as under or proximate to a location where a display screen 122 is located. In the illustrated example, the housing 130 also includes a button 126. The button 126 can be pressed to activate the display screen 122, navigate to various metrics displayed on the screen 122, or turn off the screen 122.
As shown in
Some motions will produce and quantify various types of metrics, such as step count, stairs climbed, distance traveled, very active minutes, calories burned, etc. The physical contact logic 142 can include logic that calculates or determines when particular physical contact can qualify as an input. To qualify as an input, the physical contact detected by sensors 156 should have a particular pattern that is identifiable as input. For example, the input may be predefined to be a double tap input, and the physical contact logic 142 can analyze the motion to determine if a double tap indeed occurred in response to analyzing the sensor data produced by sensors 156.
In other embodiments, the physical contact logic can be programmed to determine when particular physical contacts occurred, the time in between the physical contacts, and whether the one or more physical contacts will qualify within predefined motion profiles that would indicate that an input is desired. If physical contact occurs that is not within some predefined profile or pattern, the physical contact logic will not indicate or qualify that physical contact as an input.
The display interface logic 144 is configured to interface with the processor and the physical contact logic to determine when specific metric data will be displayed on the display screen 122 of the activity tracking device 100. The display interface logic 144 can act to turn on the screen, display metric information, display characters or alphanumeric information, display graphical user interface graphics, or combinations thereof. Alarm management logic 146 can function to provide a user interface and settings for managing and receiving input from a user to set an alarm. The alarm management logic can interface with a timekeeping module (e.g., clock, calendar, time zone, etc.), and can trigger the activation of an alarm. The alarm can be in the form of an audible alarm or a non-audible alarm.
A non-audible alarm can provide such alarm by way of a vibration. The vibration can be produced by a motor integrated in the activity tracking device 100. The vibration can be defined to include various vibration patterns, intensities, and custom set patterns. The vibration produced by the motor or motors of the activity tracking device 100 can be managed by the alarm management logic 146 in conjunction with processing by the processor 106. The wireless communication logic 148 is configured for communication of the activity tracking device with another computing device by way of a wireless signal. The wireless signal can be in the form of a radio signal. As noted above, the radio signal can be in the form of a Wi-Fi signal, a Bluetooth signal, a low energy Bluetooth signal, or combinations thereof. The wireless communication logic can interface with the processor 106, storage 108 and battery 154 of device 100, for transferring activity data, which may be in the form of motion data or processed motion data, stored in the storage 108 to the computing device.
In one embodiment, processor 106 functions in conjunction with the various logic components 140, 142, 144, 146, and 148. The processor 106 can, in one embodiment, provide the functionality of any one or all of the logic components. In other embodiments, multiple chips can be used to separate the processing performed by any one of the logic components and the processor 106. Sensors 156 can communicate via a bus with the processor 106 and/or the logic components. The storage 108 is also in communication with the bus for providing storage of the motion data processed or tracked by the activity tracking device 100. Battery 154 is provided for providing power to the activity tracking device 100.
In one embodiment, remote device 200 communicates with activity tracking device 100 over a Bluetooth connection. In one embodiment, the Bluetooth connection is a low energy Bluetooth connection (e.g., Bluetooth LE, BLE, or Bluetooth Smart). Low energy Bluetooth is configured for providing low power consumption relative to standard Bluetooth circuitry. Low energy Bluetooth uses, in one embodiment, a 2.4 GHz radio frequency, which allows for dual mode devices to share a single radio antenna. In one embodiment, low energy Bluetooth connections can function at distances up to 50 meters, with over the air data rates ranging between 1-3 megabits (Mb) per second. In one embodiment, a proximity distance for communication can be defined by the particular wireless link, and is not tied to any specific standard. It should be understood that the proximity distance limitation will change in accordance with changes to existing standards and in view of future standards and/or circuitry and capabilities.
Remote device 200 can also communicate with the Internet 160 using an Internet connection. The Internet connection of the remote device 200 can include cellular connections, wireless connections such as Wi-Fi, and combinations thereof (such as connections to switches between different types of connection links). The remote device, as mentioned above, can be a smartphone or tablet computer, or any other type of computing device having access to the Internet and with capabilities for communicating with the activity tracking device 100.
A server 220 is also provided, which is interfaced with the Internet 160. The server 220 can include a number of applications that service the activity tracking device 100, and the associated users of the activity tracking device 100 by way of user accounts. For example, the server 220 can include an activity management application 224. The activity management application 224 can include logic for providing access to various devices 100, which are associated with user accounts managed by server 220. Server 220 can include storage 226 that includes various user profiles associated with the various user accounts. The user account 228a for user A and the user account 228n for user N are shown to include various information.
The information in a user account can include, without limitation, data associated with alarm settings 230, user data, etc. As will be described in more detail below, the alarm settings 230 include information regarding a user's preferences, settings, and configurations which are settable by the user or set by default at the server 220 when accessing a respective user account. The storage 226 will include any number of user profiles, depending on the number of registered users having user accounts for their respective activity tracking devices. It should also be noted that a single user account can have various or multiple devices associated therewith, and the multiple devices can be individually customized, managed, and accessed by a user. In one embodiment, the server 220 provides access to a user to view the user data 232 associated with an activity tracking device.
To enable a user to configure the alarm settings for an activity tracking device 100 using remote device 200, activity tracking application 202 provides a number of interfaces that allow the user to configure the alarm settings. In one embodiment, the activity tracking application 202 displays a view 202a that shows the activity tracking devices associated with the user's account. As shown in view 202a, only “Device A” is associated with User A's account. It should be appreciated, however, that additional activity tracking devices, e.g., Device B, Device C, etc., also could be associated with a user's account. A suitable GUI control, e.g., a graphic icon that can be activated by a finger touch or other user input, is used to identify each device associated with the user's account, e.g., Device A, Device B, etc., so that the user can select the device for which the alarm settings are to be configured. In the example shown in view 202a, the user would touch the “Device A” GUI control to select that device for configuration.
Once the particular device to be configured has been selected, the activity tracking application 202 displays a view 202b that shows the settings available to be selected for configuration. As shown in view 202b, only “Alarm Settings” are available to be selected for configuration. It should be appreciated, however, that other settings also could be displayed. In the example shown in view 202b, the user would touch the “Alarm Settings” GUI control to select those settings for configuration. As shown in view 202c, the activity tracking application 202 then provides GUI controls that allow the user to proceed to set an alarm time (“Set Time”) and to select the days on which the alarm is to be active (“Set Days”). In the event the user touches the “Set Time” GUI control, the activity tracking application 202 displays a further view (not shown) that allows the user to set an alarm time, e.g., 6:30 am, 7:30 am, etc. In the event the user touches the “Set Days” GUI control, the activity tracking application 202 displays a further view (not shown) that allows the user to set the days on which the alarm is to be active, e.g., Monday, Tuesday, Monday thru Friday (weekdays), etc. It should be appreciated that more than one alarm, e.g., Alarm #1, Alarm #2, Alarm #3, etc., can be configured in this manner.
Once the time and days for an alarm have been set, the activity tracking application 202 provides GUI controls that allow a user to select either an audible alarm or a non-audible alarm. The non-audible alarm can be produced by tactile feedback, e.g., vibration, generated by a motor for causing vibration of the housing of the activity tracking device. The vibration can be a default vibration set by the system or, optionally, a vibration that is configured by the user. In the example shown in view 202d, the activity tracking application 202 displays a GUI control that allows the user to configure the tactile feedback (e.g., vibration) used to produce a non-audible alarm. The activity tracking application 202 then displays GUI controls that allow the user to select the nature of the vibration. In the example shown in view 202e, the GUI controls include “Vibrate Continuously,” “Vibrate Intermittently,” and “Vibrate in Pattern.” In one embodiment, the vibration pattern is configurable, as will be described in more detail below.
As shown in view 202f, the activity tracking application 202 also displays GUI controls that allow a user to configure the contact gestures that can be applied to the activity tracking device to either turn the alarm off or transition the alarm into a snooze mode. In the example shown in view 202f, the alarm will be placed into a snooze mode when the activity tracking device detects that a surface of the activity tracking device has received a double tap (two (2) taps). Further, the alarm will be turned off when the activity tracking device detects one of the following: a button press; three (3) taps to a surface of the activity tracking device; or four (4) taps to a surface of the activity tracking device. It will be appreciated that the configuration shown in view 202f is an example and that this configuration can be varied to suit the needs of the user. Once the alarm settings have been configured, as shown in view 202g, the alarm settings are saved and the saved alarm settings 230 can be accessed by the server 220, as described above.
The alarm settings configured using activity tracking application 202 of remote device 200 are saved to a memory of the remote device. When an Internet connection is available to remote device 200, the alarm settings are uploaded to server 220 via the Internet 160. As described above, the server 220 stores user profiles in storage 226. The alarm settings for a particular user are stored in that user's account (see, e.g., alarm settings 230 in user account 228a for User A). These alarm settings can be transferred to an activity tracking device 100 in several ways, as described below.
Each time the alarm settings are configured using the activity tracking application 202 of a remote device 200, the alarm settings are stored to the server 220. It is possible for a user to configure the alarm settings using multiple remote devices 200. As such, it is also possible that the alarm settings stored on a given remote device 200 might not have the most recent configuration because, e.g., the user changed the configuration of the alarm settings using a different remote device. To make sure that each remote device 200 of a user has the current alarm settings, the activity tracking application 202 on each remote device periodically synchronizes the configuration of alarm settings stored in the memory of the remote device with the configuration of the alarm settings stored on the server 220.
The alarm settings stored in the memory of a remote device 200 can be transferred to an activity tracking device 100 over a Bluetooth connection, as described above. In one embodiment, the Bluetooth connection is a low energy Bluetooth connection (e.g., Bluetooth LE, BLE, or Bluetooth Smart). Whenever an activity tracking device 100, e.g., Device A shown in
Once the alarm has been triggered, a determination is made in operation 306 as to whether a button of the activity tracking device has been pressed. If it is determined that a button of the activity tracking device has been pressed, then the alarm is turned off and the method returns to operation 300 in which the alarm is maintained in the “off” state. If it is determined that a button of the activity tracking device has not been pressed, then the method proceeds to operation 308. In operation 308, a determination is made as to whether the activity tracking device has detected physical contact with a surface thereof. In one embodiment, it is determined whether the activity tracking device has detected that a surface thereof has received a contact gesture in the form of a double tap (two (2) taps). It should be appreciated, however, that other predefined numbers of contact gestures can be detected, e.g., three (3) taps or four (4) taps. If it is determined that the activity tracking device has detected a double tap on a surface thereof, then the method proceeds to operation 310. On the other hand, if it is determined that the activity device has not detected any physical contact with a surface thereof, then the method proceeds to operation 312.
In operation 310, in response to the detected double tap on a surface of the activity tracking device, the alarm is transitioned into a snooze mode and vibration is suspended. The method then proceeds to operation 314 in which a determination is made as to whether the snooze period has elapsed. In one embodiment, the snooze period is preset by the system and lasts for a predefined period of time, e.g., 2 minutes, 3 minutes, 5 minutes, etc. In another embodiment, the snooze period is set by a user and lasts for the period of time selected by the user, e.g., 5 minutes, 10 minutes, 15 minutes, etc. If it is determined in operation 314 that the snooze period has not yet elapsed, then the method returns to operation 310 and the alarm remains in snooze mode. On the other hand, if it is determined that the snooze period has elapsed, then the method returns to operation 304 and the alarm is triggered again.
As noted above, when it is determined in operation 308 that the activity device has not detected any physical contact with a surface thereof, the method proceeds to operation 312. In operation 312, a determination is made as to whether the threshold vibration time, which is the maximum allowable vibration time, has been reached. In one embodiment, the maximum allowable vibration time is set by the system and lasts for a predefined period of time, e.g., 5 minutes, 10 minutes, 15 minutes, etc. The alarm will continue to vibrate until it is determined that the maximum allowable vibration time has been reached. Once it is determined that the maximum allowable vibration time has been reached, the alarm is turned off and the method returns to operation 300 in which the alarm is maintained in the “off” mode.
In operation 408, in response to the detected double tap, the alarm of the activity tracking device is transitioned into a snooze mode. In this snooze mode, the vibration of the activity tracking device is paused or suspended. When a predefined snooze period elapses, the method returns to operation 404 in which the activity tracking device is vibrated.
As noted above, when no double tap is detected in operation 406, the method proceeds to operation 410. In operation 410, a determination is made as to whether the activity tracking device has detected that a button of the device has been pressed. If it is determined that a button press has been detected, then the method proceeds to operation 412 in which the alarm is turned off. On the other hand, if no button press is detected, then the activity tracking device continues to vibrate and the method returns to operation 404 for further processing.
In the method illustrated in
It should be further appreciated that, in some instances, it might not be convenient to press a button of the activity tracking device to turn an alarm off (or to place the alarm in a snooze mode). For example, a user could be engaged in an activity, e.g., running, climbing stairs, etc., and might not want to stop the activity to turn off the alarm. To address such instances, the alarm can be configured to turn off automatically based on activity detected by the activity tracking device. In one embodiment, when the alarm goes off, the activity tracking device monitors the user's current activity level. When the activity tracking device detects that the user has taken a predefined number of steps, e.g., 40 steps, 50 steps, 60 steps, etc., since the alarm went off, the alarm is turned off automatically without requiring any physical contact with the device on the part of the user. In another embodiment, when the alarm goes off, the activity tracking device not only monitor's the user's current activity level but also takes into account the user's activity level during a predefined period of time before the alarm went off, e.g., 1 minute, 2 minutes, etc. For example, if a runner had taken 90 steps in the minute before the alarm went off and the alarm was configured to turn after the user had taken 100 steps, then the alarm would be automatically turned off after the activity tracking device detected that the user had taken an additional 10 steps (for a total of 100 steps) since the alarm went off.
In operation 508, in response to the detected double tap, the alarm of the activity tracking device is transitioned into a snooze mode. In this snooze mode, the vibration of the activity tracking device is paused or suspended. When a predefined snooze period elapses, the method returns to operation 504 in which the activity tracking device is vibrated.
As noted above, when no double tap is detected in operation 506, the method proceeds to operation 510. In operation 510, a determination is made as to whether the activity tracking device has detected four (4) taps (“a four tap”) on a surface of the activity tracking device. If it is determined that a four tap has been detected, then the alarm is turned off and the method returns to operation 500 in which the alarm is maintained in the “off” state. On the other hand, if no four tap is detected, then the activity tracking device continues to vibrate and the method returns to operation 504 for further processing.
In one embodiment, the device collects one or more types of physiological and/or environmental data from embedded sensors and/or external devices and communicates or relays such metric information to other devices, including devices capable of serving as Internet-accessible data sources, thus permitting the collected data to be viewed, for example, using a web browser or network-based application. For example, while the user is wearing an activity tracking device, the device may calculate and store the user's step count using one or more sensors. The device then transmits data representative of the user's step count to an account on a web service, computer, mobile phone, or health station where the data may be stored, processed, and visualized by the user. Indeed, the device may measure or calculate a plurality of other physiological metrics in addition to, or in place of, the user's step count.
Some physiological metrics include, but are not limited to, energy expenditure (for example, calorie burn), floors climbed and/or descended, heart rate, heart rate variability, heart rate recovery, location and/or heading (for example, through GPS), elevation, ambulatory speed and/or distance traveled, swimming lap count, bicycle distance and/or speed, blood pressure, blood glucose, skin conduction, skin and/or body temperature, electromyography, electroencephalography, weight, body fat, caloric intake, nutritional intake from food, medication intake, sleep periods (i.e., clock time), sleep phases, sleep quality and/or duration, pH levels, hydration levels, and respiration rate. The device may also measure or calculate metrics related to the environment around the user such as barometric pressure, weather conditions (for example, temperature, humidity, pollen count, air quality, rain/snow conditions, wind speed), light exposure (for example, ambient light, UV light exposure, time and/or duration spent in darkness), noise exposure, radiation exposure, and magnetic field.
Still further, other metrics can include, without limitation, calories burned by a user, weight gained by a user, weight lost by a user, stairs ascended, e.g., climbed, etc., by a user, stairs descended by a user, steps taken by a user during walking or running, a number of rotations of a bicycle pedal rotated by a user, sedentary activity data, driving a vehicle, a number of golf swings taken by a user, a number of forehands of a sport played by a user, a number of backhands of a sport played by a user, or a combination thereof. In some embodiments, sedentary activity data is referred to herein as inactive activity data or as passive activity data. In some embodiments, when a user is not sedentary and is not sleeping, the user is active. In some embodiments, a user may stand on a monitoring device that determines a physiological parameter of the user. For example, a user stands on a scale that measures a weight, a body fat percentage, a biomass index, or a combination thereof, of the user.
Furthermore, the device or the system collating the data streams may calculate metrics derived from this data. For example, the device or system may calculate the user's stress and/or relaxation levels through a combination of heart rate variability, skin conduction, noise pollution, and sleep quality. In another example, the device or system may determine the efficacy of a medical intervention (for example, medication) through the combination of medication intake, sleep and/or activity data. In yet another example, the device or system may determine the efficacy of an allergy medication through the combination of pollen data, medication intake, sleep and/or activity data. These examples are provided for illustration only and are not intended to be limiting or exhaustive.
This information can be associated with the user's account, which can be managed by an activity management application on the server. The activity management application can provide access to the users account and data saved thereon. The activity manager application running on the server can be in the form of a web application. The web application can provide access to a number of websites screens and pages that illustrate information regarding the metrics in various formats. This information can be viewed by the user, and synchronized with a computing device of the user, such as a smartphone.
In one embodiment, the data captured by the activity tracking device 100 is received by the computing device, and the data is synchronized with the activity measured application on the server. In this example, data viewable on the computing device (e.g. smartphone) using an activity tracking application (app) can be synchronized with the data present on the server, and associated with the user's account. In this way, information entered into the activity tracking application on the computing device can be synchronized with application illustrated in the various screens of the activity management application provided by the server on the website.
The user can therefore access the data associated with the user account using any device having access to the Internet. Data received by the network 176 can then be synchronized with the user's various devices, and analytics on the server can provide data analysis to provide recommendations for additional activity, and or improvements in physical health. The process therefore continues where data is captured, analyzed, synchronized, and recommendations are produced. In some embodiments, the captured data can be itemized and partitioned based on the type of activity being performed, and such information can be provided to the user on the website via graphical user interfaces, or by way of the application executed on the users smart phone (by way of graphical user interfaces).
In an embodiment, the sensor or sensors of a device 100 can determine or capture data to determine an amount of movement of the monitoring device over a period of time. The sensors can include, for example, an accelerometer, a magnetometer, a gyroscope, or combinations thereof. Broadly speaking, these sensors are inertial sensors, which capture some movement data, in response to the device 100 being moved. The amount of movement (e.g., motion sensed) may occur when the user is performing an activity of climbing stairs over the time period, walking, running, etc. The monitoring device may be worn on a wrist, carried by a user, worn on clothing (using a clip, or placed in a pocket), attached to a leg or foot, attached to the user's chest, waist, or integrated in an article of clothing such as a shirt, hat, pants, blouse, glasses, and the like. These examples are not limiting to all the possible ways the sensors of the device can be associated with a user or thing being monitored.
In other embodiments, a biological sensor can determine any number of physiological characteristics of a user. As another example, the biological sensor may determine heart rate, a hydration level, body fat, bone density, fingerprint data, sweat rate, and/or a bioimpedance of the user. Examples of the biological sensors include, without limitation, a biometric sensor, a physiological parameter sensor, a pedometer, or a combination thereof.
In some embodiments, data associated with the user's activity can be monitored by the applications on the server and the user's device, and activity associated with the user's friends, acquaintances, or social network peers can also be shared, based on the user's authorization. This provides for the ability for friends to compete regarding their fitness, achieve goals, receive badges for achieving goals, get reminders for achieving such goals, rewards or discounts for achieving certain goals, etc.
As noted, an activity tracking device 100 can communicate with a computing device (e.g., a smartphone, a tablet computer, a desktop computer, or computer device having wireless communication access and/or access to the Internet). The computing device, in turn, can communicate over a network, such as the Internet or an Intranet to provide data synchronization. The network may be a wide area network, a local area network, or a combination thereof. The network may be coupled to one or more servers, one or more virtual machines, or a combination thereof. A server, a virtual machine, a controller of a monitoring device, or a controller of a computing device is sometimes referred to herein as a computing resource. Examples of a controller include a processor and a memory device.
In one embodiment, the processor may be a general purpose processor. In another embodiment, the processor can be a customized processor configured to run specific algorithms or operations. Such processors can include digital signal processors (DSPs), which are designed to execute or interact with specific chips, signals, wires, and perform certain algorithms, processes, state diagrams, feedback, detection, execution, or the like. In some embodiments, a processor can include or be interfaced with an application specific integrated circuit (ASIC), a programmable logic device (PLD), a central processing unit (CPU), or a combination thereof, etc.
In some embodiments, one or more chips, modules, devices, or logic can be defined to execute instructions or logic, which collectively can be viewed or characterized to be a processor. Therefore, it should be understood that a processor does not necessarily have to be one single chip or module, but can be defined from a collection of electronic or connecting components, logic, firmware, code, and combinations thereof.
Examples of a memory device include a random access memory (RAM) and a read-only memory (ROM). A memory device may be a Flash memory, a redundant array of disks (RAID), a hard disk, or a combination thereof.
Embodiments described in the present disclosure may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. Several embodiments described in the present disclosure can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
With the above embodiments in mind, it should be understood that a number of embodiments described in the present disclosure can employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that form part of various embodiments described in the present disclosure are useful machine operations. Several embodiments described in the present disclosure also relate to a device or an apparatus for performing these operations. The apparatus can be specially constructed for a purpose, or the apparatus can be a computer selectively activated or configured by a computer program stored in the computer. In particular, various machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
Various embodiments described in the present disclosure can also be embodied as computer-readable code on a non-transitory computer-readable medium. The computer-readable medium is any data storage device that can store data, which can thereafter be read by a computer system. Examples of the computer-readable medium include hard drives, network attached storage (NAS), ROM, RAM, compact disc-ROMs (CD-ROMs), CD-recordables (CD-Rs), CD-rewritables (RWs), magnetic tapes and other optical and non-optical data storage devices. The computer-readable medium can include computer-readable tangible medium distributed over a network-coupled computer system so that the computer-readable code is stored and executed in a distributed fashion.
Although the method operations were described in a specific order, it should be understood that other housekeeping operations may be performed in between operations, or operations may be performed in an order other than that shown, or operations may be adjusted so that they occur at slightly different times, or may be distributed in a system which allows the occurrence of the processing operations at various intervals associated with the processing.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the various embodiments described in the present disclosure are not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
This application is a continuation of U.S. patent application Ser. No. 14/329,735, filed on Jul. 11, 2014, entitled “Alarm Setting and Interfacing with Gesture Contact Interfacing Controls,” which is a continuation of U.S. patent application Ser. No. 14/050,305, now U.S. Pat. No. 8,812,259, entitled “Alarm Setting and Interfacing with Gesture Contact Interfacing Controls,” filed on Oct. 9, 2013, which claims priority to U.S. Provisional Application No. 61/886,000, entitled “Alarm Setting and Interfacing with Gesture Contact Interfacing Controls,” filed on Oct. 2, 2013, the disclosures of all of which are incorporated by reference herein for all purposes. This application is related to U.S. application Ser. No. 14/050,292, now U.S. Pat. No. 8,744,803, filed on Oct. 9, 2013, entitled “Methods, Systems, and Devices for Activity Tracking Device Data Synchronization with Computing Devices,” which claims priority to U.S. Provisional Application No. 61/885,962, filed on Oct. 2, 2013, both of which are incorporated herein by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2717736 | Schlesinger | Sep 1955 | A |
2827309 | Fred | Mar 1958 | A |
2883255 | Anderson | Apr 1959 | A |
3163856 | Kirby | Dec 1964 | A |
3250270 | Walter | May 1966 | A |
3522383 | Chang | Jul 1970 | A |
3918658 | Beller | Nov 1975 | A |
4192000 | Lipsey | Mar 1980 | A |
4244020 | Ratcliff | Jan 1981 | A |
4281663 | Pringle | Aug 1981 | A |
4284849 | Anderson et al. | Aug 1981 | A |
4312358 | Barney | Jan 1982 | A |
4367752 | Jimenez et al. | Jan 1983 | A |
4390922 | Pelliccia | Jun 1983 | A |
4407295 | Steuer et al. | Oct 1983 | A |
4425921 | Fujisaki et al. | Jan 1984 | A |
4466204 | Wu | Aug 1984 | A |
4575804 | Ratcliff | Mar 1986 | A |
4578769 | Frederick | Mar 1986 | A |
4617525 | Lloyd | Oct 1986 | A |
4887249 | Thinesen | Dec 1989 | A |
4930518 | Hrushesky | Jun 1990 | A |
4977509 | Pitchford et al. | Dec 1990 | A |
5058427 | Brandt | Oct 1991 | A |
5224059 | Nitta et al. | Jun 1993 | A |
5295085 | Hoffacker | Mar 1994 | A |
5314389 | Dotan | May 1994 | A |
5323650 | Fullen et al. | Jun 1994 | A |
5365930 | Takashima et al. | Nov 1994 | A |
5446705 | Haas et al. | Aug 1995 | A |
5456648 | Edinburg et al. | Oct 1995 | A |
5553296 | Forrest et al. | Sep 1996 | A |
5583776 | Levi et al. | Dec 1996 | A |
5645509 | Brewer et al. | Jul 1997 | A |
5671162 | Werbin | Sep 1997 | A |
5692324 | Goldston et al. | Dec 1997 | A |
5704350 | Williams, III | Jan 1998 | A |
5724265 | Hutchings | Mar 1998 | A |
5817008 | Rafert et al. | Oct 1998 | A |
5890128 | Diaz et al. | Mar 1999 | A |
5891042 | Sham et al. | Apr 1999 | A |
5894454 | Kondo | Apr 1999 | A |
5899963 | Hutchings | May 1999 | A |
5941828 | Archibald et al. | Aug 1999 | A |
5947868 | Dugan | Sep 1999 | A |
5955667 | Fyfe | Sep 1999 | A |
5976083 | Richardson et al. | Nov 1999 | A |
6018705 | Gaudet et al. | Jan 2000 | A |
6077193 | Buhler et al. | Jun 2000 | A |
6078874 | Piety et al. | Jun 2000 | A |
6085248 | Sambamurthy et al. | Jul 2000 | A |
6129686 | Friedman | Oct 2000 | A |
6145389 | Ebeling et al. | Nov 2000 | A |
6183425 | Whalen et al. | Feb 2001 | B1 |
6213872 | Harada et al. | Apr 2001 | B1 |
6241684 | Amano et al. | Jun 2001 | B1 |
6287262 | Amano et al. | Sep 2001 | B1 |
6301964 | Fyfe et al. | Oct 2001 | B1 |
6302789 | Harada et al. | Oct 2001 | B2 |
6305221 | Hutchings | Oct 2001 | B1 |
6309360 | Mault | Oct 2001 | B1 |
6454708 | Ferguson et al. | Sep 2002 | B1 |
6469639 | Tanenhaus et al. | Oct 2002 | B2 |
6478736 | Mault | Nov 2002 | B1 |
6513381 | Fyfe et al. | Feb 2003 | B2 |
6513532 | Mault et al. | Feb 2003 | B2 |
6527711 | Stivoric et al. | Mar 2003 | B1 |
6529827 | Beason et al. | Mar 2003 | B1 |
6558335 | Thede | May 2003 | B1 |
6561951 | Cannon et al. | May 2003 | B2 |
6571200 | Mault | May 2003 | B1 |
6583369 | Montagnino et al. | Jun 2003 | B2 |
6585622 | Shum et al. | Jul 2003 | B1 |
6607493 | Song | Aug 2003 | B2 |
6620078 | Pfeffer | Sep 2003 | B2 |
6678629 | Tsuji | Jan 2004 | B2 |
6699188 | Wessel | Mar 2004 | B2 |
6761064 | Tsuji | Jul 2004 | B2 |
6772331 | Hind et al. | Aug 2004 | B1 |
6788200 | Jamel et al. | Sep 2004 | B1 |
6790178 | Mault et al. | Sep 2004 | B1 |
6808473 | Hisano et al. | Oct 2004 | B2 |
6811516 | Dugan | Nov 2004 | B1 |
6813582 | Levi et al. | Nov 2004 | B2 |
6813931 | Yadav et al. | Nov 2004 | B2 |
6856938 | Kurtz | Feb 2005 | B2 |
6862575 | Anttila et al. | Mar 2005 | B1 |
6984207 | Sullivan et al. | Jan 2006 | B1 |
7020508 | Stivoric et al. | Mar 2006 | B2 |
7041032 | Calvano | May 2006 | B1 |
7062225 | White | Jun 2006 | B2 |
7099237 | Lall | Aug 2006 | B2 |
7133690 | Ranta-Aho et al. | Nov 2006 | B2 |
7162368 | Levi et al. | Jan 2007 | B2 |
7171331 | Vock et al. | Jan 2007 | B2 |
7200517 | Darley et al. | Apr 2007 | B2 |
7246033 | Kudo | Jul 2007 | B1 |
7261690 | Teller et al. | Aug 2007 | B2 |
7272982 | Neuhauser et al. | Sep 2007 | B2 |
7283870 | Kaiser et al. | Oct 2007 | B2 |
7285090 | Stivoric et al. | Oct 2007 | B2 |
7373820 | James | May 2008 | B1 |
7443292 | Jensen et al. | Oct 2008 | B2 |
7457724 | Vock et al. | Nov 2008 | B2 |
7467060 | Kulach et al. | Dec 2008 | B2 |
7502643 | Farringdon et al. | Mar 2009 | B2 |
7505865 | Ohkubo et al. | Mar 2009 | B2 |
7539532 | Tran | May 2009 | B2 |
7558622 | Tran | Jul 2009 | B2 |
7559877 | Parks et al. | Jul 2009 | B2 |
7608050 | Shugg | Oct 2009 | B2 |
7653508 | Kahn et al. | Jan 2010 | B1 |
7690556 | Kahn et al. | Apr 2010 | B1 |
7713173 | Shin et al. | May 2010 | B2 |
7762952 | Lee et al. | Jul 2010 | B2 |
7771320 | Riley et al. | Aug 2010 | B2 |
7774156 | Niva et al. | Aug 2010 | B2 |
7789802 | Lee et al. | Sep 2010 | B2 |
7827000 | Stirling et al. | Nov 2010 | B2 |
7865140 | Levien et al. | Jan 2011 | B2 |
7881902 | Kahn et al. | Feb 2011 | B1 |
7907901 | Kahn et al. | Mar 2011 | B1 |
7925022 | Jung et al. | Apr 2011 | B2 |
7927253 | Vincent et al. | Apr 2011 | B2 |
7941665 | Berkema et al. | May 2011 | B2 |
7942824 | Kayyali et al. | May 2011 | B1 |
7953549 | Graham et al. | May 2011 | B2 |
7983876 | Vock et al. | Jul 2011 | B2 |
8005922 | Boudreau et al. | Aug 2011 | B2 |
8028443 | Case, Jr. | Oct 2011 | B2 |
8036850 | Kulach et al. | Oct 2011 | B2 |
8055469 | Kulach et al. | Nov 2011 | B2 |
8059573 | Julian et al. | Nov 2011 | B2 |
8060337 | Kulach et al. | Nov 2011 | B2 |
8095071 | Sim et al. | Jan 2012 | B2 |
8099318 | Moukas et al. | Jan 2012 | B2 |
8103247 | Ananthanarayanan et al. | Jan 2012 | B2 |
8132037 | Fehr et al. | Mar 2012 | B2 |
8172761 | Rulkov et al. | May 2012 | B1 |
8177260 | Tropper et al. | May 2012 | B2 |
8180591 | Yuen et al. | May 2012 | B2 |
8180592 | Yuen et al. | May 2012 | B2 |
8190651 | Treu et al. | May 2012 | B2 |
8213613 | Diehl et al. | Jul 2012 | B2 |
8260261 | Teague | Sep 2012 | B2 |
8270297 | Akasaka et al. | Sep 2012 | B2 |
8271662 | Gossweiler, III et al. | Sep 2012 | B1 |
8289162 | Mooring et al. | Oct 2012 | B2 |
8311769 | Yuen et al. | Nov 2012 | B2 |
8311770 | Yuen et al. | Nov 2012 | B2 |
8386008 | Yuen et al. | Feb 2013 | B2 |
8437980 | Yuen et al. | May 2013 | B2 |
8462591 | Marhaben | Jun 2013 | B1 |
8463576 | Yuen et al. | Jun 2013 | B2 |
8463577 | Yuen et al. | Jun 2013 | B2 |
8487771 | Hsieh et al. | Jul 2013 | B2 |
8533269 | Brown | Sep 2013 | B2 |
8533620 | Hoffman et al. | Sep 2013 | B2 |
8543185 | Yuen et al. | Sep 2013 | B2 |
8543351 | Yuen et al. | Sep 2013 | B2 |
8548770 | Yuen et al. | Oct 2013 | B2 |
8562489 | Burton et al. | Oct 2013 | B2 |
8583402 | Yuen et al. | Nov 2013 | B2 |
8597093 | Engelberg et al. | Dec 2013 | B2 |
8634796 | Johnson | Jan 2014 | B2 |
8638228 | Amigo et al. | Jan 2014 | B2 |
8670953 | Yuen et al. | Mar 2014 | B2 |
8684900 | Tran | Apr 2014 | B2 |
8690578 | Nusbaum et al. | Apr 2014 | B1 |
8738321 | Yuen et al. | May 2014 | B2 |
8738323 | Yuen et al. | May 2014 | B2 |
8744803 | Park et al. | Jun 2014 | B2 |
8762101 | Yuen et al. | Jun 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
8825445 | Hoffman et al. | Sep 2014 | B2 |
8847988 | Geisner et al. | Sep 2014 | B2 |
8868377 | Yuen et al. | Oct 2014 | B2 |
8892401 | Yuen et al. | Nov 2014 | B2 |
8949070 | Kahn et al. | Feb 2015 | B1 |
8954290 | Yuen et al. | Feb 2015 | B2 |
8961414 | Teller et al. | Feb 2015 | B2 |
8968195 | Tran | Mar 2015 | B2 |
9047648 | Lekutai et al. | Jun 2015 | B1 |
9081534 | Yuen et al. | Jul 2015 | B2 |
9374279 | Yuen et al. | Jun 2016 | B2 |
9426769 | Haro | Aug 2016 | B2 |
20010049470 | Mault et al. | Dec 2001 | A1 |
20010055242 | Deshmukh et al. | Dec 2001 | A1 |
20020013717 | Ando et al. | Jan 2002 | A1 |
20020019585 | Dickenson | Feb 2002 | A1 |
20020077219 | Cohen et al. | Jun 2002 | A1 |
20020082144 | Pfeffer | Jun 2002 | A1 |
20020087264 | Hills et al. | Jul 2002 | A1 |
20020109600 | Mault et al. | Aug 2002 | A1 |
20020178060 | Sheehan | Nov 2002 | A1 |
20020191797 | Perlman | Dec 2002 | A1 |
20020198776 | Nara et al. | Dec 2002 | A1 |
20030018523 | Rappaport et al. | Jan 2003 | A1 |
20030050537 | Wessel | Mar 2003 | A1 |
20030065561 | Brown et al. | Apr 2003 | A1 |
20030107575 | Cardno | Jun 2003 | A1 |
20030131059 | Brown et al. | Jul 2003 | A1 |
20030171189 | Kaufman | Sep 2003 | A1 |
20030208335 | Unuma et al. | Nov 2003 | A1 |
20030226695 | Mault | Dec 2003 | A1 |
20040054497 | Kurtz | Mar 2004 | A1 |
20040061324 | Howard | Apr 2004 | A1 |
20040117963 | Schneider | Jun 2004 | A1 |
20040122488 | Mazar et al. | Jun 2004 | A1 |
20040152957 | Stivoric et al. | Aug 2004 | A1 |
20040239497 | Schwartzman et al. | Dec 2004 | A1 |
20040249299 | Cobb | Dec 2004 | A1 |
20040257557 | Block | Dec 2004 | A1 |
20050037844 | Shum et al. | Feb 2005 | A1 |
20050038679 | Short | Feb 2005 | A1 |
20050054938 | Wehman et al. | Mar 2005 | A1 |
20050102172 | Sirmans, Jr. | May 2005 | A1 |
20050107723 | Wehman et al. | May 2005 | A1 |
20050163056 | Ranta-Aho et al. | Jul 2005 | A1 |
20050171410 | Hjelt et al. | Aug 2005 | A1 |
20050186965 | Pagonis et al. | Aug 2005 | A1 |
20050187481 | Hatib | Aug 2005 | A1 |
20050195830 | Chitrapu et al. | Sep 2005 | A1 |
20050216724 | Isozaki | Sep 2005 | A1 |
20050228244 | Banet | Oct 2005 | A1 |
20050228692 | Hodgdon | Oct 2005 | A1 |
20050234742 | Hodgdon | Oct 2005 | A1 |
20050248718 | Howell et al. | Nov 2005 | A1 |
20050272564 | Pyles et al. | Dec 2005 | A1 |
20060004265 | Pulkkinen et al. | Jan 2006 | A1 |
20060020174 | Matsumura | Jan 2006 | A1 |
20060020177 | Seo et al. | Jan 2006 | A1 |
20060025282 | Redmann | Feb 2006 | A1 |
20060039348 | Racz et al. | Feb 2006 | A1 |
20060047208 | Yoon | Mar 2006 | A1 |
20060047447 | Brady et al. | Mar 2006 | A1 |
20060064037 | Shalon et al. | Mar 2006 | A1 |
20060064276 | Ren et al. | Mar 2006 | A1 |
20060069619 | Walker et al. | Mar 2006 | A1 |
20060089542 | Sands | Apr 2006 | A1 |
20060106535 | Duncan | May 2006 | A1 |
20060111944 | Sirmans, Jr. | May 2006 | A1 |
20060129436 | Short | Jun 2006 | A1 |
20060143645 | Vock et al. | Jun 2006 | A1 |
20060166718 | Seshadri et al. | Jul 2006 | A1 |
20060189863 | Peyser | Aug 2006 | A1 |
20060217231 | Parks et al. | Sep 2006 | A1 |
20060247952 | Muraca | Nov 2006 | A1 |
20060277474 | Robarts et al. | Dec 2006 | A1 |
20060282021 | DeVaul et al. | Dec 2006 | A1 |
20060287883 | Turgiss et al. | Dec 2006 | A1 |
20060288117 | Raveendran et al. | Dec 2006 | A1 |
20070011028 | Sweeney | Jan 2007 | A1 |
20070049384 | King et al. | Mar 2007 | A1 |
20070050715 | Behar | Mar 2007 | A1 |
20070051369 | Choi et al. | Mar 2007 | A1 |
20070061593 | Celikkan et al. | Mar 2007 | A1 |
20070071643 | Hall et al. | Mar 2007 | A1 |
20070072156 | Kaufman et al. | Mar 2007 | A1 |
20070083095 | Rippo et al. | Apr 2007 | A1 |
20070083602 | Heggenhougen et al. | Apr 2007 | A1 |
20070123391 | Shin et al. | May 2007 | A1 |
20070135264 | Rosenberg | Jun 2007 | A1 |
20070136093 | Rankin et al. | Jun 2007 | A1 |
20070146116 | Kimbrell | Jun 2007 | A1 |
20070155277 | Amitai et al. | Jul 2007 | A1 |
20070159926 | Prstojevich et al. | Jul 2007 | A1 |
20070179356 | Wessel | Aug 2007 | A1 |
20070179761 | Wren et al. | Aug 2007 | A1 |
20070194066 | Ishihara et al. | Aug 2007 | A1 |
20070197920 | Adams | Aug 2007 | A1 |
20070208544 | Kulach et al. | Sep 2007 | A1 |
20070276271 | Chan | Nov 2007 | A1 |
20070288265 | Quinian et al. | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080014947 | Carnall | Jan 2008 | A1 |
20080022089 | Leedom | Jan 2008 | A1 |
20080032864 | Hakki | Feb 2008 | A1 |
20080044014 | Corndorf | Feb 2008 | A1 |
20080054072 | Katragadda et al. | Mar 2008 | A1 |
20080084823 | Akasaka et al. | Apr 2008 | A1 |
20080093838 | Tropper et al. | Apr 2008 | A1 |
20080097550 | Dicks et al. | Apr 2008 | A1 |
20080109158 | Huhtala | May 2008 | A1 |
20080114829 | Button et al. | May 2008 | A1 |
20080125288 | Case | May 2008 | A1 |
20080125959 | Doherty | May 2008 | A1 |
20080129457 | Ritter et al. | Jun 2008 | A1 |
20080134102 | Movold et al. | Jun 2008 | A1 |
20080139910 | Mastrototaro et al. | Jun 2008 | A1 |
20080140163 | Keacher et al. | Jun 2008 | A1 |
20080140338 | No et al. | Jun 2008 | A1 |
20080146892 | LeBoeuf et al. | Jun 2008 | A1 |
20080155077 | James | Jun 2008 | A1 |
20080176655 | James et al. | Jul 2008 | A1 |
20080190202 | Kulach et al. | Aug 2008 | A1 |
20080214360 | Stirling et al. | Sep 2008 | A1 |
20080275309 | Stivoric et al. | Nov 2008 | A1 |
20080285805 | Luinge et al. | Nov 2008 | A1 |
20080287751 | Stivoric et al. | Nov 2008 | A1 |
20080300641 | Brunekreeft | Dec 2008 | A1 |
20090012418 | Gerlach | Jan 2009 | A1 |
20090018797 | Kasama et al. | Jan 2009 | A1 |
20090043531 | Kahn et al. | Feb 2009 | A1 |
20090047645 | Dibenedetto et al. | Feb 2009 | A1 |
20090048044 | Oleson et al. | Feb 2009 | A1 |
20090054737 | Magar et al. | Feb 2009 | A1 |
20090054751 | Babashan et al. | Feb 2009 | A1 |
20090058635 | LaLonde et al. | Mar 2009 | A1 |
20090063193 | Barton et al. | Mar 2009 | A1 |
20090063293 | Mirrashidi et al. | Mar 2009 | A1 |
20090076765 | Kulach et al. | Mar 2009 | A1 |
20090088183 | Piersol | Apr 2009 | A1 |
20090093341 | James et al. | Apr 2009 | A1 |
20090098821 | Shinya | Apr 2009 | A1 |
20090144456 | Gelf et al. | Jun 2009 | A1 |
20090144639 | Nims et al. | Jun 2009 | A1 |
20090150178 | Sutton et al. | Jun 2009 | A1 |
20090156172 | Chan | Jun 2009 | A1 |
20090171788 | Tropper et al. | Jul 2009 | A1 |
20090195350 | Tsern et al. | Aug 2009 | A1 |
20090262088 | Moll-Carrillo et al. | Oct 2009 | A1 |
20090264713 | Van Loenen et al. | Oct 2009 | A1 |
20090271147 | Sugai | Oct 2009 | A1 |
20090287921 | Zhu et al. | Nov 2009 | A1 |
20090307517 | Fehr et al. | Dec 2009 | A1 |
20090309742 | Alexander et al. | Dec 2009 | A1 |
20090313857 | Carnes et al. | Dec 2009 | A1 |
20100023348 | Hardee et al. | Jan 2010 | A1 |
20100043056 | Ganapathy | Feb 2010 | A1 |
20100058064 | Kirovski et al. | Mar 2010 | A1 |
20100059561 | Ellis et al. | Mar 2010 | A1 |
20100069203 | Kawaguchi et al. | Mar 2010 | A1 |
20100079291 | Kroll | Apr 2010 | A1 |
20100125729 | Baentsch et al. | May 2010 | A1 |
20100130873 | Yuen et al. | May 2010 | A1 |
20100158494 | King | Jun 2010 | A1 |
20100159709 | Kotani et al. | Jun 2010 | A1 |
20100167783 | Alameh et al. | Jul 2010 | A1 |
20100179411 | Holmström et al. | Jul 2010 | A1 |
20100185064 | Bandic et al. | Jul 2010 | A1 |
20100191153 | Sanders et al. | Jul 2010 | A1 |
20100205541 | Rapaport et al. | Aug 2010 | A1 |
20100217099 | LeBoeuf et al. | Aug 2010 | A1 |
20100222179 | Temple et al. | Sep 2010 | A1 |
20100261987 | Kamath et al. | Oct 2010 | A1 |
20100292050 | DiBenedetto | Nov 2010 | A1 |
20100292600 | DiBenedetto et al. | Nov 2010 | A1 |
20100295684 | Hsieh et al. | Nov 2010 | A1 |
20100298656 | McCombie et al. | Nov 2010 | A1 |
20100298661 | McCombie et al. | Nov 2010 | A1 |
20100304674 | Kim et al. | Dec 2010 | A1 |
20100311544 | Robinette et al. | Dec 2010 | A1 |
20100331145 | Lakovic et al. | Dec 2010 | A1 |
20110003665 | Burton et al. | Jan 2011 | A1 |
20110009051 | Khedouri et al. | Jan 2011 | A1 |
20110021143 | Kapur et al. | Jan 2011 | A1 |
20110022349 | Stirling et al. | Jan 2011 | A1 |
20110029241 | Miller et al. | Feb 2011 | A1 |
20110032105 | Hoffman et al. | Feb 2011 | A1 |
20110051665 | Huang | Mar 2011 | A1 |
20110080349 | Holbein et al. | Apr 2011 | A1 |
20110087076 | Brynelsen et al. | Apr 2011 | A1 |
20110106449 | Chowdhary et al. | May 2011 | A1 |
20110109540 | Milne et al. | May 2011 | A1 |
20110131005 | Ueshima et al. | Jun 2011 | A1 |
20110145894 | Garcia Morchon et al. | Jun 2011 | A1 |
20110153773 | Vandwalle | Jun 2011 | A1 |
20110167262 | Ross et al. | Jul 2011 | A1 |
20110193704 | Harper | Aug 2011 | A1 |
20110197157 | Hoffman et al. | Aug 2011 | A1 |
20110214030 | Greenberg et al. | Sep 2011 | A1 |
20110221590 | Baker et al. | Sep 2011 | A1 |
20110224508 | Moon | Sep 2011 | A1 |
20110230729 | Shirasaki et al. | Sep 2011 | A1 |
20110258689 | Cohen et al. | Oct 2011 | A1 |
20110275940 | Nims et al. | Nov 2011 | A1 |
20120015778 | Lee et al. | Jan 2012 | A1 |
20120035487 | Werner et al. | Feb 2012 | A1 |
20120046113 | Ballas | Feb 2012 | A1 |
20120072165 | Jallon | Mar 2012 | A1 |
20120083705 | Yuen et al. | Apr 2012 | A1 |
20120083714 | Yuen et al. | Apr 2012 | A1 |
20120083715 | Yuen et al. | Apr 2012 | A1 |
20120083716 | Yuen et al. | Apr 2012 | A1 |
20120084053 | Yuen et al. | Apr 2012 | A1 |
20120084054 | Yuen et al. | Apr 2012 | A1 |
20120092157 | Tran | Apr 2012 | A1 |
20120094649 | Porrati et al. | Apr 2012 | A1 |
20120102008 | Kääriäinen et al. | Apr 2012 | A1 |
20120116684 | Ingrassia, Jr. et al. | May 2012 | A1 |
20120119911 | Jeon et al. | May 2012 | A1 |
20120150483 | Vock et al. | Jun 2012 | A1 |
20120165684 | Sholder | Jun 2012 | A1 |
20120166257 | Shiragami et al. | Jun 2012 | A1 |
20120179278 | Riley et al. | Jul 2012 | A1 |
20120183939 | Aragones et al. | Jul 2012 | A1 |
20120215328 | Schmelzer | Aug 2012 | A1 |
20120221634 | Treu et al. | Aug 2012 | A1 |
20120226471 | Yuen et al. | Sep 2012 | A1 |
20120226472 | Yuen et al. | Sep 2012 | A1 |
20120227737 | Mastrototaro et al. | Sep 2012 | A1 |
20120245716 | Srinivasan et al. | Sep 2012 | A1 |
20120254987 | Ge et al. | Oct 2012 | A1 |
20120265477 | Vock et al. | Oct 2012 | A1 |
20120265480 | Oshima | Oct 2012 | A1 |
20120274508 | Brown et al. | Nov 2012 | A1 |
20120283855 | Hoffman et al. | Nov 2012 | A1 |
20120290109 | Engelberg et al. | Nov 2012 | A1 |
20120296400 | Bierman et al. | Nov 2012 | A1 |
20120297229 | Desai et al. | Nov 2012 | A1 |
20120297440 | Reams et al. | Nov 2012 | A1 |
20120316456 | Rahman et al. | Dec 2012 | A1 |
20120324226 | Bichsel et al. | Dec 2012 | A1 |
20120330109 | Tran | Dec 2012 | A1 |
20130006718 | Nielsen et al. | Jan 2013 | A1 |
20130041590 | Burich et al. | Feb 2013 | A1 |
20130072169 | Ross et al. | Mar 2013 | A1 |
20130073254 | Yuen et al. | Mar 2013 | A1 |
20130073255 | Yuen et al. | Mar 2013 | A1 |
20130080113 | Yuen et al. | Mar 2013 | A1 |
20130094600 | Beziat et al. | Apr 2013 | A1 |
20130095459 | Tran | Apr 2013 | A1 |
20130096843 | Yuen et al. | Apr 2013 | A1 |
20130102251 | Linde et al. | Apr 2013 | A1 |
20130103847 | Brown et al. | Apr 2013 | A1 |
20130106684 | Weast et al. | May 2013 | A1 |
20130132501 | Vandwalle et al. | May 2013 | A1 |
20130151193 | Kulach et al. | Jun 2013 | A1 |
20130151196 | Yuen et al. | Jun 2013 | A1 |
20130158369 | Yuen et al. | Jun 2013 | A1 |
20130166048 | Werner et al. | Jun 2013 | A1 |
20130187789 | Lowe | Jul 2013 | A1 |
20130190008 | Vathsangam et al. | Jul 2013 | A1 |
20130190903 | Balakrishnan et al. | Jul 2013 | A1 |
20130191034 | Weast et al. | Jul 2013 | A1 |
20130203475 | Kil et al. | Aug 2013 | A1 |
20130209972 | Carter et al. | Aug 2013 | A1 |
20130225117 | Giacoletto et al. | Aug 2013 | A1 |
20130228063 | Turner | Sep 2013 | A1 |
20130231574 | Tran | Sep 2013 | A1 |
20130238287 | Hoffman et al. | Sep 2013 | A1 |
20130261475 | Mochizuki | Oct 2013 | A1 |
20130267249 | Rosenberg | Oct 2013 | A1 |
20130268199 | Nielsen et al. | Oct 2013 | A1 |
20130268236 | Yuen et al. | Oct 2013 | A1 |
20130268687 | Schrecker | Oct 2013 | A1 |
20130268767 | Schrecker | Oct 2013 | A1 |
20130274904 | Coza et al. | Oct 2013 | A1 |
20130281110 | Zelinka | Oct 2013 | A1 |
20130289366 | Chua et al. | Oct 2013 | A1 |
20130296666 | Kumar et al. | Nov 2013 | A1 |
20130296672 | O'Neil et al. | Nov 2013 | A1 |
20130296673 | Thaveeprungsriporn et al. | Nov 2013 | A1 |
20130297220 | Yuen et al. | Nov 2013 | A1 |
20130310896 | Mass | Nov 2013 | A1 |
20130325396 | Yuen et al. | Dec 2013 | A1 |
20130331058 | Harvey | Dec 2013 | A1 |
20130337974 | Yanev et al. | Dec 2013 | A1 |
20130345978 | Lush et al. | Dec 2013 | A1 |
20140035761 | Burton et al. | Feb 2014 | A1 |
20140035764 | Burton et al. | Feb 2014 | A1 |
20140039804 | Park et al. | Feb 2014 | A1 |
20140039840 | Yuen et al. | Feb 2014 | A1 |
20140039841 | Yuen et al. | Feb 2014 | A1 |
20140052280 | Yuen et al. | Feb 2014 | A1 |
20140067278 | Yuen et al. | Mar 2014 | A1 |
20140077673 | Garg et al. | Mar 2014 | A1 |
20140085077 | Luna et al. | Mar 2014 | A1 |
20140094941 | Ellis et al. | Apr 2014 | A1 |
20140099614 | Hu et al. | Apr 2014 | A1 |
20140121471 | Walker | May 2014 | A1 |
20140125618 | Panther et al. | May 2014 | A1 |
20140156228 | Yuen et al. | Jun 2014 | A1 |
20140164611 | Molettiere et al. | Jun 2014 | A1 |
20140180022 | Stivoric et al. | Jun 2014 | A1 |
20140191866 | Yuen et al. | Jul 2014 | A1 |
20140200691 | Lee et al. | Jul 2014 | A1 |
20140207264 | Quy | Jul 2014 | A1 |
20140213858 | Presura et al. | Jul 2014 | A1 |
20140275885 | Isaacson et al. | Sep 2014 | A1 |
20140278229 | Hong et al. | Sep 2014 | A1 |
20140316305 | Venkatraman et al. | Oct 2014 | A1 |
20140337451 | Choudhary et al. | Nov 2014 | A1 |
20140337621 | Nakhimov | Nov 2014 | A1 |
20140343867 | Yuen et al. | Nov 2014 | A1 |
20150026647 | Park et al. | Jan 2015 | A1 |
20150057967 | Albinali | Feb 2015 | A1 |
20150088457 | Yuen et al. | Mar 2015 | A1 |
20150120186 | Heikes | Apr 2015 | A1 |
20150127268 | Park et al. | May 2015 | A1 |
20150137994 | Rahman et al. | May 2015 | A1 |
20150220883 | B'Far et al. | Aug 2015 | A1 |
20150289802 | Thomas et al. | Oct 2015 | A1 |
20150324541 | Cheung et al. | Nov 2015 | A1 |
20150374267 | Laughlin | Dec 2015 | A1 |
20160058372 | Raghuram et al. | Mar 2016 | A1 |
20160061626 | Burton et al. | Mar 2016 | A1 |
20160063888 | McCallum et al. | Mar 2016 | A1 |
20160089572 | Liu et al. | Mar 2016 | A1 |
20160107646 | Kolisetty et al. | Apr 2016 | A1 |
20160259426 | Yuen et al. | Sep 2016 | A1 |
20160278669 | Messenger et al. | Sep 2016 | A1 |
20160285985 | Molettiere et al. | Sep 2016 | A1 |
20160323401 | Messenger et al. | Nov 2016 | A1 |
Number | Date | Country |
---|---|---|
101978374 | Feb 2011 | CN |
102111434 | Jun 2011 | CN |
102377815 | Mar 2012 | CN |
102740933 | Oct 2012 | CN |
102983890 | Mar 2013 | CN |
103226647 | Jul 2013 | CN |
1 721 237 | Nov 2006 | EP |
11347021 | Dec 1999 | JP |
2178588 | Jan 2002 | RU |
WO 0211019 | Feb 2002 | WO |
WO 2006055125 | May 2006 | WO |
WO 2006090197 | Aug 2006 | WO |
WO 2008038141 | Apr 2008 | WO |
WO 2009042965 | Apr 2009 | WO |
WO 2012061438 | May 2012 | WO |
WO 12170586 | Dec 2012 | WO |
WO 12170924 | Dec 2012 | WO |
WO 12171032 | Dec 2012 | WO |
WO 15127067 | Aug 2015 | WO |
WO 16003269 | Jan 2016 | WO |
Entry |
---|
Chandrasekar et al., “Plug-and-Play, Single-Chip Photoplethysmography”, 34th Annual International Conference of the IEEE EMBS, San Diego, California USA, Aug. 28-Sep. 1, 2012, 4 pages. |
Clifford et al., “Altimeter and Barometer System”, Freescale Semiconductor Application Note AN1979, Rev. 3, Nov. 2006, 10 pages. |
Fang et al, “Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience”, IEEE Transactions on Instrumentation and Measurement, vol. 54, No. 6, Dec. 2005, pp. 2342-2358. |
Fitbit Inc., “Fitbit Automatically Tracks Your Fitness and Sleep” published online at web.archive.org/web/20080910224820/http://www.fitbit.com, copyright Sep. 10, 2008, 1 page. |
Godfrey et al., “Direct Measurement of Human Movement by Accelerometry”, Medical Engineering & Physics, vol. 30, 2008, pp. 1364-1386 (22 pages). |
Godha et al., “Foot Mounted Inertia System for Pedestrian Naviation”, Measurement Science and Technology, vol. 19, No. 7, May 2008, pp. 1-9 (10 pages). |
Intersema, “Using MS5534 for altimeters and barometers”, Application Note AN501, Jan. 2006, 12pages. |
Ladetto et al, “On Foot Navigation: When GPS alone is not Enough”, Journal of Navigation, vol. 53, No. 2, Sep. 2000, pp. 279-285 (6 pages). |
Lammel et al., “Indoor Navigation with MEMS Sensors”, Proceedings of the Eurosensors XIII conference, vol. 1, No. 1, Sep. 2009, pp. 532-535 (4 pages). |
Lester et al, “Validated caloric expenditure estimation using a single body-worn sensor”, Proc. of the Int'l Conf. on Ubiquitous Computing, 2009, pp. 225-234 (10 pages). |
Lester et al., “A Hybrid Discriminative/Generative Approach for Modeling Human Activities”, Proc. of the Int'l Joint Conf. Artificial Intelligence, 2005, pp. 766-772 (7 pages). |
Ohtaki et al, “Automatic classification of ambulatory movements and evaluation of energy consumptions utilizing accelerometers and barometer”, Microsystem Technologies, vol. 11, No. 8-10, Aug. 2005, pp. 1034-1040 (7 pages). |
Parkka, et al, Activity Classification Using Realistic Data From Wearable Sensors, IEEE Transactions on Information Technology in Biomedicine, vol. 10, No. 1, Jan. 2006, pp. 119-128 (10pages). |
PCT/IB07/03617 International Search Report issued on Aug. 15, 2008, 3 pages. |
Perrin et al, “Improvement of Walking Speed Prediction by Accelerometry and Altimetry, Validated by Satellite Positioning”, Medical & Biological Engineering & Computing, vol. 38, 2000, pp. 164-168 (5 pages). |
Retscher, “An Intelligent Multi-Sensor system for Pedestrian Navigation”, Journal of Global Positioning Systems, vol. 5, No. 1, 2006, pp. 110-118 (9 pages). |
Sagawa et al, “Classification of Human Moving Patterns Using Air Pressure and Acceleration”, Proceedings of the 24th Annual Conference of the IEEE Industrial Electronics Society, vol. 2, Aug.-Sep. 1998, pp. 1214-1219 (6 pages). |
Sagawa et al, “Non-restricted measurement of walking distance”, IEEE Int'l Conf. on Systems, Man, and Cybernetics, vol. 3, Oct. 2000, pp. 1847-1852 (6 pages). |
Specification of the Bluetooth® System, Core Package, version 4.1, Dec. 2013, vols. 0 & 1, 282 pages. |
Stirling et al., “Evaluation of a New Method of Heading Estimation of Pedestrian Dead Reckoning Using Shoe Mounted Sensors”, Journal of Navigation, vol. 58, 2005, pp. 31-45 (15 pages). |
Suunto LUMI, “User Guide”, Copyright Jun. and Sep. 2007, 49 pages. |
Tanigawa et al, “Drift-Free Dynamic Height Sensor Using MEMS IMU Aided by MEMS Pressure Sensor”, Workshop on Positioning, Navigation and Communication, Mar. 2008, pp. 191-196 (6 pages). |
VTI Technologies, “SCP 1000-D01/D11 Pressure Sensor as Barometer and Altimeter”, Application Note 33, Jun. 2006, 3 pages. |
“Activator is One of the Best Cydia iPhone Hacks I Control your iPhone with Gestures,” IPHONE-TIPS-AND-ADVICE.COM, [retrieved on Jul. 9, 2013 at http://www.iphone-tips-and-advice.comlactivatior.html], 10 pp. |
“Parts of Your Band,” (Product Release Date Unknown, downloaded Jul. 22, 2013) Jawbone UP Band, 1 page. |
Chudnow, Alan (Dec. 3, 2012) “Basis Wristband Make Its Debut,” The Wired Self, Living in a Wired World, published in Health [retrieved on Jul. 22, 2013 at http://thewiredself.com/health/basis-wrist-band-make-its-debut/], 3pp. |
Definition of “Graphic” from Merriam-Webster Dictionary, downloaded from merriam-webster.com on Oct. 4, 2014, 3 pp. |
Definition of “Graphical user interface” from Merriam-Webster Dictionary, downloaded from merriam-webster.com on Oct. 4, 2014, 2 pp. |
DesMarais, Christina (posted on Sep. 3, 2013) “Which New Activity Tracker is Best for You?” Health and Home, Health & Fitness, Guides & Reviews, [Retrieved on Sep. 23, 2013 at http://www.techlicious.com/guide/which-new-activity-tracker-is-right-for--you/] 4 pp. |
Empson, Rip, (Sep. 22, 2011) “Basis Reveals an Awesome New Affordable Heart and Health Tracker You Can Wear on Your Wrist,” [retrieved on Sep. 23, 2013 at http://techcrunch.com/2011/09/22/basis-reveals-an-awesome-new . . . ], 3 pp. |
Fitbit User's Manual, Last Updated Oct. 22, 2009, 15 pages. |
Forerunner® 10 Owner's Manual (Aug. 2012), Garmin Ltd., 10 pp. |
Forerunner® 110 Owner's Manual, (2010) “GPS-Enabled Sport Watch,” Garmin Ltd., 16 pp. |
Forerunner® 201 personal trainer owner's manual, (Feb. 2006) Garmin Ltd., 48 pp. |
Forerunner® 205/305 Owner's Manual, GPS-enabled trainer for runners, (2006-2008), Garmin Ltd., 80 pp. |
Forerunner® 210 Owner's Manual, (2010) “GPS-Enabled Sport Watch,” Garmin Ltd., 28 pp. |
Forerunner® 301 personal trainer owner's manual, (Feb. 2006) Garmin Ltd., 66 pp. |
Forerunner® 310XT Owner's Manual, Multisport GPS Training Device, (2009-2013), Garmin Ltd., 56 pp. |
Forerunner® 405 Owner's Manual, (Mar. 2011) “GPS-Enabled Sport Watch With Wireless Sync,” Garmin Ltd., 56 pp. |
Forerunner® 405CX Owner's Manual, “GPS-Enabled Sports Watch With Wireless Sync,” (Mar. 2009), Garmin Ltd., 56 pp. |
Forerunner® 410 Owner's Manual, (Jul. 2012) “GPS-Enabled Sport Watch With Wireless Sync,” Garmin Ltd., 52 pp. |
Forerunner® 50 with ANT+Sport.TM. wireless technology, Owner's Manual, (Nov. 2007) Garmin Ltd., 44 pp. |
Forerunner® 910XT Owner's Manual, (Jan. 2013) Garmin Ltd., 56 pp. |
Garmin Swim™ Owner's Manual (Jun. 2012), 12 pp. |
Lark/Larkpro, User Manual, (2012) “What's in the box,” Lark Technologies, 7 pp. |
Larklife, User Manual, (2012) Lark Technologies, 7 pp. |
Minetti et al. Energy cost of walking and running at extreme uphill and downhill slopes. J Appl Physiol. 2002; 93:10/39/1046. |
Nike+ FuelBand GPS Manual, User's Guide (Product Release Date Unknown, downloaded Jul. 22, 2013), 26 pp. |
Nike+SportBand User's Guide, (Product Release Date Unknown, downloaded Jul. 22, 2013), 36 pages. |
Nike+SportWatch GPS Manual, User's Guide, Powered by TOMTOM, (Product Release Date Unknown, downloaded Jul. 22, 2013), 42 pages. |
O'Donovan et al., 2009, A context aware wireless body area network (BAN), Proc. 3rd Intl. Conf. Pervasive Computing Technologies for Healthcare, pp. 1-8. |
Polar WearLink® + Coded Transmitter 31 Coded Transmitter W.I.N.D. User Manual, Polar® Listen to Your Body, Manufactured by Polar Electro Oy, 11 pages. |
Rainmaker, (Jun. 25, 2012, updated Feb. 16, 2013) “Garmin Swim watch In-Depth Review,” [retrieved on Sep. 9, 2013 at http://www.dcrainmaker.com/2012/06/garmin-swim-in-depth-review.html, 38 pp. |
Thompson et al., (Jan. 1996) “Predicted and measured resting metabolic rate of male and female endurance athletes,” Journal of the American Dietetic Association 96(1):30-34. |
Number | Date | Country | |
---|---|---|---|
20150102923 A1 | Apr 2015 | US |
Number | Date | Country | |
---|---|---|---|
61886000 | Oct 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14329735 | Jul 2014 | US |
Child | 14579982 | US | |
Parent | 14050305 | Oct 2013 | US |
Child | 14329735 | US |