The present disclosure relates to systems and methods for motion-activated display of messages on an activity monitoring device.
In recent years, the need for health and fitness has grown tremendously. The growth has occurred due to a better understanding of the benefits of good fitness to overall health and wellness. Unfortunately, although today's modern culture has brought about many new technologies, such as the Internet, connected devices and computers, people have become less active. Additionally, many office jobs require people to sit in front of computer screens for long periods of time, which further reduces a person's activity levels. Furthermore, much of today's entertainment options involve viewing multimedia content, computer social networking, and other types of computer involved interfacing. Although such computer activity can be very productive as well as entertaining, such activity tends to reduce a person's overall physical activity.
To provide users concerned with health and fitness a way of measuring or accounting for their activity or lack thereof, fitness activity trackers have recently grown in popularity. Fitness activity trackers are used to measure activity, such as walking, motion, running, sleeping, being inactive, bicycling, exercising on an elliptical trainer, and the like. Typically, the data collected by such devices can be transferred and viewed on a computing device. However, while fitness activity trackers enable many data-intensive features relating to fitness activity, interfacing with a fitness activity tracker can feel sterile and robotic to a user.
It is in this context that embodiments of the invention arise.
Embodiments described in the present disclosure provide systems, apparatus, computer readable media, and methods for displaying motion-activated messages on a portable activity monitoring device.
In one embodiment, a method for presenting a message on an activity monitoring device is provided, including: storing a plurality of messages to the device; detecting a non-user interactive state of the device; detecting a change of the device from the non-user interactive state to a user-interactive state; in response to detecting the change from the non-user interactive state to the user-interactive state, selecting one of a plurality of messages, and displaying the selected message on the device; wherein selecting one of the plurality of messages is based on one or more of a length of time of the non-user interactive state, a current time of day, a current location of the device, or an activity history associated with a user of the device; wherein the method is executed by at least one processor.
In one embodiment, detecting the non-user interactive state includes detecting a stationary state that is defined by non-movement of the device for a predefined time period; and detecting the change of the device from the non-user interactive state to the user-interactive state includes detecting a movement of the device from the stationary state.
In one embodiment, detecting the change from the non-user interactive state to the user interactive state includes detecting a movement of the device from a first orientation to a second orientation.
In one embodiment, detecting the non-user interactive state includes detecting a charging state of the device; and detecting the change from the non-user interactive state to the user interactive state includes detecting a change of the device from the charging state to a non-charging state.
In one embodiment, detecting the change from the non-user interactive state to the user interactive state includes detecting one or more of a button press or a touchscreen input.
In one embodiment, the device is configured to detect one or more of the following activities by a user: steps taken, energy consumed, elevation gained, active minutes.
In one embodiment, storing the plurality of messages includes identifying the device to a server and downloading the plurality of messages from the server, the server being configured to access a user account associated with the device, the plurality of messages being selected by the server based on the user account.
In one embodiment, at least one of the plurality of messages is defined based on one or more of a current date, a location of the device, a current season, or a current weather.
In one embodiment, at least one of the plurality of messages is defined based on an activity history of a user associated with the device.
In one embodiment, the activity history is defined by levels of activity associated to specific time periods.
In one embodiment, at least one of the plurality of messages is defined by input received from a secondary user, the secondary user being a member of a social graph of a primary user associated with the device.
In one embodiment, the operations of selecting and displaying are not performed during a time period for which an event is scheduled in a calendar of a user of the device.
In another embodiment, a method for presenting a message on an activity monitoring device is provided, comprising: storing a plurality of messages to the device; detecting a stationary state of the device; detecting a movement of the device from the stationary state; in response to detecting the movement from the stationary state, selecting one of a plurality of messages, and displaying the selected message on the device; wherein at least one of the plurality of messages is defined based on one or more of a current date, a location of the device, a current season, or a current weather; wherein the method is executed by at least one processor.
In one embodiment, detecting the stationary state includes detecting non-movement of the device for a predefined time period.
In one embodiment, storing the plurality of messages includes identifying the device to a server and downloading the plurality of messages from the server, the server being configured to access a user account associated with the device, the plurality of messages being selected by the server based on the user account.
In one embodiment, at least one of the plurality of messages is defined based on an activity history of a user associated with the device.
In one embodiment, the activity history is defined by levels of activity associated to specific time periods.
In one embodiment, at least one of the plurality of messages is defined by input received from a secondary user, the secondary user being a member of a social graph of a primary user associated with the device.
In one embodiment, the operations of selecting and displaying are not performed during a time period for which an event is scheduled in a calendar of a user of the device.
In another embodiment, an activity monitoring device is provided, comprising: a message storage device configured to store a plurality of messages; a motion sensor; a display; logic configured to detect, based on output of the motion sensor, a stationary state of the device and a subsequent movement of the device from the stationary state, and, in response to detecting the movement from the stationary state, select one of the plurality of messages, and display the selected message on the device; wherein at least one of the plurality of messages is defined based on an activity history of a user associated with the device; wherein the method is executed by at least one processor.
In one embodiment, detecting the stationary state includes detecting non-movement of the device for a predefined time period.
In one embodiment, selecting one of the plurality of messages is based on one or more of a length of time of the stationary state, a current time of day, a current location of the device, or an activity history associated with a user of the device.
In one embodiment, at least one of the plurality of messages is defined based on one or more of a current date, a location of the device, a current season, or a current weather.
In one embodiment, the activity history is defined by levels of activity associated to specific time periods.
In one embodiment, at least one of the plurality of messages is defined by input received from a secondary user, the secondary user being a member of a social graph of a primary user associated with the device.
In one embodiment, the activity monitoring device further comprises: logic configured to download the plurality of messages to the message storage, wherein the downloading includes identifying the device to a server, the server being configured to access a user account associated with the device, the plurality of messages being selected by the server based on the user account.
In another embodiment, an activity monitoring device is provided, comprising: a message storage device configured to store a plurality of messages; a motion sensor; a display; logic configured to detect, based on output of the motion sensor, a stationary state of the device and a subsequent movement of the device from the stationary state, and, in response to detecting the movement from the stationary state, select one of the plurality of messages, and display the selected message on the device; wherein at least one of the plurality of messages is defined by input received from a secondary user, the secondary user being a member of a social graph of a primary user associated with the device; wherein the method is executed by at least one processor.
In one embodiment, detecting the stationary state includes detecting non-movement of the device for a predefined time period.
In one embodiment, selecting one of the plurality of messages is based on one or more of a length of time of the stationary state, a current time of day, a current location of the device, or an activity history associated with a user of the device.
In one embodiment, at least one of the plurality of messages is defined based on one or more of a current date, a location of the device, a current season, or a current weather.
In one embodiment, at least one of the plurality of messages is defined based on an activity history of a user associated with the device.
In one embodiment, the activity history is defined by levels of activity associated to specific time periods.
In one embodiment, the activity monitoring device further comprises: logic configured to download the plurality of messages to the message storage, wherein the downloading includes identifying the device to a server, the server being configured to access a user account associated with the device, the plurality of messages being selected by the server based on the user account.
In another embodiment, a method for presenting a message on an activity monitoring device is provided, comprising: storing a plurality of messages to the device; detecting a stationary state of the device; detecting a movement of the device from the stationary state; in response to detecting the movement from the stationary state, selecting one of a plurality of messages, and displaying the selected message on the device; wherein the operations of selecting and displaying are not performed during a time period for which an event is scheduled in a calendar of a user of the device; wherein the method is executed by at least one processor.
In another embodiment, a method for presenting messages on an activity monitoring device is provided, including the following method operations: receiving a request to synchronize the device with a user account to which the device is associated; retrieving profile data from the user account; selecting one or more messages from a message storage based on the profile data; sending the selected messages to the device, the device configured to display at least one of the selected messages in response to detecting movement of the device from a stationary state.
In one embodiment, the profile data defines one or more of an age, a gender, a location, or a historical activity profile.
In one embodiment, selecting one or more messages is based on one or more of a current date, a current day of the week, a current location of the device, an activity history associated with a user of the device, a current season, a current weather, and social network activity.
In one embodiment, one of the selected messages is defined by a secondary user, the secondary user being a member of a social graph of a primary user associated with the device.
Other aspects will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of embodiments described in the present disclosure.
Various embodiments described in the present disclosure may best be understood by reference to the following description taken in conjunction with the accompanying drawings.
Embodiments described in the present disclosure provide systems, apparatus, computer readable media, and methods for displaying motion-activated messages on a portable activity monitoring device.
It should be noted that there are many inventions described and illustrated herein. The present inventions are neither limited to any single aspect nor embodiment thereof, nor to any combinations and/or permutations of such aspects and/or embodiments. Moreover, each of the aspects of the present inventions, and/or embodiments thereof, may be employed alone or in combination with one or more of the other aspects of the present inventions and/or embodiments thereof. For the sake of brevity, many of those permutations and combinations will not be discussed separately herein.
Further, in the course of describing and illustrating the present inventions, various circuitry, architectures, structures, components, functions and/or elements, as well as combinations and/or permutations thereof, are set forth. It should be understood that circuitry, architectures, structures, components, functions and/or elements other than those specifically described and illustrated, are contemplated and are within the scope of the present inventions, as well as combinations and/or permutations thereof.
The environmental sensors 118 may be in the form of motion detecting sensors. In some embodiments, a motion sensor can be one or more of an accelerometer, or a gyroscope, or a rotary encoder, or a calorie measurement sensor, or a heat measurement sensor, or a moisture measurement sensor, or a displacement sensor, or an ultrasonic sensor, or a pedometer, or an altimeter, or a linear motion sensor, or an angular motion sensor, or a multi-axis motion sensor, or a combination thereof. The biometric sensors 116 can be defined to measure physiological characteristics of the user that is using the activity tracking device 100. The user interface 114 provides a way for communicating with the activity tracking device 100, in response to user interaction 104. The user interaction 104 can be in the form of physical contact (e.g., without limitation, tapping, sliding, rubbing, multiple taps, gestures, etc.).
In some embodiments, the user interface 114 is configured to receive user interaction 104 by way of proximity sensors, button presses, touch sensitive screen inputs, graphical user interface inputs, voice inputs, sound inputs, etc. The activity tracking device 100 can communicate with a client and/or server 112 using the wireless transceiver 110. The wireless transceiver 110 will allow the activity tracking device 100 to communicate using a wireless connection, which is enabled by wireless communication logic. The wireless communication logic can be in the form of a circuit having radio communication capabilities. The radio communication capabilities can be in the form of a Wi-Fi connection, a Bluetooth connection, a low-energy Bluetooth connection, or any other form of wireless tethering or near field communication. In still other embodiments, the activity tracking device 100 can communicate with other computing devices using a wired connection (not shown). As mentioned, the environmental sensors 118 can detect motion of the activity tracking device 100.
The motion can be activity of the user, such as walking, running, stair climbing, etc. The motion can also be in the form of physical contact received on any surface of the activity tracking device 110, so long as the environmental sensors 118 can detect such motion from the physical contact. Such physical contact may be in the form of a tap or multiple taps by a finger upon the housing of the activity tracking device 100.
As shown in
Some motions will produce and quantify various types of metrics, such as step count, stairs climbed, distance traveled, very active minutes, calories burned, etc. The physical contact logic 142 can include logic that calculates or determines when particular physical contact can qualify as an input. To qualify as an input, the physical contact detected by sensors 156 should have a particular pattern that is identifiable as input. For example, the input may be predefined to be a double tap input, and the physical contact logic 142 can analyze the motion to determine if a double tap indeed occurred in response to analyzing the sensor data produced by sensors 156.
The display interface logic 144 is configured to interface with the processor and the motion-activated messaging logic to determine when specific messages will be displayed on the display screen 122 of the activity tracking device 100. The display interface logic 144 can act to turn on the screen, display metric information, display characters or alphanumeric information, display graphical user interface graphics, or combinations thereof. Alarm management logic 146 can function to provide a user interface and settings for managing and receiving input from a user to set an alarm. The alarm management logic can interface with a timekeeping module (e.g., clock, calendar, time zone, etc.), and can trigger the activation of an alarm. The alarm can be in the form of an audible alarm or a non-audible alarm.
A non-audible alarm can provide such alarm by way of a vibration. The vibration can be produced by a motor integrated in the activity tracking device 100. The vibration can be defined to include various vibration patterns, intensities, and custom set patterns. The vibration produced by the motor or motors of the activity tracking device 100 can be managed by the alarm management logic 146 in conjunction with processing by the processor 106. The wireless communication logic 148 is configured for communication of the activity tracking device with another computing device by way of a wireless signal. The wireless signal can be in the form of a radio signal. As noted above, the radio signal can be in the form of a Wi-Fi signal, a Bluetooth signal, a low energy Bluetooth signal, or combinations thereof. The wireless communication logic can interface with the processor 106, storage 108 and battery 154 of device 100, for transferring activity data, which may be in the form of motion data or processed motion data, stored in the storage 108 to the computing device.
In one embodiment, processor 106 functions in conjunction with the various logic components 140, 142, 144, 146, and 148. The processor 106 can, in one embodiment, provide the functionality of any one or all of the logic components. In other embodiments, multiple chips can be used to separate the processing performed by any one of the logic components and the processor 106. Sensors 156 can communicate via a bus with the processor 106 and/or the logic components. The storage 108 is also in communication with the bus for providing storage of the motion data processed or tracked by the activity tracking device 100. Battery 154 is provided for providing power to the activity tracking device 100.
In one embodiment, remote device 200 communicates with activity tracking device 100 over a Bluetooth connection. In one embodiment, the Bluetooth connection is a low energy Bluetooth connection (e.g., Bluetooth LE, BLE, or Bluetooth Smart). Low energy Bluetooth is configured for providing low power consumption relative to standard Bluetooth circuitry. Low energy Bluetooth uses, in one embodiment, a 2.4 GHz radio frequency, which allows for dual mode devices to share a single radio antenna. In one embodiment, low energy Bluetooth connections can function at distances up to 50 meters, with over the air data rates ranging between 1-3 megabits (Mb) per second. In one embodiment, a proximity distance for communication can be defined by the particular wireless link, and is not tied to any specific standard. It should be understood that the proximity distance limitation will change in accordance with changes to existing standards and in view of future standards and/or circuitry and capabilities.
Remote device 200 can also communicate with the Internet 160 using an Internet connection. The Internet connection of the remote device 200 can include cellular connections, wireless connections such as Wi-Fi, and combinations thereof (such as connections to switches between different types of connection links). The remote device, as mentioned above, can be a smartphone or tablet computer, or any other type of computing device having access to the Internet and with capabilities for communicating with the activity tracking device 100.
A server 220 is also provided, which is interfaced with the Internet 160. The server 220 can include a number of applications that service the activity tracking device 100, and the associated users of the activity tracking device 100 by way of user accounts. For example, the server 220 can include an activity management application 224. The activity management application 224 can include logic for providing access to various devices 100, which are associated with user accounts managed by server 220. Server 220 can include storage 226 that includes various user profiles associated with the various user accounts. The user account 228a for user A and the user account 228n for user N are shown to include various information.
The information can include, without limitation, data associated with motion-activated messaging 230, user data, etc. As will be described in greater detail below, the motion-activated messaging data 230 includes information regarding a user's preferences, settings, and configurations which are settable by the user or set by default at the server 220 when accessing a respective user account. The storage 226 will include any number of user profiles, depending on the number of registered users having user accounts for their respective activity tracking devices. It should also be noted that a single user account can have various or multiple devices associated therewith, and the multiple devices can be individually customized, managed and accessed by a user. In one embodiment, the server 220 provides access to a user to view the user data 232 associated with activity tracking device.
The data viewable by the user includes the tracked motion data, which is processed to identify a plurality of metrics associated with the motion data. The metrics are shown in various graphical user interfaces of a website enabled by the server 220. The website can include various pages with graphical user interfaces for rendering and displaying the various metrics for view by the user associated with the user account. In one embodiment, the website can also include interfaces that allow for data entry and configuration by the user.
In some embodiments, additional information that may be of use to a user may be accessed by pressing the button 304, such as a current date or time, a battery charge level, a date/time of the device's last data sync (i.e. transfer of data from the activity monitoring device 300 to an external device), a pairing mode for pairing the activity monitoring device to an external device, an option to reset a fitness activity counter, an option to turn the device off, etc. Additionally, it should be appreciated that the button may be pressed in various ways to facilitate access to various features. By way of example, the button may be pressed once, pressed and held, pressed twice in rapid succession, etc. For example, pressing and holding the button 304 may turn the device 300 on or off.
The stationary position can be defined to require the device to be maintained in a non-moving or stationary state for a predefined length of time. For example, in one embodiment, the stationary position requires the device to be not moving for approximately two to three seconds. In other embodiments, the stationary position is defined to require the device to be not moving for any specified length of time.
Furthermore, it will be appreciated that the stationary state may be defined by the absence of movements exceeding a predefined threshold for a specified length of time. It should be appreciated that the specific types of movements and the predefined threshold can be defined in various ways. For example, in one embodiment, the movement is defined by the sensor output of motion-sensitive hardware included in the activity monitoring device, such as accelerometers, gravitometers, gyroscopes, etc., and the predefined threshold may be defined by a specific magnitude of a given sensor output (e.g. detected acceleration of the device exceeds an acceleration threshold). It will be appreciated that a combination of sensor outputs and corresponding thresholds can be considered. In one embodiment, a weighted combination (e.g., a weighted sum) of motion sensor outputs is defined and compared against a predefined threshold. In this manner, certain types of movements may be prioritized over others for purposes of identifying movement from a stationary state. For example, in one embodiment, translational movement of the device is prioritized over rotational movement, such that sensor outputs which are indicative of translational movement are more highly weighted than sensor outputs which are indicative of rotational movement.
In other embodiments, the detected movement and corresponding predefined threshold can be defined based on particular types of movements which are determined or derived from motion sensor output. For example, in one embodiment, a distance moved is determined for the device, and the device is defined to be in a stationary state so long as the distance moved by the device does not exceed a predefined distance threshold. In one embodiment, the detection of the distance moved is reset following a period of zero movement.
In another embodiment, the detection of the distance moved is cumulative. By defining the stationary state based on the absence of movement exceeding a predefined threshold, false positive movements can be avoided, so that motion-activated messages are not displayed when the movements are of low significance or unlikely to be the result of intentional movement of the device warranting display of a motion-activated message.
Additionally, the stationary position can be defined to require not only a lack of movement of the activity monitoring device 300, but also a specific orientation of the activity monitoring device 300 in three-dimensional space. For example, in one embodiment, the device 300 is required to be in a substantially horizontal orientation, that is, the orientation that the device 300 has when it is laying on a substantially horizontal flat surface, with the top side of the device 300 (the side on which the display 302 may be viewed by a user) facing upward.
Thus, in accordance with the foregoing, a motion-activated message can be displayed when the following events are detected: non-movement of the device; orientation of the activity monitoring device in a specified orientation; maintenance of the non-movement and specified orientation for a minimum specified length of time; and, following the maintenance of the non-movement and specified orientation, movement of the activity monitoring device (e.g. translational and/or rotational movement). The completion of the preceding events can be configured to trigger presentation of a motion-activated message on the display 302 of the activity monitoring device 300.
As shown with continued reference to
It should be appreciated that the orientation of the device 500 may be substantially horizontal, or may have a specific tilt depending upon the specific configuration of the case 502, which determines how the device 500 will be oriented when the case 502 is resting on the surface 510 as shown. For example, assuming that the activity monitoring device 500 has a shape that is elongated in a (lengthwise) direction that is substantially orthogonal to the cross-sectional plane shown in
It will be appreciated that the configuration of the case 502 in combination with its particular orientation on the surface 510 will determine the orientation of the activity monitoring device 500. In one embodiment, the specific orientation of the activity monitoring device 500 when positioned as shown in
As such, the orientation of the device in the illustrated embodiment can be detected. When the orientation is detected for a minimum predefined time, subsequent movement of the device from such an orientation can be configured to trigger presentation of a motion-activated message on the display of the device 500.
Though reference has been made to an activity monitoring device having a generally elongated shape, it should be appreciated that this form factor for an activity monitoring device is provided by way of example, and not by way of limitation. For example,
In one embodiment, the device 600 can be configured to display a motion-activated message on the display 602 when the device 600 is moved from a stable stationary position in which the device 600 lays on its side on a substantially horizontal surface. In the stationary position, an outer edge of the band of the wrist attachable device 600 will contact the surface on which the device 600 rests, and the display 602 will have a sideways orientation. It will be appreciated that the device 600 has two such possible stable stationary positions, one in which the device rests on a right side of the display, and another in which the device rests on the left side of the display. Movement from these stable stationary positions may trigger presentation of a motion-activated message on the display 602.
In other embodiments, a recognizable motion can be detected and may be configured to trigger display of a motion-activated message. For example, with continued reference to the activity monitoring device 600, a detected movement of the device 600 indicating that a user wearing the device has raised their arm so as to view the display 602 may be configured to trigger display of a motion-activated message. Such a motion of the device 600 may be that resulting from simultaneous lifting and pronation of the user's forearm, by way of example. It should be appreciated that a similar concept can be extended to other types of motions and other form factors for the activity monitoring device, wherein an identifiable motion of the activity monitoring device triggers display of a motion-activated message.
To determine which one of the messages 700 to display, the messages 700 may be ranked (ref. 702) based on various factors, including without limitation, time/date, prior display, activity/inactivity of the user as detected by the device 706, etc. Additional exemplary factors which may be utilized to rank, or otherwise determine selection from, a plurality of messages, are discussed in further detail below. As indicated at ref. 704, one of the messages is selected for display, based at least in part on the determined ranking. The selected message is displayed on the display 708 of the activity monitoring device 706. It will be appreciated that the presentation of the selected message may be scrolled across the display 708 if the length of the selected message is too long to permit the entirety of the selected message to be displayed simultaneously.
The device 800 further includes a motion activated message data storage 812 which contains message data defining various messages that can be presented based on motion of the device following a detected stationary state. Motion activated message display logic 814 is configured to determine when to display a motion activated message as well as the particular message that is displayed. The motion activated message display logic 814 includes activation logic 816. In one embodiment, the activation logic 816 is configured to identify, based on sensor data received from the motion sensors 806, when the device 800 is resting in a predefined orientation. As has been discussed, the predefined orientation may correspond to various possible resting orientations of the device 800 when it is placed on a substantially horizontal flat surface.
The activation logic 816 determines when the device 800 is continuously oriented in the predefined orientation for a specified minimum period of time. By way of example, the specified minimum period of time may be defined in the range of 2 to 5 seconds, 5 to 10 seconds, 10 seconds to 1 min, 1 to 10 min, or any other defined period of time. In one embodiment, the specified minimum period of time may be determined based on a user defined setting. In yet another embodiment, the specified minimum period of time may vary depending upon various factors, such as time of day, the amount of activity or inactivity recently associated with the device 800, etc.
When the activation logic 816 determines that the device 800 has continuously maintained the predefined orientation for the requisite minimum period of time, then the activation logic 816 is configured to detect a subsequent movement of the device 800 from the predefined orientation, based on sensor data from the motion sensors 806. Upon such detection of movement, the activation logic is configured to trigger selection and display of a motion activated message on the device 800.
In various embodiments, it will be appreciated that the activation logic 816 can be configured to consider other factors or purposes of determining when to trigger selection and display of a motion activated message. For example, a user interacting with the device 800 may place the device 800 on a flat surface while its display is still active. A timer can be configured to automatically turn off the display after a given amount of time in which interactivity with the display is not detected (e.g. a button on the device 800 which controls the operation of the display is not pressed). Thus in one embodiment, the activation logic 816 is configured to determine a current active or inactive state of the display 830, and does not commence its procedure for determining when to trigger the motion activated message unless the display is currently inactive or turned off.
In another embodiment, the activation logic 816 will not commence its activation procedure unless ambient light levels detected by the device 800 are above a predefined threshold. This may prevent the unnecessary display of a motion activated message when the device 800 is in a dark location, such as a pocket of a user, or a purse or bag, by way of example.
In another embodiment, the activation logic 816 will not trigger a motion-activated message during a time period for which an event is scheduled in a calendar of the user, such as a meeting or other type of event. This will prevent the display of motion-activated messages when the user is scheduled, and therefore expected, to be preoccupied with some activity. In one embodiment, this is facilitated via synchronization operations between the device 800 and another computing device which includes a calendar for the user, such as a smart phone or computer. During synchronization operations, the calendar of the user (on the additional device to which the device 800 syncs) is accessed to determine event scheduling information. This information defines time periods for events which have been scheduled on the calendar. At least the start and end times for the scheduled events can be transmitted to the device 800 and utilized to define time periods during which motion-activated messages will not be displayed.
In another embodiment, the activation logic 816 is configured to not trigger a motion-activated message when it is determined that the user is engaged in an activity during which it is expected that the user will be unavailable to view a motion-activated message. For example, in one embodiment, if it is determined that the device is currently moving in a motor vehicle such as a car (e.g. based on GPS data and/or motion data), the activation logic 816 may be configured to not trigger a motion-activated message.
When the activation logic 816 determines that a motion activated message is to be displayed, as discussed above, then selection logic 818 is engaged to select a message from the message data storage 812 for presentation. The selection logic 818 can be configured to select the message based on a variety of factors as discussed elsewhere herein. By way of example, the device 800 includes a clock 822, which provides a current date and time, which may be utilized by the selection logic 818 to determine which message to present. In one embodiment, the selection logic 818 determines a ranked order for a plurality of messages stored in the message data storage 812, and identifies a specific message for presentation based on the ranked order. The motion activated message display logic 814 is configured to render the selected message on the display 830 of the device 800 via a display controller 828.
The motion activated message display logic 814 further includes deactivation logic 820 which is configured to deactivate the operation of the display logic 814 under certain conditions. For example, when a user presses a button 824 to interact with the device 800, this may trigger deactivation logic 822 to deactivate the message display logic 814. As discussed herein, input received from the button 824 may trigger the display of various activity metrics by metrics display logic 826. Thus, pressing the button 824 will cause any motion activated message that is currently displayed on the display 830 to be replaced by various metrics or other information, the display of which is triggered by a button press.
In other embodiments, the display of a motion activated message can be terminated in response to other types of interaction or the lack thereof. For example, a motion activated message may be configured to be displayed for a limited amount of time, whereupon if no additional interaction with the device 800 is detected, then rendering of the motion activated message is terminated, and the display is turned off. In such an embodiment, the deactivation logic 820 can be configured to include a timer that is activated when a motion activated message is rendered by the display logic 814. Upon the expiration of the timer, then the display of the motion activated message is stopped. In a related embodiment, upon the expiration of a limited amount of time, the display of the motion activated message is ended and replaced with display of other information automatically, such as a current time, activity metric, or any other information which the device 800 may be configured to display.
In another embodiment, the display of a motion activated message is terminated upon the detection of a specific kind of physical interaction with the device 800, e.g. shaking the device, tapping the device, touching or swiping a touchscreen on the device, etc. Additionally, these types of physical interactions with the device 800 can be configured to trigger display of other information replacing the previously displayed motion activated message on the display 830.
The device 800 also includes synchronization logic 804 which is configured to handle synchronization of data with another device or with a remote server or cloud-based service. The activity monitoring device 800 is configured to wirelessly transmit and receive data, with wireless communications being handled by a wireless controller 802. Synchronization logic 804 is configured to upload activity data from the activity data storage 810. The synchronization logic 804 is also configured to download motion activated message data from a remote storage location to the motion activated message data storage 812. In this manner, the synchronization logic 84 updates the messages which are stored in the message data storage 812. It should be appreciated that the synchronization logic 804 may also be configured to effect deletion of messages from the motion activated message data storage 812.
In one embodiment, a message may be selected based on a current time of day. For example, a message such as “good morning” may be selected when the current time is in the morning. A message may also be selected based on the current date or day of the week. For example, a message such as “TGIF” may be selected on a Friday. In another embodiment, a message may be selected based on a current location of the device. For example, if the user travels to a new location such as the city of San Francisco, then a message may be configured to welcome the user to the new location, such as “welcome to San Francisco.”
In one embodiment, a message may be selected based on the activity of a user as determined via the activity monitoring device. For example, if the user has recently been determined to have taken a certain number of steps, then a message may be selected congratulating the user on having taken that many steps. In message may also be selected based on the inactivity of the user is determined by the activity monitoring device. For example, if the device has been resting without movement for an extended period of time, then a message may be selected that is configured to encourage the user to engage in fitness activity or otherwise engage with the device, such as “walk me.”
In one embodiment, a message may be selected at random, or based on a random number. In another embodiment, a message may be selected based on prior message selection, so as to avoid displaying the same message to the user in a relatively short time span.
In some embodiments, a message may be selected for display based on various sensed conditions. For example, a message may be selected based on an environmental condition which the activity monitoring device is capable of detecting, such as ambient light, temperature, ambient pressure, altitude, humidity, ambient sound, orientation, etc.
The activity monitoring device 1000 communicates to a user device 1020, that in turn communicates with a remote server 1050 via a network 1040. The user device 1020 may be a mobile device or any other type of computing device capable of performing the functionality described herein. In the illustrated embodiment, the user device 1020 includes a calendar module 1022 that is configured to maintain a personal calendar of the user. The calendar may be synchronized to a cloud-based calendar service 1074, which may be accessed via a calendar API 1076. Additionally, the user device 1020 includes a GPS module 1024 that is configured to determine a geo-location of the user device 1020.
The user device 1020 includes an application 1026, which may be a browser or a dedicated application that is configured to interface with the activity monitoring device 1000 as well as the server 1050. The application 1026 defines a graphical user interface 1028 through which the user may control the operation of the application 1026. The application 1026 further defines an activity analyzer 1030 which is configured to analyze activity data received from the activity monitoring device 1000. A synchronization handler 1032 is configured to handle synchronization operations between the activity monitoring device, the user device 1020, and cloud-based data storage accessed via the server 1050. For example, the sync handler 1032 may communicate with the activity data update module 1006 defined by the sync logic 1002 of the activity monitoring device 1000 in order to facilitate uploading of activity data from the activity monitoring device 1000 to the user device 1020. The uploaded activity data may be further processed by the activity analyzer 1030, and/or may be transmitted via network 1040 to an activity data manager 1052 of the server 1050, for storage in a cloud-based activity data storage 1054. An activity analyzer 1056 of the cloud-based system is configured to analyze the activity data stored in the activity data 1054, and may generate additional activity data that are also stored in the activity data storage 1054.
The server includes a motion-activated message synchronization module that is configured to select messages and download them to the activity monitoring device. The downloading of selected messages is mediated by the sync handler 1032 of the user device 1020, with which a message update module 1004 of the activity monitoring device 1000 communicates to receive message data for storage in the message data storage 1008. In other words, the message data is transferred from the server 1050 to the user device 1020, and the user device 1020 in turn transfers the message data to the activity monitoring device for storage. In this manner, the motion activated message data on the activity monitoring device is updated by the remote server 1050. It should be appreciated that the transfers of the message data may occur in immediate succession when the user device 1020 is simultaneously connected to both the server 1050 and the activity monitoring device 1000. However, when the user device 1020 is not connected to the activity monitoring device 1000, then message data may be transferred by the server 1050 to the user device 1020 and temporarily stored at the user device 1020 until the activity monitoring device is connected to the user device 1020, at which time the message data may then be transferred to the activity monitoring device 1000.
In addition to transfers of message data defining specific motion activated messages to the activity monitoring device 1000, the motion activated message synchronization module 1058 of the server 1050 may additionally be configured to effect other changes to the message data stored at the activity monitoring device 1000. For example, commands or updates may be sent to the activity monitoring device 1002 to manage the message data stored in the message data storage 1008. Examples of such commands or updates include, without limitation, deletion of messages, modification of messages, changes to metadata associated with messages, etc.
In the illustrated embodiment, the message synchronization module 1058 includes a message selection engine 1060 that is configured to select one or more messages to be transferred to the activity monitoring device 1000. A plurality of messages that are available for selection are stored in a message data storage 1062. A message manager 1064 is provided for managing the messages stored in the message data storage 1062. In one embodiment, the message manager 1064 provides an interface whereby an editor may create new messages, or edit or delete existing messages.
The selection engine 1060 is configured to select messages based on a variety of factors. By way of example, selection engine 1060 can be configured to identify selected messages based on activity data that is associated with the activity monitoring device 1000, as stored in the activity data storage 1054. Such activity data can include various fitness metrics and other types of data which are determined based on the monitored activity of the activity monitoring device 1000. The selection engine 1060 may also select messages based on the user profile associated with a user of the activity monitoring device 1000. In the illustrated embodiment, the user profile may be defined in a profile data storage 1066. By way of example and not limitation, a user profile may define various pieces of information about a given user, such as the user's age, gender, residence, height, weight, preferences, etc. The user profile can include a historical activity profile of the user based on the user's activity data and fitness metrics. In this manner, different messages may be selected based on, for example, whether the user is historically more sedentary or more active.
It will be appreciated that activity data may be defined by values, levels, metrics, etc. of particular activities which are associated to specific times or time periods. The activity data which are recorded over time can therefore define an activity history for a given user. It is noted that the granularity of such time associations may vary in accordance with the specific activity being tracked or other considerations such as a predefined goal or, a predefined threshold for defining or triggering a motion-activated message. As one example of an activity whose levels may be defined with varying time-associated granularity, consider that a user's step count might be determined on a per minute basis, per hour, per day, per week, per month, etc. Furthermore, a predefined threshold might be defined so that a motion-activated message is defined or triggered (i.e. selected or cued for motion-activated display) when the user achieves a given number of steps in a given time period (e.g. x number of steps in a day). As another example, a user's heart rate may be monitored over the course of a given time period, and a corresponding motion-activated message can be defined or triggered based on the user's heart rate data. It should be understood that similar concepts may be applied for any other activity discussed herein.
Thus in some embodiments, an activity history for a given user can define levels of activity that are associated to specific time periods, as determined from data recorded by a given activity tracking device to which the user is associated. Motion-activated messages can be defined and/or triggered based on the activity history of the user. In this manner, the motion-activated messages that are presented to the user are customized to the user's activity history, thereby providing a personalized experience the user.
In one embodiment, the message selection engine 1060 can be configured to select messages based on nutrition data which is stored in a nutrition data storage 1080. The nutrition data for a given user can include information about the user's nutrition, such as meal information including foods/drinks that the user has consumed and the times they were ingested, diet plan information, etc.
In one embodiment, the selection engine 1060 is configured to identify selected messages for transfer to the activity monitoring device 1000 based on current or predicted weather information. The weather information can be obtained from a weather service 1068, via an API 1070 according to which weather information is made available.
In one embodiment, the message selection engine 1060 is configured to identify selected messages based on social network data that is associated with the user of the activity monitoring device 1000. In the illustrated embodiment, the social network data can be obtained from a social network service provider 1072 via an API 1074. By way of example, social network data can include activity of the user on the social network, such as posts or comments, as well as information relating to the social graph of the user on the social network, such as identified friends of the user from the user social graph and their activity on the social network.
In one embodiment, the selection engine 1060 is configured to select messages based on calendar events that are associated with the user of the activity monitoring device 1000. In order to determine calendar events, a calendar service 1076 may be accessed via an API 1078. The calendar service 1076 is configured to maintain a calendar associated with the user that defines various events and their dates/times.
Though in the foregoing description, the activity monitoring device 1000 is shown to communicate with a user device 1020, which in turn communicates with the server 1050 over a network 1040, in other embodiments, some or all of the functionality defined by the user device 1020 may be included in or otherwise performed by the activity monitoring device 1000. Thus, in such embodiments, the activity monitoring device itself may communicate directly with the server 1050 over the network 1040 in order to perform data synchronization and other operations such as downloading selected messages to the activity monitoring device 1000, as has been described.
In one embodiment, the selection engine 1100 is configured to identify messages for selection based on demographic data associated with a user of the activity monitoring device, such as age, gender, ethnicity, height, weight, medical conditions, etc. For example, age-appropriate messages may be selected based on the user's age. In one embodiment, the selection engine 1100 is configured to select messages based on an identified location of the user. The location can be defined to varying degrees of specificity, such as by identifying a country, state, city, address, landmark, business, GPS coordinates, or other information which defines a location of the user. The location can be a current location of the user, or another location associated with the user, such as a residence or work address. By way of example, messages in the appropriate language may be provided based on location information. Or messages may reflect aspects of the locality of the user. For example, if the user is determined to be located near a park, then a message may be selected which encourage the user to go for a walk or run in the park.
In one embodiment, the selection engine 1100 is configured to select messages based on preferences or settings which are associated to the user. These may include, without limitation, fitness metric preferences such as which fitness metrics the user prefers, activity interests associated with the user, etc.
In one embodiment, the selection engine 1100 is configured to select messages based on fitness-related goals which may be user-defined goals or system-identified milestones. For example, a message may congratulate the user on achieving a goal, or encourage a user to perform activity in furtherance of a goal/milestone. Examples of goals include, without limitation, walking a certain number of steps, burning a certain number of calories, accruing a certain number of active minutes, climbing a certain height (possibly represented by an equivalent number of stairs/floors), etc.
In one embodiment, the selection engine 1100 is configured to select messages based on events stored in a calendar associated with the user. For example, such a message may be configured to remind the user about an upcoming calendar event, ask the user about a current or prior event, etc.
In one embodiment, the selection engine 1100 is configured to select messages based on activity or inactivity of the user, as detected by the activity monitoring device. For example, if the user has been very active recently, then a selected message may congratulate the user on the activity, encourage the user to get rest/sleep, or encourage the user to eat appropriately. If the user has been rather inactive recently, then a selected message may encourage the user to engage in physical fitness activity.
In one embodiment, the selection engine 1100 is configured to select messages based on the current or predicted weather. For example, a message may recommend clothing or accessories which are appropriate for the day's weather (e.g. hat/sunglasses for sunshine, umbrella for rain, gloves/scarf for cold weather), recommend activities based on the weather (e.g. “it's a nice day for a walk”), etc.
In one embodiment, the selection engine 1100 is configured to select messages based on the current season. For example, during particular holidays or seasons, messages may be selected which are indicative of those holidays or seasons (e.g. “happy labor day”; “spring is in the air”).
In one embodiment, the selection engine 1100 is configured to select messages based on events occurring. For example, a message may ask or inform a user about an upcoming fitness-related event (e.g. a 10K run), or other types of events (e.g. concerts, sporting events, festivals, shows, etc.).
In one embodiment, the selection engine 1100 is configured to select messages based on the current date or time, day of the week, month, or other indicator of time.
In one embodiment, the selection engine 1100 is configured to select messages based on previously selected messages which have been transferred to the activity monitoring device. For example, messages which have been recently transferred may not be selected so as to avoid duplication.
In one embodiment, the selection engine 1100 is configured to select messages based on nutrition data associated with the user.
In one embodiment, the selection engine 1100 is configured to select messages based on data or activity of the user on a social network, or that of members of the user's social graph. For example, social activity of the user may indicate an interest in basketball, and a message relating to basketball may be selected.
In one embodiment, a message may be generated by a secondary user, to be presented as a motion-activated message on the primary user's activity monitoring device, as discussed in further detail below. Such a message may be prioritized for transmission to the primary user's activity monitoring device.
It will be appreciated that the foregoing examples of factors according to which messages may be selected for transmission to an activity monitoring device, are provided by way of example and not by way of limitation. In other embodiments, additional factors may be considered by a message selection engine in accordance with the principles described herein.
The browser 1204 communicates over a network 1208 to a Web server 1210. The Web server 1210 includes an authentication module 1212 that is configured to authenticate the user 1200 to their account. A request handler 1214 is provided for servicing requests from the browser 1204. Activity module 1216 is provided for accessing and managing activity data associated with the user 1200 that is stored in the activity data storage 1220. Additionally, a social module 1218 is provided for accessing and managing social data associated with the user 1200 that is stored in a social data storage 1222 it should be appreciated that the user 1200 may have access to activity data as well as social data associated with members of the user's social graph.
The browser 1204 communicates with the Web server 1210 to provide an interface on the user device 1202 whereby the user 1200 can access their activity data and social data, and perhaps view related activity of their friends on their social network. In one embodiment, the browser may access and present a message generation interface 1206 which may be utilized by the user 1200 generate a custom message to another user 1232. The message generation interface 1206 can be configured to allow the user 1200 to enter text or other types of messaging data to define a custom user-generated message 1234. The user-generated message 1234 is provided to an activity monitoring device 1230 associated with the user 1232 in accordance with embodiments previously described herein. For example, the activity monitoring device 1230 may communicate with a user device 1226 having a synchronization module 1228, which in turn communicates with a synchronization server 1224 to effect downloading of the user generated message 1234 to the activity monitoring device 1230. Then, when the activity monitoring device 1230 is placed in a predefined stationary state, the user generated message 1234 may be displayed when the activity monitoring device 1230 is subsequently moved from the stationary state.
A recent activity module 1300 provides a graph illustrating recent activity recorded by the activity monitoring device of the user. As shown, various selectable options enable the user to view the number of steps they have recently taken, calories burned, or floors climbed. A steps module 1302 is configured to present a number of steps which have been taken by the user. It will be appreciated that the number of steps may be counted for a limited time period, such as the number of steps taken during the current day or current week. The steps module 1302 can also be configured to graphically illustrate the number of steps taken in relation to a goal or some predefined milestone or threshold, or a number of steps remaining to reach the goal or milestone.
A floors climbed module 1304 is configured to present a number of floors climbed within a given time period. It should be appreciated that the number of floors climbed is a conversion indicative of altitude changes that have been detected by the activity monitoring device. For example, if the activity monitoring device detects that the user has ascended a total of 100 feet and each floor is defined as the equivalent of 10 feet, then the 100 feet of ascension will be represented as a climb of 10 floors. The floors module 1304 can also be configured to indicate the relationship between the number of floors climbed and a goal, as well as the number of floors remaining to reach said goal.
A friends module 1306 is configured to display friends of the user from the user's social graph. Activity data associated with the user's friends can also be shown. In one embodiment, the user may access a message generation interface from the friends module 1306 to generate a custom user-defined message for display at the friend's activity monitoring device in response to movement from a stationary state, in accordance with the principles described herein. By way of example, clicking on or otherwise selecting a specific friend is presented in the friends module 1306 may provide access to options including an option to generate the custom motion activated message.
A weight module 1308 is configured to present a weight of the user, or a weight remaining to be lost by the user, or other weight related information. The distance module 1310 is configured to present a distance traveled by the user as measured based on data obtained from the user's activity monitoring device. The sleep module 1312 is configured to display an amount of sleep which the user has attained. A calories module 1314 is configured to display a number of calories burned by the user. A very active minutes module 1316 is configured to display the number of minutes of very active or vigorous activity by the user. A top badges module 1318 is configured to display badges or goals or milestones which the user has achieved. It will be appreciated that any of the foregoing modules may be configured to display specific metrics in relation to predefined goals, or amounts remaining to achieve such goals.
Though various embodiments have been described with reference to motion-activated messages which consist of text, it should be appreciated that the concepts relating to motion-activated messages may also apply to other forms of information which may be displayed or otherwise rendered in response to detected movement of an activity monitoring device from a stable stationary position. For example, a motion-activated “message” may be defined by an image, video, audio, animation, haptic event, or any other type of information or communicative event that may be presented through an activity monitoring device in accordance with embodiments of the invention.
In the foregoing embodiments, messages have been displayed on an activity monitoring device in response to detected movement of the device from a stationary position defined by a predefined orientation. However, in other embodiments, a message may be configured to be displayed on the activity monitoring device in response to other types of changes which are detectable by the activity monitoring device.
Broadly speaking, the activity monitoring device can be configured to display a message in response to detecting a change from a non-user interactive state to a user interactive state. In embodiments presented above, the non-user interactive state can be defined by a stationary state in which non-movement of the device for a predefined period of time is detected. A change to a user-interactive state is detected when a movement of the device from the stationary state is detected.
In another embodiment, a change from a non-user interactive state to a user interactive state can include detecting a movement of the device from a first orientation to a second orientation, or detecting a predefined motion or movement of the device. For example, in an embodiment where the activity monitoring device is worn on a user's wrist, when the user maneuvers his/her arm to view the display on the activity monitoring device, such a movement or change in orientation can be detected, and in response, a message may be displayed. In another embodiment, the motion of the user flicking their wrist may be detected, and a message can be displayed in response.
The concept can be further extended to encompass different detected activity states, wherein the non-user interactive state might be defined by a particular detected physical activity state of the user (e.g. running, cycling, etc.). If the user moves or re-orients the device so as to be able to view its display, then such a movement or re-orientation may be detected, and a message displayed in response.
In another embodiment, the non-user interactive state can include detecting a charging state of the device, wherein the device is connected to a charger. The change to the user-interactive state can be defined by detecting a change of the device from the charging state to a non-charging state, wherein the device is disconnected from the charger.
In still other embodiments, the change from the non-user interactive state to the user-interactive state can be defined by detected actions, such as a button press or an interaction with a touchscreen (e.g. touch, swipe, gesture, etc.).
In another embodiment, a message may be displayed in response to a change in temperature detected at the activity monitoring device. Such a message may be displayed in response to a change in the temperature having a magnitude greater than or equal to a predefined threshold, either an increase or a decrease in temperature. In a related embodiment, a message may be displayed in response to detection by the activity monitoring device of a specific ambient temperature. In this manner, message display is triggered by reaching specific ambient temperatures.
In another embodiment, a message can be configured to be displayed in response to changes in ambient light levels. For example, a sudden change from very low ambient light levels to comparatively high ambient light levels may indicate that the activity monitoring device has been removed from the user's pocket or otherwise taken out of a dark location for viewing by the user. Hence, the activity monitoring device may be configured to display a message in response to detection of such a change in ambient light.
In another embodiment, the activity monitoring device may be configured to display a message in response to changes in ambient sound levels.
The aforementioned methods and systems for displaying motion-activated messages on an activity monitoring device serve to improve the user experience of interacting with the activity monitoring device. By displaying messages in response to motion from a stationary state, the activity monitoring device can react to the user's intent to interact with or otherwise utilize the device, as indicated by the motion. Furthermore, by intelligently selecting messages for display in a manner that is customized for the individual user, the experience can be highly personalized, and thus may appear to imbue the activity monitoring device with an apparent voice or persona.
In one embodiment, the device collects one or more types of physiological and/or environmental data from embedded sensors and/or external devices and communicates or relays such metric information to other devices, including devices capable of serving as Internet-accessible data sources, thus permitting the collected data to be viewed, for example, using a web browser or network-based application. For example, while the user is wearing an activity tracking device, the device may calculate and store the user's step count using one or more sensors. The device then transmits data representative of the user's step count to an account on a web service, computer, mobile phone, or health station where the data may be stored, processed, and visualized by the user. Indeed, the device may measure or calculate a plurality of other physiological metrics in addition to, or in place of, the user's step count.
Some physiological metrics include, but are not limited to, energy expenditure (for example, calorie burn), floors climbed and/or descended, heart rate, heart rate variability, heart rate recovery, location and/or heading (for example, through GPS), elevation, ambulatory speed and/or distance traveled, swimming lap count, bicycle distance and/or speed, blood pressure, blood glucose, skin conduction, skin and/or body temperature, electromyography, electroencephalography, weight, body fat, caloric intake, nutritional intake from food, medication intake, sleep periods (i.e., clock time), sleep phases, sleep quality and/or duration, pH levels, hydration levels, and respiration rate. The device may also measure or calculate metrics related to the environment around the user such as barometric pressure, weather conditions (for example, temperature, humidity, pollen count, air quality, rain/snow conditions, wind speed), light exposure (for example, ambient light, UV light exposure, time and/or duration spent in darkness), noise exposure, radiation exposure, and magnetic field.
Still further, other metrics can include, without limitation, calories burned by a user (or energy expended), weight gained by a user, weight lost by a user, stairs ascended, e.g., climbed, etc., by a user, stairs descended by a user, steps taken by a user during walking or running, a number of rotations of a bicycle pedal rotated by a user, sedentary activity data, driving a vehicle, a number of golf swings taken by a user, a number of forehands of a sport played by a user, a number of backhands of a sport played by a user, or a combination thereof. In some embodiments, sedentary activity data is referred to herein as inactive activity data or as passive activity data. In some embodiments, when a user is not sedentary and is not sleeping, the user is active. In some embodiments, a user may stand on a monitoring device that determines a physiological parameter of the user. For example, a user stands on a scale that measures a weight, a body fat percentage, a biomass index, or a combination thereof, of the user.
Furthermore, the device or the system collating the data streams may calculate metrics derived from this data. For example, the device or system may calculate the user's stress and/or relaxation levels through a combination of heart rate variability, skin conduction, noise pollution, and sleep quality. In another example, the device or system may determine the efficacy of a medical intervention (for example, medication) through the combination of medication intake, sleep and/or activity data. In yet another example, the device or system may determine the efficacy of an allergy medication through the combination of pollen data, medication intake, sleep and/or activity data. These examples are provided for illustration only and are not intended to be limiting or exhaustive.
This information can be associated to the users account, which can be managed by an activity management application on the server. The activity management application can provide access to the users account and data saved thereon. The activity manager application running on the server can be in the form of a web application. The web application can provide access to a number of websites screens and pages that illustrate information regarding the metrics in various formats. This information can be viewed by the user, and synchronized with a computing device of the user, such as a smart phone.
In one embodiment, the data captured by the activity tracking device 100 is received by the computing device, and the data is synchronized with the activity measured application on the server. In this example, data viewable on the computing device (e.g. smart phone) using an activity tracking application (app) can be synchronized with the data present on the server, and associated with the user's account. In this way, information entered into the activity tracking application on the computing device can be synchronized with application illustrated in the various screens of the activity management application provided by the server on the website.
The user can therefore access the data associated with the user account using any device having access to the Internet. Data received by the network 176 can then be synchronized with the user's various devices, and analytics on the server can provide data analysis to provide recommendations for additional activity, and or improvements in physical health. The process therefore continues where data is captured, analyzed, synchronized, and recommendations are produced. In some embodiments, the captured data can be itemized and partitioned based on the type of activity being performed, and such information can be provided to the user on the website via graphical user interfaces, or by way of the application executed on the users smart phone (by way of graphical user interfaces).
In an embodiment, the sensor or sensors of a device 100 can determine or capture data to determine an amount of movement of the monitoring device over a period of time. The sensors can include, for example, an accelerometer, a magnetometer, a gyroscope, or combinations thereof. Broadly speaking, these sensors are inertial sensors, which capture some movement data, in response to the device 100 being moved. The amount of movement (e.g., motion sensed) may occur when the user is performing an activity of climbing stairs over the time period, walking, running, etc. The monitoring device may be worn on a wrist, carried by a user, worn on clothing (using a clip, or placed in a pocket), attached to a leg or foot, attached to the user's chest, waist, or integrated in an article of clothing such as a shirt, hat, pants, blouse, glasses, and the like. These examples are not limiting to all the possible ways the sensors of the device can be associated with a user or thing being monitored.
In other embodiments, a biological sensor can determine any number of physiological characteristics of a user. As another example, the biological sensor may determine heart rate, a hydration level, body fat, bone density, fingerprint data, sweat rate, and/or a bioimpedance of the user. Examples of the biological sensors include, without limitation, a biometric sensor, a physiological parameter sensor, a pedometer, or a combination thereof.
In some embodiments, data associated with the user's activity can be monitored by the applications on the server and the users device, and activity associated with the user's friends, acquaintances, or social network peers can also be shared, based on the user's authorization. This provides for the ability for friends to compete regarding their fitness, achieve goals, receive badges for achieving goals, get reminders for achieving such goals, rewards or discounts for achieving certain goals, etc.
As noted, an activity tracking device 100 can communicate with a computing device (e.g., a smartphone, a tablet computer, a desktop computer, or computer device having wireless communication access and/or access to the Internet). The computing device, in turn, can communicate over a network, such as the Internet or an Intranet to provide data synchronization. The network may be a wide area network, a local area network, or a combination thereof. The network may be coupled to one or more servers, one or more virtual machines, or a combination thereof. A server, a virtual machine, a controller of a monitoring device, or a controller of a computing device is sometimes referred to herein as a computing resource. Examples of a controller include a processor and a memory device.
In one embodiment, the processor may be a general purpose processor. In another embodiment, the processor can be a customized processor configured to run specific algorithms or operations. Such processors can include digital signal processors (DSPs), which are designed to execute or interact with specific chips, signals, wires, and perform certain algorithms, processes, state diagrams, feedback, detection, execution, or the like. In some embodiments, a processor can include or be interfaced with an application specific integrated circuit (ASIC), a programmable logic device (PLD), a central processing unit (CPU), or a combination thereof, etc.
In some embodiments, one or more chips, modules, devices, or logic can be defined to execute instructions or logic, which collectively can be viewed or characterized to be a processor. Therefore, it should be understood that a processor does not necessarily have to be one single chip or module, but can be defined from a collection of electronic or connecting components, logic, firmware, code, and combinations thereof.
Examples of a memory device include a random access memory (RAM) and a read-only memory (ROM). A memory device may be a Flash memory, a redundant array of disks (RAID), a hard disk, or a combination thereof.
Embodiments described in the present disclosure may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. Several embodiments described in the present disclosure can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
With the above embodiments in mind, it should be understood that a number of embodiments described in the present disclosure can employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that form part of various embodiments described in the present disclosure are useful machine operations. Several embodiments described in the present disclosure also relate to a device or an apparatus for performing these operations. The apparatus can be specially constructed for a purpose, or the apparatus can be a computer selectively activated or configured by a computer program stored in the computer. In particular, various machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
Various embodiments described in the present disclosure can also be embodied as computer-readable code on a non-transitory computer-readable medium. The computer-readable medium is any data storage device that can store data, which can thereafter be read by a computer system. Examples of the computer-readable medium include hard drives, network attached storage (NAS), ROM, RAM, compact disc-ROMs (CD-ROMs), CD-recordables (CD-Rs), CD-rewritables (RWs), magnetic tapes and other optical and non-optical data storage devices. The computer-readable medium can include computer-readable tangible medium distributed over a network-coupled computer system so that the computer-readable code is stored and executed in a distributed fashion.
Although the method operations were described in a specific order, it should be understood that other housekeeping operations may be performed in between operations, or operations may be performed in an order other than that shown, or operations may be adjusted so that they occur at slightly different times, or may be distributed in a system which allows the occurrence of the processing operations at various intervals associated with the processing.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the various embodiments described in the present disclosure are not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
This application is a continuation of U.S. patent application Ser. No. 15/583,959, filed May 1, 2017, titled “MOTION-ACTIVATED DISPLAY OF MESSAGES ON AN ACTIVITY MONITORING DEVICE,” which is a continuation of U.S. patent application Ser. No. 15/156,103, filed May 16, 2016, titled “MOTION-ACTIVATED DISPLAY OF MESSAGES ON AN ACTIVITY MONITORING DEVICE,” which is a continuation of U.S. patent application Ser. No. 14/580,808, filed Dec. 23, 2014, titled “MOTION-ACTIVATED DISPLAY OF MESSAGES ON AN ACTIVITY MONITORING DEVICE,” which is a continuation of U.S. patent application Ser. No. 14/271,389, filed on May 6, 2014, titled “MOTION-ACTIVATED DISPLAY OF MESSAGES ON AN ACTIVITY MONITORING DEVICE,” which is a continuation-in-part of U.S. patent application Ser. No. 13/959,714, filed on Aug. 5, 2013, titled “METHODS AND SYSTEMS FOR IDENTIFICATION OF EVENT DATA HAVING COMBINED ACTIVITY AND LOCATION INFORMATION OF PORTABLE MONITORING DEVICES,” which is a continuation-in-part of U.S. patent application Ser. No. 13/693,334, filed on Dec. 4, 2012, titled “PORTABLE MONITORING DEVICES AND METHODS OF OPERATING SAME,” and a continuation-in-part of U.S. patent application Ser. No. 13/759,485, filed on Feb. 5, 2013, titled “PORTABLE MONITORING DEVICES AND METHODS OF OPERATING SAME.” Each of U.S. patent application Ser. Nos. 13/693,334 and 13/759,485 is a divisional of U.S. patent application Ser. No. 13/667,229, filed on Nov. 2, 2012, titled “PORTABLE MONITORING DEVICES AND METHODS OF OPERATING SAME,” which is a divisional of U.S. patent application Ser. No. 13/469,027, filed on May 10, 2012, titled “PORTABLE MONITORING DEVICES AND METHODS OF OPERATING SAME,” which is a divisional of U.S. patent application Ser. No. 13/246,843, filed on Sep. 27, 2011, titled “PORTABLE MONITORING DEVICES AND METHODS OF OPERATING SAME,” which is a divisional of U.S. patent application Ser. No. 13/156,304, filed on Jun. 8, 2011, titled “Portable Monitoring Devices and Methods of Operating Same,” which claims the benefit of and priority to U.S. Provisional Patent Application No. 61/388,595, filed on Sep. 30, 2010, titled “Portable Monitoring Devices and Methods of Operating Same,” and the benefit of and priority to U.S. Provisional Patent Application No. 61/390,811, filed on Oct. 7, 2010, titled “Portable Monitoring Devices and Methods of Operating Same.” Each of the above-referenced applications is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2717736 | Schlesinger | Sep 1955 | A |
2827309 | Fred | Mar 1958 | A |
2883255 | Anderson | Apr 1959 | A |
3163856 | Kirby | Dec 1964 | A |
3250270 | Walter | May 1966 | A |
3522383 | Chang | Jul 1970 | A |
3918658 | Beller | Nov 1975 | A |
4192000 | Lipsey | Mar 1980 | A |
4244020 | Ratcliff | Jan 1981 | A |
4281663 | Pringle | Aug 1981 | A |
4284849 | Anderson et al. | Aug 1981 | A |
4312358 | Barney | Jan 1982 | A |
4367752 | Jimenez et al. | Jan 1983 | A |
4390922 | Pelliccia | Jun 1983 | A |
4407295 | Steuer et al. | Oct 1983 | A |
4425921 | Fujisaki et al. | Jan 1984 | A |
4466204 | Wu | Aug 1984 | A |
4575804 | Ratcliff | Mar 1986 | A |
4578769 | Frederick | Mar 1986 | A |
4617525 | Lloyd | Oct 1986 | A |
4887249 | Thinesen | Dec 1989 | A |
4930518 | Hrushesky | Jun 1990 | A |
4977509 | Pitchford et al. | Dec 1990 | A |
5058427 | Brandt | Oct 1991 | A |
5099842 | Mannheimer | Mar 1992 | A |
5224059 | Nitta et al. | Jun 1993 | A |
5295085 | Hoffacker | Mar 1994 | A |
5314389 | Dotan | May 1994 | A |
5323650 | Fullen et al. | Jun 1994 | A |
5365930 | Takashima et al. | Nov 1994 | A |
5446705 | Haas et al. | Aug 1995 | A |
5456648 | Edinburg et al. | Oct 1995 | A |
5553296 | Forrest et al. | Sep 1996 | A |
5583776 | Levi et al. | Dec 1996 | A |
5645509 | Brewer et al. | Jul 1997 | A |
5671162 | Werbin | Sep 1997 | A |
5692324 | Goldston et al. | Dec 1997 | A |
5704350 | Williams, III | Jan 1998 | A |
5724265 | Hutchings | Mar 1998 | A |
5817008 | Rafert et al. | Oct 1998 | A |
5890128 | Diaz et al. | Mar 1999 | A |
5891042 | Sham et al. | Apr 1999 | A |
5894454 | Kondo | Apr 1999 | A |
5899963 | Hutchings | May 1999 | A |
5941828 | Archibald et al. | Aug 1999 | A |
5947868 | Dugan | Sep 1999 | A |
5955667 | Fyfe | Sep 1999 | A |
5976083 | Richardson et al. | Nov 1999 | A |
6018705 | Gaudet et al. | Jan 2000 | A |
6077193 | Buhler et al. | Jun 2000 | A |
6078874 | Piety et al. | Jun 2000 | A |
6085248 | Sambamurthy et al. | Jul 2000 | A |
6129686 | Friedman | Oct 2000 | A |
6145389 | Ebeling et al. | Nov 2000 | A |
6183425 | Whalen et al. | Feb 2001 | B1 |
6213872 | Harada et al. | Apr 2001 | B1 |
6241684 | Amano et al. | Jun 2001 | B1 |
6287262 | Amano et al. | Sep 2001 | B1 |
6301964 | Fyfe et al. | Oct 2001 | B1 |
6302789 | Harada et al. | Oct 2001 | B2 |
6305221 | Hutchings | Oct 2001 | B1 |
6309360 | Mault | Oct 2001 | B1 |
6454708 | Ferguson et al. | Sep 2002 | B1 |
6469639 | Tanenhaus et al. | Oct 2002 | B2 |
6478736 | Mault | Nov 2002 | B1 |
6513381 | Fyfe et al. | Feb 2003 | B2 |
6513532 | Mault et al. | Feb 2003 | B2 |
6527711 | Stivoric et al. | Mar 2003 | B1 |
6529827 | Beason et al. | Mar 2003 | B1 |
6558335 | Thede | May 2003 | B1 |
6561951 | Cannon et al. | May 2003 | B2 |
6571200 | Mault | May 2003 | B1 |
6583369 | Montagnino et al. | Jun 2003 | B2 |
6585622 | Shum et al. | Jul 2003 | B1 |
6607493 | Song | Aug 2003 | B2 |
6620078 | Pfeffer | Sep 2003 | B2 |
6678629 | Tsuji | Jan 2004 | B2 |
6699188 | Wessel | Mar 2004 | B2 |
6761064 | Tsuji | Jul 2004 | B2 |
6772331 | Hind et al. | Aug 2004 | B1 |
6788200 | Jamel et al. | Sep 2004 | B1 |
6790178 | Mault et al. | Sep 2004 | B1 |
6808473 | Hisano et al. | Oct 2004 | B2 |
6811516 | Dugan | Nov 2004 | B1 |
6813582 | Levi et al. | Nov 2004 | B2 |
6813931 | Yadav et al. | Nov 2004 | B2 |
6856938 | Kurtz | Feb 2005 | B2 |
6862575 | Anttila et al. | Mar 2005 | B1 |
6984207 | Sullivan et al. | Jan 2006 | B1 |
7020508 | Stivoric et al. | Mar 2006 | B2 |
7041032 | Calvano | May 2006 | B1 |
7062225 | White | Jun 2006 | B2 |
7099237 | Lall | Aug 2006 | B2 |
7133690 | Ranta-Aho et al. | Nov 2006 | B2 |
7162368 | Levi et al. | Jan 2007 | B2 |
7171331 | Vock et al. | Jan 2007 | B2 |
7200517 | Darley et al. | Apr 2007 | B2 |
7246033 | Kudo | Jul 2007 | B1 |
7261690 | Teller et al. | Aug 2007 | B2 |
7272982 | Neuhauser et al. | Sep 2007 | B2 |
7283870 | Kaiser et al. | Oct 2007 | B2 |
7285090 | Stivoric et al. | Oct 2007 | B2 |
7373820 | James | May 2008 | B1 |
7443292 | Jensen et al. | Oct 2008 | B2 |
7457724 | Vock et al. | Nov 2008 | B2 |
7467060 | Kulach et al. | Dec 2008 | B2 |
7502643 | Farringdon et al. | Mar 2009 | B2 |
7505865 | Ohkubo et al. | Mar 2009 | B2 |
7539532 | Tran | May 2009 | B2 |
7558622 | Tran | Jul 2009 | B2 |
7559877 | Parks et al. | Jul 2009 | B2 |
7608050 | Shugg | Oct 2009 | B2 |
7653508 | Kahn et al. | Jan 2010 | B1 |
7690556 | Kahn et al. | Apr 2010 | B1 |
7713173 | Shin et al. | May 2010 | B2 |
7762952 | Lee et al. | Jul 2010 | B2 |
7771320 | Riley et al. | Aug 2010 | B2 |
7774156 | Niva et al. | Aug 2010 | B2 |
7789802 | Lee et al. | Sep 2010 | B2 |
7827000 | Stirling et al. | Nov 2010 | B2 |
7865140 | Levien et al. | Jan 2011 | B2 |
7881902 | Kahn et al. | Feb 2011 | B1 |
7907901 | Kahn et al. | Mar 2011 | B1 |
7925022 | Jung et al. | Apr 2011 | B2 |
7927253 | Vincent et al. | Apr 2011 | B2 |
7941665 | Berkema et al. | May 2011 | B2 |
7942824 | Kayyali et al. | May 2011 | B1 |
7953549 | Graham et al. | May 2011 | B2 |
7983876 | Vock et al. | Jul 2011 | B2 |
8005922 | Boudreau et al. | Aug 2011 | B2 |
8028443 | Case, Jr. | Oct 2011 | B2 |
8036850 | Kulach et al. | Oct 2011 | B2 |
8055469 | Kulach et al. | Nov 2011 | B2 |
8059573 | Julian et al. | Nov 2011 | B2 |
8060337 | Kulach et al. | Nov 2011 | B2 |
8095071 | Sim et al. | Jan 2012 | B2 |
8099318 | Moukas et al. | Jan 2012 | B2 |
8103247 | Ananthanarayanan et al. | Jan 2012 | B2 |
8132037 | Fehr et al. | Mar 2012 | B2 |
8172761 | Rulkov et al. | May 2012 | B1 |
8177260 | Tropper et al. | May 2012 | B2 |
8180591 | Yuen et al. | May 2012 | B2 |
8180592 | Yuen et al. | May 2012 | B2 |
8190651 | Treu et al. | May 2012 | B2 |
8213613 | Diehl et al. | Jul 2012 | B2 |
8260261 | Teague | Sep 2012 | B2 |
8270297 | Akasaka et al. | Sep 2012 | B2 |
8271662 | Gossweiler, III et al. | Sep 2012 | B1 |
8289162 | Mooring et al. | Oct 2012 | B2 |
8311769 | Yuen et al. | Nov 2012 | B2 |
8311770 | Yuen et al. | Nov 2012 | B2 |
8386008 | Yuen et al. | Feb 2013 | B2 |
8437980 | Yuen et al. | May 2013 | B2 |
8462591 | Marhaben | Jun 2013 | B1 |
8463576 | Yuen et al. | Jun 2013 | B2 |
8463577 | Yuen et al. | Jun 2013 | B2 |
8487771 | Hsieh et al. | Jul 2013 | B2 |
8533269 | Brown | Sep 2013 | B2 |
8533620 | Hoffman et al. | Sep 2013 | B2 |
8543185 | Yuen et al. | Sep 2013 | B2 |
8543351 | Yuen et al. | Sep 2013 | B2 |
8548770 | Yuen et al. | Oct 2013 | B2 |
8562489 | Burton et al. | Oct 2013 | B2 |
8583402 | Yuen et al. | Nov 2013 | B2 |
8597093 | Engelberg et al. | Dec 2013 | B2 |
8634796 | Johnson | Jan 2014 | B2 |
8638228 | Amigo et al. | Jan 2014 | B2 |
8670953 | Yuen et al. | Mar 2014 | B2 |
8684900 | Tran | Apr 2014 | B2 |
8690578 | Nusbaum et al. | Apr 2014 | B1 |
8738321 | Yuen et al. | May 2014 | B2 |
8738323 | Yuen et al. | May 2014 | B2 |
8744803 | Park et al. | Jun 2014 | B2 |
8762101 | Yuen et al. | Jun 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
8825445 | Hoffman et al. | Sep 2014 | B2 |
8847988 | Geisner et al. | Sep 2014 | B2 |
8868377 | Yuen et al. | Oct 2014 | B2 |
8892401 | Yuen et al. | Nov 2014 | B2 |
8949070 | Kahn et al. | Feb 2015 | B1 |
8954290 | Yuen et al. | Feb 2015 | B2 |
8961414 | Teller et al. | Feb 2015 | B2 |
8968195 | Tran | Mar 2015 | B2 |
9047648 | Lekutai et al. | Jun 2015 | B1 |
9374279 | Yuen | Jun 2016 | B2 |
9426769 | Haro | Aug 2016 | B2 |
9639170 | Yuen et al. | May 2017 | B2 |
10126998 | Yuen et al. | Nov 2018 | B2 |
20010049470 | Mault et al. | Dec 2001 | A1 |
20010055242 | Deshmuhk et al. | Dec 2001 | A1 |
20020013717 | Ando et al. | Jan 2002 | A1 |
20020019585 | Dickenson | Feb 2002 | A1 |
20020077219 | Cohen et al. | Jun 2002 | A1 |
20020082144 | Pfeffer | Jun 2002 | A1 |
20020087264 | Hills et al. | Jul 2002 | A1 |
20020109600 | Mault et al. | Aug 2002 | A1 |
20020178060 | Sheehan | Nov 2002 | A1 |
20020191797 | Perlman | Dec 2002 | A1 |
20020198776 | Nara et al. | Dec 2002 | A1 |
20030018523 | Rappaport et al. | Jan 2003 | A1 |
20030050537 | Wessel | Mar 2003 | A1 |
20030065561 | Brown et al. | Apr 2003 | A1 |
20030107575 | Cardno | Jun 2003 | A1 |
20030131059 | Brown et al. | Jul 2003 | A1 |
20030171189 | Kaufman | Sep 2003 | A1 |
20030208335 | Unuma et al. | Nov 2003 | A1 |
20030226695 | Mault | Dec 2003 | A1 |
20040054497 | Kurtz | Mar 2004 | A1 |
20040061324 | Howard | Apr 2004 | A1 |
20040117963 | Schneider | Jun 2004 | A1 |
20040122488 | Mazar et al. | Jun 2004 | A1 |
20040152957 | Stivoric et al. | Aug 2004 | A1 |
20040239497 | Schwartzman et al. | Dec 2004 | A1 |
20040249299 | Cobb | Dec 2004 | A1 |
20040257557 | Block | Dec 2004 | A1 |
20050037844 | Shum et al. | Feb 2005 | A1 |
20050038679 | Short | Feb 2005 | A1 |
20050054938 | Wehman et al. | Mar 2005 | A1 |
20050102172 | Sirmans, Jr. | May 2005 | A1 |
20050107723 | Wehman et al. | May 2005 | A1 |
20050163056 | Ranta-Aho et al. | Jul 2005 | A1 |
20050171410 | Hjelt et al. | Aug 2005 | A1 |
20050186965 | Pagonis et al. | Aug 2005 | A1 |
20050187481 | Hatib | Aug 2005 | A1 |
20050195830 | Chitrapu et al. | Sep 2005 | A1 |
20050216724 | Isozaki | Sep 2005 | A1 |
20050228244 | Banet | Oct 2005 | A1 |
20050228692 | Hodgdon | Oct 2005 | A1 |
20050234742 | Hodgdon | Oct 2005 | A1 |
20050248718 | Howell et al. | Nov 2005 | A1 |
20050272564 | Pyles et al. | Dec 2005 | A1 |
20050277452 | Pasamba | Dec 2005 | A1 |
20060004265 | Pulkkinen et al. | Jan 2006 | A1 |
20060020174 | Matsumura | Jan 2006 | A1 |
20060020177 | Seo et al. | Jan 2006 | A1 |
20060025282 | Redmann | Feb 2006 | A1 |
20060039348 | Racz et al. | Feb 2006 | A1 |
20060047208 | Yoon | Mar 2006 | A1 |
20060047447 | Brady et al. | Mar 2006 | A1 |
20060064037 | Shalon et al. | Mar 2006 | A1 |
20060064276 | Ren | Mar 2006 | A1 |
20060069619 | Walker et al. | Mar 2006 | A1 |
20060089542 | Sands | Apr 2006 | A1 |
20060106535 | Duncan | May 2006 | A1 |
20060111944 | Sirmans, Jr. | May 2006 | A1 |
20060129436 | Short | Jun 2006 | A1 |
20060136173 | Case, Jr. et al. | Jun 2006 | A1 |
20060143645 | Vock et al. | Jun 2006 | A1 |
20060166718 | Seshadri et al. | Jul 2006 | A1 |
20060189863 | Peyser | Aug 2006 | A1 |
20060217231 | Parks et al. | Sep 2006 | A1 |
20060247952 | Muraca | Nov 2006 | A1 |
20060277474 | Robarts et al. | Dec 2006 | A1 |
20060282021 | DeVaul et al. | Dec 2006 | A1 |
20060287883 | Turgiss et al. | Dec 2006 | A1 |
20060288117 | Raveendran et al. | Dec 2006 | A1 |
20070011028 | Sweeney | Jan 2007 | A1 |
20070049384 | King et al. | Mar 2007 | A1 |
20070050715 | Behar | Mar 2007 | A1 |
20070051369 | Choi et al. | Mar 2007 | A1 |
20070061593 | Celikkan et al. | Mar 2007 | A1 |
20070071643 | Hall et al. | Mar 2007 | A1 |
20070072156 | Kaufman et al. | Mar 2007 | A1 |
20070083095 | Rippo et al. | Apr 2007 | A1 |
20070083602 | Heggenhougen et al. | Apr 2007 | A1 |
20070123391 | Shin et al. | May 2007 | A1 |
20070135264 | Rosenberg | Jun 2007 | A1 |
20070136093 | Rankin et al. | Jun 2007 | A1 |
20070146116 | Kimbrell | Jun 2007 | A1 |
20070155277 | Amitai et al. | Jul 2007 | A1 |
20070159926 | Prstojevich et al. | Jul 2007 | A1 |
20070179356 | Wessel | Aug 2007 | A1 |
20070179761 | Wren et al. | Aug 2007 | A1 |
20070194066 | Ishihara et al. | Aug 2007 | A1 |
20070197920 | Adams | Aug 2007 | A1 |
20070208544 | Kulach et al. | Sep 2007 | A1 |
20070276271 | Chan | Nov 2007 | A1 |
20070288265 | Quinian et al. | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080014947 | Carnall | Jan 2008 | A1 |
20080022089 | Leedom | Jan 2008 | A1 |
20080032864 | Hakki | Feb 2008 | A1 |
20080033827 | Kuang | Feb 2008 | A1 |
20080044014 | Corndorf | Feb 2008 | A1 |
20080046818 | Orgill | Feb 2008 | A1 |
20080054072 | Katragadda et al. | Mar 2008 | A1 |
20080084823 | Akasaka et al. | Apr 2008 | A1 |
20080093838 | Tropper et al. | Apr 2008 | A1 |
20080097550 | Dicks et al. | Apr 2008 | A1 |
20080109158 | Huhtala | May 2008 | A1 |
20080114829 | Button | May 2008 | A1 |
20080125288 | Case | May 2008 | A1 |
20080125959 | Doherty | May 2008 | A1 |
20080129457 | Ritter et al. | Jun 2008 | A1 |
20080134102 | Movold | Jun 2008 | A1 |
20080139910 | Mastrototaro et al. | Jun 2008 | A1 |
20080140163 | Keacher et al. | Jun 2008 | A1 |
20080140338 | No et al. | Jun 2008 | A1 |
20080146892 | LeBoeuf et al. | Jun 2008 | A1 |
20080155077 | James | Jun 2008 | A1 |
20080176655 | James et al. | Jul 2008 | A1 |
20080190202 | Kulach et al. | Aug 2008 | A1 |
20080214360 | Stirling et al. | Sep 2008 | A1 |
20080275309 | Stivoric et al. | Nov 2008 | A1 |
20080285805 | Luinge et al. | Nov 2008 | A1 |
20080287751 | Stivoric et al. | Nov 2008 | A1 |
20080300641 | Brunekreeft | Dec 2008 | A1 |
20090012418 | Gerlach | Jan 2009 | A1 |
20090018797 | Kasama et al. | Jan 2009 | A1 |
20090043531 | Kahn et al. | Feb 2009 | A1 |
20090047645 | Dibenedetto et al. | Feb 2009 | A1 |
20090048044 | Oleson et al. | Feb 2009 | A1 |
20090054737 | Magar et al. | Feb 2009 | A1 |
20090054751 | Babashan et al. | Feb 2009 | A1 |
20090058635 | LaLonde et al. | Mar 2009 | A1 |
20090063193 | Barton et al. | Mar 2009 | A1 |
20090063293 | Mirrashidi et al. | Mar 2009 | A1 |
20090076765 | Kulach et al. | Mar 2009 | A1 |
20090088183 | Piersol | Apr 2009 | A1 |
20090093341 | James et al. | Apr 2009 | A1 |
20090098821 | Shinya | Apr 2009 | A1 |
20090144456 | Gelf et al. | Jun 2009 | A1 |
20090144639 | Nims et al. | Jun 2009 | A1 |
20090150178 | Sutton et al. | Jun 2009 | A1 |
20090156172 | Chan | Jun 2009 | A1 |
20090171788 | Tropper et al. | Jul 2009 | A1 |
20090195350 | Tsern et al. | Aug 2009 | A1 |
20090262088 | Moll-Carrillo et al. | Oct 2009 | A1 |
20090264713 | Van Loenen et al. | Oct 2009 | A1 |
20090271147 | Sugai | Oct 2009 | A1 |
20090287921 | Zhu et al. | Nov 2009 | A1 |
20090307517 | Fehr et al. | Dec 2009 | A1 |
20090309742 | Alexander et al. | Dec 2009 | A1 |
20090313857 | Carnes et al. | Dec 2009 | A1 |
20100023348 | Hardee et al. | Jan 2010 | A1 |
20100043056 | Ganapathy | Feb 2010 | A1 |
20100058064 | Kirovski et al. | Mar 2010 | A1 |
20100059561 | Ellis et al. | Mar 2010 | A1 |
20100069203 | Kawaguchi et al. | Mar 2010 | A1 |
20100079291 | Kroll et al. | Apr 2010 | A1 |
20100125729 | Baentsch et al. | May 2010 | A1 |
20100130873 | Yuen et al. | May 2010 | A1 |
20100158494 | King | Jun 2010 | A1 |
20100159709 | Kotani et al. | Jun 2010 | A1 |
20100167783 | Alameh et al. | Jul 2010 | A1 |
20100179411 | Holmstrom et al. | Jul 2010 | A1 |
20100185064 | Bandic et al. | Jul 2010 | A1 |
20100191153 | Sanders et al. | Jul 2010 | A1 |
20100205541 | Rapaport et al. | Aug 2010 | A1 |
20100217099 | LeBoeuf et al. | Aug 2010 | A1 |
20100222179 | Temple et al. | Sep 2010 | A1 |
20100234181 | Cho et al. | Sep 2010 | A1 |
20100261987 | Kamath et al. | Oct 2010 | A1 |
20100292050 | DiBenedetto | Nov 2010 | A1 |
20100292600 | DiBenedetto et al. | Nov 2010 | A1 |
20100295684 | Hsieh et al. | Nov 2010 | A1 |
20100298656 | McCombie | Nov 2010 | A1 |
20100298661 | McCombie et al. | Nov 2010 | A1 |
20100304674 | Kim et al. | Dec 2010 | A1 |
20100311544 | Robinette et al. | Dec 2010 | A1 |
20100331145 | Lakovic et al. | Dec 2010 | A1 |
20110003665 | Burton et al. | Jan 2011 | A1 |
20110009051 | Khedouri et al. | Jan 2011 | A1 |
20110021143 | Kapur et al. | Jan 2011 | A1 |
20110022349 | Stirling et al. | Jan 2011 | A1 |
20110029241 | Miller et al. | Feb 2011 | A1 |
20110032105 | Hoffman et al. | Feb 2011 | A1 |
20110051665 | Huang | Mar 2011 | A1 |
20110080349 | Holbein et al. | Apr 2011 | A1 |
20110087076 | Brynelsen et al. | Apr 2011 | A1 |
20110087431 | Gupta | Apr 2011 | A1 |
20110106449 | Chowdhary et al. | May 2011 | A1 |
20110109540 | Milne et al. | May 2011 | A1 |
20110131005 | Ueshima et al. | Jun 2011 | A1 |
20110145894 | Garcia Morchon et al. | Jun 2011 | A1 |
20110153773 | Vandwalle | Jun 2011 | A1 |
20110166777 | Chavakula | Jul 2011 | A1 |
20110167262 | Ross et al. | Jul 2011 | A1 |
20110193704 | Harper et al. | Aug 2011 | A1 |
20110197157 | Hoffman et al. | Aug 2011 | A1 |
20110214030 | Greenbero et al. | Sep 2011 | A1 |
20110221590 | Baker et al. | Sep 2011 | A1 |
20110224508 | Moon | Sep 2011 | A1 |
20110230729 | Shirasaki et al. | Sep 2011 | A1 |
20110258689 | Cohen et al. | Oct 2011 | A1 |
20110275940 | Nims et al. | Nov 2011 | A1 |
20120015778 | Lee et al. | Jan 2012 | A1 |
20120035487 | Werner et al. | Feb 2012 | A1 |
20120046113 | Ballas | Feb 2012 | A1 |
20120072165 | Jallon | Mar 2012 | A1 |
20120072167 | Cretella, Jr. | Mar 2012 | A1 |
20120083705 | Yuen et al. | Apr 2012 | A1 |
20120083714 | Yuen et al. | Apr 2012 | A1 |
20120083715 | Yuen et al. | Apr 2012 | A1 |
20120083716 | Yuen et al. | Apr 2012 | A1 |
20120084053 | Yuen et al. | Apr 2012 | A1 |
20120084054 | Yuen et al. | Apr 2012 | A1 |
20120092157 | Tran | Apr 2012 | A1 |
20120094649 | Porrati et al. | Apr 2012 | A1 |
20120102008 | Kaariainen et al. | Apr 2012 | A1 |
20120116684 | Inqrassia, Jr. et al. | May 2012 | A1 |
20120119911 | Jeon et al. | May 2012 | A1 |
20120150483 | Vock et al. | Jun 2012 | A1 |
20120165684 | Sholder | Jun 2012 | A1 |
20120166257 | Shiracarni et al. | Jun 2012 | A1 |
20120179278 | Riley et al. | Jul 2012 | A1 |
20120183939 | Aragones et al. | Jul 2012 | A1 |
20120215328 | Schmelzer | Aug 2012 | A1 |
20120218177 | Pang | Aug 2012 | A1 |
20120221634 | Treu et al. | Aug 2012 | A1 |
20120226471 | Yuen et al. | Sep 2012 | A1 |
20120226472 | Yuen et al. | Sep 2012 | A1 |
20120227737 | Mastrototaro et al. | Sep 2012 | A1 |
20120245716 | Srinivasan et al. | Sep 2012 | A1 |
20120252532 | Williams | Oct 2012 | A1 |
20120254987 | Ge et al. | Oct 2012 | A1 |
20120265477 | Vock et al. | Oct 2012 | A1 |
20120265480 | Oshima | Oct 2012 | A1 |
20120274508 | Brown et al. | Nov 2012 | A1 |
20120283855 | Hoffman et al. | Nov 2012 | A1 |
20120290109 | Engelberq et al. | Nov 2012 | A1 |
20120296400 | Bierman et al. | Nov 2012 | A1 |
20120297229 | Desai et al. | Nov 2012 | A1 |
20120297440 | Reams et al. | Nov 2012 | A1 |
20120316456 | Rahman et al. | Dec 2012 | A1 |
20120324226 | Bichsel et al. | Dec 2012 | A1 |
20120330109 | Tran | Dec 2012 | A1 |
20130006718 | Nielsen et al. | Jan 2013 | A1 |
20130041590 | Burich et al. | Feb 2013 | A1 |
20130072169 | Ross et al. | Mar 2013 | A1 |
20130073254 | Yuen et al. | Mar 2013 | A1 |
20130073255 | Yuen et al. | Mar 2013 | A1 |
20130080113 | Yuen et al. | Mar 2013 | A1 |
20130094600 | Beziat et al. | Apr 2013 | A1 |
20130095459 | Tran | Apr 2013 | A1 |
20130096843 | Yuen et al. | Apr 2013 | A1 |
20130102251 | Linde et al. | Apr 2013 | A1 |
20130103847 | Brown et al. | Apr 2013 | A1 |
20130106684 | Weast et al. | May 2013 | A1 |
20130132501 | Vandwalle et al. | May 2013 | A1 |
20130151193 | Kulach et al. | Jun 2013 | A1 |
20130151196 | Yuen et al. | Jun 2013 | A1 |
20130158369 | Yuen et al. | Jun 2013 | A1 |
20130166048 | Werner et al. | Jun 2013 | A1 |
20130187789 | Lowe | Jul 2013 | A1 |
20130190008 | Vathsancam et al. | Jul 2013 | A1 |
20130190903 | Balakrishnan et al. | Jul 2013 | A1 |
20130191034 | Weast et al. | Jul 2013 | A1 |
20130203475 | Kil et al. | Aug 2013 | A1 |
20130209972 | Carter et al. | Aug 2013 | A1 |
20130225117 | Giacoletto et al. | Aug 2013 | A1 |
20130228063 | Turner | Sep 2013 | A1 |
20130231574 | Tran | Sep 2013 | A1 |
20130238287 | Hoffman et al. | Sep 2013 | A1 |
20130261475 | Mochizuki | Oct 2013 | A1 |
20130267249 | Rosenberg | Oct 2013 | A1 |
20130268199 | Nielsen et al. | Oct 2013 | A1 |
20130268236 | Yuen et al. | Oct 2013 | A1 |
20130268687 | Schrecker | Oct 2013 | A1 |
20130268767 | Schrecker | Oct 2013 | A1 |
20130274904 | Coza et al. | Oct 2013 | A1 |
20130281110 | Zelinka | Oct 2013 | A1 |
20130289366 | Chua et al. | Oct 2013 | A1 |
20130296666 | Kumar et al. | Nov 2013 | A1 |
20130296672 | O'Neil et al. | Nov 2013 | A1 |
20130296673 | Thaveeprungsriporn et al. | Nov 2013 | A1 |
20130297220 | Yuen et al. | Nov 2013 | A1 |
20130310896 | Mass | Nov 2013 | A1 |
20130325396 | Yuen et al. | Dec 2013 | A1 |
20130331058 | Harvey | Dec 2013 | A1 |
20130337974 | Yanev et al. | Dec 2013 | A1 |
20130345978 | Lush et al. | Dec 2013 | A1 |
20140035761 | Burton et al. | Feb 2014 | A1 |
20140035764 | Burton et al. | Feb 2014 | A1 |
20140039804 | Park et al. | Feb 2014 | A1 |
20140067278 | Yuen et al. | Mar 2014 | A1 |
20140077673 | Garg et al. | Mar 2014 | A1 |
20140085077 | Luna et al. | Mar 2014 | A1 |
20140094941 | Ellis et al. | Apr 2014 | A1 |
20140099614 | Hu et al. | Apr 2014 | A1 |
20140121471 | Walker | May 2014 | A1 |
20140125618 | Panther et al. | May 2014 | A1 |
20140156228 | Yuen et al. | Jun 2014 | A1 |
20140164611 | Molettiere et al. | Jun 2014 | A1 |
20140180022 | Stivoric et al. | Jun 2014 | A1 |
20140191866 | Yuen et al. | Jul 2014 | A1 |
20140200691 | Lee et al. | Jul 2014 | A1 |
20140207264 | Quv | Jul 2014 | A1 |
20140213858 | Presura et al. | Jul 2014 | A1 |
20140245161 | Yuen et al. | Aug 2014 | A1 |
20140253461 | Hicks | Sep 2014 | A1 |
20140275885 | Isaacson et al. | Sep 2014 | A1 |
20140278229 | Hong et al. | Sep 2014 | A1 |
20140316305 | Venkatraman et al. | Oct 2014 | A1 |
20140337451 | Choudhary et al. | Nov 2014 | A1 |
20140337621 | Nakhimov | Nov 2014 | A1 |
20140343867 | Yuen et al. | Nov 2014 | A1 |
20150026647 | Park et al. | Jan 2015 | A1 |
20150038133 | Einzig | Feb 2015 | A1 |
20150057967 | Albinali | Feb 2015 | A1 |
20150088457 | Yuen et al. | Mar 2015 | A1 |
20150102923 | Messenger et al. | Apr 2015 | A1 |
20150113417 | Yuen et al. | Apr 2015 | A1 |
20150120186 | Heikes | Apr 2015 | A1 |
20150127268 | Park et al. | May 2015 | A1 |
20150137994 | Rahman et al. | May 2015 | A1 |
20150220883 | B'far et al. | Aug 2015 | A1 |
20150289802 | Thomas et al. | Oct 2015 | A1 |
20150324541 | Cheung et al. | Nov 2015 | A1 |
20150374267 | Laughlin | Dec 2015 | A1 |
20160058372 | Raghuram et al. | Mar 2016 | A1 |
20160061626 | Burton et al. | Mar 2016 | A1 |
20160063888 | McCallum et al. | Mar 2016 | A1 |
20160089572 | Liu et al. | Mar 2016 | A1 |
20160107646 | Kolisetty et al. | Apr 2016 | A1 |
20160259426 | Yuen et al. | Sep 2016 | A1 |
20160278669 | Messenger et al. | Sep 2016 | A1 |
20160285985 | Molettiere et al. | Sep 2016 | A1 |
20160323401 | Messenger et al. | Nov 2016 | A1 |
20170249115 | Yuen et al. | Aug 2017 | A1 |
Number | Date | Country |
---|---|---|
101978374 | Feb 2011 | CN |
102111434 | Jun 2011 | CN |
102377815 | Mar 2012 | CN |
102740933 | Oct 2012 | CN |
102983890 | Mar 2013 | CN |
103226647 | Jul 2013 | CN |
1 721 237 | Nov 2006 | EP |
11-347021 | Dec 1999 | JP |
2178588 | Jan 2002 | RU |
WO 02011019 | Feb 2002 | WO |
WO 06055125 | May 2006 | WO |
WO 06090197 | Aug 2006 | WO |
WO 08038141 | Apr 2008 | WO |
WO 09042965 | Apr 2009 | WO |
WO 12061438 | May 2012 | WO |
WO 12170586 | Dec 2012 | WO |
WO 12170924 | Dec 2012 | WO |
WO 12171032 | Dec 2012 | WO |
WO 15127067 | Aug 2015 | WO |
WO 16003269 | Jan 2016 | WO |
Entry |
---|
Chandrasekar et al., Aug. 28-Sep. 1, 2012, Plug-and-Play, Single-Chip Photoplethysmography, 34th Annual International Conference of the IEEE EMBS, San Diego, California USA, 4 pages. |
Clifford et al., Altimeter and Barometer System, Freescale Semiconductor Aplication Note AN1979, Rev. 3, Nov. 2006. |
Definition of “Graphic” from Merriam-Webster Dictionary, downloaded from merriam-webster.com on Oct. 4, 2014, 3 pp. |
Definition of “Graphical user interface” from Merriam-Webster Dictionary, downloaded from merriam-webster.com on Oct. 4, 2014, 2 pp. |
Fang et al., Dec. 2005, Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience, IEEE Transactions on Instrumentation and Measurement, 54(6):2342-2358. |
Fitbit Inc., “Fitbit Automatically Tracks Your Fitness and Sleep” published online at web.archive.org/web/2008091 0224820/http://www.fitbit.com, copyright Sep. 10, 2008, 1 p. |
Godfrey et al., 2008, Direct Measurement of Human Movement by Accelerometry, Medical Engineering & Physics, 30:1364-1386. |
Godha et al., May 2008, Foot Mounted Inertia System for Pedestrian Naviation, Measurement Science and Technology, 19(7):1-9. |
Intersema App., Using MS5534 for altimeters and barometers, Note AN501, Jan. 2006. |
Ladetto et al. Sep. 2000, On Foot Navigation: When GPS alone is not Enough, Journal of Navigation, 53(2):279-285. |
Lammel et al., Sep. 2009, Indoor Navigation with MEMS Sensors, Proceedings of the Eurosensors XIII conference, 1(1):532-535. |
Lester et al., 2005, A Hybrid Discriminative/Generative Approach for Modeling Human Activities, Proc. of the Int'l Joint Conf. Artificial Intelligence, pp. 766-772. |
Lester et al., 2009, Validated caloric expenditure estimation using a single body-worn sensor, Proc. of the Int'l Conf. on Ubiquitous Computing, pp. 225-234. |
Minetti et al. Energy cost of walking and running at extreme uphill and downhill slopes. J Appl Physiol. 2002; 93:10-39-1046. |
O'Donovan et al., 2009, A context aware wireless body area network (BAN), Proc. 3rd Intl. Conf. Pervasive Computing Technologies for Healthcare, pp. 1-8. |
Ohtaki et al., Aug. 2005, Automatic classification of ambulatory movements and evaluation of energy consumptions utilizing accelerometers and barometer, Microsystem Technologies, 11(8-10:)1034-1040. |
Parkka et al., Jan. 2006, Activity Classification Using Realistic Data From Wearable Sensors, IEEE Transactions on Information Technology in Biomedicine, 10(1):119-128. |
Perrin et al., 2000, Improvement of Walking Speed Prediction by Accelerometry and Altimetry, Validated by Satellite Positioning, Medical & Biological Engineering & Computing, 38:164-168. |
Retscher, 2006, An Intelligent Multi-Sensor system for Pedestrian Navigation, Journal of Global Positioning Systems, 5(1):110-118. |
Sagawa et al., Aug.-Sep. 1998, Classification of Human Moving Patterns Using Air Pressure and Acceleration, Proceedings of the 24.sup.th Annual Conference of the IEEE Industrial Electronics Society, 2:1214-1219. |
Sagawa et al., Oct. 2000, Non-restricted measurement of walking distance, IEEE Int'l Conf. on Systems, Man, and Cybernetics, 3:1847-1852. |
SCP 1000-D01/D11 Pressure Sensor as Barometer and Altimeter, VTI Technologies Application, Jun. 2006, Note 33. |
Specification of the Bluetooth.RTM. System, Core Package, version 4.1, Dec. 2013, vols. 0 & 1, 282 pages. |
Stirling et al., 2005, Evaluation of a New Method of Heading Estimation of Pedestrian Dead Reckoning Using Shoe Mounted Sensors, Journal of Navigation, 58:31-45. |
Suunto LUMI User Guide, Jun. and Sep. 1997. |
Tanigawa et al., Mar. 2008, Drift-free dynamic height sensor using MEMS IMU aided by MEMS pressure sensor, Workshop on Positioning, Navigation and Communication, pp. 191-196. |
Thompson et al., (Jan. 1996) “Predicted and measured resting metabolic rate of male and female endurance athletes,” Journal of the American Dietetic Association 96(1):30-34. |
International Search Report dated Aug. 15, 2008, in related application PCT/IB07/03617. |
U.S. Office Action dated Sep. 12, 2014, in U.S. Appl. No. 14/271,389. |
U.S. Notice of Allowance dated Oct. 31, 2014, in U.S. Appl. No. 14/271,389. |
U.S. Office Action dated Mar. 31, 2015, in U.S. Appl. No. 14/580,808. |
U.S. Office Action dated Nov. 5, 2015, in U.S. Appl. No. 14/580,808. |
U.S. Notice of Allowance dated Mar. 16, 2016, in U.S. Appl. No. 14/580,808. |
U.S. Office Action dated Aug. 11, 2016, in U.S. Appl. No. 15/156,103. |
U.S. Notice of Allowance dated Dec. 13, 2016, in U.S. Appl. No. 15/156,103. |
U.S. First Action Interview Pilot Program Pre-Interview Communication dated Sep. 11, 2017, in U.S. Appl. No. 15/583,959. |
U.S. First Action Interview Office Action dated Oct. 2, 2017, in U.S. Appl. No. 15/583,959. |
U.S. Office Action dated Mar. 2, 2018, in U.S. Appl. No. 15/583,959. |
U.S. Notice of Allowance dated Jul. 18, 2018, in U.S. Appl. No. 15/583,959. |
Number | Date | Country | |
---|---|---|---|
20190146740 A1 | May 2019 | US |
Number | Date | Country | |
---|---|---|---|
61388595 | Sep 2010 | US | |
61390811 | Oct 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13667229 | Nov 2012 | US |
Child | 13693334 | US | |
Parent | 13469027 | May 2012 | US |
Child | 13667229 | US | |
Parent | 13246843 | Sep 2011 | US |
Child | 13469027 | US | |
Parent | 13156304 | Jun 2011 | US |
Child | 13246843 | US | |
Parent | 13667229 | Nov 2012 | US |
Child | 13759485 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15583959 | May 2017 | US |
Child | 15929061 | US | |
Parent | 15156103 | May 2016 | US |
Child | 15583959 | US | |
Parent | 14580808 | Dec 2014 | US |
Child | 15156103 | US | |
Parent | 14271389 | May 2014 | US |
Child | 14580808 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13959714 | Aug 2013 | US |
Child | 14271389 | US | |
Parent | 13693334 | Dec 2012 | US |
Child | 13959714 | US | |
Parent | 13759485 | Feb 2013 | US |
Child | 13959714 | US |