The present disclosure relates to systems and methods for capturing activity data over a period of time and methods and systems for navigating metric data to a display.
In recent years, the need for health and fitness has grown tremendously. The growth has occurred due to a better understanding of the benefits of good fitness to overall health and wellness. Unfortunately, although today's modern culture has brought about many new technologies, such as the Internet, connected devices and computers, people have become less active. Additionally, many office jobs require people to sit in front of computer screens for long periods of time, which further reduces a person's activity levels. Furthermore, much of today's entertainment options involve viewing multimedia content, computer social networking, and other types of computer involved interfacing. Although such computer activity can be very productive as well as entertaining, such activity tends to reduce a person's overall physical activity.
To provide users concerned with health and fitness a way of measuring or accounting for their activity or lack thereof, fitness trackers are often used. Fitness trackers are used to measure activity, such as walking, motion, running, sleeping, being inactive, bicycling, exercising on an elliptical trainer, and the like. Usually, the data collected by such fitness trackers can be transferred and viewed on a computing device. However, such data is often provided as a basic accumulation of activity data with complicated or confusing interfaces.
It is in this context that embodiments described herein arise.
Embodiments described in the present disclosure provide systems, apparatus, computer readable media, and methods for analyzing tracked activity data and providing navigation screens and interfaces on a device used by a user. The activity tracking device includes sensor(s) for detecting when physical contact occurs onto the activity tracking device and logic for providing a display action to the screen of the activity tracking device. The physical contact, in one embodiment, can be qualified as an input when the physical contact has a particular characteristic or pattern that is predefined. The characteristic can be, when the contact is the result of one or more taps, e.g., physical contact to the activity tracking device by a finger or hand of the user, or object held by a user and used to impart the contact.
In one embodiment, a method is provided. The method includes detecting a physical contact by a sensor of a device that is configured to display a plurality of metrics on a screen of the device and examining the physical contact to determine if the physical contact qualifies as an input for the device. The method acts to maintain the screen of the device in an off state for physical contact that does not qualify as the input and actives the screen of the device to display a first metric when the examining determines that the physical contact qualifies as the input. The method is executed by a processor.
In another embodiment, a device configured for capture of activity data for a user is provided. The device includes a housing and a screen disposed on the housing to display a plurality of metrics which include metrics that characterize the activity captured over time. The device further includes a sensor disposed in the housing to capture physical contact upon the housing. A processor is included to process the physical contact to determine if the physical contact qualifies as an input. The processor enables the screen from an off state when the physical contact qualifies as the input. The screen is configured to display one or more of the plurality of metrics in accordance with a scroll order, and a first metric of the plurality of metrics is displayed in accordance with user configuration identifying that the first metric is to be displayed in response to the physical contact that qualifies as the input, as determined by the processor.
In another embodiment, computer readable medium for storing program instructions executable by a processor is provided. The computer readable medium includes (a) program instructions for detecting a physical contact by a sensor of a device that is configured to display a plurality of metrics on a screen of the device; (b) program instructions for examining the physical contact to determine if the physical contact qualifies as an input for the device; (c) program instructions for maintaining the screen of the device in an off state for physical contact that does not qualify as the input; (d) program instructions for activating the screen of the device to display a first metric when the examining determines that the physical contact qualifies as the input, (e) program instructions for detecting user input to transition from the first metric to a next metric in a scroll order or to an off state when no user input is detected. If user input is received within a predetermined time after the off state, a process turns the screen on and displays the last displayed metric, and if the user input is received after the predetermined time after the off state, a process turns the screen on and displays the first metric. The plurality of metrics include a time of day metric and metrics representing activity data captured by the device when associated with a user, the activity data being of the user.
In some embodiments, the input is associated with qualified physical contact, and the user input is associated with one of qualified physical contact, or non-touch proximity input, or voice input, or button press input.
In still another embodiment, an activity tracking device is configured for capturing data or activity data for a user. The device includes a housing configured as a wearable wrist attachable structure or a structure that can accompany the user to capture the activity data. The device includes a screen disposed on the housing to display a plurality of metrics which include metrics that characterize the activity captured over time. The device has an accelerometer sensor disposed in the housing to capture physical contact upon the housing. The device further includes a processor to process the physical contact to determine if the physical contact qualifies as an input. The processor enables the screen from an off state when the physical contact qualifies as the input, and the screen is configured to display one or more of the plurality of metrics in accordance with a scroll order. A first metric of the plurality of metrics is displayed in accordance with user configuration identifying that the first metric is to be displayed in response to the physical contact that qualifies as the input, as determined by the processor. The physical contact captured by the accelerometer sensor is from sensing or detecting one or more taps having a predefined tap profile upon the housing by a finger, or hand or object. The user configuration identifying the first metric enables setting of a shortcut to any metric in the scroll order to be the first metric.
Other aspects will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of embodiments described in the present disclosure.
Various embodiments described in the present disclosure may best be understood by reference to the following description taken in conjunction with the accompanying drawings.
Embodiments described in the present disclosure provide systems, apparatus, computer readable media, and methods for analyzing tracked activity data and providing navigation screens and interfaces. Some embodiments are directed to providing navigation interfaces for an activity tracking device. The activity tracking device includes sensors for detecting when physical contact occurs onto the activity tracking device and logic for providing a display action to the screen of the activity tracking device. The physical contact, in one embodiment, can be qualified as an input when the physical contact has a particular characteristic that is predefined. The characteristic can be, when the contact is the result of one or more taps, e.g., physical contact to the activity tracking device by a finger or hand of the user, or object held by a user and used to impart the contact.
In other embodiments, the input can be non-physical, such as proximity sensing input. The proximity sensing input can be processed by an infrared proximity sensor, a thermal sensor, etc. The input can also be by way of a button, voice input, gaze detected input, input processed in response to motion or motion profiles, etc.
It should be noted that there are many inventions described and illustrated herein. The present inventions are neither limited to any single aspect nor embodiment thereof, nor to any combinations and/or permutations of such aspects and/or embodiments. Moreover, each of the aspects of the present inventions, and/or embodiments thereof, may be employed alone or in combination with one or more of the other aspects of the present inventions and/or embodiments thereof. For the sake of brevity, many of those permutations and combinations will not be discussed separately herein.
Further, in the course of describing and illustrating the present inventions, various circuitry, architectures, structures, components, functions and/or elements, as well as combinations and/or permutations thereof, are set forth. It should be understood that circuitry, architectures, structures, components, functions and/or elements other than those specifically described and illustrated, are contemplated and are within the scope of the present inventions, as well as combinations and/or permutations thereof.
The environmental sensors 118 may be in the form of motion detecting sensors. In some embodiments, a motion sensor can be one or more of an accelerometer, or a gyroscope, or a rotary encoder, or a calorie measurement sensor, or a heat measurement sensor, or a moisture measurement sensor, or a displacement sensor, or an ultrasonic sensor, or a pedometer, or an altimeter, or a linear motion sensor, or an angular motion sensor, or a multi-axis motion sensor, or a combination thereof. The biometric sensors 116 can be defined to measure physiological characteristics of the user that is using the activity tracking device 100. The user interface 114 provides a way for communicating with the activity tracking device 100, in response to user interaction 104. The user interaction 104 can be in the form of physical contact (e.g., without limitation, tapping, sliding, rubbing, multiple taps, gestures, etc.).
In some embodiments, the user interface 114 is configured to receive user interaction 104 that is in the form of noncontact input. The noncontact input can be by way of proximity sensors, button presses, touch sensitive screen inputs, graphical user interface inputs, voice inputs, sound inputs, etc. The activity tracking device 100 can communicate with a client and/or server 112 using the wireless transceiver 110. The wireless transceiver 110 will allow the activity tracking device 100 to communicate using a wireless connection, which is enabled by wireless communication logic. The wireless communication logic can be in the form of a circuit having radio communication capabilities. The radio communication capabilities can be in the form of a Wi-Fi connection, a Bluetooth connection, a low-energy Bluetooth connection, or any other form of wireless tethering or near field communication. In still other embodiments, the activity tracking device 100 can communicate with other computing devices using a wired connection (not shown). As mentioned, the environmental sensors 118 can detect motion of the activity tracking device 100.
The motion can be activity of the user, such as walking, running, stair climbing, etc. The motion can also be in the form of physical contact received on any surface of the activity tracking device 110, so long as the environmental sensors 118 can detect such motion from the physical contact. As will be explained in more detail below, the physical contact may be in the form of a tap or multiple taps by a finger upon the housing of the activity tracking device 100.
In other embodiments, the device components 102 are positioned substantially in a central position of the wrist attachable device, such as under or proximate to a location where a display screen 122 is located. In the illustrated example, the housing 130 also includes a button 126. The button 126 can be pressed to activate the display screen 122, navigate to various metrics displayed on the screen 122, or turn off the screen 122.
As shown in
Some motions will produce and quantify various types of metrics, such as step count, stairs climbed, distance traveled, very active minutes, calories burned, etc. The physical contact logic 142 can include logic that calculates or determines when particular physical contact can qualify as an input. To qualify as an input, the physical contact detected by sensors 156 should have a particular pattern that is identifiable as input. For example, the input may be predefined to be a double tap input, and the physical contact logic 142 can analyze the motion to determine if a double tap indeed occurred in response to analyzing the sensor data produced by sensors 156.
In other embodiments, the physical contact logic can be programmed to determine when particular physical contacts occurred, the time in between the physical contacts, and whether the one or more physical contacts will qualify within predefined motion profiles that would indicate that an input is desired. If physical contact occurs that is not within some predefined profile or pattern, the physical contact logic will not indicate or qualify that physical contact as an input.
The display interface logic 144 is configured to interface with the processor and the physical contact logic to determine when specific metric data will be displayed on the display screen 122 of the activity tracking device 100. The display interface logic 144 can act to turn on the screen, display metric information, display characters or alphanumeric information, display graphical user interface graphics, or combinations thereof Alarm management logic 146 can function to provide a user interface and settings for managing and receiving input from a user to set an alarm. The alarm management logic can interface with a timekeeping module (e.g., clock, calendar, time zone, etc.), and can trigger the activation of an alarm. The alarm can be in the form of an audible alarm or a non-audible alarm.
A non-audible alarm can provide such alarm by way of a vibration. The vibration can be produced by a motor integrated in the activity tracking device 100. The vibration can be defined to include various vibration patterns, intensities, and custom set patterns. The vibration produced by the motor or motors of the activity tracking device 100 can be managed by the alarm management logic 146 in conjunction with processing by the processor 106. The wireless communication logic 148 is configured for communication of the activity tracking device with another computing device by way of a wireless signal. The wireless signal can be in the form of a radio signal. As noted above, the radio signal can be in the form of a Wi-Fi signal, a Bluetooth signal, a low energy Bluetooth signal, or combinations thereof. The wireless communication logic can interface with the processor 106, storage 108 and battery 154 of device 100, for transferring activity data, which may be in the form of motion data or processed motion data, stored in the storage 108 to the computing device.
In one embodiment, processor 106 functions in conjunction with the various logic components 140, 142, 144, 146, and 148. The processor 106 can, in one embodiment, provide the functionality of any one or all of the logic components. In other embodiments, multiple chips can be used to separate the processing performed by any one of the logic components and the processor 106. Sensors 156 can communicate via a bus with the processor 106 and/or the logic components. The storage 108 is also in communication with the bus for providing storage of the motion data processed or tracked by the activity tracking device 100. Battery 154 is provided for providing power to the activity tracking device 100.
In one embodiment, remote device 200 communicates with activity tracking device 100 over a Bluetooth connection. In one embodiment, the Bluetooth connection is a low energy Bluetooth connection (e.g., Bluetooth LE, BLE, or Bluetooth Smart). Low energy Bluetooth is configured for providing low power consumption relative to standard Bluetooth circuitry. Low energy Bluetooth uses, in one embodiment, a 2.4 GHz radio frequency, which allows for dual mode devices to share a single radio antenna. In one embodiment, low energy Bluetooth connections can function at distances up to 50 meters, with over the air data rates ranging between 1-3 megabits (Mb) per second. In one embodiment, a proximity distance for communication can be defined by the particular wireless link, and is not tied to any specific standard. It should be understood that the proximity distance limitation will change in accordance with changes to existing standards and in view of future standards and/or circuitry and capabilities.
Remote device 200 can also communicate with the Internet 160 using an Internet connection. The Internet connection of the remote device 200 can include cellular connections, wireless connections such as Wi-Fi, and combinations thereof (such as connections to switches between different types of connection links). The remote device, as mentioned above, can be a smartphone or tablet computer, or any other type of computing device having access to the Internet and with capabilities for communicating with the activity tracking device 100.
A server 220 is also provided, which is interfaced with the Internet 160. The server 220 can include a number of applications that service the activity tracking device 100, and the associated users of the activity tracking device 100 by way of user accounts. For example, the server 220 can include an activity management application 224. The activity management application 224 can include logic for providing access to various devices 100, which are associated with user accounts managed by server 220. Server 220 can include storage 226 that includes various user profiles associated with the various user accounts. The user account 228a for user A and the user account 228n for user N are shown to include various information.
The information can include, without limitation, data associated with a display scroll order 230, user data, etc. As will be described in greater detail below, the display scroll order 230 includes information regarding a user's preferences, settings, and configurations which are settable by the user or set by default at the server 220 when accessing a respective user account. The storage 226 will include any number of user profiles, depending on the number of registered users having user accounts for their respective activity tracking devices. It should also be noted that a single user account can have various or multiple devices associated therewith, and the multiple devices can be individually customized, managed and accessed by a user. In one embodiment, the server 220 provides access to a user to view the user data 232 associated with activity tracking device.
The data viewable by the user includes the tracked motion data, which is processed to identify a plurality of metrics associated with the motion data. The metrics are shown in various graphical user interfaces of a website enabled by the server 220. The website can include various pages with graphical user interfaces for rendering and displaying the various metrics for view by the user associated with the user account. In one embodiment, the website can also include interfaces that allow for data entry and configuration by the user.
The configurations can include defining which metrics will be displayed on the activity tracking device 100. In addition, the configurations can include identification of which metrics will be a first metric to be displayed on the activity tracking device. The first metric to be displayed by the activity tracking device can be in response to a user input at the activity tracked device 100. As noted above, the user input can be by way of physical contact. The physical contact is qualified by the processor and/or logic of the activity tracking device 100 to determine if the physical contact should be treated as an input. The input can trigger or cause the display screen of the activity tracking device 100 to be turned on to display a specific metric, that is selected by the user as the first metric to display. In another embodiment, the first metric displayed in response to the input can be predefined by the system as a default.
The configuration provided by the user by way of the server 220 and the activity management application 224 can also be provided by way of the activity tracking application 202 of the computing device 200. For example, the activity tracking application 202 can include a plurality of screens that also display metrics associated with the captured motion data of the activity tracking device 100. The activity tracking application 202 can also allow for user input and configuration at various graphical user interface screens to set and define which input will produce display of the first metric. In other embodiments, in addition to identifying the first metric to be displayed in response to the input, which may be physical contact, the configuration can allow an ordering of which metrics will be displayed in a specific scroll order.
In another embodiment, the scroll order of the metrics is predefined. In some embodiments, the input provided by the user by way of the physical contact can be pre-assigned to a specific metric in the scroll order. For example, the scroll order can remain the same, while the input can allow the screen to jump to a specific entry in the scroll order. Jumping to a specific entry can be viewed as a shortcut to a specific entry that is desired to be seen first by the user upon providing physical contact or input to the device 100.
In mode 1, the display screen 122 is off 300 and upon a button press (e.g., of button 126 in
In mode 2, if it is determined that a button of the activity tracking device 100 had been pressed and held for a predetermined period of time (e.g., 1 second), then the transition would be directly to a timer metric 308. The timer metric 308 operates a stopwatch function, which first shows a graphic of a stopwatch and then automatically transitions to the time kept by the stopwatch function. If the user desires to transition from the timer metric 308 to one of the other metrics in the scroll order shown in
In mode 3, if it is determined that a double tap was detected on the surface of the activity tracking device 100 by a sensor, the display screen 122 will go from being on 302, to displaying a predetermined first metric. In this example, the predetermined first metric is a main goal 330 of the user, and is shown to be a step count metric 312. As shown, the display screen 122 will transition 302b in a direction 303a, which exposes an icon or graphic associated with the main goal 330, which are steps. The steps are shown as feet icon. The display screen 122, in one embodiment also transitions from the feet icon to the numerical value of the steps taken by the user utilizing the activity tracking device 100. If the user wishes to transition and view the other metrics in the scroll order, such as distance metric, calories burned metric, floors metric, very active minutes metric, alarm metric, the user can transition by pressing buttons 126 on the device 100. Again, transitioning downward (or through a list in any direction) is shown by the downward facing arrows, which are activated in response to a button press (or other types of inputs). These example transitions allow for display of other screens/data concerning metrics 308, 310, 312, 314, 316, 318, 320, 322, etc. It should be understood that additional metrics can be added to the scroll order, certain metrics can be deleted from the scroll order, the scroll order can be re-arranged, and these customizations can be made in response to user configurations or system configurations or default configurations.
After a predetermined period of time that no input is received by the activity tracking device 100, the display screen 122 will transition to and off state 300. The transition, in one embodiment allows for the display screen 122 to transition off 302c in direction 303b. Therefore, the display screen 122 will move to the off state 300 where battery consumption is reduced. In one embodiment, the transition to the off state 300 will occur after about 6 seconds. It should be understood that this predefined period of time can be modified for the specific configuration and should not be limited to the specific example. As will be described below with reference to
If additional input is received in operation 394, the display screen will move in scroll to the next metric. This will continue as the user is allowed to select the next metric in the list or scroll order. In one embodiment, the scroll order can wrap around and continue to display metrics. If no input is received for a period of time, the display screen will turn off in operation 395. In operation 396, it is determined if input is received within a predetermined amount of time after the screen was turned off. For example, if input, such as a button press is received within 3 seconds of the screen turning off, the screen will turn back on and display the last metric that had been displayed.
For instance, in
The settings being configured in this example includes settings associated with the scroll order and the definition of the first metric. In this example, the activity tracking application 202 will allow the user to login to his or her user accounts and access and identify device a, which is the activity tracking device 100. Using screens and menus provided by the activity tracking application, the user is able to identify the display scroll order settings in the screen 200a of the device 100. In this example, the user has decided to set the home screen metric (e.g. first metric), as time or clock. Optionally, the user may then select the scroll order of the various screens or GUIs (e.g., screens 200a-200h) to be traversed. In this example, the user has selected step count metric to follow, then stairs metric, then calories burned metric, and then distance metric. The user may also be prompted or can elect to edit, remove or add additional metrics to the scroll order setting. Once the user approves of the settings, the user can save the settings to the user profile. Saving the settings to the user profile can act to update the settings to the user display scroll order configuration 230 in the user account (user A). This configuration setting is synchronized with the server 220 and then transferred to the device 100 by way of a wireless connection over a predefined proximity distance 404. As noted above, in one embodiment, communication between the device 100 and the remote device 200 (computing device) is by way of a wireless link. The wireless link may be, for example Bluetooth radio communication, and in one embodiment, low-energy Bluetooth radio communication.
In some embodiments, a device is provided. The device is defined in a form of a wearable wrist attachable structure. In one embodiment, the device has a housing that is at least partially constructed or formed from a plastic material. In one embodiment, the housing of the device includes an altimeter. The defines can further include a transiently visible display, or a dead-front display, a touch screen display, a monochrome display, a digital display, a color display, or combination thereof. In yet another embodiment, the device can include one or more accelerometers. In one specific example, the device can include a 3-axis accelerometer. On still another embodiment, a 3-axis accelerometer can be replaced with or replicated by use of separate accelerometers (e.g., 3 accelerometers) positioned orthogonally to each other.
In one embodiment, the device collects one or more types of physiological and/or environmental data from embedded sensors and/or external devices and communicates or relays such metric information to other devices, including devices capable of serving as Internet-accessible data sources, thus permitting the collected data to be viewed, for example, using a web browser or network-based application. For example, while the user is wearing an activity tracking device, the device may calculate and store the user's step count using one or more sensors. The device then transmits data representative of the user's step count to an account on a web service, computer, mobile phone, or health station where the data may be stored, processed, and visualized by the user. Indeed, the device may measure or calculate a plurality of other physiological metrics in addition to, or in place of, the user's step count.
Some physiological metrics include, but are not limited to, energy expenditure (for example, calorie burn), floors climbed and/or descended, heart rate, heart rate variability, heart rate recovery, location and/or heading (for example, through GPS), elevation, ambulatory speed and/or distance traveled, swimming lap count, bicycle distance and/or speed, blood pressure, blood glucose, skin conduction, skin and/or body temperature, electromyography, electroencephalography, weight, body fat, caloric intake, nutritional intake from food, medication intake, sleep periods (i.e., clock time), sleep phases, sleep quality and/or duration, pH levels, hydration levels, and respiration rate. The device may also measure or calculate metrics related to the environment around the user such as barometric pressure, weather conditions (for example, temperature, humidity, pollen count, air quality, rain/snow conditions, wind speed), light exposure (for example, ambient light, UV light exposure, time and/or duration spent in darkness), noise exposure, radiation exposure, and magnetic field.
Still further, other metrics can include, without limitation, calories burned by a user, weight gained by a user, weight lost by a user, stairs ascended, e.g., climbed, etc., by a user, stairs descended by a user, steps taken by a user during walking or running, a number of rotations of a bicycle pedal rotated by a user, sedentary activity data, driving a vehicle, a number of golf swings taken by a user, a number of forehands of a sport played by a user, a number of backhands of a sport played by a user, or a combination thereof. In some embodiments, sedentary activity data is referred to herein as inactive activity data or as passive activity data. In some embodiments, when a user is not sedentary and is not sleeping, the user is active. In some embodiments, a user may stand on a monitoring device that determines a physiological parameter of the user. For example, a user stands on a scale that measures a weight, a body fat percentage, a biomass index, or a combination thereof, of the user.
Furthermore, the device or the system collating the data streams may calculate metrics derived from this data. For example, the device or system may calculate the user's stress and/or relaxation levels through a combination of heart rate variability, skin conduction, noise pollution, and sleep quality. In another example, the device or system may determine the efficacy of a medical intervention (for example, medication) through the combination of medication intake, sleep and/or activity data. In yet another example, the device or system may determine the efficacy of an allergy medication through the combination of pollen data, medication intake, sleep and/or activity data. These examples are provided for illustration only and are not intended to be limiting or exhaustive.
This information can be associated to the users account, which can be managed by an activity management application on the server. The activity management application can provide access to the users account and data saved thereon. The activity manager application running on the server can be in the form of a web application. The web application can provide access to a number of websites screens and pages that illustrate information regarding the metrics in various formats. This information can be viewed by the user, and synchronized with a computing device of the user, such as a smart phone.
In one embodiment, the data captured by the activity tracking device 100 is received by the computing device, and the data is synchronized with the activity measured application on the server. In this example, data viewable on the computing device (e.g. smart phone) using an activity tracking application (app) can be synchronized with the data present on the server, and associated with the user's account. In this way, information entered into the activity tracking application on the computing device can be synchronized with application illustrated in the various screens of the activity management application provided by the server on the website.
The user can therefore access the data associated with the user account using any device having access to the Internet. Data received by the network 176 can then be synchronized with the user's various devices, and analytics on the server can provide data analysis to provide recommendations for additional activity, and or improvements in physical health. The process therefore continues where data is captured, analyzed, synchronized, and recommendations are produced. In some embodiments, the captured data can be itemized and partitioned based on the type of activity being performed, and such information can be provided to the user on the website via graphical user interfaces, or by way of the application executed on the users smart phone (by way of graphical user interfaces).
In an embodiment, the sensor or sensors of a device 100 can determine or capture data to determine an amount of movement of the monitoring device over a period of time. The sensors can include, for example, an accelerometer, a magnetometer, a gyroscope, or combinations thereof. Broadly speaking, these sensors are inertial sensors, which capture some movement data, in response to the device 100 being moved. The amount of movement (e.g., motion sensed) may occur when the user is performing an activity of climbing stairs over the time period, walking, running, etc. The monitoring device may be worn on a wrist, carried by a user, worn on clothing (using a clip, or placed in a pocket), attached to a leg or foot, attached to the user's chest, waist, or integrated in an article of clothing such as a shirt, hat, pants, blouse, glasses, and the like. These examples are not limiting to all the possible ways the sensors of the device can be associated with a user or thing being monitored.
In other embodiments, a biological sensor can determine any number of physiological characteristics of a user. As another example, the biological sensor may determine heart rate, a hydration level, body fat, bone density, fingerprint data, sweat rate, and/or a bio impedance of the user. Examples of the biological sensors include, without limitation, a biometric sensor, a physiological parameter sensor, a pedometer, or a combination thereof.
In some embodiments, data associated with the user's activity can be monitored by the applications on the server and the users device, and activity associated with the user's friends, acquaintances, or social network peers can also be shared, based on the user's authorization. This provides for the ability for friends to compete regarding their fitness, achieve goals, receive badges for achieving goals, get reminders for achieving such goals, rewards or discounts for achieving certain goals, etc.
As noted, an activity tracking device 100 can communicate with a computing device (e.g., a smartphone, a tablet computer, a desktop computer, or computer device having wireless communication access and/or access to the Internet). The computing device, in turn, can communicate over a network, such as the Internet or an Intranet to provide data synchronization. The network may be a wide area network, a local area network, or a combination thereof. The network may be coupled to one or more servers, one or more virtual machines, or a combination thereof. A server, a virtual machine, a controller of a monitoring device, or a controller of a computing device is sometimes referred to herein as a computing resource. Examples of a controller include a processor and a memory device.
In one embodiment, the processor may be a general purpose processor. In another embodiment, the processor can be a customized processor configured to run specific algorithms or operations. Such processors can include digital signal processors (DSPs), which are designed to execute or interact with specific chips, signals, wires, and perform certain algorithms, processes, state diagrams, feedback, detection, execution, or the like. In some embodiments, a processor can include or be interfaced with an application specific integrated circuit (ASIC), a programmable logic device (PLD), a central processing unit (CPU), or a combination thereof, etc.
In some embodiments, one or more chips, modules, devices, or logic can be defined to execute instructions or logic, which collectively can be viewed or characterized to be a processor. Therefore, it should be understood that a processor does not necessarily have to be one single chip or module, but can be defined from a collection of electronic or connecting components, logic, firmware, code, and combinations thereof.
Examples of a memory device include a random access memory (RAM) and a read-only memory (ROM). A memory device may be a Flash memory, a redundant array of disks (RAID), a hard disk, or a combination thereof.
Embodiments described in the present disclosure may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. Several embodiments described in the present disclosure can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
With the above embodiments in mind, it should be understood that a number of embodiments described in the present disclosure can employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that form part of various embodiments described in the present disclosure are useful machine operations. Several embodiments described in the present disclosure also relate to a device or an apparatus for performing these operations. The apparatus can be specially constructed for a purpose, or the apparatus can be a computer selectively activated or configured by a computer program stored in the computer. In particular, various machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
Various embodiments described in the present disclosure can also be embodied as computer-readable code on a non-transitory computer-readable medium. The computer-readable medium is any data storage device that can store data, which can thereafter be read by a computer system. Examples of the computer-readable medium include hard drives, network attached storage (NAS), ROM, RAM, compact disc-ROMs (CD-ROMs), CD-recordables (CD-Rs), CD-rewritables (RWs), magnetic tapes and other optical and non-optical data storage devices. The computer-readable medium can include computer-readable tangible medium distributed over a network-coupled computer system so that the computer-readable code is stored and executed in a distributed fashion.
Although the method operations were described in a specific order, it should be understood that other housekeeping operations may be performed in between operations, or operations may be performed in an order other than that shown, or operations may be adjusted so that they occur at slightly different times, or may be distributed in a system which allows the occurrence of the processing operations at various intervals associated with the processing.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the various embodiments described in the present disclosure are not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
This application is a continuation of U.S. patent application Ser. No. 15/096,240, filed on Apr. 11, 2016, titled “METHODS, SYSTEMS AND DEVICES FOR PHYSICAL CONTACT ACTIVATED DISPLAY AND NAVIGATION,” which is a continuation of U.S. patent application Ser. No. 14/192,282, entitled “Methods, Systems and Devices for Physical Contact Activated Display and Navigation,” filed on Feb. 27, 2014, which claims priority from U.S. patent application Ser. No. 14/050,270, entitled “Methods, Systems and Devices for Physical Contact Activated Display and Navigation”, filed on Oct. 9, 2013, which claims priority to U.S. Provisional Application No. 61/885,959, entitled “Methods, Systems and Devices for Physical Contact Activated Display and Navigation”, filed on Oct. 2, 2013, all of which are incorporated herein by reference. U.S. patent application Ser. No. 14/192,282, entitled “Methods, Systems and Devices for Physical Contact Activated Display and Navigation,” filed on Feb. 27, 2014 is a continuation-in-part of U.S. patent application Ser. No. 13/959,714 (now issued as U.S. Pat. No. 8,762,101, issued Jun. 24, 2014), filed on Aug. 5, 2013, titled “Methods and Systems for Identification of Event Data Having Combined Activity and Location Information of Portable Monitoring Devices”, which is a continuation-in-part of U.S. patent application Ser. No. 13/693,334 (now U.S. Pat. No. 8,548,770, issued Oct. 1, 2013), filed on Dec. 4, 2012, titled “Portable Monitoring Devices and Methods for Operating Same”, which is a divisional of U.S. patent application Ser. No. 13/667,229 (now U.S. Pat. No. 8,437,980, issued on May 7, 2013), filed on Nov. 2, 2012, titled “Portable Monitoring Devices and Methods for Operating Same”, which is a divisional of U.S. patent application Ser. No. 13/469,027 (now U.S. Pat. No. 8,311,769, issued Nov. 13, 2012) filed May 10, 2012, titled “Portable Monitoring Devices and Methods for Operating Same”, which is a divisional of U.S. patent application Ser. No. 13/246,843 (now U.S. Pat. No. 8,180,591, issued on May 15, 2012) filed on Sep. 27, 2011, which is a divisional of U.S. patent application Ser. No. 13/156,304 (now U.S. Pat. No. 9,167,991, issued on Oct. 27, 2015) filed on Jun. 8, 2011, titled “Portable Monitoring Devices and Methods for Operating Same”, which claims the benefit of and priority to, under 35 U.S.C. 119.sctn.(e) to U.S. Provisional Patent Application No. 61/388,595, filed on Sep. 30, 2010, and titled “Portable Monitoring Devices and Methods for Operating Same”, and to U.S. Provisional Patent Application No. 61/390,811, filed on Oct. 7, 2010, and titled “Portable Monitoring Devices and Methods for Operating Same”, all of which are hereby incorporated by reference in their entirety. U.S. patent application Ser. No. 13/959,714, (now issued as U.S. Pat. No. 8,762,101) filed Aug. 5, 2013, titled “Methods and Systems for Identification of Event Data Having Combined Activity and Location Information of Portable Monitoring Devices”, is a continuation-in-part of U.S. patent application Ser. No. 13/759,485, (now issued as U.S. Pat. No. 8,543,351, issued on Sep. 24, 2013), filed on Feb. 5, 2013, titled “Portable Monitoring Devices and Methods for Operating Same”, which is a divisional of U.S. patent application Ser. No. 13/667,229 (now U.S. Pat. No. 8,437,980, issued on May 7, 2013) filed on Nov. 2, 2012, titled “Portable Monitoring Devices and Methods for Operating Same”, which is a divisional of U.S. patent application Ser. No. 13/469,027 (now U.S. Pat. No. 8,311,769, issued on Nov. 13, 2012), filed on May 10, 2012, titled “Portable Monitoring Devices and Methods for Operating Same”, which is a divisional of U.S. patent application Ser. No. 13/246,843 (now U.S. Pat. No. 8,180,591, issued on May 15, 2012) filed on Sep. 27, 2011, which is a divisional of U.S. patent application Ser. No. 13/156,304 (now U.S. Pat. No. 9,167,991, issued on Oct. 27, 2015), filed on Jun. 8, 2011, titled “Portable Monitoring Devices and Methods for Operating Same”, which claims the benefit of and priority under 35 U.S.C. 119.sctn.(e), to U.S. Provisional Patent Application No. 61/388,595, filed on Sep. 30, 2010, and titled “Portable Monitoring Devices and Methods for Operating Same” and to U.S. Provisional Patent Application No. 61/390,811, filed on Oct. 7, 2010, and titled “Portable Monitoring Devices and Methods for Operating Same”, all of which are hereby incorporated by reference in their entirety. U.S. patent application Ser. No. 14/192,282, entitled “Methods, Systems and Devices for Physical Contact Activated Display and Navigation,” filed on Feb. 27, 2014 is a continuation-in-part of U.S. patent application Ser. No. 13/913,726 (now U.S. Pat. No. 8,670,953, issued on Mar. 11, 2014), filed on Jun. 10, 2013, titled “Portable Monitoring Devices and Methods for Operating Same”, all of which are incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2717736 | Schlesinger | Sep 1955 | A |
2827309 | Fred | Mar 1958 | A |
2883255 | Anderson | Apr 1959 | A |
3163856 | Kirby | Dec 1964 | A |
3250270 | Walter | May 1966 | A |
3522383 | Chang | Jul 1970 | A |
3918658 | Beller | Nov 1975 | A |
4192000 | Lipsey | Mar 1980 | A |
4244020 | Ratcliff | Jan 1981 | A |
4281663 | Pringle | Aug 1981 | A |
4284849 | Anderson et al. | Aug 1981 | A |
4312358 | Barney | Jan 1982 | A |
4367752 | Jimenez et al. | Jan 1983 | A |
4390922 | Pelliccia | Jun 1983 | A |
4407295 | Steuer et al. | Oct 1983 | A |
4425921 | Fujisaki et al. | Jan 1984 | A |
4575804 | Ratcliff | Mar 1986 | A |
4578769 | Frederick | Mar 1986 | A |
4617525 | Lloyd | Oct 1986 | A |
4887249 | Thinesen | Dec 1989 | A |
4977509 | Pitchford et al. | Dec 1990 | A |
5058427 | Brandt | Oct 1991 | A |
5224059 | Nitta et al. | Jun 1993 | A |
5295085 | Hoffacker | Mar 1994 | A |
5323650 | Fullen et al. | Jun 1994 | A |
5339294 | Rodgers | Aug 1994 | A |
5446705 | Haas et al. | Aug 1995 | A |
5456648 | Edinburg et al. | Oct 1995 | A |
5553296 | Forrest et al. | Sep 1996 | A |
5583776 | Levi et al. | Dec 1996 | A |
5612931 | Sato et al. | Mar 1997 | A |
5645509 | Brewer et al. | Jul 1997 | A |
5671162 | Werbin | Sep 1997 | A |
5704350 | Williams, III | Jan 1998 | A |
5724265 | Hutchings | Mar 1998 | A |
5890128 | Diaz et al. | Mar 1999 | A |
5891042 | Sham et al. | Apr 1999 | A |
5894454 | Kondo | Apr 1999 | A |
5899963 | Hutchings | May 1999 | A |
5946274 | Yamaguchi et al. | Aug 1999 | A |
5947868 | Dugan | Sep 1999 | A |
5955667 | Fyfe | Sep 1999 | A |
5976083 | Richardson et al. | Nov 1999 | A |
6018705 | Gaudet et al. | Jan 2000 | A |
6077193 | Buhler et al. | Jun 2000 | A |
6085248 | Sambamurthy et al. | Jul 2000 | A |
6129686 | Friedman | Oct 2000 | A |
6145389 | Ebeling et al. | Nov 2000 | A |
6183425 | Whalen et al. | Feb 2001 | B1 |
6213872 | Harada et al. | Apr 2001 | B1 |
6241684 | Amano et al. | Jun 2001 | B1 |
6287262 | Amano et al. | Sep 2001 | B1 |
6300947 | Kanevsky | Oct 2001 | B1 |
6301964 | Fyfe et al. | Oct 2001 | B1 |
6302789 | Harada et al. | Oct 2001 | B2 |
6305221 | Hutchings | Oct 2001 | B1 |
6309360 | Mault | Oct 2001 | B1 |
6469639 | Tanenhaus et al. | Oct 2002 | B2 |
6469718 | Setogawa et al. | Oct 2002 | B1 |
6478736 | Mault | Nov 2002 | B1 |
6513381 | Fyfe et al. | Feb 2003 | B2 |
6513532 | Mault et al. | Feb 2003 | B2 |
6527711 | Stivoric et al. | Mar 2003 | B1 |
6529827 | Beason et al. | Mar 2003 | B1 |
6558335 | Thede | May 2003 | B1 |
6561951 | Cannon et al. | May 2003 | B2 |
6571200 | Mault | May 2003 | B1 |
6583369 | Montagnino et al. | Jun 2003 | B2 |
6585622 | Shum et al. | Jul 2003 | B1 |
6607493 | Song | Aug 2003 | B2 |
6620078 | Pfeffer | Sep 2003 | B2 |
6678629 | Tsuji | Jan 2004 | B2 |
6699188 | Wessel | Mar 2004 | B2 |
6761064 | Tsuji | Jul 2004 | B2 |
6790178 | Mault et al. | Sep 2004 | B1 |
6808473 | Hisano et al. | Oct 2004 | B2 |
6811516 | Dugan | Nov 2004 | B1 |
6813582 | Levi et al. | Nov 2004 | B2 |
6813931 | Yadav et al. | Nov 2004 | B2 |
6856938 | Kurtz | Feb 2005 | B2 |
6862575 | Anttila et al. | Mar 2005 | B1 |
7041032 | Calvano | May 2006 | B1 |
7062225 | White | Jun 2006 | B2 |
7133690 | Ranta-Aho et al. | Nov 2006 | B2 |
7162368 | Levi et al. | Jan 2007 | B2 |
7171331 | Vock et al. | Jan 2007 | B2 |
7200517 | Darley et al. | Apr 2007 | B2 |
7246033 | Kudo | Jul 2007 | B1 |
7254516 | Case, Jr. et al. | Aug 2007 | B2 |
7261690 | Teller et al. | Aug 2007 | B2 |
7272982 | Neuhauser et al. | Sep 2007 | B2 |
7285090 | Stivoric et al. | Oct 2007 | B2 |
7334472 | Seo et al. | Feb 2008 | B2 |
7373820 | James | May 2008 | B1 |
7443292 | Jensen et al. | Oct 2008 | B2 |
7457724 | Vock et al. | Nov 2008 | B2 |
7458014 | Rubin et al. | Nov 2008 | B1 |
7467060 | Kulach et al. | Dec 2008 | B2 |
7502643 | Farringdon et al. | Mar 2009 | B2 |
7505865 | Ohkubo et al. | Mar 2009 | B2 |
7539532 | Tran | May 2009 | B2 |
7558622 | Tran | Jul 2009 | B2 |
7559877 | Parks et al. | Jul 2009 | B2 |
7653508 | Kahn et al. | Jan 2010 | B1 |
7690556 | Kahn et al. | Apr 2010 | B1 |
7713173 | Shin et al. | May 2010 | B2 |
7717866 | Damen | May 2010 | B2 |
7762952 | Lee et al. | Jul 2010 | B2 |
7771320 | Riley et al. | Aug 2010 | B2 |
7774156 | Niva et al. | Aug 2010 | B2 |
7778118 | Lyons et al. | Aug 2010 | B2 |
7789802 | Lee et al. | Sep 2010 | B2 |
7793361 | Ishihara et al. | Sep 2010 | B2 |
7805150 | Graham | Sep 2010 | B2 |
7881902 | Kahn et al. | Feb 2011 | B1 |
7913185 | Benson et al. | Mar 2011 | B1 |
7927253 | Vincent et al. | Apr 2011 | B2 |
7983876 | Vock et al. | Jul 2011 | B2 |
8028443 | Case, Jr. | Oct 2011 | B2 |
8055469 | Kulach et al. | Nov 2011 | B2 |
8099318 | Moukas et al. | Jan 2012 | B2 |
8132037 | Fehr et al. | Mar 2012 | B2 |
8162804 | Tagliabue | Apr 2012 | B2 |
8177260 | Trapper et al. | May 2012 | B2 |
8180591 | Yuen et al. | May 2012 | B2 |
8180592 | Yuen et al. | May 2012 | B2 |
8270297 | Akasaka et al. | Sep 2012 | B2 |
8306508 | Lundy et al. | Nov 2012 | B1 |
8311769 | Yuen et al. | Nov 2012 | B2 |
8311770 | Yuen et al. | Nov 2012 | B2 |
8365073 | Kim et al. | Jan 2013 | B2 |
8386008 | Yuen et al. | Feb 2013 | B2 |
8437980 | Yuen et al. | May 2013 | B2 |
8441356 | Tedesco et al. | May 2013 | B1 |
8462591 | Marhaben | Jun 2013 | B1 |
8463576 | Yuen et al. | Jun 2013 | B2 |
8463577 | Yuen et al. | Jun 2013 | B2 |
8533269 | Brown | Sep 2013 | B2 |
8533620 | Hoffman et al. | Sep 2013 | B2 |
8543185 | Yuen et al. | Sep 2013 | B2 |
8543351 | Yuen et al. | Sep 2013 | B2 |
8548770 | Yuen et al. | Oct 2013 | B2 |
8562489 | Burton et al. | Oct 2013 | B2 |
8583402 | Yuen et al. | Nov 2013 | B2 |
8597093 | Engelberg et al. | Dec 2013 | B2 |
8615377 | Yuen | Dec 2013 | B1 |
8634796 | Johnson | Jan 2014 | B2 |
8670953 | Yuen et al. | Mar 2014 | B2 |
8684900 | Tran | Apr 2014 | B2 |
8690578 | Nusbaum et al. | Apr 2014 | B1 |
8734296 | Brumback | May 2014 | B1 |
8738321 | Yuen et al. | May 2014 | B2 |
8738323 | Yuen et al. | May 2014 | B2 |
8744803 | Park et al. | Jun 2014 | B2 |
8762101 | Yuen et al. | Jun 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
8784271 | Brumback et al. | Jul 2014 | B2 |
8847988 | Geisner et al. | Sep 2014 | B2 |
8968195 | Tran | Mar 2015 | B2 |
9026927 | Brumback et al. | May 2015 | B2 |
9031812 | Roberts et al. | May 2015 | B2 |
9288298 | Choudhary | Mar 2016 | B2 |
9310909 | Myers et al. | Apr 2016 | B2 |
9344546 | Choudhary et al. | May 2016 | B2 |
9389057 | Meschter | Jul 2016 | B2 |
9409052 | Werner | Aug 2016 | B2 |
9420083 | Roberts | Aug 2016 | B2 |
9641469 | Chaudhary et al. | May 2017 | B2 |
9672715 | Roberts et al. | Jun 2017 | B2 |
9965059 | Myers et al. | May 2018 | B2 |
20010049470 | Mault | Dec 2001 | A1 |
20010055242 | Deshmuhk et al. | Dec 2001 | A1 |
20020013717 | Ando et al. | Jan 2002 | A1 |
20020019585 | Dickenson | Feb 2002 | A1 |
20020077219 | Cohen et al. | Jun 2002 | A1 |
20020082144 | Pfeffer | Jun 2002 | A1 |
20020087264 | Hills et al. | Jul 2002 | A1 |
20020109600 | Mault et al. | Aug 2002 | A1 |
20020178060 | Sheehan | Nov 2002 | A1 |
20020198776 | Nara et al. | Dec 2002 | A1 |
20030018523 | Rappaport et al. | Jan 2003 | A1 |
20030050537 | Wessel | Mar 2003 | A1 |
20030065561 | Brown et al. | Apr 2003 | A1 |
20030131059 | Brown et al. | Jul 2003 | A1 |
20040054497 | Kurtz | Mar 2004 | A1 |
20040059790 | Austin-Lane et al. | Mar 2004 | A1 |
20040061324 | Howard | Apr 2004 | A1 |
20040117963 | Schneider | Jun 2004 | A1 |
20040152957 | Stivoric et al. | Aug 2004 | A1 |
20040239497 | Schwartzman et al. | Dec 2004 | A1 |
20040249299 | Cobb | Dec 2004 | A1 |
20050037844 | Shum et al. | Feb 2005 | A1 |
20050038679 | Short | Feb 2005 | A1 |
20050054938 | Wehman et al. | Mar 2005 | A1 |
20050102172 | Sirmans, Jr. | May 2005 | A1 |
20050107723 | Wehman et al. | May 2005 | A1 |
20050163056 | Ranta-Aho et al. | Jul 2005 | A1 |
20050171410 | Hjelt et al. | Aug 2005 | A1 |
20050228244 | Banet | Oct 2005 | A1 |
20050228692 | Hodgdon | Oct 2005 | A1 |
20050234742 | Hodgdon | Oct 2005 | A1 |
20050245793 | Hilton et al. | Nov 2005 | A1 |
20050248718 | Howell et al. | Nov 2005 | A1 |
20050250551 | Helle | Nov 2005 | A1 |
20060020174 | Matsumura et al. | Jan 2006 | A1 |
20060020177 | Seo et al. | Jan 2006 | A1 |
20060025282 | Redmann | Feb 2006 | A1 |
20060028429 | Kanevsky et al. | Feb 2006 | A1 |
20060047208 | Yoon | Mar 2006 | A1 |
20060047447 | Brady et al. | Mar 2006 | A1 |
20060064276 | Ren et al. | Mar 2006 | A1 |
20060069619 | Walker et al. | Mar 2006 | A1 |
20060089542 | Sands | Apr 2006 | A1 |
20060090139 | Jenni et al. | Apr 2006 | A1 |
20060111944 | Sirmans, Jr. | May 2006 | A1 |
20060129436 | Short | Jun 2006 | A1 |
20060143645 | Vock et al. | Jun 2006 | A1 |
20060217231 | Parks et al. | Sep 2006 | A1 |
20060242590 | Polivy et al. | Oct 2006 | A1 |
20060247952 | Muraca | Nov 2006 | A1 |
20060277474 | Robarts et al. | Dec 2006 | A1 |
20060282021 | DeVaul et al. | Dec 2006 | A1 |
20060287883 | Turgiss et al. | Dec 2006 | A1 |
20070049836 | Chen | Mar 2007 | A1 |
20070050715 | Behar | Mar 2007 | A1 |
20070051369 | Choi et al. | Mar 2007 | A1 |
20070071643 | Hall et al. | Mar 2007 | A1 |
20070072156 | Kaufman et al. | Mar 2007 | A1 |
20070083602 | Heggenhougen et al. | Apr 2007 | A1 |
20070123391 | Shin et al. | May 2007 | A1 |
20070136093 | Rankin et al. | Jun 2007 | A1 |
20070146116 | Kimbrell | Jun 2007 | A1 |
20070155277 | Amitai et al. | Jul 2007 | A1 |
20070159926 | Prstojevich et al. | Jul 2007 | A1 |
20070173327 | Kilgore et al. | Jul 2007 | A1 |
20070179356 | Wessel | Aug 2007 | A1 |
20070194066 | Ishihara et al. | Aug 2007 | A1 |
20070197920 | Adams | Aug 2007 | A1 |
20070208544 | Kulach et al. | Sep 2007 | A1 |
20070276271 | Chan | Nov 2007 | A1 |
20070293371 | Hilfiker et al. | Dec 2007 | A1 |
20080084823 | Akasaka et al. | Apr 2008 | A1 |
20080093838 | Tropper et al. | Apr 2008 | A1 |
20080096726 | Riley | Apr 2008 | A1 |
20080097550 | Dicks et al. | Apr 2008 | A1 |
20080114829 | Button et al. | May 2008 | A1 |
20080125288 | Case | May 2008 | A1 |
20080129457 | Ritter et al. | Jun 2008 | A1 |
20080134102 | Movold et al. | Jun 2008 | A1 |
20080140163 | Keacher et al. | Jun 2008 | A1 |
20080140338 | No et al. | Jun 2008 | A1 |
20080146892 | LeBoeuf et al. | Jun 2008 | A1 |
20080155077 | James | Jun 2008 | A1 |
20080155455 | Balasubramanian | Jun 2008 | A1 |
20080176655 | James et al. | Jul 2008 | A1 |
20080200312 | Tagliabue | Aug 2008 | A1 |
20080287751 | Stivoric et al. | Nov 2008 | A1 |
20090018797 | Kasama et al. | Jan 2009 | A1 |
20090043531 | Kahn et al. | Feb 2009 | A1 |
20090047645 | Dibenedetto et al. | Feb 2009 | A1 |
20090048044 | Oleson et al. | Feb 2009 | A1 |
20090054737 | Magar et al. | Feb 2009 | A1 |
20090063193 | Barton et al. | Mar 2009 | A1 |
20090085865 | Fattah | Apr 2009 | A1 |
20090098821 | Shinya | Apr 2009 | A1 |
20090150178 | Sutton et al. | Jun 2009 | A1 |
20090156172 | Chan | Jun 2009 | A1 |
20090171788 | Tropper et al. | Jul 2009 | A1 |
20090184849 | Nasiri | Jul 2009 | A1 |
20090195350 | Tsern et al. | Aug 2009 | A1 |
20090195497 | Fitzgerald et al. | Aug 2009 | A1 |
20090262088 | Moll-Carrillo et al. | Oct 2009 | A1 |
20090264713 | Van Loenen et al. | Oct 2009 | A1 |
20090271147 | Sugai | Oct 2009 | A1 |
20090287921 | Zhu et al. | Nov 2009 | A1 |
20090305744 | Ullrich | Dec 2009 | A1 |
20090307517 | Fehr et al. | Dec 2009 | A1 |
20090307619 | Gupta et al. | Dec 2009 | A1 |
20090309742 | Alexander et al. | Dec 2009 | A1 |
20100015584 | Singer et al. | Jan 2010 | A1 |
20100059561 | Ellis et al. | Mar 2010 | A1 |
20100069203 | Kawaguchi et al. | Mar 2010 | A1 |
20100085841 | Lazaridis et al. | Apr 2010 | A1 |
20100159709 | Kotani et al. | Jun 2010 | A1 |
20100159995 | Stallings et al. | Jun 2010 | A1 |
20100167783 | Alameh et al. | Jul 2010 | A1 |
20100185064 | Bandic et al. | Jul 2010 | A1 |
20100188328 | Dodge et al. | Jul 2010 | A1 |
20100205541 | Rapaport et al. | Aug 2010 | A1 |
20100217099 | LeBoeuf et al. | Aug 2010 | A1 |
20100240458 | Gaiba | Sep 2010 | A1 |
20100261987 | Kamath et al. | Oct 2010 | A1 |
20100331145 | Lakovic et al. | Dec 2010 | A1 |
20110003665 | Burton et al. | Jan 2011 | A1 |
20110009051 | Khedouri et al. | Jan 2011 | A1 |
20110010617 | Kim et al. | Jan 2011 | A1 |
20110022349 | Stirling et al. | Jan 2011 | A1 |
20110032105 | Hoffman et al. | Feb 2011 | A1 |
20110080349 | Holbein et al. | Apr 2011 | A1 |
20110087076 | Brynelsen et al. | Apr 2011 | A1 |
20110092282 | Gary | Apr 2011 | A1 |
20110106449 | Chowdhary et al. | May 2011 | A1 |
20110131005 | Ueshima et al. | Jun 2011 | A1 |
20110154196 | Icho et al. | Jun 2011 | A1 |
20110193704 | Harper et al. | Aug 2011 | A1 |
20110197157 | Hoffman et al. | Aug 2011 | A1 |
20110224508 | Moon | Sep 2011 | A1 |
20110230729 | Shirasaki et al. | Sep 2011 | A1 |
20110242043 | Yarvis et al. | Oct 2011 | A1 |
20110252362 | Cho et al. | Oct 2011 | A1 |
20110261079 | Ingrassia | Oct 2011 | A1 |
20120015778 | Lee | Jan 2012 | A1 |
20120015779 | Powch | Jan 2012 | A1 |
20120060123 | Smith | Mar 2012 | A1 |
20120072165 | Jallon | Mar 2012 | A1 |
20120083705 | Yuen et al. | Apr 2012 | A1 |
20120083714 | Yuen et al. | Apr 2012 | A1 |
20120083715 | Yuen et al. | Apr 2012 | A1 |
20120083716 | Yuen et al. | Apr 2012 | A1 |
20120084053 | Yuen et al. | Apr 2012 | A1 |
20120084054 | Yuen et al. | Apr 2012 | A1 |
20120092157 | Tran | Apr 2012 | A1 |
20120092383 | Hysek et al. | Apr 2012 | A1 |
20120094649 | Porrati et al. | Apr 2012 | A1 |
20120112908 | Prykari et al. | May 2012 | A1 |
20120116550 | Hoffman et al. | May 2012 | A1 |
20120119911 | Jeon et al. | May 2012 | A1 |
20120165684 | Sholder | Jun 2012 | A1 |
20120166257 | Shiragami et al. | Jun 2012 | A1 |
20120183939 | Aragones et al. | Jul 2012 | A1 |
20120197986 | Chen | Aug 2012 | A1 |
20120226471 | Yuen et al. | Sep 2012 | A1 |
20120226472 | Yuen et al. | Sep 2012 | A1 |
20120227737 | Mastrototaro et al. | Sep 2012 | A1 |
20120265480 | Oshima | Oct 2012 | A1 |
20120274508 | Brown et al. | Nov 2012 | A1 |
20120277891 | Aragones | Nov 2012 | A1 |
20120283855 | Hoffman et al. | Nov 2012 | A1 |
20120290109 | Engelberg | Nov 2012 | A1 |
20120295676 | Ackerson | Nov 2012 | A1 |
20120316456 | Rahman et al. | Dec 2012 | A1 |
20120316896 | Rahman et al. | Dec 2012 | A1 |
20120324226 | Bichsel et al. | Dec 2012 | A1 |
20120330109 | Tran | Dec 2012 | A1 |
20130006718 | Nielsen et al. | Jan 2013 | A1 |
20130007665 | Chaudhri et al. | Jan 2013 | A1 |
20130041590 | Burich et al. | Feb 2013 | A1 |
20130072169 | Ross et al. | Mar 2013 | A1 |
20130073254 | Yuen et al. | Mar 2013 | A1 |
20130073255 | Yuen et al. | Mar 2013 | A1 |
20130080113 | Yuen et al. | Mar 2013 | A1 |
20130080811 | Low et al. | Mar 2013 | A1 |
20130095459 | Tran | Apr 2013 | A1 |
20130096843 | Yuen et al. | Apr 2013 | A1 |
20130106684 | Weast et al. | May 2013 | A1 |
20130119255 | Dickinson et al. | May 2013 | A1 |
20130151196 | Yuen et al. | Jun 2013 | A1 |
20130158369 | Yuen et al. | Jun 2013 | A1 |
20130166048 | Werner et al. | Jun 2013 | A1 |
20130190903 | Balakrishnan et al. | Jul 2013 | A1 |
20130191034 | Weast et al. | Jul 2013 | A1 |
20130197681 | Alberth, Jr. et al. | Aug 2013 | A1 |
20130198685 | Bernini et al. | Aug 2013 | A1 |
20130203475 | Kil | Aug 2013 | A1 |
20130228063 | Turner | Sep 2013 | A1 |
20130231574 | Tran | Sep 2013 | A1 |
20130234924 | Janefalkar et al. | Sep 2013 | A1 |
20130238287 | Hoffman et al. | Sep 2013 | A1 |
20130254525 | Johnson et al. | Sep 2013 | A1 |
20130261475 | Mochizuki | Oct 2013 | A1 |
20130267249 | Rosenberg | Oct 2013 | A1 |
20130268199 | Nielsen et al. | Oct 2013 | A1 |
20130268236 | Yuen et al. | Oct 2013 | A1 |
20130268687 | Schrecker | Oct 2013 | A1 |
20130274904 | Coza et al. | Oct 2013 | A1 |
20130289366 | Chua et al. | Oct 2013 | A1 |
20130290879 | Greisson | Oct 2013 | A1 |
20130296666 | Kumar et al. | Nov 2013 | A1 |
20130296672 | O'Neil et al. | Nov 2013 | A1 |
20130296673 | Thaveeprungsriporn et al. | Nov 2013 | A1 |
20130310896 | Mass | Nov 2013 | A1 |
20130324368 | Aragones et al. | Dec 2013 | A1 |
20130325396 | Yuen et al. | Dec 2013 | A1 |
20130337974 | Yanev et al. | Dec 2013 | A1 |
20140035761 | Burton et al. | Feb 2014 | A1 |
20140039804 | Park et al. | Feb 2014 | A1 |
20140039840 | Yuen et al. | Feb 2014 | A1 |
20140039841 | Yuen et al. | Feb 2014 | A1 |
20140052280 | Yuen et al. | Feb 2014 | A1 |
20140067278 | Yuen et al. | Mar 2014 | A1 |
20140070957 | Longinotti-Buitoni et al. | Mar 2014 | A1 |
20140077673 | Garg et al. | Mar 2014 | A1 |
20140081667 | Joao | Mar 2014 | A1 |
20140094941 | Ellis et al. | Apr 2014 | A1 |
20140099614 | Hu et al. | Apr 2014 | A1 |
20140125618 | Panther et al. | May 2014 | A1 |
20140143737 | Mistry et al. | May 2014 | A1 |
20140164611 | Molettiere et al. | Jun 2014 | A1 |
20140169675 | King et al. | Jun 2014 | A1 |
20140176335 | Brumback et al. | Jun 2014 | A1 |
20140176346 | Brumback et al. | Jun 2014 | A1 |
20140180022 | Stivoric et al. | Jun 2014 | A1 |
20140180595 | Brumback et al. | Jun 2014 | A1 |
20140213858 | Presura et al. | Jul 2014 | A1 |
20140249393 | Proud | Sep 2014 | A1 |
20140275885 | Isaacson et al. | Sep 2014 | A1 |
20140278229 | Hong et al. | Sep 2014 | A1 |
20140316305 | Venkatraman et al. | Oct 2014 | A1 |
20150026647 | Park et al. | Jan 2015 | A1 |
20150081465 | Dyment | Mar 2015 | A1 |
20150181314 | Swanson | Jun 2015 | A1 |
20150262499 | Wicka | Sep 2015 | A1 |
20170237694 | Choudhary | Aug 2017 | A1 |
20170270765 | Roberts | Sep 2017 | A1 |
20190057593 | Park et al. | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
100998921 | Jul 2007 | CN |
101380252 | Mar 2009 | CN |
101918950 | Dec 2010 | CN |
202282004 | Jun 2012 | CN |
102598086 | Jul 2012 | CN |
1 721 237 | Aug 2012 | EP |
11-347021 | Dec 1999 | JP |
2178588 | Jan 2002 | RU |
WO 02011019 | Feb 2002 | WO |
WO 06055125 | May 2006 | WO |
WO 06090197 | Aug 2006 | WO |
WO 2007143535 | Dec 2007 | WO |
WO 08038141 | Apr 2008 | WO |
WO 09042965 | Apr 2009 | WO |
WO 2011028383 | Mar 2011 | WO |
WO 12170586 | Dec 2012 | WO |
WO 12170924 | Dec 2012 | WO |
WO 12171032 | Dec 2012 | WO |
WO 15127067 | Aug 2015 | WO |
WO 16003269 | Jan 2016 | WO |
Entry |
---|
John Brumm, Crunching the Numbers, Scuba Diving (Oct. 2006), available at https://www.scubadiving.com/gear/accessories/crunching-numbers. |
The Wordsworth Concise English Dictionary 606 (G.W. Davidson et al. eds. 1994). |
Steven M. Kaplan, Wiley Electrical & Electronics Engineering Dictionary 508-09, 708 (2004). |
Microsoft Computer Dictionary 484 (5th ed. 2002). |
Chandrasekar et al., Aug. 28-Sep. 1, 2012, Plug-and-Play, Single-Chip Photoplethysmography, 34th Annual International Conference of the IEEE EMBS, San Diego, California USA, 4 pages. |
Fang et al., Dec. 2005, Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience, IEEE Transactions on Instrumentation and Measurement, 54(6):2342-2358. |
Godfrey et al., 2008, Direct Measurement of Human Movement by Accelerometry, Medical Engineering & Physics, 30:1364-1386. |
Godha et al., May 2008, Foot Mounted Inertia System for Pedestrian Naviation, Measurement Science and Technology, 19(7):1-9. |
Intersema App., Using MS5534 for altimeters and barometers, Note AN501, Jan. 2006. |
Ladetto et al., Sep. 2000, On Foot Navigation: When GPS alone is not Enough, Journal of Navigation, 53(2):279-285. |
Lammel et al., Sep. 2009, Indoor Navigation with MEMS Sensors, Proceedings of the Eurosensors XIII conference, 1(1):532-535. |
Lester et al., 2005, A Hybrid Discriminative/Generative Approach for Modeling Human Activities, Proc. of the Int'l Joint Conf. Artificial Intelligence, pp. 766-772. |
Lester et al., 2009, Validated caloric expenditure estimation using a single body-worn sensor, Proc. of the Int'l Conf. on Ubiquitous Computing, pp. 225-234. |
Ohtaki et al., Aug. 2005, Automatic classification of ambulatory movements and evaluation of energy consumptions utilizing accelerometers and barometer, Microsystem Technologies, 11(8-10:)1034-1040. |
Parkka et al., Jan. 2006, Activity Classification Using Realistic Data From Wearable Sensors, IEEE Transactions on Information Technology in Biomedicine, 10(1):119-128. |
Perrin et al., 2000, Improvement of Walking Speed Prediction by Accelerometry and Altimetry, Validated by Satellite Positioning, Medical & Biological Engineering & Computing, 38:164-168. |
Retscher, 2006, An Intelligent Multi-Sensor system for Pedestrian Navigation, Journal of Global Positioning Systems, 5(1):110-118. |
Sagawa et al., Aug.-Sep. 1998, Classification of Human Moving Patterns Using Air Pressure and Acceleration, Proceedings of the 24.sup.th Annual Conference of the IEEE Industrial Electronics Society, 2:1214-1219. |
Sagawa et al., Oct. 2000, Non-restricted measurement of walking distance, IEEE Int'l Conf. on Systems, Man, and Cybernetics, 3:1847-1852. |
SCP 1000-D01/D11 Pressure Sensor as Barometer and Altimeter, VTI Technologies Application, Jun. 2006, Note 33. |
Specification of the Bluetooth.RTM. System, Core Package, version 4.1, Dec. 2013, vols. 0 & 1, 282 pages. |
Stirling et al., 2005, Evaluation of a New Method of Heading Estimation of Pedestrian Dead Reckoning Using Shoe Mounted Sensors, Journal of Navigation, 58:31-45. |
Suunto LUMI User Guide, Jun. and Sep. 1997. |
Tanigawa et al., Mar. 2008, Drift-free dynamic height sensor using MEMS IMU aided by MEMS pressure sensor, Workshop on Positioning, Navigation and Communication, pp. 191-196. |
International Search Report dated Aug. 15, 2008, in related application PCT/IB07/03617. |
U.S. Office Action, dated Feb. 12, 2015, issued in U.S. Appl. No. 14/029,763. |
U.S. Final Office Action, dated Jul. 8, 2015, issued in U.S. Appl. No. 14/029,763. |
U.S. Examiner's Answer to Appeal Brief, dated Nov. 3, 2016, issued in U.S. Appl. No. 14/029,763. |
U.S. Patent Board Decision on Appeal—Examiner Affirmed [USPTO Before the Patent Trial and Appeal Board] dated Jul. 26, 2017, issued in U.S. Appl. No. 14/029,763. |
U.S. Office Action, dated Feb. 5, 2014, issued in U.S. Appl. No. 14/045,563. |
U.S. Final Office Action, dated Jul. 28, 2014, issued in U.S. Appl. No. 14/045,563. |
U.S. Office Action, dated Jul. 17, 2015, issued in U.S. Appl. No. 14/045,563. |
U.S. Final Office Action, dated Jan. 4, 2016, issued in U.S. Appl. No. 14/045,563. |
U.S. Office Action, dated Sep. 20, 2016, issued in U.S. Appl. No. 14/045,563. |
U.S. Examiner's Answer to Appeal Brief, dated Feb. 16, 2017, issued in U.S. Appl. No. 14/045,563. |
U.S. Patent Board Decision on Appeal—Examiner Affirmed [USPTO Before the Patent Trial and Appeal Board], dated Dec. 20, 2017, issued in U.S. Appl. No. 14/045,563. |
U.S. Office Action, dated Apr. 4, 2014, issued in U.S. Appl. No. 14/045,574. |
U.S. Final Office Action, dated Jul. 31, 2014, issued in U.S. Appl. No. 14/045,574. |
U.S. Advisory Action, dated Nov. 17, 2014, issued in U.S. Appl. No. 14/045,574. |
U.S. Notice of Allowance, dated Jan. 14, 2015, issued in U.S. Appl. No. 14/045,574. |
U.S. Notice of Allowance, dated Apr. 8, 2015, issued in U.S. Appl. No. 14/045,574. |
U.S. Office Action, dated Jan. 8, 2014, issued in U.S. Appl. No. 14/045,592. |
U.S. Notice of Allowance, dated Apr. 25, 2014, issued in U.S. Appl. No. 14/045,592. |
Chinese First Office Action dated May 3, 2016 issued in CN 201310741076.5. |
Chinese Second Office Action dated Dec. 1, 2016 issued in CN 201310741076.5. |
Chinese Third Office Action dated May 4, 2017 issued in CN 201310741076.5. |
Number | Date | Country | |
---|---|---|---|
20180329518 A1 | Nov 2018 | US |
Number | Date | Country | |
---|---|---|---|
61388595 | Sep 2010 | US | |
61885959 | Oct 2013 | US | |
61390811 | Oct 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13667229 | Nov 2012 | US |
Child | 13693334 | US | |
Parent | 13469027 | May 2012 | US |
Child | 13667229 | US | |
Parent | 13246843 | Sep 2011 | US |
Child | 13469027 | US | |
Parent | 13156304 | Jun 2011 | US |
Child | 13246843 | US | |
Parent | 13667229 | Nov 2012 | US |
Child | 13759485 | US | |
Parent | 13469027 | May 2012 | US |
Child | 13667229 | US | |
Parent | 13246843 | Sep 2011 | US |
Child | 13469027 | US | |
Parent | 13156304 | Jun 2011 | US |
Child | 13246843 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15096240 | Apr 2016 | US |
Child | 15973475 | US | |
Parent | 14192282 | Feb 2014 | US |
Child | 15096240 | US | |
Parent | 14050270 | Oct 2013 | US |
Child | 14192282 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13959714 | Aug 2013 | US |
Child | 14192282 | Feb 2014 | US |
Parent | 13693334 | Dec 2012 | US |
Child | 13959714 | US | |
Parent | 13759485 | Feb 2013 | US |
Child | 13959714 | Aug 2013 | US |
Parent | 13913726 | Jun 2013 | US |
Child | 14192282 | Feb 2014 | US |