The disclosure herein relates to activity monitoring and more particularly to acquiring and displaying activity-related information.
Though useful for tracking absolute location and elevation, modern satellite-based positioning devices tend to be bulky and thus unsuited to being worn in view (e.g., as wrist-wear, eye-wear or other visible-at-a-glance accessory) during physically demanding activities such as running or hiking. Also, such devices often fail to provide other types of information desired during physical activity, such as step count, stair-climb count, calorie burn, heart rate, etc.
The disclosure herein is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
In various embodiments disclosed herein, a portable activity-tracking device receives activity-related information from a discrete data-collection device throughout an activity tracking or monitoring interval and combines the activity-related information with information obtained from one or more local sensors of the activity-tracking device to produce activity metrics for real-time display. In a number of embodiments, for example, the portable activity-tracking device receives positional information from a discrete global-positioning device throughout the activity tracking or monitoring interval (e.g., while a user is engaged in a physical activity) and combines the positional information with information obtained from one or more local sensors of the activity-tracking device for real-time display. By this arrangement, a relatively bulky global-positioning device (e.g., smartphone having global-positioning sensor/circuitry, stand-alone global-positioning device or any other electronics device having global or local positioning capability) may be carried in-pocket or otherwise borne in a not-readily visible location, while a relatively small and lightweight activity-tracking device presents data collaboratively collected by both devices in a real-time, at-a-glance display.
Conceptually, portable activity-tracking device 101 may be viewed as having processing circuitry (processor, memory, etc.) to implement an operating system (OS), sensor circuitry (Snsr) and wireless communication circuitry (WC), together with a user interface (UI) having a display and tactile input (e.g., one or more buttons, touch-screen, etc.), though one or more sensors within the sensor circuitry may be used to detect gesture input from the user as discussed below and thus constitute part of the user interface. Portable activity-tracking device 101 may also have an interface to enable wired communication (e.g., a universal serial bus port or other wire interface port). In general, the operating system executes actions in response to input received from the user interface (i.e., user input), sensor circuitry and/or communication circuitry, and presents activity metrics (i.e., information relating to a user activity such as distance traveled, step count, pace of travel, elevation, stairs climbed, calories burned, heart rate, position/location, activity duration and so forth).
As shown, smartphone 103 (or other positioning device) is assumed to have a resident operating system (OS) and activity-tracking/logging app (app), both instantiated by program code execution within a processor (i.e., though not specifically shown, the smartphone includes one or more processors and memory interconnected to form a computing device). Smartphone 103 also includes wireless communication circuitry (WC) for communicating information to the portable activity-tracking device, and position-sensing circuitry (Snsr) such as receiver circuitry for receiving distance and timestamp information from orbiting satellites (e.g., GPS space vehicles within line of sight at any given time).
As
In initiate-tracking phase 112, user input to portable activity-tracking device 101 such as a button push, touch-screen tap or predetermined user motion (i.e., a “gesture” detected by one or more sensors within the activity-tracking device) triggers wireless transmission of a data-feed request from activity-tracking device 101 to smartphone 103. In one embodiment, the data-feed request is received via the smartphone's wireless circuitry and forwarded to the smartphone OS which in turn launches an activity-tracking app. The activity-tracking app then interacts with the smartphone OS to request GPS-based position tracking (i.e., regular or intermittent logging of “GPS fix” data, with each such positional fix including, for example and without limitation, a GPS location and time stamp) and wireless transmission of acquired positioning data to activity-tracking device 101. Thereafter, in collaborative data collection phase 114, positioning data captured within smartphone 103 at successive points in time (Ti, . . . , Tj, . . . ) are transmitted to activity-tracking device 101 to be fused and/or aggregated with data collected from the activity-tracking device's on-board sensors. As shown, the collaboratively collected data (which may be a fusion and/or aggregation of collaboratively collected data) is displayed in real-time continuously or in response to view-gestures (discussed below) on a display 121 of activity-tracking device 103. At the conclusion of the activity being tracked (e.g., walking, running, climbing, riding, swimming, rowing, etc.), which may be explicitly signaled by user input or detected by the collective sensors of the collaborating devices, activity-tracking device 101 issues a stop request to smartphone 103, indicating that positioning-data collection and transmission may be stopped. At some later point in time, collaboratively collected data may be uploaded from activity-tracking device 101 and/or smartphone 103 to a general-purpose computing device (e.g., desktop, laptop, or tablet computer) or server computer 123 for archival and more full-featured rendering (e.g., comparative data display to show differences between activities carried out at different times or by different individuals, depicting route traveled during activity on a geographical map and so forth). For example, activity-tracking device 101 may upload collaboratively collected data to smartphone 103 (e.g., to be logged or archived within activity-tracking/logging app discussed above) and then communicated from smartphone 103 to computing device/server 123.
While the initiate-tracking phase and terminate-tracking phase are depicted in
Continuing with the bonding/pairing example shown in
For its part, the portable activity-tracking device accumulates “local” step-count and elevation data from on-board sensors (e.g., accelerometer and altimeter, respectively) as shown at 279 and fuses and/or aggregates the local data with incoming GPS data from the smartphone (“remote” data) as shown at 285, logging the fused/aggregated data for continuous or event-triggered presentation on a real-time display (281). As shown, the portable activity-tracking device iteratively fuses and displays collaboratively collected data during each GPS collection interval, ceasing upon detecting activity-stop input 290 from the user (or detecting an event that indicates activity cessation). Thereafter, the portable activity-tracking device processes the collaboratively collected data to produce a finalized, post-activity data set at 291 (e.g., generating or refining tallies) and transmits the finalized data set to the smartphone, optionally archiving the data set within on-board flash memory or other storage. Upon receiving the finalized data set from the portable activity-tracking device, the smartphone may post-process and/or archive the data set within its own storage as shown at 293 (e.g., for on-demand user viewing) and may also relay the finalized data set to a server or other computing device for more full-featured rendering as discussed above.
Continuing with the various events that may be detected, if the portable activity-tracking device determines that the activity has paused at 429 (e.g., based on local and/or remote sensor data and/or user-input), the device may update an internally maintained operating mode to reflect the pause event as shown at 431 and may also display an activity-pause message to the user. Similarly, if in an activity-pause mode, sensor input and/or user-input indicating that the activity being tracked has re-started may trigger a converse update to the operating mode (switching back from activity-pause mode to activity-tracking mode) and corresponding re-start message display.
In one embodiment, the portable activity-tracking device is capable of operating in a low-power “display-on-demand” mode in which the display is disabled pending user-input indicating a desire to view activity-tracking metrics. For example, when operating in such a mode, a forearm-raising “view gesture” (e.g., raising a wrist-worn device to viewable height and orientation) may be sensed by an accelerometer and/or other local sensors of the portable activity-tracking device as shown at 433 and used to trigger power-up of the device display, thereby displaying logged activity-tracking metrics to the user in response to the view gesture (435). Though not specifically shown, a countervailing gesture (e.g., lowering the forearm) may be used to disable the display and thus preserve battery power.
In a continuous display mode, the displayed metrics may be updated as shown at 439 (i.e., based on logged data) in response to detecting expiration of an interval timer at 437. For example, the display may be updated every 60 milliseconds or so and thus continuously from a user's perspective.
Still referring to
Upon detecting user input at 445, the portable activity-tracking device processes the user input and takes responsive action at 447, including stopping activity tracking at 470, changing tracking mode, display mode, etc. Where the user input indicates that activity tracking is to continue without further collaborative data collection, the portable activity-tracking device may send a message to the collaborating device instructing that further data collection may be suspended.
If the power threshold of the portable activity-tracking device is itself determined to fall below a programmed or otherwise predetermined threshold (affirmative determination at 449), the local device may transmit a message to the collaborative device instructing that device to proceed with data logging as shown at 451 (including logging data that had been omitted in view of counterpart data logging within the portable activity-tracking device) and then display a shutdown message to the user.
If the portable-activity tracking device receives an abort message from the collaborating device indicating that activity-tracking is to be aborted (453), the portable activity-tracking device may transmit an acknowledgment to the collaborating device as shown at 455 and display an activity-stop message to the user.
Similarly, if the portable activity-tracking device detects (e.g., via sensor input or otherwise) that the activity being tracked has ceased as shown at 457 (i.e., paused for a threshold period of time), the portable activity-tracking device may transmit a message to the collaborating device to disable data collection and/or transmission therein and may also display an activity-stop message (459).
Still referring to
The various activity-tracking data structures, methods and techniques disclosed herein may be implemented through execution of one or more sequences of instructions (i.e., software program(s)) within a computer system, or by a custom-built hardware ASIC (application-specific integrated circuit), or programmed on a programmable hardware device such as an FPGA (field-programmable gate array), or any combination thereof within or external to the computer system.
When received within a computer system via one or more computer-readable media, such data and/or instruction-based expressions of the above described circuits and functional blocks can be processed by a processing entity (e.g., one or more processors) within the computer system in conjunction with execution of one or more other computer programs including, without limitation, net-list generation programs, place and route programs and the like, to generate a representation or image of a physical manifestation of such circuits. Such representation or image can thereafter be used in device fabrication, for example, by enabling generation of one or more masks that are used to form various components of the circuits in a device fabrication process.
In the foregoing description and in the accompanying drawings, specific terminology and drawing symbols have been set forth to provide a thorough understanding of the disclosed embodiments. In some instances, the terminology and symbols may imply specific details that are not required to practice those embodiments. For example, any of the specific dimensions, form factors, signal path widths, signaling or operating frequencies, component circuits or devices and the like can be different from those described above in alternative embodiments. Additionally, links or other interconnection between system components or internal circuit elements or blocks may be shown as buses or as single signal lines. Each of the buses can alternatively be a single signal line, and each of the single signal lines can alternatively be buses. Signals and signaling links, however shown or described, can be single-ended or differential. A signal driving circuit is said to “output” a signal to a signal receiving circuit when the signal driving circuit asserts (or de-asserts, if explicitly stated or indicated by context) the signal on a signal line coupled between the signal driving and signal receiving circuits. The term “coupled” is used herein to express a direct connection as well as a connection through one or more intervening circuits or structures. Device “programming” can include, for example and without limitation, loading a control value into a register or other storage circuit within the integrated circuit device in response to a host instruction (and thus controlling an operational aspect of the device and/or establishing a device configuration) or through a one-time programming operation (e.g., blowing fuses within a configuration circuit during device production), and/or connecting one or more selected pins or other contact structures of the device to reference voltage lines (also referred to as strapping) to establish a particular device configuration or operation aspect of the device. The terms “exemplary” and “embodiment” are used to express an example, not a preference or requirement. Also, the terms “may” and “can” are used interchangeably to denote optional (permissible) subject matter. The absence of either term should not be construed as meaning that a given feature or technique is required.
Various modifications and changes can be made to the embodiments presented herein without departing from the broader spirit and scope of the disclosure. For example, features or aspects of any of the embodiments can be applied in combination with any other of the embodiments or in place of counterpart features or aspects thereof. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Number | Name | Date | Kind |
---|---|---|---|
4312358 | Barney | Jan 1982 | A |
4367752 | Jimenez et al. | Jan 1983 | A |
4578769 | Frederick | Mar 1986 | A |
4977509 | Pitchford et al. | Dec 1990 | A |
5058427 | Brandt | Oct 1991 | A |
5224059 | Nitta et al. | Jun 1993 | A |
5295085 | Hoffacker | Mar 1994 | A |
5323650 | Fullen et al. | Jun 1994 | A |
5583776 | Levi et al. | Dec 1996 | A |
5724265 | Hutchings | Mar 1998 | A |
5891042 | Sham et al. | Apr 1999 | A |
5899963 | Hutchings | May 1999 | A |
5947868 | Dugan | Sep 1999 | A |
5955667 | Fyfe | Sep 1999 | A |
5976083 | Richardson et al. | Nov 1999 | A |
6018705 | Gaudet et al. | Jan 2000 | A |
6078874 | Piety et al. | Jun 2000 | A |
6145389 | Ebeling et al. | Nov 2000 | A |
6183425 | Whalen et al. | Feb 2001 | B1 |
6204807 | Odagiri | Mar 2001 | B1 |
6287262 | Amano et al. | Sep 2001 | B1 |
6301964 | Fyfe et al. | Oct 2001 | B1 |
6305221 | Hutchings | Oct 2001 | B1 |
6309360 | Mault | Oct 2001 | B1 |
6478736 | Mault | Nov 2002 | B1 |
6513381 | Fyfe et al. | Feb 2003 | B2 |
6513532 | Mault et al. | Feb 2003 | B2 |
6529827 | Beason et al. | Mar 2003 | B1 |
6571200 | Mault | May 2003 | B1 |
6678629 | Tsuji | Jan 2004 | B2 |
6761064 | Tsuji | Jul 2004 | B2 |
6790178 | Mault et al. | Sep 2004 | B1 |
6813582 | Levi et al. | Nov 2004 | B2 |
7062225 | White | Jun 2006 | B2 |
7162368 | Levi et al. | Jan 2007 | B2 |
7200517 | Darley et al. | Apr 2007 | B2 |
7261690 | Teller et al. | Aug 2007 | B2 |
7457724 | Vock et al. | Nov 2008 | B2 |
7505865 | Ohkubo et al. | Mar 2009 | B2 |
7608050 | Shugg | Oct 2009 | B2 |
7690556 | Kahn et al. | Apr 2010 | B1 |
7774156 | Niva et al. | Aug 2010 | B2 |
7789802 | Lee et al. | Sep 2010 | B2 |
7927253 | Vincent et al. | Apr 2011 | B2 |
7983876 | Vock et al. | Jul 2011 | B2 |
8180591 | Yuen et al. | May 2012 | B2 |
8391888 | Hanada et al. | Mar 2013 | B2 |
8684900 | Tran | Apr 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
9063164 | Yuen et al. | Jun 2015 | B1 |
9098991 | Park | Aug 2015 | B2 |
10132645 | Yuen et al. | Nov 2018 | B1 |
20030018430 | Ladetto et al. | Jan 2003 | A1 |
20050054938 | Wehman et al. | Mar 2005 | A1 |
20050107723 | Wehman et al. | May 2005 | A1 |
20050195830 | Chitrapu et al. | Sep 2005 | A1 |
20060039348 | Racz et al. | Feb 2006 | A1 |
20060047208 | Yoon | Mar 2006 | A1 |
20070050715 | Behar | Mar 2007 | A1 |
20070051369 | Choi et al. | Mar 2007 | A1 |
20080140338 | No et al. | Jun 2008 | A1 |
20080287751 | Stivoric et al. | Nov 2008 | A1 |
20090018797 | Kasama et al. | Jan 2009 | A1 |
20090043531 | Kahn et al. | Feb 2009 | A1 |
20090047645 | Dibenedetto et al. | Feb 2009 | A1 |
20090048044 | Oleson et al. | Feb 2009 | A1 |
20100010774 | Ma et al. | Jan 2010 | A1 |
20100292050 | DiBenedetto | Nov 2010 | A1 |
20110032105 | Hoffman | Feb 2011 | A1 |
20120083714 | Yuen et al. | Apr 2012 | A1 |
20120116684 | Ingrassia et al. | May 2012 | A1 |
20120179278 | Riley et al. | Jul 2012 | A1 |
20120254934 | McBrearty et al. | Oct 2012 | A1 |
20120283855 | Hoffman et al. | Nov 2012 | A1 |
20130094600 | Beziat et al. | Apr 2013 | A1 |
20130102251 | Linde et al. | Apr 2013 | A1 |
20130274587 | Coza | Oct 2013 | A1 |
20140180022 | Stivoric et al. | Jun 2014 | A1 |
20150026647 | Park et al. | Jan 2015 | A1 |
Number | Date | Country |
---|---|---|
11347021 | Dec 1999 | JP |
WO 2012170586 | Dec 2012 | WO |
WO 2012170924 | Dec 2012 | WO |
WO 2012171032 | Dec 2012 | WO |
WO 2015127067 | Aug 2015 | WO |
WO 2016003269 | Jan 2016 | WO |
Entry |
---|
U.S. Office Action dated Apr. 3, 2014, in U.S. Appl. No. 14/050,228. |
U.S. Office Action dated Aug. 11, 2014, in U.S. Appl. No. 14/050,228. |
U.S. Final Office Action dated Dec. 31, 2014, in U.S. Appl. No. 14/050,228. |
U.S. Notice of Allowance dated Mar. 17, 2015, in U.S. Appl. No. 14/050,228. |
U.S. Office Action dated Dec. 13, 2017, in U.S. Appl. No. 14/713,288. |
U.S. Notice of Allowance dated Jul. 13, 2018, in U.S. Appl. No. 14/713,288. |
“Activity Classification Using Realistic Data From Wearable Sensors”, Parkka, et al, IEEE Transactions on Information Technology in Biomedicine, vol. 10, No. 1, Jan. 2006, pp. 119-128. |
“Altimeter and Barometer System”, Clifford, et al., Freescale Semiconductor Application Note AN1979, Rev. 3, Nov. 2006. |
“Automatic classification of ambulatory movements and evaluation of energy consumptions utilizing accelerometers and barometer”, Ohtaki, et al, Microsystem Technologies, vol. 11, No. 8-10, Aug. 2005, pp. 1034-1040. |
“Classification of Human Moving Patterns Using Air Pressure and Acceleration”, Sagawa, et al, Proceedings of the 24.sup.th Annual Conference of the IEEE Industrial Electronics Society, vol. 2, Aug.-Sep. 1998, pp. 1214-1219. |
“Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience”, Fang, et al, IEEE Transactions on Instrumentation and Measurement, vol. 54, No. 6, Dec. 2005, pp. 2342-2358. |
“Direct Measurement of Human Movement by Accelerometry”, Godfrey, et al., Medical Engineering & Physics, vol. 30, 2008, pp. 1364-1386. |
“Drift-free dynamic height sensor using MEMS IMU aided by MEMS pressure sensor”, Tanigawa, et al, Workshop on Positioning, Navigation and Communication, Mar. 2008, pp. 191-196. |
“Evaluation of a New Method of Heading Estimation of Pedestrian Dead Reckoning Using Shoe Mounted Sensors”, Stirling et al., Journal of Navigation, vol. 58, 2005, pp. 31-45. |
“Fitbit Automatically Tracks Your Fitness and Sleep” published online at web.archive.org/web/20080910224820/http://www.fitbit.com, downloaded Sep. 10, 2008, 1 page. |
“Foot Mounted Inertia System for Pedestrian Naviation”, Godha et al., Measurement Science and Technology, vol. 19, No. 7, May 2008, pp. 1-9. |
“A Hybrid Discriminative/Generative Approach for Modeling Human Activities”, Lester, et al., Proc. of the Int'l Joint Conf. Artificial Intelligence, 2005, pp. 766-772. |
“Improvement of Walking Speed Prediction by Accelerometry and Altimetiy, Validated by Satellite Positioning”, Perrin, et al, Medical & Biological Engineering & Computing, vol. 38, 2000, pp. 164-168. |
“Indoor Navigation with MEMS Sensors”, Lammel, et al., Proceedings of the Eurosensors XIII conference, vol. 1, No. 1, Sep. 2009, pp. 532-535. |
“An Intelligent Multi-Sensor system for Pedestrian Navigation”, Retscher, Journal of Global Positioning Systems, vol. 5, No. 1, 2006, pp. 110-118. |
“Non-restricted measurement of walking distance”, Sagawa, et al, IEEE Int'l Conf. on Systems, Man, Cybernetics, vol. 3, Oct. 2000, pp. 1847-1852. |
“On Foot Navigation: When GPS alone is not Enough”, Ladetto, et al, Journal of Navigation, vol. 53, No. 2, Sep. 2000, pp. 279-285. |
“SCP 1000-D01/D11 Pressure Sensor as Barometer and Altimeter”, VTI Technologies Application, Jun. 2006, Note 33. |
“Suunto LUMI User Guide”, Jun. and Sep. 1997. |
“Using MS5534 for altimeters and barometers”, Intersema App., Note AN501, Jan. 2006. |
“Validated caloric expenditure estimation using a single body-worn sensor”, Lester, et al, Proc. of the Int'l Conf. on Ubiquitous Computing, 2009, pp. 225-234. |
Number | Date | Country | |
---|---|---|---|
20190186949 A1 | Jun 2019 | US |
Number | Date | Country | |
---|---|---|---|
61886035 | Oct 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14713288 | May 2015 | US |
Child | 16195094 | US | |
Parent | 14050228 | Oct 2013 | US |
Child | 14713288 | US |