The present disclosure relates generally to detecting outdoor walking workouts using a wearable device.
A wearable device may be worn on the hand, wrist, or arm of a person when walking. It may be desirable to track outdoor walking workouts on a wearable device to promote exercise and for other health related reasons. Detecting the start of a walking workout and distinguishing walking workouts from other walking activities are essential components of accurately tracking outdoor walking workouts.
The present disclosure is related to other methods of detecting the activities on a wearable device. Specifically, U.S. patent application Ser. No. 17/015,965, filed on Sep. 9, 2020. and entitled “DETECTING SWIMMING ACTIVITIES ON A WEARABLE DEVICE”. which patent application is incorporated herein in its entirety.
In one aspect, disclosed herein are computer implemented methods for improving performance of a wearable device while recording a walking activity comprising: receiving motion data of a user from a motion sensing module of the wearable device; counting, by one or more processor circuits of the wearable device, user steps based on the motion data; detecting, by the one or more processor circuits, a bout based on the user steps, the bout comprising a collection of continuous steps; estimating, by the one or more processor circuits, user mechanical work performed during the bout; detecting, by the one or more processor circuits, the start of a walking activity, the detecting the start of the walking activity comprising: comparing the user mechanical work to a mechanical work threshold; in response to detecting a value for the user mechanical work below the mechanical work threshold, classifying movement performed by the user based on motion data; and in response to classifying the movement performed by the user as a patterned movement, starting a walking activity; and sending, by the one or more processor circuits, a notification to the user requesting confirmation of the start of the walking activity.
In one aspect, the method comprises: receiving, pressure data, from a pressure sensor of the wearable device; receiving, location data, from a GPS module of the wearable device; calculating, by the one or more processor circuits, a grade for each step included in the user steps based on the pressure data, the grade measuring steepness of terrain stepped on during the bout; calculating, by the one or more processor circuits, a step distance for each step included in the bout based on the motion data and the location data; calculating, by the one or more processor circuits, a bout time describing the bout's duration; and estimating, by the one or more processor circuits, the user mechanical work performed during the bout based on the step distance, the grade, and the bout time.
In one aspect, the method comprises: estimating, by the one or more processor circuits, a device heading at every step included in the bout based on the motion data, the device heading describing the orientation of the wearable device relative to a frame of reference; determining, by the one or more processor circuits, a user direction of travel based on the device heading and the step count; and classifying movement performed during the bout based on the user direction of travel.
In one aspect, motion data comprises acceleration data obtained from an accelerometer and gyroscope data obtained from a gyroscope. In one aspect, the patterned movement is a straight movement pattern or a repetitive movement pattern, wherein the straight movement pattern has few changes in user direction and the repetitive movement pattern has changes in user direction that repeat at regular intervals during the walking activity. In one aspect, the notification is a notification UI displayed on a display screen of the wearable device.
In one aspect, the method comprises: distinguishing between a walking workout and a casual walking activity based on the comparing the user mechanical work to the user mechanical work threshold and the classifying the movement performed by the user. In one aspect, the method comprises: in response to starting the walking activity, calculating user performance information during the walking activity; and detecting the end of the walking workout based the user performance information.
In one aspect, the mechanical work threshold is specific to the user. In one aspect, the method comprises calculating, by the one or more processor circuits, load based on the grade and the user steps, the load estimating force required to perform the user steps at the grade; and improving the user mechanical work accuracy by estimating the mechanical work using the load.
In one aspect, the method comprises: receiving magnetic field data from a magnetic field sensor; the estimating the device heading comprising: determining a yaw component of rotational data generated from the acceleration data and the gyroscope data; and improving accuracy of the device heading by combining the yaw component with a second yaw component of a second rotational data generated from the magnetic field data. In one aspect, the yaw component of the rotational data is the rotational angle relative to a horizontal frame of reference.
In one aspect, disclosed herein a computer implemented methods for improving performance of a wearable device while recording a walking activity comprising: receiving motion data of a user from a motion sensing module of the wearable device; counting, by one or more processor circuits of the wearable device, user steps based on the motion data; detecting, by the one or more processor circuits, a bout based on the user steps, the bout comprising a collection of continuous steps; estimating, by the one or more processor circuits, user mechanical work performed during the bout; detecting, by the one or more processor circuits, the start of a walking activity, the detecting the start of the walking activity comprising: comparing the user mechanical work to a mechanical work threshold; in response to detecting a value for the user mechanical work that exceeds the mechanical work threshold, starting a walking activity; and sending, by the one or more processor circuits, a notification to the user requesting confirmation of the start of the walking activity.
In one aspect, the method comprises: receiving, pressure data, from a pressure sensor of the wearable device; receiving, location data, from a GPS module of the wearable device; calculating, by the one or more processor circuits, a grade for each step included in the user steps based on the pressure data, the grade measuring steepness of terrain stepped on during the bout; calculating, by the one or more processor circuits, a step distance for each step included in the bout based on the motion data and the location data; calculating, by the one or more processor circuits, a bout time describing the bout's duration; and estimating, by the one or more processor circuits, the user mechanical work performed during the bout based on the step distance, the grade, and the bout time.
In one aspect, the methods comprise: calculating, by the one or more processor circuits, load based on the grade and the user steps, the load estimating force required to perform the user steps at the grade; and improving the user mechanical work accuracy by estimating the mechanical work using the load. In one aspect, the method comprises: estimating, by the one or more processor circuits, a device heading at every step included in the bout based on the motion data, the device heading describing the orientation of the wearable device relative to a frame of reference; determining, by the one or more processor circuits, a user direction of travel based on the device heading and the step count; and classifying movement performed during the bout based on the user direction of travel.
In one aspect, the method comprises: receiving magnetic field data from a magnetic field sensor; the estimating the device heading comprising: determining a yaw component of rotational data generated from the acceleration data and the gyroscope data; and improving accuracy of the device heading by combining the yaw component with a second yaw component of a second rotational data generated from the magnetic field data. In one aspect, the yaw component of the rotational data is the rotational angle relative to a horizontal frame of reference. In one aspect, the method comprises: distinguishing between a walking workout and a casual walking activity based on the comparing the user mechanical work to the user mechanical work threshold.
In one aspect, disclosed herein are systems for improving performance of a wearable device while recording a walking activity comprising: a motion sensing module configured to collect a user's motion data; one or more processor circuits in communication with the motion sensing module and configured to execute instructions causing the processor circuits to: count user steps based on the motion data; detect a bout based on the user steps, the bout comprising a collection of continuous steps; estimate user mechanical work performed during the bout; detect the start of a walking activity, the detecting the start of the walking activity comprising: comparing the user mechanical work to a mechanical work threshold; in response to detecting a value for the user mechanical work below the mechanical work threshold, classifying movement performed by the user based on motion data; and in response to classifying the movement performed by the user as a patterned movement, starting a walking activity; and send a notification to the user requesting confirmation of the start of the walking activity.
Various objectives, features, and advantages of the disclosed subject matter can be more fully appreciated with reference to the following detailed description of the disclosed subject matter when considered in connection with the following drawings, in which like reference numerals identify like elements.
The present disclosure describes systems and methods of detecting the start of an outdoor walking workout. Separating walking workouts from casual walking activities is challenging because the walking motion performed in each type of walking activity is highly similar. Additionally, walking workouts may have lower levels of exertion than other workout activities. To distinguish between walking workouts and casual walking activities, methods of detecting the start of an outdoor walking workout infer the intent of the user based on one or more types of sensor data collected by the wearable device.
In various embodiments, user intent may be inferred based on the mechanical work performed by the user during the walking activity and/or the user's pattern of movement. For example, if a user performs sustained mechanical work or has a predictable direction of travel, workout intent may be inferred by the wearable device. If the user does not perform sustained mechanical work and has a random direction of travel, it is likely the user is out for a causal walk and is performing a walking motion without workout intent. Referring now to the figures, methods of detecting the start of an outdoor walking workout are described below and exemplary embodiments are shown in the figures. However, the disclosure is not limited to the embodiments shown in the figures and described below since not every variation of using sensor data to infer workout intent during a walking activity may be described in detail.
As described in more detail below, the wearable device 100 may be configured to detect walking workouts performed by the user, record walking workouts, calculate performance information of the user during the walking workout, detect the type of walking activity (e.g., causal stroll, walking workout, and the like) performed by the user, and provide additional functionality related to walking activities to the user. In particular, the wearable device 100 may use motion data obtained from one or more motion sensors, heart rate data obtained from a heart rate sensing module, pressure data from one or more pressure sensors, and/or location data obtained from a GPS module to detect a walking movement and infer when the walking movement is a walking workout.
In some embodiments, main processor 210 can include one or more cores and can accommodate one or more threads to run various applications and modules. Software can run on main processor 210 capable of executing computer instructions or computer code. The main processor 210 can also be implemented in hardware using an application specific integrated circuit (ASIC), programmable logic array (PLA), field programmable gate array (FPGA), or any other integrated circuit.
In some embodiments, wearable device 100 can also include an always on processor 212 which may draw less power than the main processor 210. Whereas the main processor 210 may be configured for general purpose computations and communications, the always on processor 212 may be configured to perform a relatively limited set of tasks, such as receiving and processing data from motion sensor 230, heart rate sensor 244, pressure sensor 246, and other modules within the wearable device 100. In many embodiments, the main processor 210 may be powered down at certain times to conserve battery charge, while the always on processor 212 remains powered on. Always on processor 212 may control when the main processor 210 is powered on or off.
Memory 220 can be a non-transitory computer readable medium, flash memory, a magnetic disk drive, an optical drive, a programmable read-only memory (PROM), a read-only memory (ROM), or any other memory or combination of memories. Memory 220 can include one or more modules 222-228.
The main processor 210 and/or always on processor 212 can be configured to run one or more modules 222-228 stored in memory 220 that are configured to cause main processor 210 or always on processor 212 to perform various steps that are discussed throughout the present disclosure.
In some embodiments, the wearable device 100 can include one or more motion sensors 230. For example, motion sensors 230 can include a gyroscope 232 and an accelerometer 234. In some embodiments, accelerometer 234 may be a three-axis accelerometer that measures linear acceleration in up to three-dimensions (for example, x-axis, y-axis, and z-axis). In some embodiments, gyroscope 232 may be a three-axis gyroscope that measures rotational data, such as rotational movement, angular acceleration, and/or angular velocity, in up to three-dimensions (for example, yaw, pitch, and roll). In some embodiments, accelerometer 234 may be a microelectromechanical system (MEMS) accelerometer, and gyroscope 232 may be a MEMS gyroscope. Main processor 210 or always on processor 212 of wearable device 100 may receive motion information from one or more motion sensors 230 to track acceleration, rotation, position, and or orientation information of wearable device 100 in six degrees of freedom through three-dimensional space.
In some embodiments, the wearable device 100 may include other types of sensors in addition to accelerometer 234 and gyroscope 232. For example, the wearable device 100 may include a pressure sensor 246 (e.g., an altimeter, barometer, and the like), a magnetic field sensor 248 (e.g., a magnetometer, compass, and the like), and/or a location sensor (e.g., a Global Positioning System (GPS) sensor).
The wearable device 100 may also include a display 240. The display 240 may be a screen, such as a crystalline (e.g., sapphire) or glass touchscreen, configured to provide output to the user as well as receive input from the user via touch. For example, the display 240 may be configured to display a current heart rate or daily average energy expenditure. The display 240 may receive input from the user to select, for example, which information should be displayed, or whether the user is beginning a physical activity (e.g., starting a session) or ending a physical activity (e.g., ending a session), such as a walking session, a swimming session, a running session, or a cycling session. In some embodiments, wearable device 100 may present output to the user in other ways, such as by producing sound with a speaker, and wearable device 100 may receive input from the user in other ways, such as by receiving voice commands via a microphone.
In various embodiments, wearable device 100 may communicate with external devices via an interface 242, including a configuration to present output to a user or receive input from a user. The interface 242 may be a wireless interface. The wireless interface may be a standard Bluetooth® (IEEE 802.15) interface, such as Bluetooth® v4.0, also known as “Bluetooth low energy.” In various embodiments, the interface may operate according to a cellphone network protocol such as Long Term Evolution (LTE™) or a Wi-Fi (IEEE 802.11) protocol. In various embodiments, the interface 242 may include wired interfaces, such as a headphone jack or bus connector (e.g., Lightning®, Thunderbolt™, USB, etc.).
Wearable device 100 can measure an individual's current heart rate from a heart rate sensor 244. The heart rate sensor 244 may also be configured to determine a confidence level indicating a relative likelihood of an accuracy of a given heart rate measurement. In various embodiments, a traditional heart rate monitor may be used and may communicate with wearable device 100 through a near field communication method (e.g., Bluetooth).
In various embodiments, the wearable device 100 can include a photoplethysmogram (PPG) sensor. PPG is a technique for measuring a person's heart rate by optically measuring changes in the person's blood flow at a specific location. PPG can be implemented in many different types of devices in various forms and shapes. For example, a PPG sensor can be implemented in a wearable device 100 in the form of a wrist strap, which a user can wear around the wrist. A PPG sensor may also be implemented on the underside of a wearable device 100. The PPG sensor can optically measure the blood flow at the wrist. Based on the blood flow information, the wearable device 100 can derive the person's heart rate.
The wearable device 100 may be configured to communicate with a companion device, such as a smartphone. In various embodiments, wearable device 100 may be configured to communicate with other external devices, such as a notebook or desktop computer, tablet, headphones, Bluetooth headset, etc.
The modules described above are examples, and embodiments of wearable device 100 may include other modules not shown. For example, some embodiments of wearable device 100 may include a rechargeable battery (e.g., a lithium-ion battery), a microphone array, one or more cameras, two or more speakers, a watchband, water-resistant casing or coating, etc. In some embodiments, all modules within wearable device 100 can be electrically and/or mechanically coupled together. In some embodiments, main processor 210 and or always on processor 212 can coordinate the communication among each module.
In various embodiments, the wearable device 100 may use sensed and collected motion information to predict a user's activity. Examples of activities may include, but are not limited to walking, running, cycling, swimming, skiing, cardio machine activities, and the like. Wearable device 100 may also be able to predict or otherwise detect when a user is stationary (e.g., sleeping, sitting, standing still, driving or otherwise controlling a vehicle, etc.). Wearable device 100 may use a variety of motion data, rotational data, and/or orientation data to predict a user's activity.
Wearable device 100 may use a variety of heuristics, algorithms, or other techniques to predict the user's activity and/or detect activity start and end points. In various embodiments, one or more machine learning techniques and/or predictive models trained to predict the user's activity and/or detect activity start and end points. Training the one or more predictive models may include surveying a plurality of datasets including motion data, rotational data, heart rate data, and the like collected during activities having a known activity type (e.g., walking workouts, casual walking activities, running activities, and the like). Wearable device 100 may also estimate a confidence level (e.g., percentage likelihood, degree of accuracy, etc.) associated with a particular prediction (e.g., 90% likelihood that the user is cycling) or predictions (e.g., 60% likelihood that the user is cycling and 40% likelihood that the user is performing some other activity).
In various embodiments, step rate may be used to estimate the user's mechanical work 316. Step rate provides an indicator of how hard a user is working, for example, as step rate increases, the user may need to expend additional energy to maintain the higher step rate, therefore the user may do more work. Step rate may be a component of the mechanical work calculation performed by the pedestrian work model 314. Other factors used by the pedestrian work model 314 to estimate mechanical work performed by the user may include bout duration, step distance, and load (i.e., resistance).
Step distance may be determined based on GPS data 304, for example, latitude and longitude fixes. GPS data 304 may be provided by a GPS module that may triangulate three or more satellite signals to generate the fixes. In various embodiments, step distance may describe the distance traveled in each step counted by the step counter 308. Step distance may be estimated based on the step count and the total distance traveled during the walking activity. For example, dividing the total distance by the step count may generate an estimate for the average step distance.
The load or resistance may be based on the grade of the terrain traveled over during the walking activity. The grade may measure the steepness of terrain traversed during a bout (i.e., a collection of continuous steps) or other segment of the walking activity. In various embodiments, altimeter functionality such as estimating terrain grade or steepness, calculating elevation change, and/or determining relative altitude information may be implemented or otherwise determined from pressure data 306 measured by a pressure sensor of the wearable device. Load may indicate a user's mechanical work at each step. For example, walking over a steep grade requires more mechanical work than walking on flat ground. The load estimator 310 may estimate load by applying a load estimation model, e.g., a regression analysis or a decision tree analysis, to terrain grade and step rate information. For example, a relatively high terrain grade for a given step rate may indicate a relatively high load. To estimate load using terrain grade, the load estimator 310 may use a grade factor calculated by the wearable device. The grade factor may describe the resistance at a particular terrain grade and reflect the effort (e.g., work) required by the user to travel over a terrain having a particular grade.
In some embodiments, the load estimation model may account for a specific activity context. For example, in some activities, load or resistance may be a scaling factor for another variable such as speed. In some embodiments, the estimated load may be filtered using historical estimated load values (e.g., hysteresis) to smooth load estimation. For example, a load filter may limit the amount by which an estimated load may change from one time period to the next.
The pedestrian work model 314 can calculate the mechanical work 316 of a user during the walking activity. The pedestrian work model 314 may be specific to walking activities and the wearable device may include one or more alterative work models for calculating mechanical work of a user during a different activity type. For walking activities, the mechanical work can be calculated from the bout duration, load, step rate, and/or step distance. In various embodiments, mechanical work may be proximate to the amount of force required to move the load over the step distance within the bout duration. Mechanical work 316 generated by the pedestrian work model 314 may be output in Joules (J). The mechanical work of a user during the walking activity may be compared to a work threshold to infer user intent. In various embodiments, the mechanical work threshold may be specific to one or more user characteristics (e.g., age, gender, weight, body mass index (BMI), fitness level, and the like). For example, the mechanical work threshold for a fit person may be higher than the mechanical work threshold for an unfit person for the purposes of determining sustained work to indicate a walking activity. The mechanical work threshold may be determined by surveying a plurality of datasets including mechanical work calculated during walking activities having known period of casual walking (i.e., walking with no workout intent) and known periods of workout walking (i.e., walking with workout intent). A user specific mechanical work threshold may be determined by limiting the walking activities included in the plurality of datasets to walking activities performed by a particular user and or a group of users having one or more characteristics in common with the particular user. Mechanical work 316 may also be used to calculate work for the purposes of estimating caloric expenditure (in METs). For example, for a walking activity the mechanical work (WR) may be, WR=f(step rate)*g(grade), where f(·) denotes a function of the parameter(s) within parentheses.
To determine the movement pattern 416, the wearable device may use one or more types of sensor data. For example, the wearable device may use magnetic field data 402 received from a magnetic field sensor, gyroscope data 404 obtained from a gyroscope, and/or accelerometer data 302 obtained from an accelerometer to determine a user's movement pattern 416. Magnetic field data 402, gyroscope data 404, and acceleration data 302 may be input into a heading model 408. The heading model 408 may estimate a device heading using the input data. In various embodiments, the device heading may be determined by algorithmically combining accelerometer measurements 302 with gyroscope measurements 404 to provide a smoothed estimate of the orientation of the wearable device in three dimensional (3D) space relative to a frame of reference (e.g., a fixed body frame of reference, an inertial frame of reference, and the like). In various embodiments, magnetic field data 402 may also be used to determine the orientation of the wearable device in 3D space relative to a frame of reference (e.g., a fixed body frame of reference or an inertial frame) and or improve the accuracy of orientation estimates generated based on accelerometer measurements 302 and or gyroscope measurements.
To determine the orientation of the wearable device in 3D space, rotational data may be determined from motion data measured by the accelerometer, gyroscope, and or magnetic field sensor. Rotational data may include one or more angles relative an axis of rotational axis (e.g., roll, pitch, and yaw) that measure the angular displacement of the wearable device at a plurality of positions. Accelerometer 302 data, gyroscope data 404, and or magnetic field data 402 may be used to generate rotational data including three rotational angles for each sensor. The position of the wearable device may then be calculated by aggregating (e.g., averaging, normalizing, or otherwise combining) the rotational data generated by each sensor using 6 axis and or 9 axis sensor fusion. In various embodiments, device heading may then be calculated from the yaw component (e.g., rotational angle relative to the x axis or horizontal plane) of the rotational data. The yaw component of the rotational data may be a rotational angle in a frame of reference (e.g., an inertial frame or reference or a fixed body frame of reference). For example, in a fixed body frame of reference, the rotational angle included in the yaw component may describe the angular motion of the wearable device relative to an axis of rotation that is parallel to the display screen of the wearable device.
In various embodiments, magnetic field data 402 may be used to improve the accuracy of device headings by providing additional rotational data that describes the position of the wearable device in 3D space. For example, a horizontal angle of rotation (i.e., the yaw component of rotation—the angle of rotation relative to the x axis or horizontal plane in an inertial frame of reference or the angle of rotation relative to the axis parallel with the display screen of the wearable device in a fixed body frame of reference) may be calculated using rotational data generated from the magnetic field data 402. The horizontal angle of rotation generated from magnetic field data 402 may then be algorithmically combined (i.e., averaged, scaled, and the like) with the horizontal angles of rotation generated from gyroscope 404 data and accelerometer data 302 respectively to improve accuracy of device headings.
In various embodiments, the additional rotational datapoints based on magnetic field data can be used improve the accuracy of rotational data generated based on the accelerometer data 302 and gyroscope data 404. Rotational datapoints based on magnetic field data may normalize for integration drift and other errors that are commonly included in rotational data generated based on motion data. For example, the angular position or the angular velocity of the wearable device may be obtained based on the angular acceleration component of motion data by integrating the angular acceleration over time. Similarly, the angular position of the wearable device can be obtained based on the angular velocity by integrating the angular velocity over time. Therefore, generating rotational data based on motion data (i.e., angular acceleration and angular velocity) may require double integration of angular acceleration values and single integration of angular velocity values. Due to integration drift, rotational data based on angular acceleration and or angular velocity may be accurate for only relatively short time intervals (e.g., 30 seconds). The device orientation may be continuously tracked throughout a user's entire workout session (i.e., several minutes and or hours). Therefore, integration drift may diminish the accuracy of device orientation, device headings, relative heading, and other device position tracking estimates made throughout the duration of a full workout activity. By including datapoints based on magnetic field data in rotational data used to generate the device position tracking estimates, the device position tracking estimates used to determine user heading may be more accurate and consistent throughout the full workout activity.
The performance of the motion sensors included in the wearable device may also not be uniform across all device instances. Motion sensor calibration may be disturbed by significant shock events causing some motion sensors to have better calibration than others and some motion sensors to exhibit more drift in motion data. Rotational data generated based on magnetic field data may compensate for some of these common errors in rotational data derived from motion data. Magnetic field data describes the position of the wearable device relative to a steady state magnetic field near the device. The steady state magnetic field may be, for example, the earth's geomagnetic field, an ambient magnetic field generated by an electronic device or other aspects of the environment local to the wearable device, and the like. Determining rotational data based on magnetic field data does not require an integration operation. Thus, including datapoints derived from magnetic field data in rotational data can reduce the impact of integration drift. Accordingly, noise in rotational data based on motion data attributable to integration drift and inconsistent performance of motion sensors may be normalized by including rotational datapoints based on magnetic field data.
Similarly, noise in rotational data based on magnetic field data caused by, for example, a transient magnetic field generated by a mobile device passing in close proximity to the wearable device, may be normalized by using rotational data derived from motion data to determine the device position. Therefore, using motion data and magnetic field data to determine device position, device orientation, heading estimates, and other wearable device position tracking estimates, improves the precision, accuracy, and reliability of the device position tracking estimates generated by the wearable device.
In various embodiments, device headings generated by the heading model 408 may be combined with step count and/or step rate (“Steps”) generated by a step counter 308. The step counter 308 may calculate steps based on accelerometer data 302 as described above in
In various embodiments, the relative headings are used to determine changes in user direction of travel. For example, the change in user direction may be indicated by the difference in the device heading at two different time points during the walking activity. To accurately identify changes in user direction throughout the walking activity, the wearable device may continuously determine relative device headings at a predetermined rate. Large relative heading values (e.g., relative heading values above a heading threshold i.e., 140 degrees) may indicate a change in direction because sustained, significant changes in the direction of travel of the user device (i.e., the device heading) correspond to changes in the direction of travel of the user. For example, if a user is walking north and turns around and starts walking south the yaw component of the rotational data of the wearable device worn by the user may change 180 degrees causing a 180 degree change in device heading. Constant relative heading values having little change over time may indicate a constant direction of travel. Relative heading values may be combined with steps to associate the direction of travel with the walking pace of the user.
The movement pattern classifier 414 may determine a movement pattern 416 based on the direction of travel and the steps during the walking activity. For example, walking movements having many different changes in direction and frequent changes of walking pace may have a random movement pattern. Walking movements having fewer changes in direction and a more consistent pace may have a patterned movement pattern. For example, walking movements having few changes in direction and changes of direction that do not repeat at regular time and or distance intervals and or a constant walking pace may have a straight movement pattern. Walking movements having changes in direction that repeat at regular time and or distance intervals during the walking activity and a constant walking pace may have a repetitive movement pattern. To determine the movement pattern 416, the movement pattern classifier 414 may classify a walking movement performed by the user.
To classify the walking movement, the movement pattern classifier 414 may compare the number of changes in direction and or walking pace (i.e., walking speed) calculated for a segment of a walking activity to a change of direction threshold and or expected walking speed. The number of changes in direction may be determined for a segment of a walking activity having any duration of time (i.e., the entire walking activity, 1 minute of the walking 5 minutes of the walking activity, or any other time period) or distance (i.e., 500 meters, 0.1 miles, 0.5 miles, 1 kilometer, or any other distance). If the number of changes in direction that occur during the segment of the walking activity exceeds the change of direction threshold (e.g., 9 changes in direction), the movement pattern classifier 414 may classify the walking movement performed during the segment as a random movement pattern. If the number of changes in direction that occur during the segment of the walking activity does not exceed the change of direction threshold, the movement pattern classifier 414 may classify the walking movement performed during the segment as a patterned movement (i.e., a straight movement pattern or a repetitive movement pattern). The number of changes of direction and the time and or distance when the changes of direction occur may be used to differentiate a straight movement pattern from a repetitive movement pattern. For example, if the changes of direction during the segment repeat at regular time and or distance intervals during the walking activity, the movement pattern classifier 414 may classify the walking movement as a repetitive movement pattern. If the changes of direction during the segment are well below (e.g., 80% to 90% below the changes in direction threshold) and or the changes of direction do not repeat at regular time and or distance intervals, the movement pattern classifier 414 may classify the walking movement as a straight movement pattern.
The number changes of direction included in the change of direction threshold used to detect a walking activity with workout intent (i.e., a walking activity having a straight and or repetitive movement pattern) and the magnitude of the relative heading included in the heading threshold used to detect a change of direction may be determined by surveying a plurality of datasets including motion data, rotational data, location data, pressure data, and other sensor data measured during walking activities having periods of known casual walking and known periods of workout walking. The plurality of datasets may also include device position measurements, device headings, relative headings, changes of direction, and other metrics measured from the sensor data. A user specific change of direction threshold and heading threshold may be determined by limiting the plurality of datasets to data collected during walking activities performed by the user and or a group of users having one or more characteristics in common with the user.
At step 506, the wearable device may estimate mechanical work of the user during the bout as described above in
At step 508, mechanical work may then be compared to a mechanical work threshold to determine if the walking performed by the user is part of a walking workout or if the walking is causal walking. In various embodiments, if the mechanical work exceeds the mechanical work threshold for a predefined period of time (e.g., 30 seconds, 1 minute, 5 minutes, 15 minutes, or any other length of time) the wearable device may determine the user's work is sustained during the walking activity. The mechanical work threshold may correspond to any mechanical work value (e.g., 7.3 Joules per kilogram or any other value for mechanical work) and may be determined by surveying a plurality of datasets including motion data, pressure data, load data, steps, step rate, and or mechanical work measured during walking activities having known periods of casual walking and known periods of workout walking. A mechanical work threshold that is specific to a particular user may be determined by limiting the walking activities included in the plurality of datasets to walking activities performed by the particular user and or a group of users having one or more characteristics in common with the particular user.
In response to detecting a sustained level of work during the walking workout based on the user's mechanical work, the wearable device may start a walking workout at step 514. If the mechanical work does not exceed the mechanical work threshold, the wearable device may determine the user's work is not sustained and may perform additional analysis to determine if the user is performing a walking workout. For example, the wearable device may classify the movement during the walking activity at step 510.
The wearable device may classify the user's walking movement by generating a movement pattern as described above in
In response to detecting a walking workout by determining the user's work is sustained during the walking activity and/or the user has a predictable direction of travel, the wearable device may send a start notification to the user at step 516. In various embodiments, the start notification may be a UI displayed on a display screen of the wearable device. The UI may include a selectable option for confirming the start of the walking workout. An exemplary start notification UI is shown below in
At step 604, steps are calculated based on accelerometer data and or other motion data measuring during the walking workout. The steps may be calculated for the entire walking workout and or a predetermined portion of the walking workout (e.g., 30 seconds, 5 minutes, or any other time period). The steps calculated from motion data are then compared to the expected steps observed over the same time period. As described above, the expected steps may be determined by surveying a plurality of datasets including motion data and or calculated steps collected during known walking workouts. If the steps are consistent with the expected steps included in the walking activity profile (i.e., the steps measured during the walking workout are within a threshold percent difference of the expected steps), the walking activity may be maintained. If the steps are inconsistent with the expected steps included in the walking activity profile (i.e., the steps measured during the walking workout are not within a threshold percent difference of the expected steps) additional analysis may be performed on sensor data collected during the walking workout to detect the end of the walking workout. For example, the wearable device may perform analysis on the heart rate data at step 606 and or walking speed data at step 608.
The threshold percent difference between calculated steps and expected steps for a walking workout may be determined by surveying a plurality of datasets including motion data and or calculated steps collected during walking workouts performed by one or more users and walking workout profiles including the expected steps for walking workouts for the same one or more users. A threshold percent difference between the calculated and expected steps that is specific to a particular user may be determined by limiting the walking workouts and walking workout profiles included in the plurality of datasets to walking workouts performed by—and walking workout profiles associated with—the particular user and or a group of users having one or more characteristics in common with the particular user.
At step 606, user heart rate is calculated from heart rate data measured by a heart rate sensor. The user heart rate may be calculated for the entire walking workout and or a predetermined portion of the walking workout. The user heart rate is then compared to an expected heart rate included in the walking workout profile. As described above, the expected heart rate may be determined by surveying a plurality of datasets including heart rate data collected during walking workouts. If the calculated user heart rate is consistent with the expected heart rate (i.e., the calculated user heart rate is within a threshold percent difference of the expected heart rate), the walking workout may be maintained. If the calculated user heart rate is inconsistent with the expected heart rate included in the walking activity profile (i.e., the calculated user heart rate is not within a threshold percent difference of the expected heart rate), additional analysis may be performed to confirm the end of the walking workout. For example, the wearable device may perform analysis on walking speed data at step 608 to confirm the end of the walking workout.
The threshold percent difference between the calculated user heart rate and the expected user heart rate for a walking workout may be determined by surveying a plurality of datasets including heart rate data collected during walking workouts performed by one or more users and walking workout profiles including the expected heart rates for walking workouts for the same one or more users. A threshold percent difference for the calculated and expected heart rates that is specific to a particular user may be determined by limiting the walking workouts and walking workout profiles included in the plurality of datasets to walking workouts performed by—and walking workout profiles associated with—the particular user and or a group of users having one or more characteristics in common with the particular user.
At step 608, user walking speed is calculated using location data generated by a GPS module. For example, user walking speed may be determined based on the amount of time required for the user to travel from a first location measured by the GPS module to a second location measured by the GPS module. The user walking speed may be calculated for the entire walking workout and or a predetermined period of time and or distance (e.g., a distance of 500 m, 500 ft, 0.5 mi, and the like and or a time of 10 seconds, 1 minute, 15 minutes, and the like). As described above, the expected walking speed may be determined by surveying a plurality of datasets including speed data collected during walking workouts. The wearable device may determine an expected walking speed for a particular user by limiting the walking workouts included in the plurality of datasets to walking workouts performed by the particular user and or a group of users having one or more characteristics in common with the particular user. The walking speed is then compared to an expected walking speed included in the walking workout profile. If the walking speed is consistent with the expected walking speed (i.e., the calculated walking speed is within a threshold percent difference of the expected walking speed), the walking workout may be maintained. If the walking speed is inconsistent with the expected speed included in the walking activity profile (i.e., the calculated walking speed is not within a threshold percent difference of the expected walking speed), the wearable device may wait for an activity timeout at step 610.
The threshold percent difference between calculated walking speed and the expected walking speed may be determined by surveying a plurality of datasets including walking speed data collected during walking workouts performed by one or more users and walking workout profiles including the expected walking speed for walking workouts for the same one or more users. A threshold percent difference of the calculated and expected walking speed that is specific to a particular user may be determined by limiting the walking workouts and walking workout profiles included in the plurality of datasets to walking workouts performed by—and walking workout profiles associated with—the particular user and or a group of users having one or more characteristics in common with the particular user.
If an activity timeout is detected, the wearable device may end the walking workout at step 612. If the user resumes walking before an activity timeout, the walking workout may be maintained and steps 604-610 may be repeated until the end of a walking workout is detected. At step 614, in response to detecting the end of a walking workout, the wearable device may send an end notification to the user. In various embodiments, the end notification may be a UI displayed on a display screen of the wearable device. The UI may include a selectable option for confirming the end of the walking workout. An exemplary end notification UI is shown below in
The foregoing description is intended to convey a thorough understanding of the embodiments described by providing a number of specific exemplary embodiments and details involving activity detection, workout performance tracking, walking activity monitoring, and motion pattern classification. It should be appreciated, however, that the present disclosure is not limited to these specific embodiments and details, which are examples only. It is further understood that one possessing ordinary skill in the art, in light of known systems and methods, would appreciate the use of the invention for its intended purposes and benefits in any number of alternative embodiments, depending on specific design and other needs.
It is to be understood that the disclosed subject matter s not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The disclosed subject matter is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception, upon which this disclosure is based, may readily be utilized as a basis for the designing of other structures, methods, and systems for carrying out the several purposes of the disclosed subject matter. Therefore, the claims should be regarded as including such equivalent constructions insofar as they do not depart from the spirit and scope of the disclosed subject matter.
As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “includes” and/or “including”, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the terms “and/or” and “at least one of” include any and all combinations of one or more of the associated listed items.
Certain details are set forth in the foregoing description and in
Although the disclosed subject matter has been described and illustrated in the foregoing exemplary embodiments, it is understood that the present disclosure has been made only by way of example, and that numerous changes in the details of implementation of the disclosed subject matter may be made without departing from the spirit and scope of the disclosed subject matter.
This application claims the benefit of U.S. Provisional Application Ser. No. 62/907,543 filed Sep. 27, 2019, the entire contents of which is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4566461 | Lubell et al. | Jan 1986 | A |
4740009 | Hoelzl | Apr 1988 | A |
5158093 | Shvartz et al. | Oct 1992 | A |
5663897 | Geiser | Sep 1997 | A |
5664499 | Kingsmill | Sep 1997 | A |
6013008 | Fukushima | Jan 2000 | A |
6059724 | Campbell et al. | May 2000 | A |
6582380 | Kazlausky et al. | Jun 2003 | B2 |
6687535 | Hautala et al. | Feb 2004 | B2 |
6837827 | Lee et al. | Jan 2005 | B1 |
6862525 | Beason et al. | Mar 2005 | B1 |
6868338 | Elliott | Mar 2005 | B1 |
6876947 | Darley | Apr 2005 | B1 |
7254516 | Case et al. | Aug 2007 | B2 |
7311675 | Peifer et al. | Dec 2007 | B2 |
7377180 | Cunningham | May 2008 | B2 |
7387029 | Cunningham | Jun 2008 | B2 |
7467060 | Kulach et al. | Dec 2008 | B2 |
7534206 | Lovitt et al. | May 2009 | B1 |
7647196 | Kahn et al. | Jan 2010 | B2 |
7690556 | Kahn et al. | Apr 2010 | B1 |
7771320 | Riley et al. | Aug 2010 | B2 |
7805149 | Werner et al. | Sep 2010 | B2 |
7841967 | Kahn et al. | Nov 2010 | B1 |
8290480 | Abramson et al. | Oct 2012 | B2 |
8483775 | Buck et al. | Jul 2013 | B2 |
8531180 | Piemonte et al. | Sep 2013 | B2 |
8589174 | Nelson et al. | Nov 2013 | B2 |
8638320 | Harley et al. | Jan 2014 | B2 |
8653956 | Berkobin et al. | Feb 2014 | B2 |
8784271 | Brumback et al. | Jul 2014 | B2 |
8890854 | Tenuta et al. | Nov 2014 | B2 |
8892391 | Tu et al. | Nov 2014 | B2 |
8894576 | Alwan et al. | Nov 2014 | B2 |
8911329 | Lin et al. | Dec 2014 | B2 |
8928635 | Harley et al. | Jan 2015 | B2 |
9195305 | Markovic et al. | Nov 2015 | B2 |
9264862 | Tu et al. | Feb 2016 | B2 |
9413871 | Nixon et al. | Aug 2016 | B2 |
9448250 | Pham et al. | Sep 2016 | B2 |
9526430 | Srinivas et al. | Dec 2016 | B2 |
9704412 | Wells et al. | Jul 2017 | B2 |
9737761 | Sivaraj | Aug 2017 | B1 |
9788794 | Leboeuf et al. | Oct 2017 | B2 |
9817948 | Swank et al. | Nov 2017 | B2 |
9918646 | Alvarado et al. | Mar 2018 | B2 |
9998864 | Kumar et al. | Jun 2018 | B2 |
10098549 | Tan et al. | Oct 2018 | B2 |
10154789 | Raghuram et al. | Dec 2018 | B2 |
10188347 | Self et al. | Jan 2019 | B2 |
10206627 | Leboeuf et al. | Feb 2019 | B2 |
10219708 | Altini | Mar 2019 | B2 |
10244948 | Pham et al. | Apr 2019 | B2 |
10290260 | Wu et al. | May 2019 | B2 |
10292606 | Wisbey et al. | May 2019 | B2 |
10512406 | Martinez et al. | Dec 2019 | B2 |
10524670 | Raghuram et al. | Jan 2020 | B2 |
10620232 | Tu et al. | Apr 2020 | B2 |
10687707 | Tan et al. | Jun 2020 | B2 |
10687752 | Pham et al. | Jun 2020 | B2 |
10694994 | Alvarado et al. | Jun 2020 | B2 |
10699594 | Mermel et al. | Jun 2020 | B2 |
10617912 | Narasimha Rao et al. | Jul 2020 | B2 |
10709933 | Tan et al. | Jul 2020 | B2 |
11051720 | Perry et al. | Jul 2021 | B2 |
11103749 | Mermel et al. | Aug 2021 | B2 |
11278765 | Mohrman | Mar 2022 | B2 |
11517789 | Xie et al. | Dec 2022 | B2 |
20010022828 | Pyles | Sep 2001 | A1 |
20020019585 | Dickinson | Feb 2002 | A1 |
20030032460 | Cannon et al. | Feb 2003 | A1 |
20030138763 | Roncalez et al. | Jul 2003 | A1 |
20040064061 | Nissila | Apr 2004 | A1 |
20050065443 | Ternes | Mar 2005 | A1 |
20050107723 | Wehman et al. | May 2005 | A1 |
20050124906 | Childre et al. | Jun 2005 | A1 |
20050212701 | Nimmo | Sep 2005 | A1 |
20060046898 | Harvey | Mar 2006 | A1 |
20060064277 | Jung et al. | Mar 2006 | A1 |
20060136173 | Case et al. | Jun 2006 | A1 |
20060190217 | Lee et al. | Aug 2006 | A1 |
20060217231 | Parks et al. | Sep 2006 | A1 |
20070100666 | Stivoric et al. | May 2007 | A1 |
20070150229 | Fujiwara | Jun 2007 | A1 |
20070219059 | Schwartz et al. | Sep 2007 | A1 |
20070275825 | O'Brien | Nov 2007 | A1 |
20070276271 | Chan | Nov 2007 | A1 |
20080096726 | Riley et al. | Apr 2008 | A1 |
20080214360 | Stirling et al. | Sep 2008 | A1 |
20090009320 | O'Connor et al. | Jan 2009 | A1 |
20090024332 | Karlov et al. | Jan 2009 | A1 |
20090043531 | Kahn et al. | Feb 2009 | A1 |
20090063099 | Counts et al. | Mar 2009 | A1 |
20090143199 | Nishibayashi | Jun 2009 | A1 |
20090240461 | Makino et al. | Sep 2009 | A1 |
20090319221 | Kahn et al. | Dec 2009 | A1 |
20100030350 | House et al. | Feb 2010 | A1 |
20100030482 | Li | Feb 2010 | A1 |
20100130890 | Matsumura et al. | May 2010 | A1 |
20100184564 | Molyneux et al. | Jul 2010 | A1 |
20100204952 | Irlam et al. | Aug 2010 | A1 |
20100210953 | Sholder et al. | Aug 2010 | A1 |
20100210975 | Anthony et al. | Aug 2010 | A1 |
20100217099 | Leboeuf et al. | Aug 2010 | A1 |
20100274102 | Teixeira | Oct 2010 | A1 |
20100298656 | McCombie et al. | Nov 2010 | A1 |
20110040193 | Seppanen et al. | Feb 2011 | A1 |
20110054359 | Sazonov et al. | Mar 2011 | A1 |
20110082008 | Cheung et al. | Apr 2011 | A1 |
20110131012 | Czaja et al. | Jun 2011 | A1 |
20110152695 | Granqvist et al. | Jun 2011 | A1 |
20110195707 | Faerber et al. | Aug 2011 | A1 |
20110238485 | Haumont et al. | Sep 2011 | A1 |
20110301436 | Teixeira | Dec 2011 | A1 |
20120006112 | Lee et al. | Jan 2012 | A1 |
20120083715 | Yuen et al. | Apr 2012 | A1 |
20120172677 | Logan et al. | Jul 2012 | A1 |
20120238832 | Jang et al. | Sep 2012 | A1 |
20120245714 | Mueller et al. | Sep 2012 | A1 |
20120296455 | Ohnemus et al. | Nov 2012 | A1 |
20120322621 | Bingham et al. | Dec 2012 | A1 |
20130006515 | Vellaikal et al. | Jan 2013 | A1 |
20130006522 | Vellaikal et al. | Jan 2013 | A1 |
20130023739 | Russell | Jan 2013 | A1 |
20130041590 | Burich et al. | Feb 2013 | A1 |
20130053990 | Ackland | Feb 2013 | A1 |
20130073255 | Yuen et al. | Mar 2013 | A1 |
20130085861 | Dunlap | Apr 2013 | A1 |
20130096943 | Carey et al. | Apr 2013 | A1 |
20130135097 | Doezema | May 2013 | A1 |
20130158686 | Zhang et al. | Jun 2013 | A1 |
20130178335 | Lin et al. | Jul 2013 | A1 |
20130197377 | Kishi et al. | Aug 2013 | A1 |
20130218053 | Kaiser et al. | Aug 2013 | A1 |
20130267794 | Fernstrom et al. | Oct 2013 | A1 |
20130326137 | Bilange et al. | Dec 2013 | A1 |
20130340287 | Stewart | Dec 2013 | A1 |
20140071082 | Singh et al. | Mar 2014 | A1 |
20140073486 | Ahmed et al. | Mar 2014 | A1 |
20140087708 | Kalita et al. | Mar 2014 | A1 |
20140088444 | Saalasti et al. | Mar 2014 | A1 |
20140107932 | Luna | Apr 2014 | A1 |
20140109390 | Manning | Apr 2014 | A1 |
20140121471 | Walker | May 2014 | A1 |
20140167973 | Letchner et al. | Jun 2014 | A1 |
20140172238 | Craine | Jun 2014 | A1 |
20140172361 | Chiang et al. | Jun 2014 | A1 |
20140197946 | Park et al. | Jul 2014 | A1 |
20140200906 | Bentley et al. | Jul 2014 | A1 |
20140207264 | Quy | Jul 2014 | A1 |
20140213920 | Lee et al. | Jul 2014 | A1 |
20140221854 | Wai | Aug 2014 | A1 |
20140228649 | Rayner et al. | Aug 2014 | A1 |
20140244071 | Czaja et al. | Aug 2014 | A1 |
20140266160 | Coza | Sep 2014 | A1 |
20140266789 | Matus | Sep 2014 | A1 |
20140276127 | Ferdosi et al. | Sep 2014 | A1 |
20140277628 | Nieminen | Sep 2014 | A1 |
20140278139 | Hong et al. | Sep 2014 | A1 |
20140278229 | Hong et al. | Sep 2014 | A1 |
20140279123 | Harkey et al. | Sep 2014 | A1 |
20140316305 | Venkatraman et al. | Oct 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20150066526 | Cheng et al. | Mar 2015 | A1 |
20150072712 | Huang et al. | Mar 2015 | A1 |
20150087929 | Rapoport et al. | Mar 2015 | A1 |
20150088006 | Rapoport et al. | Mar 2015 | A1 |
20150100141 | Hughes | Apr 2015 | A1 |
20150105096 | Chowdhury et al. | Apr 2015 | A1 |
20150119728 | Blackadar et al. | Apr 2015 | A1 |
20150147734 | Flores et al. | May 2015 | A1 |
20150148632 | Benaron | May 2015 | A1 |
20150173631 | Richards et al. | Jun 2015 | A1 |
20150182149 | Rapoport et al. | Jul 2015 | A1 |
20150250417 | Cheng et al. | Sep 2015 | A1 |
20150256689 | Erkkila et al. | Sep 2015 | A1 |
20150260514 | Menelas et al. | Sep 2015 | A1 |
20150294440 | Roberts | Oct 2015 | A1 |
20150327804 | Lefever et al. | Nov 2015 | A1 |
20150328523 | Heling et al. | Nov 2015 | A1 |
20150338926 | Park et al. | Nov 2015 | A1 |
20150345985 | Fung et al. | Dec 2015 | A1 |
20150357948 | Goldstein | Dec 2015 | A1 |
20150374240 | Lee | Dec 2015 | A1 |
20160021238 | Abramson et al. | Jan 2016 | A1 |
20160038083 | Ding et al. | Feb 2016 | A1 |
20160054449 | Pekonen et al. | Feb 2016 | A1 |
20160057372 | Iwane et al. | Feb 2016 | A1 |
20160058302 | Raghuram et al. | Mar 2016 | A1 |
20160058329 | Srinivas et al. | Mar 2016 | A1 |
20160058332 | Tan et al. | Mar 2016 | A1 |
20160058333 | Arnold et al. | Mar 2016 | A1 |
20160058356 | Raghuram et al. | Mar 2016 | A1 |
20160058370 | Raghuram et al. | Mar 2016 | A1 |
20160058371 | Singh et al. | Mar 2016 | A1 |
20160058372 | Raghuram et al. | Mar 2016 | A1 |
20160059079 | Watterson | Mar 2016 | A1 |
20160066859 | Crawford et al. | Mar 2016 | A1 |
20160069679 | Jackson et al. | Mar 2016 | A1 |
20160084869 | Yuen et al. | Mar 2016 | A1 |
20160143579 | Martikka | May 2016 | A1 |
20160147319 | Agarwal et al. | May 2016 | A1 |
20160166178 | Fuss et al. | Jun 2016 | A1 |
20160170998 | Frank et al. | Jun 2016 | A1 |
20160206248 | Sartor et al. | Jul 2016 | A1 |
20160223578 | Klosinski, Jr. et al. | Aug 2016 | A1 |
20160242646 | Obma | Aug 2016 | A1 |
20160256058 | Pham et al. | Sep 2016 | A1 |
20160263435 | Venkatraman | Sep 2016 | A1 |
20160269572 | Erkkila et al. | Sep 2016 | A1 |
20160287177 | Huppert et al. | Oct 2016 | A1 |
20160301581 | Carter et al. | Oct 2016 | A1 |
20160314633 | Bonanni et al. | Oct 2016 | A1 |
20160361020 | Leboeuf et al. | Dec 2016 | A1 |
20160363449 | Metzler et al. | Dec 2016 | A1 |
20160374614 | Cavallaro et al. | Dec 2016 | A1 |
20170007166 | Roovers et al. | Jan 2017 | A1 |
20170061817 | Mettler May | Mar 2017 | A1 |
20170074897 | Mermel et al. | Mar 2017 | A1 |
20170082649 | Tu et al. | Mar 2017 | A1 |
20170094450 | Tu et al. | Mar 2017 | A1 |
20170111768 | Smith et al. | Apr 2017 | A1 |
20170181644 | Meer et al. | Jun 2017 | A1 |
20170182360 | Chang et al. | Jun 2017 | A1 |
20170188893 | Venkatraman et al. | Jul 2017 | A1 |
20170202486 | Martikka et al. | Jul 2017 | A1 |
20170211936 | Howell et al. | Jul 2017 | A1 |
20170242499 | Shah et al. | Aug 2017 | A1 |
20170242500 | Shah et al. | Aug 2017 | A1 |
20170251972 | Jayaraman et al. | Sep 2017 | A1 |
20170259116 | Mestas | Sep 2017 | A1 |
20170269734 | Graff | Sep 2017 | A1 |
20170269785 | Abdollahian et al. | Sep 2017 | A1 |
20170273619 | Alvarado et al. | Sep 2017 | A1 |
20170347885 | Tan et al. | Dec 2017 | A1 |
20170357007 | Miller et al. | Dec 2017 | A1 |
20170367658 | LeBoeuf et al. | Dec 2017 | A1 |
20170368413 | Shavit | Dec 2017 | A1 |
20180028863 | Matsuda | Feb 2018 | A1 |
20180043210 | Niehaus et al. | Feb 2018 | A1 |
20180049694 | Singh Alvarado et al. | Feb 2018 | A1 |
20180050235 | Tan et al. | Feb 2018 | A1 |
20180055375 | Martinez et al. | Mar 2018 | A1 |
20180055439 | Pham et al. | Mar 2018 | A1 |
20180056123 | Narasimha Rao et al. | Mar 2018 | A1 |
20180056128 | Narasimha Rao et al. | Mar 2018 | A1 |
20180056129 | Narasimha Rao et al. | Mar 2018 | A1 |
20180249908 | Anthony et al. | Sep 2018 | A1 |
20180279914 | Patek et al. | Oct 2018 | A1 |
20180303381 | Todd et al. | Oct 2018 | A1 |
20180344217 | Perry et al. | Dec 2018 | A1 |
20190038938 | Nagasaka et al. | Feb 2019 | A1 |
20190076063 | Kent et al. | Mar 2019 | A1 |
20190090087 | Taylor | Mar 2019 | A1 |
20190184230 | Lee et al. | Jun 2019 | A1 |
20190184233 | Xie et al. | Jun 2019 | A1 |
20190360813 | Zhao | Nov 2019 | A1 |
20200232796 | Lee | Jul 2020 | A1 |
20210068689 | Ochs et al. | Mar 2021 | A1 |
20210068712 | Humblet et al. | Mar 2021 | A1 |
20210068713 | Dervisoglu et al. | Mar 2021 | A1 |
20210093918 | Dervisoglu et al. | Apr 2021 | A1 |
20220114873 | Williams | Apr 2022 | A1 |
20220241641 | Mermel et al. | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
2008100295 | May 2008 | AU |
102481479 | May 2012 | CN |
104218976 | Dec 2014 | CN |
105031905 | Nov 2015 | CN |
105068656 | Nov 2015 | CN |
2465824 | Jun 2010 | GB |
259KOL2015 | Dec 2015 | IN |
2004089317 | Mar 2004 | JP |
2010-051333 | Mar 2010 | JP |
2013-039316 | Feb 2013 | JP |
2014-042757 | Mar 2014 | JP |
2016-150018 | Aug 2016 | JP |
2018-000543 | Jan 2018 | JP |
2018-015187 | Feb 2018 | JP |
2019028796 | Feb 2019 | JP |
2020148558 | Sep 2020 | JP |
122807 | Feb 2010 | RO |
0361779 | Jul 2003 | WO |
2010090867 | Aug 2010 | WO |
2011105914 | Sep 2011 | WO |
2015126182 | Aug 2015 | WO |
2015200900 | Dec 2015 | WO |
2016044831 | Mar 2016 | WO |
2016073620 | May 2016 | WO |
WO-2016142246 | Sep 2016 | WO |
WO-2018117914 | Jun 2018 | WO |
Entry |
---|
Mattfield, R., Jesch, E., & Hoover, A. (n.d.). A New Dataset for Evaluating Pedometer Performance. IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Clemson, South Carolina, United States of America. https://10.1109/BIBM.2017.8217769 (Year: 2017). |
“Your Fitness FAQ, Why is it important to warm up and cool down in a workout?”, 2012, Web, Retrieved from: http://www.yourfitnessfaq.com/whyisitimportanttowarmupandcooldowninaworkout.html. |
Bo et al, “TEXIVE: Detecting Drivers Using Personal Smart Phones by Leveraging Inertial Sensors,” Department of ComputerScience, Illinois Institute of Technology, Chicago IL, Dec. 7, 2014, pp. 1-12. |
Brooks, G.A. et al., “Exercise Physiology: Human Bioenergetics and Its Applications,” Fourth Edition, McGraw Hill, ISBN 0-07-255642-0, Chapter 2: Bioenergetics, Chapter 10: Metabolic Response to Exercise: Lactate Metabolism During Exercise and Recovery, Excess Postexercise 02 Consumption (EPOC), O2 Deficit, O2 Debt, and the Anaerobic Threshold, Chapter 16: Cardiovascular Dynamics During Exercise, Chapter 21: Principles of Endurance Conditioning, Chapter 27: Exercise Testing and Prescription, 141 pages (2004). |
Bruce, RA et al., “Exercising testing in adult normal subjects and cardiac patients,” Pediatrics, vol. 32, No. Suppl., pp. 742-756 (Oct. 1963). |
Bruce, RA et al., “Maximal oxygen intake and nomographic assessment of functional aerobic impairment in cardiovascular disease,” American Heart Journal, vol. 85, Issue 4, pp. 546-562 (Apr. 1973). |
Burke, Edmund R., “High-Tech Cycling,” Second Edition, Human Kinetics, Chapter 4: Optimizing the Crank Cycle and Pedaling Cadence, Chapter 5: Cycling Biomechanics, Chapter 6: Cycling Power, Chapter 10: Physiology of Professional Road Cycling, Chapter 11: Physiology of Mountain Biking, 131 pages (2003). |
Cavanagh, P.R. et al., “The effect of stride length variation on oxygen uptake during distance running,” Medicine and Science in Sports and Exercise, vol. 14, No. 1, pp. 30-35 (1982). |
Chu, “In-Vehicle Driver Detection Using Mobile Phone Sensors”, Submitted for Graduation with departmental Distinction in Electrical and Computer Engineering, Apr. 20, 2011, pp. 1-21. |
Earnest, C.P. et al., “Cross-sectional association between maximal estimated cardiorespiratory fitness, cardiometabolic risk factors and metabolic syndrome for men and women in the Aerobics Center Longitudinal Study,” Mayo Clin Proceedings, vol. 88, No. 3, pp. 259-270, 20 pages (Mar. 2013). |
Fox, S.M. et al., “Physical Activity and the Prevention of Coronary Heart Disease,” Bull. N.Y. Acad. Med., vol. 44, No. 8, pp. 950-967 (Aug. 1968). |
Frankenfield et al., “Comparison of Predictive Equations for Resting Metabolic Rate in Healthy Nonobese and Obese adulls: A systematic review. Journal of the American Dietetic Association”, May 2005, vol. 105, No. 5, p. 775-789. |
Gao et al., “Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems”, Medical engineering & physics 36.6 (2014): 779-785. |
Glass et al., “ACSM's Metabolic Calculations Handbook,” Lippincott Williams & Wilkins, 124 pages (2007). |
Hasson et al., “Accuracy of four resting metabolic rate production equations: Effects of sex, body mass index, age, and race/ethnicity”, Journal of Science and Medicine in Sport, 2011, vol. 14, p. 344-351. |
Human Kinetics, Aerobic Workout Components, 2011, Web, Retrieved from: http://www.humankinetics.com/excerpts/excerpts/aerobicworkoutcomponentsexcerpt. |
International Search Report and Written Opinion received for POT Patent Application No. PCT/US2018/047290, mailed on Nov. 8, 2018, 14 pages. |
Isaacs et al, “Modeling energy expenditure and oxygen consumption in human exposure models: accounting for fatigue and EPOC”, 2008, Journal of Exposure Science and Environmental Epidemiology, 18: 289-298. |
Jackson et al., “Prediction of functional aerobic capacity without exercise testing, Medicine and Science in Sports and Exercise”, 22(6), 863-870, 1990. |
Keytel et al, “Prediction of energy expenditure from heart rate monitoring during submaximal exercise”, Journal of Sports Sciences, 23(3), 2005: 289-297. |
KINprof, 2011, Predictive VO2max tests, Web Video, Retrieved from: https://www.youtube.com/walch?v =_9e3HcYIsm8. |
Kunze et al. “Where am i: Recognizing on-body positions of wearable sensors.” Location-and context-awareness. Springer Berlin Heidelberg, 2005. 264-275. |
Kyle, Chester R., “Reduction of Wind Resistance and Power Output of Racing Cyclists and Runners Travelling in Groups”, Ergonomics, vol. 22, No. 4, 1979, pp. 387-397. |
Lavie et al., “Impact of cardiorespiratory fitness on the obesity paradox in patients with heartfailure,” Mayo Clinic Proceedings, vol. 88, No. 3, pp. 251-258 (Mar. 2013). |
Le, et al., “Sensor-based Training Optimization of a Cyclist Group”, Seventh International Conference on Hybrid Intelligent Systems, IEEE 2007, pp. 265-270. |
Lucas et al, “Mechanisms of orthostatic intolerance following very prolonged exercise”, 2008, J. Appl. Physiol., 105: pp. 213-225. |
Margaria, R. et al., “Energy cost of running,” Journal of Applied Physiology, vol. 18, No. 2, pp. 367-370 (Mar. 1, 1963). |
McArdle, W.D. et al., “Exercise Physiology: Nutrition, Energy and Human Performance,” Seventh t:amon, Lippincott Williams & Wilkins, Chapter 5: Introduction to Energy Transfer, Chapter 6: Energy Transfer in the Body, Chapter 7: Energy Transfer During Exercise, Chapter 8: Measurement of Human Energy Expenditure, Chapter 9: Human Energy Expenditure During Rest and Physical Activity, Chapter 10: Energy Expenditure During Walking, Jogging, Running and Swimming, Chapter 11: Individual Differences and Measurement of Energy Capacities, Chapter 21: Training for Anaerobic and Aerobic Power. |
Myers et al., “Exercise Capacity and Mortality Among Men Referred for Exercise Testing,” The New England Journa of Medicine, vol. 346, No. 11, pp. 793-801 {Mar. 14, 2002). |
Noakes, Timothy D., “Lore of Running,” Fourth Edition, Human Kinetics, Chapter 2: Oxygen Transport and Running Economy, Chapter 3: Energy Systems and Running Performance, 157 pages (2002). |
Novatel, “IMU Error and Their Effects”, Novatel Application Notes APN-064 Rev A p. 1-6, Feb. 21, 2014. |
PCT International Application No. PCT/US2017/049693, International Search Report and Written Opinion dated Aug. 12, 2017. |
Rapoport, Benjamin I., “Metabolic Factors Limiting Performance in Marathon Runners,” PloS Computational Biology, vol. 6, Issue 10, 13 pages (Oct. 2010). |
Rowlands et al. “Assessing Sedentary Behavior with the GENEActiv: Introducing the Sedentary Sphere”, Medicine and science in sports and exercise, 46.6 (2014), pp. 1235-1247. |
Sabatini, “Kalman-filter-based orientation determination using inertial/magnetic sensors: observability analysis and performance evaluation”, Sep. 27, 2011, Sensors 2011, 11, 9182-9206. |
Song et al., “Training Activity Recognition Systems Online Using Real-Time Crowdsourcing”, University of Rochester Computer Science, UbiCom' 12, Sep. 5-8, 2012 (2 pages). |
Tanaka, H. et al., “Age-predicted maximal heart rate revisited,” Journal of the American College of Cardiology, vol. 37, Issue 1, pp. 153-156 (Jan. 2001). |
Vella et al, Exercise After-Burn: Research Update, 2005, Web, Retrieved from: http://www.unm.edu/˜lkravilz/Article%20folder/epocarticle.html. |
Wang et al., “Time constant of heart rate recovery after low level exercise as a useful measure of cardiovascular fitness,” Cont. Proc. IEEE Eng. Meda Biol. Soc., vol. 1, pp. 1799-1802: (2006). |
Yamaji, et al (Relationship Between Heart Rate and Relative Oxygen Intake in Male Subjects Aged 10 to 27 Years, 1978, J. Human Ergol., 7:29-39) (Year: 1978). |
U.S. Appl. No. 17/015,912, filed Sep. 9, 2020, Humblet et al. |
U.S. Appl. No. 17/015,965, filed Sep. 9, 2020, Dervisoglu et al. |
U.S. Appl. No. 17/016,020, filed Sep. 9, 2020, Ochs et al. |
Alexander, “Energetics and Optimization of Human Walking and Running,” Am J Human Biology, Mar. 20, 2002, 14:641-648. |
Lasecki, “Real-Time Crowd Labeling for Deployable Activity Recognition,” University of Rochester Computer Science, Feb. 23, 2013, 10 pages. |
Latt et al., “Walking speed, cadence and step length are selected to optimize the stability of head and pelvis accelerations,” Experimental Brain Research, Aug. 24, 2007, 184: 201-209. |
Morgan et al., “Effect of step length optimization on the aerobic demand of running,” Journal of Applied Physiology, 1994, 245-251. |
PCT International Preliminary Report on Patentability in International Appln. No. PCT/US2017/049693, dated Mar. 5, 2019, 8 pages. |
PCT International Preliminary Report on Patentability in International Appln. No. PCT/US2018/047290, dated Mar. 17, 2020, 9 pages. |
Pfitzinger.com “Optimal Marathon Training Sessions, Distance Coach.com, Intelligent Training for Distance Runners,” archived May 15, 2012, <https://web.archive.org/web/20120515081237/http://www.pfitzinger.com/marathontraining.shtml>, printed Jan. 20, 2017, 3 pages. |
Romijn et al., “Regulation of endogenous fat and carbohydrate metabolism in relation to exercise intensity and duration,” Am. J. Physiol., 1993, 6:1-13. |
Triendurance.com “Running with a Higher Cadence, Triendurance,” Oct. 23, 2021, retrieved from <https ://web. archive.org/web/20080228162904/http ://www.trienduranee .com/Related. asp?PageID=14&NavlD=7>, 2 pages. |
Zhao, “New Developments of the Typical MEMS and Wearable Sensor Technologies,” Micronanoelectronic Technology, Jan. 2015, 52(1):1-13 (with English abstract). |
Zhou et al., “Development of the Technical Monitoring System for Swimming Competition,” China Sport Science and Technology, Aug. 2008, 44(4):84-86 (with English abstract). |
Shen et al., “MiLift: Efficient Smartwatch-Based Workout Tracking Using Automatic Segmentation,” IEEE Transactions on Mobile Computing, Jul. 2018, 17(7):1609-1622. |
Unuma et al., JP2007093433, published on Apr. 12, 2007, 27 pages (machine translated English version). |
Kodama et al., Examination of Inertial Sensor-Based Estimation Methods of Lower Limb Joint Moments and Ground Reaction Force: Results for Squat and Sit-to-Stand Movements in the Sagittal Plane, Aug. 1, 2016, Sensors 2016, pp. 1-19 (Year: 2016). |
Number | Date | Country | |
---|---|---|---|
20210093917 A1 | Apr 2021 | US |
Number | Date | Country | |
---|---|---|---|
62907543 | Sep 2019 | US |