The present disclosure relates generally to detecting swim activity using inertial sensors.
When a user is doing exercise or making movements, it is oftentimes useful to detect when the user makes a change in direction. Keeping track of the user's turns or changes in direction can be useful in many applications. As an example, when the user is swimming, detecting a turn made by the user may imply the user completes a lap. As another example, when the user is walking and/or running, knowing the user makes a turn or changes in direction can be useful in tracking the user's location. It is sometimes, however, not easy or practical for the user to keep track of the changes in direction made by him or her. For example, when the user is swimming, he or she may not want to mentally keep track of the number of turns made by him or her. Accordingly, it is desirable to provide methods and systems of detecting turns while swimming.
When a user is swimming, there is often a need to detect when and how frequently the user is taking a breath. For example, this information can be used to detect exertion, level of effort, general fitness, and/or swimming ability. It is generally, however, not practical for the user to keep track of the breaths taken by him or her. Accordingly, it is desirable to provide methods and systems of detecting user's breaths while swimming.
When a user is doing activities that includes multiple types of motions, there is often a need to classify the types of motions. As an example, when a user is swimming laps, the user can switch between two types of motions: swimming and turning. As another example, when a user is running, the user can switch between running and walking. Knowing which type of motions a user is doing is useful in many applications including estimating energy expenditure of the user. Accordingly, it is desirable to provide methods and systems of determining a user's types of motions, including how many strokes the user has taken while swimming.
When a user is swimming, there is often a need to determine the number of laps a user swims during a swimming session. Keeping track of the number of laps the user swims can be useful in many applications, such as to calculate the total distance a user swims and/or the energy expenditure associated with a swimming session. Accordingly, it is desirable to provide methods and systems of determining the number of laps a user swims during a swimming session.
When a user is performing a swimming session, the user may transition from periods of swimming to periods of rest. While it is reasonable to expect that a user tracking his/her swim metrics (e.g., lap count, lap speed, strokes per lap, time splits, distance, calories, etc.) via a wearable device will indicate the start and end of the workout through interaction with the device, it is not always practical to do so. In a typical swim workout, periods of continuous swimming are interspersed with varying durations of rest. Accordingly, it is desirable to detect periods of lap swimming for the purpose of accurate swim metric evaluation.
When a user is swimming in a pool, there is often a need to know the length of the swimming pool. Information of the length of a swimming pool can be used to calculate the total distance a user swims and the energy expenditure associated with a swimming session. The pool length information, however, is not always readily available to users. Additionally, users may not be able to accurately estimate the pool length. Accordingly, it is desirable to provide methods and systems of determining a length of a swimming pool.
The present disclosure relates to a method for improving an accuracy of a wearable device while determining swimming metrics of a user during a swimming session. In some embodiments, the method can include: receiving, by a processor circuit of the wearable device, motion data from one or more motion sensors of the wearable device; determining, by the processor circuit using the motion data, a first set of rotational data of the wearable device, wherein the first set of rotational data is expressed in a first frame of reference; converting, by the processor circuit, the first set of rotational data into a second set of rotational data, wherein the second set of rotational data is expressed in a second frame of reference; determining, by the processor circuit, one or more swimming metrics of the user based on the second set of rotational data, wherein the one or more swimming metrics comprise at least one of turns, breaths, laps, swimming styles, and swimming strokes; and outputting, by the processor circuit, the one or more swimming metrics of the user.
In some embodiments, the first frame of reference can include a body-fixed frame of reference with respect to the wearable device. In some embodiments, the second frame of reference can include an inertial frame of reference.
In some embodiments, the method can include: determining, by the processor circuit, yaw rotational data from the second set of rotational data; determining, by the processor circuit, one or more turns of the user based on the yaw rotational data. In some embodiments, the method can include: determining unfiltered yaw rotational data, wherein the unfiltered yaw rotational data are part of the second set of rotational data; and filtering the unfiltered yaw rotational data. In some embodiments, the method can include: determining a time constant proportional to a period which the user needs to complete a stroke; and filtering the unfiltered yaw rotational data based on the time constant.
In some embodiments, the method can include: determining a pitch angle from the second set of rotational data; comparing the pitch angle with a threshold angle; and determining one or more breaths of the user based upon comparing the pitch angle with the threshold angle. In some embodiments, the threshold angle can be associated with a swimming style of the user, wherein the swimming style is at least one of freestyle, butterfly, or breast stroke. In some embodiments, the threshold can be associated with a swimming skill level of the user.
In some embodiments, the method can include: converting the second set of rotational data to a set of two-dimensional rotational data; adding one or more constraints to the set of two-dimensional rotational data; and counting, by the processor circuit, one or more swimming strokes from the constrained two-dimensional rotational data. In some embodiments, the method can include: determining a primary axis of rotation based on the second set of rotational data; projecting the second set of three-dimensional rotational data to a two-dimensional space based on the primary axis of rotation; and determining the set of two-dimensional rotational data based on the projection. In some embodiments, the one or more constraints comprises at least one of accelerometer energy, moment arm calculations, or rotational direction. In some embodiments, the method can include counting revolutions of circles in the constrained two-dimensional rotational data. In some embodiments, the method can include counting revolutions of semi-circles in the constrained two-dimensional rotational data.
In some embodiments, the method can include: detecting a number of turns and a number of strokes of the user during the swimming session based on the received motion data; determining a stroke range per lap based on the number of turns and the number of strokes; determining whether a turn is not detected; inserting a missing turn in response to a determination that a turn is not detected; determining a variance of strokes per lap; adjusting the detected number of turns to reduce the variance of strokes per lap; and determining a lap count of the user based on the adjusted number of turns.
In some embodiment, the method can include: determining whether the number of strokes of the user converges in the swimming session; or determining whether the number of strokes made by the user converges in a historical swimming session. In some embodiments, the method can include: determining a standard deviation of the number of strokes among consecutive turns; and comparing the standard deviation with a threshold. In some embodiments, threshold can be 1.5 strokes.
In some embodiments, the method can include: comparing the number of strokes between two consecutive turns with the determined stroke range per lap. In some embodiments, the method can include: comparing the number of strokes between two consecutive turns with a threshold, wherein the threshold is determined by multiplying a mean value of the number of strokes per lap with a ratio.
In some embodiments, the method can include: determining a stroke rate of the user; classifying a stroke style for the user based on the motion data; determining a confidence value based on the stroke rate and the stroke style; determining a motion signature of the user, wherein the motion signature is swimming; and determining the user is swimming based on the confidence value and the motion signature. In some embodiments, the stroke style comprises at least one of freestyle, backstroke, breaststroke, or butterfly.
In some embodiments, the method can include: receiving an input from the user whether to calibrate a length of the swimming pool; if the input indicates that the user selects to calibrate the length of the swimming pool: prompting the user to perform an activity along an edge of the swimming pool, wherein the edge is parallel with a direction the user swims, receiving distance data associated with the activity, calculating the length of the swimming pool based on the distance data, and determining the one or more swimming metrics based on the calculated length of the swimming pool; and if the input indicates that the user does not select to calibrate the length of the swimming pool: counting a number of swimming strokes per lap, and calculating the length of the swimming pool based on the number of strokes per lap and a default stroke length, and determining the one or more swimming metrics based on the calculated length of the swimming pool.
In some embodiments, the method can include: receiving, by the processor circuit, a number of steps associated with the activity from a pedometer of the wearable device, wherein the activity comprises at least one of walking or running. In some embodiments, the method can include: receiving, by the processor circuit, location data from a GPS sensor of the wearable device.
The present disclosure also relates to a system for improving an accuracy of a wearable device while determining one or more swimming metrics of a user during a swimming session. In some embodiments, the system can include: one or more motion sensors configured to collect motion data; and a processor circuit coupled to the one or more motion sensors and configured to execute instructions causing the processor to: determine a first set of rotational data, wherein the first set of rotational data is expressed in a first frame of reference; convert the first set of rotational data into a second set of rotational data, wherein the second set of rotational data is expressed in a second frame of reference; determine one or more swimming metrics of the user based on the second set of rotational data; and output the one or more swimming metrics of the user.
Other features and advantages will become apparent from the following detailed description and drawings.
Various objects, features, and advantages of the present disclosure can be more fully appreciated with reference to the following detailed description of the present disclosure when considered in connection with the following drawings, in which like reference numerals identify like elements.
The present disclosure relates to a method and system for detecting swim activity based on motion sensor signals obtained from a wearable device. Generally, user arm movement when swimming has distinct periodic signatures, unlike periods of rest which are typified by random user behavior.
The wearable device can include one or more motion sensors to collect data about the wearable device's position and orientation in space and to track changes to the wearable device's position and orientation over time.
In some embodiments, processor 210 can include one or more cores and can accommodate one or more threads to run various applications and modules. Software can run on processor 210 capable of executing computer instructions or computer code. Processor 210 can also be implemented in hardware using an application specific integrated circuit (ASIC), programmable logic array (PLA), field programmable gate array (FPGA), or any other integrated circuit.
Memory 220 can be a non-transitory computer readable medium, flash memory, a magnetic disk drive, an optical drive, a programmable read-only memory (PROM), a read-only memory (ROM), or any other memory or combination of memories. Memory 220 can include one or more modules 230.
Processor 210 can be configured to run module 230 stored in memory 220 that is configured to cause processor 210 to perform various steps that are discussed throughout the present disclosure, such as, for example, the methods described in connection with
The motion information received from one or more motion sensors 240 may be expressed in a body-fixed frame of reference with respect to wearable device 100. In some embodiments, the motion information can be converted from the body fixed frame of reference to the inertial frame of reference. Conversion of sensor data to the inertial frame of reference is a necessary process prior to stroke detection, as well as stroke counting, turn detection, stroke phase classification and in some aspects of stroke classification, as described in the respective applications referenced above, and incorporated by reference herein in their entirety.
In some embodiments, wearable device 100 may include other types of sensors in addition to accelerometer 260 and gyroscope 250. For example, wearable device 100 may include an altimeter or barometer, or other types of location sensors, such as a GPS sensor. Wearable device 100 may also include display 270. Display 270 may be a screen, such as a crystalline (e.g., sapphire) or glass touchscreen, configured to provide output to the user as well as receive input from the user via touch. For example, display 270 may be configured to display a current heart rate or daily average energy expenditure. Display 270 may receive input from the user to select, for example, which information should be displayed, or whether the user is beginning a physical activity (e.g., starting a session) or ending a physical activity (e.g., ending a session), such as a swimming session, a running session, a weight lifting session, a walking session or a cycling session. In some embodiments, wearable device 100 may present output to the user in other ways, such as by producing sound with a speaker (not shown), and wearable device 100 may receive input from the user in other ways, such as by receiving voice commands via a microphone (not shown).
In some embodiments, wearable device 100 may communicate with external devices via interface 280, including a configuration to present output to a user or receive input from a user. Interface 280 may be a wireless interface. The wireless interface may be a standard Bluetooth (IEEE 802.15) interface, such as Bluetooth v4.0, also known as “Bluetooth low energy.” In other embodiments, the interface may operate according to a cellphone network protocol such as Long Term Evolution (LTE) or a Wi-Fi (IEEE 802.11) protocol. In other embodiments, interface 280 may include wired interfaces, such as a headphone jack or bus connector (e.g., Lightning, Thunderbolt, USB, etc.).
Wearable device 100 can measure an individual's current heart rate from heart rate sensor 290. Heart rate sensor 290 may also be configured to determine a confidence level indicating a relative likelihood of an accuracy of a given heart rate measurement. In other embodiments, a traditional heart rate monitor may be used and may communicate with wearable device 100 through a near field communication method (e.g., Bluetooth).
Wearable device 100 may be configured to communicate with a companion device 300 (
The modules described above are examples, and embodiments of wearable device 100 may include other modules not shown. For example, some embodiments of wearable device 100 may include a rechargeable battery (e.g., a lithium-ion battery), a microphone or a microphone array, one or more cameras, one or more speakers, a watchband, water-resistant casing or coating, etc. In some embodiments, all modules within wearable device 100 can be electrically and/or mechanically coupled together. In some embodiments, processor 210 can coordinate the communication among each module.
In another example, wearable device 100 may not include an altimeter or barometer, as opposed to an alternative embodiment in which wearable device 100 may include an altimeter or barometer. In the case where wearable device 100 may not include an altimeter or barometer, an altimeter or barometer of companion device 300 may collect altitude or relative altitude information, and wearable device 100 may receive the altitude or relative altitude information via interface 280 (
In another example, wearable device 100 may receive motion information from companion device 300. Wearable device 100 may compare the motion information from companion device 300 with motion information from one or more motion sensors 240 of wearable device 100. Motion information such as data from accelerometer 260 and/or gyroscope 250 may be filtered (e.g. by a high-pass, low-pass, band-pass, or band-stop filter) in order to improve the quality of motion information. For example, a low-pass filter may be used to remove some ambient noise.
Wearable device 100 may use sensed and collected motion information to predict a user's activity. Examples of activities may include, but are not limited to, swimming, walking, running, cycling, weight lifting etc. Wearable device 100 may also be able to predict or otherwise detect when a user is sedentary (e.g., sleeping, sitting, standing still, driving or otherwise controlling a vehicle, etc.) Wearable device 100 may use a variety of motion information, including, in some embodiments, motion information from a companion device. In some embodiments, information from one or more of accelerometers, gyroscopes, global positioning (GPS) devices, and heart rate sensors can be used to determine whether a user is engaging in swimming.
In
In
It is noted that the expression of direction 450 is the same in
In
It is noted that the expression of gravity direction 440 is the same in
At step 1010, motion information may be received from the one or more motion sensors 240 on a wearable device (e.g., wearable device 100) of a user. In some embodiments, motion information may include three-dimensional rotational information from one or more sensors 240 such as gyroscope 250 and three-dimensional acceleration information from one or more sensors 240 such as accelerometer 260. In some embodiments, motion information may be filtered such as by a low-pass filter to remove unwanted noise from the ambient.
At step 1020, the angular velocity of wearable device 100 may be determined with respect to a frame of reference such as a body-fixed frame of reference or an inertial frame of reference.
At step 1030, the gravity determination method 1000 may determine whether the angular velocity of wearable device 100 determined at step 1020 is below a threshold. For example, the threshold may be approximately 0.05 radians per second, 0.2 radians per second, or 0.5 radians per second, etc. If the angular velocity exceeds the threshold (e.g., when the user is doing exercise), the gravity determination method 1000 may return to step 1010. In some embodiments, the gravity determination method 1000 may pause or wait for a period of time (e.g., 1 second, 5 seconds, 1 minute, etc.) before proceeding at step 1010.
If the angular velocity is below the threshold (e.g., when the user is relatively still), the gravity determination method 1000 may proceed to step 1040. In some embodiments, at step 1030 wearable device 100 also determines if the magnitude of forces acting on wearable device 100 are approximately equal to the normal force of gravity (1 g) before proceeding to step 1040. If the magnitude is not approximately the normal magnitude, the gravity determination method 1000 may also return to block 1010. Estimating direction of gravity when the angular velocity is below the threshold (e.g., when the user is relatively still) is important because in that way wearable device 100 will not be interfered or confused by acceleration due to other movements. Hypothetically, if wearable device 100 is having a 1 g acceleration along x-axis, then wearable device 100 may be mistaken the direction of gravity.
At step 1040, the direction of gravity relative to wearable device 100 may be estimated. For example, in some embodiments, when wearable device 100 is held relatively still, accelerometer 260 within wearable device 100 may provide data about the direction of forces acting on wearable device 100, which may be attributable primarily to gravity. In some embodiments, gravity determination method 1000 may also determine whether the user wearing wearable device 100 is accelerating (e.g., speeding up or slowing down) or traveling at an approximately constant velocity so as to further improve the estimate of the direction of gravity.
In some embodiments, gravity determination method 1000 may end after outputting the estimated direction of gravity. In other embodiments, the gravity determination method 1000 may return to step 1010 to refine or otherwise repeat the method of estimating the direction of gravity relative to the wearable device.
At step 1110, gravity determination method 1100 may periodically or continuously check for the presence of a companion device (e.g., companion device 300). For example, in some embodiments, wearable device 100 may determine whether a connection (e.g., Bluetooth, IEEE 802.11 Wi-Fi, or other wireless or wired communication channel) has been established or may be established with companion device 300. If the companion device 300 is present, gravity determination method 1100 may proceed to step 1120.
At step 1120, the direction of gravity relative to companion device 300 may be estimated. In some embodiments, in contrast to the gravity determination method 1100, it may not be necessary to check whether the angular velocity of companion device 300 is below a threshold because most or all of rotation of the angular velocity of companion device 300 may be orthogonal to the direction of gravity.
At step 1130, the direction of gravity relative to companion device 300 may be outputted. In some embodiments, the direction of gravity relative to companion device 300 may be combined or otherwise compared with the direction of gravity relative to wearable device 100. In some embodiments, companion device 300 may further determine a rotation rate around the direction of gravity relative to the companion device and output the rotation rate instead of or in addition to the direction of gravity relative to companion device 300.
In some embodiments, gravity determination method 1100 may end after outputting the estimated direction of gravity. In other embodiments, gravity determination method 1100 may return to step 1110 to refine or otherwise repeat the method of estimating the direction of gravity relative to the wearable device.
Detecting Turns
At step 1210, motion information may be received from one or more motion sensors 240 on wearable device 100. In some embodiments, motion information may include three-dimensional rotational data of wearable device 100 from gyroscope 250. In some embodiments, motion information may include three-dimensional accelerations of wearable device 100 from accelerometer 260.
At step 1220, wearable device 100 determines a first set of rotational data of wearable device 100 based on the motion information received from one or more motion sensors 240. In some embodiments, the rotational data of wearable device 100 include how wearable device 100 rotates, such as angular velocities of wearable device 100, with respect to a frame of reference. In some embodiments, the first set of rotational data is received from gyroscope 250 and is expressed in a body-fixed frame of reference with respect to wearable device 100.
At step 1230, wearable device 100 converts the first set of rotational data into a second set of rotational data. As described above, rotational data in the body-fixed frame of reference cannot readily indicate whether or not wearable device 100 undergoes movements with respect to external references. To address this issue, wearable device 100 converts the rotational data in the body-fixed frame of reference into rotational data in an inertial frame of reference using techniques appreciated by people skilled in the art such as the one discussed in “Kalman-filter-based orientation determination using inertial/magnetic sensors: observability analysis and performance evaluation,” Angelo Maria Sabatini, published Sep. 27, 2011, Sensors 2011, 11, 9182-9206.
At step 1240, wearable device 100 determines that the user wearing wearable device 100 is making a turn based on the set of rotational data expressed in the inertial frame of reference.
Detecting Breaths
At step 1410, motion information may be received from one or more motion sensors 240 on wearable device 100. In some embodiments, motion information may include three-dimensional rotational data of wearable device 100 from gyroscope 250. In some embodiments, motion information may include three-dimensional accelerations of wearable device 100 from accelerometer 260.
At step 1420, wearable device 100 determines a first set of rotational data of wearable device 100 based on the motion information received from one or more motion sensors 240. In some embodiments, the rotational data of wearable device 100 include how wearable device 100 rotates, such as angular position, angular velocity, and/or angular acceleration of wearable device 100, with respect to a frame of reference. In some embodiments, if the rotational data of wearable device 100 is angular acceleration, then angular velocity and/or angular position can be obtained by integrating the angular acceleration over time. Likewise, if the rotational data of wearable device 100 is angular velocity, then angular position can be obtained by integrating the angular velocity over time. In some embodiments, the first set of rotational data is received from gyroscope 250 and is expressed in a body-fixed frame of reference with respect to wearable device 100.
At step 1430, wearable device 100 converts the first set of rotational data into a second set of rotational data. As described above, rotational data in the body-fixed frame of reference cannot readily indicate whether or not wearable device 100 undergoes movements with respect to external references. To address this issue, wearable device 100 converts the rotational data in the body-fixed frame of reference into rotational data in an inertial frame of reference using techniques appreciated by people skilled in the art such as the one discussed in “Kalman-filter-based orientation determination using inertial/magnetic sensors: observability analysis and performance evaluation,” Angelo Maria Sabatini, published Sep. 27, 2011, Sensors 2011, 11, 9182-9206.
At step 1440, wearable device 100 determines that the user wearing wearable device 100 is taking a breath based on the second set of rotational data by monitoring the pitch rotational data exceeding a threshold. When the user is swimming in freestyle, breast stroke, or butterfly, the user's breaths are often associated with upward movements of arm/wrist. Accordingly, wearable device 100 can determine that the user takes a breath when the user's arm/wrist is making moving upward.
Counting Swim Strokes
At step 1610, wearable device 100 receives three dimensional motion information from a motion sensor 240.
At step 1620, the wearable device 100 determines a first set of three dimensional rotational data of the wearable device 100.
At step 1630, wearable device 100 converts the first set of three dimensional rotational data into a second set of three dimensional rotational data. As described above, the three dimensional rotational data in the body-fixed frame of reference cannot readily indicate whether or not wearable device 100 undergoes movements with respect to external references. To address this issue, wearable device 100 converts the three dimensional rotational data in the body-fixed frame of reference into three dimensional rotational data in an inertial frame of reference using techniques appreciated by people skilled in the art such as the one discussed in “Kalman-filter-based orientation determination using inertial/magnetic sensors: observability analysis and performance evaluation,” Angelo Maria Sabatini, published Sep. 27, 2011, Sensors 2011, 11, 9182-9206.
When the motion data is transformed to the inertial frame, the data appears as repetitive motion orbits as the user swims. Once it has been determined that the user is swimming, the 3D orbits can be examined in the inertial reference frame using a principal component analysis. For example, the plane that includes the most data points (corresponding to the plane in which the swimmer demonstrates the greatest stroke energy) define the first and second principal component vectors. The third vector that is perpendicular to the first two vectors defines the primary axis of rotation (i.e., the third principal component vector). Once the planes and axis of rotation are determined, the second set of three dimensional rotational data can be projected onto a 2D space (step 1640). The equation for the projection can be expressed as: RT(I−wwT) equation (1), where R=rotation, T=matrix transpose, I is the 3×3 identity matrix and w is the third component vector of the PCA.
At step 1645, additional constraints can be added to the 2D rotational data to project rotational data with cleaner orbits and facilitate more reliable stroke counting. In some embodiments, these constraints can be accelerometer energy, moment arm calculations, and/or rotational direction (e.g., clockwise and counterclockwise).
At step 1650, the revolutions of the circles or semi-circles shown in the constrained 2D rotational data can be counted to determine a stroke number.
In some embodiments, data captured for arm motions that do not show sufficient energy (as measured by the accelerometer) or sufficient extension (as measured by the moment arm calculations) to be considered true swim strokes can be eliminated from the 2D projection.
In some embodiments, clockwise and counterclockwise rotational direction data can be used as constraints to eliminate certain data points from the projection. For example, only the clockwise rotational direction data (e.g., for freestyle and butterfly strokes) or the counter clockwise rotational direction data (e.g., for backstroke) can be considered when counting strokes, to eliminate any unintentional gyroscope drift when executing a stroke. For example,
It is from the representation of the data shown in 2D projection 1708 that the number of revolutions can be counted and can be equated with a number of strokes. In order to determine the number of revolutions, a threshold or a number of thresholds can be established along the path of revolution. For example, if the stroke style yields a full rotation (e.g., backstroke, butterfly and freestyle), then a single threshold can be established along the rotational path. Each time the threshold is crossed, another revolution (which represents a stroke) is counted. A threshold is crossed when there is a data point before and after the threshold line.
In some embodiments, when a stroke style only exhibits a partial revolution, instead of a full revolution (e.g., breaststroke), multiple thresholds can be established along a rotational path (e.g., at 45° intervals) to capture the stroke somewhere along its semi-rotational path. When the motion that exhibits a partial rotation crosses one of the established thresholds, the partial rotation can be counted.
In some embodiments, the system and method can include a spurious stroke check module to eliminate arm motions that are not true strokes. For example, the system and method of the present disclosure can include a voting mechanism module that only counts strokes when they are not too far spread out in time, before committing these strokes as real strokes.
Counting Laps
The present disclosure describes ways to determine the number of laps a swimmer swims in a swimming session. Generally, when a swimmer reaches an end of a swimming lap, he or she will turn to continue to swim. Therefore, finishing a lap is typically associated with a turn made by the swimmer.
In reality, however, a swimmer may be detected to make a turn sometimes even if he or she has not finished a lap. For example, the swimmer may take a goggle break, take an out-of-pool break, make a turn in the middle of the lap, and/or any other activities that may cause the swimmer to intentionally or unintentionally make a turn without reaching the end of a lap. Therefore, a swimmer may be detected to make more than one turn per lap.
For example,
When the swimmer makes a turn without finishing a lap, the turn can be referred to as a false or premature turn in that the turn does not correspond to the finish of a lap. One way to determine the false turns is to look at the number of strokes between two consecutive turns. For example, if it generally takes a swimmer 15 to 20 strokes to finish a lap, and if there is only eight strokes between two turns, then at least one turn is a false turn. For example,
At step 2302, motion information may be received from one or more motion sensors 240 on wearable device 100. In some embodiments, motion information may include three-dimensional rotational data of wearable device 100 from gyroscope 250. In some embodiments, motion information may include three-dimensional accelerations of wearable device 100 from accelerometer 260.
At step 2304, wearable device 100 determines rotational data of wearable device 100 expressed in an inertial frame of reference as described above. In some embodiments, wearable device 100 can, additionally or alternatively, include rotational data expressed in a body-fixed frame of reference.
At step 2306, wearable device 100 detects each time the user makes a turn during the swimming session based on the rotational data. In some embodiments, wearable device 100 can detect the user makes a turn.
At step 2308, wearable device 100 detects each time the user makes a stroke and the swimming style associated with the stroke. In some embodiments, wearable device 100 can detect whether or not the user makes a stroke. In some embodiments, wearable device 100 can only detect strokes of the arm wearing wearable device 100. Therefore, in these embodiments, throughout the application, the strokes can be meant strokes detected by wearable device 100 and can be approximately half of the true strokes made by the user. For example, in these embodiments, the number of strokes between turns is the number of strokes detected by wearable device 100 between turns.
At step 2310, wearable device 100 rejects certain turns that are detected at step 2306 based on one or more criteria. One purpose of this step is to reject the turns made by the user while the user is not swimming. In some embodiments, wearable device 100 evaluates a turn based on the stroke, turn, and swimming style detected at steps 2306 and 2308. For example, in some embodiments, when wearable device 100 detects a turn, it will reject the turn unless both of the following two criteria are met. One of the criteria is the stroke rate between two consecutive turns needs to be greater than a minimum stroke rate. The stroke rate can be defined as the number of strokes per minute. In some embodiments, the minimum stroke rate can be eight strokes per minute. In some embodiments, other suitable values can be used as the minimum stroke rate. The other criteria is the number of the strokes with a known style between two consecutive turns needs to be greater than a minimum stroke count. In some embodiments, the minimum stroke count can be three strokes. In some embodiments, other suitable values can be used as the minimum stroke count. In some embodiments, wearable device 100 can reject a turn detected based on less, other, or more criteria. In some embodiments, even if a turn is rejected, the detected turn will still be stored for later adjustment.
At step 2312, wearable device 100 determines a stroke range per lap. In some embodiments, the range is determined based on whether or not the number of the user's strokes converges in the current swimming session or in the historical session. Generally, the number of the user's strokes converges when the variation among the number of strokes per lap is less than a threshold. The details of step 2312 can be further described in connection with
At step 2410, wearable device 100 determines whether the number of the user's strokes converges in the current swimming session. In some embodiments, wearable device 100 checks six consecutive turns and evaluates the number of strokes among the six consecutive turns. For example, the first number of strokes S1 is the stroke count between the user starts to swim and the user makes a first turn detected by wearable device 100; the second number of strokes S2 is the stroke count between the user makes a first turn detected by wearable device 100 and the user makes a second turn detected by wearable device 100; and S3-S6 can be calculated in a similar way. In some embodiments, if the standard deviation of the number of strokes among the six consecutive turns (e.g., S1-S6) is less than a threshold, such as 1.5, wearable device 100 can determine that the number of the user's strokes converges in the current swimming session. In some embodiments, other number of consecutive turns and/or other thresholds of standard deviation can be used to determine whether or not the number of the user's strokes converges in the current swimming session. If the number of the user's strokes converges in the current swimming session, the process 2400 proceeds to step 2420. If the number of the user's strokes does not converge in the current swimming session, the process 2400 proceeds to step 2430.
At step 2420, wearable device 100 determines a stroke range given the number of the user's strokes converges in the current swimming session. In some embodiments, the stroke range can be determined based on the mean value of the number of strokes per lap in the current converged swimming session. For example, if the mean value of the number of strokes per lap in the current converged swimming session is 20 strokes per lap, the range can be +/−6 from the range, i.e., from 14 to 26 strokes per lap. In some embodiments, other suitable ranges can be used.
At step 2430, wearable device 100 determines whether the number of the user's strokes converges in a historical swimming session. In some embodiments, the factor(s) to determine whether or not the number of the user's strokes converges in a historical swimming session can be the same factor(s) used at step 2410. If the number of the user's strokes converges in the historical swimming session, the process 2400 proceeds to step 2440. If the number of the user's strokes does not converge in the historical swimming session, the process 2400 proceeds to step 2450.
At step 2440, wearable device 100 determines a stroke range given the number of the user's strokes converges in the historical swimming session. In some embodiments, the stroke range can be determined based on the mean value of the number of strokes per lap in the historical converged swimming session. For example, if the mean value of the number of strokes per lap in the historical converged swimming session is 16 strokes per lap, the range can be +/−6 from the range, i.e., from 10 to 22 strokes per lap. In some embodiments, the range can be increased for a larger mean value and/or decreased for a smaller mean value. For example, if the mean value of the number of strokes per lap in the historical converged swimming session is 24 strokes per lap, the range can be +/−8 from the range, i.e., from 16 to 32 strokes per lap. In some embodiments, the lower bound of the range can be set at other numbers to take into account the possibility that the current session and the historical session are not associated with the same pool length. For example, in some embodiments, the lower bound can be set at 3 and the upper bound remains at 6 strokes over the mean value. For example, if the mean value of the number of strokes per lap in the historical converged swimming session is 16 strokes per lap, the range can be from 3 to 22 strokes per lap. In some embodiments, other suitable ranges can be used.
At step 2450, wearable device 100 determines a stroke range given the number of the user's strokes does not converge in the current swimming session or in the historical swimming session. In some embodiments, the stroke range can be wider than the ranges determined at step 2320 or 2340 because the number of the user's strokes has not converged yet. In addition, the range can be varied based on the user's swimming style. For example, in some embodiments, if the user is detected to swim in breast stroke, then the range can be from 3 to 72 strokes per lap. In some embodiments, if the user is detected to swim in other styles, then the range can be from 3 to 40 strokes per lap.
As mentioned above, the parameters used in process 1600 can be changed to other suitable values. In addition, in some embodiments, if wearable device 100 first determines the number of the user's strokes does not converge and then later determines the number of the user's strokes converges eventually, then wearable device 100 can adjust the stroke range accordingly.
Now referring back to
At step 2510, wearable device 100 accepts the turns that meet all criteria to become a lap. At step 2510, wearable device 100 has already rejected certain turns. At step 2510, wearable device further rejects a turn if the number of the strokes between the current detected turn and the previous detected turn is outside the stroke range determined at step 1512. In some embodiments, all detected turns, both accepted and rejected, will be kept at a storage for potential adjustment at step 2530.
At step 2520, wearable device 100 inserts a turn if it determines that a turn is not detected. For example, sometimes the user makes a turn with a relatively small change in yaw angle, and wearable device 100 missed detecting this turn. Sometimes if the user reaches the end of the pool using a style other than backstroke and then switches to backstroke to continue, then wearable device 100 may not detect a turn because there may not be much change of yaw angle.
In some embodiments, wearable device 100 can determine there is a missed turn if the number of strokes between two consecutive turns is too large. For example, if in previous turns, the average number of strokes between two consecutive turns is 20, and the number of strokes between the current detected turn and previous detected turn is 40, it is likely that a turn is missed. In some embodiments, wearable device 100 can determine there is a missed turn if the number of the strokes between two turns is greater than a threshold. For example, the threshold can be 1.8 times of the average number of strokes between consecutive turns. In some embodiments, wearable device 100 can also require the threshold is greater than the upper bound of the stroke range. For example, if the average stroke count is 15, and the stroke range is between 9 and 21, then if there are 30 strokes between the current detected turn and the previous turn, wearable device 100 will determine there is a missed turn because 30 is both greater than the upper bound of the stroke range (21) and 1.8 times of the mean value (1.8*15=27). In some embodiments, other suitable parameters can be used to determine whether a turn is missed. In some embodiments, step 2520 is limited to only insert one lap. In some embodiments, step 2520 can insert more than one lap.
At step 2530, wearable device 100 adjusts turns detected to reduce variance of strokes per lap counted. For example, when wearable device 100 rejects a turn, it will consider if accepting the current turn and rejecting the previous accepted turn will reduce variance of strokes per lap counted. Table I shows an example of turns detected by wearable device 100 and the number of strokes associated with two consecutive turns detected. Wearable device 100 would normally reject the fourth turn since it only associated with 3 strokes. In that case, six turns (turns #1, #2, #3, #5, #6, and #7) will be accepted, and the numbers of strokes between two consecutive accepted turns will be 15, 15, 12, 18, 14, and 15. The mean value of the stroke count will be 14.83, and the standard deviation will be 1.94. As discussed above, at step 2530, when wearable device 100 rejects a turn, it will also compare to the previous accepted turn and determine if accepting the current turn and rejecting the previous accepted turn will reduce variance of strokes per lap counted. For example, if wearable device 100 rejected the previously accepted turn #3 and accepted instead turn #4, then the numbers of strokes between two consecutive accepted turns will be 15, 15, 15, 15, 14, and 15. The mean value of the stroke count will be around 14.83, and the standard deviation will be around 0.41. Since the variance of the number of strokes per lap counted will be smaller after adjustment, wearable device 100 will accept turn #4 but reject turn #3.
Detecting Swim Activity
For example, stroke rate can be used to detect swimming. While a user is swimming, the user will generally have a regular, periodic stroke rate/count. For example, the stroke period (e.g., the time between strokes) will have a low variance when averaged over a reasonable time window, for example, ten seconds. Conversely, when the user is not swimming, but is instead taking a break, the stroke period will be sporadic. Accordingly, the motion sensors detecting a consistent stroke period over a period of time would indicate swimming. In some embodiments, the default stroke rate for detecting swimming can be eight strokes per minute or above, corresponding to a beginner/very unskilled swimmer. If a user's stroke rate falls below the default stroke rate, then the stroke rate counter will not detect swimming.
In some embodiments, wearable device 100 can receive training data based on a user's observed stroke rate when the user wears wearable device 100. The default stroke rate to detect swimming can then be personalized to be calculated based on the user's observed stroke rate. For example, the default stroke rate can be set to the inverse of the median stroke rate of a user that was observed for at least three consecutive stroke periods.
In some embodiments, the motion sensor data also can be used for style classification, which can also provide an indication as to whether a user is swimming. Each stroke is expected to be classified as one of the common four styles: freestyle, backstroke, breaststroke, and butterfly. With popular classifiers used in pattern recognition, such as support vector machines and logistic regression, an additional notion of confidence in a classification determination can be determined. Stroke classification uses a two-tiered decision tree classifier, with logistic regression classifier at each level. In some embodiments, a range [0,1] can be assigned to each stroke classification determination, which indicates the confidence level in the classification decision, e.g., a value closer to 0 implies low confidence in the classification output, and high otherwise, as apparent in logistic regression. Alternately, confidence level can be inferred from a correlated metric such as the number of strokes with known style (i.e. not classified as Unknown) in a pre-defined time-window. Low confidence in a classification style can be a useful indicator of non-steady state swim behavior, whereas high confidence in a classification style can be a useful indicator of swimming activity. For example, low confidence can be interpreted as any value in the range [0, 0.2], while the corresponding high confidence range can be [0.2, 1.0].
Stroke rate and style classification can capture differences between steady and non-steady state swimming. However, sometimes, user arm movement while not swimming can trigger a stroke count and/or a valid style classification. For these scenarios, accelerometer energy and gyroscope signal variance can be investigated to correctly determine a motion signature that indicates swim activity.
Discriminative information in accelerometer and gyroscope energy (based on the accelerometer and gyroscope signals from the device sensor fusion) can be mined for swim activity detection. As discussed above, in some embodiments, the accelerometer and gyroscope signals are converted from the body fixed frame of reference to the inertial frame of reference. Referring to
The above considerations can be combined using a decision tree classifier with the appropriate thresholds to decide if a given epoch corresponds to swim activity or not. Decisions on successive epochs can be chained together, for example of ten seconds or more, to improve the accuracy of detection. The specific choice of threshold values is usually tied to the discriminating feature used for classification and the hierarchical order in the decision tree. A typical threshold in logistic regression is 0.5. For example, feature values greater than or equal to 0.5 represent one style, while feature values less than 0.5 represent the others. This value can be adjusted to bias the accuracy in terms of either improving true positive detection or false positive rejection.
In some embodiments, additional sensory input, for example, heart rate measurements using, for example, a PPG sensor, can be used to improve the accuracy of the swimming determination. Further, speed estimation from GPS 410 also can be used to verify the swimming determination.
The swimming determination can be used by other features of wearable device 100. For example, if wearable device 100 knows that the user is swimming, the wearable device 100 can accurately count laps, strokes, calories, and distance, as discussed in the respective applications referred to above, and incorporated by reference herein in their entirety. If the device knows that the user is not swimming, the device can disregard data from those time periods where the user is not swimming. Accordingly, the final workout data for a particular user will be more accurate, because it will reflect only those instances and periods where the user is actually swimming and will not include spurious strokes or laps where the user was not swimming, but was instead resting, or walking around the pool.
Determining Swimming Pool Length
When a user is swimming in a pool, there is often a need to know the length of the swimming pool. Information of the swimming pool length can be used to calculate the total distance a user swims and the energy expenditure associated with a swimming session. The pool length information, however, is not always readily available to users. Additionally, users may not be able to accurately estimate the pool length.
The process 3200 starts at step 3205. At step 3205, a user can use wearable device 100 to indicate that he or she is about to start a swimming workout session at a swimming pool. The process 3200 then proceeds to step 3210.
At step 3210, wearable device 100 provides the user with one or more options to indicate the length of the swimming pool. In some embodiments, the user can choose among three options: standard length, custom length, and calibrate. In some embodiments, other suitable options can also be used. If the user chooses the standard length option, wearable device 100 can provide one or more lengths of a standard pool. Non-limiting examples of the standard length can include 25-yard, 25-meter, 33⅓-meter, or 50-meter. Other suitable standard length can also be used. In some embodiments, the user can choose to enter a standard length of the swimming pool. If the user chooses the custom length option, then the user can choose to enter a length of the swimming pool. The length can be based on information the user possesses or based on the user's estimation. If the user chooses the calibration option, then wearable device can prompt more options/instructions as described in following steps. The process 3200 then proceeds to step 3215.
At step 3215, wearable device 100 determines whether or not the user chooses the calibration option. If the user chooses the calibration option, the process 3200 proceeds to step 3220. If the user does not choose the calibration option, the process 3200 proceeds to step 3235.
At step 3220, wearable device 100 encourages the user to take a short walk along the edge of the pool, where the edge of the pool is parallel with the swimming direction of the pool. In some embodiments, the user can choose to run, jump, or any other suitable activity along the edge of the pool. The process 3200 then proceeds to step 3225.
At step 3225, wearable device 100 can use pedometer 265 and GPS sensor 295 to estimate the length associated with the user's activity at step 3220, and further estimate the length of the swimming pool. In some embodiments, pedometer 265 can count number of steps the user takes. The total distance of the user's activity can be calculated by multiplying the number of steps with a step length. In some embodiments, the step length can be a default value, a value previous customized for the user, or a value determined by GPS sensor 295. In some embodiments, the total distance of the user's activity can be directly estimated by GPS sensor 295. Once the total distance of the user's activity is estimated, the length of the swimming pool can be estimated based on a ratio between the length of the swimming pool and the length associated with the user's activity. For example, if the length associated with the user's activity at step 3220 is estimated to be 10-yard, and GPS sensor 295 further estimates the length of the swimming pool is 2.5 times of the length associated with the user's activity at step 3220, then the length of the swimming pool can be estimated to be 25-yard. The process 3200 then proceeds to step 3230.
At step 3230, once a length of the swimming pool is identified, the user can proceed with the swimming workout session.
If the user does not choose to calibrate the length of the swimming pool, the user can start to swim, and wearable device 100 can still passively estimate the length of the swimming pool. At step 3235, wearable device 100 can use default stroke length to estimate the length of the swimming pool. When the user is swimming, wearable device 100 can count the number of strokes the user has had within a short period. In some embodiments, wearable device 100 can only detect the strokes made by the arm wearing the wearable device 100. In these embodiments, for every stroke detected, the user may make two strokes, and the stroke length can sometimes be adjusted accordingly. The total distance of the user's swimming activity within a short period can be calculated by multiplying the number of strokes with a stroke length. In some embodiments, the stroke length can be a default value, a value previous customized for the user, or a value determined by GPS sensor 295. Once the total distance of the user's swimming activity within the short period is estimated, the length of the swimming pool can be estimated based on a ratio between the length of the swimming pool and the length associated with the user's swimming activity within the short period. For example, if the length associated with the user's swimming activity with the short period is estimated to be 10-yard, and GPS sensor 295 further estimates the length of the swimming pool is 2.5 times of the length associated with the user's swimming activity within the short period, then the length of the swimming pool can be estimated to be 25-yard. The process 3200 then proceeds to step 3240.
At step 3240, wearable device 100 determines whether or not the length estimated at step 3235 is close to a standard length of a swimming pool. In some embodiments, wearable device 100 can determine that an estimated length is close to a standard length if the difference is within 10% of the standard length. In some embodiments, other suitable threshold can be used. For example, if the pool length estimated at step 3240 is 20-yard, and the standard pool length is 25-yard, then wearable device 100 can determine whether or not the estimated length is close to the standard length based on the threshold. If the threshold is selected to be within 10% of the standard length, then an estimated length between 22.5-yard and 27.5-yard would be considered to be close enough, and the estimated 20-yard length would not be considered to be close enough. If the threshold is selected to be within 20% of the standard length, then an estimated length between 20-yard and 30-yard would be considered to be close enough, and the estimated 20-yard estimation would be considered to be close enough. If the estimated length is close to the standard length, the process 3200 proceeds to step 3245. If the estimated length is not close to the standard length, the process 3200 proceeds to step 3250.
At step 3245, wearable device 100 suggests that the user uses the standard length identified to be close to the length of the swimming pool. For example, if the estimated length is 24.3-yard, then wearable device 100 may suggest that the pool length is actually 25-yard, which is a length of a standard pool. The process 3200 then proceeds to step 3230.
At step 3250, because the estimated length of the swimming pool is not close to a standard swimming pool, wearable device 100 suggests that the user uses calibration to get an estimation of the length of the swimming pool. If the user chooses to calibrate, then the process 3200 proceeds to step 3220 to start the calibration process.
The process 3300 starts at step 3305. At step 3305, a user can use wearable device 100 to indicate that he or she is about to start a swimming workout session at a swimming pool. The process 3300 then proceeds to step 3310.
At step 3310, wearable device 100 determines whether or not there is recent location information related to the swimming pool. In some embodiments, the location information includes a length of the swimming pool identified by other users. In some embodiments, wearable device 100 searches for the recent location information from storage media local at wearable device 100. In some embodiments, wearable device 100 searches for the recent location information from a remote storage media. If there is recent location information available, the process 3300 proceeds to step 3315. If there is no recent location information available, the process 3300 proceeds to step 3335.
At step 3315, wearable device 100 determines if the location of the swimming pool is known. If the location is known, the process 3300 proceeds to step 3320.
At step 3320, wearable device 100 determines whether or not there is sufficient history of the identified location of the swimming pool. In some embodiments, history can be pool lengths identified by other users. The threshold to determine the sufficiency of the history can be any suitable number. For example, if the threshold is set at 5, then if the length of the swimming pool has been identified by 5 or more users, then there would be sufficient history of the identified location of the swimming pool; if the length of the swimming pool has been identified by less than 5 users, then there would not be sufficient history of the identified location of the swimming pool. If there is sufficient history of the identified location of the swimming pool, the process 3300 proceeds to step 3325. If there is no sufficient history of the identified location of the swimming pool, the process 3300 proceeds to step 3350.
At step 3350, wearable device 100 prompts the user to provide an estimation of the swimming pool, and the user's estimation will be added to the pool length table.
At step 3325, since there is sufficient history of the identified location of the swimming pool, wearable device 100 looks up previously identified pool lengths associated with the swimming pool. The process 3300 then proceeds to step 3330.
At step 3330, wearable device 100 prompts the user with one or more choices of the length of the swimming pool. For example, if the swimming pool has been identified by 10 users as 25-yard and 3 users as 50-yard, then wearable device 100 can provide both choices to the user. In some embodiments, wearable device 100 can provide the length option identified by most users. In some embodiments, for each length option, wearable device 100 can also provide number of users identified such a length.
At step 3335, wearable device 100 attempts to obtain location information of the swimming pool. In some embodiments, wearable device 100 obtains location information through GPS sensor 295.
At step 3340, wearable device 100 determines whether or not location information of the swimming pool is available. If the information is available, the process 3300 proceeds to step 3315. If the information is not available, the process 3300 proceeds to step 3345.
At step 3345, wearable device 100 determines whether or not the user's swimming workout session has ended. If the user's swimming session has ended, the process 3300 proceeds to step 3355. If the user's swimming session has not ended, the process 3300 proceeds to step 3335 to continue to obtain location information.
At step 3355, since the user has already ended the swimming workout, wearable device 100 can adjust the timeout period and/or GPS frequency to save power.
Although the present disclosure has been described and illustrated in the foregoing exemplary embodiments, it is understood that the present disclosure has been made only by way of example, and that numerous changes in the details of implementation of the present disclosure may be made without departing from the spirit and scope of the present disclosure, which is limited only by the claims which follow.
This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/381,640, titled “Systems and Methods for Detecting Turns,” which is filed on Aug. 31, 2016 and is incorporated by reference herein in its entirety. This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/381,641, titled “Systems and Methods for Detecting Breaths While Swimming,” which is filed on Aug. 31, 2016 and is incorporated by reference herein in its entirety. This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/381,988, titled “Systems and Methods for Counting Laps,” which is filed on Aug. 31, 2016 and is incorporated by reference herein in its entirety. This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/381,989, titled “Systems and Methods for Determining Swimming Pool Length,” which is filed on Aug. 31, 2016 and is incorporated by reference herein in its entirety. This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/381,846, titled “Systems and Methods of Counting Swim Strokes,” which is filed on Aug. 31, 2016 and is incorporated by reference herein in its entirety. This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/381,843, titled “Systems and Methods for Detecting Swim Activity Using Inertial Sensors”, which is filed on Aug. 31, 2016 and is incorporated by reference herein in its entirety. This application relates to U.S. patent application Ser. No. 15/692,726, titled “Systems and Methods of Swimming Analysis,” which is filed on Aug. 31, 2017 and is incorporated by reference herein in its entirety. This application relates to U.S. patent application Ser. No. 15/692,237, titled “Systems and Methods of Swimming Calorimetry,” which is filed on Aug. 31, 2017 and is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4566461 | Lubell et al. | Jan 1986 | A |
4740009 | Hoelzl | Apr 1988 | A |
5158093 | Shvartz et al. | Oct 1992 | A |
5663897 | Geiser | Sep 1997 | A |
5664499 | Kingsmill | Sep 1997 | A |
6013008 | Fukushima | Jan 2000 | A |
6059724 | Campell et al. | May 2000 | A |
6582380 | Kazlausky et al. | Jun 2003 | B2 |
6687535 | Hautala et al. | Feb 2004 | B2 |
6837827 | Lee et al. | Jan 2005 | B1 |
6862525 | Beason et al. | Mar 2005 | B1 |
6868338 | Elliott | Mar 2005 | B1 |
6876947 | Darley et al. | Apr 2005 | B1 |
7254516 | Case, Jr. et al. | Aug 2007 | B2 |
7311675 | Peifer et al. | Dec 2007 | B2 |
7377180 | Cunningham | May 2008 | B2 |
7387029 | Cunningham | Jun 2008 | B2 |
7467060 | Kulach et al. | Dec 2008 | B2 |
7534206 | Lovitt et al. | May 2009 | B1 |
7647196 | Kahn et al. | Jan 2010 | B2 |
7690556 | Kahn et al. | Apr 2010 | B1 |
7771320 | Riley et al. | Aug 2010 | B2 |
7805149 | Werner et al. | Sep 2010 | B2 |
7841967 | Kahn et al. | Nov 2010 | B1 |
8290480 | Abramson et al. | Oct 2012 | B2 |
8483775 | Buck et al. | Jul 2013 | B2 |
8531180 | Piemonte et al. | Sep 2013 | B2 |
8589174 | Nelson et al. | Nov 2013 | B2 |
8638320 | Harley et al. | Jan 2014 | B2 |
8653956 | Berkobin et al. | Feb 2014 | B2 |
8784271 | Brumback et al. | Jul 2014 | B2 |
8890854 | Tenuta et al. | Nov 2014 | B2 |
8892391 | Tu et al. | Nov 2014 | B2 |
8894576 | Alwan et al. | Nov 2014 | B2 |
8911329 | Lin et al. | Dec 2014 | B2 |
8928635 | Harley et al. | Jan 2015 | B2 |
9195305 | Markovic et al. | Nov 2015 | B2 |
9264862 | Tu et al. | Feb 2016 | B2 |
9413871 | Nixon et al. | Aug 2016 | B2 |
9448250 | Pham et al. | Sep 2016 | B2 |
9526430 | Srinivas et al. | Dec 2016 | B2 |
9704412 | Wells et al. | Jul 2017 | B2 |
9737761 | Sivaraj | Aug 2017 | B1 |
9788794 | Le Boeuf et al. | Oct 2017 | B2 |
9817948 | Swank et al. | Nov 2017 | B2 |
9918646 | Alvarado et al. | Mar 2018 | B2 |
9998864 | Kumar et al. | Jun 2018 | B2 |
10098549 | Tan et al. | Oct 2018 | B2 |
10154789 | Raghuram et al. | Dec 2018 | B2 |
10188347 | Self et al. | Jan 2019 | B2 |
10206627 | Le Boeuf et al. | Feb 2019 | B2 |
10219708 | Altini | Mar 2019 | B2 |
10244948 | Pham et al. | Apr 2019 | B2 |
10290260 | Wu et al. | May 2019 | B2 |
10292606 | Wisbey et al. | May 2019 | B2 |
10512406 | Martinez et al. | Dec 2019 | B2 |
10524670 | Raghuram et al. | Jan 2020 | B2 |
10620232 | Tu et al. | Apr 2020 | B2 |
10687707 | Tan et al. | Jun 2020 | B2 |
10687752 | Pham et al. | Jun 2020 | B2 |
10694994 | Alvarado et al. | Jun 2020 | B2 |
10699594 | Mermel et al. | Jun 2020 | B2 |
10617912 | Narasimha Rao et al. | Jul 2020 | B2 |
10709933 | Tan et al. | Jul 2020 | B2 |
11051720 | Perry et al. | Jul 2021 | B2 |
11103749 | Mermel et al. | Aug 2021 | B2 |
11278765 | Mohrman et al. | Mar 2022 | B2 |
11517789 | Xie | Dec 2022 | B2 |
20010022828 | Pyles | Sep 2001 | A1 |
20020019585 | Dickinson | Feb 2002 | A1 |
20030032460 | Cannon et al. | Feb 2003 | A1 |
20030138763 | Roncalez et al. | Jul 2003 | A1 |
20040064061 | Nissila | Apr 2004 | A1 |
20050065443 | Ternes | Mar 2005 | A1 |
20050107723 | Wehman et al. | May 2005 | A1 |
20050124906 | Childre et al. | Jun 2005 | A1 |
20050212701 | Nimmo | Sep 2005 | A1 |
20060064277 | Jung | Mar 2006 | A1 |
20060136173 | Case et al. | Jun 2006 | A1 |
20060190217 | Lee et al. | Aug 2006 | A1 |
20060217231 | Parks et al. | Sep 2006 | A1 |
20070100666 | Stivoric et al. | May 2007 | A1 |
20070150229 | Fujiwara | Jun 2007 | A1 |
20070219059 | Schwartz et al. | Sep 2007 | A1 |
20070275825 | O'Brien | Nov 2007 | A1 |
20070276271 | Chan | Nov 2007 | A1 |
20080096726 | Riley et al. | Apr 2008 | A1 |
20080214360 | Stirling et al. | Sep 2008 | A1 |
20090009320 | O'Connor et al. | Jan 2009 | A1 |
20090024332 | Karlov et al. | Jan 2009 | A1 |
20090043531 | Kahn et al. | Feb 2009 | A1 |
20090063099 | Counts et al. | Mar 2009 | A1 |
20090143199 | Nishibayashi | Jun 2009 | A1 |
20090240461 | Makino et al. | Sep 2009 | A1 |
20090319221 | Kahn et al. | Dec 2009 | A1 |
20100030350 | House et al. | Feb 2010 | A1 |
20100030482 | Li | Feb 2010 | A1 |
20100130890 | Matsumura et al. | May 2010 | A1 |
20100184564 | Molyneux et al. | Jul 2010 | A1 |
20100204952 | Irlam et al. | Aug 2010 | A1 |
20100210953 | Sholder et al. | Aug 2010 | A1 |
20100210975 | Anthony, III et al. | Aug 2010 | A1 |
20100217099 | Leboeuf et al. | Aug 2010 | A1 |
20100274102 | Teixeira | Oct 2010 | A1 |
20100298656 | McCombie et al. | Nov 2010 | A1 |
20110040193 | Seppanen et al. | Feb 2011 | A1 |
20110054359 | Sazonov et al. | Mar 2011 | A1 |
20110082008 | Cheung et al. | Apr 2011 | A1 |
20110131012 | Czaja et al. | Jun 2011 | A1 |
20110152695 | Granqvist et al. | Jun 2011 | A1 |
20110195707 | Faerber et al. | Aug 2011 | A1 |
20110238485 | Haumont et al. | Sep 2011 | A1 |
20110301436 | Teixeira | Dec 2011 | A1 |
20120006112 | Lee et al. | Jan 2012 | A1 |
20120083715 | Friedman | Apr 2012 | A1 |
20120172677 | Beith | Jul 2012 | A1 |
20120238832 | Hwang | Sep 2012 | A1 |
20120296455 | Ohnemus et al. | Nov 2012 | A1 |
20120322621 | Bingham et al. | Dec 2012 | A1 |
20130006522 | Vellaikal et al. | Jan 2013 | A1 |
20130023739 | Russel | Jan 2013 | A1 |
20130041590 | Burich et al. | Feb 2013 | A1 |
20130053990 | Ackland | Feb 2013 | A1 |
20130073255 | Yuen | Mar 2013 | A1 |
20130085861 | Dunlap | Apr 2013 | A1 |
20130096943 | Carey et al. | Apr 2013 | A1 |
20130135097 | Doezema | May 2013 | A1 |
20130158686 | Zhang et al. | Jun 2013 | A1 |
20130178335 | Lin et al. | Jul 2013 | A1 |
20130197377 | Takahiko et al. | Aug 2013 | A1 |
20130218053 | Kaiser et al. | Aug 2013 | A1 |
20130267794 | Fernstrom et al. | Oct 2013 | A1 |
20130326137 | Bilange et al. | Dec 2013 | A1 |
20130340287 | Stewart | Dec 2013 | A1 |
20140071082 | Singh et al. | Mar 2014 | A1 |
20140073486 | Ahmed et al. | Mar 2014 | A1 |
20140087708 | Kalita et al. | Mar 2014 | A1 |
20140088444 | Saalasti et al. | Mar 2014 | A1 |
20140107932 | Luna | Apr 2014 | A1 |
20140109390 | Manning | Apr 2014 | A1 |
20140121471 | Walker | May 2014 | A1 |
20140167973 | Letchner et al. | Jun 2014 | A1 |
20140172238 | Craine | Jun 2014 | A1 |
20140172361 | Chiang et al. | Jun 2014 | A1 |
20140197946 | Park et al. | Jul 2014 | A1 |
20140200906 | Bentley et al. | Jul 2014 | A1 |
20140207264 | Quy | Jul 2014 | A1 |
20140213920 | Lee et al. | Jul 2014 | A1 |
20140221854 | Wai | Aug 2014 | A1 |
20140228649 | Rayner et al. | Aug 2014 | A1 |
20140244071 | Czaja et al. | Aug 2014 | A1 |
20140266160 | Coza | Sep 2014 | A1 |
20140266789 | Matus | Sep 2014 | A1 |
20140276127 | Ferdosi et al. | Sep 2014 | A1 |
20140278139 | Hong et al. | Sep 2014 | A1 |
20140278229 | Hong et al. | Sep 2014 | A1 |
20140279123 | Harkey et al. | Sep 2014 | A1 |
20140316305 | Venkatraman et al. | Oct 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20150066526 | Cheng et al. | Mar 2015 | A1 |
20150072712 | Huang et al. | Mar 2015 | A1 |
20150087929 | Rapoport et al. | Mar 2015 | A1 |
20150088006 | Rapoport et al. | Mar 2015 | A1 |
20150100141 | Hughes | Apr 2015 | A1 |
20150105096 | Chowdhury et al. | Apr 2015 | A1 |
20150119728 | Blackadar et al. | Apr 2015 | A1 |
20150147734 | Flores et al. | May 2015 | A1 |
20150148632 | Benaron | May 2015 | A1 |
20150173631 | Richards | Jun 2015 | A1 |
20150182149 | Rapoport et al. | Jul 2015 | A1 |
20150250417 | Cheng et al. | Sep 2015 | A1 |
20150256689 | Erkkila et al. | Sep 2015 | A1 |
20150260514 | Menelas et al. | Sep 2015 | A1 |
20150294440 | Roberts | Oct 2015 | A1 |
20150327804 | Lefever et al. | Nov 2015 | A1 |
20150328523 | Heling et al. | Nov 2015 | A1 |
20150338926 | Park et al. | Nov 2015 | A1 |
20150345985 | Fung et al. | Dec 2015 | A1 |
20150357048 | Goldstein | Dec 2015 | A1 |
20150374240 | Lee | Dec 2015 | A1 |
20160021238 | Abramson et al. | Jan 2016 | A1 |
20160038083 | Ding et al. | Feb 2016 | A1 |
20160054449 | Pekonen et al. | Feb 2016 | A1 |
20160057372 | Raghuram et al. | Mar 2016 | A1 |
20160058302 | Raghuram et al. | Mar 2016 | A1 |
20160058329 | Srinivas et al. | Mar 2016 | A1 |
20160058332 | Tan et al. | Mar 2016 | A1 |
20160058333 | Arnold et al. | Mar 2016 | A1 |
20160058356 | Raghuram et al. | Mar 2016 | A1 |
20160058370 | Raghuram et al. | Mar 2016 | A1 |
20160058371 | Singh Alvarado et al. | Mar 2016 | A1 |
20160058372 | Raghuram et al. | Mar 2016 | A1 |
20160059079 | Watterson | Mar 2016 | A1 |
20160066859 | Crawford et al. | Mar 2016 | A1 |
20160069679 | Jackson et al. | Mar 2016 | A1 |
20160084869 | Yuen et al. | Mar 2016 | A1 |
20160143579 | Martikka | May 2016 | A1 |
20160147319 | Agarwal et al. | May 2016 | A1 |
20160166178 | Fuss et al. | Jun 2016 | A1 |
20160170998 | Frank et al. | Jun 2016 | A1 |
20160206248 | Sartor et al. | Jul 2016 | A1 |
20160223578 | Klosinski, Jr. et al. | Aug 2016 | A1 |
20160242646 | Obma | Aug 2016 | A1 |
20160256058 | Pham et al. | Sep 2016 | A1 |
20160263435 | Venkatraman | Sep 2016 | A1 |
20160269572 | Erkkila et al. | Sep 2016 | A1 |
20160287177 | Huppert et al. | Oct 2016 | A1 |
20160301581 | Carter et al. | Oct 2016 | A1 |
20160314633 | Bonanni et al. | Oct 2016 | A1 |
20160361020 | LeBoeuf et al. | Dec 2016 | A1 |
20160363449 | Metzler et al. | Dec 2016 | A1 |
20160374614 | Cavallaro et al. | Dec 2016 | A1 |
20170007166 | Roover et al. | Jan 2017 | A1 |
20170061817 | Mettler May | Mar 2017 | A1 |
20170074897 | Mermel et al. | Mar 2017 | A1 |
20170082649 | Tu et al. | Mar 2017 | A1 |
20170094450 | Tu et al. | Mar 2017 | A1 |
20170111768 | Smith et al. | Apr 2017 | A1 |
20170181644 | Meer et al. | Jun 2017 | A1 |
20170188893 | Venkatraman et al. | Jul 2017 | A1 |
20170202486 | Martikka et al. | Jul 2017 | A1 |
20170211936 | Howell et al. | Jul 2017 | A1 |
20170242499 | Shah et al. | Aug 2017 | A1 |
20170242500 | Shah et al. | Aug 2017 | A1 |
20170251972 | Jayaraman et al. | Sep 2017 | A1 |
20170259116 | Mestas | Sep 2017 | A1 |
20170269734 | Graff | Sep 2017 | A1 |
20170269785 | Abdollahian et al. | Sep 2017 | A1 |
20170273619 | Alvarado et al. | Sep 2017 | A1 |
20170347885 | Tan et al. | Dec 2017 | A1 |
20170367658 | LeBoeuf et al. | Dec 2017 | A1 |
20170368413 | Shavit | Dec 2017 | A1 |
20180028863 | Matsuda | Feb 2018 | A1 |
20180043210 | Niehaus et al. | Feb 2018 | A1 |
20180049694 | Singh Alvarado et al. | Feb 2018 | A1 |
20180050235 | Tan et al. | Feb 2018 | A1 |
20180055375 | Martinez et al. | Mar 2018 | A1 |
20180055439 | Pham et al. | Mar 2018 | A1 |
20180056123 | Narasimha Rao et al. | Mar 2018 | A1 |
20180056128 | Narasimha Rao et al. | Mar 2018 | A1 |
20180056129 | Narasimha Rao et al. | Mar 2018 | A1 |
20180279914 | Patek et al. | Oct 2018 | A1 |
20180303381 | Todd et al. | Oct 2018 | A1 |
20180344217 | Perry et al. | Dec 2018 | A1 |
20190038938 | Nagasaka et al. | Feb 2019 | A1 |
20190076063 | Kent et al. | Mar 2019 | A1 |
20190090087 | Taylor et al. | Mar 2019 | A1 |
20190184230 | Lee et al. | Jun 2019 | A1 |
20190360813 | Zhao et al. | Nov 2019 | A1 |
20200232796 | Lee et al. | Jul 2020 | A1 |
20210068689 | Ochs et al. | Mar 2021 | A1 |
20210068712 | Humblet et al. | Mar 2021 | A1 |
20210068713 | Dervisoglu et al. | Mar 2021 | A1 |
20210093917 | Dervisoglu et al. | Apr 2021 | A1 |
20210093918 | Dervisoglu et al. | Apr 2021 | A1 |
20220241641 | Mermel et al. | Aug 2022 | A1 |
20230232861 | McClements | Jul 2023 | A1 |
Number | Date | Country |
---|---|---|
2008100295 | May 2008 | AU |
102481479 | May 2012 | CN |
104218976 | Dec 2014 | CN |
105031905 | Nov 2015 | CN |
105068656 | Nov 2015 | CN |
2465824 | Jun 2010 | GB |
259KOL2015 | Dec 2015 | IN |
2004089317 | Mar 2004 | JP |
2010-051333 | Mar 2010 | JP |
2013-039316 | Feb 2013 | JP |
2014-042757 | Mar 2014 | JP |
2016-150018 | Aug 2016 | JP |
2018-000543 | Jan 2018 | JP |
2018-015187 | Feb 2018 | JP |
2019028796 | Feb 2019 | JP |
2020148558 | Sep 2020 | JP |
122807 | Feb 2010 | RO |
0361779 | Jul 2003 | WO |
2010090867 | Aug 2010 | WO |
2011105914 | Sep 2011 | WO |
2015126182 | Aug 2015 | WO |
2015200900 | Dec 2015 | WO |
2016044831 | Mar 2016 | WO |
2016073620 | May 2016 | WO |
WO 2016142246 | Sep 2016 | WO |
WO 2018117914 | Jun 2018 | WO |
Entry |
---|
Novatel, “IMU Error and Their Effects”, Novatel Application Notes APN-064 Rev A p. 1-6, Feb. 21, 2014. |
Le, et al., “Sensor-based Training Optimization of a Cyclist Group”, Seventh International Conference on Hybrid Intelligent Systems, IEEE 2007, pp. 265-270. |
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2018/047290, dated Nov. 8, 2018, 14 pages. |
Kyle, Chester R., “Reduction of Wind Resistance and Power Output of Racing Cyclists and Runners Traveiiing in Groups”, Ergonomics, vol. 22, No. 4, 1979, pp. 387-397. |
LE et al., “Sensor Based Training Optimization of a Cyclist Group”, Seventh International Conference on Hybrid Intelligent Systems, 2007, pp. 265-270. |
KINprof, May 31, 2011, Predictive VO2max tests, Web Video, Retrieved from: https://www.youtube.com/watch?v=_9e3HcY1sm8. |
PCT International Application No. PCT/US2017/049693, International Search Report dated Aug. 12, 2017, 3 pages. |
Yamaji, et al., “Relationship Between Heart Rate and Relative Oxygen Intake in Male Subjects Aged 10 to 27 Years”, J. Human Ergol., 7:29-39, Jan. 27, 1978. |
Your Fitness FAQ, Why is it important to warm up and cool down in a workout?, 2012, Web, Retrieved from: http://www.yourfitnessfaq.com/whyisitimportanttowarmupandcooldowninaworkout.html. |
Vella et al, Exercise After-Burn: Research Update, 2005, Web, Retrieved from: http://www.unm.edu/˜lkravitz/Article%20folder/epocarticle.html. |
Song et al., “Training Activity Recognition Systems Online Using Real-Time Crowdsourcing”, University of Rochester Computer Science, UbiCom' 12, Sep. 5-8, 2012 (2 pages). |
Rowlands et al., “Assessing Sedentary Behavior with the GENEActiv: Introducing the Sedentary Sphere”. Medicine and science in sports and exercise 46.6 (2014): 1235-1247. |
Hasson et al., “Accuracy of four resting metabolic rate production equations: Effects of sex, body mass index, age, and race/ethnicity”, Journal of Science and Medicine in Sport, 2011, vol. 14, p. 344-351. |
Lucas et al., “Mechanisms of orthostatic intolerance following very prolonged exercise”, 2008, J Appl Physiol, 105: 213-225. |
Kunze et al., “Where am i: Recognizing on-body positions of wearable sensors.” Location-and context-awareness. Springer Berlin Heidelberg, 2005. 264-275. |
Keytel et al., “Prediction of energy expenditure from heart rate monitoring during submaximal exercise”, 2005, Journal of Sports Sciences, 23(3):289-97. |
Sabatini, Kalman-filter-based orientation determination using inertial/magnetic sensors: observability analysis and performance evaluation, Sep. 27, 2011, Sensors 2011, 11, 9182-9206. |
Jackson et al., “Prediction of functional aerobic capacity without exercise testing”, Medicine and Science in Sports and Exercise, 22(6), 863-870, 1990. |
Isaacs et al., “Modeling energy expenditure and oxygen consumption in human exposure models: accounting for fatigue and EPOC”, 2008, Journal of Exposure Science and Environmental Epidemiology, 18: 289-298. |
Human Kinetics, Aerobic Workout Components, 2011, Web, Retrieved from: http://www.humankinetics.com/excerpts/excerpts/aerobicworkoutcomponentsexcerpt. |
Gao et al., “Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems.” Medical engineering & physics 36.6 (2014): 779-785. |
Frankenfield et al., “Comparison of Predictive Equations for Resting Metabolic Rate in Healthy Nonobese and Obese adults: A systematic review”. Journal of the American Dietetic Association. May 2005, vol. 105, No. 5, p. 775-789. |
Chu, “In-Vehicle Driver Detection Using Mobile Phone Sensors”, Submitted for Graduation with departmental Distinction In Electrical and Computer Engineering, Apirl 20, 2011, pp. 1-21. |
Bo et al., “TEXIVE: Detecting Drivers Using Personal Smart Phones by Leveraging Inertial Sensors”, Department of Computer Science, Illinois Institute of Technology, Chicago IL, Dec. 7, 2014, pp. 1-12. |
Brooks, G.A. et al., “Exercise Physiology: Human Bioenergetics and Its Applications,” Fourth Edition, McGraw Hill, ISBN 0-07-255642-0, Chapter 2: Bioenergetics, Chapter 10: Metabolic Response to Exercise: Lactate Metabolism During Exercise and Recovery, Excess Postexercise O2 Consumption (EPOC), O2 Deficit, O2 Debt, and the Anaerobic Threshold, Chapter 16: Cardiovascular Dynamics During Exercise, Chapter 21: Principles of Endurance Conditioning, Chapter 27: Exercise Testing and Prescription, 141 pages (2004). |
Bruce, R.A. et al., “Exercising testing in adult normal subjects and cardiac patients,” Pediatrics, vol. 32, No. Suppl., pp. 742-756 (Oct. 1963). |
Bruce, R.A. et al., “Maximal oxygen intake and nomographic assessment of functional aerobic impairment in cardiovascular disease,” American Heart Journal, vol. 85, Issue 4, pp. 546-562 (Apr. 1973). |
Burke, Edmund R., “High-Tech Cycling,” Second Edition, Human Kinetics, Chapter 4: Optimizing the Crank Cycle and Pedaling Cadence, Chapter 5: Cycling Biomechanics, Chapter 6: Cycling Power, Chapter 10: Physiology of Professional Road Cycling, Chapter 11: Physiology of Mountain Biking, 131 pages (2003). |
Cavanagh, P.R. et al., “The effect of stride length variation on oxygen uptake during distance running,” Medicine and Science in Sports and Exercise, vol. 14, No. 1, pp. 30-35 (1982). |
Earnest, C.P. et al., “Cross-sectional association between maximal estimated cardiorespiratory fitness, cardiometabolic risk factors and metabolic syndrome for men and women in the Aerobics Center Longitudinal Study,” Mayo Clin Proceedings, vol. 88, No. 3, pp. 259-270, 20 pages (Mar. 2013). |
Fox, S.M. et al., “Physical Activity and the Prevention of Coronary Heart Disease,” Bull. N.Y. Acad. Med., vol. 44, No. 8, pp. 950-967 (Aug. 1968). |
Glass, S., et al., “ACSM's Metabolic Calculations Handbook,” Lippincott Williams & Wilkins, 124 pages (2007). |
Lavie, C.J. et al., “Impact of cardiorespiratory fitness on the obesity paradox in patients with heart failure,” Mayo Clinic Proceedings, vol. 88, No. 3, pp. 251-258 (Mar. 2013). |
Margaria, R. et al., “Energy cost of running,” Journal of Applied Physiology, vol. 18, No. 2, pp. 367-370 (Mar. 1, 1963). |
McArdle, W.D. et al., “Exercise Physiology: Nutrition, Energy and Human Performance,” Seventh Edition, Lippincott Williams & Wilkins, Chapter 5: Introduction to Energy Transfer, Chapter 6: Energy Transfer in the Body, Chapter 7: Energy Transfer During Exercise, Chapter 8: Measurement of Human Energy Expenditure, Chapter 9: Human Energy Expenditure During Rest and Physical Activity, Chapter 10: Energy Expenditure During Walking, Jogging, Running and Swimming, Chapter 11: Individual Differences and Measurement of Energy Capacities, Chapter 21: Training for. |
Myers, J. et al., “Exercise Capacity and Mortality Among Men Referred for Exercise Testing,” The New England Journal of Medicine, vol. 346, No. 11, pp. 793-801 (Mar. 14, 2002). |
Noakes, Timothy D., “Lore of Running,” Fourth Edition, Human Kinetics, Chapter 2: Oxygen Transport and Running Economy, Chapter 3: Energy Systems and Running Performance, 157 pages (2002). |
Rapoport, Benjamin I., “Metabolic Factors Limiting Performance in Marathon Runners,” PLoS Computational Biology, vol. 6, Issue 10, 13 pages (Oct. 2010). |
Tanaka, H. et al., “Age-predicted maximal heart rate revisited,” Journal of the American College of Cardiology, vol. 37, Issue 1, pp. 153-156 (Jan. 2001). |
Wang, L. et al., “Time constant of heart rate recovery after low level exercise as a useful measure of cardiovascular fitness,” Conf. Proc. IEEE Eng. Med. Biol. Soc., vol. 1, pp. 1799-1802 (2006). |
U.S. Appl. No. 17/015,912, filed Sep. 9, 2020, Humblet et al. |
U.S. Appl. No. 17/015,965, filed Sep. 9, 2020, Dervisoglu et al. |
U.S. Appl. No. 17/016,020, filed Sep. 9, 2020, Ochs et al. |
Alexander, “Energetics and Optimization of Human Walking and Running,” Am J Human Biology, Mar. 20, 2002, 14:641-648. |
Lasecki, “Real-Time Crowd Labeling for Deployable Activity Recognition,” University of Rochester Computer Science, Feb. 23, 2013, 10 pages. |
Latt et al., “Walking speed, cadence and step length are selected to optimize the stability of head and pelvis accelerations,” Experimental Brain Research, Aug. 24, 2007, 184: 201-209. |
Morgan et al., “Effect of step length optimization on the aerobic demand of miming,” Journal of Applied Physiology, 1994, 245-251. |
PCT International Preliminary Report on Patentability in International Appln. No. PCT/US2017/049693, dated Mar. 5, 2019, 8 pages. |
PCT International Preliminary Report on Patentability in International Appln. No. PCT/US2018/047290, dated Mar. 17, 2020, 9 pages. |
Pfitzinger.com “Optimal Marathon Training Sessions, Distance Coach.com, Intelligent Training for Distance Runners,” archived May 15, 2012, <https://web.archive.org/web/20120515081237/http://www.pfitzinger .com/marathontraining.shtml>, printed Jan. 20, 2017, 3 pages. |
Romijn et al., “Regulation of endogenous fat and carbohydrate metabolism in relation to exercise intensity and duration,” Am. J. Physiol., 1993, 6:1-13. |
Triendurance.com “Running with a Higher Cadence, Triendurance,” Oct. 23, 2021, retrieved from <https ://web. archive.org/web/20080228162904/http ://www.trienduranee .com/Related. asp? PageID=14&NavID=7>, 2 pages. |
Zhao, “New Developments of the Typical MEMS and Wearable Sensor Technologies,” Micronanoelectronic Technology, Jan. 2015, 52(1):1-13 (with English abstract). |
Zhou et al., “Development of the Technical Monitoring System for Swimming Competition,” China Sport Science and Technology, Aug. 2008, 44(4):84-86 (with English abstract). |
Mattfeld et al., “A New Dataset for Evaluating Pedometer Performance,” IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Nov. 2017, pp. 865-869. |
Shen et al., “MiLift: Efficient Smartwatch-Based Workout Tracking Using Automatic Segmentation,” Jul. 2018, 17(7):1609-1622. |
Unuma et al., JP2007093433, published on Apr. 12, 2007, 27 pages (machine translated English version). |
Number | Date | Country | |
---|---|---|---|
20180056128 A1 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
62381843 | Aug 2016 | US | |
62381846 | Aug 2016 | US | |
62381988 | Aug 2016 | US | |
62381989 | Aug 2016 | US | |
62381641 | Aug 2016 | US | |
62381640 | Aug 2016 | US |