None
Obesity is a growing problem with more than one-third of American adults being classified as obese. Obesity increases the risk of certain chronic diseases such as Type II diabetes. Exercising has been shown to improve the health of individuals and lower the risk of obesity-related diseases. Despite these health benefits, many individuals still remain inactive. This could be due to lack of motivation, due to the physical effort or monotony sometimes perceived as being associated with exercise.
Exergaming (a portmanteau of “exercise” and “gaming”) has emerged as a solution to this problem. Exergaming is a class of video games that requires participants to be physically active in order to play the game, thereby turning tedious exercising into a fun and interactive exercise experience for users. Game genres may vary from action based to health and fitness focused. These technologies however, have several limitations. First, a television is often required to display these games. Second, the user is often required to hold a game controller in order to capture their physical movements. This limits the types of exercises the user can perform and is often an added physical burden.
In light of the above, there remains a need for improved systems and methods for immersive and interactive exercise.
The present disclosure generally relates to virtual reality and communicative sensing devices as applied to immersive and interactive exercise. More specifically, the present disclosure is directed to systems and methods that capture a user's movements during exercise and use captured movement information to update sensory stimuli presented to the user via a head mounted display to create an illusion of being immersed in a virtual environment in which the user can interact. In one embodiment, this movement information may be captured using an internet of things (IoT) sensor attached to, in communication with, or integrated into exercise equipment being operated by the user. The IoT sensor may identify exercise type, count the number of repetitions, and assess the quality of the exercise being performed by the user.
In an example embodiment, a method may include, with a sensor device, detecting motion and generating motion data based on the detected motion, with an electronic device, receiving the motion data from the sensor device, with a processor in the electronic device, analyzing the motion data to produce analytics information, with the processor, generating a virtual reality (VR) environment in which analytics information is provided, and with an electronic display in the electronic device, displaying the VR environment.
In some embodiments, analyzing the motion data to produce the analytics information may include segmenting the motion data into repetition segments, each corresponding to a single repetition of an exercise, generating a repetition count corresponding to a quantity of the repetition segments, generating a motion progress status corresponding to a percentage of a given repetition that has been completed in real-time, determining an exercise type based on the motion data, and determining exercise quality based on the motion data.
In some embodiments, the motion data may include acceleration data, and segmenting the motion data to produce the repetition segments includes performing principle component analysis on the acceleration data to generate a first principle component signal, and identifying a repetition segment of the motion data corresponding to a first repetition of the exercise based on the first principle component signal and the acceleration data.
In some embodiments, generating the motion progress status may include generating the motion progress status based on a comparison between the first principle component signal to a historical first principle component signal.
In some embodiments, the motion data may also include gyroscope data, and determining the exercise type based on the motion data includes generating an acceleration magnitude signal for the acceleration data, generating a rotational magnitude signal for the gyroscope data, extracting features from the acceleration magnitude signal and the rotational magnitude signal to generate a feature vector, and analyzing the feature vector to determine the exercise type by applying a majority voting scheme to the feature vector for multiple repetitions of the exercise.
In some embodiments, determining exercise quality based on the motion data may include comparing the motion data to a trainer model stored in a non-transitory memory of the electronic device.
In some embodiments, comparing the motion data to the trainer model may include dividing the repetition segments into smaller fixed-length windows, generating a first motion trajectory for the motion data by extracting features from each window of the windows to generate a sequence of local feature vectors, and performing trajectory comparison on the first motion trajectory and a second motion trajectory of the trainer model.
In some embodiments, the trajectory comparison may include multidimensional dynamic time warping.
In some embodiments, generating the VR environment may include animating an avatar that moves in real-time corresponding to the motion data, highlighting muscle groups on the avatar that correspond to muscles activated by the determined exercise type, and generating a heads-up display (HUD) that includes the repetition count, the motion progress status, the exercise type, and the exercise quality.
In an example embodiment, a system may include a sensor device that captures motion data corresponding to motion of an exercise machine, and an electronic device that receives the motion data from the sensor device. The electronic device may include a processor connected to a memory having instructions stored thereon which, when executed by the processor, cause the processor to analyze the motion data to produce analytics information, and generate a virtual reality (VR) environment in which the analytics information is provided, and an electronic display electrically coupled to the processor to display the VR environment generated by the processor.
In some embodiments, the sensor device may include a magnet that attaches the sensor device to the exercise machine.
In some embodiments, the sensor device may include wireless communications circuitry that provides the motion data to the electronic device via Bluetooth Low Energy.
In some embodiments, the processor may further execute instructions for segmenting the motion data into repetition segments, each corresponding to a single repetition of an exercise, generating a repetition count corresponding to a quantity of the repetition segments, generating a motion progress status corresponding to a percentage of a given repetition that has been completed in real-time, determining an exercise type based on the motion data, and determining exercise quality based on the motion data.
In some embodiments, the processor may further execute instructions for dividing the repetition segments into smaller fixed-length windows, generating a first motion trajectory for the motion data by extracting features from each window of the windows to generate a sequence of local feature vectors, and performing trajectory comparison on the first motion trajectory and a second motion trajectory of a trainer model to determine the exercise quality. The trainer model may be stored in a trainer reference database in a non-transitory memory of the electronic device.
In some embodiments, the sensor device may also include an accelerometer that generates acceleration data and a gyroscope that generates gyroscope data. The captured motion data may include the acceleration data and the gyroscope data.
In an example embodiment, a head-mounted display (HMD) device may include a processor connected to a memory having instructions stored thereon which, when executed by the processor, cause the processor to analyze captured motion data to produce analytics information, and generate a virtual reality (VR) environment in which the analytics information is provided, and an electronic display electrically coupled to the processor to display the VR environment generated by the processor.
In some embodiments, the memory contains further instructions which, when executed by the processor, cause the processor to segment the motion data into repetition segments, each corresponding to a single repetition of an exercise, generate a repetition count corresponding to a quantity of the repetition segments, generate a motion progress status corresponding to a percentage of a given repetition that has been completed in real-time, determine an exercise type based on the motion data, and determine exercise quality based on the motion data.
In some embodiments the memory contains further instructions which, when executed by the processor, cause the processor to divide the repetition segments into smaller fixed-length windows, generate a first motion trajectory for the motion data by extracting features from each window of the windows to generate a sequence of local feature vectors, and perform trajectory comparison on the first motion trajectory and a second motion trajectory of a trainer model to determine the exercise quality.
In some embodiments, the HMD device may also include a non-transitory computer readable storage medium. The trainer model may be stored in a trainer reference database in a non-transitory computer readable storage medium.
In some embodiments, the VR environment may include an animated avatar that moves in real-time corresponding to the motion data to perform the exercise, and a heads-up display (HUD) that includes the repetition count, the motion progress status, the exercise type, and the exercise quality in real-time. The animated avatar may include highlighted muscle groups corresponding to muscles activated by the determined exercise type.
The present invention will hereafter be described with reference to the accompanying drawings, wherein like reference numerals denote like elements.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The present disclosure relates to systems and methods for immersive and interactive machine-based exercise training using VR.
Exercising in the gym has become an important part of modern life for many people. However, without the guidance of professional trainers, novice exercisers may be unaware if the quality of the speed and motion of an exercise they perform is adequate or what they should focus on during a workout. This lack of awareness often prevents exercisers from making steady progress, and may eventually cause exercisers to lose interest and motivation for going to the gym.
In order to enhance an individual's interest and motivation for exercise, as well as to improve the quality of exercise, an immersive and interactive VR exercising experience is provided, through which controllable 3D stimulus environments may be created. As part of this VR exercising experience, an engaged virtual exercise assistant may guide exercisers in a highly interactive and precise way, which may not be achievable through traditional exercise training paradigms.
Immersive and interactive machine-based exercise training may be enabled through the use of miniature IoT sensing devices communicatively coupled to a mobile head mounted display (HMD) device. By attaching (directly or indirectly) an IoT sensing device on any piece of gym equipment, exercise progress may be continuously tracked, and exercise quality may be assessed in real-time. By providing captured exercise progress information and quality information as inputs of a VR environment implemented on the HMD device, an immersive exercise experience is created in which a user may be guided through the process of exercising by a virtual exercise assistant using real-time feedback. Additionally, by highlighting, on an avatar shown in the VR environment, required muscle groups corresponding to the exercise being performed, the virtual exercise assistant may enable a user to more easily focus on these muscle groups while performing the exercise.
Turning now to
System 110 includes a communications interface 112, processing circuitry 114, an electronic display 116, a memory 118, and an antenna 122. Some or all of these components may communicate over a bus 120. Although bus 120 is illustrated here as a single bus, it may instead be implemented as one or more busses, bridges, or other communication paths used to interconnect components of system 110. Memory 118 may be a non-transitory computer readable storage medium (e.g., read-only memory (ROM), random access memory (RAM), flash memory, etc.). Processing circuitry 114 may include one or more hardware processors, which may execute instructions stored in memory 118. These instructions may, for example, include instructions for determining exercise type, tracking exercise progress, assessing exercise quality, generating a VR scene, visualizing exercise information, and animating a virtual body (e.g., an avatar). Electronic display 116 may display a VR scene, exercise information, and an animated virtual body (e.g., all generated by processing circuitry 114) all corresponding to an exercise being performed by a user in real-time.
Communications interface 112 may include one or more communications interfaces, each configured to operate according to a different wireless communications protocol (e.g., Bluetooth, Bluetooth Low Energy (LE), WiFi, WiMAX, LTE, LTE-Advanced, GSM/EDGE). Antenna 122 may wirelessly transmit and receive data between communications interface 112 and external devices. It should be noted that while antenna 122 is shown here as being a single antenna that is external to system 110, in some embodiments, antenna 122 may instead include multiple antennas located in or at various locations of system 110, and/or may be disposed within or formed as part of a housing of system 110.
Turning now to
Sensor device 130 includes a communications interface 132, processing circuitry 134, motion sensor circuitry 136, a memory 142, and an antenna 146. Some or all of these components may communicate over a bus 144. Although bus 144 is illustrated here as a single bus, it may instead be implemented as one or more busses, bridges, or other communication paths used to interconnect components of sensor device 130. Memory 142 may be a non-transitory computer readable storage medium (e.g., read-only memory (ROM), random access memory (RAM), flash memory, etc.). Processing circuitry 134 may include one or more hardware processors, which may execute instructions stored in memory 118. These instructions may, for example, include instructions for controlling motion sensor circuitry 136 and communications interface 132. In some embodiments, processing circuitry 134 may be a microcontroller unit.
Motion sensor circuitry 136 may include an accelerometer 138, a gyroscope 140, and/or other sensors capable of discerning relative movement. Accelerometer 138 may, for example, be a 3-axis accelerometer, which may measure linear acceleration undergone by sensor device 130 in one or more directions. Gyroscope 140 may, for example, be a 3-axis gyroscope, which may measure the angular rate of rotational movement about one or more axes of sensor device 130 accurately in multiple dimensions. Motion sensor circuitry 136 may generate motion data at a given sampling rate (e.g., 10 Hz). A moving average filter (e.g., with length 10) may be applied (e.g., by processing circuitry 134) to the generated motion data in order to suppress high frequency noise that may be present in the motion data. Motion data generated by motion sensor circuitry 136 may be provided to an external VR HMD system (e.g., system 110 of
Communications interface 132 may include one or more communications interfaces, each configured to operate according to a different wireless communications protocol, such as Bluetooth LE. Antenna 146 may wirelessly transmit and receive data between communications interface 132 and external devices (e.g., system 110 of
While not shown here, sensor device 130 may include a plastic or otherwise dielectric housing in which a magnet is embedded. In this way, sensor device 130 may be easily attached to, for example, ferromagnetic exercise equipment.
Turning now to
As shown in
Turning now to
Turning now to
First, for lateral raise machine 601, a sensor tag 602 may be attached to a first portion of lateral raise machine 601. While lateral raise machine 601 is in use, the motion of this first portion corresponds to the primary motion associated with the performance of a lateral raise. Additionally, a sensor tag 604 may be attached to a second portion of lateral raise machine 601. While lateral raise machine 601 is in use, the motion of the second portion corresponds to a different angle/trajectory compared to the motion of the first portion of lateral raise machine 601.
Second, for seated abs machine 605, a sensor tag 608 may be attached to a first portion of seated abs machine 605 in a first orientation. Additionally, a sensor tag 606 may be attached to a second portion of seated abs machine 605 in a second orientation (e.g., arranged along a plane that is perpendicular to the plane along which sensor tag 608 is arranged).
By attaching a second sensor tag to an exercise machine with a slightly different angle/trajectory of motion and/or orientation compared to that of a first sensor tag attached to the exercise machine, motion data from both sensor tags may be analyzed to determine the optimal sensor tag placement on the exercise machine for accurate identification of exercise type (e.g., by exercise type recognizer 712 of
For example, for embodiments in which “N” exercise types corresponding to “N” different exercise machines are identifiable (e.g., by system 110 of
Different sensor placement combinations (e.g., arranged in motion data arrays) may then be analyzed (e.g., by exercise type recognizer 712 of
Turning now to
Repetition segmentor 708 may distinguish between separate repetitions of an exercise by segmenting captured motion data. The goal of repetition segmentation performed by repetition segmentor 708 is to segment streaming sensor data (e.g., captured motion data) so that each data segment contains one complete repetition of the performed machine exercise. Examples of how repetition segmentor 708 may operate will be described herein in the context of
Repetition counter 710 may increment a counter value each time a new repetition is identified by repetition segmentor 708, where the counter value represents the total number of exercise repetitions that have been performed. A user may have the option (e.g., via a user interface) to reset this counter between different sets of the exercise being performed.
It is challenging to provide a user with precise progress information corresponding to the percentage of a repetition that has been completed in real-time, as the exact progress status within each repetition can only be determined after a repetition has been completed. This is due in part to the issue that the amount of time it takes a user to complete a repetition and the speed with which each repetition is performed may vary from user to user and may vary between two repetitions performed by the same user. As an alternative to determining the exact progress status of a repetition, motion progress status may be estimated using motion progress detector 712. For example, motion progress detector 712 may use the values of the first PC signal for the 3-axis acceleration data to determine a motion progress status for a partially completed repetition of a given exercise in real time. Motion progress status may begin at 0% at the start of a repetition, and may increase to 100%, marking the end of the repetition, with successive steps of, for example, 10%. The correlation between the first PC signal and the motion progress status for a given exercise may be determined based on previously determined relationships between a historical first PC signal and motion progress status for the given exercise (e.g., determined based on trainer motion data stored in trainer reference database 718). For example the first PC signal may be compared to the historical first PC signal when determining the motion progress status. A real-time first PC value for the first PC signal may be determined along with an indicator that specifies whether the first PC signal is presently increasing or decreasing. A historical PC value corresponding to the real-time first PC value may be identified in a region of the historical first PC signal that is either increasing or decreasing, according to the value of the indicator. A historical motion progress status may be determined for the historical first PC value (e.g., by determining the percentage of the historical repetition was completed at the time the historical first PC value was sampled), and may be used as an estimate for the motion progress status of the exercise presently being performed.
After segmenting captured motion data into repetition segments, the type of exercise being performed may be determined by exercise type recognizer 714. Due to the different mechanical constraints of exercise machines, each type of machine exercise has a certain form, which may be used to distinguish a given machine exercise from other machine exercises. Therefore, identifying a type of machine exercise may be considered a problem of classification. As explained above, a user could place a sensor tag on different exercise machines in different ways, leading to different orientations of the sensor tag. In order to perform orientation-independent classification to determine the type of exercise being performed, an acceleration magnitude signal for the 3-axis acceleration data (e.g., generated by accelerometer 138 of
The final stage of real-time exercise analytics engine 702 provides assessment of the quality of machine exercises performed by users. Exercise quality assessor 716 includes a trainer reference database 718. Trainer reference database 718 may store trainer models corresponding to each machine exercise with which the VR system may be used. Each trainer model may include motion data corresponding to one or more professional trainers' performances of a given exercise (referred to herein as “trainer motion data”). Comparison between this trainer motion data and captured motion data corresponding to a user's performance of an exercise (referred to herein as “user motion data”) is used by exercise quality assessor 716 as a basis for determining the quality of user's performance of the exercise. In some embodiments, at least two trainer models may be stored in trainer reference database 718 for each machine exercise, one corresponding to a female trainer performing the machine exercise, and the other corresponding to a male trainer performing the machine exercise, so that a female user may choose to use a female trainer model, and a male user may choose to use a male trainer model. It should be noted that a trainer model may be an aggregate model compiled from the trainer motion data of multiple trainers, which may improve the quality and accuracy of the trainer models over trainer models that are generated based on only a single trainer's motion data.
In order to determine similarities between a trainer model and user motion data, a motion trajectory based approach may be used. For example, user motion data corresponding to a user's performance of a given exercise may be divided into repetition segments (e.g., by repetition segmentor 708) and each repetition segment may further be divided into a sequence of small fixed-length windows, each having a period that is smaller than the duration of the repetition segment itself (e.g., the duration of a repetition segment may be between 3-5 seconds depending on the machine exercise, while the window duration may be 0.5 seconds). Then, a number of features may be extracted from each window in order to capture intrinsic characteristics of each repetition. These extracted features may be stored in a local feature vector for each respective window and, thereby, a sequence of local feature vectors may be formed, which forms a motion trajectory in the feature space. A trajectory comparison algorithm may be applied to this motion trajectory in order to quantify the similarity between two motion trajectories. In this way, quality assessment performed by exercise quality assessor 716 may provide fine-grained descriptions about where a user's exercise repetition differs from the trainer model, and the user may be provided with concrete feedback on how their exercise quality may be improved.
For example, the extracted features used to form a local feature vector for a window may include average of movement intensity (AI), variation of movement intensity (VI), smoothness of movement intensity (SI), average acceleration energy (AAE), and average rotation energy (ARE). AI may be computed as the average of motion intensity (MI) defined as the Euclidian norm of the acceleration vector. AI measures the average strength level of the exercise repetition. VI is computed as the variation of MI. VI measures the strength variation of the exercise repetition. SI is computed as the derivative values of MI. SI measures the smoothness of the exercise repetition. AAE calculates the mean value of energy over the three accelerometer axes. AAE measures the total exercise acceleration energy. ARE calculates the mean value of energy over the three gyroscope axes. ARE measures the total exercise rotation energy.
Examples of sampled AI and VI values that may be used as a basis for exercise quality assessment are depicted in the graphs of
Graph 902 shows the AI value over time for user motion data corresponding to a user's performance of one repetition of a leg extension machine exercise compared to the AI value over time for a corresponding trainer model for the leg extension machine exercise.
Graph 904 shows the VI value over time for user motion data corresponding to a user's performance of one repetition of a leg extension machine exercise compared to the VI value over time for a corresponding trainer model for the leg extension machine exercise.
As shown, the user motion data for both graph 902 and graph 904 appear to be closely match the trainer model, indicating that the user's repetition of the leg extension machine exercise should be considered a good or high quality repetition.
Graph 906 shows the AI value over time for user motion data corresponding to a user's performance of one repetition of a bicep curl machine exercise compared to the AI value over time for a corresponding trainer model for the bicep curl machine exercise.
Graph 908 shows the AI value over time for user motion data corresponding to a user's performance of one repetition of a bicep curl machine exercise compared to the AI value over time for a corresponding trainer model for the bicep curl machine exercise.
As shown, the user motion data for both graph 906 and graph 908 appear to be mismatched with the trainer model, indicating that the user's repetition of the bicep curl machine exercise should be considered a bad or low quality repetition.
Returning now to
X=x1, x2, . . . , xi, . . . , xM
Y=y1, y2, . . . , yj, . . . yN
where xi and yi represent the ith and jth local feature vector in X and Y respectively, and where M and N represent the length of X and Y respectively. DTW compensates for the length difference between X and Y by solving the following dynamic programming (DP) problem:
D(i, j)=min{D(i−1, j−1), D(i−1, j), D(i, j−1)}+d(i, j)
where d(i, j) represents the distance function which measures the local difference between local feature vectors xi and yi in the feature space, and D(i, j) represents the cumulative global distance between sub-trajectories {x1, x2, . . . , xi} and {y1, y2, . . . , yj}. The solution of the DP problem is the cumulative distance between the two motion trajectories X and Y, which is located within D(M, N) and a warp path W of length K defined as:
W=w1, w2, . . . , wk, . . . , WK
which traces the mapping between X and Y. Since the cumulative D(M, N) is dependent on the length of the warp path W, D(M, N) may be normalized by dividing D(M, N) by the warp path length K and using this averaged cumulative distance as the metric for measuring the distance between motion trajectories X and Y as:
Dist(X, Y)=[D(M, N)]/K
The cosine distance may be used as the local distance function defined as:
d(i, j)=1−[(xiT*yj)/(∥xi∥*∥yi∥]
Compared to other distance functions, the cosine distance may provide an advantage of having an intrinsic range of [0, 1], which in turn should cause the averaged cumulative distance Dist(X, Y) to be in the range [0, 1], which may therefore be interpreted as the dissimilarity between X and Y in terms of percentile. Therefore, the similarity score between X and Y, Sim(X, Y), may be defined as:
Sim(X, Y)=1−Dist(X, Y)
This similarity score, as applied to a comparison between the motion trajectories of user motion data and a training model, is indicative of the quality of the user's performance of a repetition of a machine exercise and, thus applied, acts as the quantification of exercise quality. For example, the similarity score may be presented to the user (e.g., as part of a HUD of a VR environment) as a percentage, indicating the quality of a repetition of the exercise being performed. Alternatively, a running average of consecutive similarity scores may be displayed to the user in order to indicate the quality of the user's performance of the exercise across multiple repetitions of the exercise.
VR synthesis engine 704 includes a VR scene manager 720, an exercise information visualizer 726, and a virtual body animator 728. VR scene manager 720 includes an exercise type database 722 and a personal configuration database 724. Once a user begins exercising, VR scene manager 720 may automatically initiate a virtual coaching scene based on the exercise type determined by exercise type recognizer 714. For example, VR scene manager 720 retrieve a virtual coaching scene corresponding to the determined exercise type from exercise type database 722. Additionally, user-defined preferences related to the virtual coaching scene may be retrieved by VR scene manager 720 from personal configuration database 724. For example, these user-defined preferences may correspond to user customization of the avatar that is displayed, the information that is displayed, or the background of the virtual coaching scene.
An example of a virtual coaching scene that may be generated and displayed by VR scene manager 720 is shown in
The present invention has been described in terms of one or more preferred embodiments, and it should be appreciated that many equivalents, alternatives, variations, and modifications, aside from those expressly stated, are possible and within the scope of the invention.
This application claims priority to U.S. Provisional Application No. 62/592,236 filed Nov. 29, 2017, which is incorporated by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
62592236 | Nov 2017 | US |