Market research and subject matter experts are showing that fatigue can make an athlete more susceptible to injury, and may, in fact, be one of the leading causes of injuries. Thus, there is a need to detect the onset of fatigue while an athlete is actively training, conducting practice, or participating in a live game. When a trainer (e.g., an athletic trainer or coach) detects signs of fatigue, the trainer can intervene to reduce the likelihood of fatigue-related injury. For example, when a trainer detects fatigue, the trainer may instruct the athlete to slow down or focus on technique. In addition or alternately, a trainer may pull the athlete from a game or a practice session for rest and recovery.
One of the challenges with detecting fatigue is that traditional methods of monitoring athletic performance, such as real-time heart rate monitoring, do not in and of themselves necessarily indicate when an athlete is fatigued. For example, an athlete experiencing lack of sleep, may still exhibit a normal or expected heart rate during a practice session, but may experience earlier on-set of fatigue due to lack of sleep. As a result, the athlete may have a heightened susceptibility to fatigue-induced injury that the trainer may be unaware of because the athlete's heart rate appeared to be normal.
Precise control of stance, posture, and movement improves the effectiveness of exercise routines and prevents injury. Typically, an expert, either a coach, a trainer, or a doctor, will directly observe a subject, such as an athlete or a patient during the exercise, and will make real time corrections based on a number of intuitive factors. This approach is limited in that the expert can only work in real-time while the subject is under observation. In most team and clinic environments, an expert must observe a number of subjects simultaneously, making fine adjustments to stance and posture based only on information collected during the short period of time that each subject is under observation.
Accordingly, there remains a need for efficient and reliable injury prediction systems and methods that aim to address one or more problems of prior art systems.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Analytics systems configured in accordance with various embodiments of the present technology, can address at least some limitations of traditional methods of detecting fatigue and/or monitoring athletic performance. As described below, the system can provide analytics in an augmented reality environment that are real-time, comparative, and predictive in nature, and can detect the early onset of fatigue which may not be readily detectable by visual monitoring alone. This, in turn, provides the opportunity for improved training outcomes, and earlier intervention and corrective action to reduce the risk of fatigue-related injuries.
Various embodiments of the present technology include an augmented reality system (such as, a real time analytics system) incorporating data collected from wearable sensor technology, also referred to as a performance monitor, into an interactive user interface having a receiver (such as, a wireless receiver) for data. In some embodiments, the interactive user interface provides an augmented reality display of health- and/or performance-analytics data integrated into a video image of a subject. The interactive user interface may present a subject under observation during an athletic performance as viewed through a camera in a mobile device. The user interface may present data collected over a period of time preceding the athletic performance, thereby providing in-depth information on athletic development, therapeutic efficacy of an exercise routine, recovery after a sports injury, etc. In different embodiments, the inventive technology may be used for other purposes. For example, the inventive technology may be used for military training or in conjunction with consumer devices.
In some embodiments, the user interface may communicate with a data storage system including a processor implementing machine learning analytics. The interactive user interface may be implemented on a digital platform that analyzes real-time data collected from the wearable sensor technology as the subject exercises or rests, and may compare the collected data with aggregated data collected from additional subjects and subsequently analyzed by a machine learning system. The machine learning analytics may implement predictive models to adjust the augmented reality display, indicating training information, such as likelihood of injury, asymmetric exertion, motion or posture irregularities, etc.
As understood by one of ordinary skill in the art, a “data storage system” as described herein may be a device configured to store data for access by a computing device. An example of a data storage system is a high-speed relational database management system (DBMS) executing on one or more computing devices and being accessible over a high-speed network. However, other suitable storage techniques and/or devices capable of quickly and reliably providing the stored data in response to queries may be used, and the computing device may be accessible locally instead of over a network, or may be provided as a cloud-based service. The data storage system may also include data stored in an organized manner on a computer-readable storage medium.
In some embodiments, the user interface is implemented in a mobile device. As such, the mobile device may be repositioned to observe multiple subjects during a given period of time. The user interface may facilitate various techniques to identify the subject presently under observation. For example, the user interface may implement facial recognition routines to identify a subject. Alternatively or additionally, the user interface may communicate via radio-frequency identification and/or Bluetooth with the performance monitor worn by the subject. For example, the performance monitor may include a radio frequency identifier (RFID) or other unique identifier that allows the analytics system to attribute newly collected data and to request previously collected performance data.
In some embodiments, the inventive technology includes an augmented reality system for real-time assessment of an athletic performance. In some embodiments, the system includes a digital platform. In an aspect, the digital platform includes a display, at least one camera, and a communications module. In some embodiments, the system includes a performance monitor carried by a garment and configured to communicate wirelessly with the digital platform, a logic engine, and an interactive user interface. In an aspect, the interactive user interface presents real-time data and images of the athletic performance in an augmented reality environment. In an aspect the real-time data and images include images obtained by the at least one camera and athletic performance data received from the performance monitor.
In an aspect, the interactive user interface further presents historical performance data and aggregated performance data.
In an aspect, historical performance data includes real-time data collected from an identified individual over a period of time.
In an aspect, aggregated performance data includes real-time data collected from a plurality of anonymized individuals.
In an aspect, the augmented reality system includes at least one predictive model indicating a likelihood of one or more of fatigue or injury in correlation to real-time data.
In an aspect, the performance monitor includes a plurality of sensors positioned in one or more regions of the garment configured to measure signals generated during the athletic performance and a performance monitor controller. In an aspect, the performance monitor controller includes an onboard analytics module configured to receive and process signals from the plurality of sensors and an onboard communications module in wireless communication with the digital platform.
In an aspect, the performance monitor includes sensors to measure orientation, acceleration, heart response, and muscle response.
In an aspect, the logic engine includes an implementation of machine learning.
In an aspect, the augmented reality system further includes a data storage system in communication with the digital platform and the performance monitor, the data storage system including an implementation of machine learning.
In some embodiments, the inventive technology includes a method of assessing athletic performance in real-time through an augmented reality environment. In some embodiments, the method includes selecting a subject of observation, identifying the subject of observation using a digital platform, presenting an augmented reality environment including an interactive user interface and data. In an aspect, the interactive user interface and data include images of the subject of observation collected via a camera and real-time data collected via a performance monitor. In some embodiments, the method includes receiving commands from a user via the interactive user interface, wherein the commands modify one or more of the interactive user interface, the operation of the performance monitor, the selection of the subject of observation, and the presentation of data.
In an aspect, the data further includes historical performance data collected from the subject of observation and aggregated performance data collected from multiple anonymized subjects.
In an aspect, the method further includes accessing real-time analytics provided by an external data storage system and processing the real-time data using model-predictions of athletic performance.
In an aspect, the method further includes identifying multiple subjects engaging in simultaneous athletic performances, presenting one or more available subjects via the interactive user interface, and prompting a selection of one or more of the available subjects for observation in real-time.
In an aspect, the method further includes indicating, via a visual or auditory signal, when the subject of observation has a high likelihood of adverse outcome from athletic performance.
In some embodiments, the inventive technology includes an augmented reality system for real-time assessment of a physical rehabilitation treatment. The augmented reality system may include a digital platform including a display, at least one camera, and a communications module, a performance monitor carried by a garment and configured to communicate wirelessly with the digital platform, a logic engine, and an interactive user interface. The interactive user interface may present real-time data and images of the physical rehabilitation treatment in an augmented reality environment. In an aspect, the real-time data and images include images obtained by the at least one camera and physical rehabilitation treatment data received from the performance monitor.
In an aspect, the interactive user interface further presents historical performance data and aggregated performance data. In an aspect, historical performance data includes real-time data collected from an identified individual over a period of time. In an aspect, aggregated performance data includes real-time data collected from a plurality of anonymized individuals.
In an aspect, the augmented reality system further includes at least one predictive model indicating a likelihood of one or more of fatigue or injury in correlation to real-time data.
In an aspect, the performance monitor includes a plurality of sensors positioned in one or more regions of the garment configured to measure signals generated during the athletic performance and a performance monitor controller. The performance monitor controller may include an onboard analytics module configured to receive and process signals from the plurality of sensors and an onboard communications module in wireless communication with the digital platform.
In an aspect, the performance monitor includes sensors to measure orientation, acceleration, heart response, and muscle response.
In an aspect, the logic engine includes an implementation of machine learning.
In an aspect, the augmented reality system further includes a data storage system in communication with the digital platform and the performance monitor, the data storage system including an implementation of machine learning.
The foregoing aspects and attendant advantages of the inventive technology will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
The following disclosure describes various embodiments of systems and associated methods for preparing personalized cosmetic formulas. A person skilled in the art will also understand that the technology may have additional embodiments, and that the technology may be practiced without several of the details of the embodiments described below with reference to
In some embodiments, the interactive user interface 112 includes an augmented reality display including real-time data 118a of the subject 140 while the subject 140 is engaging in physical activity. The real-time data 118a may include vertical position, lateral position, acceleration, orientation, etc., as well as bioelectrical information. The bioelectrical information may include muscle activity signals, heart-rate signals, etc., as described further, below. As described in more detail below, with regard to
In addition to real-time data 118a, the interactive user interface 112 may present selected athletic performance data 118b, such as a personal best metric or a record-setting metric, to compare the subject 140 with an external measure of activity. The selected performance data 118b may also include a range of values within which the subject 140 is less likely to sustain an injury while engaging in physical activity. In some embodiments, an implementation of machine learning determines the range of data values, as described in more detail, below.
In some embodiments, the data 118 includes historical performance data 118c, collected from the subject 140 over a given period of time, such as during a period of peak condition, or during a period preceding an injury. The interactive user interface 112 may display the historical performance data 118c alongside other data 118. The user 102 may select and modify data 118 as desired.
While the real-time data 118a is collected from the subject 140 directly, the data 118 may include aggregated performance data 118d collected from a number of anonymized subjects, subsequent to processing to provide useful indicators for the subject 140. For example, aggregated performance data 118d may provide correlations between various measured parameters of the real-time data 118a and likelihood of injury, such as asymmetric load on one hamstring 144, uneven exertion between two legs 142, etc.
To pair a subject 140 with data 118a-d, the subject's face 148 may be recognized by the digital platform 110 through facial recognition 160, as shown in
In some embodiments, the interactive user interface 112 allows a user 102 to manipulate the augmented reality environment by selecting the type of data 118 to be presented and the manner of its presentation in a way most favorable for the user 102.
Additionally or alternatively to the color-map 200a shown in
In some embodiments, the system can employ cloud learning that enables the subject 140, user 102, and others to evaluate performance and compare performance to other subjects, including anonymous subjects. The digital platform 110 may communicate with the controller 305 wirelessly, via the wireless transceiver 335, which may include Bluetooth and RFID capabilities. As discussed further with regard to
In general, the word “engine,” as used herein, refers to logic software and algorithms embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NET™, PYTHON, and/or the like. An engine may be compiled into executable programs or written in interpreted programming languages. Software engines may be callable from other engines or from themselves. Generally, the engine described herein refers to logical modules that can be merged with other engines, or can be divided into sub engines. The engines can be stored in any type of computer readable medium or computer storage device and be stored on and executed by one or more general purpose computers, thus creating a special purpose computer configured to provide the engine or the functionality thereof.
Referring to
Referring to
The ECG and EMG sensors 323a and 323b may include dry-surface electrodes distributed throughout the subject's clothing 345 and positioned to make necessary skin contact beneath the clothing along predetermined locations of the body. In some embodiments, the sensors can include an optical detector, such as an optical sensor for measuring heart rate. The fit of the clothing can be selected to be sufficiently tight to provide continuous skin contact with the individual sensors 323a and 323b, allowing for accurate readings, while still maintaining a high-level of comfort, comparable to that of traditional compression fit shirts, pants, and similar clothing. In various embodiments, the clothing 345 can be made from compressive fit materials, such as polyester and other materials (ex. Elastaine) for increased comfort and functionality. In some embodiments, the controller 305 and the sensors 323 can have sufficient durability and water-resistance so that they can be washed with the clothing 345 in a washing machine without causing damage. In these and other embodiments, the presence of the controller 305 and/or the sensors 323 within the clothing 345 may be virtually unnoticeable to the subject. In one aspect of the technology, the sensors 323 can be positioned on the subject's body without the use of tight and awkward fitting sensor bands. In general, traditional sensor bands are typically uncomfortable for a subject, and subjects can be reluctant to wear them.
The ECG sensors 323a can include right arm RA, left arm LA, and right leg RL (floating ground) sensors positioned on the subject's chest and waist. The EMG sensors 323b can be positioned adjacent to targeted muscle groups, such as the large muscle groups of the pectoralis major, rectus abdominis, quadriceps femoris, biceps, triceps, deltoids, gastrocnemius, hamstring, and latissimus dorsi. The EMG sensors 323b can also be coupled to floating ground near the subject's waist or hip.
The orientation and accelerations sensors 323c and 323d can be disposed at a central position 349 located between the athlete's shoulders and upper back region. In some embodiments, the central, upper back region can be an optimal location for placement of the orientation and acceleration sensors 323c and 323d because of the relatively small amount of muscle tissue in this region of the body, which prevents muscle movement from interfering with the accuracy of the orientation and acceleration readings. In other embodiments, the orientation sensor 323c and/or the acceleration sensor 323d can be positioned centrally on the user's chest, tail-bone, or other suitable locations of the body. In various embodiments, the orientation and acceleration sensors 323c and 323d can be positioned adjacent the controller 305, or integrated into the same packaging (e.g., housing) 322 as controller 305, as shown in
In one aspect of this embodiment, the use of a single orientation sensor and a single acceleration sensor can reduce computational complexity of the various analytics 110 (
Referring back to
In some embodiments, the performance monitor 300 (
In additional or alternate embodiments, the performance monitor 300 (
As shown in
In some embodiments, the method 500 includes the digital platform 110 referring to a data storage system 540 that receives the information gathered in block 504. The data storage system 540, as previously described, can provide aggregated performance data 118d from multiple anonymous subjects, historical performance data 118c from the subject 140, as well as analytics and model-predictive adjustments of indicators of fatigue. In such cases, the method 500 displays data 118 in the augmented reality environment as part of the interactive user interface 112.
In some embodiments, the interactive user interface 112 presents visual or auditory feedback to the user 450 when real-time data 118a indicates a high likelihood of an adverse outcome, as shown in block 516. For example, the performance monitor 300 may detect a bioelectric signal indicating a high likelihood of hamstring injury, based on model predictions, and the interactive user interface 112 may provide a blinking indicator over the relevant muscle group on the subject 440. For example, the digital platform 410 may provide feedback when the performance monitor 300 detects that a left hamstring is bearing an excess load, based on models of healthy and effective load balancing. In some embodiments, the user 450 designates values of real-time data 118a for which feedback will be provided. In some embodiments, the values for which feedback will be provided are designated automatically via a model-prediction, as previously described.
Many embodiments of the technology described above may take the form of computer- or controller-executable instructions, including routines executed by a programmable computer or controller. Those skilled in the relevant art will appreciate that the technology can be practiced on computer/controller systems other than those shown and described above. The technology can be embodied in a special-purpose computer, application specific integrated circuit (ASIC), controller or data processor that is specifically programmed, configured or constructed to perform one or more of the computer-executable instructions described above. Of course, any logic or algorithm described herein can be implemented in software or hardware, or a combination of software and hardware.
From the foregoing, it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but that various modifications may be made without deviating from the disclosure. Moreover, while various advantages and features associated with certain embodiments have been described above in the context of those embodiments, other embodiments may also exhibit such advantages and/or features, and not all embodiments need necessarily exhibit such advantages and/or features to fall within the scope of the technology. Accordingly, the disclosure can encompass other embodiments not expressly shown or described herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US19/47487 | 8/21/2019 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62722763 | Aug 2018 | US |