FIELD OF THE DISCLOSURE
The present disclosure is generally related to collecting player data and more specifically relates to a system for evaluating a player's injury risk based on the collected player data.
BACKGROUND
Wearable sensors are used to acquire a wide range of information about the user wearing the sensors. Examples of wearable sensors include accelerometers, gyroscopes, compasses, global positioning systems (GPS), heart rate monitors, and electromyography sensors. Wearable location and health-based sensors can acquire a range of different information about the wearer, which can monitor the user's condition.
Wearable sensors can be affixed to a player participating in a sport to acquire information about the player. This information collection is particularly useful in sports where physical contact may be experienced by a player, such as in American football. Wearable sensors can be used to determine the intensity, direction, and location of an impact received by the player. These sensors can also identify non-impact events, such as a player tripping or falling or otherwise moving in a manner that may result in injury. The information can be compiled into graphs for viewing or analysis to determine the player's risk of developing an injury both in the short and long term.
Sports with physical contact, such as American football, can be taxing on players' bodies. Some methods of monitoring a player include affixing accelerometers to the player, such as in a helmet, and using data collected during practice or gameplay to determine the severity and frequency of impacts. When correlated with reported injuries, this impact data may be used to predict future chronic injuries, which may inform decisions to manage player health or changes to protective equipment or game rules to improve player health and safety.
Additional methods of monitoring a player comprise using video data to infer the forces imparted during impacts experienced by the player. While less costly to acquire and more readily available, this data may be less reliable as it may be unable to accurately account for players' body mass or correctly identify the exact location of impact. A system is desired which can improve the accuracy of a video-based player monitoring system.
A method of alerting a player's coach of an event that may result in an injury to a player, including the location or severity of a probable injury, which even the player may be unaware of is desired. Players may sustain an injury but not immediately feel the pain or become weakened and more prone to injury without awareness of their increased risk. A method of assessing such increase in risk, notifying staff, and allowing for a decision to be made to prevent injury to the player is desired.
DESCRIPTIONS OF THE DRAWINGS
FIG. 1: Illustrates an injury prediction system, according to an embodiment.
FIG. 2: Illustrates a sensor database, according to an embodiment.
FIG. 3: Illustrates a camera database, according to an embodiment.
FIG. 4: Illustrates a correlation database, according to an embodiment.
FIG. 5: Illustrates a base module, according to an embodiment.
FIG. 6: Illustrates a sensor prediction module, according to an embodiment.
FIG. 7: Illustrates a camera prediction module, according to an embodiment.
FIG. 8: Illustrates a correlation module, according to an embodiment.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described more fully hereinafter with reference to the accompanying drawings in which like numerals represent like elements throughout the several figures and in which example embodiments are shown. Embodiments of the claims may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. The examples set forth herein are non-limiting examples and are merely examples among other possible examples.
FIG. 1 illustrates an injury prediction system. This system may be comprised of a player monitoring system 102, which is a system comprised of at least one camera 104, at least one sensor 106, a processor 108, a memory 110, and a communications interface 112. The player monitoring system 102 may be in communication with a cloud 114 via the communication interface 112. The player monitoring system may operate a base module 114, which may further comprise several base modules, to collect data from at least one camera 104 and the at least one sensor 106 and predict based upon the collected data whether at least one player has sustained an injury or is at increased risk of injury resulting from physical activity such as a sport where physical contact occurs. A camera 104 is an imaging device for recording visual images. A camera may record single images or a series of images that may comprise a video. Each image comprises an array of pixels that may be represented by a single value, resulting in a monochrome or grayscale image, or may be represented by a color model representing colors. The color model may be any of an RGB or red-green-blue color model, HSL or hue-saturation-lightness model, HSV or hue-saturation-value model, HSI or hue-saturation-intensity model, etc. A wearable sensor 106 is a sensor that may be affixed to a person either directly, such as by using an adhesive, strap, or band, or may be integrated into an article worn by the person, such as a shirt, socks, undergarments, pants, hat, helmet, etc. A wearable sensor 106 may include any of an accelerometer, force transducer, gyroscope, compass, a global positioning system (GPS), heart rate monitor, electromyography sensor, etc. A wearable sensor 106 may further be any device that is worn by a person and collects or transmits data about the person wearing the sensor or the environment surrounding the person. Wearable sensors 106 may be configured to correspond with a part of the wearer's body the wearable sensors 106 are near, including a knee, ankle, hip, toe, face, shoulder, etc. including whether the sensor is located on the right, left, front or backside of the body. For example, a wearable sensor 106 may be an accelerometer located near the right knee of a player.
A processor 108 or controller is a computing device that executes the instructions that comprise a program. The processor 108 may perform operations including basic arithmetic, logic, controlling, and input/output operations based on the provided instructions. The processor 108 may further receive data from a memory 110 or database and perform calculations using the received data. The data results may further be saved to the memory 110 or a database. The processor 108 may further use a communication interface 112 to communicate with external devices, including a camera 104, wearable device 106, or a cloud 114. A memory 110 is a data storage device. The data storage device may be comprised of temporary, volatile storage, long-term persistent storage, or a combination of both volatile and non-volatile storage. An example of temporary storage is random access memory (RAM) which is used by processor 108 to store temporary data while performing the instructions provided by a program. Examples of persistent, non-volatile memory include hard disk drive (HDD), solid-state drive (SSD), flash drives, memory cards, etc. Memory 110 may be located local to a processor 108 or may be located externally, such as in a cloud 114. Memory 110 may additionally store programs to be accessed and executed by a processor 108.
A communication interface 112 may be comprised of a physical connection or a wireless terminal which allows data to be sent and received between devices. An example of a physical communication interface 112 is an ethernet port, which may be connected to a network switch, router, modem, or directly to another computing device to allow communication between the local device and the devices to which the local device is connected via the communication interface 112. Wireless communication interfaces 112 include Wi-Fi, Bluetooth, Zigbee, 4G, 5G, LTE, etc., and comprise a transceiver and an antenna for sending and receiving data wireless between devices. A communication interface 112 may allow a device to connect to and communicate with a cloud 114. A cloud 114 is a distributed network comprised of computing and data storage resources. A cloud 114 may be a public cloud, where resources are shared among a large number of unrelated entities via the internet, or a private cloud, where resources are owned by the organization utilizing them and are not shared with other entities. Data in a private cloud may be isolated from the internet or may be encrypted. A cloud 114 is characterized by redundancy and scalability, where there is no single point of failure, and computing and data storage resources may be requisitioned when needed and repurposed when no longer necessary.
A sensor database 116 stores data collected from one or more wearable sensors 106. Examples of the data stored may be position and orientation data from an accelerometer, GPS, compass, etc., or force data from transducers or strain gages. The data may include the player wearing the sensor, the sensor type, sensor location, and measurement from the sensor at a specific time. The sensor database 116 may additionally include the type of event and players. A camera database 118 stores data collected from one or more cameras 104 oriented toward the field of play. At least one camera 104 capturing at least one player. The camera database 118 comprises images or video segments, each having a timestamp of when the image or video was acquired. Each image or video segment may be stored as a file. The correlation database 120 stores correlation data calculated by the correlation module 128. The correlation data comprises the relationship between predictions made by the sensor prediction module 124 using data from at least one wearable sensor 106 and the camera prediction module 126 using data from at least one camera 104. The base module 122 initializes one or more wearable sensors 106 and one or more cameras 104, which are used by the sensor prediction module 124 and the camera prediction module to monitor at least one player for trigger events and predict the likelihood of the player sustaining an injury during a trigger event. The trigger event may be a trip, fall, or impact with another player or object. The base module 122 further uses the correlation module 128 to compare the predictions made by the sensor prediction module 124 and the camera prediction module 126 and previous data from the correlation database 120 to determine a correlated probability of injury. The base module 122 repeats until the event ends. The sensor prediction module 124 polls at least one wearable sensor 106 for movement data related to a player. The sensor prediction module 124 checks whether a trigger event is detected and determines an injury prediction or risk score when detected. The sensor data is saved to a sensor database 116, and the prediction data is sent to the base module 122. The camera prediction module 126 polls at least one camera 104 for movement to a player within the field of view of the camera 104. The camera prediction module 126 checks whether a trigger event is detected and determines an injury prediction or risk score when detected. The camera data is used to identify at least three points on a player's body which facilitates the assessment of a part of the player's body for injury by monitoring the relationship of the points during the trigger event. The camera data is saved to a camera database 118, and the prediction data is sent to the base module 122. The correlation module 128 receives sensor and camera data from the base module 122. The sensor and camera data comprise prediction data for at least one trigger event detected by the sensor prediction module 124 and/or the camera prediction module 126. The correlation module 128 synchronizes the data received from the sensor prediction module 124 and the camera prediction module 126 using time codes associated with the collected data and calculating correlation coefficients. If the correlation coefficient is greater than a predetermined threshold value, then determining that there is a high probability of injury. If the correlation coefficient is lower than a predetermined threshold value, then determining that there is a low probability of injury. The data is saved to a correlation database, and the prediction data is sent to the base module 122.
FIG. 2 illustrates the sensor database 116. The sensor database 116 comprises data collected from one or more wearable sensors 106. The data stored may include position and orientation data from sensors such as an accelerometer, GPS, compass, etc. The data may additionally comprise force data from transducers or strain gages. The data may additionally include player data such as the player wearing the sensor, the type of sensor, the location of the sensor on the player, and measurements from the sensor as a specific time indicated by a timestamp. The data may also include the type of event and other players participating in the event. The sensor database 116 is populated by the sensor prediction module 124 and the correlation module 128. It is used by the sensor prediction module 124 to predict the likelihood a player has sustained an injury or risk that the player will sustain an injury in the near future, such as during continued gameplay in the current event. The sensor database 116 additionally stores the probability of injury determined by the sensor prediction module 124, the correlated or combined probability determined by the correlation database, and any reported injury. The probability may be represented by a number returned from an algorithm or may be represented by a classification, such as high or low, determined by identifying the classification into which a numerical prediction fits based upon one or more threshold values. For example, the threshold of 0.75 may be used such that any probability equal to or greater than 0.75 is classified as a high probability, and any probability lower than 0.75 is classified as a low probability. The data may additionally be used by the correlation module 128.
FIG. 3 illustrates the camera database 118. The camera database 118 comprises data collected from one or more cameras 104 oriented towards at least one player, or more generally, a field of play at a sporting event. A sporting event may comprise a competitive game or match or may comprise a practice session involving at least one player. The stored data may include position and orientation data from the player identified using edge detection and object recognition utilizing machine learning and artificial intelligence. The data may additionally comprise point tracking data corresponding with parts of a player's body. The points may be relative to a common point, such as the player's head or center of mass. The data is populated by the camera prediction module 126 and is used by the camera prediction module 126 to predict the likelihood a player has sustained an injury or risk that the player will sustain an injury in the near future, such as during continued gameplay in the current event. The camera database 116 additionally stores the probability of injury determined by the camera prediction module 126, the correlated or combined probability determined by the correlation database, and any reported injury. The probability may be represented by a number returned from an algorithm or may be represented by a classification, such as high or low, determined by identifying the classification into which a numerical prediction fits based upon one or more threshold values. For example, the threshold of 0.75 may be used such that any probability equal to or greater than 0.75 is classified as a high probability, and any probability lower than 0.75 is classified as a low probability. The data may additionally be used by the correlation module 128.
FIG. 4 illustrates the correlation database 120. The correlation database 120 stores correlations of camera data with the risk of injury as determined by the camera prediction module 126 and sensor data with the risk of injury as determined by the sensor prediction module 124. In an embodiment, the sensor and camera data is comprised of the magnitude of deceleration in Gs, or acceleration equivalent to gravity. The sensor data is measured from one or more sensors affixed to a player, such as via an accelerometer affixed to the player's helmet. In further embodiments, the sensors may be affixed to a player's uniform, such as near the hip, knee, toe, ankle, shoulders, etc. The sensors may be used to track their relative position and, by association, track the positions and movements of the corresponding body parts. The camera data is calculated from the movement of players captured over time by one or more cameras. The camera data may further comprise relative tracking information for parts of a player's body, such as their hips, knees, shoes' toe, ankles, shoulders, head, etc.
A correlation represents the degree to which a first parameter, in this example, the measured or calculated deceleration, is related to or influences a second parameter, in this case, the predicted risk of injury. The predicted risk of injury is determined by the sensor prediction module 124 and the camera prediction module 126. In Fig. A, the correlation coefficient R equals 0.74 for deceleration calculated from camera data and the risk of injury determined by the camera prediction module 126. In fig. B, the correlation coefficient R equals 0.86 for deceleration measured from camera data and the risk of injury determined by the sensor prediction module 124. This correlation indicates that the sensor data and the sensor prediction module 124 are more reliable methods of predicting injury than the camera prediction module. In alternate embodiments, correlations may be assessed for the relative position of body parts during a trigger event, such as an angle formed by at least three points corresponding with parts of the player's body.
Additionally, each correlation coefficient may be compared to a threshold value, which may be predefined or identified using machine learning or artificial intelligence to determine whether the predictions are reliable. In an example, the threshold is 0.75, in which case the sensor data is determined to be reliable while the camera data is determined to be unreliable. In this example, the camera data may be unreliable due to a technical issue, such as a camera requiring calibration or poor placement of the cameras.
FIG. 5 illustrates the base module 122. The process begins with initializing, at step 502, the wearable sensors 106 attached to at least one player. Initialization may comprise powering on the wearable sensors 106 and may also include establishing communication between the wearable sensors 106 and a player monitoring system 102 via a communication interface 112. The communication may comprise a handshake where a message is transmitted to a wearable sensor and the wearable sensor, and upon receiving the message, transmits a response that confirms that the message was received. If both messages are received, then communication is established. In an embodiment, a wearable sensor 106 is an accelerometer affixed to the helmet of an American football player, John Smith. The wearable sensor 106 is turned on, and confirmation is received by a terminal in possession of John Smith's trainer that the accelerometer is transmitting position and orientation data to the terminal. The terminal may be a mobile device such as a tablet. Initializing, at step 504, the cameras 104 oriented towards at least one player, or more generally, oriented toward the field of play. Initialization may comprise powering on the cameras 104 and may additionally include calibration to verify that each camera 104 is recording data as expected. Calibration may require a standardized card held in front of the camera 104 to verify color and brightness accuracy. Alternatively, the collected data from the powered-on camera 104 may be compared to previous data from the camera database 118 to determine whether the current data matches the baseline data from the camera database 118. In an embodiment, the cameras 104 are installed at an American football stadium and oriented towards the field of play. The cameras 104 are powered on, and images are acquired and compared to images retrieved from the camera database 118, previously acquired by the cameras 104, and a correlation coefficient for each camera is calculated. If the correlation coefficient is above a threshold value, 0.90, the cameras are sufficiently calibrated and complete initialization. The initialization process may also comprise establishing communication with a player monitoring system 102. Triggering, at step 506, the sensor prediction module 124. The sensor prediction module 124 polling at least one wearable sensor 106 and evaluates whether a trigger event is detected. The sensor prediction module 124 determines an injury prediction or risk score if a trigger event is detected. The acquired and calculated data is saved to the sensor database 116. Triggering, at step 508, the camera prediction module 126. The camera prediction module 126 polling at least one camera 104 and evaluates whether a trigger event is detected. The camera prediction module 126 determines an injury prediction or risk score if a trigger event is detected. The acquired and calculated data is saved to the camera database 118. Receiving, at step 510, an injury prediction from the sensor prediction module 124. The injury prediction comprising a deceleration detected for player John Smith of 20 Gs detected at his left hip, which was identified as a trip or fall, predicted to be a high probability of injury to his knee as the accelerometers located at his left hip and left knee showed a displacement of two inches, consistent with a knee injury. Receiving, at step 512, an injury prediction from the camera prediction module 126. The injury prediction comprises an apparent fall of John Smith in the images acquired by at least one camera 104 indicated by points corresponding with his hip and shoulders contacting the ground. The camera prediction module 126 determines that John Smith has a high probability of having a knee injury as the angle formed by the points defining his hip, knee, and ankle were determined to have extended to 185°, which is greater than a threshold of 170°. Triggering, at step 514, the correlation module 128. The correlation module 128 receives sensor data collected by the sensor prediction module 124 and camera data from the camera prediction module 126. It compares time-synchronized data to determine whether the player has a high probability of injury or a low probability of injury based on the correlated data. The data is further saved to the sensor database 116 and the camera database 118. Receiving, at step 516, correlated prediction data from the correlation module 128. The injury prediction comprises a high risk of injury, a moderate risk of injury, or a low risk of injury. In an embodiment, John Smith is determined to have a high risk of injury as both the probability of injury from the sensor prediction module 124 and the camera prediction module 126 indicated a high risk of injury. Determining, at step 518, whether the player is at a high risk of injury. The player is at a high risk of injury if the correlated prediction data received from the correlation module 128 indicates a high or moderate risk of having sustained an injury. In an embodiment, John Smith is determined to be at a high risk of having sustained a concussion. Generating, at step 520, a warning notification is sent to any player, a coach, a trainer, or a medical provider indicating that the player has experienced an impact that has a likelihood of having resulted in an injury. The warning notification may be sent to a mobile device, such as a tablet and may comprise audio, visual, or haptic components. In an embodiment, a tablet held by the trainer for John Smith receives a notification comprising a text box on the tablet's screen, an audio notification playing an alarm tone, and a vibration to acquire the trainer's attention. Displaying, at step 522, injury probability data relating to the impact experienced by the player. The injury probability data may include any of the data collected by the wearable sensors 106, camera footage and calculated data acquired from one or more cameras 104, and probability data determined by any of the sensor prediction module 124, camera prediction module 126, or the correlation module 128. In an embodiment, the tablet held by the trainer for the player John Smith displays accelerometer data from the sensor prediction module 124 indicating the distance between the sensor defining the knee and the sensor defining the hip extended by 2 inches, and further determining a high risk of injury. The data further comprising a visual representation of the data from the camera prediction module 126 indicating the angle formed by the points representing the left hip, knee, and toes extended to an angle of 185°, exceeding a threshold for injury of 170°, similarly indicating a high probability of injury. The data further comprises a 10-second video file comprising footage of the detected trigger event involving John Smith from at least one camera 104. The injury probability data may additionally comprise an error message indicating that there may be a fault with the system as the sensor prediction and the camera prediction did not agree if the player's injury risk is determined to be moderate or a specific system fault was identified. Receiving, at step 524, injury information related to a trigger event. The injury information may be provided manually from injury reports filed by the player's trainer, doctor, coach, or any other relevant personnel. The injury information may alternatively be acquired by monitoring the player's performance and comparing the player's performance prior to the trigger event with the player's performance after the trigger event. The injury is associated with either a specific trigger event, such as the most recent trigger event prior to the injury report, or a series of trigger events, such as the trigger events occurring during a game prior to the injury report. In an embodiment, the trainer for player John Smith reported that John Smith sustained a dislocated knee related to the trigger occurring at 21:23:16.00 on Jan. 3, 2022. Determining, at step 526, whether the event has ended by checking whether any playing time remains on the play clock. In a practice event, the end of the event may be determined manually. Alternative embodiments include the end of the event indicated by the player of interest, or all players leaving the field of play, or manually ending a monitoring application. In an embodiment, an American football game in which John Smith is playing is in the 4th quarter with 5:34 remaining on the clock. Therefore the event has not ended. In an alternate embodiment, the game clock has expired, and the score is not tied. Therefore the event has ended. Ending, at step 528, monitoring the players after the event has ended.
FIG. 6 illustrates the sensor prediction module 124. The process begins with receiving, at step 602, a prompt from the base module 122. The sensor prediction module 124 is configured to monitor data from at least one wearable sensor 106, determine whether a trigger event has occurred based upon the data from at least one wearable sensor 106, and determine a probability of injury if a trigger event was detected. Polling, at step 604, the at least one wearable sensor 106 attached to a player. The wearable sensor 106 may be polled at a regular interval, such as every 0.5 seconds, or may transmit data as available in real-time. The sensor data may comprise a single measurement or multiple measurements taken over a time period, such as a measurement every five seconds for one minute. Alternatively, the data may be measured in response to a trigger event, such as when movement is detected. In an embodiment, sensor data is measured by an accelerometer affixed to the player, John Smith's uniform at his left hip, transmitted to a player monitoring system 102 via a Wi-Fi connection. The accelerometer data comprises the player's location, translating to 10 feet from the right boundary at the 50-yard line, the player's velocity of 16 mph in a northerly direction, and the player's acceleration of 0.5 mph on Jan. 3, 2022, at 21:23:15.50. Determining, at step 606, whether a trigger event is detected from the data collected from at least one wearable sensor 106. A trigger event is detected if the player's velocity or acceleration falls abruptly or their direction changes at a rate exceeding a threshold, which may indicate an impact from the side of another player, the player tripping and/or falling, etc. In an embodiment, the player, John Smith, is moving at 16 mph in a northerly direction on Jan. 3, 2022, at 21:23:15.50. A half-second later, at 21:23:16.00, John Smith is moving at 5 mph with a downward movement detected on the accelerometer worn on his hip. Additionally, monitoring a player may detect a trigger event over time. Suppose the player's performance, including variables such as average and maximum speed, and relative movement of the parts of a player's body as indicated by sensors distributed about the player's uniform, change from the first period of time and a second period of time during an event or monitoring session. In that case, the trigger event may be defined as the period of time between the first time period and the second time period, where the player's performance has degraded. This time period may occur over a period of seconds or minutes. Determining, at step 608, a prediction that the player has sustained an injury during the trigger event. The prediction may be based upon a predetermined threshold value, such as a head impact exceeding 75 Gs may indicate a high likelihood of a concussion. Alternatively, the threshold may be an angle of motion formed by at least three points on a player's body or a displacement between two points on a player's body. For example, a displacement of 0.5 inches between a player's hip and knee may normally be observed during gameplay. However, a displacement exceeding that threshold may indicate a dislocated joint or a broken bone. These threshold values may be static and comprise a set of reference threshold values depending on the location and magnitude of the movement or forces experienced by the player during the trigger event. In alternate embodiments, a machine learning algorithm or artificial intelligence may be utilized to predict the likelihood of injury. Artificial intelligence may use a model created using past wearable sensor 106 data from the sensor database 116 to predict the likelihood of injury. Further, the type of injury may also be predicted. In an embodiment, player John Smith is determined to have a high risk of concussion from the impact as the deceleration of magnitude 80 Gs is greater than the threshold value of 75 Gs. In a further embodiment, the angle formed by the hip, knee, and ankle may be measured at 185°, exceeding a threshold value of 170°, indicating a possible dislocation of the knee. In alternate embodiments, the likelihood of injury may refer to the player's susceptibility to future injury. For example, the wearable sensor 106 data for an impact with a magnitude of 60 Gs may indicate a low likelihood of having sustained a concussion, as the force is less than the threshold value of 75 Gs. However, the player may be determined to have a high likelihood of injury within the next hour if they continue to play based upon the data collected by the wearable sensors 106. Similarly, the angle formed by the hip, knee, and ankle may be measured at 165°, which is less than the threshold value of 170°, indicating a high probability of injury. However, it may indicate a stretching of the knee's tendons and ligaments, resulting in a higher susceptibility of future injury. Saving, at step 610, the data collected by the wearable sensors 106 and the prediction of the likelihood of injury to the sensor database 116. In an example, on Jan. 3, 2022, at 21:23:15.00, the deviation of the distance between the sensors on John Smith's left hip and knee are 21 inches apart, which is a deviation of two inches. Further saving the determined high risk of injury. Sending, at step 612, the injury prediction data to the base module 122. The injury prediction data comprises that player John Smith experienced a deviation of 2 inches in the distance between his left hip and left knee and has a high risk of an injury. The prediction may additionally include that John Smith may have dislocated his knee.
FIG. 7 illustrates the camera prediction module 126. The process begins with receiving, at step 702, a prompt from the base module 122. The camera prediction module 126 being configured to monitor data from at least one camera 104 determine whether a trigger event has occurred involving the player and determine a probability of injury if a trigger event was detected. Polling, at step 704, the at least one camera 104 oriented towards a player or a field of play. The camera 104 may be polled at a regular interval, such as every 0.1 seconds, or the camera's maximum framerate in real-time. The camera data may be comprised of a single image frame or may comprise a video taken over a time period, such as in 10-second intervals. In an embodiment, a camera oriented toward an American football field captures the movement of John Smith and a multitude of other players during a football game. Identifying, at step 706, players on the field of play within the field of view of at least one camera 104. The players may be identified using facial recognition, object detection, edge detection, etc. Alternatively, the cameras 104 may operate in conjunction with other sensors or systems to track the position of players on the field of play to track their position. In an embodiment, edge detection is used to create a shape representing the outline of each player, and object detection is used to identify the player based on their uniform indicating the team they play on and their player number. This identifying information may be stored in the camera database 118 to identify players. In an embodiment, identifying a player John Smith as a player for the New England Patriots with the number 86. In addition to identifying players, the camera prediction module 126 may identify objects with which one or more players may collide, using edge detection and object recognition. Identification of players may further comprise identifying points of interest on a player. For example, the previously mentioned image analysis methods may identify parts of a player's body, including their head, shoulders, elbows, wrists, hands, fingers, hips, knees, ankles, and toes. These body parts may be defined as a point. The point may be determined as a point of rotation representing the joint for joints. For example, a knee is defined as the point about which the upper leg and lower leg rotate or alternatively where the center of mass of the upper and lower leg intersects. In some embodiments, these points may be identified by visual indicia integrated into the player's uniform. These points may further be used to create a wire diagram connecting the points of interest and facilitating further analysis. Determining, at step 708, whether a trigger event has been detected. A trigger event may be detected when two or more players, or at least one player and an object on the field of play, make contact within the field of view of at least one camera 104. Alternatively, a trigger event may comprise a player's change in speed or direction, which may indicate a trip, fall, or other condition that may result in an injury. The trigger event may utilize point data corresponding with points of interest on the player, such as points identified and corresponding with a player's head, shoulders, elbows, wrists, hands, fingers, hips, knees, ankles, and toes. In an embodiment, the shape identified for player John Smith contacts with another player, indicating an impact. Therefore a trigger event is determined to have occurred. In some embodiments, a second image from a second camera 104 may be required to determine whether an impact has occurred in three-dimensional space. Determining that a trigger event has been detected may further comprise the calculation of velocities for each player involved and then determining an acceleration or force. A player's velocity can be determined by taking the player's distance traveled between image frames and dividing by the time elapsed between the capture of the image frames. A trigger event may be defined as an impact or contact between two or more players where at least one player was moving at a speed greater than a threshold value, such as 5 mph, or an acceleration experienced by at least one player exceeds 1G. In an embodiment, player John Smith is detected to have experienced an impact with a deceleration with a magnitude of 30 Gs measured at the hip. A trigger event may alternatively be identified as a series of two or more points deviating from an expected threshold value. The threshold values are defined by physiological limitations of the human body, such as the inability of a knee to extend beyond 180°. In some embodiments, these thresholds may be customized to an individual or may represent an aggregate of multiple individuals. A trigger event may additionally be determined when a player's performance changes abruptly, decreasing from an earlier observed level. For example, if a player's average speed was 14 mph for a period of 5 minutes, and then the player's average speed drops sharply to 8 mph for a second period of 5 minutes, it may be determined that a trigger event occurred between the two periods, or at a time towards the end of the first time period or the beginning of the second time period. Similarly, a change in the player's gait may indicate a trigger event had occurred, with the trigger event comprising the time between when the player's gait was normal and when the player's gait changed to an abnormal state. Determining, at step 710, a prediction that the player has sustained an injury from the detected trigger event. The prediction may be based upon a predetermined threshold value, such as a head impact exceeding 75 Gs may indicate a high likelihood of a concussion. Alternatively, the threshold may be an angle of motion formed by at least three points on a player's body or a displacement between two points on a player's body. For example, a displacement of 0.5 inches between a player's hip and knee may normally be observed during gameplay. However, a displacement exceeding that threshold may indicate a dislocated joint or a broken bone. These threshold values may be static and comprise a set of reference threshold values depending on the location and magnitude of the movement or forces experienced by the player during the trigger event. In alternate embodiments, a machine learning algorithm or artificial intelligence may be utilized to predict the likelihood of injury. Artificial intelligence may use a model created using past camera 104 data from the camera database 118 to predict the likelihood of injury. Further, the type of injury may also be predicted. In an embodiment, player John Smith is determined to have a low risk of concussion from an impact as a calculated deceleration of a magnitude 60 Gs is less than the threshold value of 75 Gs. In a further embodiment, the angle formed by the hip, knee, and ankle may be measured at 185°, exceeding a threshold value of 170°, indicating a possible dislocation of the knee. In an alternate embodiment, the likelihood of injury may refer to the player's susceptibility to future injury. For example, John Smith's impact with a magnitude of 60 Gs at the head may indicate a low likelihood of having sustained a concussion, as the force is less than the threshold value of 85 Gs. However, he may be determined to have a high likelihood of injury within the next hour if he continues to play based upon the data collected by the camera 104. Similarly, the angle formed by the hip, knee, and ankle may be measured at 165°, which is less than the threshold value of 170°, indicating a high probability of injury. However, it may indicate a stretching of the knee's tendons and ligaments, resulting in a higher susceptibility of future injury. In other embodiments, assessment of an ankle for injury may comprise the relative position of the knee, ankle, and toes. The toes may be indicated as the furthest point of the player's shoe. An injury of the ankle may be defined as the angle formed by a line between the points representing the knee and the ankle and the line between the points representing the ankle and the toes, dropping below a threshold value, such as 40°, such that an angle that acute would indicate excessive stretching of the tendons and ligaments in the ankle resulting in injury. In another embodiment, the head of a player, defined by a point on the faceplate of the player's helmet, may relate to points on the player's shoulders. A high likelihood of a neck injury may be identified by the point on the player's head deflecting beyond a threshold amount, indicating that the player's neck has exceeded a normal physiological range. Saving, at step 712, the data collected by the cameras 104 and the prediction of the likelihood of injury to the camera database 118. In an example, on Jan. 3, 2022, at 21:23:16.00, the maximum angle formed by John Smith's left hip, knee, and ankle was measured to be 185° according to the video in files PAT-086_02_1-3-2022_21:23:15.00.mp4 and PAT-086_03_1-3-2022_21:23:15.00.mp4. Additionally, saving the determined high risk of injury. Sending, at step 714, the injury prediction data to the base module 122. The injury prediction data comprises that player John Smith experienced an extension of the knee of 185° and has a high risk of an injury. The prediction may additionally include that John Smith may have dislocated his knee.
FIG. 8 illustrates the correlation module 128. The process begins with receiving, at step 802, sensor and camera data from the base module 122. The sensor and camera data comprise data collected by the sensor prediction module 124 and the camera prediction module 126, including prediction data. In an embodiment, the sensor data comprising at 21:23:15.00, the deviation of the distance between the sensors on John Smith's left hip and knee are 21 inches apart, which is a deviation of two inches. The sensor data further comprised a high risk of injury to his knee. The camera data comprising at 21:23:15.00, the maximum angle formed by John Smith's left hip, knee, and ankle was measured to be 185°. The camera data further comprised a high risk of injury to his knee. Synchronizing, at step 804, sensor and camera data such that the time stamps for the sensor data and the camera data match or are the closes available. In an embodiment, the time stamp for sensor data is 21:23:15.00, and the time stamp for the camera data is also 21:23:15.00. Therefore the data is synchronized. In an alternate embodiment, the sensor data detected an impact at 21:23:16.00; however, the camera data detected an impact at 21:23:17.00. The data is synchronized by retrieving the camera data for the time of the impact detected by at least one wearable sensor 106 at the time 21:23:16.00. Alternatively, retrieving the sensor data for the time of the impact detected by at least one camera 104 at the time 21:23:17.00. In an embodiment, the impact may be evaluated at the impact times detected by both the sensor prediction module 124 and the camera prediction module 126 if they are different, or the impact may be evaluated over a period of time defined by the earliest detection of impact through the latest detection of impact. Comparing, at step 806, the sensor predictions and the camera predictions for a synchronized impact event. The predictions may be comprised of a probability that the player received an injury from the detected impact. Alternatively, the predictions may comprise a risk score which may be determined by multiplying the prediction as a percentage by a confidence interval to get a weighted score. In an embodiment, the confidence interval is the correlation coefficient stored in the correlation database. In alternate embodiments, the predictions may be classified based on a threshold value into classifications such as high or low. In an embodiment, the sensor prediction of injury is high, and the camera prediction is also high. Determining, at step 808, whether both the sensor predictions and the camera predictions indicate a high probability of injury. A high probability of injury may comprise risk scores above a threshold value. Alternatively, a high probability may be determined by the sensor prediction module 124 and the camera prediction module 126. In an embodiment, the sensor prediction of injury is high, and the camera prediction of injury is high. Determining, at step 810, that there is a high correlated probability of injury if both the sensor predictions and the camera predictions indicate a high likelihood of injury. In an embodiment, the sensor prediction of injury is high, and the camera prediction of injury is high. Therefore there is a high probability of injury. Determining, at step 812, whether both the sensor predictions and the camera predictions indicate a low probability of injury. A low probability of injury may comprise risk scores below a threshold value. Alternatively, a low probability may be determined by the sensor prediction module 124 and the camera prediction module 126. In an embodiment, the sensor prediction of injury is high, and the camera prediction is low. Determining, at step 814, that there is a low correlated probability of injury if both the sensor predictions and the camera predictions indicate a low likelihood of injury. In an embodiment, the sensor prediction of injury is high, and the camera prediction of injury is low. Therefore there is not a low probability of injury. In an alternate embodiment, the sensor prediction of injury is low, and the camera prediction of injury is also low. Therefore there is a low probability of injury. Determining, at step 816, that there is a moderately correlated probability of injury if only one of the sensor predictions or the camera predictions indicates a high probability of injury and the other indicates a low probability of injury. In this case, where the predictions made by the sensor prediction module 124 and the camera prediction module 126 do not agree, the probability of injury is determined to be moderate, as the data does not support a high or low correlated probability of injury. This is likely indicative of a system fault comprising anomalous data, poor camera angle, a faulty sensor, a sensor, a camera requiring calibration, etc. Saving, at step 818, the correlated probability of injury to the sensor database 116 and the camera database 118. The correlated probability of injury may comprise a composite risk score, such as an average risk score comprising the sensor risk score and the camera risk score. Alternatively, the correlated probability of injury may be a classification, such as high, moderate, or low. The correlated probability of injury provides for higher confidence of the determined low or high probability of injury or identifies possible failures within the system to identify possible false positive or false negative determinations. Sending, at step 820, the correlated injury prediction to the base module 122. The correlated injury prediction comprises a high probability that the player, John Smith, has a knee injury as both sensor and camera data indicate a probability of injury above a threshold value. In an alternate embodiment, a player may have a moderate probability of injury may also comprise an error message indicating the system may require maintenance as the sensor prediction and the camera prediction are not in agreement.
The functions performed in the processes and methods may be implemented in differing order. Furthermore, the outlined steps and operations are only provided as examples, and some of the steps and operations may be optional, combined into fewer steps and operations, or expanded into additional steps and operations without detracting from the essence of the disclosed embodiments.