Current technologies for assessing helmet impacts or collision severity utilize one of two common approaches. In one approach, the helmet is instrumented with sensors, typically accelerometers. These sensors measure the resulting motion and correlate that to a potential for head injury. Another more recent approach uses sensors worn directly on the player's head. One example of this type of approach is the Reebok CHECKLIGHT™, which is an elastic cap containing motion sensors worn on the player's head under the helmet. However, outfitting a team with these sensors can be quite expensive. In addition, these approaches require reliable portable power. Furthermore, to be effective at identifying real-time events, a means of communication for the sensors is necessary, and wireless communication may not reliable. Furthermore, updates to the system as well as replenishing the power source are time consuming requirements since each sensor unit must be treated separately.
Accordingly, an improved system and method for detecting and assessing helmet collision impacts is needed.
Various implementations provide systems and methods for remotely detecting and assessing collision impacts between two objects. For example, some implementations include systems and methods that remotely detect and assess helmet collision impacts on an athletic playing field. In such implementations, at least one acoustical sensor is disposed adjacent an athletic playing field. The acoustical sensor is remotely located from the one or more players on the athletic playing field. A processor of a computing device in communication with and remotely located from the acoustical sensor is configured for: (1) receiving an acoustical signal from the acoustical sensor; and (2) identifying whether the acoustical signal indicates a collision event occurred between a helmet and another object. In a further implementation, the processor may be configured for identifying a location at which the collision event occurred in response to identifying that the collision event occurred; and in response to the location identified being within a boundary of the playing field, storing data associated with the collision event in a memory of the computing device. The processor may also be configured for generating a message related to the collision event for communicating to a display device on the computing device or to a remotely located computing device. In addition, the processor may also be configured for identifying one or more characteristics of the acoustical signal to determine the amount of force, the duration, the speed, the acceleration, and/or the location of the collision event on the helmet.
According to certain implementations, this system eliminates the need for sensors to be mounted in each helmet or on each player's head. Therefore, it may be an affordable option for teams that cannot afford to outfit each player. In addition, updates to the system may be implemented more quickly and less expensively and processes may be improved more rapidly because, according to various implementations, the system provides one master system for acquiring and processing data. Furthermore, because the system is installed at an athletic field, or arena, multiple teams may share the benefits of the system. And, for implementations that use wires for communicating power and data between the acoustical sensors and the computing device, the system avoids wireless communication and power sourcing costs associated with wireless communication and improves the reliability of the system.
In some implementations, the processor is further configured for identifying one or more characteristics of the acoustical signal indicative of the collision event. The one or more acoustical signal characteristics define the acoustic signature of the acoustical signal. For example, the one or more characteristics of the acoustical signal may be selected from the group consisting of: acoustic pressure, acoustic energy, maximum acoustic pressure, signal duration, acoustic pressure at one or more discrete frequencies, acoustic pressure at frequencies corresponding to the free-vibration modes of the helmet, acoustic energy at one or more discrete frequencies, wavelets, acoustic radiation mode maximum amplitude(s), acoustic radiation mode response amplitude versus time, energy in acoustic radiation mode(s), envelope of acoustic radiation mode(s) amplitude, and acoustic radiation mode amplitude decay. In a further implementation, the processor is further configured for calculating an adjustment for the acoustic energy of the received acoustical signal based on a location on the playing field of the collision event. The adjustment is associated with an amount of spreading expected for the acoustic energy as the acoustical signal propagates from the collision event.
In certain implementations, the processor is further configured for converting at least a portion of the acoustical signal to a numerical value associated with an amount of force associated with the collision event, and storing the numerical value in the memory. In addition, in some implementations, the processor is further configured for identifying an energy level of the collision event and storing the identified energy level in the memory. And, in some implementations, the processor is further configured for identifying an amount of force or severity associated with the collision event, storing the identified amount of force, and generating a message comprising the identified amount of force. The processor may also be configured for identifying and storing in the memory a location and direction of impact of the force on the helmet, and the message further comprises the location and direction of impact on the helmet, according to certain implementations. Identifying the amount of force associated with the collision event may also include comparing a maximum acoustic pressure associated with the received signal to a range of expected acoustic pressures associated with each of one or more force amounts and identifying the amount of force associated with the range of expected acoustic pressures that includes the maximum acoustic pressure of the received signal. According to some implementations, identifying whether the acoustical signal indicates the collision event may include comparing a value associated with the acoustical signal to a range of expected values indicating the occurrence of the collision event.
The processor may be further configured for identifying a duration of the collision event, storing the duration in the memory, and generating a message comprising the duration of the collision event, according to some implementations. And, according to certain implementations, the processor is further configured for identifying a speed or acceleration of the collision event, storing the speed or acceleration in the memory, and generating a message comprising the speed or acceleration.
In some implementations, the at least one acoustical sensor comprises a first acoustical sensor, a second acoustical sensor, and a third acoustical sensor. The first, second, and third acoustical sensors are remotely located from each other.
According to various other implementations, a system for correlating a helmet collision event with an acoustical signal signature may include a helmet and an object for colliding with the helmet; at least one acoustical sensor disposed remotely from the helmet and the object; and a computing device comprising a processor and a memory. The processor may be configured for: receiving an acoustical signal from the acoustical sensor at a certain time; receiving collision characteristic data associated with the collision of the helmet and the object at the certain time; and associating the acoustical signal at the certain time with the collision characteristic data.
In some implementations, the collision characteristic data comprises an amount of force at which the object is collided with the helmet, and the processor is further configured for associating the amount of force with an energy level of the acoustical signal associated with the collision event. As another example, the collision characteristic data may comprise a duration for which the object is collided with the helmet, and the processor is further configured for associating the duration with the total acoustic energy of the collision. Alternatively or additionally, the processor may be further configured for associating the duration with the duration of a vibration or acoustic mode signal associated with the helmet characteristics under collision. In some implementations, the collision characteristic data further comprises an impact location on the helmet at which the object is collided with the helmet, and the processor is further configured for associating the impact location with the amplitude of a helmet free-vibration mode or acoustic radiation mode of the acoustical signal associated with the collision event.
According to various other implementations, a system for remotely detecting a collision of at least two objects includes at least one acoustical sensor remotely located from a first object and a second object; and a computing device comprising a processor and a memory. The computing device is remotely located from the first and second objects, and the processor is configured for: receiving an acoustical signal from the acoustical sensor; and identifying whether the acoustical signal indicates a collision event of the first object with the second object.
In addition, various implementations include a system for correlating a collision event between two or more objects with an acoustical signal signature. The system includes at least two objects for colliding with each other; at least one acoustical sensor disposed remotely from the objects; and a computing device comprising a processor and a memory. The processor may be configured for: receiving an acoustical signal from the acoustical sensor at a certain time; receiving collision characteristic data associated with the collision of the objects at the certain time; and associating the acoustical signal at the certain time with the collision characteristic data.
The systems and methods are explained in detail in the following exemplary drawings. The drawings are merely exemplary to illustrate the structure of exemplary systems and methods and certain features that may be used singularly or in combination with other features. The invention should not be limited to the implementations shown.
Various implementations provide systems and methods for remotely detecting and assessing collision impacts between two objects. For example, some implementations include systems and methods that remotely detect and assess helmet collision impacts on an athletic playing field. In such implementations, at least one acoustical sensor is disposed adjacent an athletic playing field. The acoustical sensor is remotely located from the one or more players on the athletic playing field. A processor of a computing device in communication with the acoustical sensor is configured for: (1) receiving an acoustical signal from the acoustical sensor; and (2) identifying whether the acoustical signal indicates a collision event occurred between a helmet and another object. In a further implementation, the processor may be configured for identifying a location at which the collision event occurred in response to identifying that the collision event occurred; and in response to the location identified being within a boundary of the playing field, storing data associated with the collision event in a memory of the computing device. The processor may also be configured for generating a message related to the collision event for communicating to a display device on the computing device or to a remotely located computing device. In addition, the processor may also be configured for identifying one or more characteristics of the acoustical signal to determine the amount of force, the duration, the speed, the acceleration, and/or the location of the collision event on the helmet.
Various implementations use acoustic measurement(s) to remotely assess the impact severity of two colliding object, such as two helmets, a helmet colliding with another object, such as a ball, puck, a portion of sports gear or athletic equipment worn by another player, a fixed piece of equipment disposed within the boundaries of the playing field, another object that may injure the player, or two other types of objects. For example, when a helmet collides with another object, such as when two football players' helmets collide on the playing field, an impact force is generated. This impact force causes the helmets to vibrate and subsequently radiate acoustic energy or sound. Most sports fans and television viewers have heard this acoustic signature, which mimics a short cracking or popping sound. This radiated impact sound, referred to as the acoustic signature, can be measured remotely using one or more acoustical sensors, such as microphones. By appropriately processing the measured acoustic signature, the severity of the helmet collision (e.g., magnitude of the impact force) associated with that signature can be determined. Because the location of a helmet collision on the field can vary during play, certain implementations may include multiple microphones around the athletic field to determine the location of the collision on the athletic field. With multiple microphones and/or the use of processing algorithms, like wavelets, to isolate the collision signature from the total noise measured, the negative influence of extraneous noise may be reduced. Furthermore, identification of helmet impacts that may be occurring outside of the boundaries of the athletic field, such as from a player throwing a helmet onto a bench, can be identified to reduce false alarms. By detecting a collision event and assessing its severity, the potential for head injury can be determined.
To process the signals received from the acoustical sensors 12a-12f, a computer system, such as the central server 500 shown in
In addition, the central server 500 may include at least one storage device 515, such as a hard disk drive, a floppy disk drive, a CD-ROM drive, or optical disk drive, for storing information on various computer-readable media, such as a hard disk, a removable magnetic disk, or a CD-ROM disk. As will be appreciated by one of ordinary skill in the art, each of these storage devices 515 may be connected to the system bus 545 by an appropriate interface. The storage devices 515 and their associated computer-readable media may provide nonvolatile storage for a central server. It is important to note that the computer-readable media described above could be replaced by any other type of computer-readable media known in the art. Such media include, for example, magnetic cassettes, flash memory cards and digital video disks. In addition, the server 500 may include a network interface 525 configured for communicating data with other computing devices.
A number of program modules may be stored by the various storage devices and within RAM 530. Such program modules may include an operating system 550 and a plurality of one or more modules, such as a signal processing module 560, a correlation module 570, and a communication module 590. The modules 560, 570, 590 may control certain aspects of the operation of the central server 500, with the assistance of the processor 510 and the operating system 550. For example, the modules 560, 570, 590 may perform the functions described and illustrated by the figures and other materials disclosed herein.
The functions described herein and the flowchart and block diagrams in
As shown in
The location of the collision event may be determined by using triangulation, the known locations of the sensors 12a-12f, and the various times at which the collision event is detected by each sensor 12a-12f. In one implementation, for example, multiple microphones are disposed at known locations around the playing field. These locations may be recorded as the x,y coordinates on the playing field, such as the field shown in
The use of multiple microphones also decreases the negative influence of background noise. Football stadiums, for example, are notoriously noisy environments. It is not expected that a microphone would be able to effectively measure a helmet collision acoustic signature that occurred all the way on the opposite end of the playing field. By having multiple microphones around the perimeter of the field, or even permanently mounted at various locations within the arena, the likelihood of a collision occurring in the vicinity of multiple microphones increases. In addition, identification of collision location on the field is important to identify which players may be involved in the collision.
As noted above, the module 560 may identify one or more characteristics of the collision event by processing the received acoustical signal and identifying one or more characteristics of the received acoustical signal. The signal characteristics of the acoustical signals define the acoustic signature of each acoustical signal and may include one or more of the following: acoustic pressure, acoustic energy, maximum acoustic pressure, signal duration, acoustic pressure at one or more discrete frequencies, acoustic pressure at one or more discrete natural frequencies corresponding to free vibration modes of the helmet, acoustic energy at one or more discrete frequencies, acoustic radiation mode maximum amplitude(s), acoustic radiation mode response amplitude versus time, energy in acoustic radiation mode(s), envelope of acoustic radiation mode(s) amplitude, and acoustic radiation mode amplitude decay. For example, the module 560 may identify the maximum acoustic pressure of the received acoustical signal and compare the identified maximum acoustic pressure to a range of expected acoustic pressures associated with each of one or more helmet collision force amounts. In response to the received maximum acoustic pressure being within the range of expected acoustic pressures associated with a particular force amount, the module 560 associates the particular force amount with the received acoustical signal. By knowing the position of the collision on the field from using the techniques noted above, the acoustic amplitudes can also be adjusted to account for spreading of acoustic energy as it propagates.
As noted above, the data gathered using the process described in
The communication module 590 may receive the message generated by the signal processing module 560 and communicate the message to one or more display, audible, or haptic feedback devices, such as a display, audible, or haptic feedback device that is part of the server 500 or a display, audible, or haptic feedback device that is part of another computing device remotely located from and in wired or wireless communication with the server 500. For example, the computing device in communication with the server 500 may be statically disposed within a communication range of the acoustical sensors (e.g., a desktop computer or an alert monitor in the press box of the athletic field that alerts personnel when a collision event occurs) or portable (e.g., a smartphone or other portable feedback device held by personnel). This message may include a general indication that the collision event occurred (e.g., “severe collision” or “mild collision”) and/or an indication related to the severity of the impact (e.g., a force estimate, speed of the impact, resulting acceleration of the impact, scaled level of severity, a color related to severity), the location on the field, and the duration of the impact.
For example, the type of collision likely to be detected on a football field, such as the field shown in
The relationship between the severity of the collision event and the acoustic signature is also demonstrated in
Another test was conducted in which the acoustic signature resulting from the collision of two helmets was measured. In this experiment, two helmets were suspended and the acoustic signature was measured as the helmets collided at various collision speeds.
Although the above described experiments consider the acoustic energy level of the acoustic signature in the processing method, other characteristics of the acoustic signature may be used to determine the characteristics of the collision event in other implementations. For example, the peak acoustic pressure, which is associated with an energy level of the signal in the time domain, may be compared with energy levels associated with known levels of force to identify the force associated with the collision event. In another implementation, the signal or a portion thereof may be transformed from the time domain to the frequency domain. For example, the processor may use a Fourier transform to transform at least a portion of the signal from the time domain to the frequency domain. One characteristic of that frequency-domain signal is the acoustic pressure at frequencies of interest and correlating those pressures with the force associated with the collision event. In another implementation, those frequencies of interest may correlate with the vibration mode frequencies of the helmet. In another implementation, the acoustic signal is compared to a set of wavelets that correlate with some characteristic of the helmet signature.
Referring back to
Thus, various implementations allow for the use of collision event characteristics to evaluate the potential for head injury. This evaluation may be accomplished by correlating the results from the acoustic processing with results from experiments or real-time use in games or practice of instrumented helmets and players.
The correlation module 570 may store the known collision event data with the received acoustical signals (and/or one or more acoustical signal characteristics thereof) that are associated with the collision event data in a look up table or library according to one implementation. However, in other implementations, the relationship between the known collision event data and the received acoustical signals (and/or one or more acoustical signal characteristics thereof) may be “learned” by the system using a neural network or other suitable computer implemented learning algorithm. In addition, the methods described in relation to
The implementations described above include systems and methods of detecting the collision of a helmet with another object. However, the systems and methods described above aren't limited to use with helmet collisions and could be used to remotely detect a collision of two or more other objects using acoustical sensors. And, the systems and methods described above for correlating a collision event of a helmet and another object with an acoustical signal could be used to correlate a collision event of two or more other types of objects with an acoustical signal. For example, as illustrated in
The systems and methods recited in the appended claims are not limited in scope by the specific systems and methods of using the same described herein, which are intended as illustrations of a few aspects of the claims. Any systems or methods that are functionally equivalent are intended to fall within the scope of the claims. Various modifications of the systems and methods in addition to those shown and described herein are intended to fall within the scope of the appended claims. Further, while only certain representative systems and method steps disclosed herein are specifically described, other combinations of the systems and method steps are intended to fall within the scope of the appended claims, even if not specifically recited. Thus, a combination of steps, elements, components, or constituents may be explicitly mentioned herein; however, other combinations of steps, elements, components, and constituents are included, even though not explicitly stated. The term “comprising” and variations thereof as used herein is used synonymously with the term “including” and variations thereof and are open, non-limiting terms.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The implementation was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various implementations with various modifications as are suited to the particular use contemplated.
Any combination of one or more computer readable medium(s) may be used to implement the systems and methods described hereinabove. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN), such as Bluetooth or 802.11, or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to implementations of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
This application claims priority to U.S. Patent Application No. 62/016,777, filed Jun. 25, 2014, entitled “Systems and Methods for Remotely Sensing and Assessing Helmet Collision Impacts,” the content of which is incorporated herein by reference in its entity.
Number | Date | Country | |
---|---|---|---|
62016777 | Jun 2014 | US |