Systems and methods for monitoring athletic performance

Information

  • Patent Grant
  • 9642415
  • Patent Number
    9,642,415
  • Date Filed
    Tuesday, February 7, 2012
    12 years ago
  • Date Issued
    Tuesday, May 9, 2017
    7 years ago
Abstract
The invention relates to devices and methods for monitoring one or more athletic performance characteristic of a user. An example apparatus includes a sensing unit adapted to be attachable to a shoe of a user, the sensing unit including a first sensor adapted to monitor an movement of a foot of the user while the user is in motion, the first sensor comprising a gyroscopic sensor, processing means for determining a first performance characteristic of the user based upon an output from the first sensor, the first performance characteristic comprising a foot strike location of a foot of the user upon striking a ground surface, and transmitting means for transmitting a data package representative of the performance characteristic to a remote receiver.
Description
FIELD OF THE INVENTION

The present invention relates generally to the field of athletic equipment, and more particularly to systems and methods for providing training information to a runner.


BACKGROUND OF THE INVENTION

A number of devices exist for providing a runner with basic training information. For example, systems for measuring and recording the heart rate, speed, distance, and/or stride rate of a runner using a sensor that may be clipped, for example, to workout apparel has been manufactured by adidas AG of Herzogenaurach, Germany. Similarly, Nike Inc., of Beaverton, Oreg., has produced devices that measure and record the distance and pace of a walk or run. Such devices often consist of small accelerometers attached to or embedded in a shoe, which communicate with a receiving device (e.g., a sportband or a receiver plugged into of embedded within a mobile phone). The device and receiver allow a user to track the distance, time, and pace of a training run, with the information provided to a user through audio feedback during the training run and/or recorded for later analysis. Systems also exist for measuring and recording training information such as distance, time, and pace of a training run through utilization of a mobile phone of GPS (“Global Positioning System”) device without the need for a sensor in a shoe.


However, these systems only provide basic training information related, for example, to the speed and distance travelled by a runner, and cannot provide any detailed biofeedback information that may be used to improve the actual running form and technique of the runner. As a result, there still exists a need for a system and method capable of providing detailed training information to an athlete during and after a training session to assist in improving their running form.


SUMMARY OF THE INVENTION

The present invention is directed towards novel systems, methods and devices for monitoring one or more athletic performance characteristic of a user and/or providing biofeedback information to the user to assist in training the user to run with better form and, for example, with an improved foot strike.


One aspect of the invention includes an apparatus for monitoring one or more athletic performance characteristic of a user, the apparatus including a sensing unit adapted to be attachable to a shoe of a user. The sensing unit includes a first sensor, such as a gyroscopic sensor, that is adapted to monitor a movement of a foot of the user while the user is in motion, processing means for determining a first performance characteristic of the user, such as a foot strike location of a foot of the user upon striking a ground surface, and transmitting means for transmitting a data package representative of the performance characteristic to a remote receiver. The gyroscopic sensor may be adapted to measure an angular velocity of the foot of the user. The means for determining a performance characteristic of the user may include a microprocessor.


In one embodiment the apparatus also includes receiving means for receiving the data package transmitted from the sensing unit and communicating information representative of the performance characteristic to the user. The means for communicating information to the user may include, or consist essentially of, at least one of a visual signal, an auditory signal, and/or a tactile signal (e.g., a vibration). The information may be communicated to the user in substantially real-time and/or be stored, for example within the sensing unit and/or the receiving means, for communicating to the user at a later time and/or for further analysis.


The receiving means may include one or more remote user feedback devices, such as, but not limited to, a watch, a detachable strap, a mobile phone, an earpiece, a hand-held feedback device, a laptop computer, head-mounted feedback device (e.g., a visor or hat), and/or a desktop personal computer. Alternatively, or in addition, the receiving means may include a software application and/or hardware (e.g., a dongle) for controlling at least one function of a remote user feedback device, such as a mobile phone.


In one embodiment, the sensing unit includes a housing unit adapted to house the first sensor, the processing means, and the transmitting means. The housing unit may be adapted to be releasably attachable to a sole and/or upper of the shoe of the user, or be fixedly attached to, or embedded within, an upper and/or sole of the shoe. In one embodiment the housing unit is releasably attachable to at least one of a fastening portion (e.g., the lacing portion) of a shoe or a heel portion of the shoe.


The apparatus may include means for determining at least one second performance characteristic of the user, which can be based upon an output from the first sensor and/or be based upon an output from one or more second sensor(s). The second sensor(s) may include, or consist essentially of, one or more accelerometers, pressure sensors, force sensors, temperature sensors, chemical sensors, global positioning systems, piezoelectric sensors, rotary position sensors, gyroscopic sensors, heart-rate sensors, and/or goniometers. Other sensors, such as, but not limited to, electrocardiograph sensors, electrodermograph sensors, electroencephalograph sensors, electromyography sensors, feedback thermometer sensors, photoplethysmograph sensors, and/or pneumograph sensors may also be utilized in various embodiments of the invention. The at least one second performance characteristic may include, or consist essentially of, at least one of a cadence, a posture, a lean, a speed, a distance travelled, and/or a heart rate of the user.


In one embodiment, the processing means includes a comparison of a localized maximum angular velocity measurement and a localized minimum angular velocity measurement during a foot strike event (e.g., during a brief period immediately before, at, and/or after initial contact between a foot and the ground). For example, processing the measured data may include, or consist essentially of, dividing the localized minimum angular velocity measurement during a foot strike event by the localized maximum angular velocity measurement during a foot strike event, and comparing the resulting calculated value with at least one predetermined comparison value to determine whether a heel strike, a midfoot strike, or a forefoot strike has occurred. Alternatively, or in addition, the processing means may include integration of measured angular velocity data during a foot strike event and comparison of the integrated positive angular velocity results and the integrated negative angular velocity results during a foot strike event.


Another aspect of the invention includes a method for monitoring one or more athletic performance characteristic of a user, the method including providing a sensing unit adapted to be attachable to a shoe of a user. The sensing unit may include a first sensor, such as a gyroscopic sensor, that is adapted to monitor a movement of a foot of the user while the user is in motion, processing means for determining a first performance characteristic of the user, such as a foot strike location of a foot of the user upon striking a ground surface, and transmitting means for transmitting a data package representative of the performance characteristic to a remote receiver. The method further includes providing receiving means for receiving a data package transmitted from the sensing unit and communicating information representative of the performance characteristic to the user. In one embodiment, the method allows for the communication of the information to the user in substantially real-time.


These and other objects, along with advantages and features of the present invention herein disclosed, will become more apparent through reference to the following description, the accompanying drawings, and the claims. Furthermore, it is to be understood that the features of the various embodiments described herein are not mutually exclusive and can exist in various combinations and permutations.





BRIEF DESCRIPTION OF THE DRAWINGS

In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the following description, various embodiments of the present invention are described with reference to the following drawings, in which:



FIG. 1 is a schematic perspective view of a biofeedback system as worn by a runner, in accordance with one embodiment of the invention;



FIG. 2 is a schematic side view of a shoe with forefoot and heel sensors embedded therein, in accordance with one embodiment of the invention;



FIG. 3 is a schematic side view of a shoe with a midfoot sensor embedded therein, in accordance with one embodiment of the invention;



FIG. 4 is a schematic side view of a shoe with a plurality of sensors embedded therein, in accordance with one embodiment of the invention;



FIG. 5 is a schematic plan view of a shoe with an array of sensors embedded therein, in accordance with one embodiment of the invention;



FIG. 6 is a schematic plan view of another shoe with an array of sensors embedded therein, in accordance with one embodiment of the invention;



FIG. 7 is a schematic plan view of a shoe with an sensor holding insert inserted within the sole, in accordance with one embodiment of the invention;



FIG. 8 is a schematic plan view of a shoe sole having midfoot and heel sensor pads, in accordance with one embodiment of the invention;



FIG. 9 is a schematic plan view of a shoe sole having forefoot, midfoot, and heel sensor pads, in accordance with one embodiment of the invention;



FIG. 10 is a schematic side view of a shoe with a sensor holding insert coupled to the shoe at a lacing portion, in accordance with one embodiment of the invention;



FIG. 11 is a schematic side view of a shoe with a sensor coupled to the upper, in accordance with one embodiment of the invention;



FIG. 12 is a schematic view of a system for providing biofeedback information for an athlete, in accordance with one embodiment of the invention;



FIG. 13 is a schematic view of another system for providing biofeedback information for an athlete, in accordance with one embodiment of the invention;



FIG. 14 is a schematic view of yet another system for providing biofeedback information for an athlete, in accordance with one embodiment of the invention;



FIGS. 15 to 19 are schematic views of various biofeedback systems as worn by a runner, in accordance with one embodiment of the invention;



FIG. 20 is a schematic perspective view of a hand held feedback device for a biofeedback system, in accordance with one embodiment of the invention;



FIG. 21 is a schematic side view of a sensor pod for a biofeedback system positioned on a lacing portion of a shoe, in accordance with one embodiment of the invention;



FIG. 22 is a perspective view of the pod of FIG. 21;



FIG. 23 is a schematic side view of a sensor pod for a biofeedback system positioned on a heel portion of a shoe, in accordance with one embodiment of the invention;



FIG. 24 is a perspective view of the pod of FIG. 23;



FIG. 25 is a schematic perspective view of axes of orientation for a gyroscopic sensor for a biofeedback system, in accordance with one embodiment of the invention;



FIG. 26 is a graph of angular velocity data from a gyroscopic sensor for a heel striking running style, in accordance with one embodiment of the invention;



FIG. 27 is a graph of angular velocity data from a gyroscopic sensor for a midfoot striking running style, in accordance with one embodiment of the invention;



FIG. 28 is a graph of data for a reset trigger for a gyroscopic sensor, in accordance with one embodiment of the invention;



FIG. 29 is a schematic representation of various data presentation means for foot strike location of a runner, in accordance with one embodiment of the invention; and



FIGS. 30a to 33c are schematic views of various attachment mechanisms for a sensor pod for a biofeedback system, in accordance with one embodiment of the invention.





DETAILED DESCRIPTION

The invention described herein relates generally to improved biofeedback systems, and related methods, for use in training users (e.g., runners or other athletes) to run with an improved running form or technique. The invention may be utilized by runners or other athletes of all levels of skill from professional athletes through to beginners and occasional joggers. By placing one or more sensors on the body of a runner (e.g., on or in one or more shoe and/or piece of apparel), the systems and methods described herein may be used as a coaching tool to provide substantially instantaneous feedback and coaching during athletic activity, and also store information for evaluation and further processing after the run.


Promoting better running form may be beneficial to a runner for a number of reasons such as, but not limited to, improving running efficiency (thereby increasing performance) and reducing the risk of injury. In general, coaching can be an important way to promote proper running form and keep runners injury free. However the majority of runners (including many collegiate and even some elite runners) have never been given any or significant training on how to run with proper form. As a result, many runners are unaware of problems with their running form (e.g., an improper foot strike position or a running style wherein a runner's right foot contacts the ground differently than their left foot) that may significantly affect their running efficiency and leave them more prone to injury.


The utilization of high-speed cameras during coaching may provide a runner with some feedback to assist in improving running form. However, not only do the majority of athletes not have sufficient access to professional coaching utilizing such technology to provide any substantive guidance to train them to run with an improved running form, but such coaching, even if available, can be expensive and time consuming. In addition, watching video of an athlete running does not provide instantaneous feedback that can be used by the athlete during a run. While technology has been utilized to provide some instantaneous feedback to a runner, such as the speed, distance travelled, heart rate, and calories burned during a run, the information provided by these systems does not produce biofeedback information that may be used to give a runner substantive training on proper running form. The inventions described herein address this issue by providing improved systems, and related methods, for measuring, transmitting, storing, analyzing, and/or communicating substantive biofeedback data that may be utilized instantaneously, or substantially instantaneously, to promote good running form in an athlete during and/or after a run.


Biofeedback information of use in training an athlete to run with proper running form includes, but is not limited to, foot strike position, cadence, posture, and lean information. Such information may be used to analyze a runner's technique and running traits, and identify parameters that can be adjusted by a runner during training to improve one or more performance characteristic. Good running form for an athlete may include elements such as, but not limited to, quick strides, a midfoot foot strike location, and good posture. These elements may increase the efficiency and ease of running while reducing stresses on the runner that could result in strains and other injuries. In contrast, poor running form, which is common in untrained athletes, may include elements such as overstriding, aggressive heel-striking, and bad posture. These poor running elements may, for example, produce excessive stresses to the knee, potentially resulting in Runner's Knee/Petellofemoral Pain Syndrome or other injuries.


The posture of a runner relates to the carriage of the body of the runner during running. Good posture (generally an up-right posture) may be achieved, for example, by standing tall and running with your head up and with your gaze directed straight ahead.


The cadence of a runner (i.e., the number of foot strikes per minute) may be important in ensuring good running form. In one embodiment, a cadence of about 180 foot strikes per minute may be optimal to prevent over-striding and to ensure proper running form regardless of the pace of the runner. In alternative embodiments higher or lower cadences may be used depending, for example, on the specific physiology, age, and/or goals of the user.


The lean of a runner may be utilized to reduce the need for excessive muscle force by advantageously utilizing gravity to assist in forward motion. In one embodiment, improved lean may be achieved by utilizing a running style including a forward lean over the whole length of the body without bending at the waist and by flexing at the ankle to reduce unnecessary muscle strain caused by toeing-off.


The foot strike location (i.e., the location, on the sole of the foot, of initial impact with a ground surface during each step) can be extremely important in promoting a good running form. Runners with a midfoot striking gait distribute pressure across the foot during a running gait cycle differently than runners employing a heel striking gait. In addition, the mechanical work performed by the lower extremity of a runner using a midfoot striking gait is distributed across the joints differently than a runner employing a heel striking gait. Runners with a midfoot striking gait primarily have pressure distributed in the lateral midfoot and forefoot region of the foot at initial impact and exhibit more ankle flexion (dorsiflexion) subsequent to the initial impact. Runners with a heel striking gait primarily have pressure distributed in the lateral heel at initial impact and generally do not exhibit as much ankle flexion after impact. As a result, heel strikers tend to have larger stresses placed on their knee which can lead to injuries such as Runner's Knee/Patellofemoral Pain Syndrome. Consequently, rearfoot/heel strikers potentially have a less efficient running gait than midfoot strikers, with heel striking and overstriding often causing braking. An example shoe conducive to a midfoot striking gait is described in U.S. Patent Publication No. 2009-0145005, the disclosure of which is incorporated herein by reference in its entirety. In addition, a midfoot striking gait may provide a superior running form than a pronounced forefoot running gait (which may, for example, cause calf-strain and Achilles strain). One embodiment of the invention may include the use of one or more sensors to determine the location, on the sole of the foot, of initial impact with a ground surface during each step, and/or determine that angle of the foot with respect to the ground surface at initial impact (which may be used to determine foot strike location).


One embodiment of the invention includes a system 100 for providing biofeedback information to a runner 115 for use in improving running form. The system 100, as shown in FIG. 1, includes one or more sensors 105 attached to (e.g., embedded within, fixedly coupled to, or releasably coupled to) a portion of a shoe 110 of a runner 115 to measure one or more data conditions/performance characteristics during athletic activity (e.g., a run). The system 100 also includes one or more remote receiving systems 120 for receiving data from the sensor(s) 105 and communicating information to the runner based on an analysis of the gathered data. The analysis of the gathered data may be carried out in a processor located in the shoe 110, the remote receiving system 120, and/or a separate analyzing unit (e.g., a personal computer). One or more sensors 105 can be placed in each shoe 110 of the runner 115, or in only a single shoe 110 of the runner 115.


The sensor(s) may be integrally embedded within the shoe and, for example, within one or more portions of a sole (e.g., an outsole, midsole, or insole) of a shoe. One or more sensors may also be integrally embedded within one or more portions of an upper of a shoe. In another embodiment, one or more sensors may be releasably attachable to a portion of the sole and/or upper of a shoe. For example, a sensor unit may be adapted to clip to a portion of an upper of a shoe (e.g., an outer mesh layer of the shoe or a lacing section of the shoe), and/or be releasably attached to a portion of a sole of the shoe. The sensor(s) may be releasably attached through any appropriate attaching elements including, but not limited to, a hook and loop fastening (e.g., Velcro®), a clip, a pin, lacing, magnetic elements, and/or an adhesive.


Various sensors may be utilized to measure one or more data conditions during athletic activity. Example sensors include, but are not limited to, mechanical feedback devices (e.g., retractable pins that retract upon contact with the ground to measure and indicate a ground contact and/or a force associated therewith, or pins or other structures that provide a tactile sensation to a user during foot strike), accelerometers, piezoelectric sensors, rotary position sensors, gyroscopic sensors, temperature sensors, chemical sensors (e.g., sensors for measuring oxygen levels), GPS devices, pressure sensors (e.g., pressure transducers), force sensors (e.g., load cells, force transducers, or stress/strain sensors), and/or goniometers. Example pressure/force sensors include, but are not limited to, resistive, capacitive, impedance based, and/or piezoelectric sensors. The sensors may measure data conditions at a localized position or be strips or pads adapted to measure data conditions (e.g., pressure and/or force) over an extended area. In various embodiments other electromagnetic, mechanical, and/or optical sensors may be used in addition to, or in place of, the sensors listed above.


One or more sensors may be placed at any appropriate location on the shoe and, for example, within a forefoot portion, a midfoot portion, and/or a heel portion of a shoe sole and/or upper. In one embodiment, as shown in FIG. 2, a shoe 110 includes a forefoot sensor 105 located in a forefoot portion 117 of the shoe 110, and a heel sensor 105 located within a heel portion 122 of the shoe 110. Sensors may be placed at other locations on the shoe 110 in addition to, or in place of, the forefoot portion 117 and heel portion 122. For example, one or more midfoot sensors 105 may be located at a midfoot portion 125 of a shoe 110, as shown in FIG. 3. The various sensors 105 may be positioned within or above a sole 130 of the shoe 110 (e.g., within a cavity in the midsole of the shoe or in an insole place within the shoe) and/or be positioned within or on an upper 135 of a shoe 110.


In one embodiment, a plurality of sensors 105 (i.e., a sensor array) are positioned at various locations along a length of the shoe 110, or a portion thereof, as shown in FIG. 4. These sensors 105 may be positioned at a number of locations substantially along a central axis 140 of the sole 130, as shown in FIG. 5, or at a number of locations along a medial side 145 and/or a lateral side 150 of the sole 130, as shown in FIG. 6. Any appropriate number of sensors 105 may be positioned at any appropriate locations over the length and width of the sole 130 of the shoe 110, with the sensors 105 embedded within, or releasably attached to, an outsole, midsole, and/or insole of the sole 130, depending upon the specific data and running traits being measured. In one embodiment, one or more of the sensors 105 may be exposed on an outer surface of the sole 130. Alternatively, or in addition, one or more of the sensors 105 may be embedded within the sole 130.


In one embodiment, one or more sensors 105 may be placed in a removable insert that may be positioned inside a shoe, for example as a removable insole or as an insert adapted to fit within a cavity or pocket formed within a portion of the shoe sole or upper (e.g., in a heel pocket of tongue pocket). For example, a shoe 110 may be formed with a sole portion 130 having a cavity 170 adapted to releasably receive an insert 175 holding one or more sensors 105 therein, as shown in FIG. 7. The cavity may include a covering portion adapted to cover and protect the insert during operation. The cavity 170, or cavities, may be placed at any location within the forefoot portion 117, midfoot portion 125, and/or heel portion 122 of the shoe 110. In various embodiments the cavity 170 may be accessed from the interior of the shoe, as shown in FIG. 7, or through an opening in an outer surface of the outsole 130 of the shoe 110.


In one embodiment, one or more sensor pads or strips may be affixed to, or embedded in, a sole 130 of a shoe 110. For example, FIG. 8 shows a sole 130 having a heel sensor pad 180 located at a heel portion 122 along a lateral side 150 of the sole 130, with a midfoot sensor pad 185 located at a midfoot portion 125 along the lateral side 150 of the sole 130. Sensor pads or strips can be positioned along a medial side, lateral side, and/or central portion of the sole, or span across a width of the shoe, or a portion thereof. For example, FIG. 9 shows a sole 130 having a midfoot sensor pad 185 located at a midfoot portion 125 along the lateral side 150 of the sole 130, but also having a heel sensor pad 190 spanning across the width of the heel portion 122 and a forefoot sensor pad 195 spanning across the width of forefoot portion 117.


In various embodiments sensor pads and/or strips may be positioned on any portion of the shoe sole. The sensor pads or strips can be embedded within an outsole, midsole, and/or insole, or be positioned between adjoining layers of the sole. Alternatively, the sensor pads or strips can be located in a removable insert (e.g., a removable insole) that can be placed into the shoe, or attached to an exterior, ground contacting, surface of the sole.


Alternatively, one or more sensors 105 may be placed within an insert 160 that may be releasably attached to the shoe 110 at a lacing portion 165, as shown in FIG. 10, or on one or more portions of an upper 135 of the shoe 110, as shown in FIG. 11.


The sensors 105 may be used to measure the location and distribution of each foot strike of each foot on the ground during running and/or the force and/or pressure applied to various portions of the foot during running. The measured data may be processed to produce biofeedback information that may be used to train a runner to run with a more efficient and safer foot strike location, such as with a midfoot strike. The data may be processed and communicated to a runner instantaneously, or substantially instantaneously, to give the runner immediate feedback during a run. The data may also be stored and used to generate both mean and time dependent results after the run is completed, thereby providing a runner and/or a coach with a full analysis of the runner's performance over the course of the run.


The sensors 105 may also be used to measure the cadence of the runner during a run, in addition to, or instead of, the foot strike information, by recording the time between each foot strike. Again, the measured data may be processed and communicated to a runner instantaneously, or substantially instantaneously, to give the runner immediate feedback during a run and/or be stored and used to generate both mean and time dependent results after the run is completed.


In one example embodiment, a shoe 110 may include a sensor 105 comprising a mechanical feedback device (e.g., a pin) located in a sole of a shoe 110 and, for example at the heel portion. Data measured and transmitted from the sensor 105 can be used to determine when a runners heel is in contact with the ground, thereby producing information that can be used to provide the runner with a better awareness of their gait.


One embodiment of the invention includes one or more sensors 205 positioned either in or on the upper 135 or sole 130 of a shoe 110 (as described hereinabove for the sensors 105) to measure data that can be utilized to determine a runner's posture and/or lean during a run. For example, one or more sensors 205 (e.g., goniometers) can be fixedly embedded or releasably attached to an upper 135 of a shoe 110 to measure data that can be processed to provide biofeedback information related to a runners posture and/or lean, as shown in FIG. 11. The sensors 205 may operate independently from, or in concert with, sensors 105 for measuring foot strike and/or cadence. In one embodiment, the sensors 205 can communicate data to a remote receiver using the same transmitter as utilized by the sensors 105. Alternatively, the sensors 205 may utilize a separate transmitter. In an alternative embodiment, sensors for measuring the posture and/or lean of the user may be positioned at other locations on a body of a user (e.g., on an ankle, leg, waist, arm, or chest of the user).


In one embodiment, one or more sensors can be embedded within, or releasably attachable to, an item of apparel wearable by a runner. Alternatively, or in addition, one or more sensors can be mounted on a strap that may be worn by a runner, or removably affixed to a portion of a runner by a skin sensitive adhesive or tape. These sensors can be used in addition to, or in place of, sensors on a shoe to provide biofeedback information related to a performance characteristic of a runner.


In addition to providing biofeedback information related to a runners proper running form (e.g., foot strike, cadence, posture, and/or lean information), the systems described herein may include sensors for measuring other parameters related to a runners performance including, but not limited to, distance, pace, time, calories burned, heart rate, breaths per minute, blood lactate and/or muscle activity (EMG). For example, measuring blood lactate levels may be of use in determining lactate threshold data in long-distance runners and other athletes.


In various embodiments, the sensors 105, 205 may be powered by one or more battery elements coupled to the sensors 105, 205. The batteries may be single use, replaceable batteries or be rechargeable batteries. The rechargeable batteries may be recharged by any appropriate means. Alternatively, the sensors 105, 205 may utilize the biomechanical action of a runner for power.


The sensors 105, 205 may be coupled to one or more transmitters for transmitting measured data to a remote receiver. The transmitter may include, or consist essentially of, a wireless transmitter and, more particularly, a radio frequency transmitter and/or an infrared transmitter. For example, the transmitter may be a radio transmitter adapted to transmit short wavelength radio transmissions via Bluetooth®, Bluetooth® Low Energy, and/or ANT or ANT+ protocols. In one embodiment the system may include a transmitting system capable of transmitting over a plurality of transmission protocols, thereby allowing the device to communicate with multiple different receiving systems. The transmitter may also, in one embodiment, be capable of receiving information transmitted from a remote source. This information may be utilized, for example, to turn on/off the sensors, calibrate the sensors, and/or control one or more function of the sensing system.


The data measured by the sensors in or on the shoe/s) can be transmitted to a remote receiving system for recording and/or analysis. An example system 300 for providing biofeedback information including both a sensing unit 305 and a receiving/analyzing unit 310 is shown in FIG. 12. The sensing unit 305 can include elements such as, but not limited to, one or more sensors 315, a power source 320, and a transmitting/receiving element 325. The sensing unit 305 can be positioned in or on a shoe and/or piece of apparel (as described herein). The remote receiving unit 310 can include elements such as, but not limited to, a transmitting/receiving element 330 for receiving the transmitted data from the sensing unit 305, a remote user feedback element 335 for receiving the data, a storage unit 340 for storing raw and/or analyzed data, a communication element 345 (e.g., a visual display such as a graphical user interface (GUI), an auditory communication element, and/or a tactile user interface) for communicating biofeedback information determined from the analyzed data to an athlete, and a power source 350. The transmitting/receiving element 330 can also be used to communicate with a remote database, such as an online database for an online running/coaching community, thereby allowing biofeedback information to be transmitted to the remote database and allowing biofeedback information, training instructions, software updates, or other digital information to be transmitted from the remote database to the receiving/analyzing unit 310.


Another example system 300 for providing biofeedback information is shown in FIG. 13. In this embodiment, the sensing unit 305 additionally includes an analyzing element 355 and a storage unit 360. Including an analyzing element 355 and a storage unit 360 in the sensing unit 305 allows for initial processing of the raw data from the one or more sensors 315 to be carried out in the sensing unit 305, with the raw and/or analyzed data stored in the storage 360 within the sensing unit 305. As a result, only the processed data, or a small package of information representative of the processed data, need be transmitted from the sensing unit 305 to the remote user feedback element 355, thereby reducing the quantity of information that needs to be transmitted between devices in order to provide real-time feedback to a user. This in turn reduces the drain on the power sources 320, 350, thereby extending the run-time and efficiency of the system 300. In addition, storing raw and/or processed data within the sensing unit 305 allows the data to be downloaded into an analyzing device (e.g., a computer) for further processing after the run is completed, regardless of whether the data was received by a remote user feedback element 355 during the run.


The remote receiving unit 310 may, for example, be a watch, a portable media player (such as, but not limited to, an Apple Inc. iPod®), a customized receiving unit adapted to be worn by the user (e.g., attached to a detachable strap or adapted to fit in a pocket of the user's garments), a mobile phone or smart phone (such as, but not limited to, an Apple Inc. iPhone® or a Research In Motion Ltd. Blackberry®), a portable GPS device, an earpiece, and/or an item of headgear (e.g., a hat, visor, sunglasses, etc). Alternatively, or in addition, the remote receiver may be a laptop computer, a tablet computer, a desktop personal computer, and/or an athletic training system (e.g., a treadmill). In one embodiment the transmitting/receiving element 330 can be a separate unit (for example, a dongle—i.e., a hardware or software “key” allowing two remote devices to communicate) that is adapted to plug into a smart phone or computer to allow communication between the sensing unit 305 and receiving/analyzing unit 310.


The biofeedback information could be communicated to the runner through an auditory, optical, and/or tactile (e.g., vibratory) signal. Auditory signals can, for example, be communicated through a speaker (e.g., a small speaker within the shoe, the sensing unit, and/or receiving device) and/or in an earpiece or headphones worn by the athlete. Optical signals can, for example, be communicated through a visual display on a receiving device (e.g., a visual display on a smart phone screen) or through one or more optical transmitters such as, but not limited to, a light-emitting diode (LED) light source, coupled to the runners shoe and/or to an optical transmitter attached to a piece of apparel or strap worn by the runner. Vibratory signals may be communicated to the runner through a vibration inducing element within the receiver and/or within or attached to the shoe.


The biofeedback information generated through analysis of the measured data from the sensor(s) can be relayed back to the athlete either in any appropriate auditory form such as, but not limited to, a voice command and/or a warning signal. For example, the information may be communicated via a software generated spoken communication providing information and/or instructions to a runner (e.g., “You are now running on your heel”; “Your cadence is too low”; etc). Alternatively, or in addition, the auditory signal may include a click, beep, or other simple signal that can provide a runner with a warning if their running form does not meet a certain requirement and/or provide a positive signal if their running form does meet the required parameters. Such simple auditory signals can also be used to provide timing information (similar to a metronome) to give a runner a target cadence during a run. In addition, or alternatively, the auditory signal may include a change in a pitch, or speed of a musical composition being played to a user depending upon the actual cadence of a user with respect to a target cadence). By providing this practically real-time feedback, the athlete is able to make quick adjustments to their gait during the run.


In various embodiments the biofeedback information may be automatically communicated to the athlete, be communicated upon prompting from the runner (e.g., by initiating a communication command in the receiver), and/or be communicated as a summarized report at set periods (e.g., at the end of a set distance or period covered the athlete could receive summarized biofeedback information relating to his/her performance over the last mile covered and/or the entire distance covered—e.g. “You spent 20% of your time on your heel over the last mile”).


One embodiment of the invention can include a system having one or more sensors that are placed inside a shoe of a runner to record foot strike. This information is converted to an auditory signal that is made through a small speaker embedded within, or attached to, the shoe of the runner. As a result, in this embodiment a separate receiving/communicating unit would not be needed to provide feedback to the runner.


In addition to, or instead of, providing real-time biofeedback information, the information may be stored by the system for later analysis by the athlete and/or a coach. This information may be used, for example, to provide an athlete with a history of their runs and their performance during each run. In one embodiment the biofeedback system merely stores data from the sensor(s) as simple raw data, with the processing and analysis of the data being performed by a processing unit upon completion of the run by downloading the raw data to the processing unit. This may be advantageous, for example, in minimizing the size, weight, and/or cost of the actual biofeedback system being carried by the athlete during a run, while still allowing for detailed analysis of the data to be communicated to the athlete and/or coach upon completion of the run.


The processed data may also be uploaded to a shared computer drive or a storage drive (e.g., a cloud computing system) for an online community, thereby allowing the information, or portions thereof, to be reviewed remotely by a coach and/or fellow athlete. In one embodiment, processing and analysis of the data transmitted from the sensors can be carried out by one or more application software programs (“Apps”) that may be downloaded onto a smart phone or other electronic device. Such “Apps” can be programmed to analyze data and present biofeedback information in any appropriate way, depending upon the specific training requirements of an athlete.


An example system 400 for monitoring one or more athletic performance characteristic of a user, providing real-time biofeedback information relating to the performance characteristic(s), and/or downloading data associated with the performance characteristic to a computer 455 is shown in FIG. 14. In this embodiment, a sensing unit 405 is adapted to be releasably or fixedly attached to a body portion of a user and, for example, a shoe of a user. The sensing unit 405 can include one or more sensing elements (e.g., a gyroscopic sensor, an accelerometer, etc) for monitoring one or more athletic performance characteristics of the user during an athletic activity such as running. The sensing unit 405 can also include elements such as, but not limited to, one or more power sources (e.g., a rechargeable or replaceable battery), a processing/analyzing unit (e.g., a microprocessor), a memory, an RF transmitting and/or receiving unit, and/or one or more dock contacts 460 for allowing the sensing unit 405 to dock with another device. Docking with another device may, for example, allow for the transferring of measured data (raw and/or processed) from the sensing unit 405 to an analyzing device (e.g., a computer), the transmission of software applications, instructions, upgrades, or settings (e.g., firmware pushes) to the sensing unit 405 from the analyzing device, and/or the recharging of the power source of the sensing unit 405. The dock contacts 460 may include a USB port or other appropriate port for connecting the sensing unit 405 to a receiving/analyzing device.


Data related to the one or more performance characteristics can be transmitted 430 from the sensing unit 405 to one or more remote real-time feedback (RTF) receiving devices 410 and/or to a remote analyzing and storing device 455 (e.g., a computer) for later analysis and long-term storage. The RTF unit 410 may be any of the devices described herein. FIG. 14 shows a system wherein a smart phone 420 and/or a watch 415 can be used to provide real-time feedback to the user. The data associated with the performance characteristic(s) of the user can be transmitted through any appropriate RF signal such as, but not limited to, Bluetooth®, Bluetooth® Low Energy, and/or ANT or ANT+ protocols. In one embodiment the RTF 410 (e.g., the smart phone 420) can communicate directly with the sensing unit 405 without the need for any additional hardware or software. In an alternative embodiment, a dongle 450 may be used to facilitate communication between the RTF 410 and the sensing unit 405.


In one embodiment, the RTF 410 can receive and analyze information from one or more sensing unit 405 and receive and analyze data from other sensors or sources in addition to the sensing unit 405 (e.g., one or more sensors embedded within or directly attached to the RTF 410 and/or one or more additional remote sensor units communicating wirelessly with the RTF 410), thereby allowing for the simultaneous processing of information associated with multiple performance characteristics of the user. For example, the RTF 410 can include, or can communicate with, performance measuring devices such as heart-rate monitors, GPS monitors, speed/distance/time monitors, oxygen level monitors, breathing rate monitors, energy usage monitors, etc.


In various embodiments the RTF 410 can communicate remotely with a computer 455 or other processing/storage device (e.g., a cloud computing system) through a wireless connection, and/or dock with a docking port 460 on the processing/storage device to allow for direct communication therebetween after a run is complete. The RTF 410 may, for example, be adapted to allow for either one or two-way wireless or direct “docked” connection between the RTF 410 and the sensing unit 405, thereby allowing the RTF 410 to download software application, instructions, upgrades, or settings to the sensing unit 405 and/or upload raw and/or processed data from the sensing unit 405 after a run is complete. For example, the sensing unit 405 can send small data packages representative of the performance characteristic (e.g., a foot strike location of a user) to the RTF 410 wirelessly during a run, with the raw data being stored on the sensing unit 405 and downloaded to the RTF 410 and/or to a remote analyzing receiving device 455 through a physical wired connection for further processing after the user's run is complete. The raw and/or processed data can then be stored in the RTF 410 and/or communicated 425 to an analyzing/storage device 455 or facility from the RTF 410.


In one embodiment, a software application (“App”) can be provided to an RTF 410 (e.g., a smart phone 420) to control various functions of the RTF 410 and facilitate the receiving of performance characteristic information from the sensing unit 405 and the communication of the associate performance information to the user.


The RTF 410 device (e.g., the smart phone 410 or the watch 415) can communicate information to the user in a variety of ways including, but not limited to, through a visual display screen, through audio signals emitted directly from the RTF 410 or communicated to the user though a wired or wireless (e.g., Bluetooth®) headphone connection, and/or through a vibration emitted by the RTF 410. Such information can be communicated constantly in real-time or at selected intervals. In one embodiment foot strike location information (e.g., a heel strike, a midfoot strike, or a forefoot strike) is communicated to the user, while additional performance characteristics information (e.g., cadence, heart rate, speed, distance travelled, time, etc) can be communicated at the same time, or substantially at the same time, as the foot strike information, or be communicated independently from the foot strike information.


The sensing unit 405 and RTF 410 may be attached to the user in various manners, depending upon the particular performance characteristics being monitored, the particular RTF 410 being used, and the particular real-time feedback being provided to the user. For example, one or more RTFs 410 can be mounted to a wrist, arm, waist, hip, shoulder, head, chest, or leg of a user 435, or be placed in a pocket of a garment of the user 435 or in a bag carried by the user 435. Example configurations of sensing unit 405 and RTF 410 can be seen in FIGS. 15-19. In FIGS. 15-19 the sensing unit 405 is mounted on a fastening portion of the shoe (e.g., releasably attached to the laces of a laced shoe). The biofeedback system 400 may utilize a single sensing unit 405 attached to only one foot of a user, as shown in FIGS. 15-16 and 18-19, or utilize sensing units 405 attached to both feet of the user, as shown in FIG. 17.


Using a sensing unit 405 on both feet allows the system 400 to accurately monitor performance characteristics such as foot strike location on both feet during a run. However, as running gait is often reasonably symmetric (i.e., it is rare for a runner to heel-strike with one foot while midfoot striking with the other foot), valuable training information and biofeedback can be obtained from the use of only one sensing unit pod 405, with the user able to switch the foot on which the sensing unit 405 is mounted between runs (or even during a break in a run) to ensure that feedback relating to the foot strike location on both feet can be provided over time. In addition, additional performance characteristics such as cadence can be obtained through use of only a single sensing unit 405 (as the cadence will be related to twice the number of foot strikes recorded by the sensing unit 405 on only one foot).


In the embodiment of FIG. 15, the RTF 410 is a wrist-based device 465 (e.g., a watch 415 or other appropriate RTF device such as a custom device specifically adapted to receiving information from the sensing unit 405 and communicating that information to a user) strapped to a wrist of the user 435 by a releasable strap or band 440. The wrist-based device 465 can include a visual display to provide a visual indication of the information, include a tactile device to provide a tactile sensation (e.g., a vibration) to the wrist if an event (e.g., a heel strike) occurs, and/or include a speaker to provide an auditory signal if an event (e.g., a heel strike) occurs.


The embodiment of FIG. 16 uses a smart phone 420 including a software application for controlling functionality of the smart phone 420 to allow it to act as an RTF 410. The smart phone 420 is releasably attached to the upper arm of the user 435 by a strap or band 440. As with the watch 415, the smart phone 420 can communicate information to the user 435 through a visual display, a vibration, and/or an auditory signal. For example, the smart phone 420 may output an auditory signal that may be communicated to the user 435 through a pair of headphones 445, as shown in FIG. 17. Utilizing an RTF 410 having two-way communication functionality (such as, but not limited to, a smart phone 420) allows the RIF 410 to both receive information from the sensing unit 405 and transmit information to another remote device (e.g., a computer 455) and/or back to the sensing unit 405.


In various embodiments other devices or garments may be used as an RTF 410 for receiving information from the sensing unit 405 and communicating information to the user 435. In the embodiment of FIG. 18, for example, a user 435 may wear a visor 470 that has a receiving element embedded therein or attached thereto. The receiving element may then process the data, if necessary, and communicate information to the user 435 through a visual display element 475 mounted to the visor 470. The visor can also include an auditory feedback element (e.g., a speaker or a headphone connection) and/or a vibratory feedback element in addition to, or instead of, the visual display 475.


In the embodiment of FIG. 19, the RTF 410 may include, or consist essentially of, a hand-held device 480, which can be held in a hand of a user 435 during a run. An example hand-held device 480 is shown in FIG. 20. As with other RTFs 410, the hand-held device 480 can include a variety features such as, but not limited to, user communication elements, controls, and communication ports. As shown in FIG. 20, the hand-held device 480 can include a visual display screen 485 to provide visual information to the user 435, a headphone jack 490 to allow for auditory signals to be sent to the user 435, and a handle 495 having a vibration element held therein to provide tactile feedback to the user 435. Alternatively, or in addition, the hand-held device 480 can include one or more speakers to communicate an auditory signal to the user 435 without the need for headphones. The hand-held device 480 can also include control buttons 500 to allow the user to control one or more function of the hand-held device 480 and one or more communication ports 505 (e.g., a USB port) to allow for the downloading and uploading of information between the hand-held device 480 and another device e.g., a computer 455), and to provide a charging port for the hand-held device 480.


In one embodiment the sensing unit 405 includes a gyroscopic sensor that is adapted to measure an angular velocity of the shoe during athletic activity. Analysis of this angular velocity data can then be used to determine foot strike information relating to the user's running style, thereby allowing the user to be informed of whether a particular foot strike was a heel strike, a midfoot strike, or a forefoot strike. This information can be communicated to the user in rear time, at set intervals, and/or be stored in the sensing unit 405 for later analysis and processing. In addition, data from a gyroscopic sensor can be used to determine when a foot strike occurs, thereby allowing for the measurement of cadence (i.e., number of foot strikes per minute) information for the user. For example, foot strike location information (e.g., a heel strike, a midfoot strike, or a forefoot strike), cadence information, and/or other performance characteristic information can be communicated to the user upon every foot strike. Alternatively, or in addition, compiled and/or averaged performance characteristic information can be communicated to the user upon the completion of a preset or user selected number of foot strikes, upon the completion of a preset or user selected distance travelled or time travelled, upon request from the user, and/or at the end of a run.


In one embodiment, a sensing unit including a gyroscopic sensor can be releasably mounted to an upper of a shoe at a fastening portion (such as a lacing portion or hook-and-loop fastening portion) located centrally, or substantially centrally, on the top of the shoe. An example gyroscopic sensor unit 510 mounted to the lacing portion 520 of a shoe 515 is shown in FIGS. 21 and 22. Alternatively, the gyroscopic sensor unit 510 can be mounted to a heel portion 525 of a shoe 515, as shown in FIGS. 23 and 24, or to any other appropriate location on the upper or sole of the shoe 515, depending upon the shoe, the specific mounting arrangement required, and/or the measurement and calibration requirements of the sensor.


The gyroscopic sensor may be a one-axis sensor, a two-axis sensor, or even a three-axis sensor, allowing for the measurement of angular velocity (x′, y′, z′) about any axis (X,Y,Z), as shown in FIG. 25, and thereby allowing for the measurement of a variety of performance parameters associate with a foot strike. For example, careful analysis of an angular velocity measurement z′ about the Z-axis allows for the identification of a foot strike location (i.e., a heel strike, a midfoot strike, or a forefoot strike) of a user during a stride, and thereby also provides information as to whether a user's foot is dorsiflexed (heel strike), plantarflexed (forefoot strike), or neutral (midfoot strike) upon impact with a ground surface. Similarly, careful analysis of an angular velocity measurement x′ about the X-axis allows the system to identify whether a user's foot is pronating (wherein the ankle caves inwards towards the other foot) or supinating (wherein the ankle caves outwards away from the other foot) at impact with a ground surface. In addition, analysis of an angular velocity measurement y′ about the Y-axis allows the system to identify whether a user's foot is abducting (wherein the forefoot rotates outwards away from the other foot) or adducting (wherein the forefoot rotates inwards towards the other foot) at impact with a ground surface.


In general, the goal of an algorithm processing raw z′ angular velocity data from the gyroscopic sensor(s) is to accurately detect the direction and speed of the foot's rotation at the moment of impact with a ground surface, and to determine from this the type of foot strike (i.e., forefoot, midfoot, or heel strike) and the severity of the foot strike (e.g., severe heel strike, intermediate heel strike, mild heel strike, midfoot strike at rear of midfoot, midfoot strike at center of midfoot, midfoot strike at front of midfoot, mild forefoot strike, intermediate forefoot strike, or sever forefoot strike). An example method of calculating foot strike location information from a gyroscopic sensor reading is discussed below. In this embodiment, only data from one single-axis of a gyroscopic sensor is required to measure foot strike location and cadence.


The data recorded from the gyroscopic sensor may be considered to be periodic, with each gait cycle being one period. During a normal gait cycle, the foot often has a positive rotation just before the foot strike (see, for example, FIGS. 26 and 27). After the foot strike, during the stance phase of the gait, the foot is flat on the ground and is relatively motionless. The foot strike occurs during a short time period (e.g., between about 10 ms to 50 ms and, for example, about 30 ms or so) between the end of the swing phase and the start of the stance phase, with the angular velocity data (z′) about the Z-axis during that period providing information that can be analyzed to determine the type of foot strike.


During a heel striking event the heel touches the ground first, followed by a rapid rotation of the foot forwards until the sole is flat against the ground. As a result, a large positive rotational velocity during a foot strike event, accompanied by little or no negative rotational velocity, is measured for a heel strike. In contrast, midfoot striking is indicated by a smaller positive rotational velocity followed by a clearly defined negative rotational velocity, while a forefoot strike shows a larger negative velocity than that observed for a midfoot strike. Thus, there is a clear qualitative difference in the gyroscopic sensor data about the Z-axis between heel striking and forefoot/midfoot striking. It is often found that the maximum localized rotational velocity occurs close to, but slightly after, the initial impact during a foot strike event.


In one embodiment, raw data from the gyroscopic sensor can be analyzed to determine foot strike location for a foot strike event. In an alternative embodiment, pre-processing of the data (for example through amplification and/or low-pass, high-pass, band-pass, band-stop, and/or butterworth filtering) may be carried out prior to analyzing the data to determine foot strike information. The sample rate of the gyroscopic sensor can also be set at any appropriate rate to ensure that the sample rate is sufficient to obtain accurate information without sampling at a rate that would draw too much power and/or take up too much storage space. The sample rate may be set, for example, at between 500 to 5000 Hz or between 500 and 2000 Hz or, more particularly, between 800 and 1500 Hz and, for example, at about 900 Hz. In one embodiment, a sample rate of about 900 Hz may be set to ensure that at least about 300 samples are taken per stride.


Example gyroscopic sensor data for a heel strike is shown in FIG. 26, while example raw gyroscopic sensor data for a midfoot strike is shown in FIG. 27. Three foot strike events are shown in each graph. As shown, the midfoot strike exhibits a positive spike 550 followed by a negative spike 555, with the negative component showing that the foot is rotating “backwards” (i.e., the heel is coming down relative to the toe) after initial impact. Conversely, the heel strike has a positive spike 550 but does not have a negative component (i.e., the local minimum 575 is never less than zero), showing that the foot is rotating “forwards” (i.e., the toe is moving down relative to the heel). A forefoot strike is characterized by having a small positive peak 550 relative to the negative peak 555, showing that the foot is rotating backwards more severely than for a midfoot strike. Cadence can be determined simply by counting the number of foot strike events over a specified time.


In one embodiment, a microprocessor analyzes each angular velocity sample from the gyroscopic sensor and determines whether or not a foot strike has occurred. To find the moment of foot strike, the microprocessor first needs to distinguish one stride from another with a periodic trigger, as shown in FIG. 28. When the foot is swinging forward, the toe is moving up and the rotational velocity is negative. When the rotational velocity is negative, the microprocessor adds each sample to a sum. When the sum reaches a preset level, a flag is set and the sum resets 560. This happens once for each stride, just before the moment of the foot strike. After the flag is set, the microprocessor starts looking for a maximum value 550. Once found, it finds the minimum value 555 within a set window. It is the ratio of this maximum and minimum value that determines the rating for the foot strike.


The window during which the maximum and minimum angular velocity measurements are taken for the foot strike location analysis may be set to any appropriate size and starting point to ensure that the required maximum and minimum peaks are captured without additional, non-relevant, data contaminating the calculation. For example, the window may be set to capture and analyze data starting at, and/or ending at, a particular identifying event during the periodic data cycle or a specific time period or specific number of samples prior to or after a particular identifying event during the periodic data cycle, and/or be set to cover a specific time period or a specific number or samples from a set starting point, as appropriate.


In one embodiment the window may begin from the moment of initial impact, or the moment of the first local maximum angular velocity after the initial impact, and continue until a point at or after a first local minimum angular velocity measurement is observed. Alternatively, the window may be set to review data captured within a specific time window (e.g., between 10 ms to 50 ms and, for example, about 30 ms) or sample size starting at the moment of initial impact, or at the moment of the first maximum rotational value peak after initial impact, or at any other appropriate time prior to, at, or after the instant of initial impact. In another embodiment, the window for analysis may open at the time at which the angular velocity is zero prior to initial impact, and close at the time at which the angular velocity transitions from negative to positive after the local minimum angular velocity is observed.


In one embodiment, a determination of the foot strike location can be calculated by comparing the maximum value with the minimum value within the sample window in accordance with the following equation:

Foot strike Location Rating (F)=|(Local Minimum value*100)/Local Maximum Value)|

The results of this calculation produce a non-dimensional value from Fmin to Fmax (for example, a value of between Fmin=0 and Fmax=100), with Fmin indicating a severe heel strike, Fmax indicating a severe forefoot strike, and values in the middle of the range indicating a midfoot strike. The specific values identifying each range may, for one example user, be:

    • X1 to X2=heel strike (with “X1” indicating a sever heel strike and values closer to X2 indicating a more mild heel strike)
    • X2 to X3=midfoot strike
    • X3 to X4=forefoot strike (with “X4” indicating a sever forefoot strike and values closer to X3 indicating a more mild forefoot strike)


The values for X3 and X4 may be either preset or be calibrated and/or selected to ensure accuracy for a specific user. Example ranges for the values of X2 may be about 15-40 or, more particularly, about 20-30 or 20-25 or, more particularly, about 20. Example ranges for the values of X3 may be about 60-80 or, more particularly, about 65-75 or 65-70 or, more particularly, about 70.


Additional processing may be carried out to provide accurate foot strike position information, if appropriate. For example, a Gaussian multiplier or other appropriate 2nd order multiplier may be applied to assist in analyzing the results. Additionally, or in the alternative, results outside a set results window (e.g., outside the ranges Fmin to Fmax) may be either discarded or shifted to a preset default value. For example, any result greater than Fmax can either be shifted to indicate a value “Fmax” or be discarded entirely.


In an alternative embodiment, a ratio of the integral of the negative portion of the curve 565 against the integral of the positive portion of the curve 570 (i.e., from calculations of the area under the negative portion of the curve 565 and the positive portion of the curve 570) may be utilized to produce the foot strike location rating, rather than ratios of the absolute values of the minimum and maximum local peaks. The data to be integrated may be taken over any appropriate window size. Utilization of integrated results may be useful, for example, in producing accurate readings in situations where the sample rate of the gyroscopic sensor is insufficient to ensure that accurate local maximum and minimum peak values can be obtained. In addition, the integral of rotational velocity relates to a change in angle of the foot, which may produce valuable performance characteristic information relating to the foot strike of a user. Where beneficial, various other embodiments may utilize other appropriate algorithms for processing the data to determine the foot strike information.


In one embodiment, the sensor unit may be set to go into a “sleep-mode” (i.e., a reduced power consumption mode) during periods of inactivity. This sleep-mode may be set to turn on (when a sensor remains at rest for a given period) and off (when the sensor is moved) automatically. Alternatively, or in addition, the sensor unit may include a user interface (e.g., a hardware switch or a software “switch” initiated by a wireless signal from an RTF) allowing a user to turn the power to the sensor on and off manually.


In one embodiment, the gyroscopic sensor takes data continuously throughout entire gait cycle. However, in alternative embodiment, the gyroscopic sensor can be turned on and off during a gait cycle so that it only takes data during the foot strike event phase of the gait cycle. This may, for example, reduce power consumption by the sensor unit and/or reduce the amount of raw data being stored and/or processed by the sensor unit. For example, a separate sensor (e.g., an accelerometer) may be used in the sensor unit in addition to the gyroscopic sensor, with the accelerometer being used to indicate when a foot strike event is taking place and the gyroscopic sensor only capturing angular velocity data upon triggering by the system when the accelerometer data indicates that a foot strike event is occurring. Additionally, or alternatively, the accelerometer data may be used to calculate the cadence information (with the gyroscopic sensor data only being used for foot strike location calculations), and/or the accelerometer may be used to determine whether a user is standing still, walking, or running, with the gyroscopic sensor only being activated when the user is running. The accelerometer can, for example, determine whether a foot strike event is a walking foot strike or running foot strike based on the magnitude of the acceleration measured at impact (with, for example, a running foot strike indicated by a higher magnitude acceleration measurement).


The performance characteristic information (e.g., the foot strike location information) can be processed and communicated to the user in a number of ways. For example, in one embodiment, as shown in FIG. 29(A), the raw data can be processed to produce a foot strike location rating value of between Fmin and Fmax (e.g., between 0 and 100), with that value being stored and communicated to the user. In this example, the user would be provided with an accurate calculation of exactly where on the foot the centre of the initial foot strike impact occurred, with a further indication of whether this value represented a heel strike, a midfoot strike, or a forefoot strike being calculated based on the set values for X2 and X3.


In one embodiment, as shown in FIG. 29(B), the foot strike location rating value can be placed into one of a number of “bins”, with the system communicating to the user which bin a particular foot strike (or an average foot strike) fell. In the embodiment of FIG. 29(B) the foot strike information is divided into nine bins (severe heel strike 605, intermediate heel strike 610, mild heel strike 615, midfoot strike at rear of midfoot 620, midfoot strike at center of midfoot 625, midfoot strike at front of midfoot 630, mild forefoot strike 635, intermediate forefoot strike 640, or sever forefoot strike 645), although a greater or lesser number of bins may be used where appropriate. In a further alternative embodiment, the foot strike location rating value can be used to merely indicate whether a heel strike 650, a midfoot strike 655, or a forefoot strike 660 has occurred, as shown in FIG. 29(C).


By performing the analysis described hereinabove with a microprocessor within the sensor unit the performance characteristic data transmitted to a receiving unit can be limited to merely a small package or data representing the foot strike location rating value or the appropriate “bin” into which a given value falls. The cadence information can also be transmitted along with the foot strike information, or may be determined inherently from the time at which each foot strike value is transmitted. Minimizing the data that needs to be sent from the sensing unit to the RTF can reduce the power required by the system and thereby extend the running time of the system.


In various embodiments this information may be communicated to the user through a visual signal, an auditory signal, and/or a tactile signal. For example, an RTF could generate a sound (e.g., a buzz), cause a vibration, and/or flash a visual indicator only when a certain event occurs (e.g., a heel strike). Alternatively, different sounds, visual signals, and/or vibrations could be assigned to different events, thereby providing a different indicator to the user depending upon the event occurring. The severity of an event (e.g., a severe, intermediate, or mild heel strike) may be indicated, for example, by different sounds or by changes in the volume, pitch, tone, or other acoustic parameter of a given sound. Similarly, variations in the intensity or color and/or variations in the intensity of a vibration can be used to differentiate between different severities of a given foot strike event. In one embodiment the auditory, visual, and/or tactile signals associated with a specific foot strike position may be preset, or be selected by the user depending upon his/her particular preferences and the functionality of the RTF being used.


The sensing unit can, in one embodiment, include a housing unit adapted to house the sensor, a power source, a processor, and/or a transmitter. The housing unit may, for example, provide a protective casing for the sensor and other electronics and/or provide a mounting means by which the sensor can be mounted to the shoe of the user. The housing unit may be adapted to be releasably attachable to a sole and/or upper of the shoe of the user, or be fixedly attached to, or embedded within an upper and/or sole of the shoe. In one embodiment the housing unit is releasably attachable to at least one of a fastening portion (e.g., the lacing portion or a hook-and-loop fastening portion) of a shoe or a heel portion of the shoe. The housing unit may have an integrated mounting element for mounting the device to a shoe. Alternatively, the housing unit and mounting unit may be separate, attachable, elements that can be coupled together to mount the sensor package to the shoe. Example housing units 700 and mounting elements 705 are shown in FIGS. 30a to 33c.


The embodiment shown in FIG. 30a includes a housing unit 700 that is either fixedly or detachably attached to an elastic or inelastic mounting element 705 that includes hooked ends 710 for engaging the laces of a shoe. FIG. 30b includes a housing unit 700 having a plurality of elastic or inelastic hooks 715 extending from the housing unit 700 for engaging the laces or the eyelets of a shoe. FIG. 30c includes a housing unit 700 having a flexible or inflexible base unit 720 that can be placed under the laces of a shoe and be fixedly or detachably connected to the housing unit 700.



FIGS. 30d and 30e include a flexible or inflexible base unit 725 that can be slid over the heel of a shoe or slid into a pocket on the heel of the shoe (as shown in FIG. 30d), or slid under the laces of a shoe a shown in FIG. 30e). The base unit 725 may include pins for pinning the base unit 725 into one or more layers of the upper the shoe. Again, the base unit 725 can be fixedly or detachably attached to the housing unit 700. FIGS. 30f and 30g show a similar configuration but with a latching element 730 allowing the end of the base unit 725 to latch onto the housing unit 700 to lock it in place when mounted to the shoe.


In one embodiment, as shown in FIG. 31a, magnetic elements 730 can be used to releasably attach a housing unit 700 to the upper of a shoe 735, while in FIG. 31b pins 740 may be inserted through the upper of the shoe 735 to releasably or fixedly hold the housing unit 700 thereto. In another embodiment a detachable mounting unit 745 having a plurality of latching elements 750 can be placed below the laces 755 of a shoe, as shown in FIGS. 32a and 32b, thereby allowing the housing unit 700 to be detachably mounted to the lacing portion of the shoe.


In one embodiment, as shown in FIG. 33a, a housing unit 700 can be slideably connectable with a mounting unit 760. The mounting unit 760 may include flexible arms 765, that can be pinched in to allow the ends 770 of the arms 765 to be hooked under the laces of the shoe or the sides of the lacing portion of the upper of the shoe. The arms 765 may alternatively be fixedly mounted to the housing unit 700. In another embodiment, as shown in FIG. 33b, the housing unit 700 may be formed from a flexible elongate length of material with the sensor, a power source, a processor, and/or a transmitter held within. This flexible housing could, for example, be slid below the laces and held in place by the closing force applied by the laces to the shoe. In yet another embodiment a housing unit 700 may be releasably or fixedly mounted to a clip 775 that can be clipped onto the upper of a shoe at a heel portion (as shown in FIG. 33c) or over a lacing portion of a shoe, or be attached to the upper of a shoe through an adhesive attachment, a hook-and-loop attachment, a pinned attachment, or any other appropriate attachment means.


In one embodiment data from a gyroscopic sensor coupled, for example, to an upper body of a user can be used to provide posture and/or lean information during a run. For example, a gyroscopic sensor in a smart phone or other RTF strapped to the upper body of the user (e.g., on the torso or upper arm of the user) can be used to measure posture and/or lean information that can be coupled with the foot strike and cadence information to provide the runner with a more complete analysis of their running form. The smart phone may then act as both a sensor unit and an RTF. In one embodiment, the user can calibrate the gyroscopic sensor within the smart phone or other RTF prior to use, for example by simply strapping the smart phone to their person and then standing upright (e.g., against a wall) to “zero” the sensor prior to the run.


In one embodiment, the foot strike position information may be supported by GPS data, or other appropriate data, that can be used to determine the altitude, and more particularly the change in altitude, of the user for a particular foot strike. More particularly, the altitude data can be used to determine whether a runner was running on a flat or substantially flat stretch of ground, or was running uphill or downhill, for each foot strike. This may be beneficial in providing context for each foot strike to ensure that accurate data and instructions are communicated to the runner, as even runners with good running for will tend to heel strike when running down a substantial incline and forefoot strike when running up a substantial incline.


In one embodiment, a shoe may have one or more control elements embedded therein to adjust an element of the shoe. For example, control elements may be embedded in the sole of a shoe to adjust a stiffness, flexibility, and/or thickness of the sole in response to a signal received from a remote source. As such, a biofeedback system can be adapted to send biofeedback information measured by one or more sensors in a shoe to a remote receiver, analyze the data to determine one or more performance characteristic of the runner during a run, and send a control signal to control elements in the shoe to adjust a characteristic of the shoe to compensate for shortcomings in the runners technique and/or to assist in training the runner to run with proper form. In an alternative embodiment the analyzing and controlling steps may be carried out by a control element embedded within, or attached to, the shoe, without the need for a remote receiver.


While the discussions hereinabove relate to the use of sensors to measure an provide feedback relating to a runners form during jogging or running, in alternative embodiments the systems and methods described herein may be used to measure one or more performance characteristics related to athletic activities other than running. For example, the systems and methods described herein may be useful for training purposes in sports that require an athlete to make rapid changes of direction (e.g., soccer, football, basketball, baseball, softball, tennis, squash, badminton, volleyball, lacrosse, field hockey, ultimate Frisbee, etc), where analysis of the athlete's foot striking during a turn or “cut” may provide valuable performance information. Similarly, sports that require good jumping form when jumping (e.g., hurdling, basketball, volleyball, etc) may also benefit from devices and methods that provide accurate information relating to the position of the athlete's foot during push of and/or landing. In addition, devices that provide accurate information about the position and rotation of the feet of a user may provide valuable training information for a golfer during a swing, for a bowler during a throwing action, for a baseball pitcher during a throw, or for other athletic motions requiring a high degree of repeatability.


It should be understood that alternative embodiments, and/or materials used in the construction of embodiments, or alternative embodiments, are applicable to all other embodiments described herein.


The invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The foregoing embodiments, therefore, are to be considered in all respects illustrative rather than limiting the invention described herein. Scope of the invention is thus indicated by the appended claims, rather than by the foregoing description, and all changes that come within the meaning and range of equivalency of the claims are intended to be embraced therein.

Claims
  • 1. An apparatus for monitoring one or more athletic performance characteristics of a user, the apparatus comprising: a sensing unit adapted to be attachable to a shoe of a user, the sensing unit comprising: (i) a first sensor adapted to monitor a movement of a foot of the user while the user is in motion, the first sensor comprising a gyroscopic sensor for measuring angular velocity data for the foot of the user;(ii) processing means for (a) processing the angular velocity data from the first sensor to determine a processed value indicative of a first athletic performance characteristic comprising a foot strike location of a foot of the user upon striking a ground surface, and (b) comparing the processed value to a range of predetermined comparison values to determine the athletic performance characteristic relating to the processed value, the range of comparison values comprising a first range of values indicative of a heel strike, a second range of values indicative of a midfoot strike, and a third range of values indicative of a forefoot strike;(iii) a periodic trigger adapted to distinguish one stride from another for determining a point in time in each gait cycle before a foot strike event takes place to trigger the processing means to process angular velocity data from the first sensor to identify a maximum value and a minimum value during a window of time, wherein the point in time is triggered when, during a foot swing portion of each gait cycle, a sum of negative angular velocities, measured by the first sensor, reaches a preset level; and(iv) transmitting means for transmitting a data package representative of the performance characteristic to a remote receiver, the data package comprising an indication of whether a heel strike, a midfoot strike, or a forefoot strike has occurred.
  • 2. The apparatus of claim 1, further comprising receiving means for: (i) receiving the data package transmitted from the sensing unit, and (ii) communicating information representative of the performance characteristic to the user.
  • 3. The apparatus of claim 2, wherein the receiving means comprises at least one of a visual signal, an auditory signal, and a tactile signal.
  • 4. The apparatus of claim 2, wherein the information is communicated to the user in substantially real-time.
  • 5. The apparatus of claim 2, wherein the receiving means comprises at least one remote user feedback device.
  • 6. The apparatus of claim 5, wherein the remote user feedback device is selected from the group consisting of: a watch, a detachable strap, a mobile phone, an earpiece, a hand-held feedback device, a laptop computer, a head-mounted feedback device, and a desktop personal computer.
  • 7. The apparatus of claim 2, wherein the receiving means comprises means for controlling at least one function of a remote user feedback device.
  • 8. The apparatus of claim 2, wherein at least one of the sensing unit and the receiving means comprises means for storing data related to the first athletic performance characteristic.
  • 9. The apparatus of claim 1, wherein the processing means comprises a microprocessor.
  • 10. The apparatus of claim 1, wherein the sensing unit comprises a housing unit adapted to house the first sensor, the processing means, and the transmitting means.
  • 11. The apparatus of claim 10, wherein the housing unit is adapted to be releasably attachable to the shoe of the user.
  • 12. The apparatus of claim 11, wherein the housing unit is adapted to be releasably attachable to at least one of a fastening portion and a heel portion of the shoe.
  • 13. The apparatus of claim 1, further comprising means for determining at least one second athletic performance characteristic of the user.
  • 14. The apparatus of claim 13, wherein the determination of the at least one second athletic performance characteristic is based upon an output from the first sensor.
  • 15. The apparatus of claim 13, wherein the determination of the at least one second athletic performance characteristic is based upon an output from at least one second sensor.
  • 16. The apparatus of claim 15, wherein the at least one second sensor is selected from the group consisting of: an accelerometer, a pressure sensor, a force sensor, a temperature sensor, a chemical sensor, a global positioning system, a piezoelectric sensor, a rotary position sensor, a gyroscopic sensor, a heart-rate sensor, and a goniometer.
  • 17. The apparatus of claim 13, wherein the at least one second athletic performance characteristic comprises at least one of a cadence, a posture, a lean, a speed, a distance travelled, or a heart rate of the user.
  • 18. The apparatus of claim 1, wherein the processing means performs a comparison of a localized maximum angular velocity measurement and a localized minimum angular velocity measurement during a foot strike event to determine the processed value.
  • 19. The apparatus of claim 18, wherein the processing means divides the localized minimum angular velocity measurement during a foot strike event by the localized maximum angular velocity measurement during a foot strike event to determine the processed value.
  • 20. The apparatus of claim 1, wherein the processing means integrates measured angular velocity data during a foot strike event to determine the processed value.
  • 21. A method for monitoring one or more athletic performance characteristics of a user, the method comprising the steps of: providing a sensing unit adapted to be attachable to a shoe of a user, the sensing unit comprising: (i) a first sensor adapted to monitor a movement of a foot of the user while the user is in motion, the first sensor comprising a gyroscopic sensor for measuring angular velocity data for the foot of the user,(ii) processing means for (a) processing the angular velocity data from the first sensor to determine a processed value indicative of a first athletic performance characteristic comprising a foot strike location of a foot of the user upon striking a ground surface, and (b) comparing the processed value to a range of predetermined comparison values to determine the athletic performance characteristic relating to the processed value, the range of comparison values comprising a first range of values indicative of a heel strike, a second range of values indicative of a midfoot strike, and a third range of values indicative of a forefoot strike,(iii) a periodic trigger adapted to distinguish one stride from another for determining a point in time in each gait cycle before a foot strike event takes place to trigger the processing means to process angular velocity data from the first sensor to identify a maximum value and a minimum value during a window of time, wherein the point in time is triggered when, during a foot swing portion of each gait cycle, a sum of negative angular velocities, measured by the first sensor, reaches a preset level, and(iv) transmitting means for transmitting a data package representative of the performance characteristic to a remote receiver, the data package comprising an indication of whether a heel strike, a midfoot strike, or a forefoot strike has occurred; andproviding receiving means for receiving the data package transmitted from the sensing unit and communicating information representative of the performance characteristic to the user.
  • 22. The method of claim 21, wherein the information is communicated to the user in substantially real-time.
  • 23. The apparatus of claim 1, wherein the periodic trigger comprises a second sensor.
  • 24. The apparatus of claim 1, wherein the periodic trigger triggers at a pre-set angular velocity measurement from the first sensor.
  • 25. The apparatus of claim 1, wherein the window of time comprises a starting point associated with initiation of the foot strike event.
  • 26. The apparatus of claim 25, wherein the starting point comprises a pre-set angular velocity measurement.
  • 27. The apparatus of claim 25, wherein the starting point comprises a moment before the initiation of the foot strike event.
  • 28. The apparatus of claim 25, wherein the starting point comprises a moment of initial impact of the foot strike event.
  • 29. The apparatus of claim 25, wherein the starting point comprises a moment of a first local maximum angular velocity measurement after an initial impact of the foot strike event.
  • 30. The apparatus of claim 25, wherein the starting point comprises a moment of an angular velocity measurement of about zero prior to an initial impact of the foot strike event.
  • 31. The apparatus of claim 25, wherein the window of time comprises a temporal length from the starting point to an ending point.
  • 32. The apparatus of claim 31, wherein the ending point comprises a moment of a transition from a negative angular velocity measurement to a positive angular velocity measurement after a local minimum angular velocity measurement.
  • 33. The apparatus of claim 31, wherein the ending point comprises a moment of a first local minimum angular velocity measurement after the foot strike event.
  • 34. The apparatus of claim 31, wherein the temporal length comprises between about 10 msec and about 50 msec.
  • 35. The apparatus of claim 31, wherein the temporal length comprises a period between a moment of a first local maximum angular velocity measurement after an initial impact of the foot strike event and a moment of a first local minimum angular velocity measurement.
  • 36. The apparatus of claim 21, wherein the gyroscopic sensor is activated only when the user is running.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to and the benefit of U.S. Provisional Patent Application Ser. No. 61/440,243, filed Feb. 7, 2011, the disclosure of which is incorporated herein by reference in its entirety.

US Referenced Citations (377)
Number Name Date Kind
3878641 Noble Apr 1975 A
4144769 Mayer Mar 1979 A
4216416 Grace Aug 1980 A
4250627 Jaervenpaeae et al. Feb 1981 A
4387437 Lowrey et al. Jun 1983 A
4403869 Crutcher Sep 1983 A
4487208 Kamens Dec 1984 A
4499394 Koal Feb 1985 A
4510704 Johnson Apr 1985 A
4536975 Harrell Aug 1985 A
4601106 Leinonen Jul 1986 A
4630383 Gamm Dec 1986 A
4741001 Ma Apr 1988 A
4771394 Cavanagh Sep 1988 A
4780864 Houlihan Oct 1988 A
4793666 Torrence Dec 1988 A
4814661 Ratzlaff et al. Mar 1989 A
4876500 Wu Oct 1989 A
4932914 Aranda Jun 1990 A
5063690 Slenker Nov 1991 A
5117444 Sutton et al. May 1992 A
5323650 Fullen et al. Jun 1994 A
5456032 Matsumoto et al. Oct 1995 A
5456262 Birnbaum Oct 1995 A
5471405 Marsh Nov 1995 A
5500635 Mott Mar 1996 A
5511561 Wanderman et al. Apr 1996 A
5524101 Thorgersen et al. Jun 1996 A
5560114 Laehteenmaeki Oct 1996 A
5600611 Kamens Feb 1997 A
5619186 Schmidt et al. Apr 1997 A
5636146 Flentov et al. Jun 1997 A
5654718 Beason et al. Aug 1997 A
5663871 Bayani Sep 1997 A
5678448 Fullen et al. Oct 1997 A
5775011 Reitano, Jr. Jul 1998 A
5810722 Heikkilae Sep 1998 A
5845235 Luukkanen et al. Dec 1998 A
5850626 Kallio Dec 1998 A
5922058 Fishman et al. Jul 1999 A
5945911 Healy et al. Aug 1999 A
5955667 Fyfe Sep 1999 A
5960380 Flentov et al. Sep 1999 A
6018705 Gaudet et al. Jan 2000 A
6029515 Laehteenmaki et al. Feb 2000 A
6032530 Hock Mar 2000 A
6039446 Lahteenmaki et al. Mar 2000 A
6052654 Gaudet et al. Apr 2000 A
6104947 Heikkilae et al. Aug 2000 A
6122340 Darley et al. Sep 2000 A
6159130 Torvinen Dec 2000 A
6229454 Heikkilae et al. May 2001 B1
6240647 Moustgaard et al. Jun 2001 B1
6266623 Vock et al. Jul 2001 B1
6289747 Billen et al. Sep 2001 B1
6298314 Blackadar et al. Oct 2001 B1
6301964 Fyfe et al. Oct 2001 B1
6330149 Burrell Dec 2001 B1
6336365 Blackadar et al. Jan 2002 B1
6357147 Darley et al. Mar 2002 B1
6411841 Heikkil Jun 2002 B2
6418181 Nissil Jul 2002 B1
6434485 Beason et al. Aug 2002 B1
6493652 Ohlenbusch et al. Dec 2002 B1
6496787 Flentov et al. Dec 2002 B1
6498994 Vock et al. Dec 2002 B2
6499000 Flentov et al. Dec 2002 B2
6516284 Flentov et al. Feb 2003 B2
6529827 Beason et al. Mar 2003 B1
6536139 Darley et al. Mar 2003 B2
6537227 Kinnunen et al. Mar 2003 B2
6539336 Vock et al. Mar 2003 B1
6549850 Punkka et al. Apr 2003 B2
6551252 Sackner et al. Apr 2003 B2
6560903 Darley May 2003 B1
6611789 Darley Aug 2003 B1
6614726 Yamasaki et al. Sep 2003 B2
6650282 Martikka Nov 2003 B2
6679405 Zalis-Hecker et al. Jan 2004 B2
6735542 Burgett et al. May 2004 B1
6745069 Nissilae et al. Jun 2004 B2
6745115 Chen et al. Jun 2004 B1
6768449 Burgett et al. Jul 2004 B1
6768450 Walters et al. Jul 2004 B1
6798378 Walters et al. Sep 2004 B1
6799114 Etnyre Sep 2004 B2
6801855 Walters et al. Oct 2004 B1
6803860 Langner et al. Oct 2004 B1
6810322 Lai Oct 2004 B2
6819549 Lammers-Meis et al. Nov 2004 B1
6832138 Straub et al. Dec 2004 B1
6833851 Brunk Dec 2004 B1
6836744 Asphahani et al. Dec 2004 B1
6837827 Lee et al. Jan 2005 B1
6841947 Berg-johansen Jan 2005 B2
6856934 Vock et al. Feb 2005 B2
6862525 Beason et al. Mar 2005 B1
6865453 Burch et al. Mar 2005 B1
6876947 Darley et al. Apr 2005 B1
6882955 Ohlenbusch et al. Apr 2005 B1
6885971 Vock et al. Apr 2005 B2
D504627 Harju May 2005 S
6898550 Blackadar et al. May 2005 B1
6958045 Takiguchi et al. Oct 2005 B2
6959259 Vock et al. Oct 2005 B2
6963818 Flentov et al. Nov 2005 B2
6967616 Etnyre Nov 2005 B2
6970795 Burgett et al. Nov 2005 B1
6978684 Nurse Dec 2005 B2
D514461 Harju Feb 2006 S
7013216 Walters et al. Mar 2006 B2
7034747 Walters et al. Apr 2006 B1
7043355 Lai May 2006 B2
7047113 Burch et al. May 2006 B1
7054725 Burch et al. May 2006 B2
7054784 Flentov et al. May 2006 B2
7057551 Vogt Jun 2006 B1
7072246 Lizzi Jul 2006 B2
7072789 Vock et al. Jul 2006 B2
7085678 Burrell et al. Aug 2006 B1
7092846 Vock et al. Aug 2006 B2
7113450 Plancon et al. Sep 2006 B2
7129835 Nikkola Oct 2006 B2
7142152 Burgett et al. Nov 2006 B2
7146686 Saaski et al. Dec 2006 B2
7152286 Rooney et al. Dec 2006 B2
7152470 Impio et al. Dec 2006 B2
7158912 Vock et al. Jan 2007 B2
7162392 Vock et al. Jan 2007 B2
7171331 Vock et al. Jan 2007 B2
7174277 Vock et al. Feb 2007 B2
7187924 Ohlenbusch et al. Mar 2007 B2
7188439 DiBenedetto et al. Mar 2007 B2
7191644 Haselhurst et al. Mar 2007 B2
7200517 Darley et al. Apr 2007 B2
7215279 Poindexter et al. May 2007 B1
7215601 Plancon et al. May 2007 B2
7225565 DiBenedetto et al. Jun 2007 B2
7234348 Spampinato et al. Jun 2007 B2
7245254 Vogt Jul 2007 B1
7271774 Puuri Sep 2007 B2
7292867 Werner et al. Nov 2007 B2
7324002 Iso-Heiko et al. Jan 2008 B2
7349805 Kaltto et al. Mar 2008 B2
7353136 Vock et al. Apr 2008 B2
7353137 Vock et al. Apr 2008 B2
7353139 Burrell et al. Apr 2008 B1
7363121 Chen et al. Apr 2008 B1
7363521 Mehan Apr 2008 B1
7377180 Cunningham May 2008 B2
7379712 Saarnimo May 2008 B2
7382285 Horvath et al. Jun 2008 B2
7383081 Butt et al. Jun 2008 B2
7386373 Chen et al. Jun 2008 B1
7386401 Vock et al. Jun 2008 B2
7387029 Cunningham Jun 2008 B2
7398151 Burrell et al. Jul 2008 B1
7428471 Darley et al. Sep 2008 B2
7428472 Darley et al. Sep 2008 B2
7429948 Burgett et al. Sep 2008 B2
7433805 Vock et al. Oct 2008 B2
7437268 Pathak Oct 2008 B1
7451056 Flentov et al. Nov 2008 B2
7457724 Vock et al. Nov 2008 B2
7466979 Ohlenbusch et al. Dec 2008 B2
7467060 Kulach et al. Dec 2008 B2
7468692 Wiegers Dec 2008 B1
7480512 Graham et al. Jan 2009 B2
7483789 Walters et al. Jan 2009 B1
7484320 Butt et al. Feb 2009 B2
7489241 Miettinen et al. Feb 2009 B2
7506460 DiBenedetto et al. Mar 2009 B2
7512515 Vock et al. Mar 2009 B2
7512793 Harris Mar 2009 B1
7515938 Ruotsalainen et al. Apr 2009 B2
7526840 Pernu et al. May 2009 B2
7526954 Haselhurst et al. May 2009 B2
7534206 Lovitt et al. May 2009 B1
7552031 Vock et al. Jun 2009 B2
7559127 Rooney et al. Jul 2009 B2
7566290 Lee et al. Jul 2009 B2
7587937 Haselhurst et al. Sep 2009 B2
7596891 Carnes et al. Oct 2009 B2
7600426 Savolainen et al. Oct 2009 B2
7600430 Palin et al. Oct 2009 B2
7601098 Lee et al. Oct 2009 B1
7602302 Hokuf et al. Oct 2009 B2
7607243 Berner, Jr. et al. Oct 2009 B2
7613356 Uchiyama et al. Nov 2009 B2
7617071 Darley et al. Nov 2009 B2
7620520 Vock et al. Nov 2009 B2
7623078 Wang Nov 2009 B2
7623987 Vock et al. Nov 2009 B2
7625314 Ungari et al. Dec 2009 B2
7627451 Vock et al. Dec 2009 B2
7631382 DiBenedetto et al. Dec 2009 B2
7637747 Jaatinen et al. Dec 2009 B2
7640135 Vock et al. Dec 2009 B2
7658694 Ungari Feb 2010 B2
7662064 Lee et al. Feb 2010 B2
7667641 Poindexter et al. Feb 2010 B1
7670295 Sackner et al. Mar 2010 B2
7671581 Noenen Mar 2010 B2
7676960 DiBenedetto et al. Mar 2010 B2
7676961 DiBenedetto et al. Mar 2010 B2
7693668 Vock et al. Apr 2010 B2
7696904 Horvath et al. Apr 2010 B2
7698058 Chen et al. Apr 2010 B2
7698101 Alten et al. Apr 2010 B2
7706815 Graham et al. Apr 2010 B2
7726206 Terrafranca, Jr. et al. Jun 2010 B2
7728723 Niva et al. Jun 2010 B2
7736242 Stites et al. Jun 2010 B2
7740551 Nurnberg et al. Jun 2010 B2
7752011 Niva et al. Jul 2010 B2
7758523 Collings et al. Jul 2010 B2
7764990 Martikka et al. Jul 2010 B2
7768415 Blackadar Aug 2010 B2
7771320 Riley et al. Aug 2010 B2
7771371 Avni Aug 2010 B2
7778118 Lyons et al. Aug 2010 B2
7787857 Peterman Aug 2010 B2
7789802 Lee et al. Sep 2010 B2
7795523 Lumme et al. Sep 2010 B2
7797039 Koivumaa et al. Sep 2010 B2
7803117 Martikka et al. Sep 2010 B2
7805149 Werner et al. Sep 2010 B2
7805150 Graham et al. Sep 2010 B2
7813887 Vock et al. Oct 2010 B2
7821878 Plancon et al. Oct 2010 B2
7822546 Lee Oct 2010 B2
7827000 Stirling et al. Nov 2010 B2
7828697 Oberrieder et al. Nov 2010 B1
7840378 Vock et al. Nov 2010 B2
7856339 Vock et al. Dec 2010 B2
7860666 Vock et al. Dec 2010 B2
7878055 Balzano Feb 2011 B2
7878945 Ungari et al. Feb 2011 B2
7878979 Derchak Feb 2011 B2
7887459 Ungari et al. Feb 2011 B2
7889085 Downey et al. Feb 2011 B2
7901326 Niva et al. Mar 2011 B2
7905026 Martikka et al. Mar 2011 B2
7911339 Vock et al. Mar 2011 B2
7914418 Nissila Mar 2011 B2
7917198 Ahola et al. Mar 2011 B2
7918811 Lussier et al. Apr 2011 B2
7921716 Morris Bamberg et al. Apr 2011 B2
7936226 Akkila May 2011 B2
7941160 Werner et al. May 2011 B2
7949488 Flentov et al. May 2011 B2
7953549 Graham et al. May 2011 B2
7957752 Werner et al. Jun 2011 B2
7961151 Wang et al. Jun 2011 B2
7962312 Darley et al. Jun 2011 B2
7966154 Vock et al. Jun 2011 B2
7980009 Carnes et al. Jul 2011 B2
7994931 Karstens et al. Aug 2011 B2
7997007 Sanabria-Hernandez Aug 2011 B2
8011242 O'Neill et al. Sep 2011 B2
8036850 Kulach et al. Oct 2011 B2
8036851 Vock et al. Oct 2011 B2
8055469 Kulach et al. Nov 2011 B2
8060337 Kulach et al. Nov 2011 B2
8126675 Vock et al. Feb 2012 B2
20030163287 Vock et al. Aug 2003 A1
20030197597 Bahl et al. Oct 2003 A1
20030224337 Shum et al. Dec 2003 A1
20040173220 Harry et al. Sep 2004 A1
20050010139 Aminian et al. Jan 2005 A1
20050126370 Takai et al. Jun 2005 A1
20050177059 Koivumaa et al. Aug 2005 A1
20060004294 Juntunen et al. Jan 2006 A1
20060031039 Flentov et al. Feb 2006 A1
20060183990 Tolvanen Aug 2006 A1
20070006489 Case, Jr. et al. Jan 2007 A1
20070010341 Miettinen et al. Jan 2007 A1
20070159926 Prstojevich et al. Jul 2007 A1
20070173903 Goren et al. Jul 2007 A1
20070247306 Case, Jr. Oct 2007 A1
20070250286 Duncan et al. Oct 2007 A1
20070260421 Berner et al. Nov 2007 A1
20070285868 Lindberg et al. Dec 2007 A1
20070288984 Kim Dec 2007 A1
20080053253 Moore et al. Mar 2008 A1
20080096726 Riley et al. Apr 2008 A1
20080119329 Punkka et al. May 2008 A1
20080125288 Case May 2008 A1
20080200310 Tagliabue Aug 2008 A1
20080214360 Stirling et al. Sep 2008 A1
20080258921 Woo et al. Oct 2008 A1
20080319330 Juntunen et al. Dec 2008 A1
20090048044 Oleson et al. Feb 2009 A1
20090113762 Leimer et al. May 2009 A1
20090144639 Nims et al. Jun 2009 A1
20090163322 Andren et al. Jun 2009 A1
20090212941 Vock et al. Aug 2009 A1
20090233771 Quatrochi et al. Sep 2009 A1
20090258710 Quatrochi et al. Oct 2009 A1
20090265958 DiBenedetto et al. Oct 2009 A1
20090270744 Prstojevich et al. Oct 2009 A1
20090272007 Beers et al. Nov 2009 A1
20090284368 Case, Jr. Nov 2009 A1
20090312657 Martikka et al. Dec 2009 A1
20100035724 Ungari et al. Feb 2010 A1
20100036639 Vock et al. Feb 2010 A1
20100037489 Berner, Jr. et al. Feb 2010 A1
20100041517 Ungari et al. Feb 2010 A1
20100042182 Chan et al. Feb 2010 A1
20100042427 Graham et al. Feb 2010 A1
20100050478 DiBenedetto et al. Mar 2010 A1
20100057398 Darley et al. Mar 2010 A1
20100063778 Schrock et al. Mar 2010 A1
20100063779 Schrock et al. Mar 2010 A1
20100076692 Vock et al. Mar 2010 A1
20100130123 Lindman May 2010 A1
20100130315 Steidle May 2010 A1
20100151996 Alten et al. Jun 2010 A1
20100170337 Ahlstrom Jul 2010 A1
20100184563 Molyneux et al. Jul 2010 A1
20100184564 Molyneux et al. Jul 2010 A1
20100185398 Berns et al. Jul 2010 A1
20100187074 Manni Jul 2010 A1
20100197403 Niva et al. Aug 2010 A1
20100211355 Horst et al. Aug 2010 A1
20100216563 Stites et al. Aug 2010 A1
20100216564 Stites et al. Aug 2010 A1
20100216565 Stites et al. Aug 2010 A1
20100228134 Martikka et al. Sep 2010 A1
20100248852 Lee Sep 2010 A1
20100250208 Leskela et al. Sep 2010 A1
20100273434 Blackadar Oct 2010 A1
20100279824 Chapa, Jr. et al. Nov 2010 A1
20100279825 Riley et al. Nov 2010 A1
20100292050 DiBenedetto et al. Nov 2010 A1
20100292598 Roschk et al. Nov 2010 A1
20100292599 Oleson et al. Nov 2010 A1
20100292600 DiBenedetto et al. Nov 2010 A1
20100305480 Fu et al. Dec 2010 A1
20100311544 Robinette et al. Dec 2010 A1
20100332188 Vock et al. Dec 2010 A1
20110003665 Burton et al. Jan 2011 A1
20110009777 Reichow et al. Jan 2011 A1
20110022357 Vock et al. Jan 2011 A1
20110032105 Hoffman et al. Feb 2011 A1
20110054270 Derchak Mar 2011 A1
20110054271 Derchak et al. Mar 2011 A1
20110054272 Derchak Mar 2011 A1
20110054289 Derchak et al. Mar 2011 A1
20110054290 Derchak et al. Mar 2011 A1
20110054358 Kim et al. Mar 2011 A1
20110054359 Sazonov et al. Mar 2011 A1
20110060550 Vock et al. Mar 2011 A1
20110069589 Plancon et al. Mar 2011 A1
20110077904 Jung et al. Mar 2011 A1
20110082394 Chiu et al. Apr 2011 A1
20110082641 Werner et al. Apr 2011 A1
20110087115 Sackner et al. Apr 2011 A1
20110105861 Derchak et al. May 2011 A1
20110124978 Williams May 2011 A1
20110125866 Williams May 2011 A1
20110126143 Williams et al. May 2011 A1
20110130643 Derchak et al. Jun 2011 A1
20110136627 Williams Jun 2011 A1
20110137375 McBride Jun 2011 A1
20110137678 Williams Jun 2011 A1
20110140890 Vock et al. Jun 2011 A1
20110140897 Purks et al. Jun 2011 A1
20110161455 Johnson et al. Jun 2011 A1
20110191018 Werner et al. Aug 2011 A1
20110192223 Johnson et al. Aug 2011 A1
20110196603 Graham et al. Aug 2011 A1
20110202268 Werner et al. Aug 2011 A1
20110234556 Uehara et al. Sep 2011 A1
20110288811 Greene Nov 2011 A1
20110314702 Berner, Jr. et al. Dec 2011 A1
20120004883 Vock et al. Jan 2012 A1
Foreign Referenced Citations (6)
Number Date Country
1707065 Oct 2006 EP
2646545 Nov 1990 FR
2743701 Jul 1997 FR
1178121 Jan 1970 GB
2278198 Nov 1994 GB
11196906 Jul 1999 JP
Non-Patent Literature Citations (41)
Entry
Pappas et al., A Reliable Gait Phase Detection System, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 9, No. 2, Jun. 2001.
Stirling, University of Alberta, Department of Mechanical Engineering, 2004.
Ardigo et al., “Metabolic and Mechanical Aspects of Foot Landing Type, Forefoot and Rearfoot Strike, in Human Running,” Acta Physiol. Scand., 1995, 155, pp. 17-22.
Bobbert et al., “Mechanical Analysis of the Landing Phase in Heel-Toe Running,” Journal of Biomechanics, vol. 25, No. 3, pp. 223-234, 1992.
Boyer et al., “'Quantification of the Input Signal for Soft Tissue Vibration During Running,” Journal of Biomechanics, 40, (2007), pp. 1877-1880.
Bramble et al., “Endurance Running and the Evolution of Homo,” Nature, vol. 432, Nov. 18, 2004, pp. 345-352.
Cavanagh et al., “Ground Reaction Forces in Distance Running,” Journal of Biomechanics, vol. 13, pp. 397-406, (1979).
Crowell et al., “Reducing Impact Loading During Running With the Use of . . . ,” Journal of Orthopaedic & Sports Physical Therapy, Apr. 2010, vol. 40, No. 4, pp. 206-213.
Cunningham, “What Does . . . ,” Proc. 4th World Congr. Ballistocard and Cardiovasc. Dynamics, Amsterdam 1975 Biblthca cardiol. 35: pp. 64-68 (Karger, Basel 1976).
Dallam et al., “Effect of a Global Alteration of Running Technique on Kinematics and Economy,” Journal of Sports Sciences, Jul. 2005, 23(7), pp. 757-764.
De Wit et al., “Biomechanical Analysis of the Stance Phase During Barefoot and Shod Running,” Journal of Biomechanics, 33 (2000) pp. 269-278.
Divert et al., “Stiffness Adaptions in Shod Running,” Journal of Applied Biomechanics, 2005, 21, pp. 311-321.
Divert et al., “Mechanical Comparison of Barefoot and Shod Running,” Int. J. of Sports Med., 2005, 26, pp. 593-598.
Divert et al., “Barefoot-Shod Running Differences: Shoe of Mass Effect?,” Int. J. of Sports Med., 2008, 29, pp. 512-518.
Eriksson et al., “Immediate Effect of Visual and Auditory Feedback to Control the Running Mechanics . . . ,” Journal of Sports Sciences, Feb. 1, 2011, 29(3), pp. 253-262.
Eslami et al., “Forefoot-Rearfoot Coupling Patterns and Tibial Internal Rotation During Stance Phase of Barefoot Versus . . . ,” Clinical Biomechanics, 22, (2007), pp. 74-80.
Fong et al., “Cushioning and Lateral Stability Functions of Cloth Sport Shoes,” Sports Biomechanics, Sep. 2007, 6(3), pp. 407-417.
Gerritsen et al., “Direct Dynamics Simulation of the Impact Phase in Heel-Toe Running,” Journal of Biomechanics, vol. 28, No. 6, pp. 661-668, 1995.
Hamill et al., “Effects of Shoe Type on Cardiorespiratory Responses . . . ,” Medicine and Science in Sport and Exercise, vol. 20, No. 5, 1988, pp. 515-521.
Hamill et al., “Impact Characteristics in Shod and Barefoot Running,” Footwear Science, vol. 3, No. 1, Mar. 2011, pp. 33-40.
Hasegawa et al., “Foot Strike Patterns of Runners at the 15-KM Point During . . . ,” Journal of Strength and Conditioning Research, 2007, 21(3), pp. 888-893.
Hreljac, “Impact and . . . ,” Official Journal of the American College of Sports Science, Medicine & Science in Sport & Exercise, 0195-9131/04/3605-0845, 2004, pp. 845-849.
Kerr et al., “Footstrike Patterns . . . ,” Proceedings of the International Symposium on Biomechanical Aspects of Sport Shoes & Playing Surfaces, U. of Calgary, Aug. 1983.
Kersting, “Regulation of Impact Forces During Treadmill Running,” Footwear Science, vol. 3, No. 1, Mar. 2011; pp. 59-68.
Laughton et al., “Effect of Strike Pattern and Orthotic Intervention on Tibial Shock During Running,” Journal of Applied Biomechanics, 2003, 19, pp. 153-168.
Lieberman et al., “Foot Strike Patterns and Collision Forces in Habitually Barefoot Versus Shod Runners,” Nature, vol. 463, Jan. 28, 2010, pp. 531-536.
Magill, “The Influence of Augmented Feedback on Skill Learning Depends on Characteristics of the Skill and the Learner,” QUEST, 1994, 46, pp. 314-327.
Mercer et al., “Relationship Between Shock Attenuation and Stride Length During Running at Different Velocities,” Eur. J. Appl. Physiol., (2002), 87, pp. 403-408.
Messier et al., “Effects of a Verbal and Visual Feedback System on Running Technique, Perceived . . . ,” Journal of Sports Sciences, 1989, 7, pp. 113-126.
Newham et al., “Pain and Fatigue after Concentric and Eccentric Muscle Contractions,” Clinical Science, (1983), 64, pp. 55-62.
Nokes et al., “Vibration Analysis of Human Tibia: The Effect of Soft Tissue on the Output from Skin-Mounted . . . ,” J.Biomed. Eng. 1984, vol. 6, July, pp. 223-226.
Proske et al., “Muscle Damage From Eccentric Exercise: Mechanism, Mechanical Signs, Adaptation and Clinical . . . ,” Journal of Physiology, (2011), 537.2, pp. 333-345.
Salmoni et al., “Knowledge of Results and Motor Learning: A Review and Critical Reappraisal,” Psychological Bulletin, 1984, vol. 95, No. 3, pp. 355-386.
Shorten et al., “Spectral Analysis of Impact Shock During Running,” International Journal of Sport Biomechanics, vol. 8, 1992, pp. 288-304.
Shorten et al., “The ‘Heel Impact’ Force Peak During Running is Neither ‘Heel’ nor ‘Impact’ and Does Not . . . ,” Footwear Science, vol. 3, No. 1, Mar. 2011, pp. 41-58.
Squadrone et al., “Biomechanical and Physiological Comparison of Barefoot . . . ,” Journal of Sports Medicine and Physical Fitness, Mar. 2009, vol. 49, No. 1. pp. 6-13.
Stackhouse et al., “Orthotic Intervention in Forefoot and Rearfoot Strike Running Patterns,” Clinical Biomechanics 19, (2004), pp. 64-70.
Williams et al., “Relationship Between Distance Running Mechanics, Running Economy, and . . . ,” Journal of Applied Physiology, vol. 63, Issue 3, (1987), pp. 1236-1245.
Williams et al., “Lower Extremity Mechanics in Runners with a Converted Forefoot Strike Pattern,” Journal of Applied Biomechanics, 2000, 16, pp. 210-218.
Wright et al., “Passive Regulation of Impact Forces in Heel-Toe Running,” Clinical Biomechanics, 13, (1998), pp. 521-531.
International Search Report and Written Opinion for PCT/US2012/024147, mailed on May 10, 2012.
Related Publications (1)
Number Date Country
20130041617 A1 Feb 2013 US
Provisional Applications (1)
Number Date Country
61440243 Feb 2011 US