Systems and methods for time-based athletic activity measurement and display

Information

  • Patent Grant
  • 12170138
  • Patent Number
    12,170,138
  • Date Filed
    Friday, June 16, 2023
    a year ago
  • Date Issued
    Tuesday, December 17, 2024
    5 days ago
Abstract
An athletic parameter measurement device worn by an athlete during an athletic activity session includes a housing which attaches to the athlete, a display, a processor associated with the display, and an athletic parameter measurement sensor. During the athletic activity, the device detects, using the sensor, a vertical jump height of the athlete, and displays, during the performance of the athletic activity session, a representation of the vertical jump height on the display.
Description
TECHNICAL FIELD

The invention relates generally to recordation and visualization of athletic activity. In particular, aspects described herein relate to time-based recordation and review of athletic activity and time-specific metrics associated therewith.


BACKGROUND

Exercise and fitness have become increasingly popular and the benefits from such activities are well known. Various types of technology have been incorporated into fitness and other athletic activities. For example, a wide variety of portable electronic devices are available for use in fitness activity such as MP3 or other audio players, radios, portable televisions, DVD players, or other video playing devices, watches, GPS systems, pedometers, mobile telephones, pagers, beepers, etc. Many fitness enthusiasts or athletes use one or more of these devices when exercising or training to keep them entertained, provide performance data or to keep them in contact with others, etc. Such users have also demonstrated an interest in recording their athletic activities and metrics associated therewith. Accordingly, various sensors may be used to detect, store and/or transmit athletic performance information. Oftentimes, however, athletic performance information is presented in a vacuum or based on the overall athletic activity. Athletic performance data might not be readily available for a particular period or instance of time during the athletic activity session. As such, users might not be able to identify specific times or time periods within their workout or other athletic activity that certain metrics or performance statistics were achieved.


A full discussion of the features and advantages of the present invention is referred to in the following detailed description, which proceeds with reference to the accompanying drawings.


SUMMARY

The following presents a general summary of aspects of the invention in order to provide a basic understanding of at least some of its aspects. This summary is not an extensive overview of the invention. It is not intended to identify key or critical elements of the invention or to delineate the scope of the invention. The following summary merely presents some concepts of the invention in a general form as a prelude to the more detailed description provided below.


One or more aspects describe systems and methods for tracking athletic activity metrics based on a timeline. Metrics may be recorded continuously or based on a predefined schedule. In either case, multiple values may be recorded for the same metric and associated with the particular time period or specific time at which the values were detected. For examples, athletic performance data may be detected and recorded for every minimum time unit. The minimum time unit may correspond to 1 second, 2 seconds, a millisecond, 10 seconds and the like. Using such time-based recordings, the user may review instantaneous and specific metric values to determine how they were performing at particular points during their athletic activity performance.


According to another aspect, users may display the multiple metrics simultaneously in an interface during review of the athletic activity session. For example, a user may display a video in a primary visualization area with overlays of one or more desired metrics. Additionally or alternatively, a toolbar may be displayed to provide other metrics not currently displayed in the primary visualization area.


According to yet another aspect, the multiple metrics may be recorded using multiple different applications or widgets. A user may select which metrics and/or widgets to use prior to the athletic activity session or prior to initiation of recordation. The user may also modify the selected metrics or applications during session recordation.


According to still another aspect, a user may edit the collected data prior to or after the metrics and other data are compiled into a single athletic activity session file of electronic content item (e.g., an enhanced video). For example, the user may remove metrics from being included in the athletic activity session file even if the metrics have already been recorded. Additionally or alternatively, the user may crop a video or other metrics to a desired period of time (e.g., smaller than the overall duration of the athletic activity session).


According to yet another aspect, a plurality of video segments of an athletic activity session of a user may be captured by a plurality of video sources. A processing system may determine that each of the plurality of video segments correspond to the athletic activity session of the user, and, accordingly, the processing system may generate a video replay of the athletic activity session of the user by piecing together the plurality of video segments captured by the plurality of video sources. The first portion of the video replay includes a first video segment captured by a first video source of the plurality of video sources, and a second portion of the video replay includes a second video segment captured by a second video source of the plurality of the video sources different than the first video source.


According to yet another aspect, an athletic parameter measurement device configured to by worn by an athlete during an athletic activity session detects and displays one or more metrics during the athletic activity session. For example, the athletic parameter measurement device may include a housing with an attachment mechanism configured to be attached to the athlete during the athletic activity session, a display, a processor associated with the display, and at least one athletic parameter measurement sensor. The device may be configured to detect, by the at least one athletic parameter measurement sensor, at least one metric of the athlete during the athletic activity session while the housing is worn by the athlete, wherein the at least one metric of the athlete includes a vertical jump height of the athlete, transmit, by the at least one athletic parameter measurement sensor to the processor, the at least one metric, and display, by the processor on the display, during performance of the athletic activity session, a representation of the at least one metric.


Other aspects and features are described throughout the disclosure.





BRIEF DESCRIPTION OF THE DRAWINGS

To understand the present invention, it will now be described by way of example, with reference to the accompanying drawings in which:



FIG. 1 illustrates an example computing environment in which one or more aspects described herein may be used;



FIG. 2 illustrates an example computing device that may be used according to one or more aspects described herein;



FIGS. 3A and 3B illustrate example sensor and monitoring device communication environments according to one or more aspects described herein;



FIG. 3C illustrates an example shoe sensor system having force sensing capabilities according to one or more aspects described herein.



FIG. 4 is a flow diagram illustrating the features of a time-based athletic performance monitoring system according to one or more aspects described herein;



FIG. 5 illustrates an example metric/application selection interface according to one or more aspects described herein;



FIG. 6 illustrates an example activity selection interface according to one or more aspects described herein;



FIG. 7 illustrates an example recording initiation interface according to one or more aspects described herein;



FIG. 8 illustrates an example interface displaying a user's recorded activity path and additional metrics according to one or more aspects described herein;



FIG. 9 illustrates another example activity selection interface according to one or more aspects described herein;



FIG. 10 illustrates an example interface displaying video of a user's recorded activity, a timeline and other metrics in a metric toolbar according to one or more aspects described herein;



FIG. 11 illustrates another example interface displaying user activity metrics according to one or more aspects described herein;



FIG. 12 illustrates an example landscape display of a user's activity metrics with overlaid metric information according to one or more aspects described herein;



FIG. 13 illustrates an example interface in which a user may crop a recorded activity session according to one or more aspects described herein;



FIG. 14 illustrates an example interface through which a recorded activity session may be shared according to one or more aspects described herein;



FIG. 15 illustrates an example community website through which recorded activity metrics may be shared according to one or more aspects described herein;



FIGS. 16A and 16B illustrate example display overlays for conveying activity metrics according to one or more aspects described herein;



FIGS. 17A-17D illustrate example interfaces configured to display a comparison between two activity sessions and/or athletes according to one or more aspects described herein;



FIGS. 18A and 18B illustrate example interfaces that may be adjusted using an intersection point between display regions according to one or more aspects described herein;



FIG. 19 illustrates the editing of metric data upon compiling an activity session file according to one or more aspects described herein; and



FIGS. 20 and 21 illustrate example environments in which multiple video or data capture sources may be used according to one or more aspects described herein.





DETAILED DESCRIPTION

In the following description of various example embodiments of the invention, reference is made to the accompanying drawings, which form a part hereof, and in which are shown by way of illustration various example devices, systems, and environments in which aspects of the invention may be practiced. It is to be understood that other specific arrangements of parts, example devices, systems, and environments may be utilized and structural and functional modifications may be made without departing from the scope of the present invention. Also, while the terms “top,” “bottom,” “front,” “back,” “side,” and the like may be used in this specification to describe various example features and elements of the invention, these terms are used herein as a matter of convenience, e.g., based on the example orientations shown in the figures. Nothing in this specification should be construed as requiring a specific three dimensional orientation of structures in order to fall within the scope of this invention.


Various examples of the invention may be implemented using electronic circuitry configured to perform one or more functions. For example, with some embodiments of the invention, the athletic information monitoring device, the collection device, the display device or any combination thereof may be implemented using one or more application-specific integrated circuits (ASICs). More typically, however, components of various examples of the invention will be implemented using a programmable computing device executing firmware or software instructions, or by some combination of purpose-specific electronic circuitry and firmware or software instructions executing on a programmable computing device.


Example Hardware Devices


FIG. 1 shows one illustrative example of a computer 101 that can be used to implement various embodiments of the invention. As seen in this figure, the computer 101 has a computing unit 103. The computing unit 103 typically includes a processing unit 105 and a system memory 107. The processing unit 105 may be any type of processing device for executing software instructions, but will conventionally be a microprocessor device. The system memory 107 may include both a read-only memory (ROM) 109 and a random access memory (RAM) 111. As will be appreciated by those of ordinary skill in the art, both the read-only memory (ROM) 109 and the random access memory (RAM) 111 may store software instructions for execution by the processing unit 105.


The processing unit 105 and the system memory 107 are connected, either directly or indirectly, through a bus 113 or alternate communication structure to one or more peripheral devices. For example, the processing unit 105 or the system memory 107 may be directly or indirectly connected to additional memory storage, such as the hard disk drive 115, the removable magnetic disk drive 117, the optical disk drive 119, and the flash memory card 121. The processing unit 105 and the system memory 107 also may be directly or indirectly connected to one or more input devices 123 and one or more output devices 125. The input devices 123 may include, for example, a keyboard, touch screen, a remote control pad, a pointing device (such as a mouse, touchpad, stylus, trackball, or joystick), a scanner, a camera or a microphone. The output devices 125 may include, for example, a monitor display, television, printer, stereo, or speakers.


Still further, the computing unit 103 will be directly or indirectly connected to one or more network interfaces 127 for communicating with a network. This type of network interface 127, also sometimes referred to as a network adapter or network interface card (NIC), translates data and control signals from the computing unit 103 into network messages according to one or more communication protocols, such as the Transmission Control Protocol (TCP), the Internet Protocol (IP), and the User Datagram Protocol (UDP). These protocols are well known in the art, and thus will not be discussed here in more detail. An interface 127 may employ any suitable connection agent for connecting to a network, including, for example, a wireless transceiver, a power line adapter, a modem, or an Ethernet connection.


It should be appreciated that, in addition to the input, output and storage peripheral devices specifically listed above, the computing device may be connected to a variety of other peripheral devices, including some that may perform input, output and storage functions, or some combination thereof. For example, the computer 101 may be connected to a digital music player, such as an IPOD® brand digital music player available from Apple, Inc. of Cupertino, California. As known in the art, this type of digital music player can server as both an output device for a computer (e.g., outputting music from a sound file or pictures from an image file) and a storage device. In addition, this type of digital music play also can serve as an input device for inputting recorded athletic information, as will be discussed in more detail below.


In addition to a digital music player, the computer 101 may be connected to or otherwise include one or more other peripheral devices, such as a telephone. The telephone may be, for example, a wireless “smart phone.” As known in the art, this type of telephone communicates through a wireless network using radio frequency transmissions. In addition to simple communication functionality, a “smart phone” may also provide a user with one or more data management functions, such as sending, receiving and viewing electronic messages (e.g., electronic mail messages, SMS text messages, etc.), recording or playing back sound files, recording or playing back image files (e.g., still picture or moving video image files), viewing and editing files with text (e.g., Microsoft Word or Excel files, or Adobe Acrobat files), etc. Because of the data management capability of this type of telephone, a user may connect the telephone with the computer 101 so that their data maintained may be synchronized.


Of course, still other peripheral devices may be included with our otherwise connected to a computer 101 of the type illustrated in FIG. 1, as is well known in the art. In some cases, a peripheral device may be permanently or semi-permanently connected to the computing unit 103. For example, with many computers, the computing unit 103, the hard disk drive 117, the removable optical disk drive 119 and a display are semi-permanently encased in a single housing. Still other peripheral devices may be removably connected to the computer 101, however. The computer 101 may include, for example, one or more communication ports through which a peripheral device can be connected to the computing unit 103 (either directly or indirectly through the bus 113). These communication ports may thus include a parallel bus port or a serial bus port, such as a serial bus port using the Universal Serial Bus (USB) standard or the IEEE 1394 High Speed Serial Bus standard (e.g., a Firewire port). Alternately or additionally, the computer 101 may include a wireless data “port,” such as a Bluetooth interface, a Wi-Fi interface, an infrared data port, or the like.


It should be appreciated that a computing device employed according various examples of the invention may include more components than the computer 101 illustrated in FIG. 1, fewer components than the computer 101, or a different combination of components than the computer 101. Some implementations of the invention, for example, may employ one or more computing devices that are intended to have a very specific functionality, such as a digital music player or server computer. These computing devices may thus omit unnecessary peripherals, such as the network interface 115, removable optical disk drive 119, printers, scanners, external hard drives, etc. Some implementations of the invention may alternately or additionally employ computing devices that are intended to be capable of a wide variety of functions, such as a desktop or laptop personal computer. These computing devices may have any combination of peripheral devices or additional components as desired.



FIG. 2 illustrates one example of an athletic information monitoring device 201 that may be employed according to various examples of the invention to measure athletic information corresponding a user's athletic activity. As shown in this figure, the athletic information monitoring device 201 includes a digital music player 203, an electronic interface device 205, and an athletic parameter measurement device 207. As will be described in more detail, in one embodiment, the digital music player 203 may be (releasably) connected to the electronic interface device 205, and the combination is worn or otherwise carried by the user while he or she is performing an athletic activity, such as running or walking. The athletic parameter measurement device 207 also is worn or carried by the user while he or she is performing an athletic activity, and measures one or more athletic parameters relating to the athletic performance being performed by the user. The athletic parameter measurement device 207 transmits signals to the electronic interface device 205 that correspond to the measured athletic parameter. The electronic interface device 205 receives the signals from the athletic parameter measurement device 207, and provides the received information to the digital music player 203. In some arrangements, electronic interface device 205 might not be used if digital music player 203 or other electronic device is capable of interfacing with measurement device 207 directly. For example, the athletic parameter measurement device 207 may be configured to communicate using the Bluetooth wireless communication protocol, so that it can be employed with Bluetooth-capable mobile telephones, personal digital assistants, watches or personal computers.


As shown in more detail in FIG. 3A, the athletic parameter measurement device 207 includes one or more sensors 301 for measuring an athletic parameter associated with a person wearing or otherwise using the athletic parameter measurement device 207. With the illustrated implementations, for example, the sensors 301A and 301B may be accelerometers (such as piezoelectric accelerometers) for measuring the acceleration of the athletic parameter measurement device 207 in two orthogonal directions. The athletic parameter measurement device 207 is carried or otherwise worn by a user to measure the desired athletic parameter while the user exercises. For example, as shown in FIG. 3B, the athletic parameter measurement device 207 may be located the sole of a user's shoe 401 while the user walks or runs. With this arrangement, the sensors 301 will produce electrical signals corresponding to the movement of the user's foot. As known in the art, these signals can then be used to generate athletic data representative of the athletic activity performed by the user.


The athletic parameter measurement device 207 also includes a processor 303 for processing the electrical signals output by the sensors 301. With some implementations of the invention, the processor 303 may be a programmable microprocessor. For still other implementations of the invention, however, the processor 303 may be a purpose-specific circuit device, such as an ASIC. The processor 303 may perform any desired operation on the signals output from the sensors 301, such as curve smoothing, noise filtering, outlier removal, amplification, summation, integration, or the like. The processor 303 provides the processed signals to a transmitter 307. The athletic parameter measurement device 207 also includes a power supply 307, for providing power to the sensors 301, the processor 303, and the transmitter 305 as needed. The power supply 307 may be, for example, a battery.


The athletic parameter measurement device 207 transmits the processed signals to the electronic interface device 205, as seen in FIG. 3B. Returning now to FIG. 3A, the electronic interface device 205 includes a receiver 309 which receives the processed signals transmitted by the transmitter 305 in the athletic parameter measurement device 207. The receiver 309 relays the processed signals to a second processor 311, which processes the signals further. Like the processor 303, the processor 311 may perform any desired operation on the processed signals, such as curve smoothing, noise filtering, outlier removal, amplification, summation, integration, or the like.


The processor 303 provides the processed signals to the digital music player 203. Referring back now to FIG. 2, the electronic interface device 205 includes a connector system 209 that physically plugs into and connects with a conventional input port 211 provided on digital music player 203. The input port 211 into which the connector system 209 of the electronic interface device 205 connects may be any desired type of input port for transferring data, such as a parallel data port, a serial data port, an earphone or microphone jack, etc.) The connector system 209 may include any suitable connecting devices, such as wires, pins, electrical connectors, and the like, so as to make an electrical connection or other suitable connection with corresponding elements provided in the input port 211 of the digital music player 203 (e.g., to allow electronic and/or data communications between the interface device 205 and the electronic interface device 205). If necessary or desired, additional securing elements may be provided to securely connect the interface device 205 to the digital music player 203, such as straps, hooks, buckles, clips, clamps, clasps, retaining elements, mechanical connectors, and the like.


Returning now to FIG. 3A, the processor 311 provides the processed signals to the computing unit 313. The computing unit 313 may initially store the processed signals in the memory 315. Further, with some implementations of the invention, the computing unit 313 may operate on the processed signals provided by the athletic information monitoring device 201 to generate a set of athletic data corresponding to the athletic activity performed by the user. For example, if the athletic information monitoring device 201 includes accelerometers for measuring the movement of the user's foot, the computing unit 313 may analyze the processed signals from the athletic information monitoring device 201 to generate a set of athletic data describing the user's speed at specific instances during the user's athletic activity and the total distance traveled by the user at each of those specific instances. Various techniques for determining a user's speed from accelerometer signals are described in, for example, U.S. Pat. No. 6,898,550 to Blackadar et al., entitled “Monitoring Activity Of A User In Locomotion On Foot,” and issued on May 24, 2005, U.S. Pat. No. 6,882,955 to Ohlenbusch et al., entitled “Monitoring Activity Of A User In Locomotion On Foot,” and issued on Apr. 19, 2005, U.S. Pat. No. 6,876,947 to Darley et al., entitled “Monitoring Activity Of A User In Locomotion On Foot,” and issued on Apr. 5, 2005, U.S. Pat. No. 6,493,652 to Ohlenbusch et al., entitled “Monitoring Activity Of A User In Locomotion On Foot,” and issued on Dec. 10, 2002, U.S. Pat. No. 6,298,314 to Blackadar et al., entitled “Detecting The Starting And Stopping Of Movement Of A Person On Foot,” and issued on Oct. 2, 2001, U.S. Pat. No. 6,052,654 to Gaudet et al., entitled “Measuring Foot Contact Time And Foot Loft Time Of A Person In Locomotion,” and issued on Apr. 18, 2000, U.S. Pat. No. 6,018,705 to Gaudet et al., entitled “Measuring Foot Contact Time And Foot Loft Time Of A Person In Locomotion,” and issued on Jan. 25, 2000, each of which are incorporated entirely herein by reference.


The athletic data set may also include a time value associated with each speed value and/or each distance value. If the athletic information monitoring device 201 can be employed to collect athletic information from different users, then the athletic data computing unit 313 may additionally prompt the user to identify himself or herself in some way. This identification information may then be included with the athletic data set generated from the information provided by the athletic information monitoring device 201. Once the computing unit 313 has generated a set of athletic data from the information provided by the athletic information monitoring device 201, the computing unit 313 may store the athletic data set in the memory 315. As will be discussed in more detail below, when the digital music player 203 subsequently is connected to a computing device implementing an athletic information collection tool, the computing unit 313 will download the athletic data to a display configuration tool hosted on a remote computing device.


While wireless communication between the between the athletic parameter measurement device 207 and the interface device 205 is described for the embodiments illustrated in FIGS. 2-3B, any desired manner of communicating between the athletic parameter measurement device 207 and the interface device 205 may be used without departing from the invention, including wired connections. Also, any desired way of placing data derived from the physical or physiological data from the athletic parameter measurement device 207 in the proper form or format for display on or output from electronic device 210 may be provided without departing from the invention.


If desired, in accordance with at least some examples of this invention, the electronic interface device 205 may further include a display 220 and/or a user input system 222, such as one or more rotary input devices, switches, buttons (as shown in the illustrated example in FIG. 2), mouse or trackball elements, touch screens, or the like, or some combination thereof. The display 220 may be employed to show, for example, information relating to music being played by the digital music player 203, information relating to the athletic information signals being received by the digital music player 203, athletic data being generated by the digital music player 203 from the received athletic information signals, etc. The user input system 222 may be employed, for example: to control one or more aspects of the processing of the input data received via interface device 205, to control input data receipt (e.g., timing, types of information received, on-demand data requests, etc.), to control data output to or by the electronic device 203, to control the athletic parameter measurement device 207, etc. Alternatively or additionally, if desired, the input system on the digital music player 203 (e.g., buttons 222, a touch screen, a digitizer/stylus based input, a rotary input device, a trackball or roller ball, a mouse, etc.), may be used to provide user input data to the interface device 205 and/or to the athletic parameter measurement device 207. As still another example, if desired, a voice input system may be provided with the interface device 205 and/or the digital music player 203, e.g., to enable user input via voice commands. Any other desired type of user input system, for control of any system elements and/or for any purpose, may be provided without departing from the invention.


The digital music player 203 may include additional input and/or output elements, e.g., such as ports 224 and 226 shown in FIG. 2, e.g., for headphones (or other audio output), power supplies, wireless communications, infrared input, microphone input, or other devices. If desired, and if these ports 224 and/or 226 would be covered when the interface device 205 is attached to the electronic device 203, the interface device 205 may be equipped with similar external ports to ports 224 and/or 226, and internal circuitry may be provided in the interface device 205 to enable the user to plug the same additional devices into the interface device 205 as they might plug into the digital music player 203 and still take advantage of the same functions (e.g., to thereby allow the necessary data, signals, power, and/or information to pass through the interface device 205 to the user, to another output, and/or to the digital music player 203).


It should be appreciated that, while some specific embodiments of the invention described above relate to a digital music player 203, alternate examples of the invention may be implemented using any portable electronic device. For example, with some implementations of the invention, the athletic parameter measurement device 207 may be used in conjunction with a mobile telephone, a watch, a personal digital assistant, anther type of music player (such as a compact disc or satellite radio music player), a portable computer, or any other desired electronic device.


It also should be appreciated that, while a specific example of an athletic parameter measurement device 207 has been described above for ease of understanding, any type of desired athletic parameter measurement device 207 can be employed with various embodiments of the invention. For example, with some implementations of the invention, the athletic parameter measurement device 207 may be a heart rate monitor, a blood oxygen monitor, a satellite positioning device (e.g., a Global Positioning Satellite (GPS) navigation device), a device for measuring the electrical activity of the user (e.g., an EKG monitor), or any other device that measures one or more physical parameters of the user. Still further, the athletic parameter measurement device 207 may measure one or more operational parameters of some device being manipulated by the user, such as the speed and/or distance of a bicycle, the speed and/or work performed by a treadmill, rowing machine, elliptical machine, stationary bicycle, the speed and/or distance traveled by skis (water or snow), skates (roller or ice), or snowshoes or the like worn by the user, etc. Other types of sensors may include strain gages, temperature sensors, heart-rate monitors and the like. In one or more arrangements, a user may equip multiple sensors and, in some instances, the same type of sensor in multiple locations. For example, users may wear shoes that are each equipped with an accelerometer, weight sensor or the like, in order to allow a system to determine the individual movement and metrics of each foot or other body part (e.g., leg, hand, arm, individual fingers or toes, regions of a person's foot or leg, hips, chest, shoulders, head, eyes). Examples of multi-sensor apparel and the use of multiple sensors in athletic activity monitoring are described in U.S. application Ser. No. 12/483,824, entitled “FOOTWEAR HAVING SENSOR SYSTEM,” and published as U.S. Publication No. 2010/0063778 A1 and U.S. application Ser. No. 12/483,828, entitled “FOOTWEAR HAVING SENSOR SYSTEM,” and published as U.S. Publication No. 2010/0063779 A1. The content of the above reference applications are incorporated herein by reference in their entirety. In a particular example, an athlete may wear having one or more force sensing systems, e.g., that utilize force-sensitive resistory (FSR) sensors. The shoe may include multiple FSR sensors that detect forces at different regions of the user's foot (e.g., a heel, mid-sole, toes, etc.). This may help determine balance of a user's foot or between a user's two feet. In one exemplary embodiment, a FSR sensor array may take the form such as shown in FIG. 3C.


Also, while the athletic parameter measurement device 207 has been described as being separate from the digital music player 203 or other portable electronic device that receives the signals from the athletic parameter measurement device 207, with some implementations of the invention the athletic parameter measurement device 207 may be incorporated into or integrated with the digital music player 203 or other portable electronic device. For example, some implementations of the invention may employ a music player, mobile telephone, watch or personal digital assistant that incorporates accelerometers, a satellite positioning device, or any other desired device for measuring athletic activity. Still further, it should be appreciated that various implementations of the invention may employ a plurality of athletic parameter measurement devices 207, incorporated into the digital music player 203 or other portable electronic device, separate from the digital music player 203 or other portable electronic device, or some combination thereof.


Time-Based Data Collection


Athletic performance monitoring systems such as digital music player 203 or interface 205 of FIG. 2 may be used to collect, edit, store and share athletic performance data as measured by one or more external or internal sensors. This athletic performance data may be collected over a period of time that the user is performing an activity. To provide data specificity and flexibility in the use of the data, the monitoring system may collect data several times during the course of the athletic activity. In one example, the monitoring system may collect and store athletic data at every minimum time unit. For example, the minimum time unit may correspond to every second that the user is engaged in the athletic activity. In another example, the monitoring system may collect and store athletic data for every 0.5 seconds, 5 seconds, 10 seconds, 30 seconds, minute or the like. The data collected may then be mapped, associated and/or otherwise stored with the corresponding instant in time or time period in which the data was captured. The minimum time unit may be defined by the user or the system or may be defined based on a minimum time unit that is used to record video or audio of the activity session. For example, if a video provides playback granularity at the half second level, the system may record performance data at every half second. In another example, if a video is recorded at 30 frames per second, the system may record performance data (e.g., metrics) every 1/30th of a second to match each frame of video.



FIG. 4 illustrates a general process by which a user may collect and user athletic performance data. For example, the user may initially capture desired metric data. For example, a user may select or otherwise specify the type of metric that he or she wishes to record during an athletic activity session. In one example, the user may select metrics by selecting or deselecting individual types of metrics from a user interface. In another example, the user may select metrics by identifying a previous recorded set of athletic performance data and indicating that he or she wishes to record the same metrics as the previous athletic performance data set. Metric data may include video, audio, speeds, paces, reaction times, jump height, locations (e.g., using a GPS sensor or cellular triangulation), sweat level, body temperature, reach distance, weight lifted, strength and the like. Once captured, the user may edit the data, share the data and motivate himself or herself and/or others (e.g., by attempting to beat the one or more metrics of a previously recorded activity session).


Many different types of metrics may be measured and recorded in association with a time at which the metric was detected. FIG. 5 illustrates an example user interface through which a user may select various time-specific metrics to record. Other metrics may still be recorded for an athletic activity even if not selected, however, those other metrics might only be recorded as an average over the entire workout (e.g., rather than storing the metric information at the same level of granularity (e.g., 1 second, 2 seconds) as the selected metrics). Accordingly, selected metrics may be detected, recorded and/or stored at a first level of granularity (e.g., a first speed—every second, every 2 seconds, every 30 seconds, every millisecond, etc.) while non-selected metrics may be detected, recorded and/or stored at a second level of granularity (e.g., every 2 minutes, every 10 minutes, every 15 minutes), where the first level of granularity is greater than the second level of granularity. For some metrics that correspond to a period of time (e.g., pace), the metric may be recorded for a specified period of time (e.g., 2 seconds) and associated with every time unit of that period (e.g., a pace of 7.8 mi/hour over 2 seconds is recorded for and associated with each second of those 2 seconds). As such, the other metrics might not be specific to (or recorded as being specific to) any particular time or time period (e.g., a time period smaller than the entire workout/activity duration, a minimum time unit, etc.) during the workout. Each of the selectable metrics displayed in FIG. 5 may correspond to and be recorded by an application or applet (e.g., a widget). In one arrangement, each metric widget or application may be configured to measure and record a particular set of one or more metrics along a timeline. For example, each metric widget or application may be specific to the corresponding metric that the widget or application is configured to record. The timelines for multiple metric widgets or applications may then be merged to consolidate the metric data into a single activity session based on their timelines. Generally, the timelines of the various widgets or applications will match one another since the recording is likely to be initiated at the same time.


Metric applications or widgets may be created by athletes or other users. In one example, the metric applications may be available through a marketplace or community where users may download new and/or updated metric applications. The application may include algorithms, instructions, visual features, functional elements that are specific to the particular metric and application. Thus, in one or more arrangements, a metric selection interface may include multiple different applications or applets for the same type of metric. In one example, celebrations, messages, interface functionalities may be defined by users for various types of metrics. In a particular example, a vertical (e.g., jump height) widget may include a celebration once the user reaches a 2 foot jump height while a pace widget may include a celebration that is provided once the user achieves a 7 minute mile pace.


As illustrated in FIG. 5, metrics may include recovery time, pace, vertical jump, court map, gait line, run line, heart rate, balance, distance, calories, reaction time, hustle points, pedometer, flight time, trials, ollie, impact and balance center. For example, recovery time may be a measure of how long a user is motionless or exhibits a level of activity or motion below a certain threshold. This time may be considered time the user is spending to recover or rest. A court map, on the other hand, may plot the user's position against an athletic activity court or field or other predefined space. For example, if a user is playing basketball, a virtual representation of a basketball court may be generated and displayed along with a user's movement around the virtual court. In another example, football players may be graphed around a virtual football field. Reaction time, on the other hand, may measure the amount of time between two events such as a ball bouncing on a rim and the user jumping up to grab the ball (e.g., a rebound reaction time). In another example, a basketball player's reaction time to a pass may be measured between a time at which the ball is released from another player's hands and the instant the user makes a move toward the ball (e.g., as measured by hip movements or directional movement of hands or body). Hustle points may be awarded in a variety of manners including based on a speed of an athlete in completing objectives, reaching an object (e.g., a ball), moving a predefined amount of distance, moving from one specified point to another and the like. In one example, hustle points may be awarded for each second a user is moving at a speed above a threshold speed (e.g., 0.5 points per second above 10 mph).


An athletic monitoring system may determine flight time or air time by measuring the time between a user's feet leaving a floor and a time at which the user's feet touch the ground. Flight time or air time may also be measured based on other body parts or devices including skateboards (e.g., skateboard flight time) or between hands and feet (e.g., for a back flip). Flight or airtime for certain activities may have their own metric such as number of ollies for skateboarding. The ollie metric may use different parameters to measure the airtime of the skateboard trick. In yet another example, air time for the ring exercise in gymnastics may be measured based on when a user's hands leave the rings and when the user's hands return to the rings or the user's feet land on the ground. Various other flight time or air times may be defined based on various body and device sensors.


Impact may represent an amount of force that a user exerts. In one example, a boxing activity may be measured for impact of a user's first when punching. In another example, a user's impact upon landing may be measured in basketball. In yet another example, a user's impact upon hitting or tackling another user may be measured to determine an effectiveness or strength in football. Gait line and run line may measure a user's direction or pattern of foot movement during walking or running, respectively. In other examples, the pattern or direction of movement for other body parts may also be measured and analyzed. According to one or more arrangement, a run line metric may identify a path a user takes during a run. The path may be determined using location determination system such as global positioning satellites and the like.


Balance and balance center both relate to the amount of weight being placed on each foot. In one example, balance may indicate a difference in weight being placed on each foot while balance center may provide an indicator that shows where the user's center of balance exists relative to the position of his feet.


Additionally or alternatively, the system may provide a trials metric configured to measure a user's performance for time. In a trial, the user is typically racing against the clock, trying to achieve the fastest time possible. Accordingly, the system may measure the user's trials and provide time information associated therewith.


To simplify the use of the performance monitoring system and selection of metrics, one or more sets of metrics may be predefined. For example, a first set of one or more metrics may be pre-selected and/or defined for each of running, basketball, training and skateboarding as illustrated in FIG. 6. Accordingly, upon a user selecting one of the activity options or types, the corresponding set of metrics may automatically be chosen. In some arrangements, the corresponding set of metrics may be automatically chosen along with an activity-type specific widget or application configured to record the selected metrics and the activity of that type.


The user may be provided with an opportunity to customize the automatic selection after selecting the activity. Alternatively or additionally, the user may also choose to create a custom predefined set or to manually select the metrics that he or she wishes to use for a current activity (e.g., using the Create Your Own option). As noted above, a user may select a previously performed workout and ask to record the same metrics as the previously performed workout. Accordingly, the system may automatically extract the metrics recorded for the previously performed workout from an athletic performance data set associated with the previously performed workout. If a user customizes his or her own set of metrics, the user may choose to store and label the customized set. The customized set may then appear in a menu of predefined activities (e.g., as shown in FIG. 6) when the user next begins an activity session. While only basketball, running, training and skateboarding are listed as activities in FIG. 6, numerous other activities may also have predefined metric or widget sets and may similarly be displayed in such an interface. In fact, any type of motion may be tracked according to the features described herein including dancing, swimming, skipping rope, wrestling, public speaking (e.g., to track the amount of user hand motion or eye contact), traveling (e.g., number of steps taken during a trip, elevation change during the trip) and the like.


Furthermore, users may share customized metric or widget sets with other users. For example, a marketplace or share space may be created where users may exchange, purchase and/or download metric and widget sets from other users, services, coaches and the like. In one or more arrangements, metric and widgets sets may be shared among users with specific interests (e.g., skateboarding or swimming) or may be shared more generally. Other privacy and security parameters may also be defined including specifying whether the general public is allowed to download and view the metric set or if only a specified group (e.g., friends, community group, etc.) are allowed to view and download. In one or more aspects, a user may define their own metrics. In one example, a user may define a metric called “one-leg vertical height” for recording a height that a user is able to jump on one leg or foot. The user may define the metric may specifying the types of sensors to use, conditions for activating or deactivating the sensor and the sensor output that is to be detected. Accordingly the above user may indicate that the metric is only measured when sensors for one shoe are contacting a surface and the metric corresponds to half of an amount of time between detecting loss of contact between the one shoe and a surface and detecting subsequent contact of the same shoe with a surface.



FIG. 7 illustrates an example interface through which a user may initiate recording of an activity session upon selecting a set of desired metrics to be tracked. The interface may include a timeline on the bottom of the screen to indicate an amount of elapsed time since a start of the activity. A user may select the start your run/record option displayed in the middle of the screen to begin recording metrics and/or video for the activity. In one example, video may be recorded by the metric recording device (e.g., a video camera on a mobile telecommunication device or a laptop). As the data is recorded, the data (video and metrics) may be stored in association with the particular instant or time period during the activity session at which the data was captured. As noted previously, data may be collected substantially continuously (e.g., every 0.1 or 0.5 seconds, 1 second). Other recording intervals may be defined as well (e.g., every 2 seconds, 5 seconds, 10 seconds, 15 seconds, 5 minutes). The interface may further display a currently selected primary metric. The primary metric may be displayed a visualization space of the interface. For example, in FIG. 7, a run line is displayed in the visualization space (partially covered by the recording option). The user may pause or stop the recording using the corresponding options displayed in the header bar (e.g., on either side of the primary metric name).



FIG. 8 illustrates an athletic performance monitoring interface that may be displayed upon a user beginning an athletic activity session. An icon in the header bar may indicate that the current activity and metrics thereof are being recorded. A primary visualization space may display a particular metric such as run path or run line in the illustrated example. In one arrangement, a user's current position on the map may be identified by an indicator. Other metrics may be displayed in a metrics bar. The metrics may update continuously or based on a specified schedule as the activity is being performed. For example, the user's pace (e.g., 7:46 miles) may be updated in real-time as a user gets faster or slows down. Similarly, a user's balance (currently showing 46% weight on the left foot and 54% of the user's weight on the right foot) may similarly be updated in real-time. The data shown may be the instantaneous data or the data may comprise an average of a previous amount of data (e.g., all previous data recorded for the session or a proper subset of data recorded for the session).


Other metrics may be displayed upon selecting one of the directional arrow options along the metrics bar (as will be described and illustrated in further detail below). Upon selecting one of the metrics in the metric bar, the primary visualization space may change to display the selected metric. The previously displayed metric may be returned to the metric bar (e.g., replacing the newly selected metric/widget). Furthermore, a current elapsed time may be displayed against the timeline. Additionally, an amount of elapsed time may be represented in the timeline by a different color or appearance (e.g., red, polka-dots, stripes, blue, green, etc.).



FIG. 9 illustrates another example activity selection interface having a basketball activity highlighted or in the process of being selected. As noted herein, different activities may correspond to different sets of metrics. In one example, a basketball metric may include video and/or audio recording. Accordingly, selection of the basketball activity may activate a video recording function.



FIG. 10 illustrates an example metric monitoring interface in which video is recorded for a basketball activity. A basketball activity may include activities related to training for or improving skills related to basketball and is not necessarily limited to a basketball game. Similar training or evaluation type activities might also be monitored for other sports and activities as well. As the video is being recorded, other metrics as shown in the metrics bar may also be recorded at the same time and associated with a time at which the data was captured. The metric or metric widget being displayed in the primary visualization space may be modified by selecting a different metric from the metric toolbar. The video may continue to be recorded and displayed in the metric widget on the metric toolbar. Metrics toolbar may include metrics widgets such as a pace metric widget, a vertical jump widget and an impact widget. The vertical jump widget may measure a user's vertical ground clearance at the particular point in time while the impact widget may measure the amount of force exerted by a user's feet (e.g., upon landing) or hands (e.g., for blocking a shot) or an amount of force with which a ball was shot or thrown. The pace metric for a basketball activity may measure acceleration in a vertical or horizontal direction (e.g., instead of measuring mile pace). The metrics shown may be specific to the instant identified in the timeline, i.e., 4 minutes and 58 seconds into the athletic activity session. Alternatively or additionally, one or more of the metrics shown may be an average of the metrics up to the instant identified in the time line (e.g., an average over the first 4 minute and 58 seconds).


Data collection may also be facilitated by including identifiers in one or more sensors or other wearable device (e.g., shoe, gloves, headgear, etc.). A metric capturing device such as a video camera or speed camera may automatically adjust direction and focus based on finding the appropriate subject using the identifier and determining its location. In other arrangements, identifiers or wearable identifier tags might not be necessary. A camera or other sensor device may automatically determine the location of a desired subject based on image recognition (e.g., facial recognition or body recognition). Using identifiers, data may also be displayed as an overlay near or proximate to the corresponding sensor, person or person's body part. For example, step information may be displayed near a user's foot during video playback by detecting the position of the user's foot through the identifier. In another example, multiple sets of metrics may be displayed for the multiple individuals displayed in a video. Accordingly, the identifiers may be used to place the metrics close to the appropriate individuals.


Video collection may also be facilitated by combining videos from multiple different video sources as shown in FIG. 21. For example, multiple individuals may use their video camera to record the same event (e.g., a soccer game or a dance competition). A processing system may detect that each of the multiple videos corresponds to the same event and piece the videos together to fill in gaps between the individual videos. Additionally, the system may insure that the video maximizes images of a desired subject. Again, body or facial recognition may be used to identify particular subjects and to assemble portions having the desired subject or subjects together into a single video. Each portion may have a duration that corresponds to a sub-time period of the duration or overall time period of the source video or content item. The use of multiple cameras or video streams may also allow an individual to view a subject (e.g., himself or herself, a child, a player, etc.) from multiple angles and, in some instances, in 3-D.


According to another aspect, video recording by location-specific cameras and other recording equipment may be automatically triggered based on detection of an identifier or other electronic tag. FIG. 21, for instance, illustrates a location such as a gym or park with a camera 2 that belongs to or is otherwise associated with the location. As such, if the camera or a system connected thereto detects a player or individual in the location, the camera may begin automatically recording. In one example, detection of athletes and other individuals may be based on shoes having an RFID tag. Upon detection of the RFID tag, a video camera may be automatically triggered to begin recording the event or athletic activity session. These cameras may be stationary or moveable cameras located at a public, semi-private or private athletic facility (e.g., gym, field, pool, courts, etc.). The location-specific camera data may then be combined with data collected by a user's personal recording device (e.g., a mobile phone with a camera or handheld video camera) during compilation of an athletic activity session file. In some arrangements, the data from different sources may automatically be synchronized upon uploading to a server or other monitoring system.


Data Visualization and Modification



FIG. 11 illustrates an interface displaying a recorded activity session and expanded metric toolbar and display of a vertical jump metric in the primary visualization area. As discussed, by selecting one of the directional arrows, the metric toolbar may be scrolled to display other metric widgets. An arrow might not be displayed if no additional metric widgets exist in that direction. Scrolling may also be performed using gestures such as swiping to the left or to the right.


Upon selecting a new metric, such as vertical jump, to view in the primary visualization area, the previous metric or interface displayed in the primary visualization area may be reduced to widget or toolbar size and placed in the toolbar. For example, the video that was previous displayed in the primary visualization area may be reduced to a smaller size suitable for the metric widget toolbar and displayed therein. When enlarged or placed into the primary visualization area, the metric widget may display additional or more extensive information as compared to what is displayed in the metric widget toolbar. For example, in the primary visualization area, the vertical jump metric widget displays a current vertical jump value as well as historical vertical jump values for the activity session in a graph. This may allow the user to better understand his or her progress and improvement (if any) during the activity session. The current value of 49.5 inches (i.e., the value associated with the selected or current time in the timeline) may be displayed as well. Each of the widget metrics may include animations when new metric data is received and as time progresses. For example, with respect to vertical jump, the line may extend slightly to the next vertical jump value detected once the timeline advances to a subsequent point in time. In another example, the line may retract if the user decides to rewind and go back to a previous point in time. Metric widgets may display live and animated information in the primary visualization area as well as in the metric toolbar.



FIG. 12 illustrates another example interface displaying video of an athletic activity session along with a timeline representing the duration of the session. In this example interface, the video may be displayed in landscape format and a metric widget toolbar may be hidden or otherwise not displayed to conserve space. However, the timeline may still be displayed to allow a user to jump back and forth in time or to fast forward or rewind as desired. The timeline and/or the metric widget toolbar may be revealed and/or hidden at will based on user interactions with the device on which the interface is displayed. For example, a user may make a first gesture along a touch screen interface to reveal the metric widget toolbar and a second gesture to hide the toolbar. Hiding and displaying of the timeline may be similarly controlled.


According to one aspect, various metrics may be displayed as overlays on the primary visualization area. That is, the information being displayed in the primary visualization area may still be visible beneath the metric overlays. The user may select the desired metrics to be overlain on the primary visualization area. Overlaid metrics may also be hidden if desired. The user may also customize the number of metrics that are displayed over the primary visualization area as well as their appearance including color, font, size, representative symbol, unit of measure and the like. In one example, the best or optimum metric may be called out using highlighting, color, flashing, patterns and the like. In other arrangements, the overlaid information may be displayed with information about personal bests to show how far a user is from matching or exceeding their personal best. Additionally or alternatively, comments, words of encouragement and the like may also be displayed as overlays, in the toolbar or in an information bar of the interface.



FIGS. 16A and 16B illustrate example metric overlays. In FIG. 16A, for example, a user's speed is displayed as a semi-transparent odometer overlaid on top of the user's skateboard activity session video.


In FIG. 16B, the user's impact is displayed as an arrow with an indicator of the impact magnitude over a video of the user jumping.


Overlaid metric information may include videos of other portions of the activity session or other activity sessions (e.g., of the present user or of other users). In one example, the user may overlay video from an activity session of a pro athlete to compare performance.



FIG. 17A illustrates an example side by side video comparison between a user and a celebrity athlete or other user. In particular, the comparison illustrates the difference in air or flight time. The videos may be cued to a similar point in time such as when a user leaves the ground to perform a dunk. This time may be identified based on indicators stored in association with each respective video or based on pre-processed or on-the-fly image analysis. Comparison between two users may include synchronizing the timelines of athletic activity performances of the two users. For example, if two users performed a 20 minutes run, the system may synchronize the two timelines temporally or otherwise to compare the paces at different points during the 20 minute run. Synchronization may include aligning the two timelines to match up the elapsed time. The system may further allow the users to view video, hear audio or view animated data as the timeline is traversed.



FIG. 17B illustrates another example video comparison between a skateboarder user and a pro or competitor skateboarder. In the illustrated example, the comparison may display a metric using a representative symbol such as a skateboard. That is, the skateboard may represent a number of ollies performed.



FIGS. 17C and 17D illustrate wireframe representations of interfaces that may also be used to compare the performance of different users. For example, FIG. 17C illustrates a comparison of the performance of an athlete with the performance of the athlete's coach. The widget applications displayed in the toolbar may be displayed with a both the athlete's performance metric as well as the coach's performance metric in a split screen style. Similarly, video of the athlete and the coach may be displayed as split screen in the primary visualization area.



FIG. 17D illustrates a comparison interface that may be displayed in landscape format. Instead of displaying a widget toolbar, the interface may display X number of metrics. In this example, the number of metrics may be 4 (video, jump height, impact and balance). The video may be displayed in split screen and two columns may be displayed adjacent the video, one representing the metrics of the athlete and the other column displaying the metrics of the coach. The interface configurations of FIGS. 17C and 17D may be used to compare the athletic performance of any number of athletes (e.g., 2, 3, 5, 10, etc.) and the athletes may have any type of relationship (e.g., friends, competitions, coach-player/trainee, etc.).


Video overlays may automatically be triggered based on detection of various events such as releasing a pitch, executing a slam dunk and/or throwing a football. For example, video of a professional pitcher's pitch may be overlaid on top of a video of a user's pitch to facilitate visual comparison between the two. Additionally or alternatively, metrics for the overlaid video and the user's video may be displayed in conjunction with one another for comparison purposes. The two sets of metrics may be visually distinguished from one another. For example, a first set of metrics may be displayed in one color while the other set of metrics may be displayed in a second color. Font size, fonts, font style (e.g. bold, italic, underline), pattern, and the like may also be used for visual distinction. Videos that are displayed simultaneously (e.g., one overlaying the other) might also be scaled such that the subject of the videos are displayed in sufficient large size. For example, if a user is far in the distance in one video, the video may zoom in or enlarge the portion containing the user.


Additionally or alternatively, the user may customize appearance of the timeline and/or metric overlays using thresholds. For example, upon a timeline reaching 75% completion, the timeline may change in appearance (e.g., color, pattern, shape, size, etc.). In another example, the metric information may change in color or other appearance if the metric goes above or below a defined threshold (e.g., red colored lettering when pace goes below a 6 minute mile).



FIG. 13 illustrates an interface through which a user may crop or select a portion (e.g., less than all) of the overall duration of the recorded activity session. The selected portion may thus have a time period or duration representative of a sub-time period of the overall duration. The user may then separate out the selected portion for further processing, analysis and/or sharing with others. For example, the cropped portion or the entire recorded session along with the metrics associated therewith may be uploaded to a social networking site, saved to an athletic performance monitoring service site or emailed to friends. In one or more arrangements, if average metrics are provided for the entirety of the activity session, the user's selection or cropping of a portion of the activity session may automatically cause the system to modify the average to reflect an average of just the selected or cropped portion. A save option allows the user the save the selected portion. Additionally or alternatively, the monitoring system may automatically save the cropped portion and the remaining portion as separate files or data items. This may prevent a user from accidentally deleting a portion of the activity session.


A user may further be allowed to select a particular metric value and the system may automatically identify and display a portion of content file (e.g., a video or audio file) to a time of the athletic activity session at which the particular metric value was recorded. Alternatively or additionally, the user may select a portion (e.g., a range or a specific time) of the content file and one or more metric values specific to the select portion may be displayed.


The timeline may further include one or more indicators identifying certain events of the user's activity session. For example, times corresponding to the user's highest or best values with respect to one or more metrics may be marked within the timeline. Thus, if a user achieves his or her highest vertical jump at time 2:03, an indicator may be placed at the 2:03 point within the timeline. Indicators may be color coded and/or labeled to provide some information about what is being marked. According to one aspect, if a user selects a portion of the timeline (rather than the entire timeline), the indicators may be modified to reflect and identify the best (e.g., highest or lowest) metric values measured for the user during the selected portion. For example, the system may automatically determine the best metric values for the selected portion of the activity session. Alternatively, the indicators might not be modified so that the user is aware of his or her metrics throughout the entire activity session. According to yet another alternative, further indicators may be added in addition to the already existing indicators. For example, the additional indicators may identify the best times and/or other metrics for the selected portion of the activity session. The user may further name the cropped portion upon saving. The saved name may be displayed in the title bar. Indicators may also be used to identify other events that do not correspond to a best metric. For example, the indicators may identify substantial changes in pace (e.g., going from a 12 minute mile pace to a 7 minute mile pace within a predefined amount of time like 1 minute), slam dunks, tennis aces, dancing moves, tackles, football passes of greater than 20 yards and the like. Indicators may also specify the lowest metrics or points in the activity session where a user may need improvement (e.g., coaching or improvement tips).


Selection or cropping of a portion of the video may be performed by a user sliding his or her finger along the timeline (e.g., using a touch screen interface) to desired start and end times for a desired portion. With the above described indicators, the user may more easily select a portion or multiple portions of the video of his or her highlights (e.g., best performance times). Alternatively, the user may use a cursor or time entry fields to specify the start and end times. In one or more arrangements, the user may ask the monitoring system to automatically select the portion. For example, the user may request that the monitoring system crop the video such that only a portion containing a time or period of time at which a user's best dunk (e.g., most air-time, highest rating by the user and/or other users) or highest value of a particular metric was achieved is retained. In a particular example, the system may automatically retain the event along with a predefined amount of time (e.g., 2 minutes, 1 minute, 30 seconds, 15 seconds) around the event. Metric data might only be retained for the remaining portion of the activity session. Additionally, new averages may be calculated for the retained portion upon cropping the non-selected portions. Metric data for a non-retained portion of the activity session, on the other hand, may be discarded or saved to a different file (e.g., a file of the cropped portion). Alternatively or additionally, an average for the non-retained portion may also be automatically generated for comparison purposes in one or more examples.


The cropped video may also be automatically stored as a discrete content file that may be rendered (e.g., viewed, audibly played, visually played) independently of other content files or sets of athletic data. The discrete content file may also correspond to an audio file (e.g., with or without video), or an application file that animates the sequence of recorded athletic data. A user may also select multiple portions of the recorded athletic activity session and each of the selected portions may be stored as a discrete content file. Thus, a user may create multiple content files at one time by selecting multiple portions of the activity session.



FIG. 14 illustrates a video/metric sharing interface that may be displayed after a user has selected a save option (e.g., as shown in FIG. 13). The sharing interface may include one or more predefined sharing options (e.g., for YOUTUBE, FACEBOOK and the like). The interface may further allow a user to customize or define their own sharing sites (e.g., by entering a website or network address).



FIG. 15 illustrates a community website displaying a shared video that includes metrics associated with the user showcased in the shared video. On the community website, a variety of individuals may submit comments about the video and/or the user's athletic performance. In one example, coaches may submit comments to help the user improve or to further encourage the user. Comments may be associated with specific times similarly to metric information. Accordingly, comments may only be triggered or displayed upon reaching a particular time of the video. The user may specify permissions for who may comment on the video and/or video the video. For example, the user may indicate that only a certain group of people, specific individuals, or individuals satisfying user-defined criteria (e.g., age, location, affiliation, etc.) are allowed to submit comments or to rate the video/metrics. The user may also specify separate permissions for the video and the metrics. Thus, some users may be able to view both the video and the metrics while other users might only be privy to the video or only the metrics.


Other videos may be suggested for viewing to individuals that are accessing the present video. The other videos may be selected based on type of athletic activity, a subject's age, gender or location, a subject's school or gym, similarity between the subject's performance the performance of the subjects in the other videos and/or other criteria.



FIGS. 18A and 18B illustrate a series of interfaces in which metrics may be displayed in different regions of a performance visualization area. The user may then be able to adjust the size of the regions by moving an intersection between the different regions. The size of each region may then automatically adjust according to the location of the intersection. For example, in FIG. 18A, the intersection is displayed in the middle of the visualization area, thereby providing each metric with equal display space. The user may then decide to move the intersection (e.g., by selecting and dragging the intersection using a touch screen) to another location as shown in FIG. 18B. Upon the user moving the intersection to the location shown in FIG. 18B, the sizes of the various regions may automatically change to compensate for the new intersection. For example, the width of the impact and balance metric display regions may decrease while the width of the vision/video and jump height metric display regions may be increased. Additionally, the heights of the vision/video metric and impact metric regions may increase while the height of the balance and jump height regions may decrease.


In one or more configurations, a user may be allowed to edit parameters or aspects of a recorded activity session before all recorded metrics are combined into a single activity session file or content item. Additionally, cropping may be performed before the session is compiled into the single file or content item. FIG. 19, for example, illustrates a process whereby a user may select or deselect metrics that are to be combined and stored into a file corresponding to an athletic activity session. Accordingly, even if the various widget applications recorded 8 different metrics, the user may select only 5 of the 8 metrics to be compiled into the activity session file. Alternatively or additionally, the user may define the placement of the various metrics and widget applications in a display area so that the system may assemble the video and other data in a desired manner Still further, a user may add comments, audio (e.g., a soundtrack, narration, sound effects, etc.), interactive buttons (e.g., to send the athlete an email, download the video and the like) and the like.


Video, audio or other athletic performance content data may further be associated with location information. For example, location may be used as a metric as noted herein. Additionally, information about a particular location may be displayed, stored and/or associated with the athletic performance or portion thereof in a granular manner. For example, location information for a user's location at each minimum time unit (e.g., second, 1 minute, 5 minutes, 30 minutes, etc.) may be retrieved and stored. Thus, if a user is in a park at minute 1 and later runs to a bridge at minute 8, information about the park may be associated with the athletic performance at minute 1 and information about the bridge may be associated with the athletic performance at minute 8. The location description information may be descriptive of a type of location, history of the location, events occurring at the location and the like. The location description information may then be displayed while the user views a progression of the athletic performance data (e.g., video or audio or animated data).


CONCLUSION

While the invention has been described with respect to specific examples including presently preferred modes of carrying out the invention, those skilled in the art will appreciate that there are numerous variations and permutations of the above described systems and methods. For example, various aspects of the invention may be used in different combinations and various different subcombinations of aspects of the invention may be used together in a single system or method without departing from the invention. In one example, software and applications described herein may be embodied as computer readable instructions stored in computer readable media. Also, various elements, components, and/or steps described above may be changed, changed in order, omitted, and/or additional elements, components, and/or steps may be added without departing from this invention. Thus, the invention should be construed broadly as set forth in the appended claims.

Claims
  • 1. A method comprising: receiving, by an athletic parameter analysis device, video data associated with an athlete during an athletic activity;determining, by the athletic parameter analysis device and using the video data, at least one performance metric of the athlete; andbased on detecting an event in the video data, presenting a display indicating the at least one performance metric, wherein the at least one performance metric is an overlaid graphic that is displayed in a toolbar of a primary visualization area.
  • 2. The method of claim 1, further comprising receiving, by an athletic parameter measurement device worn on the athlete, activity data associated with the athletic activity, wherein determining the at least one performance metric of the athlete includes processing and synchronizing the activity data with the video data.
  • 3. The method of claim 1, wherein presenting the display includes playing video data of the athlete during the athletic activity in the primary visualization area.
  • 4. The method of claim 1, wherein presenting the display includes presenting an indication relating to at least one earlier athletic activity performed by the athlete.
  • 5. The method of claim 1, wherein presenting the display includes presenting an indication relating to at least one other athlete performing an athletic activity.
  • 6. The method of claim 1, further comprising: associating an identification of the athlete with the athletic parameter analysis device; andtransmitting, to a computing device, the at least one performance metric and the identification.
  • 7. The method of claim 1, wherein the overlaid graphic is animated during playback of the video data.
  • 8. The method of claim 1, wherein presenting the display includes displaying a portion providing metrics of the athlete performing the athletic activity and a portion with coaching information for the athlete related to the athletic activity.
  • 9. The method of claim 1, wherein determining the at least one performance metric of the athlete includes processing and synchronizing video data from a plurality of video sources and calculating the at least one performance metric based on analyzing the processed and synchronized video data.
  • 10. The method of claim 1, wherein presenting the display includes overlaying one or more indicators in the primary visualization area to identify an aspect of the at least one performance metric for improvement and coaching information for the at least one performance metric for improvement.
  • 11. The method of claim 1, wherein presenting the display includes providing a user selectable metric widget that is configured to change a displayed type of performance metric in response to a user selection.
  • 12. An athletic parameter analysis device comprising: a processor; andat least one measurement sensor,memory storing computer executable instructions that, when executed by the processor, cause the athletic parameter analysis device to: receive video data associated with an athlete during an athletic activity;process the video data;determine, using the processed video data, at least one performance metric of the athlete; andtrigger a display indicating the at least one metric, wherein the at least one metric is an overlaid graphic that is displayed during playback of the video data.
  • 13. The athletic parameter analysis device of claim 12, wherein determining the at least one performance metric of the athlete includes processing and synchronizing video data from a plurality of video sources and calculating the at least one performance metric based on analyzing the processed and synchronized video data.
  • 14. The athletic parameter analysis device of claim 12, wherein presenting the display includes playing video data of the athlete during the athletic activity in a primary visualization area of the display.
  • 15. The athletic parameter analysis device of claim 12, wherein presenting the display includes presenting an indication relating to at least one earlier athletic activity performed by the athlete.
  • 16. The athletic parameter analysis device of claim 12, wherein the instructions, when executed by the processor, further cause the athletic parameter analysis device to:receiving, by an athletic parameter measurement device worn on the athlete, activity data associated with the athletic activity,wherein determining the at least one performance metric of the athlete includes processing and synchronizing the activity data with the video data.
  • 17. The athletic parameter analysis device of claim 12, presenting the display includes presenting an indication relating to at least one other athlete performing an athletic activity.
  • 18. The athletic parameter analysis device of claim 17, wherein the instructions, when executed by the processor, further cause the athletic parameter analysis device to transmit the at least one performance metric and the identification-indication to a computing device in real-time.
  • 19. The athletic parameter analysis device of claim 12, wherein the overlaid graphic is animated during playback of the video data.
  • 20. One or more non-transitory computer readable media storing instructions that, when executed cause: receiving video data associated with an athlete during an athletic activity, wherein the video data is received from a plurality of video sources;determining, using the video data, at least one performance metric of the athlete; andbased on detecting an event in the video data, triggering a display indicating the at least one performance metric, wherein the at least one performance metric is an overlaid graphic that is displayed during playback of the video data.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of U.S. patent application Ser. No. 18/161,544, filed Jan. 30, 2023, now U.S. Pat. No. 11,935,640, which is a continuation of U.S. patent application Ser. No. 16/853,937, filed Apr. 21, 2020, now U.S. Pat. No. 11,600,371, which is a continuation of U.S. patent application Ser. No. 16/376,123, filed Apr. 5, 2019, now U.S. Pat. No. 10,632,343, which is a continuation of U.S. patent application Ser. No. 15/994,517, filed May 31, 2018, now U.S. Pat. No. 10,293,209, which is a continuation of U.S. patent application Ser. No. 15/693,753, filed Sep. 1, 2017, now U.S. Pat. No. 10,010,752, which is a divisional of U.S. patent application Ser. No. 15/223,188, filed Jul. 29, 2016, now U.S. Pat. No. 9,757,619, which is a divisional of U.S. patent application Ser. No. 14/722,695, filed May 27, 2015, now U.S. Pat. No. 9,429,411, which is a continuation of U.S. patent application Ser. No. 14/478,203, filed Sep. 5, 2014, now U.S. Pat. No. 9,389,057, which is a continuation of U.S. patent application Ser. No. 13/293,653, filed Nov. 10, 2011, now U.S. Pat. No. 8,831,407, which claims the benefit of U.S. Provisional Application No. 61/412,285, filed Nov. 10, 2010, the contents of which are incorporated herein in their entirety for any and all non-limiting purposes.

US Referenced Citations (552)
Number Name Date Kind
3270564 Evans Sep 1966 A
4372558 Shimamoto et al. Feb 1983 A
4373651 Fanslow Feb 1983 A
4496148 Morstain et al. Jan 1985 A
4518267 Hepp May 1985 A
4578969 Larson Apr 1986 A
4647918 Goforth Mar 1987 A
4664378 Van Auken May 1987 A
4703445 Dassler Oct 1987 A
4745930 Confer May 1988 A
4814661 Ratzlaff et al. Mar 1989 A
4860763 Schminke Aug 1989 A
4866412 Rzepczynski Sep 1989 A
4890111 Nicolet et al. Dec 1989 A
4911427 Matsumoto et al. Mar 1990 A
4980871 Sieber et al. Dec 1990 A
5010774 Kikuo et al. Apr 1991 A
5033291 Podoloff et al. Jul 1991 A
5047952 Kramer et al. Sep 1991 A
5050962 Monnier et al. Sep 1991 A
5149084 Dalebout et al. Sep 1992 A
5150536 Strong Sep 1992 A
5154960 Mucci et al. Oct 1992 A
5249967 O'Leary et al. Oct 1993 A
5270433 Klauck et al. Dec 1993 A
5303131 Wu Apr 1994 A
5323650 Fullen et al. Jun 1994 A
5363297 Larson et al. Nov 1994 A
5373651 Wood Dec 1994 A
5374821 Muhs et al. Dec 1994 A
5393651 Hoshi Feb 1995 A
5408873 Schmidt et al. Apr 1995 A
5419562 Cromarty May 1995 A
5422521 Neer et al. Jun 1995 A
5444462 Wambach Aug 1995 A
5446701 Utke et al. Aug 1995 A
5471405 Marsh Nov 1995 A
5500635 Mott Mar 1996 A
5513854 Daver May 1996 A
5527239 Abbondanza Jun 1996 A
5626538 Dalebout et al. May 1997 A
5636146 Flentov et al. Jun 1997 A
5636378 Griffith Jun 1997 A
5638300 Johnson Jun 1997 A
5644858 Bemis Jul 1997 A
5655316 Huang Aug 1997 A
5694514 Evans et al. Dec 1997 A
5697791 Nashner et al. Dec 1997 A
5702323 Poulton Dec 1997 A
5714706 Nakada et al. Feb 1998 A
5720200 Anderson et al. Feb 1998 A
5724265 Hutchings Mar 1998 A
5764786 Kuwashima et al. Jun 1998 A
5785666 Costello et al. Jul 1998 A
5812142 Small et al. Sep 1998 A
5813142 Demon Sep 1998 A
5813406 Kramer et al. Sep 1998 A
5825327 Krasner Oct 1998 A
5844861 Maurer Dec 1998 A
5879270 Huish et al. Mar 1999 A
5888172 Andrus et al. Mar 1999 A
5889464 Huang Mar 1999 A
5903454 Hoffberg et al. May 1999 A
5907819 Johnson May 1999 A
5913727 Ahdoot Jun 1999 A
5929332 Brown Jul 1999 A
5960380 Flentov et al. Sep 1999 A
5963891 Walker et al. Oct 1999 A
5976083 Richardson et al. Nov 1999 A
5982352 Pryor Nov 1999 A
6002982 Fry Dec 1999 A
6013007 Root et al. Jan 2000 A
6017128 Goldston et al. Jan 2000 A
6018705 Gaudet et al. Jan 2000 A
6026335 Atlas Feb 2000 A
6042492 Baum Mar 2000 A
6050962 Kramer et al. Apr 2000 A
6066075 Poulton May 2000 A
6073086 Marinelli Jun 2000 A
6077193 Buhler et al. Jun 2000 A
6081750 Hoffberg et al. Jun 2000 A
6122340 Darley et al. Sep 2000 A
6122846 Gray et al. Sep 2000 A
6141041 Carlbom et al. Oct 2000 A
6148262 Fry Nov 2000 A
6148271 Marinelli Nov 2000 A
6148280 Kramer Nov 2000 A
6151563 Marinelli Nov 2000 A
6157898 Marinelli Dec 2000 A
6174294 Crabb et al. Jan 2001 B1
6195921 Truong Mar 2001 B1
6198394 Jacobsen et al. Mar 2001 B1
6204813 Wadell et al. Mar 2001 B1
6226577 Yeo May 2001 B1
6266623 Vock et al. Jul 2001 B1
6270433 Orenstein et al. Aug 2001 B1
6287200 Sharma Sep 2001 B1
6298314 Blackadar et al. Oct 2001 B1
6308565 French et al. Oct 2001 B1
6320173 Vock et al. Nov 2001 B1
6330757 Russell Dec 2001 B1
6336365 Blackadar et al. Jan 2002 B1
6336891 Fedrigon et al. Jan 2002 B1
6356856 Damen et al. Mar 2002 B1
6357147 Darley et al. Mar 2002 B1
6360597 Hubbard, Jr. Mar 2002 B1
6418181 Nissila Jul 2002 B1
6426490 Storz Jul 2002 B1
6428490 Kramer et al. Aug 2002 B1
6430843 Potter et al. Aug 2002 B1
6430997 French et al. Aug 2002 B1
6443904 Nissila Sep 2002 B2
6458060 Watterson et al. Oct 2002 B1
6496787 Flentov et al. Dec 2002 B1
6496952 Osada et al. Dec 2002 B1
6498994 Vock et al. Dec 2002 B2
6515284 Walle et al. Feb 2003 B1
6516284 Flentov et al. Feb 2003 B2
6527674 Clem Mar 2003 B1
6536139 Darley et al. Mar 2003 B2
6539336 Vock et al. Mar 2003 B1
6544858 Beekman et al. Apr 2003 B1
6560903 Darley May 2003 B1
6567038 Granot et al. May 2003 B1
6567116 Aman et al. May 2003 B1
6578291 Hirsch et al. Jun 2003 B2
6582330 Rehkemper et al. Jun 2003 B1
6585622 Shum et al. Jul 2003 B1
6601016 Brown et al. Jul 2003 B1
6607493 Song Aug 2003 B2
6611789 Darley Aug 2003 B1
6620057 Pirritano et al. Sep 2003 B1
6640144 Huang et al. Oct 2003 B1
6648798 Yoo Nov 2003 B2
6656042 Reiss et al. Dec 2003 B2
6656091 Abelbeck et al. Dec 2003 B1
6671390 Barbour et al. Dec 2003 B1
6672991 O'Malley Jan 2004 B2
6687535 Hautala et al. Feb 2004 B2
6707487 Aman et al. Mar 2004 B1
6710713 Russo Mar 2004 B1
6718200 Marmaropoulos et al. Apr 2004 B2
6736759 Stubbs et al. May 2004 B1
6748462 Dubil et al. Jun 2004 B2
6778973 Harlan Aug 2004 B2
6784826 Kane et al. Aug 2004 B2
6785579 Huang et al. Aug 2004 B2
6785805 House et al. Aug 2004 B1
6786848 Yamashita et al. Sep 2004 B2
6796927 Toyama Sep 2004 B2
6808462 Snyder et al. Oct 2004 B2
6829512 Huang et al. Dec 2004 B2
6831603 Menache Dec 2004 B2
6836744 Asphahani et al. Dec 2004 B1
6858006 MacCarter et al. Feb 2005 B2
6876947 Darley et al. Apr 2005 B1
6882897 Fernandez Apr 2005 B1
6885971 Vock et al. Apr 2005 B2
6889282 Schollenberger May 2005 B2
6892216 Coburn, II et al. May 2005 B2
6902513 McClure Jun 2005 B1
6909420 Nicolas et al. Jun 2005 B1
6922664 Fernandez et al. Jul 2005 B1
6932698 Sprogis Aug 2005 B2
6959259 Vock et al. Oct 2005 B2
6963818 Flentov et al. Nov 2005 B2
6978320 Nonaka Dec 2005 B2
6997852 Watterson et al. Feb 2006 B2
6997882 Parker et al. Feb 2006 B1
7005970 Hodsdon et al. Feb 2006 B2
7030861 Westerman et al. Apr 2006 B1
7040998 Jolliffe et al. May 2006 B2
7045151 Trant May 2006 B2
7046151 Dundon May 2006 B2
7054678 Dardik et al. May 2006 B2
7054784 Flentov et al. May 2006 B2
7057551 Vogt Jun 2006 B1
7070571 Kramer et al. Jul 2006 B2
7072789 Vock et al. Jul 2006 B2
7076291 Pulkkinen et al. Jul 2006 B2
7091863 Ravet Aug 2006 B2
7092846 Vock et al. Aug 2006 B2
7097588 Watterson et al. Aug 2006 B2
7139582 Couronne et al. Nov 2006 B2
7152343 Whatley Dec 2006 B2
7162392 Vock et al. Jan 2007 B2
7166062 Watterson et al. Jan 2007 B1
7171331 Vock et al. Jan 2007 B2
7174277 Vock et al. Feb 2007 B2
7200517 Darley et al. Apr 2007 B2
7245898 Van Bosch et al. Jul 2007 B2
7273431 DeVall Sep 2007 B2
7277021 Beebe et al. Oct 2007 B2
7283647 McNitt Oct 2007 B2
7304580 Sullivan et al. Dec 2007 B2
7310895 Whittlesey et al. Dec 2007 B2
7321330 Sajima Jan 2008 B2
7383728 Noble et al. Jun 2008 B2
7391886 Clark et al. Jun 2008 B1
RE40474 Quellais et al. Sep 2008 E
7426873 Kholwadwala et al. Sep 2008 B1
7428471 Darley et al. Sep 2008 B2
7433805 Vock et al. Oct 2008 B2
7457724 Vock et al. Nov 2008 B2
7480512 Graham et al. Jan 2009 B2
7487045 Vieira Feb 2009 B1
7497037 Vick et al. Mar 2009 B2
7498856 Lin et al. Mar 2009 B2
7498956 Baier et al. Mar 2009 B2
7513852 Wilkins et al. Apr 2009 B2
7522970 Fernandez Apr 2009 B2
7552549 Whittlesey et al. Jun 2009 B2
7556589 Stearns et al. Jul 2009 B1
7559877 Parks et al. Jul 2009 B2
7579946 Case, Jr. Aug 2009 B2
7596891 Carnes et al. Oct 2009 B2
7602301 Stirling et al. Oct 2009 B1
7607243 Berner, Jr. et al. Oct 2009 B2
7617068 Tadin et al. Nov 2009 B2
7620466 Neale et al. Nov 2009 B2
7623987 Vock et al. Nov 2009 B2
7625314 Ungari et al. Dec 2009 B2
7627451 Vock et al. Dec 2009 B2
7634379 Noble Dec 2009 B2
7641592 Roche Jan 2010 B2
7651442 Carlson Jan 2010 B2
7658694 Ungari Feb 2010 B2
7670263 Ellis et al. Mar 2010 B2
7698830 Townsend et al. Apr 2010 B2
7699753 Daikeler et al. Apr 2010 B2
7722502 Holkkola May 2010 B2
7726206 Terrafranca, Jr. et al. Jun 2010 B2
7739076 Vock et al. Jun 2010 B1
7758523 Collings et al. Jul 2010 B2
7771320 Riley et al. Aug 2010 B2
7791808 French et al. Sep 2010 B2
7805150 Graham et al. Sep 2010 B2
7810392 Kitagawa Oct 2010 B2
7816632 Bourke, III et al. Oct 2010 B2
7840378 Vock et al. Nov 2010 B2
7850514 Weber Dec 2010 B2
7901325 Henderson Mar 2011 B2
7905815 Ellis et al. Mar 2011 B2
7909737 Ellis et al. Mar 2011 B2
7921716 Morris Bamberg et al. Apr 2011 B2
7927253 Vincent et al. Apr 2011 B2
7934983 Eisner May 2011 B1
7997007 Sanabria-Hernandez Aug 2011 B2
8002645 Savarese et al. Aug 2011 B2
8054176 Karjalainen Nov 2011 B2
8056268 DiBenedetto et al. Nov 2011 B2
8061061 Rivas Nov 2011 B1
8066514 Clarke Nov 2011 B2
8070620 Rankin Dec 2011 B2
8083643 Ng et al. Dec 2011 B2
8099258 Alten et al. Jan 2012 B2
8109858 Redmann Feb 2012 B2
8131498 McCauley Mar 2012 B1
8142267 Adams Mar 2012 B2
8172722 Molyneux et al. May 2012 B2
8177260 Tropper et al. May 2012 B2
8212136 Shirai et al. Jul 2012 B2
8212158 Wiest Jul 2012 B2
8221290 Vincent et al. Jul 2012 B2
8251930 Ido Aug 2012 B2
8253586 Matak Aug 2012 B1
8257189 Koudele et al. Sep 2012 B2
8291618 Ellis Oct 2012 B2
8333643 Eisner Dec 2012 B2
8337212 Prstojevich Dec 2012 B2
8346524 Turgiss et al. Jan 2013 B2
8353791 Holthouse et al. Jan 2013 B2
8360904 Oleson et al. Jan 2013 B2
8467979 Sobolewski Jun 2013 B2
8474153 Brie et al. Jul 2013 B2
8484654 Graham et al. Jul 2013 B2
8568277 Johnson Oct 2013 B2
8676541 Schrock et al. Mar 2014 B2
8739639 Owings et al. Jun 2014 B2
8784274 Chuang Jul 2014 B1
8814755 Ellis et al. Aug 2014 B2
8831407 Meschter et al. Sep 2014 B2
8845496 Arrasvuori et al. Sep 2014 B2
8860584 Matak Oct 2014 B1
9389057 Meschter et al. Jul 2016 B2
9390229 Kahn et al. Jul 2016 B1
9429411 Meschter et al. Aug 2016 B2
9504414 Coza et al. Nov 2016 B2
9757619 Meschter et al. Sep 2017 B2
10010752 Meschter Jul 2018 B2
10293209 Meschter May 2019 B2
10610761 Matak et al. Apr 2020 B1
10632343 Meschter Apr 2020 B2
20010054043 Harlan Dec 2001 A1
20020035184 Plaver et al. Mar 2002 A1
20020077219 Cohen et al. Jun 2002 A1
20020115047 McNitt et al. Aug 2002 A1
20020134153 Grenlund Sep 2002 A1
20020160883 Dugan Oct 2002 A1
20020170193 Townsend et al. Nov 2002 A1
20030009308 Kirtley Jan 2003 A1
20030049590 Feldbau Mar 2003 A1
20030054327 Evensen Mar 2003 A1
20030065561 Brown et al. Apr 2003 A1
20030097878 Farringdon et al. May 2003 A1
20030134714 Oishi et al. Jul 2003 A1
20030148762 Noe Aug 2003 A1
20030163287 Vock et al. Aug 2003 A1
20030190062 Noro et al. Oct 2003 A1
20030207718 Perlmutter Nov 2003 A1
20040006680 Duncan Jan 2004 A1
20040125013 Haselsteiner et al. Jul 2004 A1
20040154190 Munster Aug 2004 A1
20040209600 Werner et al. Oct 2004 A1
20040215413 Weldum et al. Oct 2004 A1
20040218317 Kawazu et al. Nov 2004 A1
20040226192 Geer et al. Nov 2004 A1
20050011085 Swigart et al. Jan 2005 A1
20050032582 Mahajan et al. Feb 2005 A1
20050038679 Short Feb 2005 A1
20050046576 Julian et al. Mar 2005 A1
20050070809 Acres Mar 2005 A1
20050106977 Coulston May 2005 A1
20050143199 Saroyan Jun 2005 A1
20050162257 Gonzalez Jul 2005 A1
20050176373 Gilbert et al. Aug 2005 A1
20050183292 DiBenedetto et al. Aug 2005 A1
20050187644 Neale et al. Aug 2005 A1
20050188566 Whittlesey et al. Sep 2005 A1
20050221403 Gazenko Oct 2005 A1
20050234307 Heinonen et al. Oct 2005 A1
20050250458 Graham et al. Nov 2005 A1
20050261609 Collings et al. Nov 2005 A1
20050270156 Ravet Dec 2005 A1
20050272564 Pyles et al. Dec 2005 A1
20050282633 Nicolas et al. Dec 2005 A1
20060003872 Chiles et al. Jan 2006 A1
20060010174 Nguyen et al. Jan 2006 A1
20060017692 Wehrenberg et al. Jan 2006 A1
20060025229 Mahajan et al. Feb 2006 A1
20060025282 Redmann Feb 2006 A1
20060026120 Carolan et al. Feb 2006 A1
20060040244 Kain Feb 2006 A1
20060084850 Spinner et al. Apr 2006 A1
20060091715 Schmitz et al. May 2006 A1
20060111944 Sirmans et al. May 2006 A1
20060136173 Case et al. Jun 2006 A1
20060143645 Vock et al. Jun 2006 A1
20060144152 Cabuz et al. Jul 2006 A1
20060148594 Saintoyant et al. Jul 2006 A1
20060178235 Coughlan et al. Aug 2006 A1
20060189440 Gravagne Aug 2006 A1
20060204045 Antonucci Sep 2006 A1
20060205569 Watterson et al. Sep 2006 A1
20060217231 Parks et al. Sep 2006 A1
20060226843 Al-Anbuky et al. Oct 2006 A1
20060248749 Ellis Nov 2006 A1
20060262120 Rosenberg Nov 2006 A1
20070006489 Case et al. Jan 2007 A1
20070016091 Butt et al. Jan 2007 A1
20070021269 Shum Jan 2007 A1
20070026421 Sundberg et al. Feb 2007 A1
20070032345 Padmanabhan et al. Feb 2007 A1
20070032748 McNeil et al. Feb 2007 A1
20070033838 Luce et al. Feb 2007 A1
20070059675 Kuenzler et al. Mar 2007 A1
20070060408 Schultz et al. Mar 2007 A1
20070060425 Kuenzler et al. Mar 2007 A1
20070063849 Rosella et al. Mar 2007 A1
20070063850 Devaul et al. Mar 2007 A1
20070067885 Fernandez Mar 2007 A1
20070068244 Billing et al. Mar 2007 A1
20070069014 Heckel et al. Mar 2007 A1
20070073178 Browning et al. Mar 2007 A1
20070078324 Wijisiriwardana Apr 2007 A1
20070082389 Clark et al. Apr 2007 A1
20070094890 Cho et al. May 2007 A1
20070105629 Toyama May 2007 A1
20070118328 Vock et al. May 2007 A1
20070135225 Nieminen et al. Jun 2007 A1
20070135243 LaRue et al. Jun 2007 A1
20070143452 Suenbuel et al. Jun 2007 A1
20070149361 Jung et al. Jun 2007 A1
20070152812 Wong et al. Jul 2007 A1
20070156335 McBride et al. Jul 2007 A1
20070173705 Teller et al. Jul 2007 A1
20070178967 Rosenberg Aug 2007 A1
20070179977 Reed et al. Aug 2007 A1
20070187266 Porter et al. Aug 2007 A1
20070191083 Kuenzler et al. Aug 2007 A1
20070207447 Lamar et al. Sep 2007 A1
20070208544 Kulach et al. Sep 2007 A1
20070232455 Hanoun Oct 2007 A1
20070239479 Arrasvuori et al. Oct 2007 A1
20070250286 Duncan et al. Oct 2007 A1
20070260421 Berner et al. Nov 2007 A1
20070283599 Talbott Dec 2007 A1
20070299625 Englert et al. Dec 2007 A1
20080009068 Georges Jan 2008 A1
20080027679 Shklarski Jan 2008 A1
20080028783 Immel et al. Feb 2008 A1
20080031492 Lanz Feb 2008 A1
20080039203 Ackley et al. Feb 2008 A1
20080048616 Paul et al. Feb 2008 A1
20080056508 Pierce et al. Mar 2008 A1
20080060224 Whittlesey et al. Mar 2008 A1
20080061023 Moor Mar 2008 A1
20080066343 Sanabria-Hernandez Mar 2008 A1
20080066560 Yu et al. Mar 2008 A1
20080080626 Lawrence et al. Apr 2008 A1
20080084351 Englert et al. Apr 2008 A1
20080085790 Englert Apr 2008 A1
20080088303 Englert Apr 2008 A1
20080090683 Englert et al. Apr 2008 A1
20080090685 Namie et al. Apr 2008 A1
20080094472 Ayer et al. Apr 2008 A1
20080104247 Venkatakrishnan et al. May 2008 A1
20080109158 Huhtala et al. May 2008 A1
20080119479 Wedge May 2008 A1
20080127527 Chen Jun 2008 A1
20080134583 Polus Jun 2008 A1
20080146302 Olsen et al. Jun 2008 A1
20080165140 Christie et al. Jul 2008 A1
20080172498 Boucard Jul 2008 A1
20080177507 Mian et al. Jul 2008 A1
20080188353 Vitolo et al. Aug 2008 A1
20080200312 Tagliabue Aug 2008 A1
20080203144 Kim Aug 2008 A1
20080207401 Harding et al. Aug 2008 A1
20080218310 Alten et al. Sep 2008 A1
20080221403 Fernandez Sep 2008 A1
20080246629 Tsui et al. Oct 2008 A1
20080249736 Prstojevich Oct 2008 A1
20080255794 Levine Oct 2008 A1
20080258921 Woo et al. Oct 2008 A1
20080259028 Teepell et al. Oct 2008 A1
20080261776 Skiba Oct 2008 A1
20080269644 Ray Oct 2008 A1
20080274755 Cholkar et al. Nov 2008 A1
20080284650 MacIntosh et al. Nov 2008 A1
20080286733 Claudel et al. Nov 2008 A1
20080287832 Collins et al. Nov 2008 A1
20080288200 Noble Nov 2008 A1
20080293023 Diehl et al. Nov 2008 A1
20080297832 Otsuka Dec 2008 A1
20080300914 Karkanias et al. Dec 2008 A1
20080306410 Kalpaxis et al. Dec 2008 A1
20080307899 Von Lilienfeld-Toal et al. Dec 2008 A1
20080316325 Nakahara Dec 2008 A1
20080318679 Tran et al. Dec 2008 A1
20090018691 Fernandez Jan 2009 A1
20090027917 Chen et al. Jan 2009 A1
20090048039 Holthouse et al. Feb 2009 A1
20090048044 Oleson et al. Feb 2009 A1
20090048070 Vincent et al. Feb 2009 A1
20090048538 Levine et al. Feb 2009 A1
20090048918 Dawson et al. Feb 2009 A1
20090050699 Basar et al. Feb 2009 A1
20090061837 Chaudhri et al. Mar 2009 A1
20090069156 Kurunmaki et al. Mar 2009 A1
20090075347 Cervin et al. Mar 2009 A1
20090076341 James et al. Mar 2009 A1
20090090683 Haghayegh Apr 2009 A1
20090105047 Guidi et al. Apr 2009 A1
20090107009 Bishop et al. Apr 2009 A1
20090135001 Yuk May 2009 A1
20090137933 Lieberman et al. May 2009 A1
20090144084 Neumaier Jun 2009 A1
20090144369 Brown Jun 2009 A1
20090149299 Tchao et al. Jun 2009 A1
20090150178 Sutton et al. Jun 2009 A1
20090152456 Waid et al. Jun 2009 A1
20090153369 Baier et al. Jun 2009 A1
20090153477 Saenz Jun 2009 A1
20090163287 Vald'Via et al. Jun 2009 A1
20090167677 Kruse et al. Jul 2009 A1
20090171614 Damen Jul 2009 A1
20090189982 Tawiah Jul 2009 A1
20090210078 Crowley Aug 2009 A1
20090233770 Vincent et al. Sep 2009 A1
20090235739 Morris Bamberg et al. Sep 2009 A1
20090258710 Quatrochi et al. Oct 2009 A1
20090259566 White, III et al. Oct 2009 A1
20090262088 Moll-Carrillo et al. Oct 2009 A1
20090293319 Avni Dec 2009 A1
20090297832 Hatta et al. Dec 2009 A1
20100000121 Brodie et al. Jan 2010 A1
20100004566 Son et al. Jan 2010 A1
20100023231 Allgaier et al. Jan 2010 A1
20100023531 Brisebois et al. Jan 2010 A1
20100035688 Picunko Feb 2010 A1
20100053867 Ellis et al. Mar 2010 A1
20100056340 Ellis et al. Mar 2010 A1
20100057951 Ellis et al. Mar 2010 A1
20100059561 Ellis et al. Mar 2010 A1
20100062740 Ellis et al. Mar 2010 A1
20100063778 Schrock et al. Mar 2010 A1
20100063779 Schrock et al. Mar 2010 A1
20100065836 Lee Mar 2010 A1
20100072948 Sun et al. Mar 2010 A1
20100082735 Petersen et al. Apr 2010 A1
20100088023 Werner Apr 2010 A1
20100094147 Inan et al. Apr 2010 A1
20100111705 Sato et al. May 2010 A1
20100113160 Belz et al. May 2010 A1
20100125028 Heppert May 2010 A1
20100129780 Homsi et al. May 2010 A1
20100152619 Kalpaxis et al. Jun 2010 A1
20100184563 Molyneux et al. Jul 2010 A1
20100184564 Molyneux et al. Jul 2010 A1
20100191490 Martens et al. Jul 2010 A1
20100201500 Stirling et al. Aug 2010 A1
20100201512 Stirling et al. Aug 2010 A1
20100204616 Shears et al. Aug 2010 A1
20100225763 Vock et al. Sep 2010 A1
20100231580 Miyasaka Sep 2010 A1
20100273610 Johnson Oct 2010 A1
20100277617 Hollinger Nov 2010 A1
20100279822 Ford Nov 2010 A1
20100286601 Yodfat et al. Nov 2010 A1
20100292599 Oleson et al. Nov 2010 A1
20100298659 McCombie et al. Nov 2010 A1
20100312083 Southerland Dec 2010 A1
20100332188 Vock et al. Dec 2010 A1
20110003665 Burton et al. Jan 2011 A1
20110021280 Boroda et al. Jan 2011 A1
20110087445 Sobolewski Apr 2011 A1
20110107369 O'Brien et al. May 2011 A1
20110119027 Zhu et al. May 2011 A1
20110119058 Berard et al. May 2011 A1
20110136627 Williams Jun 2011 A1
20110152695 Granqvist et al. Jun 2011 A1
20110208444 Solinsky Aug 2011 A1
20110230986 Lafortune et al. Sep 2011 A1
20110305369 Bentley Dec 2011 A1
20120041767 Hoffman et al. Feb 2012 A1
20120050351 Dobler et al. Mar 2012 A1
20120050529 Bentley Mar 2012 A1
20120203360 Tagliabue Aug 2012 A1
20120234111 Molyneux et al. Sep 2012 A1
20120277040 Vincent et al. Nov 2012 A1
20120291563 Schrock et al. Nov 2012 A1
20120291564 Amos et al. Nov 2012 A1
20130079907 Homsi et al. Mar 2013 A1
20130213145 Owings et al. Aug 2013 A1
20140033572 Steier et al. Feb 2014 A1
20140174205 Clarke et al. Jun 2014 A1
20140228986 Case, Jr. et al. Aug 2014 A1
20140342329 Debenedetto et al. Nov 2014 A1
20150062440 Baxter et al. Mar 2015 A1
20160332029 Meschter et al. Nov 2016 A1
20170004358 Bose Jan 2017 A1
Foreign Referenced Citations (130)
Number Date Country
2512601 Mar 2002 CA
2668946 May 2008 CA
1101757 Apr 1995 CN
1109179 Sep 1995 CN
1231753 Oct 1999 CN
2817513 Sep 2006 CN
1839724 Oct 2006 CN
200977748 Nov 2007 CN
200994779 Dec 2007 CN
101240461 Aug 2008 CN
101242880 Aug 2008 CN
101367011 Feb 2009 CN
101367012 Feb 2009 CN
101458739 Jun 2009 CN
201409198 Feb 2010 CN
101784230 Jul 2010 CN
101840461 Sep 2010 CN
101890215 Nov 2010 CN
101894206 Nov 2010 CN
102143695 Aug 2011 CN
201948063 Aug 2011 CN
202004006680 Mar 2005 DE
0160880 Nov 1985 EP
0662600 Jul 1995 EP
0956819 Nov 1999 EP
1042686 Oct 2000 EP
1707065 Oct 2006 EP
1928178 Jun 2008 EP
2025370 Feb 2009 EP
2025372 Feb 2009 EP
2189191 May 2010 EP
2324762 May 2011 EP
2363179 Sep 2011 EP
2929827 Oct 2009 FR
251054 Apr 1926 GB
2246891 Feb 1992 GB
2421416 Jun 2006 GB
5664301 May 1981 JP
S62044275 Feb 1987 JP
S62055580 Mar 1987 JP
H0355077 Mar 1991 JP
05161724 Jun 1993 JP
H07056990 Mar 1995 JP
H0938051 Feb 1997 JP
H10090396 Apr 1998 JP
H11248724 Sep 1999 JP
H11339009 Dec 1999 JP
3036281 Apr 2000 JP
2002163404 Jun 2002 JP
2003078864 Mar 2003 JP
2003141260 May 2003 JP
2003264503 Sep 2003 JP
2004024627 Jan 2004 JP
2004304653 Oct 2004 JP
2005073094 Mar 2005 JP
2005156531 Jun 2005 JP
2005269086 Sep 2005 JP
2005270640 Oct 2005 JP
2006031504 Feb 2006 JP
2006034717 Feb 2006 JP
2006209468 Aug 2006 JP
2006518247 Aug 2006 JP
2006302122 Nov 2006 JP
2006319876 Nov 2006 JP
200715117 Jan 2007 JP
2007134473 May 2007 JP
2007532165 Nov 2007 JP
2008061193 Mar 2008 JP
2008073209 Apr 2008 JP
2008524589 Jul 2008 JP
2008194095 Aug 2008 JP
20083752 Oct 2008 JP
2009045452 Mar 2009 JP
2009045462 Mar 2009 JP
200989816 Apr 2009 JP
2009073610 Apr 2009 JP
2009148338 Jul 2009 JP
2009535157 Oct 2009 JP
2010085530 Apr 2010 JP
2010088886 Apr 2010 JP
2010114547 May 2010 JP
2010517725 May 2010 JP
2010519619 Jun 2010 JP
2010166322 Jul 2010 JP
2010236951 Oct 2010 JP
2010532673 Oct 2010 JP
2012510876 May 2012 JP
20050032119 Apr 2005 KR
20060021632 Mar 2006 KR
100819564 Apr 2008 KR
20090102550 Sep 2009 KR
98007341 Feb 1998 WO
200033031 Jun 2000 WO
0108417 Feb 2001 WO
2001045014 Jun 2001 WO
0166201 Sep 2001 WO
2002035184 May 2002 WO
2002035997 May 2002 WO
2002055959 Jul 2002 WO
2002067449 Aug 2002 WO
2002101408 Dec 2002 WO
2004066837 Aug 2004 WO
2004067109 Aug 2004 WO
2005050868 Jun 2005 WO
2006065679 Jun 2006 WO
2006091715 Aug 2006 WO
2006111687 Oct 2006 WO
2007064735 Jun 2007 WO
2007069014 Jun 2007 WO
2007082389 Jul 2007 WO
2008033338 Mar 2008 WO
2008060043 May 2008 WO
2008061023 May 2008 WO
2008080626 Jul 2008 WO
2008010085 Aug 2008 WO
2008101085 Aug 2008 WO
2008104247 Sep 2008 WO
2008119479 Oct 2008 WO
2008134583 Nov 2008 WO
2009027917 Mar 2009 WO
2009073610 Jun 2009 WO
2009126818 Oct 2009 WO
2009152456 Dec 2009 WO
2010065836 Jun 2010 WO
2010065886 Jun 2010 WO
2010111705 Sep 2010 WO
2012061804 May 2012 WO
2012112931 Aug 2012 WO
2012112934 Aug 2012 WO
2012143274 Oct 2012 WO
Non-Patent Literature Citations (48)
Entry
Aylward, “Sensemble : a wireless inertial sensor system for the interactive dance and collective motion analysis,”Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2006, http://dspace.mitedu/handle/1721.1/37391 (3 pages).
Lapinski, “A wearable, wireless sensor system for sports medicine,” Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2008, http://dspace.mit.edulhandle/1721.1/46581(3 pages).
Morris, “A shoe-integrated sensor system for wireless gait analysis and real-time therapeutic feedback,” Harvard-MIT Division of Health Sciences and Technology, 2004,http://dspace.mitedu/handle/1721.1/28601 (3 pages).
Choquette et al., “Accelerometer-based wireless body area network to estimate intensity of therapy in post-acute rehabilitation,” Journal of NeuroEngineering and Rehabilitation 2008, http://www.jneuroengrehab.com/content/5/1/20/abstract (1 page).
Llosa et al., “Design of a Motion Detector to Monitor Rowing Performance Based on Wireless Sensor Networks,” Intelligent Networking and Collaborative Systems, 2009, http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=5369324 (1 page).
Frazier, Karen, “How Many Calories to 1 Carb?” published Nov. 12, 2010, Livestrong.com, 3 pages.
Jul. 31, 2012—(WO) ISR & WO—App. No. PCT/US12/025667.
Danko; How to Work a Nike Sensor; Dec. 26, 2010; eHow website; 4 pages.
Coyne; Stout's Shoes on Mass Ave Oldest Shoe Store in the USA; Jun. 18, 2013; FunCityFinder website; 5 pages.
Salpavaara, et al. Wireless Insole Sensor System for Plantar Force Measurements during Sports Events, article, Sep. 6-11, 2009, XIX IMEKO World Congress, Fundamental and Applied Metrology, 6 pages, Lisbon, Portugal.
Fleming et al., Athlete and Coach Perceptions of Technology Needs for Evaluating Running Performance, article, Aug. 14, 2010, 18 pages, 13:1-18, UK.
Morris, Stacy J., A Shoe-Integrated Sensor System for Wireless Gait Analysis and Real-Time Therapeutic Feedback, dissertation, 2004, pp. 1-314, Massachusetts Institute of Technology, MA.
Sep. 25, 2012—(WO) ISR & WO—App. No. PCT/US12/025713.
Jul. 15, 2013—(WO) Search Report and Written Opinion—App. No. PCT/US2013/022219.
Mar. 7, 2012—(WO) ISR & WO—App. PCT/US2011/060187.
Aug. 21, 2012—(WO) ISR and WO—App. No. PCT/US2012/025717.
Jul. 11, 2012—(WO) ISR & WO App No. PCT/US2012/025709.
Aug. 29, 2013—(WO) International Preliminary Report on Patentability App No. PCT/US2012/025713.
May 28, 2013—(WO) ISR & WO App No. PCT/US2013/027421.
Lovell, “A system for real-time gesture recognition and classification of coordinated motion,” Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005, <http://dspace.mit.edu/handle/1721.1/33290> (2 pages).
Chee et al, “A low cost wearable wireless sensing system for upper limb home rehabilitation,” Robotics Automation and Mechatronics (RAM) 2010 IEEE Conference on Jun. 28-30, 2010; Abstract printout (1 page).
Guraliuc et al., “Channel model for on the body communication along and around the human torso at 2.4Ghz and 5.8Ghz,” Antenna Technology (IWAT), 2010 International Workshop on Mar. 1-3, 2010; Abstract printout (1 page).
Jun. 21, 2012—(WO) ISR—App No. PCT/US2012/025701.
Oct. 1, 2013—(WO) ISR and WO—App No. PCT/US2013/048157.
International Preliminary Report on Patentability for International Application No. PCT/US2009/035877, mailed Sep. 16, 2010, 7 pages.
Apr. 25, 2012—(EP) European Search Report—App. No. 11 195 591.0.
May 11, 2010—(WO) International Search Report—App. No. PCT/US2009/066819.
Jun. 15, 2010—(WO) International Search Report—App. No. PCT/US2009/066745.
Apr. 1, 2014—(EP) Extended EP Search Report—App. No. 13196123.7.
Oliver Birbach et al, “Realtime perception for catching a flying ball with a mobile humanoid”, Robotics and Automation (ICRA), 2011 IEEE International Conference On, IEEE, May 9, 2011 (2011-0509), pp. 5955-5962, XP032033950.
Jinchang Ren et al: “Real-Time Modeling of 3-D Soccer Ball Trajectories From Multiple Fixed Cameras”, IEEE Transactions On Circuits and Systems for Video Technology, vol. 18, No. 3, Mar. 1, 2008 (Mar. 1, 2008), pp. 350-362, XP055100539.
Stefan Schiffer et al: “Qualitative World Models for Soccer Robots”, Qualitative Constraint Calculi,, URL :<http://www-kbsg.informati> k.rwth-aachen .de/sites/kbsg/files/schifferFL06kiqcc.pdf Jun. 14, 2006 (Jun. 14, 2006), pp. 1-12.
Oliver Birbach: “Accuracy Analysis of Cameral-Interial Sensor-Based Ball Trajectory Prediction”, Diploma thesis, Universiry of Bremen, Feb. 13, 2008(Feb. 13, 2008), http://www.informatik.uni-bremen.de/agebv2/downloads/published/birbach_thesis_08.pdf.
Jun. 13, 2014—(WO) ISR and WO—App. No. PCT/US2013/066841.
Sep. 16, 2010—International Preliminary Report on Patentability for International Application No. PCT/US2009/035877.
Jul. 2, 2009—(WO) International Search Report and Written Opinion—App. No. PCT/US2009/35877.
United States District Court Southern District of New York (Foley Square), “Complaint (w/o exhibits)”, NIKE, Inc. v. Lululemon Athletica Inc. et al., Case 1:22-cv-00082, filed Jan. 5, 2022, 29 pages.
United States District Court Southern District of New York (Foley Square), “Answer”, NIKE, Inc. v. Lululemon Athletica Inc. et al., Case 1:22-cv-00082, filed Jan. 14, 2022, 15 pages.
United States District Court Southern District of New York (Foley Square), “Docket”, NIKE, Inc. v. Lululemon Athletica Inc. et al., Case 1:22-cv-00082, printed Jul. 26, 2022, 8 pages.
United States District Court Southern District of New York (Foley Square), Defendant's “Disclosure of Preliminary Invalidity Contentions”, NIKE, Inc. v. Lululemon Athletica Inc. et al., Case 1:22-cv-00082, filed Jul. 18, 2022, 1097 pages.
United States District Court Southern District of New York (Foley Square), Defendant's “Amended Exhibits”, (Exhs. D-1, E-1, E-2, E-3, E-4, E-5, E-6) NIKE, Inc. v. Lululemon Athletica Inc. et al., Case 1:22-cv-00082, filed Jul. 26, 2022, 183 pages.
David R. Bassett, Jr., Validity and Reliability Issues in Objective Monitoring of Physical Activity, Research Quarterly for Exercise and Sport, vol. 71, No. 2, pp. 30-36 (2000) (“Bassett”), 7 pages.
Chris Hall, Nokia Sports Tracker, Pocket-lint (Jul. 25, 2008), https://www.pocket-lint.com/phones/reviews/nokia/70414-nokia-sports-tracker-mobile-application (“Hall”), 11 pages.
Jozsef Hajdu, Provided Services of Social Networks for Sport, TKK T-110.5190 Seminar on Internetworking (Apr. 28-29, 2008), http://www.cse.tkk.fi/en/publications/B/1/ papers/Hajdu_final.pdf (“Hajdu”), 8 pages.
Zee, Breaking: Goodbye Twitter? Facebook adds @mentions to status updates (Sep. 10, 2009, 9:16 PM), https://thenextweb.com/news/breaking-facebook-adds- mentions-status-updates?amp=1 (“Zee”), 6 pages.
Robert Anderson (@rsa), Twitter (Nov. 2, 2006, 11:58 PM), https://twitter.com/rsa/status/55281?lang=en (“Anderson”), 3 pages.
Scott Tousignant (@TheFitB), Twitter (Nov. 27, 2008, 12:27 PM), https://twitter.com/TheFitB/status/1026679285 (“Tousignant”), 1 page.
Internet webpage wii.nintendo.com/controller.jsp, dated Nov. 30, 2006 and retrieved from the Internet Archive (“Nintendo webpage”).
Related Publications (1)
Number Date Country
20230326572 A1 Oct 2023 US
Provisional Applications (1)
Number Date Country
61412285 Nov 2010 US
Divisions (2)
Number Date Country
Parent 15223188 Jul 2016 US
Child 15693753 US
Parent 14722695 May 2015 US
Child 15223188 US
Continuations (7)
Number Date Country
Parent 18161544 Jan 2023 US
Child 18210972 US
Parent 16853937 Apr 2020 US
Child 18161544 US
Parent 16376123 Apr 2019 US
Child 16853937 US
Parent 15994517 May 2018 US
Child 16376123 US
Parent 15693753 Sep 2017 US
Child 15994517 US
Parent 14478203 Sep 2014 US
Child 14722695 US
Parent 13293653 Nov 2011 US
Child 14478203 US