Field of the Invention
One or more embodiments setting forth the ideas described throughout this disclosure pertain to the field of sporting equipment fitting. One or more embodiments present information associated with an optimally fitting piece of sporting equipment, for example the best performing piece of equipment associated with a group of second users within a range or correlation of size, range of motion or speed or any combination thereof, with respect to the user. For example, embodiments may present information related to a particular make, model, dimension, weight, length, stiffness, or other parameter associated with a piece of sporting equipment through use of a motion capture sensor to measure a user's various dimensions or sizes, range of motion and speed and/or acceleration for example. Embodiments for example prompt for and accept movement to determine distance and/or speed between two locations and/or through a rotation. For example embodiments may be utilized to determine height, arm length, wrist to floor distance, hand size, longest finger size, arm length, leg length torso length, range of motion, such as but not limited to flexion, extension, abduction, adduction, outward rotation, inward rotation, pronation, supination, inversion and eversion, and speed through any motion or rotation. The distance, range of motion and speed may be obtained for any limb or through motion of any joint or portion of the human body for example. Embodiments may further utilize the same sensor for example after coupling the sensor to the piece of equipment, to obtain motion capture data from the piece of equipment, such as speed of the equipment when moved through a typical type of motion for the piece of equipment, for example to further optimize the fit. The fit may be optimized by data mining or otherwise through calculation of a correlation of dimensions, range of motion, for example static-active, static-passive and/or dynamic/kinetic range of motion, speed/acceleration, etc., with various other users, whether alive or historical as calculated through visual or other methods. Embodiments thus determine the best performing equipment for that particular type of user, i.e., within a range of size, range of motion, speed, for example the make/model of the longest hitting, most accurate, maximum or minimum scoring, etc., as previously obtained and/or determined from or based on other users having the closest dimensions, range of motion and speed. Embodiments also enable purchasing of the equipment via the mobile device, whether the piece of equipment is shown on television or other broadcast or based on the user's previous performance data or current performance data. Embodiments may further be configured to predict a first derivative or other derivate based on age or growth rates to determine the best fitting equipment for children that will fit for the longest time or otherwise minimize costs and maximize usage of equipment as well. Other embodiments of the invention may suggest exercises and/or stretches that would improve performance to a predicted performance level based on other users performance data and suggest equipment that would be appropriate for an increase strength or flexibility so that users can “grow into” or “improve into” equipment. In addition, other embodiments of the invention may be utilized over time to detect tight areas or areas that may be indicative of injury for example and alert the user. One or more embodiments of the invention may be utilized for gait analysis for fitting of shoes.
Description of the Related Art
There are no known systems that use a given motion capture sensor to measure a user's size, range of motion, speed and then utilize that same sensor to capture motion data from a piece of sporting equipment, for example to further optimize the fit of a particular piece of sporting equipment or to gather performance data over time from the same sensor. Existing sporting equipment fitting systems are generally based on size measurements of a user. These systems generally do not take into account the range of motion or direct measurements of speed through the range of motion of various joints of a user to optimize a fit for a piece of sporting equipment. There are no known fitting systems based on motion capture data obtained from high resolution sensors, for example that include use of previously stored high resolution motion data from the user or other users or piece of equipment, or from motion capture data obtained through the analysis of historical videos for example. Known systems do not contemplate data mining of motion data and size, range of motion, speed and age of other users to maximize the performance of the user.
In addition, known systems do not provide a sensor and “app” that may be inexpensively obtained and utilized on a ubiquitous mobile device such as a mobile telephone to prompt for and obtain distance, dimensions, range of motion, speed or other measurement data and suggest optimal equipment and enable the user to immediately purchase the optimally fitting equipment from the same mobile device.
Specifically, most motion capture systems are generally utilized to observe and/or teach effective body mechanics and utilize video recording of an athlete and analysis of the recorded video of an athlete. This technique has various limitations including inaccurate and inconsistent subjective analysis based on video for example. Another technique includes motion analysis, for example using at least two cameras to capture three-dimensional points of movement associated with an athlete. Known implementations utilize a stationary multi-camera system that is not portable and thus cannot be utilized outside of the environment where the system is installed, for example during an athletic event such as a golf tournament. These fixed installations are extremely expensive as well. Such prior techniques are summarized in U.S. Pat. No. 7,264,554, filed 26 Jan. 2006, which claims the benefit of U.S. Provisional Patent Application Ser. No. 60/647,751 filed 26 Jan. 2005, the specifications of which are both hereby incorporated herein by reference. Both disclosures are to the same inventor of the subject matter of the instant application. Regardless of the motion capture data obtained, the data is generally analyzed on a per user or per swing basis that does not contemplate processing on a mobile phone, so that a user would only buy a motion capture sensor and an “app” for a pre-existing mobile phone. In addition, existing solutions do not contemplate mobile use, analysis and messaging and/or comparison to or use of previously stored motion capture data from the user or other users or data mining of large data sets of motion capture data, for example to obtain or create motion capture data associated with a group of users, for example professional golfers, tennis players, baseball players or players of any other sport to provide a “professional level” average or exceptional virtual reality opponent. To summarize, motion capture data is generally used for immediate monitoring or sports performance feedback and generally has had limited and/or primitive use in other fields. Any uses for the data with respect to fitting are limited, and generally based on the size of the user and do not utilize a given sensor to measure the user's size, range of motion and speed as well as the motion of the piece of equipment, for example after coupling the motion capture sensor to the piece of equipment after the uncoupled sensor is utilized in measuring physical parameters of the user without the piece of equipment.
Known motion capture systems generally utilize several passive or active markers or several sensors. There are no known systems that utilize as little as one visual marker or sensor and an app that for example executes on a mobile device that a user already owns, to analyze and display motion capture data associated with a user and/or piece of equipment. The data is generally analyzed in a laboratory on a per user or per swing basis and is not used for any other purpose besides motion analysis or representation of motion of that particular user and is generally not subjected to data mining. This also makes fitting for sporting equipment more difficult for the user, since the user must travel to a particular installation for custom fitting for example.
There are no known systems that allow for motion capture elements such as wireless sensors to seamlessly integrate or otherwise couple with a user or shoes, gloves, shirts, pants, belts, or other equipment, such as a baseball bat, tennis racquet or golf club for local analysis or later analysis in such a small format that the user is not aware that the sensors are located in or on these items. There are no known systems that provide seamless mounts, for example in the weight port of a golf club or at the end shaft near the handle so as to provide a wireless golf club, configured to capture motion data. Data derived from existing sensors is not saved in a database for a large number of events and is not used relative to anything but the performance at which the motion capture data was acquired. In addition, known motion capture sensors are specifically designed to mount to a piece of sporting equipment in a particular manner and are not intended to measure the user's size, range of motion or speed for example without being mounted on the piece of sporting equipment.
In addition, for sports that utilize a piece of equipment and a ball, there are no known portable systems that allow the user to obtain immediate visual feedback regarding ball flight distance, swing speed, swing efficiency of the piece of equipment or how centered an impact of the ball is, i.e., where on piece of equipment the collision of the ball has taken place. These systems do not allow for user's to play games with the motion capture data acquired from other users, or historical players, or from their own previous performances. Known systems do not allow for data mining motion capture data from a large number of swings to suggest or allow the searching for better or optimal equipment to match a user's motion capture data and do not enable original equipment manufacturers (OEMs) to make business decisions, e.g., improve their products, compare their products to other manufacturers, up-sell products or contact users that may purchase different or more profitable products.
In addition, there are no known systems that utilize motion capture data mining for equipment fitting and subsequent point-of-sale decision making for instantaneous purchasing of equipment that fits an athlete. Furthermore, no known systems allow for custom order fulfillment such as assemble-to-order (ATO) for custom order fulfillment of sporting equipment, for example equipment that is built to customer specifications based on motion capture data mining, and shipped to the customer to complete the point of sales process, for example during play or virtual reality play or for example during a television broadcast.
There are no known systems that enable data mining for a large number of users related to their motion or motion of associated equipment to find patterns in the data that allows for business strategies to be determined based on heretofore undiscovered patterns related to motion. There are no known systems that enable obtain payment from OEMs, medical professionals, gaming companies or other end users to allow data mining of motion data. For at least the limitations described above there is a need for a fitting system for sporting equipment that utilizes an motion capture sensor, for example uncoupled from the piece of sporting equipment to measure a user's size, range of motion and speed and optimize a fit for a piece of sporting equipment after coupling the motion capture sensor to the piece of sporting equipment and deriving an optimized fit based on current and/or previously stored or calculated motion data from the same user or other user's that maximally correlate with the user's size, range of motion, speed or any other parameters such as age.
Embodiments of the invention enable a fitting system for sporting equipment using an application that executes on a mobile device, for example a mobile phone, to prompt and accept motion inputs from a given motion capture sensor to measure a user's size, range of motion, speed and/or acceleration and then in one or more embodiments utilizes that same sensor to capture motion data from a piece of equipment, for example to further optimize the fit of and/or further collect motion capture data. Embodiments may provide information related to the optimal fit or otherwise suggest purchase of a particular piece of sporting equipment. Embodiments may utilize correlation or other algorithms or data mining of motion data for size, range of motion, speed of other users to maximize the fit of a piece of equipment for the user based on other user's performance with particular equipment. For example, this enables a user of a similar size, range of motion and speed to data mine for the best performance equipment, e.g., longest drive, lowest putt scores, highest winning percentage, etc., associated with other users having similar characteristics.
Specifically, one or more embodiments of the fitting system for sporting equipment include at least one motion capture element that includes a wireless communication interface configured to transmit motion capture data from the at least one motion capture element and an application configured to execute on a computer within a mobile device that is configured to wirelessly communicate with the motion capture sensor, and optionally configured to telephonically communicate. In one or more embodiments the application is configured to prompt a first user to move the motion capture sensor to a first location, accept a first motion capture data from the motion capture sensor at the first location via the wireless communication interface, prompt the first user to move the motion capture sensor to a second location, accept a second motion capture data or rotation from the motion capture sensor at the second location via the wireless communication interface, calculate a distance or rotation between the first and second location or rotation based on the first and second motion capture data. The distance may include a height or an arm length, or a torso length, or a leg length, or a wrist to floor measurement, or a hand size or longest finger size or both the hand size and longest finger size of the first user, or any combination thereof or any other dimension or length associated with the first user. For example, embodiments of the invention may prompt the user to hold the motion capture sensor in the user's hand and hold the hand on top of the user's head and then prompt the user to place the sensor on the ground, to calculate the distance therebetween, i.e., the height of the user. In another example, the system may prompt the user to hold the sensor in the hand, for example after decoupling the sensor from a golf club and then prompt the user to place the sensor on the ground. The system then calculates the distance as the “wrist to floor measurement”, which is commonly used in sizing golf clubs for example. Embodiments of the system may also prompt the user to move the sensor from the side of the user to various positions or rotational values, for example to rotate the sensor while at or through various positions to calculate the range of motion, for example through flexion, extension, abduction, adduction, lateral rotation, medial rotation, etc. The range of motion may be detected for different types of stretching or movement such as static-active, static-passive and/or dynamic/kinetic stretching or rotation of any desired joint or body part. In one or more embodiments, the application is further configured to prompt the first user to couple the motion capture sensor to a piece of equipment and prompt the first user to move the piece of equipment through a movement. The application is further configured to accept a third motion capture data from the motion capture sensor for the movement via the wireless communication interface and calculate a speed for the movement based on the third motion capture data. In one or more embodiments, the application is configured to calculate a correlation between the distance and the speed for the first user with respect to a plurality of other users and present information associated with an optimally fit or sized piece of equipment associated with a second user having a maximum value correlation with at least the distance and the speed of the first user. One such algorithm may for example provide a list of make and model of the lowest scoring golf shaft, or longest hitting baseball bat associated with a similar size/range of motion/speed user. Embodiments of the user may use the speed of the user through motions or the speed of the equipment through motions or both in correlation calculations for example. Embodiments may further be configured to predict a first derivative or other derivate based on age or growth rates to determine the best fitting equipment for children that will fit for the longest time or otherwise minimize costs and maximize usage of equipment as well. Other embodiments of the invention may suggest exercises and/or stretches that would improve performance to a predicted performance level based on other users performance data and suggest equipment that would be appropriate for an increase strength or flexibility so that users can “grow into” or “improve into” equipment. In addition, other embodiments of the invention may be utilized over time to detect tight areas or areas that may be indicative of injury for example and alert the user. One or more embodiments of the invention may be utilized for gait analysis for fitting of shoes.
Other embodiments may display one or more images to enable the first user to view a sporting event. Embodiments may accept an input from the first user to purchase the piece of equipment based the distance, or range of motion or the speed previously stored with respect to the first user or any combination thereof. For example, the piece of equipment may be shown in the sporting event, but sized to fit the user based on the user's previously stored or currently accepted or calculated parameters. Embodiments may also prompt the first user for their age and utilize this when calculation of the correlation is performed. Embodiments may present information associated with a grip or length of the optimally sized piece of equipment, or stiffness, or model or manufacturer, or any combination thereof.
Embodiments of the application may also be configured to recognize when the at least one motion capture element is removed from the piece of equipment based on the motion capture data. The application may for example accept gestures or analyze the motion to determine that it could not be output from a particular piece of equipment based on the motion. Alternatively, or in combination, embodiments of the invention may recognize when the at least one motion capture element is coupled with the piece of equipment based on the motion capture data. For example if the motion data is analyzed and is determined to have a path of motion indicative of a baseball bat swing or golf swing then, the system may indicate that the motion capture sensor is currently coupled to the piece of equipment. Furthermore, since different pieces of equipment may utilize the same sensor, for example after decoupling from one and placing in the other, particular types of motion, for example a skate board and a tennis racquet may be automatically determined based on a barrel roll of the skateboard or serve of the racquet which indicates the path of motion that is unique or at least indicative of that type of equipment. This enables automatic sensing of the piece of equipment currently coupled with the sensor.
Embodiments of the invention may utilize data mining on the motion capture data to obtain patterns for users, equipment, or use the motion capture data of a given user or other user in particular embodiments of the invention. Data mining relates to discovering new patterns in large databases wherein the patterns are previously unknown. Many methods may be applied to the data to discover new patterns including statistical analysis, neural networks and artificial intelligence for example. Due to the large amount of data, automated data mining may be performed by one or more computers to find unknown patterns in the data. Unknown patterns may include groups of related data, anomalies in the data, dependencies between elements of the data, classifications and functions that model the data with minimal error or any other type of unknown pattern. Displays of data mining results may include displays that summarize newly discovered patterns in a way that is easier for a user to understand than large amounts of pure raw data. One of the results of the data mining process is improved market research reports, product improvement, lead generation and targeted sales. Generally, any type of data that will be subjected to data mining must be cleansed, data mined and the results of which are generally validated. Businesses may increase profits using data mining. Examples of benefits of embodiments of the invention include customer relationship management to highly target individuals based on patterns discovered in the data. In addition, market basket analysis data mining enables identifying products that are purchased or owned by the same individuals and which can be utilized to offer products to users that own one product but who do not own another product that is typically owned by other users. Other areas of data mining include analyzing large sets of motion data from different users to suggest exercises to improve performance based on performance data from other users. For example if one user has less rotation of the hips during a swing versus the average user, then exercises to improve flexibility or strength may be suggested by the system. In a golf course embodiment, golf course planners may determine over a large amount of users on a golf course which holes should be adjusted in length or difficulty to obtain more discrete values for the average number of shots per hole, or for determining the amount of time between golfers, for example at a certain time of day or for golfers of a certain age. In addition, sports and medical applications of data mining include determining morphological changes in user performance over time, for example versus diet or exercise changes to determine what improves performance the most.
For example, embodiments that utilize motion capture elements allow for analyzing the data obtained from the apparatus and enable the presentation of unique displays associated with the user, such as 3D overlays onto images of the body of the user to visually depict the captured motion data. In addition, these embodiments may also utilize active wireless technology such as BLUETOOTH® Low Energy for a range of up to 50 meters to communicate with a golfer's mobile computer. Embodiments of the invention also allow for display of queries for counting a stroke for example as a result of receiving a golf club ID, for example via an RFID reader or alternatively via wireless communication using BLUETOOTH® or IEEE 802.11 for example. Use of BLUETOOTH® Low Energy chips allows for a club to be in sleep mode for up to 3 years with a standard coin cell battery, thus reducing required maintenance. One or more embodiments of the invention may utilize more than one radio, of more than one technology for example. This allows for a level of redundancy that increases robustness of the system. For example, if one radio no longer functions, e.g., the BLUETOOTH® radio for example, then the IEEE 802.11 radio may be utilized to transfer data and warn the golfer that one of the radios is not functioning, while still allowing the golfer to record motion data and count shots associated with the particular club. For embodiments of the invention that utilize a mobile device (or more than one mobile device) without camera(s), sensor data may be utilized to generate displays of the captured motion data, while the mobile device may optionally obtain images from other cameras or other mobile devices with cameras. For example, display types that may or may not utilize images of the user may include ratings, calculated data and time line data. Ratings associated with the captured motion can also be displayed to the user in the form of numerical or graphical data with or without a user image, for example an “efficiency” rating. Calculated data, such as a predicted ball flight path data can be calculated and displayed on the mobile device with or without utilizing images of the user's body. Data depicted on a time line can also be displayed with or without images of the user to show the relative peaks of velocity for various parts of the equipment or user's body for example. Any of these types of measurements that are for example associated with speed are in keeping with the fitting aspects of the invention, and the use of speed herein may include any derived quantity associated with motion for example when used in conjunction with fitting of equipment with a particular user.
In one or more embodiments of the invention, fixed cameras such as at a tennis tournament, football game, baseball game, car or motorcycle race, golf tournament or other sporting event can be utilized with a wireless interface located near the player/equipment having motion capture elements so as to obtain, analyze and display motion capture data. In this embodiment, real-time or near real-time motion data can be displayed on the video for augmented video replays. An increase in the entertainment level is thus created by visually displaying how fast equipment is moving during a shot, for example with rings drawn around a players hips and shoulders. Embodiments of the invention also allow images or videos from other players having mobile devices to be utilized on a mobile device related to another user so that users don't have to switch mobile phones for example. In one embodiment, a video obtained by a first user for a piece of sporting equipment in motion that is not associated with the second user having the video camera equipped mobile phone may automatically transfer the video to the first user for display with motion capture data associated with the first user. Video and images may be uploaded into the database and data mined through image analysis to determine the types/colors of clothing or shoes for example that users are wearing. The equipment thus analyzed or otherwise input into the system may be broadcast so that other embodiments of the invention may be utilized to purchase the equipment, for example as sized to the user, or sized to the user to maximize performance as correlated with other users for example.
Based on the display of data, the user can determine the equipment that fits the best and immediately purchase the equipment, via the mobile device. For example, when deciding between two sets of skis, a user may try out both pairs that are instrumented with motion capture elements wherein the motion capture data is analyzed to determine which pair of skis enables more efficient movement. For golf embodiments, when deciding between two golf clubs, a user can take swings with different clubs and based on the analysis of the captured motion data and quantitatively determine which club performs better, for example in conjunction with size, range of motion or speed of the user or any combination thereof as determined using a motion capture sensor alone and/or in combination with a piece of equipment. Custom equipment may be ordered through an interface on the mobile device from a vendor that can assemble-to-order customer built equipment and ship the equipment to the user for example. Shaft lengths for putters for example that are a standard length can be custom made for a particular user based on captured motion data as a user putts with an adjustable length shaft for example. Based on data mining of the motion capture data and shot count data and distances for example allows for users having similar swing characteristics to be compared against a current user wherein equipment that delivers longer shots for a given swing velocity for a user of a particular size and age for example may be suggested or searched for by the user to improve performance. OEMs may determine that for given swing speeds, which make and model of club delivers the best overall performance as well. One skilled in the art will recognize that this applies to all activities involving motion, not just golf.
Embodiments of the system may utilize a variety of sensor types. In one or more embodiments of the invention, active sensors may integrate with a system that permits passive or active visual markers to be utilized to capture motion of particular points on a user's body or equipment. This may be performed in a simply two-dimensional manner or in a three-dimensional manner if the mobile device is configured with two or more cameras, or if multiple cameras or mobile devices are utilized to capture images such as video and share the images in order to create triangulated three-dimensional motion data from a set of two-dimensional images obtained from each camera. Another embodiment of the invention may utilize inertial measurement units (IMU) or any other sensors that can produce any combination of orientation, position, velocity and/or acceleration information to the mobile device. The sensors may thus obtain data that may include any combination of one or more values associated with orientation (vertical or North/South or both), position (either via through Global Positioning System, i.e., “GPS” or through triangulation), velocity (in all three axes), acceleration (in all three axes). All motion capture data obtained from the various sensor types may be saved in a database for analysis, monitoring, compliance, game playing or other use and/or data mining, regardless of the sensor type.
In one or more embodiments of the invention, a sensor may be utilized that includes a passive marker or active marker on an outside surface of the sensor, so that the sensor may also be utilized for visual tracking (either two-dimensional or three-dimensional) and for orientation, position, velocity, acceleration or any other physical quantity produced by the sensor. Visual marker embodiments of the motion capture element(s) may be passive or active, meaning that they may either have a visual portion that is visually trackable or may include a light-emitting element such as a light emitting diode (LED) that allows for image tracking in low light conditions. This for example may be implemented with a graphical symbol or colored marker at the end of the shaft near the handle or at the opposing end of the golf club at the head of the club. Images or videos of the markers may be analyzed locally or saved in the database and analyzed and then utilized in data mining.
Embodiments of the motion capture sensors may be generally mounted on or near one or more end or opposing ends of sporting equipment, for example such as a golf club and/or anywhere in between (for EI measurements) and may integrate with other sensors coupled to equipment, such as weapons, medical equipment, wristbands, shoes, pants, shirts, gloves, clubs, bats, racquets, balls, etc., and/or may be attached to a user in any possible manner. For example, a rifle to determine where the rifle was pointing when recoil was detected by the motion capture sensor. This data may be transmitted to a central server, for example using a mobile computer such as a mobile phone or other device and analyzed for war games practice for example. In addition, one or more embodiments of the sensor can fit into a weight port of a golf club, and/or in the handle end of the golf club. Other embodiments may fit into the handle of, or end of, a tennis racquet or baseball bat for example. One or more embodiments of the invention may also operate with balls that have integrated sensors as well. One or more embodiments of the mobile device may include a small mountable computer such as an IPOD® SHUFFLE® or IPOD® NANO® that may or may not have integrated displays, and which are small enough to mount on a shaft of a piece of sporting equipment and not affect a user's swing. Alternatively, the system may calculate the virtual flight path of a ball that has come in contact with equipment moved by a player. For example with a baseball bat or tennis racquet or golf club having a sensor integrated into a weight port of other portion of the end of the club striking the golf ball and having a second sensor located in the tip of the handle of the golf club, or in one or more gloves worn by the player, an angle of impact can be calculated for the club. By knowing the loft of the face of the club, an angle of flight may be calculated for the golf ball. In addition, by sampling the sensor at the end of the club at a high enough speed to determine oscillations indicative of where on the face of the club the golf ball was struck, a quality of impact may be determined. These types of measurements and the analysis thereof help an athlete improve, and for fitting purposes, allow an athlete to immediately purchase equipment that fits correctly. Centering data may be uploaded to the database and data mined for patterns related to the bats, racquets or clubs with the best centering on average, or the lowest torsion values for example on a manufacturer basis for product improvement. Any other unknown patterns in the data that are discovered may also be presented or suggested to users or search on by users, or paid for, for example by manufacturers or users.
One or more embodiments of the motion capture sensor may be removed from one piece of sporting equipment and placed on another type of equipment or article of clothing so that the user does not have to purchase motion capture sensors for all equipment and clothes associated with the user. This is possible since embodiments of the sensor may couple with any enclosure sized to fit the sensor. In one or more embodiments, a cap is removed, then the sensor is removed and inserted into another piece of equipment or article of clothing for example.
One or more embodiments of the sensor may contain charging features such as mechanical eccentric weight, as utilized in some watches known as “automatic” or “self-winding” watches, optionally including a small generator, or inductive charging coils for indirect electromechanical charging of the sensor power supply. Other embodiments may utilize plugs for direct charging of the sensor power supply or electromechanical or microelectromechanical (MEMS) based charging elements. Any other type of power micro-harvesting technologies may be utilized in one or more embodiments of the invention. One or more embodiments of the sensor may utilize power saving features including gestures that power the sensor on or off. Such gestures may include motion, physical switches, contact with the sensor, wireless commands to the sensor, for example from a mobile device that is associated with the particular sensors. Other elements that may couple with the sensor includes a battery, low power microcontroller, antenna and radio, heat sync, recharger and overcharge sensor for example. In addition, embodiments of the invention allow for power down of some or all of the components of the system until an electronic signal from accelerometers or a mechanical switch determines that the club has moved for example.
One or more embodiments of the invention enable Elasticity Inertia or EI measurement of sporting equipment and even body parts for example. Placement of embodiments of the sensor along the shaft of a golf club, tennis racquet, baseball bat, hockey stick, shoe, human arm or any other item that is not perfectly stiff enables measurement of the amount of flex at points where sensors are located or between sensors. The angular differences in the each sensor over time allow for not only calculation of a flex profile, but also a flex profile that is dependent on time or force. For example, known EI machines use static weights between to support points to determine an EI profile. These machines therefore cannot detect whether the EI profile is dependent upon the force applied or is dependent on the time at which the force is applied, for example EI profiles may be non-linear with respect to force or time. Example materials that are known to have different physical properties with respect to time include Maxwell materials and non-Newtonian fluids.
A user may also view the captured motion data in a graphical form on the display of the mobile device or for example on a set of glasses that contains a video display. The captured motion data obtained from embodiments of the motion capture element may also be utilized to augment a virtual reality display of user in a virtual environment. Virtual reality or augmented reality views of patterns that are found in the database via data mining are also in keeping with the spirit of the invention. User's may also see augmented information such as an aim assist or aim guide that shows for example where a shot should be attempted to be placed for example based on existing wind conditions, or to account for hazards, e.g., trees that are in the way of a desired destination for a ball, i.e., the golf hole for example.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee. The above and other aspects, features and advantages of the ideas conveyed through this disclosure will be more apparent from the following more particular description thereof, presented in conjunction with the following drawings wherein:
A fitting system for sporting equipment will now be described. In the following exemplary description numerous specific details are set forth in order to provide a more thorough understanding of the ideas described throughout this specification. It will be apparent, however, to an artisan of ordinary skill that embodiments of ideas described herein may be practiced without incorporating all aspects of the specific details described herein. In other instances, specific aspects well known to those of ordinary skill in the art have not been described in detail so as not to obscure the disclosure. Readers should note that although examples of the innovative concepts are set forth throughout this disclosure, the claims, and the full scope of any equivalents, are what define the invention.
Specifically, one or more embodiments of the fitting system for sporting equipment include at least one motion capture element 111 that includes a wireless communication interface configured to transmit motion capture data from the at least one motion capture element and an application configured to execute on computer 160 within a mobile device, e.g., 101, 102, 102a, 102b, 103 or 105 that is configured to wirelessly communicate with the motion capture sensor, and optionally configured to telephonically communicate.
Other embodiments may display one or more images to enable the first user to view a sporting event, for example via TV 143 or via any of the mobile devices, wireless devices capable of displaying an image 103, or other computers 101, 102, 102a, 102b or 105. Embodiments may accept an input from the first user to purchase the piece of equipment based the distance, or range of motion or the speed previously stored with respect to the first user or any combination thereof. For example, the piece of equipment may be shown in the sporting event, but sized to fit the user based on the user's previously stored or currently accepted or calculated parameters. Embodiments may also prompt the first user for their age and utilize this when calculation of the correlation is performed. Embodiments may present information associated with a grip or length of the optimally sized piece of equipment, or stiffness, or model or manufacturer, or any combination thereof.
Embodiments of the application may also be configured to recognize when the at least one motion capture element is removed from the piece of equipment based on the motion capture data. The application may for example accept gestures or analyze the motion to determine that it could not be output from a particular piece of equipment based on the motion. Alternatively, or in combination, embodiments of the invention may recognize when the at least one motion capture element is coupled with the piece of equipment based on the motion capture data. For example if the motion data is analyzed and is determined to have a path of motion indicative of a baseball bat swing or golf swing then, the system may indicate that the motion capture sensor is currently coupled to the piece of equipment. Furthermore, since different pieces of equipment may utilize the same sensor, for example after decoupling from one and placing in the other, particular types of motion, for example a skate board and a tennis racquet may be automatically determined based on a barrel roll of the skateboard or serve of the racquet which indicates the path of motion that is unique or at least indicative of that type of equipment. This enables automatic sensing of the piece of equipment currently coupled with the sensor.
Each mobile device 101, 102, 102a, 102b may optionally include an internal identifier reader 190, for example an RFID reader, or may couple with an identifier reader or RFID reader (see mobile device 102) to obtain identifier 191. Alternatively, embodiments of the invention may utilize any wireless technology in any of the devices to communicate an identifier that identifies equipment 110 to the system. The system generally may be utilized to fit any type of piece of equipment 110. The motion capture sensor(s) may couple with the user or piece of equipment via mount 192, for example to a golf club, or baseball bat, tennis racquet, hockey stick, weapon, stick, sword, or any other piece of equipment for any sport, or other sporting equipment such as a shoe, belt, gloves, glasses, hat, or any other item. The at least one motion capture element 111 may be placed at one end, both ends, or anywhere between both ends of piece of equipment 110 or anywhere on user 150 and may be utilized for EI measurements of any item. The motion capture element may optionally include a visual marker, either passive or active, and/or may include a wireless sensor, for example any sensor capable of providing any combination of one or more values associated with an orientation (North/South and/or up/down), position, velocity and/or acceleration of the motion capture element. The computer may be configured to obtain data associated with an identifier unique to each piece of equipment 110, e.g., clothing, bat, etc., for example from an RFID coupled with club 110, i.e., identifier 191, and optionally associated with the at least one motion capture element, either visually or wirelessly, analyze the data to form motion analysis data and display the motion analysis data on display 120 of mobile device 101. Motion capture element 111 may be mounted on or near the equipment or on or near the user via motion capture mount 192. The motion capture data from motion capture element 111, any data associated with the piece of equipment 110, such as identifier 191 and any data associated with user 150, or any number of such users 150, such as second user 152 may be stored in locally in memory, or in a database local to the computer or in a remote database, for example database 172. Data may be stored in database 172 from each user 150, 152 for example when a network or telephonic network link is available from motion capture element 111 to mobile device 101 and from mobile device 101 to network 170 or Internet 171 and to database 172. Data mining is then performed on a large data set associated with any number of users and their specific characteristics and performance parameters. For example, in a golf embodiment of the invention, a club ID is obtained from the golf club and a shot is detected by the motion capture element. Mobile computer 101 stores images/video of the user and receives the motion capture data for the events/hits/shots/motion and the location of the event on the course and subsequent shots and determines any parameters for each event, such as distance or speed at the time of the event and then performs any local analysis and display performance data on the mobile device. When a network connection from the mobile device to network 170 or Internet 171 is available or for example after a round of golf, the images/video, motion capture data and performance data is uploaded to database 172, for later analysis and/or display and/or data mining. In one or more embodiments, users 151, such as original equipment manufacturers pay for access to the database, for example via a computer such as computer 105 or mobile computer 101 or from any other computer capable of communicating with database 172 for example via network 170, Internet 171 or via website 173 or a server that forms part of or is coupled with database 172. Data mining may execute on database 172, for example that may include a local server computer, or may be run on computer 105 or mobile device 101, 102, 102a or 102b and access a standalone embodiment of database 172 for example. Data mining results may be displayed on mobile device 101, computer 105, television broadcast or web video originating from camera 130, 130a and 103b, or 104 or accessed via website 173 or any combination thereof.
One or more embodiments of the system may utilize a mobile device that includes at least one camera 130, for example coupled to the computer within the mobile device. This allows for the computer within mobile device 101 to command the camera 130 to obtain an image or images, for example of the user during an athletic movement. The image(s) of the user may be overlaid with displays and ratings to make the motion analysis data more understandable to a human for example. Alternatively, detailed data displays without images of the user may also be displayed on display 120 or for example on the display of computer 105. In this manner two-dimensional images and subsequent display thereof is enabled. If mobile device 101 contains two cameras, as shown in mobile device 102, i.e., cameras 130a and 130b, then the cameras may be utilized to create a three-dimensional data set through image analysis of the visual markers for example. This allows for distances and positions of visual markers to be ascertained and analyzed. Images and/or video from any camera in any embodiments of the invention may be stored on database 172, for example associated with user 150, for data mining purposes. In one or more embodiments of the invention image analysis on the images and/or video may be performed to determine make/models of equipment, clothes, shoes, etc., that is utilized, for example per age of user 150 or time of day of play, or to discover any other pattern in the data.
Alternatively, for embodiments of mobile devices that have only one camera, multiple mobile devices may be utilized to obtain two-dimensional data in the form of images that is triangulated to determine the positions of visual markers. In one or more embodiments of the system, mobile device 101 and mobile device 102a share image data of user 150 to create three-dimensional motion analysis data. By determining the positions of mobile devices 101 and 102 (via position determination elements such as GPS chips in the devices as is common, or via cell tower triangulation and which are not shown for brevity but are generally located internally in mobile devices just as computer 160 is), and by obtaining data from motion capture element 111 for example locations of pixels in the images where the visual markers are in each image, distances and hence speeds are readily obtained as one skilled in the art will recognize.
Camera 103 may also be utilized either for still images or as is now common, for video. In embodiments of the system that utilize external cameras, any method of obtaining data from the external camera is in keeping with the spirit of the system including wireless communication of the data, or via wired communication as when camera 103 is docked with computer 105 for example, which then may transfer the data to mobile device 101.
In one or more embodiments of the system, the mobile device on which the motion analysis data is displayed is not required to have a camera, i.e., mobile device 102b may display data even though it is not configured with a camera. As such, mobile device 102b may obtain images from any combination of cameras on mobile device 101, 102, 102a, camera 103 and/or television camera 104 so long as any external camera may communicate images to mobile device 102b. Alternatively, no camera is required at all to utilize the system.
In one or more embodiments, as also shown in
In one or more embodiments, the computer is further configured to broadcast the images to enable a multiplicity of viewers to purchase the piece of equipment based on the images, and may also be configured to broadcast an advertisement with information related to purchasing the piece of equipment based the images. For example, the player, or piece of equipment of interest, may have a new maximum power factor for a given swing, or compared to the average power factor of average users, or professionals, suggesting that the piece of equipment used by the player may improve performance, in the interest of the potential buyer. Furthermore, for example, an advertisement may be displayed at the bottom of a display screen, or anywhere else on a display screen, showing the new maximum power factor along with a URL or other information related to the equipment, allowing the viewer to purchase the equipment. Other information related to the equipment may comprise phone numbers, addresses, names, vendors, events or any other data helpful to the viewer in purchasing the equipment. In addition to, or alternatively, the computer may be further configured to broadcast the images to enable a multiplicity of viewers to order a custom fitted piece of equipment over a network, for example by specifying their height or other dimensions for example alone or in combination with previously stored motion capture data or physical parameters as measured or derived therefrom.
At least one of the previously disclosed embodiments may also be configured to intermittently receive the motion capture data and synchronize the images in time with the motion capture data, for example from motion capture sensor(s) 111. This enables video capture and motion capture data to be combined at a later timer, as opposed to real-time combination of video and data. This enables intelligent low power usage on the motion capture element since the transmitter is not required to be transmitting continuously. In one or more embodiments, the computer may be configured to intermittently receive the motion capture data and synchronize the images in time with the motion capture data based on location information associated with the images, and location information, an identifier and time associated with the motion capture element. The computer of one or more embodiments may also be configured to intermittently receive the motion capture data and synchronize the images in time with the motion capture data based on time and an identifier associated with the images, and time and an identifier associated with the motion capture data element. Also, in one or more embodiments, the computer may be configured to intermittently receive the motion capture data and synchronize the images in time with the motion capture data based on time, location information and a motion event associated with the images, and time, location information and a motion event associated with the motion capture data element. Configuring the computer as such allows the system to identify and locate the user associated with the images and motion capture data received, by either using time and an identifier of the images and motion capture element, or time, location and an event associated with the images or motion capture data element. Using an identifier, for example, allows the system to accurately identify a specific motion capture element associated with the user or piece of equipment, especially when the motion capture data is obtained from a previously stored or recorded event, rather than in real time. Also, using an event associated with the images and motion capture data element, in addition to the location and time, for example, allows the system to accurately identify a specific motion associated with the user or piece of equipment. A specific motion may include, a swing, a strike, a hit, or any other motion-related data, associated with the user or piece of equipment. For example, if multiple players are located on a golf course, or if a player is advancing from one hole to the next on a golf course, using the location, time and event (or identifier in some instances) associated with the player or pieces of equipment, the system is able to identify which player performed which event at which location and at what time. Furthermore, the system is able to correlate the data received to the correct player, based on the location, time, and event (or in some instances, identifier) information available. This enables the system to broadcast images with augmented motion data at a later time and still be able to accurately associate the data and information obtained to a specific user or piece of equipment and to the images thereof.
For television broadcasts, motion capture element 111 wirelessly transmits data that is received by antenna 106. The wireless sensor data thus obtained from motion capture element 111 is combined with the images obtained from television camera 104 to produce displays with augmented motion analysis data that can be broadcast to televisions, computers such as computer 105, mobile devices 101, 102, 102a, 102b or any other device configured to display images, for example as broadcast using television broadcast equipment 141. The motion analysis data can be positioned on display 120, or television 143 or computer screen on computer 105 for example by knowing the location of a camera (for example via GPS information), and by knowing the direction and/or orientation that the camera is pointing so long as the sensor data includes location data (for example GPS information). In other embodiments, visual markers or image processing may be utilized to lock the motion analysis data to the image, e.g., the golf club head can be tracked in the images and the corresponding high, middle and low position of the club can be utilized to determine the orientation of user 150 to camera 130 or 104 or 103 for example to correctly plot the augmented data onto the image of user 150. By time stamping images and time stamping motion capture data, for example after synchronizing the timer in the microcontroller with the timer on the mobile device and then scanning the images for visual markers or sporting equipment at various positions, simplified motion capture data may be overlaid onto the images. Any other method of combining images from a camera and motion capture data may be utilized in one or more embodiments of the invention. Any other algorithm for properly positioning the motion analysis data on display 120 with respect to a user (or any other display such as on computer 105) may be utilized in keeping with the spirit of the system. In one or more embodiments, the velocity of zero point in a swing, for example at the maximum of a backswing may be utilized to pinpoint a club head in an image, wherein the maximum rearmost position in the image may be matched with the horizontal orientation obtained from the motion capture data while the strike point in the image may be matched with the impact point where impact oscillations begin to occur in the motion capture data. A line may be then drawn, for example tracing the path of the contrast or color of the club head as directed or accepted as inputs into computer 140. The points that are connected may be further modified on computer 140 and the drawing may thus be completed and broadcast out to the Internet and over the television broadcast equipment for example.
One such display that may be generated and displayed on mobile device 101 include a BULLET TIME® view using two or more cameras selected from mobile devices 101, 102, 102a, camera 103, and/or television camera 104 or any other external camera. In this embodiment of the system, the computer is configured to obtain two or more images of user 150 and data associated with the at least one motion capture element (whether a visual marker or wireless sensor), wherein the two or more images are obtained from two or more cameras and wherein the computer is configured to generate a display that shows slow motion of user 150 shown from around the user at various angles at normal speed. Such an embodiment for example allows a group of fans to create their own BULLET TIME® shot of a golf pro at a tournament for example. The shots may be sent to computer 105 and any image processing required may be performed on computer 105 and broadcast to a television audience for example. In other embodiments of the system, the users of the various mobile devices share their own set of images, and or upload their shots to a website for later viewing for example. Embodiments of the invention also allow images or videos from other players having mobile devices to be utilized on a mobile device related to another user so that users don't have to switch mobile phones for example. In one embodiment, a video obtained by a first user for a piece of equipment in motion that is not associated with the second user having the video camera mobile phone may automatically transfer the video to the first user for display with motion capture data associated with the first user.
There are a myriad of applications that benefit and which are enabled by embodiments of the system that provide for viewing and analyzing motion capture data on the mobile computer or server/database, for example for data mining database 172 by users 151. For example, users 151 may include compliance monitors, including for example parents, children or elderly, managers, doctors, insurance companies, police, military, or any other entity such as equipment manufacturers that may data mine for product improvement. For example in a tennis embodiment by searching for top service speeds in Table 183 for users of a particular size, range of motion, speed, as per Table 180a or age via Table 180, or in a golf embodiment by searching for distances, i.e., differences in sequential locations in table 183 based on swing speed in the sensor data field in table 183 to determine which make or model would be the optimal scoring or fitting piece of equipment for a particular user based on the data associated with other similar users. Other embodiments related to compliance enable messages from mobile computer 101 or from server/database to be generated if thresholds for G-forces, (high or zero or any other levels), to be sent to compliance monitors, managers, doctors, insurance companies, etc., as previously described. Users 151 may include marketing personnel that determine which pieces of equipment certain users own and which related items that other similar users may own, in order to target sales at particular users. Users 151 may include medical personnel that may determine how much movement a sensor for example coupled with a shoe, i.e., a type of equipment, of a diabetic child has moved and how much this movement relates to the average non-diabetic child, wherein suggestions as per table 185 may include giving incentives to the diabetic child to exercise more, etc., to bring the child in line with healthy children. Sports physicians, physiologists or physical therapists may utilize the data per user, or search over a large number of users and compare a particular movement of a user or range of motion for example to other users to determine what areas a given user can improve on through stretching or exercise and which range of motion areas change over time per user or per population and for example what type of equipment a user may utilize to account for changes over time, even before those changes take place. Data mining motion capture data and image data related to motion provides unique advantages to users 151. Data mining may be performed on flex parameters measured by the sensors to determine if sporting equipment, shoes, human body parts or any other item changes in flexibility over time or between equipment manufacturers or any combination thereof.
To ensure that analysis of user 150 during a motion capture includes images that are relatively associated with the horizon, i.e., not tilted, the system may include an orientation module that executes on computer 160 within mobile device 101 for example. The computer is configured to prompt a user to align the camera along a horizontal plane based on orientation data obtained from orientation hardware within mobile device 101. Orientation hardware is common on mobile devices as one skilled in the art will appreciate. This allows the image so captured to remain relatively level with respect to the horizontal plane. The orientation module may also prompt the user to move the camera toward or away from the user, or zoom in or out to the user to place the user within a graphical “fit box”, to somewhat normalize the size of the user to be captured. Images may also be utilized by users to prove that they have complied with doctors' orders for example to meet certain motion requirements.
Embodiments of the system are further configured to recognize the at least one motion capture element associated with user 150 or piece of equipment 110 and associate at least one motion capture element 111 with assigned locations on user 150 or piece of equipment 110. For example, the user can shake a particular motion capture element when prompted by the computer within mobile device 101 to acknowledge which motion capture element the computer is requesting an identity for. Alternatively, motion sensor data may be analyzed for position and/or speed and/or acceleration when performing a known activity and automatically classified as to the location of mounting of the motion capture element automatically, or by prompting the user to acknowledge the assumed positions.
One or more embodiments of the computer in mobile device 101 is configured to obtain at least one image of user 150 and display a three-dimensional overlay onto the at least one image of user 150 wherein the three-dimensional overlay is associated with the motion analysis data. Various displays may be displayed on display 120. The display of motion analysis data may include a rating associated with the motion analysis data, and/or a display of a calculated ball flight path associated with the motion analysis data and/or a display of a time line showing points in time along a time axis where peak values associated with the motion analysis data occur and/or a suggest training regimen to aid the user in improving mechanics of the user. These filtered or analyzed data sensor results may be stored in database 172, for example in table 183, or the raw data may be analyzed on the database (or server associated with the database or in any other computer or combination thereof in the system shown in
Embodiments of the system may also present an interface to enable user 150 to purchase piece of equipment 110 over the wireless interface of mobile device 101, for example via the Internet, or via computer 105 which may be implemented as a server of a vendor. In addition, for custom fitting equipment, such as putter shaft lengths, or any other custom sizing of any type of equipment, embodiments of the system may present an interface to enable user 150 to order a customer fitted piece of equipment over the wireless interface of mobile device 101. Embodiments of the invention also enable mobile device 101 to suggest better performing equipment to user 150 or to allow user 150 to search for better performing equipment as determined by data mining of database 172 for distances of golf shots per club for users with swing velocities within a predefined range of user 150. This allows for real life performance data to be mined and utilized for example by users 151, such as OEMs to suggest equipment to user 150, and be charged for doing so, for example by paying for access to data mining results as displayed in any computer shown in
Embodiments of the system are configured to analyze the data obtained from at least one motion capture element and determine how centered a collision between a ball and the piece of equipment is based on oscillations of the at least one motion capture element coupled with the piece of equipment and display an impact location based on the motion analysis data. This performance data may also be stored in database 172 and used by OEMs or coaches for example to suggest clubs with higher probability of a centered hit as data mined over a large number of collisions for example.
While
In any embodiments detailed herein, efficiency may be calculated in a variety of ways and displayed. For embodiments of the invention that utilize one motion capture element, then the motion capture element associated with the club head may be utilized to calculate the efficiency. In one or more embodiments of the invention, efficiency may be calculated as:
Efficiency=(90−angle of club face with respect to direction of travel)*Vc/Vmax
As more sensors are added further from the piece of equipment, such as in this case a club, the more refined the efficiency calculation may be.
Efficiency=(90−angle of club face with respect to direction of travel)*Vc/Vmax*Wa/We*1.2
One or more embodiments of the system may analyze the peaks and/or timing of the peaks in order to determine a list of exercises to provide to a user to improve the mechanics of the user. For example, if the arms are rotating too late or with not enough speed, a list can be provided to the user such as:
The list of exercises may include any exercises for any body part and may displayed on display 120. For example, by asserting the “Training” button on the displays shown in
x=laterally sideways (right is positive, left is negative)
y=down the fairway (always positive)
z=vertically upwards (up is positive, down is negative)
B=a constant dependent on the conditions of the air, an appropriate value=0.00512
u=vector of relative velocity between the ball and the air (i.e. wind), u=v−vw
Cd=coefficient of drag which depends on the speed and spin of the ball
Cl=coefficient of drag which depends on the speed and spin of the ball
a=the angle between the vertical and the axis of rotation of the spinning ball
g=the acceleration due to gravity=32.16 ft/s2
A numerical form of the equations may be utilized to calculate the flight path for small increments of time assuming no wind and a spin axis of 0.1 radians or 5.72 degrees is as follows:
x acceleration=−0.00512*(vx^2+vy^2+vz^2)^(½)*((46.0/(vx^2+vy^2+vz^2)^(½))*(vx)+(33.4/(vx^2+vy^2+vz^2)^(½))*(vy)*sin(0.1))
y acceleration=−0.00512*(vx^2+vy^2+vz^2)^(½)*((46.0/(vx^2+vy^2+vz^2)^(½))*(vy)−(33.4/(vx^2+vy^2+vz^2)^(½))*((vx)*sin(0.1)−(vz)*cos(0.1)))
z acceleration=−32.16-0.00512*(vx^2+vy^2+vz^2)^(½)*((46.0/(vx^2+vy^2+vz^2)^(½))*(vz)−(33.4/(vx^2+vy^2+vz^2)^(½))*(vy)*cos(0.1))
In one scenario, a first user buys an instrumented piece of equipment and decides to play a virtual game as is illustrated in
While the ideas herein disclosed has been described by means of specific embodiments and applications thereof, numerous modifications and variations could be made thereto by those skilled in the art without departing from the scope of the invention set forth in the claims.
This application is a continuation of U.S. Utility patent application Ser. No. 15/044,036 filed 15 Feb. 2016, issued as U.S. Pat. No. 9,814,935, which is a continuation of U.S. Utility patent application Ser. No. 13/757,029 filed 1 Feb. 2013, issued as U.S. Pat. No. 9,261,526, which is a continuation-in-part of U.S. Utility patent application Ser. No. 13/737,956 filed 10 Jan. 2013, issued as U.S. Pat. No. 8,827,824, which is a continuation-in-part of U.S. Utility patent application Ser. No. 13/679,879 filed 16 Nov. 2012, issued as U.S. Pat. No. 8,944,928, which is a continuation-in-part of U.S. Utility patent application Ser. No. 13/298,158 filed 16 Nov. 2011, issued as U.S. Pat. No. 8,905,855, which is a continuation-in-part of U.S. Utility patent application Ser. No. 13/267,784 filed 6 Oct. 2011, issued as U.S. Pat. No. 9,604,142, which is a continuation-in-part of U.S. Utility patent application Ser. No. 13/219,525 filed 26 Aug. 2011, issued as U.S. Pat. No. 8,941,723, which is a continuation-in-part of U.S. Utility patent application Ser. No. 13/191,309 filed 26 Jul. 2011, issued as U.S. Pat. No. 9,033,810, which is a continuation-in-part of U.S. Utility patent application Ser. No. 13/048,850 filed 15 Mar. 2011, issued as U.S. Pat. No. 8,465,376, which is a continuation-in-part of U.S. Utility patent application Ser. No. 12/901,806 filed 11 Oct. 2010, issued as U.S. Pat. No. 9,320,957, which is a continuation-in-part of U.S. Utility patent application Ser. No. 12/868,882 filed 26 Aug. 2010, issued as U.S. Pat. No. 8,944,826, the specifications of which are hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
1712537 | White | May 1929 | A |
3182508 | Varju | May 1965 | A |
3226704 | Petrash | Dec 1965 | A |
3270564 | Evans | Sep 1966 | A |
3776556 | McLaughlin | Dec 1973 | A |
3788647 | Evans | Jan 1974 | A |
3792863 | Evans | Feb 1974 | A |
3806131 | Evans | Apr 1974 | A |
3945646 | Hammond | Mar 1976 | A |
4759219 | Cobb et al. | Jul 1988 | A |
4898389 | Plutt | Feb 1990 | A |
4902014 | Bontomase et al. | Feb 1990 | A |
4910677 | Remedio et al. | Mar 1990 | A |
4940236 | Allen | Jul 1990 | A |
4991850 | Wilhlem | Feb 1991 | A |
5056783 | Matcovich et al. | Oct 1991 | A |
5086390 | Matthews | Feb 1992 | A |
5111410 | Nakayama et al. | May 1992 | A |
5127044 | Bonito et al. | Jun 1992 | A |
5184295 | Mann | Feb 1993 | A |
5230512 | Tattershall | Jul 1993 | A |
5233544 | Kobayashi | Aug 1993 | A |
5249967 | O'Leary et al. | Oct 1993 | A |
5259620 | Marocco | Nov 1993 | A |
5283733 | Colley | Feb 1994 | A |
5298904 | Olich | Mar 1994 | A |
5332225 | Ura | Jul 1994 | A |
5333061 | Nakashima et al. | Jul 1994 | A |
5364093 | Huston et al. | Nov 1994 | A |
5372365 | McTeigue et al. | Dec 1994 | A |
5441256 | Hackman | Aug 1995 | A |
5441269 | Henwood | Aug 1995 | A |
5443260 | Stewart et al. | Aug 1995 | A |
5486001 | Baker | Jan 1996 | A |
5524081 | Paul | Jun 1996 | A |
5542676 | Howe et al. | Aug 1996 | A |
5592401 | Kramer | Jan 1997 | A |
5610590 | Johnson et al. | Mar 1997 | A |
5638300 | Johnson | Jun 1997 | A |
5665006 | Pellegrini | Sep 1997 | A |
5688183 | Sabatino et al. | Nov 1997 | A |
5694340 | Kim | Dec 1997 | A |
5707299 | McKenna | Jan 1998 | A |
5772522 | Nesbit | Jun 1998 | A |
5779555 | Nomura et al. | Jul 1998 | A |
5792001 | Henwood | Aug 1998 | A |
5819206 | Horton | Oct 1998 | A |
5826578 | Curchod | Oct 1998 | A |
5868578 | Baum | Feb 1999 | A |
5904484 | Burns | May 1999 | A |
5941779 | Zeiner-Gundersen | Aug 1999 | A |
5973596 | French et al. | Oct 1999 | A |
5993333 | Heckaman | Nov 1999 | A |
5998968 | Pittman et al. | Dec 1999 | A |
6012995 | Martin | Jan 2000 | A |
6030109 | Lobsenz | Feb 2000 | A |
6044704 | Sacher | Apr 2000 | A |
6073086 | Marinelli | Jun 2000 | A |
6224493 | Lee et al. | May 2001 | B1 |
6248021 | Ognjanovic | Jun 2001 | B1 |
6253159 | Bett et al. | Jun 2001 | B1 |
6254492 | Taggett | Jul 2001 | B1 |
6266623 | Vock et al. | Jul 2001 | B1 |
6293802 | Ahlgren | Sep 2001 | B1 |
6366205 | Sutphen | Apr 2002 | B1 |
6441745 | Gates | Aug 2002 | B1 |
6456938 | Barnard | Sep 2002 | B1 |
6537076 | McNitt | Mar 2003 | B2 |
6540620 | Consiglio | Apr 2003 | B1 |
6567536 | McNitt | May 2003 | B2 |
6582328 | Kuta et al. | Jun 2003 | B2 |
6611141 | Schulz | Aug 2003 | B1 |
6697820 | Tarlie | Feb 2004 | B1 |
6705942 | Crook et al. | Mar 2004 | B1 |
6746336 | Brant et al. | Jun 2004 | B1 |
6757572 | Forest | Jun 2004 | B1 |
6774932 | Ewing et al. | Aug 2004 | B1 |
6802772 | Kunzle et al. | Oct 2004 | B1 |
6868338 | Elliott | Mar 2005 | B1 |
6900759 | Katayama | May 2005 | B1 |
6908404 | Gard | Jun 2005 | B1 |
6923729 | McGinty et al. | Aug 2005 | B2 |
7004848 | Konow | Feb 2006 | B2 |
7021140 | Perkins | Apr 2006 | B2 |
7034694 | Yamaguchi et al. | Apr 2006 | B2 |
7037198 | Hameen-Antilla | May 2006 | B2 |
7092846 | Vock et al. | Aug 2006 | B2 |
7118498 | Meadows et al. | Oct 2006 | B2 |
7121962 | Reeves | Oct 2006 | B2 |
7143639 | Gobush | Dec 2006 | B2 |
7160200 | Grober | Jan 2007 | B2 |
7175177 | Meifu et al. | Feb 2007 | B2 |
7205894 | Savage | Apr 2007 | B1 |
7212943 | Aoshima et al. | May 2007 | B2 |
7219033 | Kolen | May 2007 | B2 |
7234351 | Perkins | Jun 2007 | B2 |
7264554 | Bentley | Sep 2007 | B2 |
7283647 | Mcnitt | Oct 2007 | B2 |
7421369 | Clarkson | Sep 2008 | B2 |
7433805 | Vock et al. | Oct 2008 | B2 |
7457439 | Madsen | Nov 2008 | B1 |
7457724 | Vock et al. | Nov 2008 | B2 |
7492367 | Mahajan et al. | Feb 2009 | B2 |
7494236 | Lim | Feb 2009 | B2 |
7499828 | Barton | Mar 2009 | B2 |
7561989 | Banks | Jul 2009 | B2 |
7623987 | Vock et al. | Nov 2009 | B2 |
7627451 | Vock et al. | Dec 2009 | B2 |
7689378 | Kolen | Mar 2010 | B2 |
7713148 | Sweeney | May 2010 | B2 |
7731598 | Kim et al. | Jun 2010 | B1 |
7736242 | Stites et al. | Jun 2010 | B2 |
7771263 | Telford | Aug 2010 | B2 |
7780450 | Tarry | Aug 2010 | B2 |
7800480 | Joseph et al. | Sep 2010 | B1 |
7813887 | Vock et al. | Oct 2010 | B2 |
7831212 | Balardeta et al. | Nov 2010 | B1 |
7871333 | Davenport | Jan 2011 | B1 |
7966154 | Vock et al. | Jun 2011 | B2 |
7983876 | Vock et al. | Jul 2011 | B2 |
8036826 | MacIntosh et al. | Oct 2011 | B2 |
8117888 | Chan et al. | Feb 2012 | B2 |
8172722 | Molyneux et al. | May 2012 | B2 |
8231506 | Molyneux et al. | Jul 2012 | B2 |
8249831 | Vock et al. | Aug 2012 | B2 |
8257191 | Stites et al. | Sep 2012 | B2 |
8282487 | Wilson et al. | Oct 2012 | B2 |
8314840 | Funk | Nov 2012 | B1 |
8352211 | Vock et al. | Jan 2013 | B2 |
8400548 | Bilbrey et al. | Mar 2013 | B2 |
8425292 | Lui et al. | Apr 2013 | B2 |
8477027 | Givens | Jul 2013 | B2 |
8527228 | Panagas | Sep 2013 | B2 |
8565483 | Nakaoka | Oct 2013 | B2 |
8589114 | Papadourakis | Nov 2013 | B2 |
8696482 | Pedenko et al. | Apr 2014 | B1 |
8723986 | Merrill | May 2014 | B1 |
8725452 | Han | May 2014 | B2 |
8764576 | Takasugi | Jul 2014 | B2 |
8781610 | Han | Jul 2014 | B2 |
8831905 | Papadourakis | Sep 2014 | B2 |
8876621 | Shibuya | Nov 2014 | B2 |
8888603 | Sato et al. | Nov 2014 | B2 |
8905856 | Parke et al. | Dec 2014 | B2 |
8929709 | Lokshin | Jan 2015 | B2 |
8944932 | Sato et al. | Feb 2015 | B2 |
8944939 | Clark et al. | Feb 2015 | B2 |
8956238 | Boyd et al. | Feb 2015 | B2 |
8988341 | Lin et al. | Mar 2015 | B2 |
8989441 | Han et al. | Mar 2015 | B2 |
9032794 | Perkins et al. | May 2015 | B2 |
9060682 | Lokshin | Jun 2015 | B2 |
9146134 | Lokshin et al. | Sep 2015 | B2 |
9656122 | Papadourakis | May 2017 | B2 |
9694267 | Thornbrue et al. | Jul 2017 | B1 |
20010029207 | Cameron et al. | Oct 2001 | A1 |
20010035880 | Musatov et al. | Nov 2001 | A1 |
20010045904 | Silzer, Jr. | Nov 2001 | A1 |
20010049636 | Hudda et al. | Dec 2001 | A1 |
20020004723 | Meifu et al. | Jan 2002 | A1 |
20020019677 | Lee | Feb 2002 | A1 |
20020049507 | Hameen-Anttila | Apr 2002 | A1 |
20020052750 | Hirooka | May 2002 | A1 |
20020064764 | Fishman | May 2002 | A1 |
20020072815 | McDonough et al. | Jun 2002 | A1 |
20020077189 | Tuer et al. | Jun 2002 | A1 |
20020082775 | Meadows et al. | Jun 2002 | A1 |
20020115046 | McNitt et al. | Aug 2002 | A1 |
20020126157 | Farago et al. | Sep 2002 | A1 |
20020151994 | Sisco | Oct 2002 | A1 |
20020173364 | Boscha | Nov 2002 | A1 |
20020177490 | Yong et al. | Nov 2002 | A1 |
20020188359 | Morse | Dec 2002 | A1 |
20030008722 | Konow | Jan 2003 | A1 |
20030073518 | Marty | Apr 2003 | A1 |
20030074659 | Louzoun | Apr 2003 | A1 |
20030109322 | Funk et al. | Jun 2003 | A1 |
20030163287 | Vock et al. | Aug 2003 | A1 |
20030191547 | Morse | Oct 2003 | A1 |
20030208830 | Marmaropoulos | Nov 2003 | A1 |
20040028258 | Naimark et al. | Feb 2004 | A1 |
20040033843 | Miller | Feb 2004 | A1 |
20040044493 | Coulthard | Mar 2004 | A1 |
20040147329 | Meadows et al. | Jul 2004 | A1 |
20040227676 | Kim et al. | Nov 2004 | A1 |
20040248676 | Taylor | Dec 2004 | A1 |
20050021292 | Vock et al. | Jan 2005 | A1 |
20050023763 | Richardson | Feb 2005 | A1 |
20050032582 | Mahajan et al. | Feb 2005 | A1 |
20050054457 | Eyestone et al. | Mar 2005 | A1 |
20050156068 | Ivans | Jul 2005 | A1 |
20050203430 | Williams et al. | Sep 2005 | A1 |
20050213076 | Saegusa | Sep 2005 | A1 |
20050215340 | Stites et al. | Sep 2005 | A1 |
20050227775 | Cassady et al. | Oct 2005 | A1 |
20050261073 | Farrington, Jr. et al. | Nov 2005 | A1 |
20050268704 | Bissonnette et al. | Dec 2005 | A1 |
20050272516 | Gobush | Dec 2005 | A1 |
20050282650 | Miettinen et al. | Dec 2005 | A1 |
20050288119 | Wang et al. | Dec 2005 | A1 |
20060020177 | Seo et al. | Jan 2006 | A1 |
20060025229 | Mahajan et al. | Feb 2006 | A1 |
20060038657 | Denison et al. | Feb 2006 | A1 |
20060063600 | Grober | Mar 2006 | A1 |
20060068928 | Nagy | Mar 2006 | A1 |
20060084516 | Eyestone et al. | Apr 2006 | A1 |
20060109116 | Keays | May 2006 | A1 |
20060122002 | Konow | Jun 2006 | A1 |
20060166738 | Eyestone et al. | Jul 2006 | A1 |
20060189389 | Hunter et al. | Aug 2006 | A1 |
20060199659 | Caldwell | Sep 2006 | A1 |
20060247070 | Funk et al. | Nov 2006 | A1 |
20060250745 | Butler et al. | Nov 2006 | A1 |
20060270450 | Garratt et al. | Nov 2006 | A1 |
20060276256 | Storek | Dec 2006 | A1 |
20060284979 | Clarkson | Dec 2006 | A1 |
20060293112 | Yi | Dec 2006 | A1 |
20070052807 | Zhou et al. | Mar 2007 | A1 |
20070062284 | Machida | Mar 2007 | A1 |
20070081695 | Foxlin et al. | Apr 2007 | A1 |
20070087866 | Meadows et al. | Apr 2007 | A1 |
20070099715 | Jones et al. | May 2007 | A1 |
20070111811 | Grober | May 2007 | A1 |
20070129178 | Reeves | Jun 2007 | A1 |
20070135225 | Nieminen | Jun 2007 | A1 |
20070135237 | Reeves | Jun 2007 | A1 |
20070219744 | Kolen | Sep 2007 | A1 |
20070265105 | Barton | Nov 2007 | A1 |
20070270214 | Bentley | Nov 2007 | A1 |
20070298896 | Nusbaum | Dec 2007 | A1 |
20080027502 | Ransom | Jan 2008 | A1 |
20080085778 | Dugan | Apr 2008 | A1 |
20080090703 | Rosenberg | Apr 2008 | A1 |
20080108456 | Bonito | May 2008 | A1 |
20080164999 | Otto | Jul 2008 | A1 |
20080182685 | Marty et al. | Jul 2008 | A1 |
20080190202 | Kulach et al. | Aug 2008 | A1 |
20080234935 | Wolf et al. | Sep 2008 | A1 |
20080280642 | Coxhill et al. | Nov 2008 | A1 |
20080284979 | Yee et al. | Nov 2008 | A1 |
20080285805 | Luinge et al. | Nov 2008 | A1 |
20090002316 | Rofougaran | Jan 2009 | A1 |
20090017944 | Savarese et al. | Jan 2009 | A1 |
20090029754 | Slocum et al. | Jan 2009 | A1 |
20090033741 | Oh et al. | Feb 2009 | A1 |
20090036237 | Nipper et al. | Feb 2009 | A1 |
20090048044 | Oleson et al. | Feb 2009 | A1 |
20090055820 | Huang | Feb 2009 | A1 |
20090088276 | Solheim et al. | Apr 2009 | A1 |
20090111602 | Savarese et al. | Apr 2009 | A1 |
20090131190 | Kimber | May 2009 | A1 |
20090137333 | Lin et al. | May 2009 | A1 |
20090174676 | Westerman | Jul 2009 | A1 |
20090177097 | Ma et al. | Jul 2009 | A1 |
20090191846 | Shi | Jul 2009 | A1 |
20090209343 | Foxlin et al. | Aug 2009 | A1 |
20090209358 | Niegowski | Aug 2009 | A1 |
20090213134 | Stephanick et al. | Aug 2009 | A1 |
20090222163 | Plante | Sep 2009 | A1 |
20090233735 | Savarese et al. | Sep 2009 | A1 |
20090254276 | Faulkner et al. | Oct 2009 | A1 |
20090254971 | Herz et al. | Oct 2009 | A1 |
20090299232 | Lanfermann et al. | Dec 2009 | A1 |
20100049468 | Papadourakis | Feb 2010 | A1 |
20100062869 | Chung et al. | Mar 2010 | A1 |
20100063778 | Schrock et al. | Mar 2010 | A1 |
20100063779 | Schrock et al. | Mar 2010 | A1 |
20100091112 | Veeser et al. | Apr 2010 | A1 |
20100093458 | Davenport et al. | Apr 2010 | A1 |
20100099509 | Ahem et al. | Apr 2010 | A1 |
20100103269 | Wilson et al. | Apr 2010 | A1 |
20100113174 | Ahern | May 2010 | A1 |
20100121227 | Stirling et al. | May 2010 | A1 |
20100121228 | Stirling et al. | May 2010 | A1 |
20100130298 | Dugan et al. | May 2010 | A1 |
20100144414 | Edis et al. | Jun 2010 | A1 |
20100144456 | Ahern | Jun 2010 | A1 |
20100144457 | Kim | Jun 2010 | A1 |
20100178994 | Do et al. | Jul 2010 | A1 |
20100201512 | Stirling et al. | Aug 2010 | A1 |
20100204616 | Shears et al. | Aug 2010 | A1 |
20100216564 | Stites et al. | Aug 2010 | A1 |
20100222152 | Jaekel et al. | Sep 2010 | A1 |
20100308105 | Savarese et al. | Dec 2010 | A1 |
20100309097 | Raviv et al. | Dec 2010 | A1 |
20100323794 | Su | Dec 2010 | A1 |
20110004871 | Liu | Jan 2011 | A1 |
20110029235 | Berry | Feb 2011 | A1 |
20110037778 | Deng et al. | Feb 2011 | A1 |
20110050864 | Bond | Mar 2011 | A1 |
20110052005 | Selner | Mar 2011 | A1 |
20110053688 | Crawford | Mar 2011 | A1 |
20110075341 | Lau et al. | Mar 2011 | A1 |
20110081981 | Okamoto | Apr 2011 | A1 |
20110126184 | Lisboa | May 2011 | A1 |
20110165998 | Lau et al. | Jul 2011 | A1 |
20110195780 | Lu | Aug 2011 | A1 |
20110230273 | Niegowski et al. | Sep 2011 | A1 |
20110230274 | Lafortune et al. | Sep 2011 | A1 |
20110230985 | Niegowski et al. | Sep 2011 | A1 |
20110230986 | Lafortune | Sep 2011 | A1 |
20110238308 | Miller et al. | Sep 2011 | A1 |
20110305369 | Bentley | Dec 2011 | A1 |
20120023354 | Chino | Jan 2012 | A1 |
20120052972 | Bentley | Mar 2012 | A1 |
20120115626 | Davenport | May 2012 | A1 |
20120115682 | Homsi | May 2012 | A1 |
20120116548 | Goree et al. | May 2012 | A1 |
20120120572 | Bentley | May 2012 | A1 |
20120157241 | Nomura et al. | Jun 2012 | A1 |
20120179418 | Takasugi et al. | Jul 2012 | A1 |
20120179742 | Acharya et al. | Jul 2012 | A1 |
20120191405 | Molyneux et al. | Jul 2012 | A1 |
20120295726 | Cherbini | Nov 2012 | A1 |
20120316004 | Shibuya | Dec 2012 | A1 |
20130029791 | Rose et al. | Jan 2013 | A1 |
20130095941 | Bentley et al. | Apr 2013 | A1 |
20130110415 | Davis et al. | May 2013 | A1 |
20130128022 | Bose et al. | May 2013 | A1 |
20130173212 | Saiki et al. | Jul 2013 | A1 |
20130178304 | Chan | Jul 2013 | A1 |
20130191063 | Nomura | Jul 2013 | A1 |
20130225309 | Bentley et al. | Aug 2013 | A1 |
20130267335 | Boyd et al. | Oct 2013 | A1 |
20130271602 | Bentley et al. | Oct 2013 | A1 |
20130298668 | Sato | Nov 2013 | A1 |
20130319113 | Mizuta | Dec 2013 | A1 |
20130330054 | Lokshin | Dec 2013 | A1 |
20130332004 | Gompert et al. | Dec 2013 | A1 |
20130346013 | Lokshin | Dec 2013 | A1 |
20140019083 | Nakaoka | Jan 2014 | A1 |
20140100048 | Ota et al. | Apr 2014 | A1 |
20140100049 | Ota et al. | Apr 2014 | A1 |
20140100050 | Ota et al. | Apr 2014 | A1 |
20140135139 | Shibuya et al. | May 2014 | A1 |
20140156214 | Nomura | Jun 2014 | A1 |
20140172873 | Varoglu et al. | Jun 2014 | A1 |
20140200092 | Parke et al. | Jul 2014 | A1 |
20140200094 | Parke et al. | Jul 2014 | A1 |
20140229135 | Nomura | Aug 2014 | A1 |
20140229138 | Goree et al. | Aug 2014 | A1 |
20140257743 | Lokshin et al. | Sep 2014 | A1 |
20140257744 | Lokshin et al. | Sep 2014 | A1 |
20140295982 | Shibuya | Oct 2014 | A1 |
20140376876 | Bentley et al. | Dec 2014 | A1 |
20140378239 | Sato et al. | Dec 2014 | A1 |
20140379293 | Sato | Dec 2014 | A1 |
20140379294 | Shibuya et al. | Dec 2014 | A1 |
20140379295 | Sato et al. | Dec 2014 | A1 |
20150007658 | Ishikawa et al. | Jan 2015 | A1 |
20150012240 | Sato | Jan 2015 | A1 |
20150042481 | Nomura | Feb 2015 | A1 |
20150098688 | Lokshin | Apr 2015 | A1 |
20150124048 | King | May 2015 | A1 |
20150131845 | Forouhar et al. | May 2015 | A1 |
20150154452 | Bentley et al. | Jun 2015 | A1 |
20150256689 | Erkkila et al. | Sep 2015 | A1 |
20150348591 | Kaps et al. | Dec 2015 | A1 |
20170061817 | Mettler | Mar 2017 | A1 |
Number | Date | Country |
---|---|---|
2025369 | Feb 2009 | EP |
2002210055 | Jul 2002 | JP |
2004207985 | Jul 2004 | JP |
2011000367 | Jan 2011 | JP |
2012196241 | Oct 2012 | JP |
10-20030085275 | Nov 2003 | KR |
10-20060041060 | May 2006 | KR |
10-20070119018 | Dec 2007 | KR |
10-20100074068 | Jul 2010 | KR |
101079319 | Jun 2011 | KR |
10-20100020131 | Sep 2011 | KR |
1994027683 | Dec 1994 | WO |
2007130057 | Nov 2007 | WO |
2009056688 | May 2009 | WO |
2011057194 | May 2011 | WO |
2014085744 | Jun 2014 | WO |
Entry |
---|
Supplementary Extended European Search Report received in 15782595.1 dated Nov. 27, 2017, 5 pages. |
International Search Report received in PCT/US2016/042668, dated Oct. 4, 2016, 21 pages. |
International Search Report received in PCT/US2016/042671, dated Oct. 13, 2016, 17 pages. |
International Search Report and Written Opinion received in PCT/US2016/042676, dated Oct. 24, 2016 (12 pages). |
International Preliminary Report on Patentability received in PCT/US2015/026917, dated Nov. 3, 2016 (5 pages). |
International Search Report received for PCT Application No. PCT/US2012/065716, dated Jan. 3, 2013, 10 pages. |
MyCaddie, 2009, retrieved on Sep. 26, 2012 from http://www.iMakePars.com, 4 pages. |
Swing it See it Fix it, Improve Gold Swing, SwingSmart Golf Analyzer, retrieved on Sep. 26, 2012 from http://www.SwingSmart.com, 2 pages. |
Learn how Swingbyte can improve your game, retrieved on Sep. 26, 2012 from http://www.swingbyte.com, 2 pages. |
International Search Report received for PCT Application No. PCT/US2011/055173, dated Mar. 6, 2012, 8 pages. |
International Search Report received for PCT Application No. PCT/US2011/049461, dated Feb. 23, 2012, 14 pages, 2012. |
PCT Search Report, PCT/US2012/029310, dated Sep. 28, 2012, 3 pages. |
IPRP, PCT/US2011/049461, dated Mar. 7, 2013, 6 pages. |
IPRP, PCT/US2011/058182, dated Apr. 30, 2013, 5 pages. |
IPER, PCT/US2011/055173, dated Apr. 25, 2013, 5 pages, (2013). |
IPRP, PCT/US2012/065716, dated May 20, 2014, 6 pages. |
International Search Report for PCT Application No. PCT/US2013/021999, dated Apr. 30, 2013, 8 pages. |
International Search Report for PCT Application No. PCT/US2012/066915, dated Mar. 29, 2013, 10 pages. |
International Search Report for PCT Application No. PCT/US2015/26896, dated Jul. 28, 2015, 15 pages. |
International Search Report for PCT Application No. PCTUS2015/26917, dated Jul. 30, 2015, 16 pages. |
The Nike+FuelBand User's Guide, rev 14, 26 pages, 2012. |
UP by Jawbone Extended User Guide, 10 pages, 2012. |
Armour39, Under Armour Guarantee, Getting Started, retrieved from the Internet on Jul. 12, 2013, 7 pages. |
Armour39 Module & Chest Strap, retrieved from the Internet on Jul. 12, 2013, 6 pages. |
MiCoach Pacer User Manual, 31 pages, (2009). |
Foreman et al. “A Comparative Analysis for the Measurement of Head Accelerations in Ice Hockey Helmets using Non-Accelerometer Based Systems,” Nov. 19, 2012, 13 pages. |
Reebok-CCM and MC10 to Launch Revolutionary Sports Impact Indicator, MC10 News (http://www.mc10inc.com/news/), Oct. 24, 2012, 3 pages. |
CheckLight MC10 Overview, Reebok International Limited, Nov. 20, 2012, 7 pages. |
Reebok and MC10 Team Up to Build CheckLight, a Head Impact Indicator (Hands-on), MC10 News (http://www.mc10inc.com/news/), Jan. 11, 2013, 1 pg. |
TRACE—The Most Advanced Activity Monitor for Action Sports, webpage, retrieved on Aug. 6, 2013, 22 pages. |
CheckLight, Sports/Activity Impact Indicator, User Manual, 13 pages, 2013, Reebok International Limited. |
King, The Design and Application of Wireless Mems Inertial Measurement Units for the Measurement and Analysis of Golf Swings, 2008. |
Grober, An Accelerometer Based Instrumentation of the Golf Club: Comparative Analysis of Golf Swings, 2009. |
Gehrig et al, Visual Golf Club Tracking for Enhanced Swing Analysis, Computer Vision Lab, Lausanne, Switzerland, 2003. |
Pocketpro Golf Designs, PocketPro Full Swing Analysis in Your Pocket, www.PocketPro.org, (2011). |
Clemson University, Golf Shot Tutorial, http://www.webnucleo.org/home/online_tools/newton/0.4/html/about_this_tool/tutorials/golf_1.shp.cgi, retrieved on Nov. 10, 2011. |
MiCoach Speed_Cell TM, User Manual, 23 pages, (2011). |
Nike+iPod, User Guide, 32 pages (2010). |
SureShotGPS SS9000X, Intelligent Touch, Instruction Manual, 25 page, 2011. |
ActiveReplay, “TRACE—The Most Advanced Activity Monitor for Action Sports”, http://www.kickstarter.com/projects/activereplay/trace-the-most-advanced-activity-monitor-for-actio, 13 pages, Oct. 1, 2013. |
ZEPP Golfsense@Launch2011, https://www.youtube.com/watch?v=VnOcu8szjIk (video), Mar. 14, 2011. |
Epson US Newsroom, “Epson America Enters Sports Wearables Market with Introduction of M-Tracer MT500GII Golf Swing Analyzer”, www.news.epson.com, Jan. 5, 2015, 4 pages. |
International Search Report and Written Opinion dated Dec. 22, 2015 received in PCTUS1561695, 7 pages. |
Search Report Received in PCT2013021999 dated Jan. 21, 2016. |
Patent Examination Report received in Australia Application No. 2011313952, dated Mar. 15, 2016, 5 pages. |
“About Banjo” webpages retrieved from internet, dated 2015. |
International Search Report and Written Opinion mailed in PCTUS1642674 dated Aug. 12, 2016, 9 pages. |
International Preliminary Report on Patentability in PCTUS2015061695, dated Jun. 1, 2017, 5 pages. |
European Search Report received in PCTUS2015026896 dated May 11, 2017, 13 pages. |
International Search Report and Written Opinion received in PCT/US2017/52114, dated Oct. 3, 9 pages. |
International Search Report and Written Opinion Received in PCT/US2017/37987, dated Nov. 9, 2017, 12 pages. |
Supplementary Extended European Search Report received in 11820763.8 dated Nov. 13, 2017, 16 pages. |
Supplementary Extended European Search Report received in 11833159.4 dated Nov. 6, 2017, 14 pages. |
Supplementary Partial European Search Report received from EP Application Serial No. 11820763.8, dated Aug. 8, 2017, 15 pages. |
Supplementary Partial European Search Report received from EP Application Serial No. 11833159.4, dated Aug. 8, 2017, 15 pages. |
David E. Culler, Et al., “Smart Sensors to Network the World”, published in Scientific American Magazine, No. Jun. 2004, dated Jun. 1, 2004, pp. 85-91. |
International Search Report and Written Opinion received in PCT/US2017/039209, dated Aug. 24, 2017, 7 pages. |
Zepp Labs, Inc. v. Blast Motion, Inc. Petition for Inter Partes Review of U.S. Pat. No. 8,903,521 filed on Feb. 24, 2016, as IPR2016-00672, and accompanying Declaration of Dr. Steven M. Nesbit. |
Zepp Labs, Inc. v. Blast Motion, Inc. Petition for Inter Partes Review of U.S. Pat. No. 9,039,527 filed on Feb. 24, 2016, as IPR2016-00674, and accompanying Declaration of Dr. Steven M. Nesbit. |
Zepp Labs, Inc. v. Blast Motion, Inc. Petition for Inter Partes Review of U.S. Pat. No. 8,941,723 filed on Feb. 24, 2016, as IPR2016-00675, and accompanying Declaration of Dr. Steven M. Nesbit. |
Zepp Labs, Inc. v. Blast Motion, Inc. Petition for Inter Partes Review of U.S. Pat. No. 8,905,855 filed on Feb. 24, 2016, as IPR2016-00676, and accompanying Declaration of Dr. Steven M. Nesbit. |
Zepp Labs, Inc. v. Blast Motion, Inc. Petition for Inter Partes Review of U.S. Pat. No. 8,944,928 filed on Feb. 24, 2016, as IPR2016-00677, and accompanying Declaration of Dr. Steven M. Nesbit. |
Chris Otto, et al, “System Architecture of a Wireless Body Area Sensor Network for Ubiquitous Health Monitoring”, Journal of Mobile Multimedia, vol. 1, No. 4, Jan. 10, 2006, University of Alabama in Huntsville, 20 Pages. |
Linx Technologies “High Performance RF Module: Hp3 Series Transmitter Module Data Guide Description”, Jul. 27, 2011, 13 pages. |
Roger Allan, “Wireless Sensor Architectures Uses Bluetooth Standard”, www.electronicdesign.com/communications/wireless-sensor-architecture-uses-bluetooth-standard, Aug. 7, 2000, 5 pages. |
Don Tuite, “Motion-Sensing MEMS Gyros and Accelerometers are Everywhere”, www.electronicdesign.com/print/analog/motion-sensing-mems-gyros-and-accelerometers-are-everywhere, Jul. 9, 2009, 6 pages. |
InvenSense News Release, “InvenSense Unveils World's 1st IMU Solution for Consumer Applications”, ir.invensense.com, 2016, 2 Pages. |
Dean Takahashi, “Facebook, Twitter, Last.fm coming to Xbox Live this Fall”, Jun. 1, 2009, Webpage printout, 5 pages. |
The iClub System, Products pages, www.iclub.net, 2001-2005, 5 pages. |
Websters New College Dictionary, Definition of “Virtual Reality”, Third Edition, 2005, 3 Pages. |
SmartSwing, “SmartSwing Introduces Affordable Intelligent Golf Club”, www.smartswinggolf.com, Jan. 2006, 2 pages. |
Henrick Arfwedson, et al., “Ericsson's Bluetooth modules”, Ericsson Review No. 4, 1999, 8 pages. |
ZigBees, “Zigbee information”, www.zigbees.com , 2015, 4 pages. |
SolidState Technology, “MEMS enable smart golf clubs”, www.electroiq.com , 2005, 3 pages. |
IGN, “Japanese WII Price Release Date Revealed”, www.ign.com, 2006, 1 page. |
First Annual Better Golf Through Technology Conference 2006 webpage, www.bettergolfthroughtechnology.com , Massachusetts Institute of Technology, Cambridge Massachusetts, Feb. 2006, 1 page. |
Concept2Rowing, “Training” web content, www.concept2.com , 2009, 1 page. |
Expresso, Products pages, www.expresso.com/products , 2009, 2 pages. |
Manish Kalia, et al., “Efficient Policies for Increasing Capacity in Bluetooth: An Indoor Pico-Cellular Wireless System”, IBM India Research Laboratory, Indian Institute of Technology, 2000, 5 pages. |
R. Rao, et al., “Demand-Based Bluetooth Scheduling”, Pennsylvania State University, 2001, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20180071582 A1 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15044036 | Feb 2016 | US |
Child | 15812926 | US | |
Parent | 13757029 | Feb 2013 | US |
Child | 15044036 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13737956 | Jan 2013 | US |
Child | 13757029 | US | |
Parent | 13679879 | Nov 2012 | US |
Child | 13737956 | US | |
Parent | 13298158 | Nov 2011 | US |
Child | 13679879 | US | |
Parent | 13267784 | Oct 2011 | US |
Child | 13298158 | US | |
Parent | 13219525 | Aug 2011 | US |
Child | 13267784 | US | |
Parent | 13191309 | Jul 2011 | US |
Child | 13219525 | US | |
Parent | 13048850 | Mar 2011 | US |
Child | 13191309 | US | |
Parent | 12901806 | Oct 2010 | US |
Child | 13048850 | US | |
Parent | 12868882 | Aug 2010 | US |
Child | 12901806 | US |