One or more embodiments setting forth the ideas described throughout this disclosure pertain to the field of motion capture sensors and displaying motion data in a virtual reality environment. More particularly, but not by way of limitation, one or more aspects of the invention enable a system that mirrors the motion of a real object with motion of a virtual object in a virtual environment, and that applies constraints defined in the virtual environment to the motion of the virtual object.
Motion capture sensors and systems for analyzing motion data are known in the art, but these systems typically provide delayed analysis or playback after events occur. For example, there are systems that analyze the swing of a golf club using motion sensors attached to the club; these systems wait until a swing signature is detected, and then analyze the sensor data to reconstruct and diagnose the swing. Existing motion analysis systems do not provide real-time mirroring of the motion of an object such as a golf club with a display of a virtual club in a virtual environment. Such a mirroring system may provide valuable feedback to a user, who can observe the motion of the object from various angles while it is occurring. Observations of the motion in real-time may also be valuable for coaching and teaching. Real-time mirroring of the motion of an object may also provide the ability to use the object to control a virtual reality game; for example, a user may swing a real golf club to play a virtual round of golf on a virtual golf course. While there are virtual reality systems that provide gaming experiences, these systems typically require specialized game controllers. There are no known systems that use real sporting equipment as game controllers for virtual games for the associated sports, by attaching motion sensors to these real objects.
Real-time, continuous mirroring of the motion of an object in a virtual environment presents additional challenges since sensor data inaccuracies can accumulate over time. These challenges are less acute for systems that perform after the fact analysis of events, but they are critical for long-term motion mirroring. There are no known systems that address these accumulating errors by using combinations of redundant sensor data and constraints on the motion of virtual objects in a virtual environment.
For at least the limitations described above there is a need for a motion mirroring system that incorporates virtual environment constraints.
Embodiments of the invention enable a motion mirroring system that generates and animates a virtual object in response to motion of a physical object equipped with motion sensors. The virtual object motion may take into account constraints defined for the virtual environment, such as for example regions the virtual object should remain in or remain near. The mirroring of physical object motion on a virtual environment display may be used for example for coaching or training, for playing virtual reality games, or for continuous feedback to a user.
One or more embodiments of the system may include a motion capture element (or several such elements) that may be coupled to a moveable, physical object. The motion capture element may include one or more sensors, a microprocessor to collect and transmit sensor data, and a communication interface for transmission of the data. The communication interface may be wireless, wired, or a combination thereof. Sensors may for example capture data related to any or all of the moveable object's position, orientation, linear velocity, linear acceleration, angular velocity, or angular acceleration. Sensors may capture additional or alternate data such as for example pressure, temperature, stress, strain, or shock.
The system may also include a computer that receives sensor data over another communication interface. The computer may be any device or combination of devices that can receive and process data, including for example, without limitation, a desktop computer, a laptop computer, a notebook computer, a tablet computer, a server, a mobile phone, a smart phone, a smart watch, smart glasses, a virtual reality headset, a microprocessor, or a network of any of these devices. In one or more embodiments the computer and the microprocessor of the motion capture element may coincide. The computer may access a memory (which may be local or remote, or a combination thereof) that contains a virtual environment state. The virtual environment state may define a virtual environment that includes a virtual object that represents the physical moveable object. The memory may also include one or more constraints on the position, orientation, or other characteristics of the virtual object in the virtual environment. For example, without limitation, constraints may specify regions of the virtual environment space that the virtual object must remain in or near, or regions that it may not be in.
The computer may receive sensor data from the motion capture element, and may then calculate from the data the position and orientation of the moveable object in the real environment. Because these calculations may result in errors, including potentially accumulating errors over time, the computer may apply one or more corrections to the position and orientation, for example using redundancies in the sensor data. The computer may then transform the position and orientation of the moveable object into a position and orientation of the virtual object in the virtual environment. Rules and algorithms for this transformation may depend on the nature and purpose of the virtual environment; for example, in a virtual game, the transformations may place the virtual object in an appropriate location based on the state of the game. The computer may check whether the transformed position and orientation of the virtual object satisfy the constraints associated with the virtual environment. If they do not, the computer may apply corrections or additional transformations to enforce the constraints. The computer may then generate one or more images of the virtual environment and the virtual object, and transmit these images to a display for viewing. The calculations, corrections, transformation, and image generation may occur in real time or almost real time, so that motions of the physical object result in immediate or almost immediate corresponding motions of the virtual object on the display. For example, delays between motion of the physical object and corresponding mirrored motion of the virtual object on a display may be on the order of less than a half a second, or in some cases on the order of tens of milliseconds or less.
A moveable object tracked by a motion capture element may for example be a piece of equipment, an article of clothing, or a body part of a person. In one or more embodiments a piece of equipment may be a piece of sporting equipment used in a sports activity, such as for example, without limitation, equipment used in golf, tennis, badminton, racquetball, table tennis, squash, baseball, softball, cricket, hockey, field hockey, croquet, football, rugby, Australian rules football, soccer, volleyball, water polo, polo, basketball, lacrosse, billiards, horseshoes, shuffleboard, handball, bocce, bowling, dodgeball, kick ball, track and field events, curling, martial arts, boxing, archery, pistol shooting, rifle shooting, ice skating, gymnastics, surfing, skateboarding, snowboarding, skiing, windsurfing, roller blading, bicycling, or racing. The virtual environment in one or more embodiments may be a virtual game for an associated sports activity, with the virtual object representing a piece of equipment for that virtual game. For example, in a golf application, the moveable object may be a golf club, and the virtual environment may represent a virtual golf course in which the user plays a virtual round of golf; as the user moves the physical golf club, corresponding mirrored motions may be displayed by the system for the virtual golf club in the virtual golf course.
In one or more embodiments, sensors in a motion capture element may include one or more of accelerometers, rate gyroscopes, or magnetometers. These sensors may have any number of axes; for example, without limitation, 3-axis sensors may be used for applications that track motion in all directions. These sensors are illustrative; one or more embodiments may use any type or types of sensors to track any aspect of an object's position, orientation, motion, or other characteristics.
In one or more embodiments, the sensor data may include redundant information that may be used to improve or correct calculations of the moveable object's position or orientation. For example, one or more embodiments may obtain redundant orientation information using three techniques: integration of angular velocity data from a gyroscope, measurement of the object's orientation relative to the Earth's magnetic field from a magnetometer, and measurement of the object's orientation relative to the Earth's gravitational field from an accelerometer (during periods of time when the object is substantially stationary, for example). One or more embodiments may for example use the first calculation (integration of angular velocity) to obtain an initial estimate of an object's orientation, and may then apply corrections based on differences between predicted gravity and magnetic field vectors and values measured from an accelerometer and a magnetometer, respectively.
For example, without limitation, one or more embodiments may apply corrections by calculating a magnetic rotational error between the predicted magnetic field vector based on angular velocity integration and the measured magnetic field vector, and by calculating a gravitational rotational error between the predicted gravitational field vector based on angular velocity integration and the measured gravitational field vector. One or more embodiments may then apply a fraction of either or both of the magnetic rotational error and the gravitational rotational error to the calculated orientation, to form a corrected orientation.
In one or more embodiments an image of the virtual object in the virtual environment may be formed by a virtual camera that can be positioned and oriented in the virtual environment. Users or the system may be able to modify or configure the position and orientation of this virtual camera, for example to show the motion of the object from different perspectives.
In one or more embodiments, constraints on the position, orientation, motion, or other characteristics of the virtual object in the virtual environment may for example describe or define regions of the virtual environment that the virtual object must remain in or remain near. Constraints may describe regions of the virtual environment that the virtual object must not be in or must not be near. Constraints may describe or define virtual barriers that the virtual object may not pass through. For example, without limitation, the virtual environment may define a ground surface that a virtual object must remain above and must not pass through. Constraints may for example describe or define maximum or minimum values for any function of an object's motion, position, or orientation; for example, constraints may limit the virtual object's maximum or minimum speed, acceleration, angular velocity, or angular acceleration. Constraints may describe or define limits on allowable orientations of a virtual object, for example by requiring that the virtual object must be facing in a particular direction or must not face in certain directions.
In one or more embodiments the motion mirroring system may be used for example to play a virtual game, where a user moves a physical object to control movement of a corresponding virtual piece of equipment in the virtual game. For example, the system may mirror motion of a physical golf club to move a virtual golf club that is used to play a game that simulates a round of golf on a virtual golf course. One or more embodiments that execute games may include one or more virtual game pieces in the virtual environment, such as for example a golf ball in a virtual golf game. In some games a virtual object (such as for example a virtual golf club) may be used to strike a virtual game piece (such as for example a virtual golf ball). The system may calculate the initial velocity of a virtual game piece (as well as other characteristics such as spin) by simulating the impact of the virtual object with the virtual game piece. For example, the initial velocity of a virtual golf ball may be calculated based on the velocity of a virtual golf club when it impacts the virtual golf ball. The velocity of the virtual object at impact may in turn be calculated based on the motion of the physical object corresponding to the virtual object.
In one or more embodiments the system may relocate the virtual object to a new location in response to events that occur in a virtual game or a virtual simulation. This relocation may occur automatically or in response to user input. For example, in an embodiment that mirrors the motion of a golf club to play a virtual golf game, the system may move the virtual golf club to the new location of the virtual golf ball after a shot, or to the start of a new hole once a previous hole is complete. In one or more embodiments the system may move the virtual object gradually from one position to a new position to avoid discontinuous jumps in the displayed position of the virtual object. In one or more embodiments the system may update the orientation of a virtual object to a new orientation in response to events that occur in a virtual game or a virtual simulation. For example, in a virtual golf game the system may automatically update the aim direction of a club to aim at the hole when the position of the virtual club changes to a new location where the ball lands or at the start of a new hole. As with changes in position, the system may change the orientation of the virtual object gradually to avoid discontinuous jumps in the displayed orientation of the virtual object.
The above and other aspects, features and advantages of the ideas conveyed through this disclosure will be more apparent from the following more particular description thereof, presented in conjunction with the following drawings wherein:
A motion mirroring system that incorporates virtual environment constraints will now be described. In the following exemplary description numerous specific details are set forth in order to provide a more thorough understanding of the ideas described throughout this specification. It will be apparent, however, to an artisan of ordinary skill that embodiments of ideas described herein may be practiced without incorporating all aspects of the specific details described herein. In other instances, specific aspects well known to those of ordinary skill in the art have not been described in detail so as not to obscure the disclosure. Readers should note that although examples of the innovative concepts are set forth throughout this disclosure, the claims, and the full scope of any equivalents, are what define the invention.
Motion capture element 110 measures one or more aspects of the position or orientation (or both) of object 102, or of changes thereto such as linear velocity, linear acceleration, angular velocity, or angular acceleration. Motion capture elements may use any sensing or measuring technologies to measure any physical quantity or quantities; these technologies may include but are not limited to inertial sensing technologies such as accelerometers and gyroscopes. Motion capture elements may include additional devices that may be physically separate from the motion capture element itself, such as transmitters or receivers (such as for example a GPS satellite) or cameras that observe the motion capture element. Any device or combination of devices that provides data that can be used to determine the motion of an object such as golf club 102 is in keeping with the spirit of the invention.
Motion capture element 110 collects motion data reflecting the motion 103 of the golf club, and transmits this data using communication interface 111 to a receiving computer 120 for analysis and display. In this illustrative example, communication interface 111 is a wireless transmitter. For example, transmitter 111 may send data over wireless channel 112 which may be for example, without limitation, a Bluetooth or Bluetooth Low Energy channel, an 802.11 channel, a cellular network, or any other wireless channel or network. One or more embodiments may use wired connections between motion capture element 110 and computer 120 instead of or in addition to wireless connections such as 112. One or more embodiments may use any desired media, networks, and protocols to transmit data.
Computer 120 receives sensor data from motion capture element 110 using a corresponding communication interface integrated into or accessible to the computer. Computer 120 may be any device or combination of devices that can receive and process the sensor data. For example, without limitation, computer 120 may include a desktop computer, a laptop computer, a notebook computer, a tablet computer, a server, a mobile phone, a smart phone, a smart watch, smart glasses, a virtual reality headset, a microprocessor, or a network of any of these devices. In one or more embodiments the computer may be part of or collocated with the motion capture element 110. The computer processes the sensor data received over channel 112 to analyze the motion of the object 102. The computer 120 accesses one or more memory or storage devices that contain a description 122 of a virtual environment. This memory or storage may be local, remote, or a combination thereof. The virtual environment 122 may for example include a description of terrain or surroundings such as a golf course, including a virtual golf hole 123. The virtual environment may also include a description of one or more virtual objects in the environment, such as virtual golf club 124 and virtual golf ball 125. In particular, one of these virtual objects may correspond to the real object 102 that is being moved. In the embodiment shown in
In mirroring the motion such as 103 of a real object 102 in a virtual environment, one or more embodiments may apply one or more constraints on the motion. For example, in the embodiment of
Motion capture element 110 may have an optional storage device or devices 206. For example, in one or more embodiments sensor data may be buffered or recorded before transmission to a computer. Use of storage for buffering or recording is optional; one or more embodiments may transmit sensor data directly as soon as it is received, without buffering or recording the data. In one or more embodiments the microprocessor 205 may process data from sensors such as 201, 202, 203, and 204 prior to transmitting (or storing) the data. For example, without limitation, data may be compressed, filtered, integrated, rescaled, resampled, or transformed in any desired manner by the microprocessor 205. Sensor data 210 (or a transformed version of this data) is transmitted via communications interface 111 to computer 120.
Computer 120 may include a processor (or multiple processors) 215. The processor may access sensor data 210 received on communications interface 211. It may also access memory device or devices 216, which may contain for example the description and state of a virtual environment 122, including a virtual object 124 that may represent a real object being moved. The memory device may also contain constraints 126 on the position, orientation, or motion of the virtual object, or more generally any constraints on the state of the virtual environment. The memory device or devices may be local to computer 120, remote to the computer and accessed for example via a network connection, or any combination thereof. The processor 215 may update the state of the virtual environment 122 using sensor data 210, apply constraints 126, and generate images that are transmitted to display 121. Display 121 may be local to computer 120 or remote. One or more embodiments may include multiple displays, potentially showing for example different views of virtual environment 122. In one or more embodiments display 121 may be a stereographic display or any type of 3D display technology. In one or more embodiments the display or displays 121 may be integrated into a headset or into glasses or goggles.
In one or more embodiments, an initial calculation 301 of object position and orientation 312 may be subject to various errors. For example, as is known in the art, in inertial navigation using accelerometer and gyro data, calculated position and orientation may drift over time from their true values. One or more embodiments may therefore apply a step 302 to correct position and orientation using one or more redundancies in the sensor data 210. The specific corrections depend on the type of sensor data 210 and on the specific redundancies in the data. For example, continuing the inertial navigation example, an accelerometer and a gyroscope contain redundant information about the orientation of an object, since the gyroscope angular velocity can be integrated to form orientation, and the accelerometer can also provide a tilt reading when the object is stationary. In the embodiment illustrated in
After corrections 302, step 303 may transform the position and orientation of the real object into the virtual environment. This transformation may depend for example on the state of the virtual environment. For example, in the golf example from
After transformation 303 to the virtual environment, one or more embodiments may perform step 304 to apply one or more constraints to the position, orientation, motion, or other characteristics of the virtual object. For example, constraint 332 may require that the virtual club 124 be located at or near the tee of the first hole (or more generally located at or near the ball's current position). Because of accumulated errors such as inertial drift (which may not be fully corrected by step 302), the position of the virtual club may need to be adjusted with a shift 334 to place it near the ball position 333, for example. Other constraints such as those illustrated in
where Q is me orientation matrix and ωx is the cross product matrix formed from the angular velocity ω.) Given the calculated orientation 411, predicted values can be estimated for the magnetic field vector 412 and the gravity vector 413. This prediction presumes that the orientation of the magnetic vector 412 in the fixed reference frame 311 is the same as its initial orientation 402, and that therefore the change in the magnetic vector measured in the sensor reference frame 411 is due only to the change in orientation. Similarly, the prediction presumes that that the orientation of the gravity vector 413 in the fixed reference frame 311 is the same as its initial orientation 403, and that therefore the change in the gravity vector measured in the sensor reference frame 411 is due only to the change in orientation.
Calculations 420 then use redundant sensor data to compare the predicted magnetic vector 412 and predicted gravity vector 413 to measured quantities. The measured magnetic vector 422 may for example be obtained from magnetometers 203. The measured gravity vector 423 may for example be obtained from accelerometers 201, provided that the system is stationary or substantially stationary (or not accelerating) when the measurement is made. In one or more embodiments the system may for example determine whether the system is sufficiently stationary to use the gravity vector measurement by testing the magnitude of the angular velocity; a low angular velocity may suggest that the system may be stationary. In one or more embodiments the system may for example also or alternatively determine whether the system is sufficiently stationary by testing the magnitude of the acceleration vector; an acceleration magnitude approximately equal to g (gravity acceleration) may suggest that the system may be stationary. By comparing the predicted and measured vectors, rotations 432 and 433 can be calculated that rotate the measured values into the predicted values. (For example, an axis for each of these rotations can be determined by taking the cross product of the predicted and measured vectors.) These rotations 432 and 433 represent the errors between the calculated orientation and the actual orientation.
In one or more embodiments one or both of the error rotations 432 and 433 may be applied to the current orientation with proportional scaling factors to shift the calculated orientation gradually towards the measured orientation. Applying only a portion of the error rotations to the orientation at each update cycle (or at selected update cycles) may for example provide a more robust solution when the measured gravity and magnetic vectors are also subject to possible errors. Fractions 442 and 443, respectively, are applied to rotations 432 and 433, yielding proportional error rotations 452 and 453 respectively. In this illustrative example the proportionality factors 442 and 443 are applied to the rotation angles (θm and θg), and the rotation axes (um and ug) of rotations 432 and 433 are preserved. The rotations 452 and 453 are then applied in step 460 to the orientation 411, yielding a corrected orientation 461. In one or more embodiments one or both of the corrections 452 and 453 may be applied at every sensor sample. In one or more embodiments one or both of the corrections may be applied periodically but not at every sensor sample, for example at every tenth sample. In one or more embodiments one or both of the corrections may be applied when the angular magnitude of the error rotations 432 or 433 exceed a threshold.
After possible correction using redundant sensor data, the position and orientation of a real object may be transformed to the virtual environment to form an image of a virtual object. In one or more embodiments the user or the system may be able to configure or modify the position and orientation in the virtual environment of a virtual camera that generates this image.
As discussed with respect to
Virtual environment constraints may also ensure that the position and orientation of a virtual object do not violate physical laws or the required configurations imposed by the virtual environment.
In one or more embodiments the position or orientation of a virtual object may be determined fully or partially by the state of the virtual environment at a point in time. For example, in embodiments that mirror motion to play a virtual game, the state of the game may affect the placement of the virtual object in the virtual environment.
After the virtual environment updates the position of the virtual ball 125a to its new position 812, the system may automatically update the position and orientation of the virtual golf club to reflect the new ball position. For example, the system executes update process 820 to move the virtual club position from 801 to 812, where the virtual ball landed after the shot. In addition, the system may automatically update the virtual club orientation to aim in the direction facing the hole from the new position, for example updating the aim vector of the club from 802 to 822. These updates to the position and orientation of the virtual club may occur automatically, even if the user 101 does not execute a specific motion of the real club 102 to make the changes. Subsequent motion of the club 102 may then be interpreted relative to the updated position 812 and updated orientation 822.
In one or more embodiments the system may perform a gradual, continuous update of the position and orientation of a virtual object, so that the viewer of the display does not observe a discontinuous jump. For example, in
While the ideas herein disclosed has been described by means of specific embodiments and applications thereof, numerous modifications and variations could be made thereto by those skilled in the art without departing from the scope of the invention set forth in the claims.
This application is a continuation of U.S. Utility patent application Ser. No. 15/602,853, filed on 23 May 2017, issued as U.S. Pat. No. 10,786,728, the specification of which is hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
1712537 | White | May 1929 | A |
3182508 | Varju | May 1965 | A |
3226704 | Petrash | Dec 1965 | A |
3270564 | Evans | Sep 1966 | A |
3776556 | McLaughlin | Dec 1973 | A |
3788647 | Evans | Jan 1974 | A |
3792863 | Evans | Feb 1974 | A |
3806131 | Evans | Apr 1974 | A |
3945646 | Hammond | Mar 1976 | A |
4759219 | Cobb et al. | Jul 1988 | A |
4898389 | Plutt | Feb 1990 | A |
4902014 | Bontomase et al. | Feb 1990 | A |
4910677 | Remedio et al. | Mar 1990 | A |
4940236 | Allen | Jul 1990 | A |
4991850 | Wilhlem | Feb 1991 | A |
5056783 | Matcovich et al. | Oct 1991 | A |
5086390 | Matthews | Feb 1992 | A |
5111410 | Nakayama et al. | May 1992 | A |
5127044 | Bonito et al. | Jun 1992 | A |
5184295 | Mann | Feb 1993 | A |
5230512 | Tattershall | Jul 1993 | A |
5233544 | Kobayashi | Aug 1993 | A |
5249967 | O'Leary et al. | Oct 1993 | A |
5259620 | Marocco | Nov 1993 | A |
5283733 | Colley | Feb 1994 | A |
5298904 | Olich | Mar 1994 | A |
5332225 | Ura | Jul 1994 | A |
5333061 | Nakashima et al. | Jul 1994 | A |
5364093 | Huston et al. | Nov 1994 | A |
5372365 | McTeigue et al. | Dec 1994 | A |
5441256 | Hackman | Aug 1995 | A |
5441269 | Henwood | Aug 1995 | A |
5443260 | Stewart et al. | Aug 1995 | A |
5486001 | Baker | Jan 1996 | A |
5524081 | Paul | Jun 1996 | A |
5542676 | Howe et al. | Aug 1996 | A |
5592401 | Kramer | Jan 1997 | A |
5610590 | Johnson et al. | Mar 1997 | A |
5638300 | Johnson | Jun 1997 | A |
5665006 | Pellegrini | Sep 1997 | A |
5688183 | Sabatino et al. | Nov 1997 | A |
5694340 | Kim | Dec 1997 | A |
5707299 | McKenna | Jan 1998 | A |
5772522 | Nesbit | Jun 1998 | A |
5779555 | Nomura et al. | Jul 1998 | A |
5792001 | Henwood | Aug 1998 | A |
5819206 | Horton | Oct 1998 | A |
5826578 | Curchod | Oct 1998 | A |
5868578 | Baum | Feb 1999 | A |
5904484 | Burns | May 1999 | A |
5941779 | Zeiner-Gundersen | Aug 1999 | A |
5973596 | French et al. | Oct 1999 | A |
5993333 | Heckaman | Nov 1999 | A |
5998968 | Pittman et al. | Dec 1999 | A |
6012995 | Martin | Jan 2000 | A |
6030109 | Lobsenz | Feb 2000 | A |
6044704 | Sacher | Apr 2000 | A |
6073086 | Marinelli | Jun 2000 | A |
6224493 | Lee et al. | May 2001 | B1 |
6248021 | Ognjanovic | Jun 2001 | B1 |
6253159 | Bett et al. | Jun 2001 | B1 |
6254492 | Taggett | Jul 2001 | B1 |
6266623 | Vock et al. | Jul 2001 | B1 |
6293802 | Ahlgren | Sep 2001 | B1 |
6366205 | Sutphen | Apr 2002 | B1 |
6441745 | Gates | Aug 2002 | B1 |
6456938 | Barnard | Sep 2002 | B1 |
6537076 | McNitt | Mar 2003 | B2 |
6540620 | Consiglio | Apr 2003 | B1 |
6567536 | McNitt | May 2003 | B2 |
6582328 | Kuta et al. | Jun 2003 | B2 |
6611141 | Schulz | Aug 2003 | B1 |
6697820 | Tarlie | Feb 2004 | B1 |
6705942 | Crook et al. | Mar 2004 | B1 |
6746336 | Brant et al. | Jun 2004 | B1 |
6757572 | Forest | Jun 2004 | B1 |
6774932 | Ewing et al. | Aug 2004 | B1 |
6802772 | Kunzle et al. | Oct 2004 | B1 |
6868338 | Elliott | Mar 2005 | B1 |
6900759 | Katayama | May 2005 | B1 |
6908404 | Gard | Jun 2005 | B1 |
6923729 | McGinty et al. | Aug 2005 | B2 |
7004848 | Konow | Feb 2006 | B2 |
7021140 | Perkins | Apr 2006 | B2 |
7034694 | Yamaguchi et al. | Apr 2006 | B2 |
7037198 | Hameen-Antilla | May 2006 | B2 |
7092846 | Vock et al. | Aug 2006 | B2 |
7118498 | Meadows et al. | Oct 2006 | B2 |
7121962 | Reeves | Oct 2006 | B2 |
7143639 | Gobush | Dec 2006 | B2 |
7160200 | Grober | Jan 2007 | B2 |
7175177 | Meifu et al. | Feb 2007 | B2 |
7205894 | Savage | Apr 2007 | B1 |
7212943 | Aoshima et al. | May 2007 | B2 |
7219033 | Kolen | May 2007 | B2 |
7234351 | Perkins | Jun 2007 | B2 |
7264554 | Bentley | Sep 2007 | B2 |
7283647 | Mcnitt | Oct 2007 | B2 |
7421369 | Clarkson | Sep 2008 | B2 |
7433805 | Vock et al. | Oct 2008 | B2 |
7457439 | Madsen | Nov 2008 | B1 |
7457724 | Vock et al. | Nov 2008 | B2 |
7492367 | Mahajan et al. | Feb 2009 | B2 |
7494236 | Lim | Feb 2009 | B2 |
7499828 | Barton | Mar 2009 | B2 |
7561989 | Banks | Jul 2009 | B2 |
7623987 | Vock et al. | Nov 2009 | B2 |
7627451 | Vock et al. | Dec 2009 | B2 |
7689378 | Kolen | Mar 2010 | B2 |
7713148 | Sweeney | May 2010 | B2 |
7731598 | Kim et al. | Jun 2010 | B1 |
7736242 | Stites et al. | Jun 2010 | B2 |
7771263 | Telford | Aug 2010 | B2 |
7780450 | Tarry | Aug 2010 | B2 |
7800480 | Joseph et al. | Sep 2010 | B1 |
7813887 | Vock et al. | Oct 2010 | B2 |
7831212 | Balardeta et al. | Nov 2010 | B1 |
7871333 | Davenport | Jan 2011 | B1 |
7966154 | Vock et al. | Jun 2011 | B2 |
7983876 | Vock et al. | Jul 2011 | B2 |
8036826 | MacIntosh et al. | Oct 2011 | B2 |
8117888 | Chan et al. | Feb 2012 | B2 |
8172722 | Molyneux et al. | May 2012 | B2 |
8231506 | Molyneux et al. | Jul 2012 | B2 |
8249831 | Vock et al. | Aug 2012 | B2 |
8257191 | Stites et al. | Sep 2012 | B2 |
8282487 | Wilson et al. | Oct 2012 | B2 |
8314840 | Funk | Nov 2012 | B1 |
8352211 | Vock et al. | Jan 2013 | B2 |
8400548 | Bilbrey et al. | Mar 2013 | B2 |
8425292 | Lui et al. | Apr 2013 | B2 |
8477027 | Givens | Jul 2013 | B2 |
8527228 | Panagas | Sep 2013 | B2 |
8565483 | Nakaoka | Oct 2013 | B2 |
8589114 | Papadourakis | Nov 2013 | B2 |
8696482 | Pedenko et al. | Apr 2014 | B1 |
8723986 | Merrill | May 2014 | B1 |
8725452 | Han | May 2014 | B2 |
8764576 | Takasugi | Jul 2014 | B2 |
8781610 | Han | Jul 2014 | B2 |
8831905 | Papadourakis | Sep 2014 | B2 |
8876621 | Shibuya | Nov 2014 | B2 |
8888603 | Sato et al. | Nov 2014 | B2 |
8905856 | Parke et al. | Dec 2014 | B2 |
8929709 | Lokshin | Jan 2015 | B2 |
8944932 | Sato et al. | Feb 2015 | B2 |
8944939 | Clark et al. | Feb 2015 | B2 |
8956238 | Boyd et al. | Feb 2015 | B2 |
8988341 | Lin et al. | Mar 2015 | B2 |
8989441 | Han et al. | Mar 2015 | B2 |
9032794 | Perkins et al. | May 2015 | B2 |
9060682 | Lokshin | Jun 2015 | B2 |
9146134 | Lokshin et al. | Sep 2015 | B2 |
9500464 | Coza | Nov 2016 | B2 |
9646199 | Bose et al. | May 2017 | B2 |
9656122 | Papadourakis | May 2017 | B2 |
9694267 | Thornbrue et al. | Jul 2017 | B1 |
10124230 | Thornbrue et al. | Nov 2018 | B2 |
20010029207 | Cameron et al. | Oct 2001 | A1 |
20010035880 | Musatov et al. | Nov 2001 | A1 |
20010045904 | Silzer, Jr. | Nov 2001 | A1 |
20010049636 | Hudda et al. | Dec 2001 | A1 |
20020004723 | Meifu et al. | Jan 2002 | A1 |
20020019677 | Lee | Feb 2002 | A1 |
20020049507 | Hameen-Anttila | Apr 2002 | A1 |
20020052750 | Hirooka | May 2002 | A1 |
20020064764 | Fishman | May 2002 | A1 |
20020072815 | McDonough et al. | Jun 2002 | A1 |
20020077189 | Tuer et al. | Jun 2002 | A1 |
20020082775 | Meadows et al. | Jun 2002 | A1 |
20020115046 | McNitt et al. | Aug 2002 | A1 |
20020126157 | Farago et al. | Sep 2002 | A1 |
20020151994 | Sisco | Oct 2002 | A1 |
20020173364 | Boscha | Nov 2002 | A1 |
20020177490 | Yong et al. | Nov 2002 | A1 |
20020188359 | Morse | Dec 2002 | A1 |
20030008722 | Konow | Jan 2003 | A1 |
20030073518 | Marty | Apr 2003 | A1 |
20030074659 | Louzoun | Apr 2003 | A1 |
20030109322 | Funk et al. | Jun 2003 | A1 |
20030163287 | Vock et al. | Aug 2003 | A1 |
20030191547 | Morse | Oct 2003 | A1 |
20030208830 | Marmaropoulos | Nov 2003 | A1 |
20040028258 | Naimark et al. | Feb 2004 | A1 |
20040033843 | Miller | Feb 2004 | A1 |
20040044493 | Coulthard | Mar 2004 | A1 |
20040147329 | Meadows et al. | Jul 2004 | A1 |
20040227676 | Kim et al. | Nov 2004 | A1 |
20040248676 | Taylor | Dec 2004 | A1 |
20050021292 | Vock et al. | Jan 2005 | A1 |
20050023763 | Richardson | Feb 2005 | A1 |
20050032582 | Mahajan et al. | Feb 2005 | A1 |
20050054457 | Eyestone et al. | Mar 2005 | A1 |
20050156068 | Ivans | Jul 2005 | A1 |
20050203430 | Williams et al. | Sep 2005 | A1 |
20050213076 | Saegusa | Sep 2005 | A1 |
20050215340 | Stites et al. | Sep 2005 | A1 |
20050227775 | Cassady et al. | Oct 2005 | A1 |
20050261073 | Farrington, Jr. et al. | Nov 2005 | A1 |
20050268704 | Bissonnette et al. | Dec 2005 | A1 |
20050272516 | Gobush | Dec 2005 | A1 |
20050282650 | Miettinen et al. | Dec 2005 | A1 |
20050288119 | Wang et al. | Dec 2005 | A1 |
20060020177 | Seo et al. | Jan 2006 | A1 |
20060025229 | Mahajan et al. | Feb 2006 | A1 |
20060038657 | Denison et al. | Feb 2006 | A1 |
20060063600 | Grober | Mar 2006 | A1 |
20060068928 | Nagy | Mar 2006 | A1 |
20060084516 | Eyestone et al. | Apr 2006 | A1 |
20060109116 | Keays | May 2006 | A1 |
20060122002 | Konow | Jun 2006 | A1 |
20060166738 | Eyestone et al. | Jul 2006 | A1 |
20060189389 | Hunter et al. | Aug 2006 | A1 |
20060199659 | Caldwell | Sep 2006 | A1 |
20060247070 | Funk et al. | Nov 2006 | A1 |
20060250745 | Butler et al. | Nov 2006 | A1 |
20060270450 | Garratt et al. | Nov 2006 | A1 |
20060276256 | Storek | Dec 2006 | A1 |
20060284979 | Clarkson | Dec 2006 | A1 |
20060293112 | Yi | Dec 2006 | A1 |
20070052807 | Zhou et al. | Mar 2007 | A1 |
20070062284 | Machida | Mar 2007 | A1 |
20070081695 | Foxlin et al. | Apr 2007 | A1 |
20070087866 | Meadows et al. | Apr 2007 | A1 |
20070099715 | Jones et al. | May 2007 | A1 |
20070111811 | Grober | May 2007 | A1 |
20070129178 | Reeves | Jun 2007 | A1 |
20070135225 | Nieminen | Jun 2007 | A1 |
20070135237 | Reeves | Jun 2007 | A1 |
20070219744 | Kolen | Sep 2007 | A1 |
20070265105 | Barton | Nov 2007 | A1 |
20070270214 | Bentley | Nov 2007 | A1 |
20070298896 | Nusbaum | Dec 2007 | A1 |
20080027502 | Ransom | Jan 2008 | A1 |
20080085778 | Dugan | Apr 2008 | A1 |
20080090703 | Rosenberg | Apr 2008 | A1 |
20080108456 | Bonito | May 2008 | A1 |
20080164999 | Otto | Jul 2008 | A1 |
20080182685 | Marty et al. | Jul 2008 | A1 |
20080190202 | Kulach et al. | Aug 2008 | A1 |
20080211768 | Breen et al. | Sep 2008 | A1 |
20080234935 | Wolf et al. | Sep 2008 | A1 |
20080280642 | Coxhill et al. | Nov 2008 | A1 |
20080284979 | Yee et al. | Nov 2008 | A1 |
20080285805 | Luinge et al. | Nov 2008 | A1 |
20090002316 | Rofougaran | Jan 2009 | A1 |
20090017944 | Savarese et al. | Jan 2009 | A1 |
20090029754 | Slocum et al. | Jan 2009 | A1 |
20090033741 | Oh et al. | Feb 2009 | A1 |
20090036237 | Nipper et al. | Feb 2009 | A1 |
20090048044 | Oleson et al. | Feb 2009 | A1 |
20090055820 | Huang | Feb 2009 | A1 |
20090088276 | Solheim et al. | Apr 2009 | A1 |
20090111602 | Savarese et al. | Apr 2009 | A1 |
20090131190 | Kimber | May 2009 | A1 |
20090137333 | Lin et al. | May 2009 | A1 |
20090144785 | Walker et al. | Jun 2009 | A1 |
20090174676 | Westerman | Jul 2009 | A1 |
20090177097 | Ma et al. | Jul 2009 | A1 |
20090191846 | Shi | Jul 2009 | A1 |
20090209343 | Foxlin et al. | Aug 2009 | A1 |
20090209358 | Niegowski | Aug 2009 | A1 |
20090213134 | Stephanick et al. | Aug 2009 | A1 |
20090222163 | Plante | Sep 2009 | A1 |
20090233735 | Savarese et al. | Sep 2009 | A1 |
20090254276 | Faulkner et al. | Oct 2009 | A1 |
20090254971 | Herz et al. | Oct 2009 | A1 |
20090299232 | Lanfermann et al. | Dec 2009 | A1 |
20100049468 | Papadourakis | Feb 2010 | A1 |
20100062869 | Chung et al. | Mar 2010 | A1 |
20100063778 | Schrock et al. | Mar 2010 | A1 |
20100063779 | Schrock et al. | Mar 2010 | A1 |
20100091112 | Veeser et al. | Apr 2010 | A1 |
20100093458 | Davenport et al. | Apr 2010 | A1 |
20100099509 | Ahem et al. | Apr 2010 | A1 |
20100103269 | Wilson et al. | Apr 2010 | A1 |
20100113174 | Ahern | May 2010 | A1 |
20100121227 | Stirling et al. | May 2010 | A1 |
20100121228 | Stirling et al. | May 2010 | A1 |
20100130298 | Dugan et al. | May 2010 | A1 |
20100144414 | Edis et al. | Jun 2010 | A1 |
20100144456 | Ahern | Jun 2010 | A1 |
20100144457 | Kim et al. | Jul 2010 | A1 |
20100178994 | Do et al. | Jul 2010 | A1 |
20100201512 | Stirling et al. | Aug 2010 | A1 |
20100204616 | Shears et al. | Aug 2010 | A1 |
20100216564 | Stites et al. | Aug 2010 | A1 |
20100222152 | Jaekel et al. | Sep 2010 | A1 |
20100308105 | Savarese et al. | Dec 2010 | A1 |
20100309097 | Raviv et al. | Dec 2010 | A1 |
20100323794 | Su | Dec 2010 | A1 |
20110004871 | Liu | Jan 2011 | A1 |
20110029235 | Berry | Feb 2011 | A1 |
20110037778 | Deng et al. | Feb 2011 | A1 |
20110050864 | Bond | Mar 2011 | A1 |
20110052005 | Selner | Mar 2011 | A1 |
20110053688 | Crawford | Mar 2011 | A1 |
20110075341 | Lau et al. | Mar 2011 | A1 |
20110081981 | Okamoto | Apr 2011 | A1 |
20110126184 | Lisboa | May 2011 | A1 |
20110165998 | Lau et al. | Jul 2011 | A1 |
20110195780 | Lu | Aug 2011 | A1 |
20110225178 | Ingrassia et al. | Sep 2011 | A1 |
20110230273 | Niegowski et al. | Sep 2011 | A1 |
20110230274 | Lafortune et al. | Sep 2011 | A1 |
20110230985 | Niegowski et al. | Sep 2011 | A1 |
20110230986 | Lafortune | Sep 2011 | A1 |
20110238308 | Miller et al. | Sep 2011 | A1 |
20110305369 | Bentley | Dec 2011 | A1 |
20120004034 | Pope et al. | Jan 2012 | A1 |
20120023354 | Chino | Jan 2012 | A1 |
20120052972 | Bentley | Mar 2012 | A1 |
20120088544 | Bentley et al. | Apr 2012 | A1 |
20120115626 | Davenport | May 2012 | A1 |
20120115682 | Homsi | May 2012 | A1 |
20120116548 | Goree et al. | May 2012 | A1 |
20120120572 | Bentley | May 2012 | A1 |
20120142415 | Lindsay | Jun 2012 | A1 |
20120157241 | Nomura et al. | Jun 2012 | A1 |
20120179418 | Takasugi et al. | Jul 2012 | A1 |
20120179742 | Acharya et al. | Jul 2012 | A1 |
20120191405 | Molyneux et al. | Jul 2012 | A1 |
20120295726 | Cherbini | Nov 2012 | A1 |
20120316004 | Shibuya | Dec 2012 | A1 |
20130029791 | Rose et al. | Jan 2013 | A1 |
20130095924 | Geisner et al. | Apr 2013 | A1 |
20130095941 | Bentley et al. | Apr 2013 | A1 |
20130110415 | Davis et al. | May 2013 | A1 |
20130128022 | Bose et al. | May 2013 | A1 |
20130173212 | Saiki et al. | Jul 2013 | A1 |
20130178304 | Chan | Jul 2013 | A1 |
20130191063 | Nomura | Jul 2013 | A1 |
20130225309 | Bentley et al. | Aug 2013 | A1 |
20130245966 | Burroughs et al. | Sep 2013 | A1 |
20130267335 | Boyd et al. | Oct 2013 | A1 |
20130271602 | Bentley et al. | Oct 2013 | A1 |
20130281223 | Lee et al. | Oct 2013 | A1 |
20130298668 | Sato | Nov 2013 | A1 |
20130319113 | Mizuta | Dec 2013 | A1 |
20130330054 | Lokshin | Dec 2013 | A1 |
20130332004 | Gompert et al. | Dec 2013 | A1 |
20130343729 | Rav-Acha et al. | Dec 2013 | A1 |
20130346013 | Lokshin | Dec 2013 | A1 |
20140019083 | Nakaoka | Jan 2014 | A1 |
20140100048 | Ota et al. | Apr 2014 | A1 |
20140100049 | Ota et al. | Apr 2014 | A1 |
20140100050 | Ota et al. | Apr 2014 | A1 |
20140135139 | Shibuya et al. | May 2014 | A1 |
20140156214 | Nomura | Jun 2014 | A1 |
20140172873 | Varoglu et al. | Jun 2014 | A1 |
20140200092 | Parke et al. | Jul 2014 | A1 |
20140200094 | Parke et al. | Jul 2014 | A1 |
20140206481 | Zuger | Jul 2014 | A1 |
20140213382 | Kang et al. | Jul 2014 | A1 |
20140229135 | Nomura | Aug 2014 | A1 |
20140229138 | Goree et al. | Aug 2014 | A1 |
20140257743 | Lokshin et al. | Sep 2014 | A1 |
20140257744 | Lokshin et al. | Sep 2014 | A1 |
20140295982 | Shibuya | Oct 2014 | A1 |
20140334796 | Galant et al. | Nov 2014 | A1 |
20140376876 | Bentley et al. | Dec 2014 | A1 |
20140378239 | Sato et al. | Dec 2014 | A1 |
20140379293 | Sato | Dec 2014 | A1 |
20140379294 | Shibuya et al. | Dec 2014 | A1 |
20140379295 | Sato et al. | Dec 2014 | A1 |
20150007658 | Ishikawa et al. | Jan 2015 | A1 |
20150012240 | Sato | Jan 2015 | A1 |
20150042481 | Nomura | Feb 2015 | A1 |
20150098688 | Lokshin | Apr 2015 | A1 |
20150124048 | King | May 2015 | A1 |
20150131845 | Forouhar et al. | May 2015 | A1 |
20150154452 | Bentley et al. | Jun 2015 | A1 |
20150256689 | Erkkila et al. | Sep 2015 | A1 |
20150348591 | Kaps et al. | Dec 2015 | A1 |
20170061817 | Mettler | Mar 2017 | A1 |
20180021648 | Thornbrue et al. | Jan 2018 | A1 |
20180021653 | Thornbrue et al. | Jan 2018 | A1 |
20180070056 | DeAngelis et al. | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
2025369 | Feb 2009 | EP |
2479993 | Jul 2012 | EP |
2652738 | Oct 2013 | EP |
2781240 | Sep 2014 | EP |
2002210055 | Jul 2002 | JP |
2004207985 | Jul 2004 | JP |
2005176030 | Jun 2005 | JP |
2011000367 | Jan 2011 | JP |
2012196241 | Oct 2012 | JP |
2013188426 | Sep 2013 | JP |
10-20030085275 | Nov 2003 | KR |
10-20060041060 | May 2006 | KR |
10-20070119018 | Dec 2007 | KR |
10-20100074068 | Jul 2010 | KR |
101079319 | Jun 2011 | KR |
10-20100020131 | Sep 2011 | KR |
1994027683 | Dec 1994 | WO |
2007130057 | Nov 2007 | WO |
2009056688 | May 2009 | WO |
2011057194 | May 2011 | WO |
2014085744 | Jun 2014 | WO |
Entry |
---|
International Search Report received in PCT/US2016/042668, dated Oct. 4, 2016, 21 pages. |
International Search Report received in PCT/US2016/042671, dated Oct. 13, 2016, 17 pages. |
International Search Report and Written Opinion received in PCT/US2016/042676, dated Oct. 24, 2016 (12 pages). |
International Preliminary Report on Patentability received in PCT/US2015/026917, dated Nov. 3, 2016 (5 pages). |
International Search Report received for PCT Application No. PCT/US2012/065716, dated Jan. 3, 2013, 10 pages. |
MyCaddie, 2009, retrieved on Sep. 26, 2012 from http://www.iMakePars.com, 4 pages. |
Swing it See it Fix it, Improve Gold Swing, SwingSmart Golf Analyzer, retrieved on Sep. 26, 2012 from http://www.SwingSmart.com, 2 pages. |
Learn how Swingbyte can improve your game, retrieved on Sep. 26, 2012 from http://www.swingbyte.com, 2 pages. |
International Search Report received for PCT Application No. PCT/US2011/055173, dated Mar. 6, 2012, 8 pages. |
International Search Report received for PCT Application No. PCT/US2011/049461, dated Feb. 23, 2012, 14 pages, 2012. |
PCT Search Report, PCT/US2012/029310, dated Sep. 28, 2012, 3 pages. |
IPRP, PCT/US2011/049461, dated Mar. 7, 2013, 6 pages. |
IPRP, PCT/US2011/058182, dated Apr. 30, 2013, 5 pages. |
IPER, PCT/US2011/055173, dated Apr. 25, 2013, 5 pages, (2013). |
IPRP, PCT/US2012/065716, dated May 20, 2014, 6 pages. |
International Search Report for PCT Application No. PCT/US2013/021999, dated Apr. 30, 2013, 8 pages. |
International Search Report for PCT Application No. PCT/US2012/066915, dated Mar. 29, 2013, 10 pages. |
International Search Report for PCT Application No. PCT/US2015/26896, dated Jul. 28, 2015, 15 pages. |
International Search Report for PCT Application No. PCTUS2015/26917, dated Jul. 30, 2015, 16 pages. |
The Nike+FuelBand User's Guide, rev 14, 26 pages, 2012. |
UP by Jawbone Extended User Guide, 10 pages, 2012. |
Armour39, Under Armour Guarantee, Getting Started, retrieved from the Internet on Jul. 12, 2013, 7 pages. |
Armour39 Module & Chest Strap, retrieved from the Internet on Jul. 12, 2013, 6 pages. |
MiCoach Pacer User Manual, 31 pages, (2009). |
Foreman et al. “A Comparative Analysis for the Measurement of Head Accelerations in Ice Hockey Helmets using Non-Accelerometer Based Systems,” Nov. 19, 2012, 13 pages. |
Reebok-CCM and MC10 to Launch Revolutionary Sports Impact Indicator, MC10 News (http://www.mc10inc.com/news/), Oct. 24, 2012, 3 pages. |
CheckLight MC10 Overview, Reebok International Limited, Nov. 20, 2012, 7 pages. |
Reebok and MC10 Team Up to Build CheckLight, a Head Impact Indicator (Hands-on), MC10 News (http://www.mc10inc.com/news/), Jan. 11, 2013, 1 pg. |
TRACE—The Most Advanced Activity Monitor for Action Sports, webpage, retrieved on Aug. 6, 2013, 22 pages. |
CheckLight, Sports/Activity Impact Indicator, User Manual, 13 pages, 2013, Reebok International Limited. |
King, The Design and Application of Wireless Mems Inertial Measurement Units for the Measurement and Analysis of Golf Swings, 2008. |
Grober, An Accelerometer Based Instrumentation of the Golf Club: Comparative Analysis of Golf Swings, 2009. |
Gehrig et al, Visual Golf Club Tracking for Enhanced Swing Analysis, Computer Vision Lab, Lausanne, Switzerland, 2003. |
Pocketpro Golf Designs, PocketPro Full Swing Analysis in Your Pocket, www.PocketPro.org, (2011). |
Clemson University, Golf Shot Tutorial, http://www.webnucleo.org/home/online_tools/newton/0.4/html/about_this_tool/tutorials/golf_1.shp.cgi, retrieved on Nov. 10, 2011. |
MiCoach SPEED_CELL TM, User Manual, 23 pages, (2011). |
Nike+iPod, User Guide, 32 pages (2010). |
SureShotGPS SS9000X, Intelligent Touch, Instruction Manual, 25 page, 2011. |
ActiveReplay, “TRACE—The Most Advanced Activity Monitor for Action Sports”, http://www.kickstarter.com/projects/activereplay/trace-the-most-advanced-activity-monitor-for-actio, 13 pages, Oct. 1, 2013. |
ZEPP Golfsense@Launch2011, https://www.youtube.com/watch?v=VnOcu8szjIk (video), Mar. 14, 2011. |
Epson US Newsroom, “Epson America Enters Sports Wearables Market with Introduction of M-Tracer MT500GII Golf Swing Analyzer”, www.news.epson.com, Jan. 5, 2015, 4 pages. |
International Search Report and Written Opinion dated Dec. 22, 2015 received in PCTUS1561695, 7 pages. |
Search Report Received in PCT2013021999 dated Jan. 21, 2016. |
Patent Examination Report received in Australia Application No. 2011313952, dated Mar. 15, 2016, 5 pages. |
“About Banjo” webpages retrieved from internet, dated 2015. |
International Search Report and Written Opinion mailed in PCTUS1642674 dated Aug. 12, 2016, 9 pages. |
Chris Otto, et al, “System Architecture of a Wireless Body Area Sensor Network for Ubiquitous Health Monitoring”, Journal of Mobile Multimedia, vol. 1, No. 4, Jan. 10, 2006, University of Alabama in Huntsville, 20 Pages. |
Linx Technologies “High Performance RF Module: Hp3 Series Transmitter Module Data Guide Description”, Jul. 27, 2011, 13 pages. |
Roger Allan, “Wireless Sensor Architectures Uses Bluetooth Standard”, www.electronicdesign.com/communications/wireless-sensor-architecture-uses-bluetooth-standard, Aug. 7, 2000, 5 pages. |
Don Tuite, “Motion-Sensing MEMS Gyros and Accelerometers are Everywhere”, www.electronicdesign.com/print/analog/motion-sensing-mems-gyros-and-accelerometers-are-everywhere, Jul. 9, 2009, 6 pages. |
InvenSense News Release, “InvenSense Unveils World's 1st IMU Solution for Consumer Applications”, ir.invensense.com, 2016, 2 Pages. |
Dean Takahashi, “Facebook, Twitter, Last.fm coming to Xbox Live this Fall”, Jun. 1, 2009, Webpage printout, 5 pages. |
The iClub System, Products pages, www.iclub.net, 2001-2005, 5 pages. |
Websters New College Dictionary, Definition of “Virtual Reality”, Third Edition, 2005, 3 Pages. |
SmartSwing, “SmartSwing Introduces Affordable Intelligent Golf Club”, www.smartswinggolf.com , Jan. 2006, 2 pages. |
Henrick Arfwedson, et al., “Ericsson's Bluetooth modules”, Ericsson Review No. 4, 1999, 8 pages. |
ZigBees, “Zigbee information”, www.zigbees.com , 2015, 4 pages. |
SolidState Technology, “MEMS enable smart golf clubs”, www.electroiq.com , 2005, 3 pages. |
IGN, “Japanese WII Price Release Date Revealed”, www.ign.com , 2006, 1 page. |
First Annual Better Golf Through Technology Conference 2006 webpage, www.bettergolfthroughtechnology.com , Massachusetts Institute of Technology, Cambridge Massachusetts, Feb. 2006, 1 page. |
Concept2Rowing, “Training” web content, www.concept2.com , 2009, 1 page. |
Expresso, Products pages, www.expresso.com/products , 2009, 2 pages. |
Manish Kalia, et al., “Efficient Policies for Increasing Capacity in Bluetooth: An Indoor Pico-Cellular Wireless System”, IBM India Research Laboratory, Indian Institute of Technology, 2000, 5 pages. |
R. Rao, et al., “Demand-Based Bluetooth Scheduling”, Pennsylvania State University, 2001, 13 pages. |
International Preliminary Report on Patentability in PCTUS2015061695, dated Jun. 1, 2017, 5 pages. |
European Search Report received in PCTUS2015026896 dated May 11, 2017, 13 pages. |
Supplementary Partial European Search Report received from EP Application Serial No. 11820763.8, dated Aug. 8, 2017, 15 pages. |
Supplementary Partial European Search Report received from EP Application Serial No. 11833159.4, dated Aug. 8, 2017, 15 pages. |
David E. Culler, et al., “Smart Sensors to Network the World”, published in Scientific American Magazine, No. Jun. 2004, dated Jun. 1, 2004, pp. 85-91. |
International Search Report and Written Opinion received in PCT/US2017/039209, dated Aug. 24, 2017, 7 pages. |
International Search Report and Written Opinion received in PCT/US2017/52114, dated Oct. 3, 9 pages. |
International Search Report and Written Opinion Received in PCT/US2017/37987, dated Nov. 9, 2017, 12 pages. |
Supplementary Extended European Search Report received in 15782595.1 dated Nov. 27, 2017, 5 pages. |
Supplemental Search Report received in EP1682529.5, dated Jun. 6, 2019, 7 pages. |
Supplementary Extended European Search Report received in 11820763.8 dated Nov. 13, 2017, 16 pages. |
Supplementary Extended European Search Report received in 11833159.4 dated Nov. 6, 2017, 14 pages. |
Supplementary European Search Report received in 15860384.5 dated Jun. 21, 2018, 9 pages. |
International Search Report and Written Opinion received in PCT/US18033757, dated Aug. 31, 2018, 8 pages. |
International Preliminary Report on Patentability received in PCT/US2018/033757, dated Dec. 5, 2019, 6 pages, and transmittal thereof (1 page). |
Wikipedia, “Inertial Navigation System” retrieved from wayback machine on Dec. 26, 2018 from https://web.archive.org/web/20170224173443//https://en.wikipedia.ort/wiki/inertial_navigation_system> which was published on Feb. 27, 2017. |
Obuchoski, “Tiger Woods PGA Tour 2010 Gameplay (Wii)”, retrieved from Youtube on Jul. 3, 2019 from https://www.youtube.com/watch?v=2NjzPJvyRJE> (2009). |
Number | Date | Country | |
---|---|---|---|
20210146235 A1 | May 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15602853 | May 2017 | US |
Child | 17037496 | US |