Not Applicable.
Not Applicable.
Not Applicable.
The invention generally relates to a swing analysis system for improving athletic performance. More particularly, the invention relates to a swing analysis system for improving the athletic performance of an athlete that engages in a swinging motion during the execution of the sport, such as the swinging of a baseball bat or a golf club.
Training for a sporting activity usually requires going through the same motion repeatedly. Typically, a coach or trainer first tells the athlete what to do, and then observes the motion and corrects mistakes. Particularly, in movements performed quickly, the coach or trainer explains the mistakes after the trainee performs the activity. This may take the form of showing a video of the trainee performing the activity, and then pointing out the errors. Seeing the mistakes after the fact is not as effective as getting quantitative feedback while performing the activity. This type of feedback is particularly important for sports that involve the swinging of a particular implement, such as a baseball bat or golf club.
What is needed, therefore, is a swing analysis system that is capable of determining swing performance metrics from output data of a force measurement assembly. Moreover, a swing analysis system is needed that is capable of autodetecting one or more swing phases of a user. Furthermore, a need exists for a swing analysis system that is capable of generating a swing analysis report that includes one or more swing performance metrics.
Accordingly, the present invention is directed to a swing analysis system that substantially obviates one or more problems resulting from the limitations and deficiencies of the related art.
In accordance with one or more embodiments of the present invention, there is provided a swing analysis system that comprises a motion capture system comprising at least one motion capture device configured to detect the motion of one or more body segments of a user and generate first output data, and the at least one motion capture device further configured to detect the motion of at least one of: (i) a head and/or face of the user, (ii) a hand and/or fingers of the user, and (iii) an object being manipulated by the user, and generate second output data; and at least one data processing device operatively coupled to the motion capture system, the at least one data processing device configured to determine first positional information for the one or more body segments of the user from the first output data of the at least one motion capture device, the at least one data processing device further configured to determine second positional information for the at least one of: (i) the head and/or face of the user, (ii) the hand and/or fingers of the user, and (iii) the object being manipulated by the user, from the second output data of the at least one motion capture device, and the at least one data processing device additionally configured to determine one or more swing performance parameters for the user using at least one of: (a) the first positional information of the one or more body segments of the user from the at least one motion capture device, and (b) the second positional information for the at least one of: (i) the head and/or face of the user, (ii) the hand and/or fingers of the user, and (iii) the object being manipulated by the user.
In a further embodiment of the present invention, the first positional information of the one or more body segments of the user determined by the at least one data processing device comprises keypoints for the one or more body segments of the user generated using a trained neural network.
In yet a further embodiment, the one or more swing performance parameters determined by the at least one data processing device comprise at least one of: (i) one or more body segment angles for the one or more body segments of the user determined using the keypoints generated from the trained neural network, (ii) one or more body joint angles for the user determined using the one or more body segment angles for the one or more body segments of the user, (iii) one or more body joint angular velocities for the user determined using the one or more body joint angles of the user, (iv) one or more body joint angular accelerations for the user determined using the one or more body joint angular velocities of the user, (v) one or more body segment angular velocities for the one or more body segments of the user determined using the one or more body segment angles for the one or more body segments of the user, and (vi) one or more body segment angular accelerations for the one or more body segments of the user determined using the one or more body segment angular velocities for the one or more body segments of the user.
In still a further embodiment, the at least one motion capture device is further configured to detect the motion of the head and/or face of the user, and the at least one data processing device is further configured to determine the second positional information for the head and/or face of the user; and the second positional information for the head and/or face of the user determined by the at least one data processing device comprises keypoints for the head and/or face of the user generated using a trained neural network.
In yet a further embodiment, the one or more swing performance parameters determined by the at least one data processing device comprise a head position assessment metric and/or a gaze direction assessment metric while the user is manipulating the object during a swing activity.
In still a further embodiment, the at least one motion capture device is further configured to detect the motion of the hand and/or fingers of the user, and the at least one data processing device is further configured to determine the second positional information for the hand and/or fingers of the user; and the second positional information for the hand and/or fingers of the user determined by the at least one data processing device comprises keypoints for the hand and/or fingers of the user generated using a trained neural network.
In yet a further embodiment, the one or more swing performance parameters determined by the at least one data processing device comprise a grip assessment metric while the user is manipulating the object during a swing activity.
In still a further embodiment, the at least one motion capture device is further configured to detect the motion of the object being manipulated by the user, and the at least one data processing device is further configured to determine the second positional information for the object being manipulated by the user; and the second positional information for the object being manipulated by the user determined by the at least one data processing device comprises keypoints for the object being manipulated by the user generated using a trained neural network.
In yet a further embodiment, the one or more swing performance parameters determined by the at least one data processing device comprise an object displacement path assessment metric while the user is manipulating the object during a swing activity.
In still a further embodiment, the swing analysis system further comprises a force measurement assembly configured to receive the user, the force measurement assembly including a top component for receiving at least a portion of the body of the user; and at least one force transducer, the at least one force transducer configured to sense one or more measured quantities and output one or more signals that are representative of forces and/or moments being applied to the top component of the force measurement assembly by the user. In this further embodiment, the force measurement assembly is operatively coupled to the at least one data processing device, and the at least one data processing device is further configured to receive the one or more signals that are representative of the forces and/or moments being applied to the top component of the force measurement assembly by the user, and to convert the one or more signals into output forces and/or moments. Also, in this further embodiment, the at least one data processing device is further configured to determine one or more swing performance parameters for the user using the output forces and/or moments from the force measurement assembly.
In yet a further embodiment, the output forces and/or moments determined by the at least one data processing device include shear force (Fx) values and vertical force (Ft) values; and the one or more swing performance metrics determined by the at least one data processing device are selected from the group consisting of: (i) a maximum Ft drive force, (ii) a maximum Ft load force, (iii) a maximum Fx acceleration force, (iv) a maximum Fx braking or deceleration force, (v) a rate of force development along the x-axis, (vi) a rate of force development along the z-axis, (vii) a backswing torque, (viii) a downswing torque, (ix) a peak swing torque, (x) load quality, (xi) load variability, (xii) a drive impulse, (xiii) a load impulse, (xiv) an acceleration impulse, (xv) a braking impulse, and (xvi) combinations thereof.
In still a further embodiment, the force measurement assembly is in the form of an instrumented treadmill, force plate, or a balance plate.
In yet a further embodiment, the force measurement assembly comprises a front force plate and a rear force plate.
In still a further embodiment, the at least one data processing device is further configured to characterize a swing quality of the user by utilizing the one or more swing performance parameters and a trained neural network.
In yet a further embodiment, the swing analysis system further comprises a head position sensing device operatively coupled to the at least one data processing device, the head position sensing device further comprising attachment means for attaching the head position sensing device to the head of the user; and the at least one data processing device is further configured to receive one or more head position signals that are representative of the detected position of the head of the user from the head position sensing device, and to determine the head position information for the user from the one or more head position signals output by the head position sensing device.
In still a further embodiment, the swing analysis system further comprises a hand grip sensing device operatively coupled to the at least one data processing device, the hand grip sensing device being configured to detect a hand grip pressure of the user and to output one or more hand grip signals; and the at least one data processing device is further configured to receive the one or more hand grip signals that are representative of pressure applied to the object by the hand of the user, and to determine the hand grip pressure for the user from the one or more hand grip signals output by the hand grip sensing device.
In yet a further embodiment, the swing analysis system further comprises an eye movement tracking device operatively coupled to the at least one data processing device, the eye movement tracking device configured to track eye movement and/or eye position of the user, and output one or more eye tracking signals based upon the tracked eye movement and/or eye position of the user; and the at least one data processing device is further configured to receive the one or more eye tracking signals that are representative of the tracked eye movement and/or eye position of the user, and to determine one or more eye tracking metrics from the one or more eye tracking signals output by the eye movement tracking device.
In still a further embodiment, the object being manipulated by the user comprises a sports implement, and the swing analysis system further comprises a sports implement sensing device attached to the sports implement, the sports implement sensing device operatively coupled to the at least one data processing device; and the at least one data processing device is further configured to receive one or more sports implement signals that are representative of the detected position of the sports implement and/or a gripping pressure applied to the sports implement by the one or more hands of the user, and to determine the position of the sports implement and/or the gripping pressure applied to the sports implement from the one or more sports implement signals output by the sports implement sensing device.
In yet a further embodiment, the sports implement manipulated by the user is selected from the group consisting of: (i) a bat used in one or more sports, (ii) a club used in one or more sports, and (iii) a racquet used in one or more sports.
It is to be understood that the foregoing summary and the following detailed description of the present invention are merely exemplary and explanatory in nature. As such, the foregoing summary and the following detailed description of the invention should not be construed to limit the scope of the appended claims in any sense.
The invention will now be described, by way of example, with reference to the accompanying drawings, in which:
Throughout the figures, the same parts are always denoted using the same reference characters so that, as a general rule, they will only be described once.
The present invention is described herein, in an exemplary manner, with reference to computer system architecture and exemplary processes carried out by the computer system. In one or more embodiments, the functionality described herein can be implemented by computer system instructions. These computer program instructions may be loaded directly onto an internal data storage device of a computing device (e.g., an internal data storage device of a laptop computing device). Alternatively, these computer program instructions could be stored on a portable computer-readable medium (e.g., a flash drive, etc.), and then subsequently loaded onto a computing device such that the instructions can be executed thereby. In other embodiments, these computer program instructions could be embodied in the hardware of the computing device, rather than in the software thereof. It is also possible for the computer program instructions to be embodied in a combination of both the hardware and the software.
This description describes in general form the computer program(s) required to carry out the swing analysis for a user. Any competent programmer in the field of information technology could develop a system using the description set forth herein.
For the sake of brevity, conventional computer system components, conventional data networking, and conventional software coding will not be described in detail herein. Also, it is to be understood that the connecting lines shown in the block diagram(s) included herein are intended to represent functional relationships and/or operational couplings between the various components. In addition to that which is explicitly depicted, it is to be understood that many alternative or additional functional relationships and/or physical connections may be incorporated in a practical application of the system.
An illustrative embodiment of a swing analysis system is seen generally at 100 in
As shown in the illustrative block diagram of
Now, turning again to
Referring again to
In the illustrative embodiment, the force measurement assembly 22 is operatively coupled to the data processing device 14 by virtue of an electrical cable. In one embodiment, the electrical cable is used for data transmission, as well as for providing power to the force measurement assembly 22. Various types of data transmission cables can be used for the cable. For example, the cable can be a Universal Serial Bus (USB) cable or an Ethernet cable. Preferably, the electrical cable contains a plurality of electrical wires bundled together, with at least one wire being used for power and at least another wire being used for transmitting data. The bundling of the power and data transmission wires into a single electrical cable advantageously creates a simpler and more efficient design. In addition, it enhances the safety of the training environment for the user. However, it is to be understood that the force measurement assembly 22 can be operatively coupled to the data processing device 14 using other signal transmission means, such as a wireless data transmission system. If a wireless data transmission system is employed, it is preferable to provide the force measurement assembly 22 with a separate power supply in the form of an internal power supply or a dedicated external power supply.
Now, the acquisition and processing of the load data carried out by the illustrative embodiment of the swing analysis system 100 will be described. Initially, a load is applied to the force measurement assembly 22 by the user disposed thereon. The load is transmitted from the front and rear plate components of the force measurement assembly 22 to its force transducer beams. In the illustrative embodiment, each plate component of the force measurement assembly 22 is supported on a plurality of force transducer beams disposed thereunder. In the illustrative invention, each of the force transducer beams includes a plurality of strain gages wired in one or more Wheatstone bridge configurations, wherein the electrical resistance of each strain gage is altered when the associated portion of the associated beam-type force transducer undergoes deformation (i.e., a measured quantity) resulting from the load (i.e., forces and/or moments) acting on the front and rear plate components. For each plurality of strain gages disposed on the force transducer beams, the change in the electrical resistance of the strain gages brings about a consequential change in the output voltage of the Wheatstone bridge (i.e., a quantity representative of the load being applied to the measurement surface). Thus, in the illustrative embodiment, the pair of force transducer beams disposed under the plate components output a plurality of analog output voltages (signals). In the illustrative embodiment, the plurality of output voltages from the front and rear force plates are then transmitted to a preamplifier board (not shown) for preconditioning. The preamplifier board is used to increase the magnitudes of the transducer analog voltages, and preferably, to convert the analog voltage signals into digital voltage signals as well. After which, the force measurement assembly 22 transmits the force plate output signals to a main signal amplifier/converter. Depending on whether the preamplifier board also includes an analog-to-digital (A/D) converter, the force plate output signals could be either in the form of analog signals or digital signals. The main signal amplifier/converter further magnifies the force plate output signals, and if the signals are of the analog-type (for a case where the preamplifier board did not include an analog-to-digital (A/D) converter), it may also convert the analog signals to digital signals. In the illustrative embodiment, the force plate output signals may also be transformed into output forces and/or moments by the firmware of the front and rear force plates by multiplying the voltage signals by a calibration matrix prior to the force plate output data being transmitted to the data processing device 14. Alternatively, the data acquisition/data processing device 14 may receive the voltage signals, and then transform the signals into output forces and/or moments by multiplying the voltage signals by a calibration matrix.
After the voltage signals are transformed into output forces and/or moments, the center of pressure for each foot of the user (i.e., the x and y coordinates of the point of application of the force applied to the measurement surface by each foot) may be determined by the data acquisition/data processing device 14.
In the illustrative embodiment, the data processing device 14 determines all three (3) orthogonal components of the resultant forces acting on the front and rear force plates (i.e., FFx, FFy, FFz, FRx, FRY, FRz) and all three (3) orthogonal components of the moments acting on the front and rear force plates (i.e., MFx, MFy, MFz, MRx, MRy, MRZ), while in another embodiment, a subset of these force and moment components may be determined.
In the illustrative embodiment, where a single set of overall center of pressure coordinates (xp, yp) are determined for the force measurement assembly 22, the center of pressure of the force vector F applied by the user to the measurement surface of the force plate 22 is computed as follows:
where:
xp, yp: coordinates of the point of application for the force (i.e., center of pressure) on the force plate assembly 22;
Fz: z-component of the resultant force acting on the force plate assembly 22;
Mx: x-component of the resultant moment acting on the force plate assembly 22; and
My: y-component of the resultant moment acting on the force plate assembly 22.
In an alternative embodiment, the center of pressure coordinates (xp, yp) may be determined separately for the front and rear force plates of the force measurement assembly 22.
In the illustrative embodiment, the data processing device 14 of the swing analysis system 100 is programmed to determine a plurality of different outputs from the force plate output data, which may include: (i) autodetection of movements (e.g., during golf, vertical jump, baseball swing phases), (ii) peak forces (Fx, Fy, and Fz) and torques, (iii) impulses, (iv) timing metrics, (v) timestamps of important events, and (vi) rate of force development. For example, as illustrated in the graph of
Also, as illustrated in the graphs of
68216N·s/−120448N·s=−0.57 (3)
As another example, considering the Fx force curve depicted in
331N/−206N=−1.60 (4)
Advantageously, these efficiency ratios give insight into transfer of energy and force from the acceleration phase to the braking phase. Also, the time from the peak acceleration force to the peak braking force in the graph of
2.99 sec−2.75 sec=0.24 sec (5)
Additional x-axis metrics determined by the data processing device 14 in the illustrative embodiment will be discussed with reference to
75N/200N=37.5% (6)
In addition, as illustrated in the graph of
Further, with reference to
In the illustrative embodiment, the data processing device 14 also may be programmed to determine the vertical/horizontal brake ratio for the baseball player. The vertical/horizontal brake ratio is the ratio of the vertical positive impulse and the horizontal negative impulse, and the vertical/horizontal brake ratio gives insight into whether more braking is happening horizontally or vertically. For example, considering the Fx and Fz force curves depicted in
68216N·s/61060N·s=1.12 (7)
Turning to
Load Quality Z:100−6.24=93.76 (8)
Secondly, using the rear force plate Fx plot in
Load Variability X:100−7.38=92.62 (9)
The data processing device 14 may calculate the load quality as the average between Load Quality Z and Load Variability X as follows:
Load Quality=(93.76+92.62)/2=93.19 (10)
In the illustrative embodiment, the data processing device 14 additionally may be programmed to determine baseball swing phases for the baseball player. For example, the data processing device 14 may be programmed to determine the following baseball swing phases for the baseball player: (i) stance (i.e., ready position to lead leg off), (ii) stride (i.e., lead leg off to BW 10%), (iii) coiling, (iv) swing initiation (i.e., lead leg 10% to peak force), (v) swing acceleration (i.e., peak force to contact), and (vi) follow through. As part of the determination of the baseball swing phases, the data processing device 14 may be programmed to determine the on and off positions of the front foot (refer to
In the illustrative embodiment, the data processing device 14 of the swing analysis system 100 is programmed to output the swing performance metrics described above for the front force plate of the force measurement assembly 22, the rear force plate of the force measurement assembly 22, or both the front and rear force plates of the force measurement assembly 22. Also, in the illustrative embodiment, the data processing device 14 may be programmed to compute impulses, peak forces and/or torques, a rate of force development, and other performance metrics for the front force plate and/or rear force plate of the force measurement assembly 22. In addition, the swing performance metrics described above may be determined using one or two force plates of the swing analysis system 100 (i.e., either the front force plate or the rear force plate, or both the front and rear force plates).
In the illustrative embodiment, the data processing device 14 further may be programmed to generate a baseball swing report with various swing performance metrics determined from the force plate output data. For example, as shown in
As yet another example, turning to
As yet another example, turning to
As still another example, turning to
In one or more other illustrative embodiments, the baseball swing report may include any combination of the following swing performance metrics: (i) momentum impulse, (ii) load, (iii) drive, (iv) acceleration, (v) deceleration, (vi) load variability, (vii) rate of force development, and (viii) peak force.
As yet another example, turning to
As still another example, turning to
Backswing RTD=(Mz(nBmz)−Mz(nAmz))−(nBmz-nAmz) (11)
nAmz=first moment Mz is above 0 in the backswing (12)
nBmz=nBz (13)
In equation (13) above, nBz is the index of min Fz load.
As yet another example, turning to
Downswing RTD=(Mz(nCmz)−Mz(nBmz))−(nCmz-nBmz) (14)
nBmz=nBz (15)
nCmz=max(Mz); (16)
In equation (15) above, nBz is the index of min Fz load.
As still another example, the data processing device 14 also may be programmed to determine the time to contact during a baseball swing. The data processing device 14 determines the time to contact during the baseball swing by subtracting a first time instance when a foot of the user is put back down on the ground at the end of a stride phase from an estimated time to ball contact. The data processing device 14 may calculate the time to contact during a baseball swing as follows:
TimeToContact=TimeBallContact−TimeFrontFootOn (17)
In equation (17) above, “TimeFrontFootOn” is the event point (i.e., time location) of when the foot of the baseball player is put back down on the ground, marking the end of the stride phase.
In a further illustrative embodiment of the swing analysis system 100, the system 100 uses a combination of the force measurement assembly 22′ (e.g., a force plate) and a motion capture system (see
In one or more embodiments, a remote server may be used to process the camera data collected on the local computing device, which is operatively coupled to the cameras 40. The remote server may be connected to local computing device via an internet connection so as to enable cloud processing of the camera data. Advantageously, cloud processing enables users to obtain output data without having a powerful graphics processing unit (GPU) on the local computing device to analyze the markerless motion capture data using the one or more trained neural networks.
In this further illustrative embodiment, the center of mass of the body is obtained using computer vision and processing algorithms. First, the body center of mass (COM) and the location of the force plate relative to the COM is obtained. Then, the moment about the center of mass is calculated using COM position data and the global ground reaction forces from the force plate. Also, because computer vision results are obtained, enhanced phase detection and kinematic processing also is able to be performed. In this further illustrative embodiment, phase detection of the following is additionally performed: (i) start of the swing (swing initiation), (ii) top of backswing, and (iii) ball contact.
In this further illustrative embodiment, an input of 2-4 RGB video cameras 40 may be used. Also, the swing analysis system 100 uses a computer vision algorithm to obtain 17 or more keypoint locations on the human subject during a swinging motion. The keypoint locations for each frame create a time-series file containing the locations of each keypoint in three-dimensional (3D) space. These keypoints are then processed to output the location of the center of mass in 3D space. For example, the three-dimensional (3D) pose estimation system described in U.S. Pat. No. 10,853,970 may be used to determine the keypoint locations, the entire disclosure of which is incorporated herein by reference.
In this further illustrative embodiment, with reference again to
Additionally, in this further illustrative embodiment, the keypoint time series information may be used by the data processing device 14 to do an algorithmic analysis of the kinematic data of the human subject. The data processing device 14 may calculate the angular position, velocity, and acceleration of the body segments for each frame.
Now, the details of this further illustrative embodiment will be described in more detail with reference to
headneck(x/y/z)=c7(x/y/z)+(0.5002)*(head(x/y/z)−c7(x/y/z));
trunk(x/y/z)=(rshoulder(x/y/z)+lshoulder(x/y/z))/2+
(0.413)*(hip(x/y/z)−((rshoulder(x/y/z)+lshoulder(x/y/z))/2));
rupperarm(x/y/z)=rshoulder(x/y/z)+(0.5772)*(relbow(x/y/z)−rshoulder(x/y/z));
lupperarm(x/y/z)=lshoulder(x/y/z)+(0.5772)*(lelbow(x/y/z)−lshoulder(x/y/z));
rforearm(x/y/z)=relbow(x/y/z)+(0.4574)*(rwrist(x/y/z)−relbow(x/y/z));
lforearm(x/y/z)=lelbow(x/y/z)+(0.4574)*(lwrist(x/y/z)−lelbow(x/y/z));
rthigh(x/y/z)=rhip(x/y/z)+(0.4095)*(rknee(x/y/z)−rhip(x/y/z));
lthigh(x/y/z)=lhip(x/y/z)+(0.4095)*(lknee(x/y/z)−lhip(x/y/z));
rshank(x/y/z)=rknee(x/y/z)+(0.4395)*(rankle(x/y/z)−rknee(x/y/z));
lshank(x/y/z)=lknee(x/y/z)+(0.4395)*(lankle(x/y/z)−lknee(x/y/z));
As one example, in the above lines of code, the head-neck segment center of mass location is determined as a function of the c7 keypoint, the head keypoint, and the head-neck segment length percentage. Then, in the illustrative embodiment, the data processing device 14 executes the following lines of code in order to determine the overall body center of mass location:
CM_tot(x/y/z)=headneck(x/y/z)*0.0694+trunk(x/y/z)*0.4346+rupperarm(x/y/z)*0.0271+lupperarm(x/y/z)*0.0271+rforearm(x/y/z)*0.0162+lforearm(x/y/z)*0.0162+rthigh(x/y/z)*0.1416+lthigh(x/y/z)*0.1416+rshank(x/y/z)*0.0433+lshank(x/y/z)*0.0433;
In the above lines of code, the overall body center of mass location is determined as a function of the individual body segment center of mass locations and the segment mass percentages. An exemplary output of the data processing device 14 for the overall body center of mass location (i.e., x, y, z coordinate locations of the overall COM) in millimeters over a period of time is depicted in
In the illustrative embodiment, the data processing device 14 determines the global position coordinates (i.e., x, y, z coordinates) of each keypoint over time by processing the output data from the cameras 40 using one or more trained neural networks (e.g., by using the trained neural networks described in U.S. Pat. No. 10,853,970). In the illustrative embodiment, at least the following keypoints are determined by the data processing device 14: (i) head keypoint, (ii) C7 keypoint, (iii) right shoulder keypoint, (iv) right elbow keypoint, (v) right wrist keypoint, (vi) left shoulder keypoint, (vii) left elbow keypoint, (viii) left wrist keypoint, (ix) sacrum keypoint, (x) right hip keypoint, (xi) right knee keypoint, (xii) right ankle keypoint, (xiii) left hip keypoint, (xiv) left knee keypoint, and (xv) left ankle keypoint. An exemplary output of the data processing device 14 for the left knee keypoint location (i.e., x, y, z coordinate locations of the left knee keypoint) in millimeters over a period of time is depicted in
Also, in the illustrative embodiment, the data processing device 14 determines the global angle of each body segment in the x, y, and z planes. More specifically, in the illustrative embodiment the data processing device 14 determines the x, y, and z angles for the following body segments: (i) the right forearm, using the right elbow keypoint and the right wrist keypoint, (ii) the left forearm, using the left elbow keypoint and the left wrist keypoint, (iii) the right upper arm, using the right shoulder keypoint and the right elbow keypoint, (iv) the left upper arm, using the left shoulder keypoint and the left elbow keypoint, (v) the right thigh, using the right hip keypoint and the right knee keypoint, (vi) the left thigh, using the left hip keypoint and the left knee keypoint, (vii) the right shank, using the right knee keypoint and the right ankle keypoint, (viii) the left shank, using the left knee keypoint and the left ankle keypoint, (ix) the right foot, using the right ankle keypoint and the right toe keypoint, (x) the left foot, using the left ankle keypoint and the left toe keypoint, (xi) the pelvis, using the right hip keypoint and the left hip keypoint, (xii) the upper torso, using the right shoulder keypoint and the left shoulder keypoint, (xiii) right pelvis, using the sacrum keypoint and right hip keypoint, (xiv) left pelvis, using the sacrum keypoint and left hip keypoint, (xv) right upper trunk, using the C7 keypoint and right shoulder keypoint, (xvi) left upper trunk, using the C7 keypoint and left shoulder keypoint, (xvii) neck, using the C7 keypoint and head keypoint, (xviii) trunk, using the C7 keypoint and sacrum keypoint, (xix) right trunk, using the C7 keypoint and right hip keypoint, and (xx) left trunk, using the C7 keypoint and left hip keypoint.
In the illustrative embodiment, the data processing device 14 determines the limb segment angles for body segments in all three directions using the following equations:
Equation (18) may be more generally written as:
In equation (19) above, the x, y variables represent the x, y coordinates of the two (“i” and “j”) keypoints that surround the segment (keypoints that are used to determine each segment angle in the list above). This can be changed to (y, z) or (x, z) to get all three directions of segment angles.
In the calculations performed above by the data processing device 14, the use of the arctan function can cause some difficulties because arctan only ranges from −90 to 90, and if the keypoints cross over each other (change angle “quadrants”) the plots will sometimes jump by a value of 360 degrees. For example, instead of an angle going from 180 to 181, it goes from 180 to −179. To avoid this, the algorithms detect which angle quadrant the keypoints in consideration are oriented in, and will add either ±180, ±360, ±540, or ±720 in order to avoid the jumps. This results in the angles being continuous for two revolutions around a “circle”. This is a small limitation; however, it is needed to avoid the 90 or 180 degrees “jumps” in the plots. For example, in the illustrative embodiment, the data processing device 14 executes the following lines of code in order to determine the limb segment angles:
for (int i=1; i<Length; i++)
{
if ((pointOne(x/y/z)[i] (<=/>=) pointTwo(x/y/z)[i]) &&(pointOne(x/y/z)[i] (<=/>=) pointTwo(x/y/z)[i]))
{
if ((pointOne(x/y/z)[i−1] (<=/>=) pointTwo(x/y/z)[i-1]) &&(PointOne(x/y/z)[i−1] (<=/>=) pointTwo(x/y/z)[i−1]))
{
if (angles(x/y/z)[i−1](<=/>=) 0&& angles(x/y/z)[i−1](<=/>=)−359)
{
angles(x/y/z)[i]=(±180,360, 540, 720)+(180/(float)System·Math·PI)*(float)System·Math·Atan((pointOne(x/y/z)[i]−pointTwo(x/y/z)[i])/(pointOne(x/y/z)[i]−pointTwo(x/y/z)[i]));
}
else if (angles(x/y/z)[i−1] (<=/>=)−359)
{
angles(x/y/z)[i]=(±180,360, 540, 720)+(180/(float)System·Math·PI)*(float)System·Math·Atan((pointOne(x/y/z)[i]−pointTwo(x/y/z)[i])/(pointOne(x/y/z)[i]−pointTwo(x/y/z)[i]));
}
else
{
angles(x/y/z)[i]=(±180,360, 540, 720)+(180/(float)System·Math·PI)*(float)System·Math·Atan((pointOne(x/y/z)[i]−pointTwo(x/y/z)[i])/(pointOne(x/y/z)[i]−pointTwo(x/y/z)[i]));
}
}
In the above lines of code, the keypoints are the input and the limb segment angle is the output. For example, for the computation of the right or left forearm angle, the inputs are the right or left elbow keypoint and the right or left wrist keypoint. The limb segment angles describe how a particular body segment is oriented. An exemplary output of the data processing device 14 for the right thigh segment angles (i.e., the angles in the x, y, z directions for the right thigh) in degrees over a period of time is depicted in
Further, in the illustrative embodiment, the data processing device 14 determines the global joint angles of each body joint in the x, y, and z planes. More specifically, in the illustrative embodiment, the data processing device 14 determines the following joint angles in all three directions (x, y, z): (i) right knee, (ii) left knee, (iii) right elbow, (iv) left elbow, (v) right shoulder, (vi) left shoulder, (vii) right hip, (viii) left hip, (ix) right shoulder rotation, (x) left shoulder rotation, (xi) right hip rotation, and (xii) left hip rotation. The first eight listed joint angles (i.e., right knee, left knee, right elbow, left elbow, right shoulder, left shoulder, right hip, and left hip) are calculated by the data processing device 14 as described hereinafter. In general, each of the body joints has a convention for describing its magnitude and polarity. For example, when the knee of a person is fully extended, the knee angle is described as 0° flexion, and when the leg moves in a posterior direction relative to the thigh, the knee is said to be in flexion. In terms of absolute angles, the knee angle may be calculated as follows by the data processing device 14:
knee angle=θk=θ21−θ43 (20)
In the above equation (20), if θ21>θ43, the knee is flexed; if θ21<θ43, the knee is extended. For the ankle joint, the convention is slightly different in that 90° between the leg and the foot is the boundary between plantarflexion and dorsiflexion. As such, the ankle angle may be calculated as follows by the data processing device 14:
ankle angle=θα=θ43−θ65+90° (21)
In the above equation (21), if θα, is positive, the foot is plantarflexed; if θα, is negative, the foot is dorsiflexed. These two examples are described for the knee and ankle angles, but the method can be applied to any joint where the limb-segment angles are available around the joint. In equations (20) and (21) above, angle 21 is the thigh segment angle, angle 43 is the shank segment angle, and angle 65 is the foot segment angle. For shoulder abduction, the upper arm and trunk segment angles could be calculated in similar manner to the knee and ankle angles.
The other four joint angles listed above (i.e., right shoulder rotation, left shoulder rotation, right hip rotation, left hip rotation) are calculated by the data processing device 14 in different ways as only two keypoints are available to find the rotation of these joints. For shoulder external rotation, a 0° degree angle may correspond to the arm of the person pointing straight forward, a 90° degree angle may correspond to the arm of the person pointing straight up, and a 180° degree angle may correspond to the arm of the person pointing straight backward. The right and left shoulder rotation angles may be calculated as follows by the data processing device 14:
Then, to output the angle in the orientation as described above, the data processing device 14 adds either 0, 180, or −180 to the value based on the orientation and quadrants of the keypoints in order to avoid “jumps” in the plots and to report the angles according to the desired output. For hip external rotation, the same tangent method is used with the ankle and knee keypoints to find the desired hip angle. Then, to output the angle in the orientation as described above, the data processing device 14 adds either 0, 90, or −90 to the value based on the orientation and quadrants of the keypoints in order to avoid “jumps” in the plots and to report the angles according to the desired output.
In the illustrative embodiment, a plurality of joint angles are then normalized/adjusted by the data processing device 14. For example, in the illustrative embodiment, the following joint angles are normalized/adjusted: (i) shoulder abduction angle (shoulder joint angle Y), (ii) shoulder horizontal abduction angle (shoulder joint angle Z), (iii) hip flexion angle (hip joint angle X), and (iv) elbow flexion angle. For the determination of the shoulder abduction angle in the y-direction, the neutral position is when the arm of the person is extending straight down, while the 90 degree position of the arm is when the arm is extending outwardly from the side of the person in a horizontal direction. In order to obtain the shoulder abduction angle in the desired form, the data processing device 14 utilizes the following equations:
Left Shoulder Abduction=Left Shoulder Joint Angle Y−90 (25)
Right Shoulder Abduction=90−Right Shoulder Joint Angle Y (26)
For the determination of the shoulder abduction angle in the z-direction, the 90 degree horizontal flexion position is when the arm of the person extends straight out from the person in an anterior direction, the 0 degree horizontal flexion position is when the arm of the person extends straight out from the person in a lateral direction, and 90 degree horizontal extension position is when the arm of the person extends straight out from the person in a posterior direction. In order to obtain the shoulder horizontal abduction angle in the desired form, the data processing device 14 utilizes the following equations:
Left Shoulder Horizontal Abduction=Left Shoulder Joint Angle Y−180 (27)
Right Shoulder Horizontal Abduction=−1*Right Shoulder Joint Angle Y (28)
For the determination of the hip flexion angle (hip joint angle X), the 0 degree flexion position is when the leg of the person extends straight out from the person in an inferior direction, and the 90 degree flexion position is when the leg of the person extends outwardly from the person in an anterior direction (i.e., the leg is bent 90 degrees). In order to obtain the hip flexion angle in the desired form, the data processing device 14 utilizes the following equations:
Left Hip Flexion=180−LeftHip Joint Angle X (29)
Right Hip Flexion=180−Right Hip Joint Angle X (30)
For the determination of the elbow flexion angle, the 0 degree flexion position is when the forearm of the person extends straight out from the upper arm of the person, the 90 degree flexion position is when the forearm of the person forms a 90 degree angle with the upper arm of the person, and the 180 degree flexion position is when the forearm of the person is bent back against the upper arm of the person so that the forearm and upper arm are generally parallel to one another. In order to obtain the elbow flexion angle in the desired form, the data processing device 14 utilizes the following equations:
Left Elbow Flexion=Left Elbow Joint Angle Y(no normalization) (31)
Right Elbow Flexion=−1*Right Elbow Joint Angle Y (32)
In the illustrative embodiment, once these joint and segment angles have been calculated by the data processing device 14, and there is a value at each time point for each angle, the derivative of the angle time series data can be calculated to find both body joint angular velocities and body segment angular velocities. For example, the data processing device 14 may use the following equation on both the body joint angle data and body segment angle data to then find body joint angular velocities and body segment angular velocities in all three directions (x, y, z) for each angle at time point “i”:
In the illustrative embodiment, similar to the body joint and segment angular velocities, the body joint and segment angular accelerations may be calculated at each time point by finding the derivative of the body joint and segment angular velocity. An equation similar to equation (33) may be used for angular acceleration, except that the angles will be replaced with velocities such that the derivative of angular velocity is now being taken:
In the illustrative embodiment, the data processing device 14 further determines the moment around the center of mass (COM) in the x, y, z planes using the coordinates of the center of mass and the forces in all three planes. Torque calculation laws are applied to these calculations about the point of the center of mass which is calculated using body segment percentages and body segment sum of torques.
The moments around the center of mass (COM) are calculated using equations (35)-(37) below in the illustrative embodiment. If the front plate COM moment is being determined, these values are all front plate values; if the rear plate COM moment is being determined, then these values are all rear plate values; and if the total/single plate COM moment is being determined, then these values are generated from aggregate/single plate data. In order to obtain the moments around the center of mass (COM), the data processing device 14 utilizes the following equations:
Mx=(Fy*COMz)+(Fz*(COPy—COMy))+Mx (35)
My=(Fx*COMz)+(Fz*(COPx—COMx))+My (36)
Mz=(Fx*(COPy—COMy))+(Fy*(COPx—COMx))+Mz (37)
The total moment around the COM, also called the golfer ground interaction, is calculated by the data processing device 14 using the following equation:
M=Σi({right arrow over (r)}l×{right arrow over (F)}l×{right arrow over (τ)}l) (38)
In yet a further illustrative embodiment, other kinetic metrics are used to assess the baseball swing or the golf swing. These metrics include: (i) weighting-impact time and landing-impact time, (ii) “front-foot” versus “reverse” style of golf swing, (iii) weight transfer range, (iv) rate of weight transfer, and (v) single foot metrics. Each of these additional metrics will be explained hereinafter.
First of all, weighting-impact time and landing-impact time are metrics that are used in conjunction with a dual force plate system, such as the dual plate system illustrated in
Secondly, similar to the weighting-impact time and landing-impact time, the “front-foot” versus “reverse” style of golf swing are metrics that are used in conjunction with a dual force plate system, such as the dual plate system illustrated in
Thirdly, the weight transfer range is a golf metric that may be used in conjunction with a dual force plate system, such as the dual plate system illustrated in
Fourthly, similar to the weight transfer range, the rate of weight transfer is a golf metric that may be used in conjunction with a dual force plate system, such as the dual plate system illustrated in
Finally, there are other single foot metrics that may be used in conjunction with a dual force plate system, such as the dual plate system illustrated in
In yet a further illustrative embodiment, the data processing device 14 of the swing analysis system 100 may be further configured to characterize a swing quality of the user by utilizing the one or more swing performance parameters and one or more trained neural networks (e.g., by using the trained neural networks described in U.S. Pat. No. 10,853,970). For example, the data processing device 14 may characterize the swing of the user as a good swing if the one or more swing performance parameters of the user fall within a predetermined acceptable range. Conversely, the data processing device 14 may characterize the swing of the user as a bad swing if the one or more swing performance parameters of the user fall outside a predetermined acceptable range. Also, after the swing analysis system 100 collects data for a sufficient quantity of swings, the data processing device 14 then is able to characterize the swing as good or bad based on a machine learning comparison to the other swings that have been evaluated and characterized. Also, the data processing device 14 may be further configured to make recommendations on how to improve a bad swing based on previously acquired swing data. In addition to characterizing the swing of the user, the data processing device 14 may further be configured to characterize a quality of other activities of the user as well.
In yet a further illustrative embodiment of the swing analysis system 100, the system 100 includes a motion capture system that includes a plurality of motion capture devices (e.g., video cameras 40—see
In this further illustrative embodiment of the swing analysis system 100, the system may 100 further include a force measurement assembly 22′ (e.g., a force plate—see
In this further illustrative embodiment, the output forces and/or moments determined by the at least one data processing device 14 includes shear force (Fx) values and vertical force (Ft) values; and the one or more swing performance metrics determined by the at least one data processing device 14 are selected from the group consisting of: (i) a maximum Fz drive force, (ii) a maximum Fz load force, (iii) a maximum Fx acceleration force, (iv) a maximum Fx braking or deceleration force, (v) a rate of force development along the x-axis, (vi) a rate of force development along the z-axis, (vii) a backswing torque, (viii) a downswing torque, (ix) a peak swing torque, (x) load quality, (xi) load variability, (xii) a drive impulse, (xiii) a load impulse, (xiv) an acceleration impulse, (xv) a braking impulse, and (xvi) combinations thereof.
In this further illustrative embodiment, the first positional information of the one or more body segments of the user determined by the at least one data processing device 14 comprises keypoints for the one or more body segments of the user generated using a trained neural network. More specifically, in this further illustrative embodiment, the data processing device 14 and/or a cloud server is configured to determine body keypoint data from the camera output data (e.g., markered or markerless).
In this further illustrative embodiment, the one or more swing performance parameters determined by the at least one data processing device 14 comprise at least one of: (i) one or more body segment angles for the one or more body segments of the user determined using the keypoints generated from the trained neural network, (ii) one or more body joint angles for the user determined using the one or more body segment angles for the one or more body segments of the user, (iii) one or more body joint angular velocities for the user determined using the one or more body joint angles of the user, (iv) one or more body joint angular accelerations for the user determined using the one or more body joint angular velocities of the user, (v) one or more body segment angular velocities for the one or more body segments of the user determined using the one or more body segment angles for the one or more body segments of the user, and (vi) one or more body segment angular accelerations for the one or more body segments of the user determined using the one or more body segment angular velocities for the one or more body segments of the user.
Machine learning-based body segment of the user via keypoint tracking (e.g., a pose model or biomechanical model) during different phases of a swing is very important in understanding each swing. In this further illustrative embodiment, a pose-based kinetic core model of a baseball swing or golf swing both algorithmic and machine learning-based may be utilized to extract key metrics such as back swing torque, peak swing torque similar to force-based ground reactive metrics. Additionally, trained machine learning-models for aggregate metrics may be utilized to compare subjects' performance over time and compare against other subjects.
In this further illustrative embodiment, the second positional information for the head and/or face of the user determined by the at least one data processing device 14 comprises keypoints for the head and/or face of the user generated using a trained neural network. More specifically, in this further illustrative embodiment, the data processing device 14 and/or a cloud server is configured to determine facial keypoint data from the camera output data (e.g., markered or markerless).
In this further illustrative embodiment, the one or more swing performance parameters determined by the at least one data processing device 14 comprise a head position assessment metric and/or a gaze direction assessment metric while the user is manipulating the object during a swing activity.
During any swing activities, it is very beneficial to know what the head of the user is doing (e.g., while the user is swinging a baseball bat or golf club). Training the user (e.g., athlete) based on the head-tracked data can increase energy efficiencies and reduce injuries. With camera-based tracking, the orientation of the head with respect to the body can help deduce additional insights. In this further illustrative embodiment, traditional computer vision-based tracking or marker-less motion-based tracking may be used with a suitable human keypoint model.
In this further illustrative embodiment, the second positional information for the hand and/or fingers of the user determined by the at least one data processing device 14 comprises keypoints for the hand and/or fingers of the user generated using a trained neural network. More specifically, in this further illustrative embodiment, the data processing device 14 and/or a cloud server is configured to determine hand and finger keypoint data from the camera output data (e.g., markered or markerless).
In the table above, the following abbreviations are used: (i) CMC— carpometacarpal joint, (ii) MCP— metacarpophalangeal joint, (iii) Pip—proximal interphalangeal joint, (iv) Dip—distal interphalangeal joint, and (v) Ip—interphalangeal joint.
In this further illustrative embodiment, the one or more swing performance parameters determined by the at least one data processing device 14 comprise a grip assessment metric while the user is manipulating the object during a swing activity.
Machine learning-based hand model keypoint tracking for the hand (or hand landmark model) can be utilized for grip analysis during the swinging activity. Grip analysis can give insights into accuracy and precision of handling of a baseball bat or golf club. Analysis is carried out over all phases of a swing. As described hereinafter, for more precise grip placement, instrumented gloves, instrumented golf clubs, and instrumented baseball bats may be used.
In this further illustrative embodiment, the second positional information for the object being manipulated by the user determined by the at least one data processing device comprises keypoints for the object being manipulated by the user generated using a trained neural network. For example, if the object being manipulated by the user is a baseball bat or a golf club, the keypoints for the object may comprises a series of keypoints disposed along the length of the baseball bat or the golf club.
In this further illustrative embodiment, the one or more swing performance parameters determined by the at least one data processing device 14 comprise an object displacement path assessment metric while the user is manipulating the object during a swing activity.
Referring again to
Also, in this further illustrative embodiment, the swing analysis system 100 may further include a hand grip sensing device 48 (e.g., an instrumented glove) operatively coupled to the at least one data processing device 14 (see
In addition, in this further illustrative embodiment, the swing analysis system 100 may further include an eye movement tracking device 44 operatively coupled to the at least one data processing device 14 (see
Knowing where the user is looking (gaze) during a certain action like swinging a bat is important for performance analysis. Gaze tracking over different phases of a swing can give a lot of insights into precision of the tracking of the baseball or a golf ball over time. Eye saccades, eye blink rate data can be used to identify where reflex training may be required.
Additional metrics such as blinking, blink rate, pupil dilation and constrictions during an action is essential to determine the probability of tracking a pitched ball and tracking the position of a bat, and timing of the contact of a bat with the ball. Goggle or glasses-based eye-tracking hardware may be used, such as the eye movement tracking devices described in U.S. Pat. No. 11,337,606. Also, metrics focused on cognitive load analysis can give insights into distraction levels, stress levels prior and during contact with the ball.
In this further illustrative embodiment, the object being manipulated by the user comprises a sports implement, and the swing analysis system 100 may further comprise one or more sports implement sensing devices 42 (e.g., inertial measurement units (IMUs), each with an accelerometer, gyroscope, and/or magnetometer) attached to the sports implement. The one or more sports implement sensing devices 42 are operatively coupled to the at least one data processing device 14 (see
In this further illustrative embodiment, the sports implement manipulated by the user is selected from the group consisting of: (i) a bat used in one or more sports, (ii) a club used in one or more sports, and (iii) a racquet used in one or more sports. Advantageously, having positional, velocity, trajectory data on a bat/club/racquet in conjunction with the position of the body segments and data from the force plate provides more insights in performance analysis. Bats/clubs/racquets can be tracked using IMU(s). Bats/clubs/racquets also can be tracked using vision cameras and or machine learning techniques.
In this further illustrative embodiment, other measured data points may comprise tracking the pressure of the hand on bats/clubs/racquets. Measuring this pressure is essential for proper gripping techniques. In order to measure grip pressure, one or more of the following sensors may be used: (i) sleeve-based hand grip pressure sensor, (ii) a glove-based hand grip pressure sensor(s), and (iii) hand grip tracking using cameras using machine learning (e.g., tracking of individual fingers).
Now, with reference to diagrams in
Turning to
It is readily apparent that the swing analysis system 100 described above offers numerous advantages and benefits for training athletes. First, the swing analysis system 100 is capable of determining swing performance metrics from output data of a force measurement assembly. Moreover, the swing analysis system 100 is capable of autodetecting one or more swing phases of a user. Furthermore, the swing analysis system 100 is capable of generating a swing analysis report that includes one or more swing performance metrics.
While reference is made throughout this disclosure to, for example, “an illustrative embodiment”, “one embodiment”, or a “further embodiment”, it is to be understood that some or all aspects of these various embodiments may be combined with one another as part of an overall embodiment of the invention. That is, any of the features or attributes of the aforedescribed embodiments may be used in combination with any of the other features and attributes of the aforedescribed embodiments as desired.
Although the invention has been shown and described with respect to a certain embodiment or embodiments, it is apparent that this invention can be embodied in many different forms and that many other modifications and variations are possible without departing from the spirit and scope of this invention. For example, while the embodiments presented above focus on the analysis of a baseball swing, it is to be understood that the swing analysis principles described above may be applied to the swing analysis of any implement or object swung by a user, such as a baseball bat, cricket bat, golf club, tennis racket, squash racket, etc.
Moreover, while exemplary embodiments have been described herein, one of ordinary skill in the art will readily appreciate that the exemplary embodiments set forth above are merely illustrative in nature and should not be construed as to limit the claims in any manner. Rather, the scope of the invention is defined only by the appended claims and their equivalents, and not, by the preceding description.
This is a continuation-in-part of U.S. Nonprovisional patent application Ser. No. 17/409,701, entitled “Swing Analysis System”, filed on Aug. 23, 2021; which is a continuation-in-part of U.S. Nonprovisional patent application Ser. No. 17/067,745 entitled “Swing Analysis System”, filed on Oct. 11, 2020, now U.S. Pat. No. 11,097,154; which claims the benefit of U.S. Provisional Patent Application No. 62/913,995, entitled “Swing Analysis System”, filed on Oct. 11, 2019, the disclosure of each of which is hereby incorporated by reference as if set forth in their entirety herein.
Number | Name | Date | Kind |
---|---|---|---|
6038488 | Barnes et al. | Mar 2000 | A |
6113237 | Ober et al. | Sep 2000 | A |
6152564 | Ober et al. | Nov 2000 | A |
6295878 | Berme | Oct 2001 | B1 |
6354155 | Berme | Mar 2002 | B1 |
6389883 | Berme et al. | May 2002 | B1 |
6936016 | Berme et al. | Aug 2005 | B2 |
8181541 | Berme | May 2012 | B2 |
8246354 | Chu et al. | Aug 2012 | B2 |
8315822 | Berme et al. | Nov 2012 | B2 |
8315823 | Berme et al. | Nov 2012 | B2 |
D689388 | Berme | Sep 2013 | S |
D689389 | Berme | Sep 2013 | S |
8543540 | Wilson et al. | Sep 2013 | B1 |
8544347 | Berme | Oct 2013 | B1 |
8643669 | Wilson et al. | Feb 2014 | B1 |
8700569 | Wilson et al. | Apr 2014 | B1 |
8704855 | Berme et al. | Apr 2014 | B1 |
8764532 | Berme | Jul 2014 | B1 |
8847989 | Berme et al. | Sep 2014 | B1 |
D715669 | Berme | Oct 2014 | S |
8902249 | Wilson et al. | Dec 2014 | B1 |
8915149 | Berme | Dec 2014 | B1 |
9032817 | Berme et al. | May 2015 | B2 |
9043278 | Wilson et al. | May 2015 | B1 |
9066667 | Berme et al. | Jun 2015 | B1 |
9081436 | Berme et al. | Jul 2015 | B1 |
9168420 | Berme et al. | Oct 2015 | B1 |
9173596 | Berme et al. | Nov 2015 | B1 |
9200897 | Wilson et al. | Dec 2015 | B1 |
9277857 | Berme et al. | Mar 2016 | B1 |
D755067 | Berme et al. | May 2016 | S |
9404823 | Berme et al. | Aug 2016 | B1 |
9414784 | Berme et al. | Aug 2016 | B1 |
9468370 | Shearer | Oct 2016 | B1 |
9517008 | Berme et al. | Dec 2016 | B1 |
9526443 | Berme et al. | Dec 2016 | B1 |
9526451 | Berme | Dec 2016 | B1 |
9558399 | Jeka et al. | Jan 2017 | B1 |
9568382 | Berme et al. | Feb 2017 | B1 |
9622686 | Berme et al. | Apr 2017 | B1 |
9763604 | Berme et al. | Sep 2017 | B1 |
9770203 | Berme et al. | Sep 2017 | B1 |
9778119 | Berme et al. | Oct 2017 | B2 |
9814430 | Berme et al. | Nov 2017 | B1 |
9829311 | Wilson | Nov 2017 | B1 |
9854997 | Berme et al. | Jan 2018 | B1 |
9916011 | Berme et al. | Mar 2018 | B1 |
9927312 | Berme et al. | Mar 2018 | B1 |
10010248 | Shearer | Jul 2018 | B1 |
10010286 | Berme et al. | Jul 2018 | B1 |
10085676 | Berme et al. | Oct 2018 | B1 |
10117602 | Berme et al. | Nov 2018 | B1 |
10126186 | Berme et al. | Nov 2018 | B2 |
10216262 | Berme et al. | Feb 2019 | B1 |
10231662 | Berme et al. | Mar 2019 | B1 |
10264964 | Berme et al. | Apr 2019 | B1 |
10331324 | Wilson et al. | Jun 2019 | B1 |
10342473 | Berme et al. | Jul 2019 | B1 |
10390736 | Berme et al. | Aug 2019 | B1 |
10413230 | Berme et al. | Sep 2019 | B1 |
10463250 | Berme et al. | Nov 2019 | B1 |
10527508 | Berme et al. | Jan 2020 | B2 |
10555688 | Berme et al. | Feb 2020 | B1 |
10646153 | Berme et al. | May 2020 | B1 |
10722114 | Berme et al. | Jul 2020 | B1 |
10736545 | Berme et al. | Aug 2020 | B1 |
10765936 | Berme et al. | Sep 2020 | B2 |
10803990 | Wilson et al. | Oct 2020 | B1 |
10853970 | Akbas et al. | Dec 2020 | B1 |
10856796 | Berme et al. | Dec 2020 | B1 |
10860843 | Berme et al. | Dec 2020 | B1 |
10945599 | Berme et al. | Mar 2021 | B1 |
10966606 | Berme | Apr 2021 | B1 |
11033453 | Berme et al. | Jun 2021 | B1 |
11052288 | Berme et al. | Jul 2021 | B1 |
11054325 | Berme et al. | Jul 2021 | B2 |
11074711 | Akbas et al. | Jul 2021 | B1 |
11097154 | Berme et al. | Aug 2021 | B1 |
11158422 | Wilson et al. | Oct 2021 | B1 |
11182924 | Akbas et al. | Nov 2021 | B1 |
11262231 | Berme et al. | Mar 2022 | B1 |
11262258 | Berme et al. | Mar 2022 | B2 |
11301045 | Berme et al. | Apr 2022 | B1 |
11311209 | Berme et al. | Apr 2022 | B1 |
11321868 | Akbas et al. | May 2022 | B1 |
11337606 | Berme et al. | May 2022 | B1 |
11348279 | Akbas et al. | May 2022 | B1 |
11458362 | Berme et al. | Oct 2022 | B1 |
11521373 | Akbas et al. | Dec 2022 | B1 |
11540744 | Berme | Jan 2023 | B1 |
20030216656 | Berme et al. | Nov 2003 | A1 |
20040172213 | Kainulainen | Sep 2004 | A1 |
20080221487 | Zohar et al. | Sep 2008 | A1 |
20080228110 | Berme | Sep 2008 | A1 |
20090029793 | Cage | Jan 2009 | A1 |
20100210974 | Brett et al. | Aug 2010 | A1 |
20110184225 | Whitall et al. | Jul 2011 | A1 |
20110277562 | Berme | Nov 2011 | A1 |
20120051597 | Fogt | Mar 2012 | A1 |
20120183940 | Aragones et al. | Jul 2012 | A1 |
20120240691 | Wettels et al. | Sep 2012 | A1 |
20120266648 | Berme et al. | Oct 2012 | A1 |
20120271565 | Berme et al. | Oct 2012 | A1 |
20130268254 | Sen | Oct 2013 | A1 |
20140342844 | Mooney | Nov 2014 | A1 |
20150096387 | Berme et al. | Apr 2015 | A1 |
20160084869 | Yuen et al. | Mar 2016 | A1 |
20160245711 | Berme et al. | Aug 2016 | A1 |
20160307335 | Perry | Oct 2016 | A1 |
20160334288 | Berme et al. | Nov 2016 | A1 |
20180024015 | Berme et al. | Jan 2018 | A1 |
20180071600 | Horner | Mar 2018 | A1 |
20180200605 | Syed | Jul 2018 | A1 |
20180361223 | Cherryhomes et al. | Dec 2018 | A1 |
20190078951 | Berme et al. | Mar 2019 | A1 |
20190209909 | Thornbrue | Jul 2019 | A1 |
20190282131 | Chang et al. | Sep 2019 | A1 |
20200139229 | Berme et al. | May 2020 | A1 |
20200408625 | Berme et al. | Dec 2020 | A1 |
20210333163 | Berme et al. | Oct 2021 | A1 |
20220178775 | Berme et al. | Jun 2022 | A1 |
Entry |
---|
First office action on the merits (Non-Final Rejection) in U.S. Appl. No. 17/067,745, dated Feb. 5, 2021. |
Notice of Allowance in U.S. Appl. No. 17/067,745, dated Apr. 19, 2021. |
First office action on the merits (Non-Final Rejection) in U.S. Appl. No. 17/409,701, dated Nov. 5, 2021. |
Second office action on the merits (Final Rejection) in U.S. Appl. No. 17/409,701, dated Mar. 8, 2022. |
Notice of Allowance in U.S. Appl. No. 17/409,701, dated May 26, 2022. |
Number | Date | Country | |
---|---|---|---|
62913995 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17409701 | Aug 2021 | US |
Child | 17959246 | US | |
Parent | 17067745 | Oct 2020 | US |
Child | 17409701 | US |