MEASURING 3D BASKETBALL TRAJECTORY, SPIN RATE AND SPIN AXIS

Information

  • Patent Application
  • 20250229129
  • Publication Number
    20250229129
  • Date Filed
    January 10, 2025
    6 months ago
  • Date Published
    July 17, 2025
    11 days ago
Abstract
Embodiments are disclosed for determining a three-dimensional (3D) trajectory, spin rate and spin axis of a basketball in flight. In some embodiments, a method comprises: capturing, using a first camera, a first set of images of a basketball in motion after the basketball is released by a player; capturing, using a second camera, a second set of images of the basketball when it contacts a rim of a basketball hoop; measuring, using a radar, radar data associated with the basketball; and generating, using the first and second sets of images, an observed three-dimensional trajectory of the basketball, based on two-dimensional position data determined from the first and second sets of images, intrinsic parameters of the first and second cameras, extrinsic parameters of the first and second cameras and the radar data.
Description
TECHNICAL FIELD

This disclosure relates generally to sports technologies and data analytics, and in particular to measuring the trajectory and parameters of a basketball in flight.


BACKGROUND

Data-driven sports technologies and data analytics help players and coaches better understand performance through reliable data. One such sports technology is using sensors to measure a trajectory and various parameters of a ball in flight. Two examples of ball parameters are spin rate and spin axis. A spin rate is the speed that a ball spins on its spin axis. A high spin rate will give a ball in flight more height and a steep landing angle, and a low spin rate will give the ball less height and a shallow landing angle.


SUMMARY

Embodiments are disclosed for determining a three-dimensional (3D) trajectory, spin rate and spin axis of a basketball in flight.


In some embodiments, a method comprises: capturing, using a first camera, a first set of images of a basketball in motion after the basketball is released by a player; capturing, using a second camera, a second set of images of the basketball when it contacts a rim of a basketball hoop; measuring, using a radar, radar data associated with the basketball; and generating, using the first and second sets of images, a three-dimensional trajectory of the basketball, based on two-dimensional position data determined from the first and second sets of images, intrinsic parameters of the first and second cameras, extrinsic parameters of the first and second cameras and the radar data.


In some embodiments, the first camera has a first frame rate and a first field of view (FOV) directed towards the player, and the second camera has a second frame rate and a second FOV directed toward the rim of the basketball hoop.


In some embodiments, the first frame rate is faster than the second frame rate and the first FOV is wider than the second FOV, and where the first camera captures the release of the basketball by the player, and the second camera captures the basketball before it contacts with the rim of the basketball hoop.


In some embodiments, the radar data includes a perceived radial speed of the basketball while the basketball is in motion.


In some embodiments, generating the observed 3D trajectory of the basketball further comprises: iterating through a number of flight trajectories of the basketball constructed from the 2D position data and the radar data; reprojecting each flight trajectory into camera coordinates using the intrinsic camera parameters; calculating a perceived 2D pixel error by comparing the observed 3D trajectory with 3D flight path data generated by a flight model; minimizing the 2D pixel error to obtain the observed 3D basketball trajectory; and converting the observed 3D basketball trajectory from camera coordinates to real-world coordinates using the extrinsic parameters.


In some embodiments, the 3D flight path data is randomly generated.


In some embodiments, the method comprises: analyzing the second set of images using computer vision or deep learning to determine at least one of an impact time when the basketball contacts the rim, impact position of the basketball at the impact time, descent angle, spin rate, spin axis or contact speed; generating performance data based on the analyzing and observed 3D trajectory; and providing feedback on the performance data through one or more visual or audio devices.


In some embodiments, the method further comprises measuring the spin rate and the spin axis of the basketball by tracking a logo, seam line or other elements on the basketball.


In some embodiments, the method further comprises determining the impact position of the basketball based on the observed 3D trajectory and a circumference of the rim.


In some embodiments, the vibration of the rim captured by the second camera is used to determine the impact time.


In some embodiments, the method further comprises: determining kinematics data from the first set of camera images; and determining the performance data by combining the kinematics data with the impact position.


Other embodiments are directed to a system, apparatus and computer-readable medium.


Particular embodiments described herein provide one or more of the following advantages. The disclosed monitoring system uses calibrated intrinsic camera parameters and Doppler information (e.g., velocity) to compute 3D basketball trajectories. Such 3D trajectories may be represented as a second order or higher polynomial function with time, acceleration, speed and position as parameters (e.g., using kinematic equations). The monitoring system provides advantages over existing systems that require stereo imaging, time of flight sensing, structured lighting, frequency modulated continuous wave (FMCW) radar or frequency-shift keying (FSK) radar to generate an observed 3D basketball trajectory.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a side view of a basketball monitoring device mounted to a pole, according to one or more embodiments.



FIG. 2 is a block diagram showing various components of the monitoring device of FIG. 1, according to one or more embodiments.



FIG. 3 is a flow diagram of a process of determining a 3D trajectory, spin rate and spin axis of a basketball in flight, according to one or more embodiments.





DETAILED DESCRIPTION


FIG. 1 is a side view of a basketball monitoring device 100 mounted to basketball pole 101, according to one or more embodiments. Monitoring device 100 may be attached to basketball pole 101, or a position similar to basketball pole 101. In some embodiments, monitoring device 100 can be mounted on a wall, ceiling or other support such as a tripod or the like, allowing device 100 to be placed at a desired location relative to the basketball hoop (e.g., correct height and orientation). In practice, there can be two monitoring devices 100, one mounted to each of the two basketball poles 101 on opposite ends of the basketball court.



FIG. 2 is a block diagram showing various components of the monitoring device 100 of FIG. 1, according to one or more embodiments. In some embodiments, monitoring device 100 comprises a housing (not shown) that encloses rim camera 201, player camera 202, battery 204, processor 200 and wireless transceiver 205. In some embodiments, as can be seen from FIG. 2, monitoring device 100 comprises a housing (not shown) that encloses rim camera 201, player camera 202, radar 203, battery 204, processor 200 and wireless transceiver 205. Rim camera 201 has a first frame rate and its field of view (FOV) is directed towards rim 102 of the basketball hoop, as shown in FIG. 1. As such, rim camera 201 may capture one or more events occurring at rim 102 of the basketball hoop. Player camera 202 has a second frame rate that is slower than the first frame rate of rim camera 201, a wider FOV than rim camera 201, and the FOV is directed towards the basketball court (e.g., directed towards the player shooting a basket). Player camera 202 captures the release of basketball 103 by the player and most of the trajectory of basketball 103, as shown in FIG. 1. Rim camera 201 captures basketball 103 before it contacts with rim 102, also shown in FIG. 1. It is understood that the first frame rate and the second frame rate can be modified as necessary. Hence, it may be possible that the first frame rate is the same as the second frame rate. Similarly, the width or scope of the FOV for the rim camera and the player camera can be modified accordingly. For example, the FOV of the rim camera is configured to be narrower than the FOV of the player camera. In some embodiments, the FOV of the rim camera and the FOV of the player camera partially overlap. In some embodiments, FOV of the rim camera is within the FOV of the player camera.


A basketball trajectory may be reconstructed using rim camera 201 data, player camera 202 data and optionally radar 203 data. Monitoring device 100 may be designed in a way where video is taken in a time synchronized fashion (e.g., if the player camera is set at 30 frame per second or fps, the rim camera may be set to 120 fps). The cameras 201, 202 are configured/controlled such that one out of every 4 images of rim camera 201 is taken exactly at the same time as player camera 202. In some embodiments, the video may be taken in a time asynchronized fashion. In addition, cameras 201, 202 may be calibrated during manufacturing so that it is possible to convert one of the camera coordinate systems to the other camera coordinate system and vice-versa. In some other embodiments, a higher-to-lower frame rate conversion (frame rate down sampling) may be adopted, such as dropping frames in every alternate frame of the rim camera, followed by blending the frames of the cameras into a single output frame. In some embodiments, frame averaging is used to interpolate the motion between frames. In some embodiments, deep-learning based techniques, such as Depth-Aware video frame Interpolation (DAIN), are used to interpolate the motion between frames.


During manufacturing or prior to use, calibrated intrinsic camera parameters for both camera 201, 202 are determined. Because cameras 201, 202 are calibrated, it is possible to deduce several parameters from images taken by cameras 201, 202, such as the size of basketball 103 and the approximate position of basketball 103 in 3D world frame and then use this information to generate an accurate 3D trajectory of basketball 103.


In some embodiments, radar 203 (e.g., a Doppler radar) is used to capture the perceived radial speed of basketball 103 while in motion, which can be combined with the perceived 3D motion of the ball determined from the camera images to reconstruct an accurate 3D trajectory of basketball 103.



FIG. 3 is a flow diagram of process 300 of determining a 3D trajectory, spin rate and spin axis of a basketball in flight, according to one or more embodiments. A ball flight radar track is calculated 301 from Doppler information (e.g., velocity) output by radar 203. The two-dimensional (2D) positions of basketball 103 are calculated 302 from images captured by player camera 202 at release time. The 2D positions of the basketball 103 at contact time is calculated 303 from images captured by rim camera 201.


In some embodiments, an optimizer 304 estimates 3D ball positions from the camera and radar data, calibrated intrinsic camera parameters, and extrinsic camera parameters. In some embodiments, the camera data include one or more 2D images. In some embodiments, the extrinsic camera parameters indicate the orientation of the cameras (e.g., roll, pitch, yaw) in the 3D world frame (e.g., represented as external rotation and translation matrices or quaternions as inputs).


Using a pinhole model, a point (u, v) in a 2D image plane can be mapped to a 3D point (x, y, z) in the world frame by solving Equation [1]:











s

[



u




v




1



]

=


K

[

R


t

]

[



x




y




z




1



]


,




[
1
]







where s is the scaling factor (usually equal to 1), [R t] is the extrinsic camera parameters (R is rotation matrix and t is translation of the camera frame to the world frame) and K corresponds to the intrinsic camera parameters, as defined in Equation [2]:










K
=

[




f
*

m
x




γ


u


0




0



f
*

m
y




v


0




0


0


1


0



]


,




[
2
]







where ƒ is focal length, m is the scaling factor in x and y (usually 1), γ is the axis skew and (u, v) is the principal point.


In some embodiments, the intrinsic camera parameters can be calibrated using Zhang's method, described in Z. Zhang: A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11):1330-1334, 2000), by capturing several images of a particular pattern (e.g., checkerboard pattern, 2D or 3D pattern) from different positions and extract features from the images (e.g., corners or dots), and then solving Equation [1] to find the camera parameters.


In some embodiments, optimizer 304 iterates 305 through a number of 3D ball flight trajectories constructed from the 2D position data and radar data, reprojects each trajectory back into camera coordinates using the calibrated intrinsic camera parameters and Equation [1] to calculate a perceived 2D pixel error, which is then minimized to obtain an optimum 3D basketball trajectory. In some embodiments, the optimum 3D basketball trajectory can be obtained without using additional inputs. The optimizer iteration process involves analyzing data from both the camera data for, e.g., the 2D position of the ball captured by the camera and radar data for, e.g., ball velocity. This combined data is referred to herein as the “observed trajectory.” The observed trajectory is input into optimizer 304. Optimizer 304 compares and calculates the observed trajectory with a set of 3D flight path data generated by a flight model, which is referred to herein as a deviation. The comparison process continues until it finds a match, indicating that the shape of the generated 3D flight path matches the observed trajectory, where the deviation is the pixel difference between the observed trajectory and the 3D flight path. In some embodiments, if the deviation is large (e.g., above a threshold), the optimizer 304 further refines the comparison by generating and evaluating different 3D flight path data until the deviation is at a minimum. In some embodiments, the generated 3D flight path data is randomly generated by the optimizer 304, allowing it to create multiple potential flight paths through an iterative process. The process described above resembles a curve fitting method where the observed trajectory is fitted or matched to the 3D flight path data.


In some embodiments, the system utilizes radar signals and 2D pixel positions of the ball (X1, Y1, X2, Y2, . . . Xn, Yn), together with radar-measured radial speed vector information, to estimate the ball's trajectory. Additionally, the system employs flight model data, including but are not limited to ball speed and spin vector information, to predict the trajectory accurately within a limited duration (e.g., a few seconds). By integrating the ball's position data, absolute velocity, flight model, and camera parameters (intrinsic and extrinsic) into the optimizer, the system can calculate potential 3D trajectories. The trajectories are projected back into 2D using the camera parameters and compared to the observed data. The optimizer continuously refines the flight model iteration by comparing the observed and predicted flights to estimate the flight parameters that match the observed trajectory. The optimizer's task is to deduce potential ball flights that could result in the observed flight data. By comparing the observed flight data with the expected behavior of a ball in flight, the optimizer determines the most probable 3D space coordinates of the ball, including its initial speed and direction.


Previously, a system for monitoring a basketball trajectory would require stereo imaging, time of flight sensing, structured lighting, frequency modulated continuous wave (FMCW) radar, frequency-shift keying (FSK) radar or ultrasound (e.g., 3D ultrasound). By contrast, monitoring device 100 of the present disclosure uses calibrated intrinsic camera parameters and Doppler information (e.g., velocity) to compute 3D basketball trajectories. Such 3D trajectories may be represented as a second order or higher polynomial function with time, acceleration, speed and position as parameters (e.g., using kinematic equations).


In some embodiments, the trajectory generator process is a 3D flight model. This 3D flight model is randomly generated by optimizer 304, where optimizer 304 is capable of generating hundreds and thousands of flight models with different parameters (e.g., speed, launch angle, spin rate, and spin axis). While other polynomial functions may be applied, to represent the basketball's 3D trajectory, in some embodiments, a second-order polynomial function is used, where the second-order polynomial function involves expressing its position in three dimensions (x, y, z) as a function of time (t). The general form of a second-order polynomial function is:










f



(
t
)


=


at
2

+
bt
+
c





[
3
]







The position in the x, y and z dimensions can be denoted as x(t), y(t) and z(t).











x



(
t
)


=



a
x



t
2


+


b
x


t

+

c
x



,




[
4
]














y



(
t
)


=



a
y



t
2


+


b
y


t

+

c
y



,




[
5
]















z



(
t
)


=



a
z



t
2


+


b
z


t

+

c
z



,






[
6
]








where (αx, αy, αz) are the constant accelerations in the x, y, and z directions, respectively, (bx, by, bz) are the initial velocities in the x, y, and z directions, respectively, and (cx, cy, cz) are the initial positions in the x, y, and z directions, respectively.


The above second-order polynomial function is associated with the ball's trajectory in 3D space, which is represented in matrix format as follows:







[




a
x




b
x




c
x






a
y




b
y




c
y






a
z




b
z




c
z




]

.




In some embodiments, rim camera 201 captures high frame rate images of basketball 103 in and around rim 102. These images can be analyzed using various computer vision and deep learning techniques (e.g., neural networks) to determine various parameters such as, contact position or positions (if the ball is making more than one contact to the rim), descent angle, spin rate, spin axis, contact speed. Such data could be used together with the actual 3D trajectory of basketball 103 to determine performance data, such as, e.g., the quality of a shot, which could be provided has feedback to the user through one or more visual and/or audio devices.


In some embodiments, the spin rate and spin axis of basketball 103 may be measured by logos, lines or other elements (e.g., seam lines) on basketball 103. Impact position may be determined by a combined motion of the basketball and rim circumference. In some embodiments, the vibration of rim 102 may be used to determine the precision impact timing from the high frame rate rim camera 201. As the actual size of basketball 103 and the size of rim 102 are predetermined (e.g., following standard sizes), relative 3D positions can be determined accordingly with high accuracy.


For example, a camera with 200 frames per second (fps) is configured to capture images of the rim. From the captured images of the rim, a background subtraction from rim to rim can then be performed. The moment the ball contacts with the rim, the rim will move, where the move time can be determined with the precision of the camera frame rate. For example, a 200 frames per second camera, the rim move time can be determined with an accuracy of 1/200=5 ms, and that will provide the time of the impact. Other similar and suitable background subtraction techniques can be used such as Morphological operation based or Artificial Intelligence (AI) based (Mask-RCNN).


In some embodiments, when the rim and ball overlap, this may help to determine the contact position. This occurs when both the rim and ball are visible within the camera's frame. Using techniques like background subtraction, an image of the ball can be extracted. Additional computer vision methods, such as deep learning or other forms of background subtraction, can also be employed. The precise impact point of the ball on the rim can then be calculated by considering the impact time and the known sizes of both the ball and the rim. In some embodiments, the impact position of the ball may be determined without a complete set of 3D parameters. In some embodiments, artifacts generated by vibrating motion resulted from the impact may not require to be removed or eliminated.


Determining Relative 3D Positions.

In some embodiments, the camera is calibrated to obtain intrinsic and extrinsic camera parameters. In some embodiments, the camera is calibrated to obtain updated (or calibrated) intrinsic and/or extrinsic camera parameters. In some embodiments, the camera is calibrated via a factory calibration followed by a field calibration. While the factory calibration is typically performed to obtain intrinsic camera parameters, the field calibration is a further calibration to obtain updated extrinsic camera parameters. Hence, the factory calibration typically involves finding the camera's focal length, principal point, and distortion coefficients. Computer vision techniques (e.g., deep learning-based object detection models like YOLO, SSD, or Faster R-CNN) can be used to detect the basketball and rim within the image frame may be used. Once the bounding box coordinates of the basketball and rim are determined, the sizes are calculated in pixels.


In some embodiments, the known physical sizes of the basketball and rim may be used to determine conversion ratios between pixels and real-world measurements. In some embodiments, the known sizes of the basketball and rim along with the calculated conversion ratios may be used to estimate relative 3D position of the basketball from the camera using triangulation and geometric calculations. Triangulation utilizes the perspective projection equations to estimate the distance of the basketball from the camera using the detected bounding box size and known physical size of the basketball. Geometric calculations utilize information about the positions and sizes of the detected basketball and rim to estimate the 3D coordinates of the basketball relative to the rim or any other reference point. Finally, the 3D position obtained from the image coordinates is converted to the real-world coordinate system using, e.g., the camera's extrinsic parameters obtained during calibration.


Monitoring device 100 may be constructed such that it can be easily removed from its mounted position and placed back. For example, the housing of monitoring device 100 may be affixed to basketball pole 101 or the wall and monitoring device 100 could be attached and removed easily. Such approach would allow the user to charge monitoring device 100 when not in use.


Since player camera 202 sees the shooter from front, various kinematics data may also be determined. These kinematics data may include motion of the player's torso, knee, feet, head, hands, elbows, shoulders, and other body parts. Such kinematics data combined with the shot information (e.g., 3D trajectory, contact position) may be used to determine the quality of the shot. A correlation model between kinematic form and the 3D trajectory, including contact position, can be developed and used to build player specific insights to assess the quality of each shot. A statistical analysis can provide insights into the common mistakes the player makes during his/her shot. Various drills, exercises or remote coach guidance can be provided to a player/coach using the mentioned correlation data. All captured data by monitoring device 100 may be transferred to a mobile device or other device (e.g., desktop computer, network computer) using wireless communication hardware, such as wireless transceiver 205 (e.g., WiFi) shown in FIG. 2.


In some embodiments, the correlation model may be presented in a table containing various outcomes related to a player's shots and kinematic information. The kinematic information may include kinematic form such as estimations of 3D limbs and keypoints. Outcomes can be categorized as ball “in”, ball “out”, “swish”, bounce, good, poor or any other quality of the shot. Qualitative measures defining shot quality could involve release height, shot stability, and entry angle. By capturing millions or thousands of shots taken by a player, a comprehensive assessment of shot quality and its associated outcomes can be constructed. For example, a statistical model can be used that establishes connections between the kinematic information of the shots and their resulting outcomes. These outcomes are correlated with the player's kinematic motion or any other types of motion. In some embodiments, a probability score is assigned to the player based on the executed motion, thereby offering valuable insights into the player's kinematic form and how it influences the quality of shot.


While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable sub combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub combination or variation of a sub combination.


Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.

Claims
  • 1. A method comprising: capturing, using a first camera, a first set of images of a basketball in motion after the basketball is released by a player;capturing, using a second camera, a second set of images of the basketball when it contacts a rim of a basketball hoop;measuring, using a radar, radar data associated with the basketball; andgenerating, using the first and second sets of images, an observed three-dimensional (3D) trajectory of the basketball, based on two-dimensional (2D) position data determined from the first and second sets of images, intrinsic parameters of the first and second cameras, extrinsic parameters of the first and second cameras and the radar data.
  • 2. The method of claim 1, wherein the first camera has a first frame rate and a first field of view (FOV) directed towards the player, and the second camera has a second frame rate and a second FOV directed toward the rim of the basketball hoop.
  • 3. The method of claim 2, wherein the first frame rate is faster than the second frame rate and the first FOV is wider than the second FOV, and where the first camera captures the release of the basketball by the player, and the second camera captures the basketball before it contacts with the rim of the basketball hoop.
  • 4. The method of claim 1, wherein the radar data includes a perceived radial speed of the basketball while the basketball is in motion.
  • 5. The method of claim 4, wherein generating the observed 3D trajectory of the basketball further comprises: iterating through a number of flight trajectories of the basketball constructed from the 2D position data and the radar data;reprojecting each flight trajectory into camera coordinates using the intrinsic camera parameters;calculating a perceived 2D pixel error by comparing the observed 3D trajectory with 3D flight path data generated by a flight model;minimizing the 2D pixel error to obtain the observed 3D basketball trajectory; andconverting the observed 3D basketball trajectory from camera coordinates to real-world coordinates using the extrinsic parameters.
  • 6. The method of claim 5, wherein the 3D flight path data is randomly generated.
  • 7. The method of claim 1, further comprising: analyzing the second set of images using computer vision or deep learning to determine at least one of an impact time when the basketball contacts the rim, impact position of the basketball at the impact time, descent angle, spin rate, spin axis or contact speed;generating performance data based on the analyzing and observed 3D trajectory; andproviding feedback on the performance data through one or more visual or audio devices.
  • 8. The method of claim 7, further comprising: measuring the spin rate and the spin axis of the basketball by tracking a logo, seam line or other elements on the basketball.
  • 9. The method of claim 7, further comprising: determining the impact position of the basketball based on the observed 3D trajectory and a circumference of the rim.
  • 10. The method of claim 7, wherein vibration of the rim captured by the second camera is used to determine the impact time.
  • 11. The method of claim 7, further comprising: determining kinematics data from the first set of camera images; anddetermining the performance data by combining the kinematics data with the impact position.
  • 12. A system comprising: a first camera;a second camera;a radar;at least one processor;memory storing instructions that when executed by the at least one processor, cause the at least one processor to perform operations comprising: capturing, using the first camera, a first set of images of a basketball in motion after the basketball is released by a player;capturing, using the second camera, a second set of images of the basketball when it contacts a rim of a basketball hoop;measuring, using the radar, radar data associated with the basketball; andgenerating, using the first and second sets of images, an observed three-dimensional (3D) trajectory of the basketball, based on two-dimensional (2D) position data determined from the first and second sets of images, intrinsic parameters of the first and second cameras, extrinsic parameters of the first and second cameras and the radar data.
  • 13. The system of claim 11, wherein the first camera has a first frame rate and a first field of view (FOV) directed towards the player, and the second camera has a second frame rate and a second FOV directed toward the rim of the basketball hoop.
  • 14. The system of claim 13, wherein the first frame rate is faster than the second frame rate and the first FOV is wider than the second FOV, and where the first camera captures the release of the basketball by the player, and the second camera captures the basketball before it contacts with the rim of the basketball hoop.
  • 15. The system of claim 11, wherein the radar data includes a perceived radial speed of the basketball while the basketball is in motion.
  • 16. The system of claim 15, wherein generating the observed 3D trajectory of the basketball further comprises: iterating through a number of flight trajectories of the basketball constructed from the 2D position data and the radar data;reprojecting each flight trajectory into camera coordinates using the calibrated intrinsic camera parameters;calculating a perceived 2D pixel error by comparing the observed 3D trajectory with 3D flight path data generated by a flight model;minimizing the 2D pixel error to obtain the observed 3D basketball trajectory; andconverting the observed 3D basketball trajectory from camera coordinates to real-world coordinates using the extrinsic parameters.
  • 17. The system of claim 16, wherein the 3D flight path data is randomly generated.
  • 18. The system of claim 11, wherein the operations further comprise: analyzing the second set of images using computer vision or deep learning to determine at least one of impact time when the basketball contacts the rim, impact position of the basketball at the impact time, descent angle, spin rate, spin axis or contact speed;generating performance data based on the analyzing and observed 3D trajectory; andproviding feedback on the performance data through one or more visual or audio devices.
  • 19. The system of claim 18, wherein the operations further comprising: measuring the spin rate and the spin axis of the basketball by tracking a logo, seam line or other elements on the basketball.
  • 20. The system of claim 18, wherein the operations further comprising: determining the impact position of the basketball based on the observed 3D trajectory of the basketball and a circumference of the rim.
  • 21. The system of claim 18, further comprising: determining kinematics data from the first set of camera images; anddetermining the performance data from the kinematics data and the impact position of the basketball.
  • 22. The system of claim 18, wherein vibration of the rim captured by the second camera is used to determine the impact time.
CROSS-RELATED APPLICATION

This application claims the benefit of priority of U.S. Provisional Application No. 63/620,443, for “Measuring 3D Basketball Trajectory, Spin rate and Spin axis,” filed on Jan. 12, 2024, which provisional application is incorporated by reference herein in its entirety.

Provisional Applications (1)
Number Date Country
63620443 Jan 2024 US