The specification relates generally to ball sports, and specifically to an automated ball launching system for launching balls or other projectiles.
Participants in various sports, including ball sports such as basketball, can improve their performance by training skills such as shooting (e.g. at a net). Without any assistance from other participants, such training can be rendered inefficient by the frequent need to collect balls for further training shots. Current systems intended to aid the collection of balls and their return to the training participant, however, require extensive configuration to operate, and remain inefficient, at least in part due to their lack of operational flexibility.
According to an aspect of the specification, an automated ball launching system is provided. The system includes a ball support configured to releasably support a ball in a launch position; a positioning subsystem configured to set a launch direction for the ball; a launcher configured to project the ball in the launch direction from the launch position; and a control subsystem connected to the positioning subsystem and the launcher. The control subsystem is configured to: determine a position of a player relative to the ball launching system; control the positioning subsystem to set the launch direction based on the position of the player; and control the launcher to project the ball in the launch direction.
Embodiments are described with reference to the following figures, in which:
Referring to
In other embodiments, a wide variety of other ball supports can be employed. For example, curved guides can be placed on the planar surfaces shown in
System 100 also includes a positioning subsystem configured to set a launch direction for ball 104, and a launcher configured to project ball 104 in the launch direction from the launch position shown in
In the present embodiment, the positioning system is provided by a set of actuators, and the launcher is provided by motorized wheels, as will be discussed below.
System 100 includes a rotatable base 108, such as a flat plate (e.g. made from steel or any other suitable material, and having sufficient strength to support the weight of the components of system 100 described herein). Base 108 is rotatably mounted on a fixed support, such as a ring 110. Ring 110 can rest directly on the ground when system 100 is in use, or can be supported on the ground by, for example, suction cups 112 or any other suitable support mechanism (e.g. wheels, pegs, or any suitable combination thereof). Base 108 is mounted on ring 110 to allow rotation of base 108 relative to ring 110. Thus, base 108 can include wheels, bearings or the like to move along the surface of ring 110. In other embodiments, base 108 can simply rest directly on the upper surface of ring 110, without being rotationally fixed to ring 110.
System 100 also includes a launch assembly 114 supported by base 108. In particular, launch assembly 114 is supported on base 108 by a frame 116 and an axle 120. Thus, launch assembly 114 is rotatable relative to base 108 about an elevation axis “E”. Meanwhile, as mentioned above base 108 is rotatable relative to ring 110 (and by extension, to the ground on which system 100 is installed) about a yaw axis “Y”.
In the present embodiment, the above-mentioned positioning subsystem includes a first actuator for rotating base 108 about axis Y, and a second actuator for rotating launch assembly 114 about axis E. The nature of the first and second actuators is not particularly limited. In the present example, the first actuator is a motor 122 (e.g. a high torque DC electrical motor). Motor 122 can include an axis protruding through an opening in base 108 and fixed to a central portion of ring 110, e.g. by a form arm or other attachment. Motor 112 can also include a casing (visible in
The second actuator can include a linear actuator 124, such as a hydraulic or pneumatic cylinder, or an electric actuator, coupled to base 108 at one end and to launch assembly 114 at an opposite end. Thus, the lengthening or shortening of linear actuator 124 causes launch assembly 114 to pivot about axis E. In other embodiments, linear actuator 124 can be replaced, or supplemented, with a rotational actuator mounted adjacent to axle 120.
As seen in
The positions of wheels 126 can be adjusted in at least one direction. For example, shafts 130 can be slideably mounted on rails 132, such that shafts 130 can be placed closer together or further apart. In addition, where multiple wheels 126 are provided on each shaft 130, the position of the wheels along each shaft 130 can be adjusted. More generally, the positions of wheels 126 at the outlet of launch assembly 114 can be adjusted to accommodate a variety of sizes of ball 104.
As will now be apparent to those skilled in the art, when ball 102 is in the launch position shown in
System 100 can also include a ball dispenser 138 supported by base 108 and disposed above launch assembly 114. In the present example, dispenser 138 is supported above launch assembly 114 by a frame 140, although other support mechanisms are also contemplated. Dispenser 108 can be removable from base 108, or can be omitted entirely in some embodiments.
In general, dispenser 138 defines a channel 142 configured to store a plurality of balls, and is configured to release a ball from channel 142 into launch assembly 114. Channel 142 has a lower end adjacent to launch assembly 114, and an opposite upper end. Balls may be loaded into channel 142 via the upper end, and released from the lower end to fall into launch assembly 114 via an opening 144 in an upper surface of launch assembly 114.
Dispenser 138 can include a dispensing actuator (not shown in
Dispenser 138 also includes a dispenser actuator 312 in the form of a linear actuator connected to dispenser 138. Actuator 312 includes a projection 316 extending into channel 142. In a first position, shown in
As seen in
To release the ball 300 from dispenser 138 and into launch assembly 114, actuator 312 returns to the raised position, as seen in
A wide variety of other dispensers can be employed in other embodiments. For example, although channel 142 is shown as being substantially straight, in other embodiments channel 142 can have a variety of shapes, such as a helical shape, a zigzag shape, and the like. In further embodiments, channel 142 can be replaced with a hopper-like structure in which a plurality of balls can rest (not necessarily stacked one above the other as shown in
As seen in
Although not shown in
Referring now to
Generally, computing device 400 includes a central processing unit (CPU) 404, also referred to herein as processor 404, interconnected with a memory 408. Memory 408 stores computer readable instructions executable by processor 404, and processor 404 is configured to perform various actions (discussed below) via execution of those instructions. Processor 404 and memory 408 are generally comprised of one or more integrated circuits (ICs), and can have a wide variety of structures, as will now occur to those skilled in the art. For example, processor 404 can include more than one CPU; processor 404 can also include one or more CPUs as well as one or more graphics processing units (GPUs), or any other suitable processing components.
As mentioned above, processor 404 executes the instructions stored in memory 408 to perform, in conjunction with the other components of computing device 400, various functions related to the control of system 100. In the discussion below of those functions, computing device 400 is said to be configured to perform those functions—it will be understood that computing device 400 is so configured via the processing of the instructions in memory 408 by the hardware components of computing device 400 (including processor 404 and memory 408).
Computing device 400 also includes a communications interface 412 interconnected with processor 404, which allows computing device 400 to communicate with other devices, such as the actuators of system 100 described above, as well as other computing devices. Interface 412 thus includes the necessary hardware, such as interface controllers and the like, to communicate with such other devices. As will be seen further below, interface 412 may enable computing device 400 to connect to one or more additional computing devices, which in turn communicate with the actuators of system 100 (rather than computing device 400 itself communicating directly with the actuators of system 100).
Computing device 400 can also include one or more input devices 416 interconnected with processor 404. Example input devices include keyboards, mice, touch screens, touch-sensitive wearable devices such as armbands, and the like. Input device 416 can be connected to processor 404 via interface 412 by any of a variety of connection types, including universal serial bus (USB), Bluetooth™ and the like. Input device 416 can also be connected to a separate computing device, and commands can be relayed to processor 404 via such other computing device rather than received at processor 404 directly from input device 416.
Computing device 400 can also include one or more output devices 420 interconnected with processor 404. Example output devices include a display, speaker, and the like. As shown in
Turning now to
At block 505, computing device 400 is configured to determine a position of a target relative to system 100. In the present example, the target is assumed to be a human player (e.g. employing system 100 as a basketball training tool). In other examples, however, the target need not be a human player. A variety of mechanisms can be employed to determine the position of the target.
In the present embodiment, computing device 400 is configured to perform block 505 via a method shown in
Turning to
Returning to
The features detected at block 610 are not particularly limited. In general, features are selected that permit computing device to detect target 700 within images 704 and 708 and to determine a position of target 700. When target 700 is a human player, for instance, the features detected at block 610 can include face detection, body detection and the like. Any suitable detection algorithms can be implemented. For example, the Haar cascade algorithm (e.g. as implemented in the OpenCV library) can be applied to images 704 and 708 for face detection. Other suitable algorithms will also occur to those skilled in the art.
Having detected the above-mentioned features of target 700 in each image, computing device 400 is configured to determine the position, in three dimensions, of target 700 relative to system 100. For example, computing device 400 can transform the locations of the detected features within images 704 and 708 into rays in three-dimensional space, based on the known relative positions of cameras 148. By determining the intersection of the two rays thus generated, computing device 400 can determine the position (including depth) of target 700.
Having determined the position of target 700, computing device 400 can be configured, at block 615, to define a region of interest to be applied to the next frames received from cameras 148. For example, computing device 400 can generate a ROI based on a predicted maximum distance that target 700 will travel before the next images are received. Referring to
Having defined an ROI, computing device returns to the performance of method 500 at block 510. When, on the other hand, the determination at block 605 is affirmative (e.g. when ROIs 704 and 708 were already defined), computing device 400 proceeds to block 620 rather than block 610. At block 620, computing device 400 is configured to determine whether the above-mentioned features can be detected within the ROIs, rather than within the entire images received from cameras 148. When the determination at block 620 is affirmative, the position of target 700 is determined based on the detected features at block 630 (as described above in connection with block 610), and the ROIs can be updated based on the position determined at block 630 at block 635.
When the determination at block 620 is negative, this indicates that the target 700 has moved further than expected, and that the entire images are to be searched for detectable features, rather than only the ROIs. Thus, computing device 400 is configured to discard the previous ROIs at block 625, and perform block 610 as discussed above. In some embodiments, the use of ROIs can be omitted entirely (e.g. where computational performance is great enough to continuously support full image feature detection).
A variety of other position-determination processes can be employed by computing device 400. For example, rather than relying on stereoscopic cameras 148, computing device 400 can receive a single image (from a single camera), and accompanying depth data, e.g. from a depth scanner such as a LIDAR device, an ultrasonic sensor, or the like. In other embodiments, the depth scanner can be omitted, and computing device 400 can perform a height calibration process on an image received from a monocular camera to determine the depth of target 700 in the image. In still other embodiments, a motion tracking system, for example based on infrared or near-infrared reflectors on target 700 and corresponding cameras connected to computing device 400, can be implemented. Additional motion tracking systems, such as active infrared or near-infrared (in which target 700 is equipped with IR-emitters rather than reflectors) can also be employed. In still other embodiments, location sensors such as GPS receivers worn by target 700 may be employed to determine the position of target 700, and supply the position to computing device 400.
Returning to
In connection with system 100 as discussed above, the determination at block 510 involves determining a yaw angle (that is, an angle about axis Y) and an elevation angle (that is, an angle about axis E), as well as a speed for wheels 126 required to propel ball 104 a sufficient distance to reach target 700. In some embodiments, the launch direction and speed can be determined based on known relationships between the speed of wheels 126 and the speed of ball 104 when projected by wheels 126. In other embodiments, launch directions and speeds required to project ball 104 by predetermined distances can be measured or calculated in advance and stored in memory 408, for example in a lookup table. In the present embodiment, memory 408 stores a lookup table containing the required elevation angles and corresponding wheel speeds to achieve projection distances of between two and six meters, at half-meter intervals. Thus, at block 510 computing device 400 is configured to set the yaw angle to point the outlet of launch assembly 114 at target 700, and to retrieve the elevation angle and speed from the above-mentioned lookup table based on the distance between system 100 and target 700 (interpolating between lookup table intervals if necessary).
The angles and speed determined at block 510 can be determined and provided to system 100 in any suitable format (e.g. angles, voltages, and the like).
At block 510, computing device 400 can also be configured to send the launch direction to the positioning subsystem, for controlling the positioning subsystem to place launch assembly 114 in the launch direction while the remainder of method 500 is performed. In some embodiments, this can be omitted, and system 100 can instead remain at rest until a launch is required.
At block 515, computing device 400 is configured to determine whether a launch command has been received. The launch command can be received, for example from input device 416. A variety of launch commands are contemplated. For example, an input provided by target 700 to a touch screen, a wearable armband with a touch-sensitive input, and the like, can be employed. In other embodiments, input device 416 can include a microphone, and the launch command can be an audible command (e.g. the word “go” spoken by target 700). In still other embodiments, the launch command can be generated by computing device 400 itself, for example upon determining that the location of target 700 has remained stationary (or has shifted by less than a threshold amount) for a predetermined period of time. The launch command can also include one or more gestures performed by target 700 and identified by computing device 400 from series of images received from cameras 148.
When no launch command is received, the performance of method 500 returns to block 505, and system 100 continues to track the current position of target 700. When a launch command has been received, however, the performance of method 500 proceeds from block 515 to block 520. At block 520, computing device 400 is configured to determine whether various safety conditions are met. For example, computing device 400 can be configured to determine whether target 700 is at or beyond a minimum distance (e.g. two meters) from system 100, and whether the current launch direction is within a predetermined margin of the current location of target 700 (e.g. ten degrees). In some embodiments, the performance of block 520 can be omitted.
When the determination at block 520 is negative, the performance of method 500 returns to block 505, and system 100 continues to track the position of target 700 (noting that, since a pending launch command has not yet been executed, the determination at block 515 will be affirmative until a launch has been performed). In other embodiments, an error may be generated. For example, the launch command received at block 515 may be discarded and a warning indication (e.g. a flashing light) may be generated, before returning to block 505.
When the determination at block 520 is affirmative (or when block 520 is not implemented), computing device 400 proceeds to block 525. At block 525, computing device 400 is configured to send a launch instruction to system 100. The launch instruction can include the speed determined at block 510, for controlling motors 128 to spin wheels 126 up to the required speed. In some embodiments, the launch instruction can also include an instruction to feeder actuator 134 to extend to push ball 104 into engagement with wheels 126.
In embodiments including dispenser 138 or a similar dispenser, at block 525, computing device 400 can also send an instruction to dispenser 138 to supply the next ball to launch assembly 114. Although not required, the instruction to dispenser 138 can be delayed until confirmation of a successful launch is received (e.g. a speed decrease measured at wheels 126 due to contact with ball 104, or the detection of ball 104 in subsequent images from cameras 148).
Following the performance of block 525, at block 530 computing device 400 can record various performance data
For example, computing device 400 can store, in memory 408, the location of target 700 at the time of the launch. Computing device 400 can also, subsequent to the launch, receive input data indicating whether a shot has been taken by target 700 (e.g. whether ball 104 has been returned towards system 100 since the launch of ball 104 at block 525). The time elapsed between the launch and the shot can be stored in memory 408. The input data can be received from input device 416, or can be generated by computing device 400, for example by tracking the position of ball 104 in images from cameras 148. Further, computing device can receive input data indicating whether the above-mentioned shot was a hit or a miss (e.g. whether ball 104 was successfully returned to dispenser 138 via a net and backboard). Such input data can be received from input device 416, or from sensors (not shown) mounted in or near dispenser 138.
Computing device 400 can be configured to present some or all of the data stored at block 530 on output device 420 (e.g. a display). For example, referring to
Computing device 400 can also be configured to generate a variety of additional feedback for target 700 (when target 700 is a player). For example, computing device 400 can be configured to identify locations or groups of locations from which the player is more or less likely to miss shots. Computing device 400 can also, based on successive positions determined at block 505, compute a total distance traveled by target 700 and display that distance (in some instances, along with estimated calorie burn information and the like).
Various other embodiments are contemplated for the control subsystem. For example, in some embodiments the control activities performed by computing device 400 above can be divided among more than one device, or can be supplemented by functions performed by other devices.
Turning to
In the embodiment of
The transmission of a launch instruction at block 525 can therefore include sending the speed determined at block 510 from computing device 400 to controller 1000 rather than directly to the actuators of system 100. Controller 1000 can then be configured to implement the launch instruction, for example by controlling motors 128, and instructing feeder actuator 136 to push ball 104 into engagement with wheels 126, as well as instructing dispenser 138 to release the next ball into launch assembly 114.
As will now be apparent to those skilled in the art, since launch commands are passed through auxiliary device 1004, auxiliary device 1004 can perform block 530, rather than computing device 400.
Further variations to the above are contemplated. For example, motors 128 can be controlled to spin each pair of wheels 126 shown in
The scope of the claims should not be limited by the embodiments set forth in the above examples, but should be given the broadest interpretation consistent with the description as a whole.
Number | Name | Date | Kind |
---|---|---|---|
4442823 | Floyd | Apr 1984 | A |
4579340 | Jenkins | Apr 1986 | A |
4596230 | Griffith | Jun 1986 | A |
4678189 | Koss | Jul 1987 | A |
4714248 | Koss | Dec 1987 | A |
5464208 | Pierce | Nov 1995 | A |
5465978 | Magnone | Nov 1995 | A |
5487540 | Bixler | Jan 1996 | A |
5647338 | Martin | Jul 1997 | A |
5681230 | Krings | Oct 1997 | A |
5776018 | Simpson | Jul 1998 | A |
5897445 | Sanders | Apr 1999 | A |
6102021 | Sanders | Aug 2000 | A |
6152126 | Smith | Nov 2000 | A |
6241628 | Jenkins | Jun 2001 | B1 |
6443140 | Crews | Sep 2002 | B1 |
6470873 | Battersby | Oct 2002 | B2 |
6513512 | Battersby | Feb 2003 | B2 |
6539931 | Trajkovic | Apr 2003 | B2 |
6546924 | Battersby | Apr 2003 | B2 |
6776732 | Parkinson | Aug 2004 | B2 |
7082938 | Wilmot | Aug 2006 | B2 |
7111620 | Johndreau | Sep 2006 | B2 |
7553244 | York | Jun 2009 | B2 |
7927237 | Jenkins | Apr 2011 | B2 |
7938746 | Chipperfield | May 2011 | B2 |
8123634 | Lovett | Feb 2012 | B1 |
8206246 | Joseph | Jun 2012 | B2 |
8845460 | Feldstein | Sep 2014 | B1 |
9010309 | Lewis | Apr 2015 | B2 |
9017188 | Joseph | Apr 2015 | B2 |
9022016 | Hafer | May 2015 | B1 |
9233292 | Joseph | Jan 2016 | B2 |
20020134367 | Delso | Sep 2002 | A1 |
20130104870 | Rizzo | May 2013 | A1 |
Entry |
---|
Airborne Athletics, Inc. “Dr. Dish Owners Manual” pp. 1-28; downloaded Oct. 1, 2015 from https://drdishbasketball.com/wp-content/uploads/2015/03/Current-Dr.-Dish-Manual.pdf. |