The present invention relates, in general, to the field of training individuals to improve their performance in their field of work or play. More particularly, present embodiments relate to a system and method for projecting an object toward a target and a trainee (or individual) interacting with the object.
A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions. One general aspect includes a method for training that can include projecting, via a delivery device, one or more objects along one or more pre-determined trajectories toward a target zone, where each of the one or more objects has a diameter that is less than a diameter of a regulation table tennis ball; a trainee attempting to prevent the one or more objects from entering the target zone, and scoring an ability of the trainee to prevent the one or more objects from entering the target zone. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
One general aspect includes a method for training. The method also includes projecting, via a delivery device, an object along a pre-determined trajectory toward a target zone proximate a trainee, where the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring, via a controller, an ability of the trainee to interact with the object at the target zone. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
One general aspect includes a method for training that can include operating of a delivery device in response to a first command from a trainee to perform a training; receiving, by the trainee, a response to the first command from the delivery device, where the response is requesting input from the trainee; providing, via a second command from the trainee, the requested input to the delivery device; adjusting, via a controller, one or more parameters of the delivery device based on the second command; projecting, via the delivery device, an object along a pre-determined trajectory toward a target zone proximate the trainee, where the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring an ability of the trainee to interact with the object at the target zone. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
One general aspect includes a method for training that can include projecting, via a delivery device, a plurality of first objects along a pre-determined trajectory toward a first target zone in a first training field, where each of the plurality of first objects has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a delivery device to deliver the plurality of first objects at a pre-determined location at the first target zone; adjusting one or more parameters of the delivery device based on the first score; projecting, via the delivery device, a plurality of second objects along a pre-determined trajectory toward the first target zone, where each of the plurality of second objects has a diameter less than the diameter of a regulation table tennis ball; and determining, via the controller and the imaging sensor, a second score that indicates an ability of a delivery device to deliver the plurality of second objects at the pre-determined location at the first target zone, where the second score indicates an improved performance of the delivery device compared to the first score. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
One general aspect includes a method for training that can include projecting, via a delivery device, a first object along a pre-determined trajectory toward a first target zone in a first training field, where the first object has a diameter less than the diameter of a regulation table tennis ball; determining, via a controller and an imaging sensor, a first score that indicates an ability of a trainee to interact with the first object at the first target zone; moving the delivery device to a second training field, with a second target zone; projecting, via the delivery device, a second object along the pre-determined trajectory toward the second target zone, where the second object has a diameter less than the diameter of the regulation table tennis ball; determining, via the controller and the imaging sensor, a second score that indicates an ability of the delivery device to project the second object along the pre-determined trajectory to the second target zone; adjusting one or more parameters of the delivery device based on the second score; projecting another object toward the second target zone; determining, via the controller and the imaging sensor, the second score and comparing the second score to a desired score; and repeating the adjusting the one or more parameters based on the second score, projecting the another object, and determining the second score until the second score is substantially equal to the desired score. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
Features, aspects, and advantages of present embodiments will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
The following description in combination with the figures is provided to assist in understanding the teachings disclosed herein. The following discussion will focus on specific implementations and embodiments of the teachings. This focus is provided to assist in describing the teachings and should not be interpreted as a limitation on the scope or applicability of the teachings.
As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having,” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of features is not necessarily limited only to those features but may include other features not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive-or and not to an exclusive-or. For example, a condition A or B is satisfied by any one of the following: A is true (or present), and B is false (or not present), A is false (or not present), and B is true (or present), and both A and B are true (or present).
The use of “a” or “an” is employed to describe elements and components described herein. This is done merely for convenience and to give a general sense of the scope of the invention. This description should be read to include one or at least one and the singular also includes the plural, or vice versa, unless it is clear that it is meant otherwise.
The use of the word “about”, “approximately”, or “substantially” is intended to mean that a value of a parameter is close to a stated value or position. However, minor differences may prevent the values or positions from being exactly as stated. Thus, differences of up to ten percent (10%) for the value are reasonable differences from the ideal goal of exactly as described. A significant difference can be when the difference is greater than ten percent (10%).
Military, first responders, and tactical officers often need to make quick but accurate decisions under stress. By improving time to recognize aspects of the field around them, they can more quickly determine risks and identify threats. Search and Rescue personnel can work in difficult, stressful, or poor operating environments. Enhanced visual skills can help reduce the time to recognize dangers, individuals, and risks of the situation. Visual skills that can be improved by the training systems in this disclosure are, but not limited to:
The
In one embodiment, the projection of the object 30 along the trajectory (40, 42, 44) may be controlled by one or more controllers 28, 29 (also referred to as “controller 28, 29”) capable of controlling various aspects of the process of projection of the object 30, such that the projection is conducted along a predetermined trajectory 40, 42, or 44. The one or more controllers 28, 29 can include only one controller (28 or 29) that can control the aspects of the delivery device 20 and communicate with internal and external data sources for setting parameters of the delivery device 20 to desired values. The one or more controllers 28, 29 can include an internal controller(s) 28 and an external controller(s) 29 that can control the aspects of the delivery device 20 and communicate with each of the controllers and with internal and external data sources for setting parameters of the delivery device 20 to desired values.
A predetermined trajectory can include a trajectory that is estimated (or determined) prior to projection of the object 30. The predetermined trajectory can be selected by the controller 28, 29 which can control one or more components of the delivery device 20 to control the trajectory of the object. The delivery device 20 can include or be communicatively coupled (wired or wirelessly) to the controller 28, 29 that can be configured to control one or more delivery variables associated with delivering the object along a predetermined trajectory 40, 42, or 44. In a non-limiting embodiment, the delivery variables can include, position of the device in 3D-space (i.e., position in space according to X, Y, and Z planes), angle of the device relative to an intended target or trainee, distance from a target or trainee, intended velocity of the object along the intended trajectory between the device and the target or trainee, spin of the object along the intended trajectory between the device and the target or trainee, the weight of the object by selecting an object, surface features of the object by selecting the object, as well as others. Additional delivery variables (or parameters) are defined in the following description at least in regard to
The delivery device 20 can be moved horizontally shown by arrows 60, 62, or vertically shown by arrows 64. The height L1 of the object exiting the delivery device 20 can be adjusted by moving the chassis 22 of the delivery device 20 up or down (arrows 64) a desired distance. This 3D movement of the delivery device 20 can allow users (e.g., coach 4, trainer 4, individual 8, trainee 8, or others) to adjust the position that an object 30 exits the delivery device 20. This can allow the exiting object 30 to be positioned so as to emulate a human or other real-life source for delivery of a regulation object (e.g., a regulation baseball, a regulation softball, a regulation hockey puck, a regulation tennis ball, a regulation table tennis ball, a regulation lacrosse ball, a regulation cricket ball, a regulation football, and a regulation soccer ball) such as by a pitcher for baseball or softball, a quarterback for football, a skeet delivery device for shooting sports, a soccer player making shots on goal, a hockey player making shots on goal, etc. As used herein, a “real-life” event refers to a game, practice session, or tactical situation for which the trainee is training to improve performance. The real-life event would be those events that use regulation equipment to perform the sport or tactical operations or situations.
Additionally, the object 30 trajectory can be projected from the delivery device 20 at an appropriate angle relative to a surface 6. A guide 24 can be used to cause the object to exit the delivery device 20 at an angle and cause the object to experience varied resistance when it is ejected from the guide 24. The guide 24 can include a barrel and a friction device for imparting spin and deflection to the object to project the object 30 along a predetermined trajectory. A controller 28, 29 can control the angle and position of the guide 24, as well as select the predetermined (or desired, or expected) trajectory from a plurality of trajectories or define the predetermined trajectory based on collected data from data sources. In a non-limiting embodiment, each predetermined trajectory (e.g., trajectories 40, 42, 44) can include any parameters needed to setup the delivery device 20 to deliver the object 30 along that particular predetermined trajectory (e.g., trajectories 40, 42, 44). In a non-limiting embodiment, the parameters can include an azimuthal direction of the guide 24 to produce a desired azimuthal direction of an object 30 exiting the delivery device 20. The parameters can also include the amount and location of resistance to be applied to the object as the object is propelled toward the exit of the delivery device 20. These will be described in more detail below with regard to the delivery device 20.
In a non-limiting embodiment, the parameters can also include the force to be applied to the object 30 that will propel the object 30 from the delivery device 20 and cause the object to travel along the predetermined trajectory (e.g., trajectories 40, 42, 44). In a non-limiting embodiment, the force can be applied to the object 30 via pneumatic, hydraulic, electrical, electro-mechanical, or mechanical power sources that can selectively vary the amount of force applied to the object 30. The parameters can also include which one of a plurality of objects 30 and which one of a plurality of barrels 360 should be chosen to provide the desired trajectory. The plurality of objects 30 can have many different features which are described in more detail below. The controller 28, 29 can select the object 30 that is needed to produce the desired trajectory. The controller 28, 29 can control an alert feature 26 (such as turn ON or OFF a light, turn ON or OFF an audible signal, play a synchronized video of a real-life delivery source, etc.) to indicate that an object 30 is about to be projected from the delivery device 20 toward the target zone 50. The alert feature 26 can be any device that can alert the trainee 8 to be ready for the object 30 to exit the delivery device 20.
In a non-limiting embodiment, the object 30 can be a spherical or substantially spherical object used for training purposes. The object 30 may be shaped to represent a desired sport. In a non-limiting embodiment, the object 30 can come in different colors such as white, yellow, orange, red, blue, tan, grey, black, or a luminescent color. The color of the object 30 can be selected for the sport for which the trainee 8 is being trained or for the type of training being used. In a non-limiting embodiment, a colored pattern (e.g., red, yellow, white, green, blue, orange, or black pattern) can be applied on the object 30 to differentiate it from other objects 30. The colored pattern can be used to assist the trainee 8 in focusing intently on the object 30 so that they may pick up and track a particular sports ball quicker. The object may have one or more surface features (e.g., smooth, dimples, bumps, recesses, ridges, grainy texture, etc.) that facilitate delivery along various trajectories. In a non-limiting embodiment, the object 30 can be made from a material such as acrylonitrile butadiene styrene, polylactic acid, calcium carbonate, recycled paper, cotton, foam, plastics, calcites, rubber, a metal such as steel, lead, copper, aluminum, or metal alloys, a plant-based material, or a fungus-based material.
In at least one embodiment, the device can include a magazine that may contain a plurality of objects. The objects 30 in the magazine can be substantially the same or at least a portion of the objects 30 can have varied characteristics relative to the other objects 30. Object characteristics can include but are not limited to, shape, size (e.g., longest dimension or length of the object, which in the case of a sphere is the diameter and in the case of a disk is the diameter along a major surface), color, surface features, density, material (e.g., inorganic, organic, metal, polymer, ceramic, or any combination thereof), or any combination thereof. In one embodiment, the delivery device 20 can include a first magazine with a first portion of objects having a first object characteristic, and a second magazine with a second portion of objects having a second object characteristic different from the first object characteristic. In one embodiment, the device is capable of selecting a single object from the first portion or the second portion. Various parameters may be used to select different objects, which may include, but is not limited to, a method of training (e.g., a preselected training protocol), a measured or scored capability of a trainee, a selection by the trainee, an instruction from one or more devices (e.g., data input from a sensor, such as a sensor associated with an impact device) communicatively coupled to the controller 28, 29.
In a non-limiting embodiment, it can be desirable for the object 30 to be sized such that it is significantly smaller than a corresponding regulation object. A corresponding regulation object is determined based upon the intended sport for which the trainee is training. For example, when training for baseball, the corresponding regulation object would be the regulation size of a baseball. It should be noted that there can be multiple regulation sizes in a particular sport. For example, the size of a soccer ball for professional soccer can be different than a size of a soccer ball for youth soccer, yet, both soccer balls are regulation size. For example, the size of a football for professional football can be different than a size of a football for youth football, yet, both footballs are regulation size. The object 30 of the current disclosure is significantly smaller than any of the regulation sizes for footballs or any other regulation objects. The delivery device of the current disclosure does not project objects of the same size of a regulation object for the intended sport or activity for which the trainee is training.
In one non-limiting embodiment, the difference in size between the object 30 and a corresponding regulation object can be expressed as a value of Lo/Lr, wherein Lo is the largest dimension (i.e., length) of the object 30 and Lr is the largest dimension (i.e., length) of the regulation object. In at least one embodiment, the difference in size (or ration Lo/Lr) can be less than 0.9 or less than 0.8 or less than 0.7 or less than 0.6 or less than 0.5 or less than 0.4 or less than 0.3 or less than 0.2 or less than 0.1. Still, in another non-limiting embodiment, the difference in size can be at least 0.001 or at least 0.002 or at least 0.004 or at least 0.006 or at least 0.008 or at least 0.01 or at least 0.02 or at least 0.03 or at least 0.05 or at least 0.07 or at least 0.1 or at least 0.15 or at least 0.2 or at least 0.25 or at least 0.3. It will be appreciated that the difference in size between the object 30 and a corresponding regulation object (Lo/Lr) can be within a range including any of the minimum and maximum values noted above, including, for example, but not limited to at least 0.001 and less than 0.9 or within a range of at least 0.001 and less than 0.5 or within a range of at least 0.002 and less than 0.006.
In a non-limiting embodiment, the diameter D1 (see
In another non-limiting embodiment, the diameter D1 of the object 30 can be less than 2.0 inches, less than 1.90 inches, less than 1.80 inches, less than 1.70 inches, less than 1.60 inches, less than 1.50 inches, less than 1.40 inches, less than 1.30 inches, less than 1.20 inches, less than 1.10 inches, less than 1.00 inches, less than 0.90 inches, less than 0.85 inches, less than 0.80 inches, less than 0.75 inches, less than 0.70 inches, less than 0.65 inches, less than 0.60 inches, less than 0.59 inches, less than 0.55 inches, less than 0.50 inches, less than 0.45 inches, less than 0.40 inches.
It will be appreciated that the diameter of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.05 inches and less than 2.0 inches, or within a range of at least 0.05 inches and less than 1.10 inches, or within a range of at least 0.07 inches and less than 1.00 inch.
In a non-limiting embodiment, the size of the object 30 can be at least 120 times smaller than a baseball, at least 220 times smaller than a softball, at least 400 times smaller than a soccer ball, at least 25 times smaller than a table tennis ball, at least 90 times smaller than a lacrosse ball, at least 40 times smaller than a hockey puck, at least 70 times smaller than a clay pigeon (for shooting sports), or at least 110 times smaller than a cricket ball.
In a non-limiting embodiment, the size of the object 30 can be smaller than a regulation table tennis ball, where the regulation table tennis ball is spherical, and its diameter is 1.57 inches (40 mm), where the diameter D1 of the object 30 can be less than 1.57 inches (40 mm).
In a non-limiting embodiment, the weight of the object 30 can be at least 0.001 ounces, at least 0.002 ounces, at least 0.003 ounces, at least 0.004 ounces, at least 0.005 ounces, at least 0.006 ounces, at least 0.007 ounces, at least 0.008 ounces, at least 0.009 ounces, at least 0.010 ounces, at least 0.011 ounces, at least 0.012 ounces, at least 0.013 ounces, at least 0.014 ounces, at least 0.015 ounces, at least 0.20 ounces, at least 0.25 ounces, at least 0.30 ounces, at least 0.35 ounces, at least 0.40 ounces, at least 0.45 ounces, at least 0.50 ounces, at least 0.55 ounces, or at least 0.60 ounces.
In another non-limiting embodiment, the weight of the object 30 can be less than 10 ounces, less than 9 ounces, less than 8 ounces, less than 7 ounces, less than 6 ounces, less than 5 ounces, less than 4 ounces, less than 3 ounces, less than 2 ounces, less than 1.5 ounces, less than 1 ounce, less than 0.9 ounces, less than 0.8 ounces, less than 0.7 ounces, less than 0.6 ounces, less than 0.5 ounces, less than 0.4 ounces, less than 0.3 ounces, less than 0.2 ounces, less than 0.1 ounces, less than 0.09 ounces, less than 0.08 ounces, or less than 0.05 ounces.
It will be appreciated that the weight of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.001 ounces and less than 10 ounces, or within a range of at least 0.07 ounces and less than 0.9 ounces, or within a range of at least 0.002 ounces and less than 5 ounces, or within a range of at least 0.002 ounces and less than 1.5 ounces. In a non-limiting embodiment, other sizes and weights of the object 30 can be used with the delivery device 20 to project the object 30 toward the target zone 50.
The weight of the object 30 can be adjusted for different training purposes and achieving various predetermined trajectories (e.g., 40, 42, 44). The weight can depend on the size and materials used for the specific object 30 that support different training processes. The variation of weight can result in speed changes of the object 30.
In a non-limiting embodiment, the shape of the object 30 can be substantially spherical. In another non-limiting embodiment, the object can be non-spherical, such as spheroidal. In another non-limiting embodiment, the object 30 can also have surface features (e.g., dimples, divots, holes, recesses, ridges, bumps, grainy textures, etc.) for trajectory modification. The shape of the object 30 can be tailored to emulate certain predetermined trajectories such as knuckle ball throws, kicks from a soccer ball, etc.
In a non-limiting embodiment, the materials that make up the object 30 can be acrylonitrile butadiene styrene, polylactic acid, calcium carbonate, paper, cotton, or foam, any poly-based plastics, or plastics in general, calcites, metal such as steel, lead, copper or aluminum, rubber, a plant-based material, or a fungus-based material. In a non-limiting embodiment, the object 30 can be coated with glow in the dark colors. This can be used in various training methods for vision training, such as segmenting training and strike zone training (described later).
In a non-limiting embodiment, the object 30 can be illuminated by ultraviolet lights such as black lights for isolated training processes for vision tracking. Being smaller than the regulation objects, the object 30 can be safer than regulation objects. A user may need to only wear safety glasses or a mask.
The delivery device 20 can be positioned at a distance L2 from a target zone 50 or trainee 8. In a non-limiting embodiment, the distance L2 can be at least 3 feet, at least 4 feet, at least 5 feet, at least 6 feet, at least 7 feet, at least 8 feet, at least 9 feet, at least 10 feet, at least 11 feet, at least 12 feet, at least 13 feet, at least 14 feet, at least 15 feet, at least 16 feet, at least 17 feet, at least 18 feet, at least 19 feet, at least 20 feet, at least 25 feet, at least 30 feet, at least 35 feet, or at least 40 feet.
In another non-limiting embodiment, the distance L2 can be less than 210 feet, less than 205 feet, less than 200 feet, less than 190 feet, less than 180 feet, less than 170 feet, less than 160 feet, less than 150 feet, less than 140 feet, less than 130 feet, less than 120 feet, less than 110 feet, less than 100 feet, less than 90 feet, less than 80 feet, less than 70 feet, less than 60 feet, less than 55 feet, less than 50 feet, less than 45 feet, less than 40 feet, less than 35 feet, less than 30 feet, less than 25 feet, or less than 20 feet.
It will be appreciated that the distance L2 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 5 feet and less than 200 feet, or within a range of at least 5 feet and less than 55 feet, or within a range of at least 15 feet and less than 50 feet, or within a range of at least 15 feet and less than 40 feet, or within a range of at least 5 feet and less than 15 feet, or within a range of at least 10 feet and less than 25 feet.
However, farther distances are achievable with increased power projecting the object 30 toward the target zone 50. In a non-limiting embodiment, the target zone 50 can be a rectangle defined by a height L5 and a width L4 and can represent a relative position in space, or the target zone 50 can be a physical collection device that captures the objects 30 that enter individual target segments 76. The target can be moved up or down (arrows 68,
The target zone 50 can be divided into a plurality of target segments 76 and the controller 28, 29 can initiate the projecting of the object 30 through a predetermined trajectory (e.g., trajectories 40, 42, 44) toward a specific target segment 76 or toward an area outside of the target zone 50 for various training methods. For example, as in baseball or softball training, in the beginning of a training session, the controller 28, 29 (e.g., via selections from a coach/trainer 4, the trainee 8 or another user) can deliver fast balls along the trajectory 42 that can arrive at the target zone 50 in the center target segment 76 (or any other appropriate segment 76). This can be used to help train the trainee 8 to recognize the object 30 and track it through the trajectory 42 through consistent training using the trajectory 42.
When scoring of this activity indicates that the trainee 8 has mastered tracking the object 30 through at least a portion of the trajectory 42, then other trajectories can be selected for additional training. These other trajectories can be designed by the trainee 8, the coach 4, other individual, or controller 28, 29 for the particular training method. These other trajectories can also be designed to mimic at least a portion of the trajectories of a sports object that was projected through one or more game trajectories in a real-life event by a real-life athlete. In this type of training, the trainee 8 can train like they are facing the real-life athlete that projected the sports object along the one or more game trajectories. The scoring can be determined via imagery captured by one or more imaging sensors or by a coach/trainer 4 visually observing the interaction of the trainee 8 with the object 30. The controller 28, 29 can analyze the imagery to determine the performance of the trainee 8 to the training goals or criteria for the training method being performed. The controller 28, 29 can then establish a score for the trainee 8, which can be used to provide feedback to the trainee 8, coach/trainer 4, or other user for improving the trainee's performance. The score can be compared to previous scores to identify trends in the trainee's performance.
For a fast ball simulation, the object 30 can be projected by the delivery device 20 along the trajectory 42. The object 30 can be seen traveling along the trajectory 42 as indicated by the object position 30″. For other trajectories, such as 40, 44 (which can be more complex trajectories), the object 30 can be seen traveling along the trajectory 40, 44 as indicated by positions 30′ and 30″.
In a non-limiting embodiment, the spin 94 can be “0” zero, at least 1 RPM, at least 2 RPMS, at least 3 RPMS, at least 4 RPMS, at least 5 RPMS, at least 10 RPMS, at least 20 RPMS, at least 50 RPMS, at least 100 RPMS, at least 200 RPMS, or at least 300 RPMS.
In a non-limiting embodiment, the spin 94 can be less than 120,000 RPMs, less than 116,000 RPMs, greater than 115,000 RPMs, less than 110,000 RPMs, less than 105,000 RPMs, less than 100,000 RPMs, less than 90,000 RPMs, less than 80,000 RPMs, less than 70,000 RPMs, less than 60,000 RPMs, less than 50,000 RPMs, less than 40,000 RPMs, less than 30,000 RPMs, less than 20,000 RPMs, less than 15,000 RPMs, less than 14,000 RPMs, less than 13,000 RPMs, less than 12,000 RPMs, less than 11,000 RPMs, less than 10,000 RPMs, less than 9,000 RPMs, less than 8,000 RPMs, less than 7,000 RPMs, less than 6,000 RPMs, or less than 5,000 RPMs.
It will be appreciated that the spin 94 of the object 30 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least “0” zero RPMs and less than 11,000 RPMs, or within a range of at least 1 RPM and less than 116,000 RPMs, or within a range of at least 1 RPM and less than 115,000 RPMs, or within a range of at least 100 RPMs and less than 10,000 RPMs.
One or more imaging sensors 32 can be used to capture and record the travel of the object 30 along a predetermined trajectory (e.g., 40, 42, 44). The imaging sensors 32 can be placed at any position around the system 10 with two possible positions indicated in
In a non-limiting embodiment, the imaging sensors 32 can capture physical characteristics of the trainee 8 as well as attributes of a training field 100. Imagery from the imaging sensors 32 can be used by the controller 28, 29 to perform facial recognition of the trainee 8, voice recognition (via audio sensors in the imaging sensors 32), detection and recognition of body movements of the trainee 8, detection and recognition of objects in the training field 100, detection and recognition of body movements of the trainee 8 for controlling the delivery device 20 (e.g., “visual servoing”), detection and recognition of light and color for controlling the delivery device 20, creation of a virtual Grid, and combinations thereof to control operation of the delivery device 20.
The detection and recognition of body movements of the trainee 8, and facial recognition of the trainee 8 can be used by the controller 28, 29 to adjust one or more of the parameters of the delivery device 20 to deliver objects to a target zone, where the target zone 50 has been adjusted based on the identified trainee 8. The physical characteristics of the trainee 8 can be retrieved from a database (e.g., trainee database 344 in
The physical characteristics of the trainee 8 can include one or more of the following:
The voice recognition can be used by the controller 28, 29 to identify the trainee 8 or the coach 4 and adjust one or more of the parameters of the delivery device 20 to deliver objects to a target zone, where the target zone 50 has been adjusted based on the identified trainee 8. The voice recognition can also be used to identify if the trainee 8 or coach 4 is approved for use of the delivery device 20 and enable operation is approved or disable operation if not approved. If approved, then the trainee 8 or coach 4 can issue voice commands to the delivery device 20 to control operation of the delivery device 20, such as pause, resume, start session, select mode, end session, select training session, provide score, provide training statistics, etc.
The detection and recognition of objects in the training field 100 can be used by the controller 28, 29 to determine the type of activity for which training is to be administered to the trainee 8, where the type of activity can be for a sport (e.g., baseball, softball, cricket, hockey, tennis, table tennis, football, soccer, lacrosse, handball, racket ball, basketball, shooting sports, etc.), tactical training, trainee rehabilitation training, or any other activity that can be benefit from improving eye-body coordination of the trainee 8. For example, if the controller 28, 29 detects a home plate near the target zone area in the imagery from the imaging sensor(s) 32, then baseball or softball may be the type of activity to be trained. For example, if the controller 28, 29 detects a hockey goal near the target zone area in the imagery from the imaging sensor(s) 32, the hockey can be the training activity.
The controller 28, 29 can use the imagery from the imaging sensor(s) 32 to determine the type of sport tool 12 to be used, the characteristics of the training field around the target zone 50, gear worn by the trainee 8, equipment held by the trainee 8, etc. to determine the type of activity for which the trainee 8 is training. The various methods of training described in this disclosure can be used for any or all of the types of training activities on which the trainee 8 can be trained. When the type of activity is determined, then the associated parameters for that type of activity can be retrieved from a database (e.g., activity database 346 in
The detection and recognition of body movements of the trainee 8 can be used by the controller 28, 29 to provide interactive control of the delivery device 20 by the trainee 8. This can be referred to as “visual servoing” which is a term used to indicate controlling a robot's movements or actions based on visual actions of the trainee 8. For example, when the type of activity is baseball, then the trainee 8 can raise their hand to indicate to the delivery device 20 to pause projection of the object 30 until the trainee 8 is ready. The trainee 8 can then indicate their readiness by lowering their hand, which can indicate to the delivery device 20 to begin the sequence to deliver the next object 30 toward the target zone 50. Other body movements, such as another hand gesture, a voice command, a head nod, etc., can also be used by the trainee 8 to interact with the delivery device 20 to control or adjust projection of the object 30 along the pre-determined trajectory. For example, pointing left, right, up, or down, can indicate which area of the target zone that the trainee 8 wishes the delivery device 20 to deliver the next object 30; waving can indicate for the delivery device 20 to halt delivering objects 30, increase a speed of the next object 30, or decrease a speed of the next object 30, etc.
The detection and recognition of light and color can be used by the controller 28, 29 to control the delivery device 20. For example, the delivery device 20 can be configured to deliver an object 30 to a specific location in the target zone 50, where the location is determined by where a light illuminates a portion of the target zone 50. If multiple objects 30 are to be sequentially projected to the target zone 50, then causing the light to illuminate various locations in the target zone 50 can cause the delivery device 20 to track to deliver the object 30 to the illuminated location.
Additionally, the controller 28, 29 can detect color or color patterns to control the delivery device 20. The color or color patterns can indicate a type of activity for which the delivery device 20 is to be used for training. The color or color patterns can indicate a skill level required for the training, or a skill level of the trainee 8. The color or color patterns can indicate the areas of the target zone 50 at which the current trainee 8 performs best, performs worst, or somewhere in between. This can be referred to as a “heat map” where the colors in the heat map indicate performance levels of the trainee 8. The delivery device 20 can be controlled by the controller 28, 29 to deliver objects to areas of the target zone 50 indicated by the heat map to be trouble spots for the trainee 8. These heat maps can be generated from previous training sessions and updated after each training session.
The “Heat Map” is a type of graph, generally with the same dimensions of a target zone. The graph can be used to portray where a specific batter has a greater percentage of hits within the target zone 50. Although there are a variety of types of “Heat Maps,” locations within the strike zone of hits can be represented generally by reds/orange and yellow “hot” colors. Specific areas within the target zone 50 represented by blue can portray locations where the trainee 8 is having the least success. These areas can also be known as “soft contact areas.”
These blue or weak areas in a trainee's heat map may indicate a specific vision deficiency for the trainee, such as depth perception, anticipation timing, speed of visual processing, visual reaction, and response timing. An opponent would prefer to throw into the blue areas to have greater probability of success against the trainee 8.
Heat maps can be used in multiple sports where a target zone 50 is used, such as a goalie in Soccer, Hockey, or Lacrosse. The heat map can represent where in the target zone the goalie is most vulnerable and gives up the most goals. Heat Maps can be used in Tennis to determine where a player has the least success in returning serves, volleys, etc., on the court or from which side (e.g., back hand or forehand). Heat maps can also be created and utilized in Tactical training to determine where (location) a trainee 8 has the slowest recognition and reaction times.
The controller 28, 29 can also create a virtual grid 260 that can be displayed to the trainee 8 via a pair of augmented reality goggles to indicate the portion of the grid to which the next object 30 is going to be projected. The grid can represent a target zone 50 in baseball, softball, soccer, hockey, tennis, etc. By providing the anticipated destination of the next object 30, the trainee 8 can focus on interacting with the object 30 without as much emphasis being required for determining its destination and then reacting to location.
The virtual grid 260 can also be for a horizontal playing surface, such as in tennis, table tennis, cricket, etc. Each serve, groundstroke, volley, etc., in a game or practice can be captured via imaging sensors 32 and the controller 28, 29 and stored in a database (e.g., Game statistics database 36 in
This particular training method can incorporate various colors where the individual section in the grid turns a specific color depicting the type of stroke that was used. For example, Red for Serves, Yellow for Volleys, Blue for Lobs, etc. The trainee 8 can use Augmented Reality glasses to see the same grid that the delivery device 20 is using and to see where the next stroke will be delivered. The training allows for changing velocity of strokes to match the skill level of trainee 8. For example, although capturing a real-time event between world-class professionals, the locations in the grid can remain the same, but the velocity of strokes can be reduced.
An interactive training method 110 can be used to instruct a trainee 8 throughout the training session. A display 340 can be configured to interact between the controller 28, 29, and the trainee 8 during training sessions with the delivery device 20. One-way interactive training method 110 can use interactive videos displayed on the display 340 to provide instructions during the session, provide reminders (such as reminding trainee of previous training aspects or current expectations of performance or actions), and offer encouragement during the training session. The interactive videos can be tailored to a trainee's current skill level. For example, videos for lower skilled trainees can focus on more basic skills training to lay a foundation that can be built upon in further training sessions, and videos for higher skilled trainees can focus on specialized aspects of the training to provide improvements in more and more specialized skills.
The interactive training method 110 can include pre-recorded video files that can be retrieved from a database that is coupled to the controller 28, 29 and played back on the display 340 at appropriate times during the training session. At the beginning of the training session, a video can instruct the trainee on how to setup the delivery device 20 and how to generally interact with the delivery device 20 during the session. The interactive video can be used to describe how to setup the training field 100 to perform segmenting training, where one or more screens are positioned between the delivery device 20 and the trainee 8 to hide a portion of the trajectory of the object 30 so the amount of time the trainee 8 has to track the object 30 is varied (such as by moving the screens toward or away from the trainee 8). The interactive video can be used to describe how to setup the training field 100 to perform other training methods using the delivery device 20.
The interactive video can describe attributes of the training session prior to projecting an object 30 from the delivery device 20 toward the trainee 8 or a target zone 50. After the delivery device 20 projects one or more objects 30 along predetermined trajectories, then the interactive video on the display 340 can display scoring for the trainee related to their interaction with the objects 30. Based on the scoring, the controller 28, 29 via the interactive video can highlight areas of needed improvement as well as instruct the trainee 8 on how to improve. The interactive video can also suggest other training methods for the trainee 8 that may be more focused on those areas of needed improvement.
If the trainee 8 scoring is above a pre-determined value or score, then the interactive video can recommend additional training methods or sessions that can build on the strengths of the trainee 8 or identify other areas where the trainee 8 may be weak. Based on these recommendations, the trainee 8 can setup the delivery device 20 to perform other training methods. The controller 28, 29 can then recall interactive videos for the new training and display the new interactive videos during the new training session. The controller 28, 29 can use imaging sensors 32 to capture imagery of the trainee's performance and, from analysis of the imagery, determine which instructional interactive video to output to the display 340.
For example, if a batter drops their shoulder at the incorrect time during their swing to impact the object 30, then the controller 28, 29 can play a video on the display 340 to illustrate to the trainee 8 the desired body movements during swinging at the object 30 prior to the delivery device 20 projecting the next object 30 toward the trainee 8 or target zone 50. For another example, if the controller 28, 29 detected incorrect positioning of a goalie in front of a hockey net or incorrect defensive movements, then the controller 28, 29 can play a video on the display 340 to illustrate the correct positioning or correct defensive movements prior to the delivery device 20 projecting the next object 30 toward the trainee 8 or target zone 50.
In a non-limiting embodiment, the interactive videos may include a demonstration as to why a specific drill is important for vision training, instructions on how to setup the delivery device 20, safety instructions, as well as instructions on how to interact with the delivery device 20 via voice, motion, or other commands (such as via an input device 342). A training curriculum may be several training sessions in length for teaching the trainee 8 full concepts of a training activity or sport with one or more interactive videos initiated throughout the training sessions to instruct or remind the trainee 8. For example, the training curriculum may be used for baseball/softball trainees learning the “Approach” to hitting, plate, or strike zone discipline, 15 seconds of excellence on defense, setting up segmenting screens, etc. For example, the training curriculum may be used for Quick Reaction Drills, where the interactive video can be used to explain how to perform and set up quick reaction drills, how to score the drills, and how to divide into teams. For example, the training curriculum may be used for tennis, where the interactive video can direct a trainee through various ground stroke drills as possibly telling the trainee 8 when to switch to back hand.
The interactive videos can be a software format that can be stored in non-transitory memory in the controller 28, 29 and recalled from a database to be displayed on the display 340 for viewing by the trainee 8. The controller 28, 29 can control when the interactive videos are displayed on the display 340 during the training session. For example, the controller 28, 29 can periodically pause operations of the delivery device 20 to allow an interactive video to be played on the display 340. After the video is played, the controller 28, 29 can then resume operations of the delivery device 20 to perform the desired training method.
The interactive videos can be recorded by a company that manufactures the delivery device 20 or provides training support, and they can be delivered along with the delivery device 20 or as a separate delivery to the customer. Additionally, the interactive videos can be recorded by a coach, a trainee, a parent or guardian of the trainee, or other individual 4 to insert their own video commands as part of the training sessions. For example, a parent oversees can record an instructional video to be played during the training sessions to encourage or instruct the trainee.
For example, a coach can develop their own set of instructional videos to be paired with the various training curriculums and in this way, they can teach multiple players the same instructions without being physically present.
The interactive videos can also be used, by the controller 28, 29 to interact with the trainee 8 during training sessions, by requiring inputs from the trainee 8 to progress through the training session. The trainee 8 can command the delivery device 20 to setup for a particular training session by instructing the controller 28, 29, via voice commands, to setup the delivery device 20 for the training session. The delivery device 20 can indicate reception of the voice command by an indicator light or a movement of the delivery device 20. The trainee 8 can then get in position to receive and react to an object 30 projected from the delivery device 20. The trainee 8 can then command the delivery device 20 to begin projecting a sequence of objects from the delivery device 20 by providing a voice command or body movement command to the delivery device 20 (e.g., the controller 28, 29). The trainee 8 can interact with the delivery device 20 throughout the training session via voice commands or body movement commands (e.g., hand movement, head nod, foot movement, sport tool movement, etc.)
The trainee 8 can also use voice commands to tailor a training session to project objects 30 to the trainee 8 based on personalized parameters. For example, prior to the start of the training session, the trainee through voice command can ask the delivery device 20 for a “Personal Option.” The delivery device 20 can receive the command and respond to the command by speaking or displaying “What is your personal option?” The trainee 8 may then respond with the personal options for setting up the delivery device 20. For example, the trainee 8 may request that the delivery device 20 project one or more objects 30 as a “right-hand slider pitch” at a speed of “80 miles per hour (MPH)”, and to a target zone 50 area “7 (low and away).”
The delivery device 20 can respond audibly or visually by asking “Do you have Mask and Safety Glasses on?” The trainee 8 can respond with “Yes” or “No.” If “No”, then the delivery device 20 can pause operation until the trainee 8 responds with “Yes.” The delivery device 20, after recognizing the “yes” response from the trainee 8, can request the trainee 8 to say “Start” when ready to begin. The “Start” can also be communicated to the delivery device 20 via body movements instead of voice commands. The delivery device 20 can then deliver a set of objects 30 to the trainee 8 based on the personalized parameters provided by the trainee 8 (or coach 4). At end of the set of objects 30, the delivery device 20 can pause operations, can display the trainee's scores for the set of objects 30, recommend additional training sessions and parameters. The trainee 8, can command the delivery device 20 to continue with the current parameters and project another set of objects 30 one after another toward the trainee 8.
Alternatively, or in addition to, the trainee 8 can adjust parameters of the delivery device 20 by again asking the delivery device 20 for a “Personal Option” and repeating the process for commanding the delivery device 20 to setup for delivery of additional objects 30 per the new parameters. Then the trainee 8 can interact with the delivery device 20 via voice commands or body movement commands to progress through the training session.
The one or more remote controllers 29 (referred to as controller 29) can be communicatively coupled to the local controller 28 via a network 33a that communicatively couples the external network 34 to the internal network 35 (with network 33b not connected). In this configuration, the remote controller 29 can command and control the delivery device 20 components directly without direct intervention of the local controller 28. However, in a preferred embodiment, the controller 29 can be communicatively coupled to the controller 28 via the network 33b, which is not directly coupled to the network 34 (with network 33a not connected). In this configuration, the controller 29 can communicate configuration changes (or other commands and data) for the delivery device 20 to the controller 28, which can then carry out these changes to the components of the delivery device 20. If should be understood, in another configuration, the networks 33a, 33b, 34, 35 can all be connected with the controllers 28, 29 managing the communications over the networks.
In a non-limiting embodiment, the delivery device 20 can include a guide 24 that can modify the trajectory and spin of the object 30 as the object 30 is projected toward the target zone 50 or trainee 8. The guide 24 can include a barrel 360 with a center axis 90 through which the object 30 can be projected toward a friction device 200. The friction device 200 can have a center axis 92 and can be rotated about the center axis 92 to alter the engagement of the object 30 when it impacts the friction device 200 at position 30″. An object 30 can be received from the object storage area 120 and located at position 30′ in a first end of the barrel 360. A pressurized air source 152 can be fluidically coupled to the first end of the barrel 360 via conduit 158, with delivery of a volume of pressurized air controlled by a valve 154. The valve 154 and the air source 152 can be controlled by the controller 28, 29 to adjust the air pressure applied to the object 30 at position 30′ as well as the volume of air applied. It should be understood that pressurized air is only one possible option for delivering a desired force to the object 30 to project the object 30 through the barrel 360. Pneumatics other than pressurized air can be used as well as hydraulics, electrical, electro-mechanical, or mechanical power sources that can supply the desired force to the object 30 to project the object 30 through the barrel 360.
In a non-limiting embodiment, an air pressure can be at least 3 PSI (i.e., pressure per square inch), at least 4 PSI, at least 5 PSI, at least 6 PSI, at least 7 PSI, at least 8 PSI, at least 9 PSI, at least 10 PSI, at least 20 PSI, at least 30 PSI, at least 40 PSI, at least 50 PSI, at least 60 PSI, at least 70 PSI, at least 80 PSI, at least 90 PSI, at least 100 PSI.
In another non-limiting embodiment, the air pressure can be less than 220 PSI, less than 210 PSI, less than 200 PSI, less than 190 PSI, less than 180 PSI, less than 170 PSI, less than 160 PSI, less than 150 PSI, less than 140 PSI, less than 130 PSI, less than 120 PSI, less than 110 PSI, less than 100 PSI, or less than 90 PSI.
It will be appreciated that the air pressure may be within a range including any one of the minimum and maximum values noted above, including for example, but not limited to at least 5 PSI and less than 220 PSI inches, or within a range of at least 5 PSI and less than 200 PSI, or within a range of at least 10 PSI and less than 200 PSI, or within a range of at least 5 PSI and less than 180 PSI.
In a non-limiting embodiment, a length of the barrel 360 can be at least 2 inches, at least 3 inches, at least 4 inches, at least 4.5 inches, at least 5 inches, at least 5.5 inches, at least 6 inches, at least 7 inches, at least 8 inches, at least 9 inches, at least 10 inches, at least 11 inches, or at least 12 inches.
In another non-limiting embodiment, the length of the barrel 360 can be less than 48 inches, less than 36 inches, less than 24 inches, less than 23 inches, less than 22 inches, less than 21 inches, less than 20 inches, less than 19 inches, less than 18 inches, less than 17 inches, less than 16 inches, less than 15 inches, less than 14 inches, less than 13 inches, less than 12 inches, less than 11 inches, less than 10 inches, less than 9 inches, less than 8 inches, less than 7 inches, less than 6 inches, less than 5.5 inches.
It will be appreciated that the length of the barrel 360 may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 2 inches and less than 48 inches, or within a range of at least 4.5 inches and less than 24 inches, or within a range of at least 4.5 inches and less than 5.5 inches, or within a range of at least 3 inches and less than 12 inches.
When the valve 154 is actuated, a controlled volume of pressurized air (or other pressurized gas) can be delivered to the first end of the barrel 360 for a predetermined length of time to force the object 30 to be propelled through the barrel 360 at a predetermined velocity, such that at position 30″ the object 30 achieves a desired velocity vector 174. The velocity vector 174 can range from 25 miles per hour to 135 miles per hour. If the friction device 200 is not in a position to interfere with the trajectory 46 of the object 30 as it is propelled from a second end of the barrel 360, then the object 30 may continue along trajectory 46 and exit the delivery device 20 without having additional spin or deflection imparted to the object 30 by the friction device 200. This may be used for delivering “fast balls” along the predetermined trajectory 42 since the object does not engage the friction device 200 before it exits the delivery device 20.
However, if the friction device 200 is positioned to interfere with the object 30 as it is propelled from the second end of the barrel 360, then object 30 can engage (or impact) the friction device 200 at position 30′″, thereby deflecting the object 30 from the axis 90 of the barrel 360 at an angle and imparting a spin 94 to the object. Impacting the friction device 200 can cause the object 30 to begin traveling along a predetermined trajectory 40 with an altered velocity vector 176 at position 30″″. The amount of spin 94 and the amount of deflection from trajectory 46 to trajectory 48 can be determined by the velocity vector 174 of the object 30 at position 30″, the spin of the object 30 at position 30″, the azimuthal position of the friction device 200 about its center axis 92, the azimuthal position of the friction device 200 about the center axis 90 of the barrel 360, the incline (arrows 89) of the friction device 200 relative to the center axis 90, the length (arrows 88) of the friction device 200, and the surface material on the friction device 200. The object 30 can then continue along the predetermined trajectory 48 to the target zone 50 or toward the trainee 8.
If another trajectory is desired, then the controller 28, 29 can modify the parameters of the delivery device 20 (such as changing the velocity vector 174 and spin of the object 30 at position 30″, changing the azimuthal position of the friction device 200 about its center axis 92, changing the azimuthal position of the friction device 200 about the center axis 90 of the barrel 360, changing the incline (arrows 89) of the friction device 200 relative to the center axis 90, changing the length (arrows 88) of the friction device 200, or changing the surface material on the friction device 200) to deliver a subsequent object 30 along a new predetermined trajectory 48.
In a non-limiting embodiment, in addition to these parameters mentioned above, there are also parameters of the barrel position and delivery device 20 chassis 22 position that can be used to alter a trajectory of an object 30 to travel along a predetermined trajectory (e.g., 40, 42, 44) to a target zone (or trainee 8). Some of these parameters affect the orientation of the barrel 360 within the delivery device 20, while others can affect the orientation and position of the chassis 22 of the delivery device 20 relative to a surface 6, while others affect selecting an object 30 to be propelled from the barrel 360. In a non-limiting embodiment, all these parameters can have an impact on the trajectory of the object 30 as it is projected from the delivery device 20 toward the target zone 50 or trainee 8.
The barrel 360 can be rotated (arrows 86) about its center axis 90. This can be beneficial if the barrel 360 includes a non-smooth inner surface, such as an internal bore of the barrel 360 with rifling grooves (i.e., a surface with helically oriented ridges or grooves along the internal bore of the barrel 360) that can impart a spin (clockwise or counterclockwise) to the object 30 as the object 30 travels through the internal bore of the barrel 360. Other surface features can also be used on the internal bore of the barrel 360 to affect the spin of the object 30 as it travels through the barrel 360.
The barrel 360 can be rotated (arrows 84) about the axis 91 to adjust the direction of the object 30 as it exits the barrel 360. The barrel 360 can also be moved (arrows 85) to adjust a distance between the exit end of the barrel 360 and the friction device 200.
The friction device 200 can be coupled to a structure (e.g., structure 210 via support 202) that can be used to rotate the friction device 200 about the center axis 90 of the barrel 360. This can be used to change the deflection angle imparted to the object 30 when it impacts the friction device 200 at position 30′.
The chassis 22 can be rotationally mounted to a base 18 at pivot point 148. Actuators 144 can be used to rotate the chassis 22 about the X-axis (arrows 81) or the Y-axis (arrows 82) relative to the surface 6 by extending/retracting. There can be four actuators 144 positioned circumferentially about the center axis 93. The base 18 can rotate the chassis 22 about the Z-axis (arrows 80) relative to the surface 6. The support 142 can be used to raise or lower (arrows 83) the chassis 22 relative to the surface 6. Supports 146 can be used to stabilize the support 142 to the support structure 160. The support structure 160 can have multiple wheels 164 with multiple axles 162 to facilitate moving the support structure 160 along the surface 6 in the X and Y directions (arrows 166, 168). The support structure 160 can house an optional controller 169 for controlling the articulations of the base 18 to orient the chassis 22 in the desired orientation. This controller 169 can be positioned at any location in or on the base 18 as well as in or on the chassis 22. It is not required that the controller 169 be disposed in the support structure 160.
In a non-limiting embodiment, the delivery device 20 can include one or more storage bins 150 for storing objects 30 and delivering an object 30 to the barrel 360 at position 30′. In the example shown in
However, the conduit 156 can be a collection conduit that receives each object 30a or 30b and holds them in a chronological order in the conduit 156 as to when they were received at the conduit 156 from the storage bins 150a, 150b. A mechanism 155 can be used to release the next object (30a or 30b) into the barrel 360 at position 30′, thereby delivering the objects 30a, 30b to the barrel 360 in the order they were received at the conduit 156. Even if only one object 30a, 30b is released to the conduit 156, the mechanism 155 can still be used to prevent the escape of pressurized gas into the conduit 156. However, the mechanism 155 is not required. Other means can be provided to prevent loss of pressurized gas through any other path other than through the barrel 360.
In a non-limiting embodiment, the velocity vector 174, 176, 178 can be a velocity directed in any 3D direction, with the velocity of the object 30 being at least 4 MPH (i.e., miles per hour), at least 5 MPH, at least 6 MPH, at least 7 MPH, at least 8 MPH, at least 9 MPH, at least 10 MPH, at least 15 MPH, at least 20 MPH, at least 25 MPH, at least 30 MPH, at least 35 MPH, at least 40 MPH, at least 45 MPH, at least 50 MPH, at least 55 MPH, at least 60 MPH, at least 65 MPH, at least 70 MPH, at least 75 MPH, at least 80 MPH, at least 90 MPH, or at least 100 MPH.
In another non-limiting embodiment, the velocity vector 174, 176, 178 can be a velocity directed in any 3D direction, with the velocity of the object 30 being less than 220 MPH, less than 210 MPH, less than 200 MPH, less than 190 MPH, less than 180 MPH, less than 170 MPH, less than 160 MPH, less than 150 MPH, less than 145 MPH, less than 140 MPH, less than 135 MPH, less than 130 MPH, less than 125 MPH, less than 120 MPH, less than 115 MPH, less than 110 MPH, less than 105 MPH, less than 100 MPH, less than 95 MPH, less than 90 MPH, less than 85 MPH, less than 80 MPH, less than 75 MPH, less than 70 MPH, less than 65 MPH, less than 60 MPH, less than 55 MPH, less than 50 MPH, less than 45 MPH, or less than 40 MPH.
It will be appreciated that the velocity of the object 30 at the velocity vector 174, 176, 178 may be within a range including any one of the minimum and maximum values noted above, including for example, but not limited to at least 5 MPH and less than 75 MPH, or within a range of at least 15 MPH and less than 100 RPM, or within a range of at least 15 MPH and less than 220 MPH.
In a non-limiting embodiment, the friction device 200 can include a ramp 206 with one or more surface materials attached to it. The surface material controls a friction applied to the object 30 when the object 30 impacts the friction device 200. Therefore, it can be beneficial to allow the delivery device 20 to automatically select between various surface materials (e.g., 204, 205, 208). One side of the ramp 206 can have multiple surface materials 204, 205 attached thereto. Moving the friction device 200 axially (arrows 88) can cause the object to impact either the surface material 204 or 205. If the surface materials 204, 205 have different textures or friction coefficients, then impacting one or the other can alter the spin 94 or trajectory 48 of the object 30 when it impacts the friction device 200. The ramp 206 can also have one or more surface materials (e.g., 208) attached to an opposite side of the ramp 206. The ramp 206 can be configured to rotate about the axis 92 such that the surface material 208 is positioned to impact the object 30 at position 30′″. The surface materials 204, 205, 208 can be various wool fibrous materials, plastics, cottons, foam rubbers, metals such as steel, lead, copper, aluminum, or metal alloys, plant-based material, or fungus-based material.
In a non-limiting embodiment, the surface material 204, 205, 208 can have a friction coefficient that is at least 0.010, at least 0.015, at least 0.20, at least 0.25, at least 0.30, at least 0.35, at least 0.40, at least 0.45, at least 0.50, at least 0.55, at least 0.60, at least 0.65, at least 0.70, at least 0.75, at least 0.80, at least 0.85, at least 0.090, at least 0.095, at least 0.10, at least 0.15, at least 0.20, at least 0.30, at least 0.40, at least 0.50, at least 0.60, at least 0.70, at least 0.80, at least 0.90, or at least 1.00.
In another non-limiting embodiment, the surface material 204, 205, 208 can have a friction coefficient that is less than 1.50, less than 1.45, less than 1.40, less than 1.35, less than 1.30, less than 1.25, less than 1.20, less than 1.15, less than 1.10, less than 1.05, less than 1.00, less than 0.95, less than 0.90.
It will be appreciated that the friction coefficient may be within a range including any one of the minimum and maximum values noted above, including, for example, but not limited to at least 0.20 and less than 1.35, or within a range of at least 0.01 and less than 1.50, or within a range of at least 0.25 inches and less than 1.35.
In a non-limiting embodiment, the display 340 can be used to display performance scores to a user (i.e., trainee 8, coach 4, other individual, etc.), GUI interface windows, training trajectory (single or multiple), emulated game trajectory, and player 14 associated with the game trajectory, video of game trajectory, video of training trajectory while or after object is projected to target zone, training statistics and trends, selection criteria for objects 30, selection criteria for training trajectories 40, delivery device 20 parameters and parameters selected by the input device. For example, in baseball or softball training, the display 340 can be used to display a type of pitch, the speed of delivery of object 30 at the target zone 50, a location of delivery of the object 30 at the target zone 50, text messages about the delivered object 30, animations, videos, photos, or alerts about the delivered object 30. The display is intended to provide the trainee 8 or coach 4 immediate feedback about the delivered object 30. The input device 342 and display 340 are shown separate, but they can be integrated together in a device, such as a smartphone, smart tablet, laptop, touchscreen, etc.
The network interface 332 can manage network protocols for communicating with external systems (e.g., controller 29, database 36, imagery sensors 32, tracking device 190, etc.) to facilitate communication between the processor(s) 330 and the external systems. These external systems are shown connected to the network 34, but they can also be disconnected and reconnected as needed. For example, the tracking device 190 may not be connected to the network until it is positioned on a docking station for downloading its acquired data. Additionally, the delivery device 20 may not always be connected to an external network. When it is reconnected to an appropriate external network, the communication between the external systems can again be enabled.
In a non-limiting embodiment, the processor(s) 330 can be communicatively coupled to a non-transitory memory storage 37 which can be used to store program instructions 334 and information in databases 38, 336, 338, 344, 346. The processor(s) 330 can store and read instructions 334 from the memory 37 and execute these instructions to perform any of the methods and operations described in this disclosure for the delivery device 20. The delivery device parameters (see parameters described above) for each training trajectory 40 can be stored in the delivery device parameter database 38 in the memory 37. This database 38 can be organized such that each training trajectory 40 that has been defined by a set of delivery device parameters can have a trajectory entry in the database 38. When this trajectory entry is accessed, the set of delivery device parameters can be transferred to the processor(s) 330, which can use the parameters to adjust the delivery device 20 components to deliver the predetermined trajectory defined by the trajectory entry.
If a user wishes to define a canned sequence of trajectories, then the processor(s) 330 (based on inputs from the input device) can assemble the sequence of trajectories including their associated delivery device parameters and store the sequence in the sequential trajectory database 336 as a retrievable set of predetermined trajectories. When accessed by the processor(s) 330, the sequential trajectory database 336 can deliver the set of predetermined trajectories to the processor(s) 330 including the delivery device parameters. The processor(s) 330 can then sequentially setup the delivery device 20 to sequentially project objects one after another to produce the desired set of predetermined trajectories in the desired order. The memory 37 may also contain a game trajectory database 338 which stores the game parameters of the game trajectories that have been received from other sources (such as the tracking device 190, the game statistics database 36, or user inputs) and can save them for later emulation by the delivery device 20.
For example, as the object 30 is projected along the trajectory 40, the trainee 8 can attempt to track the object 30 with their eyes. As the object 30 continues along the trajectory 40 the trainee 8 can continue to move their eyes to track the object 30. The imaging sensors 32 can be used to capture imagery that contains the trajectory 40 of the object and the movements of the eye (or eyes) and a time stamp of the movements. The imagery can be transmitted to the controller 28, 29 which can be configured to analyze the trajectory 40 to determine the parameters of the trajectory 40, such as the 3D position of the object 30 in space along the trajectory 40 and the velocity vectors (e.g., 176) of the object 30 as it traveled along the trajectory 40.
The controller 28, 29 can also be configured to analyze the recorded eye movements of the trainee's eye to determine the direction from the eye of the center line of sight 250 of the eye. At any position along the trajectory 40 (such as position 30′), the controller 28, 29 can correlate the object position along the trajectory 40 with the eye movements based on syncing the time of the position of the object 30 along the trajectory 40 (e.g., position 30′) with the time of the eye movements of the trainee 8. With the center line of sight 250 correlated to the object position (e.g., 30′), then the controller 28, 29 can calculate a deviation L9 between the object 30 and the center line of sight 250. Calculating a deviation L9 for multiple positions along the trajectory 40 can be used to score the ability of the trainee 8 to track the object 30 along the trajectory 40. The larger the deviation L9, the lower the score. The deviations L9 can be plotted vs. time to display to the user (trainee 8, coach 4, another individual, etc.) for understanding areas of strength or weakness of the trainee 8 in tracking the object 30 along the trajectory 40.
The method for tracking movement of a trainee's eye (or eyes) and correlating the eye movement to the positions of the object 30 along the trajectory (e.g., 40) can be used with any of the training systems 10 described in this disclosure. For example, during segmenting training 118, the correlation between the trajectory 40 and the eye movement can be used to score the trainee's ability to track the object 30 through the end portion of the trajectory 40 that can be reduced (e.g., barrier 220 and possibly the light source 230 moved to respective positions 220′, 230′) as the score is improved, or increased (e.g., barrier 220 and possibly the light source 230 moved to original positions) as the score is unchanged or worse.
It should also be understood that a coach 4 or another individual can score the ability of the trainee 8 to track the object 30 along the trajectory 40 by visually observing the trainee 8 as they attempt to track the object 30. This can be seen as being somewhat less precise than the method of correlating the eye movements to the object positions along the trajectory 40 using the controller 28, 29. However, this manual correlation can also be used to improve the trainee's ability to track the object 30 along the trajectory 40.
The training system 10 in
The strike zone training 119 can occur when the delivery device 20 sequentially projects objects 30 along a predetermined trajectory (e.g., trajectory 40) and the trainee 8 indicates when they believe the object 30 arrives within the target zone 50 by providing a user input to the controller 28, 29 via an HMI device 170. The target zone 50 can include sensors 51 as previously described. These sensors 51 can detect a location in the target zone at which the object 30 arrives. The trainee 8 can actuate or interact with the HMI device 170 to indicate if they think the object 30 arrived in the target zone 50, and the HMI device can transmit the indication to the controller 28, 29, which can compare the indication with actual arrival location of the object 30. The controller 28, 29 can also determine if the object did not arrive within the target zone 50, either due to a lack of indication from the sensors 51 that the object 30 arrived at the target zone 50 or possibly sensors (not shown) that are positioned outside the target zone 50.
A high score can be when the indication is received from the HMI device 170 and the object 30 arrives within the target zone 50, or when the object 30 does not arrive in the target zone 50 and no indication is received from the HMI device 170. A low score can be when the indication is not received from the HMI device 170 and the object 30 arrives within the target zone 50, or when the object 30 does not arrive in the target zone 50 and an indication is received from the HMI device 170.
The controller 28, 29 can average the individual scores over a period of time or over multiple objects 30 delivered toward the target zone 50. This average score can be used (as well as the individual scores) can be used to provide feedback to the trainee 8 (or the coach 4, other individual, or the controller 28, 29) for improving the trainee's performance of recognizing objects 30 arriving in the target zone 50 and those that do not arrive in the target zone 50. Training with the smaller object 30 can allow the trainee 8 to recognize regulation game objects even easier during a real-life event and thereby more easily recognize balls and strikes in the real-life event. This training 119 can be well suited for baseball, softball, cricket, soccer, or any sport with a target area for receiving a game object. However, the strike zone training 119 can also be used for other not as well-suited sports or tactical situations to improve eye-body coordination of a trainee 8.
The strike training 119 can also be used to improve the trainee's ability to recognize when the object 30 arrives at the target zone 50. So the trainee 8 can send an indication via the HMI device 170 to the controller 28, 29 when they believe the object 30 arrives at the target zone 50. The controller 28, 29, via comparison to the sensor data received from the sensors 51, can determine a score based on the comparison of the time of arrival of the object 30 at the target zone 50 and when the indication is initiated at the HMI device 170 by the trainee 8.
The indication from the HMI device 170 can be initiated by:
Additionally, a good performance of the trainee 8 regarding the object 30 can be when the actual arrival position is inside the target zone and the indication is received from the HMI device 170 within a pre-determined amount of time before the actual arrival time, or when the actual arrival position is outside the target zone and no indication is received from the HMI device within a pre-determined amount of time before the actual arrival time. A bad performance of the trainee 8 regarding the object 30 can be when the actual arrival position is outside the target zone and the indication is received from the HMI device 170, or when the actual arrival position is inside the target zone and no indication is received from the HMI device 170, or when the actual arrival position is inside the target zone and the indication is received from the HMI device 170 past a pre-determined amount of time prior to the actual arrival time.
This vision training for the trainee 8 using various methods of training with the delivery device 20 can be used to focus on improving specific visual abilities of the trainee 8, such as depth perception, visual reaction time and response timing, speed of visual processing, and eye-body-coordination.
For example, a target zone training method (e.g., baseball strike zone, soccer goal area; etc.) where the delivery device 20 projects the object 30 to specific areas of the target zone 50 to score the trainee's ability to determine if the object is inside or outside the perimeter of the target zone 50 or score the ability of a trainee 8 to prevent the object 30 from entering the target zone 50 for a goal at a variety of trajectories and speeds. The visual abilities of the trainee 8 are required to perform at higher and higher levels of efficiency to correctly interact with the object 30 at the target zone 50.
For example, a quick reaction training method can also be used to further enhance and improve the visual abilities of the trainee 8. In the quick reaction training, the delivery device 20 is configured to deliver the object 30 toward the trainee 8 in a variety of challenging speeds and trajectories. The object 30 is much smaller than a sport regulation object (e.g., less than the size of a table tennis ball). A goal of the trainee 8 is to prevent the object 30 from entering the target zone 50. The delivery device 20 can continue projecting an object 30 toward the trainee 8 or target zone 50, while the trainee 8 must visually acquire the object 30, track the object 30 along at least a portion of its trajectory to the target zone 50, and cause the body to react in a way (e.g., hand movement, foot movement, head movement, sport tool movement, etc.) as to prevent entry of the object 30 in the target zone 50.
The quick reaction training method can also incorporate segmenting training using one or more screens (described previously) to make the reaction and response of the trainee 8 more challenging. When the one or more segmenting screens are in place, the trainee 8 cannot see the delivery device 20. Therefore, the trainee has an increasingly shorter amount of time to visually acquire the object 30 since the trainee 8 cannot see the object 30 until it has passed around, under, over, or through the one or more segmenting screens. As the one or more segmenting screens are progressively placed in closer proximity to the trainee 8, the quick reaction training becomes increasingly more challenging and quicker visual and physical reactions are needed. The quick reaction training method can score the trainee's visual reaction time, trainee's response timing, and trainee's eye-body coordination. The scoring can indicate if the Trainee is having difficulty is specific areas of the target zone 50.
The cognitive recognition training method can use the delivery device 20 to project objects 30 of different colors toward the trainee 8. The trainee 8 can then attempt to identify the color as soon as it is delivered from the DD or as it passes around, under, over, or through the one or more segmenting screens. The trainee 8 can be scored on their ability to quickly identify the object 30 and its color. The cognitive recognition training method works to improve the trainee's speed of visual processing.
Scoring for these visual ability training methods can be stored in a database for later retrieval and analysis. The analysis of the scores can indicate trends in performance of the trainee to the various training methods. The scores can also be used to produce or update an individual “heat map” for the trainee 8 to indicate areas of the target zone that the trainee 8 is strongest, weakest, and average. The heat map can be used to control the delivery device 20 to cause the delivery device 20 to project objects 30 to the weaker areas indicated by the heat map.
Referring again to
The delivery device 20 can be removably attached to a tripod base via a quick release tripod plate. The delivery device 20 can be detached from the tripod base, moved to the new location and reattached to another tripod base, an elevator system, a gantry system, a platform, or a mounting surface via the quick release tripod plate.
The trainee 8 or other individual 8 can then power up the delivery device 20 by connecting the delivery device 20 to a power source and energizing the delivery device 20 components. After receiving power, the controller 28, 29 can begin a power-up procedure that can be referred to as an awareness protocol. The controller 28, 29 can check to see if the imaging sensors 32 are connected and in communication with the controller 28, 29 as well as checking communication to the other delivery device 20 components. The controller 28, 29 can then perform any required initialization protocols for the delivery device 20 components. Various sensors, including the imaging sensors 32, can be used to sense the environment of the training field 100 as well as the internal conditions, such as reading the accelerometer to check the orientation of the delivery device 20, motors can be moved to a pre-determined location to resync the motor positions with the controller 28, 29. The controller 28, 29 can then establish connections with peripheral devices, such as a phone, the internet, Augmented Reality glasses, smart TVs, light sources, sensors at the target zone 50, and impact sensors 56. The controller 28, 29 can determine if there are objects 30 available in the storage bin 120, and if the propulsion device to propel the object has power and is ready to project an object 30 from the delivery device 20.
Instructional videos can be played on the display 340 to instruct individuals 4 (which can include the trainee 8) to construct a backdrop at the target zone 50, construct a netting tunnel from pipes and drapes, and position the target zone 50. The individuals 4 can setup these things based on the training method to be performed, or the delivery device 20 can sense the training to be performed and then instruct the individuals 4 to construct the training field 100 accordingly.
Before projecting one or more objects 30 from the delivery device 20 to the target zone 50, the delivery device 20 can be calibrated. A laser pointer on the delivery device 20 can be used to adjust the delivery device 20 to be aimed in the general area where the desired target zone 50 will be. The delivery device 20 can then begin projecting an object 30 to the target zone 50 and capturing the trajectory of the object via the imaging sensors 32. Each captured trajectory can be compared to a desired pre-determined trajectory to determine a score of how well or poorly the delivery device 20 is projecting the object along the desired pre-determined trajectory. The controller 28, 29 scores the performance of the delivery device 20 to project the object 30 along the pre-determined trajectory, and based on the scoring, can adjust one or more parameters of the delivery device 20 to improve the score. This process can be repeated as often as needed to verify that the delivery device 20 is operating properly. After the delivery device 20 has been calibrated and has an acceptable score for delivering the object 30 along the pre-determined trajectory, the trainee 8 can commence their training sessions.
Embodiment 1. A method for training comprising:
projecting, via a delivery device, one or more objects along one or more pre-determined trajectories toward a target zone, wherein each of the one or more objects has a diameter that is less than a diameter of a regulation table tennis ball;
a trainee attempting to prevent the one or more objects from entering the target zone; and
scoring an ability of the trainee to prevent the one or more objects from entering the target zone.
Embodiment 2. The method of embodiment 1, wherein the diameter of each of the one or more objects is at least 0.05 inches (1.27 mm) and less than 1.4 inches (35.56 mm).
Embodiment 3. The method of embodiment 1, wherein the target zone is a baseball strike zone and the trainee is training for baseball.
Embodiment 4 The method of embodiment 1, wherein the target zone is a soccer goal and the trainee is training for soccer.
Embodiment 5. The method of embodiment 1, wherein the trainee attempts to prevent the one or more objects from entering the target zone by swinging a sport tool at the one or more objects to impact the one or more objects.
Embodiment 6. The method of embodiment 1, wherein the trainee attempts to prevent the one or more objects from entering the target zone by intersecting the one or more pre-determined trajectories with a hand, a foot, a head, an arm, or a leg of the trainee.
Embodiment 7. The method of embodiment 1, wherein the one or more objects are colored objects, and the method further comprises training the trainee to recognize and identify a color of each of the one or more objects before the one or more objects reach the target zone.
Embodiment 8. The method of embodiment 7, further comprising:
positioning one or more screens between the delivery device and the trainee, wherein the one or more screens blocks the training from viewing the one or more objects along at least a proximal portion of the one or more pre-determined trajectories;
projecting the one or more objects around, under, over, or through the one or more screens; and
recognizing and identifying the color of each of the one or more objects as the one or more objects travel along a distal end portion of the one or more pre-determined trajectories.
Embodiment 9. The method of embodiment 8, further comprising moving the one or more screens toward the target zone to reduce an amount of time the trainee has to recognize and identify a color along the distal end portion of the one or more pre-determined trajectories.
Embodiment 10. The method of embodiment 1, further comprising scoring depth perception, anticipation timing, speed of visual processing, visual reaction timing, response timing, or combinations thereof of the trainee.
Embodiment 11. A method for training comprising projecting, via a delivery device, an object along a pre-determined trajectory toward a target zone proximate a trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball; and scoring, via a controller, an ability of the trainee to interact with the object at the target zone.
Embodiment 12. The method of embodiment 11, further comprising:
prior to projecting the object, displaying, via a display, a video to a trainee, wherein the video explains setup for the delivery device and a training field; and
setting up the delivery device and the training field based on the video.
Embodiment 13. The method of embodiment 11, further comprising:
displaying the scoring of the trainee on the display; and
displaying, via the display, a video to the trainee, wherein the video explains how to improve the ability of the trainee to interact with the object at the target zone.
Embodiment 14. The method of embodiment 13, further comprising
projecting, via the delivery device, one or more objects toward the target zone proximate the trainee; and
scoring an ability of the trainee to interact with the one or more objects at the target zone; and
displaying the scoring of the trainee on the display.
Embodiment 15. The method of embodiment 13, further comprising:
adjusting one or more parameters of the delivery device based on the video;
projecting a second object along a second pre-determined trajectory toward a target zone; and
scoring the ability of the trainee to interact with the second object at the target zone.
Embodiment 16. The method of embodiment 11, further comprising:
selecting, via the controller, a training method based on the scoring;
selecting a video based on the training method, wherein the video instructs the trainee on how to perform the training method, how to score in the training method, what skills are being targeted by the training method, or combinations thereof;
displaying the video to the trainee via the display;
executing, via the delivery device, the training method for the trainee; and
scoring performance of the trainee.
Embodiment 17. A method for training comprising:
operating of a delivery device in response to a first command from a trainee to perform a training;
receiving, by the trainee, a response to the first command from the delivery device, wherein the response is requesting input from the trainee;
providing, via a second command from the trainee, the requested input to the delivery device;
adjusting, via a controller, one or more parameters of the delivery device based on the second command;
projecting, via the delivery device, an object along a pre-determined trajectory toward a target zone proximate the trainee, wherein the object has a diameter that is less than a diameter of a regulation table tennis ball; and
scoring an ability of the trainee to interact with the object at the target zone.
Embodiment 18. The method of embodiment 17, wherein the first command is a voice command, a hand gesture, a head movement, or a body movement of the trainee.
Embodiment 19. The method of embodiment 17, wherein the response to the first command is displayed to the trainee via a display, which is communicatively coupled to the controller.
Embodiment 20. The method of embodiment 17, further comprising:
projecting, via the delivery device, a plurality of objects along one or more pre-determined trajectories toward a target zone proximate the trainee;
scoring the ability of the trainee to interact with the plurality of objects at the target zone; and
displaying one or more videos to the trainee during the projecting of the plurality of objects toward the target zone.
Embodiment 21. The method of embodiment 20, wherein the one or more videos are instructional videos to remind the trainee about training goals, to reinforce proper performance techniques for interacting with the plurality of objects at the target zone, to encourage the trainee during the training, to share scores with the trainee during the training, or to compare trainee performance to performance of a professional athlete.
Embodiment 22. A method for training comprising:
projecting, via a delivery device, a plurality of first objects along a pre-determined trajectory toward a first target zone in a first training field, wherein each of the plurality of first objects has a diameter less than the diameter of a regulation table tennis ball;
determining, via a controller and an imaging sensor, a first score that indicates an ability of a delivery device to deliver the plurality of first objects at a pre-determined location at the first target zone;
adjusting one or more parameters of the delivery device based on the first score;
projecting, via the delivery device, a plurality of second objects along a pre-determined trajectory toward the first target zone, wherein each of the plurality of second objects has a diameter less than the diameter of a regulation table tennis ball; and
determining, via the controller and the imaging sensor, a second score that indicates an ability of a delivery device to deliver the plurality of second objects at the pre-determined location at the first target zone, wherein the second score indicates an improved performance of the delivery device compared to the first score.
Embodiment 23. The method of embodiment 22, further comprising:
moving the delivery device to a second training field, with a second target zone; and
repeating the projecting of the first objects, determining the first score for the delivery device, adjusting the one or more parameters, the projecting of the second objects, and determining the second score for the delivery device for the second target zone instead of the first target zone, wherein the second score for the second target zone indicates an improved performance of the delivery device compared to the first score for the second target zone.
Embodiment 24. A method for training comprising:
projecting, via a delivery device, a first object along a pre-determined trajectory toward a first target zone in a first training field, wherein the first object has a diameter less than the diameter of a regulation table tennis ball;
determining, via a controller and an imaging sensor, a first score that indicates an ability of a trainee to interact with the first object at the first target zone;
moving the delivery device to a second training field, with a second target zone;
projecting, via the delivery device, a second object along the pre-determined trajectory toward the second target zone, wherein the second object has a diameter less than the diameter of the regulation table tennis ball;
determining, via the controller and the imaging sensor, a second score that indicates an ability of the delivery device to project the second object along the pre-determined trajectory to the second target zone;
adjusting one or more parameters of the delivery device based on the second score;
projecting another object toward the second target zone;
determining, via the controller and the imaging sensor, the second score and comparing the second score to a desired score; and
repeating the adjusting the one or more parameters based on the second score, projecting the another object, and determining the second score until the second score is substantially equal to the desired score.
Embodiment 25. A system for training configured to perform any of the methods described in this disclosure.
While the present disclosure may be susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and tables and have been described in detail herein. However, it should be understood that the embodiments are not intended to be limited to the particular forms disclosed. Rather, the disclosure is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure as defined by the following appended claims. Further, although trainee embodiments are discussed herein, the disclosure is intended to cover all combinations of these embodiments.
Number | Date | Country | |
---|---|---|---|
63267373 | Jan 2022 | US |