The present invention relates to the field of training persons to aim firearms at moving targets.
The ability to hit a moving target is an important skill for many types of shooters (e.g. military, law-enforcement, hunters). Hitting a static target at a distance requires a shooter to aim above the target; he/she has to aim high because the bullet is affected by gravity throughout its trajectory. A more distant target requires a higher point of aim.
Scopes often have reticle patterns (markings inside the scope) which help a shooter to judge the amount by which they should aim high. Referring to
In addition, and referring to
The required lead can be expressed as a distance (e.g. metres) or an angle (degrees or more commonly Minutes Of Angle (MOA)).
To teach shooters the art of hitting a moving target it has been tried to ask trainee shooters to memorise a table of leads (see
Trainee shooters generally practice shooting very few specific distance/speed/direction combinations: e.g. they may only practice against a target at 100 m moving at 1 m/s, left-to-right at 90 deg to the shooter. As a result, trainee shooters generally memorise the required leads for the specific speed/distance/direction combinations that they practice, and fail to memorise the table.
Even if they could memorise the entire table, there is a big difference between knowing the table and being able to recall it and apply it in the heat of the moment.
One needs to:
There remains a need to provide improved methods of training to trainee shooters,
In a first aspect the present invention provides a method of training a person in the aiming of firearms at moving targets including the steps of: providing at least one moving target; calculating the correct lead for the at least one target; displaying a visualisation of the correct lead to the person.
The steps of calculating and displaying may be carried out repeatedly to provide an ongoing near real-time visualisation of the correct lead.
The step of calculating the correct lead may be based on any of the location of the target, the velocity of the target, the acceleration of the target or the direction of travel of the target.
The location of the target may include the elevation of the target.
Information relating to the location of the target, the velocity of the target or the direction of travel of the target may be derived from the target.
Information relating to the location of the target, the velocity of the target or the direction of travel of the target may be derived from a target control system.
The step of calculating the correct lead may be based on the location of the person.
The location of the person may include the elevation of the person.
The location of the person may be determined from a sensor such as a GPS sensor placed on or near the person.
The location of the person may be obtained from a previously configured control system.
The calculation of the correct lead may be based on wind speed.
The visualisation may include a representation of the target.
The representation of the target may include a visual indication of the direction of travel of the target relative to the person's line of sight.
the representation of the target includes a visual indication of the distance to the target.
The distance to the target may be indicated by the height of the representation of the target.
The visualisation may include a visual indication of the velocity of the target.
The visual indication of the velocity of the target may include a moving background image which moves to indicate the component of the velocity of the target in a direction orthogonal to the direction from the person to the target.
The visual indication of the velocity of the target may include visual cues in the form of arm or leg movements or leaning of the representation of the target.
The method may further include the step of providing an indication to the person of the accuracy of at least one shot which they fired.
The indication may include an indication of whether the shot was leading or lagging the target.
The indication may include an indication of by how much the shot was leading or lagging the target.
The indication of the accuracy shot may be calculated based on the output of acoustic sensors.
The visual indication may be displayed by overlaying it in a weapon-sight.
In a second aspect, the present invention provides a system for training persons in the aiming of firearms including: at least one target which is arranged to move about an area; calculation means for calculating the correct lead for the at least one target; display means for displaying a visualisation of the correct lead.
The system may be arranged to repeatedly carry out the steps of calculating and displaying to provide an ongoing near real-time visualisation of the correct lead.
The system may be arranged to calculate the correct lead based on any of the location of the target, the velocity of the target, the acceleration of the target or the direction of travel of the target.
The location of the target may include the elevation of the target.
Information relating to the location of the target, the velocity of the target, the acceleration of the target or the direction of travel of the target may be derived from the target.
Information relating to the location of the target, the velocity of the target, the acceleration of the target or the direction of travel of the target may be derived from a target control system.
The system may be arranged to calculate the correct lead based on the location of the person.
The location of the person may include the elevation of the person.
The calculation of the correct lead may be based on wind speed.
The visualisation may include a representation of the target.
The representation of the target may include a visual indication of the direction of travel of the target relative to the person's line of sight.
The representation of the target may include a visual indication of the distance to the target.
The distance to the target may be indicated by the height of the representation of the target.
The visualisation may include a visual indication of the velocity of the target.
The visual indication of the velocity of the target may include a moving background image which moves to indicate the component of the velocity of the target in a direction orthogonal to the direction from the person to the target.
The visual indication of the velocity of the target may include visual cues in the form of arm or leg movements or leaning of the representation of the target.
The system may be further arranged to provide an indication to the person of the accuracy of at least one shot which they fired.
the indication includes an indication of whether the shot was leading or lagging the target.
The indication may include an indication of by how much the shot was leading or lagging the target.
The target may include acoustic sensors and the indication of the accuracy shot is calculated based on the output of the acoustic sensors.
The visual indication may be displayed by overlaying it in a weapon-sight.
Embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Embodiments of systems and methods according to the invention will now be described. The embodiments are based on modified versions of robotic target systems produced by the applicant and as described, for instance, in applicant's published patent publications WO2011/035363, WO2016/134413 & WO2017/083906, which are incorporated herein by reference.
Referring to
The targets 31, 32 operate under the command of the control system 11 to move about the training area. The trainees must attempt to hit the targets. The training system helps the trainees learn by calculating and displaying the correct lead. There are two approaches to determine the relative position and velocity between the shooter and the target needed for calculating the correct lead.
The first approach relies on knowing the absolute position and velocity of both the target and the shooter and calculating the relative values. The speed and direction of travel of the targets are known by the control system 11 at all times. The location of each shooter 21, 22 is determined by the control system either by placing a sensor such as a GPS sensor on or near the shooters, or by configuring shooter locations in the software of control system 11. Once configured, the configured shooter locations may be updated as needed.
The second approach relies on estimating the relative position and velocity between the shooter and the target directly. The distance to the target could be measured with a laser range finder, or estimated using a camera if the dimensions of the target are known, or configured statically in software. The speed of the target and its direction of motion can be estimated by using computer vision techniques. This approach requires to detect the target object in the camera frame, separate the object from the background, and estimate the object's orientation and motion in 3D space. Some targeting systems today attempt to perform these functions in an operational environment. Performing all these steps on a battlefield is very challenging due to extremely diverse and a priori unknown appearance of the target and the environment. The complexity is reduced in the context of training. The tasks of target detection, tracking, and range and pose estimation are simplified because the target appearance is known. Machine learning can be applied effectively if a large dataset of target images is collected. Further simplifications can be achieved by placing distinctive markers (fiducials) on the moving targets.
Combinations of both approaches are possible. For example the speed and direction of motion of the targets could be supplied by the target system while the distance from the shooter to the target is measured by a laser range finder at the firing line.
Control system 11 determines factors affecting the shooters' bullet velocity-profile. The control system can be configured with the weapon and ammunition type being used by the shooters. The control system may measure or be configured with the current temperature, altitude, and elevation differences between shooters 21, 22 and targets 31, 32. The correct lead is calculated by the control system 11. The lead is then displayed on display screen 12, and updated in real time.
Bullets decelerate as they fly through the air. Bullet velocity profiles for a specific combination of bullet, cartridge, and weapon are well known or can be obtained experimentally. Further refinements are possible to account for the effect of specific atmospheric conditions.
The distance between the shooter 21 and the target 31 is known due to continuous localisation of the target. The time it takes the decelerating bullet to cover that distance can be looked up from the velocity profile table. The average bullet speed can then be calculated by dividing the shooter-to-target distance by the bullet flight time.
The goal is to find the angle between the line of sight to the target and the line of fire which assures the bullet intercepting the moving target. This angle is known as the “lead angle”. The geometry is defined by a triangle which is formed by a) the current distance between the shooter and the target, b) the distance covered by the bullet to the intercept point, and c) the distance covered by the target to the intercept point. The law of cosines describes a relationship between the length of the triangle sides and the heading angle of the target relative to the shooter. The equation can be solved for the time to intercept. The time to intercept allows to calculate the intercept coordinates and the lead angle.
The system has information about all targets and the graphical user interface can be configured to operate in any of several modes as follows:
Referring to
A snapshot of the display shown on display screen 12 is shown in a box labelled 62. Display 62 visualises what the trainee shooter 21 should see through his/her scope. The target 31 is visualised by a to-scale target mannequin 64 which includes 3D and leaning effects. The background 63 (abstract ‘bushes’) moves at an appropriate speed, to give the viewer a sense of the target's speed. The ground is represented by horizontal line 66.
The correct lead is indicated numerically at number field 67. In this example the correct lead is indicated in minutes of angle (14 MOA). The actual distance to the target (103 m) and the velocity of the target (3.3 m/s) are also shown.
The trainee uses the display screen 12 as a guide to assist them in correctly aiming their own weapon at target 31. What they see in their own gun-sight should correspond with the representation 62.
The target mannequin 64 is displayed in its correct size relative to the reticle 65. The reticle 65 size is fixed (because its distance from the shooters eye does not change) but the target mannequin 64 takes up fewer pixels at longer ranges.
If the target is in fact accelerating/decelerating, the target's speed will change as the bullet is in flight, so the correct lead is slightly different. The targets 31, 32 and or control system 11 knows the planned acceleration profile of the targets, so this can also be taken into account in the calculation of the correct lead.
Referring to
Referring to
In windy conditions the shooter 21 must add an offset for wind. This could also be visualised, based on wind-speed that is either configured or measured.
The system could be used in several ways including the following:
1. Before the range practice: the instructor 41 briefs the trainees 21, 22. He or she causes a target to move down-range, and shows the class screen 12 (“OK people, look at the target moving down-range. Now look at the screen: for that target, this is the lead you should apply.”).
2. For coaching students who are not progressing: the instructor has access to screen 12, if a student is struggling then the instructor can show them the lead they should be applying.
3. As a real-time tool: each student has access to screen 12 while they're shooting. Screen 12 could take the form of a tablet computer located next to the trainee, they can shift their eyes between the target and screen 12. Alternatively, screen 12 could be in the form of a mobile phone or similar which is mounted on the weapon being used by the trainee.
The trainee shooters 21, 22 will not have access to screen 12 in combat, so they should not become reliant on it; the objective is for them to internalise it.
By placing sensors on the target 31, 32, it becomes possible to estimate the path of the bullet as it passes the target. This can be used to close the feedback loop, which one would expect should accelerate the learning cycle. The desired aim is displayed on a tablet 12 before a shot is fired on a moving target. After a shot is fired, the tablet displays whether the trainee was leading or lagging the target and by how much.
Referring now to
For the bullet sensors, LOMAH (Location of Miss and Hit) sensor arrays could be used in some situations (particularly static/2D targets). Operationally, systems exist to estimate the 3D direction of incoming fire. E.g. a system called “Boomerang” uses a tetrahedral array of microphones and is designed to be mounted on vehicles. When the vehicle is shot at, the Boomerang system estimates the trajectory of the bullet and therefore the direction to the enemy shooter. A system similar to this could be mounted on the targets 31, 32.
In some cases, the weapon-mounted sight 13 can be used as a screen to display or overlay information to the trainee shooter with information overlaid over the usual weapon sight display. In
Referring to
Referring to sight view 100, aim hints 82, 83 are overlaid to the sight. The correct lead is indicated by overlaid vertical line 82 while the correct BDC is indicated by overlaid horizontal line 83. The location of both aim hints 82 and 83 are continuously adjusted based on the information 53 collected by the system. To achieve correct aim, the shooter must move the weapon so that the intersection of lines 82 and 83 overlays the intended target, typically the centre of mass of moving figure.
Referring to sight view 101, the correct lead and BDC is indicated by a different style of aim hint in the form of a cross mark 84. The location of aim hint 84 is continuously adjusted based on the information 53 collected by the system. To achieve correct aim the shooter must place the mark 84 over the intended target.
In both views the shooter correctly compensates for the motion of moving target and distance from the shooter to the target.
Depending on training objectives and the trainee proficiency, the instructor can disable some or all of the overlay information. For example, referring to
The embodiments described above utilised mobile robotic targets, but the system and methods could also be applied to rail-based targets.
In the embodiments described above, the target is visualised in very abstract form. In other embodiments it may be helpful to visualise it in a more anthropomorphic form. It could even be more human-like than the target (e.g. with moving arms/legs), because shooters are trained to use cues like arm-movement to estimate target speed.
In the embodiments utilising display in a weapon mounted sight, some or all of the lead calculations may be carried out in the sight itself, and not in the target control system.
It can be seen that embodiments of the invention provide at least one of the following advantages:
Any reference to prior art contained herein is not to be taken as an admission that the information is common general knowledge, unless otherwise indicated.
Finally, it is to be appreciated that various alterations or additions may be made to the parts previously described without departing from the spirit or ambit of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2019900563 | Feb 2019 | AU | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/AU2020/050149 | 2/21/2020 | WO | 00 |