The present invention relates to devices for teaching marksmen how to properly lead a moving target with a weapon. More particularly, the invention relates to optical projection systems to monitor and simulate trap, skeet, and sporting clay shooting.
Marksmen typically train and hone their shooting skills by engaging in skeet, trap or sporting clay shooting at a shooting range. The objective for a marksman is to successfully hit a moving target by tracking at various distances and angles and anticipating the delay time between the shot and the impact. In order to hit the moving target, the marksman must aim the weapon ahead of and above the moving target by a distance sufficient to allow a projectile fired from the weapon sufficient time to reach the moving target. The process of aiming the weapon ahead of the moving target is known in the art as “leading the target”. “Lead” is defined as the distance between the moving target and the aiming point. The correct lead distance is critical to successfully hit the moving target. Further, the correct lead distance is increasingly important as the distance of the marksman to the moving target increases, the speed of the moving target increases, and the direction of movement becomes more oblique.
Target flight path 116 extends from high house 101 to marker 117. Marker 117 is positioned about 130 feet from high house 101 along target flight path 115. Target flight path 115 extends from low house 102 to marker 118. Marker 118 is about 130 feet from low house 102 along target flight path 116. Target flight paths 115 and 116 intersect at target crossing point 119. Target crossing point 119 is positioned distance 120 from station 110 and is 15 feet above the ground. Distance 120 is 18 feet. Clay targets are launched from high house 101 and low house 102 along target flight paths 115 and 116, respectively. Marksman 128 positioned at any of stations 103, 104, 105, 106, 107, 108, 109, and 110 attempts to shoot and break the launched clay targets.
Referring to
x=x
o
+v
xo
t+½axt2+Cx (1)
y=y
o
+v
yo
t+½ayt2+Cy (2)
where x is the clay position along the x-axis, xo is the initial position of the clay target along the x-axis, vxo is the initial velocity along the x-axis, ax is the acceleration along the x-axis, t is time, and Cx is the drag and lift variable along the x-axis, y is the clay position along the y-axis, yo is the initial position of the clay target along the y-axis, vyo is the initial velocity along the y-axis, ay is the acceleration along the y-axis, t is time, and Cy is the drag and lift variable along the x-axis. Upper limit 405 is a maximum distance along the x-axis with Cx at a maximum and a maximum along the y-axis with Cy at a maximum. Lower limit 406 is a minimum distance along the x-axis with Cx at a minimum and a minimum along the y-axis with Cy at a minimum. Drag and lift are given by:
F
drag=½ρv2CDA (3)
where Fdrag is the drag force, ρ is the density of the air, v is vo, A is the cross-sectional area, and CD is the drag coefficient;
F
lift=½ρv2CLA (4)
where Flift is the lift force, ρ is the density of the air, v is vo, A is the planform area, and CL is the lift coefficient
Referring to
Clay target 503 has initial trajectory angles γ and β, positional coordinates x1, y1 and a velocity v1 Aim point 505 has coordinates x2, y2. Lead distance 506 has x-component 507 and y-component 508. X-component 507 and y-component 508 are calculated by:
Δx=x2−x1 (5)
Δy=y2−y1 (6)
where Δx is x component 507 and Δy is y component 508. As γ increases, Δy must increase. As γ increases, Δx must increase. As β increases, Δy must increase.
The prior art has attempted to address the problems of teaching proper lead distance with limited success. For example, U.S. Pat. No. 3,748,751 to Breglia et al. discloses a laser, automatic fire weapon simulator. The simulator includes a display screen, a projector for projecting a motion picture on the display screen. A housing attaches to the barrel of the weapon. A camera with a narrow band-pass filter positioned to view the display screen detects and records the laser light and the target shown on the display screen. However, the simulator requires the marksman to aim at an invisible object, thereby making the learning process of leading a target difficult and time-consuming.
U.S. Pat. No. 3,940,204 to Yokoi discloses a clay shooting simulation system. The system includes a screen, a first projector providing a visible mark on the screen, a second projector providing an infrared mark on the screen, a mirror adapted to reflect the visible mark and the infrared mark to the screen, and a mechanical apparatus for moving the mirror in three dimensions to move the two marks on the screen such that the infrared mark leads the visible mark to simulate a lead-sighting point in actual clay shooting. A light receiver receives the reflected infrared light. However, the system in Yokoi requires a complex mechanical device to project and move the target on the screen, which leads to frequent failure and increased maintenance.
U.S. Pat. No. 3,945,133 to Mohon et al. discloses a weapons training simulator utilizing polarized light. The simulator includes a screen and a projector projecting a two-layer film. The two-layer film is formed of a normal film and a polarized film. The normal film shows a background scene with a target with non-polarized light. The polarized film shows a leading target with polarized light. The polarized film is layered on top of the normal non-polarized film. A polarized light sensor is mounted on the barrel of a gun. However, the weapons training simulator requires two cameras and two types of film to produce the two-layered film making the simulator expensive and time-consuming to build and operate.
U.S. Pat. No. 5,194,006 to Zaenglein, Jr. discloses a shooting simulator. The simulator includes a screen, a projector for displaying a moving target image on the screen, and a weapon connected to the projector. When a marksman pulls the trigger a beam of infrared light is emitted from the weapon. A delay is introduced between the time the trigger is pulled and the beam is emitted. An infrared light sensor detects the beam of infrared light. However, the training device in Zaenglein, Jr. requires the marksman to aim at an invisible object, thereby making the learning process of leading a target difficult and time-consuming.
U.S. Patent Application Publication No. 2010/0201620 to Sargent discloses a firearm training system for moving targets. The system includes a firearm, two cameras mounted on the firearm, a processor, and a display. The two cameras capture a set of stereo images of the moving target along the moving target's path when the trigger is pulled. However, the system requires the marksman to aim at an invisible object, thereby making the learning process of leading a target difficult and time-consuming. Further, the system requires two cameras mounted on the firearm making the firearm heavy and difficult to manipulate leading to inaccurate aiming and firing by the marksman when firing live ammunition without the mounted cameras.
The prior art fails to disclose or suggest a system and method for simulating a lead for a moving target using recorded video images of clay targets projected at the same scale as viewed in the field and a phantom target positioned ahead of the clay targets having a variable contrast. Therefore, there is a need in the art for a shooting simulator that recreates moving targets at the same visual scale as seen in the field with a phantom target to teach proper lead of a moving target.
In a preferred embodiment, a system and methods for marksmanship training are disclosed. In one embodiment, the system includes a recording system for capturing and recording a set of video images at a shooting range and a simulation system for displaying a set of modified video images.
In one embodiment, the recording system includes a set of cameras connected to a recorder. The set of cameras are positioned at a shooting range to capture and record a set of video images of a set of shot sequences. A “shot sequence,” as used in this application, is a recorded launch of a clay target that lands. The set of video images is modified by overlaying a phantom clay target at a lead distance and a drop distance from the recorded clay target.
In another embodiment, a set of background videos is captured and recorded by the recording system. In this embodiment, the set of background videos is the set of shot sequences without the launch of the clay target. The set of background videos is recorded for the same amount of time as the set of shot sequences. In a preferred embodiment, each shot sequence has a corresponding background video. In this embodiment, the set of video images is further modified by overlaying a selectable hotspot onto the phantom clay target.
In one embodiment, the set of modified video images are loaded into the simulation system and projected onto a screen with a set of projectors at the same magnification level as perceived by a marksman at the shooting range. A weapon is provided which includes a mounted laser. The marksman aims the weapon at the phantom clay target on the screen. When the marksman pulls the trigger, a laser beam is emitted from the weapon. If the laser beam overlaps the image of the phantom target, then the shot attempt is a hit. A camera simultaneously records the shot attempts of the marksman for later analysis.
In another embodiment, the weapon includes a mounted infrared laser and the phantom clay target includes the selectable hotspot. When the marksman pulls the trigger, an infrared beam is emitted from the weapon and an infrared camera which is included in the simulation system detects the infrared beam. If the infrared beam overlaps the hotspot, then the shot attempt is a hit.
In one embodiment, a method for producing, running, and analyzing a simulation is disclosed. In this embodiment, the method includes the steps of recording a set of shot sequences, modifying the set of shot sequences by adding a phantom clay target to the set of shot sequences along an extrapolated path, at a variable contrast level, at a lead distance and at a drop distance, to create a set of modified shot sequences. The method further includes the steps of projecting the set of modified shot sequences onto a screen in a predetermined order related to the variable contrast level to train a marksman.
In another embodiment, a method for training a marksman is disclosed. In this embodiment, the method includes the steps of recording the set of shot sequences and the set of background videos, modifying the set of shot sequences by adding a phantom clay target and a hotspot to the phantom clay target, synchronously running the set of modified shot sequences and the set of background videos, projecting the set of modified shot sequences as a video source onto a screen, determining a selection of the hotpot, switching the video source to the set of background videos if the hotspot is selected, and projecting the set of background videos as the video source onto the screen.
The disclosed embodiments will be described with reference to the accompanying drawings.
It will be appreciated by those skilled in the art that aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Therefore, aspects of the present disclosure may be implemented entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “circuit,” “module,” “component,” or “system.” Further, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
Any combination of one or more computer readable media may be utilized. The computer readable media may be a computer readable signal medium or a computer readable storage medium. For example, a computer readable storage medium may be, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the computer readable storage medium would include, but are not limited to: a portable computer diskette, a hard disk, a random access memory (“RAM”), a read-only memory (“ROM”), an erasable programmable read-only memory (“EPROM” or Flash memory), an appropriate optical fiber with a repeater, a portable compact disc read-only memory (“CD-ROM”), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. Thus, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. The propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, or any suitable combination thereof.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB.NET, Python or the like, conventional procedural programming languages, such as the “C” programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
Aspects of the present disclosure are described with reference to flowchart illustrations and/or block diagrams of methods, systems and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable instruction execution apparatus, create a mechanism for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computer to implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatuses or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Referring to
In a preferred embodiment, shooting range 601 is a skeet shooting range. In another embodiment, shooting range 601 is a trap shooting range. In another embodiment, shooting range 601 is a sporting clays range.
In this example, shooting range 601 has high house 602 and low house 603. Target flight path 604 extends from high house 602 to out of bounds marker 605. Target flight path 606 extends from low house 603 to out of bounds marker 607. Field 608 of shooting range 601 is defined by boundary lines 609, 610, 611, and 612.
Recording system 600 has cameras 613 and 614, each connected to recorder 615. Camera 613 has lens 616 and field of view 617. Camera 614 has lens 618 and field of view 619. Cameras 613 and 614 are positioned at distance “d1” from boundary line 609. Cameras 613 and 614 capture a set of video images of the set of shot sequences in field 608 at a predetermined magnification level. Any shot sequence in field 608 is captured in focus by cameras 613 and 614.
In a preferred embodiment, the number of shot sequences in the set of shot sequences is determined by the type of shooting range used and the number of target flight path variations to be recorded. For example, the representative number of shot sequences for a skeet shooting range is at least eight, one shot sequence recorded per station. More than one shot per station may be utilized.
In other embodiments, any number of shot sequences may be recorded.
In one embodiment, a set of background videos is captured and recorded. In this embodiment, the set of background videos is the set of shot sequences without the launch of the clay target. The set of background videos is recorded for the same amount of time as the set of shot sequences. In a preferred embodiment, each shot sequence has a corresponding background video.
In a preferred embodiment, the predetermined magnification level is the one which is perceived by a marksman at shooting range 601 observing the set of shot sequences. In other embodiments, other magnification levels may be employed.
In a preferred embodiment, two cameras, cameras 613 and 614, are used to record the set of shot sequences throughout field 608. In this embodiment, recorder 615 synchronizes video images the set of shot sequences recorded by cameras 613 and 614.
In another embodiment, a plurality of cameras is used to record the set of shot sequences.
In another embodiment, a single camera, having a wide field of view, is used to record the set of shot sequences. In this embodiment, recorder 615 records the set of video images.
In a preferred embodiment, each of cameras 613 and 614 is a Sony F23 444 multi-rate high definition camera. Other suitable high definition cameras known in the art may be employed.
In a preferred embodiment, each of lenses 616 and 618 is a C-Series Zoom lens model no. Hac18x7.6-F manufactured by Fujifilm Holdings of America Corporation and having a focal length range of 7.6 mm to 137 mm.
In a preferred embodiment, recorder 615 is a Panavision SSR-1 digital recorder. Other suitable recorders known in the art may be employed.
Referring
In one embodiment, distance “d2” is half of distance “d1”. Recorded shot sequence 707 displays the shot sequence at approximately half the size of the original. However, because of the cover distance “d2”, marksman 701 perceives recorded tower 707 as the same size as the original shot sequence.
Referring to
Simulation system 800 has screen 801, projectors 802 and 803, camera 804, and computer 805. Projectors 802 and 803 are connected to computer 805. Computer 805 retrieves the set of modified video images and sends them to projectors 802 and 803 which display them on screen 801. Projectors 802 and 803 are positioned at about distance “d2” from screen 801. Camera 804 is connected to computer 805. Marksman 806 is positioned between projectors 802 and 803 and between camera 804 and screen 801 to view screen 801. Camera 804 and computer 805 record marksman 806 using simulation system 800 for analysis as will be further described below.
Projector 802 has throw 807. Throw 807 covers screen portion 809 of screen 801. Projector 803 has throw 808. Throw 808 covers screen portion 810 of screen 801. Screen portion 809 has width portion “d3”. Screen portion 810 has width portion “d4”. Screen 801 has width “d5”. Marksman 806 has view 811. View 811 covers width “d5” of screen 801. Camera 804 has field of view 812. Field of view 812 covers width “d3” of screen 801 and marksman 806. Computer 805 dithers the video overlaid of screen portion 809 and screen portion 810 to eliminate multiple images.
In a preferred embodiment, screen 801 is a GrayMatte 70 projection screen available from Stewart Filmscreen Corporation of Torrance, Calif. Other suitable projection screens known in the art may be employed.
In other embodiments, any reflective surface may be utilized. For example, a wall may be employed as the reflective surface.
In a preferred embodiment, each of projectors 802 and 803 is a Christie Matrix WU14K-J projector available from Christie Digital Systems USA, Inc. of Cypress, Calif. Other suitable projectors known in the art may be employed.
In a preferred embodiment, camera 804 is a Canon XF100 High Definition Camcorder. Other suitable video cameras known in the art may be employed.
In a preferred embodiment, computer 805 is a personal computer having a processor and a memory connected to the processor running Windows 8 operating system. Other suitable personal computers known in the art may be employed.
Referring to
Projector 802 has throw 807. Throw 807 covers screen portion 809 of screen 801. Projector 803 has throw 808. Throw 808 covers screen portion 810 of screen 801. Screen portion 809 has width portion “d3”. Screen portion 810 has width portion “d4”. Screen 801 has width “d5”. Marksman 806 has view 811. View 811 covers width “d5” of screen 801. Infrared camera 813 has field of view 814. Field of view 814 covers width “d5” of screen 801. Computer 805 dithers the video overlaid of screen portion 809 and screen portion 810 to eliminate multiple images.
In a preferred embodiment, infrared camera 813 is a Wii Remote available from Nintendo of America, Inc. In another embodiment, infrared camera 813 is a CMOS image sensor available from PixArt Imaging Inc. of Taiwan. Other suitable infrared optical sensors known in the art may be employed.
Referring to
In one embodiment, laser 902 is an infrared laser diode. In this embodiment, simulated shot string 907 is infrared light.
Referring to
In step 1002, the set of recorded video images are modified. In step 1003, a simulation is run using the modified video images. In step 1004, the results of the simulation are analyzed.
Referring to
In step 1102, a set of clay target flight data in the set of video images is measured. In a preferred embodiment, the set of clay target flight data comprises a launch angle of the clay target, an initial velocity of the clay target, a mass of the clay target, a clay target flight time, a wind velocity, a drag force, a lift force, an air temperature, an altitude, a relative air humidity, an outdoor illuminance, a shape of the clay target, and a color of the clay target, and a clay target brightness level.
In step 1103, a relative location of a marksman in the set of video images with respect to a clay target launch point is determined.
In step 1104, a set of weapon data is determined. In a preferred embodiment, the set of weapon data comprises a weapon type e.g., a shotgun, a rifle, or a handgun, a weapon caliber or gauge, a shot type further comprising a load, a caliber, a pellet size, and shot mass, a barrel length, a choke type, and a muzzle velocity.
In step 1105, a phantom path is extrapolated. Referring to
Referring to
where DP
where DLead is lead distance 1116, ΔDS is the difference between the distances of shot paths 1120 and 1121, Ago is the difference between angles φ2 and φ1, θ is the launch angle between target path 1113 and distance 1119, A is a variable multiplier for shot size, gauge, and shot mass, B is a variable multiplier for θ including vibration of a clay target thrower and a misaligned clay target in the clay target thrower, and C is a variable multiplier for drag, lift, and wind.
For example, the approximate times it takes for a 7½ shot size shell with an initial muzzle velocity of approximately 1,225 feet per second to travel various distances is shown in Table 1.
Various lead distances between clay target 1112 and phantom clay target 1114 for clay target 1112 having an initial velocity of approximately 30 mph is shown in Table 2.
Referring to
The “drop of a shot” is the effect of gravity on the shot during the distance traveled by the shot. The shot trajectory has a near parabolic shape. Due to the near parabolic shape of the shot trajectory, the line of sight or horizontal sighting plane will cross the shot trajectory at two points called the near zero and far zero in the case where the shot has a trajectory with an initial angle inclined upward with respect to the sighting device horizontal plane, thereby causing a portion of the shot trajectory to appear to “rise” above the horizontal sighting plane. The distance at which the weapon is zeroed, and the vertical distance between the sighting device axis and barrel bore axis, determine the amount of the “rise” in both the X and Y axes, i.e., how far above the horizontal sighting plane the rise goes, and over what distance it lasts.
Drop distance 1122 is calculated by:
where DDrop is drop distance 1122, timpact is the time required for a shot string fired by marksman 1118 to impact clay target 1114. Timpact is determined by a set of lookup tables having various impact times at predetermined distances for various shot strings.
where vt is the terminal velocity of clay target 1114, m is the mass of clay target 1114, g is the vertical acceleration due to gravity, C is the drag coefficient for clay target 1114, ρ is the density of the air, A is the planform area of clay target 1114, and τ is the characteristic time.
Returning to
In step 1107, a color and a contrast level of a phantom clay target is determined.
In a preferred embodiment, the phantom clay target comprises a set of pixels set at a predetermined contrast level. The predetermined contrast level is determined by the difference of the color between the phantom clay target and the clay target and the difference of the brightness between the phantom clay target and the clay target. In this embodiment, the predetermined contrast level is a range from a fully opaque image to a fully transparent image with respect to the image of the clay target and the image of the background.
In a preferred embodiment, the set of pixels is set at a predetermined color. For example, blaze orange has a pixel equivalent setting of R 232, G 110, B0.
In step 1108, a modified video image is created. In a preferred embodiment, a phantom clay target is overlaid onto the loaded video image. In this embodiment, the phantom clay target is a copy of the clay target located at lead distance 1116 and drop distance 1122 ahead of the clay target with the color and contrast level determined in step 1107.
In one embodiment, a screen hotspot is overlaid onto the phantom clay target to create a phantom hotspot. The phantom hotspot enables the phantom clay target to be “selected” with the phantom hotspot with a mouse or any other suitable pointing device known in the art and defines an action to be taken by the computer when “selected” as will be further described below. In this embodiment, the phantom hotspot is transparent. In this embodiment, a background video is copied to create the set of background videos.
In step 1109, the modified video image is stored in memory. In step 1110, a sequence number is compared to a predetermined number of shot sequences. The predetermined number of shot sequences is the number of modified video images shown during the simulation. If the sequence number is less than the predetermined number of shot sequences, then method 1100 returns to step 1107. If the sequence number equals the predetermined number of shot sequences, then method 1100 proceeds to step 1111. In step 1111, a set of modified video images for a shot sequence is stored in memory.
Referring to
In step 1203, a shot attempt by a marksman is recorded by a camera of the simulation system. In a preferred embodiment, the camera simultaneously records the position of the marksman and the modified video image being projected on the screen.
In step 1204, whether the simulation is complete is determined. In a preferred embodiment, the simulation is complete after each modified video image of the set of modified images has been projected and has recorded a marksman making a shot. If the simulation is not done, then method 1200 returns to step 1201 and runs the video of the next modified video image of the set of modified video images. If the simulation is complete, then the simulation stops in step 1205.
Referring to
In step 1209, whether the phantom hotspot has been “selected” is determined. An infrared camera detects the position of an infrared shot string. The infrared shot string is calculated by:
A
shot string
=πR
string
2 (14)
R
string
=R
initial
+v
spread
t (15)
where Ashot string is the area of the infrared shot string, Rstring is the radius of the infrared shot string, Rinitial is the radius of the shot as it leaves the weapon, vspread is the rate at which the shot spreads, and t is the time it takes for the shot to travel from the weapon to the clay target.
If the position of the infrared shot string overlaps the phantom hotspot, then the phantom hotspot is “selected”. If the position of the infrared shot string does not overlap the phantom hotspot, then the phantom hotspot is not “selected”.
In step 1210, if the phantom hotspot is selected, then the simulation system switches a video source projected onto the screen from the first of the set of modified video images to the first of the set of background videos and the first of the set of background videos is projected onto the screen until completion. The first of the set of modified video images runs in the background until completion. In step 1211, the simulation system records a “hit” in a database.
In step 1212, if the phantom hotspot is not selected, then the first of the set of modified video images continues to be projected onto the screen by the simulation system until completion and the first of the set of background videos runs in the background until completion. In step 1213, the simulation system records a “miss” in the database.
In step 1214, whether the simulation is complete is determined. In a preferred embodiment, the simulation is complete after each modified video image of the set of modified images and has been projected and each background video of the set of background videos has run and a “hit” or a “miss” has been recorded. If the simulation is not done, then method 1206 returns to step 1207 and synchronously runs the video of the next modified video image of the set of modified video images and the video of the next background video of the set of the background videos. If the simulation is complete, then a trend of shot attempts is analyzed in step 1215 by retrieving a number of “hits” in the set of shot sequences and a number of “misses” in the set of shot sequences from the database. In step 1216, a shot improvement is determined by evaluating the number of hits in the set of shot sequences and the number of misses in the set of shot sequences.
Referring to
In another embodiment, laser spot 1302 does not appear on the screen when marksman 1306 pulls the trigger of weapon 1307 and shot string 1303 is an infrared shot string.
Referring to
If the shot string overlaps the phantom clay target, then the recorded shot is a “hit.” If the measured difference between the shot string and the phantom clay target is equal to or greater than zero (0), then the recorded shot is a “miss.” In step 1403, whether the simulation is complete is determined. If the simulation is not complete, then method 1400 advances to the subsequent recorded shot in the set of shot sequences in step 1404. If the simulation is complete, then a trend of the recorded shots is analyzed in step 1405. In step 1406, a shot improvement is determined by evaluating a number of hits in the set of shot sequences and a number of misses in the set of shot sequences.
It will be appreciated by those skilled in the art that modifications can be made to the embodiments disclosed and remain within the inventive concept. Therefore, this invention is not limited to the specific embodiments disclosed, but is intended to cover changes within the scope and spirit of the claims.