METHODS AND SYSTEMS FOR SIMULATED TARGET PRACTICE

Information

  • Patent Application
  • 20250237483
  • Publication Number
    20250237483
  • Date Filed
    January 21, 2025
    11 months ago
  • Date Published
    July 24, 2025
    4 months ago
  • Inventors
    • Metropoulos; Peter (Manhattan Beach, CA, US)
    • O’Shaughnessy; James G. (Solvang, CA, US)
    • Way; Christopher R. (Broomfield, CO, US)
    • Galli; Frank L. (Wheat Ridge, CO, US)
  • Original Assignees
    • PRIME Reaction, LLC (Solvang, CA, US)
Abstract
A shooting training apparatus. The shooting training apparatus includes a screen configured to display a virtual range. The shooting training apparatus further includes a footing surface configured to dynamically reorient during a shooting session, wherein the footing surface dynamically reorients independent of what is displayed in the virtual range.
Description
BACKGROUND
Background and Relevant Art

Firearm training is performed to improve a shooter's skills. Such skills may include accuracy, reaction time, breath control, target acquisition, consistency, pre-trigger break force control, follow-through, etc. At the most basic level, firearm training may be performed in a controlled shooting range where a shooter stands at a bench and shoots at targets at a known distance from the shooter. More complex training involves the shooter physically traversing a course with various targets deployed along the course.


Modernly, a firearms training simulator allows shooters to improve skills using virtual training tools. With these tools, virtual ranges can be displayed to a shooter, typically by projecting the virtual range onto a screen.


The shooter uses an inert, replica weapon to shoot at targets in the virtual range. The replica weapon emits laser pulses (either visible or IR), when the trigger of the replica weapon is pulled, that strike the virtual range. In some embodiments, the replica weapon may be implemented simply by using ammunition cartridges that include a “primer” momentary selection switch coupled to a laser emitting device in an otherwise functional weapon. In this way, when the trigger of the weapon is pulled, the firing pin will strike the momentary switch, causing laser light to be emitted from the weapon, which is projected onto the virtual range. In some embodiments, the replica weapon is implemented by removing the barrel or portions of the action in an actual firearm, and replacing them with components that reset the trigger and/or actuates the bolt when the trigger is pulled to simulate semi-automatic weapons.


The training simulator further includes a detector configured to identify where the laser is emitted on the projected virtual range. The training simulator can then determine various shooting characteristics of a range session.


Range training is very common and useful, but not without its drawbacks. In particular, range training can only approximate aspects of a situation outside of a range in which firearms are used. Range training typically is not able to accurately simulate terrain differences that might occur outside of the range. Further, range training is idealistic resulting in less stressful shooting. Thus, shooting performance on a range is typically much higher than outside of a range.


The subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.


BRIEF SUMMARY

One embodiment illustrated herein includes a shooting training apparatus. The shooting training apparatus includes a screen configured to display a virtual range. The shooting training apparatus further includes a footing surface configured to dynamically reorient during a shooting session, wherein the footing surface dynamically reorients independent of what is displayed in the virtual range.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


Additional features and advantages will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the teachings herein. Features and advantages of the invention may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. Features of the present invention will become more fully apparent from the following description and appended claims, or may be learned by the practice of the invention as set forth hereinafter.





BRIEF DESCRIPTION OF THE DRAWINGS

In order to describe the manner in which the above-recited and other advantages and features can be obtained, a more particular description of the subject matter briefly described above will be rendered by reference to specific embodiments which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments and are not therefore to be considered to be limiting in scope, embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:



FIG. 1 illustrates a shooting training apparatus having a projected virtual range;



FIG. 2 illustrates a shooting training apparatus having a virtual range displayed using lighting elements in a screen and having detectors interspersed with the lighting elements;



FIG. 3 illustrates modeling bullet drop in a shooting training apparatus;



FIG. 4 illustrates modeling bullet drop and delay in a shooting training apparatus;



FIG. 5 illustrates a prompt array in a shooting training apparatus;



FIG. 6 illustrates a shooting training apparatus having a portion of a footing surface being comprises of a reorientable treadmill that is capable of being reoriented along at least two center line axis of the portion of the footing surface.





DETAILED DESCRIPTION

Embodiments illustrated herein are directed to systems and methods for introducing interference into range training. In particular, the interference introduced into the range training relates to introducing interference in footing surfaces. For example, embodiments may introduce dynamic orientation changes of footing surfaces while a shooter is shooting. In some embodiments, such interference may be multi-faceted. For example, in some examples, different portions of the footing surface may be dynamically reoriented in different directions with respect to each other. Alternatively, or additionally, different portions of the footing surface may be reoriented in different directions at different rates. Alternatively, or additionally, portions of the footing surface may be dynamically reoriented while also moving laterally in one or more directions. Alternatively, or additionally, portions of the footing surface may include visual, tactile, audio, or other cues directing the shooter to perform certain movements on the footing surface. Alternatively, or additionally, portions of the footing surface may include visual, tactile, audio, or other cues directing the shooter to perform certain task as part of the shooting training.


Some embodiments herein may be implemented using various virtual training tools. For example, some embodiments may use one or more different systems for displaying a virtual range. In some embodiments, the training tools may include a projector to display the virtual range and a detector to detect laser hits from a training weapon. In some embodiments, a virtual range may be displayed on a screen having light sources in the screen to display the virtual range and detectors in the screen to detect laser hits. In some embodiments, the training tools may be configured to have a detection resolution of 1/16 of an inch at 10 feet. Some embodiments may be configured to have a detection resolution of 1/32 of an inch at 10 feet. This may be particularly useful for simulating rifle training. Further, some embodiments may implement a training weapon with multiple laser pulses per a given primer strike, which in some embodiments, may correspond to a single trigger pull. This too can be useful for rifle simulations when simulating lock and dwell time.


Additional details are now illustrated.


Referring now to FIG. 1, an example embodiment is illustrated. In this example a shooter 102 is performing shooting drills in a training environment 100. The training environment 100, in this example, includes virtual training tools, including a virtual range 104. While virtual training tools are illustrated in FIG. 1, it should be appreciated that in other embodiments, training may be performed at an actual live-fire range. In other embodiments, training may be performed using virtual reality glasses, augmented reality glasses or other platforms. The virtual range 104 is displayed on a screen 106 by projecting the virtual range 104 from a projector 108.


The shooter 102 has a weapon 110. The weapon 110 projects a laser beam 112 on to the screen 106, and specifically the portion of the screen having the virtual range 104 projected onto it.


The training environment further includes a detector 114. The detector can identify where the laser beam 112 strikes the virtual range 104. The detector is coupled to a computing processor 120 that executes computer executable instructions to use information about where the laser beam 112 strikes the virtual range 104, and determines what bullets from a live-fire weapon would have struck.


In the example illustrated in FIG. 1, the training environment 100 includes a number of useful characteristics. For example, in some embodiments, the detector 114 is configured to have a resolution of at least 1/16 of an inch when the shooter 102 is at a distance of 10 feet from the screen 106. In other embodiments, the detector 114 is configured to have a resolution of at least 1/32 of an inch when the shooter 102 is at a distance of 10 feet from the screen 106. This allows for the training environment 100 to be used for rifle training simulating a shooting distance of several hundred yards. With this resolution, accuracy of detecting laser hits on the screen 106 of about 1 MOA at a simulated distance of 300 yards can be achieved. In some embodiments, the detector 114 itself is able to detect signals at at least 1/16 or 1/32 of an inch resolution depending on the embodiment, no matter where the weapon 110 is fired from. In some embodiments, the laser from the weapon 110 is configured to emit a half power spot size at the screen that is less than 1/16 of an inch or 1/32 of an inch, depending on the embodiment, when fired from a distance of 10 feet.


In some embodiments, the computing processor 120 is configured to model various range and/or weapon characteristics. For example, in some embodiments, the computing processor 120 may be configured to model long-distance rifle shots by introducing delays and bullet drop. Illustratively, if the computing processor 120 is configured to model shooting for a long range target, when the detector 114 detects a laser strike on the screen 106, the computing processor 120 may determine that a model bullet strike strikes a different location and/or target then the location and/or target corresponding to the location and/or target where the laser strikes the screen 106.


For example, as illustrated in FIG. 3, while the laser trajectory 122 may strike a point 124 on the screen 106 the computing processor 120 may determine that the actual point that a bullet would strike in a real environment at several hundred yards is point 126. This is due to the fact that the bullet would follow the trajectory 128 rather than the trajectory 122. Similarly in FIG. 4, assuming that targets in the virtual range 104 are moving on the screen 106, the delay in time for the bullet to strike a target may be modeled such that hitting a target may be detected by factoring in movement of the target along with delay that an actual bullet would experience when being fired. Thus, in FIG. 4, the shooter 102 can be detected as having hit the target 130 even though the laser strike point 124 is both above the target 130 and to the left of the target 130 when the shot is fired assuming that the target 130 is moving to the left.


With respect to the weapon 110, various characteristics are implemented in some embodiments. These characteristics can be implemented by virtue of characteristics of the weapon 110 itself and/or processing performed by the computing processor 120. As described previously, in some embodiments, the computing processor 120 may be configured to model delays and bullet drops. In some embodiments, the weapon 110 is configured to emit two laser pulses as opposed to just a single laser pulse per primer strike. In particular, the weapon 110 may be configured to emit a first laser pulse immediately when an actual or simulated primer strike of a bullet in the weapon is struck, which typically occurs a very short time (lock time) after the trigger of the weapon 110 is pulled. A second laser pulse is emitted at a time (dwell time) corresponding with when a bullet would leave the barrel of an actual weapon corresponding to the weapon 110. The dwell time, in other embodiments may be the time from when the primer is struck to when a bolt unlocks and extracts a spent cartage in a gun being simulated by the weapon 110. Thus, the weapon 110 may include delay circuitry configured to trigger the second laser pulse (and in some embodiments, the first laser pulse). The use of multiple laser pulses may be used to model bullet trajectory based on lock and dwell time which is related to when a primer of a bullet is struck and when the bullet leaves the barrel of an actual weapon or when a bolt opens and a cartridge is extracted. In this example, the detector 114 will detect the two strikes, which information is provided to the computing processor 120 where the computing processor 120 can determine what an actual bullet trajectory would look like related to the shooter's performance in the training environment 100. For example, a shooter's follow-through can be evaluated based on the two laser strikes on the screen 106 for a single trigger pull and/or a single shot fired.


In the example illustrated in FIG. 1, the projector 108 and the detector 114 are slightly offset from each other creating parallax. Thus, in some embodiments, a calibration process may be implemented whereby the shooter 102 shoots one or more shots from the weapon 110 at different locations on the footing surface 116 to attempt to identify parallax conditions and corrective actions that can be taken in view of the parallax conditions. For example, the computing processor may be placed in a calibration mode, and the shooter 102 may fire calibration shots from the extreme edges of the footing surface 116 and the center of the footing surface 116. In some embodiments, this will occur when the footing surface 116 is not reorienting so as to allow the shooter 102 to be more accurate in their shooting. Parallax can be determined by the computing processor 120. Later, during a training session, the computing processor 120 can factor in any parallax based on the shooter's location on the footing surface 116 to correct for parallax that occurs as the shooter 102 moves on the footing surface 116 during a training session.


Referring now to FIG. 2, an alternative embodiment is illustrated. In this embodiment, the virtual range 204 is projected using an active screen 206 having light emitting elements embedded into the screen 206. For example, such light emitting elements may be light emitting diodes or other semiconductor devices. The screen 206 further includes light detecting elements embedded into the screen 206. For example, the light detecting elements may include photodiodes, or other similar semiconductor devices. Note that functionality described above with respect to FIG. 1 can also be implemented in the embodiment illustrated in FIG. 2, if applicable.



FIG. 2 illustrates a detailed portion 232 of the screen 206. In this example, light emitters 234 are interspersed with detectors 236. Similar to the example illustrated in FIG. 1, the light detectors 236 are of a sufficient size, quantity, and location to achieve resolution of 1/16 of an inch or 1/32 when the shooter 202 is at a distance of 10 feet from the screen 206. By using emitters and detectors as illustrated in FIG. 2, the parallax issues present in the embodiments illustrated in FIG. 1 can be minimized or eliminated.


Returning once again to FIG. 1, a footing surface 116 is illustrated. The footing surface 116, in this example, is dynamically reorientable, meaning that at least a portion of the footing surface 116 can be dynamically oriented in various directions. In particular, the footing surface 116 will reorient during a training session. FIG. 1 illustrates that a first portion 116-1 of the footing surface 116 is oriented in a first direction as indicated by the normal vector 118-1. A second portion 116-2 of the footing surface 116 is oriented in a second direction as indicated by the normal vector 118-2. Both of these portions will re-orient into different directions during a training exercise. In some embodiments, different portions of the footing surface may be dynamically reoriented in different directions with respect to each other. Alternatively, or additionally, different portions of the footing surface may be reoriented in different directions at different rates.


While the examples illustrated show that the footing surface 116 reorients, it should also be appreciated that in various embodiments, the footing surface may have lateral movement. That is, portions of the footing surface may be dynamically reoriented while also moving laterally in one or more directions. Such lateral movement can be multi-directional. This can be envisioned as a multi-directional treadmill configuration. Although, as illustrated in FIG. 6, in some embodiments, a single direction treadmill may be implemented in some embodiments.


In some embodiments, the orientation of the various portions of the footing surface 116 is unrelated to what is displayed on the screen 106 in the virtual range 104. Thus, in these embodiments, the shooter 102 receives stimulation from the footing surface 116 that is unrelated to what the shooter 102 sees in the virtual range 104. This may be useful in training in that the shooter 102 learns to deal with unexpected perturbances while performing shooting actions. The illustrated embodiments can have a number of different benefits including improving the shooter's balance and building body tissues in the shooter. In some embodiments, portions of the footing surface 116 comprise one or more Reax Board floors or one or more Reax Run Treadmills available from Reaxing S.P.A of Milano, Italy.


Referring now to FIG. 5, some embodiments may implement a prompt array 138 on the footing surface 116. The prompt array 138 provides various indicators to the shooter 102. In some embodiments, the prompt array 138 may provide indicators to the shooter indicating that the shooter should move about the footing surface 116 during a training session. In some examples, this can be accomplished by causing lights to be emitted in a desired traversal pattern by the prompt array 138. For example, a portion of the prompt array 138 is illuminated, the shooter 102 should move towards the illuminated portion before firing and/or while firing the weapon 110. Similar prompts may prompt the shooter 102 to perform other actions such as touching illuminated portions with their hands, crouching, or other actions. Note that the illumination may be controlled by the computing processor 120.


While illuminated tiles are illustrated herein, it should be appreciated that in other embodiments, other cues may be provided to the user. For example, cues may be displayed on the screen 106. Alternatively, the system may include audio systems coupled to the computing processor 120 such that the computing processor can cause audio cues to be provided to the shooter 102. In still other embodiments, the computing processor 120 may control tactile elements, such as solenoids, vibration generators, or other tactile generators. Such tactile elements may be implemented in the footing surface 116, in the weapon 110, in other wearable devices worn by the shooter, etc.


In some embodiments, the prompt array 138 may have the ability to display different colored lights. In some embodiments, the color of the light displayed may correspond to targets in the virtual range 104 at which the shooter 102 should fire the weapon 110. For example, if a tile in the prompt array 138 is illuminated blue, this portion of the training session may indicate that the shooter is to move to the blue tile in the prompt array 138 and fire the weapon 110 at blue targets in the virtual range 104.


An evaluation of the shooter 102 in a training session will include if and/or how well the shooter 102 was able to move to the illuminated tile and select targets corresponding with the illuminated color of the tile. In some embodiments, this can be accomplished by computing evaluations performed by the computing processor 120. In some embodiments, the prompt array 138 may be formed using reacts lights Pro available from Reaxing S.P.A of Milano, Italy.


Referring now to FIG. 6, an alternative or additional embodiment may be implemented. FIG. 6 illustrates that at least a portion 116-1 of the footing surface 116 is implemented using a treadmill which can be reoriented in various directions and along various center lines of the treadmill. In particular, positive and negative inclinations can be performed along a frontal axis 140 and a lateral axis 142. Thus, the treadmill portion 116-1 can be moved up and down sideways, as well as front and back (incline/decline) in a predetermined fashion. In some embodiments, predetermined programming is implemented to vary the incline/decline along the axis 140 and 142. From the perspective of the shooter, the movements may be sudden and unpredictable inclinations in all directions while running or walking on the treadmill portion 116-1, even though the movements may be performed based on predetermined programming. In some embodiments, the programming may include random or pseudo random aspects as it concerns treadmill orientation. In some embodiments, the treadmill portion 116-1 of the footing surface 116 comprises one or more Reax Run Treadmills available from Reaxing S.P.A of Milano, Italy.


One embodiment illustrated herein includes a method of performing shooting training. The method includes dynamically reorienting a footing surface during a shooting session. The footing surface dynamically reorients independent of what is occurring downrange from the footing surface. That is, the footing surface reorientation is not connected to things that may be occurring in a virtual or live-fire range.


Embodiments of the method may further include displaying a virtual range downrange from the footing surface.


Embodiments of the method may further include detecting laser strikes from a weapon fired by a shooter. In some embodiments, detecting laser strikes comprises detecting laser strikes at a resolution of 1/16th of an inch or smaller on a virtual range when a shooter is shooting 10 feet from the virtual range. In an alternative embodiment, detecting laser strikes comprises detecting laser strikes at a resolution of 1/32nd of an inch or smaller on a virtual range when a shooter is shooting 10 feet from the virtual range.


Embodiments of the method may further include detecting laser strikes from a weapon fired by a shooter.


Embodiments of the method may further include modeling long-distance rife shots by introducing delay and bullet drop.


Embodiments of the method may further include performing a calibration operation for a virtual range.


Embodiments of the method may further include providing at least one of visual, tactile, or audio prompts to a shooter.


Further, the methods may be practiced by a computer system including one or more processors and computer-readable media such as computer memory. In particular, the computer memory may store computer-executable instructions that when executed by one or more processors cause various functions to be performed, such as the acts recited in the embodiments.


Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: physical computer-readable storage media and transmission computer-readable media.


Physical computer-readable storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage (such as CDs, DVDs, etc.), magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.


A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above are also included within the scope of computer-readable media.


Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission computer-readable media to physical computer-readable storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer-readable physical storage media at a computer system. Thus, computer-readable physical storage media can be included in computer system components that also (or even primarily) utilize transmission media.


Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.


Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.


Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.


The present invention may be embodied in other specific forms without departing from its characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. A shooting training apparatus comprising: a screen configured to display a virtual range; anda footing surface configured to dynamically reorient during a shooting session, wherein the footing surface dynamically reorients independent of what is displayed in the virtual range.
  • 2. The shooting training apparatus of claim 1, further comprising a projector configured to project the virtual range onto the screen.
  • 3. The shooting training apparatus of claim 1, further comprising a detector configured to detect laser strikes from a weapon used by a shooter using the training apparatus.
  • 4. The shooting training apparatus of claim 1, wherein the screen comprises light emitters in the screen to display the virtual range, and detectors interspersed with the light emitters, the detectors configured to detect laser strikes from a weapon used by a shooter using the training apparatus.
  • 5. The shooting training apparatus of claim 1, further comprising a computing processor configured to model at least one of range or weapon characteristics.
  • 6. The shooting training apparatus of claim 5, wherein the computing processor is configured to model long-distance rifle shots by introducing delay and bullet drop.
  • 7. The shooting training apparatus of claim 5, wherein the computing processor is configured to determine what targets bullets from a live-fire weapon would have struck.
  • 8. The shooting training apparatus of claim 5, wherein the computing processor is configured to calibrate the shooting training apparatus by observing where laser strikes occur when a shooter shoots calibration shots from different locations on the footing surface.
  • 9. The shooting training apparatus of claim 5, wherein the computing processor is configured to control at least one of visual, tactile, or audio prompts for the shooting training apparatus.
  • 10. The shooting training apparatus of claim 1, further comprising a detector configured to detect laser strikes from a weapon used by a shooter using the training apparatus, the detector having a resolution of at least 1/16th of an inch or smaller when a shooter shoots 10 feet from the virtual range.
  • 11. The shooting training apparatus of claim 1, further comprising a detector configured to detect laser strikes from a weapon used by a shooter using the training apparatus, the detector having a resolution of at least 1/32nd of an inch or smaller when a shooter shoots 10 feet from the virtual range.
  • 12. The shooting training apparatus of claim 1, further comprising a prompt array coupled to the footing surface, the prompt array configured to direct a shooter's movements on the footing surface.
  • 13. The shooting training apparatus of claim 1, further comprising a weapon, the weapon being configured to emit a plurality of laser pulses per primer strike.
  • 14. The shooting training apparatus of claim 1, wherein at least a portion of the footing surface comprises a reorientable treadmill that can reorient along at least two center line axis of the portion of the footing surface.
  • 15. A method of performing shooting training, the method comprising: dynamically reorienting a footing surface during a shooting session, wherein the footing surface dynamically reorients independent of what is occurring downrange from the footing surface.
  • 16. The method of claim 14, further comprising displaying a virtual range downrange from the footing surface.
  • 17. The method of claim 14, further comprising detecting laser strikes from a weapon fired by a shooter.
  • 18. The method of claim 16, wherein detecting laser strikes comprises detecting laser strikes at a resolution of 1/16th of an inch or less on a virtual range when a shooter is shooting 10 feet from the virtual range.
  • 19. The method of claim 16, wherein detecting laser strikes comprises detecting laser strikes at a resolution of 1/32nd of an inch or less on a virtual range when a shooter is shooting 10 feet from the virtual range.
  • 20. The method of claim 14, further comprising detecting laser strikes from a weapon fired by a shooter.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 63/623,730 filed on Jan. 22, 2024, and entitled “METHODS AND SYSTEMS FOR SIMULATED TARGET PRACTICE,” and which application is expressly incorporated herein by reference in its entirety.

Provisional Applications (1)
Number Date Country
63623730 Jan 2024 US