1. Technical Field
The present invention pertains to firearm training systems, such as those disclosed in U.S. Pat. No. 6,322,365 (Shechter et al) and U.S. patent application Ser. No. 09/761,102, entitled “Firearm Simulation and Gaming System and Method for Operatively Interconnecting a Firearm Peripheral to a Computer System” and filed Jan. 16, 2001; Ser. No. 09/760,610, entitled “Laser Transmitter Assembly Configured For Placement Within a Firing Chamber and Method of Simulating Firearm Operation” and filed Jan. 16, 2001; Ser. No. 09/760,611, entitled “Firearm Laser Training System and Method Employing Modified Blank Cartridges for Simulating Operation of a Firearm” and filed Jan. 16, 2001; Ser. No. 09/761,170, entitled “Firearm Laser Training System and Kit Including a Target Structure Having Sections of Varying Reflectivity for Visually Indicating Simulated Projectile Impact Locations” and filed Jan. 16, 2001; Ser. No. 09/862,187, entitled “Firearm Laser Training System and Method Employing an Actuable Target Assembly” and filed May 21, 2001; and Ser. No. 09/878,786, entitled “Firearm Laser Training System and Method Facilitating Firearm Training With Various Targets and Visual Feedback of Simulated Projectile Impact Locations” and filed Jun. 11, 2001. The disclosures of the above-mentioned patent and patent applications are incorporated herein by reference in their entireties. In particular, the present invention pertains to a firearm laser training system that simulates conditions of extended range targets to facilitate firearm training for these types of targets.
2. Discussion of the Related Art
Firearms are utilized for a variety of purposes, such as hunting, sporting competition, law enforcement and military operations. The inherent danger associated with firearms necessitates training and practice in order to minimize the risk of injury. However, special facilities are required to facilitate practice of handling and shooting the firearm. These special facilities tend to provide a sufficiently sized area for firearm training, where the area required for training may become quite large, especially for sniper type or other firearm training with extended range targets. The facilities further confine projectiles propelled from the firearm within a prescribed space, thereby preventing harm to the surrounding environment. Accordingly, firearm trainees are required to travel to the special facilities in order to participate in a training session, while the training sessions themselves may become quite expensive since each session requires new ammunition for practicing handling and shooting of the firearm.
The related art has attempted to overcome the above-mentioned problems by utilizing laser or light energy with firearms to simulate firearm operation and indicate simulated projectile impact locations on targets. For example, U.S. Pat. No. 4,164,081 (Berke) discloses a marksman training system including a translucent diffuser target screen adapted for producing a bright spot on the rear surface of the target screen in response to receiving a laser light beam from a laser rifle on the target screen front surface. A television camera scans the rear side of the target screen and provides a composite signal representing the position of the light spot on the target screen rear surface. The composite signal is decomposed into X and Y Cartesian component signals and a video signal by a conventional television signal processor. The X and Y signals are processed and converted to a pair of proportional analog voltage signals. A target recorder reads out the pair of analog voltage signals as a point, the location of which is comparable to the location on the target screen that was hit by the laser beam.
U.S. Pat. No. 5,281,142 (Zaenglein, Jr.) discloses a shooting simulation training device including a target projector for projecting a target image in motion across a screen, a weapon having a light projector for projecting a spot of light on the screen, a television camera and a microprocessor. An internal device lens projects the spot onto a small internal device screen that is scanned by the camera. The microprocessor receives various information to determine the location of the spot of light with respect to the target image. In addition, when longer ranges are simulated, a lookup table can include information concerning the trajectory of a projectile fired by any simulated cartridge. This provides information to enable display of the amount the projectile falls, and, thereby, the amount the weapon muzzle should be held above the target at any given simulated distance as well as the amount of lead required for the moving target at such a distance.
U.S. Pat. No. 5,366,229 (Suzuki) discloses a shooting game machine including a projector for projecting a video image that includes a target onto a screen. A player may fire a laser gun to emit a light beam toward the target on the screen. A video camera photographs the screen and provides a picture signal to coordinate computing means for computing the X and Y coordinates of the beam point on the screen.
International Publication No. WO 92/08093 (Kunnecke et al.) discloses a small arms target practice monitoring system including a weapon, a target, a light-beam projector mounted on the weapon and sighted to point at the target and a processor. An evaluating unit is connected to the camera to determine the coordinates of the spot of light on the target. A processor is connected to the evaluating unit and receives the coordinate information. The processor further displays the spot on a target image on a display screen.
The systems described above suffer from several disadvantages. In particular, the Berke, Zaenglein, Jr. and Suzuki systems employ particular targets or target scenarios, thereby limiting the types of firearm training activities and simulated conditions provided by those systems. Further, the Berke system utilizes both front and rear target surfaces during operation. This restricts placement of the target to areas having sufficient space for exposure of those surfaces to a user and the system. The Berke and Kunnecke et al. systems merely display impact locations to a user, thereby requiring a user to interpret the display to assess user performance during an activity. The assessment is typically limited to the information provided on the display, thereby restricting feedback of valuable training information to the user and limiting the training potential of the system. In addition, the Berke, Suzuki and Kunnecke et al systems generally do not simulate training for extended range targets, thereby requiring trainees to travel to special facilities and/or utilize a large area to conduct such training as described above. The Zaenglein, Jr. system may simulate targets at longer ranges. However, this system does not account for actual environmental conditions (e.g., temperature, wind, weather, etc.) within the simulation that affect projectile trajectory. Thus, the realism of the simulation is limited, thereby substantially reducing the system training potential.
Accordingly, it is an object of the present invention to conduct firearm training with extended range targets in a confined area having dimensions substantially less than the extended range of the targets.
It is another object of the present invention to conduct firearm training with extended range targets via a firearm laser training system simulating actual environmental conditions and the projectile trajectory resulting from those conditions.
Yet another object of the present invention is to employ various targets scaled to varying ranges within a firearm laser training system to conduct desired training procedures for extended range targets.
Still another object of the present invention is to employ a target in the form of a display screen with a firearm laser training system to present various targets and/or scenarios during training.
A further object of the present invention is to assess user performance within a firearm laser training system by determining scoring and/or other performance information based on detected impact locations of simulated projectiles on a target.
Yet another object of the present invention is to employ an electronic laser filter within a firearm laser training system to minimize false detections of simulated projectile impact locations on a target.
The aforesaid objects may be achieved individually and/or in combination, and it is not intended that the present invention be construed as requiring two or more of the objects to be combined unless expressly required by the claims attached hereto.
According to the present invention, a firearm laser training system includes a target assembly, a laser transmitter assembly that attaches to a firearm, a detection device configured to scan the target and detect beam impact locations thereon, and a processor in communication with the detection device. The system simulates targets at extended ranges and accounts for various environmental and other conditions (e.g., wind, temperature, etc.) affecting projectile trajectory that may be encountered during actual firing. The training may be conducted within a confined area, typically having dimensions substantially less than the extended range of the targets. The target assembly may include a target in the form of a target image, or in the form of a display screen displaying a target, a target scenario and/or environmental conditions (e.g., wind, weather, etc.). The detection device captures images of the target for processing by the processor to determine beam impact locations. The processor applies various offsets to the beam impact locations to account for the various conditions and determine the impact locations relative to the target. The processor displays an image of the target including the determined impact locations and further evaluates user performance by providing scoring and/or other information that is based on those impact locations. An electronic laser filter may be employed by the system to minimize false detections of beam impact locations on the target. In addition, the system may be compact and portable to facilitate ease of use in a variety of different environments.
The above and still further objects, features and advantages of the present invention will become apparent upon consideration of the following detailed description of specific embodiments thereof, particularly when taken in conjunction with the accompanying drawings wherein like reference numerals in the various figures are utilized to designate like components.
A firearm laser training system for extended range targets according to the present invention is illustrated in
Computer system 18 is typically implemented by a conventional IBM-compatible or other type of personal computer (e.g., laptop, notebook, desk top, mini-tower, Apple MacIntosh, palm pilot, etc.) preferably equipped with a base 52 (e.g., including the processor, memories, and internal or external communication devices or modems), a display or monitor 54, a keyboard 56 and an optional mouse (not shown). The computer system preferably utilizes a Windows 95/98/NT/2000 platform, however, any of the major platforms (e.g., Linux, Macintosh, Unix or OS2) may be employed. Further, the system includes components (e.g., a processor, disk storage or hard drive, etc.) having sufficient processing and storage capabilities to effectively execute the software for the training system. The software is typically in the form of a Windows 95/98/NT/2000 application.
The laser transmitter assembly utilized in the present invention is typically similar to the laser transmitter assembly described in U.S. patent application Ser. No. 09/760,611. An exemplary laser transmitter assembly employed by the training system firearm is illustrated in
Laser module 4 includes a housing 25 including receptacles or other engagement members defined therein (not shown) for attaching the laser module to the base member bottom surface. The laser module components are disposed within the housing and include a power source 27, typically in the form of batteries, a mechanical wave sensor 29 and an optics package 31 including a laser (not shown) and a lens 33. These components may be arranged within the housing in any suitable fashion. The optics package emits laser beam 11 through lens 33 toward target assembly 10 or other intended target in response to detection of trigger actuation by mechanical wave sensor 29. Specifically, when trigger 7 is actuated, the firearm hammer impacts the firearm and generates a mechanical wave that travels distally along barrel 8 toward bracket 3. As used herein, the term “mechanical wave” or “shock wave” refers to an impulse traveling through the firearm barrel. Mechanical wave sensor 29 within the laser module senses the mechanical wave from the hammer impact and generates a trigger signal. The mechanical wave sensor may include a piezoelectric element, an accelerometer or a solid state sensor, such as a strain gauge. Optics package 31 within the laser module generates and projects laser beam 11 from firearm 6 in response to the trigger signal. The optics package laser is generally enabled for a predetermined time interval sufficient for the target assembly to detect the beam. The beam may be coded, modulated or pulsed in any desired fashion. Alternatively, the laser module may include an acoustic sensor to sense actuation of the trigger and enable the optics package. The laser module is similar in function to the laser devices disclosed in the aforementioned patent and patent applications. The laser assembly may be constructed of any suitable materials and may be fastened to firearm 6 at any suitable locations by any conventional or other fastening techniques.
The target assembly for detecting laser beam impact locations is illustrated in
Base unit 42 includes a detection device 60, an optional barcode reader 61 (
The inside area of the cover unit is made rigid and covered with a plastic material to make a smooth, visually appealing surface. A target display area 70 is located on the left half of the inside of the cover unit (e.g., as viewed in
In addition, any quantity of imagery components (e.g., shrubs, backgrounds, rocks, buildings, etc.) maybe added to the target scenario by simply adding them to the target display area. These imagery components are typically smaller in dimension than the larger target, and may be trimmed around their border and stacked on top of the current target. This essentially allows the end-user to customize a particular training scenario by simply sticking these scenery components on an existing target (e.g., partially obscure an engageable enemy by placing a boulder imagery component over the lower part of the enemy's body, etc.). Alternatively, background overlays maybe integrated into the printed targets themselves. The overlays may be in the form of illustrations or digital images captured from actual mission sites via a standard or digital camera. Atmospheric conditions may also be indicated by the addition of indicators using the same stacking method (e.g., providing flags to indicate wind direction and speed, etc.).
Base unit 42 includes foam insulation 48 within the case. The foam insulation may be arranged within the base unit to form pockets or open compartments for containing various system accessories (e.g., software documentation, etc.). Moreover, the base unit typically includes a compartment 43 to contain computer system 18 in the form of a laptop computer configured with system software. The case is typically positioned in a horizontal position during system operation, with longer dimension sides of the base unit contacting a support surface (e.g., table, ground, floor, etc.) and the cover unit being in a vertical open and locked position substantially perpendicular to the base unit, thereby exposing the target area to the user.
Barcode reader 61 is typically disposed within a compartment formed by the foam insulation in the base unit (
Detection device 60 is housed within base unit 42 and includes a mounting unit and a USB cable. The detection device is pointed at the target display area and positioned such that laser beam hits on the target display area may be detected and processed by the detection device. By way of example, the detection device is a CCD or CMOS image sensor utilizing a USB interface and employed as a digital camera. Base unit 42 includes foam insulation support member 49 that substantially covers the bar code reader and supports detection device 60 in a position overlying the barcode reader within the base unit. The mounting unit for the detection device is typically a multidirectionally adjustable unit that allows for alignment of the detection device in multiple planes and rotations. For example, the mounting unit may contain a multi-axis geared tripod head with ball joints at both ends to allow for horizontal, vertical, rotational and angular adjustments of the detection device with respect to support member 49. The detection device detects laser beam hits on the target area and generates appropriate detection signals in the form of captured images which are transmitted to the computer system via the USB interface (e.g., the USB hub, USB cable and/or USB extension devices). The computer system analyzes the detection signals received from the detection device and provides feedback information via display monitor 54 and/or a printer (not shown). The detection device and computer system operate to capture and process images and detect beam impact locations on the target within these images in substantially the same manner disclosed in U.S. patent application Ser. No. 09/878,786. Computer system 18 may be selected to include enhanced processing power, thereby enabling processing of higher resolution images (e.g., including greater quantities of pixels or bits) for enhanced accuracy.
Target images are scaled in order to simulate ranges from approximately twenty-five meters to approximately one-thousand meters. A target image may be available in an image set having images scaled for particular simulated ranges which may be further expanded by modifying user training distances. The scaling of targets is a linear function of perspective. Accordingly, the combination of modifying the printed scale of the target with the distance the user is from the target (i.e., the “training distance”) reduces the number of printed targets required to achieve a variety of simulated distances. The system performs appropriate calculations to simulate any desired range, while a user projects a beam from the firearm at a distance corresponding to the selected scaled target.
In order to enable a user to be positioned a proper distance from a scaled target, the system may further include a conventional laser range finder. This device determines distance between objects based on transmission and reception of a laser beam. Basically, the device is transported to a location and directed toward the target to enable the device to determine the location distance from the target. Thus, the device rapidly determines a user or shooter position appropriately distanced from the target for a training session. Further, the simulated target distances may be easily modified, while the range device provides the appropriate location sufficiently distanced from the target for the modified target distance. In other words, the range finder basically automates the process of manually determining a position located an appropriate distance from the target to conduct a training session. The range finder may be disposed with the system in case 40 for storage.
In order to account for and simulate various conditions (e.g., distance, environmental conditions and any other appropriate factors), the computer system calculates cumulative offsets of the beam impact location for both the “x” and “y” location coordinates on the target display area. The offsets are applied using the proper scale for the displayed image on the computer system. The offsets are further calculated such that they produce the same effects as would be present if the user fired live ammunition in a real or “live” scenario. Thus, the system of the present invention is capable of selectively replicating conditions that affect “live” exercises and requires the user to utilize the same skill sets and procedures that would be required during such “live” exercises.
A user adjusts scope 16 to account for varying ranges and atmospheric conditions. In order to simulate targets at extended ranges in a confined area, computer system 18 determines a target offset based on target range and conditions entered by the user or other operator (e.g., instructor, training administrator, etc.). The computer system determines a target impact location by applying the offset to the impact locations determined from the images captured by the detection device. In response to a user adjusting scope 16 for specified conditions, the point of aim of the firearm for the target image is offset and the emitted laser beam effectively impacts the target display area offset from the intended site on the target image. The computer system determines the impact location with respect to the target image in accordance with the offset and beam impact locations derived from the captured images, and provides a display indicating the determined impact location with respect to the target as described below. The determined target impact locations are generally displayed by the computer system to the user, while the actual beam impact locations on the target are typically not residually visible on the target display area since a short pulse is emitted by the laser transmitter assembly.
The system maybe utilized with various types of target images. Target characteristics are contained in files that are stored on computer system 18. In particular, a desired target image is photographed and/or scanned prior to system utilization to produce target files and target information. The target files include a parameter file, a display and print image file and a scoring image file. The parameter file includes information to enable the computer system to control system operation. By way of example only, the parameter file may include the filenames of the display and scoring files, a scoring factor, simulated range and cursor information (e.g., for indicating determined target impact locations). Indicia, preferably in the form of substantially circular icons, are overlaid on these images to indicate determined target impact locations, and typically include an identifier to indicate the particular shot (e.g., the position number of the shot within a shot sequence). The scoring image is a scaled image of the target having sections or zones shaded with different colors. The colors are each associated with a corresponding value to determine a user score and the target priorities. When impact location information or captured images are received from the detection device, computer system 18 determines the target impact locations (e.g., the impact locations derived from the captured images with appropriate offsets applied thereto) and translates that information to coordinates within the scoring image. The color associated with the image location identified by the translated coordinates indicates a corresponding scoring value. In effect, the color scoring image functions as a look-up table to provide a scoring value based on coordinates within the image pertaining to a particular determined target impact location. The value of a determined target impact location may be multiplied by the scoring factor within the parameter file to provide scores compatible with various organizations and/or scoring schemes. Thus, the scoring of the system may be adjusted by modifying the scoring factor within the parameter file.
The produced files along with scaling and other information (e.g., produced based on user information, such as range) are stored on computer system 18 for use during system operation. In addition, target files may be downloaded from a network, such as the Internet, and loaded into the computer system to enable the system to access and be utilized with additional targets.
Computer system 18 includes software to control system operation and provide a graphical user interface for displaying user performance. The software is preferably implemented in the Delphi Pascal computer language, but may be developed in any suitable computer language, such as ‘C++’. The manner in which the computer system monitors beam impact locations and provides information to a user is illustrated in
When properly aligned and of correct size, the center horizontal alignment guide should coincide with the horizontal line intersecting the center of the target and be equal in width to the detectable target area in that position. Essentially, the user will typically see a trapezoidal image of the target on the display, with the larger end at the bottom being consistent with standard perspective. A slight curvature may occur at the edges of the target display due to the shape of any lenses on the detection device. Upon proper alignment of the detection device with the detectable area, suitable targets may be used for normal operation of the system. The calibration is typically performed at system initialization, but may be initiated by a user via computer system 18. Subsequently, the particular range, atmospheric and other conditions are entered into the computer system at step 102. The computer system may display a set-up or other screen in response to the entered conditions. An exemplary graphical user screen for facilitating the entry of atmospheric and other conditions is illustrated in
Once the target is positioned, a user may commence projecting the laser beam from the firearm toward the target assembly. The user adjusts scope 16 in accordance with the entered conditions and actuates the firearm to project a laser beam at target image 80 (
The computer system determines the impact location with respect to the target image at step 108 and applies the calibration offset and a trajectory offset at step 110 determined from the entered conditions as well as any system or user defined offsets. In other words, the computer system determines an overall offset between the point of aim and point of impact and applies the offset to the impact locations derived from the captured images (e.g., overall X and Y offsets are respectively applied to the X and Y coordinates of the impact locations) to simulate impact on the target image. In particular, computer system 18 stores various tables each having information relating to the particular firearm, ballistics and conditions employed for the training activity. The computer system may also store and utilize additional offsets derived from user input, target definition field, or any other source. Computer system 18 utilizes this information to determine the calculated trajectory offset of an actual projectile propelled from the firearm and seeks to replicate the offset between the point of aim and the point of impact. The trajectory and calibration offsets are applied to the derived impact locations to determine the point of impact with respect to the target image. The computer system may utilize a ballistic modeling program or module independent of the system software, such as a user defined input (e.g., a shooter's data card derived from a “live fire” experience) or any other method that provides information for the tables pertaining to a particular scenario. In an exemplary embodiment, the computer system includes a ballistic software interface that intercepts ballistic data written to a window display of the computer system by a conventional ballistic calculation or other program running simultaneously with other system software. The interface copies the intercepted data and stores the copied data within an appropriate database or other file in the computer system so that the data can be utilized to calculate adjusted impact positions on targets due to ballistic effect and other conditions. The stored data may be retrieved from within the system and utilized for virtually any bullet type or caliber. The ballistics program and interface are typically executed prior to a session to generate the tables.
The conditions are entered into the system (e.g., by a user, an appropriate interface, etc.) and provided to the ballistics module in order to produce a table having trajectory offsets for X and Y coordinates due to the conditions. The offsets are combined with the derived impact locations to determine impact locations relative to the target image. Alternatively, the ballistics module may be incorporated into the system software and automatically produce tables having trajectory offsets. When similar conditions are entered, the system searches the tables for those criteria to ascertain the appropriate trajectory offsets. The computer system may further include pull-down menus or other user interfaces to enable users to select various condition parameters (e.g., wind velocity, wind direction, temperature, altitude, barometric pressure, humidity, slope, etc.), while the ballistic module utilizes this information to provide information for the tables to determine trajectory offsets. The ballistic module may initially utilize a commercially available software package and may further be adapted to accommodate data supplied by the user. The ballistic module may also use calculations or formulas to determine offsets, with or without the production of tables (e.g., Ingalls-Mayevski ballistic calculation formula, standard published or unpublished formulas, custom developed calculations or any other source).
In addition, the trajectory information may be supplied from a user and include data measured from live fire at specified distances or ranges. This information is typically maintained for the firearm in a shooter's data card. The computer system may generate the data card for an individual weapon and may utilize this information to determine trajectory offsets, to produce training scenarios and/or scoring in accordance with actual firearm performance. Further, the user may selectively modify trajectory offsets generated by the computer system to correspond with information maintained in the firearm data card.
The computer system includes target files including target information and scaled images as described above. Since the scaling of the scoring/zoning and display images is predetermined, the computer system translates the target impact location (e.g., derived impact location with applied offset) into the respective scoring/zoning and display image coordinate spaces at step 112. Basically, the scoring/zoning and display images each utilize a particular quantity of pixels for a given measurement unit (e.g., millimeter, centimeter, etc.). The pixel quantities of each of the scoring and display images are applied to the target location to produce translated coordinates within each of those coordinate spaces, and optionally an offset may be applied to the coordinates to accommodate target scale, positioning, etc.
Computer system 18 determines appropriate offsets and beam impact locations relative to a target positioned at any location on the target display area. Thus, this configuration may determine beam impact locations without requiring precise placement of the target image. In addition, the target assembly may facilitate use of multiple target images, thereby enabling a greater range of training activities, assignment of priority to each target, and classification as enemy, friendly, non-engageable or any other category.
The translated coordinates for the scoring/zoning image are utilized to determine the results for the target impact at step 114. Specifically, the translated coordinates identify a particular location within the scoring/zoning image. Various sections of the scoring/zoning image are color coded to indicate a value or classification associated with that section as described above. The color of the location within the scoring image identified by the translated coordinates is ascertained to indicate the classification of the target impact to determine hit/miss, appropriateness of individual target selection (when more than one object of interest exists in a given scenario) and evaluation of sequence in which the targets are engaged (fired upon). The zoning factor within the parameter file is applied as specified in the associated parameter file for each target to determine a score or other evaluation for the target impact. The score and other impact information is determined and stored in a database or other storage structure, while a computer system display showing the target is updated to illustrate the target impact location and other information at step 116. Types of information that may be displayed include, without limitation, shot group size, center of mass, time interval between shots, natural dispersion, mean point of impact, offset of impact from center of target (e.g., quantity of units above, below, left or right of target, specific to individual targets when more than one object of interest exists), impact score, cumulative score, etc. The display image is displayed, while the target impact location is identified by indicia that are overlaid with the display image and placed in an area encompassing the translated display image coordinates. Further, the display may include a graphic overlay having a scaled minute of angle grid (
If a round or session of firearm activity is not complete as determined at step 118, the user continues actuation of the firearm and the system detects target impact locations and determines information as described above. However, when a round or session is determined to be complete at step 118, the computer system retrieves information from the database and determines information pertaining to the session at step 120. The computer system may further determine grouping circles. These are generally utilized on shooting ranges where projectile impacts through a target must all be within a circle of a particular diameter. The computer system may analyze the target impact information and provide groupings and other information on the display that is typically obtained during activities performed on firing ranges (e.g., dispersion, etc.). The grouping circle and target impact location indicia are typically overlaid with the display image and placed in areas encompassing the appropriate coordinates of the display image space in substantially the same manner described above.
When a report is desired as determined at step 122, the computer system retrieves the appropriate information from the database and generates a report for printing at step 124. The report includes the print image, while target impact location coordinates are retrieved from the database and translated to the print image coordinate space. The translation is accomplished utilizing the pixel quantity for a given measurement unit of the print image in substantially the same manner described above. The target impact locations are identified by indicia that are overlaid with the print image and placed in an area encompassing the translated print image coordinates as described above for the display. The size of impact identifying indicia displayed on the target image may be selected to correspond with a shot size representative of a round of ammunition for a particular firearm utilized in a training scenario. The report further includes various information pertaining to user performance (e.g., score, dispersion, mean point of impact, offset from center, etc.). When another session is desired, and a calibration is requested at step 128, the computer system performs the calibration at step 100 and the above process of system operation is repeated. Similarly, the above process of system operation is repeated from step 104 when another session is desired without performing a calibration. System operation terminates upon completion of the training or qualification activity as determined at step 126.
Operation of the system is described with reference to
The firearm laser training system described above may alternatively include a target assembly with a display screen to present various targets during a training session as illustrated in
Target controller 168 maybe implemented by any processor or computer system (e.g., the type of system described above for computer system 18) and is typically controlled by computer system 18 to facilitate display of targets. Target controller 168 and computer system 18 each typically include a wireless communications device (e.g., employing radio frequency (RF) signals) to enable communications between these devices via a network 172 (e.g., LAN, WAN, Internet, Intranet, etc.). Alternatively, target controller 168 and computer system 18 may access the network and/or directly communicate with each other via any suitable communications medium (e.g., wireless, wired, LAN, WAN, Internet, etc.). The wireless communication enables placement of computer system 18 near a user without utilization of the cables and USB extension devices described above for
The target controller controls display screen 170 to display a target in accordance with control signals from computer system 18. Basically, the user selects the desired target or target scenario on computer system 18 and the computer system instructs the target controller to display the selected targets on display screen 170 for the training session. The system may display targets in the form of target images, or videos showing moving targets or various scenarios (e.g., objects in a particular environment, etc.). Further, the videos may show actual shooting conditions (e.g., flags indicating wind, temperature, weather, etc.) to enable a user to identify those conditions to adjust the firearm accordingly for a training session. The images or video may be stored on the target controller or computer system 18, or be retrieved from a network site (e.g., a server system residing on the Internet). Moreover, the target controller may adjust or re-size a target image or video (e.g., zoom in or zoom out) to accommodate training at various ranges. In other words, the system may be utilized to simulate various ranges by adjusting the size of the target image or video on the display screen.
In operation, a user initially prepares the target assembly and calibrates the system as described above (e.g., the calibration target may be placed over the display screen, or the display screen may display an image of the calibration target). The desired targets for display are subsequently selected via computer system 18, and the user moves to a position an appropriate distance from the target for the training session. The user may enter the desired conditions or determine the conditions from the scenario displayed on the display screen. The user adjusts the firearm in accordance with the particular conditions and actuates the trigger to project a laser beam toward the displayed target and onto the screen. The detection device captures target images and transmits the captured images to computer system 18 for processing in substantially the same manner described above to determine target impact locations. The computer system displays the target image with target impact locations indicated thereon and additional information concerning the session to the user as described above.
In order to enable an instructor to control a training session, the system may further include an instructor computer system 180. The instructor computer system is substantially similar to computer system 18 and includes a wireless communication device to communicate with controller 168 via network 172. Thus, the instructor system may be local to or remote from the training location. The instructor system enables an instructor to enter the shooting conditions (e.g., via a screen similar to
The various conditions and other parameters for a training session may be entered at computer system 18 and/or instructor system 180, while these systems may display any desired information. For example, computer system 18 may display the target and impact locations, while the instructor system displays this information with additional information derived from the session (e.g., score, dispersion, etc.). The processing of captured images from the detection device may be distributed among target controller 168, computer system 18 and/or instructor system 180 in any manner, while these systems may distribute the processed information among each other in any fashion. The training system may further include a spectator system 182 that accesses the network or otherwise communicates with target controller 168, computer system 18 and/or instructor system 180 to display information concerning a training session to a third party. The spectator system may be implemented by any computer or processing system (e.g., systems substantially similar to computer system 18 and/or instructor system 180) and may be local to or remote from the training location. The spectator system may display any desired information (e.g., target image with beam impact locations and/or statistics concerning user shooting (e.g., via screens similar to
The firearm laser training systems described above may include an electronic laser filter to reduce false detections of beam impacts on the target as illustrated in
The laser transmitter assembly of the system typically receives power from the LIB, but may optionally include a power source or battery as described above. The laser assembly accommodates a plurality of signals including a positive power signal, a negative or reference power signal and a signal ground from a processing board (e.g., processor ground after a 1.5V signal is converted to a 5V signal for use by a processing board processor) within the laser module that interfaces laser module components to control laser operation. The positive and negative power signals provide power to the laser assembly from the LIB and allow extended ‘constant on’ operation without decrease in power or voltage, typically encountered with battery operation. When the laser is pulsed or the mechanical wave sensor (e.g., piezoelectric element) detects the mechanical wave as described above, a slight deviation occurs between signal ground and the negative power signals. This occurs since the laser processor board pulls additional current when the mechanical wave sensor is activated, thereby altering the signal ground signal. The LIB detects the deviation and produces an actuation signal to indicate trigger actuation.
The LIB is typically disposed within local USB extension unit 67 as described above to conserve components (e.g., power supply, housing, etc.), but may be integrated with or external of the system components. The LIB basically generates the positive and negative power signals for the laser assembly and receives the signal ground from the laser processing board. The LIB detects the deviation between the negative power and signal ground signals to determine trigger actuation. The LIB subsequently converts and buffers an actuation signal for transmission to a parallel port of computer system 18 that is configured to receive a digital signal. This technique enables a maximum of eight individual lasers to transmit signals to a single parallel port, each using a corresponding LIB.
The circuitry of the LIB is illustrated in
In operation, the user initially prepares the target assembly, selects a firearm activity, performs a system calibration, and selects atmospheric and other conditions to allow the computer system to apply appropriate offsets to detected beam impact locations in order to determine target impact locations as described above. The user adjusts the firearm in accordance with the conditions and moves an appropriate distance from the target for the training session. In response to firearm actuation by the user, the computer system detects a beam impact location on the target via the detection device in the same manner described above. Simultaneously, the computer system also receives the actuation signal from the LIB via the parallel port. The actuation signal provides confirmation that the detection device detected a beam impact location in response to trigger actuation and emission by the laser transmitter assembly, rather than a false hit detection caused by another light source appearing on the target. Conversely, if the detection device detects a beam impact location on the target due to a light source other than the laser transmitter assembly, the computer system will recognize the detection as a false hit when the actuation signal transmitted by the LIB does not indicate firearm actuation. Thus, utilizing the electronic laser filter enhances system performance by preventing the processing of false hit detections on the target as actual beam impact locations by the computer system. The computer system processes the images from the detection device in response to the actuation signal to determine and display the target impact locations as described above.
The electronic laser filter may similarly be utilized with the system of
The systems described above may also reference previous impact location information in a particular training session to assist in verifying the validity of a detected beam impact location, particularly for constant on or trace mode described below. Basically, the systems determine whether the most recent detected beam impact location lies within a predetermined range associated with a grouping of verified impact locations for that training session. For example, if a particular session already includes several verified impact locations all grouped near the target center, a detected impact location disposed near a target corner may be determined as falling outside an established grouping range and thus considered a false hit detection.
The systems described above may perform a fine zeroing adjustment for the laser transmitter assembly. In particular, this feature may be invoked by a user from a button on a system graphical user screen (e.g.,
The system described above employing the electronic laser filter may further include a trace mode that allows computer system 18 to trace the aiming position of the firearm or laser transmitter assembly and report graphically the horizontal and vertical deviations of the firearm for a selected time period. In the trace mode, the laser transmitter assembly is configured to continuously project a laser beam from the firearm (e.g., ‘constant on’ mode), rather than projecting a laser beam pulse in response to actuation of the firearm trigger. The continuous laser beam projection allows the detection device to trace any movement of the firearm, which in turn, allows the computer system to provide feedback to the user relating to fluctuation in firearm aim before, during and/or after trigger actuation. In an exemplary embodiment, the computer system continuously receives detection information (e.g., target images including beam impact locations) from the detection device over a selected time period. Since the laser transmitter assembly is in a continuous mode (i.e., continuously projecting a laser beam onto the target), the detection device traces the aim of the firearm on the target and continuously relays detection information to the computer system. The computer system determines the target impact locations as described above and the time at which trigger actuation occurs based upon actuation signals received from the LIB. This enables the system to provide information for any selected intervals prior to or subsequent trigger actuation. A trace report is then compiled and displayed by the computer system to provide an indication to the user of the horizontal and vertical fluctuations of the firearm with respect to an actual and/or desired hit location on the target before and/or after trigger actuation. An exemplary graphical user screen displaying trace mode information is illustrated in
Computer system 18 of the above-described systems may be in communication with other systems via any communications medium (e.g., network, wires, cables, LAN, WAN, Internet, etc.) to facilitate sessions with plural users at the same or different locations, or enable remote monitoring of user performance by instructors. Further, the system case and components maybe constructed or adapted for any weather conditions and for indoor/outdoor use. In addition, the present invention is not limited to the targets disclosed herein, but may be utilized with any type of target. For example, the present invention may be utilized with the actuable target assemblies disclosed in U.S. patent application Ser. No. 09/862,187. Briefly, these target assemblies each raise a target (e.g., including a target image and a detection device to determine impact locations) in accordance with a timed scenario and lower the target in response to a hit or an expired scenario interval. The present invention may utilize such target assemblies where the target image is offset with respect to the target assembly detection device to account for various conditions. The computer system receives beam impact locations from the target detection device and applies trajectory and any calibration offsets in the manner described above to determine impact locations relative to the target image. A record of the firing exercise may be displayed, stored or printed as described above.
The present invention is versatile and provides training in various exercises including: visual feedback on marksmanship fundamentals; shot grouping; target detection; target identification; range estimation and elevation adjustment; wind estimation and windage adjustment; ballistic correction for weather conditions; slant range correction; fleeting target engagement; multiple target engagement; and observation and recording. For example, shot grouping may be accomplished by users firing at the computerized target from a predetermined range of approximately twenty-five meters. The default target presentation and display is the bulls-eye target. Shot groups are observed by the instructor who determines whether or not the group complies with the standard, or may recommend remediation of errors that are apparent in the shot group configuration. Shot groups having a dispersion within a particular quantity of MOA as measured by the system and displayed, are considered to comply with the minimum standards.
Target detection may be accomplished by a user team detecting a target presentation which may be camouflaged or hidden among other objects or elements serving as visual distractions in a background image. The target presentation is positioned to scale with displayed background imagery. The actuable targets described above fitted with appropriately scaled masks may be utilized to provide timed and partially obscured target presentations. In addition, the user team may identify the target by a cue on the target or by the type of target (e.g., radioman, rifleman, dog team handler, etc.) for target identification.
With respect to the range estimation and elevation adjustment exercise, one method of range estimation of precisely scaled target presentations is made using the MilDot reticle of the rifle scope, M19 or M22 binoculars or other MilDot devices. Once the range to target has been established, the user adjusts the rifle scope or employs hold off appropriate for the range. If the proper adjustment is made, subsequent shots strike the target on the computer display. Ballistics software (or an offset point of aim mask for the above-described actuable targets) may be employed to adjust the point of impact at all simulated ranges. In addition, a graphical overlay scaled for distance may be utilized on the target image displayed by the computer system to replicate the image viewed through a conventional MilDot scope. In other words, the system reproduces the scope view of the target area. MilDot is basically an industry standard high precision tool superimposed into a scope viewing area that allows shooters to estimate size of objects and thereby estimate range to a target. The system replicates this situation, allowing a user to train, evaluate or be evaluated with or without the weapon. In the absence of a weapon, the MilDot graphical overlay may be manipulated by the user, via an input device (e.g., mouse), to any location on the displayed target image to determine the simulated size of an object displayed on the target and thus a simulated target range between the user and the object. Further, the overlay may be manipulated in response to movement of the firearm and detection of the laser in a constant on mode to enable viewing of the manner in which the user adjusts the scope to determine the size and range. This is similar to the trace mode with the position of the overlay being manipulated in response to movement of the firearm. An exemplary graphical user screen providing a MilDot overlay for use with the systems described above is illustrated in
Wind estimation and windage adjustment exercises may be accomplished by an instructor informing a user of the simulated wind conditions (e.g., three o'clock, 5 MPH) or providing a visual indicator such as a miniature wind flag from which to determine the wind velocity and direction. The instructor enters the wind information into the computer ballistics software, while the user makes the appropriate adjustments prior to firing. If the adjustment is correct, subsequent shots strike the target on the computer display. The user may also configure and control the scenario.
Exercises with respect to ballistic corrections for weather conditions may be performed by an instructor entering several variables into the ballistics software that affect the point of impact of the bullet. The user is informed of these variables and determines the adjustments. These weather conditions may include temperature, elevation, barometric pressure and humidity. Basically, the temperature, elevation above sea level (ASL), barometric pressure and humidity each affect the ballistic coefficient of the bullet resulting in more or less drag. If the user makes the appropriate adjustments, subsequent shots strike the target on the computer display. Exercises with respect to slant range correction may be conducted in a similar manner. Basically, the instructor enters uphill/downhill angle of the shot into the ballistics software to enable the computer system to calculate the slant range. The user may enter the correction as the angle (in degrees) given by the instructor or by estimating the slant range to the target. If the user makes the appropriate adjustment subsequent shots strike the target on the computer display.
Fleeting target engagement exercises may be accomplished by a user team engaging electronic targets mounted on the above described actuable target assemblies. The target assemblies are positioned at selected distances (e.g., approximately 25 meters) from the users. The targets are fitted with appropriate offset point of aim masks while target exposures are set by the instructor and require quick target detection, target ID and shot release. In addition, non-combatant target presentations may be mixed into the exercise. Multiple target engagement exercises may be performed in a similar manner where a user engages multiple electronic targets mounted on the actuable target assemblies and positioned at selected distances (e.g., approximately 25 meters) from the user. The targets are fitted with appropriate offset point of aim masks. Single and multiple target exposures may be set by the instructor where target presentations include targets of varying priority and non-combatant targets. The user engages targets in order of priority or threat level.
Observation and recording exercises may be accomplished by a user team moving into a position overlooking a simulated range containing several camouflaged electronic targets mounted on the above-described actuable target assemblies and positioned at selected distances (e.g., approximately 25 meters) from the user. The user prepares a range card and observes the area for a period of time (as determined by the instructor). The instructor randomly and occasionally exposes an electronic target fitted with an appropriate offset point of aim mask or scale presentation of a small object. The user team engages permitted targets and records all observations on the observation log.
In addition, the present invention provides several advantages including: training with actual weapon and weapon sights; firearm simulation by a weapon mounted eye-safe or other training laser; computerized target feedback, including internal ballistics software module to adjust bullet point of impact (e.g., instructors may enter real-world variables that affect trajectory); weapon sight(s) must be adjusted using skill based standards (e.g., adjusting specified number of clicks on a MilDot scope for range, windage, etc.) to achieve target hit. Target presentations may be of various types to facilitate target identification, target priority and range estimation of various silhouettes and non-human objects; target presentations and backgrounds can be from user acquired imagery incorporated into the trainer to enhance realism and relevancy; each target presentation corresponds to the display on the computer screen in scale, color and wind references. The computer system display may also be overlaid with a minute of angle (MOA) grid to reference impacts (e.g., miss and hit) with sight corrections applied in one MOA and one-half MOA increments. The MOA are basically used to estimate distance. An MOA grid allows users to estimate and adjust points of aim using visual comparisons between MOA units and items in the target area in order to avoid reliance upon time consuming and complex calculations. The MOA grid is displayed as an overlay by the computer system to assist the user in enhancing various skills (e.g., determining distance, adjusting point of aim, etc.). An exemplary graphical user screen displayed by the above-described systems and illustrating an MOA overlay on a target display is illustrated in
It will be appreciated that the embodiments described above and illustrated in the drawings represent only a few of the many ways of implementing a firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control.
The systems may include any quantity of any type of target placed in any desired locations. The computer system may be in communication with other training systems via any type of communications medium (e.g., direct line, telephone line/modem, network, etc.) to facilitate group training or competitions. The systems may be configured to simulate any types of training, qualification or competition scenarios. The printer may be implemented by any conventional or other type of printer.
The systems may include any quantity of computer systems, target controllers, instructor systems and/or spectator systems. These processing systems may be implemented by any conventional or other computer or processing system (e.g., PC, laptop, palm pilot, PDA, etc.). The components of the systems (e.g., computer system, USB extenders, hub, barcode reader, detection device, etc.) may include and communicate via any communications devices (e.g., cables, wireless, network, etc.) in any desired fashion, and may utilize any type of conventional or other interface scheme or protocol. The network may be implemented by any communications medium (e.g., LAN, WAN, Internet, Intranet, wired, wireless, etc.), while the devices may alternatively directly communicate with each other.
The firearm laser training systems may be utilized with any type of firearm or other device (e.g., hand-gun, rifle, shotgun, machine gun, missile or other weapon system, etc.), while the laser module may be fastened to the firearm at any suitable locations via any conventional or other fastening techniques (e.g., frictional engagement with the barrel, brackets attaching the device to the firearm, etc.). Further, the system may include a dummy firearm projecting a laser beam, or replaceable firearm components (e.g., a barrel) including a laser device disposed therein for firearm training. The replaceable components (e.g., barrel) may further enable the laser module to be operative with a firearm utilizing any type of blank cartridges.
The laser assembly may include the laser module and bracket or any other fastening device. The laser module may emit any type of laser beam, preferably within suitable safety tolerances. The laser module housing may be of any shape or size, and may be constructed of any suitable materials. The receptacles may be defined in the module housing at any suitable locations to engage the bracket. Alternatively, the housing and bracket may include any conventional or other fastening devices (e.g., integrally formed, threaded attachment, hook and fastener, frictional engagement, etc.) to attach the module to the bracket. In another exemplary embodiment, the laser module may be attached without a bracket (e.g., by frictional engagement with the inside surface of the barrel via a rod or a similar device that engages the inside surface of the barrel). The bracket base and cover members may be of any size or shape and may be constructed of any suitable materials. The laser module may be fastened to the base and/or cover members at any locations via any suitable fastening mechanisms. The openings within the base and cover members may be of any quantity, shape or size and may be defined at any suitable locations. The bolts may be implemented by any securing or fastening devices (e.g., clamps, screws, posts, etc.).
The optics package may include any suitable lens for projecting the beam. The laser beam may be enabled for any desired duration sufficient to enable the detection device to detect the beam. The laser module may be fastened to a firearm or other similar structure (e.g., a dummy, toy or simulated firearm) at any suitable locations (e.g., external or internal of a barrel) and be actuated by a trigger or any other device (e.g., power switch, firing pin, relay, etc.). Moreover, the laser module may be configured in the form of ammunition for insertion into a firearm firing or similar chamber and project a laser beam in response to trigger actuation. Alternatively, the laser module may be configured for direct insertion into the barrel without the need for the bracket. The laser module may include any type of sensor or detector (e.g., acoustic sensor, piezoelectric element, accelerometer, solid state sensors, strain gauge, etc.) to detect mechanical or acoustical waves or other conditions signifying trigger actuation. The laser module components may be arranged within the housing in any fashion, while the module power source may be implemented by any type of batteries. Alternatively, the module may include an adapter for receiving power from a common wall outlet jack or other power source. The laser beam may be visible or invisible (e.g., infrared), may be of any color and may be modulated in any fashion (e.g., at any desired frequency or unmodulated) or encoded to provide any desired information, while the transmitter may project the beam continuously or include a “constant on” mode.
The target may be implemented by any type of target having any desired configuration and indicia forming any desired target site. The target may be of any shape or size, and may be constructed of any suitable materials. The target may include any conventional or other fastening devices to attach to any supporting structure. Similarly, the supporting structure may include any conventional or other fastening devices to secure the target to that structure. Alternatively, any type of adhesive or magnetic material may be utilized to secure the target to the structure. The support structure may be implemented by any structure suitable to support or suspend the target. The target may include any quantity of sections or zones of any shape or size and associated with any desired values or information (e.g., hit/miss, vital area, etc.). The target may include any quantity of individual targets or target sites. The systems may utilize any type of coding scheme to associate values with target sections (e.g., table lookup, target location identifiers as keys into a database or other storage structure, etc.). Further, the sections may be identified by any type of codes, such as alphanumeric characters, numerals, etc., that indicate a score or zone. The score values may be set to any desired values. Zones may be identified in any manner (e.g., enemy, friendly, non-engageable, priority, etc.).
The display screen may be of any shape, size or type (e.g., LCD, plasma, monitor, etc.) and may be disposed at any desired location. The display screen may display any type of target scaled for any desired range or unscaled. The display screen may alternatively show movies or video illustrating a stationary or moving target, a target scenario or environmental or other conditions. The images and/or video may be stored locally on the computer system or target controller, or may be retrieved from a network or other processing system.
The target characteristics and images may be contained in any quantity of any types of files. The target images may be scaled in any desired fashion. The coordinate translations may be accomplished via any conventional or other techniques, and may be performed within the detection device. The translations for the various files (e.g., print, scoring, display, etc.) may be determined with respect to impact locations with or without the offsets applied, while the corresponding files may be configured accordingly. For example, the files may be generated to incorporate the offsets, thereby reducing processing during system operation (e.g., by enabling beam impact locations without offsets to be used). The target files may contain any information pertaining to the target (e.g., filenames, images, scaling information, indicia size, etc.). The target files may be produced by the computer system or other processing system and placed on the computer system for operation. Alternatively, the target files may reside on another processing system accessible to the computer system via any conventional or other communications medium (e.g., network, modem/telephone line, etc.).
The barcode reader may be of any type and configuration and may be connected or in communication with the computer system in any suitable manner. Alternatively, the computer system may utilize any suitable device or interface to receive information regarding the type of target being utilized in a particular training session. The target serial number may include any quantity of any alphanumeric character or other symbol. The range finder may be implemented by any conventional or other device that can measure distance (e.g., ultrasound device, radio device, etc.).
The detection device may be implemented by any conventional or other sensing device (e.g., camera, CCD, CMOS, matrix or array of light sensing elements, etc.) suitable for detecting the laser beam and/or capturing a target image. The filter may be implemented by any conventional or other filter having filtering properties for any particular frequency or range of frequencies. The detection device may employ any type of light sensing elements. The detection device may be of any shape or size, and may be constructed of any suitable materials. The detection device may be positioned at any suitable locations providing access to the target. The calibration may utilize any type of target and user interface to calibrate the systems. The calibration target may be an image or displayed by the display screen. The calibration target and user interface may include any quantity of alignment guides and/or lines to calibrate the system. Further, the user may adjust the detection device, target and/or interface in any manner to calibrate the system. The zeroing adjustment may be performed at any time prior, during or subsequent a session. The zeroing may utilize any quantity of shots and any type of calculation to determine an offset. The offset may be determined based on any characteristics of the shot grouping and relative to any desired target site. The offset may alternatively be adjusted or entered by a user.
The detection device may be coupled to any computer system port via any conventional or other cable. The detection device may be configured to detect any energy medium having any modulation, pulse or frequency. Similarly, the laser may be implemented by a transmitter emitting any suitable energy wave. The detection device may transmit any type of information to the computer system to indicate beam impact locations, while the computer system may process any type of information from the detection device to display and provide feedback information to the user.
It is to be understood that the software for the computer system, target controller, instructor system and spectator system maybe implemented in any desired computer language and could be developed by one of ordinary skill in the computer arts based on the functional descriptions contained in the specification and flow chart illustrated in the drawings. These processing systems may alternatively be implemented by hardware or other processing circuitry. The various functions of these systems maybe distributed in any manner among any quantity of processing systems, circuitry and hardware and/or software modules or units. The software and/or algorithms described above and illustrated in the flow chart may be modified in any manner that accomplishes the functions described herein. The database may be implemented by any conventional or other database or storage structure (e.g., file, data structure, etc.).
The graphical user screens and reports maybe arranged in any fashion and contain any type of information. The indicia indicating target impact locations and other information may be of any quantity, shape, size or color and may include any type of information. The indicia may be placed at any locations and be incorporated into or overlaid with the target images. The systems may produce any desired type of display or report having any desired information. The computer system may determine scores based on any desired criteria. The computer system may poll the detection device or the detection device may transmit images at any desired intervals for the tracing mode. The indicia for the tracing mode may be of any quantity, shape, size or color and may include any type of information. The tracing indicia may be placed at any locations and be incorporated into or overlaid with the target images.
The systems may utilize optical and/or electronic filters to reduce false detections. The laser and LIB may be coupled to each other and the computer system in any fashion or desired arrangement. For example, the laser and LIB may be coupled to a parallel port connector of the computer system and transfer signals therethrough. Alternatively, the laser may be coupled to the LIB which, in turn, is coupled to the computer system parallel port. The LIB may be housed within any system components or be external of those components. The LIB may include any conventional circuitry or components (e.g., regulator, comparator, pulse condition timer, buffer, etc.) arranged in any desired fashion to perform the functions described herein. The trace mode may track and display firearm movement for any desired time interval commencing prior to, during or after trigger actuation. Alternatively, the trace mode may be utilized without the electronic laser filter by the systems detecting a continuous laser beam for a predetermined time interval and processing captured images as described above. The trace mode may display the information in any desired manner (e.g., plot, chart, graph, etc.). The computer system may utilize any desired overlays to emulate any views through the scope or of the target (e.g., MOA, MilDot, etc.). The MilDot or other overlays may be manipulated on the image via any input devices (e.g., mouse, keyboard, firearm laser movement, voice recognition, etc.).
Ballistic information from the ballistic program maybe retrieved or intercepted in any desired fashion (e.g., intercept window writes, write program output to a readable file or data structure, direct interaction via dynamic data exchange (DEE), etc.). The targets utilized with the systems of the present invention may be produced utilizing any suitable procedure. The offsets may be determined prior to a session and stored by the system in any manner (e.g., tables, data structures, etc.), or particular offsets may be generated and applied during processing of images.
The systems may utilize any quantity of any types of devices (e.g., extenders, cables, etc.) to facilitate communication between the detection device, bar code reader and computer system. The carrying case may be of any shape or size and may be constructed of any suitable materials. The case may include any quantity of compartments of any shape or size to accommodate any system components. The system components may be arranged in the case in any desired fashion. The computer system may communicate with any quantity of training systems via any communications medium (e.g., network, cables, wireless, etc.) to facilitate group training. Further, the instructor and spectator systems may similarly be coupled to plural training systems via any communications medium (e.g., network, cables, wireless, etc.) to control and monitor group training. The systems may include and process any quantity of targets (e.g., plural images or display screens) via any quantity of detection devices in substantially the same manner described above for plural target sessions. The detection device may handle plural targets, where the computer system processes the captured images to determine target impact locations as described above.
The present invention is not limited to the applications disclosed herein, but may be utilized for any type of firearm training, qualification or competition. Further, the present invention may utilize offsets to simulate any types of conditions (e.g., wind, precipitation, elevation, humidity, type of projectile, etc.) for targets at any desired ranges.
From the foregoing description, it will be appreciated that the invention makes available a novel firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control, wherein the system scans a simulated extended range target to determine laser beam impact locations and applies an offset to those locations to simulate various conditions (e.g., range, wind, etc.) affecting projectile trajectory and determine an impact location relative to the target resulting from those conditions.
Having described preferred embodiments of a new and improved firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control, it is believed that other modifications, variations and changes will be suggested to those skilled in the art in view of the teachings set forth herein. It is therefore to be understood that all such variations, modifications and changes are believed to fall within the scope of the present invention as defined by the appended claims.
This application claims priority from provisional U.S. Patent Application Ser. No. 60/297,209, entitled “Firearm Laser Training System and Method Facilitating Firearm Training for Extended Range Targets” and filed Jun. 8, 2001; and No. 60/341,148, entitled “Firearm Laser Training System and Method Facilitating Firearm Training for Extended Range Targets with Feedback of Firearm Control” and filed Dec. 17, 2001. The disclosures of the above-mentioned provisional applications are incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
2023497 | Trammell | Dec 1935 | A |
2934634 | Hellberg | Apr 1960 | A |
3452453 | Ohlund | Jul 1969 | A |
3510965 | Rhea | May 1970 | A |
3526972 | Sumpf | Sep 1970 | A |
3590225 | Murphy | Jun 1971 | A |
3633285 | Sensney | Jan 1972 | A |
3782832 | Hacskaylo | Jan 1974 | A |
3792535 | Marshall et al. | Feb 1974 | A |
3888022 | Pardes et al. | Jun 1975 | A |
3938262 | Dye et al. | Feb 1976 | A |
3995376 | Kimble et al. | Dec 1976 | A |
3996674 | Pardes et al. | Dec 1976 | A |
4048489 | Giannetti | Sep 1977 | A |
4068393 | Tararine et al. | Jan 1978 | A |
4102059 | Kimble et al. | Jul 1978 | A |
4164081 | Berke | Aug 1979 | A |
4177580 | Marshall et al. | Dec 1979 | A |
4195422 | Budmiger | Apr 1980 | A |
4218834 | Robertsson | Aug 1980 | A |
4222564 | Allen et al. | Sep 1980 | A |
4256013 | Quitadama | Mar 1981 | A |
4269415 | Thorne-Booth | May 1981 | A |
4281993 | Shaw | Aug 1981 | A |
4290757 | Marshall et al. | Sep 1981 | A |
4313272 | Matthews | Feb 1982 | A |
4313273 | Matthews et al. | Feb 1982 | A |
4336018 | Marshall et al. | Jun 1982 | A |
4340370 | Marshall et al. | Jul 1982 | A |
4352665 | Kimble et al. | Oct 1982 | A |
4367516 | Jacob | Jan 1983 | A |
4439156 | Marshall et al. | Mar 1984 | A |
4452458 | Timander et al. | Jun 1984 | A |
4553943 | Ahola et al. | Nov 1985 | A |
4561849 | Eichweber | Dec 1985 | A |
4572509 | Sitrick | Feb 1986 | A |
4583950 | Schroeder | Apr 1986 | A |
4592554 | Gilbertson | Jun 1986 | A |
4619615 | Kratzenberg | Oct 1986 | A |
4619616 | Clarke | Oct 1986 | A |
4640514 | Myllyla et al. | Feb 1987 | A |
4657511 | Allard et al. | Apr 1987 | A |
4662845 | Gallagher et al. | May 1987 | A |
4678437 | Scott et al. | Jul 1987 | A |
4680012 | Morley et al. | Jul 1987 | A |
4695256 | Eichweber | Sep 1987 | A |
4737106 | Laciny | Apr 1988 | A |
4761907 | De Bernardini | Aug 1988 | A |
4786058 | Baughman | Nov 1988 | A |
4788441 | Laskowski | Nov 1988 | A |
4789339 | Bagnall-Wild et al. | Dec 1988 | A |
4804325 | Willits et al. | Feb 1989 | A |
4811955 | De Bernardini | Mar 1989 | A |
4830617 | Hancox et al. | May 1989 | A |
4864515 | Deck | Sep 1989 | A |
4898391 | Kelly et al. | Feb 1990 | A |
4922401 | Lipman | May 1990 | A |
4923402 | Marshall et al. | May 1990 | A |
4934937 | Judd | Jun 1990 | A |
4947859 | Brewer et al. | Aug 1990 | A |
4948371 | Hall | Aug 1990 | A |
4955812 | Hill | Sep 1990 | A |
4983123 | Scott et al. | Jan 1991 | A |
4988111 | Gerlizt et al. | Jan 1991 | A |
5004423 | Bertrams | Apr 1991 | A |
5026158 | Golubic | Jun 1991 | A |
5035622 | Marshall et al. | Jul 1991 | A |
5064988 | E'nama et al. | Nov 1991 | A |
5090708 | Gerlitz et al. | Feb 1992 | A |
5092071 | Moore | Mar 1992 | A |
5095433 | Botarelli et al. | Mar 1992 | A |
5119576 | Erning | Jun 1992 | A |
5140893 | Leiter | Aug 1992 | A |
5153375 | Eguizabal | Oct 1992 | A |
5179235 | Toole | Jan 1993 | A |
5181015 | Marshall et al. | Jan 1993 | A |
5194006 | Zaenglein, Jr. | Mar 1993 | A |
5194007 | Marshall et al. | Mar 1993 | A |
5194008 | Mohan et al. | Mar 1993 | A |
5208418 | Toth et al. | May 1993 | A |
5213503 | Marshall et al. | May 1993 | A |
5215465 | Marshall et al. | Jun 1993 | A |
5237773 | Claridge | Aug 1993 | A |
5281142 | Zaenglein, Jr. | Jan 1994 | A |
5328190 | Dart et al. | Jul 1994 | A |
5344320 | Inbar et al. | Sep 1994 | A |
5365669 | Rustick et al. | Nov 1994 | A |
5366229 | Suzuki | Nov 1994 | A |
5400095 | Minich et al. | Mar 1995 | A |
5413357 | Schulze et al. | May 1995 | A |
5433134 | Leiter | Jul 1995 | A |
5474452 | Campagnuolo | Dec 1995 | A |
5486001 | Baker | Jan 1996 | A |
5488795 | Sweat | Feb 1996 | A |
5489923 | Marshall et al. | Feb 1996 | A |
5502459 | Marshall et al. | Mar 1996 | A |
5504501 | Hauck et al. | Apr 1996 | A |
5515079 | Hauck | May 1996 | A |
5529310 | Hazard et al. | Jun 1996 | A |
5551876 | Koresawa et al. | Sep 1996 | A |
5577733 | Downing | Nov 1996 | A |
5585589 | Leiter | Dec 1996 | A |
5591032 | Powell et al. | Jan 1997 | A |
5594468 | Marshall et al. | Jan 1997 | A |
5605461 | Seeton | Feb 1997 | A |
5613913 | Ikematsu et al. | Mar 1997 | A |
5641288 | Zaenglein, Jr. | Jun 1997 | A |
5672108 | Lam et al. | Sep 1997 | A |
5685636 | German | Nov 1997 | A |
5716216 | O'Loughlin et al. | Feb 1998 | A |
5738522 | Sussholz et al. | Apr 1998 | A |
5740626 | Schuetz et al. | Apr 1998 | A |
5788500 | Gerber | Aug 1998 | A |
5842300 | Cheshelski et al. | Dec 1998 | A |
5890906 | Macri et al. | Apr 1999 | A |
5933132 | Marshall et al. | Aug 1999 | A |
5947738 | Muehle et al. | Sep 1999 | A |
5999210 | Nemiroff et al. | Dec 1999 | A |
6012980 | Yoshida et al. | Jan 2000 | A |
6028593 | Rosenberg et al. | Feb 2000 | A |
6106297 | Pollak et al. | Aug 2000 | A |
6252706 | Kaladgew | Jun 2001 | B1 |
6296486 | Cardaillac et al. | Oct 2001 | B1 |
6315568 | Hull et al. | Nov 2001 | B1 |
6322365 | Shechter et al. | Nov 2001 | B1 |
6551189 | Chen et al. | Apr 2003 | B1 |
6572375 | Shechter et al. | Jun 2003 | B2 |
6575753 | Rosa et al. | Jun 2003 | B2 |
6579098 | Shechter | Jun 2003 | B2 |
6604064 | Wolff et al. | Aug 2003 | B1 |
6616452 | Clark et al. | Sep 2003 | B2 |
6709272 | Siddle | Mar 2004 | B2 |
6739873 | Rod et al. | May 2004 | B1 |
20020009694 | Rosa | Jan 2002 | A1 |
20020012898 | Shechter et al. | Jan 2002 | A1 |
20030136900 | Shechter et al. | Jul 2003 | A1 |
20030199324 | Wang | Oct 2003 | A1 |
Number | Date | Country |
---|---|---|
30 45 509 | Jul 1982 | DE |
3537323 | Apr 1987 | DE |
3631081 | Mar 1988 | DE |
39 25 640 | Feb 1991 | DE |
4005940 | Aug 1991 | DE |
4029877 | Mar 1992 | DE |
42 07 933 | Sep 1993 | DE |
42 33 945 | Apr 1994 | DE |
195 19 503 | Dec 1995 | DE |
0 072 004 | Feb 1983 | EP |
0 285-586 | Oct 1988 | EP |
0 401 731 | Jun 1990 | EP |
0467090 | Jan 1992 | EP |
0806621 | Nov 1997 | EP |
2 726 639 | May 1996 | FR |
2141810 | Jan 1985 | GB |
2254403 | Oct 1992 | GB |
2260188 | Apr 1993 | GB |
2284 253 | May 1995 | GB |
49-103499 | Sep 1974 | JP |
50-22497 | Mar 1975 | JP |
54-40000 | Mar 1979 | JP |
56-500666 | May 1981 | JP |
59-191100 | Dec 1984 | JP |
63-502211 | Aug 1988 | JP |
2-101398 | Apr 1990 | JP |
5-223500 | Aug 1993 | JP |
8-299605 | Nov 1996 | JP |
1817825 | May 1993 | RU |
2 089 832 | Sep 1997 | RU |
WO 9109266 | Jun 1991 | WO |
WO 9208093 | May 1992 | WO |
WO 9403770 | Feb 1994 | WO |
WO 9415165 | Jul 1994 | WO |
WO 9615420 | May 1996 | WO |
WO 9849514 | Nov 1998 | WO |
WO 9910700 | Mar 1999 | WO |
Number | Date | Country | |
---|---|---|---|
20020197584 A1 | Dec 2002 | US |
Number | Date | Country | |
---|---|---|---|
60341148 | Dec 2001 | US | |
60297209 | Jun 2001 | US |