1. Field of the Invention
The present invention relates to simulated weapons and, more particularly, to an optical sight system including a simulated weapon which allows the incorporation and use of an actual weapon sight combined with a micro display in a simulated weapon environment.
2Description of the Prior Art
Firearms training simulators are used to train police and military personnel the proper use and handling of weapons without having to use real firearms and ammunition. The firearms training simulator is designed for indoor training in a safe environment. An effective firearms simulator duplicates the actual environment as much as possible including the use of simulated weapons that “look and feel” like the actual weapons. To improve the “look and feel” of the simulated weapon, the user will be able to employ a firearm optical sight on the simulated weapon (either unmodified or adapted for use on the simulator). The primary objective is to immerse the student in a training scenario so that his/her responses will be the same in the training scenario as in real life. If this is achieved, the instructor can effectively train the student on the correct responses, actions and behaviors.
To facilitate this, the student should be immersed in the training environment as much as possible, and the instructor should have as much visibility as possible of the way a student handles the weapon, including the student's aiming techniques. One desired improvement of conventional firearms simulator systems is to replicate real weapons that employ either actual firearm optical sights with great magnification or electro-optical devices such as night vision devices or thermal sights. With such weapon simulation systems, there have been various ways to incorporate the use of both an actual optical sight and a simulated optical sight with a simulated weapon to provide the desired scenario.
One option is to build a completely new weapon sight or weapon scope simulator device without using an actual weapon sight with the simulated weapon. Such a device would provide for the simulation of a weapon sight rather than using an actual weapon sight, and would include a display, optical system, reticule, and elevation adjustment mechanicals. Consequently, this option has a lack of flexibility for the user. For example, to simulate different scope, different simulators need to be built, and the student or user could not select the desired scope if that scope simulator has not been built.
A second option is to use an actual optical sight in conjunction with the simulated weapon, such that the user would examine the generated display image of the scenario with the actual firearm optical sight. Although optical sights with magnification greater than about two-times could be used with such a weapons simulation system, the image would be negatively altered due to pixelization. That is, when the digital image is enlarged through the magnification of the scope, the user will see the various pixels that compose the digital picture. Therefore, the picture will appear very pixilated to the point that the image is not realistic to the user, and therefore not usable for realistic training as is necessary for effective training. This approach also does not allow for mixed use of iron or optical sights together with electro-optical sights such as night vision and thermal sights.
The present invention is an optical sight system that is used with weapon simulation systems so as to improve the realism of the weapon simulation system for the user or student. In particular, the present optical sight system is used with an actual weapon scope or weapon sight in a weapon simulation system so that the user is able to view a correct version of the image broadcast on a primary image display with the scope and maintain the reality of the simulation. More specifically, the weapon simulation system includes primary image display and a simulated weapon that are both in electrical communication with a central processing unit having an image generator to produce the desired target or interactive scenario that is sent to the primary image display to immerse the student or user in the desired situation.
The optical sight system of the present invention is used in conjunction with such a weapon simulation system to further immerse the student or user in the interactive simulation. Specifically, the present invention employs an actual weapon sight or a weapon scope with the simulated weapon, and includes a secondary image display or display panel that is electrically connected to an image generator to receive a target or interactive scenario with an image corresponding to a magnified version of the scenario displayed on the primary image display. To view the image on the secondary image display with the weapon sight, the optical sight system additionally includes a lens or multiple lenses to correct for the long focal distance of the scope and enable it to focus on the micro display positioned only inches away. Through the use of a laser on the simulated weapon, the system is able to generate the desired magnified view on the secondary image display. Furthermore, using a system of interpolation and extrapolation, the optical sight system is able to create a clear magnified view of the primary image display. Using an angle sensor, the optical sight system further provides a method for correcting the angle displayed on the secondary image display to compensate for rotation of the weapon simulator as handled by the user.
An apparatus embodying the features of the present invention is depicted in the accompanying drawings, which form a portion of this disclosure, wherein:
a is an illustration of the connection of weapon sight with a secondary image display via a housing;
b is a diagram illustrating the optical lens positioned between the weapon sight and the secondary image display lens;
Looking to
Continuing to view
The optical sight system 10 is designed to be capable of attachment to a simulated weapon 26 with a weapon sight 26 that is used on actual weapons, not just simulated weapons 20. As a result, the student or user may use his or her own preexisting weapon sight 26 with the optical sight system 10 to perform the training tasks, which maximizes the reality of the training scenario for the student. That is, the student can be trained to operate his or her own actual weapon sight 26 on the simulated weapon 20.
The present invention of the optical sight system 10 is designed to be used with the actual weapon sight 26. To properly work with the actual weapon sight 26, the optical sight system 10 includes a secondary image display 22 and an optical lens 24. Referring to
It should further be noted that the secondary image display panel 22 may be any type of microdisplay that may be mounted either to the weapon housing 20 or the weapon sight 26. Although the size of the secondary display panel 22 will vary in view of the size of the weapon housing 20, in one useful embodiment, the display area of the microdisplay 22 as viewed by the user has dimensions that are less than 17 millimeters by less than 13 millimeters.
The optical sight system 10 is attached to the front of the real weapon sight 26 or to the simulated weapon 20. The electronic image generated on the secondary image display 22 is seen through the weapon sight 26 so as to utilize the real weapon sight 26. The reticule and the elevation adjustment mechanisms of the weapon sight 26, as originally built in the weapon sight 26, are able to be used with the optical sight system 10 as in an actual use of the weapon sight 26. The student can use his own weapon sight 26 and attach the optical sight system 10 to the frame of the simulated weapon 20 prior to starting the training.
In operation, the secondary image display panel 22 in the optical sight system 10 displays the portion of the primary image that is in the center of the student's aim with the simulated weapon 20. Although various tracking methods could be used utilizing inertial, mechanical, magnetic or optical sensors, in the embodiment presently described, the center of the student's aim is determined through the use of a laser. More specifically, in order to detect the aiming point of the simulated weapon 20 and transmit the corresponding image to the secondary image display 22 of the optical sight system 10, a tracking position device 29 (such as a laser tracking camera) is used to monitor the primary image display 14 and locate the laser spot position, which is projected from simulated weapon 20 held by the student. This tracking position device 29 transmits the detected laser spot position to the software application run by CPU 16 as a reference point to calculate the aiming point of the simulated weapon 20. Based on the aiming point, the secondary image display 22 can display the correct zoom image of the scene produced by the image generator 18. The application generates the zoom image corresponding to the aiming point and displays it on the secondary image display 22. The image in the secondary image display 22 will give the student the same look and feeling of a real weapon scope in an actual setting. Through the use of a laser beam combined with a laser detector 29 for tracking the position of the laser, and thus the user's aim, the optical sight system 10 can provide high accuracy and fast response time position information. The embodiment uses the information of the laser spot location detector 29 as the only resource for determining the aiming point and then generates the corresponding image in the secondary image display 22.
In an example of this system, a laser LED is installed in the barrel of simulated weapon 20. At the same time, the optical sight system 10 is installed with the simulated weapon 20. While the system is operating, a laser beam from the laser LED is projected to the primary image display 14 having the training scenario scene image as produced by the image generator 18. The laser spot location is changed following the location of the student's aim corresponding to the position of the simulated weapon 20. The laser spot location is real-time detected by a tracking position device 29 connected to the CPU 16 and processed to generate aiming point information of the simulated weapon 20; specifically the coordinates of the aiming point with respect to the primary image display 14. From the aiming point information, the software application of the CPU 16 can determine where the gunner is aiming on the scenario or scene image of the primary image display 14. The relative image is processed according to scope's field of view, magnification and the position of the weapon in the virtual world. The proper image is then displayed on optical sight system 10, which can be seen by the student through the weapon scope 26. The image on the primary image display 14 is exactly the same as what student would see in the real world without a scope.
The present invention provides a simulation system with a training range from several hundred meters to several thousand meters. When targets are at several thousands of meters, the resolution of most simulation screens, such as the primary image display 14 of the present invention, is too low to display the target so that the student can see it in detail. In the real world situation, the user employs the weapon scope 26 on the firearm to find and engage the target if the target is a great distance away. The fundamental problem is determining how to get the correct image reference point for the secondary display image 22; that is, the center point of image on the primary image display 14 being targeted. Therefore, the present invention uses laser spot detection to determine the center of the image generated on the secondary display.
Microsoft Windows 98 and newer operating systems provide support for multiple graphics displays. This functionality is helpful for engineers to implement and test the concept at a low cost. In the initial stage, the engineers use the second graphic card output as optical sight system 10 and, without a physical tracking position device 29, they use a mouse to simulate the movement of the aiming point. When the mouse is pointing at a scale 1:1 image, using a software timer to regularly collect the mouse position, then the programmer calculates the zoom image position and displays the zoom image to the second monitor 17. In the present invention, the mouse position is replaced by the laser position, and the second monitor 17 is replaced by the secondary image display 22.
Looking to
Therefore, the student can see the precise image through the weapon sight 26 aimed at the primary image display 14, or the user can see the broad image by simply looking at the primary image display 14. That is, the student not only needs to view the image clearly, but he/she also needs to see the maximum of the image display panel area so that the he/she can fully utilize the resolution of the secondary image display 22. For example, if the optical lens 24 is not properly selected/designed, then the student will not see the full picture in the secondary image display 22. For example, if the magnification of the weapon sight 26 is too great, the student may only see a 400×400 area on a 1024×768 display panel 22, such that the student will not be able to view the complete display area of the secondary image display panel 22. Moreover, in this case, the resolution on the panel of the secondary image display 22 will be poor, and the student will see the grainy pixels from the panel of the secondary image display panel 22 because the magnification is too large. As a result, the quality of the image will be substantially lowered. On the other hand, if the magnification of the weapon sight 26 is too low, the student may see the edges of the secondary image display panel 22, and there is no room for the elevation adjustment.
One added benefit of using the optical sight systems 10 of the present invention is that the instructor can see the student's actual aim point by viewing the image generated for the student's electro-optical device on a separate monitor 17 connected to the CPU 16. This allows the instructor to see the same image that is displayed on the secondary image display 22.
In order to overcome some issues such as image pixelization when viewing a primary image display 14 with optical magnification devices, the secondary image display panel 22 may be used to display an image for that particular optical device. Since some users (snipers and others) are particularly sensitive to having modifications made to their weapon sight 26 and are hesitant in training with equipment other than their own, a small device attaching to the user's weapon sight 26 to allow the student to use all the adjustments of the weapon sight 26 is ideal for the firearms training simulation market. The image injected in the weapon sight 26 is specific to that optical device, and is provided based on a tracking algorithm used to determine the user's point of aim.
When a laser is used to track a moving target, the simulated weapon 20 will fire laser beams periodically. In order to reduce the load of the weapon simulation system 12 on the CPU 16, the period cannot be very short, particularly if the system is to track multiple targets on the primary image display 14. The rate of firing laser beams will be controlled to less than 15 times per second, ideally 10 times per second. But the rate for updating the image according to the coordinates of the laser spots must be at least 30 times per second. As a result, a tracking algorithm must be used for determining the image transmitted to the secondary display 22 during each frame.
The tracking algorithm used in the present invention is referred to as “intrapolation”, which is a combination of extrapolation and interpolation. Extrapolation is an estimation of the value based on extending a known sequence of values or facts beyond the area that is certainly known, whereas interpolation is an estimation of a value within two known values in a sequence of values. Using interpolation makes the movement smoother, but increases the delay of the transmitted picture. By contrast with interpolation, extrapolation takes shorter delay, but it causes excessive movement of the image transmitted. The excessive movement is created when the target stops suddenly or changes direction, which leads to change in the student's aimpoint, and the optical sight system 10 does not know until the next laser coordinates have been obtained by the optical sight system 10. During this interim time, the optical sight system 10 still updates the image according to the prior coordinates. This causes the target to “oscillate” several times before it stops. In the present design, interpolation and extrapolation are both used to get smooth tracking, shorter delay and less overshoot movement.
The tracking algorithm used in the present invention follows the process of “intrapolation” for generating the desired display. For purposes of the present invention, intrapolation is a combination of interpolation and extrapolation. The formula of intrapolation for the present invention is:
In the formula above:
Using these formulas, the change between the times to be intrapolated is determined. Because Δ is calculated from tn-1 and tn-2, the invention ignores the intrapolation of the first two trace lasers. If tu≦Δ, the values for the coordinates of the location of the center of the secondary image are calculated using interpolation; otherwise, extrapolation is used.
The test system is a sniper rifle equipped with a through sight. The laser rate is 10 pulses per second. By using the tracking algorithm stated above, the tracking is smooth, fast and without much overshoot. When the rifle is at rest, such as being laid on the ground, the cross hair of the telescope does not move.
Furthermore, the noise in the image may be reduced by incorporating a Kalman filter. That is, when tracking a still target with the laser, the movement of the cross hair of the weapon sight 26 can be controlled within two screen pixels. Using laser to track the moving target has another problem in random noise. If the user requires high tracking accuracy, the noise cannot be ignored. To reduce the noise, the Kalman filter is used. A Kalman filter is used to estimate the state of a system from measurements that contain random errors.
In one embodiment of the invention, the optical sight system 10 is coupled with radio frequency (RF) technology and battery power (not illustrated) to provide a wireless version allowing unrestricted freedom of movement of the user.
The use of microdisplays 22 with image generators 18 makes this approach feasible for applications such as:
The purpose of the optical sight system 10 is to make the displayed image clearly seen through the weapon sight 26 without degrading the optical specification of the weapon sight 26. To achieve that, the image must be projected away from the weapon sight 26. The distance of the projection from the weapon sight 26 depends on the parallax-free distance of the weapon sight 26, or the distance at which there is no apparent displacement, or difference of position, of an object, as seen from two different stations, or points of view.
More specifically, if the parallax-free distance of a weapon sight 26 is 200 meters, then the image should be projected at 200 meters away from the weapon sight 26. Once the image is projected at 200 meters, the optical system 10 and the human eye can focus and produce a clear image on the human retina.
Referring to
where
The use of the optical sight system 10 has been verified through testing with a tactical scope, a single convex lens, and a microdisplay unit 22 mounted on a micro-optical rail 27 attached to a rifle. The effect of the invention on the parameters of the optical sight system 10 was determined to be as follows. With respect to the magnification, since the image displayed on the secondary image display 22 is controlled by an image generator 18, if the image displayed on secondary image display 22 is properly scaled, then the image seen by the student using the optical sight system 10 has the equivalent magnification to the image that would be seen through the weapon sight 26. The eye relief, or the distance that the weapon sight 26 can be held away from the user's eye and still present the full field of view, did not change for the weapon sight 26 when in use with the optical sight system 10. The exit pupil, or the size of the column of light that leaves the eyepiece of a weapon sight 26, may be affected if the diameter of the added optical lens 24 is smaller than the objective lens diameter of the weapon sight 26. In particular, the larger the exit pupil, the brighter the image. The field of view, which is the side-to-side measurement of the circular viewing field or subject area, does not change if the generated image is scaled down correctly by the image generator 18. Parallax error can be adjusted by adjusting the distance between the secondary display image panel 22 and the optical lens 24, defined as “u” in the equation above. Parallax error is the condition that occurs when the image of the target is not focused precisely on the reticle plane. Parallax is visible as an apparent movement between the reticle and the target when the shooter moves his head or, in extreme cases, as an out-of-focus image.
The portion of the displayed image that can be seen through the weapon sight 26 depends on the focal length of the optical lens 24 and the field-of-view of the weapon sight 26. The following equation explains the relationship between the limiting dimension W (width or height) of the secondary image display panel 22, the field of view of the weapon sight 26, and the focal length f of the optical lens 24:
where
As a result, in order to minimize the pixelization of the image of the secondary image display panel 22 seen by the user through the weapon sight 26, the focal length of the optical lens 24 has to be selected so that the largest area of the display 22 is seen by the user. The remaining variables of the equation are fixed in view of the equipment used in the optical sight system 10.
Selection of the focal length of the optical lens 24 depends on the magnification of the weapon sight 26 and the size of the secondary image display panel 22. The magnification is the optical sight parameter that the user is most concerned with, and it is related to the FOV and other parameters of the optical sight. For the purpose of the present embodiment, the FOV used is the FOV published by the sight manufacturer (or it is determined experimentally if it is not known), and use the last formula to solve for the focal length of the lens given the limiting dimension (W) of the secondary image display 22 that is being used.
For example, for a display panel 22 having 12 mm×9 mm dimensions, a 4× scope needs a 120 mm focal length lens and the secondary image display panel 22 should be put 120 mm away from the optical lens 24. In comparison, a 12× scope needs a 400 mm focal length lens and the image panel should be put 400 mm away from the optical lens 24. This means that a housing of the optical sight system 10 should be at least 400 mm long and should be attached to the 12× weapon sight 26 so that the image can be seen clearly and the maximum display area can be seen through the scope 26. If a 120 mm lens is used with the 12× weapon sight 26, the image area seen through the weapon sight 26 is only 1/9 of the display area that the 4× weapon sight 26 can see.
For illustrative purposes, consider an equilateral triangle formed by the limiting dimension (W) of the secondary image display 22 and the focal point 400 mm away. If the secondary image display 22 is moved closer to the focal point, so that it is only 120 mm away, the portion of the secondary image display 22 that is between the legs of the equilateral triangle above would be only about ⅓ (actually 120/400 based on similar triangles). Since the FOV of a weapon scope 26 is generally conical, the area of the display that would be seen is essentially a circle. If the radius of the circle when the right focal length lens (400 mm) is selected is chosen to be R and the radius of the circle when the wrong lens (120 mm) is used is r, then the relationship between the two is approximately r=R/3. Since the area of a circle is equal to πR2, then the area of the small circle would be approximately 1/9 of the area of the large circle.
The single convex lens structure requires different optical sight systems 10 for different weapon sights 26, because it uses different optical lenses and different distances between the optical lens 24 and the secondary image display panel 22. For example, if a user has two weapon sights 26, one being 4× and one being 12×, then the user will need two optical sight systems 10 for the weapon sights 26; one shorter optical sight systems 10 for the 4× scope and one longer optical sight system 10 (more than 400 mm long) for the 12× scope.
As a result of these limitations provided an optical lens 24 that is a single convex lens, another embodiment of the optical sight system 10 is provided as illustrated in
In the embodiment illustrated in
The optical sight system 10 of this embodiment increases the flexibility, reduces the production process and reduces the setup process for the different weapon sights 26. Furthermore, the optical sight system 10 provides the desired resolution of the image to the user through the weapon sight 26, such that the image retains clarity through the weapon sight 26, and the user is able to distinguish fine detail.
Referring to
One problem, however, is that when the barrel of the simulated weapon 20 is rotated, the secondary image display 22, which is physically affixed to the simulated weapon 20, should be physically rotated as well. Without detecting and compensating for this effect on the secondary image display 22, there is a visual discrepancy between the primary image display 14 and the secondary image display 22. Specifically, the image transmitted on the secondary image display 22 will remain at the same non-rotated position. Comparing
This visual discrepancy of
Hardware sensors 21 physically attached to the simulated weapon 20 detect the cant angle of the simulated weapon 20. Using firmware and low-level application program interface (API) code, the signal transmitted by the sensor 21 is used to compensate the image displayed by the secondary image display 22. More specifically, the software application of the CPU 16 creates a temporary display surface upon which it renders a part of the background image, as well as any targets 13 that should appear in the viewing area of the through-sight. This display surface of the secondary image display 22 is then counter-rotated using 3D techniques to texture map the display to a simple quadrangular polygon whose vertices are rotated. The rotation angle is equal and opposite to the cant angle reported by the low level API, and the API is used simply to read the cant angle sensor 21 and pass the value to the software application so that the software application can rotate the image by the same angle
Following the rotation operation, a sight mark overlay is applied which gives the effect of crosshairs, reduces the visually displayed area to a circle (to simulate actual weapon sights 26), and may display other information, such as the Field of View illustrated in
The initial testing has occurred under simulated conditions with weapon cant simulated by keyboard input to generate weapon cant sensor data packets, with weapon cant simulated to within 0.4 degrees accuracy, and was later verified by testing with a simulated weapon fitted with a cant angle sensor. The image on the secondary image display 22 can be observed to rotate as the simulated cant changes, and the relation between target positioning and the background image is preserved despite rotation and magnification of the image broadcast at the secondary display image 22.
The sequence diagram for the rotated through-sight is shown in
While this invention has been described with reference to preferred embodiments thereof, it is to be understood that variations and modifications can be affected within the spirit and scope of the invention as described herein and as described in the appended claims.
This application claims the benefit of U.S. Provisional Patent Application No. 60/514,815, filed Oct. 27, 2003, which is herein incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
60514815 | Oct 2003 | US |