The present invention relates generally to devices that display targeting information to a shooter, and more particularly, to such devices that may be mountable to a firearm and display such information through a weapon sighting scope.
Weapon-mounted targeting display devices cooperate with a firearm scope to present certain targeting information within a sight picture (or field of view (FOV)) that presents an image of a target. Such targeting display devices thus enable a shooter to view the target and the targeting information simultaneously without breaking cheek-to-stock weld, or otherwise losing target awareness. Such display devices may communicate with a ballistic computer (also referred to as a “calculator” or “solver”) that calculates a ballistic solution using known equations and variables including bullet type, range to target, cant angle, angle of incline/decline to target, wind speed and direction, elevation, temperature, humidity, and barometric pressure, for example, measured by various sensors. The ballistic solution generated by the ballistic computer may then be displayed in various forms by a display unit to the shooter.
Various known targeting display devices generate a digitally reproduced image of a target, and overlay targeting information on the digitally reproduced image. A notable disadvantage of this design is that upon an unintended power failure of the display device, the digitally reproduced image generated by the display device disappears, leaving the shooter unable to view the target directly with the scope. Others have attempted to integrate an entire system into a single optical device or to display limited information. Accordingly, there is a need for improvement to address these and other shortcomings of known firearm-mounted targeting display devices.
Data input or information displayed to the operator of a given optical system may be provided within the field of view using an optical projection system of the invention, such that the data is provided via a superimposed projected image, emanating from an internal illumination engine, housed within the given optical system. The given optical system (such as a rifle scope, binoculars, spotting scope, etc.) containing the projection system super-imposes data pertaining to the required information such that the operator can accurately define the relationship between said operator and a selected target without loss of continuous visual contact with the target. This data could include measurements of range to target, angle and incline to target, position, ambient temperature, wind velocity, operator position, or any other variables required to define the relationship between operator and target.
The defined relationship between operator and target can then be used for calculating a ballistic solution, logistics solution, aiming point, or any number of other applications where remote target data is required. The projection system allows for selectable on-off visibility that removes all aspects of the data, and data display region, from the FOV, providing superior image clarity and a fully unobstructed FOV for the operator as desired.
Use of the disclosed projection system allows for the data image to be composed of one or more micrometer sized pixels, providing the smallest possible obscuration within the FOV, which can also be freely positioned across a large portion of the FOV. This is a substantial departure from other systems such as fixed-location, large obscuration 7-segment digital displays. The projection system is capable of displaying data information in the form of any combination of one or more pixels such as alphanumeric characters, shapes, or symbols, for example. The projection system allows for the use of a single micro-display panel to produce images in any combination of red, green, or blue within the visible spectrum, as well as the potential to produce images in infrared (IR) or ultraviolet (UV) wavelengths, allowing for use with night vision goggles. The projection system may also enable manual or automatic adjustments to display intensity.
Other aspects, features, benefits, and advantages of the present invention will become apparent to a person of skill in the art from the detailed description of various embodiments with reference to the accompanying drawing figures, all of which comprise part of the disclosure.
Like reference numerals are used to indicate like parts throughout the various drawing figures, wherein:
With reference to the drawing figures, this section describes particular embodiments and their detailed construction and operation. Throughout the specification, reference to “one embodiment,” “an embodiment,” or “some embodiments” means that a particular described feature, structure, or characteristic may be included in at least one embodiment. Thus appearances of the phrases “in one embodiment,” “in an embodiment,” or “in some embodiments” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the described features, structures, and characteristics may be combined in any suitable manner in one or more embodiments. In view of the disclosure herein, those skilled in the art will recognize that the various embodiments can be practiced without one or more of the specific details or with other methods, components, materials, or the like. In some instances, well-known structures, materials, or operations are not shown or not described in detail to avoid obscuring aspects of the embodiments.
Referring now to the various figures of the drawing, and beginning with
The scope 10 has a body that includes a scope tube 14 having a length extending along a longitudinal scope axis (optical axis), an objective bell 16 extending distally from the scope tube 14 and defining a first (“objective”) end 18 of the scope 10, and an eyepiece 20 extending proximally from the scope tube 14 and defining a second (“eyepiece” or “ocular”) end 22 of the scope 10. As shown in
Referring to
As shown schematically in
As shown in
As shown schematically in
Advantageously, projection of the digital data image 38 generated by the optical projection system 12 may be selectively activated and deactivated by the shooter/spotter as desired. For example, the projection system 12 may be placed in an ON mode in which the digital data image 38 is generated and projected into the field of view (or sight picture) visible to the shooter through the eyepiece 20. Alternatively, the projection system 12 may be placed in an OFF mode in which generation and/or projection of the digital data image 38 into the sight picture is deactivated, such that the sight picture viewed by the shooter through the eyepiece 20 presents only the optical image 30 of the distant target object, optionally in combination with a physical reticle image. Accordingly, a total failure of the projection system 12, such as by loss of electrical power, will default to a fully functional, traditional riflescope with a manually adjustable, physical reticle.
Referring to
As shown in
In addition to providing a mounting base for certain imaging components of the optical projection system 12, the circuit board assembly 40 functions electronically to assimilate input data and to output graphics to a micro-display 60, described below, for generation of a digital data image. In exemplary embodiments, the circuit board assembly 40 may include direct user input elements, in the form of one or more buttons 61, for example, and may include one or more integrated positional sensors (not separately labeled). Such positional sensors may include a compass and/or a three-axis accelerometer, for example, of well-known design. As described below in connection with
Referring to
In brief summary, and as described in greater detail below in connection with
Referring to
The primary beamsplitter 62 may be oriented about the scope axis such that an upper face 72 extends generally horizontally. As shown, the corrective optics assembly 54, linear polarizer element 56, wave plate 58, micro-display 60, and secondary beamsplitter 64 are supported generally at or above the upper face 72. For example, the secondary beamsplitter 64 may be supported directly on the upper face 72, as described below.
The illumination source 52 is shown in the form of a light emitting diode (LED) light source 52a, which may utilize an operational wavelength (or a combination of multiple wavelengths) within known visual bands (e.g., 350-700 nm). In one embodiment, the illumination source 52 may be in the form of a red LED light source. In alternative embodiments, the illumination source 52 may be of various other types known in the art, and may utilize light wavelengths within known UV or IR bands, for example. As such, the data image generated by the optical projection system 12 may be viewable with a night vision imaging device, such as night vision goggles worn by a shooter/spotter, or with various thermal imaging devices, for example. In the illustrated embodiment, the illumination source 52 is supported with the corrective optics assembly 54 generally above the primary beamsplitter 62, and both are laterally offset from the upper face 72 thereof.
The corrective optics assembly 54 of the illustrated embodiment includes a total internal reflection (TIR) lens 74, a field lens 76, and a reflective element 78. In an exemplary embodiment, the field lens 76 may be in the form of a plano-convex (PCX) lens. Additionally, while the reflective element 78 is shown in the form of a mirror plate, in alternative embodiments the reflective element 78 may be of various other suitable types and shapes.
In the embodiment shown, the TIR lens 74 is positioned adjacent to the illumination source 52 and faces transverse to the longitudinal axis of the optical projection system 12. The field lens 76 is positioned generally perpendicular to the TIR lens 74, and the mirror plate 78 is positioned angularly between the TIR lens 74 and the field lens 76 so as to redirect light from the TIR lens 74 to the field lens 76. The mirror plate 78 may be oriented at any suitable angle relative to the TIR lens 74, such as approximately 45 degrees, for example. The TIR lens 74, field lens 76, and mirror plate 78 of the corrective optics assembly 54 cooperate to shape light emitted by the LED light source 52 into a corrected beam profile that generally matches a beam profile of the optical image to be directed through the scope 10 from the objective lens assembly 24. In that regard, the TIR lens 74 captures and collimates and/or focuses light emitted by the LED light source 52, and redirects the focused light toward the mirror plate 78, which in turn redirects the light through the field lens 76.
The use of corrective optics within the optical projection system 12 advantageously enables precise control of display focus and parallax within a field of view presented to a shooter through the scope eyepiece 20. By properly matching the beam profile of light emitted by the illumination source 52 with a beam profile of light directed through the scope 10 from the distant target object, the projection system 12 maximizes its performance efficiency, thereby maximizing the life of a power source (e.g., battery) for the projection system 12, while providing a data image of optimum visual quality. Further, to a shooter/spotter looking through the scope eyepiece 20, the data image generated by the projection system 12 appears to be flat and parallax free, similar to the reticle. Additionally, the projection illumination technique employed by the projection system 12, in combination with the corrective optics assembly 54, allows for delivery of a majority of the light rays generated by the illumination source 52 to the shooter/spotter's eye, thereby providing an enhanced data image relative to conventional targeting display devices in which the shooter directly views a backlit type display, for example.
The secondary beamsplitter 64 is spaced from the field lens 76 by a linear polarizer element 56, which may be in the form of a wire grid style linear polarizer, for example. In combination, the secondary beamsplitter 64, the linear polarizer element 56, and a wave plate 58 (described below) define a polarization-control optics assembly that improves efficiency and contrast of the digital data image produced by the optical projection system 12.
Referring to
The secondary beamsplitter 64 may be oriented such that a lower face thereof abuts the upper face 72 of the primary beamsplitter 62, and an upper face of the secondary beamsplitter 64 is directed toward the micro-display 60. As shown in
After light generated by the LED light source 52 (reflected, in this case, by mirror 76) passes through the field lens 76, it continues through the linear polarizer element 56, which restricts a polarization state of the light. Upon exiting the linear polarizer element 56, the light enters the secondary beamsplitter 64 through a side face thereof, and is reflected at least in part by the secondary beamsplitter 64 in a direction toward the micro-display 60. Before reaching the micro-display 60, the linearly polarized light passes through the wave plate 58, which may be in the form of a quarter wave plate, for example. The wave plate 58 functions as an optical retarder (or “compensator”) to alter the polarization state of the light and thereby improve image contrast of the data image ultimately produced using the micro-display 60. In alternative embodiments, the wave plate 58 may be omitted and a compensator may be integrated within the micro-display 60, for example.
Polarization control provided by the secondary beamsplitter 64, the linear polarizer element 56, and the wave plate 58 advantageously enhances the data image generated by the micro-display, described below. In particular, polarization control provides the data image with optimal image contrast and brightness, and provides optimum control of stray light noise, thereby maximizing image quality of the data image.
As best shown in
In exemplary embodiments, the micro-display 60 may be in the form of a reflective liquid crystal on silicon (LCOS) display panel. Advantageously, an LCOS display panel provides benefits including: superior display brightness (intensity) control for use in a variety of lighting conditions; superior display contrast for improved image quality (e.g., darker blacks and whiter whites) for producing a more realistic data image; even pixel illumination for consistent image color and intensity; accurate pixel-by-pixel control without screen mesh image degradation, as may be exhibited by non-LCOS micro-displays such as organic light emitting diode (OLED) displays; and superior pixel density (expressed as pixels per inch or PPI) for superior definition display, such as at least 720P and up to 4K, for example. In exemplary embodiments, an LCOS micro-display may exhibit a pixel size of 9.4 micrometers (μm) or less, such as approximately 6.4 μm, for example. In one embodiment, the micro-display 60 may be in the form of an LCOS display panel of a full color HD type, and having a 1024×600 color high brightness resolution. In alternative embodiments, the micro-display 60 may be in the form of various other display types, such as an OLED or a liquid crystal display (LCD), for example.
Referring to
The reflected light 86 illuminates the display screen 80 of the micro-display 60, which displays targeting information in the form of one or more data characters (see, e.g.,
Still referring to
As shown in
Though not depicted in
The inner hypotenuse face 92 of the primary beamsplitter 62, and optionally also the inner hypotenuse face 84 of secondary beamsplitter 64, may include one or more coatings for controlling light reflection and transmission. Exemplary coatings may include various dielectric coatings configured to provide a selected light reflection/transmission ratio, and which may be adapted for polarizing or non-polarizing use. Exemplary coatings may also include various dichroic coatings, configured to provide a selected light reflection/transmission ratio specific to certain wavelengths. Advantageously, a dichroic coating may provide fine control over polarization components across a broad range of light wavelengths. Moreover, use of a dichroic coating may facilitate maximum light transfer through the optical projection system 12 and maximum brightness of the data image 88 generated with the micro-display 60, while by reflecting only the light wavelengths projected from the micro-display 60 in data image light 88.
Exemplary coatings for one or both of the inner hypotenuse faces 84, 92 may also include various metallic coatings, which can include silver or aluminum, for example, configured for use across a wide range of light wavelengths and providing a selected light reflection/transmission ratio. In an exemplary embodiment, the inner hypotenuse face 92 of the primary beamsplitter 62 may include a coating composed of dielectric and silver layers configured to provide approximately 70-80% transmission and approximately 20-30% reflection of the data image light 88 received from the micro-display 60.
Referring to
The exemplary data image 88 shown in
A ballistic solution may be presented by the data image 88 in the form of one or more values indicating a respective number of turret adjustment clicks, minutes of angle (MOA), or milliradians (MRAD), for example, by which shooters must correct their aim, for example by adjusting the elevation and/or windage adjustment turrets 34, 36, to hit the target object 96. In the illustrated example (
In other embodiments, the ballistic solution presented by the data image 88 within the sight picture 94 may include an image or a video of a corrected aim point, displayed in the form of a projected point or, for example, an auxiliary aiming reticle 101 (shown in the low magnification view of
In other exemplary embodiments of the invention, the optical projection system 12 may be suitably configured so that the sight picture 94 presented within the scope eyepiece 20 includes thermal and/or night vision overlays.
Referring now to
The optical projection system 12 communicates with the environmental sensing devices 120, the rangefinder 122, the spotting scope 124, and the mobile device 126 over a network 128. The network 128 may include one or more networks, which may be in the form of a wide area network (WAN) or a local area network (LAN), for example. The network 128 may provide wireless or wired communication between the optical projection system 12 and the other devices 120, 122, 124, 126. In exemplary embodiments, the network 128 may employ various forms of short-range wireless technologies, such as Bluetooth® or near-field communication (NFC), for example. In other embodiments, the network 128 may utilize one or more network technologies such as Ethernet, Fast Ethernet, Gigabit Ethernet, virtual private network (VPN), remote VPN access, or a variant of IEEE 802.11 standard such as Wi-Fi and the like, for example. In an exemplary embodiment, the optical projection system 12 may communicate with the environmental sensing devices 120 and/or the rangefinder 122 directly using USB and/or RS-232, for example. The network 128 may act as a data and/or power bus. Communication over the network 128 may take place using one or more network communication protocols, including reliable streaming protocols such as transmission control protocol (TCP). It will be understood that these examples are merely illustrative and not exhaustive.
As shown in
The optical projection system 12 may also communicate with additional sensors including a compass 132, an accelerometer 134, and a light sensor 136 that detect and inform the optical projection system 12 (and/or processor 130) of certain respective conditions. For example, the compass 132 may detect and inform the optical projection system 12 of a direction in which the scope 10 is pointed (e.g., North, South, East, West, and compass azimuth intervals thereof). The accelerometer 134, which may be a 3-axis accelerometer, for example, may detect and inform the optical projection system 12 of an orientation in which the scope 10 is supported (e.g., cant angle and/or inclination angle). The light sensor 136 may detect and inform the optical projection system 12 of an ambient lighting condition to adjust the intensity of the displayed image and/or other variables. As schematically represented in
Generally, devices like an electronic compass 132, a three-axis accelerometer 134 for sensing orientation, and a light sensor 136 are items that presently have been miniaturized, fully provide the required level of accuracy and precision, and are commercially available at very low cost. Thus, it is beneficial and cost effective to integrate such features into an optical projection system 12 that is in or on the housing of the scope 10. In contrast, devices like laser rangefinders 122 and some environmental sensing devices 120 can vary greatly in cost, depending on size and quality, and are still the subject of developing technological advancements. Thus, having a laser rangefinder 122, for example, that is separate from the scope 10 and/or optical projection system 12 allows the user to choose a device of the appropriate level of quality and cost for their specific needs or budget. Likewise, wind sensing and other environmental measurement devices are available in a variety of types, quality, and price points. The technology of wind sensing, for example, is rapidly developing from rotary vane anemometers, available now in a compact size and at relatively low cost, to thermal and athermalized infrared laser wind sensing devices that are currently significantly larger, not widely available commercially, and significantly higher in cost. Thus, having certain environmental sensing devices that are separate from the scope 10 and/or optical projection system 12 allows the user to choose a device of the appropriate level of quality and cost for their specific needs or budget, and to upgrade the device to one that includes later-developed technologies without having to replace the optical components 10, 12 of the ballistic solution calculating and aiming system.
The rangefinder 122 may detect and communicate to the optical projection system 12 (and processor 130) a distance (or “range”) measured from the scope 10 to the target object 96, and may be of various suitable types known in the art. The environmental sensing devices 120 may detect and communicate to the optical projection system 12 various environmental conditions including temperature, pressure, humidity, wind speed, wind direction, elevation, and global positioning system (GPS) location, for example, corresponding to a region in the immediate vicinity of the scope 10. Alternatively, GPS location technology is commonly included in many mobile devices 126 (smartphones). In one embodiment, the environmental sensing devices 120 may include a handheld weather meter of various suitable and commercially available types known in the art, such as a Kestrel® (made by Nielsen-Kellerman Co. of Minneapolis, Minn.) or WeatherFlow™ (made by WeatherFlow, Inc. of Scotts Valley, Calif.) brand weather meter or sensing component attachable to a smartphone or other hand-held personal digital device, for example. The environmental sensing devices 120 may also include a GPS tracking device that detects and communicates to the optical projection system GPS coordinates corresponding to an area in which the firearm scope 10, and firearm, is located. In another embodiment, the GPS tracking device may be integrated within the structure of the optical projection system 12, it may be mounted to the scope 10 or firearm and coupled directly to the optical projection system 12, or it may be integrated into the mobile device 126 (smartphone).
As described above, the environmental sensing devices 120 and the rangefinder 122 may communicate wirelessly with the firearm optical projection system 12. Accordingly, and advantageously, these devices 120, 122 may be arranged separately from the scope 10 and may be selectively paired with the optical projection system 12 as desired. For example, in some embodiments one or more of the devices 120, 122 may be mounted directly to the firearm, separately from the scope 10. In other embodiments, one or more of the devices 120, 122 may be arranged separately from the firearm but within close enough proximately to the scope 10 to enable wired or wireless communication, for example via Bluetooth®, between the devices 120, 122 and the optical projection system 12 for data communication therebetween.
The environmental measurements collected by the rangefinder 122 and the one or more environmental sensing devices 120 are communicated to the optical projection system 12 via signals sent over the network 128. In response to receiving the signals and, optionally, an instruction provided by the shooter/spotter, the optical projection system 12 may calculate, with its processor 130, an appropriate ballistic solution based on the environmental measurement values and using known mathematical formulae. The ballistic solution may then be presented to the shooter/spotter via the data image 88, as generally described above. The shooter may then make appropriate aiming corrections while maintaining sight of the target object 96 through the scope 10. The GPS coordinates provided by the GPS tracking device may be displayed with the ballistic solution in the data image 88 (not shown).
Alternatively, a second or alternate ballistic solution calculation processor (not shown) can be located separate from the scope 10 and optical projection system 12, such as part of an environmental sensing device 120 or as a software application on a mobile device 126. For example, if the user prefers a ballistic solution calculator that uses a different algorithm and/or one that uses different stored and/or sensed data, it can connect with the optical projection system 12 via the network 128 to use it as a passive or “dumb” display device to overlay an alternate data image 88 with the optical target image 90. Although there are technological benefits to having the processor 130 integral with the optical projection system 12, such as so that a graphic data image does not have to be communicated through a data bus or network 128, this feature can allow the user flexibility and future adaptability without having to modify or replace the projection system 12 and/or optical scope 10.
A “mobile device” 126 may be in the form of a cell phone (smartphone), tablet computer, or laptop personal computer, for example, and may communicate with one or more devices of the targeting information system 118, over the network 128, to send and/or receive targeting information as desired. In one embodiment, the mobile device 126 may communicate with the optical projection system 12, or with the rangefinder 122 and environmental sensing devices 120, to receive signals providing the measurements collected by the rangefinder 122 and sensing devices 120. In another embodiment, the mobile device 126 may communicate with the optical projection system 12 to receive signals providing the ballistic solution displayed via the data image 88. In either embodiment, the mobile device 126 may then display the information that it receives, via the signals, on its own display. A similar network connection can be made to a desktop computer (not shown) when not in the field to upload or download ballistic data or to make other software or firmware changes.
In other exemplary embodiments, the mobile device 126 may be utilized to send information to the optical projection system 12. For example, in one embodiment, the mobile device 126 may run a software application that can be used by the shooter, or another user, to input information or instructions that the mobile device 126 communicates to the optical projection system 12 via the network 128. Such information may include ballistic information or instructions specifying when a ballistic solution is to be calculated and how the ballistic solution is to be displayed via the data image 88. In another embodiment, the mobile device 126 may serve as a GPS tracking device, and may communicate GPS coordinates to the optical projection system 12 to be displayed within the data image. In yet another embodiment, the mobile device 126 may receive measurements collected by the rangefinder 122, environmental sensing devices 120, compass 132, and accelerometer 134, and may then calculate the ballistic solution, for example using the software application. The mobile device 126 may then communicate the ballistic solution to the optical projection system 12 for presentation to the shooter/spotter via the data image 88.
A spotting scope 124 may be positioned remotely from the firearm scope 10 having the optical projection system 12, but within close enough proximately to the firearm scope 10 to enable wired or wireless communication, for example via Bluetooth®, between the spotting scope 124 and the firearm scope 10 optical projection system 12. Likewise, the spotting scope 124 may include a second optical projection system 138, having a processor 140, similar in construction and function to the optical projection system 12 integrated within the firearm scope 10. The spotting scope optical projection system 138 may communicate over the network 128 with one or more of the devices of the targeting information system 118, such as the firearm scope optical projection system 12, rangefinder 122, or environmental sensing devices 120, so as to generate and display the same data image 88 displayed by the firearm scope 10 optical projection system 12. In this manner, a spotter looking through an eyepiece of the spotting scope 124 may advantageously view the same targeting information, including a ballistic solution, for example, viewed by the shooter looking through the eyepiece 20 of the firearm scope 10.
The optical projection system 12 of the firearm scope 10 may be powered by any suitable power source. In one embodiment, the projection system 12 may include an integrated power source, such as a battery (not shown). In other embodiments, the projection system 12 may be directly coupled to and powered by an external power source through a detachable connector 45. For example, the external power source may be housed within or otherwise mounted to any suitable portion of the firearm, such as a mounting device for the scope 10 or the firearm stock/chassis. In another embodiment, the external power source may be a power source of the rangefinder 122 or any of the environmental sensing devices 120. The optical projection system 12 may include a power regulator and a converter (not shown) for modifying power input as needed.
In another exemplary embodiment, the targeting information system 118 may include two or more optical projection systems 12, each corresponding to a respective firearm scope 10. Each optical projection system 12 may receive the same environmental measurements from a common or separate rangefinder 122 and/or common environmental sensing devices 120, and may reference its own respective compass 132 and accelerometer 134, for example, to calculate its own respective ballistic solution to be displayed to its respective shooter. In an exemplary embodiment, each optical projection system 12 may communicate with its own respective GPS tracking device for detecting a GPS location of the firearm scope 10 and the respective shooter. The GPS location detected by each GPS tracking device may be communicated over the network 120 to each of the other optical projection systems 12, 138. Accordingly, each of the optical projection systems 12, 138 may display, via its respective data image, the GPS location of each of the multiple shooters to a common spotter or observer. Inclusion of a camera (not shown) in the firearm scope 10 and/or spotting scope 124 may allow one shooter or spotter to communicate to another an image of the target area, temporarily overlaid on the optical image, so as to show the other some visual aspect of the target area without either of them having to disrupt their viewing positions.
While the present invention has been illustrated by the description of specific embodiments thereof, and while the embodiments have been described in considerable detail, it is not intended to restrict or in any way limit the scope of the appended claims to such detail. The various features discussed herein may be used alone or in any combination. Additional advantages and modifications will readily appear to those skilled in the art. The invention in its broader aspects is therefore not limited to the specific details, representative apparatus and methods and illustrative examples shown and described. Accordingly, departures may be made from such details without departing from the scope of the general inventive concept.
This application is a national phase application of, and claims priority to, International Patent Application PCT/US17/52930, filed on Sep. 22, 2017, which claims priority to U.S. Provisional Patent Application No. 62/398,296, filed on Sep. 22, 2016, and incorporates the same herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2017/052930 | 9/22/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/057872 | 3/29/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
1708389 | Karnes | Apr 1929 | A |
3161716 | Burris et al. | Dec 1964 | A |
3386330 | Carson | Jun 1968 | A |
3464770 | Schmidt | Sep 1969 | A |
3506330 | Allen | Apr 1970 | A |
3533696 | Winter | Oct 1970 | A |
3671100 | Bushman et al. | Jun 1972 | A |
3672782 | Akin, Jr. | Jun 1972 | A |
3743818 | Marasco et al. | Jul 1973 | A |
3749494 | Hodges | Jul 1973 | A |
3948587 | Rubbert | Apr 1976 | A |
4248496 | Akin, Jr. et al. | Feb 1981 | A |
4311902 | Koll | Jan 1982 | A |
4395096 | Gibson | Jul 1983 | A |
4415233 | Itoh | Nov 1983 | A |
4531052 | Moore | Jul 1985 | A |
4561204 | Binion | Dec 1985 | A |
4671165 | Heidmann et al. | Jun 1987 | A |
4695161 | Reed | Sep 1987 | A |
4743765 | Ekstrand | May 1988 | A |
4777352 | Moore | Oct 1988 | A |
4965439 | Moore | Oct 1990 | A |
5020902 | Kits van Heyningen et al. | Jun 1991 | A |
5026158 | Golubic | Jun 1991 | A |
5052801 | Downes, Jr. et al. | Oct 1991 | A |
5092670 | Preston | Mar 1992 | A |
5221956 | Patterson | Jun 1993 | A |
5225838 | Kanter et al. | Jul 1993 | A |
5291263 | Kong | Mar 1994 | A |
5311203 | Norton | May 1994 | A |
5339720 | Pellarin et al. | Aug 1994 | A |
5375072 | Cohen | Dec 1994 | A |
5491546 | Wascher et al. | Feb 1996 | A |
5528354 | Uwira | Jun 1996 | A |
5577733 | Downing | Nov 1996 | A |
5612779 | Dunne | Mar 1997 | A |
5652651 | Dunne | Jul 1997 | A |
5669174 | Teetzel | Sep 1997 | A |
5686690 | Lougheed et al. | Nov 1997 | A |
5721641 | Aoki | Feb 1998 | A |
5771623 | Pernstich et al. | Jun 1998 | A |
5783753 | Kellerman | Jul 1998 | A |
5867313 | Schweitzer | Feb 1999 | A |
5903996 | Morley | May 1999 | A |
5907436 | Perry et al. | May 1999 | A |
5920995 | Sammut | Jul 1999 | A |
5926259 | Bamberger et al. | Jul 1999 | A |
5939645 | Kellerman | Aug 1999 | A |
6032374 | Sammut | Mar 2000 | A |
6057910 | Dunne | May 2000 | A |
6132048 | Gao et al. | Oct 2000 | A |
6226077 | Dunne | May 2001 | B1 |
6247259 | Tsadka et al. | Jun 2001 | B1 |
6252706 | Kaladgew | Jun 2001 | B1 |
6257074 | Kellerman | Jul 2001 | B1 |
6269581 | Groh | Aug 2001 | B1 |
6311424 | Burke | Nov 2001 | B1 |
6396639 | Togino et al. | May 2002 | B1 |
6453595 | Sammut | Sep 2002 | B1 |
6516551 | Gaber | Feb 2003 | B2 |
6516699 | Sammut et al. | Feb 2003 | B2 |
6583862 | Perger | Jun 2003 | B1 |
6640482 | Carlson | Nov 2003 | B2 |
6681512 | Sammut | Jan 2004 | B2 |
6862832 | Barrett | Mar 2005 | B2 |
6873406 | Hines et al. | Mar 2005 | B1 |
7059170 | Kellerman et al. | Jun 2006 | B2 |
7089845 | Friedli et al. | Aug 2006 | B2 |
7124531 | Florence et al. | Oct 2006 | B1 |
7162825 | Ugolini et al. | Jan 2007 | B2 |
7175279 | Drazic et al. | Feb 2007 | B2 |
7185455 | Zaderey | Mar 2007 | B2 |
7194838 | Smith, III | Mar 2007 | B2 |
7196329 | Wood et al. | Mar 2007 | B1 |
7210262 | Florence et al. | May 2007 | B2 |
7237355 | Smith, III | Jul 2007 | B2 |
7239377 | Vermillion et al. | Jul 2007 | B2 |
7249493 | Kellerman et al. | Jul 2007 | B2 |
7256940 | Kaertner et al. | Aug 2007 | B2 |
7269920 | Staley, III | Sep 2007 | B2 |
7295296 | Galli | Nov 2007 | B1 |
7296358 | Murphy et al. | Nov 2007 | B1 |
7310904 | Ugolini et al. | Dec 2007 | B2 |
7325320 | Gnepf et al. | Feb 2008 | B2 |
7329127 | Kendir et al. | Feb 2008 | B2 |
7333270 | Pochapsky et al. | Feb 2008 | B1 |
7343707 | Smith, III | Mar 2008 | B2 |
7516571 | Scrogin | Apr 2009 | B2 |
7574825 | Lacorte | Aug 2009 | B2 |
7575327 | Uchiyama | Aug 2009 | B2 |
7586586 | Constantikes | Sep 2009 | B2 |
7654029 | Peters et al. | Feb 2010 | B2 |
7658031 | Cross et al. | Feb 2010 | B2 |
7676137 | Schick | Mar 2010 | B2 |
7690145 | Peters et al. | Apr 2010 | B2 |
7703679 | Bennetts et al. | Apr 2010 | B1 |
7712225 | Sammut | May 2010 | B2 |
7719769 | Sugihara et al. | May 2010 | B2 |
7721481 | Houde-Walter | May 2010 | B2 |
7739823 | Shapira et al. | Jun 2010 | B2 |
7805020 | Trudeau et al. | Sep 2010 | B2 |
7832137 | Sammut et al. | Nov 2010 | B2 |
7856750 | Sammut et al. | Dec 2010 | B2 |
7859650 | Vermillion et al. | Dec 2010 | B2 |
7864432 | Ottney | Jan 2011 | B2 |
7905046 | Smith, III | Mar 2011 | B2 |
7937878 | Sammut et al. | May 2011 | B2 |
7946048 | Sammut | May 2011 | B1 |
8001714 | Davidson | Aug 2011 | B2 |
8046951 | Peters et al. | Nov 2011 | B2 |
8047118 | Teetzel et al. | Nov 2011 | B1 |
8051597 | D'Souza et al. | Nov 2011 | B1 |
8081298 | Cross | Dec 2011 | B1 |
8091268 | York | Jan 2012 | B2 |
8100044 | Teetzel et al. | Jan 2012 | B1 |
8109029 | Sammut et al. | Feb 2012 | B1 |
8172139 | McDonald et al. | May 2012 | B1 |
8201741 | Bennetts et al. | Jun 2012 | B2 |
8230635 | Sammut et al. | Jul 2012 | B2 |
8264770 | Minor et al. | Sep 2012 | B2 |
8270086 | Hall et al. | Sep 2012 | B1 |
8281995 | Bay | Oct 2012 | B2 |
8297173 | Teetzel et al. | Oct 2012 | B1 |
8314923 | York et al. | Nov 2012 | B2 |
8336776 | Horvath et al. | Dec 2012 | B2 |
8353454 | Sammut et al. | Jan 2013 | B2 |
8363321 | Pochapsky | Jan 2013 | B1 |
8365455 | Davidson | Feb 2013 | B2 |
8375620 | Staley, III | Feb 2013 | B2 |
8414298 | D'Souza et al. | Apr 2013 | B2 |
8448372 | Peters et al. | May 2013 | B2 |
8468930 | Bell | Jun 2013 | B1 |
8558337 | Maryfield et al. | Oct 2013 | B2 |
8599482 | Schlierbach | Dec 2013 | B2 |
8608069 | Bay | Dec 2013 | B1 |
8656630 | Sammut | Feb 2014 | B2 |
8705173 | Peters et al. | Apr 2014 | B2 |
8707608 | Sammut et al. | Apr 2014 | B2 |
8713843 | Windauer | May 2014 | B2 |
8714073 | Burzel | May 2014 | B2 |
8783568 | Kaufmann | Jul 2014 | B2 |
8804237 | Tesmar et al. | Aug 2014 | B2 |
8826583 | Kepler et al. | Sep 2014 | B2 |
8830589 | Thomas et al. | Sep 2014 | B2 |
8832988 | Hu | Sep 2014 | B2 |
8833655 | McCarty et al. | Sep 2014 | B2 |
8857714 | Benson | Oct 2014 | B2 |
8893971 | Sammut et al. | Nov 2014 | B1 |
8905307 | Sammut et al. | Dec 2014 | B2 |
8923566 | Hsieh et al. | Dec 2014 | B2 |
8936193 | McHale et al. | Jan 2015 | B2 |
8939366 | Kelly | Jan 2015 | B1 |
8959824 | Sammut et al. | Feb 2015 | B2 |
8964298 | Haddick et al. | Feb 2015 | B2 |
8966806 | Sammut et al. | Mar 2015 | B2 |
8978539 | Teetzel et al. | Mar 2015 | B2 |
8988648 | Trissel et al. | Mar 2015 | B2 |
8991702 | Sammut et al. | Mar 2015 | B1 |
8997392 | Jung et al. | Apr 2015 | B1 |
9004358 | Bay | Apr 2015 | B2 |
9038901 | Paterson et al. | May 2015 | B2 |
9068794 | Sammut | Jun 2015 | B1 |
9091507 | Paterson et al. | Jul 2015 | B2 |
9121671 | Everett | Sep 2015 | B2 |
9127910 | Volfson | Sep 2015 | B2 |
9127911 | Varshneya et al. | Sep 2015 | B2 |
9151574 | Lowrey, III | Oct 2015 | B2 |
9157701 | Varshneya et al. | Oct 2015 | B2 |
9175927 | Tubb | Nov 2015 | B2 |
9250036 | Farca et al. | Feb 2016 | B2 |
9250038 | Sammut et al. | Feb 2016 | B2 |
9255771 | Sammut et al. | Feb 2016 | B2 |
9279975 | Berlips | Mar 2016 | B2 |
9310163 | Bay | Apr 2016 | B2 |
9323061 | Edwards et al. | Apr 2016 | B2 |
9335123 | Sammut | May 2016 | B2 |
9335124 | Maryfield | May 2016 | B2 |
9347742 | Varshneya et al. | May 2016 | B2 |
9372070 | Jancic et al. | Jun 2016 | B1 |
9389425 | Edwards et al. | Jul 2016 | B2 |
9429653 | Volfson | Aug 2016 | B2 |
9464871 | Bay | Oct 2016 | B2 |
9466120 | Maryfield et al. | Oct 2016 | B2 |
9494686 | Maryfield et al. | Nov 2016 | B2 |
9506723 | Teetzel et al. | Nov 2016 | B2 |
9506725 | Maryfield et al. | Nov 2016 | B2 |
9568279 | Maryfield et al. | Feb 2017 | B2 |
9678208 | Volfson | Jun 2017 | B2 |
20020129535 | Osborn | Sep 2002 | A1 |
20030010190 | Sammut et al. | Jan 2003 | A1 |
20030012035 | Bernard | Jan 2003 | A1 |
20030132860 | Feyereisen et al. | Jul 2003 | A1 |
20040025396 | Schlierbach et al. | Feb 2004 | A1 |
20040201886 | Skinner et al. | Oct 2004 | A1 |
20040244262 | Paige | Dec 2004 | A1 |
20050021282 | Sammut et al. | Jan 2005 | A1 |
20050046706 | Sesek et al. | Mar 2005 | A1 |
20050198885 | Staley | Sep 2005 | A1 |
20050219690 | Lin et al. | Oct 2005 | A1 |
20050250085 | Lemp et al. | Nov 2005 | A1 |
20050252062 | Scrogin | Nov 2005 | A1 |
20050268521 | Cox et al. | Dec 2005 | A1 |
20060048432 | Staley | Mar 2006 | A1 |
20060201047 | Lowrey | Sep 2006 | A1 |
20060254115 | Thomas et al. | Nov 2006 | A1 |
20070044364 | Sammut et al. | Mar 2007 | A1 |
20070097351 | York | May 2007 | A1 |
20070137008 | Anstee | Jun 2007 | A1 |
20070175080 | Sammut et al. | Aug 2007 | A1 |
20070209268 | Birurakis et al. | Sep 2007 | A1 |
20070234626 | Murdock et al. | Oct 2007 | A1 |
20070277421 | Perkins et al. | Dec 2007 | A1 |
20080039962 | McRae | Feb 2008 | A1 |
20080098640 | Sammut et al. | May 2008 | A1 |
20080163536 | Koch et al. | Jul 2008 | A1 |
20080290164 | Papale et al. | Nov 2008 | A1 |
20090100735 | Schick | Apr 2009 | A1 |
20090200376 | Peters et al. | Aug 2009 | A1 |
20090205239 | Smith, III | Aug 2009 | A1 |
20090225236 | Yoon | Sep 2009 | A1 |
20090320348 | Kelly | Dec 2009 | A1 |
20100207152 | Won | Aug 2010 | A1 |
20100225833 | Methe et al. | Sep 2010 | A1 |
20100275768 | Quinn | Nov 2010 | A1 |
20110121159 | Mourar et al. | May 2011 | A1 |
20110141381 | Minikey, Jr. et al. | Jun 2011 | A1 |
20110162250 | Windauer et al. | Jul 2011 | A1 |
20110168777 | Bay | Jul 2011 | A1 |
20110219634 | Sammut | Sep 2011 | A1 |
20110271577 | Davidson | Nov 2011 | A1 |
20110314720 | Cheng | Dec 2011 | A1 |
20120033195 | Tai | Feb 2012 | A1 |
20120075168 | Osterhout et al. | Mar 2012 | A1 |
20120097741 | Karcher | Apr 2012 | A1 |
20120117848 | Cox et al. | May 2012 | A1 |
20120126001 | Justice et al. | May 2012 | A1 |
20120137567 | Sammut | Jun 2012 | A1 |
20120182417 | Everett | Jul 2012 | A1 |
20120186131 | Windauer | Jul 2012 | A1 |
20130014421 | Sammut et al. | Jan 2013 | A1 |
20130033746 | Brumfield | Feb 2013 | A1 |
20130167425 | Crispin | Jul 2013 | A1 |
20130199074 | Paterson | Aug 2013 | A1 |
20130278631 | Border et al. | Oct 2013 | A1 |
20130279013 | Edwards et al. | Oct 2013 | A1 |
20130313320 | Bay | Nov 2013 | A1 |
20130333266 | Gose et al. | Dec 2013 | A1 |
20140000146 | Davidson | Jan 2014 | A1 |
20140002812 | Kepler et al. | Jan 2014 | A1 |
20140059915 | Sammut et al. | Mar 2014 | A1 |
20140063261 | Betensky et al. | Mar 2014 | A1 |
20140075820 | Ben-Ami | Mar 2014 | A1 |
20140101982 | McPhee | Apr 2014 | A1 |
20140109459 | Sammut et al. | Apr 2014 | A1 |
20140110482 | Bay | Apr 2014 | A1 |
20140166751 | Sammut et al. | Jun 2014 | A1 |
20140182187 | McHale | Jul 2014 | A1 |
20140226214 | Edwards et al. | Aug 2014 | A1 |
20140231515 | Bay | Aug 2014 | A1 |
20140285882 | Gotz et al. | Sep 2014 | A1 |
20140339307 | Sammut et al. | Nov 2014 | A1 |
20140360083 | Sammut | Dec 2014 | A1 |
20150008260 | Volfson | Jan 2015 | A1 |
20150020431 | Sammut et al. | Jan 2015 | A1 |
20150055119 | Hamilton | Feb 2015 | A1 |
20150059226 | Kepler et al. | Mar 2015 | A1 |
20150106046 | Chen et al. | Apr 2015 | A1 |
20150226522 | Sammut et al. | Aug 2015 | A1 |
20150233674 | Beckman | Aug 2015 | A1 |
20150247702 | Davidson et al. | Sep 2015 | A1 |
20150247703 | Teetzel et al. | Sep 2015 | A1 |
20150264229 | Teetzel et al. | Sep 2015 | A1 |
20150345908 | Maryfield et al. | Dec 2015 | A1 |
20150362287 | Sammut et al. | Dec 2015 | A1 |
20150362288 | Sammut et al. | Dec 2015 | A1 |
20150369565 | Kepler | Dec 2015 | A1 |
20160018189 | Maryfield et al. | Jan 2016 | A1 |
20160025455 | Paterson et al. | Jan 2016 | A1 |
20160061566 | Chen | Mar 2016 | A1 |
20160138890 | Hofmann et al. | May 2016 | A1 |
20160169625 | Richards | Jun 2016 | A1 |
20160202282 | Maryfield et al. | Jul 2016 | A1 |
20160223293 | Maryfield et al. | Aug 2016 | A1 |
20160245619 | Bay | Aug 2016 | A1 |
20160265880 | Maryfield et al. | Sep 2016 | A1 |
20160290765 | Maryfield et al. | Oct 2016 | A1 |
20170108376 | Maryfield et al. | Apr 2017 | A1 |
20170227327 | Thomas et al. | Aug 2017 | A1 |
Number | Date | Country |
---|---|---|
102014112794 | Mar 2016 | DE |
1340956 | Dec 2006 | EP |
1772695 | Apr 2007 | EP |
1992911 | Nov 2008 | EP |
9737193 | Oct 1997 | WO |
200246822 | Jun 2002 | WO |
2005015285 | Feb 2005 | WO |
2006060007 | Jun 2006 | WO |
2008091388 | Jul 2008 | WO |
2011045759 | Apr 2011 | WO |
2012061154 | May 2012 | WO |
2013002856 | Apr 2013 | WO |
2014024188 | Feb 2014 | WO |
2014167276 | Oct 2014 | WO |
2015017289 | Feb 2015 | WO |
2015074055 | May 2015 | WO |
2015103155 | Jul 2015 | WO |
2016145124 | Sep 2016 | WO |
Entry |
---|
International Bureau of WIPO, International Preliminary Report on Patentability issued in corresponding PCT Application No. PCT/US2017/052930, dated Mar. 26, 2019, 7 pages. |
Brashear LP, Integrted Ballistic Reticle System (IBRS), Data Sheet, Oct. 17, 2013, Brashear LP (Cleared by the US Army),www.L-3com.com/Brashear, Pittsburg, PA. |
European Patent Office, International Search Report & Written Opinion issued in related international application No. PCT/US2013/067755, dated Sep. 22, 2015, 9pp. |
Number | Date | Country | |
---|---|---|---|
20190219813 A1 | Jul 2019 | US |
Number | Date | Country | |
---|---|---|---|
62398296 | Sep 2016 | US |