The present disclosure relates to imagers and optics for weapon systems, and more particularly to imager systems used or integrated with optical sights, such as on weapons or weapon systems.
Firearms and other hand-held weapons, such as bows, are commonly provided with optical sights that provide a reticle in the field of view to aid in aiming the weapon. Conventional sporting/combat optical sights often use holographic optics that, when viewed through a glass optical window, superimpose a holographic image of a reticle at a distance in the field of view. The hologram image of the reticle is illumined by a laser diode in the holographic sight and is projected parallel to and a relatively short vertical distance from the barrel or aiming axis of the firearm upon which the sight is mounted.
It is known to position a low-light digital camera or a digital night vision optic in front of the optical sight, such that the projected reticle image is superimposed in the field of view imaged by the camera or digital optical device. The digital optical device may be mounted to the weapon with fixtures and mounting devices that are mechanically adjusted with shims, risers, and the like to roughly align the optical window of the optical sights. These mounting techniques and mechanical adjustments can be unreliable and create instability on the weapon.
The present disclosure provides a weapon system that includes an optical sight and an optoelectronic device that is digitally aligned with the focal plane of the optical sight to display a desired and accurate viewing frame to the operator of the weapon. In one aspect of the disclosure, the optical sight includes a base that is configured to mount to a weapon and a frame that is coupled to the base. The frame of the optical sight has a sight window that is configured to superimpose a reticle that is visible through the sight window in a first focal plane. The optoelectronic device has a mounting feature configured to mount to the weapon and an imager with a sensor array configured to receive light from an objective end of the optoelectronic device, where the objective end is configured to face the optical sight when the optoelectronic device is attached to the weapon. An image processor is configured to receive image data captured by the sensor array and process the image data to generate a subset image that is received from a select region of the sensor array. The select region of the sensor array defines a second focal plane. A controller is configured to receive an input from an operator, and in response to the input, to select the select region of the sensor array for aligning the second focal plane with the first focal plane. A display device is configured to display the subset image to the operator of the weapon.
Implementations of the disclosure may include one or more of the following optional features. In some implementations, the sensor array of the imager includes a plurality of photosensitive pixels disposed in a grid, such that the select region of the sensor array includes a grouped subset of the plurality of photosensitive pixels in the grid. In some examples, the input may indicate a directional adjustment that is configured to move the select region to an adjacent grouped subset of the plurality of photosensitive pixels in the grid. In other examples, the input may indicate a size adjustment that is configured to increase or decrease the area of the select region to a corresponding larger or smaller grouped subset of the plurality of photosensitive pixels in the grid. A border of the subset image may be framed at an edge of the sight window when the second focal plane is aligned with the first focal plane.
In additional implementations, the optoelectronic device includes at least one of a low-light digital camera or a thermal imager. In some examples, the imager includes a CMOS sensor or CCD sensor. Also, implementations of the optical sight include a holographic optic that has a light source disposed at the base and an optical element configured to project the reticle illumined by the light source through the sight window in the first focal plane.
In further implementations, the display device is disposed at an eye piece end of the optoelectronic device opposite the objective end. In some examples, an optical magnifier may be disposed at the eye piece end of the optoelectronic device to magnify the display subset image to the operator.
In other implementations, the weapon system includes a remote device that is wirelessly connected to the optoelectronic device and is configured to provide the input to the controller to select the select region of the sensor array. In some examples, the remote device includes a display that is configured to display a stream of the subset image.
Another aspect of the disclosure provides a method that involves generating a holographic reticle in a first focal plane of an optical sight that is mounted to a weapon. An optoelectronic device is mounted to the weapon with an objective end of the optoelectronic device facing the optical sight. An imager of the optoelectronic device captures image data and transmits the image data to an image processor that generates a subset image from a select region of the imager that defines a second focal plane. The subset image is displayed to the operator of the weapon at an eye piece of the optoelectronic device. The select region of the imager is altered in response to inputs from the operator to align the second focal plane with the first focal plane.
Each of the above independent aspects of the present disclosure, and those aspects described in the detailed description below, may include any of the features, options, and possibilities set out in the present disclosure and figures, including those under the other independent aspects, and may also include any combination of any of the features, options, and possibilities set out in the present disclosure and figures.
The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, advantages, purposes, and features will be apparent upon review of the following specification in conjunction with the drawings.
Like reference numerals indicate like parts throughout the drawings.
Referring now to the drawings and the illustrative examples depicted therein, a weapon system 10, such as partially shown in
The weapon system 10, such as shown in
As shown for example in
As further shown in
The optoelectronic device 36 includes an imager 40 with a sensor array 42 (
The sensor array 42 (
Referring now to
The image processor 44 may process the image data to represent less than the entirety of data generated by the imager 40 on the display device, referred to as a subset image. The subset image simply refers to an image generated from a subset, or less than all, of the image date generated by the imager 40. The subset image corresponds to and is received from a select region 54 of the sensor array 42, such as shown in
The select region 54 of the sensor array 42 defines a second focal plane F2. A display device 48 is configured to display the subset image to the operator 50 of the weapon 12. The display device 48 is disposed at an eye piece end of the optoelectronic device 36 opposite the objective end. The display device 48 may be viewed through a viewfinder window.
In some examples, an optical magnifier may be disposed at the eye piece end of the optoelectronic device to magnify the displayed subset image to the operator. In other examples, the display device 48 may be disposed at a remote location that is detached from the weapon 12, such as at an operator's head-mounted device or at a portable electronic device, such as a computer or smart phone. The remote device may be wirelessly connected to the optoelectronic device 36 and configured to provide an input to a controller 52 to select the select region 54 of the sensor array 42. The optoelectronic device 36, in the controller 52 or with another component, may integrate wireless communication technologies, such as Wi-Fi, Bluetooth, cellular, or other conventional protocols. For example, the remote device may include a display 48 that is configured to display a relatively live stream of the subset image.
With further reference to
As shown in
As shown in
When the regioning operation is processing at the image processor 44, the display device 48 displays the subset image that corresponds to the select region 54. The operator of the weapon may view the display device 48, either at the optoelectronic device or at a remote device, and provide an input to adjust the select region 54. It is contemplated that a configuration routine may also or alternatively be actuated that selectively allows or restricts the received inputs to adjust the select region 54, such as to prevent accidental adjustments when carrying or operating the weapon. The input may indicate a directional adjustment or a sizing adjustment of the select region. In additional examples, it is contemplated that that the input could also adjust the perceived inclination angle of the subset image (e.g., yaw, pitch and roll).
As shown in
Also, as shown in
Other inputs are also contemplated. For example, combinations of directional and sizing adjustments may allow the operator to selectively adjust each border of the select region 54 independently to more precisely align the exact region desired within the range of the imager 40. Specifically, the operator 50 may be provided input options to individually adjust the top border and bottom border of the select region 54 upwards or downward, and to individually adjust the left border and right border to the left and right. In other alternatives, the top and bottom borders may be adjusted together in combination and the left and right borders may be adjusted together in combination.
The skew, or relative rotation of the imager 40 to the sight window 30, may also be adjustable. Variances in manufacturing, wear, or mounting may result in a mismatched rotation between imager 40 and the sight window 30 such that, for example, the borders of the select region 54 do not appear parallel with the borders of the sight window 30. It may therefore be desirable to adjust the subset image output to the display 48 by adjusting the skew of the select region 54. The operator may provide manual input to perform this adjustment. In other alternatives, the operator may provide input to selectively adjust the rotation of each border individually, or in top-bottom and left-right pairs to account for keystone correction.
When operating to select a new select region of the sensor array 42, the second focal plane F2 is desirably aligned with the first focal plane F1. For example, the border of the subset image may be framed at an edge of the sight window when the second focal plane is aligned with the first focal plane.
With reference to the method of digital focal plane alignment for imager and sights for weapon systems, such as shown for example in
The step 64 of the method illustrated in
The image processor may determine a select region to be defined by the edges of the sight window, or the frame. The image processor may determine a select region to include the edges of the sight window, or the frame, plus some additional margin of peripheral sensor elements. The additional margin of peripheral sensor elements may be defined as a percentage of the subset image. For example, the select region may be determine by the image processor based on the edges of the sight window plus a margin such that the margin does not take up more than 5% of the total subset image. The image processor may store the select region in a memory of the optoelectronic device as a matrix of sensor elements, defined by the address coordinates of the range of sensor elements comprising the select region.
The articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements in the preceding descriptions. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “one embodiment” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional implementations that also incorporate the recited features. Numbers, percentages, ratios, or other values stated herein are intended to include that value, and also other values that are “about” or “approximately” the stated value, as would be appreciated by one of ordinary skill in the art encompassed by implementations of the present disclosure. A stated value should therefore be interpreted broadly enough to encompass values that are at least close enough to the stated value to perform a desired function or achieve a desired result. The stated values include at least the variation to be expected in a suitable manufacturing or production process, and may include values that are within 5%, within 1%, within 0.1%, or within 0.01% of a stated value.
Also for purposes of this disclosure, the terms “approximately,” “about,” and “substantially” as used herein represent an amount close to the stated amount that still performs a desired function or achieves a desired result. For example, the terms “approximately,” “about,” and “substantially” may refer to an amount that is within less than 5% of, within less than 1% of, within less than 0.1% of, and within less than 0.01% of a stated amount. Further, it should be understood that any directions or reference frames in the preceding description are merely relative directions or movements. For example, the terms “upper,” “lower,” “right,” “left,” “rear,” “front,” “vertical,” “horizontal,” “distal,” “proximal” and derivatives thereof shall relate to the orientation shown in
Changes and modifications in the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law. The disclosure has been described in an illustrative manner, and it is to be understood that the terminology which has been used is intended to be in the nature of words of description rather than of limitation. Many modifications and variations of the present disclosure are possible in light of the above teachings, and the disclosure may be practiced otherwise than as specifically described.
This application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 63/211,758, filed Jun. 17, 2021, the disclosure of this prior application is considered part of this application and is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63211758 | Jun 2021 | US |