The invention is generally related to night vision devices and, more particularly, to systems and methods for improving viewability.
Night vision systems include image intensification, thermal imaging, and fusion monoculars, binoculars, and goggles, whether hand-held, weapon mounted, or helmet mounted. Standard night vision systems are typically equipped with one or more image intensifier tubes to allow an operator to see wavelengths of radiation (approximately 400 nm to approximately 900 nm). They work by collecting the tiny amounts of light, including the lower portion of the infrared light spectrum, that are present but may be imperceptible to our eyes, and amplifying it to the point that an operator can easily observe the image. These devices have been used by soldier and law enforcement personnel to see in low light conditions, for example at night or in caves and darkened buildings. These devices take ambient light and magnify the light up to and in excess of 50,000 times and display the image for viewing through an eyepiece. A drawback to night vision goggles is that they cannot see through smoke and heavy sand storms and cannot see a person hidden under camouflage.
Infrared thermal sensors allow an operator to see people and objects because they emit thermal energy. These devices operate by capturing the upper portion of the infrared light spectrum, which is emitted as heat by objects instead of simply reflected as light. Hotter objects, such as warm bodies, emit more of this wavelength than cooler objects like trees or buildings. Since the primary source of infrared radiation is heat or thermal radiation, any object that has a temperature radiates in the infrared. One advantage of infrared sensors is that they are less attenuated by smoke and dust and a drawback is that they typically do not have sufficient resolution and sensitivity to provide acceptable imagery of the scene.
Fusion systems have been developed that combine image intensification with thermal sensing. The image intensification information and the infrared information are fused together to provide a fused image that provides benefits over just image intensification or just thermal sensing. Whereas typical night vision devices with image intensification can only see visible wavelengths of radiation, the fused system provides additional information by providing heat information to the operator.
According to one aspect of the invention, there is provided a fusion night vision system having a first housing having a first imager for processing information in a first range of wavelengths and a detector for processing information in a second range of wavelengths; a second housing having a second imager for processing information in the first range of wavelengths; and a third housing, the first housing coupled to the third housing through a first coupler, the first coupler having a first hinged joint rotatable about a first axis and a second hinged joint rotatable about a second axis, the first axis spaced a first fixed distance from the second axis.
According to another aspect of the invention, there is provided a vision system having a first housing having a first optical axis, a display, an image combiner; and a first eye piece and a second housing having a second optical axis and a second eye piece, the first housing coupled to the second housing through a first coupler, the first coupler having a first hinged joint rotatable about a first axis and a second hinged joint rotatable about a second axis, the first axis spaced a first fixed distance from the second axis, the first housing and the second housing coupled through the first coupler such that a row of pixels in the display can be maintained viewable through the first eye piece parallel with an imaginary line going through the first optical axis and a second optical axis as the distance between the first optical axis and the second optical axis is varied.
For a better understanding of the invention, together with other objects, features and advantages, reference should be made to the following detailed description which should be read in conjunction with the following figures wherein like numerals represent like parts:
The first imager and second imagers may be image intensifier tubes for processing information in a first range of wavelengths, for example 450 nm to 1000 nm and the detector may be an Infrared Focal Plane Array (IRFPA) or other SWIR, MWIR, EBAPS, or other detector for processing information in a second range of wavelengths, for example 7,000-14,000 nm. Alternatively, the imagers may be CCD, CMOS, or other imager whose output may be coupled with information from the detector to a display. The first imager having a first optical axis, the second imager having a second optical axis, and the detector having a third optical axis. The first housing may be coupled to the third housing through a first coupler having a first hinged joint rotatable about a first hinged axis and a second hinged joint rotatable about a second hinged axis, the first axis spaced a first fixed distance from the second axis. The second housing may be coupled to the third housing through a second coupler having a third hinged joint rotatable about a third hinged axis and a fourth hinged joint rotatable about a fourth hinged axis, the third hinged axis spaced a second fixed distance from the fourth hinged axis. The first and second housings may each have an appropriate objective lens(es) and respective first and second eye piece. The eye pieces may have one or more ocular lenses for magnifying and/or focusing the image.
The first imager may be positioned in the first housing to direct an enhanced image towards the first eyepiece which may be disposed in front of an operator's eye, for example the operator's right eye. The display in the first housing may be aligned with an optic, for example a corner cube, aligned in front of the first eyepiece for projection of information from the detector into a first optical axis containing enhanced scene information from the first imager. The display may have a series of orthogonal rows and columns of pixels. The image formed by the pixels may be viewable through the right eye piece. The second imager may be positioned in the second housing to direct an enhanced image towards the second eyepiece which may be disposed in front of an operator's eye, for example the operator's left eye. The enhanced scene information may be directed towards the second eye piece along a second optical axis. Display information may be target graphics or icons, incoming video from an overhead UAV, or heading information. The information may be used to augment existing information (augmented reality) from the imagers and may be useful for Remote Target Acquisition (RTA). The hinged joints may have a clutch or friction disk to generate drag.
The third housing may be coupled to the second housing through a second coupler having a third hinged joint having a third hinged axis and a fourth hinged joint having a fourth hinged axis. The first hinged joint may be positionable independently of the second hinged joint and the third hinged joint may be positionable independently of the forth hinged joint. The first hinged joint may be parallel with the second hinged joint and the third hinged joint may be parallel with the fourth hinged joint.
The second housing may be coupled to the third housing such that when the first optical axis is generally parallel with the first and second hinged joints the second optical axis is generally parallel with the third and fourth hinged joints and the third optical axis is generally parallel with the first and second hinged joints. The first housing may be positionable to space the first optical axis an adjustable distance from the second optical axis, this distance being associated with an interpupillary distance. The first housing and the second housing may be positionable through the first and second couplers to accommodate an interpupillary distance from 60-70 mm or even greater, for example from 57-74 mm.
As shown in
There may be a second display in the second housing to allow a fused image to also be projected in the second eyepiece, for example in front of the left eye. If there was a second display in the second housing, the second coupler would likewise provide the ability to align the image from the detector with the image from the second imager in the second housing. In addition, the second coupler may allow the operator to keep the first and second optical axis at the same height for each eye.
The fusion night vision system may have a helmet mount coupled to the third housing that allows it to be mounted to a helmet. The fusion night vision system may be powered by an internal or external battery and may get information to be displayed in the display wirelessly or through a connector. The fusion night vision system may have one or more adjustable stops to allow an operator to set the system up to allow the operator to quickly return the first and second housing to a desired position relative to the third housing appropriate for his/her IPD. The third housing may have an interface board to coordinate signals and power for the other housings. The wires may extend through a channel in each of the couplers (see
The housings may have a Controller #1 to turn the unit On and Off and another Controller #2 to adjust other parameters, for example the display brightness, auto/manual gain of the thermal channel, and the mix of thermal and image intensification information viewable through the eyepieces.
Since the fusion night vision system has the optical axis of the detector physically offset a distance from the optical axis of the first imager, a menu may be selectable by the operator that allows the operator to select an offset of the image on the display to correct for parallax.
Although several embodiments of the invention have been described in detail herein, the invention is not limited hereto. It will be appreciated by those having ordinary skill in the art that various modifications can be made without materially departing from the novel and advantageous teachings of the invention. Accordingly, the embodiments disclosed herein are by way of example. It is to be understood that the scope of the invention is not to be limited thereby.
This application is a divisional of U.S. patent application Ser. No. 15/284,709 filed Oct. 4, 2016, which claims the benefit of U.S. provisional patent application Ser. No. 62/238,778, filed Oct. 8, 2016, the entire disclosure of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
6687053 | Holmes | Feb 2004 | B1 |
6798578 | Beystrum | Sep 2004 | B1 |
7155781 | Tamada et al. | Jan 2007 | B2 |
9069530 | Liang et al. | Jun 2015 | B2 |
9191585 | Rollin | Nov 2015 | B2 |
9426389 | Hornback | Aug 2016 | B1 |
20060143764 | Reed | Jul 2006 | A1 |
20070103773 | Schwartz, II | May 2007 | A1 |
20070228259 | Hohenberger | Oct 2007 | A1 |
20070235634 | Ottney | Oct 2007 | A1 |
20080291060 | Wormaid | Nov 2008 | A1 |
20080302966 | Reed | Dec 2008 | A1 |
20090058881 | Ottney | Mar 2009 | A1 |
20090224154 | Jancic | Sep 2009 | A1 |
20100157518 | Ladouceur | Jun 2010 | A1 |
20110127392 | Carter | Jun 2011 | A1 |
20120200918 | Rivkin | Aug 2012 | A1 |
20130314546 | De Groot | Nov 2013 | A1 |
20140139643 | Hogasten | May 2014 | A1 |
20140302938 | Lidak | Oct 2014 | A1 |
20150067983 | Carey | Mar 2015 | A1 |
20150253563 | DiCarlo | Sep 2015 | A1 |
20160180534 | Ernst | Jun 2016 | A1 |
20170208262 | Sheridan | Jul 2017 | A1 |
20180217429 | Busch | Aug 2018 | A1 |
20180376067 | Martineau | Dec 2018 | A1 |
Number | Date | Country | |
---|---|---|---|
20190394376 A1 | Dec 2019 | US |
Number | Date | Country | |
---|---|---|---|
62238778 | Oct 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15284709 | Oct 2016 | US |
Child | 16542709 | US |