SYSTEM AND METHOD FOR FUSING AN IMAGE

Abstract
A fusion vision system has a first sensor configured to detect scene information in a first range of wavelengths, a second sensor configured to detect scene information in a second range of wavelengths, and a processor configured to resize one of a first and a second image to improve viewability of the fused scene.
Description

BRIEF DESCRIPTION OF THE DRAWINGS

For a better understanding of the invention, together with other objects, features and advantages, reference should be made to the following detailed description which should be read in conjunction with the following figures wherein like numerals represent like parts:



FIG. 1A is a block diagram of an electronically fused vision system.



FIG. 1B is a block diagram of an optically fused vision system.



FIG. 2A is block diagram of a first fusion vision system consistent with the invention.



FIG. 2B is block diagram of a second fusion vision system consistent with the invention.



FIG. 3 illustrates resizing the output of an image intensification or thermal channel consistent with the invention.



FIG. 4 is a first calibration target useful in a method consistent with the invention.



FIG. 5 is a second calibration target useful in a method consistent with the invention.


Claims
  • 1. A fusion vision system, comprising: a housing;a first channel having a first sensor and a first objective lens at least partially disposed within the housing for processing scene information in a first range of wavelengths;a second channel having a second sensor and a second objective lens at least partially disposed within the housing for processing scene information in a second range of wavelengths;a processor configured to resize one of a first and a second output of one of the first and second channels; andan image combiner for combining the output of the first or second channel with the resized output of the second or first channel.
  • 2. The fusion vision system of claim 1, wherein the first range of wavelengths is approximately 400 nm to approximately 900 nm and the second range of wavelengths is approximately 7,000 nm to approximately 14,000 nm.
  • 3. The fusion vision system of claim 1, further comprising a display for projecting an image to an operator.
  • 4. The fusion vision system of claim 3, wherein the display has a plurality of individual pixels arranged in rows and columns.
  • 5. The fusion vision system of claim 1, wherein the processor adds or removes one or more row or columns of pixels before displaying in a display.
  • 6. The fusion night vision system of claim 1, wherein the first channel has an objective focus and an image intensification tube and the second channel has an objective focus and an infrared sensor.
  • 7. The fusion night vision system of claim 1, wherein the image combiner is a partial beam splitter.
  • 8. The fusion night vision system of claim 1, wherein the image combiner is a selected one of a digital fusion mixer and an analog fusion mixer.
  • 9. The fusion night vision system of claim 8, wherein the image combiner is an optical image combiner.
  • 10. The fusion night vision system of claim 1, further comprising a display coupled to the image combiner, the display having a plurality of pixels arranged in rows and columns for projecting an image to an operator.
  • 11. The fusion night vision system of claim 1, further comprising a parallax compensation circuit coupled to the display and configured to receive distance to target information.
  • 12. The fusion night vision system of claim 1, wherein the processor resizes the first or second output to correct for the two channels having differing fields of view.
  • 13. The fusion night vision system of claim 3, further comprising an eyepiece aligned with the display for viewing a fused image from the first and the second channels.
  • 14. The fusion night vision system of claim 11, further comprising an objective lens aligned with the first channel for determining the distance to target information.
  • 15. A method of displaying fused information representative of a scene, the method comprising the steps of: acquiring first information representative of the scene from a first channel configured to process information in a first range of wavelengths;acquiring second information representative of the scene from a second channel configured to process information in a second range of wavelengths; andresizing one of the first and the second acquired information to improve viewability of the scene.
  • 16. The method of claim 15, wherein a processor calculates a value for an added pixel based on a value of a surrounding pixel and the calculated value is displayed in a display for viewing by an operator.
  • 17. The method of claim 15, wherein information from a selected one of the first and the second channels is shifted on a display by a parallax compensation circuit so as to align the first information and the second information when viewed through an eyepiece.
  • 18. The method of claim 15, wherein the first channel has an objective focus and an image intensification tube and the second channel has an infrared sensor and an objective focus.
  • 19. The method of claim 15, wherein movement of the objective lens communicates a signal to a parallax compensation circuit indicative of the distance to target.
  • 20. A fusion vision system, comprising: a housing;a first sensor at least partially disposed within the housing for processing information in a first range of wavelengths;a second sensor at least partially disposed within the housing for processing information in a second range of wavelengths;a processor configured to resize one of a first and a second output of one of the first and second sensors; andan image combiner for combining the output of the first or second sensor with the resized output of the second or first sensor for viewing by an operator.
  • 21. The fusion vision system of claim 20, further comprising a display having a plurality of individual pixels arranged in rows and columns for projecting an image to an operator.
  • 22. The fusion vision system of claim 21, wherein the processor adds or removes one or more row or columns of pixels before displaying in the display.
  • 23. The fusion vision system of claim 20, wherein the image combiner is a partial beam splitter.
  • 24. The fusion vision system of claim 20, wherein the image combiner is a selected one of a digital fusion mixer and an analog fusion mixer.
  • 25. The fusion vision system of claim 24, wherein the image combiner is an optical image combiner.
  • 26. The fusion vision system of claim 20, further comprising a parallax compensation circuit coupled to the display and configured to receive distance to target information.
  • 27. The fusion vision system of claim 20, further comprising an eyepiece aligned with the display for viewing a fused image from the first and the second sensors.
  • 28. The fusion vision system of claim 26, further comprising an objective lens aligned with the first sensor for determining the distance to target information.
Provisional Applications (1)
Number Date Country
60728710 Oct 2005 US