The present invention relates to avionics systems and, more specifically, to an avionics imaging system
Many aircraft employ imaging systems to assist the pilot in seeing what is in front of the aircraft. Some imaging systems employ a visible light camera to capture real time video, which is displayed on a screen in the cockpit. Such systems can be useful in showing areas blocked by the nose of the aircraft and, thereby, avoiding potential hazards. Some systems employ cameras that are sensitive to wavelengths either that are outside of the average visible light spectrum or that are on the periphery of the average visible light spectrum. These systems provide video information to pilots when the availability of ordinary visual light is limited, such as at dusk. Many aircraft employ displays that show the outputs from both visual light cameras and long wave infrared (LWIR) cameras.
Such cameras can be temperature sensitive and temperature is a function factors such as elevation and weather conditions. So when an aircraft climbs, the ambient temperature decreases and, as a result, any external electronic cameras typically become more sensitive. Similarly, as the aircraft descends, any external electronic cameras typically become less sensitive. Pilots employing external cameras (such as visual light cameras and long wave infrared cameras) often have to adjust the intensity of images being displayed from the cameras as they change altitude. In fact, many aircraft video systems allow the pilot to adjust several different parameters to optimize the display for different camera conditions. Such adjustments can become distracting during higher intensity maneuvers like while on final approach and shortly after takeoff.
The time of day often determines which type of image is most important to a pilot. During broad daylight infrared images are usually of little use to the pilot and the pilot generally relies on visual light images. Conversely, at nighttime there is little visual light available to an imaging system (other than runway lights, etc.) and the pilot will make use of images based on infrared radiation. At dusk, during the period as daytime transitions into nighttime, a blending of visual light data and infrared radiation data can be useful in constructing an image to be displayed to the pilot. However, the relative availability of visual light versus infrared radiation changes quickly at dusk and, therefore, pilots often either have to adjust the visual light and infrared radiation intensities of the cockpit display or fly while viewing a sub-optimal image.
Therefore, there is a need for an avionics imaging system that automatically adjusts an image display based on sensed physical conditions so as to generate an optimal image, thereby minimizing distractions to the pilot.
The disadvantages of the prior art are overcome by the present invention which, in one aspect, an imaging apparatus for imaging a scene that includes a thermal imaging camera, a control circuit, an image generating circuit and a display. The thermal imaging camera includes an array of pixels and that senses thermal radiation within the scene. The a control circuit is programmed to: detect an area of high thermal energy within the image; define a region of interest having a predetermined number of pixels within the area of high thermal energy; determine an average temperature sensed by the predetermined number of pixels; and adjust each of the plurality of adjustable parameters to optimal settings corresponding to the average temperature. The image generating circuit is operationally coupled to the thermal imaging camera and generates image data corresponding to a portion of the scene so that the image data is based on the optimal settings for the plurality of adjustable parameters. The display displays the image data.
In another aspect, the invention is an aircraft imaging apparatus for imaging a scene that includes a visual light camera. A temperature sensor detects a temperature associated with the visual light camera. A control circuit, including a memory, adjusts a plurality of parameters associated with the visual light camera as a function of the temperature so as to generate an optimal image.
In another aspect, the invention is an image blending system that includes a visible light camera, a thermal imaging camera, a visible light sensitivity sensor, a long wave-IR sensor, an image generating circuit and a display. The visible light camera includes an array of pixels that sense visible light within a scene. The thermal imaging camera includes an array of pixels that sense thermal radiation within the scene. The visible light sensitivity sensor detects an intensity of visible light available to the visible light camera. The long wave-IR sensor detects an intensity of infra-red energy available to the thermal imaging camera. The image generating circuit is operationally coupled to the visible light camera and the thermal imaging camera. The image generating circuit is configured to blend image data from the thermal imaging camera with image data from the visible light camera proportionally based on the intensity of visible light available to the visible light camera and the intensity of infra-red energy available to the thermal imaging camera so as to generate blended image data that shows the scene so as to be optimally perceptible by a user. The display is configured to display the blended image data.
In another aspect, the invention is a method of calibrating a thermal imaging system that generates an image onto a display, in which the image is affected by a plurality of adjustable parameters. An image of a scene is received from a thermal camera. An area of high thermal energy is detected within the image. A region of interest having a predetermined number of pixels is defined within the area of high thermal energy. An average temperature sensed by the predetermined number of pixels is determined. Each of the plurality of adjustable parameters is adjusted to an optimal setting corresponding to the average temperature.
In yet another aspect, the invention is a method of generating a display. An image of a scene is received from a thermal camera. An intensity of visible light available to the visible light camera is detected. An intensity of infra-red energy available to the thermal imaging camera is detected. Image data from the thermal imaging camera is blended with image data from the visible light camera proportionally based on the intensity of visible light available to the visible light camera and the intensity of infra-red energy available to the thermal imaging camera so as to generate blended image data that, when displayed, shows the scene so as to be optimally perceptible by a user.
These and other aspects of the invention will become apparent from the following description of the preferred embodiments taken in conjunction with the following drawings. As would be obvious to one skilled in the art, many variations and modifications of the invention may be effected without departing from the spirit and scope of the novel concepts of the disclosure.
A preferred embodiment of the invention is now described in detail. Referring to the drawings, like numbers indicate like parts throughout the views. Unless otherwise specifically indicated in the disclosure that follows, the drawings are not necessarily drawn to scale. As used in the description herein and throughout the claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise: the meaning of “a,” “an,” and “the” includes plural reference, the meaning of “in” includes “in” and “on.” As used herein, “thermal imaging camera” includes any type of camera or other video sensor that is sensitive to thermal energy and capable of providing image data based thereon.
As shown in
The video sensor unit 110 includes a visual light camera 154 that is coupled to the control unit 120 and a secondary video region 160 in which is mounted a near infra-red camera 162 that is coupled to the control unit 120. The visual light camera 154 could include, for example, a Python 1300 CMOS camera available from ON Semiconductor, Phoenix, Ariz. 85008. The near infra-red camera 162 could include, for example, a Taramisk series or Tenum series thermal imaging camera available from DRS Technologies, Inc., Dallas, Tex. or a TAU series long wave camera available from FLIR Systems, Wilsonville, Oreg. These digital cameras include an array of radiation-sensing pixels on a semiconductor chip. An aerodynamic screen 111 is placed in front of the video sensor unit 110 to minimize aerodynamic effects at high speeds. A mounting unit 152 facilitates mounting the video sensor unit 110 to the fuselage of an aircraft (typically to the underside of the nose).
Three photodiodes are also coupled to the control unit 120 and can be embedded in the video sensor unit 110. The photodiodes can include: a first photodiode 140 configured to sense bright visible light (e.g., in the 10 lux to 200,000 lux range), a second photodiode 142 configured to sense moderately bright visible light (e.g., in the 0.1 lux to 100,000 lux range), and a photodiode 144 configured to sense low luminance visible and long wave infrared light (e.g., in the 0.001 lux to 10,000 lux range).
The system can adjust the images generated by the a near infra-red camera 162 and the visual light camera 154 by detecting a temperature associated with each camera and adjusting a plurality of parameters associated with the cameras as a function of the temperature so as to generate an optimal image. The temperature can be a temperature measured near the camera, or it can be based on a scene temperature.
The cockpit views of the horizon for several different aircraft attitudes are shown in
From this region of interest 316, the system determines an average scene temperature (which could be measured by using a radiometric mode of the near infra-red camera 162) and adjusts various video parameters for optimal response to the scene temperature according to an optimal setting graph, as shown in
A practical example is shown in
As shown in
Two examples of parameter setting are show in
The system responds to changing lighting conditions, for example, at different times during the day. As shown in
To do this, the photodiodes 140, 142 and 144 sense the availability of each type of radiation (i.e., visual light vs. infra-red) and the control unit 120 then adjusts the signals from each of the visual light CMOS camera 154 and the near infra-red camera 162 to achieve an image with an optimal amount of contrast and sharpness for the available radiation conditions. The control unit 120 can determine the camera settings based on a lookup table stored in memory that gives the camera setting data for each camera as a function of current photodiode values. To control relative image intensity, the control unit 120 can adjust the “shutter speed” of each camera. In the case of digital cameras, the “shutter speed” is the frame data capture rate for the sensor chip, wherein longer periods between data capture result in more intense images.
A simulated example is shown in
In another embodiment, the system can be used to blend data from a long wave infra-red (LWIR) camera with data from a short wave infrared (SWIR) camera. In yet another embodiment, the system can be used to blend data from an LWIR camera and an SWIR camera with data from a camera sensitive to low light visible. It will be understood that the invention is not limited to any particular type of camera and can be used to blend image data from cameras or other video sensors of the many types used in association with aviation and other transportation systems.
The above described embodiments, while including the preferred embodiment and the best mode of the invention known to the inventor at the time of filing, are given as illustrative examples only. It will be readily appreciated that many deviations may be made from the specific embodiments disclosed in this specification without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is to be determined by the claims below rather than being limited to the specifically described embodiments above.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/688,593, filed Jun. 22, 2018, the entirety of which is hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5909244 | Waxman | Jun 1999 | A |
6898331 | Tiana | May 2005 | B2 |
7148861 | Yelton et al. | Dec 2006 | B2 |
7196329 | Wood et al. | Mar 2007 | B1 |
7924312 | Packard | Apr 2011 | B2 |
9569848 | O'Dell et al. | Feb 2017 | B2 |
20020150304 | Ockman | Oct 2002 | A1 |
20080099678 | Johnson | May 2008 | A1 |
20100309315 | Hogasten et al. | Dec 2010 | A1 |
20110001809 | McManus | Jan 2011 | A1 |
20110235939 | Peterson | Sep 2011 | A1 |
20130236098 | Fujisaki | Sep 2013 | A1 |
20130242110 | Terre | Sep 2013 | A1 |
20140139643 | Hogasten | May 2014 | A1 |
20150172545 | Szabo | Jun 2015 | A1 |
20150189192 | Jonsson | Jul 2015 | A1 |
20160093034 | Beck | Mar 2016 | A1 |
20160165202 | Lee | Jun 2016 | A1 |
20170078591 | Petrov | Mar 2017 | A1 |
20170337754 | Wang | Nov 2017 | A1 |
20190026875 | Yuan | Jan 2019 | A1 |
Entry |
---|
Mudau et al.: “Non-Uniformity Correction and Bad Pixel Replacement on LWIR and MWIR Images”; believed to have been published no later than Sep. 30, 2016. |
Yelton et al.: “Processing System for an Enhanced Vision System”; Apr. 2004; SPIE. |
Number | Date | Country | |
---|---|---|---|
62688593 | Jun 2018 | US |