There exists a need to be able to accurately detect potentially dangerous threat materials in a sample scene. Discrimination between a particular target material and a particular class of background material requires a few well-chosen wavebands. Different materials exhibit different spectral features. Therefore, a wide selection of wavelengths is necessary to address a wide variety of materials. As more types of targets and backgrounds are added to the sample scene, however, the number and location of wavelengths needed to discriminate between any given spectral pair grows rapidly.
To fully discriminate materials, spectral features must have differing intensities at different spectral locations. One possible solution would be to build many special-purpose sensors, each of which collects only a minimal set of wavebands needed for a limited set of targets and backgrounds. However, this approach has several drawbacks including cost, time, and limited results. Therefore, there exists a need for a system and method for multispectral imaging that enables cost-effective and robust results while discriminating target materials in a sample scene.
The present disclosure provides for a system and method for real time, stationary, dynamic, or on the move (“OTM”) multispectral imaging. More specifically, the present disclosure provides for the use of color addition to identify target materials in a sample scene. The system and method described herein utilize a small number of spectral channels (i.e., frequencies, wavelengths, wavenumbers, energies, or colors) for rapid multispectral imaging of target materials in a sample scene. Such system and method may provide for dual wavelength imaging. The system and method described herein hold potential for accurate identification of target materials and for discriminating the chemical properties of these target materials (“analyte”) from interfering background components. The system and method of the present disclosure overcome the limitations of the prior art by providing for a cost-effective and robust solution wherein one type of sensor oversamples the spectral information. The present disclosure also contemplates the use of algorithms to reduce or eliminate redundant or undesired spectral information and determining the proper number of wavelengths for analysis.
The accompanying drawings, which are included to provide further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.
In the drawings:
Reference will now be made in detail to the embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout to refer to the same or like parts.
A ratio is obtained in step 140 of said first spectroscopic band of interest and said second spectroscopic band of interest. A ratio may be obtained using any processing known in the art including but not limited to dividing the spectroscopic bands, subtracting the spectroscopic bands, and applying a chemometric technique such as partial least squares discriminate analysis (“PLSDA”). Other chemometric techniques that may be implemented include, but are not limited to, one selected from the group consisting of: principal component analysis (PCA), Cosine Correlation Analysis (CCA), Euclidian Distance Analysis (EDA), k-means clustering, multivariate curve resolution (MCR), Band T. Entropy Method (BTEM), k means clustering, Mahalanobis Distance (MD), Adaptive Subspace Detector (ASD), Spectral Mixture Resolution, and combinations thereof.
In step 150 this ratio is compared to one or more ranges of threshold values. In one embodiment, these threshold ranges may correspond to known relationships between high and low absorption spectroscopic bands of various materials that may be present in a sample scene. The present disclosure contemplates an embodiment wherein if a ratio is within a range, the target material is associated with the corresponding known material or class of materials. In other embodiments, a ratio outside of the range may be indicative of a target material not being associated with a known material or known class of materials. In yet another embodiment, whether or not a ratio is above or below a certain threshold or range of threshold values may be indicative of whether or not a target material is associated with a known material or class of materials.
Based on the comparison of a ratio with one or more ranges of threshold values, a result is displayed in step 160. In one embodiment, the result comprises a first pseudo color overly on said optical image. In one embodiment, this first pseudo color overly may be obtained by assigning a first pseudo color to said first target material and wherein said first pseudo color corresponds to at least one of: a known material and a class of known materials. In one embodiment, this methodology enables a target material in an optical image to be identifiable based on the color it is assigned. In such an embodiment, a target material can be easily identified by a user as either a threat agent (i.e., explosive, chemical hazard, biological hazard, or other hazardous agent). In another embodiment, this methodology may enable a user to identify a target material as not being a known material or belonging to known class of materials. In such an embodiment, an absence of color assigned to a target material or present in an optical image may indicate that the target material is not a threat material (i.e., not an explosive, chemical hazard, biological hazard, or other hazardous substance). It is also contemplated by the present disclosure that another embodiment may enable a user to identify a target material as a concealment material based on the presence or absence of color. The presence or absence of color can also be used in another embodiment to identify a target material as a non-threat material (i.e., plastic).
The present disclosure also contemplates the acquisition and addition of additional spectral information representative of the sample scene. In one embodiment, this additional spectral information may correspond to one or more other target materials of interest present in the sample scene. The addition of this spectral information may allow for the addition of more pseudo color information to the first pseudo color overlay (also referred to herein as “color addition”). This holds potential for improving contrast between target materials and identifying additional target materials present in a sample scene. As a result, color addition holds potential for analysis of target materials present in a sample scene and also holds potential for increasing accuracy of target material identification. This color addition therefore holds potential for reducing the number of falsely identified target materials (reducing the false alarm rate).
In step 235 spectroscopic data representative of a second target material in the sample scene is acquired wherein the spectroscopic data comprises at least one spectrum representative of the second target material. In step 240 a first spectroscopic band of interest and a second spectroscopic band of interest from said spectrum representative of said second target material are selected. A ratio is obtained in step 245 of said first spectroscopic band of interest and said second spectroscopic band of interest. The ratio is compared to one or more ranges of threshold values in step 250. Based on said comparison, a result is displayed in step 255. In one embodiment the result comprises the first pseudo color overlay generated after assessing the first target material and further comprising at least one of: assigning a second pseudo color to said second target material wherein said second pseudo color corresponds to at least one of a known material and a known class of materials, and changing the contrast of said first pseudo color assigned to said first target material. In another embodiment, separate and distinct pseudo color overlays may be generated wherein each pseudo color overlay is assigned to a color channel. These pseudo color overlays can then be combined to obtain an image comprising all of the information contained in each individual pseudo color overlay.
In one embodiment, the spectroscopic data may be obtained by illuminating a sample to thereby generate a plurality of interacted photons wherein said photons are selected from the group consisting of: photons absorbed by the sample, photons reflected by the sample, photons scattered by the sample, photons emitted by the sample, and combinations thereof. The illumination source may comprise an active illumination source such as a laser, a passive illumination source such as the sun, and combinations thereof.
In one embodiment, the spectroscopic data may comprise infrared spectroscopic data. This infrared spectroscopic data may be data selected from the group consisting of: short wave infrared spectroscopic data, near infrared spectroscopic data, mid infrared spectroscopic data, far infrared spectroscopic data, and combinations thereof. In another embodiment, the spectroscopic data may comprise one of: Raman, fluorescence, visible, ultraviolet, and combinations thereof.
In one embodiment, the present disclosure provides for overlaying the optical image comprising a pseudo color overlay with an infrared image. The infrared image may be selected from the group consisting of: a short wave infrared image, near infrared image, mid infrared image, far infrared image, and combinations thereof. A schematic representation of the overlay of an optical image comprising pseudo (or “false color”) with a near infrared image is illustrated in
The present disclosure contemplates the use of the system and method disclosed herein in real-time, stationary, dynamic and OTM configurations. In one embodiment, the system and method of the present disclosure may be operated in a stationary mode. In another embodiment, the system and method of the present disclosure may be operated in an On-the-Move (“OTM”) mode. An OTM material-specific detection result is illustrated in
The present disclosure also provides for a system for detecting target materials in a sample scene wherein images generated though two independent LCTF channels are focused side-by-side on a single CCD or independently on two CCDs. Embodiments of such a system hold potential for OTM configurations because they provide for quick (even simultaneous) imaging and may eliminate the need for aligning images. However, the present disclosure also contemplates the use of these embodiments in other configurations including stationary imaging.
In one embodiment, the present disclosure may incorporate technology available from ChemImage Corporation, Pittsburgh, Pa. including that described more fully in U.S. Pat. No. 7,69,2775, filed on Jun. 9, 2006, entitled “Time and Space Resolved Standoff Hyperspectral IED Explosives LIDAR Detection”, U.S. patent application Ser. No. 12/199,145, filed on Aug. 27, 2008, entitled “Time Time and Space Resolved Standoff Hyperspectral IED Explosives LIDAR Detector,” U.S. Pat. No. 7,692,776, filed on Dec. 22, 2006, entitled “Chemical Imaging Explosives (CHIMED) Optical Sensor,”and U.S. application patent Ser. No. 12/754,229, filed on Apr. 5, 2010, entitled “Chemical Imaging Explosives (CHIMED) Optical Sensor Using SWIR.” These patents and applications are hereby incorporated by reference in their entireties.
The choice of tunable filter depends on the desired optical region and/or the nature of the sample being analyzed. In another embodiment, the tunable filter may comprise a filter selected from the group consisting of: a Fabry Perot angle tuned filter, a Lyot filter, an Evans split element liquid crystal tunable filter, a Solc liquid crystal tunable filter, a spectral diversity filter, a photonic crystal filter, a fixed wavelength Fabry Perot tunable filter, an air-tuned Fabry Perot tunable filter, a mechanically-tuned Fabry Perot tunable filter, a liquid crystal Fabry Perot tunable filter. In one embodiment, the tunable filer may be selected to operate in one or more of the following spectral ranges: the ultraviolet (UV), visible, near infrared, and mid-infrared.
In general, the sample size determines the choice of image gathering optic. For example, a microscope is typically employed for the analysis of sub micron to millimeter spatial dimension samples. For larger objects, in the range of millimeter to meter dimensions, macro lens optics are appropriate. For samples located within relatively inaccessible environments, flexible fiberscopes or rigid borescopes can be employed. For very large scale objects, telescopes are appropriate image gathering optics.
In the arrangement of
The system may further comprise a controller 855 may be configured to tune the filters 840a and 840b. In one embodiment, the controller 855 may be controlled by a computer (not shown) that may also carry the interface elements coupled to and controlling the CCD 880 and contain a display on which the collected image can be viewed, stored, transmitted over a network, etc. The controller 855 may also be used to tune the filters 840a and 840b in unison. In another embodiment, the controller 855 may independently tune the passband wavelengths ω1 and ω2 that respectively process components of the input. Therefore, by appropriate control, the filters 840a and 840b can be tuned to the same wavelength or to two different passband wavelengths (ω1≠ω2) at the same time. The controller 855 may be programmable or implemented in software to allow a user to selectively tune each filter 840a and 840b as desired.
It is noted here that although laser light may be coherent, the light received from the sample 820 (e.g., visible light, Raman scatter, fluoresce emission, infrared absorbance or reflectance) and fed to the filters 840a and 840b may not be coherent. Therefore, wavefront errors may not be present or may be substantially avoided in the two filter versions of
The present disclosure may be embodied in other specific forms without departing from the spirit or essential attributes of the disclosure. Although the foregoing description is directed to the embodiments of het disclosure, it is noted that other variations and modifications will be apparent to those skilled in the art, and may be made without departing from the spirit of scope of the disclosure.
This Application claims priority to U.S. Provisional Application No. 61/215,082, entitled “Method for Component Discrimination Enhancement Based on Hyperspectral Addition Imaging,” filed on May 1, 2009, which is hereby incorporated by reference in its entirety. This Application is also a continuation-in-part of U.S. patent application Ser. No. 11/681,326, entitled “Polarization Independent Raman Imaging with Liquid Crystal Tunable Filter,” filed on Mar. 2, 2007, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4749257 | Klausz | Jun 1988 | A |
5080486 | Shirasaki | Jan 1992 | A |
5097352 | Takahashi | Mar 1992 | A |
5321539 | Hirabayashi | Jun 1994 | A |
5657121 | Nishina | Aug 1997 | A |
5740288 | Pan | Apr 1998 | A |
6002476 | Treado | Dec 1999 | A |
6014475 | Frisken | Jan 2000 | A |
6262851 | Marshall | Jul 2001 | B1 |
6415077 | Friken | Jul 2002 | B1 |
6522467 | Li | Feb 2003 | B1 |
6717668 | Treado et al. | Apr 2004 | B2 |
7460227 | Kim et al. | Dec 2008 | B1 |
20030108284 | Danagher | Jun 2003 | A1 |
20040109232 | Riza | Jun 2004 | A1 |
20050015004 | Hertel et al. | Jan 2005 | A1 |
20050148842 | Wang | Jul 2005 | A1 |
20050228452 | Mourlas | Oct 2005 | A1 |
20060119797 | Ockenfuss | Jun 2006 | A1 |
20080212180 | Zhang | Sep 2008 | A1 |
20100225899 | Treado | Sep 2010 | A1 |
Number | Date | Country |
---|---|---|
WO2006058306 | Jun 2006 | WO |
Number | Date | Country | |
---|---|---|---|
20110012916 A1 | Jan 2011 | US |
Number | Date | Country | |
---|---|---|---|
61215082 | May 2009 | US |