A method using Long Wave Infrared (“LWIR”) Imaging Polarimetry for facial recognition in total or near darkness is disclosed herein. The method employs LWIR imaging polarimetry to enhance thermal imagery for facial recognition purposes. One method of polarimetry for the embodiment of the method comprises capturing a number of images of different polarization states of the face using a polarization filter in conjunction with a thermal camera, correcting each image for non-uniformity and performing weighted subtractions of said images to produce Stokes Parameters images S0, S1, S2 and Degree of Linear Polarization Images (“DoLP”) from those Stokes images. Finally, facial recognition algorithms may be applied to the DoLP image, or the images may simply be viewed by a human for facial recognition.
In another embodiment, a subject's face is centered in the field of view of the LWIR imaging polarimeter, focused and a thermal/polarimetric hybrid image is collected.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawings will be provided by the Office upon request and payment of the necessary fee.
The polarimeter 101 comprises a polarizing imaging device (not shown) for recording polarized images, such as a digital camera or thermal imager that collects images. The polarimeter 101 transmits raw image data to the signal processing unit 107, which processes the data as further discussed herein. Although
In the illustrated embodiment, the polarimeter 101 sends raw image data (not shown) to the signal processing unit 107 over a network 105. The signal processing unit 107 may be any suitable computer known in the art or future-developed. The signal processing unit 107 receives the raw image data, filters the data, and analyzes the data as discussed further herein to provide facial images. The network 105 may be of any type network or networks known in the art or future-developed, such as the internet backbone, Ethernet, Wifi, WiMax, broadband over power line, coaxial cable, and the like. The network 105 may be any combination of hardware, software, or both. Further, the network 105 could be resident in a sensor (not shown) housing both the polarimeter 101 and the signal processing unit 107.
The polarimeter 101 comprises an objective imaging lens 1201, a filter array 1203, and a focal plane array 1202. The objective imaging lens 1201 comprises a lens pointed at the subject's face 102 (
The signal processing unit 107 comprises image processing logic 120 and system data 121. In the exemplary signal processing unit 107 image processing logic 120 and system data 121 are shown as stored in memory 1123. The image processing logic 120 and system data 121 may be implemented in hardware, software, or a combination of hardware and software.
The signal processing unit 107 also comprises a processor 130, which comprises a digital processor or other type of circuitry configured to run the image processing logic 120 by processing the image processing logic 120, as applicable. The processor 130 communicates to and drives the other elements within the signal processing unit 107 via a local interface 1124, which can include one or more buses. When stored in memory 1123, the image processing logic 120 and the system data 121 can be stored and transported on any computer-readable medium for use by or in connection with logic circuitry, a processor, an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable medium” can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Note that the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
Exemplary system data 121 is depicted comprises:
The image processing logic 120 executes the processes described herein with respect to
Referring to
The external interface device 126 is shown as part of the signal processing unit 107 in the exemplary embodiment of
In step 1002, the signal processing unit 107 (
In step 1003, the Stokes parameters (S0, S1, S2) are calculated from the resultant image by weighted subtraction of the polarized image obtained in step 1002. The LWIR imaging polarimeter measures both a radiance image and a polarization image. A radiance image is a standard image whereby each pixel in the image is a measure of the radiance, typically expressed in of radiance Watts/cm2-sr reflected or emitted from that corresponding pixel area of the scene. Standard photographs and thermal images are radiance images, simply mappings of the radiance distribution emitted or reflected from the scene. A polarization image is a mapping of the polarization state distribution across the image. The polarization state distribution is typically expressed in terms of a Stokes image.
Of the Stokes parameters, S0 represents the conventional LWIR thermal image with no polarization information. S1 and S2 display orthogonal polarimetric information. Thus the Stokes vector, first introduced by G. G. Stokes in 1852, is useful for describing partially polarized light. The Stokes vector is defined as
where I0 is the radiance that is linearly polarized in a direction making an angle of 0 degrees with the horizontal plane, I90 is radiance linearly polarized in a direction making an angle of 90 degrees with the horizontal plane. Similarly I45 and I135 are radiance values of linearly polarized light making an angle of 45° and 135° with respect to the horizontal plane. Finally IR and IL are radiance values for right and left circularly polarized light. For this invention, right and left circularly polarized light is not necessary and the imaging polarimeter does not need to measure these states of polarization. For this reason, the Stokes vectors considered are limited to the first 3 elements which express linearly polarized light only,
Another useful form of equation (2) is a normalized form of the equation given by
The polarization state emitted or reflected from the surface of human skin depends on a number of factors including the angle of emission, the surface temperature of the skin, the micro-roughness of the skin, the complex refractive index of the skin and the background temperature of the surrounding environment. The invention here primarily makes use of the fact that the polarization state of light emitted and reflected from the skin is a function of angle of emission.
The emissivity of an object is determined from Kirchoff's radiation law. The most familiar form of Kirchoff's law is gives the emissivity of a surface ε in terms of the reflectance r, given by
ε(θ,ϕ)=1−r(θ) (4)
where θ is the angle between the surface normal and the camera's line of sight. The more general equations for Kirchoff's law are given by
εp(θ)=1−rp(θ) (5)
and
εs(θ)=1−rs(θ) (6)
where the subscripts p and s denote the emissivity and reflectance of particular polarization states. The p-state indicates the plane of emission for light that is linearly polarized in a plane that contains the surface normal and the line of sight to the camera. For example, if the camera is looking down at a horizontal surface, the p-state of polarization would appear vertically polarized. The s-state of polarization is perpendicular to the p-state. Note that temperature and wavelength dependence is suppressed in equations 4-6.
Substituting equations (5) and (6) into equation (3) gives
where ϕ is the angle that the plane of incidence makes with the horizontal plane and
Equation 8 can be written out more explicitly as
where rp and rs are given by the Fresnel equations for reflection
Note that P(θ) does not explicitly depend on the angle ϕ that the plane of incidence makes with the horizontal plane. The angle ϕ is critical to determine the orientation of plane of incidence and ultimately the azimuthal angle of the surface normal. The angle ϕ can be determined from the following angle,
The angle θ can be determined a number of ways. Methods for determining θ and ϕ from normalized Stokes images (Eqn 3) are known in the art.
In step 1004, a degree of linear polarization (DoLP) image is computed from the Stokes images. A DoLP image is useful for visualizing a face, and can be calculated as follows:
Note that DoLP is linear polarization. As one with skill in the art would know, in some situations polarization that is not linear (e.g., circular) may be desired. Thus in other embodiments, step 1004 may use polarization images derived from any combination of S0, S1, S2, or S3 and is not limited to DoLP.
In step 1005, facial recognition algorithms that are known in the art are applied to the DoLP image from step 1004. Some facial recognition algorithms identify facial features by extracting landmarks, or features, from an image of the subject's face. For example, an algorithm may analyze the relative position, size, and/or shape of the eyes, nose, cheekbones, and jaw. These features are then used to search for other images with matching features. Other algorithms normalize a gallery of face images and then compress the face data, only saving the data in the image that is useful for face recognition. A probe image is then compared with the face data. One of the earliest successful systems is based on template matching techniques applied to a set of salient facial features, providing a sort of compressed face representation. Non-restrictive examples include Principal Component Analysis using eigenfaces, Linear Discriminate Analysis, Elastic Bunch Graph Matching using the Fisherface algorithm. In other embodiments, the DoLP image may be viewed by humans for facial recognition, and no algorithms are applied.
The polarization information collected with the LWIR imaging polarimeter can be presented in a number of ways. Those practiced in the art can imagine a very large number of ways to combine the measured parameters to enhance the visualization of a human face in a variety of ways. One exemplary way is to generate a hybrid thermal/polarization image by overlaying a polarization image obtained from the method 1000 of
Many different imaging polarimeter architectures have been developed and are commercially available to use for capturing LWIR DoLP images of faces. One exemplary architecture is a Divided Focal Plane Array (DFPA) imaging polarimeter. In such a polarimeter, a Pixelated Polarizer Array (PPA) comprises pixels which are aligned to and brought into close proximity to the pixels of a Focal Plane Array (FPA).
In an alternative embodiment, a super pixel may comprise a three (3) pixel super-pixel arrangement, in which the measurements of linear polarization are 60 degrees apart, for example 0°, 60° and 120°.
The DFPA imaging polarimeter of this type works very similar to a color camera which uses BAYER filters. The BAYER filter array also has super-pixels typically containing a red pixel, a blue pixel and two green pixels. Interpolation techniques that are very well known to those practiced in the art are used to interpolate between pixels of like polarization states just as it is in pixels of like color. The images are interpolated to obtain the pixel format of the FPA. For example, if the FPA is 640×512 pixels, then each polarized image will be 320×256 in size. The images are up-sampled to become again 640×512 images. Once the images are up-sampled, they are subtracted to obtain the Stokes images.
The images must be captured fast enough (minimizing time between subsequent measurements of polarization images) so that the subject does not move between measurements. The speed at which the measurements must be taken depends on how quickly the test subject is expected to move. The rule of thumb is that the subject should not move more than ¼ pixel between measurements of the required polarization images. If the test subject moves more than a ¼ pixel then the images must be registered using standard Affine correction image registration methods.
By way of non-limiting example, if the test subject is asked to be still, a frame capture rate of 60 frames per second is fast enough that the polarization images are registered to one another to within ¼ pixel. At 60 frames per second, the entire sequence of images may be captured in 50 milliseconds. Therefore, if the image of the face is resolved to 1 mm spatial resolution, then the test subject should not move any faster than 0.001/0.05*(¼)=5 mm/sec.
In some instances, unless the test subject is asked to be still, motion artifacts must be removed in processing. Registration algorithms that are known in the art may be applied to register the images using the features of the face to register the images.
In the schematic of
The embodiments discussed herein have been directed to facial recognition. However, as is known by persons with skill in the art, human ears may also be recognized using ear detection algorithms that are known in the art. Thus the methods disclosed herein for facial recognition are equally applicable to human ear detection.
This disclosure may be provided in other specific forms and embodiments without departing from the essential characteristics as described herein. The embodiments described are to be considered in all aspects as illustrative only and not restrictive in any manner. Other embodiments of polarimeter are known to those practiced in the art and include Division of Amplitude and Division of Aperture polarimetric architectures.
This application is a continuation of, and claims priority to, U.S. Non-provisional patent application Ser. No. 14/602,823, entitled “Polarization Imaging for Facial Recognition Enhancement System and Method, and filed on Jan. 22, 2015, which claims priority to Provisional Patent Application U.S. Ser. No. 61/930,272, entitled “Polarization Imaging for Facial Recognition Enhancement” and filed on Jan. 22, 2014. Both applications are fully incorporated herein by reference.
This invention was made with government support under Contract Number W911QX-12-C-0008 awarded by the U.S. Army. The government has certain rights in the invention.
Number | Name | Date | Kind |
---|---|---|---|
4751387 | Robillard | Jun 1988 | A |
6037590 | Boreman | Mar 2000 | A |
6075235 | Chun | Jun 2000 | A |
6122404 | Barter | Sep 2000 | A |
6173068 | Prokoski | Jan 2001 | B1 |
7034938 | Miles | Apr 2006 | B1 |
7602942 | Bazakos | Oct 2009 | B2 |
8217368 | Meyers | Jul 2012 | B2 |
9589195 | Aycock | Mar 2017 | B2 |
9631973 | Dorschner | Apr 2017 | B2 |
9741163 | Fest | Aug 2017 | B2 |
9830506 | Short | Nov 2017 | B2 |
9970861 | Chenault | May 2018 | B2 |
9989625 | Aycock | Jun 2018 | B2 |
10311285 | Pezzaniti | Jun 2019 | B2 |
20040012853 | Garcia | Jan 2004 | A1 |
20050264813 | Giakos | Dec 2005 | A1 |
20060104488 | Bazakos | May 2006 | A1 |
20060146284 | Collins | Jul 2006 | A1 |
20060164643 | Giakos | Jul 2006 | A1 |
20070146632 | Chipman | Jun 2007 | A1 |
20080165359 | Mattox | Jul 2008 | A1 |
20090318773 | Jung | Dec 2009 | A1 |
20100271475 | Schwiegerling | Oct 2010 | A1 |
20110211047 | Chhibber | Sep 2011 | A1 |
20120075473 | Sarwar | Mar 2012 | A1 |
20120075513 | Chipman | Mar 2012 | A1 |
20120257800 | Zheng | Oct 2012 | A1 |
20120268571 | Debevec | Oct 2012 | A1 |
20130063722 | Sparks | Mar 2013 | A1 |
20130076932 | Chhibber | Mar 2013 | A1 |
20130293871 | Gruev | Nov 2013 | A1 |
20140247361 | Sarwar | Sep 2014 | A1 |
20140314332 | Mudge | Oct 2014 | A1 |
20150168210 | Dorschner | Jun 2015 | A1 |
20150226827 | Aycock | Aug 2015 | A1 |
20160003677 | Pezzaniti | Jan 2016 | A1 |
20160037089 | Silny | Feb 2016 | A1 |
20160061665 | Chenault | Mar 2016 | A1 |
20160232709 | Videen | Aug 2016 | A1 |
20160253551 | Pezzaniti | Sep 2016 | A1 |
20160307053 | Aycock | Oct 2016 | A1 |
20170132458 | Short | May 2017 | A1 |
20170178399 | Fest | Jun 2017 | A1 |
20170184700 | Aycock | Jun 2017 | A1 |
20170299501 | Chenault | Oct 2017 | A1 |
20180005012 | Aycock | Jan 2018 | A1 |
20180189547 | Daniels | Jul 2018 | A1 |
20200082159 | Pezzaniti | Mar 2020 | A1 |
20200285838 | Hu | Sep 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
20200082159 A1 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
61930272 | Jan 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14602823 | Jan 2015 | US |
Child | 16431374 | US |