The present disclosure relates to facial detection and recognition, and more particularly to facial detection and recognition for occupants in vehicles.
At security check points, border crossings, high occupancy vehicle (HOV) lanes, and the like it, is desirable to know how many occupants are in each vehicle that passes. At a traditional checkpoint an officer can count occupants that are visible in a vehicle. In security applications, it can be desirable to know who the occupants of a vehicle are. An officer can verify this by inspection of identification documents such as a photo ID for each occupant of the vehicle. However, these techniques require each vehicle to stop for inspection before passing through.
The conventional techniques have been considered satisfactory for their intended purpose. However, there is an ever present need for improved systems and methods for detecting, counting, and identifying occupants in vehicles. This disclosure provides a solution for this need.
A system for detecting occupants in a vehicle includes a controller and a plurality of camera systems external to the vehicle in a vehicle approach area, wherein each camera system is operatively connected to the controller. A trigger in the vehicle approach area is operatively connected to the controller to detect an approaching vehicle and control the camera systems to acquire images of the approaching vehicle. The controller includes machine readable instructions configured to cause the controller to perform any method as disclosed herein.
Each camera system can include an imaging sensor, a pulsed illumination device, and a processor operatively connecting the imaging sensor to the pulsed illumination source for synchronizing illumination a pulse from the pulsed illumination device with exposure of the imaging sensor. Each camera system can include a lens optically coupled to the imaging sensor, an optical bandpass filter operatively connected to filter light passing through the lens, and a linear polarization filter operatively connected to filter light passing through the lens.
A method of detecting occupants in a vehicle includes detecting an oncoming vehicle and acquiring a plurality of images of occupants in the vehicle in response to detection of the vehicle. The method includes performing automated facial detection on the plurality of images and adding a facial image for each face detected to a gallery of facial images for the occupants of the vehicle. The method includes performing automated facial recognition on the gallery of facial images to group the facial images into groups based on which occupant is in the respective facial images, and counting the groups to determine how many occupants are in the vehicle.
The method can include selecting a representative image from each group, and outputting a set of cropped selected images, one uniquely cropped selected image for each of the occupants. It is contemplated that no duplicate images of a given occupant need be stored or displayed. Selecting the representative image from each group can include selecting images based on corresponding confidence scores from the automated facial detection. Selecting the representative image from each group can include selecting images based on which image in the group has least facial offset angle from line of sight of an imaging sensor which acquired the respective image. The method can include running the selective images through a database to check for matches between the occupants and known individuals in the database. The method can include initiating a response upon finding a match in the database, wherein the response include at least one of outputting an alert on a visual display, sounding an audible alarm, closing a physical barrier, transmitting a citation, mailing a citation, and/or dispatching an officer. It is also contemplated that the method can include initiating a response upon determining an improper number of occupants in the vehicle, wherein the response includes at least one of outputting an alert on a visual display, sounding an audible alarm, closing a physical barrier, transmitting a citation, mailing a citation, and/or dispatching an officer.
Each image can be acquired from a different sensor viewing the vehicle from a different respective angle. The method can include illuminating the vehicle with a respective pulse of illumination for each image acquired, wherein each pulse of illumination is performed at a different time to reduce shadows cast onto the occupants while acquiring the plurality of images.
One of the sensors can be a primary sensor that acquires a primary image of occupants in the vehicle, wherein faces detected in primary image serve as references in the gallery for facial recognition for subsequent ones of the images of occupants in the vehicle. The method can include adding a new face to the gallery each time a detected face in a subsequent one of the images of occupants in the vehicle does not match with a face already in the gallery. The method can include iteratively comparing faces detected in subsequent ones of the images of occupants in the vehicle and adding each face detected to the gallery that is not already in the gallery until there is an image in the gallery of each face detected by performing automated facial detection.
These and other features of the systems and methods of the subject disclosure will become more readily apparent to those skilled in the art from the following detailed description of the preferred embodiments taken in conjunction with the drawings.
So that those skilled in the art to which the subject disclosure appertains will readily understand how to make and use the devices and methods of the subject disclosure without undue experimentation, preferred embodiments thereof will be described in detail herein below with reference to certain figures, wherein:
Reference will now be made to the drawings wherein like reference numerals identify similar structural features or aspects of the subject disclosure. For purposes of explanation and illustration, and not limitation, a partial view of an exemplary embodiment of a system in accordance with the disclosure is shown in
The system 100 for detecting occupants in a vehicle 102 includes a controller 104 and a plurality of camera systems 106, 108, and 110 that are external to the vehicle 105 in the vehicle approach area 112. Each camera system 106, 108, and 110 is operatively connected to the controller 104. A trigger 114 in the vehicle approach area 112 is operatively connected to the controller 104 to detect an approaching vehicle 105 and to control the camera systems 106, 108, and 110 to acquire images of the approaching vehicle 105. The controller 104 includes machine readable instructions configured to cause the controller 104 to perform any method as disclosed herein. As shown in
With reference now to
With reference now to
The method includes having controller 104 perform automated facial detection on the plurality of images 128, 130, and 132, and to add a facial image for each face detected to a gallery 202 of facial images for the occupants of the vehicle 105. For the image 128, three faces are detected and four faces are detected from each of images 130 and 132. Controller 104 performs automated facial recognition on the facial images of gallery 202 to group the facial images into groups 134, 136, 138, and 140 based on which occupant is in the respective facial images, as indicated by facial recognition groupings 204 in
Facial detection and facial recognition need not necessarily be performed one after another, but instead can be performed together on the fly. One of the sensors 120 can be a primary sensor, e.g., the sensor 120 of camera system 106, that acquires a primary image, e.g., image 128, of occupants in the vehicle 105. The faces detected in primary image 128 can serve as references in the gallery 202 for facial recognition for subsequent ones of the images 130 and 132 of occupants in the vehicle. The controller 104 can add a new face to the gallery 202 each time a detected face in a subsequent one of the images 130 and 132 does not match with a face already in the gallery 202. The controller 104 can iteratively compare faces detected in subsequent ones of the images 128, 130, and 132 and add each face detected to the gallery 202 that is not already in the gallery 202 until there is an image in the gallery 202 of each face detected by performing automated facial detection.
Whenever a face is detected for which there is already an image in the gallery 202, the best image of the face can be retained in the image gallery 202. Controller 104 selects a representative image 142, 144, 146, and 148 from each group 134, 136, 138, and 145 and can output a set 206 of cropped selected images, one uniquely cropped selected image for each of the occupants. Set 206 includes no duplicate images, i.e. no more than one image is in set 206 for a given occupant, so duplicate images of a given occupant need be stored or displayed. The controller 104 can select the representative image 142, 144, 146, and 148 from each group 134, 136, 138, and 140 by selecting images based on corresponding confidence scores from the automated facial detection. It is also contemplated that controller 104 can selecting the representative image 142, 144, 146, and 148 from each group 134, 136, 138, and 140 by selecting images based on which image in the group has least facial offset angle from line of sight of the imaging sensor 120 which acquired the respective image. This selection process can be run on the fly with facial detection and facial recognition to winnow the gallery 202 down to the set 206.
The controller 104 can determine how many occupants are in the vehicle 105 by counting the groups 134, 136, 138, and 140. In this example, there are four groups 134, 136, 138, and 140 indicating there are four occupants in the vehicle 105. If groups 134, 136, 138, and 140 are conflated down to the set 206 on the fly as described above, then the groups 134, 136, 138, and 140 can be counted indirectly by simply counting the final cropped images in set 206 to determine how many occupants are in the vehicle 105.
The controller 104 can output the number of occupants in the vehicle 105, and can provide other output actions as needed. For example, controller 104 can initiate a response, e.g., via the output device 150, upon determining an improper number of occupants in the vehicle. For example, if controller 104 determines there are not enough occupants in a vehicle in an HOV lane, controller 150 can use the output device 150 to output an alert on a visual display, sound an audible alarm, close a physical barrier, transmit a citation, mail a citation, update a database, and/or dispatch an officer.
It is also contemplated that with the set of images 206, controller 104 can run the final cropped facial images through a facial recognition database, either locally or remotely, to check for matches between the occupants and known individuals in the database. If a match is found, e.g., one of the occupants in the vehicle 105 is on a watch list, the controller 104 can initiate an output response, e.g., using output device 150, such as outputting an alert on a visual display, sounding an audible alarm, closing a physical barrier, transmitting a citation, mailing a citation, updating a database, and/or dispatching an officer.
While shown and described herein in an exemplary context where there are n=3 camera systems and m=4 occupants in the vehicle 105, those skilled in the art will readily appreciate that any suitable number n of camera systems can be used, and any suitable number m of occupants in a vehicle can be counted/identified without departing from the scope of this disclosure.
The methods and systems of the present disclosure, as described above and shown in the drawings, provide for counting and identifying occupants in vehicles with superior properties including reliable, automated detection and identification of all occupants in a moving vehicle. While the apparatus and methods of the subject disclosure have been shown and described with reference to preferred embodiments, those skilled in the art will readily appreciate that changes and/or modifications may be made thereto without departing from the scope of the subject disclosure.
This application claims the benefit of priority to U.S. Provisional Patent Application No. 62/596,497 filed Dec. 8, 2017, which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4288819 | Williams | Sep 1981 | A |
5091924 | Bermbach | Feb 1992 | A |
5119236 | Fong | Jun 1992 | A |
5283643 | Fujimoto | Feb 1994 | A |
5343390 | Doi | Aug 1994 | A |
5361840 | Matthews | Nov 1994 | A |
5449864 | Beatty | Sep 1995 | A |
6313946 | Petitto | Nov 2001 | B1 |
6400835 | Lemelson | Jun 2002 | B1 |
6459764 | Chalmers | Oct 2002 | B1 |
6611200 | Pressnall | Aug 2003 | B2 |
6650765 | Alves | Nov 2003 | B1 |
6718049 | Pavlidis | Apr 2004 | B2 |
6856344 | Frantz | Feb 2005 | B2 |
6958676 | Morgan | Oct 2005 | B1 |
6972693 | Brown | Dec 2005 | B2 |
7076088 | Pavlidis | Jul 2006 | B2 |
7092106 | Cox | Aug 2006 | B2 |
7102665 | Chandler | Sep 2006 | B1 |
7132653 | Faubion | Nov 2006 | B2 |
7305108 | Waehner | Dec 2007 | B2 |
7349007 | Millar | Mar 2008 | B2 |
7439847 | Pederson | Oct 2008 | B2 |
7469060 | Bazakos | Dec 2008 | B2 |
7602942 | Bazakos | Oct 2009 | B2 |
7602947 | Lemelson | Oct 2009 | B1 |
7642899 | Alvarado | Jan 2010 | B2 |
7689033 | Xiao | Mar 2010 | B2 |
7786897 | Alves | Aug 2010 | B2 |
7792970 | Bigioi | Sep 2010 | B2 |
8005267 | Chew | Aug 2011 | B2 |
8028903 | Daniel | Oct 2011 | B1 |
8054182 | Cutchis | Nov 2011 | B2 |
8067719 | Herrera | Nov 2011 | B2 |
8155384 | Chew | Apr 2012 | B2 |
8254647 | Nechyba | Aug 2012 | B1 |
8305442 | Millar | Nov 2012 | B2 |
8358343 | Millar | Jan 2013 | B2 |
8509486 | Hsieh | Aug 2013 | B2 |
8604901 | Hoyos | Dec 2013 | B2 |
8830322 | Nerayoff | Sep 2014 | B2 |
8861802 | Bedros | Oct 2014 | B2 |
9087204 | Gormley | Jul 2015 | B2 |
9105128 | Robinson | Aug 2015 | B2 |
9189680 | Komatsu | Nov 2015 | B2 |
9230183 | Bechtel | Jan 2016 | B2 |
9256794 | Braithwaite | Feb 2016 | B2 |
9292754 | Shin | Mar 2016 | B2 |
9396595 | Daniel | Jul 2016 | B1 |
9460598 | Noone | Oct 2016 | B2 |
9471838 | Miller | Oct 2016 | B2 |
9533687 | Lisseman | Jan 2017 | B2 |
9552524 | Artan | Jan 2017 | B2 |
9600712 | Jin | Mar 2017 | B2 |
9613258 | Chen | Apr 2017 | B2 |
9623878 | Tan | Apr 2017 | B2 |
9667627 | Gormley | May 2017 | B2 |
9791766 | Ekin | Oct 2017 | B2 |
9953149 | Tussy | Apr 2018 | B2 |
9953210 | Rozploch | Apr 2018 | B1 |
10146797 | Bataller | Dec 2018 | B2 |
10262126 | Tussy | Apr 2019 | B2 |
10657360 | Rozploch | May 2020 | B2 |
10674587 | Sinitsyn | Jun 2020 | B2 |
10839200 | Nazemi | Nov 2020 | B2 |
10867193 | Hansen | Dec 2020 | B1 |
11087119 | Nazemi | Aug 2021 | B2 |
11196965 | Hansen | Dec 2021 | B2 |
20020092988 | Didomenico | Jul 2002 | A1 |
20030174865 | Vernon | Sep 2003 | A1 |
20030185340 | Frantz | Oct 2003 | A1 |
20030209893 | Breed | Nov 2003 | A1 |
20040070679 | Pope | Apr 2004 | A1 |
20040165750 | Chew | Aug 2004 | A1 |
20040199785 | Pederson | Oct 2004 | A1 |
20040225651 | Musgrove | Nov 2004 | A1 |
20050063566 | Beek | Mar 2005 | A1 |
20050105806 | Nagaoka | May 2005 | A1 |
20050110610 | Bazakos | May 2005 | A1 |
20050271184 | Ovadia | Dec 2005 | A1 |
20060018522 | Sunzeri | Jan 2006 | A1 |
20060028556 | Bunn | Feb 2006 | A1 |
20060055512 | Chew | Mar 2006 | A1 |
20060102843 | Bazakos | May 2006 | A1 |
20060117186 | Yeo | Jun 2006 | A1 |
20060146062 | Kee | Jul 2006 | A1 |
20060284982 | Bigioi | Dec 2006 | A1 |
20070030350 | Wagner | Feb 2007 | A1 |
20070087756 | Hoffberg | Apr 2007 | A1 |
20070112699 | Zhao | May 2007 | A1 |
20070122007 | Austin | May 2007 | A1 |
20070133844 | Waehner | Jun 2007 | A1 |
20080175438 | Alves | Jul 2008 | A1 |
20080211914 | Herrera | Sep 2008 | A1 |
20080285803 | Madsen | Nov 2008 | A1 |
20080297611 | Qiu | Dec 2008 | A1 |
20080298644 | Irmatov | Dec 2008 | A1 |
20090023472 | Yoo | Jan 2009 | A1 |
20090080715 | Van Beek | Mar 2009 | A1 |
20090232365 | Berthilsson | Sep 2009 | A1 |
20090303342 | Corcoran | Dec 2009 | A1 |
20100158380 | Neville | Jun 2010 | A1 |
20110182473 | Wang | Jul 2011 | A1 |
20110242285 | Byren | Oct 2011 | A1 |
20120069183 | Aoki | Mar 2012 | A1 |
20120106806 | Folta | May 2012 | A1 |
20120140079 | Millar | Jun 2012 | A1 |
20120262577 | Wang | Oct 2012 | A1 |
20120328197 | Sanderson | Dec 2012 | A1 |
20130129159 | Huijgens | May 2013 | A1 |
20130147959 | Wang | Jun 2013 | A1 |
20130176285 | Sato | Jul 2013 | A1 |
20130202274 | Chan | Aug 2013 | A1 |
20130236068 | Eshghi et al. | Sep 2013 | A1 |
20130243260 | Burry | Sep 2013 | A1 |
20130251214 | Chung | Sep 2013 | A1 |
20130266193 | Tiwari | Oct 2013 | A1 |
20130266196 | Kono | Oct 2013 | A1 |
20130279757 | Kephart | Oct 2013 | A1 |
20130336538 | Skaff | Dec 2013 | A1 |
20140002617 | Zhang | Jan 2014 | A1 |
20140029005 | Fiess | Jan 2014 | A1 |
20140044348 | Chen | Feb 2014 | A1 |
20140063177 | Tian | Mar 2014 | A1 |
20140132501 | Choi | May 2014 | A1 |
20140132746 | King | May 2014 | A1 |
20140253701 | Wexler | Sep 2014 | A1 |
20140254890 | Bergman | Sep 2014 | A1 |
20140285315 | Wiewiora | Sep 2014 | A1 |
20140320281 | Sager | Oct 2014 | A1 |
20140334684 | Strimling | Nov 2014 | A1 |
20150131872 | Ganong | May 2015 | A1 |
20150186711 | Baldwin | Jul 2015 | A1 |
20150261994 | Yamaji | Sep 2015 | A1 |
20150262024 | Braithwaite | Sep 2015 | A1 |
20150278617 | Oami | Oct 2015 | A1 |
20150286883 | Xu | Oct 2015 | A1 |
20150294144 | Konishi | Oct 2015 | A1 |
20150317535 | Lenor | Nov 2015 | A1 |
20150331105 | Bell | Nov 2015 | A1 |
20150347860 | Meier | Dec 2015 | A1 |
20150357000 | Howell | Dec 2015 | A1 |
20150363655 | Artan | Dec 2015 | A1 |
20160026855 | Mazumdar | Jan 2016 | A1 |
20160063235 | Tussy | Mar 2016 | A1 |
20160171312 | Aoki | Jun 2016 | A1 |
20160171808 | Caterino | Jun 2016 | A1 |
20160178936 | Yang | Jun 2016 | A1 |
20160217319 | Bhanu | Jul 2016 | A1 |
20160239714 | Oami | Aug 2016 | A1 |
20160253331 | Roshen | Sep 2016 | A1 |
20160300410 | Jones | Oct 2016 | A1 |
20160343251 | Lee | Nov 2016 | A1 |
20160379043 | Fazl Ersi | Dec 2016 | A1 |
20170046808 | Parrish | Feb 2017 | A1 |
20170068863 | Rattner | Mar 2017 | A1 |
20170076140 | Waniguchi | Mar 2017 | A1 |
20170106892 | Lisseman | Apr 2017 | A1 |
20180018351 | Fagans | Jan 2018 | A1 |
20180082131 | Li | Mar 2018 | A1 |
20180089528 | Chan | Mar 2018 | A1 |
20180157922 | Miyamoto | Jun 2018 | A1 |
20180181737 | Tussy | Jun 2018 | A1 |
20180189551 | Ranganath | Jul 2018 | A1 |
20180196587 | Bialynicka-Birula | Jul 2018 | A1 |
20180225307 | Kocher | Aug 2018 | A1 |
20180306598 | Decia | Oct 2018 | A1 |
20180307915 | Olson | Oct 2018 | A1 |
20190089934 | Goulden | Mar 2019 | A1 |
20190354750 | Nazemi | Nov 2019 | A1 |
20190373157 | Kunihiro | Dec 2019 | A1 |
20220094880 | Hansen | Mar 2022 | A1 |
Number | Date | Country |
---|---|---|
3010922 | Sep 2017 | CA |
102682295 | Sep 2012 | CN |
104024827 | Sep 2014 | CN |
105785472 | Jul 2016 | CN |
10101341 | Jul 2002 | DE |
102015002802 | Aug 2015 | DE |
102014214352 | Jan 2016 | DE |
1482329 | Dec 2004 | EP |
2620896 | Jul 2013 | EP |
2993619 | Mar 2016 | EP |
2395105 | Feb 2013 | ES |
2258321 | Feb 1993 | GB |
2003348573 | Dec 2003 | JP |
4366008 | Nov 2009 | JP |
05997871 | Sep 2016 | JP |
1020050003664 | Jan 2005 | KR |
20090031136 | Mar 2009 | KR |
100964025 | Jun 2010 | KR |
100964886 | Jun 2010 | KR |
101252671 | Apr 2013 | KR |
101514444 | Apr 2015 | KR |
20150137666 | Dec 2015 | KR |
101628390 | Jun 2016 | KR |
20190030960 | Mar 2019 | KR |
200146668 | Jun 2001 | WO |
2004110054 | Dec 2004 | WO |
2012160251 | Nov 2012 | WO |
WO-2013004864 | Jan 2013 | WO |
2014054328 | Apr 2014 | WO |
2014110629 | Jul 2014 | WO |
2015120413 | Aug 2015 | WO |
2016183408 | Nov 2016 | WO |
2017151859 | Sep 2017 | WO |
Entry |
---|
F. Z. Chelali, A. Djeradi and R. Djeradi, “Linear discriminant analysis for face recognition,” 2009 International Conference on Multimedia Computing and Systems, Ouarzazate, 2009, pp. 1-10, doi: 10.1109/MMCS.2009.5256630. |
Shishir Bashyal and Ganesh K. Venayagamoorthy, “Recognition of facial expressions using Gabor wavelets and learning vector quantization”, Missouri University of Science and Technology, MO 65409, USA, Received Feb. 2, 2005; received in revised form Apr. 26, 2007; accepted Nov. 12, 2007. |
Jin Wei, Zhang Jian-qi, Zhang Xiang, “Face recognition method based on support vector machine and particle swarm optimization”, © 2010 Elsevier Ltd. All rights reserved. doi: 10.1016/j.eswa.2010.09.108. |
Pavlidis et al., “Automatic Passenger Counting in the High Occupany Vehicle (HOV) Lanes,” 19 pages, prior to Oct. 20, 2005. |
P. Jonathon Phillips,“Support Vector Machines Applied to Face Recognition”, this is technical report NISTIR 6241, to appear in Advances in Neural Information, Processing Systems 11, eds. M. J. Kearns, S. A. Solla, and D. A. Cohn, MIT Press, 1999. |
Huaqing Li, Shaoyu Wang, and Feihu Qi, R. Klette and J. Zuni'c (Eds.),“Automatic Face Recognition by Support Vector Machines”: IWCIA 2004, LNCS 3322, pp. 716-725, 2004. © Springer-Verlag Berlin Heidelberg 2004. |
Jia Hao, Yusuke Morishita, Toshinori Hosoi, Kazuyuki Sakurai, Hitoshi Imaoka, Takao Imaizumi, and Hideki Irisawa, “Large-scale Face Recognition on Smart Devices”, 2013 Second IAPR Asian Conference on Pattern Recognition, 978-1-4799-2190-4/13, © 2013 IEEE, DOI 10.1109/ACPR.2013.189. |
PCT International Search Report and Written Opinion dated Feb. 21, 2019, issued during the prosecution of PCT International Patent Application No. PCT/US2018/64444 (15 pages). |
Extended European Search Report for European Patent Application No. EP18885197.6, dated Jul. 9, 2021. |
Viisage Technology, Inc. “FaceFINDER 2.5”, Data Sheet, pp. 2 page; https://www.epic.org/privacy/surveillance/cptolight/1105/facefinder.pdf, 2004. |
Dickson, Peter et al. “Mosaic Generation for Under Vehicle Inspection”, Applications of Computer Vision, 2002. (WACV 2002), Pascataway, NJ, Dec. 3, 2022, pp. 251-256. |
International Search Report and Written Opinion for PCT/US06/06708, dated Aug. 29, 2006. |
International Search Report and Written Opinion for PCT/US2019/031755, dated Sep. 5, 2019. |
International Search Report and Written Opinion for PCT/US2018/064444, dated Feb. 21, 2019. |
International Search Report and Written Opinion for PCT/US2020/056429, dated Feb. 9, 2021. |
International Search Report and Written Opinion for PCT/US2020/041195, dated Oct. 21, 2020. |
International Search Report and Written Opinion for PCT/US2022/013783, dated May 16, 2022. |
Number | Date | Country | |
---|---|---|---|
20190180125 A1 | Jun 2019 | US |
Number | Date | Country | |
---|---|---|---|
62596497 | Dec 2017 | US |