The present disclosure relates to imaging, and more particularly to scanners for entry control systems, such as under car scanners for security check points.
Several technologies exist which can scan the underside of motor vehicles. Many of these technologies rely on the ability to link a vehicle with a vehicle identifier (e.g., license plate number, radio frequency identification (RFID) tag, etc.) so as to be able to perform an automated search of the underside. Other technologies produce only a single image requiring manual inspection of the vehicle image on a screen. One issue that arises is artifacts in the images, wherein areas within a given image have lower image quality. Such artifacts can interfere with software performing analysis on the images, e.g., giving rise to false positives on potential security issues, and can even hamper manual inspection of the vehicle images on screen.
The conventional techniques have been considered satisfactory for their intended purpose. However, there is an ever present need for improved systems and methods for imaging systems and methods. This disclosure provides a solution for this need.
A scanner system includes a scanner framework having a front end, a back end and a top surface. A scanner camera is operatively connected to the scanner framework and has a lens and a sensor for recording images captured from a field of view of the camera. A first mirror arrangement is secured to the framework so as to provide a first reflecting surface angled upwardly toward the top surface and toward the framework front end for imaging a second portion of the field of view. A second mirror arrangement is secured to the framework so as to provide a second reflecting surface angled upwardly in a direction facing the framework top surface and the framework back end for imaging a first portion of the field of view. An illuminator is operatively connected to the camera to illuminate the first and second portions of the field of view. A band pass filter is operatively connected to the scanner camera to filter out illumination outside of an atmospheric absorption band, wherein the sensor is sensitive to illumination in the atmospheric absorption band.
The filter can be configured to pass illumination within plus or minus 60 nm of at least one atmospheric absorption band selected from the list consisting of 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm. The filter can be configured to pass illumination within plus or minus 50 nm of at least one atmospheric absorption band selected from the list consisting of 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm. The illuminator can be configured to illuminate a scene with illumination in the at least one atmospheric absorption band selected from the list consisting of 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm. The sensor can include at least one of Germanium sensitive to plus or minus 50 nm of 1120 nanometers, InGaAs sensitive to plus or minus 50 nm of 780 nm to 1900 nm, and/or HgCdTe (Mercury Cadmium Telurride or Mercadetelluride). The illuminator can be an LED or laser based illuminator that emits 940 nm illumination, wherein the filter is configured to pass illumination within plus or minus 50 nm of 940 nm, and wherein the sensor is a silicon based sensor sensitive to illumination within plus or minus 50 nm of 940 nm.
The scanner camera can be secured to the framework such that the lens faces the framework front end. The first mirror arrangement can include a mirror mounted at or near the framework front end. The second mirror arrangement can include a primary mirror mounted at or near the framework back end, and a secondary mirror mounted at or near a location between the framework front and back ends. The scanner camera can be secured to the framework such that a portion of the lens faces the first mirror arrangement and a portion of the lens faces the second mirror arrangement. The first mirror arrangement can includes a mirror mounted at or near the framework front end and the second mirror arrangement can include a primary mirror mounted at or near the framework back end and a secondary mirror mounted at or near a location between the framework front and back ends.
The scanner camera can be secured such that the camera lens is angled downwardly away from the framework top surface. The scanner camera can be secured to the framework such that the lens faces the framework back end. The framework can include a first glass member secured between the framework top surface and front end, and a second glass member secured between the framework top surface and back end. The first reflecting surface can be angled toward the first glass member and the second reflecting surface can be angled toward the second glass member. The camera can be provided with a single board computer (SBC) in two-way communication with a remote computer monitoring system.
A method for imaging includes illuminating a vehicle undercarriage with illumination in an atmospheric absorption band, imaging the vehicle undercarriage to form an image, wherein scanning includes filtering out illumination returned from the vehicle undercarriage that is outside the atmospheric absorption band. The method includes forming an image with the filtered illumination returned from the vehicle undercarriage.
Illuminating can include illuminating the undercarriage with illumination that includes at least one atmospheric absorption band selected from the list consisting of 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm. Filtering out illumination can include filtering out illumination that is not within plus or minus 50 nm of at least one band selected from the list consisting of 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm. The illuminator can be an LED or laser based illuminator that emits at least one of 780 nm and/or 940 nm illumination, and wherein the filter is configured to pass illumination within plus or minus 50 nm of at least one of 780 nm and/or 940 nm, and wherein the sensor is a silicon based sensor sensitive to illumination within plus or minus 50 nm of at least one of 780 nm and/or 940 nm.
These and other features of the systems and methods of the subject disclosure will become more readily apparent to those skilled in the art from the following detailed description of the preferred embodiments taken in conjunction with the drawings.
So that those skilled in the art to which the subject disclosure appertains will readily understand how to make and use the devices and methods of the subject disclosure without undue experimentation, preferred embodiments thereof will be described in detail herein below with reference to certain figures, wherein:
Reference will now be made to the drawings wherein like reference numerals identify similar structural features or aspects of the subject disclosure. For purposes of explanation and illustration, and not limitation, a partial view of an embodiment of a system in accordance with the disclosure is shown in
As shown in
With reference now to
Two windows 73, 74 can be securely positioned between the top surface 66 and the end walls 67, 69 in order to cover the respective openings while maintaining visibility therethrough.
The windows 73, 74 are secured at respective angles A and B to the horizontal. The first window 73 is positioned to face forward (i.e., in the direction of travel of the overriding vehicle) and the second window 74 is positioned to face backward (i.e., against the direction of travel of the overriding vehicle) to assist in capturing two simultaneous views of the vehicle. The direction of travel of a given vehicle is indicated by arrow C in
The scanner system 54 includes a camera 75 and first 76 and second 77 internal mirror arrangements, which can be angled such that internal mirrors 82 and 84 face out through the anti-reflective, anti-glare, water-repellant glass members of the windows 73, 74. The camera 75 can be a Basler A602f wide area scan camera manufactured by Basler Vision Technology of Ahrensburg, Germany, capable of recording digital video images at a rate of at least 200 frames per second. The camera is provided with a lens 78 mounted thereto. The scanner camera 75 is secured in a position that faces the direction C of oncoming travel of a vehicle. The camera 75 is secured such that the lens 78 faces at an angle downwardly away from the framework top surface 66 such that the camera 75 is appropriately positioned to capture images reflected off of the first 76 and second 77 mirror arrangements. It is contemplated that the camera 75 can be oriented such that its lens faces either the front end or the back end of the framework.
As further shown in
In a similar manner, the second mirror arrangement, indicated at 77, can be secured to the framework so as to provide a second reflecting surface angled upwardly in a direction facing the framework top surface 66 and the framework back end 64. The positioning of the second mirror arrangement 77 enables the camera 75 to record images reflected by the second mirror arrangement as they appear on the other side of window member 74. The second mirror arrangement 77 can include a larger primary mirror 84 mounted at or near the framework back end 64 and a smaller secondary mirror 86 mounted at a location 87 in between the front 62 and back 64 ends of the scanner framework. The primary mirror 84 of the second mirror arrangement 77 is secured inside the back wall 64 of the framework and underneath the back window member 74. The secondary mirror 86 of the second mirror arrangement 77 can be positioned roughly halfway between the scanner framework front 62 and end 64 walls, and can be secured in a substantially perpendicular relation to the framework bottom floor 61. The mirror 82 can be secured at an angle D of between approximately 20 and 30 degrees from the horizontal, and mirror 84 is secured at an angle E of between approximately 25 and 35 degrees from the horizontal.
The scanner system 54 including the camera 75 and first and second mirror arrangements 76, 77 allows the scanner system 54 to operate such that the camera 75 can detect multiple images from an overriding vehicle at the same time. The top half of the camera lens looks over the small mirror 86 on to the front mirror 82. The bottom half of the camera lens looks onto the small mirror 86 that captures the view reflected by the back main mirror 84. A first view is taken of the vehicle as it approaches wall 69 as shown by the dashed lines 92. In this view, the camera is recording the image of the vehicle as reflected by the back mirror 84 at the back end of the scanner framework looking toward the back of the vehicle via the smaller mirror 86. A second view is simultaneously recorded by the camera as it is reflected from the first mirror arrangement as indicated in dashed lines at 90.
An illuminator 95, which includes an upward facing laser or LED bank on either side end of the scanner system 54, is operatively connected to the camera 95 to illuminate the first and second portions of the field of view. A band pass filter 96 is operatively connected to the scanner camera 75 to filter out illumination outside of an atmospheric absorption band, wherein the sensor 97 of the camera 75 is sensitive to illumination in the atmospheric absorption band. The filter 96 can be located outside of the lens 78 as shown, or it can be mounted between the lens 78 and the camera 75 inside the interface between the lens 78 and camera 75.
The filter 96 can be configured to pass illumination within plus or minus 50 or 60 nm of at least one atmospheric absorption band such as 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm. The illuminator 95 can be configured to illuminate the scene in the field of view with illumination in the at least one atmospheric absorption band such as 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm. Those skilled in the art will readily appreciate that there can be advantages to systems that use even smaller bandwidth spectral filters. For example, a system 54 may be able to reject even more of the solar irradiance transmitted through the atmosphere by implementing a bandpass filter as low as plus or minus 5 nm. This approach would typically require laser illumination which has a smaller spectral bandwidth than 5 nm making for an efficient system that will collect most of the laser illumination through that small spectral band. Implementing an LED solution which has tradeoffs. LEDs have a larger spectral bandwidth typically above 30 nm, and therefore a filter of larger than 30 nm is needed to make sure the system 54 captures all of the LED illumination efficiently. If a spectral filter of plus or minus 5 nm bandpass is used with LEDs, most of the light from the LEDs would be rejected by the filter. In appropriate applications, it may be desirable to implement a laser based solution with a smaller spectral bandwidth filter since there are some atmosphere absorption bands that are thinner than the 50 or 60 nm.
The sensor 97 can include at least one of Germanium sensitive to plus or minus 50 nm of 1120 nanometers (Germanium can be sensitive to the wavelengths of 700 nm to about 1600 nm. It is not typically sensitive up to 1900 nm.), InGaAs sensitive to plus or minus 50 nm of 780 nm to 1900 nm, and/or HgCdTe (Mercury Cadmium Telurride or Mercadetelluride). It is contemplated that the illuminator 95 can be an LED or laser based illuminator that emits 940 nm illumination, wherein the filter 96 is configured to pass illumination within plus or minus 50 nm of 940 nm, and wherein the sensor 97 is a silicon based sensor sensitive to illumination within plus or minus 50 nm of 940 nm. Those skilled in the art having the benefit of this disclosure will readily appreciate that the bandwidths described in this paragraph can be tailored larger or smaller as suitable for a specific application, e.g. as explained in the previous paragraph, and to account for manufacturing tolerances in illuminators or the like.
With reference now to
A separate computer 35 is shown, which may be a remote computer not located near the physical entry control deployment elements 10. Thus, communications from the scanner system 54 can be used while being operated either locally at computer 15 or remotely at computer 35. It will be appreciated that computer and monitor 15 may be considered remote even when located at the implementation site, since they may be connected to elements 10 via Ethernet or fiber cabling 12, for example, or via wireless communication.
A method for can include illuminating a vehicle undercarriage with illumination in an atmospheric absorption band, imaging the vehicle undercarriage to form an image, wherein scanning includes filtering out illumination returned from the vehicle undercarriage that is outside the atmospheric absorption band. The method includes forming an image with the filtered illumination returned from the vehicle undercarriage.
Illuminating can include illuminating the undercarriage with illumination that includes at least one atmospheric absorption band selected from the list consisting of 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm. Filtering out illumination can include filtering out illumination that is not within plus or minus 50 nm of at least one band selected from the list consisting of 780 nm, 940 nm, 1120 nm, 1400 nm, and 1900 nm (or other suitable bandwidths as described above).
The methods and systems of the present disclosure, as described above and shown in the drawings, provide for reduction or even elimination of certain artifacts in vehicle undercarriage imagery. While the apparatus and methods of the subject disclosure have been shown and described with reference to preferred embodiments, those skilled in the art will readily appreciate that changes and/or modifications may be made thereto without departing from the scope of the subject disclosure.
This application is a continuation of U.S. patent application Ser. No. 16/808,725 filed Mar. 4, 2020, which claims priority to U.S. Provisional Patent Application No. 62/925,892 filed Oct. 25, 2019 the content of which are incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4288819 | Williams | Sep 1981 | A |
5091924 | Bermbach | Feb 1992 | A |
5119236 | Fong | Jun 1992 | A |
5283643 | Fujimoto | Feb 1994 | A |
5343390 | Doi | Aug 1994 | A |
5361840 | Matthews | Nov 1994 | A |
5449864 | Beatty | Sep 1995 | A |
6313946 | Petitto | Nov 2001 | B1 |
6400835 | Lemelson | Jun 2002 | B1 |
6459764 | Chalmers | Oct 2002 | B1 |
6611200 | Pressnall | Aug 2003 | B2 |
6650765 | Alves | Nov 2003 | B1 |
6718049 | Pavlidis | Apr 2004 | B2 |
6856344 | Frantz | Feb 2005 | B2 |
6958676 | Morgan | Oct 2005 | B1 |
6972693 | Brown | Dec 2005 | B2 |
7076088 | Pavlidis | Jul 2006 | B2 |
7092106 | Cox | Aug 2006 | B2 |
7102665 | Chandler | Sep 2006 | B1 |
7132653 | Faubion | Nov 2006 | B2 |
7305108 | Waehner | Dec 2007 | B2 |
7349007 | Millar | Mar 2008 | B2 |
7439847 | Pederson | Oct 2008 | B2 |
7469060 | Bazakos | Dec 2008 | B2 |
7602942 | Bazakos | Oct 2009 | B2 |
7602947 | Lemelson | Oct 2009 | B1 |
7642899 | Alvarado | Jan 2010 | B2 |
7689033 | Xiao | Mar 2010 | B2 |
7786897 | Alves | Aug 2010 | B2 |
7792970 | Bigioi | Sep 2010 | B2 |
8005267 | Chew | Aug 2011 | B2 |
8028903 | Daniel | Oct 2011 | B1 |
8054182 | Cutchis | Nov 2011 | B2 |
8067719 | Herrera | Nov 2011 | B2 |
8155384 | Chew | Apr 2012 | B2 |
8254647 | Nechyba | Aug 2012 | B1 |
8305442 | Millar | Nov 2012 | B2 |
8358343 | Millar | Jan 2013 | B2 |
8509486 | Hsieh | Aug 2013 | B2 |
8604901 | Hoyos | Dec 2013 | B2 |
8817098 | Millar | Aug 2014 | B2 |
8830322 | Nerayoff | Sep 2014 | B2 |
8861802 | Bedros | Oct 2014 | B2 |
9087204 | Gormley | Jul 2015 | B2 |
9105128 | Robinson | Aug 2015 | B2 |
9189680 | Komatsu | Nov 2015 | B2 |
9230183 | Bechtel | Jan 2016 | B2 |
9256794 | Braithwaite | Feb 2016 | B2 |
9292754 | Shin | Mar 2016 | B2 |
9396595 | Daniel | Jul 2016 | B1 |
9460598 | Noone | Oct 2016 | B2 |
9471838 | Miller | Oct 2016 | B2 |
9533687 | Lisseman | Jan 2017 | B2 |
9552524 | Artan | Jan 2017 | B2 |
9600712 | Jin | Mar 2017 | B2 |
9613258 | Chen | Apr 2017 | B2 |
9623878 | Tan | Apr 2017 | B2 |
9667627 | Gormley | May 2017 | B2 |
9791766 | Ekin | Oct 2017 | B2 |
9953149 | Tussy | Apr 2018 | B2 |
9953210 | Rozploch | Apr 2018 | B1 |
10146797 | Bataller | Dec 2018 | B2 |
10262126 | Tussy | Apr 2019 | B2 |
10657360 | Rozploch | May 2020 | B2 |
10674587 | Sinitsyn | Jun 2020 | B2 |
10839200 | Nazemi | Nov 2020 | B2 |
10867193 | Hansen | Dec 2020 | B1 |
11087119 | Nazemi | Aug 2021 | B2 |
11196965 | Hansen | Dec 2021 | B2 |
20020092988 | Didomenico | Jul 2002 | A1 |
20030174865 | Vernon | Sep 2003 | A1 |
20030185340 | Frantz | Oct 2003 | A1 |
20030209893 | Breed | Nov 2003 | A1 |
20040070679 | Pope | Apr 2004 | A1 |
20040165750 | Chew | Aug 2004 | A1 |
20040199785 | Pederson | Oct 2004 | A1 |
20040225651 | Musgrove | Nov 2004 | A1 |
20050063566 | Beek | Mar 2005 | A1 |
20050105806 | Nagaoka | May 2005 | A1 |
20050110610 | Bazakos | May 2005 | A1 |
20050271184 | Ovadia | Dec 2005 | A1 |
20060018522 | Sunzeri | Jan 2006 | A1 |
20060028556 | Bunn | Feb 2006 | A1 |
20060055512 | Chew | Mar 2006 | A1 |
20060102843 | Bazakos | May 2006 | A1 |
20060117186 | Yeo | Jun 2006 | A1 |
20060146062 | Kee | Jul 2006 | A1 |
20060284982 | Bigioi | Dec 2006 | A1 |
20070030350 | Wagner | Feb 2007 | A1 |
20070087756 | Hoffberg | Apr 2007 | A1 |
20070112699 | Zhao | May 2007 | A1 |
20070122007 | Austin | May 2007 | A1 |
20070133844 | Waehner | Jun 2007 | A1 |
20080175438 | Alves | Jul 2008 | A1 |
20080211914 | Herrera | Sep 2008 | A1 |
20080285803 | Madsen | Nov 2008 | A1 |
20080297611 | Qiu | Dec 2008 | A1 |
20080298644 | Irmatov | Dec 2008 | A1 |
20090023472 | Yoo | Jan 2009 | A1 |
20090080715 | Van Beek | Mar 2009 | A1 |
20090232365 | Berthilsson | Sep 2009 | A1 |
20090303342 | Corcoran | Dec 2009 | A1 |
20100158380 | Neville | Jun 2010 | A1 |
20110182473 | Wang | Jul 2011 | A1 |
20110242285 | Byren | Oct 2011 | A1 |
20120069183 | Aoki | Mar 2012 | A1 |
20120106806 | Folta | May 2012 | A1 |
20120140079 | Millar | Jun 2012 | A1 |
20120262577 | Wang | Oct 2012 | A1 |
20120328197 | Sanderson | Dec 2012 | A1 |
20130129159 | Huijgens | May 2013 | A1 |
20130147959 | Wang | Jun 2013 | A1 |
20130176285 | Sato | Jul 2013 | A1 |
20130202274 | Chan | Aug 2013 | A1 |
20130236068 | Eshghi | Sep 2013 | A1 |
20130243260 | Burry | Sep 2013 | A1 |
20130251214 | Chung | Sep 2013 | A1 |
20130266193 | Tiwari | Oct 2013 | A1 |
20130266196 | Kono | Oct 2013 | A1 |
20130279757 | Kephart | Oct 2013 | A1 |
20130336538 | Skaff | Dec 2013 | A1 |
20140002617 | Zhang | Jan 2014 | A1 |
20140029005 | Fiess | Jan 2014 | A1 |
20140044348 | Chen | Feb 2014 | A1 |
20140063177 | Tian | Mar 2014 | A1 |
20140132501 | Choi | May 2014 | A1 |
20140132746 | King | May 2014 | A1 |
20140253701 | Wexler | Sep 2014 | A1 |
20140254890 | Bergman | Sep 2014 | A1 |
20140285315 | Wiewiora | Sep 2014 | A1 |
20140320281 | Sager | Oct 2014 | A1 |
20140334684 | Strimling | Nov 2014 | A1 |
20150131872 | Ganong | May 2015 | A1 |
20150186711 | Baldwin | Jul 2015 | A1 |
20150261994 | Yamaji | Sep 2015 | A1 |
20150262024 | Braithwaite | Sep 2015 | A1 |
20150278617 | Oami | Oct 2015 | A1 |
20150286883 | Xu | Oct 2015 | A1 |
20150294144 | Konishi | Oct 2015 | A1 |
20150317535 | Lenor | Nov 2015 | A1 |
20150331105 | Bell | Nov 2015 | A1 |
20150347860 | Meier | Dec 2015 | A1 |
20150357000 | Howell | Dec 2015 | A1 |
20150363655 | Artan | Dec 2015 | A1 |
20160026855 | Mazumdar | Jan 2016 | A1 |
20160063235 | Tussy | Mar 2016 | A1 |
20160171312 | Aoki | Jun 2016 | A1 |
20160171808 | Caterino | Jun 2016 | A1 |
20160178936 | Yang | Jun 2016 | A1 |
20160217319 | Bhanu | Jul 2016 | A1 |
20160239714 | Oami | Aug 2016 | A1 |
20160253331 | Roshen | Sep 2016 | A1 |
20160300410 | Jones | Oct 2016 | A1 |
20160343251 | Lee | Nov 2016 | A1 |
20160379043 | Fazl Ersi | Dec 2016 | A1 |
20170046808 | Parrish | Feb 2017 | A1 |
20170068863 | Rattner | Mar 2017 | A1 |
20170076140 | Waniguchi | Mar 2017 | A1 |
20170106892 | Lisseman | Apr 2017 | A1 |
20180018351 | Fagans | Jan 2018 | A1 |
20180082131 | Li | Mar 2018 | A1 |
20180089528 | Chan | Mar 2018 | A1 |
20180157922 | Miyamoto | Jun 2018 | A1 |
20180181737 | Tussy | Jun 2018 | A1 |
20180189551 | Ranganath | Jul 2018 | A1 |
20180196587 | Bialynicka-Birula | Jul 2018 | A1 |
20180225307 | Kocher | Aug 2018 | A1 |
20180306598 | Decia | Oct 2018 | A1 |
20180307915 | Olson | Oct 2018 | A1 |
20190089934 | Goulden | Mar 2019 | A1 |
20190180125 | Rozploch | Jun 2019 | A1 |
20190354750 | Nazemi | Nov 2019 | A1 |
20190373157 | Kunihiro | Dec 2019 | A1 |
Number | Date | Country |
---|---|---|
3010922 | Sep 2017 | CA |
102682295 | Sep 2012 | CN |
104024827 | Sep 2014 | CN |
105785472 | Jul 2016 | CN |
10101341 | Jul 2002 | DE |
102015002802 | Aug 2015 | DE |
102014214352 | Jan 2016 | DE |
1482329 | Dec 2004 | EP |
2620896 | Jul 2013 | EP |
2993619 | Mar 2016 | EP |
2395105 | Feb 2013 | ES |
2258321 | Feb 1993 | GB |
2003348573 | Dec 2003 | JP |
4366008 | Nov 2009 | JP |
05997871 | Sep 2016 | JP |
1020050003664 | Jan 2005 | KR |
20090031136 | Mar 2009 | KR |
100964025 | Jun 2010 | KR |
100964886 | Jun 2010 | KR |
101252671 | Apr 2013 | KR |
101514444 | Apr 2015 | KR |
20150137666 | Dec 2015 | KR |
101628390 | Jun 2016 | KR |
20190030960 | Mar 2019 | KR |
200146668 | Jun 2001 | WO |
2004110054 | Dec 2004 | WO |
2012160251 | Nov 2012 | WO |
2013004864 | Jan 2013 | WO |
2014054328 | Apr 2014 | WO |
2014110629 | Jul 2014 | WO |
2015120413 | Aug 2015 | WO |
2016183408 | Nov 2016 | WO |
2017151859 | Sep 2017 | WO |
WO-2019092246 | May 2019 | WO |
Entry |
---|
Viisage Technology, Inc. “FaceFINDER 2.5”, Data Sheet, pp. 2 page; https://www.epic.org/privacy/surveillance/cptolight/1105/facefinder.pdf, 2004. |
P. Jonathon Phillips, “Support Vector Machines Applied to Face Reconition”, this is technical report NISTIR 6241, to appear in Advances in Neural Information, Processing Systems 11, eds. M. J. Kearns, S. A. Solla, and D. A. Cohn, MIT Press, 1999. |
Huaqing Li, Shaoyu Wang, and Feihu Qi, R. Kiette and J. Zuni'c (Eds.), “Automatic Face Recognition by Support Vector Machines”: IWCIA2004, LNCS 3322, pp. 716-725, 2004. copyright Springer-Verlag Berlin Heidelberg 2004. |
Jia Hao, Yusuke Morishita, Toshinori Hosoi, Kazuyuki Sakurai, Hitsohi Imaoka, Takao Imaizumi, and Hideki Irisawa, “Large-scale Face Recognition on Smart Devices”, 2013 Second IAPR Asian Conference on Pattern Recognition, 978-1-4799-2190-4/13, copyright 2013 IEEE, DOI 10.1109/ACPR.2013.189. |
F. Z. Chelali, A. Djeradi and R. Djeradi, “Linear discriminant analysis for face recognition,” 2009 International Conference on Multimedia Computing and Systems, Ouarzazate, 2009, pp. 1-10, doi: 10.1109/MMCS.2009.5256630. |
Shishir Bahyal and Ganesh K. Venayagamoorthy, “Recognition of facial expressions using Gabor wavelets and learning vector quantization”, Missouri University of Science and Technology, MO 65409, USA, received in revised form Apr. 26, 2007; accepted Nov. 12, 2007. |
Jin Wei, Zhang Jian-qi, Zhang Xiang, “Face recognition method based on support vector machine and particle swarm optimizatin”, copyright 2010 Elsevier Ltd. All rights reserved, doi: 10.1016/j.eswa.2010.09.108. |
Pavlidis et al., “Automatic Passenger Counting in the High Occupancy Vehicle (HOV) Lanes”, 19 pages, prior to Oct. 20, 2005. |
Dickson, Peter et al. “Mosaic Generation for Under Vehicle Inspection”, Applications of Computer Vision, 2002. (WACV 2002), Pascataway, NJ, Dec. 3, 2022, pp. 251-256. |
International Search Report and Written Opinion for PCT/US06/06708, dated Aug. 29, 2006. |
International Search Report and Written Opinion for PCT/US2019/031755, dated Sep. 5, 2019. |
International Search Report and Written Opinion for PCT/US2018/064444, dated Feb. 21, 2019. |
International Search Report and Written Opinion for PCT/US2020/056429, dated Feb. 9, 2021. |
International Search Report and Written Opinion for PCT/US2020/041195, dated Oct. 21, 2020. |
International Search Report and Written Opinion for PCT/US2022/013783, dated May 16, 2022. |
Number | Date | Country | |
---|---|---|---|
20220094880 A1 | Mar 2022 | US |
Number | Date | Country | |
---|---|---|---|
62925892 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16808725 | Mar 2020 | US |
Child | 17542329 | US |