Positive identification of human beings has been used for many purposes including law enforcement and others. One of the most conventional ways of positive identification is via a fingerprint. Other techniques may include photography as well as other biometric techniques.
Around 1870, the French anthropologist Alfonse Bertillon devised a system to measure and record dimensions of certain bony parts of the body. These measurements were reduced to a value, effectively a multidimensional vector, which theoretically applies to only one person, and does not change during that person's adult life. The so-called Bertillon system was used for many years, until it was discredited in 1903. According to legend, a prisoner named Will West was sentenced to the US penitentiary in Leavenworth, Kans. That prisoner had Bertillon measurements which were exactly the same as another prisoner who was there at the same time. Coincidentally, they had the same name, but were different people. Their Bertillon measurements were close enough to identify them as the same person, however they had different fingerprints. It was alleged later that they were identical twin brothers. Around this time, law enforcement began using fingerprints for criminals and their identification.
In 1905, the U.S. Army began using fingerprints, and two years later, the U.S. Navy started doing the same. Around 1907, the Marine Corps joined in doing that. Many of the agencies began sending copies of their fingerprint cards to the national Bureau of criminal identification.
The science of fingerprinting continued to advance. In 1918, Edmond Locard wrote that if 12 points, called Galton's details, were the same between two fingerprints, that would suffice for positive identification. This is the origination of the so-called 12 points of comparison between fingerprints. While this is a guideline, there is no required number of points necessary for identification. Some countries set minimum standards of points of comparison. The United States has no set standard.
Since 1924, the FBI identification division has stored fingerprint files. By 1971, there were 200 million cards on file. The fingerprints are often scanned in terms of minutia as explained in U.S. Pat. Nos. 6,766,040; 6,763,127; 6,270,011; 6,078,265; 6,072,895; and 5,878,158.
Modern “AFIS” technology has split these files into a computerized part and a manually maintained part. Many of the manual files are duplicates, but there is no one accepted filing system. The FBI's new integrated AFIS site plans to stop using paper fingerprint cards completely. Many of the fingerprint cards are stored in a warehouse facility. Even more fingerprint data has been acquired from the US “visit” program, under which foreign nationals which wish to visit the United States must first go to their local US Embassy and apply for a visa. If the VISA is approved, the traveler is fingerprinted and then photographed. The fingerprint and photograph is compared against the traveler who arrives, to ensure identity. Difficulties with the fingerprint reader have often caused abandonment of the fingerprinting, in favor of simple manual comparison.
Other forms of positive identification such as DNA matching, blood typing and saliva matching have been used. Facial imaging, iris scanning, and palm geometry readers have been used in military applications and for government security.
The present inventors have filed other patent applications relating to unique individual identification.
According to the present system, a body shape part is obtained, and a geometrical object is aligned with different reference points within the body shape. Measurements of that geometrical object are used as minutia to represent the body shape, either for unique identification or for comparison to a database.
These and other aspects will now be described in detail with reference to the accompanying drawings, wherein:
The general structure and techniques, and more specific embodiments which can be used to effect different ways of carrying out the more general goals are described herein.
A human body can be scanned using the techniques described in U.S. Pat. No. 6,891,381, and co-pending application Ser. No. 11/056,945.
The processor 615 is then used to process the data as described herein. The present application teaches techniques which can be used for reducing that raw information into minutia that can be used, for example, for at least one of storing, retrieving, cataloging and or comparing the results of a body scan.
One aspect describes using an individual's two-dimensional profile image to determine if that person is the same person listed on a specified identification card. One application can be for a visa identification, carried out by comparing a person's two-dimensional profile image to the image on the visa application, taken at a previous date in time.
In the embodiment, a silhouette of the person is taken at the time they apply for the identification card, or a visa. The image is sampled using electromagnetic radiation, for example millimeter waves, microwave energy, ultrasonic waves, coherent light, or a photo diode array. The sampling may illuminate the subject's head, neck and upper torso from one side. In the embodiment, a series of detectors or receivers may be located across from the transmission device on the opposite side of the individual to capture information therefrom. The received silhouette is cataloged.
In the embodiment, the silhouette is reduced to minutia and stored for further recall. In the embodiment, the resolution of the minutia may be sufficient to allow the sample to be used at some future time to determine if the previously stored image matches the person being screened or does not match the person.
If the image matches, then the person is granted access. If the image does not match, than the subject fails the identification test. The subject may be automatically denied access, or may be further processed in some way, either by manual scanning, or subject to criminal enforcement proceedings.
The subject is scanned with the positioning of the subject's head being used to consider a point of reference in determining whether the person is the same as the previous sample. If the person was previously sampled as looking straight ahead, and then subsequent screenings sample the person's head while looking up, looking down or looking in some other way, previous systems may have had difficulties with comparison purposes.
The present application defines the concept of a reference point. The reference point can be a center line, or a mean reference point. This is used for identification purposes.
Geometrical objects which are described herein may include lines, angles, triangles, and others. It should be understood that any other geometrical object, such as a trapezoid, polygon, or the like, can alternatively be used. A special line may be established within the diagram above the person's skull. This line is referred to within the specification as “the baseline”.
A second line 110 is established that extends vertically upward from the external occipital protuberance. A third line extends from the nose tip 99 to a point in space that intersects the second line. The three lines 100, 110 and 120 form a right triangle 125.
A second triangle 135 is an isosceles triangle formed between the first line 100, and a portion of the second line 120 at the where it intersects with a third line 140 that extends from the external occipital protuberance.
These lines are all formed between baselines on this view, and form angles shown herein as a, b, c and d. While the above describes certain reference points, it should be understood that different reference points can be similarly used to form unique angles.
The operation may be carried out by the processor 615, which may execute the flow chart of
At 700, the image is processed using conventional image processing techniques to identify different features in the image. The image processing techniques, for example, may use a correlation technique, or may use artificial intelligence techniques. At 700, the image is processed to find the nose within the image. One way of doing this is to obtain a number of different exemplary forms of noses. Each of these noses are then correlated across the image, to find the portion of the image that includes the best least mean squares score to each of the sample noses. Another way is to simply look for a specified part of the nose, such as a nostril, which may not exist in other face portions. Yet another way uses a ruleset, that simply assumes that the portion of the face which extends furthest from the eye sockets is in fact the tip of the nose. In any case, any of these techniques can be used to find the nose tip at 700. Data indicative of the nose tip is stored. For example, the image may be defined as a series of points in x,y space, and the position of the nose may be stored as its x,y coordinates.
At 702, the image is again correlated to find the occipital protuberance, which is again stored as its x and y coordinates. At 705, an operation called geometric weighting is carried out, which extends a geometric line 100 between the points obtained in 702. Another line 110 is obtained as the perpendicular to that line, extending from the occipital protuberance. Another line 120 is obtained by extending the point from a, to intersect with 110. Each of these lines can be geometrically obtained from the already obtained information. At 710, the values of the angles a-d are obtained and stored. These may be obtained using conventional trigonometric techniques.
Angles formed by the line intersections are unique to the individual subject. These angles can be concatenated collectively to form a basis for probability analysis indicating that the subject being imaged can be distinguished from another individual. One aspect of this system may be a measure of probability of error.
It is projected that the world's population will increase to 8 billion by the year 2020. Therefore, to allow the system to operate effectively with zero error, one useful probability may be greater than one in 1010.
The scanning system may obtain an image with a resolution factor of 1 mm of resolution. This enables measuring the angles described above as well as the length of the nose, the length of the forehead, length of the jaw line, and baseline for an end to end.
The lengths and angles of the eight lines taking eight at a time may be used to create a unique profile of the subject. This may be used along with height and gender to provide a combination of 10 samples, taken 10 at a time. This may provide 1010 variations.
In one embodiment, length and curvature of the jaw line may be avoided because it changes drastically from significant weight loss or gain, and from time to time. Again, while this describes geometric lines and angles taken along specified reference points, different reference points may be defined.
The angles, or information about the lines themselves, may be used to form template minutia. The minutae represent a reduced version of the unique information about the individual, and can be used to rapidly identify individuals from unknown user groups, as compared with someone outside the known groups. By taking the lengths of the four lines, and the angles formed between them, a mathematical probability of correct identification is obtained. In an embodiment, this may be proportional to 66, and if height and gender are added, 88, or approximately 1 in 134 million. For example, if there is a group of 500,000 police officers, this might provide a likely error rate of 0.03% failure.
As an alternative to the technique of obtaining silhouette information from an individual, the technique shown in
Although only a few embodiments have been disclosed in detail above, other embodiments are possible and the inventor (s) intend these to be encompassed within this specification. The specification describes specific examples to accomplish a more general goal that may be accomplished in other way. This disclosure is intended to be exemplary, and the claims are intended to cover any modification or alternative which might be predictable to a person having ordinary skill in the art. For example, different lines, reference points, and different geometric constructs can be used. Polygons may be used to represent 3D information. Different numbers of lines/angles/polygons can be obtained. The information may be stored as raw numbers or as vectors.
Also, the inventor(s) intend that only those claims which use the words “means for” are intended to be interpreted under 35 USC 112, sixth paragraph. Moreover, no limitations from the specification are intended to be read into any claims, unless those limitations are expressly included in the claims.
This application claims priority under 35 USC §119(e) to U.S. Patent Application Ser. No. 60/603,603, filed on Aug. 23, 2004, the entire contents of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
3490037 | Williams | Jan 1970 | A |
3691557 | Constant | Sep 1972 | A |
3801978 | Gershberg et al. | Apr 1974 | A |
4519037 | Brodeur et al. | May 1985 | A |
4552151 | Bolomey et al. | Nov 1985 | A |
4773029 | Claesson et al. | Sep 1988 | A |
4901084 | Huguenin et al. | Feb 1990 | A |
4910523 | Huguenin et al. | Mar 1990 | A |
4940986 | Huguenin | Jul 1990 | A |
4975968 | Yukl | Dec 1990 | A |
4975969 | Tal | Dec 1990 | A |
5047783 | Hugenin | Sep 1991 | A |
5073782 | Huguenin et al. | Dec 1991 | A |
5202692 | Huguenin et al. | Apr 1993 | A |
5227800 | Huguenin et al. | Jul 1993 | A |
5302962 | Rebuffi et al. | Apr 1994 | A |
5363050 | Guo et al. | Nov 1994 | A |
5438336 | Lee et al. | Aug 1995 | A |
5450504 | Calia | Sep 1995 | A |
5541985 | Ishii et al. | Jul 1996 | A |
5578933 | Nonaka | Nov 1996 | A |
5588435 | Weng et al. | Dec 1996 | A |
5715819 | Svenson et al. | Feb 1998 | A |
5729591 | Bailey | Mar 1998 | A |
5740266 | Weiss et al. | Apr 1998 | A |
5787186 | Schroeder | Jul 1998 | A |
5841288 | Meaney et al. | Nov 1998 | A |
5850599 | Seiderman | Dec 1998 | A |
5859628 | Ross et al. | Jan 1999 | A |
5878158 | Ferris et al. | Mar 1999 | A |
5956525 | Minsky | Sep 1999 | A |
6057761 | Yukl | May 2000 | A |
6072895 | Bolle et al. | Jun 2000 | A |
6078265 | Bonder et al. | Jun 2000 | A |
6122737 | Bjorn et al. | Sep 2000 | A |
6144848 | Walsh et al. | Nov 2000 | A |
6168079 | Becker et al. | Jan 2001 | B1 |
6175923 | Bailey | Jan 2001 | B1 |
6219793 | Li et al. | Apr 2001 | B1 |
6232937 | Jacobsen et al. | May 2001 | B1 |
6243447 | Swartz et al. | Jun 2001 | B1 |
6270011 | Gottfried | Aug 2001 | B1 |
6334575 | Su-Hui | Jan 2002 | B1 |
6405314 | Bailey | Jun 2002 | B1 |
6429625 | LeFevre et al. | Aug 2002 | B1 |
6453301 | Niwa | Sep 2002 | B1 |
6454711 | Haddad et al. | Sep 2002 | B1 |
6492986 | Metaxas et al. | Dec 2002 | B1 |
6507309 | McMakin et al. | Jan 2003 | B2 |
6571188 | Clarridge et al. | May 2003 | B1 |
6587891 | Janky et al. | Jul 2003 | B1 |
6612488 | Suzuki | Sep 2003 | B2 |
6664916 | Stafford et al. | Dec 2003 | B1 |
6687345 | Swartz et al. | Feb 2004 | B1 |
6687346 | Swartz et al. | Feb 2004 | B1 |
6703964 | McMakin et al. | Mar 2004 | B2 |
6763127 | Lin et al. | Jul 2004 | B1 |
6766040 | Catalano et al. | Jul 2004 | B1 |
6775777 | Bailey | Aug 2004 | B2 |
6792291 | Topol et al. | Sep 2004 | B1 |
6870791 | Caulfield et al. | Mar 2005 | B1 |
6891381 | Bailey et al. | May 2005 | B2 |
6900980 | Christopher | May 2005 | B2 |
6965340 | Baharav et al. | Nov 2005 | B1 |
7180441 | Rowe et al. | Feb 2007 | B2 |
20010044331 | Miyoshi et al. | Nov 2001 | A1 |
20010056359 | Abreu | Dec 2001 | A1 |
20020060243 | Janiak et al. | May 2002 | A1 |
20020087478 | Hudd et al. | Jul 2002 | A1 |
20020089410 | Janiak et al. | Jul 2002 | A1 |
20020129257 | Parmelee et al. | Sep 2002 | A1 |
20020163780 | Christopher | Nov 2002 | A1 |
20030067630 | Stringham | Apr 2003 | A1 |
20030083042 | Abuhamdeh | May 2003 | A1 |
20030118216 | Goldberg | Jun 2003 | A1 |
20030123754 | Toyama | Jul 2003 | A1 |
20030222142 | Stevens | Dec 2003 | A1 |
20030235335 | Yukhin et al. | Dec 2003 | A1 |
20040012398 | Bailey et al. | Jan 2004 | A1 |
20040058705 | Morgan et al. | Mar 2004 | A1 |
20040096086 | Miyasaka et al. | May 2004 | A1 |
20040104268 | Bailey | Jun 2004 | A1 |
20040133604 | Lordo | Jul 2004 | A1 |
20040204120 | Jiles | Oct 2004 | A1 |
20050050617 | Moore et al. | Mar 2005 | A1 |
20050061873 | Pirillo | Mar 2005 | A1 |
20050264303 | Bailey et al. | Dec 2005 | A1 |
20060110010 | Bailey et al. | May 2006 | A1 |
Number | Date | Country |
---|---|---|
2002-247162 | Aug 2002 | JP |
WO 9815929 | Oct 1996 | WO |
Number | Date | Country | |
---|---|---|---|
20060104489 A1 | May 2006 | US |
Number | Date | Country | |
---|---|---|---|
60603603 | Aug 2004 | US |