System for reading and authenticating a composite image in a sheeting

Information

  • Patent Grant
  • 8072626
  • Patent Number
    8,072,626
  • Date Filed
    Thursday, August 20, 2009
    15 years ago
  • Date Issued
    Tuesday, December 6, 2011
    13 years ago
Abstract
A system for reading and authenticating a composite image in a sheeting. A exemplary embodiment of the invention provides a system for reading and authenticating a sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting or both. The present invention also relates to methods of reading and authenticating a composite image that appears to the unaided eye to be floating above or below the sheeting or both.
Description
TECHNICAL FIELD

The present invention relates to a system for reading and authenticating a composite image in a sheeting. The present invention relates more particularly to system for reading and authenticating a sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting. The present invention also relates more particularly to methods of reading and authenticating a composite image that appears to the unaided eye to be floating above or below the sheeting.


BACKGROUND OF THE INVENTION

As tampering and counterfeiting of identification documents, such as passports, driver's licenses, identification cards and badges, and documents of value, such as bonds, certificates, and negotiable instruments, increase, there is a need for greater security features and measures. Using commonly available technology, it is possible to alter such typed, printed, photographed or handwritten details in such a way that the document can then show that the ownership of that document, or an article to which that document relates, has been transferred to a party not legally entitled to that document or article. To impede the successful tampering or alteration of such details, it is a known practice to apply a security laminate over the top of such details. Such laminates may contain security features that will indicate whether the laminate itself is genuine, whether the laminate has been lifted or replaced, whether the laminate's surface has been penetrated, and whether that laminate surface has been overprinted or overlabelled. Other security features can include printing or patterns that respond to ultra-violet or infra-red light.


One example of a commercially available security laminate is the 3M™ Confirm™ Security Laminate with Floating Images, which is sold by 3M Company based in St. Paul, Minn. This security laminate with floating image is also described in U.S. Pat. No. 6,288,842 B1, “Sheeting with Composite Image that Floats,” (Florczak et al.), which is owned by the same assignee as the present application. This patent discloses microlens sheetings with composite images in which the composite image floats above or below the sheeting, or both. The composite image may be two-dimensional or three-dimensional. Methods for providing such an imaged sheeting, including by the application of radiation to a radiation sensitive material layer adjacent the microlens, are also disclosed in this patent.


A variety of security readers are known in the art. For example, U.S. Pat. No. 6,288,842, “Security Reader for Automatic Detection of Tampering and Alteration, (Mann) discloses a security reader for reading and processing information about security laminates. One example of a passport reader is commercially available from 3M Company based in St. Paul, Minn. and 3M AiT, Ltd. based in Ottawa, Ontario, Canada, as the 3M™ Full Page Reader (formerly sold as the AiT™ imPAX™ Reader).


A variety of machine vision systems are known in the art. For example, Computer Vision written by Dana Bollard and Christopher Brown is a text book concerning computer vision or machine vision. Computer Vision discloses that computer vision or machine vision is the enterprise of automating and integrating a wide range of processes and representations used for vision perception. It includes as parts many techniques that are useful by themselves, such as image processing (transforming, encoding, and transmitting images) and statistical pattern classification (statistical decision theory applied to general patterns, visual or otherwise), geometric modeling, and cognitive processing. In essence, machine vision is taking a two-dimensional representation of a three-dimensional scene and trying to replicate the three-dimensional scene. However, machine vision systems are not used for verifying the existence of a perceived three-dimensional security feature and then authenticating such security feature by comparing it to a database of security features.


Although the commercial success of available security features and security readers has been impressive, as the capabilities of counterfeiters continues to evolve, it is desirable to further improve the ability to indicate that a security feature has been tampered with or somehow compromised to help protect against counterfeiting, alteration, duplication, and simulation.


SUMMARY OF THE INVENTION

One aspect of the present invention provides a system for reading and authenticating a composite image in a sheeting. The system for reading and authenticating a composite image in a sheeting comprises: a sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting or both; a reader, comprising: a first camera to capture a first image of the sheeting and a first image of the composite image floating above or below the sheeting or both; a second camera to capture a second image of the sheeting and a second image of the composite image floating above or below the sheeting or both; and a computer for comparing the first image and the second image of the sheeting and for comparing the first image and second image of the composite image floating above or below the sheeting or both to calculate the perceived distance between the sheeting and the composite image floating above or below the sheeting or both.


In one preferred embodiment of the above system, the system further comprises a database including information about composite images that float above or below the sheeting or both and their floating distances relative to the sheeting. In another aspect of this embodiment, the computer compares the first image of the composite image that floats above or below the sheeting or both to the database of composite images to identify the composite image. In another aspect of this embodiment, the system compares the calculated perceived distance between the sheeting and the composite image with the floating distances in the database to provide information about the sheeting. In yet another aspect of this embodiment, the calculated perceived distance matches the floating distance in the database for the identified composite image and the system thereby authenticates the sheeting. In another aspect of this embodiment, the calculated perceived distance does not match the floating distances in the database for the identified composite image and the system thereby determines that the sheeting is not authentic.


In one preferred embodiment of the above system, the first camera and second camera are perpendicular to the sheeting. In another preferred embodiment of the above system, the sheeting is located in a fixed position. In another preferred embodiment of the above system, the composite image appears under reflected light to float above the sheeting. In yet another preferred embodiment of the above system, the composite image appears in transmitted light to float above the sheeting.


In another preferred embodiment of the above system, the composite image appears under reflected light to float below the sheeting. In another preferred embodiment of the above system, the composite image appears in transmitted light to float below the sheeting. In another preferred embodiment of the above system, the composite image also appears to the unaided eye to be at least in part in the plane of the sheeting.


Another aspect of the present invention provides an alternative system for reading and authenticating a composite image in a sheeting. The system for reading and authenticating a composite image in a sheeting comprises: a sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting or both; a reader, comprising: a camera moveable between a first position and a second position, wherein in the first position the camera captures a first image of the sheeting and a first image of the composite image floating above or below the sheeting or both, wherein in the second position the camera captures a second image of the sheeting and captures a second image of the composite image floating above or below the sheeting or both; and a computer for comparing the first image and the second image of the sheeting and for comparing the first image and second image of the composite image floating above or below the sheeting or both to calculate the perceived distance between the sheeting and the composite image floating above or below the sheeting or both.


In one preferred embodiment of the above system, the system further comprises a database including information about composite images that float above or below the sheeting or both and their floating distances relative to the sheeting. In another preferred embodiment of the above system, the computer compares the first image of the composite image that floats above or below the sheeting or both to the database of composite images to identify the composite image. In another preferred embodiment of the above system, the system compares the calculated perceived distance between the sheeting and the composite image with the floating distances in the database to provide information about the sheeting.


In another preferred embodiment of the above system, the calculated perceived distance of the floating image, above or below the sheeting or both, matches the floating distance in the database for the identified composite image and the system thereby authenticates the sheeting. In another preferred embodiment of the above system, the calculated perceived distance does not match the floating distances in the database for the identified composite image and the system thereby determines that the sheeting is not authentic. In yet another preferred embodiment of the above system, the sheeting is located in a fixed position.


In another preferred embodiment of the above system, the composite image appears under reflected light to float above the sheeting. In another preferred embodiment of the above system, the composite image appears in transmitted light to float above the sheeting. In another preferred embodiment of the above system, the composite image appears under reflected light to float below the sheeting. In yet another preferred embodiment of the above system, the composite image appears in transmitted light to float below the sheeting. In another aspect of this embodiment, the composite image also appears to the unaided eye to be at least in part in the plane of the sheeting. In another preferred embodiment of the above system, the camera is perpendicular to the sheeting.


Another aspect of the present invention provides an alternative system for reading and authenticating a composite image in a sheeting. The system for reading and authenticating a composite image in a sheeting comprises: a sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting; a reader, comprising: a camera; and a sheeting holder moveable between a first position and a second position, wherein the microlens sheeting is positioned on the sheeting holder, wherein in the first position the camera captures a first image of the sheeting and a first image of the composite image floating above or below the sheeting or both, wherein in the second position the camera captures a second image of the microlens sheeting and a second image of the composite image floating above or below the sheeting or both; and a computer for comparing the first image and the second image of the sheeting and for comparing the first image and second image of the composite image floating above or below the sheeting or both to calculate the perceived distance between the sheeting and the composite image floating above or below the sheeting or both.


In one preferred embodiment of the above system, the system further comprises a database including information about composite images that float above or below the sheeting or both and their floating distances relative to the sheeting. In another aspect of this embodiment, the computer compares the first image of the composite image that floats above or below the sheeting or both to the database of composite images to identify the composite image. In another aspect of this embodiment, the system compares the calculated perceived distance between the sheeting and the composite image with the floating distances in the database to provide information about the sheeting. In another aspect of this embodiment, the calculated perceived distance matches the floating distance in the database for the identified composite image and the system thereby authenticates the sheeting. In yet another aspect of this embodiment, the calculated distance does not match the floating distances in the database for the identified composite image and the system thereby determines that the sheeting is not authentic.


In another preferred embodiment of the above system, the first camera and second camera are perpendicular to the sheeting. In yet another aspect of this embodiment, the sheeting is located in a fixed position. In another preferred embodiment of the above system, the composite image appears under reflected light to float above the sheeting. In another preferred embodiment of the above system, the composite image appears in transmitted light to float above the sheeting. In another preferred embodiment of the above system, the composite image appears under reflected light to float below the sheeting. In another preferred embodiment of the above system, the composite image appears in transmitted light to float below the sheeting. In yet another aspect of this embodiment, the composite image also appears to the unaided eye to be at least in part in the plane of the sheeting.


Another aspect of the present invention provides a method of reading and authenticating a composite image in a sheeting. The method comprises the steps of: providing a sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting or both; recording a first image of the microlens sheeting and recording a first image of the composite image floating above or below the sheeting or both; recording a second image of the microlens sheeting and recording a second image of the composite image floating above or below the sheeting or both; calculating the perceived distance between the sheeting and the composite image floating above or below the sheeting or both by comparing the first image and the second image of the microlens sheeting and by comparing the first image and second image of the composite image floating above or below the sheeting or both.


In one preferred embodiment of the above method, the method further includes the step of: providing a database including information about composite images that float above or below the sheeting or both and their floating distances relative to the sheeting. In another aspect of this embodiment, the method further includes the step of: identifying the composite image by comparing the first image of the composite image that floats above or below the sheeting or both to the database of composite images. In another aspect of this embodiment, the method further includes the step of: comparing the calculated perceived distance between the sheeting and the composite image with the floating distances in the database to provide information about the sheeting. In another aspect of this embodiment, the method further includes the step of: providing a signal to a user that the sheeting is authentic when the calculated perceived distance matches the floating distance in the database for the identified composite image and the system. In another aspect of this embodiment, the method further includes the step of: providing a signal to a user that the sheeting is not authentic when the calculated perceived distance does not match the floating distances in the database for the identified composite image.


In one preferred embodiment of the above method, the composite image appears under reflected light to float above the sheeting. In another preferred embodiment of the above system, the composite image appears in transmitted light to float above the sheeting. In another preferred embodiment of the above system, the composite image appears under reflected light to float below the sheeting. In one preferred embodiment of the above method, the composite image appears in transmitted light to float below the sheeting. In yet another preferred embodiment of the above system, the composite image also appears to the unaided eye to be at least in part in the plane of the sheeting.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be further explained with reference to the appended Figures, wherein like structure is referred to by like numerals throughout the several views, and wherein:



FIG. 1 is a perspective view of one exemplary embodiment of a reader for reading and authenticating a composite image in a sheeting of the present invention;



FIG. 2 is a top view of a passport including composite images that appear to float above and appear to float below the sheeting;



FIG. 2
a is a photomicrograph of a passport including composite images that appear to float above and appear to float below the sheeting;



FIG. 3 is a perspective view of the passport of FIG. 2 being read by the reader of FIG. 1;



FIG. 4 is a side, cross-sectional, schematic view of the passport reader and passport of FIG. 3;



FIG. 5 illustrates a schematic view of one exemplary embodiment of the cameras in the system for reading and authenticating a composite image in a sheeting of the present invention;



FIG. 6 illustrates a schematic view of another exemplary embodiment of the camera in the system for reading and authenticating a composite image in a sheeting of the present invention;



FIG. 7 illustrates a schematic view of yet another exemplary embodiment of the camera in the system for reading and authenticating a composite image in a sheeting of the present invention; and



FIG. 8 illustrates the optics associated with the embodiments of the systems illustrated in FIGS. 5-7.





DETAILED DESCRIPTION OF THE INVENTION

The system of the present invention reads a composite image that appears to be suspended, or to float, above, in the plane of, and/or below a sheeting. The system of the present invention is also useful for providing information to a user whether or not a sheeting having such a composite image is authentic or not. The system of the present invention is for reading and authenticating a composite image that appears to the unaided eye to be floating above or below a sheeting or both, such a floating composite image as taught in U.S. Pat. No. 6,288,842 B1, (“the '842 patent”), “Sheeting with Composite Image that Floats,” (Florczak et al.), which is owned by the same assignee as the present application, and which is hereby incorporated by reference. These composite images are actually three-dimensional, optical illusions, and they are perceived by the user to either be floating above or below the sheeting or both. The system of the present invention assists in calculating the distance that is perceived by the user between the composite image and the sheeting in this optical illusion.


Composite images that appear to the unaided eye to be floating above a sheeting, below a sheeting, or both, are suspended images and are referred to for convenience as floating images. The term “unaided eye” means normal (or corrected to normal) human vision not enhanced by, for example, magnification. These suspended or floating images may be either two or three-dimensional images, can be in black or white or in color, and can appear to move with the observer or change in shape. The sheeting that has a composite image may be viewed using light that impinges on the sheeting from the same side as the observer (reflected light), or from the opposite side of the sheeting as the observer (transmitted light), or both. One example of sheeting including such composite images is shown in FIG. 2a, which is explained in more detail below.


In one exemplary embodiment of sheeting containing such composite images as described above, the sheeting includes: (a) at least one layer of microlens, the layer having first and second sides; (b) a layer of material disposed adjacent the first side of the layer of microlens; and (c) an at least partially complete image formed in the material associated with each of a plurality of the microlens, where the image contrasts with the material. Microlens may also be called lenticular lens or microlenslets. The composite image is provided by the individual images, and it appears to the unaided eye to be floating above or below the sheeting, or both. The '842 patent provides a complete description of the microlens sheeting, exemplary material layers of such sheeting, some of which are preferably radiation sensitive material layers, examples of radiation sources for creating the individual images, and exemplary imaging processes.


The sheeting having a composite image as described in the '842 patent may be used in a variety of applications such as securing tamperproof images in passports, ID badges, event passes, affinity cards, or other documents of value, product identification formats and advertising promotions for verification and authenticity, brand enhancement images which provide a floating or sinking or a floating and sinking image of the brand, identification presentation images in graphics applications such as emblems for police, fire or other emergency vehicles; information presentation images in graphics applications such as kiosks, night signs and automotive dashboard displays, and novelty enhancement through the use of composite images on products such as business cards, hang-tags, art, shoes and bottled products. The system of the present invention for reading and authenticating sheeting having a composite image includes a reader for reading and authenticating any of the items mentioned above. For sake of simplicity, the figures of the present application illustrate a passport having a floating image and a passport reader for reading and authenticating the floating image. However, the system of the present invention may include any reader for reading and authenticating any item having a floating image.



FIG. 1 illustrates one embodiment of a reader 10 that is a part of the system of the present invention for reading and authenticating a floating image. In this embodiment, the reader 10 is configured to read passports having floating images. The passport reader 10 includes a housing 50. The housing 50 includes a first portion 42 and a second portion 44. The first portion 42 includes a window 40, preferably made of glass, which is convenient for viewing the optical information found in the passport, such as printed images, photographs, signatures, personal alphanumeric information, and barcodes, and for viewing the floating images on the passport. The second portion 44 of the passport reader includes a ledge, which is convenient for supporting half of a passport when the passport 14 is inserted into the passport reader 10 to be read (shown in FIG. 2). The other half of the passport is placed on the glass 40 when the passport 14 is inserted into the passport reader 10 to be read and authenticated or verified.



FIG. 2 illustrates one embodiment of a schematic document of value including a floating image. FIG. 2a is a photomicrograph of a close up view of a portion of an actual document of value including floating images. In this embodiment, the document of value is a passport booklet 14. The passport 14 is typically a booklet filled with several bound pages. One of the pages usually includes personalization data 18, often presented as printed images, which can include photographs 16, signatures, personal alphanumeric information, and barcodes, and allows human or electronic verification that the person presenting the document for inspection is the person to whom the passport 14 is assigned. This same page of the passport may have a variety of covert and overt security features, such as those security features described in U.S. patent application Ser. No. 10/193,850, “Tamper-Indicating Printable Sheet for Securing Documents of Value and Methods of Making the Same, filed on Aug. 6, 2004 by the same assignee as the present application, which is hereby incorporated by reference. In addition, this same page of the passport 14 includes a laminate of microlens sheeting 20 having composite images 30, which appear to the unaided eye to float either above or below the sheeting 20 or both. This feature is a security feature that is used to verify that the passport is an authentic passport and not a fake passport. One example of suitable microlens sheeting 20 is commercially available from 3M Company based in St. Paul, Minn. as 3M™ Confirm™ Security Laminate with Floating Images.


In this embodiment of the passport 14, the composite images 30 or floating images 30 include three different types of floating images. The first type of floating image 30a is a “3M” that appears to the unaided eye to float above the page in the passport 14. The second type of floating image 30b is a “3M” that appears to the unaided eye to float below the page in the passport 14. The third type of floating image 30c is a sine wave that appears to the unaided eye to float above the page in the passport 14. When the passport 14 is tilted by a user, the floating images 30a, 30b, 30c may appear to move to the observer. In reality, the floating images 30a, 30b, 30c are optical illusions that appear to the viewer's unaided eye to be floating above or below the sheeting 20 or both. The passport 14 or document of value may include any combination of floating images that float above, below and/or in the plane of the passport 14. The floating images may be any configuration and may include words, symbols, or particular designs that correspond to the document of value. For instance, passports issued by the Australian government include microlens sheeting having floating images in the shape of a kangaroo and boomerangs, two symbols representing the country. The other pages of the passport booklet may contain blank pages for receiving a country's stamp, as the person is processed through customs.


In the past, when a passport has been presented to a customs official as the person is being processed through customs to either leave or enter in a country, the customs official would typically look at the passport 14 with his unaided eyes to see if the passport included the appropriate floating images 30 to verify that the passport was authentic. However, as counterfeiters become more and more sophisticated, it may become necessary in the future to provide systems that assist the official in verifying that the passport is authentic based on the security feature of the floating images. The system of the present invention first verifies that the passport or document of value contains at least one floating image 30. Then, the system verifies that the floating image 30 is the correct floating image 30. Lastly, the system verifies the perceived distance between the floating image 30 and the passport page having the microlens sheeting, known as the “floating distance.” If this floating distance is the correct distance or within some margin of error, then the system verifies or authenticates or otherwise communicates to the customs official that the passport is an authentic passport. If, however, the floating distance is not the correct distance, the system indicates to the customs official that the passport is a forgery or a fake. The system also helps reduce time and effort spent by the customs official processing the passport.



FIG. 3 illustrates the passport reader 10 of the system in combination with a passport 14. To read the passport, the passport booklet 14 is opened up to the page containing the floating images, creating a first portion 46 of the passport and second portion 48 of the passport. In this case, the page of the passport 14 having the floating images is the same page that contains the personalization data 18, such as the picture 16 of the individual carrying the passport. Next, the passport booklet is inserted into the passport reader 10, such that the floating images 30 and the personalization data 18 in the first portion 46 of the passport 14 are adjacent (or placed over) the glass 40 of the reader 10. The second portion 48 of the passport 14 is in contact with the ledge 44 of the reader, and the seam of the passport 14 extends along the junction between adjacent edges of the glass 40 and the ledge 44. This placement of the passport 14 on the passport reader 50 is convenient for reading the floating images 30 and the personalization data 18, which is explained in more detail below in reference to FIGS. 4-7.



FIG. 4 is convenient for illustrating the inside of the passport reader 14 when the passport is being read and verified. The passport reader 14 can read the personalization data 18 from the passport and to perform this feature, the passport reader 14 contains many of the same parts (not illustrated) as the Full Page Readers sold under the 3M brand from 3M Company located in St. Paul, Minn. For example, the cameras in the reader 10 are also used to record and transmit the personalization information on the passport to the computer. However, the difference between the passport reader 14 of the system of the present invention and the Full Page Readers is that the passport reader 14 of the present invention can read and authenticate floating images 30.


The passport reader 14 includes light source 52, a mirror 54, and at least a first camera 58. The reader 14 may optionally include a second camera 60 (FIG. 5.). The mirror 54 is preferably a half-silvered mirror that can both reflect and transmit light. The microlens sheeting 20 on the passport 14 is viewable through the glass window 40. As mentioned above, the microlens sheeting 20 preferably includes a layer of microlens 22 and a layer of radiation sensitive material layer 24.


In an exemplary embodiment, the mirror 54 is positioned at a 45° angle relative to both the light source 52 and the camera 58. This arrangement is such that the light from the light source 52 is reflected off the half-silvered mirror, up to the microlens sheeting or substrate 20 through the glass 40, and then reflected back down through the half-silvered mirror 54 and into the camera 58, as illustrated in FIG. 4. The light source 52 may provide light of a certain wavelength, polarized light, or retroreflected light. The term “retroreflected” as used herein refers to the attribute of reflecting an incident light ray in a direction antiparallel to its incident direction, or nearly so, such that it returns to the light source or the immediate vicinity thereof. Retroreflected light is preferred because it helps eliminate viewing the printed personalization information on the passport 14, making the floating image 30 easier to view.


The reader 10 may include a stationary camera 58, one moveable camera 58a, or two cameras 58, 60, as discussed in more detail in reference to FIGS. 5-8. One example of a suitable light source 52 is commercially available from Lumex, Inc. located in Palatine, Ill., a white, clear lens, TI format LED, under part number SSL-LX3054 UWC/A. One example of a suitable camera 58 is commercially available from Micron Technology, Inc. located in Boise, Id. as a 1.3 Mega-pixel CMOS color sensor camera. One example of a suitable half-silvered mirror 54 is commercially available from Edmund Industrial Optics located in Barrington, N.J., having part number NT43-817.


The system includes a computer 56 (illustrated as box 56) in communication with the camera 58. The computer 56 processes the information obtained by either the first camera 58, second camera 60 or both cameras 58, 60. Any computer known in the art is suitable to be used in the passport reader 10.



FIGS. 5-8 illustrate three different embodiments of the reader 10. In the first embodiment, which is illustrated in FIG. 5, the reader 10 includes a first camera 58 and a second camera 60. In the second embodiment, which is illustrated in FIG. 6, the reader includes a first moveable camera 58a. The camera 58a may move along a track inside the reader and be powered by a motor. In the third embodiment, which is illustrated in FIG. 7, the camera 58 is stationary, but a holder 38a of the passport 14 is moveable relative to the camera 58. The holder 38a may move along a track on top of the reader and be powered by a motor. The holder 38a preferably includes the glass 40. The three embodiments illustrated in FIGS. 5-7 are arranged so as to provide at least two views of the microlens sheeting 20 and the floating image 30. The images of the microlens sheeting 20 and floating image 30 are captured on the camera image planes 66, 68 and transmitted to the computer 56 for further processing. The first image 70 and second image 72 of the microlens sheeting are depicted graphically by boxes 70 and 72. The first image 74 and second image 76 of the composite floating image 30 are depicted graphically by boxes 74 and 76. The first image 70 and second image 72 of the microlens sheeting are compared by the computer 56. The first image 74 and second image 76 of the floating image 30 are compared by the computer 56. In one exemplary embodiment, the images 70, 72, 74, 76 are measured relative to the center of the camera planes 66, 68 as discussed in reference to FIG. 8.



FIG. 8 illustrates the optics associated with the embodiments of the system illustrated in FIGS. 5-7. For simplicity, FIG. 8 illustrates a first camera image plane 66 and a second camera image plane 68. In one embodiment, the first image plane 66 may be part of the first camera 58 and the second image plane 68 may be part of a second camera 60, as illustrated in FIG. 5. However, the first image plane 66 may represent one camera 58a in a first position and the second image plane 68 may represent the same camera in a second position, as illustrated in FIG. 6. The optics illustrated in FIG. 8 represent the same relative measurements for the embodiment illustrated in FIG. 7, where the microlens sheeting 20 moves relative to the camera 58. In addition, the optics illustrated in FIG. 8 represent the same measurements for whether the composite image 30 is floating above or below the sheeting 20. Preferably, the position of the sheeting is fixed during the first and second pictures of the sheeting 20 by either the first and second camera 58, 60 or by the single camera 58. Alternatively, the single camera 58 is fixed during the first and second pictures of the sheeting 20 and the sheeting 20 moves from a first position and to a second position using holder 38a. Regardless, the system preferably captures two images of the composite sheeting 20 and the floating image 30 from two different perspectives.


The measurements illustrated in FIG. 8 are for calculating the distance “p” between the microlens sheeting 20 in the passport 14 and the floating image 30 floating above or below the sheeting, which is useful for authenticating or verifying the sheeting 20. Essentially, the system is comparing the first image and the second image of the microlens sheeting and comparing the first image and second image of the composite image floating above or below the sheeting, so that the images will cancel each other out, except for the floating distance.


The first camera 58 includes a first camera lens 62 and a first camera image plane 66 and the second camera 60 includes a second camera lens 64 and a second camera image plane 68. The first and second cameras 58, 60 both include a focal length “f” of their lens 62, 64. Preferably, the first and second cameras 58, 60 are similar cameras with the same focal lengths. The first camera image plane 66 has a center point 78. The second camera image plane 68 has a center point 80. The local length “f” is measured from the center point of the camera image planes to the lens of the cameras. The first camera 58 takes a first picture, records or captures a first image of the sheeting 20 and the floating image 30. The second camera 60 takes a second picture, records or captures a second image of the sheeting 20 and the floating image 30. The first image of the microlens sheeting 20 is represented schematically on the first camera image plane 66 as reference number 70. The first image of the floating image 30 is represented schematically on the first camera image plane 66 as reference number 72. The second image of the microlens sheeting 20 is represented schematically on the second camera image plane 68 as reference number 74. The second image of the floating image 30 is represented on the second camera image plane 68 as reference number 76. The lens 62, 64 of the cameras 58, 60 are preferably orthogonal relative to the microlens sheeting 20.


Distance “a” is the distance between the second image 74 of the microlens sheeting on the camera image plane 68 and the center 80 of the camera image plane 68. Distance “b” is the distance between the second image 76 of the floating image 30 on the camera image plane 68 and the center 80 of the camera image plane 68. Distance “d” is the distance between the first image 72 of the floating image 30 on the camera image plane 66 and the center 78 of the camera image plane 66. Distance “c” is the distance between the first image 70 of the microlens sheeting on the camera image plane 66 and the center 78 of the camera image plane 66. Distance “e” is the known distance between the centers of the lens 62, 64 of the cameras. Distance “g” is the known orthogonal distance between the lens 62, 64 of the cameras 58, 60 and the microlens sheeting 20. A relational point other than the center point of lens could be used with appropriate modification of the math formulas.


As a result, the system can measure distances “a”, “b”, “c”, and “d”. The distances “e”, “f”, and “g” are known distances based on how the reader 10 is built. The floating distance or distance p is the unknown distance. The system calculates distance “p” using the measured distances and known distances as follows:

h/e=f/(d−b)and g/e=f(c−a)


Divide h/e and g/e by each other to cancel out the distances “e” and distances “f”:










h
/
e

=

f
/

(

d
-
b

)





g
/
e

=

f
/

(

c
-
a

)






h
g


=


(

c
-
a

)


(

d
-
b

)






which provides a calculation for distance “h”:

h=g(c−a)/(d−b)


Now that distance “h” can be calculated, the floating distance “p” can be calculated as follows:

p=g−h

The example below provides calculation of actual floating distance based on the formulas above.


The system's computer 56 calculates the floating distance “p.” Then, the computer can compare the floating distance to the database of floating distances. This enables inspection authorities to identify any anomalies or discrepancies between the data presented by a traveler and data held in databases. If the calculated floating distance matches the floating distance in the database for the identified composite image 30, then the system authenticates the sheeting 20. If the calculated floating distance does not match the floating distances in the database for the identified composite image 30, then the system determines that the sheeting is not authentic.


In the embodiments illustrated in FIGS. 5-8, the system includes at least one camera that takes a first image and a second image of the microlens sheeting 20 having a floating image 30. The camera may move in any direction relative the sheeting 20 to obtain these first and second images. For instance, the camera may move in the x, y, or z direction relative to the sheeting 20. Alternatively, the camera may rotate around its center of mass relative to the sheeting. In addition, the camera may take multiple images of the sheeting and composite images.


In another alternative embodiment of reader 14 (not illustrated), the reader may have a one fixed focal-length camera. In this embodiment, the single focus camera is moveable between a first position and a second position perpendicular to the sheeting 20. The camera moves along a track between the first position and the second position. First, the camera moves until the microlens sheeting 20 comes into full focus, which establishes the first position of the camera. Then the camera captures a first image of the sheeting 20 and the composite image 30. Next, the camera moves until the composite image 30 comes into full focus, which establishes the second position of the camera. In the second position, the camera captures a second image of the microlens sheeting 20 and the composite image 30. The distance between the first camera position and the second camera position is the distance “p” between the microlens sheeting 20 in the passport 14 and the perceived distance of the floating image 30 floating above or below the sheeting or both.


The reader 10 is capable of locating the floating image 30 and identifying the floating image 30. The camera will first record the floating image 30 and then the computer 56 will compare the recorded floating image 30 with a database of floating images to identify the floating image. The computer 56 preferably includes a template matching program or a normalization correlation matrix, which compares a known image with a recorded image. One example of a normalization correlation is described in Computer Vision by Dana Bollard and Christopher Brown, copyright 1982, published by Prentice Hall, Inc., pages 65-70, which are hereby incorporated by reference.


The reader 10 may include radio-frequency identification (“RFID”) reading capabilities. For instance the reader 10 may include the features disclosed in U.S. patent application Ser. No. 10/953,200, “A Passport Reader for Processing a Passport Having an RFID Element,” (Jesme), which is hereby incorporated by reference. The system will read and authenticate a variety of different floating images.


In an additional embodiment, the floating distance may vary from one sheeting to another. Optionally, the system reads a security code embedded in the sheeting that contains information relating to the floating distance of that sheeting and authenticates the sheeting only if the calculated floating distance matches the floating distance provided in the security code. Alternatively, the security code is used to retrieve the proper floating distance from a database of floating distances.


The operation of the present invention will be further described with regard to the following detailed example, which for convenience references the Figures. These examples are offered to further illustrate the various specific and preferred embodiments and techniques. It should be understood, however, that many variations and modifications may be made while remaining within the scope of the present invention.


In this example, a single Micron Semiconductor 1.3 Mega-pixel color sensor camera from Micron Semiconductor, located in Boise, Id., and a microlens sheeting with a composite image floating at a known distance of 1 centimeter, +/−1 millimeter, was arranged as depicted in FIG. 6. The camera lens 62 was located at a measured distance of 12.5 centimeters (‘g’ in FIG. 8) from the microlens sheeting 20. The microlens sheeting with the floating image was a sample of 3M™ Confirm™ Security Laminate with Floating Images which is commercially available from 3M Company located in, St. Paul, Minn., as part number ES502.


A first image of the microlens sheeting and of the composite image was captured. The camera was then moved laterally and a second image of the microlens sheeting and the composite image was captured.


The first image of the microlens sheeting and composite image were first used to identify if the microlens sheeting had a composite image and to verify if the composite image was the correct image. The computer ran the template matching program which was based on the normalization correlation matrix disclosed in Computer Vision by Dana Bollard and Christopher Brown, published by Prentice-Hall, Inc., copyright 1982, pages 65-70, which has been incorporated by reference. Using the template matching program, the computer was able to identify at least one of the floating images and verify that the floating image was what was expected.


Distances ‘c−a’ and ‘d−b’ (FIG. 8) were determined by the computer. Since the camera captures the images in discrete pixels and the pixel density of the images formed by the camera is known, i.e. the number of pixels per millimeter is known, the computer can calculate the distances a, b, c and d. The computer calculates ‘a’—the distance between points 72 and 80, ‘b’—the distance between points 76 and 80, ‘c’—the distance between points 70 and 78 and ‘d’—the distance between points 74 and 78 by counting the number of pixels in each respective length, i.e. a, b, c and d, and then converting the number of counted pixels by the image pixel density to a length. For this example, the computer determined values for c−a and d−b was 7.6 millimeters and 8.3 millimeters respectively.


With g known and c−a and d−b now determined, h was calculated as follows.

h=g(c−a)/(d−b)=12.5(0.76)/(0.83)=11.45 centimeters


With h now determined and g known, p—the floating height of the composite image—was calculated as follows.

p=g−h=12.5−11.45=1.05 centimeters


As the known floating height of the composite image was 1 centimeter+/−1 millimeter, the measured floating height of 1.05 centimeters was within range. Therefore, the system verifies the security laminate with the floating images as an authentic security laminate.


The tests and test results described above are intended solely to be illustrative, rather than predictive, and variations in the testing procedure can be expected to yield different results.


The present invention has now been described with reference to several embodiments thereof. The foregoing detailed description and example have been given for clarity of understanding only. No unnecessary limitations are to be understood therefrom. All patents and patent applications cited herein are hereby incorporated by reference. It will be apparent to those skilled in the art that many changes can be made in the embodiments described without departing from the scope of the invention. Thus, the scope of the present invention should not be limited to the exact details and structures described herein, but rather by the structures described by the language of the claims, and the equivalents of those structures.

Claims
  • 1. A system for reading and authenticating a composite image in a sheeting, the sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting or both, the system comprising: a reader, comprising: a first camera to capture a first image of the sheeting and a first image of the composite image floating above or below the sheeting or both;a second camera to capture a second image of the sheeting and a second image of the composite image floating above or below the sheeting or both; anda computer for comparing the first image and the second image of the sheeting and for comparing the first image and second image of the composite image floating above or below the sheeting or both to calculate the perceived distance between the sheeting and the composite image floating above or below the sheeting or both.
  • 2. A system for reading and authenticating a composite image in a sheeting, the sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting or both, the system comprising: a reader, comprising: a camera moveable between a first position and a second position, wherein in the first position the camera captures a first image of the sheeting and a first image of the composite image floating above or below the sheeting or both, wherein in the second position the camera captures a second image of the sheeting and a second image of the composite image floating above or below the sheeting or both; anda computer for comparing the first image and the second image of the sheeting and for comparing the first image and second image of the composite image floating above or below the sheeting or both to calculate the perceived distance between the sheeting and the composite image floating above or below the sheeting or both.
  • 3. A system for reading and authenticating a composite image in a sheeting, the sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting or both, the system comprising: a reader, comprising: a camera; anda sheeting holder moveable between a first position and a second position, wherein the microlens sheeting is positioned on the sheeting holder, wherein in the first position the camera captures a first image of the sheeting and a first image of the composite image floating above or below the sheeting or both, wherein in the second position the camera captures a second image of the microlens sheeting and a second image of the composite image floating above or below the sheeting or both; anda computer for comparing the first image and the second image of the sheeting and for comparing the first image and second image of the composite image floating above or below the sheeting or both to calculate the perceived distance between the sheeting and the composite image floating above or below the sheeting or both.
  • 4. A method of reading and authenticating a composite image in a sheeting, comprising the steps of: providing a sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting;recording a first image of the microlens sheeting and recording a first image of the composite image floating above or below the sheeting or both;recording a second image of the microlens sheeting and recording a second image of the composite image floating above or below the sheeting or both;calculating the distance between the sheeting and the composite image floating above or below the sheeting or both by comparing the first image and the second image of the microlens sheeting and by comparing the first image and second image of the composite image floating above or below the sheeting or both.
Parent Case Info

This application is a divisional of Ser. No. 11/002,943, U.S. Pat. No. 7,616,332, filed Dec. 2, 2004, the entire content of which is hereby incorporated herein by reference.

US Referenced Citations (172)
Number Name Date Kind
1905716 Ives Apr 1933 A
1918705 Ives Jul 1933 A
2039648 Ives May 1936 A
2063985 Coffey Dec 1936 A
2279825 Kaszab Apr 1942 A
2326634 Gebhard et al. Aug 1943 A
2500511 Bonnet Mar 1950 A
2622472 Bonnet Dec 1952 A
2833176 Ossoinak May 1958 A
3154872 Nordgren Nov 1964 A
3161509 Howe et al. Dec 1964 A
3306974 Cunnally Feb 1967 A
3357770 Clay Dec 1967 A
3365350 Cahn Jan 1968 A
3442569 Bonnet May 1969 A
3459111 Cooper, Jr. Aug 1969 A
3503315 Montebello Mar 1970 A
3584369 Montebello Jun 1971 A
3607273 Kinney Sep 1971 A
3613539 Dudley Oct 1971 A
3671122 Dudley Jun 1972 A
3676130 Burckhardt et al. Jul 1972 A
3683773 Dudley Aug 1972 A
3706486 de Montebello Dec 1972 A
3751258 Howe et al. Aug 1973 A
3801183 Sevelin et al. Apr 1974 A
4034555 Rosenthal Jul 1977 A
4082426 Brown Apr 1978 A
4099838 Cook et al. Jul 1978 A
4121011 Glover et al. Oct 1978 A
4200875 Galanos Apr 1980 A
4315665 Haines Feb 1982 A
4420527 Conley Dec 1983 A
4424990 White et al. Jan 1984 A
4541727 Rosenthal Sep 1985 A
4541830 Hotta et al. Sep 1985 A
4552442 Street Nov 1985 A
4557590 Winnek Dec 1985 A
4618552 Tanaka et al. Oct 1986 A
4621898 Cohen Nov 1986 A
4629667 Kistner et al. Dec 1986 A
4632895 Patel et al. Dec 1986 A
4634220 Hockert et al. Jan 1987 A
4650283 Orensteen et al. Mar 1987 A
4668063 Street May 1987 A
4688894 Hockert Aug 1987 A
4691993 Porter et al. Sep 1987 A
4700207 Vanier et al. Oct 1987 A
4708920 Orensteen et al. Nov 1987 A
4714656 Bradshaw et al. Dec 1987 A
4732453 de Montebello et al. Mar 1988 A
4743526 Ando et al. May 1988 A
4757350 Street Jul 1988 A
4765656 Becker et al. Aug 1988 A
4772582 DeBoer Sep 1988 A
4775219 Appeldorn et al. Oct 1988 A
4783141 Baba et al. Nov 1988 A
4799739 Newswanger Jan 1989 A
4833124 Lum May 1989 A
4876235 DeBoer Oct 1989 A
4892336 Kaule et al. Jan 1990 A
4917292 Drexler Apr 1990 A
4920039 Fotland et al. Apr 1990 A
4927238 Green et al. May 1990 A
4935335 Fotland Jun 1990 A
5064272 Bailey et al. Nov 1991 A
5091483 Mazurek et al. Feb 1992 A
5105206 Sarraf et al. Apr 1992 A
5169707 Faykish et al. Dec 1992 A
5204160 Rouser Apr 1993 A
5244288 Nagaoka et al. Sep 1993 A
5254390 Lu Oct 1993 A
5279912 Telfer et al. Jan 1994 A
5308737 Bills et al. May 1994 A
5326619 Dower et al. Jul 1994 A
5330799 Sandor et al. Jul 1994 A
5359454 Steenblik et al. Oct 1994 A
5360694 Thien et al. Nov 1994 A
5364740 Fohrenkamm et al. Nov 1994 A
5449597 Sawyer Sep 1995 A
5455689 Taylor et al. Oct 1995 A
5459016 Debe et al. Oct 1995 A
5491045 DeBoer et al. Feb 1996 A
5506300 Ward et al. Apr 1996 A
5514730 Mazurek et al. May 1996 A
5521035 Wolk et al. May 1996 A
5554432 Sandor et al. Sep 1996 A
5589246 Calhoun et al. Dec 1996 A
5594841 Schutz Jan 1997 A
5639580 Morton Jun 1997 A
5642226 Rosenthal Jun 1997 A
5644431 Magee Jul 1997 A
5671089 Allio Sep 1997 A
5680171 Lo et al. Oct 1997 A
5681676 Telfer et al. Oct 1997 A
5685939 Wolk et al. Nov 1997 A
5689372 Morton Nov 1997 A
5706133 Orensteen et al. Jan 1998 A
5712731 Drinkwater et al. Jan 1998 A
5717844 Lo et al. Feb 1998 A
5744291 Ip Apr 1998 A
5757550 Gulick, Jr. May 1998 A
5828488 Ouderkirk et al. Oct 1998 A
5850278 Lo et al. Dec 1998 A
5850580 Taguchi et al. Dec 1998 A
5882774 Jonza et al. Mar 1999 A
5894069 Wen et al. Apr 1999 A
5896230 Goggins Apr 1999 A
5935758 Patel et al. Aug 1999 A
5945249 Patel et al. Aug 1999 A
5986781 Long Nov 1999 A
5994026 DeBoer et al. Nov 1999 A
6028621 Yakubovich Feb 2000 A
6057067 Isberg et al. May 2000 A
6084713 Rosenthal Jul 2000 A
6092465 Agronin Jul 2000 A
6095566 Yamamoto et al. Aug 2000 A
6110645 DeBoer et al. Aug 2000 A
6197474 Niemeyer et al. Mar 2001 B1
6212805 Hill Apr 2001 B1
6222650 Long Apr 2001 B1
6228555 Hoffend, Jr. et al. May 2001 B1
6242152 Staral et al. Jun 2001 B1
6280891 Daniel et al. Aug 2001 B2
6286873 Seder Sep 2001 B1
6288842 Krasa et al. Sep 2001 B1
6291143 Patel et al. Sep 2001 B1
6351537 Dovogodko et al. Feb 2002 B1
6369844 Neumann et al. Apr 2002 B1
6388043 Langer et al. May 2002 B1
6398270 Fukui et al. Jun 2002 B1
6468715 Hoffend, Jr. et al. Oct 2002 B2
6531230 Weber et al. Mar 2003 B1
6552830 Long Apr 2003 B2
6602578 Tompkin et al. Aug 2003 B1
6729655 Dorricott et al. May 2004 B1
6781733 Hira Aug 2004 B1
6791723 Vallmajo et al. Sep 2004 B1
6919892 Cheiky et al. Jul 2005 B1
7054042 Holmes et al. May 2006 B2
7068434 Florczak et al. Jun 2006 B2
7196822 Hu Mar 2007 B2
7246824 Hudson Jul 2007 B2
7253958 Aizenberg et al. Aug 2007 B2
7255909 Mann et al. Aug 2007 B2
7265904 Schilling et al. Sep 2007 B2
7333268 Steenblik et al. Feb 2008 B2
7336422 Dunn et al. Feb 2008 B2
7591415 Jesme Sep 2009 B2
7616332 Kenner Nov 2009 B2
7648744 Kuo et al. Jan 2010 B2
20020054434 Krasa et al. May 2002 A1
20020126396 Dolgoff Sep 2002 A1
20020145807 Nishikawa Oct 2002 A1
20030116630 Carey et al. Jun 2003 A1
20050057812 Raber Mar 2005 A1
20050142468 Blood et al. Jun 2005 A1
20050142469 Blood et al. Jun 2005 A1
20050161512 Jones et al. Jul 2005 A1
20060029753 Kuo et al. Feb 2006 A1
20060129489 Hersch et al. Jun 2006 A1
20060209412 Schilling et al. Sep 2006 A1
20060262411 Dunn et al. Nov 2006 A1
20070081254 Endle et al. Apr 2007 A1
20070132227 Dean Jun 2007 A1
20070196616 Stalder et al. Aug 2007 A1
20070284546 Ryzi et al. Dec 2007 A1
20080023890 Sherman et al. Jan 2008 A1
20080024872 Dunn et al. Jan 2008 A1
20080027199 Mazurek et al. Jan 2008 A1
20080037131 Steenblik et al. Feb 2008 A1
20080130126 Brooks et al. Jun 2008 A1
Foreign Referenced Citations (41)
Number Date Country
2326180 Mar 1999 CA
2 400 894 Aug 2001 CA
198 04 997 Feb 1999 DE
0 175 504 Mar 1986 EP
0 314 134 May 1989 EP
0 363 919 Jan 1990 EP
0 404 004 Dec 1990 EP
0 583 766 Feb 1994 EP
0 658 443 Jun 1995 EP
0 673 785 Sep 1995 EP
0 688 351 Aug 1997 EP
0 655 347 Sep 1997 EP
0 615 860 Aug 1998 EP
1 079 274 Feb 2001 EP
1 130 541 Sep 2001 EP
1 130 541 Sep 2001 EP
03 005 075 Jan 2003 EP
1 308 116 Feb 1973 GB
1 433 025 Apr 1976 GB
2 083 726 Mar 1982 GB
1-181083 Dec 1989 JP
03 068610 Mar 1991 JP
4309583 Nov 1992 JP
6-308895 Nov 1994 JP
7-140571 Jun 1995 JP
7-281327 Oct 1995 JP
10-186276 Jul 1998 JP
11-500236 Jan 1999 JP
01 065153 Mar 1999 JP
01 116917 Apr 2001 JP
WO 8303019 Sep 1983 WO
WO 9526281 Oct 1995 WO
WO 9624867 Aug 1996 WO
WO 9715173 Apr 1997 WO
WO 9746631 Dec 1997 WO
WO 9937949 Jul 1999 WO
WO 9942147 Aug 1999 WO
WO 0222376 Mar 2002 WO
WO 03005075 Jan 2003 WO
WO 03022598 Mar 2003 WO
WO 03061983 Jul 2003 WO
Related Publications (1)
Number Date Country
20090310824 A1 Dec 2009 US
Divisions (1)
Number Date Country
Parent 11002943 Dec 2004 US
Child 12544932 US