The present invention relates to verification of authenticity, and in particular but not exclusively to more especially verification of authenticity of an article such as an personal identification (ID) card, vendable product, important document or other item.
Many traditional authentication security systems rely on a process which is difficult for anybody other than the manufacturer to perform, where the difficulty may be imposed by expense of capital equipment, complexity of technical know-how or preferably both. Examples are the provision of a watermark in bank notes and a hologram on credit cards or passports. Unfortunately, criminals are becoming more sophisticated and can reproduce virtually anything that original manufacturers can do.
Because of this, there is a known approach to authentication security systems which relies on creating security tokens using some process governed by laws of nature which results in each token being unique, and more importantly having a unique characteristic that is measurable and can thus be used as a basis for subsequent verification. According to this approach tokens are manufactured and measured in a set way to obtain a unique characteristic. The characteristic can then be stored in a computer database, or otherwise retained. Tokens of this type can be embedded in the carrier article, e.g. a banknote, passport, ID card, important document. Subsequently, the carrier article can be measured again and the measured characteristic compared with the characteristics stored in the database to establish if there is a match.
Security tokens can be used to access information, authorise transactions or many other purposes. However, damaged tokens and imperfect token identification apparatuses can lead to difficulties in carrying out the activities to which the token should provide enablement.
The present invention has been made, at least in part, in consideration of problems and drawbacks of conventional systems.
The present invention has at least in part resulted from the inventor's work on applying authentication techniques using tokens made of magnetic materials, where the uniqueness is provided by unreproducible defects in the magnetic material that affect the token's magnetic response (as detailed in PCT/GB03/03917, Cowburn). As part of this work, magnetic materials were fabricated in barcode format, i.e. as a number of parallel strips. As well as reading the unique magnetic response of the strips by sweeping a magnetic field with a magnetic reader, an optical scanner was built to read the barcodes by scanning a laser beam over the barcode and using contrast from the varying reflectivity of the barcode strips and the article on which they were formed. This information was complementary to the magnetic characteristic, since the barcode was being used to encode a digital signature of the unique magnetic response in a type of well known self authentication scheme, for example as also described above for banknotes (see for example, Kravolec “Plastic tag makes foolproof ID”, Technology research news, 2 Oct. 2002).
To the surprise of the inventor, it was discovered when using this optical scanner that the paper background material on which the magnetic chips were supported gave a unique optical response to the scanner. On further investigation, it was established that many other unprepared surfaces, such as surfaces of various types of cardboard and plastic, show the same effect. Moreover, it has been established by the inventor that the unique characteristic arises at least in part from speckle, but also includes non-speckle contributions.
It has thus been discovered that it is possible to gain all the advantages of speckle based techniques without having to use a specially prepared token or specially prepare an article in any other way. In particular, many types of paper, cardboard and plastics have been found to give unique characteristic scattering signals from a coherent light beam, so that unique digital signatures can be obtained from almost any paper document or cardboard packaging item.
The above-described known speckle readers used for security devices appear to be based on illuminating the whole of a token with a laser beam and imaging a significant solid angle portion of the resultant speckle pattern with a CCD (see for example GB 2 221 870 and U.S. Pat. No. 6,584,214), thereby obtaining a speckle pattern image of the token made up of a large array of data points.
The reader used by the inventor does not operate in this manner. It uses four single channel detectors (four simple phototransistors) which are angularly spaced apart to collect only four signal components from the scattered laser beam. The laser beam is focused to a spot covering only a very small part of the surface. Signal is collected from different localised areas on the surface by the four single channel detectors as the spot is scanned over the surface. The characteristic response from the article is thus made up of independent measurements from a large number (typically hundreds or thousands) of different localised areas on the article surface. Although four phototransistors are used, analysis using only data from a single one of the phototransistors shows that a unique characteristic response can be derived from this single channel alone! However, higher security levels are obtained if further ones of the four channels are included in the response.
Viewed from a first aspect, the present invention provides an article identification method. The method can comprise: determining a signature from an article based upon an intrinsic characteristic of the article; and comparing the determined signature to a stored signature. The method can also comprise splitting the determined signature into blocks of contiguous data, and performing a comparison operation between each block and respective blocks of the stored signature. Thus a higher level of granularity in verifying the article can be achieved.
In some embodiments the method can also comprise comparing an attribute of a comparison result from each block comparison to an expected attribute of the block comparison to determine a compensation value for use in determining a comparison result. The method can also comprise determining a similarity result between the determined signature and the stored signature, using the compensation value to adjust the determined signature. Thus an article damaged by stretching or shrinking can be successfully identified. Also, a non-linear signature determination can be accommodated without losing identification accuracy. Thus a variety of physical alignment deviations during a signature generation step can be compensated for to allow a correct comparison result to be achieved.
In some embodiments determining the signature comprises: exposing the value entitlement token to coherent radiation; collecting a set of data points that measure scatter of the coherent radiation from intrinsic structure of the value entitlement token; and determining a signature of the value entitlement token from the set of data points. Thus the intrinsic characteristic can be a surface pattern of a material from which an article is made.
In some embodiments the comparing of an attribute of a comparison result from each block comparison to an expected attribute of the block comparison comprises comparing an actual cross-correlation peak location of a comparison result between a block of the determined signature and the corresponding block of the stored signature to an expected cross correlation peak location to determine the compensation value for use in determining a comparison result. Thus the expected result can be used to work out physical alignment deviations of the article during scanning.
In some embodiments the determining of the compensation value comprises estimating a function of best fit to the cross-correlation peak locations for each of the block comparisons, the function of best fit representing an average deviation from the expected cross-correlation peak locations. Thus an average deviation from the expected position can be used for the compensation. The average deviation may be measured in many ways, and may result in a function of best fit which is one of a variety of functions, which can include a straight line function, an exponential function, a trigonometric function and an x2 function.
In some embodiments, the method further comprises comparing the determined signature to a plurality stored signatures. A closest match result between the determined signature and the plurality of stored signatures can be found. Also, a no-match result can be found if the determined signature has determined therefor a similarity result lower than a predetermined threshold for each of the stored signatures. Thus an item can be compared to a database of item signatures to determine whether that item is a member of that database. This can be used to determine the authenticity of a variety of articles, such as products, value transfer tokens, value transfer authorisation tokens, entitlement tokens and access tokens.
In some embodiments, the method can also comprise calculating a similarity result for each compared block. In some embodiments the method can also comprise is comparing the similarity result for at least one predetermined block to a predetermined similarity threshold and returning a negative comparison result in the event of the similarity result for the at least one predetermined block being below a predetermined similarity threshold, regardless of a similarity result for the signatures as a whole. Thus a critical portion of an article can be identified and a positive match be required for authenticity verification of the article for the critical portion as well as the signature as a whole.
Viewed from a second aspect, the present invention provides a system for identifying an article. The system can comprise a signature determination unit operable to determine a signature from an article based upon an intrinsic characteristic of the article and a comparison unit operable to compare the determined signature to a stored signature. The comparison unit can be operable to split the determined signature into blocks of contiguous data and to perform a comparison operation between each block and respective blocks of the stored signature. Thus a high granularity analysis of the article signature can be performed.
In some embodiments, the comparison unit can be further operable to compare an attribute of a comparison result from each block comparison to an expected attribute of block the comparison to determine a compensation value for use in determining a comparison result. The comparison unit can be further operable to determine a similarity result between the determined signature and the stored signature, using the compensation value to adjust the determined signature. Thus an article damaged by stretching or shrinking can be successfully identified. Also, a non-linear signature determination can be accommodated without losing identification accuracy. Thus a variety of physical alignment deviations during a signature generation step can be compensated for to allow a correct comparison result to be achieved.
In some embodiments, the comparison unit can be operable to calculate a similarity result for each compared block. The comparison unit can also be operable to compare the similarity result for at least one predetermined block to a predetermined similarity threshold and to return a negative comparison result in the event of the similarity result for the at least one predetermined block being below a predetermined similarity threshold, regardless of a similarity result for the signatures as a whole. Thus a critical part of an article can be identified and subjected to a higher level of scrutiny than other areas of an article.
In some embodiments, it is ensured that different ones of the data gathered in relation to the intrinsic property of the article relate to scatter from different parts of the article by providing for movement of the coherent beam relative to the article. The movement may be provided by a motor that moves the beam over an article that is held fixed. The motor could be a servo motor, free running motor, stepper motor or any suitable motor type. Alternatively, the drive could be manual in a low cost reader. For example, the operator could scan the beam over the article by moving a carriage on which the article is mounted across a static beam. The coherent beam cross-section will usually be at least one order of magnitude (preferably at least two) smaller than the projection of the article so that a significant number of independent data points can be collected. A focusing arrangement may be provided for bringing the coherent beam into focus in the article. The focusing arrangement may be configured to bring the coherent beam to an elongate focus, in which case the drive is preferably configured to move the coherent beam over the article in a direction transverse to the major axis of the elongate focus. An elongate focus can conveniently be provided with a cylindrical lens, or equivalent mirror arrangement.
In other embodiments, it can be ensured that different ones of the data points relate to scatter from different parts of the article, in that the detector arrangement includes a plurality of detector channels arranged and configured to sense scatter from respective different parts of the article. This can be achieved with directional detectors, local collection of signal with optical fibres or other measures. With directional detectors or other localised collection of signal, the coherent beam does not need to be focused. Indeed, the coherent beam could be static and illuminate the whole sampling volume. Directional detectors could be implemented by focusing lenses fused to, or otherwise fixed in relation to, the detector elements. Optical fibres may be used in conjunction with microlenses.
It is possible to make a workable reader when the detector arrangement consists of only a single detector channel. Other embodiments use a detector arrangement that comprises a group of detector elements angularly distributed and operable to collect a group of data points for each different part of the reading volume, preferably a small group of a few detector elements. Security enhancement is provided when the signature incorporates a contribution from a comparison between data points of the same group. This comparison may conveniently involve a cross-correlation.
Although a working reader can be made with only one detector channel, there are preferably at least 2 channels. This allows cross-correlations between the detector signals to be made, which is useful for the signal processing associated with determining the signature. It is envisaged that between 2 and 10 detector channels will be suitable for most applications with 2 to 4 currently being considered as the optimum balance between apparatus simplicity and security.
The detector elements are advantageously arranged to lie in a plane intersecting the reading volume with each member of the pair being angularly distributed in the plane in relation to the coherent beam axis, preferably with one or more detector elements either side of the beam axis. However, non-planar detector arrangements are also acceptable.
The use of cross-correlations of the signals obtained from the different detectors has been found to give valuable data for increasing the security levels and also for allowing the signatures to be more reliably reproducible over time. The utility of the cross-correlations is somewhat surprising from a scientific point of view, since speckle patterns are inherently uncorrelated (with the exception of signals from opposed points in the pattern). In other words, for a speckle pattern there will by definition be zero cross-correlation between the signals from the different detectors so long as they are not arranged at equal magnitude angles offset from the excitation location in a common plane intersecting the excitation location. The value of using cross-correlation contributions therefore indicates that an important part of the scatter signal is not speckle. The non-speckle contribution could be viewed as being the result of direct scatter, or a diffuse scattering contribution, from a complex surface, such as paper fibre twists. At present the relative importance of the speckle and non-speckle scatter signal contribution is not clear. However, it is clear from the experiments performed to date that the detectors are not measuring a pure speckle pattern, but a composite signal with speckle and non-speckle components.
Incorporating a cross-correlation component in the signature can also be of benefit for improving security. This is because, even if it is possible using high resolution printing to make an article that reproduces the contrast variations over the surface of the genuine article, this would not be able to match the cross-correlation coefficients obtained by scanning the genuine article.
In the one embodiment, the detector channels are made up of discrete detector components in the form of simple phototransistors. Other simple discrete components could be used such as PIN diodes or photodiodes. Integrated detector components, such as a detector array could also be used, although this would add to the cost and complexity of the device.
From initial experiments which modify the illumination angle of the laser beam on the article to be scanned, it also seems to be preferable in practice that the laser beam is incident approximately normal to the surface being scanned in order to obtain a characteristic that can be repeatedly measured from the same surface with little change, even when the article is degraded between measurements. At least some known readers use oblique incidence (see GB 2 221 870). Once appreciated, this effect seems obvious, but it is clearly not immediately apparent as evidenced by the design of some prior art speckle readers including that of GB 2 221 870 and indeed the first prototype reader built by the inventor. The inventor's first prototype reader with oblique incidence functioned reasonably well in laboratory conditions, but was quite sensitive to degradation of the paper used as the article. For example, rubbing the paper with fingers was sufficient to cause significant differences to appear upon re-measurement. The second prototype reader used normal incidence and has been found to be robust against degradation of paper by routine handling, and also more severe events such as: passing through various types of printer including a laser printer, passing through a photocopier machine, writing on, printing on, deliberate scorching in an oven, and crushing and reflattening.
It can therefore be advantageous to mount the source so as to direct the coherent beam onto the reading volume so that it will strike an article with near normal incidence. By near normal incidence means ±5, 10 or 20 degrees. Alternatively, the beam can be directed to have oblique incidence on the articles. This will usually have a negative influence in the case that the beam is scanned over the article.
It is also noted that in the readers described in the detailed description, the detector arrangement is arranged in reflection to detect radiation back scattered from the reading volume. However, if the article is transparent, the detectors could be arranged in transmission.
A signature generator can be operable to access the database of previously recorded signatures and perform a comparison to establish whether the database contains a match to the signature of an article that has been placed in the reading volume. The database may be part of a mass storage device that forms part of the reader apparatus, or may be at a remote location and accessed by the reader through a telecommunications link. The telecommunications link may take any conventional form, including wireless and fixed links, and may be available over the internet. The data acquisition and processing module may be operable, at least in some operational modes, to allow the signature to be added to the database if no match is found.
When using a database, in addition to storing the signature it may also be useful to associate that signature in the database with other information about the article such as a scanned copy of the document, a photograph of a passport holder, details on the place and time of manufacture of the product, or details on the intended sales destination of vendable goods (e.g. to track grey importation).
The invention allows identification of articles made of a variety of different kinds of materials, such as paper, cardboard and plastic.
By intrinsic structure we mean structure that the article inherently will have by virtue of its manufacture, thereby distinguishing over structure specifically provided for security purposes, such as structure given by tokens or artificial fibres incorporated in the article.
By paper or cardboard we mean any article made from wood pulp or equivalent fibre process. The paper or cardboard may be treated with coatings or impregnations or covered with transparent material, such as cellophane. If long-term stability of the surface is a particular concern, the paper may be treated with an acrylic spray-on transparent coating, for example.
Data points can thus be collected as a function of position of illumination by the coherent beam. This can be achieved either by scanning a localised coherent beam over the article, or by using directional detectors to collect scattered light from different parts of the article, or by a combination of both.
The signature is envisaged to be a digital signature in most applications. Typical sizes of the digital signature with current technology would be in the range 200 bits to 8 k bits, where currently it is preferable to have a digital signature size of about 2 k bits for high security.
A further implementation of the invention can be performed without storing the digital signatures in a database, but rather by labelling the entitlement token with a label derived from the signature, wherein the label conforms to a machine-readable encoding protocol.
Specific embodiments of the present invention will now be described by way of example only with reference to the accompanying figures in which:
b shows an example of cross-correlation data data gathered from a scan where the scanned article is distorted;
While the invention is susceptible to various modifications and alternative forms, specific embodiments are shown by way of example in the drawings and are herein described in detail. It should be understood, however, that drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents and alternatives falling within the scope of the present invention as defined by the appended claims.
For providing security and authorisation services in environments such as an e-commerce environment, a system for uniquely identifying a physical item can be used to reduce possibilities for fraud, and to enhance both actual and perceived reliability of the e-commerce system, for both provider and end-users.
Examples of systems suitable for performing such item identification will now be described with reference to
Generally it is desirable that the depth of focus is large, so that any differences in the article positioning in the z direction do not result in significant changes in the size of the beam in the plane of the reading aperture. In the present example, the depth of focus is approximately 0.5 mm which is sufficiently large to produce good results where the position of the article relative to the scanner can be controlled to some extent. The parameters, of depth of focus, numerical aperture and working distance are interdependent, resulting in a well known trade off between spot size and depth of focus.
A drive motor 22 is arranged in the housing 12 for providing linear motion of the optics subassembly 20 via suitable bearings 24 or other means, as indicated by the arrows 26. The drive motor 22 thus serves to move the coherent beam linearly in the x direction over the reading aperture 10 so that the beam 15 is scanned in a direction transverse to the major axis of the elongate focus. Since the coherent beam 15 is dimensioned at its focus to have a cross-section in the xz plane (plane of the drawing) that is much smaller than a projection of the reading volume in a plane normal to the coherent beam, i.e. in the plane of the housing wall in which the reading aperture is set, a scan of the drive motor 22 will cause the coherent beam 15 to sample many different parts of the reading volume under action of the drive motor 22.
Also illustrated schematically are optional distance marks 28 formed on the underside of the housing 12 adjacent the slit 10 along the x direction, i.e. the scan direction. An example spacing between the marks in the x-direction is 300 micrometres. These marks are sampled by a tail of the elongate focus and provide for linearisation of the data in the x direction in situations where such linearisation is required, as is described in more detail further below. The measurement is performed by an additional phototransistor 19 which is a directional detector arranged to collect light from the area of the marks 28 adjacent the slit.
In alternative examples, the marks 28 can be read by a dedicated encoder emitter/detector module 19 that is part of the optics subassembly 20. Encoder emitter/detector modules are used in bar code readers. In one example, an Agilent HEDS-1500 module that is based on a focused light emitting diode (LED) and photodetector can be used. The module signal is fed into the PIC ADC as an extra detector channel (see discussion of
With an example minor dimension of the focus of 40 micrometers, and a scan length in the x direction of 2 cm, n=500, giving 2000 data points with k=4. A typical range of values for k×n depending on desired security level, article type, number of detector channels ‘k’ and other factors is expected to be 100<k×n<10,000. It has also been found that increasing the number of detectors k also improves the insensitivity of the measurements to surface degradation of the article through handling, printing etc. In practice, with the prototypes used to date, a rule of thumb is that the total number of independent data points, i.e. k×n, should be 500 or more to give an acceptably high security level with a wide variety of surfaces. Other minima (either higher or lower) may apply where a scanner is intended for use with only one specific surface type or group of surface types.
In some examples, the PC 34 can have access through an interface connection 38 to a database (dB) 40. The database 40 may be resident on the PC 34 in memory, or stored on a drive thereof. Alternatively, the database 40 may be remote from the PC 34 and accessed by wireless communication, for example using mobile telephony services or a wireless local area network (LAN) in combination with the internet Moreover, the database 40 may be stored locally on the PC 34, but periodically downloaded from a remote source. The database may be administered by a remote entity, which entity may provide access to only a part of the total database to the particular PC 34, and/or may limit access the database on the basis of a security policy.
The database 40 can contain a library of previously recorded signatures. The PC 34 can be programmed so that in use it can access the database 40 and performs a comparison to establish whether the database 40 contains a match to the signature of the article that has been placed in the reading volume. The PC 34 can also be programmed to allow a signature to be added to the database if no match is found.
The way in which data flow between the PC and database is handled can be dependent upon the location of the PC and the relationship between the operator of the PC and the operator of the database. For example, if the PC and reader are being used to confirm the authenticity of an article, then the PC will not need to be able to add new articles to the database, and may in fact not directly access the database, but instead provide the signature to the database for comparison. In this arrangement the database may provide an authenticity result to the PC to indicate whether the article is authentic. On the other hand, if the PC and reader are being used to record or validate an item within the database, then the signature can be provided to the database for storage therein, and no comparison may be needed. In this situation a comparison could be performed however, to avoid a single item being entered into the database twice.
Thus there has now been described an example of a scanning and signature generation apparatus suitable for use in a security mechanism for remote verification of article authenticity. Such a system can be deployed to allow an article to be scanned in more than one location, and for a check to be performed to ensure that the article is the same article in both instances, and optionally for a check to performed to ensure that the article has not been tampered with between initial and subsequent scannings.
Thus there has now been described an apparatus suitable for scanning articles in an automated feeder type device. Depending upon the physical arrangement of the feed arrangement, the scanner may be able to scan one or more single sheets of material, joined sheets or material or three-dimensional items such as packaging cartons.
As shown in
Thus there have now been described an arrangement for manually initiated scanning of an article. This could be used for scanning bank cards and/or credit cards. Thereby a card could be scanned at a terminal where that card is presented for use, and a signature taken from the card could be compared to a stored signature for the card to check the authenticity and un-tampered nature of the card. Such a device could also be used, for example in the context of reading a military-style metal ID-tag (which tags are often also carried by allergy sufferers to alert others to their allergy). This could enable medical personnel treating a patient to ensure that the patient being treated was in fact the correct bearer of the tag. Likewise, in a casualty situation, a recovered tag could be scanned for authenticity to ensure that a casualty has been correctly identified before informing family and/or colleagues.
The above-described examples are based on localised excitation with a coherent light beam of small cross-section in combination with detectors that accept light signal scattered over a much larger area that includes the local area of excitation. It is possible to design a functionally equivalent optical system which is instead based on directional detectors that collect light only from localised areas in combination with excitation of a much larger area.
A hybrid system with a combination of localised excitation and localised detection may also be useful in some cases.
Having now described the principal structural components and functional components of various reader apparatuses, the numerical processing used to determine a signature will now be described. It will be understood that this numerical processing can be implemented for the most part in a computer program that runs on the PC 34 with some elements subordinated to the PIC 30. In alternative examples, the numerical processing could be performed by a dedicated numerical processing device or devices in hardware or firmware.
In other words, it can be essentially pointless to go to the effort and expense of making specially prepared tokens, when unique characteristics are measurable in a straightforward manner from a wide variety of every day articles. The data collection and numerical processing of a scatter signal that takes advantage of the natural structure of an article's surface (or interior in the case of transmission) is now described.
Step S1 is a data acquisition step during which the optical intensity at each of the photodetectors is acquired approximately every 1 ms during the entire length of scan. Simultaneously, the encoder signal is acquired as a function of time. It is noted that if the scan motor has a high degree of linearisation accuracy (e.g. as would a stepper motor) then linearisation of the data may not be required. The data is acquired by the PIC 30 taking data from the ADC 31. The data points are transferred in real time from the PIC 30 to the PC 34. Alternatively, the data points could be stored in memory in the PIC 30 and then passed to the PC 34 at the end of a scan. The number n of data points per detector channel collected in each scan is defined as N in the following. Further, the value ak(i) is defined as the i-th stored intensity value from photodetector k, where i runs from 1 to N. Examples of two raw data sets obtained from such a scan are illustrated in
Step S2 uses numerical interpolation to locally expand and contract ak(i) so that the encoder transitions are evenly spaced in time. This corrects for local variations in the motor speed. This step can be performed in the PC 34 by a computer program.
Step S3 is an optional step. If performed, this step numerically differentiates the data with respect to time. It may also be desirable to apply a weak smoothing function to the data. Differentiation may be useful for highly structured surfaces, as it serves to attenuate uncorrelated contributions from the signal relative to correlated (speckle) contributions.
Step S4 is a step in which, for each photodetector, the mean of the recorded signal is taken over the N data points. For each photodetector, this mean value is subtracted from all of the data points so that the data are distributed about zero intensity. Reference is made to
Step S5 digitises the analogue photodetector data to compute a digital signature representative of the scan. The digital signature is obtained by applying the rule: ak(i)>0 maps onto binary ‘1’ and ak(i)<=0 maps onto binary ‘0’. The digitised data set is defined as dk(i) where i runs from 1 to N. The signature of the article may incorporate further components in addition to the digitised signature of the intensity data just described. These further optional signature components are now described.
Step S6 is an optional step in which a smaller ‘thumbnail’ digital signature is created. This is done either by averaging together adjacent groups of m readings, or more preferably by picking every cth data point, where c is the compression factor of the thumbnail. The latter is preferred since averaging may disproportionately amplify noise. The same digitisation rule used in Step S5 is then applied to the reduced data set. The thumbnail digitisation is defined as tk(i) where i runs 1 to N/c and c is the compression factor.
Step S7 is an optional step applicable when multiple detector channels exist. The additional component is a cross-correlation component calculated between the intensity data obtained from different ones of the photodetectors. With 2 channels there is one possible cross-correlation coefficient, with 3 channels up to 3, and with 4 channels up to 6 etc. The cross-correlation coefficients are useful, since it has been found that they are good indicators of material type. For example, for a particular type of document, such as a passport of a given type, or laser printer paper, the cross-correlation coefficients always appear to lie in predictable ranges. A normalised cross-correlation can be calculated between ak(i) and al(i), where k≠16 and k,l vary across all of the photodetector channel numbers. The normalised cross-correlation function Σ is defined as
Another aspect of the cross-correlation function that can be stored for use in later verification is the width of the peak in the cross-correlation function, for example the full width half maximum (FWHM). The use of the cross-correlation coefficients in verification processing is described further below.
Step S8 is another optional step which is to compute a simple intensity average value indicative of the signal intensity distribution. This may be an overall average of each of the mean values for the different detectors or an average for each detector, such as a root mean square (rms) value of ak(i). If the detectors are arranged in pairs either side of normal incidence as in the reader described above, an average for each pair of detectors may be used. The intensity value has been found to be a good crude filter for material type, since it is a simple indication of overall reflectivity and roughness of the sample. For example, one can use as the intensity value the unnormalised rms value after removal of the average value, i.e. the DC background.
The signature data obtained from scanning an article can be compared against records held in a signature database for verification purposes and/or written to the database to add a new record of the signature to extend the existing database.
A new database record will include the digital signature obtained in Step S5. This can optionally be supplemented by one or more of its smaller thumbnail version obtained in Step S6 for each photodetector channel, the cross-correlation coefficients obtained in Step S7 and the average value(s) obtained in Step S8. Alternatively, the thumbnails may be stored on a separate database of their own optimised for rapid searching, and the rest of the data (including the thumbnails) on a main database.
In a simple implementation, the database could simply be searched to find a match based on the full set of signature data. However, to speed up the verification process, the process can use the smaller thumbnails and pre-screening based on the computed average values and cross-correlation coefficients as now described.
Verification Step V1 is the first step of the verification process, which is to scan an article according to the process described above, i.e. to perform Scan Steps S1 to S8.
Verification Step V2 takes each of the thumbnail entries and evaluates the number of matching bits between it and tk(i+j), where j is a bit offset which is varied to compensate for errors in placement of the scanned area. The value of j is determined and then the thumbnail entry which gives the maximum number of matching bits. This is the ‘hit’ used for further processing.
Verification Step V3 is an optional pre-screening test that is performed before analysing the full digital signature stored for the record against the scanned digital signature. In this pre-screen, the rms values obtained in Scan Step S8 are compared against the corresponding stored values in the database record of the hit. The ‘hit’ is rejected from further processing if the respective average values do not agree within a predefined range. The article is then rejected as non-verified (i.e. jump to Verification Step V6 and issue fail result).
Verification Step V4 is a further optional pre-screening test that is performed before analysing the full digital signature. In this pre-screen, the cross-correlation coefficients obtained in Scan Step S7 are compared against the corresponding stored values in the database record of the hit. The ‘hit’ is rejected from further processing if the respective cross-correlation coefficients do not agree within a predefined range. The article is then rejected as non-verified (i.e. jump to Verification Step V6 and issue fail result).
Another check using the cross-correlation coefficients that could be performed in Verification Step V4 is to check the width of the peak in the cross-correlation function, where the cross-correlation function is evaluated by comparing the value stored from the original scan in Scan Step S7 above and the re-scanned value:
If the width of the re-scanned peak is significantly higher than the width of the original scan, this may be taken as an indicator that the re-scanned article has been tampered with or is otherwise suspicious. For example, this check should beat a fraudster who attempts to fool the system by printing a bar code or other pattern with the same intensity variations that are expected by the photodetectors from the surface being scanned.
Verification Step V5 is the main comparison between the scanned digital signature obtained in Scan Step S5 and the corresponding stored values in the database record of the hit. The full stored digitised signature, dkdb(i) is split into n blocks of q adjacent bits on k detector channels, i.e. there are qk bits per block. A typical value for q is 4 and a typical value for k is 4, making typically 16 bits per block. The qk bits are then matched against the qk corresponding bits in the stored digital signature dkdb(i+j). If the number of matching bits within the block is greater or equal to some pre-defined threshold zthresh, then the number of matching blocks is incremented. A typical value for zthresh is 13. This is repeated for all n blocks. This whole process is repeated for different offset values of j, to compensate for errors in placement of the scanned area, until a maximum number of matching blocks is found. Defining M as the maximum number of matching blocks, the probability of an accidental match is calculated by evaluating:
where s is the probability of an accidental match between any two blocks (which in turn depends upon the chosen value of zthreshold), M is the number of matching blocks and p(M) is the probability of M or more blocks matching accidentally. The value of s is determined by comparing blocks within the data base from scans of different objects of similar materials, e.g. a number of scans of paper documents etc. For the case of q=4, k=4 and zthreshold=13, we typical value of s is 0.1. If the qk bits were entirely independent, then probability theory would give s=0.01 for zthreshold=13. The fact that a higher value is found empirically is because of correlations between the k detector channels and also correlations between adjacent bits in the block due to a finite laser spot width. A typical scan of a piece of paper yields around 314 matching blocks out of a total number of 510 blocks, when compared against the data base entry for that piece of paper. Setting M=314, n=510, s=0.1 for the above equation gives a probability of an accidental match of 10−177.
Verification Step V6 issues a result of the verification process. The probability result obtained in Verification Step V5 may be used in a pass/fail test in which the benchmark is a pre-defined probability threshold. In this case the probability threshold may be set at a level by the system, or may be a variable parameter set at a level chosen by the user. Alternatively, the probability result may be output to the user as a confidence level, either in raw form as the probability itself, or in a modified form using relative terms (e.g. no match/poor match/good match/excellent match) or other classification.
It will be appreciated that many variations are possible. For example, instead of treating the cross-correlation coefficients as a pre-screen component, they could be treated together with the digitised intensity data as part of the main signature. For example the cross-correlation coefficients could be digitised and added to the digitised intensity data. The cross-correlation coefficients could also be digitised on their own and used to generate bit strings or the like which could then be searched in the same way as described above for the thumbnails of the digitised intensity data in order to find the hits.
Thus there have now been described a number of examples arrangements for scanning an article to obtain a signature based upon intrinsic properties of that article. There have also been described examples of how that signature can be generated from the data collected during the scan, and how the signature can be compared to a later scan from the same or a different article to provide a measure of how likely it is that the same article has been scanned in the later scan.
Such a system has many applications, amongst which are security and confidence screening of items for fraud prevention and item traceability.
In some examples, the method for extracting a signature from a scanned article can be optimised to provide reliable recognition of an article despite deformations to that article caused by, for example, stretching or shrinkage. Such stretching or shrinkage of an article may be caused by, for example, water damage to a paper or cardboard based article.
Also, an article may appear to a scanner to be stretched or shrunk if the relative speed of the article to the sensors in the scanner is non-linear. This may occur if, for example the article is being moved along a conveyor system, or if the article is being moved through a scanner by a human holding the article. An example of a likely scenario for this to occur is where a human scans, for example, a bank card using a scanner such as that described with reference to
As described above, where a scanner is based upon a scan head which moves within the scanner unit relative to an article held stationary against or in the scanner, then linearisation guidance can be provided by the optional distance marks 28 to address any non-linearities in the motion of the scan head. Where the article is moved by a human, these non-linearities can be greatly exaggerated
To address recognition problems which could be caused by these non-linear effects, it is possible to adjust the analysis phase of a scan of an article. Thus a modified validation procedure will now be described with reference to
The process carried out in accordance with
As shown in
For each of the blocks, a cross-correlation is performed against the equivalent block for each stored signature with which it is intended that article be compared at step S23. This can be performed using a thumbnail approach with one thumbnail for each block. The results of these cross-correlation calculations are then analysed to identify the location of the cross-correlation peak. The location of the cross-correlation peak is then compared at step S24 to the expected location of the peak for the case were a perfectly linear relationship to exist between the original and later scans of the article.
This relationship can be represented graphically as shown in
In the example of
In the example of
A variety of functions can be test-fitted to the plot of points of the cross-correlation peaks to find a best-fitting function. Thus curves to account for stretch, shrinkage, misalignment, acceleration, deceleration, and combinations thereof can be used. Examples of suitable functions can include straight line functions, exponential functions, a trigonometric functions, x2 functions and x3 functions.
Once a best-fitting function has been identified at step S25, a set of change parameters can be determined which represent how much each cross-correlation peak is shifted from its expected position at step S26. These compensation parameters can then, at step S27, be applied to the data from the scan taken at step S21 in order substantially to reverse the effects of the shrinkage, stretch, misalignment, acceleration or deceleration on the data from the scan. As will be appreciated, the better the best-fit function obtained at step S25 fits the scan data, the better the compensation effect will be.
The compensated scan data is then broken into contiguous blocks at step S28 as in step S22. The blocks are then individually cross-correlated with the respective blocks of data from the stored signature at step S29 to obtain the cross-correlation coefficients. This time the magnitude of the cross-correlation peaks are analysed to determine the uniqueness factor at step S29. Thus it can be determined whether the scanned article is the same as the article which was scanned when the stored signature was created.
Accordingly, there has now been described an example of a method for compensating for physical deformations in a scanned article, and for non-linearities in the motion of the article relative to the scanner. Using this method, a scanned article can be checked against a stored signature for that article obtained from an earlier scan of the article to determine with a high level of certainty whether or not the same article is present at the later scan. Thereby an article constructed from easily distorted material can be reliably recognised. Also, a scanner where the motion of the scanner relative to the article may be non-linear can be used, thereby allowing the use of a low-cost scanner without motion control elements.
Another characteristic of an article which can be detected using a block-wise analysis of a signature generated based upon an intrinsic property of that article is that of localised damage to the article. For example, such a technique can be used to detect modifications to an article made after an initial record scan.
For example, many documents, such as passports, ID cards and driving licenses, include photographs of the bearer. If an authenticity scan of such an article includes a portion of the photograph, then any alteration made to that photograph will be detected. Taking an arbitrary example of splitting a signature into 10 blocks, three of those blocks may cover a photograph on a document and the other seven cover another part of the document, such as a background material. If the photograph is replaced, then a subsequent rescan of the document can be expected to provide a good match for the seven blocks where no modification has occurred, but the replaced photograph will provide a very poor match. By knowing that those three blocks correspond to the photograph, the fact that all three provide a very poor match can be used to automatically fail the validation of the document, regardless of the average score over the whole signature.
Also, many documents include written indications of one or more persons, for example the name of a person identified by a passport, driving licence or identity card, or the name of a bank account holder. Many documents also include a place where written signature of a bearer or certifier is applied. Using a block-wise analysis of a signature obtained therefrom for validation can detect a modification to alter a name or other important word or number printed or written onto a document. A block which corresponds to the position of an altered printing or writing can be expected to produce a much lower quality match than blocks where no modification has taken place. Thus a modified name or written signature can be detected and the document failed in a validation test even if the overall match of the document is sufficiently high to obtain a pass result.
An example of an identity card 300 is shown in
The area and elements selected for the scan area can depend upon a number of factors, including the element of the document which it is most likely that a fraudster would attempt to alter. For example, for any document including a photograph the most likely alteration target will usually be the photograph as this visually identifies the bearer. Thus a scan area for such a document might beneficially be selected to include a portion of the photograph. Another element which may be subjected to fraudulent modification is the bearer's signature, as it is easy for a person to pretend to have a name other than their own, but harder to copy another person's signature. Therefore for signed documents, particularly those not including a photograph, a scan area may beneficially include a portion of a signature on the document.
In the general case therefore, it can be seen that a test for authenticity of an article can comprise a test for a sufficiently high quality match between a verification signature and a record signature for the whole of the signature, and a sufficiently high match over at least selected blocks of the signatures. Thus regions important to the assessing the authenticity of an article can be selected as being critical to achieving a positive authenticity result.
In some examples, blocks other than those selected as critical blocks may be allowed to present a poor match result. Thus a document may be accepted as authentic despite being torn or otherwise damaged in parts, so long as the critical blocks provide a good match and the signature as a whole provides a good match.
Thus there have now been described a number of examples of a system, method and apparatus for identifying localised damage to an article, and for rejecting an inauthentic an article with localised damage or alteration in predetermined regions thereof. Damage or alteration in other regions may be ignored, thereby allowing the document to be recognised as authentic.
In some scanner apparatuses, it is also possible that it may be difficult to determine where a scanned region starts and finishes. Of the examples discussed above, this is most problematic for the example of
In this example, the scan head is operational prior to the application of the article to the scanner. Thus initially the scan head receives data corresponding to the unoccupied space in front of the scan head. As the article is passed in front of the scan head, the data received by the scan head immediately changes to be data describing the article. Thus the data can be monitored to determine where the article starts and all data prior to that can be discarded. The position and length of the scan area relative to the article leading edge can be determined in a number of ways. The simplest is to make the scan area the entire length of the article, such that the end can be detected by the scan head again picking up data corresponding to free space. Another method is to start and/or stop the recorded data a predetermined number of scan readings from the leading edge. Assuming that the article always moves past the scan head at approximately the same speed, this would result in a consistent scan area. Another alternative is to use actual marks on the article to start and stop the scan region, although this may require more work, in terms of data processing, to determine which captured data corresponds to the scan area and which data can be discarded.
Thus there has now been described an number of techniques for scanning an item to gather data based on an intrinsic property of the article, compensating if necessary for damage to the article or non-linearities in the scanning process, and comparing the article to a stored signature based upon a previous scan of an article to determine whether the same article is present for both scans.
When using a biometric technique such as the identity technique described with reference to
Examples of systems, methods and apparatuses for addressing these difficulties will now be described. First, with reference to
As shown in
In some examples, further read heads can be used, such that three, four or more signatures are created for each item. Each scan head can be offset from the others in order to provide signatures from positions adjacent the intended scan location. Thus greater robustness to article misalignment on verification scanning can be provided.
The offset between scan heads can be selected dependent upon factors such as a width of scanned portion of the article, size of scanned are relative to the total article size, likely misalignment amount during verification scanning, and article material.
Thus there has now been described a system for scanning an article to create a signature database against which an article can be checked to verify the identity and/or authenticity of the article.
An example of another system for providing multiple signatures in an article database will now be describe with reference to
As shown in
In some examples, further read head positions can be used, such that three, four or more signatures are created for each item. Each scan head position can be offset from the others in order to provide signatures from positions adjacent the intended scan location. Thus greater robustness to article misalignment on verification scanning can be provided.
The offset between scan head positions can be selected dependent upon factors such as a width of scanned portion of the article, size of scanned are relative to the total article size, likely misalignment amount during verification scanning, and article material.
Thus there has now been described another example of a system for scanning an article to create a signature database against which an article can be checked to verify the identity and/or authenticity of the article.
Although it has been described above that a scanner used for record scanning (i.e. scanning of articles to create reference signatures against which the article can later be validated) can use multiple scan heads and/or scan head positions to create multiple signatures for an article, it is also possible to use a similar system for later validation scanning.
For example, a scanner for use in a validation scan may have multiple read heads to enable multiple validation scan signatures to be generated. Each of these multiple signatures can be compared to a database of recorded signatures, which may itself contain multiple signatures for each recorded item. Due to the fact that, although the different signatures for each item may vary these signatures will all still be extremely different to any signatures for any other items, a match between any one record scan signature and any one validation scan signature should provide sufficient confidence in the identity and/or authenticity of an item.
A multiple read head validation scanner can be arranged much as described with reference to
Although the embodiments above have been described in considerable detail, numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications as well as their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
0515460.4 | Jul 2005 | GB | national |
This application claims priority to and incorporates by reference U.S. provisional application No. 60/702,732 filed on Jul. 27, 2005, and Great Britain patent application GB 0515460.4 filed on Jul. 27, 2005.
Number | Name | Date | Kind |
---|---|---|---|
3781109 | Mayer, Jr. et al. | Dec 1973 | A |
3877019 | Auerbach et al. | Apr 1975 | A |
4179212 | Lahr | Dec 1979 | A |
4218674 | Brosow et al. | Aug 1980 | A |
4423415 | Goldman | Dec 1983 | A |
4525748 | Carbone | Jun 1985 | A |
4537504 | Baltes et al. | Aug 1985 | A |
4544266 | Antes | Oct 1985 | A |
4568936 | Goldman | Feb 1986 | A |
4599509 | Silverman et al. | Jul 1986 | A |
4738901 | Finkel et al. | Apr 1988 | A |
4748316 | Dickson | May 1988 | A |
4785290 | Goldman et al. | Nov 1988 | A |
4791669 | Kage | Dec 1988 | A |
4797921 | Shiraishi | Jan 1989 | A |
4817176 | Marshall et al. | Mar 1989 | A |
4820912 | Samyn | Apr 1989 | A |
4920385 | Clarke et al. | Apr 1990 | A |
4929821 | Kocznar et al. | May 1990 | A |
5003596 | Wood | Mar 1991 | A |
5054066 | Riek et al. | Oct 1991 | A |
5058164 | Elmer et al. | Oct 1991 | A |
5059776 | Antes | Oct 1991 | A |
5081675 | Kittirutsunetorn | Jan 1992 | A |
5103479 | Takaragi et al. | Apr 1992 | A |
5120126 | Wertz et al. | Jun 1992 | A |
5133601 | Cohen et al. | Jul 1992 | A |
5142578 | Matyas et al. | Aug 1992 | A |
5194918 | Kino et al. | Mar 1993 | A |
5243405 | Tichenor et al. | Sep 1993 | A |
5258605 | Metlitsky et al. | Nov 1993 | A |
5306899 | Marom et al. | Apr 1994 | A |
5307423 | Gupta et al. | Apr 1994 | A |
5325167 | Melen | Jun 1994 | A |
5384717 | Ebenstein | Jan 1995 | A |
5451759 | Hoshino et al. | Sep 1995 | A |
5453840 | Parker et al. | Sep 1995 | A |
5485312 | Horner et al. | Jan 1996 | A |
5488661 | Matsui | Jan 1996 | A |
5510199 | Martin | Apr 1996 | A |
5521984 | Denenberg et al. | May 1996 | A |
5539840 | Krtolica et al. | Jul 1996 | A |
5546462 | Indeck et al. | Aug 1996 | A |
5637854 | Thomas | Jun 1997 | A |
5647010 | Okubo et al. | Sep 1997 | A |
5673338 | Denenberg et al. | Sep 1997 | A |
5687002 | Itoh | Nov 1997 | A |
5760386 | Ward | Jun 1998 | A |
5767988 | Dobbs et al. | Jun 1998 | A |
5784463 | Chen et al. | Jul 1998 | A |
5790025 | Amer et al. | Aug 1998 | A |
5886798 | Staub et al. | Mar 1999 | A |
5903721 | Sixtus | May 1999 | A |
5912974 | Holloway et al. | Jun 1999 | A |
6029150 | Kravitz | Feb 2000 | A |
6141119 | Tseng et al. | Oct 2000 | A |
6182892 | Angelo et al. | Feb 2001 | B1 |
6223166 | Kay | Apr 2001 | B1 |
6280797 | Kuczynski et al. | Aug 2001 | B1 |
6314409 | Schneck et al. | Nov 2001 | B2 |
6332663 | Puzio | Dec 2001 | B1 |
6360001 | Berger et al. | Mar 2002 | B1 |
6365907 | Staub et al. | Apr 2002 | B1 |
6373573 | Jung | Apr 2002 | B1 |
6389151 | Carr et al. | May 2002 | B1 |
6390368 | Edwards | May 2002 | B1 |
6466329 | Mukai | Oct 2002 | B1 |
6473165 | Coombs et al. | Oct 2002 | B1 |
6560355 | Graves et al. | May 2003 | B2 |
6563129 | Knobel | May 2003 | B1 |
6584214 | Pappu et al. | Jun 2003 | B1 |
6621916 | Smith et al. | Sep 2003 | B1 |
6741360 | D'Agraives et al. | May 2004 | B2 |
6760472 | Takeda et al. | Jul 2004 | B1 |
6779720 | Lewis | Aug 2004 | B2 |
6850147 | Prokoski et al. | Feb 2005 | B2 |
6882738 | Davis et al. | Apr 2005 | B2 |
6928552 | Mischenko et al. | Aug 2005 | B1 |
6950094 | Gordon et al. | Sep 2005 | B2 |
6955141 | Santanam et al. | Oct 2005 | B2 |
6970573 | Carr et al. | Nov 2005 | B2 |
6977791 | Zhu et al. | Dec 2005 | B2 |
7002675 | Macgibbon | Feb 2006 | B2 |
7071481 | Nekrasov et al. | Jul 2006 | B2 |
7076084 | Davis et al. | Jul 2006 | B2 |
7077332 | Verschuur et al. | Jul 2006 | B2 |
7080041 | Nagel | Jul 2006 | B2 |
7082216 | Jones et al. | Jul 2006 | B2 |
7111321 | Watts, Jr. et al. | Sep 2006 | B1 |
7119662 | Horiguchi et al. | Oct 2006 | B1 |
7143949 | Hannigan | Dec 2006 | B1 |
7164810 | Schnee et al. | Jan 2007 | B2 |
7170391 | Lane et al. | Jan 2007 | B2 |
7200868 | Mattox et al. | Apr 2007 | B2 |
7221445 | Earthman et al. | May 2007 | B2 |
7222361 | Kemper | May 2007 | B2 |
7333629 | Patton et al. | Feb 2008 | B2 |
7336842 | Kondo | Feb 2008 | B2 |
7346184 | Carr et al. | Mar 2008 | B1 |
7389530 | Raghunath et al. | Jun 2008 | B2 |
7391889 | Kim et al. | Jun 2008 | B2 |
7497379 | Chen et al. | Mar 2009 | B2 |
7506365 | Hirano et al. | Mar 2009 | B2 |
7567349 | Tearney et al. | Jul 2009 | B2 |
7577844 | Kirovski | Aug 2009 | B2 |
7599927 | Lebrat | Oct 2009 | B2 |
7599963 | Fernandez | Oct 2009 | B2 |
7602904 | Juels et al. | Oct 2009 | B2 |
7647279 | Bourrieres et al. | Jan 2010 | B2 |
7684069 | Kashiwazaki | Mar 2010 | B2 |
7716297 | Wittel et al. | May 2010 | B1 |
20020091555 | Leppink | Jul 2002 | A1 |
20020111837 | Aupperle | Aug 2002 | A1 |
20020116329 | Serbetcioglu et al. | Aug 2002 | A1 |
20020146155 | Mil'shtein et al. | Oct 2002 | A1 |
20030002067 | Miyano | Jan 2003 | A1 |
20030012374 | Wu et al. | Jan 2003 | A1 |
20030018587 | Althoff et al. | Jan 2003 | A1 |
20030028494 | King et al. | Feb 2003 | A1 |
20030035539 | Thaxton | Feb 2003 | A1 |
20030118191 | Wang et al. | Jun 2003 | A1 |
20030156294 | D'Agraives et al. | Aug 2003 | A1 |
20030178487 | Rogers | Sep 2003 | A1 |
20030219145 | Smith | Nov 2003 | A1 |
20030231806 | Troyanker et al. | Dec 2003 | A1 |
20040016810 | Hori et al. | Jan 2004 | A1 |
20040031849 | Hori et al. | Feb 2004 | A1 |
20040059952 | Newport et al. | Mar 2004 | A1 |
20040101158 | Butler | May 2004 | A1 |
20040112962 | Farrall et al. | Jun 2004 | A1 |
20040155913 | Kosugi et al. | Aug 2004 | A1 |
20040190085 | Silverbrook et al. | Sep 2004 | A1 |
20040199765 | Kohane et al. | Oct 2004 | A1 |
20050044385 | Holdsworth | Feb 2005 | A1 |
20050060171 | Molnar | Mar 2005 | A1 |
20050101841 | Kaylor et al. | May 2005 | A9 |
20050108057 | Cohen et al. | May 2005 | A1 |
20050122209 | Black | Jun 2005 | A1 |
20050135260 | Todd | Jun 2005 | A1 |
20050178841 | Jones et al. | Aug 2005 | A1 |
20050217969 | Coombs et al. | Oct 2005 | A1 |
20050237534 | Deck | Oct 2005 | A1 |
20060022059 | Juds | Feb 2006 | A1 |
20060104103 | Colineau | May 2006 | A1 |
20060163504 | Fujimoto et al. | Jul 2006 | A1 |
20060166381 | Lange | Jul 2006 | A1 |
20060256959 | Hymes | Nov 2006 | A1 |
20060294583 | Cowburn et al. | Dec 2006 | A1 |
20070025619 | Cowburn et al. | Feb 2007 | A1 |
20070027819 | Cowburn | Feb 2007 | A1 |
20070028093 | Cowburn et al. | Feb 2007 | A1 |
20070028107 | Cowburn et al. | Feb 2007 | A1 |
20070028108 | Cowburn et al. | Feb 2007 | A1 |
20070036470 | Piersol et al. | Feb 2007 | A1 |
20070053005 | Cowburn | Mar 2007 | A1 |
20070058037 | Bergeron et al. | Mar 2007 | A1 |
20070113076 | Cowburn et al. | May 2007 | A1 |
20070115497 | Cowburn | May 2007 | A1 |
20070136612 | Asano et al. | Jun 2007 | A1 |
20070153078 | Cowburn | Jul 2007 | A1 |
20070162961 | Tarrance et al. | Jul 2007 | A1 |
20070164729 | Cowburn et al. | Jul 2007 | A1 |
20070165208 | Cowburn et al. | Jul 2007 | A1 |
20070188793 | Wakai | Aug 2007 | A1 |
20070192850 | Cowburn | Aug 2007 | A1 |
20070199047 | Gibart et al. | Aug 2007 | A1 |
20070271456 | Ward et al. | Nov 2007 | A1 |
20080002243 | Cowburn | Jan 2008 | A1 |
20080016358 | Filreis et al. | Jan 2008 | A1 |
20080044096 | Cowburn | Feb 2008 | A1 |
20080051033 | Hymes | Feb 2008 | A1 |
20080260199 | Cowburn | Oct 2008 | A1 |
20080294900 | Cowburn | Nov 2008 | A1 |
20090016535 | Cowburn | Jan 2009 | A1 |
20090083372 | Teppler | Mar 2009 | A1 |
20090283583 | Cowburn | Nov 2009 | A1 |
20090290906 | Cowburn | Nov 2009 | A1 |
20090303000 | Cowburn | Dec 2009 | A1 |
20090307112 | Cowburn | Dec 2009 | A1 |
20100007930 | Cowburn | Jan 2010 | A1 |
20100008590 | Cowburn | Jan 2010 | A1 |
20100141380 | Pishva | Jun 2010 | A1 |
20100158377 | Cowburn | Jun 2010 | A1 |
20100161529 | Cowburn | Jun 2010 | A1 |
20100277446 | van Veenendaal | Nov 2010 | A1 |
Number | Date | Country |
---|---|---|
1588847 | Mar 2005 | CN |
19632269 | Feb 1997 | DE |
19612819 | Oct 1997 | DE |
10155780 | May 2003 | DE |
10234431 | Feb 2004 | DE |
0234105 | Sep 1987 | EP |
0278058 | Aug 1988 | EP |
0334201 | Sep 1989 | EP |
0472192 | Feb 1992 | EP |
0480620 | Apr 1992 | EP |
0570162 | Nov 1993 | EP |
0590826 | Apr 1994 | EP |
0691632 | Jan 1996 | EP |
1087348 | Mar 2001 | EP |
1273461 | Jan 2003 | EP |
1286315 | Feb 2003 | EP |
1388797 | Feb 2004 | EP |
1418542 | May 2004 | EP |
1434161 | Jun 2004 | EP |
1507227 | Feb 2005 | EP |
1577812 | Sep 2005 | EP |
1587030 | Oct 2005 | EP |
1616711 | Dec 2005 | EP |
1990779 | Nov 2008 | EP |
1319928 | Mar 1972 | GB |
1458726 | Dec 1976 | GB |
2097979 | Nov 1982 | GB |
2221870 | Feb 1990 | GB |
2228821 | Sep 1990 | GB |
2304077 | Dec 1997 | GB |
2346110 | Jan 2000 | GB |
2346111 | Jan 2000 | GB |
2411954 | Sep 2005 | GB |
2417074 | Feb 2006 | GB |
2417592 | Mar 2006 | GB |
2417707 | Mar 2006 | GB |
2426100 | Nov 2006 | GB |
2428846 | Feb 2007 | GB |
2428948 | Feb 2007 | GB |
2429092 | Feb 2007 | GB |
2429097 | Feb 2007 | GB |
2431759 | May 2007 | GB |
2433632 | Jun 2007 | GB |
2434642 | Aug 2007 | GB |
2462059 | Jul 2008 | GB |
H02-10482 | Jan 1990 | JP |
H03192523 | Aug 1991 | JP |
H06301840 | Oct 1994 | JP |
07210721 | Aug 1995 | JP |
H08-003548 | Jan 1996 | JP |
H08180189 | Jul 1996 | JP |
09218910 | Aug 1997 | JP |
H11-224319 | Aug 1999 | JP |
2000140987 | May 2000 | JP |
2002092682 | Mar 2002 | JP |
2004102562 | Apr 2002 | JP |
2003-150585 | May 2003 | JP |
2003141595 | May 2003 | JP |
2004171109 | Jun 2004 | JP |
2005217805 | Aug 2005 | JP |
2008523438 | Jul 2010 | JP |
20050023050 | Mar 2005 | KR |
8002604 | Dec 1981 | NL |
9401796 | Oct 1994 | NL |
2043201 | Sep 1995 | RU |
2065819 | Aug 1996 | RU |
8900742 | Jan 1989 | WO |
9111703 | Aug 1991 | WO |
9119614 | Dec 1991 | WO |
9322745 | Nov 1993 | WO |
WO 9524691 | Sep 1995 | WO |
9534018 | Dec 1995 | WO |
9636934 | Nov 1996 | WO |
9724699 | Jul 1997 | WO |
9913391 | Mar 1999 | WO |
0045344 | Aug 2000 | WO |
200046980 | Aug 2000 | WO |
0065541 | Nov 2000 | WO |
0118754 | Mar 2001 | WO |
0125024 | Apr 2001 | WO |
0143086 | Jun 2001 | WO |
0186574 | Nov 2001 | WO |
0186589 | Nov 2001 | WO |
0191007 | Nov 2001 | WO |
0250790 | Jun 2002 | WO |
03087991 | Oct 2003 | WO |
2004025548 | Mar 2004 | WO |
2004025549 | Mar 2004 | WO |
2004057525 | Jul 2004 | WO |
2004070667 | Aug 2004 | WO |
2004109479 | Dec 2004 | WO |
200504039 | Jan 2005 | WO |
2005004797 | Jan 2005 | WO |
2005027032 | Mar 2005 | WO |
2005029447 | Mar 2005 | WO |
2005088517 | Mar 2005 | WO |
2005027032 | Mar 2005 | WO |
20051029447 | Mar 2005 | WO |
2005048256 | May 2005 | WO |
2005078651 | Aug 2005 | WO |
2005080088 | Sep 2005 | WO |
2005086158 | Sep 2005 | WO |
2005088533 | Sep 2005 | WO |
2005122100 | Dec 2005 | WO |
02016112 | Feb 2006 | WO |
02016114 | Feb 2006 | WO |
2006016112 | Feb 2006 | WO |
2006016114 | Feb 2006 | WO |
2006021083 | Mar 2006 | WO |
2006132584 | Dec 2006 | WO |
2007012815 | Feb 2007 | WO |
2007012821 | Feb 2007 | WO |
2007028799 | Mar 2007 | WO |
2007072048 | Jun 2007 | WO |
2007080375 | Jul 2007 | WO |
2007111548 | Oct 2007 | WO |
2007144598 | Dec 2007 | WO |
2009141576 | Nov 2009 | WO |
2010004281 | Jan 2010 | WO |
2010004282 | Jan 2010 | WO |
2004097826 | Apr 2010 | WO |
Number | Date | Country | |
---|---|---|---|
20070028093 A1 | Feb 2007 | US |
Number | Date | Country | |
---|---|---|---|
60702732 | Jul 2005 | US |