The invention relates generally to technology for sensing and recording finger motion, fingerprints and, more particularly to systems, devices and methods for finger motion tracking both alone, and in combination with fingerprint image processing and navigation operations.
Partial fingerprint scanners are becoming popular for a wide variety of security applications. In contrast to “all at once” fingerprint scanners, which capture an image of an entire fingerprint at the same time, partial fingerprint sensing devices use a sensing area that is smaller than the fingerprint area to be imaged. By imaging only a portion of a fingerprint at any given time, the size and cost of a partial fingerprint sensor can be made considerably smaller and cheaper than that of a full fingerprint sensor. However to capture a full fingerprint image, the user must move his finger and “swipe” it across the sensing zone of the partial finger print sensor.
Various types of partial fingerprint readers exist. Some work by optical means, some by pressure sensor means, and others by capacitance sensing means or radiofrequency sensing means.
For example, one common configuration used for a fingerprint sensor is a one or two dimensional array of CCD (charge coupled devices) or C-MOS circuit sensor elements (pixels). These components are embedded in a sensing surface to form a matrix of pressure sensing elements that generate signals in response to pressure app lied to the surface by a finger. These signals are read by a processor and used to reconstruct the fingerprint of a user and to verify identification.
Other devices include one or two dimensional arrays of optical sensors that read light reflected off of a person's finger and onto an array of optical detectors. The reflected light is converted to a signal that defines the fingerprint of the finger analyzed and is used to reconstruct the fingerprint and to verify identification.
Many types of partial fingerprint scanners are comprised of linear (1 dimensional) arrays of sensing elements (pixels). These one dimensional sensors create a two dimensional image of a fingerprint through the relative motion of the finger pad relative to the sensor array.
One class of partial fingerprint sensors that are particularly useful for small device applications are deep finger penetrating radio frequency (RF) based sensors. These are described in U.S. Pat. Nos. 7,099,496; 7,146,024; and US Publication Nos. US 2005-0244038 A1, US 2005-0244039 A1, US 2006-0083411 A1, and US 2007-0031011 A1, and the contents of these patents and patent applications are incorporated herein by reference. These types of sensors are commercially produced by Validity Sensors, Inc, San Jose Calif. This class of sensor mounts the sensing elements (usually arranged in a one dimensional array) on a thin, flexible, and environmentally robust support, and the IC used to drive the sensor in a protected location some distance away from the sensing zone. Such sensors are particularly advantageous in applications where small sensor size and sensor robustness are critical.
The Validity fingerprint sensors measure the intensity of electric fields conducted by finger ridges and valleys, such as deep finger penetrating radio frequency (RF) based sensing technology, and use this information to sense and create the fingerprint image. These devices create sensing elements by creating a linear array composed of many miniature excitation electrodes, spaced at a high density, such as a density of approximately 500 electrodes per inch. The tips of these electrodes are separated from a single sensing electrode by a small sensor gap. The electrodes are electrically excited in a progressive scan pattern and the ridges and valleys of a finger pad alter the electrical properties (usually the capacitive properties) of the excitation electrode-sensing electrode interaction, and this in turn creates a detectable electrical signal. The electrodes and sensors are mounted on thin flexible printed circuit support, and these electrodes and sensors are usually excited and the sensor read by an integrated circuit chip (scanner chip, driver chip, scan IC) designed for this purpose. The end result is to create a one dimensional “image” of the portion of the finger pad immediately over the electrode array and sensor junction.
As the finger surface is moved across the sensor, portions of the fingerprint are sensed and captured by the device's one dimensional scanner, creating an array of one dimensional images indexed by order of data acquisition, and/or alternatively annotated with additional time and/or finger pad location information. Circuitry, such as a computer processor or microprocessor, then creates a full two-dimensional fingerprint image by creating a mosaic of these one dimensional partial fingerprint images.
Often the processor will then compare this recreated two dimensional full fingerprint, usually stored in working memory, with an authorized fingerprint stored in a fingerprint recognition memory, and determine if there is a match or not. Software to fingerprint matching is disclosed in U.S. Pat. Nos. 7,020,591 and 7,194,392 by Wei et. al., and is commercially available from sources such as Cogent systems, Inc., South Pasadena, Calif.
If the scanned fingerprint matches the record of an authorized user, the processor then usually unlocks a secure area or computer system and allows the user access. This enables various types of sensitive areas and information (financial data, security codes, etc.), to be protected from unauthorized users, yet still be easily accessible to authorized users.
The main drawback of partial fingerprint sensors is that in order to obtain a valid fingerprint scan, the user must swipe his or her finger across the sensor surface in a relatively uniform manner. Unfortunately, due to various human factors issues, this usually isn't possible. In the real world, users will not swipe their fingers with a constant speed. Some will swipe more quickly than others, some may swipe at non-uniform speeds, and some may stop partially through a scan, and then resume. In order to account for this type of variation, modern partial fingerprint sensors often incorporate finger position sensors to determine, relative to the fingerprint sensor, how the overall finger position and speed varies during a finger swipe.
One type of finger position indicator, represented by U.S. Pat. No. 7,146,024, and US Publication Nos. US 2005-0244038 A1, US 2005-0244039 A1 (the contents of which are incorporated herein by reference) detects relative finger position using a long array of electrical drive plate sensors. These plates sense the bulk of a finger (rather than the fine details of the fingerprint ridges), and thus sense the relative position of the finger relative to the linear array used for fingerprint sensing. A second type of fingerprint position indicator, represented by U.S. patent publication US 2007-0031011 A1 (the contents of which are incorporated herein by reference), uses two linear partial fingerprint sensors, located about 400 microns apart. The two linear sensors use the slight timing differences that occur when a fingerprint swipe first hits one sensor and then the other sensor to detect when a fingerprint edge passes over the sensors. This technique can also detect relative speed of passage over the two partial sensors. This type of information can be used to deduce overall finger location during the course of a fingerprint swipe.
Another device is described in U.S. Pat. No. 6,002,815 of Immega, et al. The technique used by the Immega device is based on the amount of time required for the finger to travel a fixed distance between two parallel image lines that are oriented perpendicular to the axis of motion.
Still another technique is described in U.S. Pat. No. 6,289,114 of Mainguet. A device utilizing this method reconstructs fingerprints based on sensing and recording images taken of rectangular slices of the fingerprint and piecing them together using an overlapping mosaic algorithm.
In either case, once finger position is known, each of the one-dimensional partial fingerprint images can then be annotated with additional (and optional) time data (time stamp) or finger (finger tip, finger pad, fingerprint location) location data (location stamp). This optional annotation information, which supplements the “order of data acquisition” that would normally be used to keep track of the multiple stored partial fingerprint images in memory, can be used to help to correct distortions (artifacts) when the various one dimensional partial images are assembled into a full two dimensional fingerprint image.
For example, if the user momentarily stops moving the finger during the finger swipe, the system will generate a series of nearly identical partial (one dimensional) fingerprint images. These images will have different orders of acquisition, and differing time stamps, which could confuse a processor when it attempts to create a correct two dimensional full fingerprint image. However if the fingerprint scanner also has a finger position sensor, the finger location data stamp associated with these nearly identical one dimensional partial fingerprint images will provide evidence that the finger stopped because the finger location data linked to these various one-dimensional partial fingerprint images will be almost the same. The computer processor that reassembles the partial fingerprint images into the complete fingerprint image can be instructed or programmed to also analyze the finger position (location) data, and perform appropriate image corrections when the location data shows that the finger paused during a scan.
U.S. Pat. Nos. 7,099,496, 7,146,024, and US Publication Nos. US 20050244038 and US 20050244039 describe a combination fingerprint sensor and finger location apparatus, and the contents of these application are included herein by reference. FIGS. 11-18 of U.S. Pat. No. 7,099,496 show various ways in which finger location sensors and partial fingerprint images can be packaged together to produce a system capable of reproducing an entire fingerprint. FIG. 11 of U.S. Pat. No. 7,099,496, shows a fingerprint sensor that contains both a fingerprint imager (110), (114), (116), (118) and fingerprint location pickup plates (112), and (1120) through (1162).
Similarly application US Publication No. US 2005-0244038 A1,
One drawback of these earlier approaches was that the system was still too sensitive to individual variations in user technique. One way to address this issue, discussed in application US Publication US 20050244039A1, is to assist the user to produce a finger swipe that the system can properly read by giving the user visual or audio feedback as to if the finger swipe produced finger location and speed data that was readable by the system. This way, the user can be encouraged optimize his or her finger swipe technique. However alternative approaches, such as improved signal analysis techniques that make the system more tolerant to variations in user technique, either with or without audio or visual user feedback are also desirable.
One of the reasons why these earlier approaches were overly sensitive to variations in use technique is that the systems were not accurate enough. In a rate based sensor, noise can occur when the sensor that is configured to detect the finger and take a reading of finger features is unable to accurately detect the location and motion of the finger while it is being swiped. The result can be noise, such as electronic noise, resulting from a transition of the finger from one sensor element to another. As a result, the finger motion rate calculation is not totally accurate because the signal noise occurring from one sensor element to another creates uncertainty with respect to the location and timing at which a finger transitions from one sensor to another.
Therefore, there exists a need in the art to more accurately sense finger swiping motion across a fingerprint sensor and to accurately calculate the speed and location of the finger that is in motion across the sensor. There also exists a great need in the art for a more efficient means to accurately sense and capture fingerprints on portable microprocessor controlled devices (e.g. cell phones, smart cards, PDA's, laptop computers, MP3 players, and the like). There is also a need for more convenient and efficient ways to provide navigation and control operations on such portable devices. As will be seen, the invention provides for these multiple needs in an elegant manner.
Further improvements in the finger location and movement sensing technology previously disclosed in U.S. Pat. Nos. 7,099,496 and 7,146,024, and also Publication Nos. US 20050244038 and US 20050244039 that are commonly assigned to applicant and incorporated herein by reference, are possible, and some of these additional improvements are described herein. In particular, the present invention teaches improved signal analysis techniques that can enable finger locations and movement during a finger swipe for a fingerprint scan to be determined with improved accuracy and precision, greater resistance to noise (false signals), and over a wider variety of different finger swipe techniques. The present invention also teaches techniques to use this improved information to produce more accurate fingerprint images.
The motion of a finger as it is being swiped across various sensor elements or plates generates noise as the finger transitions from one sensing plate to another plate. Finger motion may be detected by analyzing the sensor signals as a function of time, detecting stable regions of the signals, followed by regions where the sensor signal becomes distorted or noisy. This will usually define finger motion from one sensor element to the other. The problem, however, is that the noisy area is both broad and poorly defined, and this leads to inaccurate results. This is particularly true when the finger being swiped across the sensor changes in velocity or direction. This commonly happens with different types of users, each of whom may have their own unique techniques. Alternative algorithms that determine finger position and movement with higher accuracy are disclosed, and these algorithms in turn allow finger motion sensors to be produced that can track small variations in finger position and velocity with higher degrees of accuracy.
These enhanced accuracy finger position and motion sensors can be used in a greater variety of different applications. In one type of application, these higher accuracy finger motion sensors can be used in conjunction with partial fingerprint scanners to produce improved fingerprint imagers that function with greater robustness and accuracy. In a second type of application, these higher accuracy finger motion sensors may be used (either with or without a partial fingerprint imager) to control electronic devices. When several of these finger motion and position sensors are aligned in different directions, finger motion over a two dimensional surface may be detected. This allows a new type of finger controlled “mouse” computer input device to be created. Motion of a finger along the surface of such sensors may allow a user to control the movement of an indicator on a display screen, and control a microprocessor device. Such techniques are particularly useful for small space constrained devices, such as cell phones, smart cards, music players, portable computers, personal digital accessories, and the like.
The techniques discussed here can generally be used with the sensing circuits previously described in U.S. Pat. Nos. 7,099,496 and 7,146,024, and also U.S. Publication Nos. US 2005-0244038A1 and US 2005-0244039A1 that are commonly assigned to applicant and incorporated herein by reference. Please see these applications for a more detailed discussion of the electronic elements. The present invention is focused on signal analysis techniques, methods, and algorithms, and improved fingerprint sensors and navigational devices that use these previously disclosed finger position sensing devices. Thus the present application will not reiterate the details of these previously discussed electrical circuits unless they are relevant to the present invention.
Referring to
In operation by a user, the finger (104) is swiped against a top surface (105) of the sensing array, where the finger surface (106) is physically juxtaposed against surface of the individual plates, Pn through P0 and a the location and movement of the bulk of the finger is captured by the sensing array. Normally, a partial fingerprint scanner that captures a more detailed image of a portion of the fingerprint will also be present (see, for example, present FIG. 1I, FIGS. 3 and 4, and FIGS. 5-8 of Publication US2005-0244038A1), but for simplicity, this is not shown in this figure.
The sensing array has a combination of individual sensing elements that are used for sensing the presence and location of the bulk of the finger surface (106) when it is in contact with the surface (105) of the sensing array. These individual elements can be used to sense the presence, location, and motion of the bulk of the finger with respect to the sensing array surface (105). This information can be used to determine the presence and motion of the finger skin (which holds the fingerprint) (106) with respect to sensing array surface (105), but do not capture an image of the fingerprint itself. This is done by the partial fingerprint imaging sensors (not shown).
This bulk finger motion and location information, together with partial fingerprint images obtained by the fingerprint imager, is used to reconstruct a captured image of the fingerprint surface on the underside of the finger (106) for use in authorizing a user by identifying a fingerprint with a stored version of the user's fingerprint.
Thus when finger (104) is moved in the swipe direction (A), the sensor elements (100) can detect the finger's presence, velocity, acceleration, and direction. Depending upon the particular application, this information can be used in different ways. In some applications, only finger velocity information may be desired. In other applications, finger acceleration and direction may also be desired in order to allow a more complete image of the total fingerprint to be reconstructed from a mosaic of partial fingerprint images. Here, speed, acceleration and direction information can help correct for distortions that may occur if the finger is moved in an abnormal or unpredictable direction.
In another application, this information can be used for navigational applications, can and be used somewhat like a computer “mouse” to help the user command a microprocessor equipped computerized device.
Typically the sensing array (100) is connected to a sensor circuit (116), often by link (117) such as a wire connection or flexible circuit board. Using a sensor circuit (116) that is physically separated from the array (100) allows the sensor circuitry, which may be delicate or prone to picking up environmental noise, to be placed in a more protected or convenient location, and can help improve system robustness.
The sensor circuit (116) may include a processor (116a) and also persistent memory (116b) for storing general system process software and instructions executable by the processor. The sensor circuit may also include memory that can be read and written to such as RAM (116c). Motion process software (116d) may be included in the RAM or persistent memory (116b) (typically this software will require at least some RAM to function).
The motion process software (116d), when executed by the processor (116a), can activate the sensing array and interpret signals from the array via I/O (116e). This allows the processor to receive data from the sensing array as a fingerprint surface (106) is swiped across the surface (105). As previously discussed, such data may include finger presence, location, speed, acceleration, and direction. As previously discussed, this data can be used to determine the finger location where a partial fingerprint sensor (not shown) has received a partial fingerprint image, and using this data, a series of partial fingerprint images can be reassembled into a mosaic representative of a complete fingerprint image. In this diagram, the partial fingerprint imager (not shown) is located proximal to the sensing array medium (100).
Here, improved motion process software is described that enables finger location and speed to be determined with superior accuracy and superior resistance to variations in user technique. In order to describe how this improved motion process software operates, the system will be described in further detail.
In order to properly analyze the signal generated by the various individual sensing plates Pn, Pn-1, Pn-2, . . . P3, P2, P1 and P0 as the finger moves across these plates, a significant amount of signal analysis must be done. In particular, the data from the various plates must be properly interpreted, and then properly meshed with the data from a partial fingerprint sensor in order to generate a usable result.
In order to determine the location of the fingertip (107), the processor needs to receive a signal that indicates a finger transfer from one sensor plate to another. Although the analog signal data from the various individual sensing elements could be rounded off or truncated to produce simple binary “on plate” or “off plate” signals (that is, is the finger “on” or “off” sensing element Pn), as was done with some previous implementations, such a low resolution approach discards important information. Much useful information can also be found by analyzing slight variations in the plate signals (that is, digitizing the analog signal generated by the plates to a higher degree of precision), and then using this information, together with time of acquisition data and sensor plate location data, to determine finger position and movement more precisely. Here, techniques to do this more precise analysis are described.
Referring to
In essence then, the signal analysis problem is one of using the electrical signals generated by the various sensing plates to determine the speed of motion of the fingerprint surface, and its location, with respect to the surfaces s1, S2 . . . sn of the respective sensing elements. By carefully analyzing the intensity and timing of these electrical signals, and using the techniques taught herein, the position and movement of the finger may be determined with improved accuracy.
In order to facilitate this discussion, the dimensions of the sensor array will be defined. Here, the fingerprint surface transfers from one sensor element to another between the gaps “d”, the finger surface moves between each and every plate from plates Pn through P0, and these plates have a width wn, a surface sn and a center point cn.
Usually, a sensor array will contain a large number of individual plates, and these are shown in
Referring back to
As a finger tip (107) passes over a particular plate, the plate generates varying electrical signals. In particular, when the finger tip (107) first contacts either edge of the plate, for a while an intermittent (noisy) electrical signal is generated. Then, when the edge of the finger migrates away from the edge more towards the middle of the plate, there is a region where the electrical signal is more stable. Finally, when the edge of the finger migrates toward the opposite edge of the plate, there is another region where again a noisy electrical signal is generated. Thus the timing of the regions of noise and stability contains useful information relative to the precise location and speed of the bulk of the finger, which the present invention exploits to improve the robustness and accuracy of the fingerprint scanner.
Thus, according to the invention, the sensor circuit is able to determine a stable period where it is determined that the fingerprint surface is present and being sensed by a particular sensing element and the readings are continuous. The processor (116) and motion process software (116d) can use this information, in conjunction with preprogrammed knowledge of the geometry and dimensions of the various sensor plates (which can be stored in persistent memory (116b) or alternatively in RAM (116c)), to determine, with improved accuracy, exactly what portion of the finger surface containing fingerprint (106) are passing over a fingerprint imager (sensor) at any given instant of time. This improved finger location information allows the various partial fingerprint images, produced by the fingerprint imager (sensor), to be stitched together by software into a more accurate complete fingerprint image. The complete fingerprint image is more accurate because the relative location of the various partial fingerprint images is better defined. An additional advantage is that by this more detailed analysis of noise and stable periods, more marginal finger swipes can be “saved”, and users will be able to use the system with less training, or alternatively will not need to repeat finger swipes so often. The end result is improved user satisfaction and higher commercial success.
In order to do this, the motion process software (116d) will perform algorithms or processes similar to that shown in
In the parallel process (210), the fingerprint images are captured throughout the sensing period by the fingerprint sensor. In this separate process, which can be completed essentially simultaneously with the bulk finger speed and location sensing, the actual image of the fingerprint (formed from the underside of the bulk finger) that is being swiped across the sensor is being recorded.
It should be appreciated that the accuracy and success of this approach is dependent on the accuracy of the finger location and movement data. Earlier approaches, which neglected finger edge and noise considerations, were essentially blind to finger locations while the finger traversed a given sensor plate Pn. By contrast, by making use of the fact that the plate signal becomes nosier as the finger approaches plate edges, and becomes more stable while the finger is near the center of the plate (process step (206)), additional information is available to help determine finger location and movement to higher precision.
Put another way, according to the invention, the process step (206) is an improved process step which more accurately calculates the bulk finger motion and relative location as it moves from one sensor plate (element) to another during a finger swipe. This data, in combination with speed and location calculations (208) determines the location of the fingerprint images (210) on the finger surface more precisely. In Step (212), the fingerprint image is reconstructed using the images captured in Step (210), and also the speed calculations calculated in Step (208). Using this information, the fingerprint image can be reconstructed more accurately. Usually this reconstructed fingerprint will then be compared to an authorized fingerprint in order to give security authorization for the user.
Once the finger has settled on to the sensor, the signal will then enter a region of relative stability (203A). However this stability then becomes interrupted as the finger then begins to move across the various sensor plates during the progress of the swipe. During the swipe, as the finger tip (107) transitions from between one plate gap and the other, the signal changes as a series of somewhat noisy steps.
The transition stage as the finger tip moves off of a first sensing plate (usually the plate furthest away from the fingerprint imaging sensor) is shown in the area circled by region (204). As the finger moves, the originally steady signal once again becomes unstable and unsure (to keep the diagram simple, this signal noise is not shown). During transition region (204) noise in the signal makes it difficult to precisely determine if the finger is on or off that particular plate, and approaches, which attempt to simplify the signal into a binary “on plate” or “off plate” result are thus prone to error. In reality, the signal transition region where the finger tip transitions from one plate to another may be relatively broad, beginning on the left side of the region (204L) and ranging to (204R), with a center point (204C).
One possible way to cope with the problem is to look at the time interval where the noise begins (passess a certain threshold), look at the time interval where it ends (again passes a certain threshold), and use the midway point to localize the exact time when the tip of the finger passed a given plate gap. Although this works, the results are still prone to error because the beginning time and ending time of a noisy signal can be rather indefinite. Sometimes random noise will cause the beginning tune of the noisy signal to be immediately detected, and sometimes, due to random fluctuations, the beginning time of the noisy signal will not be immediately detected. Similarly sometimes random noise will cause the end time of the noisy signal to be immediately detected, and sometimes the end time of the noise signal will not be immediately detected. Thus the “midway noise time” approach to precisely locate finger location and velocity does not produce timing results with optimal accuracy, and can produce many small timing errors. These timing errors in turn translate into bulk finger velocity and location data with suboptimal accuracy. This in turn leads to suboptimal accuracy in assembling a complete fingerprint scan from a mosaic of partial fingerprint images.
In a more favored embodiment of the invention, the transition is not determined by the center of the noise field (204C). Rather it is determined by the center point of the stable (noise free) region (203(b)) and the center point of a subsequent stable (noise free) region (203(c)).
Returning to
Returning to
Processor (116a) and motion process software (116d) can use this information to deduce finger location and speed. A flow chart for doing this is shown in
After the finger presence is sensed, then the system is initiated and the sensing of a finger location and fingerprint scan commences. In Step (308), a sensor circuit connected to the finger position sensing array (see
Once the signal looses stability, this is detected in Step (316), and the ending time tend of that formerly stable period is assigned in Step (318). In Step (320), the stability time period is calculated by the following formula:
tstable=(tend−tbegin)/2,
In Step (322), the value Of tstable is stored. In Step (324), it is determined whether or not the sensing processes are completed.
If the noisy signal lasts for less than a preset time, such as less than 2 milliseconds, the system may determine that the finger was very briefly swiped but now is no longer present on the sensor. In this case, the process may return to Step (308) and continue to monitor for a stable signal. If the process is completed, then the process proceeds to Step (326) where the end of the sensing period is determined and the entire fingerprint is presumed to swiped and sensed. The process then returns to Step (304) where the system continues to monitor for presence of another fingerprint. As a result of this process, the finger position sensing array, processor, and software thus act to collects the data previously discussed in
As previously discussed in
In
The fingerprint image reconstruction strategy will differ somewhat depending upon how fast the finger was swiped. Usually the partial fingerprint images will be collected at a set number of images per second, producing a defined number of different partial fingerprint images every second. If the finger was swiped slowly, then so much data and so many partial fingerprint images may have been collected as to create much redundancy, and many of these redundant partial fingerprint images can and should be discarded so as not to clutter system memory and processing resources with a lot of redundant data. On the other hand, if the finger swipe was fast, the reverse problem may have occurred, in which an insufficient number of partial fingerprint images may have been recorded. In this case, since fingerprints usually consist of a series of connected grooves, it is usually adequate to make up for at least a small amount of missing data by interpolating between neighboring images (image segments).
In Step (412), the finger speed data from the fingerprint sensing array is used to determine if the fingerprint motion was fast enough to be treated by the “fast swipe” algorithm. If the swipe was a fast swipe, then any gaps between the various partial fingerprint images (again usually one-dimensional fingerprint scans), here called “image segments” may be filled in by interpolation Step (414). If the swipe was a very slow swipe, then redundant (overlapping) partial fingerprint images may be discarded Step (420), and the selected partial fingerprint images (image segments) are used to reconstruct the entire fingerprint. If the swipe was neither too fast or too slow (422), then most of the images can be used. In either of three scenarios, the process ends up in Step (416) to determine whether or not the last image has been used in order to reconstruct the fingerprint image. If it is not the last image, then the process returns to Step (406) where the timing data is retrieved. Once the last image is captured, as determined in Step (416), then the reconstruction of the image is ended in Step (424).
In operation, the motion of finger (501) will generate differential motion signals on motion sensor arrays (510) and (514). In the configuration as shown, if the two arrays produce an identical motion signal, then the finger is proceeding directly towards the partial fingerprint scanner (530). If motion sensor (510) is producing more motion, then the finger is veering to the left. If motion sensor (514) is producing more motion, then the finger is veering to the right. In this way, the same device may produce both fingerprint scans, and may also produce two dimensional finger motion information that can be used in a “mouse like” manner to control a cursor or perform other control operations on an electronic device. Also in this configuration, sensing arrays (510), (514), (530) and traces (512), (516), (532), (534) may all be mounted on the same flexible film-like circuit, and IC chip (542) may be also mounted on this flexible film-like circuit a few centimeters away in a more remote location that is less prone to be subjected to environmental stress. In some embodiments, this configuration may resemble FIG. (2D) or (8A).
Some examples of how a collection of partial fingerprint images are reassembled to form complete fingerprint images are shown in
In
In order to determine which partial fingerprint images should be discarded or excluded, it will sometimes be useful to perform image analysis to confirm or detect image redundancy. Often, this can be done by simple algorithms, such as, on a per pixel basis, subtracting a first image from a second image, and then summing up, over the pixels in the image, the difference and determining if this difference is less than a preset threshold. If so, then the images are likely to be redundant. More sophisticated redundancy determination algorithms may also be used.
By contrast, in
Various types of interpolation may be used to correct for the missing data between images. The interpolation may be an adaptive algorithm that changes its parameters depending upon the image data, or a non-adaptive algorithm that treats all pixels in the image the same way. If a non adaptive algorithm is used, this algorithm may use a variety of different techniques including bicubic, bilinear, lanczos, nearest neighbor, sinc, spline or other methods. If an adaptive method is used, this method may use a variety of different techniques including Genuine Fractals, PhotoZoom Pro, and Qimage. Often, it will be convenient to use nearest neighbor interpolation methods or bicubic interpolation methods because such methods tend to require less microprocessor processing time. Anti-aliasing techniques to remove the jagged edges that separate the images on one side of a gap from the image on the other side of the gap may also be used.
The problem of an inconsistent fingerprint swipe, with mixed slow and fast regions, is shown in
The problem of a very slow fingerprint swipe is shown in
Alternate embodiments and sensor array configurations are also possible. In operation of another embodiment of the invention, the linear sensor array senses and captures fingerprint features in the form of a string of data signals by first sensing the features in an initial sensing and capture, and this is followed by one or more subsequent operations where a sample is taken of a subset of the fingerprint features are captured again over a known time period. This time period may be predetermined or measured as time progresses between sensing and capturing of the samples. Once at least two samples are taken, a subsequent sample is compared against a previous sample to determine the amount shift of the previous sample relative to the subsequent sample. In one embodiment, a single linear line of sensor pixels is used to sense a one-dimensional track of fingerprint features, and the signal sensed by the pixels is converted from an analog signal to a digital signal, where the features are then represented as a string of digital values. For example, the ridges of the fingerprint features may be represented as logical ones, and valleys represented as logical zeros.
When compared, the first string of digital values from one sample can be compared to the second string in a one to one relationship, and a similarity score can be produced that measures the number of matching values. If there is an immediate match, where both strings are substantially identical, then this would indicate that there was no movement during the time between which the two samples were taken. If there is not an immediate match, then this would indicate that there was some movement, and additional comparisons may be needed to determine the distance traveled. For each comparison, the strings of digital values can be shifted one or more pixels at a time. Once a good match is found, the distance traveled by the fingerprint is simply the number of pixels shifted times the distance between the pixels, which may be measured from the center point of one pixel to the center point of another pixel in the array of pixel sensors for example.
In one embodiment, a predetermined number of comparisons can be made along with corresponding similarity scores. The process may then choose the highest score to determine the most accurate comparison. The number of pixels that were shifted to get the best comparison can then be used to determine the distance traveled, since the size of and distance between the pixels can be predetermined, and the number of pixels can thus be used to measure the distance traveled by the fingerprint across the motion sensor over the time period of the motion.
In another embodiment, the process could make comparisons and generate scores to measure against a predetermined threshold, rather than making a predetermined number of comparisons. In this embodiment, the similarity score from each comparison can be measured after the comparison is made. If the score is within the threshold, then it can be used to indicate the amount of shift from one sample to another. This can then be used to determine the distance traveled by the fingerprint across the linear motion sensor.
In one embodiment, generally, the invention provides a fingerprint motion tracking system and method, where a single linear sensor array is configured to sense features of a fingerprint along an axis of finger motion. The linear sensor array includes a plurality of substantially contiguous sensing elements or pixels configured to capture a segment of image data that represents a series of fingerprint features passing over a sensor surface. A buffer is configured to receive and store image data from the linear sensor array. And, a processing element is configured to generate fingerprint motion data. The linear sensor array may be configured to repeatedly sense at least two substantially contiguous segments of fingerprint data, and the processor can generate motion data based on at least two sensed contiguous segments of fingerprint data. In operation, the linear sensor array is configured to sense a first set of features of a fingerprint along an axis of finger motion and to generate a first set of image data captured by a plurality of substantially contiguous pixels of the sensor array. The linear sensor array is also configured to subsequently sense a second set of features of the fingerprint along an axis of finger motion and to generate a second set of image data captured by a plurality of substantially contiguous pixels of the sensor array. The processing element can then compare first and second sets of image data to determine the distance traveled by the fingerprint over a time interval.
As used herein, linear sensor array is a generic term that relates to a portion of sensing elements, whether they are pixels in an optical reader, a static or radio frequency reader that reads electric field intensity to capture a fingerprint image, piezoelectric components in touch-sensitive circuit fingerprint readers, or other elements indicative of fingerprint readers, where the elements are used to sense a portion of the fingerprint, rather than the entire fingerprint. Such sensor arrays may be configured in a number of ways within a matrix of well known sensor devices. For example, as previously discussed, several modern configurations are described and illustrated in pending U.S. Publication No. US 2006-0083411 A1 entitled: Fingerprint Sensing Assemblies and Methods of Making; U.S. Publication US 2005-0244039 A1 entitled: Methods and Apparatus for Acquiring a Swiped Fingerprint Image; U.S. Publication US 2005-0244038 A1, entitled: Fingerprint Sensing Methods and Apparatus; U.S. Publication US 2003-0035570 A1 entitled: Swiped aperture capacitive fingerprint sensing systems and methods, and other applications that are all assigned to common assignee Validity, Inc. Also, many other types of sensor matrices exist in the art directed to capturing fingerprint images. The invention is directed to a novel system, device and method that are not limited in application to any particular sensor matrix or array configuration. In fact, the invention can be used in conjunction with or incorporated into such configurations to improve performance, and further to reduce the processing resources required to capture and reconstruct images.
According to the invention, the linear sensor is substantially contiguous, which is to say that the sensor elements are in a relative proximity to each other so that a first reading of a portion of fingerprint features can be taken, followed by a second reading after a short period of time from another position. The two samples can be compared to determine the relative distance traveled by the fingerprint surface in relation to the sensor surface. The linear sensor is configured to merely take a relatively small sample of the fingerprint at one point in time, then another at a subsequent time. These two samples are used to determine movement of the fingerprint. Two or more samples maybe compared in order to compute direction and velocity of a fingerprint surface relative to the linear sensing elements. These samples may be linear, as described below and illustrated in the drawings, so that a linear array of fingerprint features can be recorded and easily compared to provide a basis for motion, distance traveled over time. If more than one sensor is employed, it is possible to determine direction of motion using vector addition with the different linear samples taken. Thus, some of the functions provided by the invention are a result of taking a linear sample to give a basis for vector analysis. However, those skilled in the art will understand that, given the description below and the related drawings, other embodiments are possible using other configurations of motion sensors, which would not depart from the spirit and scope of the invention, which is defined by the appended claims and their equivalents, as well as any claims and amendments presented in the future and their equivalents.
One useful feature of the invention is that ambiguity in results is substantially prevented. If properly configured, a system configured according to the invention can consistently produce a result, where at least two samples can be taken such that the features of one sample overlap with another sample. Then, comparisons can be made to determine the amount of shift, indicating the amount of movement of the fingerprint across the linear sensor. In prior art systems and methods, it is often the case that no result occurs, and a singularity results. Thus, a user would need to rep eat sensing the fingerprint. In some systems, substantial predictor algorithms have been created in an attempt to compensate or resolve the singularity when it occurs. Such applications are very large and demand a good deal of computation and processing resources, which would greatly bog down a portable device. According to the invention, sensing motion of a fingerprint is substantially certain, where samples taken from the fingerprint surface are consistently reliable. This is particularly important in navigation applications, where relative movement of the finger translates to movement of an object such as a cursor on a graphical user interface (GUI), discussed further below.
In one embodiment, the linear sensor array may be used alone to determine linear movement of a fingerprint. In another embodiment, the single sensor array may be used in conjunction with one or more other linear sensor arrays to determine movement in two dimensions. In either embodiment, the linear sensor arrays are utilized solely for determining motion. If the motion of the analyzed fingerprint occurs generally along a predetermined axis of motion, the single linear sensor array can be utilized to sense the velocity of the fingerprint being analyzed. To capture and record the motion of a fingerprint that is not directed along a predetermined axis of motion, two or more linear arrays (a plurality of arrays) can be used together to sense and record such motion, and a processor can determine the direction and speed of the fingerprint using vector arithmetic.
In yet another embodiment, one or more such linear arrays may be used in conjunction with a fingerprint sensor matrix to more accurately capture and reconstruct a fingerprint image. The sensor matrix can be configured to sense and capture an image of a portion of a fingerprint being analyzed, and the one or more linear arrays can provide motion information for use in reconstructing a fingerprint image. A device so configured would be able to more accurately sense, capture, record and reconstruct a fingerprint image using less processing resources than conventional devices and methods.
Alternatively, in yet another embodiment, one or more arrays can be used to generate motion information for use in accurate navigational operations, such as for use in navigating a cursor on a graphical user interface (GUI). Utilizing the improved processing functions of the invention, an improved navigation device can be constructed that is compatible with a portable device that has the power and processing restrictions discussed above. Examples of such embodiments are described and illustrated below.
A motion sensor configured according to the invention uses substantially less space and power compared to conventional configurations for motion sensing, navigation and fingerprint image reconstruction. Such a configuration can further provide aid to conventional fingerprint reconstructing processes by better sensing motion of a finger while it is being analyzed by a sensing device. This allows a fingerprint sensing device the ability to reconstruct a fingerprint analyzed by a fingerprint sensor with reduced power. Utilizing the invention, conventional processes that need to match and construct fragmented images of a fingerprint, particularly devices that sense and process a fingerprint in portions, can be optimized with information related to fingerprint motion that occurs while a fingerprint surface is being read. Also, using this unique motion detection technology, optimal navigation functions can be provided that demands significantly less power than conventional devices. Such navigation functions can enable a low power navigation device to be integrated in a portable device system, such as a mouse pad used to move a cursor across a graphical user interface (GUI) on portable electronic devices including cellular phones, laptop computers, personal data assistants (PDAs), and other devices where low power navigation functions are desired. A novel system and method are provided that uses minimal space and processing resources in providing accurate motion detection from which fingerprint sensors as well as navigation systems can greatly benefit.
A device or system configured according to the invention can be implemented as a stand alone navigation device, or a device to provide image reconstruction information for use with a line imaging device that matches and assembles a fingerprint image. Such a line imaging device may be any imaging device configured to sense and capture portions of a fingerprint, whether it captures individual perpendicular image lines of a fingerprint, or multiple perpendicular lines. In operation, a motion detection device can operate as a separate motion detection and/or direction detection device. Alternatively, a motion detection device can be used in conjunction with a line imaging device to more accurately and efficiently sense, capture, store and reconstruct a fingerprint image. A device configured according to the invention may include a single array of finger ridge sensing pixels or data sensor points centrally located along the principal axis of motion to be detected, a sampling system to periodically sample the finger contact across the array, and a computational module or element that compares two sets of samples collected at different times to determine the distance traveled while between the two sample times. According to the invention, the motion sensor pixels do not necessarily need to have the same resolution as the line imager. The motion sensor pixels may in fact use a different sensing technique than the imager.
Again, the invention provides separate operations for detecting motion and for sensing and capturing a fingerprint image. Thus, the techniques used for the separate processes can be the same or may be different depending on the application. Those skilled in the art will understand that different variations of the separate processes are possible using known techniques and techniques can be derived without any undue experimentation. Such variations would not depart from the spirit and scope of the invention.
Devices and Applications for Improved Navigation and Control:
The same techniques used to derive finger location and speed to help assist in assembling complete fingerprint images from partial fingerprint image scans can also be used for other purposes as well. In another embodiment of the present invention, these techniques can be used to create elegant “finger mouse” devices that allow the motion of a user's finger to control a computerized system in a manner similar to that of a conventional computer “mouse”.
In this type of embodiment, the invention provides the capability of dual axis finger motion sensing through additional finger motion sensing arrays. In this embodiment, there are two or more (a plurality of) sensor arrays for detecting motion, and each axis is independently measured to determine the component of velocity in that axis. The velocity components from the individual axes are used to compute a vector sum to determine the actual direction and velocity of motion of the finger with respect to the sensor surface. According to the invention, it is not necessary to capture a full image of the fingerprint in order to determine the distance traveled and the velocity. It is only necessary to capture either the finger location, or enough of a linear sample of fingerprint features along the line of motion of the fingerprint to allow motion to be computed.
In one embodiment, a plurality of samples, such as two or three samples, are captured by motion sensor pixels and are used to determine the distance traveled across the axis of motion of the fingerprint relative to the sensor surface and the velocity at which the motion occurs. This information can also be used in “mouse like” computerized device navigational operations. If desired, the information can also of course, further be used in combination with a fingerprint imager to aid in reconstructing a fingerprint image.
In order to provide a navigation device, as well as to detect and correct for finger motion that is not completely aligned with the desired axis, either of the embodiments may be combined in ensembles such that one sensor is aligned on the axis of motion, and additional sensors aligned at an angle (such as 22.5 or 30 degrees) to the principal axis of finger motion. Examples of different embodiments are discussed below.
Referring to
The system further includes a sensor module (102) that is used to sense a user's finger (104) and fingerprint surface (106) when it is moved across fingerprint sensing surface (108). As can be seen, the fingerprint sensing surface (108) is illustrated as a narrow surface that is designed to sense and capture portions of a fingerprint as it is moves across the sensor. These portions can be subsequently reconstructed according to the invention using motion information from the motion sensors (110), (112). Thus, the sensor components illustrated in
Referring to
As an example, in
According to the invention, such a navigational pad can be greatly enhanced using sensor technology according to the invention, where directional movement sensors (110), (112) are used to guide the cursor (206) for searching for and selecting indicia such as toolbar items or icons for opening files, photos and other items when selected. In some applications, a multi-step sensor can read the fingerprint structures for guidance at one level, and may select indicia by pressing harder on the sensor for another level of sensing, Thus, a user can move the cursor around by lightly pressing on and moving a finger along the surface, then pressing harder when selecting an icon, toolbar or other indicia. Utilizing the invention, a more efficient navigation tool can be adapted to perform all of these tasks at low power and high accuracy, a very adaptable feature for portable devices.
Referring to
In this case, as previously discussed for
Referring again to
One advantage of having motion sensors and possibly fingerprint imagers arranged to accept fingers moving with two dimensions of freedom (that is, combinations of up and down and right and left), is that fingerprints can now be created from very different types of finger swipes. As shown in
According to another embodiment 102(a) of the invention illustrated in
Referring to
Referring to
Referring to
Referring to
Referring to
If used for navigation purposes, of the motion sensor configurations above can be utilized for different navigation operations. For example, referring again to
Another application for the invention is the implementation of a scroll function for lists of data or text in a GUI. Precise power control over a range may be useful in manufacturing environments, where small changes in power can greatly affect a process. Another application may be to operate a medical instrument where accuracy is useful to the device's operation.
Computer-mouse-like navigation requires ability to sense motion in two dimensional space, where motion and direction information are required. Referring again to
Thus, if a user would stroke a fingerprint surface against a motion sensor surface, the arrays could pick up the motion and direction information, and a processor could process the information to generate relative motion and direction information for use in navigation, such as for a computer mouse. In this example, a user can move a finger relative to a cursor on a graphical user interface (GUI), such as a computer screen, a cellular phone, a personal data assistant (PDA) or other personal device. The navigation sensor could then cause the cursor to move relative to the fingerprint motion, and a user can navigate across the GUI to operate functions on a computer or other device. Since the motion of the cursor is relative to the movement of the fingerprint surface against the navigation sensor, relatively small movements can translate to equal, lesser or even greater distance movement of the cursor.
Referring to
In the present example, there are three sensors that fan upward for detecting motion and direction. In operation, a user typically will stroke over the sensor in a downward direction, and the three sensors can determine the direction and speed using vector analysis. However, it may be desired to account for motion in either an upward or downward direction, and multiple sensors in either direction would be useful to better capture the information. From an orientation of a user facing the sensor illustrated in
Referring to
Referring to
In one embodiment, in order to support motion at any arbitrary angle, sensor arrays may be oriented at approximately 0, 30, 60, 90, 120, and 150 degrees. Another more robust system might space them at 22.5 degree increments, rather than 30. Once motion reaches 180 degrees, the process can use reverse motion on the zero degree sensor array, and so on. As previously discussed, a device configured in this way would have some of the properties of a navigation touchpad such as those used in laptop computers, with the relative motion sensing capability of a computer mouse.
Such finger motion sensors will often be deep finger penetrating radio frequency (RF) based sensors, as previously discussed in U.S. Pat. Nos. 7,099,496; 7,146,024; and U.S. patent application Ser. Nos. 11/107,682; 11/112,338; 11,243,100; 11/184,464; however alternative sensing techniques (optical sensors, etc.) may also be used. The circuitry used to drive such sensors was previously shown in partial detail in
As before, the overall device (100) is one or more sensor elements composed of linear arrays of finger position sensing plates, and other optional sensors such as fingerprint imaging sensors. The finger position sensing plates will usually be activated or scanned by sensor control logic (252), which will send electrical signals, to the sensing plates in some sort of rapid scanning order. Control logic (252) may also control power, reset control of the sensor pixels or data contact points, control the output signal, control light sources or cooling (if an optical sensor is used), or perform other standard control functions. The output from these plates will in turn be detected by a readout circuit (254). This readout circuit is usually controlled by an amplifier (256) to detect and amplify the electrical signal from a particular plate. This signal is normally affected by the presence or absence of a finger. The output from amplifier (258) will often then be filtered with a low pass filter (258) to reduce ambient electrical noise, and will normally then be digitized by an analog to digital converter (260). This data will then normally be transmitted by a communications link, such as a system bus (280), serial link, parallel link, or some other data transmission means to other processing devices for further analysis.
The readout circuit (254) may store the output signal (data) in storage (262). If fingerprint images are obtained, the fingerprint data (264) is stored and preserved, either temporarily until the processor (266) can process the data, or for later use by the processor as needed. The processor (266) includes arithmetic unit (268) configured to process algorithms used for navigation of a cursor, such as those described in connection with navigation features of
Referring to
More specifically, the navigation sensor operation algorithm can be used as is a finger mouse cursor control algorithm and this more specific case is shown in
The invention may also involve a number of functions to be performed by a computer processor, such as a microprocessor. The microprocessor may be a specialized or dedicated microprocessor that is configured to perform particular tasks by executing machine-readable software code that defines the particular tasks. The microprocessor may also be configured to operate and communicate with other devices such as direct memory access modules, memory storage devices, Internet related hardware, and other devices that relate to the transmission of data in accordance with the invention. The software code may be configured using software formats such as Java, C++, XML (Extensible Mark-up Language) and other languages that may be used to define functions that relate to operations of devices required to carry out the functional operations related to the invention. The code may be written in different forms and styles, many of which are known to those skilled in the art. Different code formats, code configurations, styles and forms of software programs and other means of configuring code to define the operations of a microprocessor in accordance with the invention will not depart from the spirit and scope of the invention.
Within the different types of computers, such as computer servers, that utilize the invention, there exist different types of memory devices for storing and retrieving information while performing functions according to the invention. Cache memory devices are often included in such computers for use by the central processing unit as a convenient storage location for information that is frequently stored and retrieved. Similarly, a persistent memory is also frequently used with such computers for maintaining information that is frequently retrieved by a central processing unit, but that is not often altered within the persistent memory, unlike the cache memory. Main memory is also usually included for storing and retrieving larger amounts of information such as data and software applications configured to perform functions according to the invention when executed by the central processing unit. These memory devices may be configured as random access memory (RAM), static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, and other memory storage devices that may be accessed by a central processing unit to store and retrieve information. The invention is not limited to any particular type of memory device, or any commonly used protocol for storing and retrieving information to and from these memory devices respectively.
The apparatus and method include a method and apparatus for enabling and controlling fingerprint sensors and fingerprint image data and motion data in conjunction with the operation of an electronic device where navigation and fingerprint verification processes are utilized. Although this embodiment is described and illustrated in the context of devices, systems and related methods of imaging fingerprints and navigation features for a portable device, the scope of the invention extends to other applications where such functions are useful. Furthermore, while the foregoing description has been with reference to particular embodiments of the invention, it will be appreciated that these are only illustrative of the invention and that changes may be made to those embodiments without departing from the principles of the invention.
This application is related to U.S. Non-Provisional application Ser. No. TBD, filed Dec. 14, 2007, entitled “Method and Apparatus for Fingerprint Image Reconstruction,” and U.S. Non-Provisional application Ser. No. TBD, filed Dec. 14, 2007, entitled “Method and Algorithm for Accurate Finger Motion Tracking.” This application is a continuation in part of, and claims the priority benefit of, U.S. Non-Provisional application Ser. No. 11/519,383, filed Sep. 11, 2006, and also a continuation in part of U.S. Non-Provisional application Ser. No. 11/519,362, filed Sep. 11, 2006. This application is a continuation in part of, and claims the priority benefit of, U.S. Non-Provisional application Ser. No. 11/107,682, filed Apr. 15, 2005. U.S. Non-Provisional patent application Ser. No. 11/107,682 claimed the priority benefit of U.S. Provisional Patent Application No. 60/563,139, filed Apr. 16, 2004. This application is also a continuation in part of, and claims the priority benefit of, U.S. Non-Provisional patent application Ser. No. 11/112,338, filed Apr. 22, 2005. U.S. Non-Provisional patent application Ser. No. 11/112,338 claimed the priority benefit of U.S. Provisional Application 60/564,791, filed Apr. 23, 2004. This application is also a continuation in part of, and claims the priority benefit of, U.S. Non-Provisional patent application Ser. No. 11/243,100, filed Oct. 4, 2005. U.S. Non-Provisional patent application Ser. No. 11/243,100 claimed the priority benefit of U.S. Provisional Patent Application 60/615,718, filed Oct. 4, 2004.
Number | Name | Date | Kind |
---|---|---|---|
4151512 | Riganati et al. | Apr 1979 | A |
4225850 | Chang et al. | Sep 1980 | A |
4310827 | Asai | Jan 1982 | A |
4353056 | Tsikos | Oct 1982 | A |
4405829 | Rivest et al. | Sep 1983 | A |
4525859 | Bowles et al. | Jun 1985 | A |
4550221 | Mabusth | Oct 1985 | A |
4580790 | Doose | Apr 1986 | A |
4758622 | Gosselin | Jul 1988 | A |
4817183 | Sparrow | Mar 1989 | A |
5076566 | Kriegel | Dec 1991 | A |
5109427 | Yang | Apr 1992 | A |
5140642 | Hsu et al. | Aug 1992 | A |
5305017 | Gerpheide | Apr 1994 | A |
5319323 | Fong | Jun 1994 | A |
5325442 | Knapp | Jun 1994 | A |
5420936 | Fitzpatrick et al. | May 1995 | A |
5422807 | Mitra et al. | Jun 1995 | A |
5456256 | Schneider et al. | Oct 1995 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5569901 | Bridgelall et al. | Oct 1996 | A |
5623552 | Lane | Apr 1997 | A |
5627316 | De Winter et al. | May 1997 | A |
5650842 | Maase et al. | Jul 1997 | A |
5717777 | Wong et al. | Feb 1998 | A |
5781651 | Hsiao et al. | Jul 1998 | A |
5801681 | Sayag | Sep 1998 | A |
5818956 | Tuli | Oct 1998 | A |
5838306 | O'Connor | Nov 1998 | A |
5848176 | Harra et al. | Dec 1998 | A |
5850450 | Schweitzer et al. | Dec 1998 | A |
5852670 | Setlak et al. | Dec 1998 | A |
5864296 | Upton | Jan 1999 | A |
5887343 | Salatino et al. | Mar 1999 | A |
5892824 | Beatson et al. | Apr 1999 | A |
5903225 | Schmitt et al. | May 1999 | A |
5915757 | Tsuyama et al. | Jun 1999 | A |
5920384 | Borza | Jul 1999 | A |
5920640 | Salatino et al. | Jul 1999 | A |
5940526 | Setlak et al. | Aug 1999 | A |
5999637 | Toyoda et al. | Dec 1999 | A |
6002815 | Immega et al. | Dec 1999 | A |
6016355 | Dickinson et al. | Jan 2000 | A |
6052475 | Upton | Apr 2000 | A |
6067368 | Setlak et al. | May 2000 | A |
6073343 | Petrick et al. | Jun 2000 | A |
6076566 | Lowe | Jun 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6098175 | Lee | Aug 2000 | A |
6118318 | Fifield et al. | Sep 2000 | A |
6134340 | Hsu et al. | Oct 2000 | A |
6157722 | Lerner et al. | Dec 2000 | A |
6161213 | Lofstrom | Dec 2000 | A |
6175407 | Santor | Jan 2001 | B1 |
6182076 | Yu et al. | Jan 2001 | B1 |
6182892 | Angelo et al. | Feb 2001 | B1 |
6185318 | Jain et al. | Feb 2001 | B1 |
6234031 | Suga | May 2001 | B1 |
6241288 | Bergenek et al. | Jun 2001 | B1 |
6259108 | Antonelli et al. | Jul 2001 | B1 |
6289114 | Mainguet | Sep 2001 | B1 |
6317508 | Kramer et al. | Nov 2001 | B1 |
6320394 | Tartagni | Nov 2001 | B1 |
6332193 | Glass et al. | Dec 2001 | B1 |
6333989 | Borza | Dec 2001 | B1 |
6337919 | Dunton | Jan 2002 | B1 |
6346739 | Lepert et al. | Feb 2002 | B1 |
6347040 | Fries et al. | Feb 2002 | B1 |
6360004 | Akizuki | Mar 2002 | B1 |
6362633 | Tartagni | Mar 2002 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6399994 | Shobu | Jun 2002 | B2 |
6400836 | Senior | Jun 2002 | B2 |
6408087 | Kramer | Jun 2002 | B1 |
6473072 | Comiskey et al. | Oct 2002 | B1 |
6509501 | Eicken et al. | Jan 2003 | B2 |
6539101 | Black | Mar 2003 | B1 |
6580816 | Kramer et al. | Jun 2003 | B2 |
6597289 | Sabatini | Jul 2003 | B2 |
6643389 | Raynal et al. | Nov 2003 | B1 |
6672174 | Deconde et al. | Jan 2004 | B2 |
6710461 | Chou et al. | Mar 2004 | B2 |
6738050 | Comiskey et al. | May 2004 | B2 |
6741729 | Bjorn et al. | May 2004 | B2 |
6757002 | Oross et al. | Jun 2004 | B1 |
6766040 | Catalano et al. | Jul 2004 | B1 |
6785407 | Tschudi et al. | Aug 2004 | B1 |
6836230 | Le Pailleur et al. | Dec 2004 | B2 |
6838905 | Doyle | Jan 2005 | B1 |
6886104 | McClurg et al. | Apr 2005 | B1 |
6897002 | Teraoka et al. | May 2005 | B2 |
6898299 | Brooks | May 2005 | B1 |
6924496 | Manansala | Aug 2005 | B2 |
6937748 | Schneider et al. | Aug 2005 | B1 |
6941001 | Bolle et al. | Sep 2005 | B1 |
6941810 | Okada | Sep 2005 | B2 |
6950540 | Higuchi | Sep 2005 | B2 |
6959874 | Bardwell | Nov 2005 | B2 |
6963626 | Shaeffer et al. | Nov 2005 | B1 |
6970584 | O'Gorman et al. | Nov 2005 | B2 |
6980672 | Saito et al. | Dec 2005 | B2 |
6983882 | Cassone | Jan 2006 | B2 |
7013030 | Wong et al. | Mar 2006 | B2 |
7020591 | Wei et al. | Mar 2006 | B1 |
7030860 | Hsu et al. | Apr 2006 | B1 |
7035443 | Wong | Apr 2006 | B2 |
7040561 | Wong | May 2006 | B2 |
7042535 | Katoh et al. | May 2006 | B2 |
7043644 | DeBruine | May 2006 | B2 |
7046230 | Zadesky et al. | May 2006 | B2 |
7064743 | Nishikawa | Jun 2006 | B2 |
7099496 | Benkley | Aug 2006 | B2 |
7110577 | Tschud | Sep 2006 | B1 |
7113622 | Hamid | Sep 2006 | B2 |
7126389 | McRae et al. | Oct 2006 | B1 |
7129926 | Mathiassen et al. | Oct 2006 | B2 |
7136514 | Wong | Nov 2006 | B1 |
7146024 | Benkley | Dec 2006 | B2 |
7146026 | Russon et al. | Dec 2006 | B2 |
7146029 | Manansala | Dec 2006 | B2 |
7184581 | Johansen et al. | Feb 2007 | B2 |
7190816 | Mitsuyu et al. | Mar 2007 | B2 |
7194392 | Tuken et al. | Mar 2007 | B2 |
7197168 | Russo | Mar 2007 | B2 |
7200250 | Chou | Apr 2007 | B2 |
7251351 | Mathiassen et al. | Jul 2007 | B2 |
7258279 | Schneider et al. | Aug 2007 | B2 |
7260246 | Fujii | Aug 2007 | B2 |
7263212 | Kawabe | Aug 2007 | B2 |
7263213 | Rowe | Aug 2007 | B2 |
7289649 | Walley et al. | Oct 2007 | B1 |
7290323 | Deconde et al. | Nov 2007 | B2 |
7308121 | Mathiassen et al. | Dec 2007 | B2 |
7308122 | McClurg et al. | Dec 2007 | B2 |
7321672 | Sasaki et al. | Jan 2008 | B2 |
7356169 | Hamid | Apr 2008 | B2 |
7360688 | Harris | Apr 2008 | B1 |
7369685 | DeLeon | May 2008 | B2 |
7379569 | Chikazawa et al. | May 2008 | B2 |
7409876 | Ganapathi et al. | Aug 2008 | B2 |
7412083 | Takahashi | Aug 2008 | B2 |
7424618 | Roy et al. | Sep 2008 | B2 |
7447339 | Mimura et al. | Nov 2008 | B2 |
7447911 | Chou et al. | Nov 2008 | B2 |
7460697 | Erhart et al. | Dec 2008 | B2 |
7463756 | Benkley | Dec 2008 | B2 |
7505611 | Fyke | Mar 2009 | B2 |
7505613 | Russo | Mar 2009 | B2 |
7565548 | Fiske et al. | Jul 2009 | B2 |
7574022 | Russo | Aug 2009 | B2 |
7643950 | Getzin et al. | Jan 2010 | B1 |
7646897 | Fyke | Jan 2010 | B2 |
7681232 | Nordentoft et al. | Mar 2010 | B2 |
7689013 | Shinzaki | Mar 2010 | B2 |
7706581 | Drews et al. | Apr 2010 | B2 |
7733697 | Picca et al. | Jun 2010 | B2 |
7751601 | Benkley | Jul 2010 | B2 |
7843438 | Onoda | Nov 2010 | B2 |
7899216 | Watanabe et al. | Mar 2011 | B2 |
7953258 | Dean et al. | May 2011 | B2 |
8005276 | Dean et al. | Aug 2011 | B2 |
8077935 | Geoffroy et al. | Dec 2011 | B2 |
8107212 | Nelson et al. | Jan 2012 | B2 |
8115530 | Lewis et al. | Feb 2012 | B2 |
8131026 | Benkley et al. | Mar 2012 | B2 |
20010043728 | Kramer et al. | Nov 2001 | A1 |
20020025062 | Black | Feb 2002 | A1 |
20020061125 | Fujii | May 2002 | A1 |
20020064892 | Lepert et al. | May 2002 | A1 |
20020067845 | Griffis | Jun 2002 | A1 |
20020073046 | David | Jun 2002 | A1 |
20020089044 | Simmons et al. | Jul 2002 | A1 |
20020089410 | Janiak et al. | Jul 2002 | A1 |
20020096731 | Wu et al. | Jul 2002 | A1 |
20020122026 | Bergstrom | Sep 2002 | A1 |
20020126516 | Jeon | Sep 2002 | A1 |
20020133725 | Roy et al. | Sep 2002 | A1 |
20020181749 | Matsumoto et al. | Dec 2002 | A1 |
20030002717 | Hamid | Jan 2003 | A1 |
20030002719 | Hamid et al. | Jan 2003 | A1 |
20030021495 | Cheng | Jan 2003 | A1 |
20030035570 | Benkley, III | Feb 2003 | A1 |
20030063782 | Acharya et al. | Apr 2003 | A1 |
20030068072 | Hamid | Apr 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030076303 | Huppi | Apr 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030102874 | Lane et al. | Jun 2003 | A1 |
20030123714 | O'Gorman et al. | Jul 2003 | A1 |
20030123715 | Uchida | Jul 2003 | A1 |
20030141959 | Keogh et al. | Jul 2003 | A1 |
20030147015 | Katoh et al. | Aug 2003 | A1 |
20030161510 | Fuji | Aug 2003 | A1 |
20030161512 | Mathiassen et al. | Aug 2003 | A1 |
20030169228 | Mathiassen et al. | Sep 2003 | A1 |
20030174871 | Yoshioka et al. | Sep 2003 | A1 |
20030186157 | Teraoka et al. | Oct 2003 | A1 |
20030209293 | Sako et al. | Nov 2003 | A1 |
20030224553 | Manansala | Dec 2003 | A1 |
20040012773 | Puttkammer | Jan 2004 | A1 |
20040022001 | Chu et al. | Feb 2004 | A1 |
20040042642 | Bolle et al. | Mar 2004 | A1 |
20040050930 | Rowe | Mar 2004 | A1 |
20040066613 | Leitao | Apr 2004 | A1 |
20040076313 | Bronstein et al. | Apr 2004 | A1 |
20040081339 | Benkley | Apr 2004 | A1 |
20040096086 | Miyasaka | May 2004 | A1 |
20040113956 | Bellwood et al. | Jun 2004 | A1 |
20040120400 | Linzer | Jun 2004 | A1 |
20040125993 | Zhao et al. | Jul 2004 | A1 |
20040129787 | Saito | Jul 2004 | A1 |
20040136612 | Meister et al. | Jul 2004 | A1 |
20040172339 | Snelgrove et al. | Sep 2004 | A1 |
20040179718 | Chou | Sep 2004 | A1 |
20040184641 | Nagasaka et al. | Sep 2004 | A1 |
20040190761 | Lee | Sep 2004 | A1 |
20040208346 | Baharav et al. | Oct 2004 | A1 |
20040208347 | Baharav et al. | Oct 2004 | A1 |
20040208348 | Baharav et al. | Oct 2004 | A1 |
20040213441 | Tschudi | Oct 2004 | A1 |
20040215689 | Dooley et al. | Oct 2004 | A1 |
20040228505 | Sugimoto | Nov 2004 | A1 |
20040228508 | Shigeta | Nov 2004 | A1 |
20040240712 | Rowe et al. | Dec 2004 | A1 |
20040252867 | Lan et al. | Dec 2004 | A1 |
20050031174 | Ryhanen et al. | Feb 2005 | A1 |
20050036665 | Higuchi | Feb 2005 | A1 |
20050047485 | Khayrallah et al. | Mar 2005 | A1 |
20050100196 | Scott et al. | May 2005 | A1 |
20050109835 | Jacoby et al. | May 2005 | A1 |
20050110103 | Setlak | May 2005 | A1 |
20050111708 | Chou | May 2005 | A1 |
20050123176 | Ishil et al. | Jun 2005 | A1 |
20050136200 | Durell et al. | Jun 2005 | A1 |
20050139656 | Arnouse | Jun 2005 | A1 |
20050139685 | Kozlay | Jun 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050169503 | Howell et al. | Aug 2005 | A1 |
20050210271 | Chou et al. | Sep 2005 | A1 |
20050219200 | Weng | Oct 2005 | A1 |
20050220329 | Payne et al. | Oct 2005 | A1 |
20050231213 | Chou et al. | Oct 2005 | A1 |
20050238212 | Du et al. | Oct 2005 | A1 |
20050244038 | Benkley | Nov 2005 | A1 |
20050244039 | Geoffroy et al. | Nov 2005 | A1 |
20050249386 | Juh | Nov 2005 | A1 |
20050258952 | Utter et al. | Nov 2005 | A1 |
20050269402 | Spitzer et al. | Dec 2005 | A1 |
20060006224 | Modi | Jan 2006 | A1 |
20060055500 | Burke et al. | Mar 2006 | A1 |
20060066572 | Yumoto et al. | Mar 2006 | A1 |
20060078176 | Abiko et al. | Apr 2006 | A1 |
20060083411 | Benkley | Apr 2006 | A1 |
20060110537 | Huang et al. | May 2006 | A1 |
20060140461 | Kim et al. | Jun 2006 | A1 |
20060144953 | Takao | Jul 2006 | A1 |
20060170528 | Fukushige et al. | Aug 2006 | A1 |
20060187200 | Martin | Aug 2006 | A1 |
20060210082 | Devadas et al. | Sep 2006 | A1 |
20060214512 | Iwata | Sep 2006 | A1 |
20060239514 | Watanabe et al. | Oct 2006 | A1 |
20060249008 | Luther | Nov 2006 | A1 |
20060259873 | Mister | Nov 2006 | A1 |
20060261174 | Zellner et al. | Nov 2006 | A1 |
20060271793 | Devadas et al. | Nov 2006 | A1 |
20060287963 | Steeves et al. | Dec 2006 | A1 |
20070031011 | Erhart et al. | Feb 2007 | A1 |
20070036400 | Watanabe et al. | Feb 2007 | A1 |
20070057763 | Blattner et al. | Mar 2007 | A1 |
20070067828 | Bychkov | Mar 2007 | A1 |
20070076926 | Schneider et al. | Apr 2007 | A1 |
20070076951 | Tanaka et al. | Apr 2007 | A1 |
20070086634 | Setlak et al. | Apr 2007 | A1 |
20070090312 | Stallinga et al. | Apr 2007 | A1 |
20070138299 | Mitra | Jun 2007 | A1 |
20070180261 | Akkermans et al. | Aug 2007 | A1 |
20070198141 | Moore | Aug 2007 | A1 |
20070198435 | Siegal et al. | Aug 2007 | A1 |
20070228154 | Tran | Oct 2007 | A1 |
20070237366 | Maletsky | Oct 2007 | A1 |
20070248249 | Stoianov | Oct 2007 | A1 |
20080002867 | Mathiassen et al. | Jan 2008 | A1 |
20080013805 | Sengupta et al. | Jan 2008 | A1 |
20080019578 | Saito et al. | Jan 2008 | A1 |
20080049987 | Champagne et al. | Feb 2008 | A1 |
20080049989 | Iseri et al. | Feb 2008 | A1 |
20080063245 | Benkley et al. | Mar 2008 | A1 |
20080069412 | Champagne et al. | Mar 2008 | A1 |
20080126260 | Cox et al. | May 2008 | A1 |
20080169345 | Keane et al. | Jul 2008 | A1 |
20080170695 | Adler et al. | Jul 2008 | A1 |
20080175450 | Scott et al. | Jul 2008 | A1 |
20080178008 | Takahashi et al. | Jul 2008 | A1 |
20080179112 | Qin et al. | Jul 2008 | A1 |
20080185429 | Saville | Aug 2008 | A1 |
20080201265 | Hewton | Aug 2008 | A1 |
20080205714 | Benkley et al. | Aug 2008 | A1 |
20080219521 | Benkley et al. | Sep 2008 | A1 |
20080222049 | Loomis et al. | Sep 2008 | A1 |
20080223925 | Saito et al. | Sep 2008 | A1 |
20080226132 | Gardner | Sep 2008 | A1 |
20080244277 | Orsini et al. | Oct 2008 | A1 |
20080267462 | Nelson et al. | Oct 2008 | A1 |
20080279373 | Erhart et al. | Nov 2008 | A1 |
20090130369 | Huang et al. | May 2009 | A1 |
20090153297 | Gardner | Jun 2009 | A1 |
20090154779 | Satyan et al. | Jun 2009 | A1 |
20090155456 | Benkley et al. | Jun 2009 | A1 |
20090169071 | Bond et al. | Jul 2009 | A1 |
20090174974 | Huang et al. | Jul 2009 | A1 |
20090237135 | Ramaraju et al. | Sep 2009 | A1 |
20090252384 | Dean et al. | Oct 2009 | A1 |
20090252385 | Dean et al. | Oct 2009 | A1 |
20090252386 | Dean et al. | Oct 2009 | A1 |
20090279742 | Abiko | Nov 2009 | A1 |
20090319435 | Little et al. | Dec 2009 | A1 |
20090324028 | Russo | Dec 2009 | A1 |
20100026451 | Erhart et al. | Feb 2010 | A1 |
20100045705 | Vertegaal et al. | Feb 2010 | A1 |
20100083000 | Kesanupalli et al. | Apr 2010 | A1 |
20100119124 | Satyan | May 2010 | A1 |
20100123675 | Ippel | May 2010 | A1 |
20100127366 | Bond et al. | May 2010 | A1 |
20100176823 | Thompson et al. | Jul 2010 | A1 |
20100176892 | Thompson et al. | Jul 2010 | A1 |
20100177940 | Thompson et al. | Jul 2010 | A1 |
20100180136 | Thompson et al. | Jul 2010 | A1 |
20100189314 | Benkley et al. | Jul 2010 | A1 |
20100208953 | Gardner et al. | Aug 2010 | A1 |
20100244166 | Shibuta et al. | Sep 2010 | A1 |
20100272329 | Benkley | Oct 2010 | A1 |
20100284565 | Benkley et al. | Nov 2010 | A1 |
20110002461 | Erhart et al. | Jan 2011 | A1 |
20110018556 | Le et al. | Jan 2011 | A1 |
20110102567 | Erhart | May 2011 | A1 |
20110102569 | Erhart | May 2011 | A1 |
20110182486 | Valfridsson et al. | Jul 2011 | A1 |
20110214924 | Perezselsky et al. | Sep 2011 | A1 |
20110267298 | Erhart et al. | Nov 2011 | A1 |
20110298711 | Dean et al. | Dec 2011 | A1 |
20110304001 | Erhart et al. | Dec 2011 | A1 |
20120044639 | Garcia | Feb 2012 | A1 |
Number | Date | Country |
---|---|---|
2213813 | Oct 1973 | DE |
0929028 | Jan 1998 | EP |
0905646 | Mar 1999 | EP |
0905646 | Mar 1999 | EP |
0973123 | Jan 2000 | EP |
1018697 | Jul 2000 | EP |
1139301 | Oct 2001 | EP |
1531419 | May 2005 | EP |
1533759 | May 2005 | EP |
1538548 | Jun 2005 | EP |
1624399 | Feb 2006 | EP |
1939788 | Jul 2008 | EP |
2331613 | May 1999 | GB |
2480919 | Dec 2011 | GB |
04158434 | Jun 1992 | JP |
2005242856 | Sep 2005 | JP |
WO 9003620 | Apr 1990 | WO |
WO 9858342 | Dec 1998 | WO |
WO 9928701 | Jun 1999 | WO |
WO 9943258 | Sep 1999 | WO |
WO 0122349 | Mar 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0195304 | Dec 2001 | WO |
WO 0195304 | Dec 2001 | WO |
WO 02011066 | Feb 2002 | WO |
WO 0211066 | Feb 2002 | WO |
WO 0247018 | Jun 2002 | WO |
WO 0247018 | Jun 2002 | WO |
WO 02061668 | Aug 2002 | WO |
WO 02077907 | Oct 2002 | WO |
WO 03063054 | Jul 2003 | WO |
WO 03075210 | Sep 2003 | WO |
WO 2004066194 | Aug 2004 | WO |
WO 2004066693 | Aug 2004 | WO |
WO 20050104012 | Nov 2005 | WO |
WO 2005106774 | Nov 2005 | WO |
WO 2005106774 | Nov 2005 | WO |
WO 2006040724 | Apr 2006 | WO |
WO 2006041780 | Apr 2006 | WO |
WO 2007011607 | Jan 2007 | WO |
WO 2008033264 | Mar 2008 | WO |
WO 2008033264 | Mar 2008 | WO |
WO 2008033265 | Jun 2008 | WO |
WO 2008033265 | Jun 2008 | WO |
WO 2008137287 | Nov 2008 | WO |
WO 2009002599 | Dec 2008 | WO |
WO 2009002599 | Dec 2008 | WO |
WO 2009029257 | Jun 2009 | WO |
WO 2009079219 | Jun 2009 | WO |
WO 2009079221 | Jun 2009 | WO |
WO 2009079257 | Jun 2009 | WO |
WO 2009079262 | Jun 2009 | WO |
WO 2010034036 | Mar 2010 | WO |
WO 2010036445 | Apr 2010 | WO |
WO 2010143597 | Dec 2010 | WO |
WO 2011053797 | May 2011 | WO |
Entry |
---|
Wikipedia (Mar. 2003). “Integrated circuit.” http://en.wikipedia.org/wiki/Integrated—circuit. Revision as of Mar. 23, 2003. |
Matsumoto et al., Impact of Artificial “Gummy” Fingers on Fingerprint Systems, SPIE 4677 (2002), reprinted from cryptome.org. |
Maltoni, “Handbook of Fingerprint Recognition”, XP002355942 Springer, New York, USA, Jun. 2003 pp. 65-69. |
Vermasan, et al., “A500 dpi AC Capacitive Hybrid Flip-Chip CMOS ASIC/Sensor Module for Fingerprint, Navigation, and Pointer Detection With On-Chip Data Processing”, IEEE Journal of Solid State Circuits, vol. 38, No. 12, Dec. 2003, pp. 2288-2294. |
Ratha, et al. “Adaptive Flow Orientation Based Feature Extraction in Fingerprint Images,” Pattern Recognition, vol. 28 No. 11, 1657-1672, Nov. 1995. |
Ratha, et al., “A Real Time Matching System for Large Fingerprint Databases,” IEEE, Aug. 1996. |
Suh, et al., “Design and Implementation of the AEGIS Single-Chip Secure Processor Using Physical Random Functions”, Computer Architecture, 2005, ISCA '05, Proceedings, 32nd International Symposium, Jun. 2005 (MIT Technical Report CSAIL CSG-TR-843, 2004. |
Rivest, et al., “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems”, Communication of the ACM, vol. 21 (2), pp. 120-126. (1978). |
Hiltgen, et al., “Secure Internet Banking Authentication”, IEEE Security and Privacy, IEEE Computer Society, New York, NY, US, Mar. 1, 2006, pp. 24-31, XP007908655, ISSN: 1540-7993. |
Hegt, “Analysis of Current and Future Phishing Attacks on Internet Banking Services”, Mater Thesis. Techische Universiteit Eindhoven—Department of Mathematics and Computer Science May 31, 2008, pp. 1-149, XP002630374, Retrieved from the Internet: URL:http://alexandria.tue.nl/extral/afstversl/wsk-i/hgt2008.pdf [retrieved on Mar. 29, 2011] *pp. 127-134, paragraph 6.2*. |
Gassend, et al., “Controlled Physical Random Functions”, In Proceedings of the 18th Annual Computer Security Conference, Las Vegas, Nevada, Dec. 12, 2002. |
Wikipedia (Mar. 2003). “Integrated Circuit,” http://en.wikipedia.org/wiki/integrated—circuit. Revision as of Mar. 23, 2003. |
Wikipedia (Dec. 2006). “Integrated circuit” Revision as of Dec. 10, 2006. http://en.widipedia.org/wiki/Integrated—circuit. |
BELLAGIODESIGNS.COM (Internet Archive Wayback Machine, www.bellagiodesigns.com date: Oct. 29, 2005). |
Closed Loop Systems, The Free Dictionary, http://www.thefreedictionary.com/closed-loop+system (downloaded Dec. 1, 2011). |
Feedback: Electronic Engineering, Wikipedia, p. 5 http://en.wikipedia.org/wiki/Feedback#Electronic—engineering (downloaded Dec. 1, 2011). |
Galy et al. (Jul. 2007) “A full fingerprint verification system for a single-line sweep sensor.” IEEE Sensors J., vol. 7 No. 7, pp. 1054-1065. |
Number | Date | Country | |
---|---|---|---|
20080240523 A1 | Oct 2008 | US |
Number | Date | Country | |
---|---|---|---|
60563139 | Apr 2004 | US | |
60564791 | Apr 2004 | US | |
60615718 | Oct 2004 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11519383 | Sep 2006 | US |
Child | 11957332 | US | |
Parent | 11519362 | Sep 2006 | US |
Child | 11519383 | US | |
Parent | 11107682 | Apr 2005 | US |
Child | 11519362 | US | |
Parent | 11957332 | US | |
Child | 11519362 | US | |
Parent | 11112338 | Apr 2005 | US |
Child | 11957332 | US | |
Parent | 11243100 | Oct 2005 | US |
Child | 11112338 | US |