The invention relates generally to technology for sensing and recording fingerprints and, more particularly to systems, devices and methods for fingerprint motion tracking alone and in combination with fingerprint image processing.
A number of devices and techniques exist for sensing, capturing, and reconstructing the image of a fingerprint as it moves across a sensor array. Though many devices exist to sense and record an entire fingerprint, partial fingerprint sensing devices have been developed for small portable devices to save space. The sensing devices themselves vary widely, and many devices and related techniques exist for sensitively detecting the presence of the finger surface and features located on the surface that make up the unique fingerprint of a person. For example, one common configuration used for a fingerprint sensing surface includes CCD (charge coupled devices) or C-MOS circuits. These components are embedded in a sensing surface to form a matrix of piezoelectric elements that generate signals in response to pressure applied to the surface by a finger. These signals are read by a processor and used to reconstruct the fingerprint of a user and to verify identification. Other devices include a matrix of optical sensors that read light reflected off of a person's finger and onto optical elements The reflected light is converted to a signal that defines the fingerprint of the finger analyzed and is used to reconstruct the fingerprint and to verify identification. More modern devices include static or radio frequency (RF) devices configured to measure the intensity of electric fields conducted by finger ridges and valleys to sense and capture the fingerprint image. Regardless of the method used to sense the fingerprint, conventional devices and techniques have common drawbacks, particularly when used in combination with portable electronic devices. These devices require small component size because of a lack of space and surface area due to the devices small size, and further require that any power demand be as small as possible due to limited battery life.
Specifically, devices exist that have a sensing area that is smaller than the fingerprint area to be imaged. Such devices are greatly desired because they take up much less space than a full fingerprint sensor. This is a very useful feature for small portable devices. These sensing devices generally consist of one or more imaging lines disposed perpendicular to the axis of motion. As the finger surface is moved across the sensor, portions of the fingerprint are sensed and captured by the device. These portions are subsequently reconstructed in a mosaic or overlapping manner. In operation however, current conventional devices have severe drawbacks. They generally require extensive processing resources for computing the algorithms and required data for reconstructing fingerprints.
For applications of fingerprint identification devices in portable electronics, such as laptops and cellular telephones, low power consumption is a strict requirement. Therefore, it is important to maintain minimal computation processing in such applications. Again, present conventional fingerprint sensor technology requires a substantial amount of processing, and thus requires a large amount of power to perform the required tasks for reconstructing fingerprints for identification. One major problem is that a large amount of pixel information is required to be recorded and matched in a short a mount of time, burdening the device processor and consuming substantial power. This is a big problem with small devices, which already have restrictions on power consumption.
One conventional device is described in U.S. Pat. No. 6,002,815 of Immega, et al. The technique used by the Immega device is based on the amount of time required for the finger to travel a fixed distance between two parallel image lines that are oriented perpendicular to the axis of motion. After a time history of samples are captured, the speed is determined by finding the time delay that provides the best match between data from the first line and data to from the second line. The device captures the entire image of an object and stores the image line by line. Such an object is illustrated as a photo copy of a document, and the reference does not suggest a fingerprint or other image. Thus, it is directed to a device and method for scanning an image passing over a perpendicular slit pair at a variable speed, as opposed to objects that pass over the slit pair at a fixed speed. It does not address the problem of excessive processor power expended to perform the process. Also, the perpendicular lines of the image are used for determining the speed of the object as it passes through the perpendicular slit where the image is captured. These recorded lines are also used in reconstructing the image when the scan is complete. Thus, a large amount of data is processed and stored in the process. The amount of processing resources required to calculate the speed at any given moment is immense, where the resources include time required, calculation by the processor and power demanded by the processor. Furthermore, this time series approach has the disadvantage that it is not possible to quickly determine an absolute distance of motion by comparing only the instantaneous data from the two image lines. This is true for all cases other than for the rare coincidental case where the finger happens to travel exactly the distance between the image lines during the interval between the two samples. Another problem arises when the object is moving much slower than the sample rate of the device. In this case, the number of samples needed to find a match is substantial. In addition, at slow speeds, the device must compare a larger number of stored lines in order to find a match. This greatly increases the computational requirements, placing a substantial burden on the device processor. Thus, expensive high order processors are required for adequate performance and substantial power is needed to operate such processors.
Another technique is described in U.S. Pat. No. 6,289,114 of Mainguet. A device utilizing this method reconstructs fingerprints based on sensing and recording images taken of rectangular slices of the fingerprint and piecing them together using an overlapping mosaic algorithm. Like Immega, the technique described in Mainguet is also computationally burdensome on the device processor. Furthermore, the Mainguet method requires a substantial amount of memory as well as a larger number of imaging pixels in order to properly record the images. Again, this method demands substantial power to perform algorithms, a big problem for power rationed portable devices.
For accurate fingerprint capture, it is often advantageous to provide a navigation function with the same device used for fingerprint sensing. The navigation function can provide more functionality in as little area as possible in a portable device, and provide a more accurate fingerprint image. However, conventional devices and methods for navigation require substantial processor resources, and thus demand more power. In such devices, in order to sense finger motion, the sensing device must sample the image at a periodic rate that is fast enough to ensure that a moving feature will be sampled when it passes both the primary imaging line and the auxiliary line of pixels. As a consequence, the sensor needs to operate at full imaging speeds, thus consuming full imaging power while in the navigation mode. Consequently, conventional navigation methods demand substantial power, and are thus impractical for small devices.
Thus, there exists a great need in the art for a more efficient means to accurately sense and capture fingerprints on portable devices and also to provide navigation operations without unduly demanding power. As will be seen, the invention provides a means to overcome the shortcomings of conventional systems in an elegant manner.
a-b is are diagrammatic views of a sensor and fingerprint configured according to the invention;
a-j are diagrammatic views of a sensor and fingerprint images configured according to the invention;
The invention is directed to a fingerprint motion tracking method and system for sensing features of a fingerprint along an axis of finger motion, where a linear sensor array has a plurality of substantially contiguous sensing elements configured to capture substantially contiguous overlapping segments of image data. A processing element is configured to receive segments of image data captured by the linear sensor array and to generate fingerprint motion data. Multiple sensor arrays may be included for generating directional data. The motion tracking data may be used in conjunction with a fingerprint image sensor to reconstruct a fingerprint image using the motion data either alone or together with the directional data.
The invention provides an independent relative motion sensor that does not require the power demanded by conventional devices. The independent relative motion sensor includes a linear array of sensing elements that captures a narrow string of data that is indicative of fingerprint features along a relatively narrow sample. In operation, the linear sensor array senses and captures fingerprint features in the form of a string of data signals by first sensing the features in an initial sensing and capture, and this is followed by one or more subsequent operations where a sample is taken of a subset of the fingerprint features are captured again over a known time period. This time period may be predetermined or measured as time progresses between sensing and capturing of the samples. Once at least two samples are taken, a subsequent sample is compared against a previous sample to determine the amount shift of the previous sample relative to the subsequent sample. In one embodiment, a single linear line of sensor pixels is used to sense a one-dimensional track of fingerprint features, and the signal sensed by the pixels is converted from an analog signal to a digital signal, where the features are then represented as a string of digital values. For example, the ridges of the fingerprint features may be represented as logical ones, and valleys represented as logical zeros.
When compared, the first string of digital values from one sample can be compared to the second string in a one to one relationship, and a similarity score can be produced that measures the number of matching values. If there is an immediate match, where both strings are substantially identical, then this would indicate that there was no movement during the time between which the two samples were taken. If there is not an immediate match, then this would indicate that there was some movement, and additional comparisons may be needed to determine the distance traveled. For each comparison, the strings of digital values can be shifted one or more pixels at a time. Once a good match is found, the distance traveled by the fingerprint is simply the number of pixels shifted times the distance between the pixels, which may be measured from the center point of one pixel to the center point of another pixel in the array of pixel sensors for example.
In one embodiment, a predetermined number of comparisons can be made along with corresponding similarity scores. The process may then choose the highest score to determine the most accurate comparison. The number of pixels that were shifted to get the best comparison can then be used to determine the distance traveled, since the size of and distance between the pixels can be predetermined, and the number of pixels can thus be used to measure the distance traveled by the fingerprint across the motion sensor over the time period of the motion.
In another embodiment, the process could make comparisons and generate scores to measure against a predetermined threshold, rather than making a predetermined number of comparisons. In this embodiment, the similarity score from each comparison can be measured after the comparison is made. If the score is within the threshold, then it can be used to indicate the amount of shift from one sample to another. This can then be used to determine the distance traveled by the fingerprint across the linear motion sensor.
In one embodiment, generally, the invention provides a fingerprint motion tracking system and method, where a single linear sensor array is configured to sense features of a fingerprint along an axis of finger motion. The linear sensor array includes a plurality of substantially contiguous sensing elements or pixels configured to capture a segment of image data that represents a series of fingerprint features passing over a sensor surface. A buffer is configured to receive and store image data from the linear sensor array. And, a processing element is configured to generate fingerprint motion data. The linear sensor array may be configured to repeatedly sense at least two substantially contiguous segments of fingerprint data, and the processor can generate motion data based on at least two sensed contiguous segments of fingerprint data. In operation, the linear sensor array is configured to sense a first set of features of a fingerprint along an axis of finger motion and to generate a first set of image data captured by a plurality of substantially contiguous pixels of the sensor array. The linear sensor array is also configured to subsequently sense a second set of features of the fingerprint along an axis of finger motion and to generate a second set of image data captured by a plurality of substantially contiguous pixels of the sensor array. The processing element can then compare first and second sets of image data to determine the distance traveled by the fingerprint over a time interval.
As used herein, linear sensor array is a generic term that relates to a portion of sensing elements, whether they are pixels in an optical reader, a static or radio frequency reader that reads electric field intensity to capture a fingerprint image, piezoelectric components in touch-sensitive circuit fingerprint readers, or other elements indicative of fingerprint readers, where the elements are used to sense a portion of the fingerprint, rather than the entire fingerprint. Such sensor arrays may be configured in a number of ways within a matrix of well known sensor devices. For example, several modern configurations are described and illustrated in pending U.S. Pub. US 2006/0083411A1entitled: Fingerprint Sensing Assemblies and Methods of Making; U.S. Pub. US 2005/0244039 A1entitled: Methods and Apparatus for Acquiring a Swiped Fingerprint Image; U.S. Pub US 2005/0244038 A1, entitled: Fingerprint Sensing Methods and Apparatus; U.S. Pub US 2003/0035570 A1entitled: Swiped aperture capacitive fingerprint sensing systems and methods, and other applications that are all assigned to common assignee Validity, Inc. Also, many other types of sensor matrices exist in the art directed to capturing fingerprint images. The invention is directed to a novel system, device and method that is not limited in application to any particular sensor matrix or array configuration. In fact, the invention can be used in conjunction with or incorporated into such configurations to improve performance, and further to reduce the processing resources required to capture and reconstruct images.
According to the invention, the linear sensor is substantially contiguous, which is to say that the sensor elements are in a relative proximity to each other so that a first reading of a portion of fingerprint features can be taken, followed by a second reading after a short period of time from a relatively stationary position. The two samples can be compared to determine the relative distance traveled by the fingerprint surface in relation to the sensor surface. The linear sensor is configured to merely take a relatively small sample of the fingerprint at one point in time, then another at a subsequent time. These two samples are used to determine movement of the fingerprint. Two or more samples maybe compared in order to compute direction and velocity of a fingerprint surface relative to the linear sensing elements. These samples may be linear, as described below and illustrated in the drawings, so that a linear array of fingerprint features can be recorded and easily compared to provide a basis for motion, distance traveled over time. If more than one sensor is employed, it is possible to determine direction of motion using vector addition with the different linear samples taken. Thus, some of the functions provided by the invention are a result of taking a linear sample to give a basis for vector analysis. However, those skilled in the art will understand that, given the description below and the related drawings, other embodiments are possible using other configurations of motion sensors, which would not depart from the spirit and scope of the invention, which is defined by the appended claims and their equivalents, as well as any claims and amendments presented in the future and their equivalents.
One useful feature of the invention is that ambiguity in results is substantially prevented. If properly configured, a system configured according to the invention can consistently produce a result, where at least two samples can be taken such that the features of one sample overlap with another sample. Then, comparisons can be made to determine the amount of shift, indicating the amount of movement of the fingerprint across the linear sensor. In prior art systems and methods, it is often the case that no result occurs, and a singularity results. Thus, a user would need to repeat sensing the fingerprint. In some systems, substantial predictor algorithms have been created in an attempt to compensate or resolve the singularity when it occurs. Such applications are very large and demand a good deal of computation and processing resources, which would greatly bog down a portable device. According to the invention, sensing motion of a fingerprint is substantially certain, where samples taken from the fingerprint surface are consistently reliable. This is particularly important in navigation applications, where relative movement of the finger translates to movement of an object such as a cursor on a graphical user interface (GUI), discussed further below.
In one embodiment, the linear sensor array may be used alone to determine linear movement of a fingerprint. In another embodiment, the single sensor array may be used in conjunction with one or more other linear sensor arrays to determine movement in two dimensions. In either embodiment, the linear sensor arrays are utilized solely for determining motion. If the motion of the analyzed fingerprint occurs generally along a predetermined axis of motion, the single linear sensor array can be utilized to sense the velocity of the fingerprint being analyzed. To capture and record the motion of a fingerprint that is not directed along a predetermined axis of motion, two or more linear arrays (a plurality of arrays) can be used together to sense and record such motion, and a processor can determine the direction and speed of the fingerprint using vector arithmetic.
In yet another embodiment, one or more such linear arrays may be used in conjunction with a fingerprint sensor matrix to more accurately capture and reconstruct a fingerprint image. The sensor matrix can be configured to sense and capture an image of a portion of a fingerprint being analyzed, and the one or more linear arrays can provide motion information for use in reconstructing a fingerprint image. A device so configured would be able to more accurately sense, capture, record and reconstruct a fingerprint image using less processing resources than conventional devices and methods.
The primary distinction between the invention and the prior art, Immega and Mainguet for example, is that the invention separates the analysis of motion from the capturing of the entire fingerprint image. The concept described in Immega, for example, requires the entire image to be captured and recoded line by line. The lines are used to both determine speed of the object being sensed and recorded and also calculate the speed of the object as it is passed over the perpendicular slot. Immega requires immense processing and storage resources to sense, capture, record and reconstruct the image, and all of these functions are carried out by processing the entire lot of image data captured and recorded. Similarly, a device configured according to Mainguet must capture large portions of the fingerprint image and requires substantial processing and storage resources to overlap and match the image mosaics to reconstruct the image. In stark contrast, the invention provides a means for detecting motion of a fingerprint separately from the process of capturing a fingerprint image, and uses the motion information to more efficiently reconstruct the fingerprint image using less processing and storage resources.
Alternatively, in yet another embodiment, one or more arrays can be used to generate motion information for use in accurate navigational operations, such as for use in navigating a cursor on a graphical user interface (GUI). Utilizing the improved processing functions of the invention, an improved navigation device can be constructed that is compatible with a portable device that has the power and processing restrictions discussed above. Examples of such embodiments are described and illustrated below.
A motion sensor configured according to the invention uses substantially less space and power compared to conventional configurations for motion sensing, navigation and fingerprint image reconstruction. Such a configuration can further provide aid to conventional fingerprint reconstructing processes by better sensing motion of a finger while it is being analyzed by a sensing device. This allows a fingerprint sensing device the ability to reconstruct a fingerprint analyzed by a fingerprint sensor with reduced power. Utilizing the invention, conventional processes that need to match and construct fragmented images of a fingerprint, particularly devices that sense and process a fingerprint in portions, can be optimized with information related to fingerprint motion that occurs while a fingerprint surface is being read. Also, using this unique motion detection technology, optimal navigation functions can be provided that demand significantly less power than conventional devices. Such navigation functions can enable a low power navigation device to be integrated in a portable device system, such as a mouse pad used to move a cursor across a graphical user interface (GUI) on portable electronic devices including cellular phones, laptop computers, personal data assistants (PDAs), and other devices where low power navigation functions are desired. A novel system and method are provided that uses minimal space and processing resources in providing accurate motion detection from which fingerprint sensors as well as navigation systems can greatly benefit.
A device or system configured according to the invention can be implemented as a stand alone navigation device, or a device to provide image reconstruction information for use with a line imaging device that matches and assembles a fingerprint image. Such a line imaging device may be any imaging device configured to sense and capture portions of a fingerprint, whether it captures individual perpendicular image lines of a fingerprint, or multiple perpendicular lines. In operation, a motion detection device can operate as a separate motion detection and/or direction detection device. Alternatively, a motion detection device can be used in conjunction with a line imaging device to more accurately and efficiently sense, capture, store and reconstruct a fingerprint image. A device configured according to the invention may include a single array of finger ridge sensing pixels or data sensor points centrally located along the principal axis of motion to be detected, a sampling system to periodically sample the finger contact across the array, and a computational module or element that compares two sets of samples collected at different times to determine the distance traveled while between the two sample times. According to the invention, the motion sensor pixels do not necessarily need to have the same resolution as the line imager. The motion sensor pixels may in fact use a different sensing technique than the imager.
Again, the invention provides separate operations for detecting motion and for sensing and capturing a fingerprint image. Thus, the techniques used for the separate processes can be the same or may be different depending on the application. Those skilled in the art will understand that different variations of the separate processes are possible using known techniques and techniques can be derived without any undue experimentation. Such variations would not depart from the spirit and scope of the invention.
In another embodiment, the invention provides the capability of multi-axis motion sensing with additional off-axis sensing arrays. In this embodiment, there are two or more (a plurality of) sensor arrays for detecting motion, and each axis is independently measured to determine the component of velocity in that axis. The velocity components from the individual axes are used to compute a vector sum to determine the actual direction and velocity of motion of the finger with respect to the sensor surface. According to the invention, it is not necessary to capture the full image of the fingerprint in order to determine the distance traveled and the velocity. It is only necessary to capture a linear sample of fingerprint features along the line of motion of the fingerprint. In one embodiment, a plurality of samples, such as two or three samples, are captured by motion sensor pixels and are used to determine the distance traveled across the axis of motion of the fingerprint relative to the sensor surface and the velocity at which the motion occurs. This information can also be used in navigational operations, and can further be used in combination with a fingerprint imager to aid in reconstructing a fingerprint image. Utilizing the invention, either application can be configured in an economical and useful manner. Moreover, the operation of such a sensor or navigational device can be optimized to consume substantially less power than conventional devices, which require excessive processor operations for reassembly of the fingerprint image. And, given the motion information generated by a system configured according to the invention, the distance traveled and velocity of the fingerprint can be used to more accurately and efficiently reconstruct a full fingerprint.
Aligning the pixels along the axis of motion, rather than perpendicular to it, enables the use of motion detection algorithms that can be both time-variant and distance variant. This enables development of algorithms that utilize short distance measurement over long time periods for low speed motion and longer distance motion to more accurately measure higher speed motion, thus optimizing response time and accuracy. Both embodiments share the advantages gained by acquiring and comparing multiple spatial measurements of the fingerprint pattern at each sampling instance. Because multiple samples are taken and compared simultaneously, effects of sampling error, both due to noise and imprecision in the sampling of the finger pattern, are minimized. Also, because samples are taken at multiple locations along the axis of motion simultaneously at each sampling period, the images from two sampling periods can be compared to detect if there had been any significant finger motion between the two sample times. One shared advantage is that both systems are capable of detecting under-sampling of the image being acquired by the line imager, as a consequence of their ability to detect motion of multiple pixels in a short time interval.
An embodiment using a single segmented motion sensor array offers the advantage of detecting motion over a shorter range of distance. This provides faster response time, particularly at low finger speeds that may be encountered in navigation applications. Because this embodiment is sensitive to single pixel motion, it provides unique features that may also reduce the memory requirements for the computational elements. In order to provide a navigation device, as well as to detect and correct for finger motion that is not completely aligned with the desired axis, either of the embodiments may be combined in ensembles such that one sensor is aligned on the axis of motion, and additional sensors aligned at an angle (such as 22.5 or 30 degrees) to the principal axis of finger motion. Examples of different embodiments are discussed below.
Referring to
The system further includes a sensor module 102 that is used to sense a user's finger 104 fingerprint surface 106 when it is moved across fingerprint sensing surface 108. As can be seen, the fingerprint sensing surface 108 is illustrated as a narrow surface that is designed to sense and capture portions of a fingerprint as it is moves across the sensor. These portions can be subsequently reconstructed according to the invention using motion information from the motion sensors 110,112. Thus, the sensor components illustrated in
Referring to
Referring to
Referring to
Referring to
Referring again to
According to another embodiment 102(a) of the invention illustrated in
Referring to
Referring to
Referring to
Referring to
Thus, if a user would stroke a fingerprint surface against a motion sensor surface, the arrays could pick up the motion and direction information, and a processor could process the information to generate relative motion and direction information for use in navigation, such as for a computer mouse. In this example, a user can move a finger relative to a cursor on a graphical user interface (GUI), such as a computer screen, a cellular phone, a personal data assistant (PDA) or other personal device. The navigation sensor could then cause the cursor to move relative to the fingerprint motion, and a user can navigate across the GUI to operate functions on a computer or other device. Since the motion of the cursor is relative to the movement of the fingerprint surface against the navigation sensor, relatively small movements can translate to equal, lesser or even greater distance movement of the cursor.
One aspect of the invention that is very useful to navigation configurations is the ability to consistently generate a motion result. As discussed above, the invention provides a means to substantially ensure a result when a fingerprint moves across a motion sensor. This is true for single array motion sensors as well as multiple array sensors used for two-dimensional motion processing. In a navigation application, such a configuration can provide accurate and consistent motion and directional information that allows for smooth and reliable navigational operations.
Referring to
Referring to
Referring to
In one embodiment, in order to support motion at any arbitrary angle, sensor arrays may be oriented at approximately 0,30,60,90,120, and 150 degrees. Another more robust system might space them at 22.5 degree increments, rather than 30. Once motion reaches 180 degrees, the process can use reverse motion on the zero degree sensor array, and so on. A device configured in this way would have some of the properties of a navigation touchpad such as those used in laptop computers, with the relative motion sensing capability of a computer mouse.
Referring to
The readout circuit includes an amplifier 256 configured to amplify the analog signal so that the it can more accurately be read in subsequent operations. Low pass filter 258 is configured to filter out any noise from the analog signal so that the analog signal can be more efficiently processed. The readout circuit further includes an analog to digital converter 260 that is configured to convert the output signal from the sensor element to a digital signal that indicates a series of logic 0′s and l′s that define the sensing of the fingerprint features by the pixels or data contact points of the sensor surface 107. Such signals may be separately received by the motion sensors and the fingerprint sensing surfaces as discussed in the embodiments above, and may be read out and processed separately. The readout circuit may store the output signal in storage 262, where fingerprint data 264 is stored and preserved, either temporarily until the processor 266 can process the signal, or for later use by the processor. The processor 216 includes arithmetic unit 268 configured to process algorithms used for navigation of a cursor, such as that described in connection with navigation features of
At each sample time, the state of the sense elements is converted to a series of numerical values from digitized segments 203a, 203b. For the sake of simplification, digitized segments 203a, 203b shows a binary digitization, indicating presence or absence of ridge. The sensor values may be encoded with a higher precision if the chosen sensor methodology allows. Because the two image samples 203a and 203b were taken along the axis of motion 106 at different times, they may be sequentially shifted and compared against each other until a match is found for an absolute distance of motion D in the period between the samples T, resulting in a direct finger velocity measurement D/T.
Unlike conventional systems and methods, the system does not have to accumulate a large time history when no motion is detected between samples 203a and 203b. It can simply maintain the earlier sample 203a, and perform a new computation when the next sample is acquired. This is advantageous in the case where there is no prior knowledge of the approximate velocity speed of the finger. Often in practice, the finger velocity relative to the sensory surface may vary greatly. The invention eliminates the need for a large buffer of samples to cover a wide dynamic range of finger speeds.
A further advantage offered by the invention is the ability to adjust the sample rate and therefore the distance of motion traveled between samples as a function of finger velocity. As the finger velocity increases, the number of sample periods required to traverse between two adjacent pixels decreases. This effectively decreases the resolution of a velocity measurement. And, as the uncertainty of the measurement approaches the measurement period, all resolution is lost. Accordingly, in order to maintain the accuracy of the estimated velocity, the measurement system may adjust the sample rate to optimize the distance traveled when looking for a match between two frames. For example, requiring ten pixels of motion at fast finger swipe speeds can ensure a 10% accuracy in velocity measurements. Conversely, as the finger velocity decreases, the number of time samples required to travel a significant distance increases. In this case, the system could decrease the sample rate and reduce the distance traveled for a match to as little as one pixel. This would provide a significantly more rapid response to motion changes in navigation applications and would better tract finger velocity changes used to reconstruct two dimensional images from a one dimensional sensor. Those skilled in the art will understand that there are various methods for changing the sample rate in order to achieve these and other objectives, and the invention is not limited to any particular method, and moreover is inclusive of the various known methods as well as methods readily ascertainable by one skilled in the art without undue experimentation.
a and 11b show the digitization results sampled at two instances 203a and 203b as the finger moves in a downward direction 306. In this example, the finger has traveled downward approximately 7 pixels between samples 303a and 303b.
The match results show a strong correlation with the actual motion of seven pixels of vertical distance clearly distinguished in just one sample pair, even though the ridge frequency is fairly uniform for the selected segment of the fingerprint. It should also be clear to those knowledgeable in the art that the accuracy of the match would be significantly enhanced by additional levels of gray scale in the pixel data.
FIGS. 12B(a) and 12B(b) depict an embodiment of the invention that includes three linear arrays disposed at different angles to measure motion across a range of angles from the principal axis (in this case +/−25 degrees from the main axis). The central imaging array 302 is augmented with an array 301 oriented at a −25degree angle to the central axis and an array 303 oriented at a +25 degree angle to the central axis. It will be understood by those skilled in the art that, given this disclosure, various different angles of the arrays can be implemented, as well as different numbers of arrays.
In FIG. 12B(b) we see the image of a fingerprint at the initial starting position superimposed on the sensor arrays, and the resulting binary images 304a, 305a, and 306awith the finger in the initial position. In FIG. 12B(a), the finger has moved a short distance at an approximately +25 degree angle shown between positions 310 and 311, and the resulting binary images are shown in 304b, 305b, and 306b. The following table shows the results of binary comparison for the pairings of 304a/304b, 305a/305b, and 306a/306b using the shift and compare method previously described:
Because the motion principally follows the axis of sensor 303, the correlation for the pairing 306a/306b is strong at the correct six pixel distance, but the pairings 304a/304b, and 305a/305b show weak correlation. When the direction of motion is at an angle between the axes of any two of the sensor arrays, a correlation will be found in both of the sensors, and the true motion will be found by taking the vector sum of the estimates from the two sensors.
The example above covers the simple case where the motion is completely aligned with one of the sensor axes. In the case of motion that lies between two axes, the distance a feature travels along a sensor array will be less than the entire length of the sensor. To detect motion across a range of angles, sensor arrays must be provided at a series of angles disposed so that a match will be found on at least two of the sensor arrays. For example, by arranging the arrays in 30 degree increments across the allowable range of motion axes, it is possible to ensure that if there is worst case alignment (i.e. a 15 degree misalignment between the actual axis of motion an the two sensor arrays on either side of it), an image feature will still approximately follow the nearest sensor arrays for more than three pixels of travel. Thus, by sampling the sensor arrays fast enough to ensure that the finger has not traveled more than three pixels between samples, it is possible to determine the axis of motion by finding the adjacent pair of sensors with the highest correlation, and computing the vector sum of the distances traveled along each of them.
Referring to
In step 1210, a similarity score is generated, defining the amount of correlation between the two arrays. This may be in the form of a probability value, a percentage correlation value, or other mathematical value that can be used by the processor to determine the best similarity score among different comparisons. In step 1212, it is determine whether the similarity score falls within a threshold. In one embodiment, the threshold is a predetermined number that is decided according to a particular application. In practice, the invention can be configured to produce correlations that are of a high value, thus justifying a high threshold. Those skilled in the art will understand that such a threshold can be determined without undue experimentation, and that is depends on an application. If the score does not fall within the threshold, then the arrays are shifted to offset alignment in step 1214. The direction of the shifting may be done according to a predicted direction that a user would be expected to move the fingerprint surface across the sensor. If it is not known, or if the design calls for either direction, then flexibility can be accommodated by shifting the arrays in multiple directions until an alignment is reached that is within the threshold. In either case, the process returns to step 1208, where the arrays are compared again. A new similarity score is generated in step 1210, and the new score is measured against the threshold. This process can be reiterated until a score passes the threshold, and could possibly register an error if one is not met over time or a predetermined number of cycles. In a practical application, the two arrays can be shifted and processed once for each pixel in one array, since they are equal in length given that they were taken from the same array. If a score occurs that is within the threshold, then the distance is estimated in step 1216. This can be done by simply counting the number of pixels in which the arrays were shifted before a score occurs within the threshold, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application. Then, the velocity can be estimated in step 1218 by dividing the distance traveled by the time expended during the travel. The process ends at step 1220, where an estimated velocity value can be generated.
Referring to
Continuing, in step 1308, the two arrays are compared. In an initial alignment, referring briefly to
Then the distance is estimated in step 1318. Again, this can be done by simply counting the number of pixels in which the arrays were shifted before a score occurs within the threshold, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application. Then, the velocity can be estimated in step 1320 by dividing the distance traveled by the time expended during the travel. The process ends in step 1322 where a velocity value can be generated.
Referring to
Continuing, in step 1408, the two arrays are compared for each sensor. In an initial alignment, referring briefly to
Then the distance is estimated in step 1418. Again, this can be done by simply counting the number of pixels in which the arrays were shifted, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application.
If the similarity score for either of the adjacent arrays exceeds the threshold and this similarity score occurs at a distance less than the distance traveled on the predominant axis, then the principal axis of motion is assumed to lie between the predominant axis and this second axis. The angle of motion is then estimated by computing the ratio of distances along the predominant and secondary axes. The ratio of these distances is approximately equal to the ratio of the cosines of the angles between the actual axis of motion and the axes of the two sensor arrays.
The final estimated distance is computed by taking the distance measured on the predominant axis sensor and dividing it by the cosine of the difference between the estimated angle of motion and the angle of the sensor axis.
Then, the velocity can be estimated in step 1420 by dividing the distance traveled by the time expended during the travel. The process ends in step 1422 where a velocity value can be generated.
The invention may also involve a number of functions to be performed by a computer processor, such as a microprocessor. The microprocessor may be a specialized or dedicated microprocessor that is configured to perform particular tasks by executing machine-readable software code that defines the particular tasks. The microprocessor may also be configured to operate and communicate with other devices such as direct memory access modules, memory storage devices, Internet related hardware, and other devices that relate to the transmission of data in accordance with the invention. The software code may be configured using software formats such as Java, C++, XML (Extensible Mark-up Language) and other languages that may be used to define functions that relate to operations of devices required to carry out the functional operations related to the invention. The code may be written in different forms and styles, many of which are known to those skilled in the art. Different code formats, code configurations, styles and forms of software programs and other means of configuring code to define the operations of a microprocessor in accordance with the invention will not depart from the spirit and scope of the invention.
Within the different types of computers, such as computer servers, that utilize the invention, there exist different types of memory devices for storing and retrieving information while performing functions according to the invention. Cache memory devices are often included in such computers for use by the central processing unit as a convenient storage location for information that is frequently stored and retrieved. Similarly, a persistent memory is also frequently used with such computers for maintaining information that is frequently retrieved by a central processing unit, but that is not often altered within the persistent memory, unlike the cache memory. Main memory is also usually included for storing and retrieving larger amounts of information such as data and software applications configured to perform functions according to the invention when executed by the central processing unit. These memory devices may be configured as random access memory (RAM), static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, and other memory storage devices that may be accessed by a central processing unit to store and retrieve information. The invention is not limited to any particular type of memory device, or any commonly used protocol for storing and retrieving information to and from these memory devices respectively.
The apparatus and method include a method and apparatus for enabling and controlling fingerprint sensors and fingerprint image data and motion data in conjunction with the operation of a electronic device where navigation and fingerprint verification processes are utilized. Although this embodiment is described and illustrated in the context of devices, systems and related methods of imaging fingerprints and navigation features for a portable device, the scope of the invention extends to other applications where such functions are useful. Furthermore, while the foregoing description has been with reference to particular embodiments of the invention, it will be appreciated that these are only illustrative of the invention and that changes may be made to those embodiments without departing from the principles of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4151512 | Rigannati et al. | Apr 1979 | A |
4225850 | Chang et al. | Sep 1980 | A |
4310827 | Asi | Jan 1982 | A |
4353056 | Tsikos | Oct 1982 | A |
4405829 | Rivest et al. | Sep 1983 | A |
4525859 | Bowles et al. | Jun 1985 | A |
4550221 | Mabusth | Oct 1985 | A |
4580790 | Doose | Apr 1986 | A |
4582985 | Loftberg | Apr 1986 | A |
4675544 | Shrenk | Jun 1987 | A |
4758622 | Gosselin | Jul 1988 | A |
4817183 | Sparrow | Mar 1989 | A |
5076566 | Kriegel | Dec 1991 | A |
5109427 | Yang | Apr 1992 | A |
5140642 | Hau et al. | Aug 1992 | A |
5305017 | Gerpheide | Apr 1994 | A |
5319323 | Fong | Jun 1994 | A |
5325442 | Knapp | Jun 1994 | A |
5359243 | Norman | Oct 1994 | A |
5420936 | Fitzpatrick et al. | May 1995 | A |
5422807 | Mitra et al. | Jun 1995 | A |
5429006 | Tamori | Jul 1995 | A |
5456256 | Schneider et al. | Oct 1995 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5569901 | Bridgelall et al. | Oct 1996 | A |
5623552 | Lane | Apr 1997 | A |
5627316 | De Winter et al. | May 1997 | A |
5650842 | Maase et al. | Jul 1997 | A |
5717777 | Wong et al. | Feb 1998 | A |
5781651 | Hsiao et al. | Jul 1998 | A |
5801681 | Sayag | Sep 1998 | A |
5818956 | Tuli | Oct 1998 | A |
5838306 | O'Connor | Nov 1998 | A |
5848176 | Harra et al. | Dec 1998 | A |
5850450 | Schweitzer et al. | Dec 1998 | A |
5852670 | Setlak et al. | Dec 1998 | A |
5864296 | Upton | Jan 1999 | A |
5887343 | Salatino et al. | Mar 1999 | A |
5892824 | Beatson et al. | Apr 1999 | A |
5903225 | Schmitt et al. | May 1999 | A |
5915757 | Tsuyama et al. | Jun 1999 | A |
5920384 | Borza | Jul 1999 | A |
5920640 | Salatino et al. | Jul 1999 | A |
5940526 | Setlak et al. | Aug 1999 | A |
5963679 | Setlak | Oct 1999 | A |
5995630 | Borza | Nov 1999 | A |
5999637 | Toyoda et al. | Dec 1999 | A |
6002815 | Immega et al. | Dec 1999 | A |
6011859 | Kalnitsky et al. | Jan 2000 | A |
6016355 | Dickinson et al. | Jan 2000 | A |
6052475 | Upton | Apr 2000 | A |
6067368 | Setlak et al. | May 2000 | A |
6073343 | Petrick et al. | Jun 2000 | A |
6076566 | Lowe | Jun 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6098175 | Lee | Aug 2000 | A |
6118318 | Fifield et al. | Sep 2000 | A |
6134340 | Hsu et al. | Oct 2000 | A |
6157722 | Lerner et al. | Dec 2000 | A |
6161213 | Lofstrom | Dec 2000 | A |
6175407 | Santor | Jan 2001 | B1 |
6182076 | Yu et al. | Jan 2001 | B1 |
6182892 | Angelo et al. | Feb 2001 | B1 |
6185318 | Jain et al. | Feb 2001 | B1 |
6234031 | Suga | May 2001 | B1 |
6241288 | Bergenek et al. | Jun 2001 | B1 |
6259108 | Antonelli et al. | Jul 2001 | B1 |
6289114 | Mainguet | Sep 2001 | B1 |
6292272 | Okauchi et al. | Sep 2001 | B1 |
6317508 | Kramer et al. | Nov 2001 | B1 |
6320394 | Tartagni | Nov 2001 | B1 |
6325285 | Baratelli | Dec 2001 | B1 |
6327376 | Harkin | Dec 2001 | B1 |
6332193 | Glass et al. | Dec 2001 | B1 |
6333989 | Borza | Dec 2001 | B1 |
6337919 | Dunton | Jan 2002 | B1 |
6343162 | Saito et al. | Jan 2002 | B1 |
6346739 | Lepert et al. | Feb 2002 | B1 |
6347040 | Fries et al. | Feb 2002 | B1 |
6357663 | Takahashi et al. | Mar 2002 | B1 |
6360004 | Akizuki | Mar 2002 | B1 |
6362633 | Tartagni | Mar 2002 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6399994 | Shobu | Jun 2002 | B2 |
6400836 | Senior | Jun 2002 | B2 |
6473072 | Comiskey et al. | Oct 2002 | B1 |
6509501 | Eicken et al. | Jan 2003 | B2 |
6525547 | Hayes | Feb 2003 | B2 |
6525932 | Ohnishi et al. | Feb 2003 | B1 |
6539101 | Black | Mar 2003 | B1 |
6580816 | Kramer et al. | Jun 2003 | B2 |
6597289 | Sabatini | Jul 2003 | B2 |
6628812 | Setlak et al. | Sep 2003 | B1 |
6631201 | Dickinson et al. | Oct 2003 | B1 |
6643389 | Raynal et al. | Nov 2003 | B1 |
6672174 | Deconde et al. | Jan 2004 | B2 |
6710461 | Chou et al. | Mar 2004 | B2 |
6738050 | Comiskey et al. | May 2004 | B2 |
6741729 | Bjorn et al. | May 2004 | B2 |
6757002 | Oross et al. | Jun 2004 | B1 |
6766040 | Catalano et al. | Jul 2004 | B1 |
6785407 | Tschudi et al. | Aug 2004 | B1 |
6799275 | Bjorn et al. | Sep 2004 | B1 |
6836230 | Le Pailleur et al. | Dec 2004 | B2 |
6838905 | Doyle | Jan 2005 | B1 |
6873356 | Kanbe et al. | Mar 2005 | B1 |
6886104 | McClurg et al. | Apr 2005 | B1 |
6897002 | Teraoka et al. | May 2005 | B2 |
6898299 | Brooks | May 2005 | B1 |
6924496 | Manansala | Aug 2005 | B2 |
6937748 | Schneider et al. | Aug 2005 | B1 |
6941001 | Bolle et al. | Sep 2005 | B1 |
6941810 | Okada | Sep 2005 | B2 |
6950540 | Higuchi | Sep 2005 | B2 |
6959874 | Bardwell | Nov 2005 | B2 |
6963626 | Shaeffer et al. | Nov 2005 | B1 |
6970584 | O'Gorman et al. | Nov 2005 | B2 |
6980672 | Saito et al. | Dec 2005 | B2 |
6983882 | Cassone | Jan 2006 | B2 |
7013030 | Wong et al. | Mar 2006 | B2 |
7020591 | Wei et al. | Mar 2006 | B1 |
7030860 | Hsu et al. | Apr 2006 | B1 |
7031670 | May | Apr 2006 | B2 |
7035443 | Wong | Apr 2006 | B2 |
7042535 | Katoh et al. | May 2006 | B2 |
7043061 | Hamid et al. | May 2006 | B2 |
7043644 | DeBruine | May 2006 | B2 |
7046230 | Zadesky et al. | May 2006 | B2 |
7064743 | Nishikawa | Jun 2006 | B2 |
7099496 | Benkley, III | Aug 2006 | B2 |
7110574 | Haruki et al. | Sep 2006 | B2 |
7110577 | Tschudi | Sep 2006 | B1 |
7113622 | Hamid | Sep 2006 | B2 |
7126389 | McRae et al. | Oct 2006 | B1 |
7129926 | Mathiassen et al. | Oct 2006 | B2 |
7136514 | Wong | Nov 2006 | B1 |
7146024 | Benkley | Dec 2006 | B2 |
7146026 | Russo et al. | Dec 2006 | B2 |
7146029 | Manansala | Dec 2006 | B2 |
7184581 | Johansen et al. | Feb 2007 | B2 |
7190209 | Kang et al. | Mar 2007 | B2 |
7190816 | Mitsuyu et al. | Mar 2007 | B2 |
7194392 | Tuken et al. | Mar 2007 | B2 |
7197168 | Russo | Mar 2007 | B2 |
7200250 | Chou | Apr 2007 | B2 |
7251351 | Mathiassen et al. | Jul 2007 | B2 |
7258279 | Schneider et al. | Aug 2007 | B2 |
7260246 | Fujii | Aug 2007 | B2 |
7263212 | Kawabe | Aug 2007 | B2 |
7263213 | Rowe | Aug 2007 | B2 |
7289649 | Walley et al. | Oct 2007 | B1 |
7290323 | Deconde et al. | Nov 2007 | B2 |
7308121 | Mathiassen et al. | Dec 2007 | B2 |
7308122 | McClurg et al. | Dec 2007 | B2 |
7321672 | Sasaki et al. | Jan 2008 | B2 |
7356169 | Hamid | Apr 2008 | B2 |
7360688 | Harris | Apr 2008 | B1 |
7369685 | DeLeon | May 2008 | B2 |
7379569 | Chikazawa et al. | May 2008 | B2 |
7408135 | Fujeda | Aug 2008 | B2 |
7409876 | Ganapathi et al. | Aug 2008 | B2 |
7412083 | Takahashi | Aug 2008 | B2 |
7424618 | Roy et al. | Sep 2008 | B2 |
7447339 | Mimura et al. | Nov 2008 | B2 |
7447911 | Chou et al. | Nov 2008 | B2 |
7460697 | Erhart et al. | Dec 2008 | B2 |
7463756 | Benkley | Dec 2008 | B2 |
7505611 | Fyke | Mar 2009 | B2 |
7505613 | Russo | Mar 2009 | B2 |
7565548 | Fiske et al. | Jul 2009 | B2 |
7574022 | Russo | Aug 2009 | B2 |
7596832 | Hsieh et al. | Oct 2009 | B2 |
7643950 | Getzin et al. | Jan 2010 | B1 |
7681232 | Nordentoft et al. | Mar 2010 | B2 |
7689013 | Shinzaki | Mar 2010 | B2 |
7706581 | Drews et al. | Apr 2010 | B2 |
7733697 | Picca et al. | Jun 2010 | B2 |
7751601 | Benkley | Jul 2010 | B2 |
7826645 | Cayen | Nov 2010 | B1 |
7843438 | Onoda | Nov 2010 | B2 |
7848798 | Martinsen et al. | Dec 2010 | B2 |
7899216 | Watanabe et al. | Mar 2011 | B2 |
7953258 | Dean et al. | May 2011 | B2 |
8005276 | Dean et al. | Aug 2011 | B2 |
8031916 | Abiko et al. | Oct 2011 | B2 |
8063734 | Conforti | Nov 2011 | B2 |
8077935 | Geoffroy et al. | Dec 2011 | B2 |
8107212 | Nelson et al. | Jan 2012 | B2 |
8116540 | Dean et al. | Feb 2012 | B2 |
8131026 | Benkley et al. | Mar 2012 | B2 |
8165355 | Benkley et al. | Apr 2012 | B2 |
8175345 | Gardner | May 2012 | B2 |
8204281 | Satya et al. | Jun 2012 | B2 |
8224044 | Benkley | Jul 2012 | B2 |
8229184 | Benkley | Jul 2012 | B2 |
8276816 | Gardner | Oct 2012 | B2 |
8278946 | Thompson | Oct 2012 | B2 |
8290150 | Erhart et al. | Oct 2012 | B2 |
8315444 | Gardner | Nov 2012 | B2 |
8331096 | Garcia | Dec 2012 | B2 |
8358815 | Benkley et al. | Jan 2013 | B2 |
20010026636 | Mainget | Oct 2001 | A1 |
20010030644 | Allport | Oct 2001 | A1 |
20010036299 | Senior | Nov 2001 | A1 |
20010043728 | Kramer et al. | Nov 2001 | A1 |
20020025062 | Black | Feb 2002 | A1 |
20020061125 | Fujii | May 2002 | A1 |
20020064892 | Lepert et al. | May 2002 | A1 |
20020067845 | Griffis | Jun 2002 | A1 |
20020073046 | David | Jun 2002 | A1 |
20020089044 | Simmons et al. | Jul 2002 | A1 |
20020089410 | Janiak et al. | Jul 2002 | A1 |
20020096731 | Wu et al. | Jul 2002 | A1 |
20020122026 | Bergstrom | Sep 2002 | A1 |
20020126516 | Jeon | Sep 2002 | A1 |
20020133725 | Roy et al. | Sep 2002 | A1 |
20020152048 | Hayes | Oct 2002 | A1 |
20020181749 | Matsumoto et al. | Dec 2002 | A1 |
20030002717 | Hamid | Jan 2003 | A1 |
20030002719 | Hamid et al. | Jan 2003 | A1 |
20030021495 | Cheng | Jan 2003 | A1 |
20030035570 | Benkley | Feb 2003 | A1 |
20030063782 | Acharya et al. | Apr 2003 | A1 |
20030068072 | Hamid | Apr 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030076303 | Huppi | Apr 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030095690 | Su et al. | May 2003 | A1 |
20030123714 | O'Gorman et al. | Jul 2003 | A1 |
20030123715 | Uchida | Jul 2003 | A1 |
20030141959 | Keogh et al. | Jul 2003 | A1 |
20030147015 | Katoh et al. | Aug 2003 | A1 |
20030161510 | Fujii | Aug 2003 | A1 |
20030161512 | Mathiassen et al. | Aug 2003 | A1 |
20030169228 | Mathiassen et al. | Sep 2003 | A1 |
20030174871 | Yoshioka et al. | Sep 2003 | A1 |
20030186157 | Teraoka et al. | Oct 2003 | A1 |
20030209293 | Sako et al. | Nov 2003 | A1 |
20030224553 | Manansala | Dec 2003 | A1 |
20040012773 | Puttkammer | Jan 2004 | A1 |
20040017934 | Kocher et al. | Jan 2004 | A1 |
20040022001 | Chu et al. | Feb 2004 | A1 |
20040042642 | Bolle et al. | Mar 2004 | A1 |
20040050930 | Rowe | Mar 2004 | A1 |
20040066613 | Leitao | Apr 2004 | A1 |
20040076313 | Bronstein et al. | Apr 2004 | A1 |
20040081339 | Benkley | Apr 2004 | A1 |
20040096086 | Miyasaka | May 2004 | A1 |
20040113956 | Bellwood et al. | Jun 2004 | A1 |
20040120400 | Linzer | Jun 2004 | A1 |
20040125993 | Zhao et al. | Jul 2004 | A1 |
20040129787 | Saito | Jul 2004 | A1 |
20040136612 | Meister et al. | Jul 2004 | A1 |
20040155752 | Radke | Aug 2004 | A1 |
20040172339 | Snelgrove et al. | Sep 2004 | A1 |
20040179718 | Chou | Sep 2004 | A1 |
20040184641 | Nagasaka et al. | Sep 2004 | A1 |
20040188838 | Okada et al. | Sep 2004 | A1 |
20040190761 | Lee | Sep 2004 | A1 |
20040208346 | Baharav et al. | Oct 2004 | A1 |
20040208347 | Baharav et al. | Oct 2004 | A1 |
20040208348 | Baharav et al. | Oct 2004 | A1 |
20040213441 | Tschudi | Oct 2004 | A1 |
20040215689 | Dooley et al. | Oct 2004 | A1 |
20040228505 | Sugimoto | Nov 2004 | A1 |
20040228508 | Shigeta | Nov 2004 | A1 |
20040240712 | Rowe et al. | Dec 2004 | A1 |
20040252867 | Lan et al. | Dec 2004 | A1 |
20050031174 | Ryhanen et al. | Feb 2005 | A1 |
20050036665 | Higuchi | Feb 2005 | A1 |
20050047485 | Khayrallah et al. | Mar 2005 | A1 |
20050100196 | Scott et al. | May 2005 | A1 |
20050100938 | Hoffmann et al. | May 2005 | A1 |
20050109835 | Jacoby et al. | May 2005 | A1 |
20050110103 | Setlak | May 2005 | A1 |
20050111708 | Chou | May 2005 | A1 |
20050123176 | Ishii et al. | Jun 2005 | A1 |
20050129291 | Boshra | Jun 2005 | A1 |
20050136200 | Durell et al. | Jun 2005 | A1 |
20050139656 | Arnouse | Jun 2005 | A1 |
20050139685 | Kozlay | Jun 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050169503 | Howell et al. | Aug 2005 | A1 |
20050174015 | Scott et al. | Aug 2005 | A1 |
20050210271 | Chou et al. | Sep 2005 | A1 |
20050219200 | Weng | Oct 2005 | A1 |
20050220329 | Payne et al. | Oct 2005 | A1 |
20050231213 | Chou et al. | Oct 2005 | A1 |
20050238212 | Du et al. | Oct 2005 | A1 |
20050244038 | Benkley | Nov 2005 | A1 |
20050244039 | Geoffroy et al. | Nov 2005 | A1 |
20050247559 | Frey et al. | Nov 2005 | A1 |
20050249386 | Juh | Nov 2005 | A1 |
20050258952 | Utter et al. | Nov 2005 | A1 |
20050269402 | Spitzer et al. | Dec 2005 | A1 |
20060006224 | Modi | Jan 2006 | A1 |
20060055500 | Burke et al. | Mar 2006 | A1 |
20060066572 | Yumoto et al. | Mar 2006 | A1 |
20060078176 | Abiko et al. | Apr 2006 | A1 |
20060083411 | Benkley | Apr 2006 | A1 |
20060110537 | Huang et al. | May 2006 | A1 |
20060140461 | Kim et al. | Jun 2006 | A1 |
20060144953 | Takao | Jul 2006 | A1 |
20060170528 | Funushige et al. | Aug 2006 | A1 |
20060181521 | Perrault et al. | Aug 2006 | A1 |
20060182319 | Setlank et al. | Aug 2006 | A1 |
20060187200 | Martin | Aug 2006 | A1 |
20060210082 | Devadas et al. | Sep 2006 | A1 |
20060214512 | Iwata | Sep 2006 | A1 |
20060214767 | Carrieri | Sep 2006 | A1 |
20060239514 | Watanabe et al. | Oct 2006 | A1 |
20060249008 | Luther | Nov 2006 | A1 |
20060259873 | Mister | Nov 2006 | A1 |
20060261174 | Zellner et al. | Nov 2006 | A1 |
20060267125 | Huang et al. | Nov 2006 | A1 |
20060267385 | Steenwyk et al. | Nov 2006 | A1 |
20060271793 | Devadas et al. | Nov 2006 | A1 |
20060285728 | Leung et al. | Dec 2006 | A1 |
20060287963 | Steeves et al. | Dec 2006 | A1 |
20070031011 | Erhart et al. | Feb 2007 | A1 |
20070036400 | Watanabe et al. | Feb 2007 | A1 |
20070057763 | Blattner et al. | Mar 2007 | A1 |
20070058843 | Theis et al. | Mar 2007 | A1 |
20070067828 | Bychkov | Mar 2007 | A1 |
20070076926 | Schneider et al. | Apr 2007 | A1 |
20070076951 | Tanaka et al. | Apr 2007 | A1 |
20070086634 | Setlak et al. | Apr 2007 | A1 |
20070090312 | Stallinga et al. | Apr 2007 | A1 |
20070138299 | Mitra | Jun 2007 | A1 |
20070154072 | Taraba et al. | Jul 2007 | A1 |
20070160269 | Kuo | Jul 2007 | A1 |
20070180261 | Akkermans et al. | Aug 2007 | A1 |
20070196002 | Choi et al. | Aug 2007 | A1 |
20070198141 | Moore | Aug 2007 | A1 |
20070198435 | Siegal et al. | Aug 2007 | A1 |
20070228154 | Tran | Oct 2007 | A1 |
20070237366 | Maletsky | Oct 2007 | A1 |
20070237368 | Bjorn et al. | Oct 2007 | A1 |
20070248249 | Stoianov | Oct 2007 | A1 |
20070290124 | Neil et al. | Dec 2007 | A1 |
20080002867 | Mathiassen et al. | Jan 2008 | A1 |
20080013805 | Sengupta et al. | Jan 2008 | A1 |
20080019578 | Saito et al. | Jan 2008 | A1 |
20080049987 | Champagne et al. | Feb 2008 | A1 |
20080049989 | Iseri et al. | Feb 2008 | A1 |
20080063245 | Benkley et al. | Mar 2008 | A1 |
20080069412 | Champagne et al. | Mar 2008 | A1 |
20080126260 | Cox et al. | May 2008 | A1 |
20080169345 | Keane et al. | Jul 2008 | A1 |
20080170695 | Adler et al. | Jul 2008 | A1 |
20080175450 | Scott et al. | Jul 2008 | A1 |
20080178008 | Takahashi et al. | Jul 2008 | A1 |
20080179112 | Qin et al. | Jul 2008 | A1 |
20080185429 | Saville | Aug 2008 | A1 |
20080201265 | Hewton | Aug 2008 | A1 |
20080205714 | Benkley et al. | Aug 2008 | A1 |
20080219521 | Benkley et al. | Sep 2008 | A1 |
20080222049 | Loomis et al. | Sep 2008 | A1 |
20080223925 | Saito et al. | Sep 2008 | A1 |
20080226132 | Gardner | Sep 2008 | A1 |
20080240523 | Benkley et al. | Oct 2008 | A1 |
20080240537 | Yang et al. | Oct 2008 | A1 |
20080244277 | Orsini et al. | Oct 2008 | A1 |
20080267462 | Nelson et al. | Oct 2008 | A1 |
20080279373 | Erhart et al. | Nov 2008 | A1 |
20080317290 | Tazoe | Dec 2008 | A1 |
20090001999 | Douglas | Jan 2009 | A1 |
20090130369 | Huang et al. | May 2009 | A1 |
20090153297 | Gardner | Jun 2009 | A1 |
20090154779 | Satyan et al. | Jun 2009 | A1 |
20090155456 | Benkley et al. | Jun 2009 | A1 |
20090174974 | Huang et al. | Jul 2009 | A1 |
20090212902 | Haddock | Aug 2009 | A1 |
20090218698 | Lam | Sep 2009 | A1 |
20090237135 | Ramaraju et al. | Sep 2009 | A1 |
20090252383 | Adam et al. | Oct 2009 | A1 |
20090252384 | Dean et al. | Oct 2009 | A1 |
20090252385 | Dean et al. | Oct 2009 | A1 |
20090252386 | Dean et al. | Oct 2009 | A1 |
20090279742 | Abiko | Nov 2009 | A1 |
20090319435 | Little et al. | Dec 2009 | A1 |
20090324028 | Russo | Dec 2009 | A1 |
20100026451 | Erhart et al. | Feb 2010 | A1 |
20100083000 | Kesanupalli | Apr 2010 | A1 |
20100117794 | Adams et al. | May 2010 | A1 |
20100119124 | Satyan | May 2010 | A1 |
20100123675 | Ippel | May 2010 | A1 |
20100127366 | Bond et al. | May 2010 | A1 |
20100176823 | Thompson et al. | Jul 2010 | A1 |
20100176892 | Thompson et al. | Jul 2010 | A1 |
20100177940 | Thompson et al. | Jul 2010 | A1 |
20100180136 | Thompson et al. | Jul 2010 | A1 |
20100189314 | Benkley et al. | Jul 2010 | A1 |
20100208953 | Gardner et al. | Aug 2010 | A1 |
20100244166 | Shibuta et al. | Sep 2010 | A1 |
20100272329 | Benkley | Oct 2010 | A1 |
20110002461 | Erhart et al. | Jan 2011 | A1 |
20110018556 | Le et al. | Jan 2011 | A1 |
20110090047 | Patel | Apr 2011 | A1 |
20110175703 | Benkley | Jul 2011 | A1 |
20110176037 | Benkley | Jul 2011 | A1 |
20110182486 | Valfridsson et al. | Jul 2011 | A1 |
20110214924 | Perezselsky et al. | Sep 2011 | A1 |
20110267298 | Erhart et al. | Nov 2011 | A1 |
20110298711 | Dean et al. | Dec 2011 | A1 |
20110304001 | Erhart et al. | Dec 2011 | A1 |
20120044639 | Garcia | Feb 2012 | A1 |
20120189166 | Russo | Jul 2012 | A1 |
20120189172 | Russo | Jul 2012 | A1 |
20120206586 | Gardner | Aug 2012 | A1 |
20120256280 | Ehart | Oct 2012 | A1 |
20120257032 | Benkley | Oct 2012 | A1 |
20120308092 | Benkley et al. | Dec 2012 | A1 |
20130021044 | Thompson et al. | Jan 2013 | A1 |
Number | Date | Country |
---|---|---|
2213813 | Oct 1973 | DE |
0929028 | Jan 1998 | EP |
0905646 | Mar 1999 | EP |
0973123 | Jan 2000 | EP |
1018697 | Jul 2000 | EP |
1139301 | Oct 2001 | EP |
1531419 | May 2005 | EP |
1533759 | May 2005 | EP |
1538548 | Jun 2005 | EP |
1624399 | Feb 2006 | EP |
1775674 | Apr 2007 | EP |
1939788 | Jul 2008 | EP |
2331613 | May 1999 | GB |
2480919 | Dec 2011 | GB |
2487661 | Aug 2012 | GB |
2489100 | Sep 2012 | GB |
2490192 | Oct 2012 | GB |
01094418 | Apr 1989 | JP |
04158434 | Jun 1992 | JP |
2005011002 | Jan 2005 | JP |
2005242856 | Sep 2005 | JP |
2007305097 | Nov 2007 | JP |
200606745 | Feb 2006 | TW |
200620140 | Jun 2006 | TW |
200629167 | Aug 2006 | TW |
WO 9003620 | Apr 1990 | WO |
WO 9858342 | Dec 1998 | WO |
WO 9928701 | Jun 1999 | WO |
WO 9943258 | Sep 1999 | WO |
WO 0122349 | Mar 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0195304 | Dec 2001 | WO |
WO 0211066 | Feb 2002 | WO |
WO 0247018 | Jun 2002 | WO |
WO 0247018 | Jun 2002 | WO |
WO 02061668 | Aug 2002 | WO |
WO 02077907 | Oct 2002 | WO |
WO 03063054 | Jul 2003 | WO |
WO 03075210 | Sep 2003 | WO |
WO 2004066194 | Aug 2004 | WO |
WO 2004066693 | Aug 2004 | WO |
WO 2005104012 | Nov 2005 | WO |
WO 2005106774 | Nov 2005 | WO |
WO 2005106774 | Nov 2005 | WO |
WO 2006040724 | Apr 2006 | WO |
WO 2006041780 | Apr 2006 | WO |
WO 2007011607 | Jan 2007 | WO |
WO 2008033264 | Mar 2008 | WO |
WO 2008033264 | Mar 2008 | WO |
WO 2008033265 | Jun 2008 | WO |
WO 2008033265 | Jun 2008 | WO |
WO 2008137287 | Nov 2008 | WO |
WO 2009002599 | Dec 2008 | WO |
WO 2009002599 | Dec 2008 | WO |
WO 2009029257 | Jun 2009 | WO |
WO 2009079219 | Jun 2009 | WO |
WO 2009079221 | Jun 2009 | WO |
WO 2009079257 | Jun 2009 | WO |
WO 2009079262 | Jun 2009 | WO |
WO 2010034036 | Mar 2010 | WO |
WO 2010036445 | Apr 2010 | WO |
WO 2010143597 | Dec 2010 | WO |
WO 2011088248 | Jan 2011 | WO |
WO2011088252 | Jan 2011 | WO |
WO 2011053797 | May 2011 | WO |
Entry |
---|
Matsumoto et al., Impact of Artificial “Gummy” Fingers on Fingerprint Systesm, SPIE 4677 (2002), reprinted from cryptome.org. |
Maltoni, “Handbook of Fingerprint Recognition”, XP002355942 Springer, New York, USA, Jun. 2003 pp. 65-69. |
Vermasan, et al., “A500 dpi AC Capacitive Hybrid Flip-Chip CMOS ASIC/Sensor Module for Fingerprint, Navigation, and Pointer Detection With On-Chip Data Processing”, IEEE Journal of Solid State Circuits, vol. 38, No. 12, Dec. 2003, pp. 2288-2294. |
Ratha, et al. “Adapative Flow Orientation Based Feature Extractionin Fingerprint Images,” Pattern Recognition, vol. 28 No. 11, 1657-1672, Nov. 1995. |
Ratha, et al., “A Real Time Matching System for Large Fingerprint Databases,” IEEE, Aug. 1996. |
Suh, et al., “Design and Implementation of the AEGIS Single-Chip Secure Processor Using Physical Random Functions”, Computer Architecture, 2005, ISCA '05, Proceedings, 32nd International Symposium, Jun. 2005 (MIT Technical Report CSAIL CSG-TR-843, 2004. |
Rivest, et al., “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems”, Communication of the ACM, vol. 21 (2), pp. 120-126. (1978). |
Hiltgen, et al., “Secure Internet Banking Authentication”, IEEE Security and Privacy, IEEE Computer Society, New York, NY, US, Mar. 1, 2006, pp. 24-31, XP007908655, ISSN: 1540-7993. |
Hegt, “Analysis of Current and Future Phishing Attacks on Internet Banking Services”, Mater Thesis. Techische Universiteit Eindhoven—Department of Mathematics and Computer Science May 31, 2008, pp. 1-149, XP002630374, Retrieved from the Internet: URL:http://alexandria.tue.nl/extral/afstversl/wsk-i/hgt2008.pdf [retrieved on Mar. 29, 2011] *pp. 127-134, paragraph 6.2*. |
Gassend, et al., “Controlled Physical Random Functions”, In Proceedings of the 18th Annual Computer Security Conference, Las Vegas, Nevada, Dec. 12, 2002. |
Wikipedia (Mar. 2003). “Integrated Circuit,” http://en.wikipedia.org/wiki/integrated—circuit. Revision as of Mar. 23, 2003. |
Wikipedia (Dec. 2006). “Integrated circuit” Revision as of Dec. 10, 2006. http://en.widipedia.org/wiki/Integrated—circuit. |
Bellagiodesigns.Com (Internet Archive Wayback Machine, www.bellagiodesigns.com date: Oct. 29, 2005). |
Closed Loop Systems, The Free Dictionary, http://www.thefreedictionary.com/closed-loop+system (downloaded Dec. 1, 2011). |
Feedback: Electronic Engineering, Wikipedia, p. 5 http://en.wikipedia.org/wiki/Feedback#Electronic—engineering (downloaded Dec. 1, 2011). |
Galy et al. (Jul. 2007) “A full fingerprint verification system for a single-line sweep sensor.” IEEE Sensors J., vol. 7 No. 7, pp. 1054-1065. |
Number | Date | Country | |
---|---|---|---|
20100284565 A1 | Nov 2010 | US |