Method and apparatus for fingerprint motion tracking using an in-line array

Information

  • Patent Grant
  • 8447077
  • Patent Number
    8,447,077
  • Date Filed
    Monday, September 11, 2006
    18 years ago
  • Date Issued
    Tuesday, May 21, 2013
    11 years ago
Abstract
A fingerprint motion tracking method and system is provided for sensing features of a fingerprint along an axis of finger motion, where a linear sensor array has a plurality of substantially contiguous sensing elements configured to capture substantially contiguous overlapping segments of image data. A processing element is configured to receive segments of image data captured by the linear sensor array and to generate fingerprint motion data. Multiple sensor arrays may be included for generating directional data. The motion tracking data may be used in conjunction with a fingerprint image sensor to reconstruct a fingerprint image using the motion data either alone or together with the directional data.
Description
BACKGROUND

The invention relates generally to technology for sensing and recording fingerprints and, more particularly to systems, devices and methods for fingerprint motion tracking alone and in combination with fingerprint image processing.


A number of devices and techniques exist for sensing, capturing, and reconstructing the image of a fingerprint as it moves across a sensor array. Though many devices exist to sense and record an entire fingerprint, partial fingerprint sensing devices have been developed for small portable devices to save space. The sensing devices themselves vary widely, and many devices and related techniques exist for sensitively detecting the presence of the finger surface and features located on the surface that make up the unique fingerprint of a person. For example, one common configuration used for a fingerprint sensing surface includes CCD (charge coupled devices) or C-MOS circuits. These components are embedded in a sensing surface to form a matrix of piezoelectric elements that generate signals in response to pressure applied to the surface by a finger. These signals are read by a processor and used to reconstruct the fingerprint of a user and to verify identification. Other devices include a matrix of optical sensors that read light reflected off of a person's finger and onto optical elements The reflected light is converted to a signal that defines the fingerprint of the finger analyzed and is used to reconstruct the fingerprint and to verify identification. More modern devices include static or radio frequency (RF) devices configured to measure the intensity of electric fields conducted by finger ridges and valleys to sense and capture the fingerprint image. Regardless of the method used to sense the fingerprint, conventional devices and techniques have common drawbacks, particularly when used in combination with portable electronic devices. These devices require small component size because of a lack of space and surface area due to the devices small size, and further require that any power demand be as small as possible due to limited battery life.


Specifically, devices exist that have a sensing area that is smaller than the fingerprint area to be imaged. Such devices are greatly desired because they take up much less space than a full fingerprint sensor. This is a very useful feature for small portable devices. These sensing devices generally consist of one or more imaging lines disposed perpendicular to the axis of motion. As the finger surface is moved across the sensor, portions of the fingerprint are sensed and captured by the device. These portions are subsequently reconstructed in a mosaic or overlapping manner. In operation however, current conventional devices have severe drawbacks. They generally require extensive processing resources for computing the algorithms and required data for reconstructing fingerprints.


For applications of fingerprint identification devices in portable electronics, such as laptops and cellular telephones, low power consumption is a strict requirement. Therefore, it is important to maintain minimal computation processing in such applications. Again, present conventional fingerprint sensor technology requires a substantial amount of processing, and thus requires a large amount of power to perform the required tasks for reconstructing fingerprints for identification. One major problem is that a large amount of pixel information is required to be recorded and matched in a short a mount of time, burdening the device processor and consuming substantial power. This is a big problem with small devices, which already have restrictions on power consumption.


One conventional device is described in U.S. Pat. No. 6,002,815 of Immega, et al. The technique used by the Immega device is based on the amount of time required for the finger to travel a fixed distance between two parallel image lines that are oriented perpendicular to the axis of motion. After a time history of samples are captured, the speed is determined by finding the time delay that provides the best match between data from the first line and data to from the second line. The device captures the entire image of an object and stores the image line by line. Such an object is illustrated as a photo copy of a document, and the reference does not suggest a fingerprint or other image. Thus, it is directed to a device and method for scanning an image passing over a perpendicular slit pair at a variable speed, as opposed to objects that pass over the slit pair at a fixed speed. It does not address the problem of excessive processor power expended to perform the process. Also, the perpendicular lines of the image are used for determining the speed of the object as it passes through the perpendicular slit where the image is captured. These recorded lines are also used in reconstructing the image when the scan is complete. Thus, a large amount of data is processed and stored in the process. The amount of processing resources required to calculate the speed at any given moment is immense, where the resources include time required, calculation by the processor and power demanded by the processor. Furthermore, this time series approach has the disadvantage that it is not possible to quickly determine an absolute distance of motion by comparing only the instantaneous data from the two image lines. This is true for all cases other than for the rare coincidental case where the finger happens to travel exactly the distance between the image lines during the interval between the two samples. Another problem arises when the object is moving much slower than the sample rate of the device. In this case, the number of samples needed to find a match is substantial. In addition, at slow speeds, the device must compare a larger number of stored lines in order to find a match. This greatly increases the computational requirements, placing a substantial burden on the device processor. Thus, expensive high order processors are required for adequate performance and substantial power is needed to operate such processors.


Another technique is described in U.S. Pat. No. 6,289,114 of Mainguet. A device utilizing this method reconstructs fingerprints based on sensing and recording images taken of rectangular slices of the fingerprint and piecing them together using an overlapping mosaic algorithm. Like Immega, the technique described in Mainguet is also computationally burdensome on the device processor. Furthermore, the Mainguet method requires a substantial amount of memory as well as a larger number of imaging pixels in order to properly record the images. Again, this method demands substantial power to perform algorithms, a big problem for power rationed portable devices.


For accurate fingerprint capture, it is often advantageous to provide a navigation function with the same device used for fingerprint sensing. The navigation function can provide more functionality in as little area as possible in a portable device, and provide a more accurate fingerprint image. However, conventional devices and methods for navigation require substantial processor resources, and thus demand more power. In such devices, in order to sense finger motion, the sensing device must sample the image at a periodic rate that is fast enough to ensure that a moving feature will be sampled when it passes both the primary imaging line and the auxiliary line of pixels. As a consequence, the sensor needs to operate at full imaging speeds, thus consuming full imaging power while in the navigation mode. Consequently, conventional navigation methods demand substantial power, and are thus impractical for small devices.


Thus, there exists a great need in the art for a more efficient means to accurately sense and capture fingerprints on portable devices and also to provide navigation operations without unduly demanding power. As will be seen, the invention provides a means to overcome the shortcomings of conventional systems in an elegant manner.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a diagrammatic view of a sensor configured according to the invention;



FIG. 1B is a diagrammatic view of a sensor configured according to the invention;



FIG. 2A is a diagrammatic view of a sensor configured according to the invention;



FIG. 2B is a diagrammatic view of a sensor configured according to the invention;



FIG. 2C is a diagrammatic view of a fingerprint scan result;



FIG. 2D is a diagrammatic view of a fingerprint scan result;



FIG. 3 is a diagrammatic view of a sensor configured according to the invention;



FIG. 4 is a diagrammatic view of a sensor configured according to the invention;



FIG. 5A is a diagrammatic view of a sensor configured according to the invention;



FIG. 5B is a diagrammatic view of a sensor configured according to the invention;



FIG. 6 is a diagrammatic view of a sensor configured according to the invention;



FIG. 7 is a diagrammatic view of a sensor configured according to the invention;



FIG. 8A is a diagrammatic view of a sensor configured according to the invention;



FIG. 8B is a diagrammatic view of a sensor configured according to the invention;



FIG. 8C is a diagrammatic view of a sensor configured according to the invention;



FIG. 9 is a diagrammatic view of a system configured according to the invention;



FIGS. 10
a-b is are diagrammatic views of a sensor and fingerprint configured according to the invention;



FIGS. 11
a-j are diagrammatic views of a sensor and fingerprint images configured according to the invention;



FIG. 12 is a diagrammatic view of a sensor and fingerprint configured according to the invention.



FIGS. 12B (a) and (b) are illustrations of fingerprint images according to the invention;



FIG. 13 is a flow diagram of method configured according to the invention; and



FIG. 14 is a flow diagram of method configured according to the invention.





DETAILED DESCRIPTION

The invention is directed to a fingerprint motion tracking method and system for sensing features of a fingerprint along an axis of finger motion, where a linear sensor array has a plurality of substantially contiguous sensing elements configured to capture substantially contiguous overlapping segments of image data. A processing element is configured to receive segments of image data captured by the linear sensor array and to generate fingerprint motion data. Multiple sensor arrays may be included for generating directional data. The motion tracking data may be used in conjunction with a fingerprint image sensor to reconstruct a fingerprint image using the motion data either alone or together with the directional data.


The invention provides an independent relative motion sensor that does not require the power demanded by conventional devices. The independent relative motion sensor includes a linear array of sensing elements that captures a narrow string of data that is indicative of fingerprint features along a relatively narrow sample. In operation, the linear sensor array senses and captures fingerprint features in the form of a string of data signals by first sensing the features in an initial sensing and capture, and this is followed by one or more subsequent operations where a sample is taken of a subset of the fingerprint features are captured again over a known time period. This time period may be predetermined or measured as time progresses between sensing and capturing of the samples. Once at least two samples are taken, a subsequent sample is compared against a previous sample to determine the amount shift of the previous sample relative to the subsequent sample. In one embodiment, a single linear line of sensor pixels is used to sense a one-dimensional track of fingerprint features, and the signal sensed by the pixels is converted from an analog signal to a digital signal, where the features are then represented as a string of digital values. For example, the ridges of the fingerprint features may be represented as logical ones, and valleys represented as logical zeros.


When compared, the first string of digital values from one sample can be compared to the second string in a one to one relationship, and a similarity score can be produced that measures the number of matching values. If there is an immediate match, where both strings are substantially identical, then this would indicate that there was no movement during the time between which the two samples were taken. If there is not an immediate match, then this would indicate that there was some movement, and additional comparisons may be needed to determine the distance traveled. For each comparison, the strings of digital values can be shifted one or more pixels at a time. Once a good match is found, the distance traveled by the fingerprint is simply the number of pixels shifted times the distance between the pixels, which may be measured from the center point of one pixel to the center point of another pixel in the array of pixel sensors for example.


In one embodiment, a predetermined number of comparisons can be made along with corresponding similarity scores. The process may then choose the highest score to determine the most accurate comparison. The number of pixels that were shifted to get the best comparison can then be used to determine the distance traveled, since the size of and distance between the pixels can be predetermined, and the number of pixels can thus be used to measure the distance traveled by the fingerprint across the motion sensor over the time period of the motion.


In another embodiment, the process could make comparisons and generate scores to measure against a predetermined threshold, rather than making a predetermined number of comparisons. In this embodiment, the similarity score from each comparison can be measured after the comparison is made. If the score is within the threshold, then it can be used to indicate the amount of shift from one sample to another. This can then be used to determine the distance traveled by the fingerprint across the linear motion sensor.


In one embodiment, generally, the invention provides a fingerprint motion tracking system and method, where a single linear sensor array is configured to sense features of a fingerprint along an axis of finger motion. The linear sensor array includes a plurality of substantially contiguous sensing elements or pixels configured to capture a segment of image data that represents a series of fingerprint features passing over a sensor surface. A buffer is configured to receive and store image data from the linear sensor array. And, a processing element is configured to generate fingerprint motion data. The linear sensor array may be configured to repeatedly sense at least two substantially contiguous segments of fingerprint data, and the processor can generate motion data based on at least two sensed contiguous segments of fingerprint data. In operation, the linear sensor array is configured to sense a first set of features of a fingerprint along an axis of finger motion and to generate a first set of image data captured by a plurality of substantially contiguous pixels of the sensor array. The linear sensor array is also configured to subsequently sense a second set of features of the fingerprint along an axis of finger motion and to generate a second set of image data captured by a plurality of substantially contiguous pixels of the sensor array. The processing element can then compare first and second sets of image data to determine the distance traveled by the fingerprint over a time interval.


As used herein, linear sensor array is a generic term that relates to a portion of sensing elements, whether they are pixels in an optical reader, a static or radio frequency reader that reads electric field intensity to capture a fingerprint image, piezoelectric components in touch-sensitive circuit fingerprint readers, or other elements indicative of fingerprint readers, where the elements are used to sense a portion of the fingerprint, rather than the entire fingerprint. Such sensor arrays may be configured in a number of ways within a matrix of well known sensor devices. For example, several modern configurations are described and illustrated in pending U.S. Pub. US 2006/0083411A1entitled: Fingerprint Sensing Assemblies and Methods of Making; U.S. Pub. US 2005/0244039 A1entitled: Methods and Apparatus for Acquiring a Swiped Fingerprint Image; U.S. Pub US 2005/0244038 A1, entitled: Fingerprint Sensing Methods and Apparatus; U.S. Pub US 2003/0035570 A1entitled: Swiped aperture capacitive fingerprint sensing systems and methods, and other applications that are all assigned to common assignee Validity, Inc. Also, many other types of sensor matrices exist in the art directed to capturing fingerprint images. The invention is directed to a novel system, device and method that is not limited in application to any particular sensor matrix or array configuration. In fact, the invention can be used in conjunction with or incorporated into such configurations to improve performance, and further to reduce the processing resources required to capture and reconstruct images.


According to the invention, the linear sensor is substantially contiguous, which is to say that the sensor elements are in a relative proximity to each other so that a first reading of a portion of fingerprint features can be taken, followed by a second reading after a short period of time from a relatively stationary position. The two samples can be compared to determine the relative distance traveled by the fingerprint surface in relation to the sensor surface. The linear sensor is configured to merely take a relatively small sample of the fingerprint at one point in time, then another at a subsequent time. These two samples are used to determine movement of the fingerprint. Two or more samples maybe compared in order to compute direction and velocity of a fingerprint surface relative to the linear sensing elements. These samples may be linear, as described below and illustrated in the drawings, so that a linear array of fingerprint features can be recorded and easily compared to provide a basis for motion, distance traveled over time. If more than one sensor is employed, it is possible to determine direction of motion using vector addition with the different linear samples taken. Thus, some of the functions provided by the invention are a result of taking a linear sample to give a basis for vector analysis. However, those skilled in the art will understand that, given the description below and the related drawings, other embodiments are possible using other configurations of motion sensors, which would not depart from the spirit and scope of the invention, which is defined by the appended claims and their equivalents, as well as any claims and amendments presented in the future and their equivalents.


One useful feature of the invention is that ambiguity in results is substantially prevented. If properly configured, a system configured according to the invention can consistently produce a result, where at least two samples can be taken such that the features of one sample overlap with another sample. Then, comparisons can be made to determine the amount of shift, indicating the amount of movement of the fingerprint across the linear sensor. In prior art systems and methods, it is often the case that no result occurs, and a singularity results. Thus, a user would need to repeat sensing the fingerprint. In some systems, substantial predictor algorithms have been created in an attempt to compensate or resolve the singularity when it occurs. Such applications are very large and demand a good deal of computation and processing resources, which would greatly bog down a portable device. According to the invention, sensing motion of a fingerprint is substantially certain, where samples taken from the fingerprint surface are consistently reliable. This is particularly important in navigation applications, where relative movement of the finger translates to movement of an object such as a cursor on a graphical user interface (GUI), discussed further below.


In one embodiment, the linear sensor array may be used alone to determine linear movement of a fingerprint. In another embodiment, the single sensor array may be used in conjunction with one or more other linear sensor arrays to determine movement in two dimensions. In either embodiment, the linear sensor arrays are utilized solely for determining motion. If the motion of the analyzed fingerprint occurs generally along a predetermined axis of motion, the single linear sensor array can be utilized to sense the velocity of the fingerprint being analyzed. To capture and record the motion of a fingerprint that is not directed along a predetermined axis of motion, two or more linear arrays (a plurality of arrays) can be used together to sense and record such motion, and a processor can determine the direction and speed of the fingerprint using vector arithmetic.


In yet another embodiment, one or more such linear arrays may be used in conjunction with a fingerprint sensor matrix to more accurately capture and reconstruct a fingerprint image. The sensor matrix can be configured to sense and capture an image of a portion of a fingerprint being analyzed, and the one or more linear arrays can provide motion information for use in reconstructing a fingerprint image. A device so configured would be able to more accurately sense, capture, record and reconstruct a fingerprint image using less processing resources than conventional devices and methods.


The primary distinction between the invention and the prior art, Immega and Mainguet for example, is that the invention separates the analysis of motion from the capturing of the entire fingerprint image. The concept described in Immega, for example, requires the entire image to be captured and recoded line by line. The lines are used to both determine speed of the object being sensed and recorded and also calculate the speed of the object as it is passed over the perpendicular slot. Immega requires immense processing and storage resources to sense, capture, record and reconstruct the image, and all of these functions are carried out by processing the entire lot of image data captured and recorded. Similarly, a device configured according to Mainguet must capture large portions of the fingerprint image and requires substantial processing and storage resources to overlap and match the image mosaics to reconstruct the image. In stark contrast, the invention provides a means for detecting motion of a fingerprint separately from the process of capturing a fingerprint image, and uses the motion information to more efficiently reconstruct the fingerprint image using less processing and storage resources.


Alternatively, in yet another embodiment, one or more arrays can be used to generate motion information for use in accurate navigational operations, such as for use in navigating a cursor on a graphical user interface (GUI). Utilizing the improved processing functions of the invention, an improved navigation device can be constructed that is compatible with a portable device that has the power and processing restrictions discussed above. Examples of such embodiments are described and illustrated below.


A motion sensor configured according to the invention uses substantially less space and power compared to conventional configurations for motion sensing, navigation and fingerprint image reconstruction. Such a configuration can further provide aid to conventional fingerprint reconstructing processes by better sensing motion of a finger while it is being analyzed by a sensing device. This allows a fingerprint sensing device the ability to reconstruct a fingerprint analyzed by a fingerprint sensor with reduced power. Utilizing the invention, conventional processes that need to match and construct fragmented images of a fingerprint, particularly devices that sense and process a fingerprint in portions, can be optimized with information related to fingerprint motion that occurs while a fingerprint surface is being read. Also, using this unique motion detection technology, optimal navigation functions can be provided that demand significantly less power than conventional devices. Such navigation functions can enable a low power navigation device to be integrated in a portable device system, such as a mouse pad used to move a cursor across a graphical user interface (GUI) on portable electronic devices including cellular phones, laptop computers, personal data assistants (PDAs), and other devices where low power navigation functions are desired. A novel system and method are provided that uses minimal space and processing resources in providing accurate motion detection from which fingerprint sensors as well as navigation systems can greatly benefit.


A device or system configured according to the invention can be implemented as a stand alone navigation device, or a device to provide image reconstruction information for use with a line imaging device that matches and assembles a fingerprint image. Such a line imaging device may be any imaging device configured to sense and capture portions of a fingerprint, whether it captures individual perpendicular image lines of a fingerprint, or multiple perpendicular lines. In operation, a motion detection device can operate as a separate motion detection and/or direction detection device. Alternatively, a motion detection device can be used in conjunction with a line imaging device to more accurately and efficiently sense, capture, store and reconstruct a fingerprint image. A device configured according to the invention may include a single array of finger ridge sensing pixels or data sensor points centrally located along the principal axis of motion to be detected, a sampling system to periodically sample the finger contact across the array, and a computational module or element that compares two sets of samples collected at different times to determine the distance traveled while between the two sample times. According to the invention, the motion sensor pixels do not necessarily need to have the same resolution as the line imager. The motion sensor pixels may in fact use a different sensing technique than the imager.


Again, the invention provides separate operations for detecting motion and for sensing and capturing a fingerprint image. Thus, the techniques used for the separate processes can be the same or may be different depending on the application. Those skilled in the art will understand that different variations of the separate processes are possible using known techniques and techniques can be derived without any undue experimentation. Such variations would not depart from the spirit and scope of the invention.


In another embodiment, the invention provides the capability of multi-axis motion sensing with additional off-axis sensing arrays. In this embodiment, there are two or more (a plurality of) sensor arrays for detecting motion, and each axis is independently measured to determine the component of velocity in that axis. The velocity components from the individual axes are used to compute a vector sum to determine the actual direction and velocity of motion of the finger with respect to the sensor surface. According to the invention, it is not necessary to capture the full image of the fingerprint in order to determine the distance traveled and the velocity. It is only necessary to capture a linear sample of fingerprint features along the line of motion of the fingerprint. In one embodiment, a plurality of samples, such as two or three samples, are captured by motion sensor pixels and are used to determine the distance traveled across the axis of motion of the fingerprint relative to the sensor surface and the velocity at which the motion occurs. This information can also be used in navigational operations, and can further be used in combination with a fingerprint imager to aid in reconstructing a fingerprint image. Utilizing the invention, either application can be configured in an economical and useful manner. Moreover, the operation of such a sensor or navigational device can be optimized to consume substantially less power than conventional devices, which require excessive processor operations for reassembly of the fingerprint image. And, given the motion information generated by a system configured according to the invention, the distance traveled and velocity of the fingerprint can be used to more accurately and efficiently reconstruct a full fingerprint.


Aligning the pixels along the axis of motion, rather than perpendicular to it, enables the use of motion detection algorithms that can be both time-variant and distance variant. This enables development of algorithms that utilize short distance measurement over long time periods for low speed motion and longer distance motion to more accurately measure higher speed motion, thus optimizing response time and accuracy. Both embodiments share the advantages gained by acquiring and comparing multiple spatial measurements of the fingerprint pattern at each sampling instance. Because multiple samples are taken and compared simultaneously, effects of sampling error, both due to noise and imprecision in the sampling of the finger pattern, are minimized. Also, because samples are taken at multiple locations along the axis of motion simultaneously at each sampling period, the images from two sampling periods can be compared to detect if there had been any significant finger motion between the two sample times. One shared advantage is that both systems are capable of detecting under-sampling of the image being acquired by the line imager, as a consequence of their ability to detect motion of multiple pixels in a short time interval.


An embodiment using a single segmented motion sensor array offers the advantage of detecting motion over a shorter range of distance. This provides faster response time, particularly at low finger speeds that may be encountered in navigation applications. Because this embodiment is sensitive to single pixel motion, it provides unique features that may also reduce the memory requirements for the computational elements. In order to provide a navigation device, as well as to detect and correct for finger motion that is not completely aligned with the desired axis, either of the embodiments may be combined in ensembles such that one sensor is aligned on the axis of motion, and additional sensors aligned at an angle (such as 22.5 or 30 degrees) to the principal axis of finger motion. Examples of different embodiments are discussed below.


Referring to FIG. 1A, a diagrammatic view of motion detection and tracking system configured according to the invention is illustrated. An integrated circuit package 100 is illustrated having circuits and possibly software embedded (not shown) and electrical connections 101 for integration in and connection with a system that utilizes the circuit package. FIG. 1 illustrates an embodiment of the invention where a finger 104 can move its fingerprint surface 106 against sensor surface 108 to be read by the sensors 110, 112. These sensors can pick up movement information of a fingerprint for use in navigational applications, or can be used in conjunction with an integrated fingerprint sensor surface 108 to simultaneously capture and record portions of a fingerprint. Such a system configured according to the invention may be a stand alone component as shown, or can be integrated with other circuits for more space and power savings as well as efficiency. Those skilled in the art will understand that many variations of the configuration are possible, and that the invention is not limited to any particular configuration, but is defined by the claims and all equivalents.


The system further includes a sensor module 102 that is used to sense a user's finger 104 fingerprint surface 106 when it is moved across fingerprint sensing surface 108. As can be seen, the fingerprint sensing surface 108 is illustrated as a narrow surface that is designed to sense and capture portions of a fingerprint as it is moves across the sensor. These portions can be subsequently reconstructed according to the invention using motion information from the motion sensors 110,112. Thus, the sensor components illustrated in FIG. 1 have multiple utilities, and can be configured in devices that utilize part or all of such utilities, whether it is a stand alone motion sensor configured to sense movement and velocity in one direction, a multidirectional motion sensor configured to sense movement and velocity in several directions, or a combination device configured to sense motion either in one or more (one or more meaning a plurality of directions) directions and used in combination with a fingerprint sensor surface that reads portions of fingerprints and reassembles the fingerprints using the motion information from motion sensors. The features and benefits of several embodiments of the invention are discussed and illustrated below. Again, these are intended as mere examples of different embodiments, and are not intended as an exhaustive set of samples. And again, those skilled in the art will understand that these and other embodiments of the invention described herein are illustrative of the invention and are not intended to limit the spirit and scope of the invention, which is defined by the appended claims and all equivalents, including claims appended herein upon filing and also those as possibly amended at a later date.


Referring to FIG. 1B, a side view of the sensor system of FIG. 1A is illustrated. In operation, the finger 104 is placed by a user onto the sensor surface 107, which includes fingerprint sensing surface 108, so that the fingerprint sensing surface 108 and the sensor surface 106 are juxtaposed relative to each other. The finger 104 and sensor 100 moves in opposite directions A,B, so that the sensor 102 can move across and analyze the fingerprint surface 106. In different applications and devices, this interaction may take on many forms. A user may hold the fingerprint surface stationary so that sensor 102 can move relative to the fingerprint, similar to the operations of a photocopy machine. Or, if the sensor is fixed in a surface, such as on the surface of a laptop computer or cellular phone, the user can move the fingerprint surface 106 by rubbing it against and along the fingerprint sensing surface 108 so that the sensor 102 can analyze and read the fingerprint.


Referring to FIG. 2A, one practical application of a navigational system is illustrated, where a portable device 202, such as a portable music player, a cellular phone, PDA or other device, has a graphical user interface (GUI) or screen 204, a cursor 206 that may appear on the screen that is capable of being moved across the screen under control of a user navigating a touch-sensitive cursor 208. The touch sensitive cursor has navigational indicia 210, which may be merely directional indicators located about sensor 102 that is located within or about that touch-sensitive cursor that acts as a navigational pad, similar to that of a mouse pad commonly used on laptop computers. According to the invention, such a navigational pad can be greatly enhanced using sensor technology according to the invention, where directional movement sensors 110,112 are used to guide the cursor 206 for searching for and selecting indicia such as toolbar items or icons for opening files, photos and other items when selected. In some applications, a multi-step sensor can read the fingerprint structures for guidance at one level, and may select an indicia by pressing harder on the sensor for another level of sensing. Thus, a user can move the cursor around by lightly pressing on and moving a finger along the surface, then pressing harder when selecting an icon, toolbar or other indicia. Utilizing the invention, a more efficient navigation tool can be adapted to perform all of these tasks at low power and high accuracy, a very adaptable feature for portable devices.


Referring to FIG. 2B, another embodiment of the invention is illustrated, where the integrated circuit (IC) chip 114 is separate from the sensor surface 108(b). In the illustration of FIGS. 1A and 1B, the sensor surface may be located on top of an IC as in many conventional configurations, but with the novel array sensors 110,112 of the invention. FIG. 2B illustrates a novel configuration where the sensor surface 108(b) is located on a film 118, ad the IC 116 is located separately, allowing for more flexible and useful applications. As discussed herein, the invention can be applied either type of configuration, and is adaptable to any application where motion and direction information may be useful, such as for capturing and reconstructing fingerprint images or other applications.


Referring to FIG. 2C, a sample output of a scanned fingerprint is illustrated, where the multiple scans, Scan-a through Scan-d, are assembled to reconstruct the fingerprint image. As can be seen, if the finger and sensor are moved in the directions A,B, relative to each other, the scanned portions arrive in order from the finger knuckle location (C) to finger tip location (D), FIG. 2a. In conventional systems, the portions must be matched and assembled using processor intensive algorithms that match the overlapping parts of the different portions. Utilizing the invention, the scan portions, such as Scans a-d, can be assembled using the motion information, where distance and time expended can be recorded and can together give a velocity factor. This way, reconstruction can be done in an efficient manner, with low burden on a device's processor and power supply. While this example shows an image composed of rectangular segments, the invention may also be used to construct an image from a series of single-line images as well. Referring to FIG. 2D, another configuration of the sample output is illustrated, where vertical scans of the image are captured as Scans a-d. The invention is also adaptable to such a configuration.


Referring again to FIG. 1A, the surface 108 has embedded motion sensors 112 that, according to the invention, operate to detect the presence and motion of a fingerprint surface 106 about the sensor surface 108. A single motion sensor 110, aligned with a general fingerprint motion direction for detecting distance traveled by the fingerprint across the sensor over a period of time. This allows a processor to compute the velocity of the fingerprint over the sensor surface. In another embodiment, there may be a single motion sensor 110 on the surface 108, or there may be a plurality, two or more motion sensors 110,112, on the surface 108, depending on the application. The additional sensors 112 may be used to detect direction of a fingerprint's motion across the sensor surface. In practical applications, a user may not move the finger exactly parallel with the sensor 110. A user may rub the fingerprint surface 106 at an angle with respect to the axis of the sensor 110. A processor analyzing the velocity of the fingerprint motion may then end up with an inaccurate velocity reading. This may be important when the data generated by the sensor is used for reconstructing a fingerprint, or when the sensor data is used for navigational purposes. According to this additional embodiment of the invention, the additional sensors 112 can be used to determine the direction of the fingerprint surface when it is being analyzed. Using the data captured by the sensors, a processor can apply vector analysis to generate motion information. This motion information can be used in processes for reconstructing the fingerprint images, or for navigation processes.



FIGS. 3-7 discussed below have a similar numbering pattern, where the sensor surface 107 includes the two other sensing surfaces: fingerprint sensing surface 108 and motion sensors 110 and 112. The different embodiments, though similar in general function, are separately identified to differentiate the different components in the different embodiments. These are intended as mere examples of different embodiments, and are not intended as an exhaustive set of samples. Again, those skilled in the art will understand that these and other embodiments of the invention described herein are illustrative of the invention and do not limit the spirit and scope of the invention, which is defined by the appended claims and all equivalents, including claims appended herein upon filing and also those as possibly amended at a later date.


According to another embodiment 102(a) of the invention illustrated in FIG. 3, the sensor surface 108(a) may include image sensing elements used for broadly sensing and recording the fingerprint features. In addition, a motion sensor 110(a) is included for sensing and recording the motion of the fingerprint. Such a device may be a single sensor embedded within the two dimensions of the sensor surface 107(a), with the fingerprint sensing surface 108(a) included for sensing and recording the full fingerprint. The motion sensors are configured to separately sense and recording motion information. Here, the sensor surface 107(a) includes a motion sensor 110(a) configured separately from fingerprint sensing surface 108(a). According to this embodiment, the motion sensor is separate from the fingerprint sensing surface, though located on the same sensor surface. In operation, a fingerprint surface 106 can be moved simultaneously along motion sensor 110(a) and fingerprint sensing surface 108(a). The motion information from the motion sensor, such as distance and time traveled over that distance, can be utilized together with the fingerprint sensing surface as an aid in reconstructing the separate portions of the fingerprint.


Referring to FIG. 4, another embodiment 102(b) of the invention is illustrated where motion sensors 110(b), 112(b) are located about fingerprint sensor surface 108(b) within sensor surface 107(b). The motion sensor 110(b) is located along an anticipated axis of motion of finger 106 with respect to device 100 in directions A,B. Motion sensor 110(b) can sense the distance and time expended over that distance to determine velocity, which can be used in reconstructing the fingerprint portions simultaneously captured by fingerprint sensor surface 108(b). Using the additional motion sensors 112(b), a fingerprint surface 106 can be sensed and captured even if a user slides the finger at an angle to the axis of the motion sensor 110(b). In fact, given the angles of the additional sensors 112(b) with respect to the central axis of the device, the direction of motion can be computed by a processor using vector addition. Thus, the direction, distance and time expended during fingerprint surface travel across the sensors can be used along with the fingerprint portions captured by the fingerprint sensor to accurately reconstruct the fingerprint image. This can be done with a fraction of the processing power, and thus less power source power, than conventional methods and devices known in the prior art. Thus, the invention provides great utility for fingerprint reconstruction and verification for devices that have power and processing restrictions.


Referring to FIG. 5a, yet another embodiment 102(C) of the invention is illustrated, where the motion sensors 110(C), 112(C) are interleaved with fingerprint sensor surface 108(C) in a combined component within sensor surface 107(C). Such a configuration can be created in a sensor surface, where the pixels or data contact points that sense the fingerprint features are separately read from the sensors by a processor. For example, in a matrix of sensor pixels or data contact points, individual points can be singled out in one or more arrays to operate as motion sensing arrays. In the same matrix, the remaining pixels or data contact points can form a fingerprint sensor surface for sensing and capturing the fingerprint image. In operation, a fingerprint can be juxtaposed and moved along the sensor surface 107(C) along the anticipated axis of motion or at another angle, and an accurate sense and capture of a fingerprint can be achieved without undue computation and power load. While the fingerprint sensor surface 108(C) senses and captures the portions of images of the fingerprint features upon contact with the fingerprint surface 106, the motion sensors can simultaneously capture motion information as the features move past the motion sensors. The motion information can be used in combination with the portions of fingerprint images to reconstruct the fingerprint image. Referring to FIG. 5b, the same configuration of FIG. 5a is illustrated, with a view of the motion sensors shown much smaller in comparison to the overall sensor surface. In a sensor surface that is densely populated with pixels or data contact points, the relative size of the portion of the sensor surface that is covered with the motion sensing arrays are very small compared to the pixels and data points that make up the fingerprint sensing surface 108(C), both located within sensor surface 107(C). Thus, the fingerprint can be sensed and captured without any interference by the interleaved motion sensing arrays and accurate portions of a fingerprint image can be captured and accurately reconstructed using the combined information from the fingerprint sensors and the motion sensors. Utilizing this embodiment, a universal component can be constructed and utilized for both motion detection and fingerprint capture, and the results from both functions can be utilized to produce an efficient and power thrifty method of sensing, reconstructing and verifying a fingerprint.


Referring to FIG. 6, another embodiment 102(d) of the invention is illustrated, where a single motion sensor array 110(d) is interleaved within the fingerprint sensor surface 108(d) of sensor surface 107(d). Unlike the embodiment illustrated in FIGS. 5a, 5b, this embodiment is limited to one motion sensor array located along the anticipated axis of motion of the finger, which is anticipated to move in directions A,B with respect to the device 100. In operation, the interleaved sensor array 110(d) can sense and capture motion information regarding the motion of the finger across the sensor surface 107(d), while simultaneously fingerprint sensor surface 108(d) can sense and capture the fingerprint images for subsequent reconstruction. The information from both sensors can be used to more accurately reconstruct the fingerprint image.


Referring to FIG. 7, yet another embodiment 102(e) of the invention is illustrated, where multiple motion sensors 112(e) are interleaved within fingerprint sensor surface 108(e). This embodiment is similar to that illustrated in FIGS. 5a, 5b, but with more motion sensors at various angles. In operation, a fingerprint can be juxtaposed and moved along the sensor surface 107(e) along the anticipated axis of motion or at another angle, and an accurate sense and capture of a fingerprint can be achieved without undue computation and power load. While the fingerprint sensor surface 108(e) senses and captures the portions of images of the fingerprint features upon contact with the fingerprint surface 106, the motion sensors can simultaneously capture motion information as the features move past the motion sensors. The motion information can be used in combination with the portions of fingerprint images to reconstruct the fingerprint image. Those skilled in the art will understand that many variations on the concept of multiple motion sensors embedded or interleaved within the sensor surface are possible, and that different applications will have varying demands for the different sensor features.


Thus, if a user would stroke a fingerprint surface against a motion sensor surface, the arrays could pick up the motion and direction information, and a processor could process the information to generate relative motion and direction information for use in navigation, such as for a computer mouse. In this example, a user can move a finger relative to a cursor on a graphical user interface (GUI), such as a computer screen, a cellular phone, a personal data assistant (PDA) or other personal device. The navigation sensor could then cause the cursor to move relative to the fingerprint motion, and a user can navigate across the GUI to operate functions on a computer or other device. Since the motion of the cursor is relative to the movement of the fingerprint surface against the navigation sensor, relatively small movements can translate to equal, lesser or even greater distance movement of the cursor.


One aspect of the invention that is very useful to navigation configurations is the ability to consistently generate a motion result. As discussed above, the invention provides a means to substantially ensure a result when a fingerprint moves across a motion sensor. This is true for single array motion sensors as well as multiple array sensors used for two-dimensional motion processing. In a navigation application, such a configuration can provide accurate and consistent motion and directional information that allows for smooth and reliable navigational operations.


Referring to FIG. 8A, another embodiment of the invention is illustrated, where multiple arrays are located on the sensor surface to allow for sensing and capturing motion and direction information in different directions of fingerprint travel. The base film 120, which may be a 35 mm film or other material, includes a sensor surface 121 having several motion sensor arrays. Similar to the three sensor array illustrated in FIG. 5A, there are three sensors that fan upward for detecting motion and direction. In operation, a user typically will stroke over the sensor in a downward direction, and the three sensors can determine the direction and speed using vector analysis. However, it may be desired to account for motion in either an upward or downward direction, and multiple sensors in either direction would be useful to better capture the information. From an orientation of a user facing the sensor illustrated in FIG. 8(a), the right sensors 122,124 face the right, and are configured to capture movement toward the right, where either sensor could capture movement motion from the upper right to the lower left, and from the upper left to the lower right. Sensors 126,128 could capture up or down movement, and sensors 130,132 face the left, and are configured to capture movement toward the right, where either sensor could capture movement motion from the upper right to the lower left. Utilizing the multiple sensors, a sensor would be more robust, capable of sensing more fingerprint features, and also able to process more movement and directional information for use in capturing and reconstructing fingerprint images or for other applications such as navigation. The angle θ occurring between sensor 121 and center horizontal line 134 can be any angle, such as 30, 45 or 22.5 degrees in order to most effectively capture movement that is not aligned with center sensors 126,128. All off-axis sensors 124,128,130,132 can be set at various angles, which can depend on a particular application.


Referring to FIG. 8B, an even more robust example of a sensor set on film 136 having a surface 137 located on the film. The sensor 138 is located on the film surface 137, and includes multiple array sensors 140 that are set at various angles. In this embodiment, each array may be set at 22.5 degrees for adjacent angles, providing a wide variety of angles at which to sense and capture motion information. The sensor, similar to that of FIGS. 8(a) and 2B, has an IC chip 139 that is separate from the sensor surface 138.


Referring to FIG. 8C, a diagrammatic view of multiple array sensors located on a sensor 142 is illustrated. Sensors 144,144′ are vertical arrays that are set to capture one axis of motion. Sensors 146,146′ and 150,150′ are located off axis at an angle to sensors 144,144′. Sensors 148,148′ are optional and may be used in conjunction with the other sensors to gather motion information in a horizontal direction with respect to the vertical sensors. In practice, either or all of these sensors can be utilized by a system to accurately sense and capture motion and direction information in multiple directions. Again, which sensors to use may depend on a particular application and configuration.


In one embodiment, in order to support motion at any arbitrary angle, sensor arrays may be oriented at approximately 0,30,60,90,120, and 150 degrees. Another more robust system might space them at 22.5 degree increments, rather than 30. Once motion reaches 180 degrees, the process can use reverse motion on the zero degree sensor array, and so on. A device configured in this way would have some of the properties of a navigation touchpad such as those used in laptop computers, with the relative motion sensing capability of a computer mouse.


Referring to FIG. 9, a diagrammatic view of a sensing device 100 configured according to the invention is illustrated. The device includes a linear array 112 such as described in the embodiments above, and also includes a sensor element 102 also discussed above. The device further includes sensor control logic 252 configured to control the basic operations of the sensor element. The exact operations of the sensor element governed by the sensor logic control greatly depends on a particular sensor configuration employed, which may include such as power control, reset control of the pixels or data contact points, output signal control, cooling control in the case of some optical sensors, and other basic controls of a sensor element. Sensor controls are well known by those skilled in the art, and, again, depend on the particular operation. The device further includes a readout circuit 254 for reading analog output signals from the sensor element when it is subject to a fingerprint juxtaposed on the sensor surface 107.


The readout circuit includes an amplifier 256 configured to amplify the analog signal so that the it can more accurately be read in subsequent operations. Low pass filter 258 is configured to filter out any noise from the analog signal so that the analog signal can be more efficiently processed. The readout circuit further includes an analog to digital converter 260 that is configured to convert the output signal from the sensor element to a digital signal that indicates a series of logic 0′s and l′s that define the sensing of the fingerprint features by the pixels or data contact points of the sensor surface 107. Such signals may be separately received by the motion sensors and the fingerprint sensing surfaces as discussed in the embodiments above, and may be read out and processed separately. The readout circuit may store the output signal in storage 262, where fingerprint data 264 is stored and preserved, either temporarily until the processor 266 can process the signal, or for later use by the processor. The processor 216 includes arithmetic unit 268 configured to process algorithms used for navigation of a cursor, such as that described in connection with navigation features of FIG. 2b, and for reconstruction of fingerprints. Processing logic 270 is configured to process information and includes analog to digital converters, amplifiers, signal filters, logic gates (all not shown) and other logic utilized by a processor. Persistent memory 274 is used to store algorithms 276 and software applications 278 that are used by the processor for the various functions described above, and in more detail below. The system bus 280 is a data bus configured to enable communication among the various components in the system 100.



FIG. 10 depicts the operation of the invention as a section 101 of fingerprint 100 passes over the sensor array 202. Sensor array 202 is comprised of a number of imaging pixel elements arranged along the axis of motion of the finger with a sufficient pixel density to resolve fingerprint ridges and valleys, typically 250-500 dpi. The pixels may sense the presence or absence of the fingerprint ridge through a variety of techniques, such as capacitance, optical imaging, or mechanical pressure. The array of imaging pixels 202 is sampled at a predetermined rate, sufficient to ensure that the finger will not travel more than two pixels in a sample period. Any reasonable time period could be set, but one example is 500 usec. In this embodiment, the pixels are configured as a single extended array, and software may subdivide the larger array into a number of potentially overlapping windows.


At each sample time, the state of the sense elements is converted to a series of numerical values from digitized segments 203a, 203b. For the sake of simplification, digitized segments 203a, 203b shows a binary digitization, indicating presence or absence of ridge. The sensor values may be encoded with a higher precision if the chosen sensor methodology allows. Because the two image samples 203a and 203b were taken along the axis of motion 106 at different times, they may be sequentially shifted and compared against each other until a match is found for an absolute distance of motion D in the period between the samples T, resulting in a direct finger velocity measurement D/T.


Unlike conventional systems and methods, the system does not have to accumulate a large time history when no motion is detected between samples 203a and 203b. It can simply maintain the earlier sample 203a, and perform a new computation when the next sample is acquired. This is advantageous in the case where there is no prior knowledge of the approximate velocity speed of the finger. Often in practice, the finger velocity relative to the sensory surface may vary greatly. The invention eliminates the need for a large buffer of samples to cover a wide dynamic range of finger speeds.


A further advantage offered by the invention is the ability to adjust the sample rate and therefore the distance of motion traveled between samples as a function of finger velocity. As the finger velocity increases, the number of sample periods required to traverse between two adjacent pixels decreases. This effectively decreases the resolution of a velocity measurement. And, as the uncertainty of the measurement approaches the measurement period, all resolution is lost. Accordingly, in order to maintain the accuracy of the estimated velocity, the measurement system may adjust the sample rate to optimize the distance traveled when looking for a match between two frames. For example, requiring ten pixels of motion at fast finger swipe speeds can ensure a 10% accuracy in velocity measurements. Conversely, as the finger velocity decreases, the number of time samples required to travel a significant distance increases. In this case, the system could decrease the sample rate and reduce the distance traveled for a match to as little as one pixel. This would provide a significantly more rapid response to motion changes in navigation applications and would better tract finger velocity changes used to reconstruct two dimensional images from a one dimensional sensor. Those skilled in the art will understand that there are various methods for changing the sample rate in order to achieve these and other objectives, and the invention is not limited to any particular method, and moreover is inclusive of the various known methods as well as methods readily ascertainable by one skilled in the art without undue experimentation.



FIGS. 11
a and 11b show the digitization results sampled at two instances 203a and 203b as the finger moves in a downward direction 306. In this example, the finger has traveled downward approximately 7 pixels between samples 303a and 303b. FIGS. 11a-11j illustrate results from a similarity comparison between samples 203a and 203b that were converted into binary numbers, giving the following match results:

















Pixel Shift
FIG. 2
Score









0
(a)
(9/16)~.56



1
(b)
(7/15)~.47



2
(c)
(9/14)~.64



3
(d)
(4/13)~.31



4
(e)
(7/12)~.58



5
(f)
(8/11)~.73



6
(g)
(1/10) = .10



7
(h)
(8/9)~.89



8
(j)
(3/8)~.38










The match results show a strong correlation with the actual motion of seven pixels of vertical distance clearly distinguished in just one sample pair, even though the ridge frequency is fairly uniform for the selected segment of the fingerprint. It should also be clear to those knowledgeable in the art that the accuracy of the match would be significantly enhanced by additional levels of gray scale in the pixel data.


FIGS. 12B(a) and 12B(b) depict an embodiment of the invention that includes three linear arrays disposed at different angles to measure motion across a range of angles from the principal axis (in this case +/−25 degrees from the main axis). The central imaging array 302 is augmented with an array 301 oriented at a −25degree angle to the central axis and an array 303 oriented at a +25 degree angle to the central axis. It will be understood by those skilled in the art that, given this disclosure, various different angles of the arrays can be implemented, as well as different numbers of arrays.


In FIG. 12B(b) we see the image of a fingerprint at the initial starting position superimposed on the sensor arrays, and the resulting binary images 304a, 305a, and 306awith the finger in the initial position. In FIG. 12B(a), the finger has moved a short distance at an approximately +25 degree angle shown between positions 310 and 311, and the resulting binary images are shown in 304b, 305b, and 306b. The following table shows the results of binary comparison for the pairings of 304a/304b, 305a/305b, and 306a/306b using the shift and compare method previously described:


















Pixel Shift
Score 304
Score 305
Score 306









0
0.38
0.44
0.38



1
0.67
0.47
0.67



2
0.64
0.43
0.21



3
0.38
0.54
0.77



4
0.50
0.33
0.50



5
0.82
0.36
0.18



6
0.40
0.60
1.00



7
0.44
0.44
0.22



8
0.50
0.25
0.50










Because the motion principally follows the axis of sensor 303, the correlation for the pairing 306a/306b is strong at the correct six pixel distance, but the pairings 304a/304b, and 305a/305b show weak correlation. When the direction of motion is at an angle between the axes of any two of the sensor arrays, a correlation will be found in both of the sensors, and the true motion will be found by taking the vector sum of the estimates from the two sensors.


The example above covers the simple case where the motion is completely aligned with one of the sensor axes. In the case of motion that lies between two axes, the distance a feature travels along a sensor array will be less than the entire length of the sensor. To detect motion across a range of angles, sensor arrays must be provided at a series of angles disposed so that a match will be found on at least two of the sensor arrays. For example, by arranging the arrays in 30 degree increments across the allowable range of motion axes, it is possible to ensure that if there is worst case alignment (i.e. a 15 degree misalignment between the actual axis of motion an the two sensor arrays on either side of it), an image feature will still approximately follow the nearest sensor arrays for more than three pixels of travel. Thus, by sampling the sensor arrays fast enough to ensure that the finger has not traveled more than three pixels between samples, it is possible to determine the axis of motion by finding the adjacent pair of sensors with the highest correlation, and computing the vector sum of the distances traveled along each of them.


Referring to FIG. 12, a flow chart 1200 is illustrated that shows one embodiment of a motion sensor process that can be used for simply detecting and sensing motion, in conjunction with an image sensor for use in reconstructing a fingerprint image, for use in navigation applications or other applications where accurate motion sensing is desired. The process begins at step 1202. In step 1204, an initial sample array of a fingerprint is sensed. In step 1204, a second sample array is sensed after a period of time, t=n. The arrays are converted into a digital representation of the array of fingerprint sensors, and a digital string of digital ones and zeros is used by a processor to determine the relative movement between the two samplings. In practice, a predetermined period of time can be selected, or it can alternatively be measured, where time is measured between the first and second samples. In either case, once the distance is determined between the two samples, assuming that movement has occurred, velocity can be calculated using the distance traveled divided by the time expended during such travel. Continuing, in step 1208, the two arrays are compared. In an initial alignment, referring briefly to FIG. 10, the arrays are compared side by side. If this comparison shows a high correlation, then it is indicative of no relative motion between the fingerprint and the motion sensor.


In step 1210, a similarity score is generated, defining the amount of correlation between the two arrays. This may be in the form of a probability value, a percentage correlation value, or other mathematical value that can be used by the processor to determine the best similarity score among different comparisons. In step 1212, it is determine whether the similarity score falls within a threshold. In one embodiment, the threshold is a predetermined number that is decided according to a particular application. In practice, the invention can be configured to produce correlations that are of a high value, thus justifying a high threshold. Those skilled in the art will understand that such a threshold can be determined without undue experimentation, and that is depends on an application. If the score does not fall within the threshold, then the arrays are shifted to offset alignment in step 1214. The direction of the shifting may be done according to a predicted direction that a user would be expected to move the fingerprint surface across the sensor. If it is not known, or if the design calls for either direction, then flexibility can be accommodated by shifting the arrays in multiple directions until an alignment is reached that is within the threshold. In either case, the process returns to step 1208, where the arrays are compared again. A new similarity score is generated in step 1210, and the new score is measured against the threshold. This process can be reiterated until a score passes the threshold, and could possibly register an error if one is not met over time or a predetermined number of cycles. In a practical application, the two arrays can be shifted and processed once for each pixel in one array, since they are equal in length given that they were taken from the same array. If a score occurs that is within the threshold, then the distance is estimated in step 1216. This can be done by simply counting the number of pixels in which the arrays were shifted before a score occurs within the threshold, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application. Then, the velocity can be estimated in step 1218 by dividing the distance traveled by the time expended during the travel. The process ends at step 1220, where an estimated velocity value can be generated.


Referring to FIG. 13, another flow chart 1300 is illustrated that shows one embodiment of a motion sensor process that can be used for simply detecting and sensing motion, in conjunction with an image sensor for use in reconstructing a fingerprint image, for use in navigation applications or other applications where accurate motion sensing is desired. The process begins at step 1302. In step 1304, an initial sample array of a fingerprint is sensed. In step 1304, a second sample array is sensed after a period of time, t=n. The arrays are converted into a digital representation of the array of fingerprint sensors, and a digital string of digital ones and zeros is used by a processor to determine the relative movement between the two samplings. In practice, a predetermined period of time can be selected, or it can alternatively be measured, where time is measured between the first and second samples. In either case, once the distance is determined between the two samples, assuming that movement has occurred, velocity can be calculated using the distance traveled divided by the time expended during such travel.


Continuing, in step 1308, the two arrays are compared. In an initial alignment, referring briefly to FIG. 10, the arrays are compared side by side. If this comparison shows a high correlation, then it is indicative of no relative motion between the fingerprint and the motion sensor. In step 1310, a similarity score is generated, defining the amount of correlation between the two arrays. This may be in the form of a probability value, a percentage correlation value, or other mathematical value that can be used by the processor to determine the best similarity score among different comparisons. In step 1312, it is determine whether the shift is a last shift in a predetermined number of shifts. In practice, it is practical to shift at least the number of pixels in the array sensor, since both image arrays are sensed and sampled by the same sensor array. Again, similar to the process invention embodied in FIG. 12, the direction of the shifting may be done according to a predicted direction that a user would be expected to move the fingerprint surface across the sensor. If it is not known, or if the design calls for either direction, then flexibility can be accommodated by shifting the arrays in multiple directions until an alignment is reached that is within the threshold. If it is not the last shift, then the array is shifted in step 1314, and the process returns to step 1308, where the arrays are again compared, a new score is generated in step 1310, and it is again queried whether it is the last shift. If it is the last shift, then the highest similarity score is chosen in step 1316.


Then the distance is estimated in step 1318. Again, this can be done by simply counting the number of pixels in which the arrays were shifted before a score occurs within the threshold, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application. Then, the velocity can be estimated in step 1320 by dividing the distance traveled by the time expended during the travel. The process ends in step 1322 where a velocity value can be generated.


Referring to FIG. 14, a flow chart of another embodiment of a sensor operation is illustrated, where multiple sensors are used to produce motion information from a navigation sensor. The process begins Referring to FIG. 14, another flow chart 1400 is illustrated that shows one embodiment of a motion sensor process that can be used for simply detecting and sensing motion, in conjunction with an image sensor for use in reconstructing a fingerprint image, for use in navigation applications or other applications where accurate motion sensing is desired. The process begins at step 1402. In step 1404, initial sample arrays of a fingerprint are sensed. In step 1404, a second set of sample arrays are sensed after a period of time, t=n. The arrays are converted into a digital representation of the array of fingerprint sensors, and a digital string of digital ones and zeros is used by a processor to determine the relative movement between the each of the two samplings from each sensor. In practice, a predetermined period of time can be selected, or it can alternatively be measured, where time is measured between the first and second samples. In either case, once the distance is determined between the two samples, assuming that movement has occurred, velocity can be calculated using the distance traveled divided by the time expended during such travel, and direction can be determined using vector analysis of the several vectors' motion information.


Continuing, in step 1408, the two arrays are compared for each sensor. In an initial alignment, referring briefly to FIG. 10, the digital representation of the arrays of features is compared side by side for each sensor array. If this initial comparison shows a high correlation, then it is indicative of no relative motion between the fingerprint and the motion sensor. In step 1410, a similarity score is generated for each array, defining the amount of correlation between the two arrays. This may be in the form of a probability value, a percentage correlation value, or other mathematical value that can be used by the processor to determine the best similarity score among different comparisons. In step 1412, it is determine whether the shift is a last shift in a predetermined number of shifts. In practice, it is practical to shift at least the number of pixels in each of the array sensors, since both image arrays from each sensor is sensed and sampled by the same sensor array. Again, similar to the process invention embodied in FIG. 12, the direction of the shifting may be done according to a predicted direction that a user would be expected to move the fingerprint surface across the sensor. If it is not known, or if the design calls for either direction, then flexibility can be accommodated by shifting the arrays in multiple directions until an alignment is reached that is within the threshold. If it is not the last shift, then the array is shifted in step 1414, and the process returns to step 1408, where the arrays are again compared, a new score is generated in step 1410, and it is again queried whether it is the last shift. If it is the last shift, then the highest similarity score is chosen in step 1416. In step 1417, the predominant direction of motion is determined by selecting the array with the highest similarity score at its local maximum. The arrays adjacent to the array at the predominant motion axis are examined to determine if either their similarity scores exceeds the threshold for a secondary component axis (this threshold is lower than the threshold for the predominant axis).


Then the distance is estimated in step 1418. Again, this can be done by simply counting the number of pixels in which the arrays were shifted, and multiplying this number by the distance between pixels, which can be estimated to be the distance between midpoints of two pixels. The distance can be accurately measured by sampling distances between individual pixels and groups of pixels in an array, but the exact method of measurement would depend on the application.


If the similarity score for either of the adjacent arrays exceeds the threshold and this similarity score occurs at a distance less than the distance traveled on the predominant axis, then the principal axis of motion is assumed to lie between the predominant axis and this second axis. The angle of motion is then estimated by computing the ratio of distances along the predominant and secondary axes. The ratio of these distances is approximately equal to the ratio of the cosines of the angles between the actual axis of motion and the axes of the two sensor arrays.


The final estimated distance is computed by taking the distance measured on the predominant axis sensor and dividing it by the cosine of the difference between the estimated angle of motion and the angle of the sensor axis.


Then, the velocity can be estimated in step 1420 by dividing the distance traveled by the time expended during the travel. The process ends in step 1422 where a velocity value can be generated.


The invention may also involve a number of functions to be performed by a computer processor, such as a microprocessor. The microprocessor may be a specialized or dedicated microprocessor that is configured to perform particular tasks by executing machine-readable software code that defines the particular tasks. The microprocessor may also be configured to operate and communicate with other devices such as direct memory access modules, memory storage devices, Internet related hardware, and other devices that relate to the transmission of data in accordance with the invention. The software code may be configured using software formats such as Java, C++, XML (Extensible Mark-up Language) and other languages that may be used to define functions that relate to operations of devices required to carry out the functional operations related to the invention. The code may be written in different forms and styles, many of which are known to those skilled in the art. Different code formats, code configurations, styles and forms of software programs and other means of configuring code to define the operations of a microprocessor in accordance with the invention will not depart from the spirit and scope of the invention.


Within the different types of computers, such as computer servers, that utilize the invention, there exist different types of memory devices for storing and retrieving information while performing functions according to the invention. Cache memory devices are often included in such computers for use by the central processing unit as a convenient storage location for information that is frequently stored and retrieved. Similarly, a persistent memory is also frequently used with such computers for maintaining information that is frequently retrieved by a central processing unit, but that is not often altered within the persistent memory, unlike the cache memory. Main memory is also usually included for storing and retrieving larger amounts of information such as data and software applications configured to perform functions according to the invention when executed by the central processing unit. These memory devices may be configured as random access memory (RAM), static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, and other memory storage devices that may be accessed by a central processing unit to store and retrieve information. The invention is not limited to any particular type of memory device, or any commonly used protocol for storing and retrieving information to and from these memory devices respectively.


The apparatus and method include a method and apparatus for enabling and controlling fingerprint sensors and fingerprint image data and motion data in conjunction with the operation of a electronic device where navigation and fingerprint verification processes are utilized. Although this embodiment is described and illustrated in the context of devices, systems and related methods of imaging fingerprints and navigation features for a portable device, the scope of the invention extends to other applications where such functions are useful. Furthermore, while the foregoing description has been with reference to particular embodiments of the invention, it will be appreciated that these are only illustrative of the invention and that changes may be made to those embodiments without departing from the principles of the invention.

Claims
  • 1. A method of determining motion of a fingerprint surface with respect to a sensor surface, comprising: sensing at least two temporally separated sets, each of a plurality of pixels in a fingerprint image along a sensor axis of the sensor surface, the sensing performed by a linear array of fingerprint image feature detection sensors spaced along the sensor axis;storing digital data corresponding to each of the plurality of pixels in each of the at least two sets of a plurality of pixels;processing the digital data to generate fingerprint motion data, wherein generating fingerprint motion data comprises comparing the digital data corresponding to each of the plurality of pixels in a first of the at least two sets of a plurality of pixels in the fingerprint image to the digital data corresponding to each of the plurality of pixels in a second of the at least two sets of a plurality of pixels in the fingerprint image, through sequentially shifting the pixels in the first of the at least two sets with respect to the pixels in the second of the at least two sets, until a match is found for the pixels in the first of the at least two sets with respect to the pixels in the second of the at least two sets;estimating the distance D traveled by the fingerprint surface with respect to the sensor surface by multiplying a pixel shift required to achieve the match times a pitch of the image feature detection sensors; andcomputing the velocity measurement D/T of the fingerprint surface with respect to the sensor surface by dividing the estimated distance D by the temporal difference T between sensing the first set of the at least two sets and the second set of the at least two sets.
  • 2. A method of determining motion of a fingerprint surface with respect to a sensor surface, comprising: collecting at least a first image sample and a second image sample from a linear sensor array, wherein each of the at least a first image sample and a second image sample has a series of pixel values corresponding to substantially contiguous fingerprint image features along an axis of the linear sensor array, taken at different times;generating, by a processor in communication with the linear sensor array, a list of similarity match scores by comparing the respective series of pixels corresponding to each of the at least a first image sample and a second image sample;determining, by the processor, a pixel shift between the at least the first image sample and the second image sample resulting in a highest similarity match score;estimating the distance traveled by multiplying the pixel shift times a pitch of sensors in the linear sensor array; andcomputing a velocity estimate by dividing the estimated distance by the time expended between collecting the first image sample and the second image sample with the highest match score.
  • 3. A method of determining motion of a fingerprint surface with respect to a sensor surface, comprising: collecting at least a first image and a second image, each image comprising values for each of a plurality of pixels in a first linear image sensor array along a first axis of the array taken respectively at a first time and a second time;generating, by a processor in communication with the first linear image sensor array, a list of similarity match scores for each of a plurality of pixel shifted versions of the at least the first image and the second image;determining, by the processor, a pixel shift between the at least the first image and the second image that results in a highest similarity match score;estimating the distance traveled along the first axis by multiplying the number of pixels in the pixel shift by a pitch of sensors in the linear image sensor array; andcomputing the velocity of the fingerprint surface with respect to the sensor surface and producing a velocity value equal to multiplying the distance traveled by the difference between the first time and the second time.
  • 4. The method of claim 3, further comprising: collecting from at least a second linear image sensor array having a second axis at least a third image and a fourth image;generating, by a processor in communication with the second linear image sensor array, a list of similarity match scores for each of a plurality of pixel shifted versions of the at least a third image and a fourth image that has a next highest similarity match score to compute a velocity of the fingerprint in the second axis by determining a pixel shift in the second axis with a highest similarity match score; andcomputing by the processor, a velocity and direction as a vector sum of the velocity in the first axis and the velocity in the second axis.
  • 5. A system for determining the motion of a fingerprint surface with respect to a sensor surface, comprising: a sensing means, including a first linear array of image feature detection sensors spaced along a first axis of motion with respect to the sensing means, for sensing at least two temporally separated sets, each of a plurality of pixels in a fingerprint image along the first axis of motion;memory means for storing digital data corresponding to each of the plurality of pixels in each of the at least two sets;processor means for processing the digital data to generate fingerprint motion data, wherein generating fingerprint motion data comprises comparing the digital data corresponding to each of the plurality of pixels in a first of the at least two sets of a plurality of pixels in the fingerprint image to the digital data corresponding to each of a plurality of pixels in a second of the at least two sets of a plurality of pixels in the fingerprint image, through sequentially shifting the pixels in the first of the at least two sets with respect to the pixels in the second of the at least two sets, and for computing a respective list of similarity match scores for the first set of a plurality of pixels and the second set of a plurality of pixels for each of a plurality of pixel shift positions, until a match is found for the pixels in the first set with respect to the pixels in the second set, as indicated by the occurrence of a highest of the computed similarity match scores, the processor means also being for estimating a distance D traveled by the fingerprint surface with respect to the sensor surface by multiplying a pixel shift required to achieve the match times the physical distance between image feature detection sensors in the first linear array as measured from centerline to centerline of the respective image feature detection sensors, and the processor means also being for computing a first velocity measurement D/T of the fingerprint surface with respect to the sensor surface by dividing the estimated distance D by the temporal difference between sensing of the first of the at least two sets and the second of the at least two sets in the first axis;a sensing means, including a second linear array of image feature detection sensors spaced along a second axis with respect to the sensing means, for sensing at least two temporally separated sets, each of a plurality of pixels in a third fingerprint image and a fourth fingerprint image along the second axis of motion;the memory means being for storing digital data corresponding to each of the plurality of pixels in each of the third fingerprint feature image and the fourth fingerprint feature image along the second axis;the processor means being for processing the digital data to generate fingerprint motion data, wherein generating fingerprint motion data comprises comparing the digital data corresponding to each of the plurality of pixels in the third fingerprint feature image in the second axis to the digital data corresponding to each of the plurality of pixels in a fourth fingerprint feature image in the second axis, through sequentially shifting the pixels in the third image with respect to the pixels in the fourth image, and computing a respective list of similarity match scores for the first set of a plurality of pixels and the second set of a plurality of pixels for each of a plurality of pixel shift positions, until a match is found for the pixels in the third image with respect to the pixels in the fourth image, as indicated by the occurrence of a highest of the computed similarity match scores;the processor means also being for estimating a distance D traveled by the fingerprint surface with respect to the sensor surface in the second axis by multiplying a pixel shift required to achieve the match times the physical distance between the fingerprint feature image sensors in the second linear array as measured from centerline to centerline of the respective image feature detection sensors; andthe processor means also being for computing a first velocity measurement D/T of the fingerprint surface with respect to the sensor surface in the second axis by dividing the estimated distance D by the temporal difference between sensing of the third image and the fourth image;the processor means for determining which of the first sensor array and the second sensor array produced a highest similarity match score and for selecting the one of the first sensor array and the second sensor array that produced the highest similarity match score to represent a principal axis of motion; andthe processor means being for generating fingerprint motion and direction data using the velocity in the principle axis of motion and the velocity in the other axis of motion.
  • 6. The system of claim 5, further comprising: the processor means being for producing navigation data using the fingerprint motion data.
  • 7. The system of claim 5, further comprising the processor means being for sensing portions of a fingerprint image to be reconstructed into a full fingerprint image using the fingerprint motion data.
  • 8. The system of claim 5, further comprising the processor means being for sensing portions of a fingerprint image to be reconstructed into a full fingerprint image using fingerprint motion and direction data.
  • 9. A method of determining motion of a fingerprint surface with respect to a sensor surface, comprising: sensing, via a computing device, at least two temporally separated sets, each of a plurality of pixels in a fingerprint image along an axis of motion of a fingerprint surface with respect to a sensor surface the sensing performed by a linear array of image feature detection sensors spaced along the axis of motion;storing digital data corresponding to each of the plurality of pixels in each of the at least two sets;processing, via a computing device, the digital data to generate fingerprint motion data, wherein generating fingerprint motion data comprises comparing, via the computing device, the digital data corresponding to each of the plurality of pixels in a first of the at least two sets of a plurality of pixels in the fingerprint image to the digital data corresponding to each of the plurality of pixels in a second of the at least two sets of a plurality of pixels in the fingerprint image through sequentially shifting the pixels in the first of the at least two sets with respect to the pixels in the second of the at least two sets, until a match is found for the pixels in the first and second sets;estimating, via the computing device, the distance D traveled by the fingerprint surface with respect to the sensor surface by multiplying the pixel shift required to achieve the match times the physical distance between the pixels in the image feature detection sensor as measured from centerline to centerline of the respective image feature detection sensors;computing, via the computing device, the velocity measurement D/T of the fingerprint surface with respect to the sensor surface by dividing the estimated distance D by the temporal difference between sensing the first of the at least two sets and the second of the at least two sets; andreconstructing a fingerprint image using the velocity.
  • 10. A method of determining motion of a fingerprint surface with respect to a sensor surface, comprising: sensing, via a computing device, at least two temporally separated sets, each of a plurality of pixels in a fingerprint image along an axis of motion of a fingerprint surface with respect to a sensor surface the sensing performed by a linear array of image feature detection sensors spaced along the axis of motion;storing digital data corresponding to each of the plurality of pixels in each of the at least two sets;processing, via a computing device, the digital data to generate fingerprint motion data, wherein generating fingerprint motion data comprises comparing, via the computing device, the digital data corresponding to each of the plurality of pixels in a first of the at least two sets of a plurality of pixels in the fingerprint image to the digital data corresponding to each of the plurality of pixels in a second of the at least two sets of a plurality of pixels in the fingerprint image through sequentially shifting the pixels in the first of the at least two sets with respect to the pixels in the second of the at least two sets, until a match is found for the pixels in the first and second sets;estimating, via the computing device, the distance D traveled by the fingerprint surface with respect to the sensor surface by multiplying the pixel shift required to achieve the match times the physical distance between the pixels in the image feature detection sensor as measured from centerline to centerline of the respective image feature detection sensors; andcomputing, via the computing device, the velocity measurement D/T of the fingerprint surface with respect to the sensor surface by dividing the estimated distance D by the temporal difference between sensing the first of the at least two sets and the second of the at least two sets.
US Referenced Citations (414)
Number Name Date Kind
4151512 Rigannati et al. Apr 1979 A
4225850 Chang et al. Sep 1980 A
4310827 Asi Jan 1982 A
4353056 Tsikos Oct 1982 A
4405829 Rivest et al. Sep 1983 A
4525859 Bowles et al. Jun 1985 A
4550221 Mabusth Oct 1985 A
4580790 Doose Apr 1986 A
4582985 Loftberg Apr 1986 A
4675544 Shrenk Jun 1987 A
4758622 Gosselin Jul 1988 A
4817183 Sparrow Mar 1989 A
5076566 Kriegel Dec 1991 A
5109427 Yang Apr 1992 A
5140642 Hau et al. Aug 1992 A
5305017 Gerpheide Apr 1994 A
5319323 Fong Jun 1994 A
5325442 Knapp Jun 1994 A
5359243 Norman Oct 1994 A
5420936 Fitzpatrick et al. May 1995 A
5422807 Mitra et al. Jun 1995 A
5429006 Tamori Jul 1995 A
5456256 Schneider et al. Oct 1995 A
5543591 Gillespie et al. Aug 1996 A
5569901 Bridgelall et al. Oct 1996 A
5623552 Lane Apr 1997 A
5627316 De Winter et al. May 1997 A
5650842 Maase et al. Jul 1997 A
5717777 Wong et al. Feb 1998 A
5781651 Hsiao et al. Jul 1998 A
5801681 Sayag Sep 1998 A
5818956 Tuli Oct 1998 A
5838306 O'Connor Nov 1998 A
5848176 Harra et al. Dec 1998 A
5850450 Schweitzer et al. Dec 1998 A
5852670 Setlak et al. Dec 1998 A
5864296 Upton Jan 1999 A
5887343 Salatino et al. Mar 1999 A
5892824 Beatson et al. Apr 1999 A
5903225 Schmitt et al. May 1999 A
5915757 Tsuyama et al. Jun 1999 A
5920384 Borza Jul 1999 A
5920640 Salatino et al. Jul 1999 A
5940526 Setlak et al. Aug 1999 A
5963679 Setlak Oct 1999 A
5995630 Borza Nov 1999 A
5999637 Toyoda et al. Dec 1999 A
6002815 Immega et al. Dec 1999 A
6011859 Kalnitsky et al. Jan 2000 A
6016355 Dickinson et al. Jan 2000 A
6052475 Upton Apr 2000 A
6067368 Setlak et al. May 2000 A
6073343 Petrick et al. Jun 2000 A
6076566 Lowe Jun 2000 A
6088585 Schmitt et al. Jul 2000 A
6098175 Lee Aug 2000 A
6118318 Fifield et al. Sep 2000 A
6134340 Hsu et al. Oct 2000 A
6157722 Lerner et al. Dec 2000 A
6161213 Lofstrom Dec 2000 A
6175407 Santor Jan 2001 B1
6182076 Yu et al. Jan 2001 B1
6182892 Angelo et al. Feb 2001 B1
6185318 Jain et al. Feb 2001 B1
6234031 Suga May 2001 B1
6241288 Bergenek et al. Jun 2001 B1
6259108 Antonelli et al. Jul 2001 B1
6289114 Mainguet Sep 2001 B1
6292272 Okauchi et al. Sep 2001 B1
6317508 Kramer et al. Nov 2001 B1
6320394 Tartagni Nov 2001 B1
6325285 Baratelli Dec 2001 B1
6327376 Harkin Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6333989 Borza Dec 2001 B1
6337919 Dunton Jan 2002 B1
6343162 Saito et al. Jan 2002 B1
6346739 Lepert et al. Feb 2002 B1
6347040 Fries et al. Feb 2002 B1
6357663 Takahashi et al. Mar 2002 B1
6360004 Akizuki Mar 2002 B1
6362633 Tartagni Mar 2002 B1
6392636 Ferrari et al. May 2002 B1
6399994 Shobu Jun 2002 B2
6400836 Senior Jun 2002 B2
6473072 Comiskey et al. Oct 2002 B1
6509501 Eicken et al. Jan 2003 B2
6525547 Hayes Feb 2003 B2
6525932 Ohnishi et al. Feb 2003 B1
6539101 Black Mar 2003 B1
6580816 Kramer et al. Jun 2003 B2
6597289 Sabatini Jul 2003 B2
6628812 Setlak et al. Sep 2003 B1
6631201 Dickinson et al. Oct 2003 B1
6643389 Raynal et al. Nov 2003 B1
6672174 Deconde et al. Jan 2004 B2
6710461 Chou et al. Mar 2004 B2
6738050 Comiskey et al. May 2004 B2
6741729 Bjorn et al. May 2004 B2
6757002 Oross et al. Jun 2004 B1
6766040 Catalano et al. Jul 2004 B1
6785407 Tschudi et al. Aug 2004 B1
6799275 Bjorn et al. Sep 2004 B1
6836230 Le Pailleur et al. Dec 2004 B2
6838905 Doyle Jan 2005 B1
6873356 Kanbe et al. Mar 2005 B1
6886104 McClurg et al. Apr 2005 B1
6897002 Teraoka et al. May 2005 B2
6898299 Brooks May 2005 B1
6924496 Manansala Aug 2005 B2
6937748 Schneider et al. Aug 2005 B1
6941001 Bolle et al. Sep 2005 B1
6941810 Okada Sep 2005 B2
6950540 Higuchi Sep 2005 B2
6959874 Bardwell Nov 2005 B2
6963626 Shaeffer et al. Nov 2005 B1
6970584 O'Gorman et al. Nov 2005 B2
6980672 Saito et al. Dec 2005 B2
6983882 Cassone Jan 2006 B2
7013030 Wong et al. Mar 2006 B2
7020591 Wei et al. Mar 2006 B1
7030860 Hsu et al. Apr 2006 B1
7031670 May Apr 2006 B2
7035443 Wong Apr 2006 B2
7042535 Katoh et al. May 2006 B2
7043061 Hamid et al. May 2006 B2
7043644 DeBruine May 2006 B2
7046230 Zadesky et al. May 2006 B2
7064743 Nishikawa Jun 2006 B2
7099496 Benkley, III Aug 2006 B2
7110574 Haruki et al. Sep 2006 B2
7110577 Tschudi Sep 2006 B1
7113622 Hamid Sep 2006 B2
7126389 McRae et al. Oct 2006 B1
7129926 Mathiassen et al. Oct 2006 B2
7136514 Wong Nov 2006 B1
7146024 Benkley Dec 2006 B2
7146026 Russo et al. Dec 2006 B2
7146029 Manansala Dec 2006 B2
7184581 Johansen et al. Feb 2007 B2
7190209 Kang et al. Mar 2007 B2
7190816 Mitsuyu et al. Mar 2007 B2
7194392 Tuken et al. Mar 2007 B2
7197168 Russo Mar 2007 B2
7200250 Chou Apr 2007 B2
7251351 Mathiassen et al. Jul 2007 B2
7258279 Schneider et al. Aug 2007 B2
7260246 Fujii Aug 2007 B2
7263212 Kawabe Aug 2007 B2
7263213 Rowe Aug 2007 B2
7289649 Walley et al. Oct 2007 B1
7290323 Deconde et al. Nov 2007 B2
7308121 Mathiassen et al. Dec 2007 B2
7308122 McClurg et al. Dec 2007 B2
7321672 Sasaki et al. Jan 2008 B2
7356169 Hamid Apr 2008 B2
7360688 Harris Apr 2008 B1
7369685 DeLeon May 2008 B2
7379569 Chikazawa et al. May 2008 B2
7408135 Fujeda Aug 2008 B2
7409876 Ganapathi et al. Aug 2008 B2
7412083 Takahashi Aug 2008 B2
7424618 Roy et al. Sep 2008 B2
7447339 Mimura et al. Nov 2008 B2
7447911 Chou et al. Nov 2008 B2
7460697 Erhart et al. Dec 2008 B2
7463756 Benkley Dec 2008 B2
7505611 Fyke Mar 2009 B2
7505613 Russo Mar 2009 B2
7565548 Fiske et al. Jul 2009 B2
7574022 Russo Aug 2009 B2
7596832 Hsieh et al. Oct 2009 B2
7643950 Getzin et al. Jan 2010 B1
7681232 Nordentoft et al. Mar 2010 B2
7689013 Shinzaki Mar 2010 B2
7706581 Drews et al. Apr 2010 B2
7733697 Picca et al. Jun 2010 B2
7751601 Benkley Jul 2010 B2
7826645 Cayen Nov 2010 B1
7843438 Onoda Nov 2010 B2
7848798 Martinsen et al. Dec 2010 B2
7899216 Watanabe et al. Mar 2011 B2
7953258 Dean et al. May 2011 B2
8005276 Dean et al. Aug 2011 B2
8031916 Abiko et al. Oct 2011 B2
8063734 Conforti Nov 2011 B2
8077935 Geoffroy et al. Dec 2011 B2
8107212 Nelson et al. Jan 2012 B2
8116540 Dean et al. Feb 2012 B2
8131026 Benkley et al. Mar 2012 B2
8165355 Benkley et al. Apr 2012 B2
8175345 Gardner May 2012 B2
8204281 Satya et al. Jun 2012 B2
8224044 Benkley Jul 2012 B2
8229184 Benkley Jul 2012 B2
8276816 Gardner Oct 2012 B2
8278946 Thompson Oct 2012 B2
8290150 Erhart et al. Oct 2012 B2
8315444 Gardner Nov 2012 B2
8331096 Garcia Dec 2012 B2
8358815 Benkley et al. Jan 2013 B2
20010026636 Mainget Oct 2001 A1
20010030644 Allport Oct 2001 A1
20010036299 Senior Nov 2001 A1
20010043728 Kramer et al. Nov 2001 A1
20020025062 Black Feb 2002 A1
20020061125 Fujii May 2002 A1
20020064892 Lepert et al. May 2002 A1
20020067845 Griffis Jun 2002 A1
20020073046 David Jun 2002 A1
20020089044 Simmons et al. Jul 2002 A1
20020089410 Janiak et al. Jul 2002 A1
20020096731 Wu et al. Jul 2002 A1
20020122026 Bergstrom Sep 2002 A1
20020126516 Jeon Sep 2002 A1
20020133725 Roy et al. Sep 2002 A1
20020152048 Hayes Oct 2002 A1
20020181749 Matsumoto et al. Dec 2002 A1
20030002717 Hamid Jan 2003 A1
20030002719 Hamid et al. Jan 2003 A1
20030021495 Cheng Jan 2003 A1
20030035570 Benkley Feb 2003 A1
20030063782 Acharya et al. Apr 2003 A1
20030068072 Hamid Apr 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030095096 Robbin et al. May 2003 A1
20030095690 Su et al. May 2003 A1
20030123714 O'Gorman et al. Jul 2003 A1
20030123715 Uchida Jul 2003 A1
20030141959 Keogh et al. Jul 2003 A1
20030147015 Katoh et al. Aug 2003 A1
20030161510 Fujii Aug 2003 A1
20030161512 Mathiassen et al. Aug 2003 A1
20030169228 Mathiassen et al. Sep 2003 A1
20030174871 Yoshioka et al. Sep 2003 A1
20030186157 Teraoka et al. Oct 2003 A1
20030209293 Sako et al. Nov 2003 A1
20030224553 Manansala Dec 2003 A1
20040012773 Puttkammer Jan 2004 A1
20040017934 Kocher et al. Jan 2004 A1
20040022001 Chu et al. Feb 2004 A1
20040042642 Bolle et al. Mar 2004 A1
20040050930 Rowe Mar 2004 A1
20040066613 Leitao Apr 2004 A1
20040076313 Bronstein et al. Apr 2004 A1
20040081339 Benkley Apr 2004 A1
20040096086 Miyasaka May 2004 A1
20040113956 Bellwood et al. Jun 2004 A1
20040120400 Linzer Jun 2004 A1
20040125993 Zhao et al. Jul 2004 A1
20040129787 Saito Jul 2004 A1
20040136612 Meister et al. Jul 2004 A1
20040155752 Radke Aug 2004 A1
20040172339 Snelgrove et al. Sep 2004 A1
20040179718 Chou Sep 2004 A1
20040184641 Nagasaka et al. Sep 2004 A1
20040188838 Okada et al. Sep 2004 A1
20040190761 Lee Sep 2004 A1
20040208346 Baharav et al. Oct 2004 A1
20040208347 Baharav et al. Oct 2004 A1
20040208348 Baharav et al. Oct 2004 A1
20040213441 Tschudi Oct 2004 A1
20040215689 Dooley et al. Oct 2004 A1
20040228505 Sugimoto Nov 2004 A1
20040228508 Shigeta Nov 2004 A1
20040240712 Rowe et al. Dec 2004 A1
20040252867 Lan et al. Dec 2004 A1
20050031174 Ryhanen et al. Feb 2005 A1
20050036665 Higuchi Feb 2005 A1
20050047485 Khayrallah et al. Mar 2005 A1
20050100196 Scott et al. May 2005 A1
20050100938 Hoffmann et al. May 2005 A1
20050109835 Jacoby et al. May 2005 A1
20050110103 Setlak May 2005 A1
20050111708 Chou May 2005 A1
20050123176 Ishii et al. Jun 2005 A1
20050129291 Boshra Jun 2005 A1
20050136200 Durell et al. Jun 2005 A1
20050139656 Arnouse Jun 2005 A1
20050139685 Kozlay Jun 2005 A1
20050162402 Watanachote Jul 2005 A1
20050169503 Howell et al. Aug 2005 A1
20050174015 Scott et al. Aug 2005 A1
20050210271 Chou et al. Sep 2005 A1
20050219200 Weng Oct 2005 A1
20050220329 Payne et al. Oct 2005 A1
20050231213 Chou et al. Oct 2005 A1
20050238212 Du et al. Oct 2005 A1
20050244038 Benkley Nov 2005 A1
20050244039 Geoffroy et al. Nov 2005 A1
20050247559 Frey et al. Nov 2005 A1
20050249386 Juh Nov 2005 A1
20050258952 Utter et al. Nov 2005 A1
20050269402 Spitzer et al. Dec 2005 A1
20060006224 Modi Jan 2006 A1
20060055500 Burke et al. Mar 2006 A1
20060066572 Yumoto et al. Mar 2006 A1
20060078176 Abiko et al. Apr 2006 A1
20060083411 Benkley Apr 2006 A1
20060110537 Huang et al. May 2006 A1
20060140461 Kim et al. Jun 2006 A1
20060144953 Takao Jul 2006 A1
20060170528 Funushige et al. Aug 2006 A1
20060181521 Perrault et al. Aug 2006 A1
20060182319 Setlank et al. Aug 2006 A1
20060187200 Martin Aug 2006 A1
20060210082 Devadas et al. Sep 2006 A1
20060214512 Iwata Sep 2006 A1
20060214767 Carrieri Sep 2006 A1
20060239514 Watanabe et al. Oct 2006 A1
20060249008 Luther Nov 2006 A1
20060259873 Mister Nov 2006 A1
20060261174 Zellner et al. Nov 2006 A1
20060267125 Huang et al. Nov 2006 A1
20060267385 Steenwyk et al. Nov 2006 A1
20060271793 Devadas et al. Nov 2006 A1
20060285728 Leung et al. Dec 2006 A1
20060287963 Steeves et al. Dec 2006 A1
20070031011 Erhart et al. Feb 2007 A1
20070036400 Watanabe et al. Feb 2007 A1
20070057763 Blattner et al. Mar 2007 A1
20070058843 Theis et al. Mar 2007 A1
20070067828 Bychkov Mar 2007 A1
20070076926 Schneider et al. Apr 2007 A1
20070076951 Tanaka et al. Apr 2007 A1
20070086634 Setlak et al. Apr 2007 A1
20070090312 Stallinga et al. Apr 2007 A1
20070138299 Mitra Jun 2007 A1
20070154072 Taraba et al. Jul 2007 A1
20070160269 Kuo Jul 2007 A1
20070180261 Akkermans et al. Aug 2007 A1
20070196002 Choi et al. Aug 2007 A1
20070198141 Moore Aug 2007 A1
20070198435 Siegal et al. Aug 2007 A1
20070228154 Tran Oct 2007 A1
20070237366 Maletsky Oct 2007 A1
20070237368 Bjorn et al. Oct 2007 A1
20070248249 Stoianov Oct 2007 A1
20070290124 Neil et al. Dec 2007 A1
20080002867 Mathiassen et al. Jan 2008 A1
20080013805 Sengupta et al. Jan 2008 A1
20080019578 Saito et al. Jan 2008 A1
20080049987 Champagne et al. Feb 2008 A1
20080049989 Iseri et al. Feb 2008 A1
20080063245 Benkley et al. Mar 2008 A1
20080069412 Champagne et al. Mar 2008 A1
20080126260 Cox et al. May 2008 A1
20080169345 Keane et al. Jul 2008 A1
20080170695 Adler et al. Jul 2008 A1
20080175450 Scott et al. Jul 2008 A1
20080178008 Takahashi et al. Jul 2008 A1
20080179112 Qin et al. Jul 2008 A1
20080185429 Saville Aug 2008 A1
20080201265 Hewton Aug 2008 A1
20080205714 Benkley et al. Aug 2008 A1
20080219521 Benkley et al. Sep 2008 A1
20080222049 Loomis et al. Sep 2008 A1
20080223925 Saito et al. Sep 2008 A1
20080226132 Gardner Sep 2008 A1
20080240523 Benkley et al. Oct 2008 A1
20080240537 Yang et al. Oct 2008 A1
20080244277 Orsini et al. Oct 2008 A1
20080267462 Nelson et al. Oct 2008 A1
20080279373 Erhart et al. Nov 2008 A1
20080317290 Tazoe Dec 2008 A1
20090001999 Douglas Jan 2009 A1
20090130369 Huang et al. May 2009 A1
20090153297 Gardner Jun 2009 A1
20090154779 Satyan et al. Jun 2009 A1
20090155456 Benkley et al. Jun 2009 A1
20090174974 Huang et al. Jul 2009 A1
20090212902 Haddock Aug 2009 A1
20090218698 Lam Sep 2009 A1
20090237135 Ramaraju et al. Sep 2009 A1
20090252383 Adam et al. Oct 2009 A1
20090252384 Dean et al. Oct 2009 A1
20090252385 Dean et al. Oct 2009 A1
20090252386 Dean et al. Oct 2009 A1
20090279742 Abiko Nov 2009 A1
20090319435 Little et al. Dec 2009 A1
20090324028 Russo Dec 2009 A1
20100026451 Erhart et al. Feb 2010 A1
20100083000 Kesanupalli Apr 2010 A1
20100117794 Adams et al. May 2010 A1
20100119124 Satyan May 2010 A1
20100123675 Ippel May 2010 A1
20100127366 Bond et al. May 2010 A1
20100176823 Thompson et al. Jul 2010 A1
20100176892 Thompson et al. Jul 2010 A1
20100177940 Thompson et al. Jul 2010 A1
20100180136 Thompson et al. Jul 2010 A1
20100189314 Benkley et al. Jul 2010 A1
20100208953 Gardner et al. Aug 2010 A1
20100244166 Shibuta et al. Sep 2010 A1
20100272329 Benkley Oct 2010 A1
20110002461 Erhart et al. Jan 2011 A1
20110018556 Le et al. Jan 2011 A1
20110090047 Patel Apr 2011 A1
20110175703 Benkley Jul 2011 A1
20110176037 Benkley Jul 2011 A1
20110182486 Valfridsson et al. Jul 2011 A1
20110214924 Perezselsky et al. Sep 2011 A1
20110267298 Erhart et al. Nov 2011 A1
20110298711 Dean et al. Dec 2011 A1
20110304001 Erhart et al. Dec 2011 A1
20120044639 Garcia Feb 2012 A1
20120189166 Russo Jul 2012 A1
20120189172 Russo Jul 2012 A1
20120206586 Gardner Aug 2012 A1
20120256280 Ehart Oct 2012 A1
20120257032 Benkley Oct 2012 A1
20120308092 Benkley et al. Dec 2012 A1
20130021044 Thompson et al. Jan 2013 A1
Foreign Referenced Citations (66)
Number Date Country
2213813 Oct 1973 DE
0929028 Jan 1998 EP
0905646 Mar 1999 EP
0973123 Jan 2000 EP
1018697 Jul 2000 EP
1139301 Oct 2001 EP
1531419 May 2005 EP
1533759 May 2005 EP
1538548 Jun 2005 EP
1624399 Feb 2006 EP
1775674 Apr 2007 EP
1939788 Jul 2008 EP
2331613 May 1999 GB
2480919 Dec 2011 GB
2487661 Aug 2012 GB
2489100 Sep 2012 GB
2490192 Oct 2012 GB
01094418 Apr 1989 JP
04158434 Jun 1992 JP
2005011002 Jan 2005 JP
2005242856 Sep 2005 JP
2007305097 Nov 2007 JP
200606745 Feb 2006 TW
200620140 Jun 2006 TW
200629167 Aug 2006 TW
WO 9003620 Apr 1990 WO
WO 9858342 Dec 1998 WO
WO 9928701 Jun 1999 WO
WO 9943258 Sep 1999 WO
WO 0122349 Mar 2001 WO
WO 0194902 Dec 2001 WO
WO 0194902 Dec 2001 WO
WO 0195304 Dec 2001 WO
WO 0211066 Feb 2002 WO
WO 0247018 Jun 2002 WO
WO 0247018 Jun 2002 WO
WO 02061668 Aug 2002 WO
WO 02077907 Oct 2002 WO
WO 03063054 Jul 2003 WO
WO 03075210 Sep 2003 WO
WO 2004066194 Aug 2004 WO
WO 2004066693 Aug 2004 WO
WO 2005104012 Nov 2005 WO
WO 2005106774 Nov 2005 WO
WO 2005106774 Nov 2005 WO
WO 2006040724 Apr 2006 WO
WO 2006041780 Apr 2006 WO
WO 2007011607 Jan 2007 WO
WO 2008033264 Mar 2008 WO
WO 2008033264 Mar 2008 WO
WO 2008033265 Jun 2008 WO
WO 2008033265 Jun 2008 WO
WO 2008137287 Nov 2008 WO
WO 2009002599 Dec 2008 WO
WO 2009002599 Dec 2008 WO
WO 2009029257 Jun 2009 WO
WO 2009079219 Jun 2009 WO
WO 2009079221 Jun 2009 WO
WO 2009079257 Jun 2009 WO
WO 2009079262 Jun 2009 WO
WO 2010034036 Mar 2010 WO
WO 2010036445 Apr 2010 WO
WO 2010143597 Dec 2010 WO
WO 2011088248 Jan 2011 WO
WO2011088252 Jan 2011 WO
WO 2011053797 May 2011 WO
Non-Patent Literature Citations (16)
Entry
Matsumoto et al., Impact of Artificial “Gummy” Fingers on Fingerprint Systesm, SPIE 4677 (2002), reprinted from cryptome.org.
Maltoni, “Handbook of Fingerprint Recognition”, XP002355942 Springer, New York, USA, Jun. 2003 pp. 65-69.
Vermasan, et al., “A500 dpi AC Capacitive Hybrid Flip-Chip CMOS ASIC/Sensor Module for Fingerprint, Navigation, and Pointer Detection With On-Chip Data Processing”, IEEE Journal of Solid State Circuits, vol. 38, No. 12, Dec. 2003, pp. 2288-2294.
Ratha, et al. “Adapative Flow Orientation Based Feature Extractionin Fingerprint Images,” Pattern Recognition, vol. 28 No. 11, 1657-1672, Nov. 1995.
Ratha, et al., “A Real Time Matching System for Large Fingerprint Databases,” IEEE, Aug. 1996.
Suh, et al., “Design and Implementation of the AEGIS Single-Chip Secure Processor Using Physical Random Functions”, Computer Architecture, 2005, ISCA '05, Proceedings, 32nd International Symposium, Jun. 2005 (MIT Technical Report CSAIL CSG-TR-843, 2004.
Rivest, et al., “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems”, Communication of the ACM, vol. 21 (2), pp. 120-126. (1978).
Hiltgen, et al., “Secure Internet Banking Authentication”, IEEE Security and Privacy, IEEE Computer Society, New York, NY, US, Mar. 1, 2006, pp. 24-31, XP007908655, ISSN: 1540-7993.
Hegt, “Analysis of Current and Future Phishing Attacks on Internet Banking Services”, Mater Thesis. Techische Universiteit Eindhoven—Department of Mathematics and Computer Science May 31, 2008, pp. 1-149, XP002630374, Retrieved from the Internet: URL:http://alexandria.tue.nl/extral/afstversl/wsk-i/hgt2008.pdf [retrieved on Mar. 29, 2011] *pp. 127-134, paragraph 6.2*.
Gassend, et al., “Controlled Physical Random Functions”, In Proceedings of the 18th Annual Computer Security Conference, Las Vegas, Nevada, Dec. 12, 2002.
Wikipedia (Mar. 2003). “Integrated Circuit,” http://en.wikipedia.org/wiki/integrated—circuit. Revision as of Mar. 23, 2003.
Wikipedia (Dec. 2006). “Integrated circuit” Revision as of Dec. 10, 2006. http://en.widipedia.org/wiki/Integrated—circuit.
Bellagiodesigns.Com (Internet Archive Wayback Machine, www.bellagiodesigns.com date: Oct. 29, 2005).
Closed Loop Systems, The Free Dictionary, http://www.thefreedictionary.com/closed-loop+system (downloaded Dec. 1, 2011).
Feedback: Electronic Engineering, Wikipedia, p. 5 http://en.wikipedia.org/wiki/Feedback#Electronic—engineering (downloaded Dec. 1, 2011).
Galy et al. (Jul. 2007) “A full fingerprint verification system for a single-line sweep sensor.” IEEE Sensors J., vol. 7 No. 7, pp. 1054-1065.
Related Publications (1)
Number Date Country
20100284565 A1 Nov 2010 US