User input utilizing dual line scanner apparatus and method

Information

  • Patent Grant
  • 8538097
  • Patent Number
    8,538,097
  • Date Filed
    Wednesday, January 26, 2011
    13 years ago
  • Date Issued
    Tuesday, September 17, 2013
    11 years ago
Abstract
A user input method and apparatus may comprise a two line object imaging sensor having a primary line scan-sensor providing a primary line scan-sensor output and a secondary line scan-sensor providing a secondary line scan-sensor output, representing pixels in a current primary scan row and a current secondary scan row, and adapted to scan an object; storing for each scan time each current primary line scan-sensor output and each current secondary line scan-sensor output and a correlation unit correlating at least one of the current representations of pixels in a primary line sensor output with stored representations and the current representations of pixels in a secondary line sensor output with stored representations and, the correlation unit providing as an output a motion indicator.
Description
BACKGROUND OF THE INVENTION
Background

Some conventional fingerprint scanners include large, postage-stamp size units, called contact or placement sensors, that sense an entire fingerprint at once (e.g., an entire fingerprint including images of 200-500 rows and 128-200 columns of pixels). Other fingerprint scanners include smaller swipe scanners incorporated into laptop and notebook computers, mobile phones, mobile email devices, and smartphones. Smaller swipe scanners are much less expensive to manufacture than larger placement scanners. Stationary swipe fingerprint scanners sense a finger being swiping across the scanner and can be single line scanners, dual line scanners or multi-line scanners.


One example of a dual line scanner is disclosed in U.S. Pat. No. 6,002,815 issued to Immega et al. on Dec. 14, 1999 (“Immega”), the entire contents of which is herein incorporated by reference. The Immega dual line scanner must determine and track the velocity of the finger as it passes over the sensor and a 1×n pixel array scanner. The Immega dual line scanner performs 1×n linear array cross-correlation on current and historic line scans to initially image the fingerprint. The velocity of the finger must then be known in order to reconstruct the fingerprint image from the line scans.


Conventional fingerprint navigation methods require the velocity of the finger to be known. For example, United States Patent Application Publication No. 2010/0284565, entitled “Method and Apparatus for Fingerprint Motion Tracking Using an In-Line Array,” published on Nov. 11, 2010, and United States Patent Application Publication No. 2008/0063245, entitled “Method and Apparatus for Fingerprint Motion Tracking Using an In-Line Array for Use in Navigation Applications,” published on Mar. 13, 2008, each disclose matrix scanner arrays that image portions of a fingerprint and determine velocity and direction of movement with at least one linear array aligned to a direction of finger movement for user input navigation purposes.


Currently, a user input device (such as a mouse) uses various electrical and optical configurations to track the movement of the user's hand to control the position of a cursor on the screen or to click on icons or links. These can be cumbersome when a portable computing device is being used in a tight space, such as on an airplane, and inconvenient to carry along as an extra item. Built-in user input devices, such as are found on the casings of many lap-top and notebook computing devices, have been found to be difficult to use. Built-in user input devices often lack the feeling of smooth response to the application of pressure to the pressure plate and are often too large and cumbersome for use on mobile phones and handheld computing devices.


Thus, there is a need for a very compact user input device including a fingerprint scanner that can serve to manipulate the position of a cursor on the screen of a computing device.


SUMMARY OF THE INVENTION

A user input method and apparatus may comprise a two line object imaging sensor having a primary line scan-sensor providing a primary line scan-sensor output and a secondary line scan-sensor providing a secondary line scan-sensor output, representing pixels in a current primary scan row and a current secondary scan row, and adapted to scan an object; storing for each scan time each current primary line scan-sensor output and each current secondary line scan-sensor output and a correlation unit correlating at least one of the current representations of pixels in a primary line sensor output with stored representations and the current representations of pixels in a secondary line sensor output with stored representations and, the correlation unit providing as an output a motion indicator.


INCORPORATION BY REFERENCE

All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.





BRIEF DESCRIPTION OF THE DRAWINGS

The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:



FIG. 1 is a schematic block diagram of a basic configuration for a fingerprint scanning and image reconstruction system according to embodiments of the present disclosure.



FIG. 2 is a schematic view, partly in block diagram form, of a dual line fingerprint scanner according to one embodiment of the present disclosure.



FIG. 3 is a flow diagram for a user input device according to one embodiment of the present disclosure.



FIGS. 4
a-g are schematic illustrations of a cross correlation technique according to one embodiment of the present disclosure.



FIG. 5 is a schematic block diagram of a user input device according to one embodiment of the present disclosure.





DETAILED DESCRIPTION OF THE INVENTION

Before any embodiments of the invention are explained in detail, it is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the described drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used in this application is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” is meant to encompass the items listed thereafter and equivalents, as well as additional items. Unless specified or limited otherwise, the terms used are intended to cover variations ordinarily known, now or in the future. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings and can include both physical and electrical, magnetic, and capacitive couplings and connections.


The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures depict selected embodiments and are not intended to limit the scope of embodiments of the present disclosure.



FIG. 1 schematically illustrates a fingerprint scanning and image reconstruction system 200 according to embodiments of the present disclosure. The fingerprint scanning and image reconstruction system 200 includes a sensor 202 and an image reconstruction module 204. The image reconstruction module 204 can be connected to or integrated with a host computing device 206 (as shown in FIG. 2) and can receive inputs from the sensor 202. The host computing device 206 can be connected to a database 210. In some embodiments, the sensor 202 can also include a culling module 205 to reduce the amount of data transmitted over the bandwidth of the communication links, whether wired or wireless, between the sensor 202, the image reconstruction module 204, and the host computing device 206. Culling is a technique for keeping line scans with very little variation from one clock time to the next clock time from being sent to the image reconstruction module 204 and/or the host computing device 206. If there is no change from one clock time to the next clock time, the finger is not moving with respect to the sensor 202. It is well understood in the art that such essentially redundant scan lines are not useful in image reconstruction.



FIG. 2 schematically illustrates a dual line fingerprint scanner 220 according to one embodiment of the present disclosure. The dual line scanner 220 includes a primary linear scanner segment 230 and a secondary linear scanner segment 250. The primary linear scanner segment 230 can be a 1×n linear pixel array, where n is typically 128-200 pixel scan points (for illustrative purposes, only 12 pixel scan points 232 are shown in FIG. 2). The secondary linear scanner segment 250 can be a 1×n linear pixel array, where n is about half of the number of pixels in the primary linear scanner segment 230 (e.g., about 64-100 pixel scan points 252, but with only 6 pixel scan points 252 being shown in FIG. 2).


Drive signals are supplied to each pixel scan point 232, 252 through leads 234, 254 across from reference voltage plates 236, 256 using a multiplexer 270, connected to the leads 234, 254 through contacts 262. The responses to the drive signals are influenced by capacitive couplings between the leads 234, 254 and the voltage plates 236, 256 at the pixel scan points 232, 252 as sensed by sensors 272. The capacitive couplings are influenced by whether the portion of the fingerprint being scanned at the pixel scan points 232, 252 is a ridge or a valley of the fingerprint. The output of each pixel scan point 232, 252 is a gray scale value from zero to 255. This is a convenient byte size data value range that is exemplary only and can be other values of gray scale granularity. Typically, the gray scale value of zero is white and the gray scale value of 255 is black, with intervening incremental shades of gray between these values. The image reconstruction module 204 can perform image reconstruction using these scan lines and the gray scale values to reconstruct the fingerprint with dark indicating ridges and light indicating valleys.


Each pixel scan point 232, 252 is provided VHF (20-80 MHz) signal bursts in sequence, at a very high clock rate, e.g. 40 MHz, as described, for example, in U.S. Pat. No. 7,099,496, entitled SWIPED APERTURE CAPACITIVE FINGERPRINT SENSING SYSTEMS AND METHODS, issued to Benkley on Aug. 29, 2006, the disclosure of which is hereby incorporated by reference. The signal bursts are provided from a multiplexer 270 to the scan points 232, 252 through respective leads 234, 254 having contacts 262. An output for each sequential lead 234, 254 and respective pixel scan points 232, 252 taken from the primary linear array scanner reference plate 236, and the secondary linear array scanner reference plate 256. The output for each sequentially sampled pixel scan point 232, 252 is influenced by a capacitive coupling between the respective lead 234, 254 and voltage plate 236, 256 at the respective pixel scan point 232, 252. The capacitive coupling depends upon whether, in the case of a finger, there is a fingerprint ridge or valley at the respective pixel scan point 232, 252. The leads 234, 254 may be provided the signal for a very short time period, e.g., 2-5 μsec, so that, compared to the speed of movement of the object, e.g., a finger, the scan is a linear single line across the object in the direction generally orthogonal to the object movement direction. The outputs can be sensed by a sensor 272 and sent to the host computer 206 by the sensor unit 202. Thus, each pixel scan point 232, 252 output for each pixel, is typically a gray scale value of from, typically 0-255, a convenient byte size data value range, which, it will be understood, is exemplary only and could easily be other values of gray scale granularity, e.g., up to about 12 bit resolution. Typically, also, the gray scale value of 0 is white and 255 is black with intervening incremental shades of gray between these values. Thus the host computing device 204 in the fingerprint imaging and reconstruction arrangement of the prior art can perform image reconstruction using these scanned lines and the gray scale value to form an image of the object, such as a finger, to reconstruct the fingerprint 10 with dark indicating ridges and light indicating valleys. Also various well known techniques can be used to sharpen edge contrast and the like to get a more faithfully the object reconstructed image of the fingerprint in the example where a finger is being scanned and a fingerprint being imaged.


The value of n in the primary linear scanner segment 232, such as 128, is the number of pixels in the x direction, with currently existing systems having around 128-200 pixels in the x axis. Leads 234, 254 of 25 μm in width with spaces 238, 258 in between the leads 234,254 of 25 μm gives a pitch R between the centerlines of the leads 234, 254 of 50 μm. The pitch determines the sensor's resolution in the x direction.


It will be understood, that, for purposes of illustrating the operation of the disclosed subject matter only, +y is chosen to be in the direction the movement of the object, such as the finger, as it moves first across the primary linear scan segment 230 and then across the secondary linear scan segment 250. Such orientation of the primary linear scan segment 230 and the secondary linear scan segment 250 and finger movement, is not intended to be limiting. Indeed, the system 200 operates whether the finger moves in the just oriented +y direction or in the −y direction, and on certain kinds of user devices for which the disclosed subject matter can be utilized to provide user input, the user may not always swipe the finger in the “right” direction, yet the system can determine the direction of movement, as is explained in more detail in the present application. Indeed, the +y direction as oriented to the plane of the paper on which FIG. 2 appears is “down” rather than “up.” “Up,” as used in the present application, unless otherwise clearly stated, shall mean +y direction and “down” shall mean −y axis direction, with the +y axis direction extending from the primary scan portion 230 to the secondary scan portion 250.


Applicant has determined that this same basic object (i.e., finger) scanning and image (i.e. fingerprint) regeneration system 200 can be the basis for a very compact and miniaturized computing device user input, as will be explained in regard to possible embodiments of the disclosed subject matter. Turning now to FIG. 3 there is shown a block diagram of a process 300 flow that can be utilized according to aspects of the disclosed subject matter. The process starts at START and in block number 304 a save a new scan line in circular buffer 210 step occurs. The buffer 210 may be a “first in first out storage,” such as a circular buffer 210, shown in FIG. 2, which can have any selected number of buffer locations (not shown), such as, 8, 16, 32 or so forth. A 16-row buffer is used in the preferred embodiment. However, an illustrative buffer with 8 locations is described in relation to an embodiment of the disclosed subject matter. The data that has been in a respective primary scan line and secondary scan line circular buffer, representing a single line scan by the primary linear scan array 230 and the secondary linear scan array 250, respectively, for the longest time will be dropped from the respective circular buffer 210 as a new scan line is entered into the circular buffer 210, when the buffer 210 is full.


A determine if the object (finger) is present step is performed in block 306 of the process 300. Various ways are known in the art of performing this step, such as calculating a mean, a variance and the like, or combinations of these. If the finger is not present as determined in block 308, then the process 300 returns to the START. If the finger is present as determined in block 308 then in block 310 correlation begins.


To make the navigation algorithm work most reliably to track finger motion, it is necessary to decide if a finger is likely to be on the sensor at the time that it is trying to be tracked. Also it is necessary to know if the finger is in contact with enough pixels of the primary and/or secondary line to yield a correlation value with high-enough confidence level. Such a “finger detect” operation preferably can be done before correlation to save the computing of the correlations if an object, such as a finger, is not there. Culling itself can't distinguish between a non-moving finger on the sensor and no finger at all.


There are many methods of doing this. A preferred embodiment can calculate the variance of the gray level pixels in a subset of either or both of the secondary and primary scan lines 230, 250. If that variance is below a threshold, no finger is detected. If it is above a threshold, a finger is detected. A preferred embodiment analyzes the variance in two subsets (left half and right half) to ensure both halves have a finger on either or both of the primary and secondary scan lines 230, 250 (each half s variance must be above some threshold). Other ways to do this in software can be to analyze zero crossings of the pixels in the line, or the frequency content of the gray levels along the line, or combinations of both. In hardware finger detect could be done through a micro-switch or other pressure-sensitive means, optically (blocking a light source with the finger), or electrically (measuring a bulk capacitance and/or resistance of the finger).


In blocks 320, 322 and 324 respectively, various correlations are performed, the nature of which is explained by truncated illustrative example with respect to FIG. 4 below. In block 320 the current primary scan line is correlated to past secondary scan lines contained in the circular buffer 210 in an attempt to sense downward movement, i.e., in the +y-axis direction of the object being sensed, such as a finger, moving with respect to the primary linear scan array 230 and the secondary linear scan array 250. That is, a line of the object extending along the x-axis, generally perpendicular to the longitudinal axis of each of the primary linear scan array 230 and the secondary linear scan array 250, passed over the primary linear scan array 230 before passing over the secondary linear scan array 250. When the object (i.e., the finger) is so scanned, correlation of the current primary scan line obtained from the primary linear scan array 230 with one of a plurality of secondary linear scan lines obtained from the secondary linear scan array 250, and stored in the secondary scan line circular buffer 210, can show that a finger is swiping in a +y axis direction (“down” in relation to the illustrated orientation of FIG. 2).


In block 322 the current secondary scan line can be correlated to past primary linear scan lines in the primary linear scan line circular buffer 210 in an attempt to sense −y axis direction movement of the object being sensed, such as a finger, moving with respect to the primary linear scan array 230 and the secondary linear scan array 250 (in this case “up” as viewed in FIG. 2. That is, a line of the object extending along the x-axis, generally perpendicular to the longitudinal axis of each of the primary linear scan array 230 and the secondary linear scan array 250, passes over the secondary linear scan array 250 before passing over the primary linear scan array 230. When the object (i.e., the finger) is so scanned correlation of the current secondary scan line from the secondary linear scan array 250 with one of a plurality of primary scan lines from the primary linear scan array 230, stored in the primary scan line circular buffer 210, can show that a finger is swiping in a (−y axis direction), crossing the secondary linear scan line segment 250 first and then the primary linear scan line segment 230 (though “up” in FIG. 2).


In block 324 the current primary scan line 230 is correlated to the immediately past primary scan line 230, which is stored in buffer 210, which contains past primary and secondary lines. Therefore the buffer 210 also contains the immediately past primary line, which would be the most recent one added to the circular buffer. This correlation is performed in an attempt to sense purely sideways movement in the x-axis direction of the object being sensed, such as a finger, moving with respect to the primary linear scan array 230. That is, when the object (i.e., the finger) is so scanned, correlation of the current primary scan line with the immediate past primary scan line, can show that a finger is swiping in a sideways direction (±x axis direction only, i.e., there is also no corresponding movement in the +y or −y directions).


It will be understood that the foregoing is exemplary only and many modifications will be understood by those skilled in the art, such as accounting for half pixel registration of the purely sidewise movement, so that the current primary scan line may be compared to several immediate past primary scan lines, each kept in the primary scan line circular buffer for y axis direction correlation, to detect a pixel aligned correlation with a second or third most recent primary scan line. It will also be understood that the secondary scan line could be suitably used for this sidewise motion detection, either as an alternative or a verification or as supplying added data.


In blocks 330, 332 and 334 the process 300 may choose the past secondary scan line with the highest correlation to the current primary line (block 330), choose the past primary scan line with highest correlation to the current secondary line (block 332) and choose the past primary scan line with highest correlation to the current primary line (block 334). In block 336, the choice of the past primary with the highest correlation may be confirmed, e.g., by, as noted above, calculating a secondary scan line to an immediate past secondary scan line(s) at the same x-lag chosen in block 336. Confidence may then be calculated in blocks 340, 342 and 344, which may be done by comparisons of the current correlation measure with past correlation measures, or, in the case of the sidewise motion (blocks 334, 336) the secondary to secondary correlation mentioned for block 336.


In block 350 the process 300 chooses a direction, up, down or sideways from the highest correlation confidence measure, which is the direction of the motion of the object, such as the finger. This may involve some choice algorithms, such as when there is no clear correlation measure that can be selected from either one or both of the other two confidence measures. Such choice algorithms could rely on past results, e.g., the object was moving up so assume that up is the right answer. Or, if none of the correlations are above a threshold, declare that no finger motion has occurred. Other algorithms for improving choice accuracy may be used as well, such as, those described in further detail below.


In block 352 the process 300, based on a corresponding time lag for the y-lag (in the case of up/down movement), may determine an y-velocity, e.g., based on timestamps, as is well known in the art. Similarly as explained illustratively with respect to FIG. 4 an x-velocity can be determined, e.g., based on the x-lag and the same determined time difference. In block 354 a motion in the x and y directions may then be computed based on the calculated x and y velocities and elapsed time since the last processed current primary linear scan array 230 and secondary linear scan array 250 scan line. Such a motion event may be stored in an event buffer 360 as an entry onto an event list equal to the computed x and y motion, as is further explained below in regard to FIG. 5. The event list may then be processed and any event that has matured can be sent to a calling program, such as 420 shown in FIG. 5.


In an alternate embodiment, the velocity calculation step 352 may be omitted entirely and motion may be calculated directly as a function of the x-lag and y-lag,. For the y-lag, this function would be an inverse proportion. For instance, if the y-lag is smaller then the y-motion would be larger because the finger must be moving very quickly to have traversed from the primary to the secondary in a small number of scan lines. The corresponding x-motion would simply be a straight proportion of the calculated x-lag. Using this method, velocities need not be calculated and timestamps are not necessary. This is illustrated in more detail in a co-pending application U.S. 2012/0189172A1 published Jul. 26, 2012, entitled SYSTEM FOR AND METHOD OF IMAGE RECONSTRUCTION WITH DUAL LINE SCANNER USING LINE COUNTS, the disclosure of which is hereby incorporated by reference.


Correlation may be illustrated schematically by considering FIG. 4. Correlation is actually done differently in a current actual embodiment of the disclosed subject mater as explained below, for various reasons, such as computational time and memory capacity limitations, but could be implemented in the way illustrated in FIG. 4. FIG. 4 in any event illustrates the intent of correlation, such as cross-correlation, in the disclosed subject matter. Turning to FIG. 4, as illustrated in 5a, a primary linear scan array 230 line scan 370, e.g., taken from the schematically illustrative scanner 220 of FIG. 3, can be compared to a secondary linear scan array 250 scan line 372. In FIG. 4a, the six pixels of the scan line 372 are compared to the first six pixels in the scan line 370. It will be understood that the scan line 370 could be a current scan line and the secondary scan line 372 could be a past scan line from the secondary scan line circular buffer 210 or vice versa.


Various comparison algorithms can be used, depending in part on the size of the primary scan line 370, i.e., number of pixel locations available for comparison, and the size of the secondary scan line 372. For example, matching of four out of six of the secondary scan line 372 pixel locations with the portion of the primary scan line 370 pixel locations being compared, or four consecutive such pixels, or the like can be employed.


Each of the different symbols in the individual pixel locations, 12 for the primary scan line 370 and six for the secondary scan line 372, represents a different value, e.g., a gray scale value for the respective pixel location sensed by the dual line scanner 220. It will also be understood that for this exemplary implementation of cross correlation, due to noise in the system, the individual values represented by the different symbols may have to be flexibly valued within some range. That is, the symbol in a pixel location of the primary pixel line 370 may not have to be identical to that of the corresponding pixel location in the secondary pixel scan line 372 for each comparison.


Assuming, for illustration, either a noiseless system or that each of the individual symbols used in FIG. 4 for the pixel values are not within such range of values from any other, i.e., a match requires a same symbol be matched for all of the different symbols shown in FIG. 4, then, as can be seen, there is no correlation between the first six pixel locations of the secondary scan line 372 and the primary scan line 370 in the position illustrated in FIG. 4a. There the initial register locations of the initial pixels in each of the primary scan line 370 and the secondary scan line 372 are aligned and there is no match of four out of six or four adjacent pixel locations. The same applies for FIG. 4b, where the secondary initial scan line pixel location s1 is aligned with the primary scan line pixel location p2. In fact, the same situation prevails through each of FIGS. 4c, 4d, 4e and 4f until FIG. 4g. There, with the secondary scan line 372 pixel location s1 aligned with primary scan line 370 pixel location p8 the first four symbols in the secondary pixel scan line 372, s1-s4 are the same as the four adjacent primary scan line 370 pixel locations p8-p11.


Given this cross correlation, the system can determine along in a particular direction (x and y) depending on whether the primary scan line 370 or the secondary scan line 372 is a current scan line and the opposite one is taken from the respective scan line buffer, which equates to the difference between the current scan line and the position of the historical scan line in the respective buffer. That is, as an example, an 8 line separation would be a separation of 400 μm and would have a time difference based upon time stamp comparison. Assuming the secondary scan line 372 is the current scan line in the comparison, this would be a y-lag in the +y axis direction. In addition it can be seen that there is a lag in the +x-axis direction of 8 pixel locations. This would amount to 400 μm, given a pitch of 50 μm as noted above for the pixel locations in the primary linear array scan line 230 and the secondary linear scan line array 250.


While FIG. 4 is illustrative of a correlation process that could be employed and the objective of such correlation, other mathematical approaches to cross correlation are much easier to implement and even more effective, including normalized cross correlation (“NCC”) as discussed in more detail below.


Turning now to FIG. 5 which illustrates schematically and in block diagram form, according to aspects of embodiments of the disclosed subject matter, an architecture of a user input device 400, which can include a navigation system module 402, and further can include a device emulation module 410. The user input device 400 navigation module can include a gesture recognition unit 406 and the motion tracking system 202 of the object imaging and image reconstruction system 200 as described above. As seen in FIG. 5, in an object imaging and image reconstruction system 200 motion tracking module 204 the host computer 202 of FIG. 2 consumes sensor rows one scan line at a time, as received from the sensor 220 of FIG. 2 for both the primary scan line 230 and the secondary scan line 250. The motion tracking module 202 can calculate an estimate of the raw finger motion relative to the sensor 220 surface, as is explained in the present application, and accumulate an output event queue 360, which can contain ΔX and ΔY values, which may be time-stamped, along with finger on/off status.


Gesture recognition in a gesture recognition module 406 can be integrated into motion tracking, and can consume a motion event list every G milliseconds, while the module 406 attempts to determine if a gesture has occurred. The module 406 may then characterize any gesture(s) it finds and append all gesture events to the output event queue 360. Every N milliseconds events in the output event queue 360 whose “time has come” can be sent via device emulation callback 412 to the device emulation module 410, which can consume these motion and gesture events and produce as outputs on-screen behavior 430 through an application or an operating system 420, having an application callback 422, according to the device under emulation. Similarly, device emulation 410 can create its own output event (i.e., no queuing being used) and immediately send the output event to the calling program 420.


Motion tracking is at the heart of the navigation task. For gestures and device emulation to work well, good estimates of actual finger motion are required. Motion tracking, as noted, employs correlation, and more specifically in an example of an embodiment of the disclosed subject matter normalized cross-correlation (NCC), to determine when a finger has moved vis-a-vis the primary linear scan array 230 and/or the secondary linear scan array 250. NCC can be the choice because it is very robust to noise and changes in DC level.


As explained in regard to FIG. 3, three sets of data are correlated, current primary scan line to past secondary scan lines for −y axis motion; current secondary scan line to past primary scan lines for +y axis motion; and current primary scan line to past primary scan lines for sideways motion. The atomic datum is one scan line. The algorithm can operate on non-culled scan lines, since culled lines are not sent to the host computing device 204 in order to reduce bandwidth over the bus, although the navigation module 402 could implement its own culling or improve on the culling done by the sensor module 202. Once an input scan line is processed and its movement calculated, the navigation module 402 may be set up to not revisit the scan line movement, e.g., based on results for future scan lines, though such is possible and could improve fidelity. When the navigation module 402 is called with a new non-culled row, the simplified processing flow is as is shown in FIG. 3.


The correlation algorithm, discussed above, being mathematically well-known, having no tunable parameters and consuming over 95% of the processing can preferably be implemented in a hard wired hardware implementation. Each new scan line that is not-culled, i.e., is delivered to the host computing device 204, must be correlated to older scan lines saved in the circular buffer(s) 210. As an example 64 rows of history, i.e., 64 past primary line scans 230 and 64 past secondary scan lines 250 in respective 48 place circular buffers 210 can be used, but also as little as 48 or even 32 total may be enough, especially with improved culling techniques.


A possible configuration can be to use 48 rows of 4-bit data, i.e., using the middle 128 pixels of a 256 pixel primary linear scan array 230 and the entire 64 pixels of a secondary linear scan array 250. This would require a RAM buffer space equal to 48×(128+64)×0.5=4.6 KB. Having 128 primary pixels with a 64-pixel secondary pixels can allow for +/−32 column-wise shifts (lags), whereas an alternative of 112 primary pixels can accommodate +/−24 column-wise shifts. The number of the pixels in the secondary scan line 250 is believed to be less susceptible to modification.


With regard to timing considerations, worst-case is that the finger is moving so fast as to have no lines culled at that moment, which means that to be in real time the system must complete tasks within the time to do a single scan. This could be, as an example, 250 μsec, but 330 μsec or longer could also be acceptable. Without any optimizations to reduce correlation set search space, the system will need to correlate 3 sets of data at each iteration, i.e., the current primary to all (48) stored secondary lines (for −y direction motion), the current secondary to all (48) stored primary lines (for +y direction motion) and the current primary to 2 or 3 most recently stored primary lines (for purely horizontal motion tracking)


The total 48+48+2=98 rows need to be correlated in the preferred embodiment. Each row's correlation can use 64 pixels and needs to be computed over a number of different shifts or lags, which is required to track angles above or below 90 degrees (pure vertical). The larger the maximum lag that is allowed for, the wider the deviation from pure vertical that can be successfully tracked. The primary-to-primary correlation can only track pure horizontal motion. A lag range of +/−16 could be used, along with a prediction of the next lag, from the lag history, combined with centering the lag range on the predicted value, or even on the last computed value, as a possibility. Alternatively, a wider range may be used, e.g., +/−24 instead, and centering on the center of the secondary scan line is possible. Using this number, 24+24+1=49 NCC values that would need to be calculated per row. The total number of NCC calculations per new scan line, without the above noted possible optimizations, is, therefore, 49×98=4802.


The classic equation for a single NCC calculation for a given lag is:

NCC=[E(qRef*qTest)−E(qRef)E(qTest)]/[σ(qRef)*σ(qTest)]  (i)

where qRef is the set of pixels in the secondary scanline, qTest is the shifted set of pixels used in the primary scanline, E(q) is the expected value of the set of pixels in q, and σ(q) is the standard deviation of the set of pixels in q. Because standard deviations require square roots, it can be helpful to use the squared-NCC value to simplify the equation. Other straightforward simplifications can include elimination of most of the divisions. The navigation algorithm equation for a given lag being considered (i.e., a respective shift between the secondary set of pixels with respect to the primary set of pixels) then becomes:

NCC2=[N*Σ(qRef*qTest)−Σ(qRef)*Σ(qTest)]2/([N*Σ(qRef*qRef)−Σ(qRef)*Σ(qRef)]*[N*Σ(qTest*qTest)−Σ(qTest)*Σ(qTest)])   (ii)

where N=64 and all sums are from 1-64.


Choosing among the three correlation scores for the one most likely to represent true motion, i.e., correlation set choice, according to aspects of an embodiment of the disclosed subject matter, involves three correlation peaks to choose from: primary-to-past-secondaries (CorrUP, i.e., +y axis motion), secondary-to-past-primaries (CorrDOWN, i.e. −y axis motion), and primary-to-past-primaries (CorrSIDE), in a preferred embodiment using primary-to-past-primaries because the number of pixel locations is larger and hence more reliable.


The following logic may be used: If CorrUP>CorrDOWN+Margin_updown and CorrUp>CorrSIDE+Margin_side and CorrUp>Thresh_updown, then the finger moved upward (+y direction). In other words, CorrUP must be significantly greater than CorrDOWN and CorrSIDE, and must also be greater than a selected threshold. Similarly if CorrDOWN>CorrUP+Margin_updown and CorrDOWN>CorrSIDE+Margin_side and CorrDOWN>Thresh_updown then the finger moved downward (−y axis direction motion). Finally if CorrSIDE>CorrUP+Margin_side and CorrSIDE>CorrDOWN+Margin_side and CorrSIDE>Thresh_side then the finger moved sideways.


Typically, Margin_updown is greater than zero and, if correlation values are said to range between 0-100, then Margin_updown is typically 0<Margin_updown<50. Margin_side is typically less than zero because it has been found that, in general, side-to-side correlations tend to be higher than up-down correlations. This is because the data being correlated is coming from the same part of the sensor (e.g. primary to primary) versus up-down where it's a primary against a secondary. So, typically −25<Margin_side<0, but this could differ with sensing hardware.


As noted above, NCC is not the only way to measure the similarity of two scan lines, with nearly equivalent measures, i.e., mean-squared-error (MSE) and sum-of-absolute-differences (SAD) as possible substitutes. While they may be computationally slightly less complex than NCC after the just noted optimizations, perhaps they can be easier to implement in hardware. If so, accuracy impact may be the determining factor in the choice.


Embodiments of the present disclosure can be used to scan objects other than fingers and to create images of such objects other than fingerprint images. The present disclosure can be used to scan other biometric data, such as the palm of a hand or a retina. The present disclosure can also be used to scan virtually any type of object by a swipe scan without having to calculate the velocity of the object as it moves across the swipe scanner.


The present disclosure is described with reference to block diagrams and operational illustrations of methods and devices implementing methods (collectively “block diagrams”). Each block of the block diagram, and combinations of blocks in the block diagram, can be implemented with analog or digital hardware and/or computer program instructions, such as in a computing device. The computer program instructions can be provided to a processor of a general purpose computer, special purpose computer, microcontroller, ASIC, or any other programmable data processing apparatus, so that the instructions implement the functions/acts specified in the block diagram when executed via the computing device. The functions/acts noted in the blocks can occur out of the order noted in the block diagram. For example, two blocks shown in succession can in fact be executed substantially concurrently or the blocks can sometimes be executed in the reverse order, depending upon the functionality/acts involved. In addition, different blocks may be implemented by different computing devices, such as an array of processors within computing devices operating in series or parallel arrangement, and exchanging data and/or sharing common data storage media.


The term “computer readable medium” as used herein and in the claims generally means a non-transitory medium that stores computer programs and/or data in computing device machine-readable form. Computer readable medium can include computer storage media and communication media. Computer storage media can include volatile and non-volatile, removable and non-removable media implemented with any suitable method or technology for the storage of information, such as computer-readable instructions, data structures, program modules, or specific applications.


The term “module” as used herein and in the claims generally means a software, hardware, and/or firmware system, process and/or functionality that can perform or facilitate the processes, features, and/or functions of the present disclosure, with or without human interaction or augmentation. A module can include sub-modules. Software components of a module can be stored on a non-transitory computing device readable medium. Modules can be integral to one or more servers or can be loaded and executed by one or more servers. One or more modules can be grouped into an engine or an application.


While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims
  • 1. A user input apparatus comprising: a two line object imaging sensor having a primary line scan-sensor providing a primary line scan-sensor output and a secondary line scan-sensor providing a secondary line scan-sensor output, each representing, respectively, the pixels in a current primary scan row of pixels and a current secondary scan row of pixels, the primary line scan-sensor and the secondary line scan-sensor each adapted to scan an object;a storage apparatus storing for each scan time each current primary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for a plurality of past primary line scan sensor outputs and storing each current secondary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for a plurality of past secondary line scan sensor outputs;a correlation unit correlating at least one of the current representations of pixels in a primary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past secondary line scan-sensor outputs and the current representations of pixels in a secondary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past primary line scan-sensor outputs; and,the correlation unit providing as an output a motion indicator comprising a direction and amplitude of the motion of the object being scanned in a coordinate system coplanar with the primary and secondary line scan-sensors and aligned to the primary line scan sensor and the secondary line scan-sensor.
  • 2. The user input apparatus of claim 1 wherein the correlation unit correlates the current representations of pixels in the primary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past primary line scan sensor outputs.
  • 3. The user input apparatus of claim 1 wherein the correlation unit correlates the current representations of pixels in the secondary line scan sensor output with stored representations of pixels in a row of pixels for a preceding secondary line scan-sensor output.
  • 4. The user input apparatus of claim 1 wherein the correlation unit output is operatively connected to a user device user interface.
  • 5. The user input apparatus of claim 1 wherein the correlation unit creates a correlation score for each correlation performed by the correlation unit and provides the correlation unit output based upon evaluation of the correlation scores.
  • 6. The user input apparatus of claim 5 wherein the correlation scores include a down correlation score, an up correlation score, and a sideways correlation score.
  • 7. The user input apparatus of claim 1 wherein the motion indicator is computed using a velocity of the object.
  • 8. The user input apparatus of claim 1 wherein the motion indicator is computed without using a velocity of the object.
  • 9. A method of providing a user input, the method comprising: providing a primary line scan-sensor output and a secondary line scan-sensor output, via a two line object imaging sensor having a primary line scan-sensor and a secondary line scan-sensor, each output representing, respectively, the pixels in a current primary scan row of pixels and a current secondary scan row of pixels, utilizing the primary line scan-sensor and the secondary line scan-sensor to scan an object;storing in a storage apparatus, for each scan time, each current primary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for each of a plurality of past primary line scan-sensor outputs and storing in a storage apparatus, for each scan time, each current secondary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for each of a plurality of past secondary line scan-sensor outputs;correlating with a correlating unit at least one of the current representations of pixels in the primary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past secondary line scan-sensor outputs and the current representations of pixels in a secondary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past primary line scan-sensor outputs; and,providing as an output of the correlation unit a motion indicator of a direction and amplitude of the motion of the object being scanned in a coordinate system coplanar with the primary line scan-sensor and secondary line scan-sensor and aligned to the primary line scan-sensor and the secondary line scan-sensor.
  • 10. The method of claim 9 and further comprising correlating with the correlating unit the current representations of pixels in the primary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past primary line scan-sensor outputs.
  • 11. The method of claim 9 and further comprising correlating with the correlation unit the current representations of pixels in the secondary line scan sensor output with stored representations of pixels in a row of pixels for a preceding secondary line scan-sensor output.
  • 12. The method of claim 9 further comprising providing to a user device user interface the correlation unit output.
  • 13. The method of claim 9 and further comprising creating via the correlation unit a correlation score for each correlation performed by the correlation unit and providing the correlation unit output based upon evaluation of the correlation scores.
  • 14. The method of claim 13 wherein the correlation scores comprise a down correlation score, an up correlation score, and a sideways correlation score.
  • 15. The method of claim 9 wherein the motion indicator is computed using a velocity of the object.
  • 16. The method of claim 9 wherein the motion indicator is computed without using a velocity of the object.
  • 17. A user input apparatus comprising: a two line object imaging sensor having a primary line scan-sensor providing a primary line scan-sensor output and a secondary line scan-sensor providing a secondary line scan-sensor output, each representing, respectively, the pixels in a current primary line scan row of pixels and a current secondary line scan row of pixels, the primary line scan-sensor and the secondary line scan-sensor each adapted to scan an object;a storage apparatus storing for each scan time each current primary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for a plurality of past primary line scan-sensor outputs and storing each current secondary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for a plurality of past secondary line scan-sensor outputs;a correlation unit correlating at least one of the current representations of pixels in a primary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past secondary line scan-sensor outputs and the current representations of pixels in a secondary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past primary line scan-sensor outputs;the correlation unit providing as a correlation unit output a motion indicator comprising a direction and amplitude of motion of the object being scanned in a coordinate system coplanar with the primary line scan-sensor and the secondary line scan-sensor and aligned to the primary line scan-sensor and the secondary line scan-sensor;wherein the correlation unit creates a correlation score for each correlation performed by the correlation unit and provides the correlation unit output based upon evaluation of the correlation scores; andwherein the correlation scores include a down correlation score, an up correlation score, and a sideways correlation score.
  • 18. A method of providing a user input, the method comprising: providing a primary line scan-sensor output and a secondary line scan-sensor output, via a two line object imaging sensor having a primary line scan-sensor and a secondary line scan- sensor, each output representing, respectively, the pixels in a current primary scan row of pixels and a current secondary scan row of pixels, the primary line scan-sensor and the secondary line scan-sensor scanning an object;storing in a storage apparatus, for each scan time, each current primary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for each of a plurality of past primary line scan-sensor outputs and storing in a storage apparatus, for each scan time, each current secondary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for each of a plurality of past secondary line scan- sensor outputs;correlating with a correlating unit at least one of the current representations of pixels in the primary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past secondary line scan-sensor outputs and the current representations of pixels in the secondary line scan-sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past primary line scan-sensor outputs;providing as an output of the correlation unit a motion indicator of a direction and amplitude of the motion of the object being scanned in a coordinate system coplanar with the primary line scan-sensor and secondary line scan-sensor and aligned to the primary line scan sensor and the secondary line scan-sensor;creating via the correlation unit a correlation score for each correlation performed by the correlation unit and providing the correlation unit output based upon evaluation of the correlation scores; andwherein the correlation scores comprise a down correlation score, an up correlation score, and a sideways correlation score.
  • 19. A non-transitory tangible machine readable medium storing instructions that, when executed by a computing device, cause the computing device to perform a method of providing user input, the method comprising: receiving a primary line scan-sensor output and a secondary line scan-sensor output each from a two line object imaging sensor having a primary line scan-sensor and a secondary line scan-sensor, representing, respectively, the pixels in a current primary scan row of pixels and a current secondary scan row of pixels, the primary line scan-sensor and the secondary line scan- sensor scanning an object;storing for each scan time, each current primary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for each of a plurality of past primary line scan sensor outputs and storing, for each scan time, each current secondary line scan-sensor output to maintain a plurality of stored representations of pixels in a row of pixels for each of a plurality of past secondary line scan sensor outputs;correlating at least one of the current representations of pixels in the primary line scan sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past secondary line scan sensor outputs and the current representations of pixels in the secondary line scan sensor output with stored representations of pixels in a row of pixels for respective ones of the plurality of past primary line scan sensor outputs;providing a motion indicator of a direction and amplitude of the motion of the object being scanned in a coordinate system coplanar with the primary line scan-sensor and secondary line scan-sensor and aligned to the primary line scan sensor and the secondary line scan-sensor;creating a correlation score for each correlation performed and providing the motion indicator based upon evaluation of the correlation scores; andwherein the correlation scores comprise a down correlation score, an up correlation score, and a sideways correlation score.
US Referenced Citations (431)
Number Name Date Kind
4151512 Rigannati et al. Apr 1979 A
4225850 Chang et al. Sep 1980 A
4310827 Asi Jan 1982 A
4353056 Tsikos Oct 1982 A
4405829 Rivest et al. Sep 1983 A
4525859 Bowles et al. Jun 1985 A
4550221 Mabusth Oct 1985 A
4580790 Doose Apr 1986 A
4582985 Loftberg Apr 1986 A
4675544 Shrenk Jun 1987 A
4758622 Gosselin Jul 1988 A
4817183 Sparrow Mar 1989 A
5076566 Kriegel Dec 1991 A
5109427 Yang Apr 1992 A
5140642 Hsu et al. Aug 1992 A
5305017 Gerpheide Apr 1994 A
5319323 Fong Jun 1994 A
5325442 Knapp Jun 1994 A
5359243 Norman Oct 1994 A
5420936 Fitzpatrick et al. May 1995 A
5422807 Mitra et al. Jun 1995 A
5429006 Tamori Jul 1995 A
5456256 Schneider et al. Oct 1995 A
5543591 Gillespie et al. Aug 1996 A
5569901 Bridgelall et al. Oct 1996 A
5623552 Lane Apr 1997 A
5627316 De Winter et al. May 1997 A
5650842 Maase et al. Jul 1997 A
5717777 Wong et al. Feb 1998 A
5781651 Hsiao et al. Jul 1998 A
5801681 Sayag Sep 1998 A
5818956 Tuli Oct 1998 A
5838306 O'Connor Nov 1998 A
5848176 Harra et al. Dec 1998 A
5850450 Schweitzer et al. Dec 1998 A
5852670 Setlak et al. Dec 1998 A
5864296 Upton Jan 1999 A
5887343 Salatino et al. Mar 1999 A
5892824 Beatson et al. Apr 1999 A
5903225 Schmitt et al. May 1999 A
5915757 Tsuyama et al. Jun 1999 A
5920384 Borza Jul 1999 A
5920640 Salatino et al. Jul 1999 A
5940526 Setlak et al. Aug 1999 A
5963679 Setlak Oct 1999 A
5995630 Borza Nov 1999 A
5999637 Toyoda et al. Dec 1999 A
6002815 Immega et al. Dec 1999 A
6011859 Kalnitsky et al. Jan 2000 A
6016355 Dickinson et al. Jan 2000 A
6052475 Upton Apr 2000 A
6067368 Setlak et al. May 2000 A
6073343 Petrick et al. Jun 2000 A
6076566 Lowe Jun 2000 A
6088585 Schmitt et al. Jul 2000 A
6098175 Lee Aug 2000 A
6118318 Fifield et al. Sep 2000 A
6134340 Hsu et al. Oct 2000 A
6157722 Lerner et al. Dec 2000 A
6161213 Lofstrom Dec 2000 A
6175407 Santor Jan 2001 B1
6182076 Yu et al. Jan 2001 B1
6182892 Angelo et al. Feb 2001 B1
6185318 Jain et al. Feb 2001 B1
6234031 Suga May 2001 B1
6241288 Bergenek et al. Jun 2001 B1
6259108 Antonelli et al. Jul 2001 B1
6289114 Mainguet Sep 2001 B1
6292272 Okauchi et al. Sep 2001 B1
6317508 Kramer et al. Nov 2001 B1
6320394 Tartagni Nov 2001 B1
6325285 Baratelli Dec 2001 B1
6327376 Harkin Dec 2001 B1
6330345 Russo et al. Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6333989 Borza Dec 2001 B1
6337919 Duton Jan 2002 B1
6343162 Saito et al. Jan 2002 B1
6346739 Lepert et al. Feb 2002 B1
6347040 Fries et al. Feb 2002 B1
6357663 Takahashi et al. Mar 2002 B1
6360004 Akizuki Mar 2002 B1
6362633 Tartagni Mar 2002 B1
6392636 Ferrari et al. May 2002 B1
6399994 Shobu Jun 2002 B2
6400836 Senior Jun 2002 B2
6408087 Kramer Jun 2002 B1
6459804 Mainguet Oct 2002 B2
6473072 Comiskey et al. Oct 2002 B1
6509501 Eicken et al. Jan 2003 B2
6525547 Hayes Feb 2003 B2
6525932 Ohnishi et al. Feb 2003 B1
6535622 Russo et al. Mar 2003 B1
6539101 Black Mar 2003 B1
6580816 Kramer et al. Jun 2003 B2
6597289 Sabatini Jul 2003 B2
6628812 Setlak et al. Sep 2003 B1
6631201 Dickinson et al. Oct 2003 B1
6643389 Raynal et al. Nov 2003 B1
6672174 Deconde et al. Jan 2004 B2
6710416 Chou et al. Mar 2004 B1
6738050 Comiskey et al. May 2004 B2
6741729 Bjorn et al. May 2004 B2
6757002 Oross et al. Jun 2004 B1
6766040 Catalano et al. Jul 2004 B1
6785407 Tschudi et al. Aug 2004 B1
6799275 Bjorn et al. Sep 2004 B1
6836230 Le Pailleur et al. Dec 2004 B2
6838905 Doyle Jan 2005 B1
6873356 Kanbe et al. Mar 2005 B1
6886104 McClurg et al. Apr 2005 B1
6897002 Teraoka et al. May 2005 B2
6898299 Brooks May 2005 B1
6924496 Manansala Aug 2005 B2
6937748 Schneider et al. Aug 2005 B1
6941001 Bolle et al. Sep 2005 B1
6941810 Okada Sep 2005 B2
6950540 Higuchi Sep 2005 B2
6959874 Bardwell Nov 2005 B2
6963626 Shaeffer et al. Nov 2005 B1
6970584 O'Gorman et al. Nov 2005 B2
6980672 Saito et al. Dec 2005 B2
6983882 Cassone Jan 2006 B2
7013030 Wong et al. Mar 2006 B2
7020591 Wei et al. Mar 2006 B1
7030860 Hsu et al. Apr 2006 B1
7031670 May Apr 2006 B2
7035443 Wong Apr 2006 B2
7042535 Katoh et al. May 2006 B2
7043061 Hamid et al. May 2006 B2
7043644 DeBruine May 2006 B2
7046230 Zadesky et al. May 2006 B2
7064743 Nishikawa Jun 2006 B2
7099496 Benkley Aug 2006 B2
7110574 Haruki et al. Sep 2006 B2
7110577 Tschud Sep 2006 B1
7113622 Hamid Sep 2006 B2
7126389 McRae et al. Oct 2006 B1
7129926 Mathiassen et al. Oct 2006 B2
7136514 Wong Nov 2006 B1
7146024 Benkley Dec 2006 B2
7146026 Russon et al. Dec 2006 B2
7146029 Manansala Dec 2006 B2
7184581 Johansen et al. Feb 2007 B2
7190209 Kang et al. Mar 2007 B2
7190816 Mitsuyu et al. Mar 2007 B2
7194392 Tuken et al. Mar 2007 B2
7197168 Russo Mar 2007 B2
7200250 Chou Apr 2007 B2
7251351 Mathiassen et al. Jul 2007 B2
7258279 Schneider et al. Aug 2007 B2
7260246 Fujii Aug 2007 B2
7263212 Kawabe Aug 2007 B2
7263213 Rowe Aug 2007 B2
7289649 Walley et al. Oct 2007 B1
7290323 Deconde et al. Nov 2007 B2
7308121 Mathiassen et al. Dec 2007 B2
7308122 McClurg et al. Dec 2007 B2
7321672 Sasaki et al. Jan 2008 B2
7356169 Hamid Apr 2008 B2
7360688 Harris Apr 2008 B1
7369685 DeLeon May 2008 B2
7379569 Chikazawa et al. May 2008 B2
7408135 Fujeda Aug 2008 B2
7409876 Ganapathi et al. Aug 2008 B2
7412083 Takahashi Aug 2008 B2
7424618 Roy et al. Sep 2008 B2
7447339 Mimura et al. Nov 2008 B2
7447911 Chou et al. Nov 2008 B2
7460697 Erhart et al. Dec 2008 B2
7463756 Benkley Dec 2008 B2
7474772 Russo et al. Jan 2009 B2
7505611 Fyke Mar 2009 B2
7505613 Russo Mar 2009 B2
7565548 Fiske et al. Jul 2009 B2
7574022 Russo Aug 2009 B2
7596832 Hsieh et al. Oct 2009 B2
7599530 Boshra Oct 2009 B2
7616787 Boshra Nov 2009 B2
7643950 Getzin et al. Jan 2010 B1
7646897 Fyke Jan 2010 B2
7681232 Nordentoft et al. Mar 2010 B2
7689013 Shinzaki Mar 2010 B2
7706581 Drews et al. Apr 2010 B2
7733697 Picca et al. Jun 2010 B2
7751601 Benkley Jul 2010 B2
7826645 Cayen Nov 2010 B1
7843438 Onoda Nov 2010 B2
7848798 Martinsen et al. Dec 2010 B2
7899216 Watanabe et al. Mar 2011 B2
7953258 Dean et al. May 2011 B2
8005276 Dean et al. Aug 2011 B2
8031916 Abiko et al. Oct 2011 B2
8063734 Conforti Nov 2011 B2
8077935 Geoffroy et al. Dec 2011 B2
8107212 Nelson et al. Jan 2012 B2
8116540 Dean et al. Feb 2012 B2
8131026 Benkley et al. Mar 2012 B2
8165355 Benkley et al. Apr 2012 B2
8175345 Gardner May 2012 B2
8204281 Satya et al. Jun 2012 B2
8224044 Benkley Jul 2012 B2
8229184 Benkley Jul 2012 B2
8276816 Gardner Oct 2012 B2
8278946 Thompson Oct 2012 B2
8290150 Erhart et al. Oct 2012 B2
8315444 Gardner Nov 2012 B2
8331096 Garcia Dec 2012 B2
8358815 Benkley et al. Jan 2013 B2
8374407 Benkley et al. Feb 2013 B2
8391568 Satyan Mar 2013 B2
20010026636 Mainget Oct 2001 A1
20010030644 Allport Oct 2001 A1
20010036299 Senior Nov 2001 A1
20010043728 Kramer et al. Nov 2001 A1
20020025062 Black Feb 2002 A1
20020061125 Fujii May 2002 A1
20020064892 Lepert et al. May 2002 A1
20020067845 Griffis Jun 2002 A1
20020073046 David Jun 2002 A1
20020089044 Simmons et al. Jul 2002 A1
20020089410 Janiak et al. Jul 2002 A1
20020096731 Wu et al. Jul 2002 A1
20020122026 Bergstrom Sep 2002 A1
20020126516 Jeon Sep 2002 A1
20020133725 Roy et al. Sep 2002 A1
20020152048 Hayes Oct 2002 A1
20020181749 Matsumoto et al. Dec 2002 A1
20030002717 Hamid Jan 2003 A1
20030002719 Hamid et al. Jan 2003 A1
20030021495 Cheng Jan 2003 A1
20030035570 Benkley Feb 2003 A1
20030063782 Acharya et al. Apr 2003 A1
20030068072 Hamid Apr 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030095096 Robbin et al. May 2003 A1
20030095690 Su et al. May 2003 A1
20030102874 Lane et al. Jun 2003 A1
20030123714 O'Gorman et al. Jul 2003 A1
20030123715 Uchida Jul 2003 A1
20030141959 Keogh et al. Jul 2003 A1
20030147015 Katoh et al. Aug 2003 A1
20030161510 Fujii Aug 2003 A1
20030161512 Mathiassen et al. Aug 2003 A1
20030169228 Mathiassen et al. Sep 2003 A1
20030174871 Yoshioka et al. Sep 2003 A1
20030186157 Teraoka et al. Oct 2003 A1
20030209293 Sako et al. Nov 2003 A1
20030224553 Manansala Dec 2003 A1
20040012773 Puttkammer Jan 2004 A1
20040017934 Kocher et al. Jan 2004 A1
20040022001 Chu et al. Feb 2004 A1
20040042642 Bolle et al. Mar 2004 A1
20040050930 Rowe Mar 2004 A1
20040066613 Leitao Apr 2004 A1
20040076313 Bronstein et al. Apr 2004 A1
20040081339 Benkley Apr 2004 A1
20040096086 Miyasaka May 2004 A1
20040113956 Bellwood et al. Jun 2004 A1
20040120400 Linzer Jun 2004 A1
20040125993 Zhao et al. Jul 2004 A1
20040129787 Saito Jul 2004 A1
20040136612 Meister et al. Jul 2004 A1
20040155752 Radke Aug 2004 A1
20040172339 Snelgrove et al. Sep 2004 A1
20040179718 Chou Sep 2004 A1
20040184641 Nagasaka et al. Sep 2004 A1
20040188838 Okada et al. Sep 2004 A1
20040190761 Lee Sep 2004 A1
20040208346 Baharav et al. Oct 2004 A1
20040208347 Baharav et al. Oct 2004 A1
20040208348 Baharav et al. Oct 2004 A1
20040213441 Tschudi Oct 2004 A1
20040215689 Dooley et al. Oct 2004 A1
20040228505 Sugimoto Nov 2004 A1
20040228508 Shigeta Nov 2004 A1
20040240712 Rowe et al. Dec 2004 A1
20040252867 Lan et al. Dec 2004 A1
20050031174 Ryhanen et al. Feb 2005 A1
20050036665 Higuchi Feb 2005 A1
20050047485 Khayrallah et al. Mar 2005 A1
20050100196 Scott et al. May 2005 A1
20050100938 Hoffmann et al. May 2005 A1
20050109835 Jacoby et al. May 2005 A1
20050110103 Setlak May 2005 A1
20050111708 Chou May 2005 A1
20050123176 Ishil et al. Jun 2005 A1
20050129291 Boshra Jun 2005 A1
20050136200 Durell et al. Jun 2005 A1
20050139656 Arnouse Jun 2005 A1
20050139685 Kozlay Jun 2005 A1
20050162402 Watanachote Jul 2005 A1
20050169503 Howell et al. Aug 2005 A1
20050174015 Scott et al. Aug 2005 A1
20050210271 Chou et al. Sep 2005 A1
20050219200 Weng Oct 2005 A1
20050220329 Payne et al. Oct 2005 A1
20050231213 Chou et al. Oct 2005 A1
20050238212 Du et al. Oct 2005 A1
20050244038 Benkley Nov 2005 A1
20050244039 Geoffroy et al. Nov 2005 A1
20050247559 Frey et al. Nov 2005 A1
20050249386 Juh Nov 2005 A1
20050258952 Utter et al. Nov 2005 A1
20050269402 Spitzer et al. Dec 2005 A1
20060006224 Modi Jan 2006 A1
20060055500 Burke et al. Mar 2006 A1
20060066572 Yumoto et al. Mar 2006 A1
20060078176 Abiko et al. Apr 2006 A1
20060083411 Benkley Apr 2006 A1
20060110537 Huang et al. May 2006 A1
20060140461 Kim et al. Jun 2006 A1
20060144953 Takao Jul 2006 A1
20060170528 Fukushige et al. Aug 2006 A1
20060181521 Perrault et al. Aug 2006 A1
20060182319 Setlank et al. Aug 2006 A1
20060187200 Martin Aug 2006 A1
20060210082 Devadas et al. Sep 2006 A1
20060214512 Iwata Sep 2006 A1
20060214767 Carrieri Sep 2006 A1
20060239514 Watanabe et al. Oct 2006 A1
20060249008 Luther Nov 2006 A1
20060259873 Mister Nov 2006 A1
20060261174 Zellner et al. Nov 2006 A1
20060267125 Huang et al. Nov 2006 A1
20060267385 Steenwyk et al. Nov 2006 A1
20060271793 Devadas et al. Nov 2006 A1
20060285728 Leung et al. Dec 2006 A1
20060287963 Steeves et al. Dec 2006 A1
20070031011 Erhart et al. Feb 2007 A1
20070036400 Watanabe et al. Feb 2007 A1
20070057763 Blattner et al. Mar 2007 A1
20070058843 Theis et al. Mar 2007 A1
20070067828 Bychkov Mar 2007 A1
20070076926 Schneider et al. Apr 2007 A1
20070076951 Tanaka et al. Apr 2007 A1
20070086634 Setlak et al. Apr 2007 A1
20070090312 Stallinga et al. Apr 2007 A1
20070138299 Mitra Jun 2007 A1
20070154072 Taraba et al. Jul 2007 A1
20070160269 Kuo Jul 2007 A1
20070180261 Akkermans et al. Aug 2007 A1
20070196002 Choi et al. Aug 2007 A1
20070198141 Moore Aug 2007 A1
20070198435 Siegal et al. Aug 2007 A1
20070228154 Tran Oct 2007 A1
20070237366 Maletsky Oct 2007 A1
20070237368 Bjorn et al. Oct 2007 A1
20070248249 Stoianov Oct 2007 A1
20070290124 Neil et al. Dec 2007 A1
20080002867 Mathiassen et al. Jan 2008 A1
20080013805 Sengupta et al. Jan 2008 A1
20080019578 Saito et al. Jan 2008 A1
20080049987 Champagne et al. Feb 2008 A1
20080049989 Iseri et al. Feb 2008 A1
20080063245 Benkley et al. Mar 2008 A1
20080069412 Champagne et al. Mar 2008 A1
20080126260 Cox et al. May 2008 A1
20080169345 Keane et al. Jul 2008 A1
20080170695 Adler et al. Jul 2008 A1
20080175450 Scott et al. Jul 2008 A1
20080178008 Takahashi et al. Jul 2008 A1
20080179112 Qin et al. Jul 2008 A1
20080185429 Saville Aug 2008 A1
20080201265 Hewton Aug 2008 A1
20080205714 Benkley et al. Aug 2008 A1
20080219521 Benkley et al. Sep 2008 A1
20080222049 Loomis et al. Sep 2008 A1
20080223925 Saito et al. Sep 2008 A1
20080226132 Gardner Sep 2008 A1
20080240523 Benkley et al. Oct 2008 A1
20080240537 Yang et al. Oct 2008 A1
20080244277 Orsini et al. Oct 2008 A1
20080267462 Nelson et al. Oct 2008 A1
20080279373 Erhart et al. Nov 2008 A1
20080317290 Tazoe Dec 2008 A1
20090001999 Douglas Jan 2009 A1
20090130369 Huang et al. May 2009 A1
20090153297 Gardner Jun 2009 A1
20090154779 Satyan et al. Jun 2009 A1
20090155456 Benkley et al. Jun 2009 A1
20090169071 Bond et al. Jul 2009 A1
20090174974 Huang et al. Jul 2009 A1
20090212902 Haddock Aug 2009 A1
20090218698 Lam Sep 2009 A1
20090237135 Ramaraju et al. Sep 2009 A1
20090252384 Dean et al. Oct 2009 A1
20090252385 Dean et al. Oct 2009 A1
20090252386 Dean et al. Oct 2009 A1
20090279742 Abiko Nov 2009 A1
20090319435 Little et al. Dec 2009 A1
20090324028 Russo Dec 2009 A1
20100026451 Erhart et al. Feb 2010 A1
20100045705 Vertegaal et al. Feb 2010 A1
20100083000 Kesanupalli et al. Apr 2010 A1
20100117794 Adams et al. May 2010 A1
20100119124 Satyan May 2010 A1
20100123675 Ippel May 2010 A1
20100127366 Bond et al. May 2010 A1
20100176823 Thompson et al. Jul 2010 A1
20100176892 Thompson et al. Jul 2010 A1
20100177940 Thompson et al. Jul 2010 A1
20100180136 Thompson et al. Jul 2010 A1
20100189314 Benkley et al. Jul 2010 A1
20100208953 Gardner et al. Aug 2010 A1
20100244166 Shibuta et al. Sep 2010 A1
20100272329 Benkley Oct 2010 A1
20100284565 Benkley et al. Nov 2010 A1
20110002461 Erhart et al. Jan 2011 A1
20110018556 Le et al. Jan 2011 A1
20110083018 Kesanupalli et al. Apr 2011 A1
20110083170 Kesanupalli et al. Apr 2011 A1
20110090047 Patel Apr 2011 A1
20110102567 Erhart May 2011 A1
20110102569 Erhart May 2011 A1
20110175703 Benkley Jul 2011 A1
20110176037 Benkley, III Jul 2011 A1
20110182486 Valfridsson et al. Jul 2011 A1
20110214924 Perezselsky et al. Sep 2011 A1
20110267298 Erhart et al. Nov 2011 A1
20110298711 Dean et al. Dec 2011 A1
20110304001 Erhart et al. Dec 2011 A1
20120044639 Garcia Feb 2012 A1
20120189172 Russo Jul 2012 A1
20120206586 Gardner Aug 2012 A1
20120256280 Ehart et al. Oct 2012 A1
20120257032 Benkley Oct 2012 A1
20120308092 Benkley et al. Dec 2012 A1
20130021044 Thompson et al. Jan 2013 A1
20130094715 Benkley et al. Apr 2013 A1
Foreign Referenced Citations (70)
Number Date Country
2213813 Oct 1973 DE
0929028 Jan 1998 EP
0905646 Mar 1999 EP
0973123 Jan 2000 EP
1018697 Jul 2000 EP
1139301 Oct 2001 EP
1531419 May 2005 EP
1533759 May 2005 EP
1538548 Jun 2005 EP
1624399 Feb 2006 EP
1775674 Apr 2007 EP
1939788 Jul 2008 EP
2331613 May 1999 GB
2480919 Dec 2011 GB
2487661 Aug 2012 GB
2489100 Sep 2012 GB
2490192 Oct 2012 GB
2474999 Feb 2013 GB
01094418 Apr 1989 JP
04158434 Jun 1992 JP
2005011002 Jan 2005 JP
2005242856 Sep 2005 JP
2007305097 Nov 2007 JP
200606745 Feb 2006 TW
200606746 Feb 2006 TW
200614092 May 2006 TW
200617798 Jun 2006 TW
200620140 Jun 2006 TW
200629167 Aug 2006 TW
WO 9003620 Apr 1990 WO
WO 9858342 Dec 1998 WO
WO 9928701 Jun 1999 WO
WO 9943258 Sep 1999 WO
WO 0122349 Mar 2001 WO
WO 0194902 Dec 2001 WO
WO 0194902 Dec 2001 WO
WO 0195304 Dec 2001 WO
WO 0211066 Feb 2002 WO
WO 0247018 Jun 2002 WO
WO 0247018 Jun 2002 WO
WO 02061668 Aug 2002 WO
WO 02077907 Oct 2002 WO
WO 03063054 Jul 2003 WO
WO 03075210 Sep 2003 WO
WO 2004066194 Aug 2004 WO
WO 2004066693 Aug 2004 WO
WO 20050104012 Nov 2005 WO
WO 2005106774 Nov 2005 WO
WO 2005106774 Nov 2005 WO
WO 2006040724 Apr 2006 WO
WO 2006041780 Apr 2006 WO
WO 2007011607 Jan 2007 WO
WO 2008033264 Mar 2008 WO
WO 2008033264 Mar 2008 WO
WO 2008033265 Jun 2008 WO
WO 2008033265 Jun 2008 WO
WO 2008137287 Nov 2008 WO
WO 2009002599 Dec 2008 WO
WO 2009002599 Dec 2008 WO
WO 2009029257 Jun 2009 WO
WO 2009079219 Jun 2009 WO
WO 2009079221 Jun 2009 WO
WO 2009079257 Jun 2009 WO
WO 2009079262 Jun 2009 WO
WO 2010034036 Mar 2010 WO
WO 2010036445 Apr 2010 WO
WO 2010143597 Dec 2010 WO
WO 2011088248 Jan 2011 WO
WO2011088252 Jan 2011 WO
WO 2011053797 May 2011 WO
Non-Patent Literature Citations (14)
Entry
Maltoni, “Handbook of Fingerprint Recognition”, XP002355942 Springer, New York, USA, Jun. 2003 pp. 65-69.
Vermasan, et al., “A500 dpi AC Capacitive Hybrid Flip-Chip CMOS ASIC/Sensor Module for Fingerprint, Navigation, and Pointer Detection With On-Chip Data Processing”, IEEE Journal of Solid State Circuits, vol. 38, No. 12, Dec. 2003, pp. 2288-2294.
Ratha, et al. “Adaptive Flow Orientation Based Feature Extraction in Fingerprint Images,” Pattern Recognition, vol. 28 No. 11, 1657-1672, Nov. 1995.
Ratha, et al., “A Real Time Matching System for Large Fingerprint Databases,” IEEE, Aug. 1996.
Suh, et al., “Design and Implementation of the AEGIS Single-Chip Secure Processor Using Physical Random Functions”, Computer Architecture, 2005, ISCA '05, Proceedings, 32nd International Symposium, Jun. 2005 (MIT Technical Report CSAIL CSG-TR-843, 2004.
Hiltgen, et al., “Secure Internet Banking Authentication”, IEEE Security and Privacy, IEEE Computer Society, New York, NY, US, Mar. 1, 2006, pp. 24-31, XP007908655, ISSN: 1540-7993.
Hegt, “Analysis of Current and Future Phishing Attacks on Internet Banking Services”, Mater Thesis. Techische Universiteit Eindhoven—Department of Mathematics and Computer Science May 31, 2008, pp. 1-149, XP002630374, Retrieved from the Internet: URL:http://alexandria.tue.nl/extral/afstversl/wsk-i/hgt2008.pdf [retrieved on Mar. 29, 2011] *pp. 127-134, paragraph 6.2*.
Gassend, et al., “Controlled Physical Random Functions”, In Proceedings of the 18th Annual Computer Security Conference, Las Vegas, Nevada, Dec. 12, 2002.
Bellagiodesigns.com (Internet Archive Wayback Machine, www.bellagiodesigns.com date: Oct. 29, 2005).
Wikipedia (Mar. 2003). “Integrated Circuit,” http://en.wikipedia.org/wiki/integrated—circuit. Revision as of Mar. 23, 2003.
Wikipedia (Dec. 2006). “Integrated circuit” Revision as of Dec. 10, 2006. http://en.widipedia.org/wiki/Integrated—circuit.
Closed Loop Systems, The Free Dictionary, http://www.thefreedictionary.com/closed-loop+system (downloaded Dec. 1, 2011).
Feedback: Electronic Engineering, Wikipedia, p. 5 http://en.wikipedia.org/wiki/Feedback#Electronic—engineering (downloaded Dec. 1, 2011).
Galy et al. (Jul. 2007) “A full fingerprint verification system for a single-line sweep sensor.” IEEE Sensors J., vol. 7 No. 7, pp. 1054-1065.
Related Publications (1)
Number Date Country
20120189166 A1 Jul 2012 US