This invention relates to the field of biometric sensors. In particular, this invention relates to systems and methods that use fingerprint images to emulate electronic positioning devices.
The emergence of portable electronic computing platforms allows functions and services to be enjoyed wherever necessary. Palmtop computers, personal digital assistants, mobile phones, portable game consoles, biometric/health monitors, remote controls, digital cameras, to name a few, are some daily-life examples of portable electronic computing platforms. The desire for portability has driven these computing platforms to become smaller and, consequently, to have longer battery life. A dilemma occurs when these ever-smaller devices require efficient ways to collect user input.
Portable electronic computing platforms need these user inputs for multiple purposes, including (a) navigating a cursor or a pointer to a certain location on a display, (b) selecting (e.g., choosing or not choosing) an item or an action, and (c) orientating (e.g., changing direction with or without visual feedback) an input device.
Concepts for user input from much larger personal computers have been borrowed. Micro joysticks, navigation bars, scroll wheels, touch pads, steering wheels and buttons have all been adopted, with limited success, in conventional devices. All these positioning devices consume substantial amounts of valuable surface real estate on a portable device. Mechanical positioning devices such as joysticks, navigation bars and scroll wheels can wear out and become unreliable. Their sizes and required movements often preclude optimal ergonomic placement on portable computing platforms.
Prior art methods calculate rotation by rotating one frame with respect to another and then applying standard correlation methods. These methods require the selection of a pivot point (e.g., the origin), followed by additional computations. These computations are not helpful for determining linear motion (e.g., non-rotational movement in the x- and y-directions). Such a shortcoming makes prior art systems even more inefficient when used in portable devices, in which both rotational and linear movement are required, such as when emulating, respectively, a steering wheel and a pointing device.
The present invention discloses a system for and method of obtaining rotation information from a patterned image, such as a fingerprint image. Embodiments of the present invention thus require smaller footprints than those that use joy sticks, steering wheels, and other, larger devices that require additional power. Embodiments of the present invention use linear correlation methods that are easier to use than rotational and other methods such as those using trigonometric functions. Embodiments of the present invention thus use simpler algorithms that can be performed faster and more reliably.
In a first aspect of the present invention, a method of obtaining rotation information comprises capturing a plurality of patterned images from a plurality of locations, correlating the plurality of patterned images to generate sets of linear differences, and using the sets of linear differences to generate the rotation information. The plurality of locations comprise a first part of a sensor and a second part of the sensor. A first of the plurality of patterned images is captured in the first part of the sensor and a second of the plurality of patterned images is captured in the second part of the sensor.
Preferably, the sensor is a biometric image sensor, such as a finger image sensor. The first of the plurality of patterned images and the second of the plurality of patterned images together correspond to a fingerprint image in a first position on the sensor. A third of the plurality of patterned images is captured in the first part of the sensor and a fourth of the plurality of patterned images is captured in the second part of the sensor. The third of the plurality of patterned images and the fourth of the plurality of patterned images together correspond to the fingerprint image in a second position on the sensor. In one embodiment, the rotation information corresponds to an angular difference between the first position and the second position.
In one embodiment, correlating the plurality of patterned images comprises correlating the first patterned image with the third patterned image to generate a first set of linear differences from the sets of linear differences, correlating the second patterned image with the fourth patterned image to generate a second set of linear differences from the sets of linear differences, and correlating a first combination of the first patterned image and the second patterned image with a second combination of the third patterned image and the fourth patterned image to generate a third set of linear differences from the sets of linear differences. Correlating the first patterned image with the third patterned image, correlating the second patterned image with the fourth patterned image, and correlating the first combination with the second combination all comprise performing a cross correlation. In one embodiment, the cross correlation is either a normalized cross correlation or a standardized cross correlation.
In one embodiment, the first part of the sensor and the second part of the sensor are contiguous. Alternatively, the first part of the sensor and the second part of the sensor are not contiguous.
In one embodiment, the first part of the sensor comprises a first sub-frame of pixels and the second part of the sensor comprises a second sub-frame of pixels. In this embodiment, capturing the first patterned image comprises storing in the first sub-frame first data corresponding to the first patterned image, capturing the second patterned image comprises storing in the second sub-frame second data corresponding to the second patterned image, capturing the third patterned image comprises storing in the first sub-frame third data corresponding to the third patterned image, and capturing the fourth patterned image comprises storing in the second sub-frame fourth data corresponding to the fourth patterned image. Correlating the first patterned image with the third patterned image comprises correlating the first data with the third data to generate first and second linear differences from the first set of linear differences. Correlating the second patterned image with the fourth patterned image comprises correlating the second data with the fourth data to generate first and second linear differences from the second set of linear differences. And correlating the first combination with the second combination comprises correlating a combination of the first data and the second data with a combination of the third data and the fourth data to generate first and second linear differences from the third set of linear differences.
In another embodiment, correlating comprises determining a lag to correlate elements of one of the first and second sub-frames, the lag and a difference between the elements corresponding to first and second linear differences from one of the sets of linear differences. Each element corresponds to a row of one of the first and second sub-frames. Alternatively, each element corresponds to a column of one of the first and second sub-frames.
In another embodiment, the method further comprises filtering the first set of linear differences, the second set of linear differences, the third set of linear differences, and the rotation information. Filtering comprises multiplying by a scaling factor, performing a smoothing function, and performing a clipping function.
Preferably, the finger image sensor is a finger swipe sensor. Alternatively, the finger image sensor is a finger placement sensor.
In another embodiment, the method further comprises using the rotation information on a host platform having a display screen, the rotation information used to rotate an object on the display screen, thereby emulating a computer input device. The computer input device is selected from the group consisting of a steering wheel, a joystick, and a navigation bar. Emulating a computer input device comprises moving the object on the display screen at a rate related to the angular difference or the angular position.
In accordance with a second aspect of the invention, a system for obtaining rotation information comprises means for capturing a plurality of patterned images from a plurality of locations and means for correlating the plurality of patterned images to generate sets of linear differences and for using the sets of linear differences to generate the rotation information.
In accordance with a third aspect of the present invention, a method of emulating a rotational device using a pattern comprises capturing a first image of the pattern at a first orientation, capturing a second image of the pattern at a second orientation, correlating the first image with the second image to calculate linear differences between the first orientation and the second orientation, translating the linear difference into rotational data, and using the rotational data to emulate the movement of a rotational device.
In accordance with a fourth aspect of the present invention, a system for emulating a positional device comprises a sensor for capturing an image of a pattern and a processor coupled to the sensor. The processor is configured to calculate linear differences between a first position of the image of the pattern and a second position of the image of the pattern and to translate the linear differences into rotational data corresponding to a rotation of the image of the pattern.
In accordance with a fifth aspect of the present invention, a method of sensing rotation of an object on an image sensor comprises sensing a first image of the object, sensing a second image of the object, and comparing the first image with the second image to determine whether there is linear motion in each of at least two portions of an area containing the first image and the second image to determine whether the object remained stationary, moved in a linear manner, or rotated.
The present invention is directed to systems for and methods of determining the rotational position and movement of an arbitrary patterned material imaged by an imaging sensor. Preferably, the arbitrary patterned material is a finger and the rotational position and movement of the image of the finger are determined.
Embodiments of the present invention advantageously determine and collect finger rotational information for use in a digital device and most preferably in personal computing devices. Unlike prior art rotational position correlators, which are non-linear, requiring trigonometric functions like sine, cosine, and tangent calculations, embodiments of the present invention use a linear correlation method that is easier to implement and more computationally efficient. Embodiments of the present invention allow for extremely efficient calculation of linear motion from the components used to determine the rotational motion, thereby reducing the complexity of systems that require one sensor to be used to gather both linear and rotational movement inputs.
A system in accordance with embodiments of the present invention reconstructs fingerprint images from swipe sensors, thereby efficiently providing rotational motion data along with data necessary to reconstruct the image. The system is particularly well suited for applications that do not require high precision rotational information. Methods of and systems for fingerprint sensing are described in detail in the U.S. patent application Ser. No. 10/194,994, filed Jul. 12, 2002, and titled “Method and System for Biometric Image Assembly from Multiple Partial Biometric Frame Scans,” and in the U.S. patent application Ser. No. 10/099,558, filed Mar. 13, 2002, and titled “Fingerprint Biometric Capture Device and Method with Integrated On-Chip Data Buffering,” both of which are hereby incorporated by reference in their entireties. In the preferred embodiment, the fingerprint sensor is an Atrua Wings ATW 100 capacitive swipe sensor by Atrua Technologies, Inc., at 1696 Dell Avenue, Campbell, California 95008.
A key aspect of the present invention is determining rotation from linear correlation rather than prior art methods that determine rotation by rotating one frame with respect to another, and then applying standard correlation methods. The prior art methods require choosing a pivot point (center of origin) and then performing additional computation. Furthermore, such computation is not helpful for determining linear motion (non-rotational movement in the x- and y-directions). These computations are even more inefficient in portable electronic devices where it may be important to calculate both kinds of movement, for instance, when emulating a pointing device and a steering wheel on one component.
Most of the prior art is concentrated on calculating exact rotational movement, and therefore requires the more precise steps outlined above. However, many applications do not require such precision, and it is these cases where the new invention is bes suited. Embodiments of the present invention make use of the fact that, as a finger rotates clockwise, the left side of the image of the finger will appear to be moving upward, while the right half of the image will appear to be moving downward. This is sometimes referred to as shear. The opposite is true of counterclockwise motion. Furthermore, in both cases, the left side will appear to be moving toward the right, and the right side will appear to be moving toward the left, as shown in
Using the observations outlined above, embodiments of the present invention use the simpler linear correlation methods to determine rotational movement, which will occur when motion of the left side of a finger image is in an opposite direction to that of the right side For instance, if the left half is moving upward and the right half downward, there is clockwise rotational movement as shown in
Accordingly, the present invention offers a reliable and computationally-efficient solution to obtain high-resolution rotational information about a user's finger as it contacts a finger imaging sensor, so that such a sensor can, for example, emulate a steering wheel for use in gaming, or rotate the image of a map for easier viewing on a display.
It will be appreciated that Δα can correspond to Δβ in any number of ways. For example, Δα can equal Δβ, Δα can be a multiple of Δβ, Δα can be a fraction of Δβ, Δα can be a multiple of Δβ plus some offset, etc. It will also be appreciated that in accordance with one embodiment, the finger 30 does not have to maintain a pivot point on the finger image sensor 25. The finger 30 can be moved horizontally or vertically on the finger image sensor 25 before, while, or after it is rotated so that the line segment 36 is displaced horizontally, or vertically, or both, and the angle Δβ still determined. It will also be appreciated that vertical and horizontal movements of the finger 30 can also captured to vertically and horizontally displace the triangular image 15 on the display device 11. It will also be appreciated that the triangular image 15 can be moved at a rate related to Δβ (called the rate mode) or at a rate related to β1.
While the finger image sensor 25 is depicted as a placement sensor, it will be appreciated that other types of sensors can be used in accordance with the present invention. Preferably, the finger image sensor 25 is a swipe sensor, described in more detail below.
Many prior art electronic finger imaging sensors actively sense the entire surface of a fingerprint at the same time. Whether based on optical or electrical sensing methods, these sensors have a surface area at least as large as a typical person's fingertip pad (typically 15 mm×15 mm). Using these sensors the user simply places his finger on the sensor until the image is captured. These sensors, now known as placement sensors, contain rows and columns and can capture large images, typically ranging from 250-500 rows and 200-500 columns depending on the sensor's capabilities and size. Such devices are capable of sensing rotational input, and can indeed be used with the new invention to collect rotational information, but they are larger than today's more miniaturized finger sensors.
The most promising of the miniaturized sensors is one that is fully sized in one direction (typically in width) but abbreviated in the other (typically height). This results in a sensor that only is capable of sensing a small rectangular portion of the fingerprint at any one time.
Such smaller sensors are better suited for use with the present invention, not only because they are more apropos for portable devices, but also because they produce smaller images. The smaller images have less data in them, making the computations less intense. While it is possible to ignore or mask off data from a larger sensor to make it resemble a smaller one, such an approach is not ideal, because it does not guarantee that the finger of the user is even touching the area of interest. With a swipe sensor, this is not an issue.
Embodiments of the present invention can acquire rotational position data from any device capable of imaging the surface of a human finger or other patterned material and is therefore not limited to use with placement or swipe finger image sensors, which typically provide at least 250 dots per inch resolution. The present invention will also work with lower or higher resolution devices that may become available in the future.
Next, in the step 213, a frame is read by the sensor at a rate supported by it or by the hardware platform's computing power and bandwidth. In the step 215, the properties of the frame are estimated to determine whether it is useful. The metrics of the frame are analyzed in the step 220, to determine whether the frame is useful. If the frame is useful, it is kept and processing continues in the step 225; otherwise, the frame is disregarded, and processing continues in the step 255. As described in more detail below, in a preferred embodiment the usefulness of a frame is determined by measuring image statistics such as the average value and the variance of pixel data in the frame. The usefulness of a frame is directly related to whether or not a finger is present on the sensor. It will be appreciated that the step 215 can be eliminated if a less efficient implementation is acceptable, or when the sensor only generates frames when a finger is present on it.
In the step 225, the current frame (e.g., the frame most recently read and currently being processed) is correlated with the last stored useful frame. On the very first iteration, since there is no “last useful frame,” the current frame is copied to the last useful frame. In a preferred embodiment, the frame is divided into a left half and a right half. It will be appreciated, however, that a frame can be divided into any number of parts. Next, in the step 230, the linear movement of the left half of the frame and the linear movement of the right half of the frame are both calculated. In the step 235, using the linear movement of the left half of the frame and the linear movement of the right half of the frame, the overall linear movement of the frame is calculated. This calculation is described in more detail below. In the step 240, the calculations made in the step 235 are used to calculate the rotational movement of the fingerprint image.
Next, in the step 245 the process checks whether there was any movement, linear or rotational, of the fingerprint image. If there was movement, the process continues in the step 250, otherwise it continues in the step 255. In the step 250, the last frame useful frame is updated, and in the step 251, the last useful frame is stored. Processing then continues in the step 225.
In the step 255, the process checks whether more frames are to be acquired. If more frames are to be acquired, the process continues to the step 260, where a counter is incremented, and then continues on to the step 213. If no more frames are to be acquired, the process continues to the step 265, where it stops.
As described above, the pixels for the current frame are correlated to the pixels of the last useful frame to determine the amount of rotational or linear motion. If overall linear movement has been calculated in the step 235, data corresponding to the movement are sent to whatever downstream process needs it. For example, a program (e.g., an application program, a device driver, or an operating system) can use the corresponding data to linearly position a pointer on a display screen. If any overall rotational movement has been calculated in the step 240, data corresponding to the movement are sent to whatever downstream process needs it. For example, a program can use the corresponding data to rotate an image on the display screen. Once it is determined that movement has occurred, the last useful frame is replaced by the current frame and the algorithm continues by acquiring new image data from the sensor.
In a preferred embodiment, the system iterates in real time. Alternatively, the system stores all the frames in a memory buffer and calculates movement after multiple frames are acquired from the sensor. Preferably, the iteration halts when either the application or operating system tells it to stop. When the system is used as a pointing device for an operating system, the process can continue indefinitely.
The algorithm starts whenever there is a need for rotational feedback, such as power up of a device or start of a game. The algorithm terminates when rotational information is no longer needed.
In a preferred embodiment, the system executes on a computing device that is connected to a swipe fingerprint sensor 310 shown in
Typically, swipe sensors are capable of delivering anywhere from 250 to 3000 frames per second (the “frame rate”), depending on the sensor's capabilities, the interface used and the speed of the host personal computer.
It is possible, in alternative embodiments, to use the system with little or no modification, with other similar types of sensors, such as optical document scanners. In one embodiment, the system of the present invention executes in specialized hardware or firmware instead of in software. In another embodiment, the algorithm executes on a general-purpose CPU and other portions execute solely in hardware.
It will be appreciated that signs (positive or negative) assigned to a particular direction in an x-direction and a y-direction are arbitrarily chosen for the sake of explanation. The signs can be reversed.
The algorithm 210 in
In the preferred embodiment, the usefulness of a frame is determined by ensuring the frame contains at least some finger image information. For instance, if a frame is collected when no finger is on the device, that frame likely will contain only noise or a blank image. This is done using rules based on measuring image statistics of the frame, namely the average value and the variance. Some sensors provide information on finger presence, and that can be used in systems where it is available, either by itself or in conjunction with the above statistics.
Mathematically, if the pixel in the nth row and mth column is given by frame[n,m], then:
For the purposes of efficiency, the Metrics (calculated in the step 215) can also be calculated just using portions of the frame (rather than the entire frame), where the portions are arbitrary sub-images or are obtained by skipping every pth pixel in the calculation.
The frame is considered noise, and thereby disregarded, if:
Once the current frame has been found useful, it is next correlated to the last useful frame (stored in the step 251) to determine the finger movement, if any, that occurred. Once it is determined that finger movement has occurred, the last useful frame (step 251) is replaced by the current frame and the algorithm continues by acquiring a new frame from the sensor.
In accordance with the present invention, a new frame (“cF”) is correlated with an older one stored in memory (“oF”). Correlation is well known by any person skilled in the art, but it is described in more detail here to better explain the present invention.
Standard cross-correlation SCC of row R of the last useful frame with row S of the current frame is mathematically expressed as:
where d is referred to as the “lag” or “offset.” The lag is related to the horizontal movement of the data in one frame with respect to the data in another, and can also be thought of as a velocity. Typically, the lag is −L<=d<=+L, where L is much less than M. All of the equations in this description are written assuming d>=0 to keep the equations clear. It will be appreciated, however, that negative lag values can be processed by interchanging the column indices on oF and cF as shown below:
where |d| is the absolute value of d.
This interchange method is valid for all correlation equations in this document, not just SCC but also normalized cross-correlation NCC, discussed below.
Though it is feasible to use standard correlation, the preferred embodiment uses a slightly modified version of the correlation called Normalized Cross Correlation NCC, defined in Equation 2 below, which is better suited to image registration tasks like fingerprint image reconstruction. Unlike standard correlation, NCC is invariant to changes in image intensity, has a range that is independent of the number of pixels used in the calculation, and is more accurate because it is less dependent on local properties of the image frames being correlated.
is the sum along the row R from column d+1 through column M−d, and
is the sum along row S from column 1 through column M−2d.
The above equations are in terms of rows of each frame, but it is more general to think of “patches” of each frame, where a patch can be some subset or superset of a row. While the patches of each frame to be correlated can be any arbitrary set of pixels, the preferred embodiment uses a patch centered in the middle of each frame. While any row or subset of a row could be used, if the patch is too small, the statistical significance of the cross-correlation value will erode.
Since the lag, or offset, of the information in the current frame to that in the last frame corresponds to an unknown amount of movement in the x-direction, NCC_whole(R,S,d) must typically be calculated for multiple values of d to find the one that corresponds to the best fit. Therefore, in one embodiment:
PeakNCCwhole(R,S,L)=MAX{NCC_whole(R,S,d)}for d=−L to d=L. [Equation 3]
dpeakwhole(R,S,L)=the value of d at which the above equation is satisfied.
In the preferred embodiment, L=8, but L should be chosen so that it is as large as the maximum x-velocity that can occur from frame to frame. A smaller L is more computationally efficient, but will produce inaccurate results if the finger shifts more than ±L from one frame to the next. In an alternative embodiment, L can be a function of the dpeakwhole from the previous iteration i−1. For example,
L(at iterationi)=dpeakwhole(at iterationi−1)+e,where e is typically equal to 1or 2.
In yet another embodiment, L can be a function of the row number in the frame (i.e. R and/or S). Also note that it is possible to use scaled versions of the NCC equations so that floating-point operations can be avoided, and that for computing purposes it is also possible to use NCC-squared to avoid an expensive square-root operation.
The PeakNCCwhole corresponds to the correlation coefficient of the best fit, while dpeakwhole corresponds to the amount of movement in the x direction. A method to obtain the amount of motion in the y direction is described below.
The NCC calculation in Equation 2 below can be restated, for last frame oF and current frame cF,
where the numerator and denominator have both been multiplied by (M−2d)2 to make it simpler to compute and understand.
This can be separated into left and right halves of each row as:
NCC_whole(R,S,d)=(A−B)/(C1/2×D)
is the sum along row R from column M/2+1 through column M−d,
is the sum along row R from column d+1 through column M/2,
is the sum along row S from column M/2−d+1 through column M−2d, and
is the sum along row S from column 1 through column M/2−d.
Furthermore, the NCC for the left and right halves of each row can be determined using:
In addition, the PeakNCC for the left and right sides is given by:
PeakNCCleft(R,S,L)=MAX{NCC_left(R,S,d)}for d=−L to d=L. [Equation 6a]
dpeakleft(R,S,L)=the value of d at which the above equation is satisfied.
PeakNCCright(R,S,L)=MAX{NCC_right(R,S,d)}for d=−L to d=L. [Equation 6b]
dpeakright(R,S,L)=the value of d at which the above equation is satisfied.
These equations allow the left and right sides of the sensor array to be treated separately, and efficiently determine the rotational movement as described below.
Once all the NCC terms for left and right sides are calculated for the left and right sides in Equations 5a and 5b, only a few addition and division operations are required to calculate NCC_whole for the entire sensing array using Equation 4. Then, overall linear motion can be calculated using Eq. 3 as before, while rotational movement is calculated using the linear motion for the left and right sides of the sensing array.
Using PeakNCC(R,S,L) defined above in Equation 3 or Equations 6a and 6b, the calculation of exact x and y motion is straightforward. The last useful frame at iteration i has rows numbered 1 through N, as shown in
For a given row R in the last frame, PeakNCC and dpeak are calculated as in Equations 3, 6a and 6b with respect to rows 1 through N of the current frame, and take the dpeak that corresponds to the maximum PeakNCC. Preferably, this is done for two values of R: R=1 and R=N. In this way, both upward and downward motion in the y direction can be determined while maximizing the speed at which a user can move his finger. It is also possible to choose only one R, at R=N/2 (or very near the middle row). However that is sub-optimal. It is also possible to choose values other than 1 or N, such as R=2 and R=N−1, which may be advantageous for accuracy reasons since they are not on the edge of the sensor array.
Table 1 shows the pseudo-code for performing a single frame iteration for a given value of R. Although the calculations are carried out separately for the left side, right side, and whole row, only the generic case is described by the pseudo-code in Table 1.
The pseudo-code in Table 1 can be summarized as:
MaxPeakNCC=NCC_whole(bestR, bestS, dpeak(bestR,bestS,L)) [Equation 6c]
dpeakMax=dpeak(bestR,bests,L) [Equation 6d]
Thus, after the above calculations, the following information is obtained:
Typically, MaxPeakNCC will be close to 1.0 (the closer to 1.0, the stronger the correlation), but if the finger being analyzed is moved too quickly, it is possible that the current frame does not have any rows in common with the last frame (i.e. a non-overlapping case). Therefore, MaxPeakNCC must be checked to ensure that it is large enough.
Using the above information, the following calculations are performed:
In any case, after x and y motion have been calculated for the left side, right side, and whole rows for the current iteration i—denoted by Δxleft(i) and Δyleft(i); Δxright(i) and Δyright(i); Δxwhole(i) and Δywhole(i), respectively—are calculated, the rotational movement Δtheta(i) can now be determined. The Δxwhole(i) and Δywhole(i) are made available to the host requiring the rotational information, and represent the overall linear x- and y-motion.
Table 2 shows the pseudo code for determining rotational movement. The pseudo code continues iterating until told to stop by the application program or operating system using the rotational data.
It will be appreciated that there are alternative ways to compute the rotational delta, including arbitrary functions of the Δyleft(i) and Δyright(i). For alternative mountings of the sensor, where the x and y directions are transposed, Δxleft(i) and Δxright(i) are used in the pseudo code in Table 2. On full size placement sensors, more accuracy can be achieved using both Δxleft(i), Δxright(i) and Δyleft(i), Δyright(i). This is achieved in one embodiment by calculating a Δtheta(i) using the pseudo code in Table 2 using Δx values and again using Δy values. The resulting two estimated values can be averaged together or otherwise combined to form the final Δtheta(i). The Δtheta(i) are made available to the host application and/or operating system.
In other embodiments a standard correlation is used instead of normalized cross correlation. Standard cross correlation given in Equation 1 could be used instead of Normalized Cross Correlation. It is straightforward to split Equation 1 into terms from the left and right sides of each row.
In this case, which is much simpler than the NCC case in the preferred embodiment, the SCC value for the entire row is simply the sum of the correlation values of each half.
It will also be appreciated that the maximum standard cross-correlation between a row S of a frame and a row R of the last useful frame can be given by other expressions. For example, weighted or voting schemes can be used. In one embodiment,
PeakSCCwhole(R,S,L)=Weighted—MAX{SCC_Whole(R,S,d)} for d=−L to d=L [Equation 8]
where Weighted_MAX is a function that assigns a plurality of predetermined weights to its elements before generating a value, dpeakwhole(R,S,L) is the value of d at which Equation 8 is satisfied, and L is approximately equal to the maximum horizontal speed from the last useful frame to the current frame.
While the preferred embodiment splits each row into a left and right side of equal length, alternative embodiments use any arbitrary division of each row, including more than 2 equal parts instead of 2, and also using divisions that are of differing length. It is also not necessary to have each division touch the next.
For example,
In other embodiments, it is desirable to modify the raw values Δxwhole(i), Δywhole(i), and Δtheta(i) before sending it to the host. These types of modifications involve three different mathematical transformations, generically called filtering, where the transformed output is noted by the ′ notation:
Those skilled in the art will recognize that correlation is computationally intensive. Accordingly, in one embodiment, the calculation of Δtheta(i) and/or the Δx(i), Δy(i) for the left, right, and whole array are performed on a separate processor or dedicated hardware. In this embodiment, the hardware can be integrated into the silicon fingerprint sensor itself. The hardware performing the correlation must have access to the current frame and the last useful frame, both of which can be stored in memory on the device. If, for example, this is integrated into the finger image sensor, such a device would obviously have access to the current frame (since the device itself created it), and it could save the last useful frame in volatile memory registers. In such a case the device would also need to determine whether a frame is useful or not, using the method described here in the preferred embodiment. In such an embodiment the host computing device is not necessary. Obviously, such a hardware implementation could also be used to reconstruct fingerprint images since doing so only requires the Δx(i) and Δy(i) for the whole array.
It will be readily apparent to one skilled in the art that various modifications may be made to the embodiments without departing from the spirit and scope of the invention as defined by the appended claims.
This application claims priority under 35 U.S.C. § 119(e) of the U.S. provisional application Ser. No. 60/497,045, filed on Aug. 22, 2003, and titled “ROTATIONAL INPUT METHOD PATENT.” The provisional application Ser. No. 60/497,045, filed on Aug. 22, 2003, and titled “ROTATIONAL INPUT METHOD PATENT” is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
1660161 | Hansen | Feb 1928 | A |
1683059 | Van Deventer | Sep 1928 | A |
3393390 | Louis | Jul 1968 | A |
3610887 | Betzer | Oct 1971 | A |
3621439 | Newbery | Nov 1971 | A |
3624584 | Ohno | Nov 1971 | A |
3863195 | Bowen | Jan 1975 | A |
3960044 | Nagai et al. | Jun 1976 | A |
3997863 | Luce | Dec 1976 | A |
4152304 | Tadewald | May 1979 | A |
4257305 | Friend et al. | Mar 1981 | A |
4273682 | Kanomori | Jun 1981 | A |
4333068 | Kishel | Jun 1982 | A |
4419653 | Waigand | Dec 1983 | A |
4438158 | Eichelberger et al. | Mar 1984 | A |
4479392 | Froeb et al. | Oct 1984 | A |
4604509 | Clancy et al. | Aug 1986 | A |
4745301 | Michalchik | May 1988 | A |
4746894 | Zeldman | May 1988 | A |
4765930 | Mashimo et al. | Aug 1988 | A |
4775765 | Kimura et al. | Oct 1988 | A |
4827527 | Morita et al. | May 1989 | A |
4833440 | Wojtanek | May 1989 | A |
4878040 | Tamura | Oct 1989 | A |
4933660 | Wynne, Jr. | Jun 1990 | A |
4952761 | Viebrantz | Aug 1990 | A |
5060527 | Burgess | Oct 1991 | A |
5068638 | Bickely et al. | Nov 1991 | A |
5162775 | Kuramochi et al. | Nov 1992 | A |
5164697 | Kramer | Nov 1992 | A |
5283735 | Gross et al. | Feb 1994 | A |
5296835 | Nakamura | Mar 1994 | A |
5327161 | Logan et al. | Jul 1994 | A |
5376913 | Pine et al. | Dec 1994 | A |
5429006 | Tamori | Jul 1995 | A |
5499041 | Brandenburg et al. | Mar 1996 | A |
5610993 | Yamamoto | Mar 1997 | A |
5612719 | Beernink et al. | Mar 1997 | A |
5614881 | Duggal et al. | Mar 1997 | A |
5621318 | Jacobsen et al. | Apr 1997 | A |
5644283 | Grosse-Wilde et al. | Jul 1997 | A |
5657012 | Tait | Aug 1997 | A |
5666113 | Logan | Sep 1997 | A |
5675309 | DeVolpi | Oct 1997 | A |
5689285 | Asher | Nov 1997 | A |
5740276 | Tomko et al. | Apr 1998 | A |
5821930 | Hansen | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5825907 | Russo | Oct 1998 | A |
5828773 | Setlak et al. | Oct 1998 | A |
5841888 | Setlak et al. | Nov 1998 | A |
5845005 | Setlak et al. | Dec 1998 | A |
5852670 | Setlak et al. | Dec 1998 | A |
5862248 | Salatino et al. | Jan 1999 | A |
5876106 | Kordecki | Mar 1999 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5889507 | Engle et al. | Mar 1999 | A |
5903225 | Schmitt et al. | May 1999 | A |
5907327 | Ogura et al. | May 1999 | A |
5909211 | Combs et al. | Jun 1999 | A |
5910286 | Lipskier | Jun 1999 | A |
5912612 | DeVolpi | Jun 1999 | A |
5920640 | Salatino et al. | Jul 1999 | A |
5940526 | Setlak et al. | Aug 1999 | A |
5943052 | Allen et al. | Aug 1999 | A |
5945929 | Westra | Aug 1999 | A |
5949325 | Devolpi | Sep 1999 | A |
5953441 | Setlak | Sep 1999 | A |
5956415 | McCalley et al. | Sep 1999 | A |
5963679 | Setlak | Oct 1999 | A |
5982894 | McCalley et al. | Nov 1999 | A |
5995084 | Chan et al. | Nov 1999 | A |
5995623 | Kawano et al. | Nov 1999 | A |
5995630 | Borza | Nov 1999 | A |
5999084 | Armstrong | Dec 1999 | A |
6011589 | Matsuura et al. | Jan 2000 | A |
6011849 | Orrin | Jan 2000 | A |
6021211 | Setlak et al. | Feb 2000 | A |
6028773 | Hundt | Feb 2000 | A |
6035398 | Bjorn | Mar 2000 | A |
6047281 | Wilson et al. | Apr 2000 | A |
6047282 | Wilson et al. | Apr 2000 | A |
6057540 | Gordon et al. | May 2000 | A |
6057830 | Chan et al. | May 2000 | A |
6061051 | Chan et al. | May 2000 | A |
6061464 | Leger | May 2000 | A |
6067368 | Setlak et al. | May 2000 | A |
6069970 | Salatino et al. | May 2000 | A |
6070159 | Wilson et al. | May 2000 | A |
6088471 | Setlak et al. | Jul 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6098330 | Schmitt et al. | Aug 2000 | A |
6135958 | Mikula-Curtis et al. | Oct 2000 | A |
6141753 | Zhao et al. | Oct 2000 | A |
6181807 | Setlak et al. | Jan 2001 | B1 |
6208329 | Ballare | Mar 2001 | B1 |
6219793 | Li et al. | Apr 2001 | B1 |
6219794 | Soutar et al. | Apr 2001 | B1 |
6239790 | Martinelli et al. | May 2001 | B1 |
6248655 | Machida et al. | Jun 2001 | B1 |
6256012 | Devolpi | Jul 2001 | B1 |
6256022 | Manaresi et al. | Jul 2001 | B1 |
6259804 | Setlak et al. | Jul 2001 | B1 |
6278443 | Amro et al. | Aug 2001 | B1 |
6289114 | Mainguet | Sep 2001 | B1 |
6317508 | Kramer et al. | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6330345 | Russo et al. | Dec 2001 | B1 |
6337918 | Holehan | Jan 2002 | B1 |
6344791 | Armstrong | Feb 2002 | B1 |
6376393 | Newton et al. | Apr 2002 | B1 |
6400836 | Senior | Jun 2002 | B2 |
6404323 | Schrum et al. | Jun 2002 | B1 |
6404900 | Qian et al. | Jun 2002 | B1 |
6408087 | Kramer | Jun 2002 | B1 |
6437682 | Vance | Aug 2002 | B1 |
6442286 | Kramer | Aug 2002 | B1 |
6459804 | Mainguet | Oct 2002 | B2 |
6483931 | Kalnitsky et al. | Nov 2002 | B2 |
6501284 | Gozzini | Dec 2002 | B1 |
6512381 | Kramer | Jan 2003 | B2 |
6515488 | Thomas | Feb 2003 | B1 |
6518560 | Yeh et al. | Feb 2003 | B1 |
6535622 | Russo et al. | Mar 2003 | B1 |
6546122 | Russo | Apr 2003 | B1 |
6563101 | Tullis | May 2003 | B1 |
6580816 | Kramer et al. | Jun 2003 | B2 |
6601169 | Wallace, Jr. et al. | Jul 2003 | B2 |
6603462 | Matusis | Aug 2003 | B2 |
6628812 | Setlak et al. | Sep 2003 | B1 |
6654484 | Topping | Nov 2003 | B2 |
6661631 | Meador et al. | Dec 2003 | B1 |
6664951 | Fujii et al. | Dec 2003 | B1 |
6667439 | Salatino et al. | Dec 2003 | B2 |
6668072 | Hribernig et al. | Dec 2003 | B1 |
6681034 | Russo | Jan 2004 | B1 |
6683971 | Salatino et al. | Jan 2004 | B1 |
6744910 | McClurg et al. | Jun 2004 | B1 |
6754365 | Wen et al. | Jun 2004 | B1 |
6804378 | Rhoads | Oct 2004 | B2 |
6876756 | Vieweg | Apr 2005 | B1 |
6961452 | Fujii | Nov 2005 | B2 |
7002553 | Shkolnikov | Feb 2006 | B2 |
7003670 | Heaven et al. | Feb 2006 | B2 |
7020270 | Ghassabian | Mar 2006 | B1 |
7054470 | Bolle et al. | May 2006 | B2 |
7113179 | Baker et al. | Sep 2006 | B2 |
7136514 | Wong | Nov 2006 | B1 |
7197168 | Russo | Mar 2007 | B2 |
7263212 | Kawabe | Aug 2007 | B2 |
7280679 | Russo | Oct 2007 | B2 |
7299360 | Russo | Nov 2007 | B2 |
7339572 | Schena | Mar 2008 | B2 |
7369688 | Ser et al. | May 2008 | B2 |
20010012036 | Giere et al. | Aug 2001 | A1 |
20010017934 | Paloniemi et al. | Aug 2001 | A1 |
20010026636 | Mainguet | Oct 2001 | A1 |
20010032319 | Setlak | Oct 2001 | A1 |
20010043728 | Kramer et al. | Nov 2001 | A1 |
20020054695 | Bjorn et al. | May 2002 | A1 |
20020109671 | Kawasome | Aug 2002 | A1 |
20020130673 | Pelrine et al. | Sep 2002 | A1 |
20020164057 | Kramer et al. | Nov 2002 | A1 |
20020186203 | Huang | Dec 2002 | A1 |
20020188854 | Heaven et al. | Dec 2002 | A1 |
20030002718 | Hamid | Jan 2003 | A1 |
20030016849 | Andrade | Jan 2003 | A1 |
20030021451 | Lee | Jan 2003 | A1 |
20030021495 | Cheng | Jan 2003 | A1 |
20030025606 | Sabatini | Feb 2003 | A1 |
20030028811 | Walker et al. | Feb 2003 | A1 |
20030035568 | Mitev et al. | Feb 2003 | A1 |
20030035572 | Kalnitsky et al. | Feb 2003 | A1 |
20030044051 | Fujieda | Mar 2003 | A1 |
20030095691 | Nobuhara et al. | May 2003 | A1 |
20030108227 | Philomin et al. | Jun 2003 | A1 |
20030115490 | Russo et al. | Jun 2003 | A1 |
20030123714 | O'Gorman et al. | Jul 2003 | A1 |
20030126448 | Russo | Jul 2003 | A1 |
20030135764 | Lu | Jul 2003 | A1 |
20030214481 | Xiong | Nov 2003 | A1 |
20030215116 | Brandt et al. | Nov 2003 | A1 |
20040014457 | Stevens | Jan 2004 | A1 |
20040042642 | Bolle et al. | Mar 2004 | A1 |
20040128521 | Russo | Jul 2004 | A1 |
20040148526 | Sands et al. | Jul 2004 | A1 |
20040156538 | Greschitz et al. | Aug 2004 | A1 |
20040186882 | Ting | Sep 2004 | A1 |
20040208348 | Baharav et al. | Oct 2004 | A1 |
20040252867 | Lan et al. | Dec 2004 | A1 |
20040258282 | Bjorn et al. | Dec 2004 | A1 |
20040263479 | Shkolnikov | Dec 2004 | A1 |
20050012714 | Russo et al. | Jan 2005 | A1 |
20050041885 | Russo | Feb 2005 | A1 |
20050105782 | Abiko | May 2005 | A1 |
20050144329 | Tsai et al. | Jun 2005 | A1 |
20050169503 | Howell et al. | Aug 2005 | A1 |
20050179657 | Russo et al. | Aug 2005 | A1 |
20050259851 | Fyke | Nov 2005 | A1 |
20050259852 | Russo | Nov 2005 | A1 |
20060002597 | Rowe | Jan 2006 | A1 |
20060034043 | Hisano et al. | Feb 2006 | A1 |
20060078174 | Russo | Apr 2006 | A1 |
20060103633 | Gioeli | May 2006 | A1 |
20060242268 | Omernick et al. | Oct 2006 | A1 |
20060280346 | Machida | Dec 2006 | A1 |
20070014443 | Russo | Jan 2007 | A1 |
20070016779 | Lyle | Jan 2007 | A1 |
20070034783 | Eliasson et al. | Feb 2007 | A1 |
20070038867 | Verbauwhede et al. | Feb 2007 | A1 |
20070061126 | Russo et al. | Mar 2007 | A1 |
20070067642 | Singhal | Mar 2007 | A1 |
20070125937 | Eliasson et al. | Jun 2007 | A1 |
20070146349 | Errico et al. | Jun 2007 | A1 |
20070274575 | Russo | Nov 2007 | A1 |
20080013808 | Russo et al. | Jan 2008 | A1 |
Number | Date | Country |
---|---|---|
19606408 | Aug 1997 | DE |
0973123 | Jan 2000 | EP |
1 113 383 | Jul 2001 | EP |
1 113 405 | Jul 2001 | EP |
1 143 374 | Feb 2005 | EP |
2000-048208 | Feb 2000 | JP |
2000-056877 | Feb 2000 | JP |
09071135 | Mar 2007 | JP |
WO 9815225 | Apr 1998 | WO |
WO 9852145 | Nov 1998 | WO |
WO 9852146 | Nov 1998 | WO |
WO 9852147 | Nov 1998 | WO |
WO 9852157 | Nov 1998 | WO |
WO 9943258 | Sep 1999 | WO |
WO 0068873 | Nov 2000 | WO |
WO 0068874 | Nov 2000 | WO |
WO 0072507 | Nov 2000 | WO |
WO 0109819 | Feb 2001 | WO |
WO 0109936 | Feb 2001 | WO |
WO 0129731 | Apr 2001 | WO |
WO 0139134 | May 2001 | WO |
WO 0165470 | Sep 2001 | WO |
WO 0173678 | Oct 2001 | WO |
WO 0177994 | Oct 2001 | WO |
WO 0180166 | Oct 2001 | WO |
WO 0194892 | Dec 2001 | WO |
WO 0194902 | Dec 2001 | WO |
WO 0194966 | Dec 2001 | WO |
WO 0195305 | Dec 2001 | WO |
WO 0199035 | Dec 2001 | WO |
WO 0199036 | Dec 2001 | WO |
WO 0215209 | Feb 2002 | WO |
WO 0215267 | Feb 2002 | WO |
WO 0244998 | Jun 2002 | WO |
WO 02069386 | Sep 2002 | WO |
WO 02071313 | Sep 2002 | WO |
WO 02073375 | Sep 2002 | WO |
WO 02086800 | Oct 2002 | WO |
WO 02093462 | Nov 2002 | WO |
WO 02095349 | Nov 2002 | WO |
WO 03007127 | Jan 2003 | WO |
WO 03017211 | Feb 2003 | WO |
WO 03049011 | Jun 2003 | WO |
WO 03049012 | Jun 2003 | WO |
WO 03049016 | Jun 2003 | WO |
WO 03049104 | Jun 2003 | WO |
WO 03075210 | Sep 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20050041885 A1 | Feb 2005 | US |
Number | Date | Country | |
---|---|---|---|
60497045 | Aug 2003 | US |