Claims
- 1. A method for reconstructing two overlapping images, comprising:
collecting a first slice of image data; collecting a second slice of image data; determining the correlation factors for a plurality of frames of image data within the first slice; determining the correlation factors for a frame of image data within the second slice; comparing the correlation factors from each of the plurality of frames of image data from the first slice to the correlation factors for the frame of image data from the second slice; determining the frame within the first slice with the highest correlation to the frame from the second slice; and positioning the first slice of image data relative to the second slice of image data based upon the location of the frame within the first slice with the highest correlation to the frame from the second slice.
- 2. The method according to claim 1 wherein the said image data is taken from a biometric object.
- 3. The method according to claim 2 wherein the biometric object is a fingerprint or a palmprint.
- 4. The method according to claim 1 wherein the steps of collecting a first slice of image data and collecting a second slice of image data are performed by collecting outputs from an array of sensitive elements in a biometric sensor.
- 5. The method according to claim 4 wherein the biometric sensor is a fingerprint sensor.
- 6. The method according to claim 1 wherein the step of determining the correlation factors for a plurality of frames of image data within the first slice further comprises the step of determining the deviation per column values for each column of sensitive elements within the frame.
- 7. The method according to claim 1 wherein the step of determining the correlation factors for a frame of image data within the second slice further comprises the step of determining the deviation per column values for each column of sensitive elements within the frame.
- 8. The method according to claim 1 wherein the step of comparing the correlation factors from each of the plurality of frames of image data from the first slice to the correlation factors for the frame of image data from the second slice further comprises the steps of:
determining the difference between the deviation per column values for each of the frames in the first slice to the deviation per column value for the frame of the second slice; and calculating the sum of the difference between the deviation per column values.
- 9. The method according to claim 8 wherein the step of determining the frame within the first slice with the highest correlation to the frame from the second slice further comprises the step of:
comparing the sum of the difference between the deviation per column values to find the frames with the smallest value of the sum of the difference between the deviation per column values.
- 10. A method for reconstructing fingerprint images from a fingerprint sensor, comprising the steps of:
collecting a first slice of fingerprint image data from a first plurality of sensitive element outputs; collecting a second slice of fingerprint image data from a second plurality of sensitive element outputs; reconstructing the fingerprint image by positioning the first slice relative to the second slice based on comparing the correlation factors of the frames of the first slice to the correlation factors of a frame in the second slice.
- 11. The method according to claim 10 wherein the fingerprint image is generated by swiping a finger along a fingerprint sensor.
- 12. The method according to claim 10 wherein the fingerprint image is generated by placing a finger on a fingerprint sensor in a plurality of positions to generate a complete fingerprint image.
- 13. The method according to claim 10 wherein the number of sensitive elements in each of the frames of the first slice is less than half the number of sensitive elements in the first plurality of sensitive elements.
- 14. The method according to claim 10 wherein the number of sensitive elements in each of the frames of the first slice is more than half the number of sensitive elements in the first plurality of sensitive elements.
- 15. The method according to claim 10 wherein the number of sensitive elements in the first plurality of sensitive elements is the same as the number of sensitive elements in the second plurality of sensitive elements.
- 16. The method according to claim 10 wherein the step of comparing the correlation factors of the frames compares the frames in one dimension.
- 17. The method according to claim 10 wherein the step of comparing the correlation factors of the frames compares the frames in two dimensions.
- 18. The method according to claim 10 wherein the correlation factor for a frame of image data is based upon the comparison of outputs from columns of sensitive elements arranged in a biometric sensor.
- 19. The method according to claim 10 wherein the correlation factor for a frame of image data is based upon the comparison of outputs from rows of sensitive elements arranged in a biometric sensor.
- 20. The method according to claim 10 wherein the correlation factor for a frame of image data is based upon comparison of the outputs from the rows and the outputs from the columns of sensitive elements arranged in a biometric sensor.
- 21. A method for compensating for stretch in biometric object data collected from a swipe sensor, comprising the steps of:
collecting two slices of image data; determining the shift between the slices by comparing frames within the slices; determining the amount of stretch in the collected image data; and adjusting the collected image data to compensate for the amount of stretch.
- 22. The method according to claim 21 wherein the step of determining the amount of stretch in the collected image data further comprises the steps of:
determining a hardware stretch factor; determining a finger swipe speed stretch factor; applying the hardware stretch factor and the finger swipe speed stretch factor to the shift to determine the amount of image stretch.
- 23. The method according to claim 21 wherein the step of adjusting the collected image data to compensate for the amount of stretch further comprises the step of removing some of the shift image data.
- 24. The method according to claim 23 wherein the step of removing some of the shift image data further comprises removing a plurality of rows of image data from the shift image data.
- 25. The method according to claim 23 wherein the step of removing some of the shift image data introduces a rounding error into adjusted collected image data.
- 26. The method according to claim 25 wherein the introduced rounding error is collected and applied to the adjusted collected image data.
- 27. The method according to claim 24 wherein the plurality of rows of image data removed from the shift image data are uniformly removed from the shift image data.
- 28. The method according to claim 24 wherein the plurality of rows of image data removed from the shift image data are non-uniformly removed from the shift image data.
- 29. The method according to claim 21 wherein the step of adjusting the collected image data to compensate for the amount of stretch further comprises the step of removing a plurality of rows of shift image data that portion of the shift image data furthest from the overlapping portion of the collected slices of image data.
- 30. The method according to claim 21 wherein the step of adjusting the collected image data to compensate for the amount of stretch further comprises the step of removing a portion of shift image data relative to the amount of stretch in the shift in the portion of the shift image data where the most image stretch occurs.
- 31. The method according to claim 22 wherein the step of adjusting the collected image data to compensate for the amount of stretch further comprises the step of determining an interval of image removal based upon the shift image data and the amount of image stretch.
- 32. The method according to claim 31 wherein the step of adjusting the collected image data to compensate for the amount of stretch further comprises the steps of:
removing a portion of the shift of image data based on a fraction of the image removal interval; and removing a portion of the shift of image data based on the full image removal interval.
- 33. A method according to claim 32 wherein the fraction of the image removal interval is about half of the image removal interval.
- 34. A method for detecting swipe start on a swipe sensor, comprising the steps of:
collecting slices of image data; comparing the collected slices of image data to detect an image shift between two slices; and determining that swipe has started when an image shift is detected.
- 35. The method according to claim 34 wherein the step of comparing the collected slices of image data to detect an image shift between two slices further comprises the steps of:
determining correlation factors for a plurality of frames within one slice; determining correlation factors for a frame within another slice; determining the shift between the one slice and the another slice by comparing the correlation factors for each of the plurality of frames within the one slice to the correlation factors for the frame within the another slice.
- 36. A method for determining when swiping has stopped in a swipe sensor, the method comprising the steps of:
collecting multiple slices of image data from a biometric sensor; comparing adjacent slices within the multiple collected slices of image data to detect an image shift between two slices; and determining that swiping has stopped when there is no image shift detected before a threshold number of image slices is collected.
- 37. A method for detecting a swipe too fast condition on a swipe sensor, comprising the steps of:
collecting slices of image data from a swipe sensor; attempting to correlate any one of a plurality of frames of image data from within one slice to a frame of image data within an adjacent slice; and determining that there is a swipe too fast condition when none of the plurality of frames of image data from the one slice correlates to a frame of image data from an adjacent slice.
- 38. A method of authenticating fingerprints in a swipe fingerprint system, the method comprising the steps of:
creating an enrolled fingerprint image data file for a true user by instructing the user to swipe at several different speeds; collecting slices of fingerprint image data while the true user swipes at several different speeds; instructing an unknown user claiming to be the true user to swipe at several different speeds; collecting slices of image data as the unknown user swipes at different speeds; and determining whether the unknown user is the true user by comparing the slices of image data collected from the true user at several different swipe speeds to the slices of image data collected from the unknown user at several different swipe speeds.
- 39. A method for authenticating a user based on biometric image data, comprising the steps of:
collecting a standard initial enrolled swipe image from an enrolled user; collecting a secondary enrolled swipe image from an enrolled user; collecting a standard initial swipe image from an unknown user; collecting a secondary enrolled swipe image from an unknown user; and determining whether the unknown user is the enrolled user by comparing the standard initial enrolled swipe image from an enrolled user to the standard initial swipe image from an unknown user and comparing the secondary enrolled swipe image from an enrolled user to the secondary enrolled swipe image from an unknown user.
- 40. The method according to claim 39 wherein the steps of collecting a secondary enrolled swipe image from an enrolled user and collecting a secondary enrolled swipe image from an unknown user further comprise the collection of image data from altered swipe patterns.
- 41. The method according to claim 40 wherein the altered swipe patterns are selected from or are combinations from the group of swipe patterns consisting of: swipe fast, swipe slow, swipe with finger tilt left, swipe with fingertip, swipe and stop half way along the swipe sensor, and swiping a pattern across the sensor.
- 42. The method according to claim 39 wherein the steps of collecting a secondary enrolled swipe image from an enrolled user and collecting a secondary enrolled swipe image from an unknown user further comprise the step of collecting a plurality of secondary swipe images from a variety of altered swipe conditions.
- 43. The method according to claim 42 wherein the step of determining whether the unknown user is the enrolled user further comprises the step of comparing less than all of collected plurality of enrolled secondary images to less than all of the collected unknown user secondary images.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application is related to U.S. Provisional Application Serial No. 60/337,933 filed Nov. 6, 2001, entitled, “Method and System For Capturing Fingerprints From Multiple Swipe Images”, which is incorporated herein by reference in its entirety and to which priority is claimed.
[0002] Appendix A, which is part of the present disclosure, consists of 14 pages of a software program operable on a host computer in accordance with embodiments of the present invention. These 14 pages correspond to pages A-10 to A-23 of the provisional application Ser. No. 60/337,933 filed Nov. 6, 2001. A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
Provisional Applications (1)
|
Number |
Date |
Country |
|
60337933 |
Nov 2001 |
US |