System and method for counting ridges in a captured print image

Information

  • Patent Grant
  • 6996259
  • Patent Number
    6,996,259
  • Date Filed
    Friday, August 1, 2003
    21 years ago
  • Date Issued
    Tuesday, February 7, 2006
    18 years ago
Abstract
A system and method for counting ridges in a captured print image frame is described. A pixel path through the captured print image frame is traversed. A hysteresis band for the pixel path is determined. A number of crossings of the determined hysteresis band is counted while traversing the pixel path. A number of print ridges based on the counted number of hysteresis band crossings is determined.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


The invention relates generally to the field of fingerprint scanner technology and, more particularly, to counting ridges in a captured fingerprint image frame.


2. Background Art


Biometrics are a group of technologies that provide a high level of security. Fingerprint capture and recognition is an important biometric technology. Law enforcement, banking, voting, and other industries increasingly rely upon fingerprints as a biometric to recognize or verify identity. See, Biometrics Explained, v. 2.0, G. Roethenbaugh, International Computer Society Assn. Carlisle, Pa. 1998, pages 1-34 (incorporated herein by reference in its entirety).


Fingerprint scanners having cameras are available that capture an image of a fingerprint. Typically, to capture a fingerprint image electronically with a fingerprint scanner, a light source is directed towards a fingerprint capture surface that reflects light from the light source towards a camera. The fingerprint capture surface is generally glass. Contact between the surface of a finger and the fingerprint capture surface causes the reflected light to be representative of the fingerprint of the particular finger placed against the fingerprint capture surface. This reflected light is then captured by camera. The fingerprint scanner may have processing that produces a signal representative of a captured fingerprint image from the reflected light.


The quality of contact between a finger and the fingerprint capture surface plays a large role in the intensity of the reflected light. A very dry skin surface on a clean fingerprint capture surface may result in a low intensity level of reflected light. On the other hand, an oily skin surface and/or a less-clean fingerprint capture surface may result in a high level of reflected light. Additional factors, such as the location of the finger, the pressure applied to press the finger to the platen, and the ambient environment may affect whether an acceptable quality fingerprint image is captured.


As a result of the above variations, a fingerprint scanner system and method that captures an acceptable fingerprint image is needed. Moreover, a system and method for determining the quality of a captured fingerprint image is desired.


BRIEF SUMMARY OF THE INVENTION

A system and method for counting fingerprint ridges in a captured fingerprint image frame is described. In an aspect of the present invention, a fingerprint image is captured. The captured fingerprint image is stored. A region of interest in the stored fingerprint image frame is determined. A pixel path is determined through the region of interest.


In a further aspect, the pixel path through the captured fingerprint image frame is traversed. A hysteresis band for the pixel path is determined. A number of crossings of the determined hysteresis band is counted while traversing the pixel path. A number of fingerprint ridges based on the counted number of hysteresis band crossings is determined.


In a further aspect, the determined number of fingerprint ridges is stored.


In a further aspect, the stored number of fingerprint ridges is evaluated to determine a quality of the captured fingerprint image.


In a further aspect, a plurality of pixel paths may be determined, and individually traversed.


In an aspect of the present invention, the hysteresis band is defined by a hysteresis band first edge value and a hysteresis band second edge value. The hysteresis band may be determined as follows: A first ridge pixel value peak for the determined pixel path is measured. A first valley pixel value peak for the determined pixel path is measured. A hysteresis band center pixel value between the first ridge pixel value peak and the first valley pixel value peak is selected. The hysteresis band first edge value is calculated by adding a delta value to the selected hysteresis band center pixel value. The hysteresis band second edge value is calculated by subtracting the delta value from the selected hysteresis band center pixel value.


In a further aspect, the hysteresis band center pixel value may be selected by calculating an average pixel value of the first ridge pixel value peak and the first valley pixel value peak, and setting the hysteresis band center pixel value to the calculated average pixel value.


In a further aspect, the delta value may be calculated according to the following equation:

delta value=|(first valley pixel value peak−first ridge pixel value peak)|/N,

    • wherein N is any number greater than one. For example, N may be an integer, such as six.


In a further aspect, a hysteresis band crossing may be detected when sequentially detected pixel values along the pixel path range from the hysteresis band first edge value to the hysteresis band second edge value. Furthermore, a hysteresis band crossing maybe detected when sequentially detected pixel values along the pixel path range from the hysteresis band second edge value to the hysteresis band first edge value.


In a further aspect, the number of fingerprint ridges based on the counted number of hysteresis band crossings is determined by dividing the counted number of hysteresis band crossings by two.


In another aspect of the present invention, a system is described for counting fingerprint ridges in a captured fingerprint image frame. A ridge counter module traverses a pixel path through the captured fingerprint image frame, determines a hysteresis band for the pixel path, counts a number of crossings of the determined hysteresis band while traversing the pixel path, and determines a number of fingerprint ridges based on the counted number of hysteresis band crossings.


In a further aspect, the system includes a camera that captures a fingerprint image and outputs the captured fingerprint image frame.


In a further aspect, the system includes a memory that stores the captured fingerprint image frame, and is accessible by the ridge counter module.


In a further aspect, the system includes a platen that has a finger application area.


In a further aspect, the system includes an illumination source that provides light to illuminate the finger application area to produce the fingerprint image.


In a further aspect, the system includes an optical system that directs the light to the camera.


In a further aspect, the system includes a controller that includes the ridge counter module and controls the illumination source and/or the camera.


This system and method for counting ridges according to the present invention can be used with any type of print including, but not limited to a print of all or part of a finger, palm, hand, toe, and foot.


Further aspects, features, and advantages of the present invention, as well as the structure and operation of the various embodiments of the present invention, are described in detail below with reference to the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES

The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.



FIG. 1 illustrates an example block diagram of a fingerprint scanner system, according to embodiments of the present invention.



FIG. 2A shows an example captured fingerprint image frame.



FIG. 2B shows the example captured fingerprint image frame of FIG. 2A with two example fingerprint image frame regions, according to embodiments of the present invention.



FIG. 2C shows example pixel paths through a captured fingerprint image frame region, according to the present invention.



FIG. 2D shows an example finger that is applied to an example platen, according to an embodiment of the present invention.



FIG. 3 shows an example plot of pixel intensity for a traversed pixel path.



FIG. 4 shows a flowchart providing high level steps for performing the present invention.



FIG. 5 shows example steps for counting fingerprint ridges in a captured fingerprint image frame, according to embodiments of the present invention.



FIG. 6 shows example steps for determining a hysteresis band, according to an embodiment of the present invention.



FIG. 7 shows example steps for a counting a number of crossings of a determined hysteresis band while traversing a determined pixel path, according to an embodiment of the present invention.



FIG. 8 shows example steps for counting fingerprint ridges in a captured fingerprint image frame for one or more pixel paths, according to embodiments of the present invention.





The present invention will now be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.


DETAILED DESCRIPTION OF THE INVENTION

While the present invention is described herein with reference to illustrative embodiments for particular applications, it should be understood that the invention is not limited thereto. Those skilled in the art with access to the teachings provided herein will recognize additional modifications, applications, and embodiments within the scope thereof and additional fields in which the present invention would be of significant utility.


Overview


The present invention is directed to a method, system, and apparatus for counting ridges in a captured fingerprint image. The present invention may be applied in any type of print scanner, including but not limited to any type of fingerprint and/or palm print scanner.


Numerous embodiments of the present invention are presented herein. Detail on the above mentioned embodiments for counting fingerprint ridges in a captured fingerprint image frame, and additional embodiments according to the present invention, are described. The embodiments described herein may be combined in any applicable manner, as required by a particular application.


Terminology


To more clearly delineate the present invention, an effort is made throughout the specification to adhere to the following term definitions consistently.


The term “finger” refers to any digit on a hand including, but not limited to, a thumb, an index finger, middle finger, ring finger, or a pinky finger.


The term “live scan” refers to a scan of any type of fingerprint and/or palm print image made by a print scanner. A live scan can include, but is not limited to, a scan of a finger, a finger roll, a flat finger, slap print of four fingers, thumb print, palm print, toe, or foot, or a combination of fingers, such as, sets of fingers and/or thumbs from one or more hands or one or more palms, or one or more toes disposed on a platen.


In a live scan, one or more fingers or palms from either a left hand or a right hand, or both hands or all or part of a foot are placed on a platen of a scanner. Different types of print images are detected depending upon a particular application. For example, a flat print consists of a fingerprint image of a digit (finger or thumb) pressed flat against the platen. A roll print consists of an image of a digit (finger or thumb) made while the digit (finger or thumb) is rolled from one side of the digit to another side of the digit over the surface of the platen. A slap print consists of an image of four flat fingers pressed flat against the platen. A palm print involves pressing all or part of a palm upon the platen. A platen can be movable or stationary depending upon the particular type of scanner and the type of print being captured by the scanner.


The terms “biometric imaging system”, “scanner”, “live scanner”, “live print scanner”, “fingerprint scanner” and “print scanner” are used interchangeably, and refer to any type of scanner which can obtain an image of all or part of one or more fingers and/or palm in a live scan. The obtained images can be combined in any format including, but not limited to, an FBI, state, or international ten-print format.


The term “platen” refers to a component that includes an imaging surface upon which at least one finger is placed during a live scan. A platen can include, but is not limited to, a surface of an optical prism, set of prisms, or set of micro-prisms, or a surface of a silicone layer or other element disposed in optical contact with a surface of an optical prism, set of prisms, or set of micro-prisms.


Embodiments for Counting Fingerprint Ridges in a Captured Fingerprint Image Frame


Example embodiments for counting fingerprint ridges according to the present invention are described at a high-level and at a more detailed level. These example embodiments are provided herein for illustrative purposes, and are not limiting. In particular, fingerprint ridge counting as described in this section can be achieved using any number of structural implementations, including hardware, firmware, software, or any combination thereof.



FIG. 1 illustrates an example high level block diagram of a live scanner system 100, according to embodiments of the present invention. Live scanner system 100 includes an illumination source 110, a live scanner optical system 120, a camera 130, a memory 135, and a live scanner controller 140. Live scanner system 100 captures a user's print. Furthermore, live scanner system 100 is capable of performing ridge counting according to the present invention.


Live scanner system 100 may be a portion of, or may be included in any suitable type of print scanner, known to persons skilled in the relevant art(s). For example, live scanner system 100 may be included in any live scanner available from Cross Match Technologies, Inc., or other manufacturer. Furthermore, one or more portions of live scanner system 100 maybe incorporated in any computer system that can process captured fingerprint images.


Optical system 120 shown in FIG. 1 includes a fingerprint image capturing platen, where a user may apply a finger. In some embodiments, the fingerprint image capturing platen may allow a user to roll the applied finger across the platen. Illumination source 110 provides light for illuminating the applied finger at the platen. Optical system 120 may focus and direct the light to the platen. Optical system 120 focuses and/or directs light reflected from the applied finger to camera 130. Camera 130 periodically samples the reflected light, and outputs captured fingerprint image data. The data is output to memory 135, which stores the captured fingerprint image data in the form of a captured fingerprint image frame. For example, the captured fingerprint image frame may be stored in the form of a two-dimensional array of pixel data.


Controller 140 accesses the captured fingerprint image data stored in memory 135, and/or directly from camera 130. Controller 140 may provide a sampling signal to camera 130 and/or illumination source 110 that causes camera 130 to capture fingerprint image frames while being illuminated by illumination source 110.


Controller 140 may be included in a personal computer, a mainframe computer, one or more processors, specialized hardware, software, firmware, or any combination thereof, and/or any other device capable of processing the captured fingerprint image data as described herein. Controller 140 may allow a user to initiate and terminate a fingerprint capture session. Controller 140 also allows a user to evaluate the quality of captured fingerprint images, as described below.


As shown in FIG. 1, controller 140 comprises a ridge counter module 150. Ridge counter module 150 counts fingerprint ridges in captured fingerprint image frames. Further structural and operational detail of ridge counter module 150 is provided below. Ridge counter module 150 may be implemented in hardware, firmware, software, or a combination thereof. Other structural embodiments for ridge counter module 150 will be apparent to persons skilled in the relevant art(s) based on the discussion contained herein.



FIG. 4 shows a flowchart 400 providing high level steps for the present invention. Operational and structural embodiments related to flowchart 400 will become apparent to persons skilled in the relevant art(s) based on the discussion herein.


Flowchart 400 begins with step 402. In step 402, a fingerprint ridge count is determined for one or more pixel paths across a captured fingerprint image frame. Procedures for determining a fingerprint ridge count according to step 402 are described further below. Step 402 may be performed by live scanner system 100, for example.


In step 404, the determined fingerprint ridge count is evaluated to determine at least a quality of the captured fingerprint image frame. Controller 140 or other processing hardware/software may use the output of ridge counter module 150 for any number of reasons. For example, controller 140 may use a ridge count as a factor in determining whether a quality fingerprint image has been captured. For example, if an unusually low number of ridges are counted, controller 140 may determine that a poor quality fingerprint image was captured. If an expected number of ridges are counted, controller 140 may use this information as a factor indicating that a relatively good quality fingerprint image was captured. A fingerprint ridge count may be used for additional reasons, including evaluating the performance of the corresponding fingerprint scanner, identity verification, and for further reasons.



FIG. 5 shows example steps for performing step 402, according to one or more embodiments of the present invention. Optional steps are shown surrounded by dotted lines in FIG. 5. The steps of FIG. 5 do not necessarily have to occur in the order shown, as will be apparent to persons skilled in the relevant art(s) based on the teachings herein. Operational and structural embodiments will be apparent to persons skilled in the relevant art(s) based on the following discussion


In step 502, a fingerprint image is captured.


In step 504, the captured fingerprint image is stored to be accessed as the stored fingerprint image frame.


In step 506, a region of interest in a stored fingerprint image frame is identified.


In step 508, a pixel path through the region of interest is determined.


In step 510, the determined pixel path is traversed.


In step 512, a hysteresis band for the determined pixel path is determined.


In step 514, a number of crossings of the determined hysteresis band while traversing the determined pixel path is counted.


In step 516, a number of fingerprint ridges based on the counted number of hysteresis band crossings is determined.


In step 518, the number of fingerprint ridges determined in step 516 is stored.


The steps shown in FIG. 5 are described in further detail in the following discussion.


In optional step 502 of FIG. 5, a fingerprint image is captured. For example, camera 130 captures a fingerprint image reflected from a finger applied to a platen. FIG. 2D shows an example finger 230 that is applied to a platen 240, according to an embodiment of the present invention. As shown in FIG. 2D, platen 240 is a surface of an optical prism. Finger 230 includes five ridges 260a-260e and four valleys 280a-d, for illustrative purposes. Illumination source 110 emits light to finger 230. A portion of the emitted light from illumination source 110 is shown as first light 270a and second light 270b in FIG. 2D. As shown in FIG. 2D, first light 270a contacts ridge 260a at the surface of platen 240. First light 270a is largely diffused and/or absorbed by finger 230, and does not substantially cause reflected light. Hence, because first light 270a is not substantially reflected from ridge 260a, camera 130 will receive relatively little light, and will output one or more “dark” pixels that correspond to ridge 260a. Conversely, second light 270b contacts space or valley 280c at the surface of platen 240, causing reflected light 290. Second light 270b is relatively less diffused and/or absorbed by finger 230, causing more light to be reflected. Hence, camera 130 receives reflected light 290 and outputs one or more relatively “bright” pixels that correspond to valley 280c.


Note that in this description, ridges are described to cause relatively “dark” reflections, and valleys to cause relatively “bright” reflections. However, in alternative embodiments, this may be reversed. In other words, ridges may instead cause “bright” reflections, while valleys may cause relatively “dark” reflections.


In embodiments, different amounts of light are reflected depending on whether a ridge, valley, or intermediate portion of finger 230 is in contact with a prism. The light captured by the camera may be output as data by the camera in the form of grey-scale pixel intensity values. For example, the grey-scale pixel intensity values may range from 0 to 255, where 0 represents a white pixel, 255 represents a black pixel, and values in between 0 and 255 represent corresponding shades of grey. Alternatively, 0 may represent a black pixel and 255 may represent a white pixel. Furthermore, in alternative embodiments, the pixel intensity values may have greater and lesser ranges than 0 to 255. Furthermore, the pixel intensity values may include shades of one or more colors in addition to black, grey, and white. For illustrative purposes, fingerprint image data is described herein as in the form of pixels with grey-scale pixel intensity values ranging from 0 (white) to 255 (black).


Note that in alternative embodiments, the fingerprint image may have already been captured, and step 502 is therefore not necessary.


In optional step 504, shown in FIG. 5, the captured fingerprint image is stored to be accessed as the stored fingerprint image frame. For example, the captured fingerprint image frame is stored by camera 130 in memory 135. FIG. 2A shows an example stored captured fingerprint image frame 200. Captured fingerprint image frame 200 may have been captured by camera 130 shown in FIG. 1, and stored in memory 135, for example. In an embodiment, captured fingerprint image frame 200 is stored as a two-dimensional array of pixel data. Note that in alternative embodiments, the captured fingerprint image may be prestored, and step 504 is therefore not necessary.


As shown in FIG. 2A, stored captured fingerprint image frame 200 includes a captured fingerprint image 202. Captured fingerprint image 202 is an image captured from a user's finger applied to a corresponding platen. Darker pixel areas of captured fingerprint image 202 represent areas where relatively better contact is made between the user's finger and the platen. The darker pixel areas tend to correspond to ridges of the user's fingerprint. Lighter pixel areas of captured fingerprint 202 represent areas where relatively less contact is made between the user's finger and the platen. The lighter pixel areas tend to correspond to spaces or valleys between ridges of the user's fingerprint.


In optional step 506, shown in FIG. 5, a region of interest in a stored fingerprint image frame is identified. In the embodiment shown in FIG. 1, captured fingerprint image frame 200 is processed by controller 140 to obtain desired information related to the user's fingerprint. All of captured fingerprint image frame 200, or a portion of fingerprint image frame 200 may be considered to be the region of interest. Processing only a portion of fingerprint image frame 200 determined to be a region of interest may save processing time and other resources relative to processing all of fingerprint image frame 200.


For example, FIG. 2B shows an example first fingerprint image frame region 204 and an example second fingerprint image frame region 206, which are each portions of captured fingerprint image frame 200. First fingerprint image frame region 204 is smaller in area than captured fingerprint image frame 200, and therefore includes fewer pixels than captured fingerprint image frame 200. Because first fingerprint image frame region 204 has fewer pixels than captured fingerprint image frame 200, any processing of first fingerprint image frame region 204 may be more efficient than processing captured fingerprint image frame 200. Furthermore, second fingerprint image frame region 206 is smaller in area than first fingerprint image frame region 204, and therefore includes fewer pixels than first fingerprint image frame region 204. Because second fingerprint image frame region 206 has fewer pixels than first fingerprint image frame region 204, any processing of second fingerprint image frame region 206 may be more efficient than processing first fingerprint image frame region 204


For illustrative purposes, in the following example of fingerprint image processing, any processing is described as performed on second fingerprint image frame region 206. However, in embodiments of the present invention, an entire captured fingerprint image frame, or any portion thereof, may be processed.


In optional step 508, shown in FIG. 5, a pixel path through the region of interest is determined. In embodiments, one or more pixel paths through a single captured fingerprint image may be determined. FIG. 2C shows example pixel paths through captured fingerprint image frame region 206, according to the present invention. As shown in the example of FIG. 2C, six pixel paths are present, first horizontal pixel path 212, second horizontal pixel path 214, third horizontal pixel path 216, first vertical pixel path 218, second vertical pixel path 220, and third vertical pixel path 222. Only six paths are shown for clarity; however, in general a fewer or greater number of pixel paths can be used.


Note that in embodiments, pixel paths may be straight horizontal and vertical paths, such as are shown in FIG. 2C, and may be other shaped paths. For example, pixel paths may also be elliptical, triangular, rectangular, other polygons, irregular, and other shaped paths. Furthermore, in alternative embodiments, the one or more pixel paths are predetermined, and are not determined on an application basis. Hence, step 508 is may not be necessary.


In step 510, shown in FIG. 5, the determined pixel path is traversed. According to the present invention, a pixel path is traversed so that the number of fingerprint ridges occurring along the pixel path may be counted. Pixel paths may be traversed in either direction along the pixel path. Typically, a pixel path is traversed by detecting pixel values sequentially along the determined pixel path. During the traversal, a pixel intensity value may be detected for every pixel in the pixel path, for every other pixel in the pixel path, for every third pixel in the pixel path, or for other multiples of pixels in the pixel path.


In the example of FIG. 2C, horizontal pixel paths are traversed from left to right, and the number of ridges occurring along the pixel path are counted during the traversal. During an example traversal of first horizontal pixel path 212, three fingerprint ridges are counted. During an example traversal of second horizontal pixel path 214, four fingerprint ridges are counted. During an example traversal of third horizontal pixel path 216, seven fingerprint ridges are counted. During an example traversal of first vertical pixel path 218, three fingerprint ridges are counted. During an example traversal of second vertical pixel path 220, five fingerprint ridges are counted. During an example traversal of third vertical pixel path 222, three fingerprint ridges are counted. The discussion below describes how to count fingerprint ridges such as these, according to the present invention.


In step 512, shown in FIG. 5, a hysteresis band for the determined pixel path is determined. According to the present invention, fingerprint ridges are counted using a hysteresis band, as is further described below. FIG. 3 shows an example plot 300 of pixel intensity for a traversed pixel path. Pixel intensity values range from 0 (white) to 255 (black). The pixel path begins at a first end of the pixel path, traversing sequentially through the pixels in the pixel path, to a second end of the pixel path. As shown in FIG. 3, four fingerprint ridges 304a-304d and four fingerprint valleys 302a-302d are present in the captured fingerprint image frame portion during traversal of the pixel path.


According to the present invention, a hysteresis band 306, as shown in FIG. 3, is determined. As shown in FIG. 3, hysteresis band 306 is defined by a hysteresis band first edge value 308 and a hysteresis band second edge value 310. FIG. 6 shows example steps for step 512 of FIG. 5, according to one or more embodiments of the present invention. The steps of FIG. 6 do not necessarily have to occur in the order shown, as will be apparent to persons skilled in the relevant art(s) based on the teachings herein. Operational and structural embodiments will be apparent to persons skilled in the relevant art(s) based on the following discussion related to flowchart 600.


Flowchart 600 begins with step 602. In step 602, a first ridge pixel value peak for the pixel path is measured. In other words, for the example of FIG. 3, the pixel path is traversed from left to right, to find a first maximum pixel value for the pixel intensity, which corresponds to a fingerprint ridge peak. In FIG. 3, by traversing the pixel path from left to right, a first ridge pixel value peak is found as the maximum pixel value for first ridge 304a. First ridge 304a is the first ridge found while traversing the pixel path from left to right. For instance, the peak pixel value for first ridge 304a may be equal to a pixel intensity value of 230. In examples, the first ridge pixel value peak can be any selected ridge value peak along the pixel path, an average or mean of the ridge value peaks, or the minimum ridge value peak of a group of ridge value peaks along the pixel path.


In step 604, a first valley pixel value peak for the pixel path is measured. In other words, for the example of FIG. 3, the pixel path is traversed from left to right, to find a first minimum pixel value for the pixel intensity, which corresponds to a fingerprint valley peak. In FIG. 3, by traversing the pixel path from left to right, a first valley pixel value peak is found as the minimum pixel value for first valley 302a. First valley 302a is the first valley found while traversing the pixel path from left to right. For instance, the peak pixel value for first valley 302a may be equal to a pixel intensity value of 30. In examples, the first valley pixel value peak can be any selected valley value peak along the pixel path, an average or mean of the valley value peak, or the maximum valley value peak of a group of valley value peaks along the pixel path.


In step 606, a hysteresis band center pixel value is selected between the first ridge pixel value peak and the first valley pixel value peak. In the example of FIG. 3, the hysteresis band center pixel value is shown as hysteresis band center pixel value 312. Any pixel intensity value between the intensity values of the first ridge pixel value peak and first valley pixel value peak may be chosen to be hysteresis band center pixel value 312. In an embodiment, hysteresis band center pixel value 312 is chosen to be the average value of the first ridge pixel value peak and first valley pixel value peak. In the current example, the average value of the first ridge pixel value peak and first valley pixel value peak is equal to

(230−30)/2=130=hysteresis band center pixel value 312.


In step 608, the hysteresis band first edge value is calculated by adding a delta value to the selected hysteresis band center pixel value. Hence, hysteresis band first edge value 308 is equal to the sum of hysteresis band center pixel value 312 and a delta value. In embodiments, the delta value may be predetermined, may be calculated as a fraction or percentage of the difference between the first ridge pixel value peak and the first valley pixel value peak, and maybe calculated in other ways. In an embodiment, the delta value is one-sixth of the difference between the first ridge pixel value peak and the first valley pixel value peak:

delta value=(first ridge pixel value peak−first valley pixel value peak)/6

which in the current example is approximately equal to:

delta value={230−30)/6=33.3.

Hence, in the current example, hysteresis band first edge value 308 is approximately equal to

130+33.3=163.3=hysteresis band first edge value 308.


In step 610, the hysteresis band second edge value is calculated by subtracting the delta value from the selected hysteresis band center pixel value. Hence, in the current example, hysteresis band second edge value 310 is approximately equal to

130−33.3=96.7=hysteresis band second edge value 310.

Hence, in the current example, hysteresis band 306 is defined to range from 163.3 to 96.7. The determined hysteresis band 306 may be used to count fingerprint ridges, according to the present invention, as described herein.


In step 514, shown in FIG. 5, a number of crossings of the determined hysteresis band while traversing the determined pixel path is counted. A crossing of the hysteresis band is determined whenever pixels in a pixel path completely cross the hysteresis band. FIG. 7 shows example steps for step 514, according to an embodiment of the present invention. Operational and structural embodiments related to the steps shown in FIG. 7 will become apparent to persons skilled in the relevant art(s) based on the discussion herein.


In step 702, a hysteresis band crossing is detected when sequentially detected pixel values range from the hysteresis band first edge value to the hysteresis band second edge value.


In step 704, a hysteresis band crossing is detected when sequentially detected pixel values range from the hysteresis band second edge value to the hysteresis band first edge value.


For illustrative purposes, these processes are further described with respect to FIG. 3. As shown in FIG. 3, a hysteresis band crossing occurs between valley 302a and ridge 304a. According to step 704, a hysteresis band crossing is detected when sequentially detected pixel values range from the hysteresis band second edge value to the hysteresis band first edge value. Between valley 302a and ridge 304a, hysteresis band 306 is crossed because sequentially detected pixel values shown in plot 300 range from hysteresis band second edge value 310 to hysteresis band first edge value 308. Hence, under step 704, a hysteresis band crossing is detected between valley 302a and ridge 304a.


Furthermore, as shown in FIG. 3, a hysteresis band crossing occurs between ridge 304a and valley 302b. According to step 702, a hysteresis band crossing is detected when sequentially detected pixel values range from the hysteresis band first edge value to the hysteresis band second edge value. Between ridge 304a and valley 302b, hysteresis band 306 is crossed because sequentially detected pixel values shown in plot 300 range from hysteresis band first edge value 308 to hysteresis band second edge value 310. Hence, under step 702, a hysteresis band crossing is detected between ridge 304a and valley 302b.


Likewise, hysteresis band crossings maybe detected between valley 302b and ridge 304b, ridge 304b and valley 302c, and valley 302c and ridge 304c.


However, a hysteresis band crossing is not detected between ridge 304c and valley 302d, or between valley 302d and ridge 304d. This is because hysteresis band 306 is not completely crossed. In other words, sequentially detected pixel values do not range from hysteresis band first edge value 308 to hysteresis band second edge value 310, or from hysteresis band second edge value 310 to hysteresis band first edge value 308 between ridge 304c and valley 302d or between valley 302d and ridge 304d.


A final hysteresis band crossing in plot 300 may be detected between ridge 304d and valley 302d according to step 702. Hence, a total of six hysteresis band crossings are detected in the example of plot 300 shown in FIG. 3.


In step 516, shown in FIG. 5, a number of fingerprint ridges based on the counted number of hysteresis band crossings is determined. Typically, the counted number of hysteresis band crossings is divided by two to determine the number of fingerprint ridges. However, in alternative embodiments, the number of fingerprint ridges based on the counted number of hysteresis band crossings may be determined in other ways.


Hence, in the example of plot 300 shown in FIG. 3, the number of fingerprint ridges may be determined by dividing by two the counted number of hysteresis band crossings:

6÷2=3=number of fingerprint ridges determined in plot 300


In optional step 518, shown in FIG. 5, the number of fingerprint ridges determined in step 516 is stored. For example, the number of fingerprint ridges that have been determined may be stored in a memory, such as memory 135, or any other memory or storage device, temporary or permanent. In embodiments, the number of fingerprint ridges may be displayed and/or transmitted. In alternative embodiments, the determined number of fingerprint ridges is not stored, and hence step 518 is not necessary.



FIG. 8 shows an alternative embodiment for step 402 of FIG. 5, according to the present invention. In FIG. 8, steps 508, 510, 512, 514, 516, and 518 are repeated for multiple pixel paths through the region of interest of the captured fingerprint image frame. Hence, after step 518 is performed (when present), operation proceeds back to step 508, so that a next pixel path may be determined. Once all of the desired pixel paths have been traversed, operation proceeds from step 518 to end step 804.


Furthermore, FIG. 8 shows the additional optional step of step 802. In step 802, it is evaluated whether a range of pixels in the determined pixel path is acceptable. In other words, controller 140, for example, may make a comparison of pixel intensities for pixels in the current pixel path to determine whether they have enough intensity variation to warrant further processing. For example, if there is relatively little intensity variation in the pixel path, the pixel path may include none or few fingerprint ridges. The pixel path may actually cross the region of interest in an area where there is little or no fingerprint information, and hence may not warrant further processing. Hence, as shown in FIG. 8, if the answer at step 802 is no, the current pixel path may be skipped, operation may proceed from step 802 to step 508, where a next pixel path may be determined. If the pixel path does have a range of pixel intensities that are determined to be acceptable, operation may proceed from step 802 to step 510.


Further steps for the processes shown in FIGS. 4-8 will be known to persons skilled in the relevant art(s) from the teachings herein.


The present invention has been described with respect to fingerprints; however, the system and method for counting ridges can be used to count ridges in any type of print, including all or part of a finger, palm, hand, toe, and foot.


CONCLUSION

While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims
  • 1. A method for counting print ridges in a captured print image frame, comprising the steps of: (A) traversing a pixel path through the captured print image frame; (B) determining a hysteresis band for the pixel path; (C) counting a number of crossings of the determined hysteresis band while traversing the pixel path; and (D) determining a number of print ridges based on the counted number of hysteresis band crossings; wherein the hysteresis band is defined by a hysteresis band first edge value and a hysteresis band second edge value, wherein step (B) comprises the steps of: (1) measuring a first ridge pixel value peak for the pixel path; (2) measuring a first valley pixel value peak for the pixel path; (3) selecting a hysteresis band center pixel value between the first ridge pixel value peak and the first valley pixel value peak; (4) calculating the hysteresis band first edge value by adding a delta value to the selected hysteresis band center pixel value; and (5) calculating the hysteresis band second edge value by subtracting the delta value from the selected hysteresis band center pixel value.
  • 2. The method of claim 1, wherein step (D) comprises the step of: dividing the counted number of hysteresis band crossings by two to determine the number of print ridges.
  • 3. The method of claim 1, further comprising the steps of: (E) traversing a second pixel path across the captured print image frame; and (F) repeating steps (C) and (D) using the second pixel path.
  • 4. The method of claim 1, further comprising the steps of: (E) determining a second hysteresis band; (F) traversing a second pixel path across the captured print image frame; and (G) repeating steps (C) and (D) using the second determined hysteresis band and the second pixel path.
  • 5. The method of claim 1, further comprising the steps of: (E) determining a second hysteresis band; (F) traversing the pixel path across the captured print image frame a second time; and (G) repeating steps (C) and (D) using the second determined hysteresis band and the second traversal of the pixel path.
  • 6. The method of claim 1, wherein step (B)(3) comprises the step of: calculating an average pixel value of the first ridge pixel value peak and the first valley pixel value peak; and setting the hysteresis band center pixel value to the calculated average pixel value.
  • 7. The method of claim 1, further comprising the step of: calculating the delta value according to the following equation delta value=|(first valley pixel value peak−first ridge pixel value peak)|/6.
  • 8. The method of claim 1, wherein step (A) comprises the step of: detecting pixel values sequentially along the pixel path.
  • 9. The method of claim 8, wherein step (C) comprises the steps of: detecting a hysteresis band crossing when sequentially detected pixel values range from the hysteresis band first edge value to the hysteresis band second edge value; and detecting a hysteresis band crossing when sequentially detected pixel values range from the hysteresis band second edge value to the hysteresis band first edge value.
  • 10. A method for counting fingerprint ridges, comprising the steps of: (A) identifying a region of interest in a stored fingerprint image frame; (B) determining a pixel path through the region of interest; (C) traversing the determined pixel path; (D) determining a hysteresis band for the determined pixel path; (E) counting a number of crossings of the determined hysteresis band while traversing the determined pixel path; (F) determining a number of fingerprint ridges based on the counted number of hysteresis band crossings; and (G) storing the number of fingerprint ridges determined in step (F); wherein the hysteresis band is defined by a hysteresis band first edge value and a hysteresis band second edge value, wherein step (D) comprises the steps of: (1) measuring a first ridge pixel value peak for the pixel path; (2) measuring a first valley pixel value peak for the pixel path; (3) selecting a hysteresis band center pixel value between the first ridge pixel value peak and the first valley pixel value peak; (4) calculating the hysteresis band first edge value by adding a delta value to the selected hysteresis band center pixel value; and (5) calculating the hysteresis band second edge value by subtracting the delta value from the selected hysteresis band center pixel value.
  • 11. The method of claim 10, further comprising the step of: (H) capturing a fingerprint image; and (I) storing the captured fingerprint image to be accessed as the stored fingerprint image frame.
  • 12. The method of claim 11, further comprising the step of: (J) evaluating the stored number of fingerprint ridges to determine a quality of the captured fingerprint image.
  • 13. The method of claim 10, further comprising the step of: (H) repeating steps (B)-(G) at least one additional time.
  • 14. The method of claim 10, further comprising the step of: (H) evaluating the stored number of fingerprint ridges to determine a quality of the stored fingerprint image frame.
  • 15. The method of claim 10, wherein step (F) comprises the step of: dividing the counted number of hysteresis band crossings by two to determine the number of fingerprint ridges.
  • 16. The method of claim 10, wherein step (D)(3) comprises the step of: calculating an average pixel value of the first ridge pixel value peak and the first valley pixel value peak; and setting the hysteresis band center pixel value to the calculated average pixel value.
  • 17. The method of claim 10, further comprising the step of: calculating the delta value according to the following equation delta value=|(first valley pixel value peak−first ridge pixel value peak)|/6.
  • 18. The method of claim 10, wherein step (C) comprises the step of: detecting pixel values sequentially along the determined pixel path.
  • 19. The method of claim 18, wherein step (E) comprises the steps of: detecting a hysteresis band crossing when sequentially detected pixel values range from the hysteresis band first edge value to the hysteresis band second edge value; and detecting a hysteresis band crossing when sequentially detected pixel values range from the hysteresis band second edge value to the hysteresis band first edge value.
  • 20. A system for counting fingerprint ridges in a captured fingerprint image frame, comprising: a ridge counter module that includes means for traversing a pixel path through the captured fingerprint image frame, means for determining a hysteresis band for the pixel path; means for counting a number of crossings of the determined hysteresis band while traversing the pixel path, and means for determining a number of fingerprint ridges based on the counted number of hysteresis band crossings; wherein the hysteresis band is defined by a hysteresis band first edge value and a hysteresis band second edge value, said hysteresis band determining means comprises: means for measuring a first ridge pixel value peak for the pixel path; means for measuring a first valley pixel value peak for the pixel path; means for selecting a hysteresis band center pixel value between the first ridge pixel value peak and the first valley pixel value peak; means for calculating the hysteresis band first edge value by adding a delta value to the selected hysteresis band center pixel value; and means for calculating the hysteresis band second edge value by subtracting the delta value from the selected hysteresis band center pixel value.
  • 21. The system of claim 20, wherein said means for determining a number of fingerprint ridges comprises: means for dividing the counted number of hysteresis band crossings by two to determine the number of fingerprint ridges.
  • 22. The system of claim 20, further comprising: a camera that captures a fingerprint image and outputs said captured fingerprint image frame.
  • 23. The system of claim 22, further comprising: a memory that stores said captured fingerprint image frame, and is accessible by said ridge counter module.
  • 24. The system of claim 22, further comprising: a platen that has a finger application area.
  • 25. The system of claim 24, further comprising: an illumination source that provides light to illuminate said finger application area to produce said fingerprint image.
  • 26. The system of claim 25, further comprising: an optical system that directs said light to said camera.
  • 27. The system of claim 24, further comprising: a controller that includes said ridge counter module and controls said illumination source and said camera.
  • 28. The system of claim 20, wherein said means for selecting a hysteresis band center pixel value comprises: means for calculating an average pixel value of the first ridge pixel value peak and the first valley pixel value peak; and means for setting the hysteresis band center pixel value to the calculated average pixel value.
  • 29. The system of claim 20, further comprising: means for calculating the delta value according to the following equation delta value=|(first valley pixel value peak−first ridge pixel value peak)|/6.
  • 30. The system of claim 20, wherein said means for traversing a pixel path comprises: means for detecting pixel values sequentially along the pixel path.
  • 31. The system of claim 30, wherein said means for counting a number of crossings comprises: means for detecting a hysteresis band crossing when sequentially detected pixel values range from the hysteresis band first edge value to the hysteresis band second edge value; and means for detecting a hysteresis band crossing when sequentially detected pixel values range from the hysteresis band second edge value to the hysteresis band first edge value.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application No. 60/400,103, filed Aug. 2, 2002, which is herein incorporated by reference in its entirety.

US Referenced Citations (168)
Number Name Date Kind
2500017 Altman Mar 1950 A
3200701 White Aug 1965 A
3475588 McMaster Oct 1969 A
3482498 Becker Dec 1969 A
3495259 Rocholl et al. Feb 1970 A
3527535 Monroe Sep 1970 A
3540025 Levin et al. Nov 1970 A
3617120 Roka Nov 1971 A
3699519 Campbell Oct 1972 A
3906520 Phillips Sep 1975 A
3947128 Weinberger et al. Mar 1976 A
3968476 McMahon Jul 1976 A
3975711 McMahon Aug 1976 A
4032975 Malueg et al. Jun 1977 A
4063226 Kozma et al. Dec 1977 A
4120585 DePalma et al. Oct 1978 A
4152056 Fowler May 1979 A
4209481 Kashiro et al. Jun 1980 A
4210899 Swonger et al. Jul 1980 A
4253086 Szwarcbier Feb 1981 A
4322163 Schiller Mar 1982 A
4414684 Blonder Nov 1983 A
4537484 Fowler et al. Aug 1985 A
4544267 Schiller Oct 1985 A
4553837 Marcus Nov 1985 A
4601195 Garritano Jul 1986 A
4669487 Frieling Jun 1987 A
4681435 Kubota et al. Jul 1987 A
4684802 Hakenewerth et al. Aug 1987 A
4701772 Anderson et al. Oct 1987 A
4783823 Tasaki et al. Nov 1988 A
4784484 Jensen Nov 1988 A
4792226 Fishbine et al. Dec 1988 A
4811414 Fishbine et al. Mar 1989 A
4876726 Capello et al. Oct 1989 A
4905293 Asai et al. Feb 1990 A
4924085 Kato et al. May 1990 A
4933976 Fishbine et al. Jun 1990 A
4942482 Kakinuma et al. Jul 1990 A
4946276 Chilcott Aug 1990 A
4995086 Lilley et al. Feb 1991 A
5054090 Knight et al. Oct 1991 A
5067162 Driscoll, Jr. et al. Nov 1991 A
5067749 Land Nov 1991 A
5131038 Puhl et al. Jul 1992 A
5146102 Higuchi et al. Sep 1992 A
5157497 Topper et al. Oct 1992 A
5185673 Sobol Feb 1993 A
5187747 Capello et al. Feb 1993 A
5210588 Lee May 1993 A
5222152 Fishbine et al. Jun 1993 A
5222153 Beiswenger Jun 1993 A
5230025 Fishbine et al. Jul 1993 A
5233404 Lougheed et al. Aug 1993 A
5249370 Stanger et al. Oct 1993 A
5253085 Maruo et al. Oct 1993 A
5261266 Lorenz et al. Nov 1993 A
5285293 Webb et al. Feb 1994 A
5291318 Genovese Mar 1994 A
D348445 Fishbine et al. Jul 1994 S
5351127 King et al. Sep 1994 A
D351144 Fishbine et al. Oct 1994 S
5363318 McCauley Nov 1994 A
5384621 Hatch et al. Jan 1995 A
5412463 Sibbald et al. May 1995 A
5416573 Sartor, Jr. May 1995 A
5448649 Chen et al. Sep 1995 A
5467403 Fishbine et al. Nov 1995 A
5469506 Berson et al. Nov 1995 A
5471240 Prager et al. Nov 1995 A
5473144 Mathurin, Jr. Dec 1995 A
5483601 Faulkner Jan 1996 A
5509083 Abtahi et al. Apr 1996 A
5517528 Johnson May 1996 A
5528355 Maase et al. Jun 1996 A
5548394 Giles et al. Aug 1996 A
5591949 Bernstein Jan 1997 A
5596454 Hebert Jan 1997 A
5598474 Johnson Jan 1997 A
5613014 Eshera et al. Mar 1997 A
5615277 Hoffman Mar 1997 A
5625448 Ranalli et al. Apr 1997 A
5640422 Johnson Jun 1997 A
5649128 Hartley Jul 1997 A
5650842 Maase et al. Jul 1997 A
5661451 Pollag Aug 1997 A
5680205 Borza Oct 1997 A
5689529 Johnson Nov 1997 A
5717777 Wong et al. Feb 1998 A
5729334 Van Ruyven Mar 1998 A
5736734 Marcus et al. Apr 1998 A
5745684 Oskouy et al. Apr 1998 A
5748766 Maase et al. May 1998 A
5748768 Sivers et al. May 1998 A
5755748 Borza May 1998 A
5757278 Itsumi May 1998 A
5767989 Sakaguchi Jun 1998 A
5778089 Borza Jul 1998 A
5781647 Fishbine et al. Jul 1998 A
5793218 Oster et al. Aug 1998 A
5801681 Sayag Sep 1998 A
5805777 Kuchta Sep 1998 A
5809172 Melen Sep 1998 A
5812067 Bergholz et al. Sep 1998 A
5815252 Price-Francis Sep 1998 A
5818956 Tuli Oct 1998 A
5822445 Wong Oct 1998 A
5825005 Behnke Oct 1998 A
5825474 Maase Oct 1998 A
5828773 Setlak et al. Oct 1998 A
5832244 Jolley et al. Nov 1998 A
5848231 Teitelbaum et al. Dec 1998 A
5855433 Velho et al. Jan 1999 A
5859420 Borza Jan 1999 A
5859710 Hannah Jan 1999 A
5862247 Fisun et al. Jan 1999 A
5867802 Borza Feb 1999 A
5869822 Meadows, II et al. Feb 1999 A
5872834 Teitelbaum Feb 1999 A
5892599 Bahuguna Apr 1999 A
5900993 Betensky May 1999 A
5907627 Borza May 1999 A
5920384 Borza Jul 1999 A
5920640 Salatino et al. Jul 1999 A
5928347 Jones Jul 1999 A
5942761 Tuli Aug 1999 A
5946135 Auerswald et al. Aug 1999 A
5960100 Hargrove Sep 1999 A
5973731 Schwab Oct 1999 A
5974162 Metz et al. Oct 1999 A
5987155 Dunn et al. Nov 1999 A
5991467 Kamiko Nov 1999 A
5995014 DiMaria Nov 1999 A
5999307 Whitehead et al. Dec 1999 A
6018739 McCoy et al. Jan 2000 A
6023522 Draganoff et al. Feb 2000 A
6038332 Fishbine et al. Mar 2000 A
6041372 Hart et al. Mar 2000 A
6055071 Kuwata et al. Apr 2000 A
6064398 Ellenby et al. May 2000 A
6064753 Bolle et al. May 2000 A
6064779 Neukermans et al. May 2000 A
6072891 Hamid et al. Jun 2000 A
6075876 Draganoff Jun 2000 A
6078265 Bonder et al. Jun 2000 A
6088585 Schmitt et al. Jul 2000 A
6097873 Filas et al. Aug 2000 A
6104809 Berson et al. Aug 2000 A
6111978 Bolle et al. Aug 2000 A
6115484 Bowker et al. Sep 2000 A
6122394 Neukermans et al. Sep 2000 A
6144408 MacLean Nov 2000 A
6150665 Suga Nov 2000 A
6154285 Teng et al. Nov 2000 A
6162486 Samouilhan et al. Dec 2000 A
6166787 Akins et al. Dec 2000 A
6178255 Scott et al. Jan 2001 B1
6195447 Ross Feb 2001 B1
6198836 Hauke Mar 2001 B1
6204331 Sullivan et al. Mar 2001 B1
6212290 Gagne et al. Apr 2001 B1
6259108 Antonelli et al. Jul 2001 B1
6272562 Scott et al. Aug 2001 B1
6281931 Tsao et al. Aug 2001 B1
6327047 Motamed Dec 2001 B1
6347163 Roustaei Feb 2002 B2
20020021827 Smith Feb 2002 A1
20020030668 Hoshino et al. Mar 2002 A1
Foreign Referenced Citations (38)
Number Date Country
0 101 772 Mar 1984 EP
0 308 162 Mar 1989 EP
0 308 162 Mar 1989 EP
0 379 333 Jul 1990 EP
0 623 890 Nov 1994 EP
0 623 890 Nov 1994 EP
0 653 882 May 1995 EP
0 379 333 Jul 1995 EP
0 889 432 Jan 1999 EP
0 889 432 Jan 1999 EP
0 905 646 Mar 1999 EP
0 785 750 Jun 1999 EP
0 924 656 Jun 1999 EP
0 623 890 Aug 2001 EP
2 089 545 Jun 1982 GB
2 313 441 Nov 1997 GB
62-212892 Sep 1987 JP
1-205392 Aug 1989 JP
3-161884 Jul 1991 JP
3-194674 Aug 1991 JP
3-194675 Aug 1991 JP
11-225272 Aug 1999 JP
11-289421 Oct 1999 JP
WO 8702491 Apr 1987 WO
WO 8706378 Oct 1987 WO
WO 9003620 Apr 1990 WO
WO 9211608 Jul 1992 WO
WO 9422371 Oct 1994 WO
WO 9422371 Oct 1994 WO
WO 9617480 Jun 1996 WO
WO 9617480 Jun 1996 WO
WO 9729477 Aug 1997 WO
WO 9741528 Nov 1997 WO
WO 9809246 Mar 1998 WO
WO 9812670 Mar 1998 WO
WO 9912123 Mar 1999 WO
WO 9926187 May 1999 WO
WO 9940535 Aug 1999 WO
Related Publications (1)
Number Date Country
20040109590 A1 Jun 2004 US
Provisional Applications (1)
Number Date Country
60400103 Aug 2002 US