System and method for counting ridges in a captured print image

Information

  • Patent Application
  • 20060133656
  • Publication Number
    20060133656
  • Date Filed
    January 30, 2006
    18 years ago
  • Date Published
    June 22, 2006
    18 years ago
Abstract
A system and method for counting ridges in a captured print image frame is described. A pixel path through the captured print image frame is traversed. A hysteresis band for the pixel path is determined. A number of crossings of the determined hysteresis band is counted while traversing the pixel path. A number of print ridges based on the counted number of hysteresis band crossings is determined.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


The invention relates generally to the field of fingerprint scanner technology and, more particularly, to counting ridges in a captured fingerprint image frame.


2. Background Art


Biometrics are a group of technologies that provide a high level of security. Fingerprint capture and recognition is an important biometric technology. Law enforcement, banking, voting, and other industries increasingly rely upon fingerprints as a biometric to recognize or verify identity. See, Biometrics Explained, v. 2.0, G. Roethenbaugh, International Computer Society Assn. Carlisle, Pa. 1998, pages 1-34 (incorporated herein by reference in its entirety).


Fingerprint scanners having cameras are available that capture an image of a fingerprint. Typically, to capture a fingerprint image electronically with a fingerprint scanner, a light source is directed towards a fingerprint capture surface that reflects light from the light source towards a camera. The fingerprint capture surface is generally glass. Contact between the surface of a finger and the fingerprint capture surface causes the reflected light to be representative of the fingerprint of the particular finger placed against the fingerprint capture surface. This reflected light is then captured by camera. The fingerprint scanner may have processing that produces a signal representative of a captured fingerprint image from the reflected light.


The quality of contact between a finger and the fingerprint capture surface plays a large role in the intensity of the reflected light. A very dry skin surface on a clean fingerprint capture surface may result in a low intensity level of reflected light. On the other hand, an oily skin surface and/or a less-clean fingerprint capture surface may result in a high level of reflected light. Additional factors, such as the location of the finger, the pressure applied to press the finger to the platen, and the ambient environment may affect whether an acceptable quality fingerprint image is captured.


As a result of the above variations, a fingerprint scanner system and method that captures an acceptable fingerprint image is needed. Moreover, a system and method for determining the quality of a captured fingerprint image is desired.


BRIEF SUMMARY OF THE INVENTION

A system and method for counting fingerprint ridges in a captured fingerprint image frame is described. In an aspect of the present invention, a fingerprint image is captured. The captured fingerprint image is stored. A region of interest in the stored fingerprint image frame is determined. A pixel path is determined through the region of interest.


In a further aspect, the pixel path through the captured fingerprint image frame is traversed. A hysteresis band for the pixel path is determined. A number of crossings of the determined hysteresis band is counted while traversing the pixel path. A number of fingerprint ridges based on the counted number of hysteresis band crossings is determined.


In a further aspect, the determined number of fingerprint ridges is stored.


In a further aspect, the stored number of fingerprint ridges is evaluated to determine a quality of the captured fingerprint image.


In a further aspect, a plurality of pixel paths may be determined, and individually traversed.


In an aspect of the present invention, the hysteresis band is defined by a hysteresis band first edge value and a hysteresis band second edge value. The hysteresis band may be determined as follows: A first ridge pixel value peak for the determined pixel path is measured. A first valley pixel value peak for the determined pixel path is measured. A hysteresis band center pixel value between the first ridge pixel value peak and the first valley pixel value peak is selected. The hysteresis band first edge value is calculated by adding a delta value to the selected hysteresis band center pixel value. The hysteresis band second edge value is calculated by subtracting the delta value from the selected hysteresis band center pixel value.


In a further aspect, the hysteresis band center pixel value may be selected by calculating an average pixel value of the first ridge pixel value peak and the first valley pixel value peak, and setting the hysteresis band center pixel value to the calculated average pixel value.


In a further aspect, the delta value may be calculated according to the following equation:

delta value=|(first valley pixel value peak−first ridge pixel value peak)|/N,


wherein N is any number greater than one. For example, N may be an integer, such as six.


In a further aspect, a hysteresis band crossing may be detected when sequentially detected pixel values along the pixel path range from the hysteresis band first edge value to the hysteresis band second edge value. Furthermore, a hysteresis band crossing may be detected when sequentially detected pixel values along the pixel path range from the hysteresis band second edge value to the hysteresis band first edge value.


In a further aspect, the number of fingerprint ridges based on the counted number of hysteresis band crossings is determined by dividing the counted number of hysteresis band crossings by two.


In another aspect of the present invention, a system is described for counting fingerprint ridges in a captured fingerprint image frame. A ridge counter module traverses a pixel path through the captured fingerprint image frame, determines a hysteresis band for the pixel path, counts a number of crossings of the determined hysteresis band while traversing the pixel path, and determines a number of fingerprint ridges based on the counted number of hysteresis band crossings.


In a further aspect, the system includes a camera that captures a fingerprint image and outputs the captured fingerprint image frame.


In a further aspect, the system includes a memory that stores the captured fingerprint image frame, and is accessible by the ridge counter module.


In a further aspect, the system includes a platen that has a finger application area.


In a further aspect, the system includes an illumination source that provides light to illuminate the finger application area to produce the fingerprint image.


In a further aspect, the system includes an optical system that directs the light to the camera.


In a further aspect, the system includes a controller that includes the ridge counter module and controls the illumination source and/or the camera.


This system and method for counting ridges according to the present invention can be used with any type of print including, but not limited to a print of all or part of a finger, palm, hand, toe, and foot.


Further aspects, features, and advantages of the present invention, as well as the structure and operation of the various embodiments of the present invention, are described in detail below with reference to the accompanying drawings.




BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES

The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.



FIG. 1 illustrates an example block diagram of a fingerprint scanner system, according to embodiments of the present invention.



FIG. 2A shows an example captured fingerprint image frame.



FIG. 2B shows the example captured fingerprint image frame of FIG. 2A with two example fingerprint image frame regions, according to embodiments of the present invention.



FIG. 2C shows example pixel paths through a captured fingerprint image frame region, according to the present invention.



FIG. 2D shows an example finger that is applied to an example platen, according to an embodiment of the present invention.



FIG. 3 shows an example plot of pixel intensity for a traversed pixel path.



FIG. 4 shows a flowchart providing high level steps for performing the present invention.



FIG. 5 shows example steps for counting fingerprint ridges in a captured fingerprint image frame, according to embodiments of the present invention.



FIG. 6 shows example steps for determining a hysteresis band, according to an embodiment of the present invention.



FIG. 7 shows example steps for a counting a number of crossings of a determined hysteresis band while traversing a determined pixel path, according to an embodiment of the present invention.



FIG. 8 shows example steps for counting fingerprint ridges in a captured fingerprint image frame for one or more pixel paths, according to embodiments of the present invention.




The present invention will now be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.


DETAILED DESCRIPTION OF THE INVENTION

While the present invention is described herein with reference to illustrative embodiments for particular applications, it should be understood that the invention is not limited thereto. Those skilled in the art with access to the teachings provided herein will recognize additional modifications, applications, and embodiments within the scope thereof and additional fields in which the present invention would be of significant utility.


Overview


The present invention is directed to a method, system, and apparatus for counting ridges in a captured fingerprint image. The present invention may be applied in any type of print scanner, including but not limited to any type of fingerprint and/or palm print scanner.


Numerous embodiments of the present invention are presented herein. Detail on the above mentioned embodiments for counting fingerprint ridges in a captured fingerprint image frame, and additional embodiments according to the present invention, are described. The embodiments described herein may be combined in any applicable manner, as required by a particular application.


Terminology


To more clearly delineate the present invention, an effort is made throughout the specification to adhere to the following term definitions consistently.


The term “finger” refers to any digit on a hand including, but not limited to, a thumb, an index finger, middle finger, ring finger, or a pinky finger.


The term “live scan” refers to a scan of any type of fingerprint and/or palm print image made by a print scanner. A live scan can include, but is not limited to, a scan of a finger, a finger roll, a flat finger, slap print of four fingers, thumb print, palm print, toe, or foot, or a combination of fingers, such as, sets of fingers and/or thumbs from one or more hands or one or more palms, or one or more toes disposed on a platen.


In a live scan, one or more fingers or palms from either a left hand or a right hand, or both hands or all or part of a foot are placed on a platen of a scanner. Different types of print images are detected depending upon a particular application. For example, a flat print consists of a fingerprint image of a digit (finger or thumb) pressed flat against the platen. A roll print consists of an image of a digit (finger or thumb) made while the digit (finger or thumb) is rolled from one side of the digit to another side of the digit over the surface of the platen. A slap print consists of an image of four flat fingers pressed flat against the platen. A palm print involves pressing all or part of a palm upon the platen. A platen can be movable or stationary depending upon the particular type of scanner and the type of print being captured by the scanner.


The terms “biometric imaging system”, “scanner”, “live scanner”, “live print scanner”, “fingerprint scanner” and “print scanner” are used interchangeably, and refer to any type of scanner which can obtain an image of all or part of one or more fingers and/or palm in a live scan. The obtained images can be combined in any format including, but not limited to, an FBI, state, or international tenprint format.


The term “platen” refers to a component that includes an imaging surface upon which at least one finger is placed during a live scan. A platen can include, but is not limited to, a surface of an optical prism, set of prisms, or set of micro-prisms, or a surface of a silicone layer or other element disposed in optical contact with a surface of an optical prism, set of prisms, or set of micro-prisms.


Embodiments for Counting Fingerprint Ridges in a Captured Fingerprint Image Frame


Example embodiments for counting fingerprint ridges according to the present invention are described at a high-level and at a more detailed level. These example embodiments are provided herein for illustrative purposes, and are not limiting. In particular, fingerprint ridge counting as described in this section can be achieved using any number of structural implementations, including hardware, firmware, software, or any combination thereof.



FIG. 1 illustrates an example high level block diagram of a live scanner system 100, according to embodiments of the present invention. Live scanner system 100 includes an illumination source 110, a live scanner optical system 120, a camera 130, a memory 135, and a live scanner controller 140. Live scanner system 100 captures a user's print. Furthermore, live scanner system 100 is capable of performing ridge counting according to the present invention.


Live scanner system 100 may be a portion of, or may be included in any suitable type of print scanner, known to persons skilled in the relevant art(s). For example, live scanner system 100 may be included in any live scanner available from Cross Match Technologies, Inc., or other manufacturer. Furthermore, one or more portions of live scanner system 100 may be incorporated in any computer system that can process captured fingerprint images.


Optical system 120 shown in FIG. 1 includes a fingerprint image capturing platen, where a user may apply a finger. In some embodiments, the fingerprint image capturing platen may allow a user to roll the applied finger across the platen. Illumination source 110 provides light for illuminating the applied finger at the platen. Optical system 120 may focus and direct the light to the platen. Optical system 120 focuses and/or directs light reflected from the applied finger to camera 130. Camera 130 periodically samples the reflected light, and outputs captured fingerprint image data. The data is output to memory 135, which stores the captured fingerprint image data in the form of a captured fingerprint image frame. For example, the captured fingerprint image frame may be stored in the form of a two-dimensional array of pixel data.


Controller 140 accesses the captured fingerprint image data stored in memory 135, and/or directly from camera 130. Controller 140 may provide a sampling signal to camera 130 and/or illumination source 110 that causes camera 130 to capture fingerprint image frames while being illuminated by illumination source 110.


Controller 140 may be included in a personal computer, a mainframe computer, one or more processors, specialized hardware, software, firmware, or any combination thereof, and/or any other device capable of processing the captured fingerprint image data as described herein. Controller 140 may allow a user to initiate and terminate a fingerprint capture session. Controller 140 also allows a user to evaluate the quality of captured fingerprint images, as described below.


As shown in FIG. 1, controller 140 comprises a ridge counter module 150. Ridge counter module 150 counts fingerprint ridges in captured fingerprint image frames. Further structural and operational detail of ridge counter module 150 is provided below. Ridge counter module 150 may be implemented in hardware, firmware, software, or a combination thereof. Other structural embodiments for ridge counter module 150 will be apparent to persons skilled in the relevant art(s) based on the discussion contained herein.



FIG. 4 shows a flowchart 400 providing high level steps for the present invention. Operational and structural embodiments related to flowchart 400 will become apparent to persons skilled in the relevant art(s) based on the discussion herein.


Flowchart 400 begins with step 402. In step 402, a fingerprint ridge count is determined for one or more pixel paths across a captured fingerprint image frame. Procedures for determining a fingerprint ridge count according to step 402 are described further below. Step 402 may be performed by live scanner system 100, for example.


In step 404, the determined fingerprint ridge count is evaluated to determine at least a quality of the captured fingerprint image frame. Controller 140 or other processing hardware/software may use the output of ridge counter module 150 for any number of reasons. For example, controller 140 may use a ridge count as a factor in determining whether a quality fingerprint image has been captured. For example, if an unusually low number of ridges are counted, controller 140 may determine that a poor quality fingerprint image was captured. If an expected number of ridges are counted, controller 140 may use this information as a factor indicating that a relatively good quality fingerprint image was captured. A fingerprint ridge count may be used for additional reasons, including evaluating the performance of the corresponding fingerprint scanner, identity verification, and for further reasons.



FIG. 5 shows example steps for performing step 402, according to one or more embodiments of the present invention. Optional steps are shown surrounded by dotted lines in FIG. 5. The steps of FIG. 5 do not necessarily have to occur in the order shown, as will be apparent to persons skilled in the relevant art(s) based on the teachings herein. Operational and structural embodiments will be apparent to persons skilled in the relevant art(s) based on the following discussion.


In step 502, a fingerprint image is captured.


In step 504, the captured fingerprint image is stored to be accessed as the stored fingerprint image frame.


In step 506, a region of interest in a stored fingerprint image frame is identified.


In step 508, a pixel path through the region of interest is determined.


In step 510, the determined pixel path is traversed.


In step 512, a hysteresis band for the determined pixel path is determined.


In step 514, a number of crossings of the determined hysteresis band while traversing the determined pixel path is counted.


In step 516, a number of fingerprint ridges based on the counted number of hysteresis band crossings is determined.


In step 518, the number of fingerprint ridges determined in step 516 is stored.


The steps shown in FIG. 5 are described in further detail in the following discussion.


In optional step 502 of FIG. 5, a fingerprint image is captured. For example, camera 130 captures a fingerprint image reflected from a finger applied to a platen. FIG. 2D shows an example finger 230 that is applied to a platen 240, according to an embodiment of the present invention. As shown in FIG. 2D, platen 240 is a surface of an optical prism. Finger 230 includes five ridges 260a-260e and four valleys 280a-d, for illustrative purposes. Illumination source 110 emits light to finger 230. A portion of the emitted light from illumination source 110 is shown as first light 270a and second light 270b in FIG. 2D. As shown in FIG. 2D, first light 270a contacts ridge 260a at the surface of platen 240. First light 270a is largely diffused and/or absorbed by finger 230, and does not substantially cause reflected light. Hence, because first light 270a is not substantially reflected from ridge 260a, camera 130 will receive relatively little light, and will output one or more “dark” pixels that correspond to ridge 260a. Conversely, second light 270b contacts space or valley 280c at the surface of platen 240, causing reflected light 290. Second light 270b is relatively less diffused and/or absorbed by finger 230, causing more light to be reflected. Hence, camera 130 receives reflected light 290 and outputs one or more relatively “bright” pixels that correspond to valley 280c.


Note that in this description, ridges are described to cause relatively “dark” reflections, and valleys to cause relatively “bright” reflections. However, in alternative embodiments, this may be reversed. In other words, ridges may instead cause “bright” reflections, while valleys may cause relatively “dark” reflections.


In embodiments, different amounts of light are reflected depending on whether a ridge, valley, or intermediate portion of finger 230 is in contact with a prism. The light captured by the camera may be output as data by the camera in the form of grey-scale pixel intensity values. For example, the grey-scale pixel intensity values may range from 0 to 255, where 0 represents a white pixel, 255 represents a black pixel, and values in between 0 and 255 represent corresponding shades of grey. Alternatively, 0 may represent a black pixel and 255 may represent a white pixel. Furthermore, in alternative embodiments, the pixel intensity values may have greater and lesser ranges than 0 to 255. Furthermore, the pixel intensity values may include shades of one or more colors in addition to black, grey, and white. For illustrative purposes, fingerprint image data is described herein as in the form of pixels with grey-scale pixel intensity values ranging from 0 (white) to 255 (black).


Note that in alternative embodiments, the fingerprint image may have already been captured, and step 502 is therefore not necessary.


In optional step 504, shown in FIG. 5, the captured fingerprint image is stored to be accessed as the stored fingerprint image frame. For example, the captured fingerprint image frame is stored by camera 130 in memory 135. FIG. 2A shows an example stored captured fingerprint image frame 200. Captured fingerprint image frame 200 may have been captured by camera 130 shown in FIG. 1, and stored in memory 135, for example. In an embodiment, captured fingerprint image frame 200 is stored as a two-dimensional array of pixel data. Note that in alternative embodiments, the captured fingerprint image may be prestored, and step 504 is therefore not necessary.


As shown in FIG. 2A, stored captured fingerprint image frame 200 includes a captured fingerprint image 202. Captured fingerprint image 202 is an image captured from a user's finger applied to a corresponding platen. Darker pixel areas of captured fingerprint image 202 represent areas where relatively better contact is made between the user's finger and the platen. The darker pixel areas tend to correspond to ridges of the user's fingerprint. Lighter pixel areas of captured fingerprint 202 represent areas where relatively less contact is made between the user's finger and the platen. The lighter pixel areas tend to correspond to spaces or valleys between ridges of the user's fingerprint.


In optional step 506, shown in FIG. 5, a region of interest in a stored fingerprint image frame is identified. In the embodiment shown in FIG. 1, captured fingerprint image frame 200 is processed by controller 140 to obtain desired information related to the user's fingerprint. All of captured fingerprint image frame 200, or a portion of fingerprint image frame 200 may be considered to be the region of interest. Processing only a portion of fingerprint image frame 200 determined to be a region of interest may save processing time and other resources relative to processing all of fingerprint image frame 200.


For example, FIG. 2B shows an example first fingerprint image frame region 204 and an example second fingerprint image frame region 206, which are each portions of captured fingerprint image frame 200. First fingerprint image frame region 204 is smaller in area than captured fingerprint image frame 200, and therefore includes fewer pixels than captured fingerprint image frame 200. Because first fingerprint image frame region 204 has fewer pixels than captured fingerprint image frame 200, any processing of first fingerprint image frame region 204 may be more efficient than processing captured fingerprint image frame 200. Furthermore, second fingerprint image frame region 206 is smaller in area than first fingerprint image frame region 204, and therefore includes fewer pixels than first fingerprint image frame region 204. Because second fingerprint image frame region 206 has fewer pixels than first fingerprint image frame region 204, any processing of second fingerprint image frame region 206 may be more efficient than processing first fingerprint image frame region 204


For illustrative purposes, in the following example of fingerprint image processing, any processing is described as performed on second fingerprint image frame region 206. However, in embodiments of the present invention, an entire captured fingerprint image frame, or any portion thereof, may be processed.


In optional step 508, shown in FIG. 5, a pixel path through the region of interest is determined. In embodiments, one or more pixel paths through a single captured fingerprint image may be determined. FIG. 2C shows example pixel paths through captured fingerprint image frame region 206, according to the present invention. As shown in the example of FIG. 2C, six pixel paths are present, first horizontal pixel path 212, second horizontal pixel path 214, third horizontal pixel path 216, first vertical pixel path 218, second vertical pixel path 220, and third vertical pixel path 222. Only six paths are shown for clarity; however, in general a fewer or greater number of pixel paths can be used.


Note that in embodiments, pixel paths may be straight horizontal and vertical paths, such as are shown in FIG. 2C, and may be other shaped paths. For example, pixel paths may also be elliptical, triangular, rectangular, other polygons, irregular, and other shaped paths. Furthermore, in alternative embodiments, the one or more pixel paths are predetermined, and are not determined on an application basis. Hence, step 508 is may not be necessary.


In step 510, shown in FIG. 5, the determined pixel path is traversed. According to the present invention, a pixel path is traversed so that the number of fingerprint ridges occurring along the pixel path may be counted. Pixel paths may be traversed in either direction along the pixel path. Typically, a pixel path is traversed by detecting pixel values sequentially along the determined pixel path. During the traversal, a pixel intensity value may be detected for every pixel in the pixel path, for every other pixel in the pixel path, for every third pixel in the pixel path, or for other multiples of pixels in the pixel path.


In the example of FIG. 2C, horizontal pixel paths are traversed from left to right, and the number of ridges occurring along the pixel path are counted during the traversal. During an example traversal of first horizontal pixel path 212, three fingerprint ridges are counted. During an example traversal of second horizontal pixel path 214, four fingerprint ridges are counted. During an example traversal of third horizontal pixel path 216, seven fingerprint ridges are counted. During an example traversal of first vertical pixel path 218, three fingerprint ridges are counted. During an example traversal of second vertical pixel path 220, five fingerprint ridges are counted. During an example traversal of third vertical pixel path 222, three fingerprint ridges are counted. The discussion below describes how to count fingerprint ridges such as these, according to the present invention.


In step 512, shown in FIG. 5, a hysteresis band for the determined pixel path is determined. According to the present invention, fingerprint ridges are counted using a hysteresis band, as is further described below. FIG. 3 shows an example plot 300 of pixel intensity for a traversed pixel path. Pixel intensity values range from 0 (white) to 255 (black). The pixel path begins at a first end of the pixel path, traversing sequentially through the pixels in the pixel path, to a second end of the pixel path. As shown in FIG. 3, four fingerprint ridges 304a-304d and four fingerprint valleys 302a-302d are present in the captured fingerprint image frame portion during traversal of the pixel path.


According to the present invention, a hysteresis band 306, as shown in FIG. 3, is determined. As shown in FIG. 3, hysteresis band 306 is defined by a hysteresis band first edge value 308 and a hysteresis band second edge value 310. FIG. 6 shows example steps for step 512 of FIG. 5, according to one or more embodiments of the present invention. The steps of FIG. 6 do not necessarily have to occur in the order shown, as will be apparent to persons skilled in the relevant art(s) based on the teachings herein. Operational and structural embodiments will be apparent to persons skilled in the relevant art(s) based on the following discussion related to flowchart 600.


Flowchart 600 begins with step 602. In step 602, a first ridge pixel value peak for the pixel path is measured. In other words, for the example of FIG. 3, the pixel path is traversed from left to right, to find a first maximum pixel value for the pixel intensity, which corresponds to a fingerprint ridge peak. In FIG. 3, by traversing the pixel path from left to right, a first ridge pixel value peak is found as the maximum pixel value for first ridge 304a. First ridge 304a is the first ridge found while traversing the pixel path from left to right. For instance, the peak pixel value for first ridge 304a may be equal to a pixel intensity value of 230. In examples, the first ridge pixel value peak can be any selected ridge value peak along the pixel path, an average or mean of the ridge value peaks, or the minimum ridge value peak of a group of ridge value peaks along the pixel path.


In step 604, a first valley pixel value peak for the pixel path is measured. In other words, for the example of FIG. 3, the pixel path is traversed from left to right, to find a first minimum pixel value for the pixel intensity, which corresponds to a fingerprint valley peak. In FIG. 3, by traversing the pixel path from left to right, a first valley pixel value peak is found as the minimum pixel value for first valley 302a. First valley 302a is the first valley found while traversing the pixel path from left to right. For instance, the peak pixel value for first valley 302a may be equal to a pixel intensity value of 30. In examples, the first valley pixel value peak can be any selected valley value peak along the pixel path, an average or mean of the valley value peak, or the maximum valley value peak of a group of valley value peaks along the pixel path.


In step 606, a hysteresis band center pixel value is selected between the first ridge pixel value peak and the first valley pixel value peak. In the example of FIG. 3, the hysteresis band center pixel value is shown as hysteresis band center pixel value 312. Any pixel intensity value between the intensity values of the first ridge pixel value peak and first valley pixel value peak may be chosen to be hysteresis band center pixel value 312. In an embodiment, hysteresis band center pixel value 312 is chosen to be the average value of the first ridge pixel value peak and first valley pixel value peak. In the current example, the average value of the first ridge pixel value peak and first valley pixel value peak is equal to

(230−30)/2=130=hysteresis band center pixel value 312.


In step 608, the hysteresis band first edge value is calculated by adding a delta value to the selected hysteresis band center pixel value. Hence, hysteresis band first edge value 308 is equal to the sum of hysteresis band center pixel value 312 and a delta value. In embodiments, the delta value may be predetermined, may be calculated as a fraction or percentage of the difference between the first ridge pixel value peak and the first valley pixel value peak, and may be calculated in other ways. In an embodiment, the delta value is one-sixth of the difference between the first ridge pixel value peak and the first valley pixel value peak:

delta value=(first ridge pixel value peak−first valley pixel value peak)/6

which in the current example is approximately equal to:

delta value={230−30)/6=33.3.

Hence, in the current example, hysteresis band first edge value 308 is approximately equal to

130+33.3=163.3=hysteresis band first edge value 308.


In step 610, the hysteresis band second edge value is calculated by subtracting the delta value from the selected hysteresis band center pixel value. Hence, in the current example, hysteresis band second edge value 310 is approximately equal to

130−33.3=96.7=hysteresis band second edge value 310.

Hence, in the current example, hysteresis band 306 is defined to range from 163.3 to 96.7. The determined hysteresis band 306 may be used to count fingerprint ridges, according to the present invention, as described herein.


In step 514, shown in FIG. 5, a number of crossings of the determined hysteresis band while traversing the determined pixel path is counted. A crossing of the hysteresis band is determined whenever pixels in a pixel path completely cross the hysteresis band. FIG. 7 shows example steps for step 514, according to an embodiment of the present invention. Operational and structural embodiments related to the steps shown in FIG. 7 will become apparent to persons skilled in the relevant art(s) based on the discussion herein.


In step 702, a hysteresis band crossing is detected when sequentially detected pixel values range from the hysteresis band first edge value to the hysteresis band second edge value.


In step 704, a hysteresis band crossing is detected when sequentially detected pixel values range from the hysteresis band second edge value to the hysteresis band first edge value.


For illustrative purposes, these processes are further described with respect to FIG. 3. As shown in FIG. 3, a hysteresis band crossing occurs between valley 302a and ridge 304a. According to step 704, a hysteresis band crossing is detected when sequentially detected pixel values range from the hysteresis band second edge value to the hysteresis band first edge value. Between valley 302a and ridge 304a, hysteresis band 306 is crossed because sequentially detected pixel values shown in plot 300 range from hysteresis band second edge value 310 to hysteresis band first edge value 308. Hence, under step 704, a hysteresis band crossing is detected between valley 302a and ridge 304a.


Furthermore, as shown in FIG. 3, a hysteresis band crossing occurs between ridge 304a and valley 302b. According to step 702, a hysteresis band crossing is detected when sequentially detected pixel values range from the hysteresis band first edge value to the hysteresis band second edge value. Between ridge 304a and valley 302b, hysteresis band 306 is crossed because sequentially detected pixel values shown in plot 300 range from hysteresis band first edge value 308 to hysteresis band second edge value 310. Hence, under step 702, a hysteresis band crossing is detected between ridge 304a and valley 302b.


Likewise, hysteresis band crossings may be detected between valley 302b and ridge 304b, ridge 304b and valley 302c, and valley 302c and ridge 304c.


However, a hysteresis band crossing is not detected between ridge 304c and valley 302d, or between valley 302d and ridge 304d. This is because hysteresis band 306 is not completely crossed. In other words, sequentially detected pixel values do not range from hysteresis band first edge value 308 to hysteresis band second edge value 310, or from hysteresis band second edge value 310 to hysteresis band first edge value 308 between ridge 304c and valley 302d or between valley 302d and ridge 304d.


A final hysteresis band crossing in plot 300 may be detected between ridge 304d and valley 302d according to step 702. Hence, a total of six hysteresis band crossings are detected in the example of plot 300 shown in FIG. 3.


In step 516, shown in FIG. 5, a number of fingerprint ridges based on the counted number of hysteresis band crossings is determined. Typically, the counted number of hysteresis band crossings is divided by two to determine the number of fingerprint ridges. However, in alternative embodiments, the number of fingerprint ridges based on the counted number of hysteresis band crossings may be determined in other ways.


Hence, in the example of plot 300 shown in FIG. 3, the number of fingerprint ridges may be determined by dividing by two the counted number of hysteresis band crossings:

6÷2=3=number of fingerprint ridges determined in plot 300


In optional step 518, shown in FIG. 5, the number of fingerprint ridges determined in step 516 is stored. For example, the number of fingerprint ridges that have been determined may be stored in a memory, such as memory 135, or any other memory or storage device, temporary or permanent. In embodiments, the number of fingerprint ridges may be displayed and/or transmitted. In alternative embodiments, the determined number of fingerprint ridges is not stored, and hence step 518 is not necessary.



FIG. 8 shows an alternative embodiment for step 402 of FIG. 5, according to the present invention. In FIG. 8, steps 508, 510, 512, 514, 516, and 518 are repeated for multiple pixel paths through the region of interest of the captured fingerprint image frame. Hence, after step 518 is performed (when present), operation proceeds back to step 508, so that a next pixel path may be determined. Once all of the desired pixel paths have been traversed, operation proceeds from step 518 to end step 804.


Furthermore, FIG. 8 shows the additional optional step of step 802. In step 802, it is evaluated whether a range of pixels in the determined pixel path is acceptable. In other words, controller 140, for example, may make a comparison of pixel intensities for pixels in the current pixel path to determine whether they have enough intensity variation to warrant further processing. For example, if there is relatively little intensity variation in the pixel path, the pixel path may include none or few fingerprint ridges. The pixel path may actually cross the region of interest in an area where there is little or no fingerprint information, and hence may not warrant further processing. Hence, as shown in FIG. 8, if the answer at step 802 is no, the current pixel path may be skipped, operation may proceed from step 802 to step 508, where a next pixel path may be determined. If the pixel path does have a range of pixel intensities that are determined to be acceptable, operation may proceed from step 802 to step 510.


Further steps for the processes shown in FIGS. 4-8 will be known to persons skilled in the relevant art(s) from the teachings herein.


The present invention has been described with respect to fingerprints; however, the system and method for counting ridges can be used to count ridges in any type of print, including all or part of a finger, palm, hand, toe, and foot.


CONCLUSION

While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims
  • 1-34. (canceled)
  • 35. A method for counting print ridges in a captured print image frame, comprising the steps of: (1) traversing a pixel path through the captured print image frame; (2) counting a number of crossings of a first pixel intensity threshold and a second pixel intensity threshold while traversing the pixel path; and (3) determining a number of print ridges in the captured print image frame based on the counted number of crossing of the first pixel intensity threshold and the second pixel intensity threshold.
  • 36. The method of claim 35, wherein step (3) comprises determining the number of print ridges by dividing the counted number of first pixel intensity threshold and second pixel intensity threshold crossings by two.
  • 37. The method of claim 35, further comprising the step of: (4) determining the first pixel intensity threshold and the second pixel intensity threshold.
  • 38. The method of claim 37, wherein step (4) comprises determining a third pixel intensity threshold based on a first maximum pixel intensity value and a first minimum pixel intensity value in the captured print image frame.
  • 39. The method of claim 38, wherein step (4) further comprises determining a delta value based on the first maximum pixel intensity value and the first minimum pixel intensity value in the captured print image frame.
  • 40. The method of claim 39, wherein step (4) further comprises determining the first pixel intensity threshold as a sum of the third pixel intensity threshold and the delta value.
  • 41. The method of claim 39, wherein step (4) further comprises determining the second pixel intensity threshold as a difference of the third pixel intensity threshold and the delta value.
  • 42. The method of claim 35, further comprising the steps of: (4) traversing a second pixel path across the captured print image frame; and (5) repeating steps (2) and (3) using the second pixel path.
  • 43. The method of claim 35, further comprising the steps of: (4) storing the number of print ridges; and (5) evaluating the stored number of print ridges to determine a quality of the captured fingerprint image.
  • 44. A print scanner system, comprising: an optical system enabled to direct light to an imaging surface of a platen of the optical system; a camera enabled to capture a print image frame from light reflected by the print surface and directed to the camera by the optical system; a memory coupled to the camera and enabled to store the captured print image frame; and a ridge counter coupled to the memory and enabled to traverse a pixel path through the captured print image frame, count a number of crossings of a first pixel intensity threshold and a second pixel intensity threshold while traversing the pixel path, and determine a number of print ridges in the captured print image frame based on the counted number of crossings.
  • 45. The print scanner system of claim 44, wherein the ridge counter is enabled to determine the number of print ridges by dividing the number of crossings by two.
  • 46. The print scanner system of claim 44, wherein the ridge counter is enabled to determine a region of interest in the captured print image frame.
  • 47. The print scanner system of claim 44, wherein the ridge counter is enabled to determine a first pixel intensity threshold and a second pixel intensity threshold.
  • 48. The print scanner system of claim 47, wherein the ridge counter is enabled to determine a third pixel intensity threshold based on a first maximum pixel intensity value and a first minimum pixel intensity value in the captured print image frame.
  • 49. The print scanner system of claim 48, wherein the ridge counter is enabled to determine a delta value based on the first maximum pixel intensity value and first minimum pixel intensity value in the captured print image frame.
  • 50. The print scanner system of claim 49, wherein the ridge counter is enabled to determine the first pixel intensity threshold as a sum of the third pixel intensity threshold and the delta value.
  • 51. The print scanner system of claim 49, wherein the ridge counter is enabled to determine the second pixel intensity threshold as a difference of the third pixel intensity threshold and the delta value.
  • 52. A ridge counter embodied in software and including control logic for counting print ridges in a captured print image frame, comprising: first control logic means for enabling a processor to traverse a pixel path through the captured print image frame; second control logic means for enabling a processor to count a number of crossings of a first pixel intensity threshold and a second pixel intensity threshold while traversing the pixel path; and third control logic means for enabling a processor to determine a number of print ridges in the captured print image frame based on the counted number of first pixel intensity threshold crossings and second pixel intensity threshold crossings.
  • 53. The ridge counter of claim 52, further comprising fourth control logic means for enabling a processor to determine the number of print ridges by dividing the counted number of crossings of the first pixel intensity threshold and the second pixel intensity threshold by two.
  • 54. The ridge counter of claim 52, further comprising fourth control logic means for enabling a processor to determine the first pixel intensity threshold and the second pixel intensity threshold.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application No. 60/400,103, filed Aug. 2, 2002 (Atty. Dkt. No. 1823.0680000), which is herein incorporated by reference in its entirety.

Provisional Applications (1)
Number Date Country
60400103 Aug 2002 US
Continuations (1)
Number Date Country
Parent 10631899 Aug 2003 US
Child 11341377 Jan 2006 US