1. Field of the Invention
The present invention is directed to the field of rolled fingerprint capture, and more specifically, to capturing and combining multiple fingerprint images to generate an overall rolled fingerprint image.
2. Related Art
A rolled fingerprint scanner is a device used to capture rolled fingerprint images. The scanner captures the image of a user's fingerprint as the user rolls a finger across an image capturing surface. Multiple fingerprint images may be captured by the scanner as the finger is rolled. These images may be combined to form a composite rolled fingerprint image. A computer system may be used to create the composite rolled fingerprint image. Fingerprint images captured by a digital camera are generally comprised of pixels. Combining the pixels of fingerprint images into a composite fingerprint image is commonly referred to as pixel “knitting.”
The captured composite rolled fingerprint image may be used to identify the user. Fingerprint biometrics are largely regarded as an accurate method of identification and verification. A biometric is a unique, measurable characteristic or trait of a human being for automatically recognizing or verifying identity. See, e.g., Roethenbaugh, G. Ed., Biometrics Explained (International Computer Security Association: Carlisle, Pa. 1998), pages 1–34, which is herein incorporated by reference in its entirety.
Capturing rolled fingerprints using a fingerprint scanner coupled to a computer may be accomplished in a number of ways. Many current technologies implement a guide to assist the user. These guides primarily come in two varieties. The first type includes a guide located on the fingerprint scanner itself. This type may include guides such as light emitting diodes (LEDs) that move across the top and/or bottom of the scanner. The user is instructed to roll the finger at the same speed as the LEDs moving across the scanner. In doing so, the user inevitably goes too fast or too slow, resulting in poor quality images. The second type includes a guide located on a computer screen. Again, the user must match the speed of the guide, with the accompanying disadvantages. What is needed is a method and apparatus for capturing rolled fingerprint images without the requirement of a guide.
Current devices exist for collecting rolled fingerprint images. For instance, U.S. Pat. No. 4,933,976 describes using the statistical variance between successive fingerprint image “slices” to knit together a composite fingerprint image. This patent also describes techniques for averaging successive slices into the composite image. These techniques have the disadvantage of less than desirable image contrast. What is needed is a method and apparatus for capturing rolled fingerprint images with improved contrast imaging.
The present invention is directed to a method and apparatus for rolled fingerprint capture. The invention detects the start of a fingerprint roll. A plurality of fingerprint image frames are captured. A centroid window corresponding to each of the plurality of captured fingerprint image frames is determined. Pixels of each determined centroid window are knitted into a composite fingerprint image. The end of the fingerprint roll is detected.
In an embodiment, a pixel intensity difference count percentage value between a current fingerprint image frame and a previous fingerprint image frame is generated. Whether the generated pixel intensity difference count percentage value is greater than a start roll sensitivity threshold percentage value is determined.
Furthermore, in embodiments, a pixel window in a captured fingerprint image frame is determined. A leading edge column and a trailing edge column of a fingerprint image in the corresponding generated pixel window are found. A centroid window in the captured fingerprint image frame bounded by the leading edge column found and the trailing edge column found is generated.
The present invention further provides a novel algorithm for knitting fingerprint images together. Instead of averaging successive pixels, the algorithm of the present invention compares an existing pixel value to a captured potential new pixel value. New pixel values are only knitted if they are darker than the existing pixel value. The resultant image of the present invention has a much higher contrast than images that have been averaged or smoothed by previous techniques. In an embodiment, the invention compares the intensity of each pixel of the determined centroid window to the intensity of a corresponding pixel of a composite fingerprint image. The pixel of the composite fingerprint image is replaced with the corresponding pixel of the determined centroid window if the pixel of the determined centroid window is darker than the corresponding pixel of the composite fingerprint image.
Furthermore, existing fingerprint capturing devices require actuating a foot pedal to begin the capture process. The present invention requires no such activation. The algorithm of the present invention can be instantiated through a variety of software/hardware means (e.g. mouse click, voice command, etc.).
According to a further feature, the present invention provides a rolled fingerprint capture algorithm that can operate in either of two modes: guided and unguided. The present invention may provide the guided feature in order to support legacy systems; however, the preferred mode of operation is the unguided mode. Capturing rolled fingerprints without a guide has advantages. These advantages include decreased fingerprint scanner device complexity (no guide components required), and no need to train users to follow the speed of the guide.
Further embodiments, features, and advantages of the present inventions, as well as the structure and operation of the various embodiments of the present invention, are described in detail below with reference to the accompanying drawings.
The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention. In the drawings:
The present invention will now be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
Overview and Terminology
The present invention is directed to a method and apparatus for rolled fingerprint capture. The invention detects the start and end of a fingerprint roll. One or more fingerprint image frames are captured. A centroid window corresponding to each of the captured fingerprint image frames is determined. Pixels of each determined centroid window are knitted into a composite fingerprint image. The composite fingerprint image represents an image of a complete fingerprint roll.
To more clearly delineate the present invention, an effort is made throughout the specification to adhere to the following term definitions as consistently as possible.
“USB” port means a universal serial bus port.
The term “fingerprint image frame” means the image data obtained in a single sample of a fingerprint image area of a fingerprint scanner, including fingerprint image data. A fingerprint image frame has a certain width and height in terms of image pixels, determined by the fingerprint scanner and the application.
The terms “centroid” or “fingerprint centroid” means the pixels of a fingerprint image frame that comprise a fingerprint.
The term “centroid window” means an area of pixels substantially surrounding and including a fingerprint centroid. This area of pixels can be any shape, including but not limited to rectangular, Square, or other shape.
Example Rolled Fingerprint Capture Environment
Structural implementations for rolled fingerprint capture according to the present invention are described at a high-level and at a more detailed level. These structural implementations are described herein for illustrative purposes, and are not limiting. In particular, rolled fingerprint capture as described in this section can be achieved using any number of structural implementations, including hardware, firmware, software, or any combination thereof.
Fingerprint scanner 102 captures a user's fingerprint. Fingerprint scanner 102 may be any suitable type of fingerprint scanner, known to persons skilled in the relevant art(s). For example, fingerprint scanner 102 may be a Cross Match Technologies Verifier Model 290 Fingerprint Capture Device. Fingerprint scanner 102 includes a fingerprint-image capturing area or surface, where a user may apply a finger, and roll the applied finger across the fingerprint capturing area or surface. Fingerprint scanner 102 periodically samples the fingerprint image capturing area, and outputs captured image data from the fingerprint image capturing area. Fingerprint scanner 102 is coupled to computer system 104.
Fingerprint scanner 102 may be coupled to computer system 104 in any number of ways. Some of the more common methods include coupling by a frame grabber, a USB port, and a parallel port. Other methods of coupling fingerprint scanner 102 to computer system 104 will be known by persons skilled in the relevant art(s), and are within the scope of the present invention.
Computer system 104 receives captured fingerprint image data from fingerprint scanner 102. Computer system 104 may provide a sampling signal to fingerprint scanner 102 that causes fingerprint scanner 102 to capture fingerprint image frames. Computer system 104 combines the captured fingerprint image data/frames into composite or overall fingerprint images. Further details of combining captured fingerprint image frames into composite or overall fingerprint images is provided below.
Computer system 104 may comprise a personal computer, a mainframe computer, one or more processors, specialized hardware, software, firmware, or any combination thereof, and/or any other device capable of processing the captured fingerprint image data as described herein. Computer system 104 may comprise a hard drive, a floppy drive, memory, a keyboard, a computer mouse, and any additional peripherals known to person(s) skilled in the relevant art(s), as necessary. Computer system 104 allows a user to initiate and terminate a rolled fingerprint capture session. Computer system 104 also allows a user to modify rolled fingerprint capture session options and parameters, as further described below.
Computer system 104 may be optionally coupled to a communications interface signal 110. Computer system 104 may output fingerprint image data, or any other related data, on optional communications interface signal 110. Optional communications interface signal 110 may interface the data with a network, the Internet, or any other data communication medium known to persons skilled in the relevant art(s). Through this communication medium, the data may be routed to any fingerprint image data receiving entity of interest, as would be known to persons skilled in the relevant art(s). For example, such entities may include the police and other law enforcement agencies. Computer system 104 may comprise a modem, or any other communications interface, as would be known to persons skilled in the relevant art(s), to transmit and receive data on optional communications interface signal 110.
Display 106 is coupled to computer system 104. Computer system 104 outputs fingerprint image data, including individual frames and composite rolled fingerprint images, to display 106. Any related rolled fingerprint capture session options, parameters, or outputs of interest, may be output to display 106. Display 106 displays the received fingerprint image data and related rolled fingerprint capture session options, parameters, and outputs. Display 106 may include a computer monitor, or any other applicable display known to persons skilled in the relevant art(s) from the teachings herein.
Embodiments for computer system 104 are further described below with respect to
As shown in
The present invention is described in terms of the exemplary environment shown in
Description in these terms is provided for convenience only. It is not intended that the invention be limited to application in this example environment. In fact, after reading the following description, it will become apparent to a person skilled in the relevant art how to implement the invention in alternative environments known now or developed in the future.
Rolled Fingerprint Capture Module Embodiments
Implementations for a rolled fingerprint capture module 108 are described at a high-level and at a more detailed level. These structural implementations are described herein for illustrative purposes, and are not limiting. In particular, the rolled fingerprint capture module 108 as described in this section can be achieved using any number of structural implementations, including hardware, firmware, software, or any combination thereof. The details of such structural implementations will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein.
Fingerprint frame capture module 202 receives a fingerprint scanner data signal 208. Fingerprint scanner data signal 208 comprises fingerprint image frame data captured by fingerprint scanner 102. In an embodiment, fingerprint frame capture module 202 allocates memory to hold a fingerprint frame, initiates transfer of the frame from the fingerprint scanner 102, and arranges the pixels for subsequent analysis. Fingerprint frame capture module 202 outputs a captured fingerprint image frame data signal 210. Captured fingerprint image frame data signal 210 comprises fingerprint image frame data, preferably in the form of digitized image pixels. For instance, fingerprint image frame data signal 210 may comprise a series of slices of fingerprint image frame data, where each slice is a vertical line of image pixels.
Fingerprint image format module 204 receives captured fingerprint image frame data signal 210. Fingerprint image format module 204 detects the start and stop of fingerprint rolls using captured fingerprint image frame data signal 210. Furthermore, fingerprint image format module 204 combines captured rolled fingerprint image frames into composite rolled fingerprint images. Further structural and operational embodiments of rolled fingerprint capture module 108 are provided below. Fingerprint image format module 204 outputs a composite fingerprint image data signal 212. Composite fingerprint image data signal 212 comprises fingerprint image data, such as a single rolled fingerprint image frame, or any combination of one or more rolled fingerprint image frames, including a complete rolled fingerprint image.
Fingerprint image display module 206 receives composite fingerprint image data signal 212. Fingerprint image display module 206 provides any display formatting and any display drivers necessary for displaying fingerprint images on-display 106. In a preferred embodiment, fingerprint image display module 206 formats the fingerprint image pixels into a Windows Device Independent Bitmap (DIB). This is a preferred image format used by the Microsoft Windows Graphical Device Interface (GDI) Engine. Fingerprint image display module 206 outputs a fingerprint image display signal 214, preferably in DIB format.
Fingerprint roll detector module 216 detects when a fingerprint roll has started, and detects when the fingerprint roll has stopped.
Referring back to
Pixel knitting module 220 knits together the relevant portions of centroid windows to create composite rolled fingerprint images. Embodiments of this module are described in further detail below.
The embodiments described above are provided for purposes of illustration. These embodiments are not intended to limit the invention. Alternate embodiments, differing slightly or substantially from those described herein, will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein.
Operation
Exemplary operational and/or structural implementations related to the structure(s), and/or embodiments described above are presented in this section (and its subsections). These components and methods are presented herein for purposes of illustration, and not limitation. The invention is not limited to the particular examples of components and methods described herein. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the present invention.
The process begins with step 302. In step 302, system variables are initialized. Control then passes to step 304.
In step 304, the start of a fingerprint roll is detected. Control then passes to step 306.
In step 306, a plurality of fingerprint image frames are captured. Control then passes to step 308.
In step 308, a centroid window corresponding to each of the plurality of captured fingerprint image frames is determined. Control then passes to step 310.
In step 310, pixels of the determined centroid windows are knitted into an overall fingerprint image. Control then passes to step 312.
In step 312, the end of a fingerprint roll is detected. The algorithm then ends.
More detailed structural and operational embodiments for implementing the steps of
System Variable Initialization
In step 302, variables used by the routine steps must be initialized before the process proceeds. In a preferred embodiment, the initialization phase resets at least the variables shown in Table 1:
Both RollInitialized and RollDetected are initially set to FALSE. When a roll is initialized, RollInitialized is set to TRUE. When a roll is detected, RollDetected is set to TRUE. When a roll is complete, both variables are set to FALSE.
StartRollSensitivity and StopRollSensitivity may be fixed or adjustable values. In an embodiment, the StartRollSensitivity and StopRollSensitivity variables may be configured from a user interface to control the sensitivity of the rolling process. In an embodiment, these variables can take values between 0 and 100 representing low sensitivity to high sensitivity. Other value ranges may be used, as would be recognized by persons skilled in the relevant art(s).
CurrentBits, PreviousBits, and ImageBits are comprised of arrays of pixels, with each pixel having a corresponding intensity. In a preferred embodiment, all pixel intensity values in CurrentBits, PreviousBits, and ImageBits are set to 255 (base 10), which corresponds to white. Zero (0) corresponds to black. Pixel values in between 0 and 255 correspond to shades of gray, becoming lighter when approaching 255 from 0. This scheme may be chosen in keeping with the concept that a fingerprint image contains black ridges against a white background. Other pixel intensity value ranges may be used, as would be recognized by persons skilled in the relevant art(s). Furthermore, the invention is fully applicable to the use of a color fingerprint scanner, with colored pixel values, as would be recognized by persons skilled in the relevant art(s).
Detecting Start of Fingerprint Roll
In step 304, the start of a fingerprint roll is detected. Before a rolled fingerprint image can begin to be created, the system must detect that the user has placed a finger in or against the fingerprint image capturing area of fingerprint scanner 102 (shown in
In a preferred embodiment, a method for detecting a finger on fingerprint scanner 102 is based on calculating a percentage intensity change from a previously captured fingerprint scanner image (PreviousBits) to a currently captured fingerprint scanner image (CurrentBits). Each pixel in CurrentBits may be compared to the corresponding, identically located pixel in PreviousBits. If the difference in the intensities of a compared pixel pair is greater than a predetermined threshold, that pixel pair is counted as being different. Once all pixels have been compared, the percentage of different pixels is calculated. In alternate embodiments, the number of different pixels may be calculated without determining a percentage. This calculated pixel difference percentage is used to determine when a roll has started and stopped. When the algorithm is initiated, this calculated percentage will be relatively low since virtually no pixels will be different.
Once the calculated pixel difference percentage goes beyond a predetermined start roll sensitivity threshold value (StartRollSensitivity), the algorithm goes into rolling mode. As soon as the percentage goes below a predetermined stop roll sensitivity threshold value (StopRollSensitivity), the algorithm exits rolling mode (discussed in greater detail below). As discussed above, in alternate embodiments, the number of different pixels may be calculated, without determining a percentage, and this number may be compared to a predetermined stop roll sensitivity threshold value.
In step 314, a pixel intensity difference percentage value between a current fingerprint image frame and a previous fingerprint image frame is generated. In an alternate embodiment, a pixel intensity difference count value between a current fingerprint image frame and a previous fingerprint image frame may be generated. Control then proceeds to step 316.
In step 316, whether the generated pixel intensity difference percentage value is greater than a start roll sensitivity threshold percentage value is determined. In the alternate embodiment stated in step 314 above, whether a generated pixel difference count value is greater than a start roll sensitivity threshold value may be determined.
In step 318, a current fingerprint image frame is captured. Control then proceeds to step 320.
In step 320, the intensity of each pixel of the current fingerprint image frame is compared to the intensity of a corresponding pixel of a previously captured fingerprint image frame to obtain a pixel intensity difference count value. In embodiments, compared pixels are found different if their respective pixel intensity values are not the same. In alternative embodiments, compared pixels may be found different if their intensity values differ by greater than a pixel intensity difference threshold. The pixel intensity difference threshold may be established as a system variable, and may be set to a fixed value, or may be adjustable by a user. Control then proceeds to step 322.
In step 322, a pixel intensity difference percentage value is calculated. In an alternate embodiment, a pixel difference count value may be calculated.
In the following example of a preferred embodiment, a finger detected function (FingerDetected) is presented. This function may be called to detect whether a finger is present on a fingerprint scanner.
In this preferred embodiment, the finger detected function calls a frame difference function. The frame difference routine calculates the percentage difference between two frames. This difference is calculated down to the pixel level. Two pixels are considered to be different is their values are more than a certain value (nDiff) apart. In the following example of a preferred embodiment, a frame difference function (FrameDifference) is presented.
In embodiments, fingerprint roll detector module 216 of
Capturing Fingerprint Image Frames
Returning to
Determining a Centroid Window
In step 308, a centroid window corresponding to each of the plurality of captured fingerprint image frames is determined. After the algorithm has detected that a fingerprint roll has started, the task of combining pixels from captured fingerprint image frames into a composite rolled fingerprint image begins. However, all of the pixels in a particular frame are not necessarily read. Only those pixels inside a particular window, the “centroid window,” are read. A centroid window comprises captured fingerprint image pixels, substantially trimming off the non-relevant pixels of a captured fingerprint image frame. By focusing only on the relevant portion of the captured frame, the processing of the captured frame can proceed much faster.
In an embodiment, to determine a centroid window, the leading and trailing edges of the fingerprint image in a captured fingerprint image frame are found. These edges are determined by sampling a thin strip of pixels in a pixel window across the center of a fingerprint frame.
Furthermore, in alternative embodiments, more than one pixel window 702 may be generated to determine a centroid window. For example, three pixel windows may be generated within the fingerprint image frame, with pixel windows generated across the center, at or near the top, and at or near the bottom of the fingerprint image frame. Generating more than one pixel window may provide advantages in locating a fingerprint image within a fingerprint image frame, particularly if the fingerprint image is off-center.
A histogram is built from the generated pixel window. The histogram includes the total pixel intensity value for each column of pixels in pixel window 606. An example histogram 704 is shown graphically in
To find the leading edge of a fingerprint image, the algorithm scans the histogram in the direction opposite of the direction of the fingerprint roll. The algorithm searches for a difference above a predetermined threshold between two adjacent columns. Where the total pixel intensity changes above the threshold value between columns, becoming darker, the leading edge is found. To find the trailing edge of the fingerprint, the algorithm scans the histogram in the direction of the fingerprint roll, in a similar fashion to finding the leading edge. The “x” coordinates of the leading and trailing edges of the histogram become the “x” coordinates of the leading and trailing edges of the centroid window.
In the example of
In step 324, a pixel window in a captured fingerprint image frame is generated. Control then proceeds to step 326.
In step 326, a leading edge column and a trailing edge column of a fingerprint image are found in the corresponding generated pixel window. Control then proceeds to step 328.
In step 328, a centroid window in the captured fingerprint image frame bounded by the leading edge column found and the trailing edge column found is generated.
In a preferred embodiment, the window generated in step 328 is centered in an axis perpendicular to the direction that a finger is rolled. In such an embodiment, the generated window may include columns of a height of a predetermined number of pixels in the axis perpendicular to the direction that the finger is rolled. Furthermore, the generated window may have a length in an axis parallel to the direction that the finger is rolled equal to the number of pixels spanning the captured fingerprint image frame along that axis.
In step 330, a histogram representative of the cumulative pixel intensity of pixels present in each column of the generated window is built. Control then proceeds to step 332.
In step 332, the histogram is scanned in the direction opposite of that in which the finger is rolled. The algorithm scans for a difference in the pixel intensities of adjacent columns of the generated window that is greater than a first fingerprint edge. In an embodiment, the darker column of the adjacent columns is designated the leading edge column. In alternative embodiments, other columns, such as the other adjacent column, may be designated as the leading edge column. Control then proceeds to step 334.
In step 334, the histogram is scanned in the direction in which the finger is rolled for a difference in the pixel intensities of two adjacent columns that is greater than a second fingerprint edge threshold. In an embodiment, the darker column of the adjacent columns is designated the trailing edge column. In alternative embodiments, other columns, such as the other adjacent column, may be designated as the trailing edge column.
In the following example of a preferred embodiment, a find centroid function (FindCentroid) is presented. The FindCentroid function may be called to determine a centroid window. The function builds a histogram from a generated pixel window in a captured fingerprint image frame, and searches from left to right and then right to left through the histogram, looking for the edges of a fingerprint.
In an embodiment, centroid window determiner module 218 of
Knitting Pixels
Returning to
In a preferred embodiment, the copying of pixels for knitting is a conditional copy based on the intensity, or darkness, of the pixel. In other words, the algorithm for copying pixels from the centroid window is not a blind copy. The algorithm compares each pixel of the ImageBits array with the corresponding pixel of the CurrentBits array. A pixel will only be copied from CurrentBits if the pixel is darker than the corresponding pixel of ImageBits. This rule prevents the image from becoming too blurry during the capture process. This entire process is referred to herein as “knitting.”
For example, pixel 808 has an intensity value of 94. Pixel 808 is compared against pixel 810, which has an intensity value of 118. Because the intensity value of pixel 808 is darker than that of pixel 810 (i.e., 94 is a lower intensity value than 118), the intensity value of pixel 812 is set to the new pixel intensity value of pixel 808. Likewise, pixel 814 has an intensity value of 123. Pixel 814 is compared against pixel 816, which has an intensity value of 54. Because the intensity value of pixel 816 is darker than that of pixel 814 (i.e., 54 is a lower intensity value than 123), the intensity value of pixel 818 remains that of pixel 816.
In step 336, the intensity of each pixel of the determined centroid window is compared to the intensity of the corresponding pixel of an overall fingerprint image. Control then proceeds to step 338.
In step 338, the pixel of the composite fingerprint image is replaced with the corresponding pixel of the determined centroid window if the pixel of the determined centroid window is darker than the corresponding pixel of the composite fingerprint image.
In the following example of a preferred embodiment, a pixel knitting function (CopyConditionalBits) is presented. The CopyConditionalBits function copies pixels from the determined centroid window (CurrentBits) into the overall image (ImageBits). The function does not blindly copy the pixels. Instead, the function will only copy a pixel if the new value is less than the previous value.
In an embodiment of the present invention, pixel knitting module 220 of
Detecting End of Fingerprint Roll
Returning to
Returning to
As discussed above, this calculated pixel difference percentage is used to determine when a roll has started and stopped. When the rolled fingerprint capture algorithm is operating, and a fingerprint is being captured, this percentage will be relatively low because a relatively small number of pixels will be changing during the roll. However, when the user removes their finger from the scanner surface, the difference percentage will increase, as fingerprint image data is no longer being captured.
As discussed above, as soon as the percentage goes below a predetermined stop roll sensitivity threshold value (StopRollSensitivity), the algorithm exits rolled fingerprint capture mode.
In step 340, a pixel intensity difference percentage value between a current fingerprint image frame and a previous fingerprint image frame is generated. In an alternative embodiment, a pixel intensity difference count value between a current fingerprint image frame and a previous fingerprint image frame may be generated. Control then proceeds to step 342.
In step 342, whether the generated pixel intensity difference percentage value is less than a stop roll sensitivity threshold percentage value is determined. In the alternate embodiment mentioned in step 340, whether a generated pixel intensity difference count value is greater than a stop roll sensitivity threshold value may be determined.
A finger detected function (FingerDetected) is presented above in reference to step 304 of
In embodiments, fingerprint roll stop detector module 224 of
Rolled Fingerprint Display Panel
In an embodiment, rolled fingerprint display panel 904 of
Example Computer System
An example of a computer system 104 is shown in
The computer system 104 includes one or more processors, such as processor 1004. One or more processors 1004 can execute software implementing the routine shown in
Computer system 104 may include a graphics subsystem 1003 (optional). Graphics subsystem 1003 can be any type of graphics system supporting computer graphics. Graphics subsystem 1003 can be implemented as one or more processor chips. The graphics subsystem 1003 can be included as a separate graphics engine or processor, or as part of processor 1004. Graphics data is output from the graphics subsystem 1003 to bus 1002. Display interface 1005 forwards graphics data from the bus 1002 for display on the display unit 106.
Computer system 104 also includes a main memory 1008, preferably random access memory (RAM), and can also include a secondary memory 1010. The secondary memory 1010 can include, for example, a hard disk drive 1012 and/or a removable storage drive 1014, representing a floppy disk drive, a magnetic tape drive, an optical disk drive, etc. The removable storage drive 1014 reads from and/or writes to a removable storage unit 1018 in a well known manner. Removable storage unit 1018 represents a floppy disk, magnetic tape, optical disk, etc., which is read by and written to by removable storage drive 1014. As will be appreciated, the removable storage unit 1018 includes a computer usable storage medium having stored therein computer software and/or data.
In alternative embodiments, secondary memory 1010 may include other similar means for allowing computer programs or other instructions to be loaded into computer system 104. Such means can include, for example, a removable storage unit 1022 and an interface 1020. Examples can include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) and associated socket, and other removable storage units 1022 and interfaces 1020 which allow software and data to be transferred from the removable storage unit 1022 to computer system 104.
Computer system 104 can also include a communications interface 1024. Communications interface 1024 allows software and data to be transferred between computer system 104 and external devices via communications path 1026. Examples of communications interface 1024 can include a modem, a network interface (such as Ethernet card), a communications port, etc. Software and data transferred via communications interface 1024 are in the form of signals which can be electronic, electromagnetic, optical or other signals capable of being received by communications interface 1024, via communications path 1026. Note that communications interface 1024 provides a means by which computer system 104 can interface to a network such as the Internet.
Graphical user interface module 1030 transfers user inputs from peripheral devices 1032 to bus 1002. In an embodiment, one or more peripheral devices 1032 may be fingerprint scanner 102. These peripheral devices 1032 also can be a mouse, keyboard, touch screen, microphone, joystick, stylus, light pen, voice recognition unit, or any other type of peripheral unit.
The present invention can be implemented using software running (that is, executing) in an environment similar to that described above with respect to
Computer programs (also called computer control logic) are stored in main memory 1008 and/or secondary memory 1010. Computer programs can also be received via communications interface 1024. Such computer programs, when executed, enable the computer system 104 to perform the features of the present invention as discussed herein. In particular, the computer programs, when executed, enable the processor 1004 to perform the features of the present invention. Accordingly, such computer programs represent controllers of the computer system 104.
In an embodiment where the invention is implemented using software, the software may be stored in a computer program product and loaded into computer system 104 using removable storage drive 1014, hard drive 1012, or communications interface 1024. Alternatively, the computer program product may be downloaded to computer system 104 over communications path 1026. The control logic (software), when executed by the one or more processors 1004, causes the processor(s) 1004 to perform the functions of the invention as described herein.
In another embodiment, the invention is implemented primarily in firmware and/or hardware using, for example, hardware components such as application specific integrated circuits (ASICs). Implementation of a hardware state machine so as to perform the functions described herein will be apparent to persons skilled in the relevant art(s).
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Number | Name | Date | Kind |
---|---|---|---|
2500017 | Altman | Mar 1950 | A |
3200701 | White | Aug 1965 | A |
3482498 | Becker | Dec 1969 | A |
3527535 | Monroe | Sep 1970 | A |
3617120 | Roka | Nov 1971 | A |
3699519 | Campbell | Oct 1972 | A |
3947128 | Weinberger et al. | Mar 1976 | A |
3968476 | McMahon | Jul 1976 | A |
4032975 | Malueg et al. | Jun 1977 | A |
4047154 | Vitols et al. | Sep 1977 | A |
4063226 | Kozma et al. | Dec 1977 | A |
4210899 | Swonger et al. | Jul 1980 | A |
4414684 | Blonder | Nov 1983 | A |
4537484 | Fowler et al. | Aug 1985 | A |
4544267 | Schiller | Oct 1985 | A |
4601195 | Garritano | Jul 1986 | A |
4681435 | Kubota et al. | Jul 1987 | A |
4783823 | Tasaki et al. | Nov 1988 | A |
4784484 | Jensen | Nov 1988 | A |
4792226 | Fishbine et al. | Dec 1988 | A |
4811414 | Fishbine et al. | Mar 1989 | A |
4876726 | Capello et al. | Oct 1989 | A |
4924085 | Kato et al. | May 1990 | A |
4933976 | Fishbine et al. | Jun 1990 | A |
4995086 | Lilley et al. | Feb 1991 | A |
5054090 | Knight et al. | Oct 1991 | A |
5067162 | Driscoll, Jr. et al. | Nov 1991 | A |
5067749 | Land | Nov 1991 | A |
5131038 | Puhl et al. | Jul 1992 | A |
5146102 | Higuchi et al. | Sep 1992 | A |
5187747 | Capello et al. | Feb 1993 | A |
5222152 | Fishbine et al. | Jun 1993 | A |
5230025 | Fishbine et al. | Jul 1993 | A |
5233404 | Lougheed et al. | Aug 1993 | A |
5249370 | Stanger et al. | Oct 1993 | A |
D348445 | Fishbine et al. | Jul 1994 | S |
D351144 | Fishbine et al. | Oct 1994 | S |
5384621 | Hatch et al. | Jan 1995 | A |
5412463 | Sibbald et al. | May 1995 | A |
5416573 | Sartor, Jr. | May 1995 | A |
5467403 | Fishbine et al. | Nov 1995 | A |
5469506 | Berson et al. | Nov 1995 | A |
5473144 | Mathurin, Jr. | Dec 1995 | A |
5509083 | Abtahi et al. | Apr 1996 | A |
5517528 | Johnson | May 1996 | A |
5528355 | Maase et al. | Jun 1996 | A |
5548394 | Giles et al. | Aug 1996 | A |
5591949 | Bernstein | Jan 1997 | A |
5596454 | Hebert | Jan 1997 | A |
5598474 | Johnson | Jan 1997 | A |
5613014 | Eshera et al. | Mar 1997 | A |
5615277 | Hoffman | Mar 1997 | A |
5625448 | Ranalli et al. | Apr 1997 | A |
5640422 | Johnson | Jun 1997 | A |
5649128 | Hartley | Jul 1997 | A |
5650842 | Maase et al. | Jul 1997 | A |
5659626 | Ort et al. | Aug 1997 | A |
5661451 | Pollag | Aug 1997 | A |
5680205 | Borza | Oct 1997 | A |
5689529 | Johnson | Nov 1997 | A |
5717777 | Wong et al. | Feb 1998 | A |
5745684 | Oskouy et al. | Apr 1998 | A |
5748766 | Maase et al. | May 1998 | A |
5755748 | Borza | May 1998 | A |
5778089 | Borza | Jul 1998 | A |
5781647 | Fishbine et al. | Jul 1998 | A |
5793218 | Oster et al. | Aug 1998 | A |
5799098 | Ort et al. | Aug 1998 | A |
5805777 | Kuchta | Sep 1998 | A |
5812067 | Bergholz et al. | Sep 1998 | A |
5815252 | Price-Francis | Sep 1998 | A |
5818956 | Tuli | Oct 1998 | A |
5822445 | Wong | Oct 1998 | A |
5825005 | Behnke | Oct 1998 | A |
5825474 | Maase | Oct 1998 | A |
5828773 | Setlak et al. | Oct 1998 | A |
5832244 | Jolley et al. | Nov 1998 | A |
5848231 | Teitelbaum et al. | Dec 1998 | A |
5859420 | Borza | Jan 1999 | A |
5862247 | Fisun et al. | Jan 1999 | A |
5867802 | Borza | Feb 1999 | A |
5869822 | Meadows, II et al. | Feb 1999 | A |
5872834 | Teitelbaum | Feb 1999 | A |
5900993 | Betensky | May 1999 | A |
5907627 | Borza | May 1999 | A |
5920384 | Borza | Jul 1999 | A |
5920640 | Salatino et al. | Jul 1999 | A |
5926555 | Ort et al. | Jul 1999 | A |
5928347 | Jones | Jul 1999 | A |
5960100 | Hargrove | Sep 1999 | A |
5973731 | Schwab | Oct 1999 | A |
5974162 | Metz et al. | Oct 1999 | A |
5978495 | Thomopoulos et al. | Nov 1999 | A |
5987155 | Dunn et al. | Nov 1999 | A |
5995014 | DiMaria | Nov 1999 | A |
6018739 | McCoy et al. | Jan 2000 | A |
6023522 | Draganoff et al. | Feb 2000 | A |
6041372 | Hart et al. | Mar 2000 | A |
6075876 | Draganoff | Jun 2000 | A |
6078265 | Bonder et al. | Jun 2000 | A |
6088585 | Schmitt et al. | Jul 2000 | A |
6104809 | Berson et al. | Aug 2000 | A |
6658164 | Irving et al. | Dec 2003 | B1 |
7010148 | Irving et al. | Mar 2006 | B1 |
20030128240 | Martinez et al. | Jul 2003 | A1 |
Number | Date | Country |
---|---|---|
0 101 772 | Mar 1984 | EP |
0 308 162 | Mar 1989 | EP |
0 379 333 | Jul 1990 | EP |
0 623 890 | Nov 1994 | EP |
0 379 333 | Jul 1995 | EP |
0 889 432 | Jan 1999 | EP |
0 905 646 | Mar 1999 | EP |
0 924 656 | Jun 1999 | EP |
2 089 545 | Jun 1982 | GB |
2 313 441 | Nov 1997 | GB |
WO 8702491 | Apr 1987 | WO |
WO 9003620 | Apr 1990 | WO |
WO 9211608 | Jul 1992 | WO |
WO 9422371 | Oct 1994 | WO |
WO 9617480 | Jun 1996 | WO |
WO 9729477 | Aug 1997 | WO |
WO 9741528 | Nov 1997 | WO |
WO 9809246 | Mar 1998 | WO |
WO 9812670 | Mar 1998 | WO |
WO 9912123 | Mar 1999 | WO |
WO 9926187 | May 1999 | WO |
WO 9940535 | Aug 1999 | WO |
Number | Date | Country | |
---|---|---|---|
20030091219 A1 | May 2003 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09377597 | Aug 1999 | US |
Child | 10247285 | US |