Multispectral barcode imaging

Information

  • Patent Grant
  • 8787630
  • Patent Number
    8,787,630
  • Date Filed
    Wednesday, January 5, 2011
    13 years ago
  • Date Issued
    Tuesday, July 22, 2014
    9 years ago
  • Inventors
  • Original Assignees
  • Examiners
    • Strege; John
    Agents
    • Marsh Fischmann & Breyfogle LLP
    • Sherwinter; Daniel J.
Abstract
A multispectral sensor is provided with an illumination source and a digital imaging system. The illumination source is disposed to provide light at multiple wavelengths to an object. The digital imaging system is disposed to receive light scattered from the object and has a digital array of light detectors and a color filter array. The color filter array has a multiple distributed filter elements, each of which is adapted to transmit light of one of a limited number of specified narrowband wavelength ranges. The color filter array is disposed to filter the light scattered from the object prior to encountering the digital array of light detectors.
Description

This application is also related to the following applications, each of which is incorporated herein by reference for all purposes: U.S. patent application Ser. No. 11/219,006, entitled “COMPARATIVE TEXTURE ANALYSIS OF TISSUE FOR BIOMETRIC SPOOF DETECTION,” filed Sep. 1, 2005 by Robert K. Rowe (U.S. Pat. No. 7,668,350); “U.S. patent application Ser. No. 10/818,698, entitled “MULTISPECTRAL BIOMETRIC SENSOR,” filed Apr. 5, 2004 by Robert K. Rowe et al. (U.S. Pat. No. 7,147,153); U.S. patent application Ser. No. 09/874,740, entitled “APPARATUS AND METHOD OF BIOMETRIC DETERMINATION USING SPECIALIZED OPTICAL SPECTROSCOPY SYSTEM,” filed Jun. 5, 2001 by Robert K. Rowe et al.; U.S. Pray. Pat. Appl. No. 60/659,024, entitled “MULTISPECTRAL IMAGING OF THE FINGER FOR BIOMETRICS,” filed Mar. 4, 2005 by Robert K. Rowe et al.; U.S. Prov. Pat. Appl. No. 60/654,354, entitled “SYSTEMS AND METHODS FOR MULTISPECTRAL FINGERPRINT SENSING,” filed Feb. 18, 2005 by Robert K. Rowe; U.S. Prov. Pat. Appl. No. 60/610,802, entitled “FINGERPRINT SPOOF DETECTION USING MULTISPECTRAL IMAGING,” filed Sep. 17, 2004 by Robert K. Rowe; U.S. Prov. Pat. Appl. No. 60/576,364, entitled “MULTISPECTRAL FINGER RECOGNITION,” filed Jun. 1, 2004 by Robert K. Rowe and Stephen P. Corcoran; U.S. Prov. Pat. Appl. No. 60/552,662, entitled “OPTICAL SKIN SENSOR FOR BIOMETRICS,” filed Mar. 10, 2004 by Robert K. Rowe; U.S. Prov. Pat. Appl. No. 60/504,594, entitled “HYPERSPECTRAL FINGERPRINTING,” filed Sep. 18, 2003 by Robert K. Rowe et al.; U.S. Prov. Pat. Appl. No. 60/483,281, entitled “HYPERSPECTRAL FINGERPRINT READER,” filed Jun. 27, 2003 by Robert K. Rowe et al.; and U.S. Prov. Pat. Appl. No. 60/460,247, entitled “NONINVASIVE ALCOHOL MONITOR,” filed Apr. 4, 2003 by Robert K. Rowe and Robert M. Harbour.


BACKGROUND OF THE INVENTION

This application relates generally to biometrics. More specifically, this application relates to methods and systems for performing biometric measurements with a multispectral imaging sensor.


“Biometrics” refers generally to the statistical analysis of characteristics of living bodies. One category of biometrics includes “biometric identification,” which commonly operates under one of two modes to provide automatic identification of people or to verify purported identities of people. Biometric sensing technologies measure the physical features or behavioral characteristics of a person and compare those features to similar prerecorded measurements to determine whether there is a match. Physical features that are commonly used for biometric identification include faces, irises, hand geometry, vein structure, and fingerprint patterns, which is the most prevalent of all biometric-identification features. Current methods for analyzing collected fingerprints include optical, capacitive, radio-frequency, thermal, ultrasonic, and several other less common techniques.


Most of the fingerprint-collection methods rely on measuring characteristics of the skin at or very near the surface of a finger. In particular, optical fingerprint readers typically rely on the presence or absence of a difference in the index of refraction between the sensor platen and the finger placed on it. When an air-filled valley of the fingerprint is above a particular location of the platen, total internal reflectance (“TIR”) occurs in the platen because of the air-platen index difference. Alternatively, if skin of the proper index of refraction is in optical contact with the platen, then the TIR at this location is “frustrated,” allowing light to traverse the platen-skin interface. A map of the differences in TIR across the region where the finger is touching the platen forms the basis for a conventional optical fingerprint reading. There are a number of optical arrangements used to detect this variation of the optical interface in both bright-field and dark-field optical arrangements. Commonly, a single, quasimonochromatic beam of light is used to perform this TIR-based measurement.


There also exists non-TIR optical fingerprint sensors. In most cases, these sensors rely on some arrangement of quasimonochromatic light to illuminate the front, sides, or back of a fingertip, causing the light to diffuse through the skin. The fingerprint image is formed due to the differences in light transmission across the skin-platen boundary for the ridge and valleys. The difference in optical transmission are due to changes in the Fresnel reflection characteristics due to the presence or absence of any intermediate air gap in the valleys, as known to one of familiarity in the art.


Optical fingerprint readers are particularly susceptible to image quality problems due to non-ideal conditions. If the skin is overly dry, the index match with the platen will be compromised, resulting in poor image contrast. Similarly, if the finger is very wet, the valleys may fill with water, causing an optical coupling to occur all across the fingerprint region and greatly reducing image contrast. Similar effects may occur if the pressure of the finger on the platen is too little or too great, the skin or sensor is dirty, the skin is aged and/or worn, or overly fine features are present such as may be the case for certain ethnic groups and in very young children. These effects decrease image quality and thereby decrease the overall performance of the fingerprint sensor. In some cases, commercial optical fingerprint readers incorporate a thin membrane of soft material such as silicone to help mitigate some of these effects and restore performance. As a soft material, the membrane is subject to damage, wear, and contamination, limiting the use of the sensor without maintenance.


Biometric sensors, particularly fingerprint biometric sensors, are generally prone to being defeated by various forms of spoof samples. In the case of fingerprint readers, a variety of methods are known in the art for presenting readers with a fingerprint pattern of an authorized user that is embedded in some kind of inanimate material such as paper, gelatin, epoxy, latex, and the like. Thus, even if a fingerprint reader can be considered to reliably determine the presence or absence of a matching fingerprint pattern, it is also critical to the overall system security to ensure that the matching pattern is being acquired from a genuine, living finger, which may be difficult to ascertain with many common sensors.


A common approach to making biometric sensors more robust, more secure, and less error-prone is to combine sources of biometric signals using an approach sometimes referred to in the art as using “dual,” “combinatoric,” “layered,” “fused,” or “multifactor biometric sensing. To provide enhanced security in this way, biometric technologies are combined in such a way that different technologies measure the same portion of the body at the same time and are resistant to being defeated by using different samples or techniques to defeat the different sensors that are combined. When technologies are combined in a way that they view the same part of the body they are referred to as being “tightly coupled.”


There is accordingly a general need in the art for improved methods and systems for biometric sensing.


BRIEF SUMMARY OF THE INVENTION

Embodiments of the invention provide a multispectral sensor that comprises an illumination source and a digital imaging system. The illumination source is disposed to provide light at a plurality of wavelengths to an object. The digital imaging system is disposed to receive light scattered from the object and comprises a digital array of light detectors and a color filter array. The color filter array has a plurality of distributed filter elements, each of which is adapted to transmit light of one of a limited number of specified narrowband wavelength ranges. The color filter array is disposed to filter the light scattered from the object prior to encountering the digital array of light detectors.


The multispectral sensor may function as a biometric sensor when the object comprises a skin site of an individual, and may be configured to detect blanching or blood pooling at the skin site as part of a spoof detection. In some instances, the filter elements are distributed according to a Bayer pattern. In some embodiments, a first polarizer may be disposed to polarize the light provided by the illumination source, with the digital imaging system further comprising a second polarizer disposed to polarize the light scattered from the object. The first and second polarizers may advantageously be provided in a crossed configuration.


The multispectral sensor may be incorporated within a portable electronic device and have such functionality as an ability to read a bar code, an ability to scan printed matter, an ability to securely receive data related to functionality changes of the portable electronic device, and the like. In other instances, the multispectral sensor may be configured for use as a smart switch, configured for use as a pointing device, configured for use a text entry device, configured for measuring an ambient light condition, and the like. In some embodiments, the multispectral sensor is integrated with a separate biometric sensor.





BRIEF DESCRIPTION OF THE DRAWINGS

A further understanding of the nature and advantages of the present invention may be realized by reference to the remaining portions of the specification and the drawings wherein like reference labels are used throughout the several drawings to refer to similar components. In some instances, reference labels include a numerical portion followed by a latin-letter suffix; reference to only the numerical portion of reference labels is intended to refer collectively to all reference labels that have that numerical portion but different latin-letter suffices.



FIG. 1 provides a front view of a multispectral biometric sensor in one embodiment of the invention; and



FIG. 2 provides a front view of a multispectral biometric sensor shown in another embodiment.





DETAILED DESCRIPTION OF THE INVENTION

1. Overview


Embodiments of the invention provide methods and systems that allow for the collection and processing of biometric measurements. These biometric measurements may provide strong assurance of a person's identity, as well as of the authenticity of the biometric sample being taken, and may be incorporated within a number of different types of devices, such as cellular telephones, personal digital assistants, laptop computers, and other portable electronic devices. In some embodiments, a sensor provides light that penetrates the surface of an individual's skin, and scatters within the skin and/or underlying tissue. Skin sites applicable to multispectral imaging and biometric determination include all surfaces and all joints of the fingers and thumbs, the fingernails and nail beds, the palms, the backs of the hands, the wrists and forearms, the face, the eyes, the ears, and all other external surfaces of the body. While the discussion below sometimes makes reference to “fingers,” it should be understood that this refers merely to exemplary embodiments and that other embodiments may use skin sites at other body parts.


A portion of the light scattered by the skin and/or underlying tissue exits the skin and is used to form a multispectral image of the structure of the tissue at and below the surface of the skin. As used herein, the term “multispectral” is intended to be construed broadly as referring to methods and systems that use multiple wavelengths, and thus includes imaging systems that are “hyperspectral” or “ultraspectral” as those terms are understood by those of skill in the art. Because of the wavelength-dependent properties of the skin, the image formed from each wavelength of light is usually different from images formed at other wavelengths. Accordingly, embodiments of the invention collect images from each of the wavelengths of light in such a way that characteristic spectral and spatial information may be extracted by an algorithm applied to the resulting multispectral image data.


Embodiments of the invention provide for multispectral imaging of tissue using a digital imaging system. An illustration of a simplified arrangement is shown in FIG. 1, which shows one embodiment for a multispectral biometric sensor 101. The multispectral sensor 101 comprises one or more sources of light 103 and a digital imaging system 123. The number of illumination sources may conveniently be selected to achieve certain levels of illumination, to meet packaging requirements, and to meet other structural constraints of the multispectral biometric sensor 101. Illumination passes from the source 103 through illumination optics that shape the illumination to a desired form, such as in the form of flood light, light lines, light points, and the like. The illumination optics 105 are shown for convenience as consisting of a lens but may more generally include any combination of one or more lenses, one or more mirrors, and/or other optical elements. The illumination optics 105 may also comprise a scanner mechanism (not shown) to scan the illumination light in a specified one-dimensional or two-dimensional pattern. The light source 103 may comprise a point source, a line source, an area source, or may comprise a series of such sources in different embodiments.


After the light passes through the illumination optics 105, it passes through a platen 117 and illuminates the finger 119 or other skin site so that reflected light is directed to a digital imaging system 123. The digital imaging system 123 generally comprises a digital array 115 and detection optics 113 adapted to focus the light reflected from the object onto the array. For example, the detection optics 113 may comprise a lens, a mirror, a pinhole, combination of such optical elements, or other optical elements known to those of skill in the art. The digital imaging system 123 also comprises a color filter array 121, which may in some instances be incorporated as part of the digital array 115. The color filter array 121 may comprise a red-green-blue filter array in the well-known Bayer pattern or in other patterns. In some instances, the filter elements may function to transmit wavelengths that differ from the standard red-green-blue wavelengths, may include additional wavelengths, and/or may be arranged in a pattern that differs from the Bayer pattern. In instances where such a color filter array 121 is included, the illumination source(s) 103 may be a white-light or broadband source. Alternatively, the illumination source(s) 103 may comprise a plurality of narrowband sources, such as LEDs, with central wavelengths that are within the pass bands of filter elements comprised by the color filter array 121.


The sensor layout and components may advantageously be selected to minimize the direct reflection of the illumination into the digital imaging system 123. In one embodiment, such direct reflections are reduced by relatively orienting the illumination and detection optics such that the amount of directly reflected light detected is minimized. For instance, optical axes of the illumination optics 105 and the detection optics 113 may be placed at angles such that a mirror placed on the platen 117 does not direct an appreciable amount of illumination light into the detection subsystem 123. In addition, the optical axes of the illumination and detection optics may be placed at angles relative to the platen 117 such that the angular acceptance of both subsystems is less than the critical angle of the system; such a configuration avoids appreciable effects due to total internal reflectance between the platen 117 and the skin site 119.


The specific characteristics of the optical components comprised by the multispectral sensor 101 may be implemented to configure the multispectral sensor 101 for different form factors. For example, in an embodiment where the multispectral sensor is implemented in the top of a gear shift as part of a system to verify the identify of a driver of a vehicle, the light sources 103 and digital array 115 might not fit within the gear-shift handle as constructed. In such an embodiment, an optical relay system may be implemented. For example, relay optics that comprise individual lenses similar to those in a bore scope may be used, or alternatively optical fibers such as used in orthoscopes may be used. Still other techniques for implementing an optical relay system will be evident to those of skill in the art. In this way, components of the sensor may be located remotely from the sampling surface.


The multispectral sensor may take multiple images and combine them for processing. For example, one image may be made with one or more illumination wavelengths present and be followed immediately by an image taken with no illumination turned on. The difference between these two images allows the effect of illumination to be separated from background illumination. The difference image may then be used for further processing according to other aspects of the invention.


In some embodiments, the multispectral sensor uses optical polarizers. One example of such an embodiment is provided in FIG. 2. The basic structure of the sensor 101′ is similar to that of FIG. 1, but polarizers 107 have been added to the illumination system(s) 121 and a polarizer 111 has been added to the digital imaging system. The polarizers 107 and 111 may be linear or circular, or a combination of the two. In the case of linear polarizers, one useful arrangement is that in which the illumination light is polarized along a particular axis while the detected light requires an orthogonal polarization. Such an arrangement has utility in ensuring that detected light has undergone multiple scatter events in a medium such as skin. Further utility is derived from the observation that such an arrangement greatly reduces the visibility of latent prints left on the platen 117 by previous users, thus providing improved image quality and reducing the likelihood of spoofing by “reactivating” the latent prints. The utility of the arrangement also extends to conventional optical fingerprint readers as well as multispectral imagers. In particular, dark-field optical fingerprint systems are well-suited for the addition of linear polarizers in such an arrangement. Further discussion of a multispectral finger-recognition sensor that uses such a crossed-polarizer arrangement is described in copending, commonly assigned U.S. Prov. Pat. Appl. No. 60/576,364, entitled “MULTISPECTRAL FINGER RECOGNITION,” filed Jun. 1, 2004 by Robert K. Rowe and Stephen P. Corcoran, the entire disclosure of which is incorporated herein by reference for all purposes.


2. Applications


In a number of specific embodiments, A multispectral imaging sensor may be incorporated in a cellular telephone, a personal digital assistant, a laptop computer, or other portable electronic device. Such a multispectral sensor may be configured to collect multispectral biometric data on a finger. The sensor may require that a person touch the sensor, or may be able to collect the necessary multispectral data in a noncontact fashion with appropriate images being collected while the skin is located at a distance from the sensor.


In some embodiments, the multispectral imaging sensor incorporated in a portable electronic device may contain an optical system to enable adjustable focus. The mechanism for adjusting the may include one or more lenses that may be moved into various positions. The focusing mechanism itself may be a conventional zoom arrangement. Alternatively, the mechanism for focusing may use a liquid lens based on the known phenomenon of electro-wetting.


In a system configuration in which the portable electronic device has been designed to accommodate a “close-up” or macro image of the finger for biometric sensing, the same optical system may be used to read an optical code such as a barcode. Such a barcode reading could, for example, initiate a service in which product information for a product corresponding to the UPC barcode is downloaded to the portable device to provide the consumer with comparative pricing and performance data. Similar barcode scans may be used in other embodiments for promotional games or various gaming activities. The conjunction of a barcode scan taken in close temporal proximity to a biometric scan could provide for an audit trail for legal matters, including financial documents and transactions, forensic chain-of-evidence scenarios, and a variety of logical and/or physical security applications.


An imaging system on a portable electronic device that is configured to collect multispectral biometric data may also be used to scan in text, graphics, or other printed matter. In the case of text, the scanned data may be converted to an interpretable form using known optical-character-recognition (“OCR”) techniques. Such text recognition may then be used to provide input of text-translation services, copying services, and other such services that may be aided by a rapid and convenient character input.


An imaging system on a portable electronic device may also be used as an optical input device to provide a mechanism for securely inputting data into the device for functions such as reprogramming, security overrides, and secure digital communications. The illumination components of the imaging system may be used as optical output devices in the reverse direction from the detector elements. The use of multiple, filtered wavelengths can provide proved for multiple high-bandwidth channels for rapid and/or robust optical communication.


The multispectral sensor may also be used as a smart switch to turn on or enable an associated device, system, or service. In such a capacity, the multispectral sensor may be set to a video-streaming mode to collect several frames per second. Each frame may then be analyzed to detect motion and, if motion is detected, perform image processing steps to confirm that the motion is due to a finger by analyzing the overall shape, the texture, and/or the spectral qualities relative to a living finger.


The multispectral sensor may be used as a pointing device with similar functionality as a touchpad commonly used on a laptop PC. The multispectral sensor can be used in this fashion by monitoring the motion of the finger over the sensing area. Sliding the finger in a linear motion to the left can indicate a leftward motion to the PC (or cell phone, PDA, or other device), with similar effects for motions to the right, up, down, diagonal, or other directions. The cursor of the PC (or cell phone, PDA, or other device) may then be made to move in the indicated direction, or other appropriate action may be taken. In a similar fashion, the surface of the sensor may be tapped in different regions to simulate a click or double-click of a conventional PC mouse. Other motions, such as circles, X's, and the like, may be used to indicate other specific actions. In the case of touching or tapping the sensor, the degree of pressure may be estimated by evaluating the degree of blanching occurring in the finger. In this manner, different actions may be taken in response to a soft pressure being sensed relative to a hard pressure.


The spectral qualities of the finger in motion may be assessed to ensure that the detected motion is from that of a finger rather than some spurious object. In this way, false motions can be avoided.


The sensor surface may also be used as a simple text entry device. In a similar fashion as in the case of a pointing device, the user may make motions with the fingertip that describe single letters or number, which are then accumulated by the portable electronic device.


A particular motion of the finger may be used to increase the security of the sensing system. In such a configuration, the spectral and spatial qualities of the finger are confirmed to match those that are on record while the particular finger motion that is made is assessed to ensure it is similar to the motion on record. In this way, both the finger qualities and the motion need to match in order to determine an overall match.


The multispectral sensor may be used to measure the ambient light condition. In order to do so, an image is taken without any illumination light turned on at a time when a finger is not covering the sensor surface. The amount of ambient light may be determined from the image. Further details about ambient lighting may be derived in the case where the imager uses a color filter array or a similar mechanism to assess spectral characteristics of the light. The measure levels of ambient light may then be used by the associated device to set levels for display brightness, backlighting, etc. Such settings are particularly useful in ensuring the usability of portable electronic devices while conserving battery usage.


3. Combinations of Multispectral Sensing with Other Biometric Sensors


A small and rugged embodiment of a multispectral sensor may be constructed from solid-state components such as silicon digital imaging arrays and light-emitting diodes. Such a sensor may be integrated into a conventional fingerprint sensor to provide a second biometric reading when a fingerprint is taken. The conventional fingerprint sensor may be an optical fingerprint sensor. The multispectral sensor may use one or more illumination wavelengths to sense optical characteristics of the skin, including the presence, degree, and/or distribution of blood in the finger or other body part. The illumination wavelength(s) may include one or more wavelengths shorter than approximately 600 nm, where blood is known to become highly optically absorbing and thus discernible from other tissue components.


The images taken by the multispectral sensor prior to the finger touching the sensor may be used in whole or in part to perform a biometric assessment of the person's identity. In the case of a fingerprint, the individual ridge lines may be identified and tracked through a series of images to quantify the degree and type of distortion that such ridge images undergo when pressure is applied to the sensor by the finger.


The fingerprint pattern observed by the multispectral imager using one or more illumination wavelengths may be combined with the TIR pattern to provide a combinatoric biometric. The multispectral image may contain information on the external friction ridge pattern, the internal friction ridge pattern, the composition and position of other subsurface structures, the spectral qualities of the finger, the size and shape of the finger, and other features that are somewhat distinct from person to person. In this way, one or more multispectral features may be combined with the optical fingerprint data to provided additional biometric information.


In some cases, the multispectral imaging data may be processed to improve the quality of the TIR fingerprint. In particular, there may be a linear or nonlinear numerical relationship established on parts of the image where both the multispectral image data and TIR data are well defined. These parts are then used to establish a mathematical model such as with Principal Component Regression, Partial Least Squares, Neural Networks, or other methods to one familiar in the art. The parts of the TIR image that are missing due to poor contact, etc. can thus be estimated from the model so established. In another embodiment, the entire images may be used, but the numerical model built using robust statistics in which the relationship is relatively unaffected by missing or degraded portions of the TIR image. Alternatively, numerical models may be established through the examination of previously collected TIR/multispectral image sets and then applied to new data.


4. Spoof Detection


The multispectral sensor may be used to make a determination about the authenticity of the sample and thereby detect attempts to spoof the optical fingerprint sensor. The multispectral sensor may be able to make a static spectral reading of the sample either when it touches the sensor surface or at a remote distance to ensure that the spectral qualities match those of a living finger.


The multispectral sensor may also use one or more illumination wavelengths to illuminate the finger as it moves to touch the sensor surface. During this interval of time, blanching of the skin may be observed in the vicinity of the sensor as pressure is applied by the finger. As well, areas of the skin may show a distinct pooling of blood, especially those regions at the perimeter of the area of contact between the finger and sensor. This blanching and/or pooling of the skin provides an identifiable set of changes to the corresponding images. In particular, wavelengths less than approximately 600 nm, which are highly absorbed by the blood, are seen to get brighter in the region of blanching and darker in areas of blood pooling. Wavelengths longer than approximately 600 nm are seen to change much less during blanching and/or pooling. The presence, magnitude, and/or relative amounts of spectral changes that occur while the finger touches the fingerprint sensor can be used as an additional means of discriminating between genuine measurements and attempts to spoof the sensor.


In the case where the multispectral sensor is combined with an optical TIR fingerprint reader, the pattern detected by the multispectral sensor using one or more illumination wavelengths may be compared with the pattern detected by the fingerprint sensor and consistency confirmed. In this way, the internal fingerprint data due to blood and other subsurface structures is used to confirm the image of the external fingerprint that the conventional fingerprint sensor collects. If there is a discrepancy between the two patterns, an attempt to spoof the fingerprint sensor using a thin, transparent film placed on the finger may be indicated. Appropriate action may be taken in response to this discrepancy to ensure that such a spoof attempt is not being perpetrated.


Other factors that can be monitored to discriminate between genuine finger and attempts to spoof the detector using an artificial or altered sample of some kind include monitoring the image taken with 1 or more wavelengths over time. During the specified time interval, changes such as those due to pulse can be measured and used to confirm a genuine finger. As well, changes in the image due to sweating at the ridge pores can be observed and used for spoof detection.


Thus, having described several embodiments, it will be recognized by those of skill in the art that various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the invention. Accordingly, the above description should not be taken as limiting the scope of the invention, which is defined in the following claims.

Claims
  • 1. A sensor comprising: an illumination source disposed to provide light to an object; andan imaging system disposed to receive light scattered from the object substantially without total internal reflectance (TIR) effects,wherein: the imaging system is configured to perform a biometric analysis of the object when the object comprises a skin site of an individual; andthe imaging system is configured to read a barcode from the object when the object comprises the barcode.
  • 2. The sensor recited in claim 1 wherein: the illumination source provides light to the object at a plurality of wavelengths; andthe biometric analysis comprises a multispectral biometric analysis.
  • 3. The sensor recited in claim 2 wherein the imagining system comprises: an array of light detectors; anda color filter array having a plurality of distributed filter elements, each filter element being adapted to transmit light of one of a limited number of specified narrowband wavelength ranges, the color filter array being disposed to filter the light scattered from the object prior to encountering the array of light detectors.
  • 4. The sensor recited in claim 3 wherein the filter elements are distributed according to a Bayer pattern.
  • 5. The sensor recited in claim 1 further comprising: a first polarizer disposed to polarize the light provided by the illumination source; anda second polarized disposed to polarize the light scattered from the object.
  • 6. The sensor recited in claim 1 wherein the first and second polarizers are in a substantially crossed configuration.
  • 7. The sensor recited in claim 1 wherein the imaging system is further configured to initiate a service in response to reading the barcode.
  • 8. The sensor recited in claim 1 wherein the imaging system is further configured to generate an audit trail for the object in response to reading the barcode.
  • 9. The sensor recited in claim 1 wherein the imaging system is configured to scan printed matter from the object when the object comprises the printed matter.
  • 10. The sensor recited in claim 1 wherein: the printed matter comprises text; andthe imaging system is further configured to convert the scanned printed matter to an interpretable form using optical-character-recognition techniques.
  • 11. The sensor recited in claim 1 wherein: the illumination source is comprised by an illumination system; andthe illumination system is configured to generate an image in response to receiving light scattered from the object.
  • 12. The sensor recited in claim 1 wherein the imaging system comprises a focusing mechanism providing an adjustable focus to image the object.
  • 13. A portable electronic device comprising the sensor recited in claim 1.
  • 14. The portable electronic device recited in claim 13 wherein the portable electronic device comprises a cellular telephone.
  • 15. A method comprising: illuminating an object with light, the object comprising a barcode;receiving light scattered from the object substantially without total internal reflectance (TIR) effects;reading the barcode from the object with the received light scattered from the object;illuminating a skin site of an individual with light;receiving light scattered from the skin site; andperforming a biometric analysis of the skin site from the received light scattered from the skin site.
  • 16. The method recited in claim 15 wherein: illuminating the skin site of the individual comprises illuminating the skin site of the individual at a plurality of wavelengths; andperforming the biometric analysis comprises performing a multispectral biometric analysis of the skin site.
  • 17. The method recited in claim 15 further comprising initiating a service in response to reading the barcode.
  • 18. The method recited in claim 15 wherein illuminating the skin of the individual and illuminating the object with light are perform in close temporal proximity.
  • 19. The method recited in claim 15 further comprising generating an audit trail for the object in response to reading the barcode.
  • 20. The method recited in claim 15 further comprising: illuminating a second object with light, the second object comprising printed matter;receiving light scattered from the second object; andgenerating an image of the printed matter from the second object with the received light scattered from the second object.
  • 21. The method recited in claim 20 wherein the printed matter comprises text, the method further comprising converted the image of the printed matter to an interpretable form using optical-character-recognition techniques.
CROSS-REFERENCES TO RELATED APPLICATIONS

This application is a continuation-in-part of U.S. patent application Ser. No. 12/815,196, entitled “WHITE-LIGHT SPECTRAL BIOMETRIC SENSORS,” filed Jun. 14, 2010 by Robert K. Rowe et al., which is a continuation of U.S. patent application Ser. No. 11/458,607, entitled “WHITE-LIGHT SPECTRAL BIOMETRIC SENSORS,” filed Jul. 19, 2006 by Robert K. Rowe et al. (“the '607 application”; U.S. Pat. No. 7,751,594). The '607 application is a continuation-in-part of U.S. patent application Ser. No. 11/115,100, entitled “MULTISPECTRAL IMAGING BIOMETRICS,” filed Apr. 25, 2005 by Robert K. Rowe (U.S. Pat. No. 7,460,696), which claims the benefit of the filing date of U.S. Prov. Pat. Appl. No. 60/600,687, entitled “MULTISPECTRAL IMAGING BIOMETRIC,” filed Aug. 11, 2004. The '607 application is also a continuation-in-part of U.S. patent application Ser. No. 11/115,101, entitled “MULTISPECTRAL IMAGING BIOMETRIC,” filed Apr. 25, 2005 by Robert K. Rowe (U.S. Pat. No. 7,394,919), which claims the benefit of the filing date of U.S. Prov. Pat. Appl. No. 60/600,687, entitled “MULTISPECTRAL IMAGING BIOMETRIC,” filed Aug. 11, 2004. The '607 application is also a continuation-in-part of U.S. patent application Ser. No. 11/115,075, entitled “MULTISPECTRAL LIVENESS DETERMINATION,” filed Apr. 25, 2005 by Robert K. Rowe (U.S. Pat. No. 7,539,330), which claims the benefit of the filing date of U.S. Prov. Pat. Appl. No. 60/600,687, entitled “MULTISPECTRAL IMAGING BIOMETRIC,” filed Aug. 11, 2004 The entire disclosure of each of the aforementioned patent applications is hereby incorporated by reference for all purposes.

STATEMENT AS TO RIGHTS TO INVENTIONS MADE UNDER FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT

The Government of the United States may have rights in this invention.

US Referenced Citations (356)
Number Name Date Kind
3508830 Hopkins et al. Apr 1970 A
3619060 Johnson Nov 1971 A
3854319 Burroughs et al. Dec 1974 A
3872443 Ott Mar 1975 A
3910701 Henderson et al. Oct 1975 A
RE29008 Ott Oct 1976 E
4035083 Woodriff et al. Jul 1977 A
4142797 Astheimer Mar 1979 A
4169676 Kaiser Oct 1979 A
4170987 Anselmo et al. Oct 1979 A
4260220 Whitehead Apr 1981 A
4322163 Schiller Mar 1982 A
4427889 Muller Jan 1984 A
4537484 Fowler et al. Aug 1985 A
4598715 Machler et al. Jul 1986 A
4653880 Sting et al. Mar 1987 A
4654530 Dybwad Mar 1987 A
4655225 Dahne et al. Apr 1987 A
4656562 Sugino Apr 1987 A
4657397 Oehler et al. Apr 1987 A
4661706 Messerschmidt et al. Apr 1987 A
4684255 Ford Aug 1987 A
4699149 Rice Oct 1987 A
4712912 Messerschmidt Dec 1987 A
4730882 Messerschmidt Mar 1988 A
4747147 Sparrow May 1988 A
4787013 Sugino et al. Nov 1988 A
4787708 Whitehead Nov 1988 A
4830496 Young May 1989 A
4853542 Milosevic et al. Aug 1989 A
4857735 Noller Aug 1989 A
4859064 Messerschmidt et al. Aug 1989 A
4866644 Shenk et al. Sep 1989 A
4867557 Takatani et al. Sep 1989 A
4882492 Schlager Nov 1989 A
4883953 Koashi et al. Nov 1989 A
4936680 Henkes et al. Jun 1990 A
4937764 Komatsu et al. Jun 1990 A
4944021 Hoshino et al. Jul 1990 A
4975581 Robinson et al. Dec 1990 A
5015100 Doyle May 1991 A
5019715 Sting et al. May 1991 A
5028787 Rosenthal et al. Jul 1991 A
5051602 Sting et al. Sep 1991 A
5055658 Cockburn Oct 1991 A
5068536 Rosenthal Nov 1991 A
5070874 Barnes et al. Dec 1991 A
5077803 Kato et al. Dec 1991 A
5088817 Igaki et al. Feb 1992 A
5109428 Igaki et al. Apr 1992 A
5146102 Higuchi et al. Sep 1992 A
5158082 Jones Oct 1992 A
5163094 Prokoski et al. Nov 1992 A
5177802 Fujimoto et al. Jan 1993 A
5178142 Harjunmaa et al. Jan 1993 A
5179951 Knudson Jan 1993 A
5204532 Rosenthal Apr 1993 A
5222495 Clarke et al. Jun 1993 A
5222496 Clarke et al. Jun 1993 A
5223715 Taylor Jun 1993 A
5225678 Messerschmidt Jul 1993 A
5230702 Lindsay et al. Jul 1993 A
5237178 Rosenthal et al. Aug 1993 A
5243546 Maggard Sep 1993 A
5257086 Fateley et al. Oct 1993 A
5258922 Grill Nov 1993 A
5267152 Yang et al. Nov 1993 A
5268749 Weber et al. Dec 1993 A
5291560 Daugman Mar 1994 A
5299570 Hatschek Apr 1994 A
5303026 Strobl et al. Apr 1994 A
5311021 Messerschmidt May 1994 A
5313941 Braig et al. May 1994 A
5321265 Block Jun 1994 A
5331958 Oppenheimer Jul 1994 A
5335288 Faulkner Aug 1994 A
5348003 Caro Sep 1994 A
5351686 Steuer et al. Oct 1994 A
5355880 Thomas et al. Oct 1994 A
5360004 Purdy et al. Nov 1994 A
5361758 Hall et al. Nov 1994 A
5366903 Lundsgaard et al. Nov 1994 A
5372135 Mendelson et al. Dec 1994 A
5379764 Barnes et al. Jan 1995 A
5402778 Chance Apr 1995 A
5405315 Khuri et al. Apr 1995 A
5413096 Hart May 1995 A
5413098 Benaron May 1995 A
5419321 Evans May 1995 A
5435309 Thomas et al. Jul 1995 A
5441053 Lodder et al. Aug 1995 A
5452723 Wu et al. Sep 1995 A
5459317 Small et al. Oct 1995 A
5459677 Kowalski et al. Oct 1995 A
5460177 Purdy et al. Oct 1995 A
5483335 Tobias Jan 1996 A
5494032 Robinson et al. Feb 1996 A
5505726 Meserol Apr 1996 A
5507723 Keshaviah Apr 1996 A
5515847 Braig et al. May 1996 A
5518623 Keshaviah et al. May 1996 A
5523054 Switalski et al. Jun 1996 A
5533509 Koashi et al. Jul 1996 A
5537208 Bertram et al. Jul 1996 A
5539207 Wong Jul 1996 A
5552997 Massart Sep 1996 A
5559504 Itsumi et al. Sep 1996 A
5568251 Davies et al. Oct 1996 A
5596992 Haaland et al. Jan 1997 A
5606164 Price et al. Feb 1997 A
5613014 Eshera et al. Mar 1997 A
5630413 Thomas et al. May 1997 A
5636633 Messerschmidt et al. Jun 1997 A
5655530 Messerschmidt Aug 1997 A
5672864 Kaplan Sep 1997 A
5672875 Block et al. Sep 1997 A
5677762 Ortyn et al. Oct 1997 A
5681273 Brown Oct 1997 A
5708593 Saby et al. Jan 1998 A
5719399 Alfano et al. Feb 1998 A
5719950 Osten et al. Feb 1998 A
5724268 Sodickson et al. Mar 1998 A
5729619 Puma Mar 1998 A
5737439 Lapsley et al. Apr 1998 A
5743262 Lepper, Jr. et al. Apr 1998 A
5747806 Khalil et al. May 1998 A
5750994 Schlager May 1998 A
5751835 Topping et al. May 1998 A
5751836 Wildes et al. May 1998 A
5761330 Stoianov et al. Jun 1998 A
5782755 Chance et al. Jul 1998 A
5792050 Alam et al. Aug 1998 A
5792053 Skladnev et al. Aug 1998 A
5793881 Stiver et al. Aug 1998 A
5796858 Zhou et al. Aug 1998 A
5808739 Turner et al. Sep 1998 A
5818048 Sodickson et al. Oct 1998 A
5823951 Messerschmidt Oct 1998 A
5828066 Messerschmidt Oct 1998 A
5830132 Robinson Nov 1998 A
5830133 Osten et al. Nov 1998 A
5850623 Carman, Jr. et al. Dec 1998 A
5853370 Chance et al. Dec 1998 A
5857462 Thomas et al. Jan 1999 A
5859420 Borza Jan 1999 A
5860421 Eppstein et al. Jan 1999 A
5867265 Thomas Feb 1999 A
5886347 Inoue et al. Mar 1999 A
5902033 Levis et al. May 1999 A
5914780 Turner et al. Jun 1999 A
5929443 Alfano et al. Jul 1999 A
5933792 Andersen et al. Aug 1999 A
5935062 Messerschmidt et al. Aug 1999 A
5945676 Khalil et al. Aug 1999 A
5949543 Bleier et al. Sep 1999 A
5957841 Maruo et al. Sep 1999 A
5961449 Toida et al. Oct 1999 A
5963319 Jarvis et al. Oct 1999 A
5978495 Thomopoulos et al. Nov 1999 A
5987346 Benaron et al. Nov 1999 A
5999637 Toyoda et al. Dec 1999 A
6005722 Butterworth et al. Dec 1999 A
6016435 Maruo et al. Jan 2000 A
6025597 Sterling et al. Feb 2000 A
6026314 Amerov et al. Feb 2000 A
6028773 Hundt Feb 2000 A
6031609 Funk et al. Feb 2000 A
6034370 Messerschmidt Mar 2000 A
6040578 Malin et al. Mar 2000 A
6041247 Weckstrom et al. Mar 2000 A
6041410 Hsu et al. Mar 2000 A
6043492 Lee et al. Mar 2000 A
6044285 Chaiken et al. Mar 2000 A
6045502 Eppstein et al. Apr 2000 A
6046808 Fateley Apr 2000 A
6049727 Crothall Apr 2000 A
6056738 Marchitto et al. May 2000 A
6057925 Anthon May 2000 A
6061581 Alam et al. May 2000 A
6061582 Small et al. May 2000 A
6066847 Rosenthal May 2000 A
6069689 Zeng et al. May 2000 A
6070093 Oosta et al. May 2000 A
6073037 Alam et al. Jun 2000 A
6081612 Gutkowicz-Krusin et al. Jun 2000 A
6088605 Griffith et al. Jul 2000 A
6088607 Diab et al. Jul 2000 A
6097035 Belongie et al. Aug 2000 A
6100811 Hsu et al. Aug 2000 A
6115484 Bowker et al. Sep 2000 A
6115673 Malin et al. Sep 2000 A
6122042 Wunderman et al. Sep 2000 A
6122394 Neukermans et al. Sep 2000 A
6122737 Bjorn et al. Sep 2000 A
6125192 Bjorn et al. Sep 2000 A
6141101 Bleier et al. Oct 2000 A
6147749 Kubo et al. Nov 2000 A
6148094 Kinsella Nov 2000 A
6152876 Robinson et al. Nov 2000 A
6154658 Caci Nov 2000 A
6157041 Thomas et al. Dec 2000 A
6159147 Lichter et al. Dec 2000 A
6172743 Kley et al. Jan 2001 B1
6175407 Sartor Jan 2001 B1
6181414 Raz et al. Jan 2001 B1
6181958 Steuer et al. Jan 2001 B1
6188781 Brownlee Feb 2001 B1
6193153 Lambert Feb 2001 B1
6208749 Gutkowicz-Krusin et al. Mar 2001 B1
6212424 Robinson Apr 2001 B1
6226541 Eppstein et al. May 2001 B1
6229908 Edmonds, III et al. May 2001 B1
6230034 Messerschmidt et al. May 2001 B1
6236047 Malin et al. May 2001 B1
6240306 Rohrscheib et al. May 2001 B1
6240309 Yamashita et al. May 2001 B1
6241663 Wu et al. Jun 2001 B1
6256523 Diab et al. Jul 2001 B1
6272367 Chance Aug 2001 B1
6280381 Malin et al. Aug 2001 B1
6282303 Brownlee Aug 2001 B1
6285895 Ristolainen et al. Sep 2001 B1
6292576 Brownlee Sep 2001 B1
6301375 Choi Oct 2001 B1
6301815 Sliwa Oct 2001 B1
6304767 Soller et al. Oct 2001 B1
6307633 Mandella et al. Oct 2001 B1
6309884 Cooper et al. Oct 2001 B1
6317507 Dolfing Nov 2001 B1
6324310 Brownlee Nov 2001 B1
6330346 Peterson et al. Dec 2001 B1
6404904 Einighammer et al. Jun 2002 B1
6419361 Cabib et al. Jul 2002 B2
6483929 Murakami et al. Nov 2002 B1
6504614 Messerschmidt et al. Jan 2003 B1
6537225 Mills Mar 2003 B1
6560352 Rowe et al. May 2003 B2
6574490 Abbink et al. Jun 2003 B2
6597945 Marksteiner Jul 2003 B2
6606509 Schmitt Aug 2003 B2
6628809 Rowe et al. Sep 2003 B1
6631199 Topping et al. Oct 2003 B1
6741729 Bjorn et al. May 2004 B2
6749115 Gressel et al. Jun 2004 B2
6799275 Bjorn Sep 2004 B1
6799726 Stockhammer Oct 2004 B2
6816605 Rowe et al. Nov 2004 B2
6825930 Cronin et al. Nov 2004 B2
6853444 Haddad Feb 2005 B2
6898299 Brooks May 2005 B1
6928181 Brooks Aug 2005 B2
6937885 Lewis et al. Aug 2005 B1
6958194 Hopper et al. Oct 2005 B1
6995384 Lee et al. Feb 2006 B2
7047419 Black May 2006 B2
7084415 Iwai Aug 2006 B2
7147153 Rowe et al. Dec 2006 B2
7254255 Dennis Aug 2007 B2
7263213 Rowe Aug 2007 B2
7287013 Schneider et al. Oct 2007 B2
7347365 Rowe Mar 2008 B2
7366331 Higuchi Apr 2008 B2
7386152 Rowe et al. Jun 2008 B2
7394919 Rowe et al. Jul 2008 B2
7397943 Merbach et al. Jul 2008 B2
7440597 Rowe Oct 2008 B2
7460696 Rowe Dec 2008 B2
7508965 Rowe et al. Mar 2009 B2
7515252 Hernandez Apr 2009 B2
7539330 Rowe May 2009 B2
7545963 Rowe Jun 2009 B2
7627151 Rowe Dec 2009 B2
7668350 Rowe Feb 2010 B2
7735729 Rowe Jun 2010 B2
7751594 Rowe et al. Jul 2010 B2
7801338 Rowe Sep 2010 B2
7801339 Sidlauskas et al. Sep 2010 B2
7804984 Sidlauskas et al. Sep 2010 B2
7819311 Rowe et al. Oct 2010 B2
7831072 Rowe Nov 2010 B2
7835554 Rowe Nov 2010 B2
7899217 Uludag et al. Mar 2011 B2
7995808 Rowe et al. Aug 2011 B2
20020009213 Rowe et al. Jan 2002 A1
20020065468 Utzinger et al. May 2002 A1
20020101566 Elsner et al. Aug 2002 A1
20020111546 Cook et al. Aug 2002 A1
20020138768 Murakami et al. Sep 2002 A1
20020171834 Rowe et al. Nov 2002 A1
20020183624 Rowe et al. Dec 2002 A1
20030025897 Iwai Feb 2003 A1
20030044051 Fujieda Mar 2003 A1
20030078504 Rowe Apr 2003 A1
20030095525 Lavin et al. May 2003 A1
20030128867 Bennett Jul 2003 A1
20030163710 Ortiz et al. Aug 2003 A1
20030223621 Rowe et al. Dec 2003 A1
20040003295 Elderfield et al. Jan 2004 A1
20040008875 Linares Jan 2004 A1
20040022421 Endoh et al. Feb 2004 A1
20040042642 Bolle et al. Mar 2004 A1
20040047493 Rowe et al. Mar 2004 A1
20040068394 Maekawa et al. Apr 2004 A1
20040114783 Spycher et al. Jun 2004 A1
20040120553 Stobbe Jun 2004 A1
20040125994 Engels et al. Jul 2004 A1
20040179722 Moritoki et al. Sep 2004 A1
20040240712 Rowe et al. Dec 2004 A1
20040240713 Hata Dec 2004 A1
20040264742 Zhang et al. Dec 2004 A1
20050007582 Villers et al. Jan 2005 A1
20050125339 Tidwell et al. Jun 2005 A1
20050169504 Black Aug 2005 A1
20050180620 Takiguchi Aug 2005 A1
20050185847 Rowe Aug 2005 A1
20050205667 Rowe Sep 2005 A1
20050265585 Rowe Dec 2005 A1
20050265586 Rowe et al. Dec 2005 A1
20050265607 Chang Dec 2005 A1
20050271258 Rowe Dec 2005 A1
20060002597 Rowe Jan 2006 A1
20060002598 Rowe et al. Jan 2006 A1
20060045330 Marion Mar 2006 A1
20060062438 Rowe Mar 2006 A1
20060110015 Rowe May 2006 A1
20060115128 Mainguet Jun 2006 A1
20060171571 Chan et al. Aug 2006 A1
20060173256 Ridder et al. Aug 2006 A1
20060202028 Rowe et al. Sep 2006 A1
20060210120 Rowe et al. Sep 2006 A1
20060244947 Rowe Nov 2006 A1
20060274921 Rowe Dec 2006 A1
20070014437 Sato Jan 2007 A1
20070030475 Rowe et al. Feb 2007 A1
20070052827 Hiltunen Mar 2007 A1
20070116331 Rowe et al. May 2007 A1
20070153258 Hernandez Jul 2007 A1
20070165903 Munro et al. Jul 2007 A1
20080008359 Beenau et al. Jan 2008 A1
20080013806 Hamid Jan 2008 A1
20080025579 Sidlauskas et al. Jan 2008 A1
20080025580 Sidlauskas et al. Jan 2008 A1
20080192988 Uludag et al. Aug 2008 A1
20080232653 Rowe Sep 2008 A1
20080260211 Bennett et al. Oct 2008 A1
20080298649 Ennis et al. Dec 2008 A1
20090046903 Corcoran et al. Feb 2009 A1
20090080709 Rowe et al. Mar 2009 A1
20090092290 Rowe Apr 2009 A1
20090148005 Rowe Jun 2009 A1
20090245591 Rowe et al. Oct 2009 A1
20100067748 Rowe Mar 2010 A1
20100246902 Rowe et al. Sep 2010 A1
20110085708 Martin et al. Apr 2011 A1
20110211055 Martin et al. Sep 2011 A1
20110235872 Rowe et al. Sep 2011 A1
Foreign Referenced Citations (66)
Number Date Country
130771 Aug 2001 CN
1402183 Mar 2003 CN
1509454 Jun 2004 CN
10153808 May 2003 DE
0280418 Aug 1988 EP
0317121 May 1989 EP
0372748 Jun 1990 EP
0426358 May 1991 EP
0449335 Oct 1991 EP
0573137 Dec 1993 EP
0631137 Dec 1994 EP
0670143 Sep 1995 EP
0681166 Nov 1995 EP
0757243 Feb 1997 EP
0788000 Aug 1997 EP
0801297 Oct 1997 EP
0836083 Apr 1998 EP
0843986 May 1998 EP
0869348 Oct 1998 EP
0897164 Feb 1999 EP
0897691 Feb 1999 EP
0924656 Jun 1999 EP
0982583 Mar 2000 EP
0990945 Apr 2000 EP
1353292 Oct 2003 EP
1434162 Jun 2004 EP
2761180 Sep 1998 FR
61182174 Aug 1986 JP
03-016160 Jan 1991 JP
7075629 Mar 1995 JP
10-127585 May 1998 JP
2001033381 Feb 2001 JP
2001-112742 Apr 2001 JP
2001-184490 Jul 2001 JP
2002-133402 May 2002 JP
2002-517835 Jun 2002 JP
2003-050993 Feb 2003 JP
2003-511101 Mar 2003 JP
2003-308520 Oct 2003 JP
9200513 Jan 1992 WO
9217765 Oct 1992 WO
9300855 Jan 1993 WO
9307801 Apr 1993 WO
9927848 Jun 1999 WO
0030530 Jun 2000 WO
0046739 Aug 2000 WO
0115596 Mar 2001 WO
0118332 Mar 2001 WO
0120538 Mar 2001 WO
0127882 Apr 2001 WO
0152180 Jul 2001 WO
0152726 Jul 2001 WO
0153805 Jul 2001 WO
0165471 Sep 2001 WO
0169520 Sep 2001 WO
02054337 Jul 2002 WO
02084605 Oct 2002 WO
02099393 Dec 2002 WO
03010510 Feb 2003 WO
03096272 Nov 2003 WO
2004068388 Aug 2004 WO
2004068394 Aug 2004 WO
2004090786 Oct 2004 WO
2006049394 May 2006 WO
2006077446 Jul 2006 WO
2006093508 Sep 2006 WO
Non-Patent Literature Citations (49)
Entry
Rowe. “LumiGuard: A Novel Spectroscopic Sensor for Biometric Security Applications”, American Chemical Society 225th National Meeting, Mar. 25, 2003, 20 pages.
Rowe, R.K. Nixon, K.A. and Butler, P.W., “Multispectral Fingerprint Image Acquisition”, Advances in Biometrics, SpringerLink Oct. 6, 2006 [retrieved Jun. 22, 2010] Retrieved from the internet. <URL: http://www.lumidigm.com/download/Multispectral-Fingerprint-Image-Acquisition.pdf.> Entire document, especially: p. 4, para 2; p. 6, para 2 and Fig. 2; p. 7, para 3 and Fig. 3; p. 8, para 2; p. 9, para 5 and Fig 4; p. 10, para 4; p. 18 para 1.
Anderson, C.E. et al. “Fundamentals of Calibration Transfer Through Procrustes Analysis,” Appln, Spectros., vol. 53, No. 10 (1999) p. 1268-1276.
Ashbourn, Julian, Biometrics; Advanced Identity Verification, Springer, 2000, pp. 63-64.
Bantle, John P. et al., “Glucose Measurement in Patients With Diabetes Mellitus With Dermal Interstitial Fluid,” Mosby-Year Book, Inc., 9 pages, 1997.
Berkoben, Michael S. et al., “Vascular Access for Hemodialysis,” Clinical Dialysis, Third Edition, pp. 2 cover pages and 26-45, 1995.
Blank, T.B. et al., “Transfer of Near-Infrared Multivariate Calibrations Without Standards,” Anal. Chem., vol. 68 (1996) p. 2987.
Bleyer, Anthony J. et al., “The Costs of Hospitalizations Due to Hemodialysis Access Management,” Nephrology News & Issues, pp. 19, 20 and 22, Jan. 1995.
Brasunas, John C. et al., “Uniform Time-Sampling Fourier Transform Spectroscopy,” Applied Optics, vol. 36, No. 10, Apr. 1, 1997, pp. 2206-2222.
Brault, James W., “New Approach to High-Precision Fourier Transform Spectrometer Design,” Applied Optics, vol. 35, No. 16, Jun. 1, 1996, pp. 2891-2896.
Brochure entitled “Improve the Clinical Outcome of Every Patient”, In Line Diagnostics, published on or before Oct. 30, 1997, 2 pages.
Cassarly, W.J. et al. “Distributed Lighting Systems: Uniform Light Delivery,” Source Unknown, pp. 1698-1702, 1996.
Chang, Chong-Min et al., “A Uniform Rectangular Illuminating Optical System for Liquid Crystal Light Valve Projectors,” Euro Display '96 (1996) pp. 257-260.
Coyne, Lawrence J. et. al., “Distributive Fiber Optic couplers Using Rectangular Lightguides as Mixing Elements,” (Information Gatekeepers, Inc. Brookline, MA, 1979) pp. 160-164.
Daugirdas, JT et al. “Comparison of Methods to Predict the Equilibrated Kt/V (eKt/V) in The Hemo Study,” National Institutes of Health, pp. 1-28, Aug. 20, 1996.
de Noord, Onno E., “Multivariate Calibration Standardization,” Chemometrics and Intelligent Laboratory Systems 25, (1994) pp. 85-97.
Demos, S.G. et al., “Optical Fingerprinting Using Polarisation Contrast Improvement,” Electronics Letters, vol. 33, No. 7, pp. 582-584, Mar. 27, 1997.
Depner, Thomas a. et al., “Clinical Measurement of Blood Flow in Hemodialysis Access Fistulae and Grafts by Ultrasound Dilution,” Division of Nephrology, University of California, pp. M745-M748, published on or before Oct. 30, 1997.
Despain, Alvin M. et al., “A Large-Aperture Field-Widened Interferometer-Spectrometer for Airglow Studies”, Aspen International Conference on Fourier Spectroscopy, 1970, pp. 293-300.
Faber, Nicolaas, “Multivariate Sensitivity for the Interpretation of the Effect of Spectral Pretreatment Methods on Near-Infrared Calibration Model Predictions”, Analytical Chemistry, vol. 71, No. 3, Feb. 1, 1999, pp. 557-565.
Fresenius USA, “Determination of Delivered Therapy Through Measurement of Effective Clearance,” 2 pages, Dec. 1994.
Geladi, Paul et al., A Multivariate NIR Study of Skin Alterations in Diabetic Patients as Compared to Control Subjects. Near Infrared Spectrosc., vol. 8 (2000) pp. 217-227.
Hakim, Raymond M. et al., Effects of Dose of Dialysis on Morbidity and Mortality, American Journal of Kidney Diseases, vol. 23, No. 5, pp. 661-669, May 1994.
Jacobs, Paul et al., “A Disposable Urea Sensor for Continuous Monitoring of Hemodialysis Efficiency”, ASAIO Journal, pp. M353-M358, 1993.
Keshaviah, Prakash R. et al., “On-Line Monitoring of the Delivery of the Hemodialysis Prescription”, Pediatric Nephrology, vol. 9, pp. S2-S8, 1995.
Krivitski, Nikolai M., “Theory and Validation of Access Flow Measurement by Dilution Technique During Hemodialysis”, Kidney International, vol. 48, pp. 244-250, 1995.
Lee et al., “Fingerprint Recognition Using Principal Gabor Basis Function”, Proceedings of 2001 International Symposium on Intelligent Multimedia, Video and Speech Processing, May 2-4, 2001, Sections 2-3.
Maltoni et al., “Handbook of Fingerprint Recognition”, 2005, pp. 58-61.
Marbach, Ralf, “Measurement Techniques for IR Spectroscopic Blood Glucose Determination”, Fortschritt Bericht, Series 8: Measurement and Control Technology, No. 346, pp. cover and 1-158, Mar. 28, 1994.
Mardia K.V. et al., “Chapter 11—Discriminant Analysis”, Multivariate Analysis, pp. 2 cover pages and 300-325, 1979.
Nichols, Michael G. et al., “Design and Testing of a White-Light, Steady-State Diffuse Reflectance Spectrometer for Determination of Optical Properties of Highly Scattering Systems”, Applied Optics, vol. 36, No. 1, pp. 93-104, Jan. 1, 1997.
Nixon, Kristin A. et al., “Novel Spectroscopy-Based Technology for Biometric and Liveness Verification”, Technology for Human Identification. Proceedings od SPIE, vol. 5404, No. 1, XP-002458441, Apr. 12-13, 2004, pp. 287-295 (ISSN: 0277-786x).
Pan et al., “Face Recognition in Hyperspectral Images”, IEEE Transactions on Pattern Analysis and Machine Intelligence vol. 25, No. 12, Dec. 2003.
Ripley, B. D., “Chapter 3—Linear Discriminant Analysis”, Pattern Recognition and Neural Networks, pp. 3 cover pages and 91-120, 1996.
Ronco, C. et al., “On-Line Urea Monitoring: A Further Step Towards Adequate Dialysis Prescription and Delivery”, The International Journal of Artificial Organs, vol. 18, No. 9, pp. 534-543, 1995.
Ross et al., “A Hybrid Fingerprint Matcher”, Pattern Recognition 36, The Journal of the Pattern Recognition Society, 2003 Elsevier Science Ltd., pp. 1661-1673.
Selvaraj et al., Fingerprint Verification Using Wavelet Transform, Proceedings of the Fifth International Conference on Computational Intelligence and Multimedia Applications, IEEE, 2003.
Service, F. John et al., “Dermal Interstitial Glucose as an Indicator of Ambient Glycemia”, Diabetes Care, vol. 20, No. 9, 8 pages, Aug. 1997.
Sherman, Richard A., “Chapter 4—Recirculation in the Hemodialysis Access”, Principles and Practice of Dialysis, pp. 2 cover pages and 38-46, 1994.
Sherman, Richard A., “The Measurement of Dialysis Access Recirculation”, American Journal of Kidney Diseases, vol. 22, No. 4, pp. 616-621, Oct. 1993.
Steuer, Robert R. et al., “A New Optical Technique for Monitoring Hematocrit and Circulating Blood Volume: Its Application in Renal Dialysis”, Dialysis & Transplantation, vol. 22, No. 5, pp. 260-265, May 1993.
Webb, Paul, “Temperatures of Skin, Subcutaneous Tissue, Muscle and Core in Resting Men in Cold, Comfortable and Hot Conditions”, European Journal of Applied Physiology, vol. 64, pp. 471-476, 1992.
Zavala, Albert et al., “Using Fingerprint Measures to Predict Other Anthropometric Variables”, Human Factors, vol. 17, No. 6, pp. 591-602, 1975.
Chinese Patent Application No. 2006/80038579.4, First Office Action mailed on Mar. 23, 2011, 7 pages.
European Patent Application No. 10166537.0, Extended European Search Report mailed on Jun. 1, 2011, 7 pages.
International Search Report and Written Opinion of PCT/US2008/066585 mailed on Oct. 30, 2008, 10 pages.
International Search Report and Written Opinion of PCT/US2010/025463 mailed on Jun. 30, 2010, 12 pages.
International Search Report of PCT/US2010/046852 mailed on Dec. 29, 2010, 5 pages.
Rowe, et al. “Multispectral Fingerprint Image Acquisition”, Advances in Biometrics, 2008, 22 pages.
Related Publications (1)
Number Date Country
20110165911 A1 Jul 2011 US
Provisional Applications (1)
Number Date Country
60600687 Aug 2004 US
Continuations (1)
Number Date Country
Parent 11458607 Jul 2006 US
Child 12815196 US
Continuation in Parts (4)
Number Date Country
Parent 12815196 Jun 2010 US
Child 12985161 US
Parent 11115100 Apr 2005 US
Child 11458607 US
Parent 11115101 Apr 2005 US
Child 11115100 US
Parent 11115075 Apr 2005 US
Child 11115101 US