Fake finger detection based on transient features

Information

  • Patent Grant
  • 11216681
  • Patent Number
    11,216,681
  • Date Filed
    Wednesday, June 24, 2020
    3 years ago
  • Date Issued
    Tuesday, January 4, 2022
    2 years ago
Abstract
In a method for determining whether a finger is a real finger at an ultrasonic fingerprint sensor, a sequence of images of a fingerprint of a finger are captured at an ultrasonic fingerprint sensor, wherein the sequence of images includes images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor. A plurality of transient features of the finger is extracted from the sequence of images. A classifier is applied to the plurality of transient features to classify the finger as one of a real finger and a fake finger. It is determined whether the finger is a real finger based on an output of the classifier.
Description
BACKGROUND

Fingerprint sensors have become ubiquitous in mobile devices as well as other devices (e.g., locks on cars and buildings) and applications for authenticating a user's identity. They provide a fast and convenient way for the user to unlock a device, provide authentication for payments, etc. It is essential that fingerprint sensors operate at a level of security that, at a minimum, reduces the potential for circumvention of security of fingerprint authentication. For instance, fake fingers having fake or spoofed fingerprints can be used to attempt to circumvent fingerprint authentication at fingerprint sensors.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and form a part of the Description of Embodiments, illustrate various non-limiting and non-exhaustive embodiments of the subject matter and, together with the Description of Embodiments, serve to explain principles of the subject matter discussed below. Unless specifically noted, the drawings referred to in this Brief Description of Drawings should be understood as not being drawn to scale and like reference numerals refer to like parts throughout the various figures unless otherwise specified.



FIG. 1 is a block diagram of an example electronic device 100 upon which embodiments described herein may be implemented.



FIG. 2 illustrates a block diagram of an example fingerprint sensing system for determining whether a fingerprint image was generated using a real finger or a fake finger, according to some embodiments.



FIG. 3 illustrates a block diagram of a transient feature extractor, according to some embodiments.



FIG. 4A illustrates example graphs of fingerprint image contrast over time and ridge signal strength over time during capture of a sequence of fingerprint images generated using a real finger, according to embodiments.



FIG. 4B illustrates example graphs of fingerprint image contrast over time and ridge signal strength over time during capture of a sequence of fingerprint images generated using a fake finger, according to embodiments.



FIG. 5 illustrates an example graph of ridge signal strength over time during capture of a sequence of fingerprint images, according to embodiments.



FIG. 6 illustrates an example graph of ridge signal strength over time during capture of a sequence of fingerprint images using a real finger, according to embodiments.



FIG. 7 illustrates an example graph of ridge signal strength over time during capture of a sequence of fingerprint images using a fake finger, according to embodiments.



FIG. 8 illustrates example distributions of transient features as used by a classifier, according to some embodiments.



FIG. 9 illustrates an example process for determining whether a finger is a real finger at an ultrasonic fingerprint sensor, according to some embodiments.





DESCRIPTION OF EMBODIMENTS

The following Description of Embodiments is merely provided by way of example and not of limitation. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding background or in the following Description of Embodiments.


Reference will now be made in detail to various embodiments of the subject matter, examples of which are illustrated in the accompanying drawings. While various embodiments are discussed herein, it will be understood that they are not intended to limit to these embodiments. On the contrary, the presented embodiments are intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope the various embodiments as defined by the appended claims. Furthermore, in this Description of Embodiments, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present subject matter. However, embodiments may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the described embodiments.


Notation and Nomenclature

Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing and other symbolic representations of operations on data within an electrical device. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be one or more self-consistent procedures or instructions leading to a desired result. The procedures are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of acoustic (e.g., ultrasonic) signals capable of being transmitted and received by an electronic device and/or electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in an electrical device.


It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the description of embodiments, discussions utilizing terms such as “capturing,” “extracting,” “applying,” “determining,” “performing,” “providing,” “receiving,” “analyzing,” “confirming,” “displaying,” “presenting,” “using,” “completing,” “instructing,” “comparing,” “executing,” or the like, refer to the actions and processes of an electronic device such as an electrical device.


Embodiments described herein may be discussed in the general context of processor-executable instructions residing on some form of non-transitory processor-readable medium, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.


In the figures, a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, logic, circuits, and steps have been described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. Also, the example fingerprint sensing system and/or mobile electronic device described herein may include components other than those shown, including well-known components.


Various techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium comprising instructions that, when executed, perform one or more of the methods described herein. The non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.


The non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.


Various embodiments described herein may be executed by one or more processors, such as one or more motion processing units (MPUs), sensor processing units (SPUs), host processor(s) or core(s) thereof, digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein, or other equivalent integrated or discrete logic circuitry. The term “processor,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein. As it employed in the subject specification, the term “processor” can refer to substantially any computing processing unit or device comprising, but not limited to comprising, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Moreover, processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of user equipment. A processor may also be implemented as a combination of computing processing units.


In addition, in some aspects, the functionality described herein may be provided within dedicated software modules or hardware modules configured as described herein. Also, the techniques could be fully implemented in one or more circuits or logic elements. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of an SPU/MPU and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with an SPU core, MPU core, or any other such configuration.


Overview of Discussion

Discussion begins with a description of a device including a fingerprint sensor, upon which described embodiments can be implemented. An example fingerprint sensor and system for determining whether a fingerprint image is generated using a real finger or a fake finger is then described, in accordance with various embodiments. Example operations of a fingerprint sensor for determining whether a fingerprint image is generated using a real finger or a fake finger using transient features are then described.


Fingerprint sensors are used in electronic devices for user authentication, such as mobile electronic devices and applications operating on mobile electronic devices, locks for accessing cars or buildings, for protecting against unauthorized access to the devices and/or applications. Authentication of a fingerprint at a fingerprint sensor is performed before providing access to a device and/or application. In order to circumvent fingerprint authentication, attempts can be made to copy or spoof fingerprints of an authorized user using a fake or artificial finger. As such, fingerprint sensors should be capable of distinguishing real fingers from fake, artificial, or even dead fingers, also referred to herein as performing “spoof detection” or “fake finger detection”. A “spoofed” fingerprint is a fake or artificial fingerprint that is used to attempt to circumvent security measures requiring fingerprint authentication. For example, an artificial finger may be used to gain unauthorized access to the electronic device or application, by making an unauthorized copy of the fingerprint of an authorized user, e.g., “spoofing” an actual fingerprint. The spoof detection may be performed by analyzing fingerprint images captured by the fingerprint sensor, e.g., performing biometric analysis of the fingerprint images, or looking at any characteristics that can help distinguish a fake/spoof fingerprint from a real fingerprint. These characteristics may be static features or dynamic features which have a certain time dependency because they change over time.


Embodiments described herein provide methods and systems for determining whether a finger interacting with a fingerprint sensor, for purposes of authentication, is a real finger or a fake finger based on dynamic features, also referred to herein as transient features. Transient features may refer to the characteristics of the signals or changes of the signal (e.g., a transient signal feature), or may refer to any dynamic characteristics of the fingerprint itself (e.g., a transient spatial feature). For example, a transient spatial feature may include how the fingerprint, or feature of the fingerprint, deform when pressed on the sensor surface or lifted from the sensor surface. Physiological transient features may also be used, such as the influence of transpiration on the measurements.


Embodiments described herein provide for determining whether a finger is a real finger at an ultrasonic fingerprint sensor. A sequence of images of a fingerprint of a finger are captured at an ultrasonic fingerprint sensor, wherein the sequence of images includes images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor. In one embodiment, the sequence of images includes images of the finger separating or lifting from a contact surface of the ultrasonic fingerprint sensor. In one embodiment, the sequence of images includes images of the finger contacting or pressing on a contact surface of the ultrasonic fingerprint sensor. It should be appreciated that the sequence of images can include images of the finger contacting the contact surface of the ultrasonic fingerprint sensor and separating from the contact surface of the ultrasonic fingerprint sensor.


A plurality of transient features of the finger is extracted from the sequence of images. In some embodiments, extracting the plurality of transient features of the finger from the sequence of images includes extracting the plurality of transient features of the finger from the sequence of images at pixels of the sequence of images that satisfy a certain criteria, e.g., a signal change criteria. In one embodiment, the pixels of the sequence of images exhibiting signal changes relative to other pixels exceeding a change threshold include pixels at ridges of the fingerprint. In other embodiments, extracting the plurality of transient features of the finger from the sequence of images includes extracting the plurality of transient features of the finger from the sequence of images at pixels corresponding to a ridge of the fingerprint.


In some embodiments, the plurality of transient features includes at least one transient signal feature. In some embodiments, the plurality of transient features includes at least one transient spatial feature. In some embodiments, the at least one transient spatial feature includes a transient fingerprint pattern feature. In some embodiments, the at least one transient spatial feature includes a transient contact pattern feature. In some embodiments, at least one transient feature of the plurality of transient features is related to a deformation of the finger, the fingerprint pattern, or the ridge/valley pattern or profile.


A classifier is applied to the plurality of transient features to classify the finger as one of a real finger and a fake finger. In some embodiments, one or more transient features of the plurality of transient features are used as a feature vector of the classifier. In some embodiments, the classifier is constrained to considering the finger for an enrolled user. It is determined whether the finger is a real finger based on an output of the classifier. In some embodiments, the output of the classifier includes a probability whether the finger is a real finger or a fake finger.


EXAMPLE MOBILE ELECTRONIC DEVICE

Turning now to the figures, FIG. 1 is a block diagram of an example electronic device 100. As will be appreciated, electronic device 100 may be implemented as a device or apparatus, such as a handheld mobile electronic device. For example, such a mobile electronic device may be, without limitation, a mobile telephone phone (e.g., smartphone, cellular phone, a cordless phone running on a local network, or any other cordless telephone handset), a wired telephone (e.g., a phone attached by a wire), a personal digital assistant (PDA), a video game player, video game controller, a Head Mounted Display (HMD), a virtual or augmented reality device, a navigation device, an activity or fitness tracker device (e.g., bracelet, clip, band, or pendant), a smart watch or other wearable device, a mobile internet device (MID), a personal navigation device (PND), a digital still camera, a digital video camera, a portable music player, a portable video player, a portable multi-media player, a remote control, or a combination of one or more of these devices. In other embodiments, electronic device 100 may be implemented as a fixed electronic device, such as and without limitation, an electronic lock, a doorknob, a car start button, an automated teller machine (ATM), etc. In accordance with various embodiments, electronic device 100 is capable of reading fingerprints.


As depicted in FIG. 1, electronic device 100 may include a host processor 110, a host bus 120, a host memory 130, and a sensor processing unit 170. Some embodiments of electronic device 100 may further include one or more of a display device 140, an interface 150, a transceiver 160 (all depicted in dashed lines) and/or other components. In various embodiments, electrical power for electronic device 100 is provided by a mobile power source such as a battery (not shown), when not being actively charged.


Host processor 110 can be one or more microprocessors, central processing units (CPUs), DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs or applications, which may be stored in host memory 130, associated with the functions and capabilities of electronic device 100.


Host bus 120 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SPI) or other equivalent. In the embodiment shown, host processor 110, host memory 130, display 140, interface 150, transceiver 160, sensor processing unit (SPU) 170, and other components of electronic device 100 may be coupled communicatively through host bus 120 in order to exchange commands and data. Depending on the architecture, different bus configurations may be employed as desired. For example, additional buses may be used to couple the various components of electronic device 100, such as by using a dedicated bus between host processor 110 and memory 130.


Host memory 130 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory), hard disk, optical disk, or some combination thereof. Multiple layers of software can be stored in host memory 130 for use with/operation upon host processor 110. For example, an operating system layer can be provided for electronic device 100 to control and manage system resources in real time, enable functions of application software and other layers, and interface application programs with other software and functions of electronic device 100. Similarly, a user experience system layer may operate upon or be facilitated by the operating system. The user experience system may comprise one or more software application programs such as menu navigation software, games, device function control, gesture recognition, image processing or adjusting, voice recognition, navigation software, communications software (such as telephony or wireless local area network (WLAN) software), and/or any of a wide variety of other software and functional interfaces for interaction with the user can be provided. In some embodiments, multiple different applications can be provided on a single electronic device 100, and in some of those embodiments, multiple applications can run simultaneously as part of the user experience system. In some embodiments, the user experience system, operating system, and/or the host processor 110 may operate in a low-power mode (e.g., a sleep mode) where very few instructions are processed. Such a low-power mode may utilize only a small fraction of the processing power of a full-power mode (e.g., an awake mode) of the host processor 110.


Display 140, when included, may be a liquid crystal device, (organic) light emitting diode device, or other display device suitable for creating and visibly depicting graphic images and/or alphanumeric characters recognizable to a user. Display 140 may be configured to output images viewable by the user and may additionally or alternatively function as a viewfinder for camera. It should be appreciated that display 140 is optional, as various electronic devices, such as electronic locks, doorknobs, car start buttons, etc., may not require a display device.


Interface 150, when included, can be any of a variety of different devices providing input and/or output to a user, such as audio speakers, touch screen, real or virtual buttons, joystick, slider, knob, printer, scanner, computer network I/O device, other connected peripherals and the like.


Transceiver 160, when included, may be one or more of a wired or wireless transceiver which facilitates receipt of data at electronic device 100 from an external transmission source and transmission of data from electronic device 100 to an external recipient. By way of example, and not of limitation, in various embodiments, transceiver 160 comprises one or more of: a cellular transceiver, a wireless local area network transceiver (e.g., a transceiver compliant with one or more Institute of Electrical and Electronics Engineers (IEEE) 802.11 specifications for wireless local area network communication), a wireless personal area network transceiver (e.g., a transceiver compliant with one or more IEEE 802.15 specifications for wireless personal area network communication), and a wired a serial transceiver (e.g., a universal serial bus for wired communication).


Electronic device 100 also includes a general purpose sensor assembly in the form of integrated Sensor Processing Unit (SPU) 170 which includes sensor processor 172, memory 176, a fingerprint sensor 178, and a bus 174 for facilitating communication between these and other components of SPU 170. In some embodiments, SPU 170 may include at least one additional sensor 180 (shown as sensor 180-1, 180-2, . . . 180-n) communicatively coupled to bus 174. In some embodiments, at least one additional sensor 180 is a force or pressure sensor (e.g. a touch sensor) configured to determine a force or pressure or a temperature sensor configured to determine a temperature at electronic device 100. The force or pressure sensor may be disposed within, under, or adjacent fingerprint sensor 178. In some embodiments, all of the components illustrated in SPU 170 may be embodied on a single integrated circuit. It should be appreciated that SPU 170 may be manufactured as a stand-alone unit (e.g., an integrated circuit), that may exist separately from a larger electronic device and is coupled to host bus 120 through an interface (not shown). It should be appreciated that, in accordance with some embodiments, that SPU 170 can operate independent of host processor 110 and host memory 130 using sensor processor 172 and memory 176.


Sensor processor 172 can be one or more microprocessors, CPUs, DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs, which may be stored in memory 176, associated with the functions of SPU 170. It should also be appreciated that fingerprint sensor 178 and additional sensor 180, when included, may also utilize processing and memory provided by other components of electronic device 100, e.g., host processor 110 and host memory 130.


Bus 174 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SPI) or other equivalent. Depending on the architecture, different bus configurations may be employed as desired. In the embodiment shown, sensor processor 172, memory 176, fingerprint sensor 178, and other components of SPU 170 may be communicatively coupled through bus 174 in order to exchange data.


Memory 176 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory). Memory 176 may store algorithms or routines or other instructions for processing data received from fingerprint sensor 178 and/or one or more sensor 180, as well as the received data either in its raw form or after some processing. Such algorithms and routines may be implemented by sensor processor 172 and/or by logic or processing capabilities included in fingerprint sensor 178 and/or sensor 180.


A sensor 180 may comprise, without limitation: a temperature sensor, a humidity sensor, an atmospheric pressure sensor, an infrared sensor, a radio frequency sensor, a navigation satellite system sensor (such as a global positioning system receiver), an acoustic sensor (e.g., a microphone), an inertial or motion sensor (e.g., a gyroscope, accelerometer, or magnetometer) for measuring the orientation or motion of the sensor in space, or other type of sensor for measuring other physical or environmental factors. In one example, sensor 180-1 may comprise an acoustic sensor, sensor 180-2 may comprise a temperature sensor, and sensor 180-n may comprise a motion sensor.


In some embodiments, fingerprint sensor 178 and/or one or more sensors 180 may be implemented using a microelectromechanical system (MEMS) that is integrated with sensor processor 172 and one or more other components of SPU 170 in a single chip or package. It should be appreciated that fingerprint sensor 178 may be disposed behind display 140. Although depicted as being included within SPU 170, one, some, or all of fingerprint sensor 178 and/or one or more sensors 180 may be disposed externally to SPU 170 in various embodiments. It should be appreciated that fingerprint sensor 178 can be any type of fingerprint sensor, including without limitation, an ultrasonic sensor, an optical sensor, a camera, etc.


EXAMPLE FINGERPRINT SENSOR AND SYSTEM FOR DETERMINING WHETHER A FINGER IS A REAL FINGER OR A FAKE FINGER


FIG. 2 illustrates a block diagram of an example fingerprint sensing system 200 for determining whether a fingerprint image was generated using a real finger or a fake finger, according to some embodiments. Fingerprint sensing system 200 is configured to determine whether a finger is a real finger or a fake finger using transient features extracted from fingerprint images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor. It should be appreciated that fingerprint sensing system 200 can be implemented as hardware, software, or any combination thereof. It should also be appreciated that fingerprint image capture 210, transient feature extractor 220, classifier 230, and fake finger determiner 240 may be separate components, may be comprised within a single component, or may be comprised in various combinations of multiple components (e.g., classifier 230 and fake finger determiner 240 may be comprised within a single component), in accordance with some embodiments.


Fingerprint images 215 are captured at fingerprint image capture 210. It should be appreciated that, in accordance with various embodiments, fingerprint image capture 210 is an ultrasonic sensor (e.g., a sensor capable of transmitting and receiving ultrasonic signals). The fingerprint sensor is operable to emit and detect ultrasonic waves (also referred to as ultrasonic signals or ultrasound signals). An array of ultrasonic transducers (e.g., Piezoelectric Micromachined Ultrasonic Transducers (PMUTs)) may be used to transmit and receive the ultrasonic waves. The emitted ultrasonic waves are reflected from any objects in contact with (or in front of) the fingerprint sensor, and these reflected ultrasonic waves, or echoes, are then detected. Where the object is a finger, the waves are reflected from different features of the finger, such as the surface features on the skin, fingerprint, or features present in deeper layers of the finger (e.g., the dermis). Examples of surface features of a finger are ridges and valleys of a fingerprint. For example, the reflection of the sound waves from the ridge/valley pattern enables the fingerprint sensor to produce a fingerprint image that may be used for identification of the user.


Fingerprint image capture 210 is configured to capture a plurality of fingerprint images 215 in a sequence, where the sequence of images includes images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor. A sequence of images is used to capture the transient nature of the signals, features, and characteristics of the fingerprint. In one embodiment, the sequence of images includes images of the finger contacting a contact surface of the ultrasonic fingerprint sensor, or changing a contact state. In one embodiment, the sequence of images includes images of the finger separating from a contact surface of the ultrasonic fingerprint sensor. It should be appreciated that the sequence of images can include images of the finger contacting the contact surface of the ultrasonic fingerprint sensor and separating from the contact surface of the ultrasonic fingerprint sensor. Capturing the sequence of images when the user presses a finger on the contact surface, or lifts a finger from the contact surface, enables the capturing of the sequence of images during a state of change. This change of state is related to the fact that a finger may deform and the contact between the fingerprint sensor and the finger changes. The characteristics of this changes are different for a real finger and a fake finger. The more the fake finger resembles a real finger, the smaller the difference in these characteristics. The transient features discussed in this disclosure enable a characterization of the state of change and are therefore used to differentiate between a real finger and a fake finger. Although, in the example embodiments discussed herein, a sequence of fingerprint images is used to derive the transient feature, it should be appreciated that other transient features may be derived directly from the received ultrasonic signals, without the forming of an image. Moreover, other types of transient features may be utilized in accordance with the described embodiments, such as temperature information (e.g., initial and steady state temperatures) detected by the ultrasonic fingerprint sensor or a temperature sensor of or related to the ultrasonic fingerprint sensor.


The capturing of the image sequence may be initiated whenever a change in signal is detected. For example, to capture the image sequence when the user presses the finger on the sensor, the image capture may be started as soon an object or finger starts interacting with the sensor. For an ultrasonic sensor with an array of ultrasonic transducers, a subset of transducers may be active in a low power mode, and as soon as a finger start interacting with the sensor, the full sensor may be activated to capture the sequence of images. In another example, where the user starts lifting the finger, a change in signal may occur as the pressure of the finger is reduced, and this may initiate the image sequence capture. The change of contact state may this be determined by the fingerprint sensor itself, or it may be detected by a second sensor associated with the fingerprint sensor. For example, a pressure sensor, a force sensor, or a touch sensor may be position near, below, or above the fingerprint sensor and this additional sensor may be used to detect a change in contact state that initiates the capturing of the image sequence.


The sequence of fingerprint images 215 can include any number of fingerprint images. In some embodiments, fingerprint images 215 are captured at periodic intervals (e.g., every 10 milliseconds) over a time period (e.g., 10 seconds). In some embodiments, the sequence of fingerprint images 215 includes at least three fingerprint images. In some embodiments, the fingerprint sensor may have a higher image capture rate when a change of contact state is detected, and a lower image capture rate during a steady state. For example, the sequence of fingerprint images 215 forwarded to transient feature extractor 220 may include two or more images captured during the change of contact state right after a finger is detected on the ultrasonic sensor, and a steady state image captured at a fixed amount of time after the finger is detected. In embodiments where the transient feature are deduced directly from the ultrasonic signals, the data transferred to the transient feature extractor 220 includes of a sequence of ultrasonic signal data.


Fingerprint images 215 are received at transient feature extractor 220, which is configured to extract transient features from the sequence of fingerprint images 215. Transient features may refer to the characteristics of the signals itself (e.g., a transient signal feature), or of any dynamic characteristics of the fingerprint itself (e.g., a transient spatial feature). For example, a transient spatial feature may include how the fingerprint, fingerprint pattern, or ridge profile deforms when the finger is pressed on the contact surface or lifted away from the contact surface. Physiological transient features may also be used, such as the influence of transpiration on the measurements.



FIG. 3 illustrates a block diagram of a transient feature extractor 220, according to some embodiments. In some embodiments, fingerprint images 215 are received at region selector 310, which is configured to select regions 315 of fingerprint images 215 for extracting transient features. It should be appreciated that regions 315 can include one or more pixels of fingerprint images 215. Region selector 310 allows for selection of a subset of pixels of fingerprint images 215, allowing for local pixel/region selection for transient feature extraction. It should be appreciated that region selector 310 is optional, and may not be used for global pixel selection of fingerprint images 215. For purposes of the instant specification, global pixel selection allows for the extraction of transient features of a large part of the pixels or all pixels of fingerprint images 215, and local pixel selection allows for the extraction of transient features of a subset of pixels of fingerprint images 215. In some embodiments, local pixel selection is performed for identifying pixels that exhibit a sufficient amount of signal change to be indicative of transient features (e.g., at ridges of the fingerprint, or involved in initial contact). It should be appreciated that the described embodiments can be performed using global or local pixel selection.


In some embodiment, region selector 310 identifies regions 315 of fingerprint images 215 that satisfy a signal change criteria. For example, a signal change criteria may be a signal change threshold value that, when exceeded, is satisfied. In some embodiments, pixels of the sequence of fingerprint images 215 exhibiting signal changes, e.g., relative to other pixels, exceeding a change threshold include pixels at ridges of the fingerprint. In other embodiments, region selector 310 identifies pixels corresponding to a ridge of the fingerprint.


Regions 315 (local pixel selection or global pixel selection) are received at extractor 320, which is operable to extract a plurality of transient features 225 from pixels of regions 315. In some embodiments, the plurality of transient features 225 includes at least one transient signal feature (e.g., a signal value). In some embodiments, the plurality of transient features 225 includes at least one transient spatial feature (e.g., a width of a fingerprint ridge). In some embodiments, the at least one transient spatial feature includes a transient fingerprint pattern feature, e.g., a flattening of ridges as the pressure of the contact is increased. Another example of transient spatial feature is ridge continuity referring to how broken up a ridge line is or how long the ridge segments are (more or less continuous, or broken up in many parts). In some embodiments, the at least one transient spatial feature includes a transient contact pattern feature, such as the part of the contact surface/image covered by a fingerprint pattern (e.g., starting at the first point of contact and then spreading over the image as the pressure increases, or vice-versa). In some embodiments, at least one transient feature of the plurality of transient features is related to a deformation of the finger due to the change in contact state. The deformation may be limited to the outer surface of the fingerprint, and may involve the deeper layers of the finger (or fingerprint), depending on the penetration depth of the ultrasound. In some embodiments, the transient feature used to determine if the finger is real is a combination of two or more of the above features. For example, comparison of the transient signal feature compare to the transient spatial features may be used as an indication if a finger is a real finger. In other words, does the change in the observed ultrasound signal correspond to the observed change of state based on the spatial feature such as the contact surface area and/or changes in ridges.



FIG. 4A illustrates example graph 410 of fingerprint image contrast over time and graph 415 of ridge signal strength over time during capture of a sequence of fingerprint images generated using a real finger, according to embodiments. The fingerprint images 412a through 412i illustrated in graph 410 shows the fingerprint images captured at the different time and are indicative of changes in contact state between the real finger and the ultrasonic sensor. As illustrated, fingerprint images 412a through 412d represent fingerprint images captured while the real finger is initially making contact with a contact surface of the ultrasonic fingerprint sensor, fingerprint images 412e and 412f represent fingerprint images captured during a substantially steady state of the real finger in contact with a contact surface of the ultrasonic fingerprint sensor, and fingerprint images 412g and 412i represent fingerprint images captured while the real finger is separating from the contact surface of the ultrasonic fingerprint sensor. Graph 415 illustrates the ridge signal strength that is substantially inverse to the contrast level of graph 410.



FIG. 4B illustrates example graph 420 of fingerprint image contrast over time and graph 425 of ridge signal strength over time during capture of a sequence of fingerprint images generated using a fake finger, according to embodiments. The fingerprint images 422a through 422j illustrated in graph 420 are indicative of changes in contact state between the fake finger and the ultrasonic sensor. As illustrated, fingerprint images 422a through 422d represent fingerprint images captured while the fake finger is initially making contact with a contact surface of the ultrasonic fingerprint sensor, fingerprint images 422e through 422h represent fingerprint images captured during a substantially steady state of the fake finger in contact with a contact surface of the ultrasonic fingerprint sensor, and fingerprint images 422i and 422j represent fingerprint images captured while the fake finger is separating from the contact surface of the ultrasonic fingerprint sensor. Graph 425 illustrates the ridge signal strength that is substantially inverse to the contrast level of graph 420.


As illustrated, graph 415 illustrates that transient features of a real finger illustrate a gradual change in the slope of the ridge signal strength over time relative to the transient features of a fake finger, as shown in graph 425. This is particularly apparent while the finger is making contact with the ultrasonic fingerprint sensor, but is also apparent while the finger is separating from the ultrasonic sensor. By analyzing the transient features extracted from fingerprint images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor. The transient signal features can be extracted from the signal intensities, contrasts etc., as discussed in more detail below. The fingerprint images 412a through 412i of graph 415 also show the change in spatial features that can be used to derive transient spatial features (e.g., area of image showing fingerprint pattern, continuity of ridges, width of ridges). In some embodiments, the transient features are extracted and classification is applied to classify the finger as a real finger or a fake finger.



FIG. 5 illustrates an example graph 500 of ridge signal strength over time during capture of a sequence of fingerprint images, according to embodiments. When the ridges of the fingerprint make contact with the sensor, ultrasonic waves are coupled into the finger and as a consequence, less ultrasonic waves are reflected back to the transducer. This results in a signal drop at the location of ridges. This effect is much less pronounced at valleys because of the air gap between the sensor and the fingerprint at the location of the valleys, resulting in a higher acoustic mismatch and this more reflection. Graph 500 illustrates two example transient features used for classifying a finger as a real finger or a fake finger, the ridge signal drop at steady state and a slope of the signal as the finger is making contact with the ultrasonic fingerprint sensor. Normalized negative slope 510 and normalized negative slope 520 illustrate slopes of the ridge signal at different stages as the finger is making contact with the ultrasonic fingerprint sensor. In this example, the slope of the signal change is determined at two instances, and a difference or ratio between the slopes may be used as a transient signal feature. In other example, other characteristics of the change in signal or rate of change in signal may be used. Ridge signal drop 530 illustrates that ridge signal drop to a steady state of the finger in contact with the ultrasonic sensor. In general, fake fingers have smaller ridge signal drop at steady state than real fingers, and have a fast response (e.g., a faster slope rate change) as the finger is making contact with the ultrasonic fingerprint sensor (as will be discussed below in relation to FIG. 7).



FIG. 6 illustrates an example graph 600 of ridge signal strength over time during capture of a sequence of fingerprint images using a real finger, according to embodiments. As described above, local pixel selection or global pixel selection can be used for transient feature extraction. Line 610 of graph 600 illustrates a plot of the global pixel selection and line 620 of graph 600 illustrates a plot of the local pixel selection. As discussed above, global pixel selection means that pixels over the whole image are taken in consideration, while local pixel selection means that a subset of the images is used. In one embodiment, global may mean considering all the pixels in the images, while local may mean using only the pixels corresponding to ridges. In another embodiment, local may indicate only those pixels are used involved with the initial contact or showing an initial change above a certain threshold. For a real finger, local pixel selection may provide a less steep slope than global pixel selection, and a reduced change in steady state signal drop.



FIG. 7 illustrates an example graph 700 of ridge signal strength over time during capture of a sequence of fingerprint images using a fake finger, according to embodiments. As described above, local pixel selection or global pixel selection can be used for transient feature extraction. Line 710 of graph 700 illustrates a plot of the global pixel selection and line 720 of graph 700 illustrates a plot of the local pixel selection. For a fake finger, the plots generated from local pixel selection and global pixel selection are much more alike, and may be difficult to distinguish. The comparison of the plots of local pixel selection to global pixel selection for ridge signal strength over time may also be used to distinguish between a real finger and a fake finger.


In one embodiment, the transient features are used as feature vectors in a classifier to determine if the finger is a real finger of a fake finger, but alternative methods to use the transient features to determine if the finger is a real finger may also be used. For example, the transient features (values) may be compared to reference (values), and if the transient features are within a threshold range of the reference, the finger is likely to be a real finger. Thereby, classifying the finger as a real finger or a fake finger based on the comparison. The probability of the finger being a real finger may be deduced from the difference between the transient feature (value) and the reference (value). Using a plurality of transient features may increase the confidence in the determination. The reference values may be predetermined, for example from measurement based on a plurality of users and or a plurality of (different types of) fake fingers. For increased performance, the reference values may be determined for authenticated user, for example during enrollment, and may also be context dependent. The contact dependent can help correct for external influences, such as e.g. temperature because at cold temperature the contact between the finger and sensor is less optimal, often due to dryness of the finger.


With reference again to FIG. 2, transient features 225 are received at classifier 230. Classifier 230 performs classification on transient features 225 to classify the finger as one of a real finger and a fake finger. In some embodiments, each transient feature of the plurality of transient features is a feature vector of the classifier 230. In some embodiments, the classifier is constrained to considering the finger for an enrolled user. This means that the decision of the difference classes is based on references determined for an authorized user, e.g., based on a user enrollment where the transient features for that user are measured and stored as a reference. Based on the transient features 225, classifier 230 generates output 235 as to whether the finger is a real finger or a fake finger. In one embodiment, output 235 of the classifier comprises a probability whether the finger is a real finger or a fake finger



FIG. 8 illustrates example distributions of transient features as used by a classifier, according to some embodiments. Graph 800 illustrates the use of transient features to classify a plurality of real fingers from different users and a plurality of different fake fingers fabricated using various methods and materials. As can be seen in FIG. 8, the real fingers are located towards the bottom right while the fake fingers measurements are located more towards the top left of the figure. Graph 800 is based on transient features extracted from a global pixel selection and graph 810 illustrates similar observations using transient features extracted from a local pixel selection. The ridge signal drop to steady state and a difference in signal slope during the initial contact are used as transient features in the example, as discussed in relation to FIG. 5. This example shows a simple linear regression classifier to demonstrate how the transient features can be used to determine whether a finger is a real finger or a fake finger. Many other types of classifiers are known to the person skilled in the art, e.g., logistic regression classifiers, Gaussian Mixture Model classifiers, linear discriminant analysis (LDA) classifiers, neural network based classifiers, multiplayer perceptron classifiers, support vector machine classifier, etc., which can be used in a similar manner. The decision on which classifier to use may also depend on the transient features and/or the number of transient features to use. This example shows the use of two transient signal feature in a two-dimensional classifier. As discussed above, more transient features can be used in a multi-dimensional classifier using a plurality of transient signal feature and/or transient spatial feature.


As illustrated in graph 800, line 805 bisects graph 800 such that measured fingerprints plotted on the lower right of graph 800 are indicative of a real finger and measured fingerprints plotted on the upper left of graph 800 are indicative of a fake finger. Similarly, as illustrated in graph 810, line 815 bisects graph 800 such that measured fingerprints plotted on the lower right of graph 810 are indicative of a real finger and measured fingerprints plotted on the upper left of graph 810 are indicative of a fake finger. Lines 805 and 815 indicate a class boundary, and thus fingerprint measurements to the bottom right of the boundary are classified as a real finger and fingerprint measurements to the top left of the boundary are classified as a fake finger. The distance between the fingerprint measurements and the boundary can be used as a measure of the confidence in the classification, and this distance can be converted into a probability using any type of transfer function or activation function.


It is determined whether the finger is a real finger based on output 235 of the classifier at faker finger determiner 240. In some embodiments, the output 235 of the classifier includes a probability whether the finger is a real finger or a fake finger. In some embodiments, fake finger determiner 240 receives input from other types of spoof detection or fake finger detection, and makes the determination as to whether the finger is a fake finger based on multiple inputs, including output 235.


EXAMPLE OPERATIONS FOR OPERATING A FINGERPRINT SENSOR FOR DETERMINING WHETHER A FINGER IS A REAL FINGER OR A FAKE FINGER


FIG. 9 illustrates an example process 900 for determining whether a finger is a real finger at a fingerprint sensor, according to some embodiments. Procedures of process 900 will be described with reference to elements and/or components of various figures described herein. It is appreciated that in some embodiments, the procedures may be performed in a different order than described, that some of the described procedures may not be performed, and/or that one or more additional procedures to those described may be performed. The flow diagram includes some procedures that, in various embodiments, are carried out by one or more processors (e.g., a host processor or a sensor processor) under the control of computer-readable and computer-executable instructions that are stored on non-transitory computer-readable storage media. It is further appreciated that one or more procedures described in the flow diagrams may be implemented in hardware, or a combination of hardware with firmware and/or software.


At procedure 910 of flow diagram 900, a sequence of images of a fingerprint of a finger are captured at an ultrasonic fingerprint sensor, wherein the sequence of images includes images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor. In one embodiment, the sequence of images includes images of the finger separating from a contact surface of the ultrasonic fingerprint sensor. In one embodiment, the sequence of images includes images of the finger contacting a contact surface of the ultrasonic fingerprint sensor. It should be appreciated that the sequence of images can include images of the finger contacting the contact surface of the ultrasonic fingerprint sensor and separating from the contact surface of the ultrasonic fingerprint sensor.


At procedure 920, a plurality of transient features of the finger is extracted from the sequence of images. In some embodiments, as shown at procedure 922, extracting the plurality of transient features of the finger from the sequence of includes extracting the plurality of transient features of the finger from the sequence of images at pixels of the sequence of images that satisfy a signal change criteria. In one embodiment, as shown at procedure 924, the pixels of the sequence of images exhibiting signal changes relative to other pixels exceeding a change threshold include pixels at ridges of the fingerprint. In other embodiments, extracting the plurality of transient features of the finger from the sequence of includes extracting the plurality of transient features of the finger from the sequence of images at pixels corresponding to a ridge of the fingerprint.


In some embodiments, the plurality of transient features includes at least one transient signal feature. In some embodiments, the plurality of transient features includes at least one transient spatial feature. In some embodiments, the at least one transient spatial feature includes a transient fingerprint pattern feature. In some embodiments, the at least one transient spatial feature includes a transient contact pattern feature. In some embodiments, at least one transient feature of the plurality of transient features is related to a deformation of the finger.


At procedure 930, a classifier is applied to the plurality of transient features to classify the finger as one of a real finger and a fake finger. In some embodiments, each transient feature of the plurality of transient features is a feature vector of the classifier. In some embodiments, the classifier is constrained to considering the finger for an enrolled user. At procedure 940, it is determined whether the finger is a real finger based at least in part on output of the classifier. In some embodiments, the output of the classifier includes a probability whether the finger is a real finger or a fake finger.


CONCLUSION

The examples set forth herein were presented in order to best explain, to describe particular applications, and to thereby enable those skilled in the art to make and use embodiments of the described examples. However, those skilled in the art will recognize that the foregoing description and examples have been presented for the purposes of illustration and example only. Many aspects of the different example embodiments that are described above can be combined into new embodiments. The description as set forth is not intended to be exhaustive or to limit the embodiments to the precise form disclosed. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.


Reference throughout this document to “one embodiment,” “certain embodiments,” “an embodiment,” “various embodiments,” “some embodiments,” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any embodiment may be combined in any suitable manner with one or more other features, structures, or characteristics of one or more other embodiments without limitation.

Claims
  • 1. A method for determining whether a finger is a real finger at an ultrasonic fingerprint sensor, the method comprising: capturing a sequence of images of a fingerprint of a finger at an ultrasonic fingerprint sensor, wherein the sequence of images comprises a plurality of images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor, the plurality of images comprising one of images captured during an instance of the finger initiating contact with a contact surface of the ultrasonic fingerprint sensor and images captured during an instance of the finger separating from the contact surface of the ultrasonic fingerprint sensor;extracting a plurality of transient features of the finger from the sequence of images;applying a classifier to the plurality of transient features to classify the finger as one of a real finger and a fake finger; anddetermining whether the finger is a real finger based on an output of the classifier.
  • 2. The method of claim 1, wherein the extracting the plurality of transient features of the finger from the sequence of images comprises: extracting the plurality of transient features of the finger from the sequence of images at pixels of the sequence of images that satisfy a signal change criteria.
  • 3. The method of claim 2, wherein the pixels of the sequence of images exhibiting signal changes relative to other pixels exceeding a change threshold comprise pixels at ridges of the fingerprint.
  • 4. The method of claim 1, wherein the extracting the plurality of transient features of the finger from the sequence of images comprises: extracting the plurality of transient features of the finger from the sequence of images at pixels corresponding to a ridge of the fingerprint.
  • 5. The method of claim 1, wherein each transient feature of the plurality of transient features is a feature vector of the classifier.
  • 6. The method of claim 1, wherein the output of the classifier comprises a probability whether the finger is a real finger or a fake finger.
  • 7. The method of claim 1, wherein the classifier is constrained to considering the finger for an enrolled user.
  • 8. The method of claim 1, wherein the plurality of transient features comprises at least one transient signal feature.
  • 9. The method of claim 1, wherein the plurality of transient features comprises at least one transient spatial feature.
  • 10. The method of claim 9, wherein the at least one transient spatial feature comprises a transient fingerprint pattern feature.
  • 11. The method of claim 9, wherein the at least one transient spatial feature comprises a transient contact pattern feature.
  • 12. The method of claim 1, wherein at least one transient feature of the plurality of transient features is related to a deformation of the finger.
  • 13. An ultrasonic fingerprint sensor device comprising: a two-dimensional array of ultrasonic transducers; anda processor, wherein the processor is configured to: capture a sequence of images of a fingerprint of a finger at a fingerprint sensor, wherein the sequence of images comprises a plurality of images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor device, the plurality of images comprising one of images captured during an instance of the finger initiating contact with a contact surface of the ultrasonic fingerprint sensor device and images captured during an instance of the finger separating from the contact surface of the ultrasonic fingerprint sensor device;extract a plurality of transient features of the finger from the sequence of images;apply a classifier to the plurality of transient features to classify the finger as one of a real finger and a fake finger; anddetermine whether the finger is a real finger based at least in part on an output of the classifier.
  • 14. The ultrasonic fingerprint sensor device of claim 13, wherein the processor is further configured to: extract the plurality of transient features of the finger from the sequence of images at pixels of the sequence of images that satisfy a signal change criteria.
  • 15. The ultrasonic fingerprint sensor device of claim 13, wherein the processor is further configured to: extract the plurality of transient features of the finger from the sequence of images at pixels corresponding to a ridge of the fingerprint.
  • 16. The ultrasonic fingerprint sensor device of claim 13, wherein the plurality of transient features comprises at least one transient signal feature or at least one transient spatial feature.
  • 17. A non-transitory computer readable storage medium having computer readable program code stored thereon for causing a computer system to perform a method for determining whether a finger is a real finger at a fingerprint sensor, the method comprising: capturing a sequence of images of a fingerprint of a finger at an ultrasonic fingerprint sensor, wherein the sequence of images comprises a plurality of images captured during a change in contact state between the finger and the ultrasonic fingerprint sensor, the plurality of images comprising one of images captured during an instance of the finger initiating contact with a contact surface of the ultrasonic fingerprint sensor and images captured during an instance of the finger separating from the contact surface of the ultrasonic fingerprint sensor;extracting a plurality of transient features of the finger from the sequence of images;applying a classifier to the plurality of transient features to classify the finger as one of a real finger and a fake finger; anddetermining whether the finger is a real finger based on an output of the classifier.
RELATED APPLICATIONS

This application claims priority to and the benefit of U.S. Patent Provisional Patent Application 62/866,510, filed on Jun. 25, 2019, entitled “FAKE FINGER INVESTIGATION USING TRANSIENT FEATURES,” by Akhbari et al., and assigned to the assignee of the present application, which is incorporated herein by reference in its entirety.

US Referenced Citations (252)
Number Name Date Kind
5575286 Weng et al. Nov 1996 A
5684243 Gururaja et al. Nov 1997 A
5808967 Yu et al. Sep 1998 A
5867302 Fleming Feb 1999 A
5911692 Hussain et al. Jun 1999 A
6071239 Cribbs et al. Jun 2000 A
6104673 Cole et al. Aug 2000 A
6289112 Jain et al. Sep 2001 B1
6292576 Brownlee Sep 2001 B1
6350652 Libera et al. Feb 2002 B1
6428477 Mason Aug 2002 B1
6483932 Martinez Nov 2002 B1
6500120 Anthony Dec 2002 B1
6676602 Barnes et al. Jan 2004 B1
6736779 Sano et al. May 2004 B1
7067962 Scott Jun 2006 B2
7109642 Scott Sep 2006 B2
7243547 Cobianu et al. Jul 2007 B2
7257241 Lo Aug 2007 B2
7400750 Nam Jul 2008 B2
7433034 Huang Oct 2008 B1
7459836 Scott Dec 2008 B2
7471034 Schlote-Holubek et al. Dec 2008 B2
7489066 Scott et al. Feb 2009 B2
7634117 Cho Dec 2009 B2
7739912 Schneider et al. Jun 2010 B2
8018010 Tigli et al. Sep 2011 B2
8139827 Schneider et al. Mar 2012 B2
8255698 Li et al. Aug 2012 B2
8311514 Bandyopadhyay et al. Nov 2012 B2
8335356 Schmitt Dec 2012 B2
8433110 Kropp et al. Apr 2013 B2
8508103 Schmitt et al. Aug 2013 B2
8515135 Clarke et al. Aug 2013 B2
8666126 Lee et al. Mar 2014 B2
8703040 Liufu et al. Apr 2014 B2
8723399 Sammoura et al. May 2014 B2
8805031 Schmitt Aug 2014 B2
9056082 Liautaud et al. Jun 2015 B2
9070861 Bibl et al. Jun 2015 B2
9224030 Du et al. Dec 2015 B2
9245165 Slaby et al. Jan 2016 B2
9424456 Kamath Koteshwara et al. Aug 2016 B1
9572549 Belevich et al. Feb 2017 B2
9582102 Setlak Feb 2017 B2
9582705 Du et al. Feb 2017 B2
9607203 Yazdandoost et al. Mar 2017 B1
9607206 Schmitt et al. Mar 2017 B2
9613246 Gozzini et al. Apr 2017 B1
9665763 Du et al. May 2017 B2
9747488 Yazdandoost et al. Aug 2017 B2
9785819 Oreifej Oct 2017 B1
9815087 Ganti et al. Nov 2017 B2
9817108 Kuo et al. Nov 2017 B2
9818020 Schuckers et al. Nov 2017 B2
9881195 Lee et al. Jan 2018 B2
9881198 Lee et al. Jan 2018 B2
9898640 Ghavanini Feb 2018 B2
9904836 Yeke Yazdandoost et al. Feb 2018 B2
9909225 Lee et al. Mar 2018 B2
9922235 Cho et al. Mar 2018 B2
9934371 Hong et al. Apr 2018 B2
9939972 Shepelev et al. Apr 2018 B2
9953205 Rasmussen et al. Apr 2018 B1
9959444 Young et al. May 2018 B2
9967100 Hong et al. May 2018 B2
9983656 Merrell et al. May 2018 B2
9984271 King et al. May 2018 B1
10275638 Yousefpor et al. Apr 2019 B1
10315222 Salvia et al. Jun 2019 B2
10387704 Dagan et al. Aug 2019 B2
10461124 Berger et al. Oct 2019 B2
10478858 Lasiter et al. Nov 2019 B2
10515255 Strohmann et al. Dec 2019 B2
10539539 Garlepp et al. Jan 2020 B2
10600403 Garlepp et al. Mar 2020 B2
10656255 Ng et al. May 2020 B2
10670716 Apte et al. Jun 2020 B2
10706835 Garlepp et al. Jul 2020 B2
10755067 De Foras et al. Aug 2020 B2
20020135273 Mauchamp et al. Sep 2002 A1
20030013955 Poland Jan 2003 A1
20040085858 Khuri-Yakub et al. May 2004 A1
20040122316 Satoh et al. Jun 2004 A1
20040174773 Thomenius et al. Sep 2004 A1
20050023937 Sashida et al. Feb 2005 A1
20050057284 Wodnicki Mar 2005 A1
20050100200 Abiko et al. May 2005 A1
20050110071 Ema et al. May 2005 A1
20050146240 Smith et al. Jul 2005 A1
20050148132 Wodnicki et al. Jul 2005 A1
20050162040 Robert Jul 2005 A1
20060052697 Hossack et al. Mar 2006 A1
20060079777 Karasawa Apr 2006 A1
20060280346 Machida Dec 2006 A1
20070046396 Huang Mar 2007 A1
20070047785 Jang et al. Mar 2007 A1
20070073135 Lee et al. Mar 2007 A1
20070202252 Sasaki Aug 2007 A1
20070215964 Khuri-Yakub et al. Sep 2007 A1
20070223791 Shinzaki Sep 2007 A1
20070230754 Jain et al. Oct 2007 A1
20080125660 Yao et al. May 2008 A1
20080150032 Tanaka Jun 2008 A1
20080194053 Huang Aug 2008 A1
20080240523 Benkley et al. Oct 2008 A1
20090005684 Kristoffersen et al. Jan 2009 A1
20090182237 Angelsen et al. Jul 2009 A1
20090274343 Clarke Nov 2009 A1
20090303838 Svet Dec 2009 A1
20100030076 Vortman et al. Feb 2010 A1
20100046810 Yamada Feb 2010 A1
20100113952 Raguin et al. May 2010 A1
20100168583 Dausch et al. Jul 2010 A1
20100195851 Buccafusca Aug 2010 A1
20100201222 Adachi et al. Aug 2010 A1
20100202254 Roest et al. Aug 2010 A1
20100239751 Regniere Sep 2010 A1
20100251824 Schneider et al. Oct 2010 A1
20100256498 Tanaka Oct 2010 A1
20100278008 Ammar Nov 2010 A1
20110285244 Lewis et al. Nov 2011 A1
20110291207 Martin et al. Dec 2011 A1
20120016604 Irving et al. Jan 2012 A1
20120092026 Liautaud et al. Apr 2012 A1
20120095335 Sverdlik et al. Apr 2012 A1
20120095347 Adam et al. Apr 2012 A1
20120147698 Wong et al. Jun 2012 A1
20120224041 Monden Sep 2012 A1
20120232396 Tanabe Sep 2012 A1
20120238876 Tanabe et al. Sep 2012 A1
20120263355 Monden Oct 2012 A1
20120279865 Regniere et al. Nov 2012 A1
20120288641 Diatezua et al. Nov 2012 A1
20120300988 Ivanov et al. Nov 2012 A1
20130051179 Hong Feb 2013 A1
20130064043 Degertekin et al. Mar 2013 A1
20130127297 Bautista et al. May 2013 A1
20130127592 Fyke et al. May 2013 A1
20130133428 Lee et al. May 2013 A1
20130201134 Schneider et al. Aug 2013 A1
20130271628 Ku et al. Oct 2013 A1
20130294202 Hajati Nov 2013 A1
20140060196 Falter et al. Mar 2014 A1
20140117812 Hajati May 2014 A1
20140176332 Alameh et al. Jun 2014 A1
20140208853 Onishi et al. Jul 2014 A1
20140219521 Schmitt et al. Aug 2014 A1
20140232241 Hajati Aug 2014 A1
20140265721 Robinson et al. Sep 2014 A1
20140294262 Schuckers Oct 2014 A1
20140313007 Harding Oct 2014 A1
20140355387 Kitchens et al. Dec 2014 A1
20150036065 Yousefpor et al. Feb 2015 A1
20150049590 Rowe et al. Feb 2015 A1
20150087991 Chen et al. Mar 2015 A1
20150097468 Hajati et al. Apr 2015 A1
20150145374 Xu et al. May 2015 A1
20150164473 Kim et al. Jun 2015 A1
20150165479 Lasiter et al. Jun 2015 A1
20150169136 Ganti et al. Jun 2015 A1
20150189136 Chung et al. Jul 2015 A1
20150198699 Kuo et al. Jul 2015 A1
20150206738 Rastegar Jul 2015 A1
20150213180 Herberholz Jul 2015 A1
20150220767 Yoon et al. Aug 2015 A1
20150241393 Ganti et al. Aug 2015 A1
20150261261 Bhagavatula et al. Sep 2015 A1
20150286312 Kang et al. Oct 2015 A1
20150301653 Urushi Oct 2015 A1
20150345987 Hajati Dec 2015 A1
20150371398 Qiao et al. Dec 2015 A1
20160051225 Kim et al. Feb 2016 A1
20160063294 Du et al. Mar 2016 A1
20160063300 Du et al. Mar 2016 A1
20160070967 Du Mar 2016 A1
20160070968 Gu et al. Mar 2016 A1
20160086010 Merrell et al. Mar 2016 A1
20160092715 Yazdandoost et al. Mar 2016 A1
20160092716 Yazdandoost et al. Mar 2016 A1
20160100822 Kim et al. Apr 2016 A1
20160107194 Panchawagh et al. Apr 2016 A1
20160117541 Lu et al. Apr 2016 A1
20160180142 Riddle et al. Jun 2016 A1
20160326477 Fernandez-Alcon et al. Nov 2016 A1
20160350573 Kitchens et al. Dec 2016 A1
20160358003 Shen et al. Dec 2016 A1
20170004352 Jonsson Jan 2017 A1
20170330552 Garlepp et al. Jan 2017 A1
20170032485 Vemury Feb 2017 A1
20170075700 Abudi et al. Mar 2017 A1
20170100091 Eigil et al. Apr 2017 A1
20170110504 Panchawagh et al. Apr 2017 A1
20170119343 Pintoffl May 2017 A1
20170124374 Rowe et al. May 2017 A1
20170168543 Dai et al. Jun 2017 A1
20170185821 Chen et al. Jun 2017 A1
20170194934 Shelton et al. Jul 2017 A1
20170200054 Du Jul 2017 A1
20170219536 Koch et al. Aug 2017 A1
20170231534 Agassy et al. Aug 2017 A1
20170255338 Medina et al. Sep 2017 A1
20170293791 Mainguet et al. Oct 2017 A1
20170316243 Ghavanini Nov 2017 A1
20170316248 He et al. Nov 2017 A1
20170322290 Ng Nov 2017 A1
20170322291 Salvia et al. Nov 2017 A1
20170322292 Salvia et al. Nov 2017 A1
20170322305 Apte et al. Nov 2017 A1
20170323133 Tsai Nov 2017 A1
20170326590 Daneman Nov 2017 A1
20170326591 Apte et al. Nov 2017 A1
20170326593 Garlepp et al. Nov 2017 A1
20170326594 Berger et al. Nov 2017 A1
20170328866 Apte et al. Nov 2017 A1
20170328870 Garlepp et al. Nov 2017 A1
20170330012 Salvia et al. Nov 2017 A1
20170330553 Garlepp et al. Nov 2017 A1
20170357839 Yazdandoost et al. Dec 2017 A1
20180025202 Ryshtun Jan 2018 A1
20180032788 Krenzer Feb 2018 A1
20180101711 D'Souza et al. Apr 2018 A1
20180107852 Fenrich Apr 2018 A1
20180129849 Strohmann et al. May 2018 A1
20180129857 Bonev May 2018 A1
20180206820 Anand et al. Jul 2018 A1
20180225495 Jonsson et al. Aug 2018 A1
20180229267 Ono et al. Aug 2018 A1
20180276443 Strohmann et al. Sep 2018 A1
20180349663 Garlepp et al. Dec 2018 A1
20180357457 Rasmussen et al. Dec 2018 A1
20180369866 Sammoura et al. Dec 2018 A1
20180373913 Panchawagh et al. Dec 2018 A1
20190005300 Garlepp et al. Jan 2019 A1
20190018123 Narasimha-Iyer et al. Jan 2019 A1
20190057267 Kitchens et al. Feb 2019 A1
20190073507 D'Souza et al. Mar 2019 A1
20190095015 Han et al. Mar 2019 A1
20190102046 Miranto et al. Apr 2019 A1
20190130083 Agassy May 2019 A1
20190171858 Ataya et al. Jun 2019 A1
20190188441 Hall et al. Jun 2019 A1
20190188442 Flament et al. Jun 2019 A1
20190325185 Tang Oct 2019 A1
20200030850 Apte et al. Jan 2020 A1
20200050816 Tsai Feb 2020 A1
20200050817 Salvia et al. Feb 2020 A1
20200050828 Li et al. Feb 2020 A1
20200074135 Garlepp et al. Mar 2020 A1
20200158694 Garlepp et al. May 2020 A1
20200210666 Flament Jul 2020 A1
20200302140 Lu et al. Sep 2020 A1
Foreign Referenced Citations (27)
Number Date Country
1826631 Aug 2006 CN
102159334 Aug 2011 CN
105264542 Jan 2016 CN
1214909 Jun 2002 EP
2884301 Jun 2015 EP
3086261 Oct 2016 EP
2011040467 Feb 2011 JP
2009096576 Aug 2009 WO
2009137106 Nov 2009 WO
2014035564 Mar 2014 WO
2015009635 Jan 2015 WO
2015112453 Jul 2015 WO
2015120132 Aug 2015 WO
2015131083 Sep 2015 WO
2015134816 Sep 2015 WO
2015183945 Dec 2015 WO
2016007250 Jan 2016 WO
2016011172 Jan 2016 WO
2016040333 Mar 2016 WO
2016061406 Apr 2016 WO
2016061410 Apr 2016 WO
2017003848 Jan 2017 WO
2017053877 Mar 2017 WO
2017192895 Nov 2017 WO
2017196678 Nov 2017 WO
2017196682 Nov 2017 WO
2017192903 Dec 2017 WO
Non-Patent Literature Citations (55)
Entry
Tang, et al., “Pulse-Echo Ultrasonic Fingerprint Sensor on a Chip”, IEEE Transducers, Anchorage, Alaska, USA, Jun. 21-25, 2015, pp. 674-677.
ISA/EP, Partial International Search Report for International Application No. PCT/US2019/034032, 8 pages, dated Sep. 12, 2019, 8.
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2018/063431, pp. 1-15, dated Feb. 5, 2019.
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2019/015020, pp. 1-23, dated Jul. 1, 2019.
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2019/023440, pp. 1-10, dated Jun. 4, 2019.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031120, 12 pages, dated Aug. 29, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031127, 13 pages, dated Sep. 1, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031134, 12 pages, dated Aug. 30, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031140, 18 pages, dated Nov. 2, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031421 13 pages, dated Jun. 21, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031426 13 pages, dated Jun. 22, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031431, 14 pages, dated Aug. 1, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031434, 13 pages, dated Jun. 26, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031439, 10 pages, dated Jun. 20, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031824, 18 pages, dated Sep. 22, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031827, 16 pages, dated Aug. 1, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031831, 12 pages, dated Jul. 21, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2018/037364, 10 pages, dated Sep. 3, 2018.
ISA/EP, International Search Report for International Application No. PCT/US2017/031826, 16 pages, dated Feb. 27, 2018.
ISA/EP, Partial International Search Report for International Application No. PCT/US2017/031140, 13 pages, dated Aug. 29, 2017.
ISA/EP, Partial International Search Report for International Application No. PCT/US2017/031826, 12 pages, dated Nov. 30, 2017.
“Moving Average Filters”, Waybackmachine XP05547422, Retrieved from the Internet: URL:https://web.archive.org/web/20170809081353/https//www.analog.com/media/en/technical-documentation/dsp-book/dsp_book_Ch15.pdf [retrieved on Jan. 24, 2019], Aug. 9, 2017, 1-8.
Office Action for CN App No. 201780029016.7 dated Mar. 24, 2020, 7 pages.
“Receiver Thermal Noise Threshold”, Fisher Telecommunication Services, Satellite Communications. Retrieved from the Internet URL:https://web.archive.org/web/20171027075705/http//www.fishercom.xyz:80/satellite-communications/receiver-thermal-noise-threshold.html, Oct. 27, 2017, 3.
“Sleep Mode”, Wikipedia, Retrieved from the Internet: URL:https://web.archive.org/web/20170908153323/https://en.wikipedia.org/wiki/Sleep_mode [retrieved on Jan. 25, 2019], Sep. 8, 2017, 1-3.
“TMS320C5515 Fingerprint Development Kit (FDK) Hardware Guide”, Texas Instruments, Literature No. SPRUFX3, XP055547651, Apr. 2010, 1-26.
“ZTE V7 Max. 5,5” smartphone on MediaTeck Helio P10 cpu; Published on Apr. 20, 2016; https://www.youtube.com/watch?v=ncNCbpkGQzU (Year: 2016).
Cappelli, et al., “Fingerprint Image Reconstruction from Standard Templates”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 29, No. 9, Sep. 2007, 1489-1503.
Dausch, et al., “Theory and Operation of 2-D Array Piezoelectric Micromachined Ultrasound Transducers”, IEEE Transactions on Ultrasonics, and Frequency Control, vol. 55, No. 11;, Nov. 2008, 2484-2492.
Feng, et al., “Fingerprint Reconstruction: From Minutiae to Phase”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 33, No. 2, Feb. 2011, 209-223.
Hopcroft, et al., “Temperature Compensation of a MEMS Resonator Using Quality Factor as a Thermometer”, Retrieved from Internet: http://micromachine.stanford.edu/˜amanu/linked/MAH_MEMS2006.pdf, 2006, 222-225.
Hopcroft, et al., “Using the temperature dependence of resonator quality factor as a thermometer”, Applied Physics Letters 91. Retrieved from Internet: http://micromachine.stanford.edu/˜hopcroft/Publications/Hopcroft_QT_ApplPhysLett_91_013505.pdf, 2007, 013505-1-031505-3.
Jiang, et al., “Ultrasonic Fingerprint Sensor with Transmit Beamforming Based on a PMUT Array Bonded to CMOS Circuitry”, IEEE Transactions on Ultrasonics Ferroelectrics and Frequency Control, Jan. 1, 2017, 1-9.
Kumar, et al., “Towards Contactless, Low-Cost and Accurate 3D Fingerprint Identification”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 37, No. 3, Mar. 2015, 681-696.
Lee, et al., “Low jitter and temperature stable MEMS oscillators”, Frequency Conlrol Symposium (FCS), 2012 IEEE International, May 2012, 1-5.
Li, et al., “Capacitive micromachined ultrasonic transducer for ultra-low pressure measurement: Theoretical study”, AIP Advances 5.12. Retrieved from Internet: http://scitation.aip.org/content/aip/journal/adva/5/12/10.1063/1.4939217, 2015, 127231.
Pang, et al., “Extracting Valley-Ridge Lines from Point-Cloud-Based 3D Fingerprint Models”, IEEE Computer Graphics and Applications, IEEE Service Center, New York, vol. 33, No. 4, Jul./Aug. 2013, 73-81.
Papageorgiou, et al., “Self-Calibration of Ultrasonic Transducers in an Intelligent Data Acquisition System”, International Scientific Journal of Computing, 2003, vol. 2, Issue 2 Retrieved Online: URL: https://scholar.google.com/scholar?q=self-calibration+of+ultrasonic+transducers+in+an+intelligent+data+acquisition+system&hl=en&as_sdt=0&as_vis=1&oi=scholart, 2003, 9-15.
Qiu, et al., “Piezoelectric Micromachined Ultrasound Transducer (PMUT) Arrays for Integrated Sensing, Actuation and Imaging”, Sensors 15, doi:10.3390/s150408020, Apr. 3, 2015, 8020-8041.
Ross, et al., “From Template to Image: Reconstructing Fingerprints from Minutiae Points”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 29, No. 4, Apr. 2007, 544-560.
Rozen, et al., “Air-Coupled Aluminum Nitride Piezoelectric Micromachined Ultrasonic Transducers at 0.3 MHZ to 0.9 MHZ”, 2015 28th IEEE International Conference on Micro Electro Mechanical Systems (MEMS), IEEE, Jan. 18, 2015, 921-924.
Savoia, et al., “Design and Fabrication of a cMUT Probe for Ultrasound Imaging of Fingerprints”, 2010 IEEE International Ultrasonics Symposium Proceedings, Oct. 2010, 1877-1880.
Shen, et al., “Anisotropic Complementary Acoustic Metamaterial for Canceling out Aberrating Layers”, American Physical Society, Physical Review X 4.4: 041033., Nov. 19, 2014, 041033-1-041033-7.
Thakar, et al., “Multi-resonator approach to eliminating the temperature dependence of silicon-based timing references”, Hilton Head'14. Retrieved from the Internet: http://blog.narotama.ac.id/wp-content/uploads/2014/12/Multi-resonator-approach-to-eliminating-the-temperature-dependance-of-silicon-based-timing-references.pdf, 2014, 415-418.
Zhou, et al., “Partial Fingerprint Reconstruction with Improved Smooth Extension”, Network and System Security, Springer Berlin Heidelberg, Jun. 3, 2013, 756-762.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/033854, 16 pages, dated Nov. 3, 2020.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/039208, 10 pages, dated Oct. 9, 2020.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/039452, 11 pages, dated Sep. 9, 2020.
ISA/EP, Partial Search Report for International Application No. PCT/US2020/033854, 10 pages, dated Sep. 8, 2020.
Office Action for CN App No. 201780029016.7 dated Sep. 25, 2020, 7 pages.
Tang, et al., “11.2 3D Ultrasonic Fingerprint Sensor-on-a-Chip”, 2016 IEEE International Solid-State Circuits Conference, IEEE, Jan. 31, 2016, 202-203.
EP Office Action, for Application 17724184.1, dated Oct. 12, 2021, 6 pages.
EP Office Action, dated Oct. 9, 2021, 6 pages.
European Patent Office, Office Action, App 17725018, pp. 5, dated Oct. 25, 2021.
Tang, et al., “Pulse-echo ultrasonic fingerprint sensor on a chip”, 2015 Transducers, 2015 18th International Conference on Solid-State Sensors, Actuators and Microsystems, Apr. 1, 2015, 674-677.
Related Publications (1)
Number Date Country
20200410268 A1 Dec 2020 US
Provisional Applications (1)
Number Date Country
62866510 Jun 2019 US