Pressure-based activation of fingerprint spoof detection

Information

  • Patent Grant
  • 11328165
  • Patent Number
    11,328,165
  • Date Filed
    Friday, April 24, 2020
    4 years ago
  • Date Issued
    Tuesday, May 10, 2022
    2 years ago
Abstract
In a method for performing operating a fingerprint sensor, an image of a fingerprint of a finger is captured at a fingerprint sensor. A force applied by the finger at the fingerprint sensor is determined, where the force is a measure of pressure applied by the finger on the fingerprint sensor during capture of the image. The force is compared to a pressure threshold. Provided the force satisfies a pressure threshold, a spoof detection operation is performed to determine whether the finger is a real finger. Provided the force does not satisfy the pressure threshold, fingerprint authentication using the image of the fingerprint is performed without performing the spoof detection operation.
Description
BACKGROUND

Fingerprint sensors have become ubiquitous in mobile devices as well as other applications for authenticating a user's identity. They provide a fast and convenient way for the user to unlock a device, provide authentication for payments, etc. Some applications, e.g., banking or payment applications, may require a higher level of security than other use applications utilizing the same fingerprint sensor, where a higher level of security typically requires significantly more processing power and reduces response time by increasing latency of the authentication procedure.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and form a part of the Description of Embodiments, illustrate various non-limiting and non-exhaustive embodiments of the subject matter and, together with the Description of Embodiments, serve to explain principles of the subject matter discussed below. Unless specifically noted, the drawings referred to in this Brief Description of Drawings should be understood as not being drawn to scale and like reference numerals refer to like parts throughout the various figures unless otherwise specified.



FIG. 1 is a block diagram of an example mobile electronic device 100 upon which embodiments described herein may be implemented.



FIG. 2 illustrates a block diagram of an example fingerprint sensing system for determining whether a latent fingerprint remains on a contact surface of the fingerprint sensing system, according to some embodiments.



FIG. 3 illustrates a block diagram of a force-based spoof detection activator, according to some embodiments



FIGS. 4A and 4B illustrate cross section views of an example ultrasonic sensor and a finger, according to some embodiments.



FIGS. 5A and 5B illustrate cross section views of an example ridge/valley pattern of a finger at different forces, according to some embodiments.



FIG. 6 illustrates an example pressure profile 600 of finger pressure on applied to a fingerprint sensor over time, according to an embodiment.



FIG. 7 illustrates a block diagram of a spoof detection operation, according to some embodiments.



FIG. 8 illustrates an example process for operating a fingerprint sensor for pressure-based activation of fingerprint spoof detection, according to some embodiments.



FIG. 9 illustrates an example process for operating a performing spoof detection, according to some embodiments.





DESCRIPTION OF EMBODIMENTS

The following Description of Embodiments is merely provided by way of example and not of limitation. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding background or in the following Description of Embodiments.


Reference will now be made in detail to various embodiments of the subject matter, examples of which are illustrated in the accompanying drawings. While various embodiments are discussed herein, it will be understood that they are not intended to limit to these embodiments. On the contrary, the presented embodiments are intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope the various embodiments as defined by the appended claims. Furthermore, in this Description of Embodiments, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present subject matter. However, embodiments may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the described embodiments.


Notation and Nomenclature

Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing and other symbolic representations of operations on data within an electrical device. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be one or more self-consistent procedures or instructions leading to a desired result. The procedures are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of acoustic (e.g., ultrasonic) signals capable of being transmitted and received by an electronic device and/or electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in an electrical device.


It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the description of embodiments, discussions utilizing terms such as “capturing,” “determining,” “performing,” “providing,” “receiving,” “analyzing,” “confirming,” “displaying,” “presenting,” “using,” “completing,” “instructing,” “comparing,” “executing,” or the like, refer to the actions and processes of an electronic device such as an electrical device.


Embodiments described herein may be discussed in the general context of processor-executable instructions residing on some form of non-transitory processor-readable medium, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.


In the figures, a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, logic, circuits, and steps have been described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. Also, the example fingerprint sensing system and/or mobile electronic device described herein may include components other than those shown, including well-known components.


Various techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium comprising instructions that, when executed, perform one or more of the methods described herein. The non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.


The non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.


Various embodiments described herein may be executed by one or more processors, such as one or more motion processing units (MPUs), sensor processing units (SPUs), host processor(s) or core(s) thereof, digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein, or other equivalent integrated or discrete logic circuitry. The term “processor,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein. As it employed in the subject specification, the term “processor” can refer to substantially any computing processing unit or device comprising, but not limited to comprising, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Moreover, processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of user equipment. A processor may also be implemented as a combination of computing processing units.


In addition, in some aspects, the functionality described herein may be provided within dedicated software modules or hardware modules configured as described herein. Also, the techniques could be fully implemented in one or more circuits or logic elements. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of an SPU/MPU and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with an SPU core, MPU core, or any other such configuration.


Overview of Discussion

Discussion begins with a description of a device including a fingerprint sensor, upon which described embodiments can be implemented. An example fingerprint sensor and system for pressure-based activation of fingerprint spoof detection is then described, in accordance with various embodiments. Example operations of a fingerprint sensor and for pressure-based activation of fingerprint spoof detection are then described.


Fingerprint sensors are used in electronic devices for user authentication, such as mobile electronic devices and applications operating on mobile electronic devices, for protecting against unauthorized access to the devices and/or applications. Different applications may have different security requirements for authenticating user access. For example, banking or payment applications may have higher security requirements than other types of less critical applications or unlocking the mobile device. For instance, fingerprint sensors should be capable of distinguishing real fingers from fake/artificial, or even dead fingers, also referred to herein as performing “spoof detection.” A “spoofed” fingerprint is a fake or artificial fingerprint that is used to attempt to circumvent security measures requiring fingerprint authentication. For example, an artificial finger may be used to gain unauthorized access to the electronic device or application, by making an unauthorized copy of the fingerprint of an authorized user, e.g., “spoofing” an actual fingerprint. The spoof detection may be performed by analyzing fingerprint images captured by the fingerprint sensor, e.g., performing biometric analysis of the fingerprint images, or looking at any characteristics that can help distinguish a fake/spoof fingerprint from a real fingerprint.


Embodiments described herein provide systems and methods for selectively controlling implementation of heightened security measures used during fingerprint authentication at a fingerprint sensor. Heightened security measures used during fingerprint authentication, such as liveness and/or spoof detection, typically require increased computing resources relative to standard fingerprint authentication. Moreover, these heightened security measures typically introduce latency into the response time of the fingerprint authentication.


Embodiments described herein provide for selective activation of spoof detection. In some embodiments, spoof detection is activated during the fingerprint authentication responsive to a user triggering the spoof detection, e.g., a hard finger press by the user on the fingerprint sensor. For example, spoof detection may be initiated when an increase in pressure or force is detected at the fingerprint sensor. In one embodiment, the pressure is compared to a threshold to trigger the spoof detection. In another embodiment, finger pressure matching a particular pressure profile is utilized to trigger spoof detection.


In one example, the spoof detection includes capturing a first fingerprint image responsive to a finger pressing the fingerprint sensor and capturing a second fingerprint image responsive to the finger pressing the fingerprint sensor at a pressure greater than during the capture of the first fingerprint image. In other words, two fingerprint images are captured at different finger pressures on the fingerprint sensor. The second fingerprint image at the greater pressure activates the spoof detection. Single image spoof detection is then performed on the first fingerprint image and the second fingerprint image. A difference image of the first fingerprint image and the second fingerprint image, and a classifier is then run on the difference image to determine the probability that the deformation of the fingerprint at the different pressures comes from a real finger. In one embodiment, a fusion score is determined using the three scores, where the fusion score is used to make a final determination as to whether the fingerprints are from a real finger.


Embodiments described herein provide a method for performing operating a fingerprint sensor, where an image of a fingerprint of a finger is captured at a fingerprint sensor. A force applied by the finger at the fingerprint sensor is determined, where the force is a measure of pressure applied by the finger on the fingerprint sensor during capture of the image. The force is compared to a pressure threshold. Provided the force satisfies the pressure threshold, a spoof detection operation is performed to determine whether the finger is a real finger. Provided the force does not satisfy the pressure threshold, fingerprint authentication using the image of the fingerprint is performed without performing the spoof detection operation. In one embodiment, the pressure threshold comprises a value. In another embodiment, the pressure threshold comprises a pressure profile of force over time. In one embodiment, the pressure threshold is associated with an application using the image of the fingerprint for authentication. In one embodiment, provided the spoof detection operation determines that the finger is a real finger, fingerprint authentication is performed using the image of the fingerprint. Embodiments described here provide fingerprint authentication having multiple levels of authentication. For instance, a lower security authentication level may allow access to a device and a limited number of applications (e.g., those with lower security concerns), while a higher security authentication level may allow access to all applications including applications requiring higher security, such as banking or payment applications. It should be appreciated that more than two levels of authentication may be provided herein, in accordance with various embodiments.


In one embodiment, performing the spoof detection operation to determine whether the finger is a real finger includes capturing a second image of the fingerprint, wherein the force applied by the finger during capture of the second image is greater than the force applied by the finger during capture of the image of the fingerprint. A difference image of the image and the second image is determined. A probability that the finger is a real finger is determined based on deformation indicated in the difference image. In one embodiment, the probability that the finger is a real finger is determined by applying a classifier on the difference image. In one embodiment, whether the finger is a real finger is determined based on the probability. In order to increase reliability of the probability, in some embodiments, the probability that the finger is a real finger can be computed based on the single first image, the single second image, and the difference image. The final probability is the combined probability of the three probabilities previously computed.


Example Mobile Electronic Device

Turning now to the figures, FIG. 1 is a block diagram of an example electronic device 100. As will be appreciated, electronic device 100 may be implemented as a device or apparatus, such as a handheld mobile electronic device. For example, such a mobile electronic device may be, without limitation, a mobile telephone phone (e.g., smartphone, cellular phone, a cordless phone running on a local network, or any other cordless telephone handset), a wired telephone (e.g., a phone attached by a wire), a personal digital assistant (PDA), a video game player, video game controller, a Head Mounted Display (HMD), a virtual or augmented reality device, a navigation device, an activity or fitness tracker device (e.g., bracelet, clip, band, or pendant), a smart watch or other wearable device, a mobile internet device (MID), a personal navigation device (PND), a digital still camera, a digital video camera, a portable music player, a portable video player, a portable multi-media player, a remote control, or a combination of one or more of these devices. In other embodiments, electronic device 100 may be implemented as a fixed electronic device, such as and without limitation, an electronic lock, a doorknob, a car start button, an automated teller machine (ATM), etc. In accordance with various embodiments, electronic device 100 is capable of reading fingerprints.


As depicted in FIG. 1, electronic device 100 may include a host processor 110, a host bus 120, a host memory 130, and a sensor processing unit 170. Some embodiments of electronic device 100 may further include one or more of a display device 140, an interface 150, a transceiver 160 (all depicted in dashed lines) and/or other components. In various embodiments, electrical power for electronic device 100 is provided by a mobile power source such as a battery (not shown), when not being actively charged.


Host processor 110 can be one or more microprocessors, central processing units (CPUs), DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs or applications, which may be stored in host memory 130, associated with the functions and capabilities of electronic device 100.


Host bus 120 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SPI) or other equivalent. In the embodiment shown, host processor 110, host memory 130, display 140, interface 150, transceiver 160, sensor processing unit (SPU) 170, and other components of electronic device 100 may be coupled communicatively through host bus 120 in order to exchange commands and data. Depending on the architecture, different bus configurations may be employed as desired. For example, additional buses may be used to couple the various components of electronic device 100, such as by using a dedicated bus between host processor 110 and memory 130.


Host memory 130 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory), hard disk, optical disk, or some combination thereof. Multiple layers of software can be stored in host memory 130 for use with/operation upon host processor 110. For example, an operating system layer can be provided for electronic device 100 to control and manage system resources in real time, enable functions of application software and other layers, and interface application programs with other software and functions of electronic device 100. Similarly, a user experience system layer may operate upon or be facilitated by the operating system. The user experience system may comprise one or more software application programs such as menu navigation software, games, device function control, gesture recognition, image processing or adjusting, voice recognition, navigation software, communications software (such as telephony or wireless local area network (WLAN) software), and/or any of a wide variety of other software and functional interfaces for interaction with the user can be provided. In some embodiments, multiple different applications can be provided on a single electronic device 100, and in some of those embodiments, multiple applications can run simultaneously as part of the user experience system. In some embodiments, the user experience system, operating system, and/or the host processor 110 may operate in a low-power mode (e.g., a sleep mode) where very few instructions are processed. Such a low-power mode may utilize only a small fraction of the processing power of a full-power mode (e.g., an awake mode) of the host processor 110.


Display 140, when included, may be a liquid crystal device, (organic) light emitting diode device, or other display device suitable for creating and visibly depicting graphic images and/or alphanumeric characters recognizable to a user. Display 140 may be configured to output images viewable by the user and may additionally or alternatively function as a viewfinder for camera. It should be appreciated that display 140 is optional, as various electronic devices, such as electronic locks, doorknobs, car start buttons, etc., may not require a display device.


Interface 150, when included, can be any of a variety of different devices providing input and/or output to a user, such as audio speakers, touch screen, real or virtual buttons, joystick, slider, knob, printer, scanner, computer network I/O device, other connected peripherals and the like.


Transceiver 160, when included, may be one or more of a wired or wireless transceiver which facilitates receipt of data at electronic device 100 from an external transmission source and transmission of data from electronic device 100 to an external recipient. By way of example, and not of limitation, in various embodiments, transceiver 160 comprises one or more of: a cellular transceiver, a wireless local area network transceiver (e.g., a transceiver compliant with one or more Institute of Electrical and Electronics Engineers (IEEE) 802.11 specifications for wireless local area network communication), a wireless personal area network transceiver (e.g., a transceiver compliant with one or more IEEE 802.15 specifications for wireless personal area network communication), and a wired a serial transceiver (e.g., a universal serial bus for wired communication).


Electronic device 100 also includes a general purpose sensor assembly in the form of integrated Sensor Processing Unit (SPU) 170 which includes sensor processor 172, memory 176, a fingerprint sensor 178, and a bus 174 for facilitating communication between these and other components of SPU 170. In some embodiments, SPU 170 may include at least one additional sensor 180 (shown as sensor 180-1, 180-2, . . . 180-n) communicatively coupled to bus 174. In some embodiments, at least one additional sensor 180 is a force or pressure sensor configured to determine a force or pressure. The force or pressure sensor may be disposed within, under, or adjacent fingerprint sensor 178. In some embodiments, all of the components illustrated in SPU 170 may be embodied on a single integrated circuit. It should be appreciated that SPU 170 may be manufactured as a stand-alone unit (e.g., an integrated circuit), that may exist separately from a larger electronic device and is coupled to host bus 120 through an interface (not shown).


Sensor processor 172 can be one or more microprocessors, CPUs, DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs, which may be stored in memory 176, associated with the functions of SPU 170. It should also be appreciated that fingerprint sensor 178 and additional sensor 180, when included, may also utilize processing and memory provided by other components of electronic device 100, e.g., host processor 110 and host memory 130.


Bus 174 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDK)) bus, a serial peripheral interface (SPI) or other equivalent. Depending on the architecture, different bus configurations may be employed as desired. In the embodiment shown, sensor processor 172, memory 176, fingerprint sensor 178, and other components of SPU 170 may be communicatively coupled through bus 174 in order to exchange data.


Memory 176 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory). Memory 176 may store algorithms or routines or other instructions for processing data received from fingerprint sensor 178 and/or one or more sensor 180, as well as the received data either in its raw form or after some processing. Such algorithms and routines may be implemented by sensor processor 172 and/or by logic or processing capabilities included in fingerprint sensor 178 and/or sensor 180.


A sensor 180 may comprise, without limitation: a temperature sensor, a humidity sensor, an atmospheric pressure sensor, an infrared sensor, a radio frequency sensor, a navigation satellite system sensor (such as a global positioning system receiver), an acoustic sensor (e.g., a microphone), an inertial or motion sensor (e.g., a gyroscope, accelerometer, or magnetometer) for measuring the orientation or motion of the sensor in space, or other type of sensor for measuring other physical or environmental factors used in the adaptation of a quality threshold. In one example, sensor 180-1 may comprise an acoustic sensor, sensor 180-2 may comprise a temperature sensor, and sensor 180-n may comprise a motion sensor.


In some embodiments, fingerprint sensor 178 and/or one or more sensors 180 may be implemented using a microelectromechanical system (MEMS) that is integrated with sensor processor 172 and one or more other components of SPU 170 in a single chip or package. It should be appreciated that fingerprint sensor 178 may be disposed behind display 140. Although depicted as being included within SPU 170, one, some, or all of fingerprint sensor 178 and/or one or more sensors 180 may be disposed externally to SPU 170 in various embodiments. It should be appreciated that fingerprint sensor 178 can be any type of fingerprint sensor, including without limitation, an ultrasonic sensor, an optical sensor, a camera, etc.


Example Fingerprint Sensor and System for Pressure-Based Activation of Fingerprint Spoof Detection


FIG. 2 illustrates a block diagram of an example fingerprint sensing system 200 for pressure-based activation of fingerprint spoof detection, according to some embodiments. Fingerprint sensing system 200 is configured to initiate spoof detection for captured fingerprint images prior to fingerprint authentication in response to detecting satisfaction of a pressure threshold. It should be appreciated that fingerprint sensing system 200 can be implemented as hardware, software, or any combination thereof. It should be appreciated that fingerprint image capture 210, force-based spoof detection activator 220, spoof detection 230, and fingerprint authentication 240 may be separate components, may be comprised within a single component, or may be comprised in various combinations of multiple components (e.g., spoof detection 230 and fingerprint authentication 240 may be comprised within a single component), in accordance with some embodiments.


Fingerprint image 215 is captured at fingerprint image capture 210. It should be appreciated that fingerprint image capture 210 can be any type of image capture device, including without limitation, an ultrasonic sensor, an optical sensor, a camera, etc. It should be appreciated that fingerprint image capture is configured to capture one or more fingerprint images 215, and that embodiments described herein may utilize one or more fingerprint images 215 for performing force-based activation of spoof detection.


At least one fingerprint image 215 is received at force-based spoof detection activator 220, which is configured to determine a force applied by a finger at the fingerprint sensor during fingerprint image capture 210, and to determine whether to perform spoof detection 230 based on the force applied by the finger. The force may be derived by analyzing deformation of the fingerprint when the user presses the finger on the sensor. For example, the ridge/valley ratio may be an indication of the force because when a finger is pressed harder the ridge/valley ration increases as the ridges are compressed against the sensor surface. It should be appreciated that in accordance with some embodiments, a force calibration is performed at user enrollment (or at another time) for use in determining the force applied by a finger at the fingerprint sensor during fingerprint image capture 210.



FIG. 3 illustrates a block diagram of a force-based spoof detection activator 220, according to some embodiments. Fingerprint image 215 is received at finger force determiner 310, which is configured to determine the force, e.g., pressure, applied by a finger to the fingerprint sensor during the fingerprint capture operation. In some embodiments, finger force determiner 310 is configured to determine a force applied by the finger by analyzing fingerprint characteristics of the fingerprint image 215. Examples of fingerprint characteristics include, without limitation: ridge/valley patterns, widths of fingerprint ridges/valleys, depths of valleys, heights of ridges, surface ratio of ridges to valleys, curvature of dermal layers, depths of dermal layers or other layers of the finger, depth of surface features of the fingerprint, skin condition (e.g., dryness), etc. Skin condition is an important factor because it influences the acoustic coupling between the finger and the sensor. For example, dry finger may have a worse coupling and may therefore influence the captures ridge pattern (e.g. broken ridge pattern). Changes in the (broke) ridge pattern can be used as an indicated of increased force. In some embodiments, finger force determiner is a force or pressure sensor (e.g., an additional sensor 180) that is configured to determine a force or pressure.


In some embodiments, finger force determiner 310 is configured to determine a force applied by the finger using one fingerprint image 215. For example, by analyzing fingerprint characteristics of a single fingerprint image 215, finger force determiner 310 is able to estimate or calculate a force applied by the finger. In other embodiments, finger force determiner 310 is configured to determine a force applied by the finger using more than one fingerprint image 215. For example, by comparing characteristics between fingerprint images 215 of the same finger taken at different times during a fingerprint image capture, a force can be calculated using the relative difference in characteristics between fingerprint images 215. In other embodiments, finger force determiner 310 is configured to determine a force using an additional force sensor disposed within, under, or adjacent the image capture device.



FIGS. 4A and 4B illustrate cross section views of an example fingerprint sensor 430 and a finger 410, according to some embodiments. With reference to FIG. 4A, finger 410 is shown interacting with fingerprint sensor 430. It should be appreciated that the dimensions of fingerprint sensor 430 may be chosen to capture only a small section of the fingerprint of finger 410, or the dimensions of fingerprint sensor 430 may be chosen larger to capture substantially the complete fingerprint. In one embodiment, a cover 420 overlies fingerprint sensor 430. In various embodiments, cover 420 may be made of any type of transparent, translucent, or opaque material, including but not limited to: a thin sheet of glass, plastic, resin, rubber, Teflon, epoxy, glass, aluminum-based alloys, sapphire, titanium nitride (TiN), Silicon carbide (SiC), diamond, etc. It should be appreciated that the type of material may be dependent on the type of sensor. For example, cover 420 may provide protection for fingerprint sensor 430 by preventing a user from coming into contact with fingerprint sensor 430. It should be appreciated that fingerprint sensor 430 may be in direct contact with cover 420, or there may be a gap separating fingerprint sensor 430 and cover 420. In various embodiments, the gap may be filled with an acoustic coupling material including air, solid liquid, gel-like materials, or other materials for supporting transmission of acoustic signals.


Fingerprint sensor 430 may be incorporated on the different exterior faces of an electronic device (e.g., electronic device 100 of FIG. 1), depending on the ergonomics and easy for the user to interact with fingerprint sensor 430 using a finger 410. For example, if the electronic device includes a display, fingerprint sensor 430 may be included in the same side as the display, behind the display, on an edge of the electronic device, or on the back of the electronic device. In accordance with some embodiments, fingerprint sensor 430 may be incorporated in a button of the electronic device. In some embodiments, visual or textural markers may be present on cover 420 to indicate to the user where fingerprint sensor 430 is positioned and where to put finger 410.


Fingerprint sensor 430 may provide multiple functionalities. For instance, in addition to being operable capture the fingerprint of the user, fingerprint sensor 430 may also be used to determine the force applied by the user (e.g., the force of finger 410 applied to fingerprint sensor 430). The different functionalities or modes may be selected and/or activated automatically, for example, depending on the context or application of the device, and the different functionalities or modes may be adaptive to the user and the user's habits or preferences. In some embodiments, the parameters of the force detection process may be adapted to use less power or computing resources, which may come at the costs of quality or confidence in the determined force. Embodiments described herein pertain to methods to derive the applied force for use in determining whether to activate spoof detection.


In some embodiments, fingerprint sensor 430 is an ultrasonic sensor. In such embodiments, fingerprint sensor 430 is operable to emit and detect ultrasonic waves (also referred to as ultrasonic signals or ultrasound signals). The emitted ultrasonic waves are reflected from any objects in front of fingerprint sensor 430, and these reflected ultrasonic waves, or echoes, are then detected. Where the object is a finger (e.g., finger 410), the waves are reflected from different features of the finger, such as the surface features (e.g., surface features 412 of FIG. 4A and surface features 422 of FIG. 4B) on the skin (e.g., the epidermis), or features (e.g., features 416 of FIG. 4A and surface features 426 of FIG. 4B) present in deeper layers of the finger (e.g., the dermis). Examples of surface features of a finger are ridges and valleys of a fingerprint. For example, the reflection of the sound waves from the ridge/valley pattern enables fingerprint sensor 430 to produce a fingerprint image that may be used for identification of the user. In optical fingerprint sensors, the same principle of emission and reflection are used to detect the fingerprint. In some embodiments, fingerprint sensor 430 is able to provide depth information, from which a multi-dimensional fingerprint may be determined, e.g., a three-dimensional fingerprint.


It should be appreciated that the features that can reflect ultrasonic waves, and used to determine deformation, may be any anatomical feature from the different layers of the finger, e.g., the epidermis layer, the dermis layer, or subcutaneous tissue. The features may be the layers itself, transitions between different layers, features within the layers (e.g., pores), or features traversing the layers (e.g., capillary blood vessels). Which features may be used depends on the penetration depth of the ultrasound waves and the imaging resolution. The features need not directly be the anatomical features, but may be features of ultrasonic signals caused by the anatomical features, such as specific reflections or absorptions of the signal.


In order to obtain the three-dimensional fingerprint, the depth information is detected using fingerprint sensor 430. The depth information can be obtained due to the fact that the ultrasonic waves reflect from features at different depths in the skin. The reflection time, which is defined as the time between the emission of the ultrasonic waves and the detection of the reflected ultrasonic waves, increases as a function of the depth of the features. Therefore, by analyzing the reflected waves as a function of time, the features can be determined as a function of depth. Images can be created that correspond to a certain depth within a finger. An array of images of different depths may be defined as the 3D fingerprint. Images may also be created to visualize other cross sections of the finger, for example perpendicular to the cover surface or sensor surface. Fingerprints or 3D fingerprint may not just be defined as images, but also as multi-dimensional data corresponding to various (acoustic) properties of the finger (e.g. density, acoustic absorption, acoustic reflection).


As illustrated in FIGS. 4A and 4B, a finger 410 interacting with fingerprint sensor 430 and in contact with cover 420. FIGS. 4A and 4B show finger 410 contacting cover 420 with a different force, as illustrated by the different compression of (epi)dermal layers and features and a different size of a contact region. With reference to FIG. 4A, finger 410 is in contact with cover 420 at contact region 418, where contact region 418 defines the portion of the surface of finger 410 that is in contact with cover 420. Similarly, with reference to FIG. 4B, finger 410 is in contact with cover 420 at contact region 428, where contact region 428 is larger than contact region 418. Moreover, dermal layers and features 416 of FIG. 4A are spaced farther apart than dermal layers and features 426 of FIG. 4B. Thus, finger 410 is contacting cover 420 with a larger force in FIG. 4B than in FIG. 4A.



FIGS. 5A and 5B illustrate cross section views of an example ridge/valley pattern of a finger at different forces, according to some embodiments. The detection of the features discussed above may be limited to the surface features of the finger, such as the actual ridges and valleys of the surface of the finger. A depth analysis may be limited to approximately the height of the fingerprint structures. By pressing the finger against the surface (e.g., cover) of the fingerprint sensor, the ridge/valley pattern may be modified or compressed, which can then be detected and used to determine the applied force. For example, the air cavity due to the valleys may be decreased due to the applied force, and the surface ratio of the ridges and valleys may be changed. The shape of the ridges may also change due to the applied force, and this change may be determined through the depth analysis using the sensor. Thus, the change of shape of the ridge and valleys due to compression and/or deformation can be used to derive the applied force. In one example, the determined contact surface may be used to derive the applied force. FIG. 5A shows an example of the ridge/alley pattern at low-force, shown as features 510, and FIG. 5B shows the same pattern, shown as features 520, at a higher force where the pattern is compressed leading to a greater contact surface and smaller valleys. Any number of (depth) images may be used to make the determination.


In some embodiments, the analysis of the ridge/valley pattern and the analysis of deeper layers and/or features may be combined. In some embodiments, each analysis may be performed separately, and then the results may be combined, for example, by averaging or weighted averaging. The applied weight may depend on the obtained results and a confidence factor. The different algorithms may produce a confidence factor of the determined force, and the higher the confidence factor the higher the weight in the averaging. In other embodiments, the different algorithms may also be performed sequentially. For example, a first algorithm may determine the force, and a certain confidence factor. The second algorithm may only be used in case the confidence factor is below a preset threshold. For each analysis, the active section of the sensor may be adapted, for example, only a central section of the sensor may be used. In one embodiment, results from the surface analysis to determine the force may help determine the best location to perform an in depth analysis.


With reference again to FIG. 3, finger force determiner 310 is configured to output force 315. Force 315 is received at pressure threshold comparer 320, which is configured to compare force 315 to a pressure threshold. If force 315 satisfies the pressure threshold, spoof detection activation determination 325 is generated, where spoof detection activation determination 325 indicates an activation of spoof detection on the associated fingerprint image(s) 215. If force 315 does not satisfy the pressure threshold, the associated fingerprint image(s) are forwarded directly to fingerprint authentication without performing spoof detection. For example, this provides access to devices and/or applications requiring a lower security level authentication.


In one embodiment, the pressure threshold comprises pressure threshold value 322 such that pressure threshold comparer 320 compares force 315 to pressure threshold value 322. In such an embodiment, force 315 also comprises a value, and the value of force 315 is compared to pressure threshold value 322. If force 315 satisfies, e.g., is larger than, pressure threshold value 322, then spoof detection activation determination 325 is generated. For example, a first fingerprint image may have a first associated fingerprint force and a second fingerprint image may have a second associated fingerprint force. The pressure threshold value 322 may be dynamically set to be the first fingerprint force, such that if the second associated fingerprint force (force 315) is greater than the first associated fingerprint force (pressure threshold value 322), the pressure threshold value 322 is satisfied. In other words, the pressure threshold value 322 may be a fixed value or a value set during the spoof detection operation such that as long as the second associated fingerprint force is greater than the first associated fingerprint force.


In another embodiment, the pressure threshold comprises reference pressure profile 324 such that pressure threshold comparer 320 compares force 315 to reference pressure profile 324. A pressure profile includes two or more pressure values over time, where a latter pressure value is greater than an earlier pressure value. In such an embodiment, force 315 also comprises a pressure profile, and the pressure profile of force 315 is compared to reference pressure profile 324. If force 315 satisfies, e.g., indicates similar finger pressure profile during fingerprint image capture to reference pressure profile 324, then spoof detection activation determination 325 is generated.



FIG. 6 illustrates an example pressure profile 600 of finger pressure on applied to a fingerprint sensor over time, according to an embodiment. It should be appreciated the pressure profile 600 is an example pressure profile that can be implemented as reference pressure profile 324 of FIG. 3. As illustrated, example pressure profile 600 shows that a first fingerprint image is captured at time 610 and a second fingerprint image is captured at time 620. The pressure exerted by the finger on the fingerprint sensor increase over time to a first peak as illustrated at time 610, then decreases slightly as illustrated at time 615, then increases to a pressure higher than the first peak as illustrated at time 620, indicating that a second fingerprint image is captured at a force greater than a force of the first fingerprint image. Where the pressure profile during fingerprint image capture indicates this type of behavior, a reference pressure threshold profile (e.g., pressure profile 324) is satisfied. Comparison of the detected pressure profile with a reference pressure profile may be performed using any methods know to the person skilled in the art. Techniques similar to gesture recognition, e.g., Dynamic Time Warping (DTW), may be used.


With reference again to FIG. 3, pressure threshold comparer 320 is configured to output force a spoof detection activation determination 325 in response to force 315 satisfying a pressure threshold, resulting in activation of spoof determination. It should be appreciated that in some embodiments, a determination to not perform spoof detection on an associated fingerprint image(s) 215 may be generated in response to force 315 not satisfying the pressure threshold.


Fingerprint image forwarder 330 is configured to receive fingerprint images 215 and spoof detection activation determination 325. If fingerprint image forwarder 330 receives spoof detection activation determination 325 for an associated fingerprint image(s) 215, fingerprint image forwarder 330 forwards the associated fingerprint image(s) 215 to spoof detection 230. If fingerprint image forwarder 330 does not receive spoof detection activation determination 325 for an associated fingerprint image(s) 215, fingerprint image forwarder 330 forwards the associated fingerprint image(s) 215 directly to fingerprint authentication 240. In some embodiments, fingerprint image forwarder 330 is also configured to receive a determination to not perform spoof detection on an associated fingerprint image(s) 215, resulting in the associated fingerprint image(s) 215 being forwarded directly to fingerprint authentication 240. In some embodiments, fingerprint image forwarder 330 can forward fingerprint image(s) 215 to fingerprint authentication 240 while concurrently performing the spoof detection 230, authenticating a user for access to a device and or applications having a lower level of authentication while performing spoof detection for access to applications having a higher level of authentication.


With reference to FIG. 2, responsive to the generation of spoof detection activation determination 325 of FIG. 3, associated fingerprint image(s) 215 is received at spoof detection 230. Spoof detection 230 is configured to perform a spoof detection operation to determine whether the finger used during the fingerprint image capture 210 is a real finger. In one embodiments, spoof detection 230 receives at least two fingerprint images 215 and determines a probability as to whether the finger used to capture the fingerprint images 215 is a real finger. Embodiments described here provide fingerprint authentication having multiple levels of authentication. For instance, a lower security authentication level may allow access to a device and a limited number of applications (e.g., those with lower security concerns) without performing spoof detection, while a higher security authentication level may allow access to all applications including applications requiring higher security, such as banking or payment applications, after performing spoof detection.



FIG. 7 illustrates a block diagram of spoof detection 230, according to some embodiments. At least two fingerprint images 215 are received at both image spoof verification 710 and difference image generator 720. Image spoof verification 710 is configured to perform a spoof verification operation on each received fingerprint image 215 and to generate a spoof score 715 for each fingerprint image 215. The spoof verification operation performed on each fingerprint image 215 evaluates characteristics of each fingerprint image 215 in generating spoof score 715. Spoof score 715 is an estimated value as to the likelihood that an individual fingerprint image 215 is of a real finger. It should be appreciated that spoof score 715 may be a value, e.g., 0 through 100, a probability, or any other relative valuation of the likelihood that an individual fingerprint image 215 is of a real finger. Such spoof scores can typically be obtained by training a classifier from example images with or without class labels (e.g., a class label includes information as to whether the image is real finger or spoof) using state of the art machine learning techniques. Typically, one or more characteristic features are extracted from the example images (e.g., characteristics of the gray scale distribution, characteristics of the ridge/valley pattern or cross sections, presence of pores or other fingerprint features, etc.) The feature definition may be performed manually, or in more advanced methods, the distinguishable features may be deduced automatically. The characteristics are then used as input of a classifier. The output is then a label or a score or a likelihood.


Difference image generator 720 is configured to generate a difference image of the two fingerprint images 215 also received at image spoof verification 710. A difference image is an image that includes the differences between the two fingerprint images 215. For example, where the two fingerprint images 215 are captured at different times under different pressures, the characteristics of the fingerprint will be different due to the different deformation of the characteristics (e.g., widths of fingerprint ridges/valleys). In some embodiments, the difference image generator may only be used if the force difference between the first and second image acquisition exceeds a certain threshold (e.g., a pressure difference threshold). For example, the pressure difference threshold can be set to ensure that the force difference between the first and second image acquisition is sufficient enough to result in a difference image. The difference image is received at image classifier 730, where the classifier determines the probability that the deformation of the difference image is from a real finger. Based on this determination, image classifier 730 generates spoof probability 735. It should be appreciated that a real finger may have a number of constraints on the skin and on how the skin deforms when pressed. A spoofed finger is frequently made of thick and uniform material, e.g., silicon, rubber, modeling compound (such as Play-Doh), etc. When such a thick spoof is pressed on the sensor, the deformation may be very different from a natural deformation using a real finger. Embodiments describe herein require the user to apply variable pressure. The difference between the natural deformation of a real finger and the deformation of a thick spoof will be amplified and visible more easily under different applied pressures. For training the classifier to use the difference image, images from real users may be captures at different forces, and images from spoof/fake fingers and fingerprints may be captured at different forces. Furthermore, the deformation may also be determined as a function of the force, and this dependency may also be used to determine the confidence whether or not a finger is a real finger. For example, for a fake finger, more or less force may be used to obtain a certain deformation compared to a real finger. This dependency may also depend on the user, and therefore a calibration of the deformation versus force may be performed.


Spoof determiner 740 makes a spoof determination 745 as to whether fingerprint images 215 include a fingerprint of a real finger. In one embodiment, spoof determiner 740 receives spoof scores 715 and spoof probability 735, and computes a fusion score of the inputs in determining whether fingerprint images 215 include a fingerprint of a real finger. In some embodiments, images used during the enrollment of a user's fingerprint for authentication purposes are also utilized in making spoof determination 745.


Spoof determination 745 is a determination as to whether fingerprint images 215 include a fingerprint of a real finger. Provided spoof determination 745 indicates that fingerprint images 215 include a fingerprint of a real finger, at least one fingerprint image 215 is forwarded to fingerprint authentication 240 of FIG. 2 for performing authentication. Provided spoof determination 745 indicates that fingerprint images 215 does not include a fingerprint of a real finger (e.g., is a spoofed fingerprint), the spoof detection operation fails, and fingerprint authentication 240 is not performed.


Example Operations for Operating a Fingerprint Sensor for Pressure-Based Activation of Fingerprint Spoof Detection


FIGS. 8 and 9 illustrate flow diagrams of example methods for operating a fingerprint sensor for pressure-based activation of fingerprint spoof detection, according to various embodiments. Procedures of these methods will be described with reference to elements and/or components of various figures described herein. It is appreciated that in some embodiments, the procedures may be performed in a different order than described, that some of the described procedures may not be performed, and/or that one or more additional procedures to those described may be performed. The flow diagrams include some procedures that, in various embodiments, are carried out by one or more processors (e.g., a host processor or a sensor processor) under the control of computer-readable and computer-executable instructions that are stored on non-transitory computer-readable storage media. It is further appreciated that one or more procedures described in the flow diagrams may be implemented in hardware, or a combination of hardware with firmware and/or software.


With reference to FIG. 8, flow diagram 800 illustrates example process for operating a fingerprint sensor for pressure-based activation of fingerprint spoof detection, according to some embodiments. At procedure 810 of flow diagram 800, a first image of a fingerprint of a finger is captured at a fingerprint sensor. In one embodiment, as shown at procedure 820, a second image of a fingerprint of the finger is captured at a fingerprint sensor. At procedure 830, a pressure applied to the fingerprint sensor during fingerprint capture is determined.


At procedure 840, it is determined whether the pressure satisfies a pressure threshold. In one embodiment, the pressure threshold comprises a value. In another embodiment, the pressure threshold comprises a pressure profile of force over time. In one embodiment, the pressure threshold is associated with an application using the image of the fingerprint for authentication.


Provided the force satisfies a pressure threshold, as shown at procedure 850, a spoof detection operation is performed to determine whether the finger is a real finger. In one embodiment, the spoof detection operation includes analyzing deformation of the finger as a function of the force applied by the finger during capture of an image. Provided the force does not satisfy the pressure threshold, as shown at procedure 860, fingerprint authentication is performed using at least one image of the fingerprint.


In some embodiments, procedure 850 is performed according to the procedures of flow diagram 900 of FIG. 9. Flow diagram 900 illustrates example process for operating a performing spoof detection, according to some embodiments. At procedure 910 of flow diagram 900, spoof verification is performed on the first and second fingerprint images. At procedure 920, a difference image is generated using the first and second fingerprint images. At procedure 930, a classifier is applied to the difference image. At procedure 940, a fusion score is generated using the spoof verification and the classified difference image. At procedure 950, a probability that the finger is a real finger is determined based on deformation indicated in the difference image. At procedure 960, whether the finger is a real finger is determined based on the probability.


With reference again to FIG. 8, at procedure 870, it is determined whether a spoof is detected based on the spoof detection of procedure 850. If a spoof is detected, flow diagram 800 ends. If a spoof is not detected (e.g., the finger is determined to be a real finger), flow diagram 800 proceeds to procedure 860, where fingerprint authentication is performed using at least one image of the fingerprint.


In some embodiments, flow diagram 800 proceeds to procedure 880 after performing the fingerprint authentication. At procedure 880, a security level of fingerprint authentication is provided. Embodiments described here provide fingerprint authentication having multiple levels of authentication. For instance, a lower security authentication level may allow access to a device and a limited number of applications (e.g., those with lower security concerns), while a higher security authentication level may allow access to all applications including applications requiring higher security, such as banking or payment applications. It should be appreciated that more than two levels of authentication may be provided herein, in accordance with various embodiments. In one embodiment, responsive to passing the fingerprint authentication without performing the spoof detection operation, a first security level of fingerprint authentication is provided. In one embodiment, responsive to satisfying the spoof detection operation and passing the fingerprint authentication, a second security level of fingerprint authentication is provided, wherein the second security level of fingerprint authentication provides access to applications having a higher security level requirement than the first security level of fingerprint authentication.


Conclusion

The examples set forth herein were presented in order to best explain, to describe particular applications, and to thereby enable those skilled in the art to make and use embodiments of the described examples. However, those skilled in the art will recognize that the foregoing description and examples have been presented for the purposes of illustration and example only. Many aspects of the different example embodiments that are described above can be combined into new embodiments. The description as set forth is not intended to be exhaustive or to limit the embodiments to the precise form disclosed. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.


Reference throughout this document to “one embodiment,” “certain embodiments,” “an embodiment,” “various embodiments,” “some embodiments,” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any embodiment may be combined in any suitable manner with one or more other features, structures, or characteristics of one or more other embodiments without limitation.

Claims
  • 1. A method for operating a fingerprint sensor, the method comprising: capturing an image of a fingerprint of a finger at a fingerprint sensor;determining a force applied by the finger at the fingerprint sensor by analyzing the image of the fingerprint, wherein the force is a measure of pressure applied by the finger on the fingerprint sensor during capture of the image;comparing the force to a pressure threshold;provided the force satisfies the pressure threshold, performing a spoof detection operation to determine whether the finger is a real finger; andprovided the force does not satisfy the pressure threshold, performing fingerprint authentication using the image of the fingerprint without performing the spoof detection operation.
  • 2. The method of claim 1, wherein the pressure threshold comprises a value.
  • 3. The method of claim 1, wherein the pressure threshold comprises a pressure profile of force over time.
  • 4. The method of claim 1, further comprising: provided the spoof detection operation determines that the finger is a real finger, performing fingerprint authentication using the image of the fingerprint.
  • 5. The method of claim 1, wherein the pressure threshold is associated with an application using the image of the fingerprint for authentication.
  • 6. The method of claim 1, wherein the performing the spoof detection operation to determine whether the finger is a real finger comprises: capturing a second image of the fingerprint, wherein a force applied by the finger during capture of the second image is greater than the force applied by the finger during capture of the image of the fingerprint; anddetermining a difference image of the image and the second image.
  • 7. The method of claim 6, wherein the determining the difference image of the image and the second image is performed responsive to the force applied by the finger during capture of the second image satisfying a pressure difference threshold.
  • 8. The method of claim 6, wherein the determining a force applied by the finger at the fingerprint sensor utilizes the second image of the fingerprint to determine the force applied by the finger at the fingerprint sensor.
  • 9. The method of claim 6, wherein the performing the spoof detection operation to determine whether the finger is a real finger further comprises: determining a probability that the finger is a real finger based on deformation indicated in the difference image.
  • 10. The method of claim 9, wherein the probability that the finger is a real finger is determined by applying a classifier on the difference image.
  • 11. The method of claim 1, wherein the performing the spoof detection to determine that the finger is a real finger comprises: analyzing deformation of the finger as a function of the force applied by the finger during capture of the image.
  • 12. The method of claim 1, further comprising: responsive to passing the fingerprint authentication without performing the spoof detection operation, providing a first security level of fingerprint authentication.
  • 13. The method of claim 12, further comprising: responsive to satisfying the spoof detection operation and passing the fingerprint authentication, providing a second security level of fingerprint authentication, wherein the second security level of fingerprint authentication provides access to applications having a higher security level requirement than the first security level of fingerprint authentication.
  • 14. A non-transitory computer readable storage medium having computer readable program code stored thereon for causing a computer system to perform method for operating a fingerprint sensor, the method comprising: capturing a first image of a fingerprint of a finger at a fingerprint sensor;capturing a second image of the fingerprint of the finger at the fingerprint sensor;determining a first force applied by the finger at the fingerprint sensor by analyzing the first image of the fingerprint;determining a second force applied by the finger at the fingerprint sensor by analyzing the second image of the fingerprint;determining whether the second force applied by the finger during capture of the second image is greater than the first force applied by the finger during capture of the first image;provided the second force is not greater than the first force, performing fingerprint authentication using at least one of the first image and the second image; andprovided the second force is greater than the first force, performing a spoof detection operation to determine whether the finger is a real finger.
  • 15. The non-transitory computer readable storage medium of claim 14, wherein the performing the spoof detection operation to determine whether the finger is a real finger comprises: determining a difference image of the first image and the second image; anddetermining a probability that the finger is a real finger based on deformation indicated in the difference image.
  • 16. The non-transitory computer readable storage medium of claim 14, the method further comprising: responsive to passing the fingerprint authentication without performing the spoof detection operation, providing a first security level of fingerprint authentication.
  • 17. The non-transitory computer readable storage medium of claim 16, the method further comprising: responsive to satisfying the spoof detection operation and passing the fingerprint authentication, providing a second security level of fingerprint authentication, wherein the second security level of fingerprint authentication provides access to applications having a higher security level requirement than the first security level of fingerprint authentication.
  • 18. The non-transitory computer readable storage medium of claim 14, wherein the performing the spoof detection operation to determine whether the finger is a real finger comprises: analyzing deformation of the finger as a function of the second force applied by the finger during capture of the second image.
  • 19. An electronic device comprising: a fingerprint sensor;a memory; anda processor configured to: capture an image of a fingerprint of a finger at a fingerprint sensor;determine a force applied by the finger at the fingerprint sensor analyzing the image of the fingerprint, wherein the force is a measure of pressure applied by the finger on the fingerprint sensor during capture of the image;compare the force to a pressure threshold;provided the force satisfies the pressure threshold, perform a spoof detection operation to determine whether the finger is a real finger; andprovided the force does not satisfy the pressure threshold, perform fingerprint authentication using the image of the fingerprint without performing the spoof detection operation.
  • 20. The electronic device of claim 19, wherein the processor is further configured to: capture a second image of the fingerprint, wherein a force applied by the finger during capture of the second image is greater than the force applied by the finger during capture of the image of the fingerprint;determine a difference image of the image and the second image; anddetermine a probability that the finger is a real finger based on deformation indicated in the difference image.
  • 21. The electronic device of claim 19, wherein the processor is further configured to: provide a first security level of fingerprint authentication responsive to passing fingerprint authentication without performing the spoof detection operation.
  • 22. The electronic device of claim 21, wherein the processor is further configured to: provide a second security level of fingerprint authentication responsive to satisfying the spoof detection operation and passing the fingerprint authentication, wherein the second security level of fingerprint authentication provides access to applications having a higher security level requirement than the first security level of fingerprint authentication.
  • 23. The electronic device of claim 19, wherein the processor is further configured to: analyze deformation of the finger as a function of the force applied by the finger during capture of the second image during the spoof detection operation.
US Referenced Citations (292)
Number Name Date Kind
4880012 Sato Nov 1989 A
5575286 Weng et al. Nov 1996 A
5684243 Gururaja et al. Nov 1997 A
5808967 Yu et al. Sep 1998 A
5867302 Fleming Feb 1999 A
5911692 Hussain et al. Jun 1999 A
6071239 Cribbs et al. Jun 2000 A
6104673 Cole et al. Aug 2000 A
6289112 Jain et al. Sep 2001 B1
6292576 Brownlee Sep 2001 B1
6350652 Libera et al. Feb 2002 B1
6428477 Mason Aug 2002 B1
6500120 Anthony Dec 2002 B1
6676602 Barnes et al. Jan 2004 B1
6736779 Sano et al. May 2004 B1
7067962 Scott Jun 2006 B2
7109642 Scott Sep 2006 B2
7243547 Cobianu et al. Jul 2007 B2
7257241 Lo Aug 2007 B2
7400750 Nam Jul 2008 B2
7433034 Huang Oct 2008 B1
7459836 Scott Dec 2008 B2
7471034 Schlote-Holubek et al. Dec 2008 B2
7489066 Scott et al. Feb 2009 B2
7634117 Cho Dec 2009 B2
7739912 Schneider et al. Jun 2010 B2
8018010 Tigli et al. Sep 2011 B2
8139827 Schneider et al. Mar 2012 B2
8255698 Li et al. Aug 2012 B2
8311514 Bandyopadhyay et al. Nov 2012 B2
8335356 Schmitt Dec 2012 B2
8433110 Kropp et al. Apr 2013 B2
8508103 Schmitt et al. Aug 2013 B2
8515135 Clarke et al. Aug 2013 B2
8666126 Lee et al. Mar 2014 B2
8703040 Liufu et al. Apr 2014 B2
8723399 Sammoura et al. May 2014 B2
8805031 Schmitt Aug 2014 B2
9056082 Liautaud et al. Jun 2015 B2
9070861 Bibl et al. Jun 2015 B2
9224030 Du Dec 2015 B2
9245165 Slaby et al. Jan 2016 B2
9424456 Kamath Koteshwara et al. Aug 2016 B1
9572549 Belevich et al. Feb 2017 B2
9582102 Setlak Feb 2017 B2
9582705 Du et al. Feb 2017 B2
9607203 Yazdandoost et al. Mar 2017 B1
9607206 Schmitt et al. Mar 2017 B2
9613246 Gozzini et al. Apr 2017 B1
9618405 Liu et al. Apr 2017 B2
9665763 Du et al. May 2017 B2
9747488 Yazdandoost et al. Aug 2017 B2
9785819 Oreifej Oct 2017 B1
9815087 Ganti et al. Nov 2017 B2
9817108 Kuo et al. Nov 2017 B2
9818020 Schuckers et al. Nov 2017 B2
9881195 Lee et al. Jan 2018 B2
9881198 Lee et al. Jan 2018 B2
9898640 Ghavanini Feb 2018 B2
9904836 Yeke Yazdandoost et al. Feb 2018 B2
9909225 Lee et al. Mar 2018 B2
9922235 Cho et al. Mar 2018 B2
9933319 Li et al. Apr 2018 B2
9934371 Hong et al. Apr 2018 B2
9939972 Shepelev et al. Apr 2018 B2
9953205 Rasmussen et al. Apr 2018 B1
9959444 Young et al. May 2018 B2
9967100 Hong et al. May 2018 B2
9983656 Merrell et al. May 2018 B2
9984271 King et al. May 2018 B1
10006824 Tsai et al. Jun 2018 B2
10275638 Yousefpor et al. Apr 2019 B1
10315222 Salvia et al. Jun 2019 B2
10322929 Soundara Pandian et al. Jun 2019 B2
10325915 Salvia et al. Jun 2019 B2
10387704 Dagan et al. Aug 2019 B2
10445547 Tsai Oct 2019 B2
10461124 Berger et al. Oct 2019 B2
10478858 Lasiter et al. Nov 2019 B2
10488274 Li et al. Nov 2019 B2
10497747 Tsai et al. Dec 2019 B2
10515255 Strohmann et al. Dec 2019 B2
10539539 Garlepp et al. Jan 2020 B2
10600403 Garlepp et al. Mar 2020 B2
10656255 Ng et al. May 2020 B2
10670716 Apte et al. Jun 2020 B2
10706835 Garlepp et al. Jul 2020 B2
10726231 Tsai et al. Jul 2020 B2
10755067 De Foras et al. Aug 2020 B2
11107858 Berger et al. Aug 2021 B2
20020135273 Mauchamp et al. Sep 2002 A1
20030013955 Poland Jan 2003 A1
20040085858 Khuri-Yakub et al. May 2004 A1
20040122316 Satoh et al. Jun 2004 A1
20040174773 Thomenius et al. Sep 2004 A1
20050023937 Sashida et al. Feb 2005 A1
20050057284 Wodnicki Mar 2005 A1
20050100200 Abiko et al. May 2005 A1
20050110071 Ema et al. May 2005 A1
20050146240 Smith et al. Jul 2005 A1
20050148132 Wodnicki Jul 2005 A1
20050162040 Robert Jul 2005 A1
20060052697 Hossack et al. Mar 2006 A1
20060079777 Karasawa Apr 2006 A1
20060230605 Schlote-Holubek et al. Oct 2006 A1
20070046396 Huang Mar 2007 A1
20070047785 Jang et al. Mar 2007 A1
20070073135 Lee et al. Mar 2007 A1
20070202252 Sasaki Aug 2007 A1
20070215964 Khuri-Yakub et al. Sep 2007 A1
20070230754 Jain et al. Oct 2007 A1
20080125660 Yao et al. May 2008 A1
20080146938 Hazard et al. Jun 2008 A1
20080150032 Tanaka Jun 2008 A1
20080194053 Huang Aug 2008 A1
20080240523 Benkley et al. Oct 2008 A1
20090005684 Kristoffersen et al. Jan 2009 A1
20090182237 Angelsen et al. Jul 2009 A1
20090232367 Shinzaki Sep 2009 A1
20090274343 Clarke Nov 2009 A1
20090303838 Svet Dec 2009 A1
20100030076 Vortman et al. Feb 2010 A1
20100046810 Yamada Feb 2010 A1
20100113952 Raguin et al. May 2010 A1
20100168583 Dausch et al. Jul 2010 A1
20100195851 Buccafusca Aug 2010 A1
20100201222 Adachi et al. Aug 2010 A1
20100202254 Roest et al. Aug 2010 A1
20100239751 Regniere Sep 2010 A1
20100251824 Schneider et al. Oct 2010 A1
20100256498 Tanaka Oct 2010 A1
20100278008 Ammar Nov 2010 A1
20110285244 Lewis et al. Nov 2011 A1
20110291207 Martin et al. Dec 2011 A1
20120016604 Irving et al. Jan 2012 A1
20120092026 Liautaud et al. Apr 2012 A1
20120095335 Sverdlik et al. Apr 2012 A1
20120095344 Kristoffersen et al. Apr 2012 A1
20120095347 Adam et al. Apr 2012 A1
20120147698 Wong et al. Jun 2012 A1
20120179044 Chiang et al. Jul 2012 A1
20120232396 Tanabe Sep 2012 A1
20120238876 Tanabe et al. Sep 2012 A1
20120263355 Monden Oct 2012 A1
20120279865 Regniere et al. Nov 2012 A1
20120288641 Diatezua et al. Nov 2012 A1
20120300988 Ivanov et al. Nov 2012 A1
20130051179 Hong Feb 2013 A1
20130064043 Degertekin et al. Mar 2013 A1
20130127297 Bautista et al. May 2013 A1
20130127592 Fyke et al. May 2013 A1
20130133428 Lee et al. May 2013 A1
20130201134 Schneider et al. Aug 2013 A1
20130271628 Ku et al. Oct 2013 A1
20130294201 Hajati Nov 2013 A1
20130294202 Hajati Nov 2013 A1
20140060196 Falter et al. Mar 2014 A1
20140117812 Hajati May 2014 A1
20140176332 Alameh et al. Jun 2014 A1
20140208853 Onishi et al. Jul 2014 A1
20140219521 Schmitt et al. Aug 2014 A1
20140232241 Hajati Aug 2014 A1
20140265721 Robinson et al. Sep 2014 A1
20140313007 Harding Oct 2014 A1
20140355387 Kitchens et al. Dec 2014 A1
20150036065 Yousefpor et al. Feb 2015 A1
20150049590 Rowe et al. Feb 2015 A1
20150087991 Chen et al. Mar 2015 A1
20150097468 Hajati et al. Apr 2015 A1
20150105663 Kiyose et al. Apr 2015 A1
20150145374 Xu et al. May 2015 A1
20150164473 Kim et al. Jun 2015 A1
20150165479 Lasiter et al. Jun 2015 A1
20150169136 Ganti et al. Jun 2015 A1
20150189136 Chung et al. Jul 2015 A1
20150198699 Kuo et al. Jul 2015 A1
20150206738 Rastegar Jul 2015 A1
20150213180 Herberholz Jul 2015 A1
20150220767 Yoon et al. Aug 2015 A1
20150241393 Ganti et al. Aug 2015 A1
20150261261 Bhagavatula et al. Sep 2015 A1
20150286312 Kang et al. Oct 2015 A1
20150301653 Urushi Oct 2015 A1
20150345987 Hajati Dec 2015 A1
20150357375 Tsai et al. Dec 2015 A1
20150358740 Tsai et al. Dec 2015 A1
20150362589 Tsai Dec 2015 A1
20150371398 Qiao et al. Dec 2015 A1
20160041047 Liu et al. Feb 2016 A1
20160051225 Kim et al. Feb 2016 A1
20160063294 Du et al. Mar 2016 A1
20160063300 Du et al. Mar 2016 A1
20160070967 Du et al. Mar 2016 A1
20160070968 Gu et al. Mar 2016 A1
20160086010 Merrell et al. Mar 2016 A1
20160091378 Tsai et al. Mar 2016 A1
20160092715 Yazdandoost et al. Mar 2016 A1
20160092716 Yazdandoost et al. Mar 2016 A1
20160100822 Kim et al. Apr 2016 A1
20160107194 Panchawagh et al. Apr 2016 A1
20160117541 Lu et al. Apr 2016 A1
20160180142 Riddle et al. Jun 2016 A1
20160296975 Lukacs et al. Oct 2016 A1
20160299014 Li et al. Oct 2016 A1
20160326477 Fernandez-Alcon et al. Nov 2016 A1
20160350573 Kitchens et al. Dec 2016 A1
20160358003 Shen et al. Dec 2016 A1
20170004352 Jonsson et al. Jan 2017 A1
20170330552 Garlepp et al. Jan 2017 A1
20170032485 Vemury Feb 2017 A1
20170059380 Li et al. Mar 2017 A1
20170075700 Abudi et al. Mar 2017 A1
20170100091 Eigil et al. Apr 2017 A1
20170110504 Panchawagh et al. Apr 2017 A1
20170119343 Pintoffl May 2017 A1
20170124374 Rowe et al. May 2017 A1
20170168543 Dai et al. Jun 2017 A1
20170185821 Chen et al. Jun 2017 A1
20170194934 Shelton et al. Jul 2017 A1
20170219536 Koch et al. Aug 2017 A1
20170231534 Agassy et al. Aug 2017 A1
20170255338 Medina et al. Sep 2017 A1
20170293791 Mainguet et al. Oct 2017 A1
20170316243 Ghavanini Nov 2017 A1
20170316248 He et al. Nov 2017 A1
20170322290 Ng Nov 2017 A1
20170322291 Salvia et al. Nov 2017 A1
20170322292 Salvia et al. Nov 2017 A1
20170322305 Apte et al. Nov 2017 A1
20170323133 Tsai Nov 2017 A1
20170325081 Chrisikos et al. Nov 2017 A1
20170326590 Daneman Nov 2017 A1
20170326591 Apte et al. Nov 2017 A1
20170326593 Garlepp et al. Nov 2017 A1
20170326594 Berger et al. Nov 2017 A1
20170328866 Apte et al. Nov 2017 A1
20170328870 Garlepp et al. Nov 2017 A1
20170330012 Salvia et al. Nov 2017 A1
20170330553 Garlepp et al. Nov 2017 A1
20170357839 Yazdandoost et al. Dec 2017 A1
20180025202 Ryshtun Jan 2018 A1
20180101711 D'Souza et al. Apr 2018 A1
20180107854 Tsai et al. Apr 2018 A1
20180129849 Strohmann et al. May 2018 A1
20180129857 Bonev May 2018 A1
20180178251 Foncellino et al. Jun 2018 A1
20180206820 Anand et al. Jul 2018 A1
20180217008 Li et al. Aug 2018 A1
20180225495 Jonsson et al. Aug 2018 A1
20180229267 Ono et al. Aug 2018 A1
20180276443 Strohmann et al. Sep 2018 A1
20180329560 Kim Nov 2018 A1
20180349663 Garlepp et al. Dec 2018 A1
20180357457 Rasmussen et al. Dec 2018 A1
20180369866 Sammoura et al. Dec 2018 A1
20180373913 Panchawagh et al. Dec 2018 A1
20190005300 Garlepp et al. Jan 2019 A1
20190012673 Chakraborty et al. Jan 2019 A1
20190018123 Narasimha-Iyer et al. Jan 2019 A1
20190043920 Berger et al. Feb 2019 A1
20190046263 Hayashida et al. Feb 2019 A1
20190057267 Kitchens et al. Feb 2019 A1
20190073507 D'Souza et al. Mar 2019 A1
20190087632 Raguin et al. Mar 2019 A1
20190095015 Han et al. Mar 2019 A1
20190102046 Miranto et al. Apr 2019 A1
20190130083 Agassy May 2019 A1
20190171858 Ataya et al. Jun 2019 A1
20190188441 Hall et al. Jun 2019 A1
20190188442 Flament et al. Jun 2019 A1
20190247887 Salvia et al. Aug 2019 A1
20190325185 Tang Oct 2019 A1
20190340455 Jung et al. Nov 2019 A1
20190370518 Maor et al. Dec 2019 A1
20200030850 Apte et al. Jan 2020 A1
20200050816 Tsai Feb 2020 A1
20200050817 Salvia et al. Feb 2020 A1
20200050820 Iatsun et al. Feb 2020 A1
20200050828 Li et al. Feb 2020 A1
20200074135 Garlepp et al. Mar 2020 A1
20200111834 Tsai et al. Apr 2020 A1
20200125710 Andersson Apr 2020 A1
20200147644 Chang May 2020 A1
20200158694 Garlepp et al. May 2020 A1
20200175143 Lee Jun 2020 A1
20200194495 Berger et al. Jun 2020 A1
20200210666 Flament Jul 2020 A1
20200250393 Tsai et al. Aug 2020 A1
20200285882 Skovgaard Christensen et al. Sep 2020 A1
20200302140 Lu et al. Sep 2020 A1
20200355824 Apte et al. Nov 2020 A1
20200400800 Ng et al. Dec 2020 A1
Foreign Referenced Citations (44)
Number Date Country
1826631 Aug 2006 CN
101192644 Jun 2008 CN
102159334 Aug 2011 CN
105264542 Jan 2016 CN
105378756 Mar 2016 CN
106458575 Jul 2018 CN
109196671 Jan 2019 CN
109255323 Jan 2019 CN
1214909 Jun 2002 EP
2884301 Jun 2015 EP
3086261 Oct 2016 EP
1534140 Jan 2019 EP
3292508 Dec 2020 EP
3757884 Dec 2020 EP
2011040467 Feb 2011 JP
201531701 Aug 2015 TW
2009096576 Aug 2009 WO
2009137106 Nov 2009 WO
2014035564 Mar 2014 WO
2015009635 Jan 2015 WO
2015112453 Jul 2015 WO
2015120132 Aug 2015 WO
2015131083 Sep 2015 WO
2015134816 Sep 2015 WO
2015183945 Dec 2015 WO
2016007250 Jan 2016 WO
2016011172 Jan 2016 WO
2016022439 Feb 2016 WO
2016040333 Mar 2016 WO
2016053587 Apr 2016 WO
2016061406 Apr 2016 WO
2016061410 Apr 2016 WO
2017003848 Jan 2017 WO
2017053877 Mar 2017 WO
2017192890 Nov 2017 WO
2017192895 Nov 2017 WO
2017192899 Nov 2017 WO
2017196678 Nov 2017 WO
2017196682 Nov 2017 WO
2017192903 Dec 2017 WO
2018148332 Aug 2018 WO
2019005487 Jan 2019 WO
2019164721 Aug 2019 WO
2020081182 Apr 2020 WO
Non-Patent Literature Citations (62)
Entry
Tang, et al., “Pulse-Echo Ultrasonic Fingerprint Sensor on a Chip”, IEEE Transducers, Anchorage, Alaska, USA, Jun. 21-25, 2015, pp. 674-677.
ISA/EP, Partial International Search Report for International Application No. PCT/US2019/034032, 8 pages, dated Sep. 12, 2019, 8.
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2018/063431, pp. 1-15, dated Feb. 5, 2019.
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2019/015020, pp. 1-23, dated Jul. 1, 2019.
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2019/023440, pp. 1-10, dated Jun. 4, 2019.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031120, 12 pages, dated Aug. 29, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031127,13 pages, dated Sep. 1, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031134, 12 pages, dated Aug. 30, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031140, 18 pages, dated Nov. 2, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031421 13 pages, dated Jun. 21, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031426 13 pages, dated Jun. 22, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031431, 14 pages, dated Aug. 1, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031434, 13 pages, dated Jun. 26, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031439, 10 pages, dated Jun. 20, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031824, 18 pages, dated Sep. 22, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031827, 16 pages, dated Aug. 1, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031831, 12 pages, dated Jul. 21, 2017.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2018/037364, 10 pages, dated Sep. 3, 2018.
ISA/EP, International Search Report for International Application No. PCT/US2017/031826, 16 pages, dated Feb. 27, 2018.
ISA/EP, Partial International Search Report for International Application No. PCT/US2017/031140, 13 pages, dated Aug. 29, 2017.
ISA/EP, Partial International Search Report for International Application No. PCT/US2017/031823, 12 pages, dated Nov. 30, 2017.
“Moving Average Filters”, Waybackmachine XP05547422, Retrieved from the Internet: URL:https://web.archive.org/web/20170809081353/https//www.analog.com/media/en/technical-documentation/dsp-book/dsp_book_Ch15.pdf [retrieved on Jan. 24, 2019], Aug. 9, 2017, 1-8.
Office Action for CN App No. 201780029016.7 dated Mar. 24, 2020, 7 pages.
“Receiver Thermal Noise Threshold”, Fisher Telecommunication Services, Satellite Communications. Retrieved from the Internet: URL:https://web.archive.org/web/20171027075705/http//www.fishercom.xyz:80/satellite-communications/receiver-thermal-noise-threshold.html, Oct. 27, 2017, 3.
“Sleep Mode”, Wikipedia, Retrieved from the Internet: URL:https://web.archive.org/web/20170908153323/https://en.wikipedia.org/wiki/Sleep_mode [retrieved on Jan. 25, 2019], Sep. 8, 2017, 1-3.
“TMS320C5515 Fingerprint Development Kit (FDK) Hardware Guide”, Texas Instruments, Literature No. SPRUFX3, XP055547651, Apr. 2010, 1-26.
“ZTE V7 MAX. 5,5” smartphone on MediaTeck Helio P10 cpu; Published on Apr. 20, 2016; https://www.youtube.com/watch?v=ncNCbpkGQzU (Year: 2016).
Cappelli, et al., “Fingerprint Image Reconstruction from Standard Templates”, IEEE Transactions On Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 29, No. 9, Sep. 2007, 1489-1503.
Dausch, et al., “Theory and Operation of 2-D Array Piezoelectric Micromachined Ultrasound Transducers”, IEEE Transactions on Ultrasonics, and Frequency Control, vol. 55, No. 11;, Nov. 2008, 2484-2492.
Feng, et al., “Fingerprint Reconstruction: From Minutiae to Phase”, IEEE Transactions On Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 33, No. 2, Feb. 2011, 209-223.
Hopcrofi, et al., “Temperature Compensation of a MEMS Resonator Using Quality Factor as a Thermometer”, Retrieved from Internet: http://micromachine.stanford.edu/˜amanu/linked/MAH_MEMS2006.pdf, 2006, 222-225.
Hopcroft, et al., “Using the temperature dependence of resonator quality factor as a thermometer”, Applied Physics Letters 91. Retrieved from Internet: http://micromachine.stanford.edu/˜hopcroft/Publications/Hopcroft_QT_ApplPhysLett_91_013505.pdf, 2007, 013505-1-031505-3.
Jiang, et al., “Ultrasonic Fingerprint Sensor with Transmit Beamforming Based on a PMUT Array Bonded to CMOS Circuitry”, IEEE Transactions on Ultrasonics Ferroelectrics and Frequency Control, Jan. 1, 2017, 1-9.
Kumar, et al., “Towards Contactless, Low-Cost and Accurate 3D Fingerprint Identification”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 37, No. 3, Mar. 2015, 681-696.
Lee, et al., “Low jitter and temperature stable MEMS oscillators”, Frequency Control Symposium (FCS), 2012 IEEE International, May 2012, 1-5.
Li, et al., “Capacitive micromachined ultrasonic transducer for ultra-low pressure measurement: Theoretical study”, AIP Advances 5.12. Retrieved from Internet: http://scitation.aip.org/content/aip/journal/adva/5/12/10.1063/1.4939217, 2015, 127231.
Pang, et al., “Extracting Valley-Ridge Lines from Point-Cloud-Based 3D Fingerprint Models”, IEEE Computer Graphics and Applications, IEEE Service Center, New York, vol. 33, No. 4, Jul./Aug. 2013, 73-81.
Papageorgiou, et al., “Self-Calibration of Ultrasonic Transducers in an Intelligent Data Acquisition System”, International Scientific Journal of Computing, 2003, vol. 2, Issue 2 Retrieved Online: URL: https://scholar.google.com/scholar?q=self-calibration+of+ultrasonic+transducers+in+an+intelligent+data+acquisition+system&hl=en&as_sdt=0&as_vis=1&oi=scholart, 2003, 9-15.
Qiu, et al., “Piezoelectric Micromachined Ultrasound Transducer (PMUT) Arrays for Integrated Sensing, Actuation and maging”, Sensors 15, doi:10.3390/s150408020, Apr. 3, 2015, 8020-8041.
Ross, et al., “From Template to Image: Reconstructing Fingerprints from Minutiae Points”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 29, No. 4, Apr. 2007, 544-560.
Rozen, et al., “Air-Coupled Aluminum Nitride Piezoelectric Micromachined Ultrasonic Fransducers at 0.3 Mhz to 0.9 MHZ”, 2015 28th IEEE International Conference on Micro Electro Mechanical Systems (MEMS), IEEE, Jan. 18, 2015, 921-924.
Savoia, et al., “Design and Fabrication of a cMUT Probe for Ultrasound Imaging of Fingerprints”, 2010 IEEE International Ultrasonics Symposium Proceedings, Oct. 2010, 1877-1880.
Shen, et al., “Anisotropic Complementary Acoustic Metamaterial for Canceling out Aberrating Layers”, American Physical Society, Physical Review X 4.4: 041033., Nov. 19, 2014, 041033-1-041033-7.
Thakar, et al., “Multi-resonator approach to eliminating the temperature dependence of silicon-based timing references”, Hilton Head'14. Retrieved from the Internet: http://blog.narotama.ac.id/wp-content/uploads/2014/12/Multi-resonator-approach-to-eliminating-the-temperature-dependance-of-silicon-based-timing-references.pdf, 2014, 415-418.
Zhou, et al., “Partial Fingerprint Reconstruction with Improved Smooth Extension”, Network and System Security, Springer Berlin Heidelberg, Jun. 3, 2013, 756-762.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2019061516, 14 pages, dated Mar. 12, 2020.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/033854, 16 pages, dated Nov. 3, 2020.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/039208, 10 pages, dated Oct. 9, 2020.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/039452, 11 pages, dated Sep. 9, 2020.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/042427, 18 pages, dated Dec. 14, 2020.
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2021/021412, 12 pages, dated Jun. 9, 2021.
ISA/EP, Partial Search Report and Provisional Opinion for International Application No. PCT/US2020/042427, 13 pages, dated Oct. 26, 2020.
ISA/EP, Partial Search Report for International Application No. PCT/US2020/033854, 10 pages, dated Sep. 8, 2020.
Office Action for CN App No. 201780029016.7 dated Sep. 25, 2020, 7 pages.
Taiwan Application No. 106114623, 1st Office Action, dated Aug. 5, 2021, pp. 1-8.
Tang, et al., “11.2 3D Ultrasonic Fingerprint Sensor-on-a-Chip”, 2016 IEEE International Solid-State Circuits Conference, IEEE, Jan. 31, 2016, 202-203.
EP Office Action, for Application 17724184.1, dated Oct. 12, 2021, 6 pages.
EP Office Action, dated Oct. 9, 2021, 6 pages.
European Patent Office, Office Action, App 17725018, pp. 5, dated Oct. 25, 2021.
European Patent Office, Office Action, App 17725020.6, pp. 4, dated Oct. 25, 2021.
Tang, et al., “Pulse-echo ultrasonic fingerprint sensor on a chip”, 2015 Transducers, 2015 18th International Conference on Solid-State Sensors, Actuators and Microsystems, Apr. 1, 2015, 674-677.
EP Office Action, for Application 17725017.2 dated Feb. 25, 2022, 7 pages.
Related Publications (1)
Number Date Country
20210334568 A1 Oct 2021 US