Fingerprint sensors have become ubiquitous in mobile devices as well as other devices (e.g., locks on cars and buildings) and applications for authenticating a user's identity. They provide a fast and convenient way for the user to unlock a device, provide authentication for payments, etc. It is essential that fingerprint sensors operate at a level of security that, at a minimum, reduces the potential for circumvention of security of fingerprint authentication. For instance, fake fingers having fake or spoofed fingerprints can be used to attempt to circumvent fingerprint authentication at fingerprint sensors.
The accompanying drawings, which are incorporated in and form a part of the Description of Embodiments, illustrate various non-limiting and non-exhaustive embodiments of the subject matter and, together with the Description of Embodiments, serve to explain principles of the subject matter discussed below. Unless specifically noted, the drawings referred to in this Brief Description of Drawings should be understood as not being drawn to scale and like reference numerals refer to like parts throughout the various figures unless otherwise specified.
The following Description of Embodiments is merely provided by way of example and not of limitation. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding background or in the following Description of Embodiments.
Reference will now be made in detail to various embodiments of the subject matter, examples of which are illustrated in the accompanying drawings. While various embodiments are discussed herein, it will be understood that they are not intended to limit to these embodiments. On the contrary, the presented embodiments are intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope the various embodiments as defined by the appended claims. Furthermore, in this Description of Embodiments, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present subject matter. However, embodiments may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the described embodiments.
Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing and other symbolic representations of operations on data within an electrical device. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be one or more self-consistent procedures or instructions leading to a desired result. The procedures are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of acoustic (e.g., ultrasonic) signals capable of being transmitted and received by an electronic device and/or electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in an electrical device.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the description of embodiments, discussions utilizing terms such as “capturing,” “quantifying,” “determining,” “comparing,” “generating,” “providing,” “receiving,” “analyzing,” “confirming,” “displaying,” “presenting,” “using,” “completing,” “instructing,” “executing,” or the like, refer to the actions and processes of an electronic device such as an electrical device.
Embodiments described herein may be discussed in the general context of processor-executable instructions residing on some form of non-transitory processor-readable medium, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.
In the figures, a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, logic, circuits, and steps have been described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. Also, the example fingerprint sensing system and/or mobile electronic device described herein may include components other than those shown, including well-known components.
Various techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium comprising instructions that, when executed, perform one or more of the methods described herein. The non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
The non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
Various embodiments described herein may be executed by one or more processors, such as one or more motion processing units (MPUs), sensor processing units (SPUs), host processor(s) or core(s) thereof, digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein, or other equivalent integrated or discrete logic circuitry. The term “processor,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein. As it employed in the subject specification, the term “processor” can refer to substantially any computing processing unit or device comprising, but not limited to comprising, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Moreover, processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of user equipment. A processor may also be implemented as a combination of computing processing units.
In addition, in some aspects, the functionality described herein may be provided within dedicated software modules or hardware modules configured as described herein. Also, the techniques could be fully implemented in one or more circuits or logic elements. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of an SPU/MPU and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with an SPU core, MPU core, or any other such configuration.
Discussion begins with a description of a device including a fingerprint sensor, upon which described embodiments can be implemented. An example fingerprint sensor and system for determining whether a fingerprint image is generated using a real finger or a fake finger is then described, in accordance with various embodiments. Example operations of a fingerprint sensor for determining whether a fingerprint image is generated using a real finger or a fake finger based on features of finger ridges of the captured fingerprint images are then described.
Fingerprint sensors are used in electronic devices for user authentication, such as mobile electronic devices and applications operating on mobile electronic devices, locks for accessing cars or buildings, for protecting against unauthorized access to the devices and/or applications. Authentication of a fingerprint at a fingerprint sensor is performed before providing access to a device and/or application. In order to circumvent fingerprint authentication, attempts can be made to copy or spoof fingerprints of an authorized user using a fake or artificial finger. As such, fingerprint sensors should be capable of distinguishing real fingers from fake, artificial, or even dead fingers, also referred to herein as performing “spoof detection” or “fake finger detection”. A “spoofed” fingerprint is a fake or artificial fingerprint that is used to attempt to circumvent security measures requiring fingerprint authentication. For example, an artificial finger may be used to gain unauthorized access to the electronic device or application, by making an unauthorized copy of the fingerprint of an authorized user, e.g., “spoofing” an actual fingerprint. The spoof detection may be performed by analyzing fingerprint images captured by the fingerprint sensor, e.g., performing biometric analysis of the fingerprint images, or looking at any characteristics that can help distinguish a fake/spoof fingerprint from a real fingerprint. These characteristics may be static features or dynamic features which have a certain time dependency because they change over time.
Embodiments described herein provide methods and systems for determining whether a finger interacting with a fingerprint sensor, for purposes of authentication, is a real finger or a fake finger based on observed features of finger ridges of the captured fingerprint images. Observed features of finger ridges may refer to the width of ridges of the ridge/valley pattern of captured fingerprint images. For example, a feature of the finger ridge may include a profile of the ridge, and how the profile changes based on depth and/or deformation. In some embodiments, capturing multiple fingerprint images using different times of flight allows for detecting ridge features that are indicative of whether a finger is a real finger or a fake finger.
Embodiments described herein provide for determining whether a finger is a real finger at an ultrasonic fingerprint sensor. A first image of a fingerprint pattern is captured at an ultrasonic fingerprint sensor, wherein the first image is based on ultrasonic signals corresponding to a first time of flight range. A second image of the fingerprint pattern is captured at the ultrasonic fingerprint sensor, wherein the second image is based on ultrasonic signals corresponding to a second time of flight range, the second time of flight range being delayed compared to the first time of flight range. A difference in a width of ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image is quantified. Based on the quantification of the difference, a probability whether the finger is a real finger is determined.
In some embodiments, quantifying the difference in the width of the ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image includes determining a difference between the first image and the second image. In one embodiment, a difference image is generated by subtracting the second image from the first image (or vice-versa). A first signal strength is determined at a first spatial frequency range of at least one of the first image and the second image and a second signal strength is determined at a second spatial frequency range of the difference between the first image and the second image. In one embodiment, the first signal strength corresponds to a maximum in signal strength of the first spatial frequency range and wherein the second signal strength corresponds to a maximum in signal strength of the second spatial frequency range. In one embodiment, the second frequency range is distributed around a frequency substantially double a main frequency contribution of the first frequency range. In one embodiment, the first spatial frequency range corresponds to a spatial ridge frequency range of the fingerprint pattern.
The first signal strength is compared to the second signal strength. In some embodiments, a ratio of the second signal strength to the first signal strength is determined. Provided the ratio satisfies a ratio range threshold, it is determined that the finger is a real finger. In one embodiment, the ratio range threshold is above 0.8. Based on the comparing, the probability that the finger is a real finger is determined. In one embodiment, the probability that the finger is a real finger is based on the ratio of the second signal strength to the first signal strength, wherein the probability that the finger is a real finger increases as the ratio of the second signal strength to the first signal strength increases.
In other embodiments, determining the second signal strength at the second spatial frequency range of the difference between the first image and the second image includes determining the second signal strength at the second spatial frequency of the of the difference image. In some embodiments, quantifying the difference in the width of the ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image includes determining a first average width of ridges of the fingerprint pattern of the first image and determining a second average width of ridges of the fingerprint pattern of the second image. A difference between the first average width and the second average width is quantified. In some embodiments, a ratio of the first average width to the second average width is determined. In some embodiments, determining the probability whether the finger is a real finger includes comparing of the ratio of the first average width and the second average width to a width range threshold.
Turning now to the figures,
As depicted in
Host processor 110 can be one or more microprocessors, central processing units (CPUs), DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs or applications, which may be stored in host memory 130, associated with the functions and capabilities of electronic device 100.
Host bus 120 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SPI) or other equivalent. In the embodiment shown, host processor 110, host memory 130, display 140, interface 150, transceiver 160, sensor processing unit (SPU) 170, and other components of electronic device 100 may be coupled communicatively through host bus 120 in order to exchange commands and data. Depending on the architecture, different bus configurations may be employed as desired. For example, additional buses may be used to couple the various components of electronic device 100, such as by using a dedicated bus between host processor 110 and memory 130.
Host memory 130 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory), hard disk, optical disk, or some combination thereof. Multiple layers of software can be stored in host memory 130 for use with/operation upon host processor 110. For example, an operating system layer can be provided for electronic device 100 to control and manage system resources in real time, enable functions of application software and other layers, and interface application programs with other software and functions of electronic device 100. Similarly, a user experience system layer may operate upon or be facilitated by the operating system. The user experience system may comprise one or more software application programs such as menu navigation software, games, device function control, gesture recognition, image processing or adjusting, voice recognition, navigation software, communications software (such as telephony or wireless local area network (WLAN) software), and/or any of a wide variety of other software and functional interfaces for interaction with the user can be provided. In some embodiments, multiple different applications can be provided on a single electronic device 100, and in some of those embodiments, multiple applications can run simultaneously as part of the user experience system. In some embodiments, the user experience system, operating system, and/or the host processor 110 may operate in a low-power mode (e.g., a sleep mode) where very few instructions are processed. Such a low-power mode may utilize only a small fraction of the processing power of a full-power mode (e.g., an awake mode) of the host processor 110.
Display 140, when included, may be a liquid crystal device, (organic) light emitting diode device, or other display device suitable for creating and visibly depicting graphic images and/or alphanumeric characters recognizable to a user. Display 140 may be configured to output images viewable by the user and may additionally or alternatively function as a viewfinder for camera. It should be appreciated that display 140 is optional, as various electronic devices, such as electronic locks, doorknobs, car start buttons, etc., may not require a display device.
Interface 150, when included, can be any of a variety of different devices providing input and/or output to a user, such as audio speakers, touch screen, real or virtual buttons, joystick, slider, knob, printer, scanner, computer network I/O device, other connected peripherals and the like.
Transceiver 160, when included, may be one or more of a wired or wireless transceiver which facilitates receipt of data at electronic device 100 from an external transmission source and transmission of data from electronic device 100 to an external recipient. By way of example, and not of limitation, in various embodiments, transceiver 160 comprises one or more of: a cellular transceiver, a wireless local area network transceiver (e.g., a transceiver compliant with one or more Institute of Electrical and Electronics Engineers (IEEE) 802.11 specifications for wireless local area network communication), a wireless personal area network transceiver (e.g., a transceiver compliant with one or more IEEE 802.15 specifications for wireless personal area network communication), and a wired a serial transceiver (e.g., a universal serial bus for wired communication).
Electronic device 100 also includes a general purpose sensor assembly in the form of integrated Sensor Processing Unit (SPU) 170 which includes sensor processor 172, memory 176, a fingerprint sensor 178, and a bus 174 for facilitating communication between these and other components of SPU 170. In some embodiments, SPU 170 may include at least one additional sensor 180 (shown as sensor 180-1, 180-2, . . . 180-n) communicatively coupled to bus 174. In some embodiments, at least one additional sensor 180 is a force or pressure sensor (e.g. a touch sensor) configured to determine a force or pressure or a temperature sensor configured to determine a temperature at electronic device 100. The force or pressure sensor may be disposed within, under, or adjacent fingerprint sensor 178. In some embodiments, all of the components illustrated in SPU 170 may be embodied on a single integrated circuit. It should be appreciated that SPU 170 may be manufactured as a stand-alone unit (e.g., an integrated circuit), that may exist separately from a larger electronic device and is coupled to host bus 120 through an interface (not shown). It should be appreciated that, in accordance with some embodiments, that SPU 170 can operate independent of host processor 110 and host memory 130 using sensor processor 172 and memory 176.
Sensor processor 172 can be one or more microprocessors, CPUs, DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs, which may be stored in memory 176, associated with the functions of SPU 170. It should also be appreciated that fingerprint sensor 178 and additional sensor 180, when included, may also utilize processing and memory provided by other components of electronic device 100, e.g., host processor 110 and host memory 130.
Bus 174 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SPI) or other equivalent. Depending on the architecture, different bus configurations may be employed as desired. In the embodiment shown, sensor processor 172, memory 176, fingerprint sensor 178, and other components of SPU 170 may be communicatively coupled through bus 174 in order to exchange data.
Memory 176 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory). Memory 176 may store algorithms or routines or other instructions for processing data received from fingerprint sensor 178 and/or one or more sensor 180, as well as the received data either in its raw form or after some processing. Such algorithms and routines may be implemented by sensor processor 172 and/or by logic or processing capabilities included in fingerprint sensor 178 and/or sensor 180.
A sensor 180 may comprise, without limitation: a temperature sensor, a humidity sensor, an atmospheric pressure sensor, an infrared sensor, a radio frequency sensor, a navigation satellite system sensor (such as a global positioning system receiver), an acoustic sensor (e.g., a microphone), an inertial or motion sensor (e.g., a gyroscope, accelerometer, or magnetometer) for measuring the orientation or motion of the sensor in space, or other type of sensor for measuring other physical or environmental factors. In one example, sensor 180-1 may comprise an acoustic sensor, sensor 180-2 may comprise a temperature sensor, and sensor 180-n may comprise a motion sensor.
In some embodiments, fingerprint sensor 178 and/or one or more sensors 180 may be implemented using a microelectromechanical system (MEMS) that is integrated with sensor processor 172 and one or more other components of SPU 170 in a single chip or package. It should be appreciated that fingerprint sensor 178 may be disposed behind display 140. Although depicted as being included within SPU 170, one, some, or all of fingerprint sensor 178 and/or one or more sensors 180 may be disposed externally to SPU 170 in various embodiments. It should be appreciated that fingerprint sensor 178 can be any type of fingerprint sensor, including without limitation, an ultrasonic sensor, an optical sensor, a camera, etc.
Fingerprint images 215 are captured at fingerprint image capture 210. It should be appreciated that, in accordance with various embodiments, fingerprint image capture 210 is an ultrasonic sensor (e.g., a sensor capable of transmitting and receiving ultrasonic signals). The fingerprint sensor is operable to emit and detect ultrasonic waves (also referred to as ultrasonic signals or ultrasound signals). An array of ultrasonic transducers (e.g., Piezoelectric Micromachined Ultrasonic Transducers (PMUTs)) may be used to transmit and receive the ultrasonic waves, where the ultrasonic transducers of the array are capable of performing both the transmission and receipt of the ultrasonic waves. The emitted ultrasonic waves are reflected from any objects in contact with (or in front of) the fingerprint sensor, and these reflected ultrasonic waves, or echoes, are then detected. Where the object is a finger, the waves are reflected from different features of the finger, such as the surface features on the skin, fingerprint, or features present in deeper layers of the finger (e.g., the dermis). Examples of surface features of a finger are ridges and valleys of a fingerprint, e.g., the ridge/valley pattern of the finger. For example, the reflection of the sound waves from the ridge/valley pattern enables the fingerprint sensor to produce a fingerprint image that may be used for identification of the user.
In accordance with some embodiments, at least two fingerprint images 215 are captured at an ultrasonic fingerprint sensor at different times of flight. It should be appreciated that operating parameters of an ultrasonic fingerprint sensor can be controlled, allowing for image capture at different times of flight. For instance, an adjustment of timing of transmission of the ultrasonic signals for ultrasonic transducers of an ultrasonic fingerprint sensor can change the time of flight. For example, a first fingerprint image is captured at a finger surface time of flight (e.g., a time of flight formed for imaging at a contact surface of the ultrasonic transducer) and second fingerprint image is captured at a delayed time of flight (e.g., 50-150 nanoseconds) relative to the first image. Where the finger used for generating the fingerprint images is a real finger, it is typically observed (either visually or analytically) that ridges of the second fingerprint image are narrower than ridges of the first fingerprint image. This event (ridge narrowing) is typically not observed in fake fingers. The time of flight delay may be selected for an optimum ridge narrowing effect, and may be dependent on the user. As such, the delay in time of flight may be determined for user during enrollment.
In some embodiments, the first image is captured with the optimal time of flight (e.g., calibrated time of flight for an optimum image) for measuring at the sensor surface, and the signal integration windows may be optimized for the fake finger detection. For example, in normal fingerprint detection, the integration window may be large (e.g., 50-200 ns), while for the fake finger detection a shorter integration window (e.g., <50 ns) may be used. The signal integration window for the first and second image may be different. Embodiments described herein focus on the use of a first and second image. However, more images may be used at different time of flights, and the methods described herein may then be applied in a similar manner on the plurality of images. The shorter integration window provides more depth resolution. It should be appreciated that the actual integration windows may depend on the ultrasonic fingerprint sensor stack, thickness and material, and acoustic properties of the specific ultrasonic fingerprint sensor design.
In some embodiments, the beam focusing for the two images may be the same, and may be adapted to focus on the top of the sensor stack. In other embodiments, the beam focusing may be different, where the focusing for the second image is at certain depth in the finger or certain depth from the sensor surface.
The capturing of the first and second images may be initiated whenever a change in signal is detected. For example, to capture the images when the user presses the finger on the sensor, the image capture may be started as soon an object or finger starts interacting with the sensor. For an ultrasonic sensor with an array of ultrasonic transducers, a subset of transducers may be active in a low power mode, and as soon as a finger start interacting with the sensor, the full sensor may be activated to capture the sequence of images. In another example, where the user starts lifting the finger, a change in signal may occur as the pressure of the finger is reduced, and this may initiate the image capture. In some embodiments, a background image is captured before a finger contacts the sensor, where the background image can be used to enhance image quality by subtracting from the captured fingerprint images. The change of contact state may be determined by the fingerprint sensor itself, or it may be detected by a second sensor associated with the fingerprint sensor. For example, a pressure sensor, a force sensor, or a touch sensor may be position near, below, or above the fingerprint sensor and this additional sensor may be used to detect a change in contact state that initiates the capturing of the image sequence.
The fingerprint images 215 can include any number of fingerprint images. In some embodiments, two fingerprint images 215 are captured using different times of flight In some embodiments, fingerprint images 215 includes at least two fingerprint images, but it should be appreciated that more than two image can be captured. For example, the fingerprint images 215 forwarded to difference quantifier 220 may include two or more images captured during a steady state while the finger is contacting the ultrasonic fingerprint sensor. The embodiments described herein may then be applied in a similar manner on the plurality of images to determine a change in ridge width as a function of time of flight.
Fingerprint images 215 are received at difference quantifier 220, which is configured to quantify a difference of ridge features (e.g., ridge width) of the fingerprint images 215. The (difference of) ridge features may be determined as a function of time of flight. For instance, ridge narrowing as a function of time of flight can be taken as an indicator for the probability that the finger is a real or fake finger. In general, the depth profile of the ridges can be used as an indicator as to the probability that the finger is a real or fake finger. To determine the probability, embodiments herein quantify the ridge narrowing (e.g., profile change).
Signal strength determiner 320 receives difference 315 and at least one of fingerprint images 215, and is configured to determine signal strengths corresponding to spatial frequency values or ranges. For example, signal strength determiner 320 determines a maximum signal strength at a first spatial frequency of at least one of fingerprint images 215 and a maximum signal strength at a second spatial frequency of difference 315. The first spatial frequency of each of the fingerprint images 215 should be substantially the same. In some embodiments, the second spatial frequency range is distributed around a frequency substantially double a main frequency contribution of the first spatial frequency range. In some embodiments, signal strength determiner 320 determines the maximum signal strength for at least one of fingerprint images 215, and identifies the corresponding spatial frequency (within the first spatial frequency range). Signal strength determiner 320 then determines the maximum signal strength for difference 315 at the second spatial frequency range, where the second spatial frequency range is distributed around a frequency substantially double the frequency contribution of the first frequency range.
The signal strengths 325 are output to signal strength comparer 330. Signal strength comparer 330 is configured to compare the signal strength 325 for at least fingerprint image 215 and difference 315. Output 335 is generated as a result of the comparison, where output 335 includes a probability whether the finger is a real finger or a fake finger. In one embodiment, the ratio of the signal strength of difference 315 to the signal strength of a fingerprint image 215 is determined. The ratio is then compared to a ratio range threshold. In some embodiments, the probability that the finger is a real finger is based on the ratio, wherein the probability that the finger is a real finger increases as the ratio increases. The reasoning is that for real fingers the change in width as a function of time of flight is more observable, and therefore the signal strength of the second spatial frequency range is higher. In one embodiment, the ratio range threshold is greater than 0.8. In some embodiments, output 335 is transmitted to fake finger determiner 230 as difference quantification 225.
It should be appreciated that the use of the spatial frequency spectrum is only one method to analyze the ridge narrowing. Alternative methods may also be used to compare the ridges at the different times of flight, and to create an indicator of liveness. For example, in some embodiments, the width of the ridges of the first image and second image can be compared and a different in the width of the ridges quantified.
With reference to
Graph 515 illustrates a graph of spectrogram power (e.g., signal strength) versus frequency of first fingerprint image 510, in which the maximum power corresponds to a spatial frequency range centered at approximately 2.4 line pairs per millimeter (Ipmm). Graph 525 illustrates a graph of spectrogram power versus frequency of second fingerprint image 520, in which the maximum power also corresponds to a spatial frequency range centered at approximately 2.4 lpmm. Graph 535 illustrates a graph of spectrogram power versus frequency of difference image 530, in which the maximum power corresponds to a spatial frequency range centered at approximately 4.8 lpmm.
As illustrated in example 500, the frequency of detected ridges in difference image 530, as illustrated in graph 535, is substantially double the frequency of detected ridges in first fingerprint image 510 and second fingerprint image 520, indicative of the finger being a real finger. Where the finger is a real finger, due to ridge narrowing of second fingerprint image 520, the ridges of difference image 530 are observably split. Therefore, difference image 530 has substantially twice the main spatial frequency comparing to one of first fingerprint image 510 and second fingerprint image 520. Therefore, example 500 illustrates that first fingerprint image 510 and second fingerprint image 520 were generated using a real finger.
In some embodiments, the ratio of the signal strength of difference image 530 to the signal strength of one of first fingerprint image 510 and second fingerprint image 520 is determined. The ratio is then compared to a ratio range threshold. In some embodiments, the probability that the finger is a real finger is based on the ratio, wherein the probability that the finger is a real finger increases as the ratio increases. The reasoning is that for real fingers the change in width as a function of time of flight is more observable, and therefore the signal strength of the second spatial frequency range is higher.
Graph 565 illustrates a graph of spectrogram power versus frequency of first fingerprint image 560, in which the maximum power corresponds to a spatial frequency range centered at approximately 2.1 line pairs per millimeter (Ipmm). Graph 575 illustrates a graph of spectrogram power versus frequency of second fingerprint image 570, in which the maximum power also corresponds to a spatial frequency range centered at approximately 2.1 lpmm. Graph 585 illustrates a graph of spectrogram power versus frequency of difference image 580, in which the maximum power at around twice the main frequency contributions of the first and/or second image is difficult to determine.
As illustrated in example 550, the frequency of detected ridges in difference image 580, as illustrated in graph 585, is not substantially double the frequency of detected ridges in first fingerprint images 560 or second fingerprint image 570, indicative of the finger being a fake finger. Moreover, there is no maximum or peak signal strength at or near 4.2 lpmm of difference image 580. Where the finger is a fake finger, the ridges of difference image 580 are not clearly observable. In particular, there is no observable ridge splitting in difference image 580, and no observable ridge narrowing between first fingerprint images 560 and second fingerprint image 570. Therefore, example 550 illustrates that first fingerprint images 560 and second fingerprint image 570 were generated using a fake finger.
Graph 615 illustrates a graph of normalized power (e.g., signal strength) versus frequency of first fingerprint image 610, second fingerprint image 620, and difference image 630. First fingerprint image 610 corresponds to a ridge frequency of approximately 2.5 lpmm, as indicated at point 614 of line 612. Second fingerprint image 620 corresponds to a ridge frequency of approximately 2.3 lpmm, as indicated at point 624 line 622. Difference image 630 corresponds to a maximum ridge frequency of approximately 4.6 lpmm as indicated at point 634 of line 632.
As illustrated in graph 615, the frequency of detected ridges in difference image 630 is substantially double the frequency of detected ridges in first fingerprint image 610 and second fingerprint image 620, indicative of the finger being a real finger. Where the finger is a real finger, due to ridge narrowing of second fingerprint image 620, the ridges of difference image 630 are observably split. Therefore, difference image 630 has substantially twice the main spatial frequency comparing to one of first fingerprint image 610 and second fingerprint image 620. Therefore, example 600 illustrates that first fingerprint image 610 and second fingerprint image 620 were generated using a real finger.
In some embodiments, the ratio of the signal strength of difference image 630 to the signal strength of one of first fingerprint image 610 and second fingerprint image 620 is determined. The ratio is then compared to a ratio range threshold. In some embodiments, the probability that the finger is a real finger is based on the ratio, wherein the probability that the finger is a real finger increases as the ratio increases. The reasoning is that for real fingers the change in width as a function of time of flight is more observable, and therefore the signal strength of the second spatial frequency range is higher.
Graph 690 illustrates a graph of normalized power (e.g., signal strength) versus frequency of first fingerprint image 660, second fingerprint image 670, and difference image 680. First fingerprint image 660 and second fingerprint image 670 correspond to a ridge frequency of approximately 2.2 lpmm, as indicated at point 664 of line 662 and point 674 of line 672, respectively. Difference image 680 corresponds to a maximum ridge frequency of approximately 1.4 lpmm, as indicated at point 684 of line 682.
As illustrated in graph 690, the frequency of detected ridges in difference image 680 is not substantially double the frequency of detected ridges in first fingerprint image 660 or second fingerprint image 670, indicative of the finger being a fake finger. Where the finger is a fake finger, the ridges of difference image 680 are not observable. In particular, there is no observable ridge splitting in difference image 680, and no observable ridge narrowing between first fingerprint image 660 and second fingerprint image 670. Therefore, example 650 illustrates that first fingerprint image 660 and second fingerprint image 670 were generated using a fake finger.
At procedure 710 of flow diagram 700, a first image of a fingerprint pattern is captured at an ultrasonic fingerprint sensor, wherein the first image is based on ultrasonic signals corresponding to a first time of flight range. At procedure 720, a second image of the fingerprint pattern is captured at the ultrasonic fingerprint sensor, wherein the second image is based on ultrasonic signals corresponding to a second time of flight range, the second time of flight range being delayed compared to the first time of flight range.
At procedure 730, a difference in a width of ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image is quantified.
In some embodiments, procedure 730 is performed according to flow diagram 800 of
At procedure 850, the first signal strength is compared to the second signal strength. In some embodiments, as shown at procedure 852, a ratio of the second signal strength to the first signal strength is determined. Provided the ratio satisfies a ratio range threshold, as shown at procedure 854, it is determined that the finger is a real finger. In one embodiment, the ratio range threshold is above 0.8. Based on the comparing, as shown at procedure 860, the probability that the finger is a real finger is determined. In one embodiment, the probability that the finger is a real finger is based on the ratio of the second signal strength to the first signal strength, wherein the probability that the finger is a real finger increases as the ratio of the second signal strength to the first signal strength increases.
In other embodiments, procedure 730 is performed according to flow diagram 900 of
Returning to flow diagram 700 of
The examples set forth herein were presented in order to best explain, to describe particular applications, and to thereby enable those skilled in the art to make and use embodiments of the described examples. However, those skilled in the art will recognize that the foregoing description and examples have been presented for the purposes of illustration and example only. Many aspects of the different example embodiments that are described above can be combined into new embodiments. The description as set forth is not intended to be exhaustive or to limit the embodiments to the precise form disclosed. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Reference throughout this document to “one embodiment,” “certain embodiments,” “an embodiment,” “various embodiments,” “some embodiments,” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any embodiment may be combined in any suitable manner with one or more other features, structures, or characteristics of one or more other embodiments without limitation.
This application claims priority to and the benefit of U.S. Patent Provisional Patent Application 62/865,810, filed on Jun. 24, 2019, entitled “FAKE FINGER DETECTION BY RIDGE NARROWING,” by Akhbari et al., and assigned to the assignee of the present application, which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4880012 | Sato | Nov 1989 | A |
5575286 | Weng et al. | Nov 1996 | A |
5684243 | Gururaja et al. | Nov 1997 | A |
5808967 | Yu et al. | Sep 1998 | A |
5867302 | Fleming | Feb 1999 | A |
5911692 | Hussain et al. | Jun 1999 | A |
6071239 | Cribbs et al. | Jun 2000 | A |
6104673 | Cole et al. | Aug 2000 | A |
6289112 | Jain et al. | Sep 2001 | B1 |
6292576 | Brownlee | Sep 2001 | B1 |
6350652 | Libera et al. | Feb 2002 | B1 |
6428477 | Mason | Aug 2002 | B1 |
6483932 | Martinez et al. | Nov 2002 | B1 |
6500120 | Anthony | Dec 2002 | B1 |
6676602 | Barnes et al. | Jan 2004 | B1 |
6736779 | Sano et al. | May 2004 | B1 |
7067962 | Scott | Jun 2006 | B2 |
7109642 | Scott | Sep 2006 | B2 |
7243547 | Cobianu et al. | Jul 2007 | B2 |
7257241 | Lo | Aug 2007 | B2 |
7400750 | Nam | Jul 2008 | B2 |
7433034 | Huang | Oct 2008 | B1 |
7459836 | Scott | Dec 2008 | B2 |
7471034 | Schlote-Holubek et al. | Dec 2008 | B2 |
7489066 | Scott et al. | Feb 2009 | B2 |
7634117 | Cho | Dec 2009 | B2 |
7739912 | Schneider et al. | Jun 2010 | B2 |
8018010 | Tigli et al. | Sep 2011 | B2 |
8139827 | Schneider et al. | Mar 2012 | B2 |
8255698 | Li et al. | Aug 2012 | B2 |
8311514 | Bandyopadhyay et al. | Nov 2012 | B2 |
8335356 | Schmitt | Dec 2012 | B2 |
8433110 | Kropp et al. | Apr 2013 | B2 |
8508103 | Schmitt et al. | Aug 2013 | B2 |
8515135 | Clarke et al. | Aug 2013 | B2 |
8666126 | Lee et al. | Mar 2014 | B2 |
8703040 | Liufu et al. | Apr 2014 | B2 |
8723399 | Sammoura et al. | May 2014 | B2 |
8805031 | Schmitt | Aug 2014 | B2 |
9056082 | Liautaud et al. | Jun 2015 | B2 |
9070861 | Bibl et al. | Jun 2015 | B2 |
9224030 | Du et al. | Dec 2015 | B2 |
9245165 | Slaby et al. | Jan 2016 | B2 |
9424456 | Kamath Koteshwara et al. | Aug 2016 | B1 |
9572549 | Belevich et al. | Feb 2017 | B2 |
9582102 | Setlak | Feb 2017 | B2 |
9582705 | Du et al. | Feb 2017 | B2 |
9607203 | Yazdandoost et al. | Mar 2017 | B1 |
9607206 | Schmitt et al. | Mar 2017 | B2 |
9613246 | Gozzini et al. | Apr 2017 | B1 |
9665763 | Du et al. | May 2017 | B2 |
9747488 | Yazdandoost et al. | Aug 2017 | B2 |
9785819 | Oreifej | Oct 2017 | B1 |
9815087 | Ganti et al. | Nov 2017 | B2 |
9817108 | Kuo et al. | Nov 2017 | B2 |
9818020 | Schuckers et al. | Nov 2017 | B2 |
9881195 | Lee et al. | Jan 2018 | B2 |
9881198 | Lee et al. | Jan 2018 | B2 |
9898640 | Ghavanini | Feb 2018 | B2 |
9904836 | Yeke Yazdandoost et al. | Feb 2018 | B2 |
9909225 | Lee et al. | Mar 2018 | B2 |
9922235 | Cho et al. | Mar 2018 | B2 |
9934371 | Hong et al. | Apr 2018 | B2 |
9939972 | Shepelev et al. | Apr 2018 | B2 |
9953205 | Rasmussen et al. | Apr 2018 | B1 |
9959444 | Young et al. | May 2018 | B2 |
9967100 | Hong et al. | May 2018 | B2 |
9983656 | Merrell et al. | May 2018 | B2 |
9984271 | King et al. | May 2018 | B1 |
10275638 | Yousefpor et al. | Apr 2019 | B1 |
10315222 | Salvia et al. | Jun 2019 | B2 |
10322929 | Soundara Pandian et al. | Jun 2019 | B2 |
10387704 | Dagan et al. | Aug 2019 | B2 |
10461124 | Berger et al. | Oct 2019 | B2 |
10478858 | Lasiter et al. | Nov 2019 | B2 |
10497747 | Tsai et al. | Dec 2019 | B2 |
10515255 | Strohmann | Dec 2019 | B2 |
10539539 | Garlepp et al. | Jan 2020 | B2 |
10600403 | Garlepp et al. | Mar 2020 | B2 |
10656255 | Ng et al. | May 2020 | B2 |
10670716 | Apte et al. | Jun 2020 | B2 |
10706835 | Garlepp et al. | Jul 2020 | B2 |
10755067 | De Foras et al. | Aug 2020 | B2 |
20020135273 | Mauchamp et al. | Sep 2002 | A1 |
20030013955 | Poland | Jan 2003 | A1 |
20040085858 | Khuri-Yakub et al. | May 2004 | A1 |
20040122316 | Satoh et al. | Jun 2004 | A1 |
20040174773 | Thomenius et al. | Sep 2004 | A1 |
20050023937 | Sashida et al. | Feb 2005 | A1 |
20050057284 | Wodnicki | Mar 2005 | A1 |
20050100200 | Abiko et al. | May 2005 | A1 |
20050110071 | Ema et al. | May 2005 | A1 |
20050146240 | Smith et al. | Jul 2005 | A1 |
20050148132 | Wodnicki et al. | Jul 2005 | A1 |
20050162040 | Robert | Jul 2005 | A1 |
20060052697 | Hossack et al. | Mar 2006 | A1 |
20060079777 | Karasawa | Apr 2006 | A1 |
20060230605 | Schlote-Holubek et al. | Oct 2006 | A1 |
20060280346 | Machida | Dec 2006 | A1 |
20070046396 | Huang | Mar 2007 | A1 |
20070047785 | Jang et al. | Mar 2007 | A1 |
20070073135 | Lee et al. | Mar 2007 | A1 |
20070202252 | Sasaki | Aug 2007 | A1 |
20070215964 | Khuri-Yakub et al. | Sep 2007 | A1 |
20070223791 | Shinzaki | Sep 2007 | A1 |
20070230754 | Jain et al. | Oct 2007 | A1 |
20080125660 | Yao et al. | May 2008 | A1 |
20080150032 | Tanaka | Jun 2008 | A1 |
20080194053 | Huang | Aug 2008 | A1 |
20080240523 | Benkley et al. | Oct 2008 | A1 |
20090005684 | Kristoffersen et al. | Jan 2009 | A1 |
20090182237 | Angelsen et al. | Jul 2009 | A1 |
20090232367 | Shinzaki | Sep 2009 | A1 |
20090274343 | Clarke | Nov 2009 | A1 |
20090303838 | Svet | Dec 2009 | A1 |
20100030076 | Vortman et al. | Feb 2010 | A1 |
20100046810 | Yamada | Feb 2010 | A1 |
20100113952 | Raguin et al. | May 2010 | A1 |
20100168583 | Dausch et al. | Jul 2010 | A1 |
20100195851 | Buccafusca | Aug 2010 | A1 |
20100201222 | Adachi et al. | Aug 2010 | A1 |
20100202254 | Roest et al. | Aug 2010 | A1 |
20100239751 | Regniere | Sep 2010 | A1 |
20100251824 | Schneider et al. | Oct 2010 | A1 |
20100256498 | Tanaka | Oct 2010 | A1 |
20100278008 | Ammar | Nov 2010 | A1 |
20110285244 | Lewis et al. | Nov 2011 | A1 |
20110291207 | Martin et al. | Dec 2011 | A1 |
20120016604 | Irving et al. | Jan 2012 | A1 |
20120092026 | Liautaud et al. | Apr 2012 | A1 |
20120095347 | Adam et al. | Apr 2012 | A1 |
20120147698 | Wong et al. | Jun 2012 | A1 |
20120224041 | Monden | Sep 2012 | A1 |
20120232396 | Tanabe | Sep 2012 | A1 |
20120238876 | Tanabe et al. | Sep 2012 | A1 |
20120263355 | Monden | Oct 2012 | A1 |
20120279865 | Regniere et al. | Nov 2012 | A1 |
20120288641 | Diatezua et al. | Nov 2012 | A1 |
20120300988 | Ivanov et al. | Nov 2012 | A1 |
20130051179 | Hong | Feb 2013 | A1 |
20130064043 | Degertekin et al. | Mar 2013 | A1 |
20130127592 | Fyke et al. | May 2013 | A1 |
20130133428 | Lee et al. | May 2013 | A1 |
20130201134 | Schneider et al. | Aug 2013 | A1 |
20130271628 | Ku et al. | Oct 2013 | A1 |
20130294201 | Hajati | Nov 2013 | A1 |
20130294202 | Hajati | Nov 2013 | A1 |
20140003679 | Han et al. | Jan 2014 | A1 |
20140060196 | Falter et al. | Mar 2014 | A1 |
20140117812 | Hajati | May 2014 | A1 |
20140176332 | Alameh et al. | Jun 2014 | A1 |
20140208853 | Onishi et al. | Jul 2014 | A1 |
20140219521 | Schmitt et al. | Aug 2014 | A1 |
20140232241 | Hajati | Aug 2014 | A1 |
20140265721 | Robinson et al. | Sep 2014 | A1 |
20140294262 | Schuckers et al. | Oct 2014 | A1 |
20140313007 | Harding | Oct 2014 | A1 |
20140355387 | Kitchens et al. | Dec 2014 | A1 |
20150036065 | Yousefpor et al. | Feb 2015 | A1 |
20150049590 | Rowe et al. | Feb 2015 | A1 |
20150087991 | Chen et al. | Mar 2015 | A1 |
20150097468 | Hajati et al. | Apr 2015 | A1 |
20150145374 | Xu et al. | May 2015 | A1 |
20150164473 | Kim et al. | Jun 2015 | A1 |
20150165479 | Lasiter et al. | Jun 2015 | A1 |
20150169136 | Ganti et al. | Jun 2015 | A1 |
20150189136 | Chung et al. | Jul 2015 | A1 |
20150198699 | Kuo et al. | Jul 2015 | A1 |
20150206738 | Rastegar | Jul 2015 | A1 |
20150213180 | Herberholz | Jul 2015 | A1 |
20150220767 | Yoon et al. | Aug 2015 | A1 |
20150241393 | Ganti et al. | Aug 2015 | A1 |
20150261261 | Bhagavatula et al. | Sep 2015 | A1 |
20150286312 | Kang et al. | Oct 2015 | A1 |
20150301653 | Urushi | Oct 2015 | A1 |
20150345987 | Hajati | Dec 2015 | A1 |
20150371398 | Qiao et al. | Dec 2015 | A1 |
20160051225 | Kim et al. | Feb 2016 | A1 |
20160063294 | Du et al. | Mar 2016 | A1 |
20160063300 | Du et al. | Mar 2016 | A1 |
20160070967 | Du et al. | Mar 2016 | A1 |
20160070968 | Gu | Mar 2016 | A1 |
20160086010 | Merrell et al. | Mar 2016 | A1 |
20160092715 | Yazdandoost et al. | Mar 2016 | A1 |
20160092716 | Yazdandoost et al. | Mar 2016 | A1 |
20160100822 | Kim et al. | Apr 2016 | A1 |
20160107194 | Panchawagh et al. | Apr 2016 | A1 |
20160180142 | Riddle et al. | Jun 2016 | A1 |
20160326477 | Fernandez-Alcon et al. | Nov 2016 | A1 |
20160350573 | Kitchens, II | Dec 2016 | A1 |
20160358003 | Shen et al. | Dec 2016 | A1 |
20170004346 | Kim et al. | Jan 2017 | A1 |
20170004352 | Jonsson et al. | Jan 2017 | A1 |
20170330552 | Garlepp et al. | Jan 2017 | A1 |
20170032485 | Vemury | Feb 2017 | A1 |
20170075700 | Abudi et al. | Mar 2017 | A1 |
20170076132 | Sezan et al. | Mar 2017 | A1 |
20170100091 | Eigil et al. | Apr 2017 | A1 |
20170110504 | Panchawagh et al. | Apr 2017 | A1 |
20170119343 | Pintoffl | May 2017 | A1 |
20170124374 | Rowe et al. | May 2017 | A1 |
20170168543 | Dai et al. | Jun 2017 | A1 |
20170185821 | Chen et al. | Jun 2017 | A1 |
20170194934 | Shelton et al. | Jul 2017 | A1 |
20170200054 | Du | Jul 2017 | A1 |
20170219536 | Koch et al. | Aug 2017 | A1 |
20170231534 | Agassy et al. | Aug 2017 | A1 |
20170255338 | Medina et al. | Sep 2017 | A1 |
20170293791 | Mainguet et al. | Oct 2017 | A1 |
20170316243 | Ghavanini | Nov 2017 | A1 |
20170316248 | He et al. | Nov 2017 | A1 |
20170322290 | Ng | Nov 2017 | A1 |
20170322291 | Salvia et al. | Nov 2017 | A1 |
20170322292 | Salvia et al. | Nov 2017 | A1 |
20170322305 | Apte et al. | Nov 2017 | A1 |
20170323133 | Tsai | Nov 2017 | A1 |
20170325081 | Chrisikos et al. | Nov 2017 | A1 |
20170326590 | Daneman | Nov 2017 | A1 |
20170326591 | Apte et al. | Nov 2017 | A1 |
20170326593 | Garlepp et al. | Nov 2017 | A1 |
20170326594 | Berger et al. | Nov 2017 | A1 |
20170328866 | Apte et al. | Nov 2017 | A1 |
20170328870 | Garlepp et al. | Nov 2017 | A1 |
20170330012 | Salvia et al. | Nov 2017 | A1 |
20170330553 | Garlepp et al. | Nov 2017 | A1 |
20170357839 | Yazdandoost et al. | Dec 2017 | A1 |
20180025202 | Ryshtun et al. | Jan 2018 | A1 |
20180032788 | Krenzer et al. | Feb 2018 | A1 |
20180101711 | D'Souza et al. | Apr 2018 | A1 |
20180107852 | Fenrich et al. | Apr 2018 | A1 |
20180107854 | Tsai et al. | Apr 2018 | A1 |
20180129849 | Strohmann et al. | May 2018 | A1 |
20180129857 | Bonev | May 2018 | A1 |
20180150679 | Kim et al. | May 2018 | A1 |
20180178251 | Foncellino et al. | Jun 2018 | A1 |
20180206820 | Anand et al. | Jul 2018 | A1 |
20180225495 | Jonsson | Aug 2018 | A1 |
20180229267 | Ono et al. | Aug 2018 | A1 |
20180276443 | Strohmann et al. | Sep 2018 | A1 |
20180329560 | Kim et al. | Nov 2018 | A1 |
20180341799 | Schwartz et al. | Nov 2018 | A1 |
20180349663 | Garlepp et al. | Dec 2018 | A1 |
20180357457 | Rasmussen et al. | Dec 2018 | A1 |
20180369866 | Sammoura et al. | Dec 2018 | A1 |
20180373913 | Panchawagh et al. | Dec 2018 | A1 |
20190005300 | Garlepp et al. | Jan 2019 | A1 |
20190012673 | Chakraborty et al. | Jan 2019 | A1 |
20190018123 | Narasimha-Iyer et al. | Jan 2019 | A1 |
20190057267 | Kitchens, II | Feb 2019 | A1 |
20190073507 | D'Souza et al. | Mar 2019 | A1 |
20190087632 | Raguin et al. | Mar 2019 | A1 |
20190102046 | Miranto et al. | Apr 2019 | A1 |
20190130083 | Agassy et al. | May 2019 | A1 |
20190171858 | Ataya et al. | Jun 2019 | A1 |
20190188441 | Hall et al. | Jun 2019 | A1 |
20190188442 | Flament et al. | Jun 2019 | A1 |
20190325185 | Tang | Oct 2019 | A1 |
20190340455 | Jung et al. | Nov 2019 | A1 |
20190370518 | Maor et al. | Dec 2019 | A1 |
20200030850 | Apte et al. | Jan 2020 | A1 |
20200050816 | Tsai | Feb 2020 | A1 |
20200050817 | Salvia et al. | Feb 2020 | A1 |
20200050820 | Iatsun et al. | Feb 2020 | A1 |
20200050828 | Li et al. | Feb 2020 | A1 |
20200074135 | Garlepp et al. | Mar 2020 | A1 |
20200125710 | Andersson et al. | Apr 2020 | A1 |
20200147644 | Chang | May 2020 | A1 |
20200158694 | Garlepp et al. | May 2020 | A1 |
20200175143 | Lee et al. | Jun 2020 | A1 |
20200210666 | Flament | Jul 2020 | A1 |
20200285882 | Skovgaard et al. | Sep 2020 | A1 |
20200302140 | Lu et al. | Sep 2020 | A1 |
20200342203 | Lin et al. | Oct 2020 | A1 |
20200410193 | Wu | Dec 2020 | A1 |
Number | Date | Country |
---|---|---|
1826631 | Aug 2006 | CN |
102159334 | Aug 2011 | CN |
105264542 | Jan 2016 | CN |
105378756 | Mar 2016 | CN |
109255323 | Jan 2019 | CN |
1214909 | Jun 2002 | EP |
2884301 | Jun 2015 | EP |
3086261 | Oct 2016 | EP |
2011040467 | Feb 2011 | JP |
201531701 | Aug 2015 | TW |
2009096576 | Aug 2009 | WO |
2009137106 | Nov 2009 | WO |
2014035564 | Mar 2014 | WO |
2015009635 | Jan 2015 | WO |
2015112453 | Jul 2015 | WO |
2015120132 | Aug 2015 | WO |
2015131083 | Sep 2015 | WO |
2015134816 | Sep 2015 | WO |
2015183945 | Dec 2015 | WO |
2016007250 | Jan 2016 | WO |
2016011172 | Jan 2016 | WO |
2016040333 | Mar 2016 | WO |
2016061406 | Apr 2016 | WO |
2016061410 | Apr 2016 | WO |
2017003848 | Jan 2017 | WO |
2017053877 | Mar 2017 | WO |
2017192895 | Nov 2017 | WO |
2017196678 | Nov 2017 | WO |
2017196681 | Nov 2017 | WO |
2017196682 | Nov 2017 | WO |
2017192903 | Dec 2017 | WO |
2019164721 | Aug 2019 | WO |
Entry |
---|
Tang, et al., “Pulse-Echo Ultrasonic Fingerprint Sensor on a Chip”, IEEE Transducers, Anchorage, Alaska, USA, Jun. 21-25, 2015, pp. 674-677. |
ISA/EP, Partial International Search Report for International Application No. PCT/US2019/034032, 8 pages, dated Sep. 12, 2019, 8. |
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2018/063431, pp. 1-15, dated Feb. 5, 2019. |
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2019/015020, pp. 1-23, dated Jul. 1, 2019. |
ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2019/023440, pp. 1-10, dated Jun. 4, 2019. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031120, 12 pages, dated Aug. 29, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031127, 13 pages, dated Sep. 1, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031134, 12 pages, dated Aug. 30, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031140, 18 pages, dated Nov. 2, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031421 13 pages, dated Jun. 21, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031426 13 pages, dated Jun. 22, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031431, 14 pages, dated Aug. 1, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031434, 13 pages, dated Jun. 26, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031439, 10 pages, dated Jun. 20, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031824, 18 pages, dated Sep. 22, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031827, 16 pages, dated Aug. 1, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031831, 12 pages, dated Jul. 21, 2017. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2018/037364, 10 pages, dated Sep. 3, 2018. |
ISA/EP, International Search Report for International Application No. PCT/US2017/031826, 16 pages, dated Feb. 27, 2018. |
ISA/EP, Partial International Search Report for International Application No. PCT/US2017/031826, 12 pages, dated Nov. 30, 2017. |
“Moving Average Filters”, Waybackmachine XP05547422, Retrieved from the Internet: URL:https://web.archive.org/web/20170809081353/https//www.analog.com/media/en/technical-documentation/dsp-book/dsp_book_Ch15.pdf [retrieved on Jan. 24, 2019], Aug. 9, 2017, 1-8. |
Office Action for CN App No. 201780029016.7 dated Mar. 24, 2020, 7 pages. |
“Receiver Thermal Noise Threshold”, Fisher Telecommunication Services, Satellite Communications. Retrieved from the Internet: URL:https://web.archive.org/web/20171027075705/http//www.fishercom.xyz:80/satellite-communications/receiver-thermal-noise-threshold.html, Oct. 27, 2017, 3. |
“Sleep Mode”, Wikipedia, Retrieved from the Internet: URL:https://web.archive.org/web/20170908153323/https://en.wikipedia.org/wiki/Sleep_mode [retrieved on Jan. 25, 2019], Sep. 8, 2017, 1-3. |
“TMS320C5515 Fingerprint Development Kit (FDK) Hardware Guide”, Texas Instruments, Literature No. SPRUFX3, XP055547651, Apr. 2010, 1-26. |
“ZTE V7 MAX. 5,5” smartphone on MediaTeck Helio P10 cpu; Published on Apr. 20, 2016; https://www.youtube.com/watch?v=ncNCbpkGQzU (Year: 2016). |
Cappelli, et al., “Fingerprint Image Reconstruction from Standard Templates”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 29, No. 9, Sep. 2007, 1489-1503. |
Dausch, et al., “Theory and Operation of 2-D Array Piezoelectric Micromachined Ultrasound Transducers”, IEEE Transactions on Ultrasonics, and Frequency Control, vol. 55, No. 11;, Nov. 2008, 2484-2492. |
Feng, et al., “Fingerprint Reconstruction: From Minutiae to Phase”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 33, No. 2, Feb. 2011, 209-223. |
Hopcroft, et al., “Temperature Compensation of a MEMS Resonator Using Quality Factor as a Thermometer”, Retrieved from Internet: http://micromachine.stanford.edu/˜amanu/linked/MAH_MEMS2006.pdf, 2006, 222-225. |
Hopcroft, et al., “Using the temperature dependence of resonator quality factor as a thermometer”, Applied Physics Letters 91. Retrieved from Internet: http://micromachine.stanford.edu/˜hopcroft/Publications/Hopcroft_QT_ApplPhysLett_91_013505.pdf, 2007, 013505-1-031505-3. |
Jiang, et al., “Ultrasonic Fingerprint Sensor with Transmit Beamforming Based on a PMUT Array Bonded to CMOS Circuitry”, IEEE Transactions on Ultrasonics Ferroelectrics and Frequency Control, Jan. 1, 2017, 1-9. |
Kumar, et al., “Towards Contactless, Low-Cost and Accurate 3D Fingerprint Identification”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 37, No. 3, Mar. 2015, 681-696. |
Lee, et al., “Low jitter and temperature stable MEMS oscillators”, Frequency Conlrol Symposium (FCS), 2012 IEEE International, May 1-5, 2012. |
Li, et al., “Capacitive micromachined ultrasonic transducer for ultra-low pressure measurement: Theoretical study”, AIP Advances 5.12. Retrieved from Internet: http://scitation.aip.org/content/aip/journal/adva/5/12/10.10.1063/1.4939217, 2015, 127231. |
Pang, et al., “Extracting Valley-Ridge Lines from Point-Cloud-Based 3D Fingerprint Models”, IEEE Computer Graphics and Applications, IEEE Service Center, New York, vol. 33, No. 4, Jul./Aug. 2013, 73-81. |
Papageorgiou, et al., “Self-Calibration of Ultrasonic Transducers in an Intelligent Data Acquisition System”, International Scientific Journal of Computing, 2003, vol. 2, Issue 2 Retrieved Online: URL: https://scholar.google.com/scholar?q=self-calibration+of+ultrasonic+transducers+in+an+intelligent+data+acquisition+system&hl=en&as_sdt=0&as_vis=1&oi=scholart, Sep. 15, 2003. |
Qiu, et al., “Piezoelectric Micromachined Ultrasound Transducer (PMUT) Arrays for Integrated Sensing, Actuation and Imaging”, Sensors 15, doi:10.3390/s150408020, Apr. 3, 2015, 8020-8041. |
Ross, et al., “From Template to Image: Reconstructing Fingerprints from Minutiae Points”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 29, No. 4, Apr. 2007, 544-560. |
Rozen, et al., “Air-Coupled Aluminum Nitride Piezoelectric Micromachined Ultrasonic Transducers At 0.3 MHZ To 0.9 MHZ”, 2015 28th IEEE International Conference on Micro Electro Mechanical Systems (MEMS), IEEE, Jan. 18, 2015, 921-924. |
Savoia, et al., “Design and Fabrication of a cMUT Probe for Ultrasound Imaging of Fingerts”, 2010 IEEE International Ultrasonics Symposium Proceedings, Oct. 2010, 1877-1880. |
Shen, et al., “Anisotropic Complementary Acoustic Metamaterial for Canceling out Aberrating Layers”, American Physical Society, Physical Review X 4.4: 041033., Nov. 19, 2014, 041033-1-041033-7. |
Thakar, et al., “Multi-resonator approach to eliminating the temperature dependence of silicon-based timing references”, Hilton Head'14. Retrieved from the Internet: http://blog.narotama.ac.id/wp-content/uploads/2014/12/Multi-resonator-approach-to-eliminating-the-temperature-dependance-of-silicon-based-timing-references.pdf, 2014, 415-418. |
Zhou, et al., “Partial Fingerprint Reconstruction with Improved Smooth Extension”, Network and System Security, Springer Berlin Heidelberg, Jun. 3, 2013, 756-762. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/033854, 16 pages, dated Nov. 3, 2020. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/039208, 10 pages, dated Oct. 9, 2020. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/039452, 11 pages, dated Sep. 9, 2020. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/042427, 18 pages, dated Dec. 14, 2020. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2021/021412, 12 pages, dated Jun. 9, 2021. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2021/021561, 9 pages, dated Jun. 28, 2021. |
ISA/EP, Partial Search Report and Provisional Opinion for International Application No. PCT/US2020/042427, 13 pages, dated Oct. 26, 2020. |
ISA/EP, Partial Search Report for International Application No. PCT/US2020/033854, 10 pages, dated Sep. 8, 2020. |
Office Action for CN App No. 201780029016.7 dated Sep. 25, 2020, 7 pages. |
Tang, et al., “11.2 3D Ultrasonic Fingerprint Sensor-on-a-Chip”, 2016 IEEE International Solid-State Circuits Conference, IEEE, Jan. 31, 2016, 202-203. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2019061516, 14 pages, dated Mar. 12, 2020. |
Taiwan Application No. 106114623, 1st Office Action, dated Aug. 5, 2021, pp. 1-8. |
Number | Date | Country | |
---|---|---|---|
20200401783 A1 | Dec 2020 | US |
Number | Date | Country | |
---|---|---|---|
62865810 | Jun 2019 | US |