Fingerprint sensors have become ubiquitous in mobile devices as well as other applications for authenticating a user's identity. They provide a fast and convenient way for the user to unlock a device, provide authentication for payments, etc. Current fingerprint sensors are typically area sensors that obtain a two-dimensional image of the user's finger area presented to the sensor. Different technologies can be used to image the finger such as capacitive, ultrasound, and optical sensing. Once an image is obtained, that image is processed by a matcher to extract features and to compare against stored images to authenticate the user. As such, accuracy of captured images is essential to the performance of image matching for user authentication.
The accompanying drawings, which are incorporated in and form a part of the Description of Embodiments, illustrate various embodiments of the subject matter and, together with the Description of Embodiments, serve to explain principles of the subject matter discussed below. Unless specifically noted, the drawings referred to in this Brief Description of Drawings should be understood as not being drawn to scale. Herein, like items are labeled with like item numbers.
The following Description of Embodiments is merely provided by way of example and not of limitation. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding background or brief summary, or in the following detailed description.
Reference will now be made in detail to various embodiments of the subject matter, examples of which are illustrated in the accompanying drawings. While various embodiments are discussed herein, it will be understood that they are not intended to limit to these embodiments. On the contrary, the presented embodiments are intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope the various embodiments as defined by the appended claims. Furthermore, in this Description of Embodiments, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present subject matter. However, embodiments may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the described embodiments.
Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing and other symbolic representations of operations on data within an electrical circuit. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be one or more self-consistent procedures or instructions leading to a desired result. The procedures are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in an electronic device.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the description of embodiments, discussions utilizing terms such as “receiving,” “determining,” “decreasing,” “increasing,” “forwarding,” “changing,” “capturing,” “resetting,” “monitoring,” detecting,” “using,” “adapting,” or the like, refer to the actions and processes of an electronic device such as an electrical circuit.
Embodiments described herein may be discussed in the general context of processor-executable instructions residing on some form of non-transitory processor-readable medium, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.
In the figures, a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, logic, circuits, and steps have been described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. Also, the example fingerprint sensing system and/or mobile electronic device described herein may include components other than those shown, including well-known components.
Various techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium comprising instructions that, when executed, perform one or more of the methods described herein. The non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
The non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
Various embodiments described herein may be executed by one or more processors, such as one or more motion processing units (MPUs), sensor processing units (SPUs), host processor(s) or core(s) thereof, digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein, or other equivalent integrated or discrete logic circuitry. The term “processor,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein. As it employed in the subject specification, the term “processor” can refer to substantially any computing processing unit or device comprising, but not limited to comprising, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Moreover, processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of user equipment. A processor may also be implemented as a combination of computing processing units.
In addition, in some aspects, the functionality described herein may be provided within dedicated software modules or hardware modules configured as described herein. Also, the techniques could be fully implemented in one or more circuits or logic elements. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of an SPU/MPU and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with an SPU core, MPU core, or any other such configuration.
Discussion begins with a description of an example electronic device with which or upon which various embodiments described herein may be implemented. Different types of adaptation of the quality threshold after the description of the example electronic device. Examples of a short term adaptation of the quality threshold are then described. Examples of long term adaptation of a quality threshold according to environmental factors are then described. Examples of long term adaptation of a quality threshold based on monitoring fingerprint image quality are then described.
Fingerprint sensors, in accordance with the described embodiments, are used for capturing fingerprint images that are used for performing fingerprint authentication at a matcher. A matcher compares a fingerprint image to at least one stored fingerprint image (e.g., image acquired during enrollment of the user) and authenticates the user of the sensor based on the comparison. In order for the matcher to provide accurate results, the fingerprint image must be of sufficient quality. For example, the fingerprint image should exhibit a correct ridge/valley pattern, ridge connectedness, be of sufficient contrast, and/or have a high contrast-to-noise (CNR) ratio.
Embodiments described herein perform an image quality check on the captured fingerprint image prior to forwarding the fingerprint image to the matcher. The image quality check confirms that the fingerprint image is of sufficient quality to send to the matcher. For example, the image quality check may also act as a filter to ensure that non-fingerprint images (e.g., void images) do not get forwarded to the matcher. In some embodiments, the image quality of the captured fingerprint image is compared against a quality threshold. If the fingerprint image satisfies the quality threshold (e.g., the quality of the fingerprint image is higher than the quality threshold), the fingerprint image is forwarded to the matcher. If the fingerprint image does not satisfy the quality threshold (e.g., the quality of the fingerprint image is not higher than the quality threshold), the fingerprint image is not forwarded to the matcher.
Selection and setting of a quality threshold is an important determination in fingerprint image authentication. Forwarding non-fingerprint images or low-quality images to the matcher, which have a low likelihood of authentication, use unnecessary system resources. For example, they may wake-up the processor performing the matching from a low-power mode unnecessary. If the quality threshold is a high value, only high-quality images are forwarded to the matcher, and as a results, if the user of the sensor is an authenticated user, there is a high confidence of the matcher authenticating the user. However, some fingerprint images may not satisfy the quality threshold, and if these fingerprint images are forwarded to the matcher, the matcher may not be able to confirm authentication due to the low quality, resulting in an incorrect refusal of the user, also referred to as a false reject. Therefore, a low quality threshold may lead to a high false reject ratio (FRR) and is disadvantageous as authentication of a user is incorrectly denied. The inverse condition may also occur, such that if the quality threshold is a low value, confidence in the results of the matcher can degrade, as incorrect fingerprint images may be accepted resulting in a high false accept ratio (FAR). A high FAR is disadvantageous as user authentication may be improperly granted, resulting in security concerns. Therefore, adapting and setting the correct quality threshold is important for user experience, use of resources, and for security aspects.
Image quality of a fingerprint image can vary as a result of environmental conditions that impact the appearance of the fingerprint. For example, cold and dry condition can decrease the quality of a fingerprint image, as the appearance of the fingerprint may be degraded. In another example, moist or wet conditions can also decrease the quality of the fingerprint image.
Embodiments described herein provide adaptation of the quality threshold for a fingerprint sensor to account for changes in quality of a user's fingerprints. In some embodiments, adaptation of the quality threshold is provided for a limited duration and is reset to an original (or other) value upon completion. Such embodiments are described herein as “short term” adaptation of the quality threshold. For example, a short term adaptation of a quality threshold may occur during an authentication session, such that upon completion of the authentication session, the quality threshold is reset to an original (or other) value upon completion of the authentication session. In other embodiments, adaptation of the quality threshold is provided without resetting the quality threshold value. Such embodiments are described herein as “long term” adaptation of the quality threshold. For example, a long term adaptation of a quality threshold may occur responsive to detecting a trend indicative of a change in fingerprint image quality, such as a change in season or a change in fingerprint quality over time. It should be appreciated that the short term adaptation and long term adaptation of a quality threshold can be used cooperatively. For example, a long term adaptation can adapt the original quality threshold value used in a short term adaptation, such that upon completion of the short term adaptation, the quality threshold is reset to the original quality threshold set by the long term adaptation.
In some embodiments, a short term adaptation of the quality threshold is provided, where the quality threshold is reset to an original (or other) value at completion of the adaptation operation. In such an embodiment, if a user fails to satisfy the quality threshold, the quality threshold is decreased, thereby increasing the likelihood of satisfying the quality threshold. In some embodiments, if a user fails to satisfy the quality threshold on successive attempts, the quality threshold is progressively decreased. In some embodiments, the quality threshold cannot be decreased below a minimum quality threshold. Once the fingerprint image satisfies a decreased quality threshold, the fingerprint image is forwarded to the matcher. In some embodiments, once the fingerprint image satisfies a decreased quality threshold, the quality threshold is reset to the initial value. An example of a short term adaptation of the quality threshold occurs during an authentication session, where upon completion of the adaptation of the quality threshold, the quality threshold is reset to an original (or other) value.
In some embodiments, a long term adaptation of the quality threshold is provided. Quality of a fingerprint image may change gradually over time (e.g., due to environmental factors or aging). For example, fingers may get dryer during winter months, resulting in lower image quality, with the image quality improving for the same fingers in summer months. In some embodiments, a change in an environmental factor is monitored, and the quality threshold is adapted according to the change in the environmental factor. In other embodiments, the quality of the fingerprint image for a user is monitored over time. Responsive to detecting a trend indicative of a change in fingerprint image quality, the quality threshold is adapted. It should be appreciated that the short term adaptation of the quality threshold can be used in conjunction with the long term adaptation of the quality threshold.
Turning now to the figures,
As depicted in
Host processor 110 can be one or more microprocessors, central processing units (CPUs), DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs or applications, which may be stored in host memory 130, associated with the functions and capabilities of electronic device 100.
Host bus 120 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SPI) or other equivalent. In the embodiment shown, host processor 110, host memory 130, display 140, interface 150, transceiver 160, sensor processing unit (SPU) 170, and other components of electronic device 100 may be coupled communicatively through host bus 120 in order to exchange commands and data. Depending on the architecture, different bus configurations may be employed as desired. For example, additional buses may be used to couple the various components of electronic device 100, such as by using a dedicated bus between host processor 110 and memory 130.
Host memory 130 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory), hard disk, optical disk, or some combination thereof. Multiple layers of software can be stored in host memory 130 for use with/operation upon host processor 110. For example, an operating system layer can be provided for electronic device 100 to control and manage system resources in real time, enable functions of application software and other layers, and interface application programs with other software and functions of electronic device 100. Similarly, a user experience system layer may operate upon or be facilitated by the operating system. The user experience system may comprise one or more software application programs such as menu navigation software, games, device function control, gesture recognition, image processing or adjusting, voice recognition, navigation software, communications software (such as telephony or wireless local area network (WLAN) software), and/or any of a wide variety of other software and functional interfaces for interaction with the user can be provided. In some embodiments, multiple different applications can be provided on a single electronic device 100, and in some of those embodiments, multiple applications can run simultaneously as part of the user experience system. In some embodiments, the user experience system, operating system, and/or the host processor 110 may operate in a low-power mode (e.g., a sleep mode) where very few instructions are processed. Such a low-power mode may utilize only a small fraction of the processing power of a full-power mode (e.g., an awake mode) of the host processor 110.
Display 140, when included, may be a liquid crystal device, (organic) light emitting diode device, or other display device suitable for creating and visibly depicting graphic images and/or alphanumeric characters recognizable to a user. Display 140 may be configured to output images viewable by the user and may additionally or alternatively function as a viewfinder for camera. It should be appreciated that display 140 is optional, as various electronic devices, such as electronic locks, doorknobs, car start buttons, etc., may not require a display device.
Interface 150, when included, can be any of a variety of different devices providing input and/or output to a user, such as audio speakers, touch screen, real or virtual buttons, joystick, slider, knob, printer, scanner, computer network I/O device, other connected peripherals and the like.
Transceiver 160, when included, may be one or more of a wired or wireless transceiver which facilitates receipt of data at electronic device 100 from an external transmission source and transmission of data from electronic device 100 to an external recipient. By way of example, and not of limitation, in various embodiments, transceiver 160 comprises one or more of: a cellular transceiver, a wireless local area network transceiver (e.g., a transceiver compliant with one or more Institute of Electrical and Electronics Engineers (IEEE) 802.11 specifications for wireless local area network communication), a wireless personal area network transceiver (e.g., a transceiver compliant with one or more IEEE 802.15 specifications for wireless personal area network communication), and a wired a serial transceiver (e.g., a universal serial bus for wired communication).
Electronic device 100 also includes a general purpose sensor assembly in the form of integrated Sensor Processing Unit (SPU) 170 which includes sensor processor 172, memory 176, a fingerprint sensor 178, and a bus 174 for facilitating communication between these and other components of SPU 170. In some embodiments, SPU 170 may include at least one additional sensor 180 (shown as sensor 180-1, 180-2, . . . 180-n) communicatively coupled to bus 174. In some embodiments, all of the components illustrated in SPU 170 may be embodied on a single integrated circuit. It should be appreciated that SPU 170 may be manufactured as a stand-alone unit (e.g., an integrated circuit), that may exist separately from a larger electronic device and is coupled to host bus 120 through an interface (not shown).
Sensor processor 172 can be one or more microprocessors, CPUs, DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs, which may be stored in memory 176, associated with the functions of SPU 170. It should also be appreciated that fingerprint sensor 178 and additional sensor 180, when included, may also utilize processing and memory provided by other components of electronic device 100, e.g., host processor 110 and host memory 130.
Bus 174 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter-Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SPI) or other equivalent. Depending on the architecture, different bus configurations may be employed as desired. In the embodiment shown, sensor processor 172, memory 176, fingerprint sensor 178, and other components of SPU 170 may be communicatively coupled through bus 174 in order to exchange data.
Memory 176 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory). Memory 176 may store algorithms or routines or other instructions for processing data received from fingerprint sensor 178 and/or one or more sensor 180, as well as the received data either in its raw form or after some processing. Such algorithms and routines may be implemented by sensor processor 172 and/or by logic or processing capabilities included in fingerprint sensor 178 and/or sensor 180.
A sensor 180 may comprise, without limitation: a temperature sensor, a humidity sensor, an atmospheric pressure sensor, an infrared sensor, a radio frequency sensor, a navigation satellite system sensor (such as a global positioning system receiver), an acoustic sensor (e.g., a microphone), an inertial or motion sensor (e.g., a gyroscope, accelerometer, or magnetometer) for measuring the orientation or motion of the sensor in space, or other type of sensor for measuring other physical or environmental factors used in the adaptation of a quality threshold. In one example, sensor 180-1 may comprise an acoustic sensor, sensor 180-2 may comprise a temperature sensor, and sensor 180-n may comprise a motion sensor.
In some embodiments, fingerprint sensor 178 and/or one or more sensors 180 may be implemented using a microelectromechanical system (MEMS) that is integrated with sensor processor 172 and one or more other components of SPU 170 in a single chip or package. Although depicted as being included within SPU 170, one, some, or all of fingerprint sensor 178 and/or one or more sensors 180 may be disposed externally to SPU 170 in various embodiments.
Fingerprint image 215 is captured at fingerprint image capture 210. It should be appreciated that fingerprint image capture 210 can be any type of image capture device, including without limitation, an ultrasonic sensor, an optical sensor, a camera, etc. Fingerprint image 215 is received at fingerprint image quality check 220, where a quality check is performed on fingerprint image 215. Fingerprint image quality check 220 confirms that the fingerprint image is of sufficient quality to send to matcher 230. For example, fingerprint image quality check 220 may also act as a filter to ensure that non-fingerprint images (e.g., void images) do not get forwarded to matcher 230. In some embodiments, the image quality of fingerprint image 220 is compared against a quality threshold. If fingerprint image 215 satisfies the quality threshold (e.g., the quality of fingerprint image 215 is higher than the quality threshold), fingerprint image 215 forwarded to matcher 230. If fingerprint image 215 does not satisfy the quality threshold (e.g., the quality of fingerprint image 215 is not higher than the quality threshold), fingerprint image 215 is not forwarded to matcher 230 for authentication.
Matcher 230 is configured to receive fingerprint image 215 and perform user authentication. Matcher 230 is the part of fingerprint sensing system 200 that compares fingerprint image 215 to at least one authentication fingerprint image (e.g., a fingerprint image acquired during enrollment of the user) and authenticates the user based on the comparison. Based on the comparison between fingerprint image 215 and at least one authentication image, matcher 230 outputs an authentication determination 240, e.g., fingerprint image 215 is authenticated or fingerprint image 215 is not authenticated. As used herein, enrollment refers to an operation where an authorized user inputs his/her fingerprint images into a fingerprint sensing system for later comparison during an authentication operation. In general, the higher the quality of fingerprint image 215, the higher the confidence of the authentication results. For example, fingerprint image 215 should exhibit ridge connectedness, be of sufficient contrast, and/or have a high CNR.
Fingerprint sensing system 300 includes quality determiner 320 for determining a quality of enrollment images 310. Enrollment images 310 include fingerprint images acquired during enrollment of a user and are used for comparison to fingerprint image 215 for authenticating the user of the sensor. Enrollment images 310 may also be referred to as the fingerprint templates, or templates. Enrollment images 310 are used as reference images for authentication. In some embodiments, these enrollment images 310 may be updated after enrollment, referred to herein as a dynamic update of the fingerprint templates.
In some embodiments, a quality threshold determination is made during or concurrent to an enrollment operation. During enrollment, quality determiner 320 determines a quality of enrollment images 310. Quality of an image is an indication of the condition of an image for purposes of matching. It should be appreciated that many factors can be included in a fingerprint image quality determination, including without limitation: ridge-connectedness, ridge distinctness, ridge continuity, image contrast, CNR, average gray-level, etc. Quality can be dependent on matcher 230, e.g., in consideration of features of the fingerprint such as pores. In some embodiments, the quality is a normalized value in the range of 0-100.
Quality determiner 320 forwards the quality for enrollment images to quality thresholder 330. Quality thresholder 330 determines a quality threshold based on the received qualities. For example, quality thresholder 330 may determine an average quality of enrollment images 310. In some embodiments, the quality threshold can be set as the average quality of enrollment images 310. In some embodiments, the quality threshold can be set as a percentage or fraction of the average quality of enrollment images 310 (e.g., 80% of the average quality). It should be appreciated that the setting of the quality threshold may also depend on the variation of the quality of the enrollment images. For example, a large variation in the quality of the images during enrollment may suggest that a lower quality threshold may provide better performance. It should be appreciated that an initial or default quality threshold can be set to any value, where it is generally desirable to optimize the FRR and FAR, which may depend on the user or device in consideration. In some embodiments, the quality threshold may not consider the quality of the enrollment images, and may set the threshold based on system and security consideration. The security consideration may depend on the application requesting the authentication. In some embodiments, the threshold may be a weighted average between the average quality of the enrollment image and a threshold based on system and security consideration.
In accordance with various embodiments described herein, adaptation of the quality threshold for a fingerprint sensor is provided to account for changes in quality of a user's fingerprints.
Fingerprint image 215 is received (e.g., from fingerprint image capture 210) at quality determiner 410, where image quality 415 of fingerprint image 215 is determined. It should be appreciated that quality determiner 410 utilizes a quality determiner procedure consistent with the determination of the quality thresholder (e.g., for determination of the quality of enrollment images 310). Quality determiner 410 forwards image quality 415 for fingerprint image 215 to quality comparer 420.
Quality comparer 420 receives image quality 415 for fingerprint image 215 and quality threshold 425 from quality thresholder 330. Quality comparer 420 compares image quality 415 to quality threshold 425. If image quality 415 satisfies quality threshold 425 (e.g., is not less than quality threshold 425), fingerprint image 215 is determined to pass the quality check, and fingerprint image 215 is forwarded to matcher 230 for outputting authentication determination 240. If image quality 415 does not satisfy quality threshold 425 (e.g., is less than quality threshold 425), fingerprint image 215 is determined to fail the quality check.
In some embodiments, if a user fails to satisfy the quality threshold, the quality threshold is decreased, thereby increasing the likelihood of satisfying the quality threshold. Quality comparer 420 transmits a notification to quality thresholder 330 to decrease quality threshold 425. It should be appreciated that quality thresholder 330 can use different means to decrease quality threshold 425, e.g., decrease by a set value, decrease by a percentage or fraction, etc. The decreased quality threshold 425 is received at quality comparer 420, where a comparison between image quality 415 and the reduced quality threshold 425 is performed.
In some embodiments, if fingerprint image 215 fails to satisfy the quality threshold on successive attempts, quality threshold 425 is progressively decreased. In some embodiments, quality threshold 425 cannot be decreased below a minimum quality threshold. If the minimum quality threshold is reached, the authentication process may be aborted, and a message may be sent to the system or user. The message may include suggestions/instructions for improve image quality during capture. Once fingerprint image 215 satisfies a decreased quality threshold 425, fingerprint image 215 is forwarded to matcher 230 for outputting authentication determination 240. In some embodiments, once fingerprint image 215 satisfies a decreased quality threshold 425, quality threshold 425 is reset to the initial value. In some embodiments, once fingerprint image 215 satisfies a decreased quality threshold 425, quality threshold 425 is partially reset to the initial value, the amount of reset may depend on the quality of the image.
In some embodiments, quality data is passed to matcher 230 for use in the fingerprint authentication. For example, where fingerprint image 215 fails to satisfy an initial quality threshold 425 and satisfies a reduced quality threshold 425, quality data is passed to matcher 230 for use in the fingerprint authentication by considering the reduced quality threshold 425 in an authentication operation. Different applications using the authentication provided by a fingerprint sensing system may have different security requirements. For example, banking or financial applications may have higher security requirements such that a decreased quality threshold may not satisfy the authentication requirement, or may allow satisfaction of a reduced quality threshold subject to a particular minimum reduced quality threshold. In such embodiments, the quality data includes information related to the reduced quality threshold, allowing the application to make a determination as to whether fingerprint image 215 is of sufficient quality to perform authentication. For example, the application can have a minimum quality threshold it is willing to accept for authentication purposes, and the quality data informs the application of the quality threshold used for authenticating the fingerprint image in determining whether to accept the authentication.
At procedure 510 of flow diagram 500, a fingerprint image is received. In some embodiments, the fingerprint image is captured at a fingerprint sensor (e.g., an ultrasonic fingerprint sensor) associated with an electronic device. In some embodiments, the received fingerprint image is a previously captured fingerprint image that has undergone image processing to enhance the image quality (e.g., at procedure 550).
At procedure 520, an image quality of the fingerprint image is determined (e.g., at quality determiner 410). At procedure 530, the quality of the fingerprint image is compared to a quality threshold (e.g., at quality comparer 420). In one embodiment, the quality threshold is associated with an electronic device at which the fingerprint image was captured. In one embodiment, the quality threshold is associated with an application.
At procedure 540, it is determined whether the quality threshold is satisfied (e.g., the quality is not less than the quality threshold). If it is determined that the quality threshold is satisfied, as shown at procedure 580, the fingerprint image is forwarded to the matcher for authentication.
In one embodiment, if it is determined that the quality threshold is not satisfied, flow diagram 500 proceeds to procedure 550. At procedure 550, the operation of the fingerprint sensor for capturing the fingerprint image is changed. It should be appreciated that procedure 550 is optional. In some embodiments, a new fingerprint image is captured according to the changed operation. For example, the power to the fingerprint sensor can be increased, the gain of the fingerprint sensor can be increased, the beamforming operation of the fingerprint sensor can be changed, etc. In some embodiments, where the operation of the fingerprint sensor is changed, flow diagram 500 returns to procedure 510 where a new verification image is received. In some embodiments, procedure 550 includes performing image processing on the previously received fingerprint image to enhance the quality of the fingerprint image. For example, the image processing may reduce noise, increase contrast, smooth ridges, etc. In some embodiments, the verification image received at procedure 510 is a previously captured image to which the image processing was applied. In some embodiments, the new verification image is compared to a decreased quality threshold (e.g., according to procedures 560 and 570). In some embodiments, the iteration of capturing a new fingerprint images can be done at a high rate of multiple times per second (e.g. 1-30 Hz or higher). For instance, the procedures of flow diagram 500 can be performed while the user places the finger on the sensor, without lifting the finger. In some embodiments, the user may remove and replace the finger between each iteration.
In another embodiment, if it is determined that the quality threshold is not satisfied, flow diagram 500 proceeds to procedure 555. It should be appreciated that procedure 555 is optional. At procedure 555, in accordance with an embodiment, a failure count is incremented. It should be appreciated that the failure count is associated with an authentication session. For example, the failure count can be used to limit the number of times the quality threshold can be reduced. In some embodiments, the failure count is included within quality data that can be sent to the matcher for use in authentication. It should be appreciated that procedure 555 is optional, and that flow diagram 500 can proceed to procedure 560.
At procedure 560, it is determined whether the quality threshold is at a minimum quality threshold. For example, reducing the quality threshold may be subject to a minimum quality threshold, ensuring that the quality threshold is not reduced below a particular value, thereby ensuring that a fingerprint image of insufficient quality will not be sent to the matcher. If it is determined that the quality threshold is at the minimum value, in one embodiment, flow diagram 500 ends.
If it is determined that the quality threshold is not at the minimum value, flow diagram 500 proceeds to procedure 570. At procedure 570, the quality threshold is decreased. Flow diagram 500 then proceeds to procedure 530, where the quality of the fingerprint image is compared to the reduced quality threshold.
At procedure 590, in some embodiments, responsive to the fingerprint image being forward to the matcher for the fingerprint authentication, the quality threshold is reset. In some embodiments, the quality threshold is reset to the initial quality threshold determination made at enrollment. In other embodiments, the quality threshold is reset to a different value, e.g., an average of the initial quality threshold and the satisfied quality threshold.
It should be appreciated that, in accordance with various embodiments, that procedures 530, 540, 560, and 570 are repeated such that the quality threshold can be progressively decreased until the quality of the fingerprint image satisfies a decreased quality threshold.
At t2, authentication attempt #2 is performed at the fingerprint sensing system. Since the quality of the received fingerprint image is 17 and the quality threshold is reduced to 18, the reduced quality threshold is not satisfied. The quality threshold is then adapted by decreasing the quality threshold from 18 to 16. In one embodiment, the failure count is incremented from one to two.
At t3, authentication attempt #3 is performed at the fingerprint sensing system. Since the quality of the received fingerprint image is 17 and the quality threshold is reduced to 16, the reduced quality threshold is satisfied. The fingerprint image is sent to the matcher for authentication. The quality threshold is then reset to 20. In one embodiment, the failure count of two and the final reduced quality threshold of 16 are included in quality data also forwarded to the matcher.
Fingerprint sensing system 700 includes environmental factor monitor 710 for monitoring changes or trends in environmental factors impacting an image quality of a captured fingerprint image. Different environmental factors can impact a user's fingerprint. For example, fingers may get dryer during winter months, resulting in lower image quality, with the image quality improving for the same fingers in summer months. In some embodiments, environmental factor monitor 710 monitors at least one environmental factor. It should be appreciated that the environmental factor can include, without limitation: a time of year, a time of day, a season, a location, a temperature, humidity, activity of the user, a passage of time, etc. Environmental factors may include any factor that may influence the quality of the fingerprint image. Moreover, it should be appreciated that in some embodiments the environmental factor is actively monitored (e.g., a temperature sensor or a location sensor) by the fingerprint sensing system.
Responsive to detecting a change in the environmental factor that impacts a user's fingerprint, environmental factor monitor 710 transmits an indication to quality thresholder 330 to adapt the quality threshold. In one embodiment, quality thresholder 330 adapts the initial quality threshold (e.g., the quality threshold determined based on the enrollment images). In some embodiments, responsive to receiving a fingerprint image, the initial quality threshold used during the fingerprint image quality determination operation (e.g., at flow diagram 500 of
At procedure 810 of flow diagram 800, an environmental factor is monitored. At procedure 820, it is determined whether a change in the environmental factor impacting an image quality of a captured fingerprint image is detected. In one embodiment, the change in the environmental factor is a seasonal weather change impacting fingerprint characteristics of a user. In one embodiment, the change in the environmental factor is a temperature change impacting fingerprint characteristics of a user. For example, in colder weather the fingers get drier, and may decrease the quality of the captured fingerprint. Therefore, in one embodiment, as the temperature decreases/increases with a change in seasons, the image quality threshold may be decreased/increased. In one embodiment, the change in the environmental factor is change in geographical location impacting fingerprint characteristics of a user. If a change in the environmental factor impacting an image quality of a captured fingerprint image is not detected, flow diagram 800 ends.
If a change or trend in the environmental factor impacting an image quality of a captured fingerprint image is detected, flow diagram 800 proceeds to procedure 830. At procedure 830, a value of the quality threshold of a fingerprint image quality determination operation is adapted. In one embodiment, the quality threshold is the initial quality threshold of the short term quality threshold adaptation described above. In one embodiment, the value of the quality threshold is decreased. In another embodiment, the value of the quality threshold is increased. It should be appreciated that the determination whether to increase or decrease the quality threshold is based on the change in environmental factors. For example, during summer months the quality threshold may be higher due to better conditions for capturing high quality fingerprint images. It should be appreciated that the rate of change in the quality threshold with the change in the environmental factor may vary with the user or the geographic location of the user. The rate of change may be predefined, or may be learned by monitoring the trend of the image quality of the user (as a function of the environmental factors). Models for the rate of change may also be built using crowd sourcing and machine learning techniques.
Fingerprint sensing system 900 includes image quality monitor 910 for monitoring a fingerprint image quality determination operation of the plurality of fingerprint images over a time range. Image quality monitor 910 receives the image quality for received verification images from fingerprint quality check 220 over a time range. As described above, in some embodiments, fingerprint image quality check 220 is configured to adapt a quality threshold responsive to a fingerprint image not satisfying the quality threshold. It should be appreciated that the image quality can be received from a plurality of users (e.g., crowd sourced).
Responsive to detecting a trend indicative of a change in fingerprint image quality, image quality monitor 910 transmits an indication to quality thresholder 330 to adapt the quality threshold. In one embodiment, quality thresholder 330 adapts the initial quality threshold (e.g., the quality threshold determined based on the enrollment images). In one embodiment, the value of the quality threshold is adapted according to the trend. For example, where the trend identifies a linear decline in quality over the time range, the quality threshold is decreased according to the linear trend. It should be appreciated that the trend may be in one direction, e.g., the quality of the fingerprint decrease with the age of the user, that the trend may be cyclical, e.g., depending on the seasons, and/or the trend may be associated with activities of the user, e.g., the image quality decreases due to wear on the fingertips as a results of heavy manual labor. A plurality of trends may also be combined into more complex trends. The timescale or time range of the trends may vary from hours, to weeks, months, or even years.
In some embodiments, responsive to receiving a fingerprint image, the adapted quality threshold during the fingerprint image quality determination operation (e.g., at flow diagram 500 of
At procedure 1010 of flow diagram 1000, a plurality of fingerprint images captured over a time range is received. In one embodiment, the plurality of fingerprint images are received from a single user. In another embodiment, the plurality of fingerprint images are received from a plurality of users (e.g., crowd sourced). At procedure 1020, a fingerprint image quality determination operation of the plurality of fingerprint images is monitored over the time range, where the fingerprint image quality determination operation is configured to adapt a quality threshold responsive to a fingerprint image not satisfying the quality threshold. At procedure 1030, it is determined whether a trend indicative of a change in image quality of the plurality of fingerprint images is detected. If a trend indicative of a change in image quality of the plurality of fingerprint images is not detected, flow diagram 1000 ends.
If a trend indicative of a change in image quality of the plurality of fingerprint images is detected, flow diagram 1000 proceeds to procedure 1040. At procedure 1040, a value of the quality threshold of a fingerprint image quality determination operation is adapted. In one embodiment, the quality threshold is the initial quality threshold of the short term quality threshold adaptation described above. In one embodiment, the value of the quality threshold is adapted according to the trend. In one embodiment, the value of the quality threshold is decreased. In another embodiment, the value of the quality threshold is increased. The principle of the adaptation is that based on the observed trend, the system predicts the expected image quality and adapts the quality threshold accordingly. This adaptation takes into considering the security aspect and system resources, as discussed above.
At t2, image quality for a fingerprint image received at day 2 is determined. The quality of the received fingerprint image is determined to have a value of 23 (within a range of 0-100) and the initial quality threshold (TQ) is 20. Since the quality of the received fingerprint image is 23, the quality threshold is satisfied, and no adaptation of the quality threshold is performed to allow for authentication of the fingerprint image. However, there is a decreasing trend in quality of the fingerprint image. As such, subsequent t2, the quality threshold is adapted to 15, in accordance with the detected trend.
At t3, image quality for a fingerprint image received at day N is determined. The quality of the received fingerprint image is determined to have a value of 16 (within a range of 0-100) and the adapted quality threshold (TQ) is 15. Since the quality of the received fingerprint image is 16, the quality threshold is satisfied, and no adaptation of the quality threshold is performed. In the example shown, the quality threshold is adapted before an attempt where the quality of the captured image is below the threshold. The adaptation may be based on the trend and the quality of the captured image. In some embodiments, the quality threshold is adapted based on the trend, which in this example would mean that for day N, the threshold would already be set at 15 so that the image would be accepted and transferred to the matcher. In contrast to the example in
The examples set forth herein were presented in order to best explain, to describe particular applications, and to thereby enable those skilled in the art to make and use embodiments of the described examples. However, those skilled in the art will recognize that the foregoing description and examples have been presented for the purposes of illustration and example only. Many aspects of the different example embodiments that are described above can be combined into new embodiments. The description as set forth is not intended to be exhaustive or to limit the embodiments to the precise form disclosed. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Reference throughout this document to “one embodiment,” “certain embodiments,” “an embodiment,” “various embodiments,” “some embodiments,” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any embodiment may be combined in any suitable manner with one or more other features, structures, or characteristics of one or more other embodiments without limitation.
Number | Name | Date | Kind |
---|---|---|---|
4880012 | Sato | Nov 1989 | A |
5575286 | Weng et al. | Nov 1996 | A |
5684243 | Gururaja et al. | Nov 1997 | A |
5808967 | Yu et al. | Sep 1998 | A |
5867302 | Fleming | Feb 1999 | A |
5911692 | Hussain et al. | Jun 1999 | A |
6071239 | Cribbs et al. | Jun 2000 | A |
6104673 | Cole et al. | Aug 2000 | A |
6289112 | Jain et al. | Sep 2001 | B1 |
6292576 | Brownlee | Sep 2001 | B1 |
6350652 | Libera et al. | Feb 2002 | B1 |
6428477 | Mason | Aug 2002 | B1 |
6483932 | Martinez et al. | Nov 2002 | B1 |
6500120 | Anthony | Dec 2002 | B1 |
6676602 | Barnes et al. | Jan 2004 | B1 |
6736779 | Sano et al. | May 2004 | B1 |
7067962 | Scott | Jun 2006 | B2 |
7109642 | Scott | Sep 2006 | B2 |
7243547 | Cobianu et al. | Jul 2007 | B2 |
7257241 | Lo | Aug 2007 | B2 |
7400750 | Nam | Jul 2008 | B2 |
7433034 | Huang | Oct 2008 | B1 |
7459836 | Scott | Dec 2008 | B2 |
7471034 | Schlote-Holubek et al. | Dec 2008 | B2 |
7489066 | Scott et al. | Feb 2009 | B2 |
7634117 | Cho | Dec 2009 | B2 |
7739912 | Schneider et al. | Jun 2010 | B2 |
8018010 | Tigli et al. | Sep 2011 | B2 |
8139827 | Schneider et al. | Mar 2012 | B2 |
8255698 | Li | Aug 2012 | B2 |
8311514 | Bandyopadhyay et al. | Nov 2012 | B2 |
8335356 | Schmitt | Dec 2012 | B2 |
8433110 | Kropp et al. | Apr 2013 | B2 |
8508103 | Schmitt et al. | Aug 2013 | B2 |
8515135 | Clarke et al. | Aug 2013 | B2 |
8666126 | Lee et al. | Mar 2014 | B2 |
8703040 | Liufu et al. | Apr 2014 | B2 |
8723399 | Sammoura et al. | May 2014 | B2 |
8805031 | Schmitt | Aug 2014 | B2 |
9056082 | Liautaud et al. | Jun 2015 | B2 |
9070861 | Bibl et al. | Jun 2015 | B2 |
9224030 | Du et al. | Dec 2015 | B2 |
9245165 | Slaby et al. | Jan 2016 | B2 |
9424456 | Koteshwara et al. | Aug 2016 | B1 |
9572549 | Belevich et al. | Feb 2017 | B2 |
9582102 | Setlak | Feb 2017 | B2 |
9582705 | Du et al. | Feb 2017 | B2 |
9607203 | Yazdandoost et al. | Mar 2017 | B1 |
9607206 | Schmitt et al. | Mar 2017 | B2 |
9613246 | Gozzini et al. | Apr 2017 | B1 |
9665763 | Du et al. | May 2017 | B2 |
9747488 | Yazdandoost et al. | Aug 2017 | B2 |
9785819 | Oreifej | Oct 2017 | B1 |
9815087 | Ganti et al. | Nov 2017 | B2 |
9817108 | Kuo et al. | Nov 2017 | B2 |
9818020 | Schuckers et al. | Nov 2017 | B2 |
9881195 | Lee et al. | Jan 2018 | B2 |
9881198 | Lee et al. | Jan 2018 | B2 |
9898640 | Ghavanini | Feb 2018 | B2 |
9904836 | Yazdandoost et al. | Feb 2018 | B2 |
9909225 | Lee et al. | Mar 2018 | B2 |
9922235 | Cho et al. | Mar 2018 | B2 |
9934371 | Hong et al. | Apr 2018 | B2 |
9939972 | Shepelev et al. | Apr 2018 | B2 |
9953205 | Rasmussen et al. | Apr 2018 | B1 |
9959444 | Young et al. | May 2018 | B2 |
9967100 | Hong et al. | May 2018 | B2 |
9983656 | Merrell et al. | May 2018 | B2 |
9984271 | King et al. | May 2018 | B1 |
10275638 | Yousefpor et al. | Apr 2019 | B1 |
10315222 | Salvia et al. | Jun 2019 | B2 |
10322929 | Pandian et al. | Jun 2019 | B2 |
10387704 | Dagan et al. | Aug 2019 | B2 |
10461124 | Berger et al. | Oct 2019 | B2 |
10478858 | Lasiter et al. | Nov 2019 | B2 |
10497747 | Tsai et al. | Dec 2019 | B2 |
10515255 | Strohmann et al. | Dec 2019 | B2 |
10539539 | Garlepp et al. | Jan 2020 | B2 |
10600403 | Garlepp et al. | Mar 2020 | B2 |
10656255 | Ng et al. | May 2020 | B2 |
10670716 | Apte et al. | Jun 2020 | B2 |
10706835 | Garlepp et al. | Jul 2020 | B2 |
10755067 | De Foras et al. | Aug 2020 | B2 |
20020135273 | Mauchamp et al. | Sep 2002 | A1 |
20030013955 | Poland | Jan 2003 | A1 |
20040085858 | Khuri-Yakub et al. | May 2004 | A1 |
20040122316 | Satoh et al. | Jun 2004 | A1 |
20040174773 | Thomenius et al. | Sep 2004 | A1 |
20050023937 | Sashida et al. | Feb 2005 | A1 |
20050057284 | Wodnicki | Mar 2005 | A1 |
20050100200 | Abiko et al. | May 2005 | A1 |
20050110071 | Ema et al. | May 2005 | A1 |
20050146240 | Smith et al. | Jul 2005 | A1 |
20050148132 | Wodnicki et al. | Jul 2005 | A1 |
20050162040 | Robert | Jul 2005 | A1 |
20060052697 | Hossack et al. | Mar 2006 | A1 |
20060079777 | Karasawa | Apr 2006 | A1 |
20060230605 | Schlote-Holubek et al. | Oct 2006 | A1 |
20060280346 | Machida | Dec 2006 | A1 |
20070046396 | Huang | Mar 2007 | A1 |
20070047785 | Jang et al. | Mar 2007 | A1 |
20070073135 | Lee et al. | Mar 2007 | A1 |
20070202252 | Sasaki | Aug 2007 | A1 |
20070215964 | Khuri-Yakub et al. | Sep 2007 | A1 |
20070223791 | Shinzaki | Sep 2007 | A1 |
20070230754 | Jain et al. | Oct 2007 | A1 |
20080125660 | Yao et al. | May 2008 | A1 |
20080150032 | Tanaka | Jun 2008 | A1 |
20080194053 | Huang | Aug 2008 | A1 |
20080240523 | Benkley et al. | Oct 2008 | A1 |
20090005684 | Kristoffersen et al. | Jan 2009 | A1 |
20090182237 | Angelsen et al. | Jul 2009 | A1 |
20090232367 | Shinzaki | Sep 2009 | A1 |
20090274343 | Clarke | Nov 2009 | A1 |
20090303838 | Svet | Dec 2009 | A1 |
20100030076 | Vortman et al. | Feb 2010 | A1 |
20100046810 | Yamada | Feb 2010 | A1 |
20100113952 | Raguin et al. | May 2010 | A1 |
20100168583 | Dausch et al. | Jul 2010 | A1 |
20100195851 | Buccafusca | Aug 2010 | A1 |
20100201222 | Adachi et al. | Aug 2010 | A1 |
20100202254 | Roest et al. | Aug 2010 | A1 |
20100239751 | Regniere | Sep 2010 | A1 |
20100251824 | Schneider et al. | Oct 2010 | A1 |
20100256498 | Tanaka | Oct 2010 | A1 |
20100278008 | Ammar | Nov 2010 | A1 |
20110285244 | Lewis et al. | Nov 2011 | A1 |
20110291207 | Martin et al. | Dec 2011 | A1 |
20120016604 | Irving et al. | Jan 2012 | A1 |
20120092026 | Liautaud et al. | Apr 2012 | A1 |
20120095335 | Sverdlik et al. | Apr 2012 | A1 |
20120095347 | Adam et al. | Apr 2012 | A1 |
20120147698 | Wong et al. | Jun 2012 | A1 |
20120224041 | Monden | Sep 2012 | A1 |
20120232396 | Tanabe | Sep 2012 | A1 |
20120238876 | Tanabe et al. | Sep 2012 | A1 |
20120263355 | Monden | Oct 2012 | A1 |
20120279865 | Regniere et al. | Nov 2012 | A1 |
20120288641 | Diatezua et al. | Nov 2012 | A1 |
20120300988 | Ivanov et al. | Nov 2012 | A1 |
20130051179 | Hong | Feb 2013 | A1 |
20130064043 | Degertekin et al. | Mar 2013 | A1 |
20130127297 | Bautista et al. | May 2013 | A1 |
20130127592 | Fyke et al. | May 2013 | A1 |
20130133428 | Lee et al. | May 2013 | A1 |
20130201134 | Schneider et al. | Aug 2013 | A1 |
20130271628 | Ku et al. | Oct 2013 | A1 |
20130294201 | Hajati | Nov 2013 | A1 |
20130294202 | Hajati | Nov 2013 | A1 |
20140060196 | Falter et al. | Mar 2014 | A1 |
20140117812 | Hajati | May 2014 | A1 |
20140176332 | Alameh et al. | Jun 2014 | A1 |
20140208853 | Onishi et al. | Jul 2014 | A1 |
20140219521 | Schmitt et al. | Aug 2014 | A1 |
20140232241 | Hajati | Aug 2014 | A1 |
20140265721 | Robinson et al. | Sep 2014 | A1 |
20140294262 | Schuckers et al. | Oct 2014 | A1 |
20140313007 | Harding | Oct 2014 | A1 |
20140355387 | Kitchens et al. | Dec 2014 | A1 |
20150036065 | Yousefpor et al. | Feb 2015 | A1 |
20150049590 | Rowe et al. | Feb 2015 | A1 |
20150087991 | Chen et al. | Mar 2015 | A1 |
20150097468 | Hajati et al. | Apr 2015 | A1 |
20150105663 | Kiyose et al. | Apr 2015 | A1 |
20150145374 | Xu et al. | May 2015 | A1 |
20150164473 | Kim et al. | Jun 2015 | A1 |
20150165479 | Lasiter et al. | Jun 2015 | A1 |
20150169136 | Ganti et al. | Jun 2015 | A1 |
20150189136 | Chung et al. | Jul 2015 | A1 |
20150198699 | Kuo et al. | Jul 2015 | A1 |
20150206738 | Rastegar | Jul 2015 | A1 |
20150213180 | Herberholz | Jul 2015 | A1 |
20150220767 | Yoon et al. | Aug 2015 | A1 |
20150241393 | Ganti et al. | Aug 2015 | A1 |
20150261261 | Bhagavatula et al. | Sep 2015 | A1 |
20150286312 | Kang et al. | Oct 2015 | A1 |
20150301653 | Urushi | Oct 2015 | A1 |
20150345987 | Hajati | Dec 2015 | A1 |
20150371398 | Qiao et al. | Dec 2015 | A1 |
20160051225 | Kim et al. | Feb 2016 | A1 |
20160063294 | Du et al. | Mar 2016 | A1 |
20160063300 | Du et al. | Mar 2016 | A1 |
20160070967 | Du et al. | Mar 2016 | A1 |
20160070968 | Gu et al. | Mar 2016 | A1 |
20160086010 | Merrell et al. | Mar 2016 | A1 |
20160092715 | Yazdandoost et al. | Mar 2016 | A1 |
20160092716 | Yazdandoost et al. | Mar 2016 | A1 |
20160100822 | Kim et al. | Apr 2016 | A1 |
20160107194 | Panchawagh et al. | Apr 2016 | A1 |
20160117541 | Lu et al. | Apr 2016 | A1 |
20160180142 | Riddle et al. | Jun 2016 | A1 |
20160326477 | Fernandez-Alcon et al. | Nov 2016 | A1 |
20160350573 | Kitchens et al. | Dec 2016 | A1 |
20160358003 | Shen et al. | Dec 2016 | A1 |
20170004352 | Jonsson et al. | Jan 2017 | A1 |
20170330552 | Garlepp et al. | Jan 2017 | A1 |
20170032485 | Vemury | Feb 2017 | A1 |
20170059380 | Li et al. | Mar 2017 | A1 |
20170075700 | Abudi et al. | Mar 2017 | A1 |
20170100091 | Eigil et al. | Apr 2017 | A1 |
20170110504 | Panchawagh et al. | Apr 2017 | A1 |
20170119343 | Pintoffl | May 2017 | A1 |
20170124374 | Rowe et al. | May 2017 | A1 |
20170168543 | Dai et al. | Jun 2017 | A1 |
20170185821 | Chen et al. | Jun 2017 | A1 |
20170194934 | Shelton et al. | Jul 2017 | A1 |
20170200054 | Du et al. | Jul 2017 | A1 |
20170219536 | Koch et al. | Aug 2017 | A1 |
20170231534 | Agassy et al. | Aug 2017 | A1 |
20170255338 | Medina et al. | Sep 2017 | A1 |
20170293791 | Mainguet et al. | Oct 2017 | A1 |
20170316243 | Ghavanini | Nov 2017 | A1 |
20170316248 | He et al. | Nov 2017 | A1 |
20170322290 | Ng | Nov 2017 | A1 |
20170322291 | Salvia et al. | Nov 2017 | A1 |
20170322292 | Salvia et al. | Nov 2017 | A1 |
20170322305 | Apte et al. | Nov 2017 | A1 |
20170323133 | Tsai | Nov 2017 | A1 |
20170325081 | Chrisikos et al. | Nov 2017 | A1 |
20170326590 | Daneman | Nov 2017 | A1 |
20170326591 | Apte et al. | Nov 2017 | A1 |
20170326593 | Garlepp et al. | Nov 2017 | A1 |
20170326594 | Berger et al. | Nov 2017 | A1 |
20170328866 | Apte et al. | Nov 2017 | A1 |
20170328870 | Garlepp et al. | Nov 2017 | A1 |
20170330012 | Salvia et al. | Nov 2017 | A1 |
20170330553 | Garlepp et al. | Nov 2017 | A1 |
20170357839 | Yazdandoost et al. | Dec 2017 | A1 |
20180025202 | Ryshtun et al. | Jan 2018 | A1 |
20180032788 | Krenzer et al. | Feb 2018 | A1 |
20180101711 | D'Souza et al. | Apr 2018 | A1 |
20180107852 | Fenrich et al. | Apr 2018 | A1 |
20180107854 | Tsai et al. | Apr 2018 | A1 |
20180129849 | Strohmann et al. | May 2018 | A1 |
20180129857 | Bonev | May 2018 | A1 |
20180178251 | Foncellino et al. | Jun 2018 | A1 |
20180206820 | Anand et al. | Jul 2018 | A1 |
20180225495 | Jonsson et al. | Aug 2018 | A1 |
20180229267 | Ono et al. | Aug 2018 | A1 |
20180276443 | Strohmann et al. | Sep 2018 | A1 |
20180329560 | Kim et al. | Nov 2018 | A1 |
20180349663 | Garlepp et al. | Dec 2018 | A1 |
20180357457 | Rasmussen et al. | Dec 2018 | A1 |
20180369866 | Sammoura et al. | Dec 2018 | A1 |
20180373913 | Panchawagh et al. | Dec 2018 | A1 |
20190005300 | Garlepp et al. | Jan 2019 | A1 |
20190012673 | Chakraborty et al. | Jan 2019 | A1 |
20190018123 | Narasimha-Iyer et al. | Jan 2019 | A1 |
20190046263 | Hayashida et al. | Feb 2019 | A1 |
20190057267 | Kitchens et al. | Feb 2019 | A1 |
20190073507 | D'Souza et al. | Mar 2019 | A1 |
20190087632 | Raguin et al. | Mar 2019 | A1 |
20190095015 | Han et al. | Mar 2019 | A1 |
20190102046 | Miranto et al. | Apr 2019 | A1 |
20190130083 | Agassy et al. | May 2019 | A1 |
20190171858 | Ataya et al. | Jun 2019 | A1 |
20190188441 | Hall et al. | Jun 2019 | A1 |
20190188442 | Flament et al. | Jun 2019 | A1 |
20190325185 | Tang | Oct 2019 | A1 |
20190340455 | Jung et al. | Nov 2019 | A1 |
20190370518 | Maor et al. | Dec 2019 | A1 |
20200030850 | Apte et al. | Jan 2020 | A1 |
20200050816 | Tsai | Feb 2020 | A1 |
20200050817 | Salvia et al. | Feb 2020 | A1 |
20200050820 | Iatsun et al. | Feb 2020 | A1 |
20200050828 | Li et al. | Feb 2020 | A1 |
20200074135 | Garlepp et al. | Mar 2020 | A1 |
20200125710 | Andersson et al. | Apr 2020 | A1 |
20200147644 | Chang | May 2020 | A1 |
20200158694 | Garlepp et al. | May 2020 | A1 |
20200175143 | Lee et al. | Jun 2020 | A1 |
20200210666 | Flament | Jul 2020 | A1 |
20200285882 | Skovgaard Christensen et al. | Sep 2020 | A1 |
20200302140 | Lu et al. | Sep 2020 | A1 |
Number | Date | Country |
---|---|---|
1826631 | Aug 2006 | CN |
101192644 | Jun 2008 | CN |
102159334 | Aug 2011 | CN |
105264542 | Jan 2016 | CN |
105378756 | Mar 2016 | CN |
109255323 | Jan 2019 | CN |
1214909 | Jun 2002 | EP |
2884301 | Jun 2015 | EP |
3086261 | Oct 2016 | EP |
2011040467 | Feb 2011 | JP |
201531701 | Aug 2015 | TW |
2009096576 | Aug 2009 | WO |
2009137106 | Nov 2009 | WO |
2014035564 | Mar 2014 | WO |
2015009635 | Jan 2015 | WO |
2015112453 | Jul 2015 | WO |
2015120132 | Aug 2015 | WO |
2015131083 | Sep 2015 | WO |
2015134816 | Sep 2015 | WO |
2015183945 | Dec 2015 | WO |
2016007250 | Jan 2016 | WO |
2016011172 | Jan 2016 | WO |
2016040333 | Mar 2016 | WO |
2016061406 | Apr 2016 | WO |
2016061410 | Apr 2016 | WO |
2017003848 | Jan 2017 | WO |
2017053877 | Mar 2017 | WO |
2017192895 | Nov 2017 | WO |
2017196678 | Nov 2017 | WO |
2017196682 | Nov 2017 | WO |
2017192903 | Dec 2017 | WO |
2018148332 | Aug 2018 | WO |
2019164721 | Aug 2019 | WO |
Entry |
---|
Tang, et al., “Pulse-Echo Ultrasonic Fingerprint Sensor on a Chip”, IEEE Transducers, Anchorage, Alaska, USA, Jun. 21-25, 2015, pp. 674-677. |
ISA/EP, Partial International Search Report for International Application No. PCT/US2019/034032, 8 pages, dated Sep. 12, 2019, 8. |
“ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2018/063431, pp. 1-15, dated Feb. 5, 2019 (Feb. 5, 2019))”. |
“ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2019/015020, pp. 1-23, dated Jul. 1, 2019”. |
“ISA/EP, International Search Report and Written Opinion for International Application # PCT/US2019/023440, pp. 1-10, dated Jun. 4, 2019”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031120, 12 pages, dated Aug. 29, 2017 (Aug. 29, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031127, 13 pages, dated Sep. 1, 2017”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031134, 12 pages, dated Aug. 30, 2017 (Aug. 30, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031140, 18 pages, dated Nov. 2, 2017 (Nov. 2, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031421 13 pages, dated Jun. 21, 2017 (Jun. 21, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031426 13 pages, dated Jun. 22, 2017 (Jun. 22, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031431, 14 pages, dated Aug. 1, 2017 (Aug. 1, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031434, 13 pages, dated Jun. 26, 2017 (Jun. 26, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031439, 10 pages, dated Jun. 20, 2017 (Jun. 20, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031824, 18 pages, dated Sep. 22, 2017 (Sep. 22, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031827, 16 pages, dated Aug. 1, 2017 (Aug. 1, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2017/031831, 12 pages, dated Jul. 21, 2017 (Jul. 21, 2017)”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2018/037364, 10 pages, dated Sep. 3, 2018”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/033854, 16 pages, dated Nov. 3, 2020”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/039208, 10 pages, dated Oct. 9, 2020”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/039452, 11 pages, dated Sep. 9, 2020”. |
“ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2020/042427, 18 pages, dated Dec. 14, 2020”. |
“ISA/EP, International Search Report for International Application No. PCT/US2017/031826, 16 pages, dated Feb. 27, 2018 (Feb. 27, 2018))”. |
“ISA/EP, Partial International Search Report for International Application No. PCT/US2017/031140, 13 pages, dated Aug. 29, 2017 (Aug. 29, 2017)”. |
“ISA/EP, Partial International Search Report for International Application No. PCT/US2017/031823, 12 pages, dated Nov. 30, 2017 (Nov. 30, 2017)”. |
“ISA/EP, Partial Search Report and Provisional Opinion for International Application No. PCT/US2020/042427, 13 pages, dated Oct. 26, 2020”. |
“ISA/EP, Partial Search Report for International Application No. PCT/US2020/033854, 10 pages, dated Sep. 8, 2020”. |
“Moving Average Filters”, Waybackmachine XP05547422, Retrieved from the Internet: URL:https://web.archive.org/web/20170809081353/https//www.analog.com/media/en/technical-documentation/dsp-book/dsp_book_Ch15.pdf—[retrieved on Jan. 24, 2019], Aug. 9, 2017, 1-8. |
“Office Action for CN App No. 201780029016.7 dated Mar. 24, 2020, 7 pages”. |
“Office Action for CN App No. 201780029016.7 dated Sep. 25, 2020, 7 pages”. |
“Receiver Thermal Noise Threshold”, Fisher Telecommunication Services, Satellite Communications. Retrieved from the Internet: URL:https://web.archive.org/web/20171027075705/http//www.fishercom.xyz:80/satellite-communications/receiver-thermal-noise-threshold.html, Oct. 27, 2017, 3. |
“Sleep Mode”, Wikipedia, Retrieved from the Internet: URL:https://web.archive.org/web/20170908153323/https://en.wikipedia.org/wiki/Sleep_mode [retrieved on Jan. 25, 2019], Sep. 8, 2017, 1-3. |
“TMS320C5515 Fingerprint Development Kit (FDK) Hardware Guide”, Texas Instruments, Literature No. SPRUFX3, XP055547651, Apr. 2010, 1-26. |
“ZTE V7 MAX. 5,5″ smartphone on MediaTeck Helio P10 cpu; Published on Apr. 20, 2016; https://www.youtube.com/watch?v=ncNCbpkGQzU (Year: 2016)”. |
Cappelli, et al., “Fingerprint Image Reconstruction from Standard Templates”, IEEE Transactions On Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 29, No. 9, Sep. 2007, 1489-1503. |
Feng, et al., “Fingerprint Reconstruction: From Minutiae to Phase”, IEEE Transactions On Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 33, No. 2, Feb. 2011, 209-223. |
Jiang, et al., “Ultrasonic Fingerprint Sensor with Transmit Beamforming Based on a PMUT Array Bonded to CMOS Circuitry”, IEEE Transactions on Ultrasonics Ferroelectrics and Frequency Control, Jan. 1, 2017,1-9. |
Kumar, et al., “Towards Contactless, Low-Cost and Accurate 3D Fingerprint Identification”, IEEE Transactions On Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 37, No. 3, Mar. 2015, 681-696. |
Pang, et al., “Extracting Valley-Ridge Lines from Point-Cloud-Based 3D Fingerprint Models”, IEEE Computer Graphics and Applications, IEEE Service Center, New York, vol. 33, No. 4, Jul./Aug. 2013, 73-81. |
Papageorgiou, et al., “Self-Calibration of Ultrasonic Transducers in an Intelligent Data Acquisition System”, International Scientific Journal of Computing, 2003, vol. 2, Issue 2 Retrieved Online: URL: https://scholar.google.com/scholar?q=self-calibration+of+ultrasonic+transducers+in+an+intelligent+data+acquisition+system&hl=en&as_sdt=0&as_vis=1&oi=scholart, 2003, 9-15. |
Ross, et al., “From Template to Image: Reconstructing Fingerprints from Minutiae Points”, IEEE Transactions On Pattern Analysis and Machine Intelligence, IEEE Computer Society, vol. 29, No. 4, Apr. 2007, 544-560. |
Rozen, et al., “Air-Coupled Aluminum Nitride Piezoelectric Micromachined Ultrasonic Transducers at 0.3 MHZ to 0.9 MHZ”, 2015 28th IEEE International Conference on Micro Electro Mechanical Systems (MEMS), IEEE, Jan. 18, 2015, 921-924. |
Tang, et al., “11.2 3D Ultrasonic Fingerprint Sensor-on-a-Chip”, 2016 IEEE International Solid-State Circuits Conference, IEEE, Jan. 31, 2016, 202-203. |
Thakar, et al., “Multi-resonator approach to eliminating the temperature dependence of silicon-based timing references”, Hilton Head'14. Retrieved from the Internet: http://blog.narotama.ac.id/wp-content/uploads/2014/12/Multi-resonator-approach-to-eliminating-the-temperature-dependance-of-silicon-based-timing-references.pdf, 2014, 415-418. |
Zhou, et al., “Partial Fingerprint Reconstruction with Improved Smooth Extension”, Network and System Security, Springer Berlin Heidelberg, Jun. 3, 2013, 756-762. |
Dausch, et al., “Theory and Operation of 2-D Array Piezoelectric Micromachined Ultrasound Transducers”, IEEE Transactions on Ultrasonics, and Frequency Control, vol. 55, No. 11;, Nov. 2008, 2484-2492. |
Hopcroft, et al., “Temperature Compensation of a MEMS Resonator Using Quality Factor as a Thermometer”, Retrieved from Internet: http://micromachine.stanford.edu/˜amanu/linked/MAH_MEMS2006.pdf, 2006, 222-225. |
Hopcroft, et al., “Using the temperature dependence of resonator quality factor as a thermometer”, Applied Physics Letters 91. Retrieved from Internet: http://micromachine.stanford.edu/˜hopcroft/Publications/Hoperoft_QT_ApplPhysLett_91_013505.pdf, 2007, 013505-1-031505-3. |
Lee, et al., “Low jitter and temperature stable MEMS oscillators”, Frequency Control Symposium (FCS), 2012 IEEE International, May 2012, 1-5. |
Li, et al., “Capacitive micromachined ultrasonic transducer for ultra-low pressure measurement: Theoretical study”, AIP Advances 5.12, Retrieved from Internet: http://scitation.aip.org/content/aip/journal/adva/5/12/10.1063/1.4939217, 2015, 127231. |
Qiu, et al., “Piezoelectric Micromachined Ultrasound Transducer (PMUT) Arrays for Integrated Sensing, Actuation and Imaging”, Sensors 15, doi:10.3390/s150408020, Apr. 3, 2015, 8020-8041. |
Savoia, et al., “Design and Fabrication of a cMUT Probe for Ultrasound Imaging of Fingerprints”, 2010 IEEE International Ultrasonics Symposium Proceedings, Oct. 2010, 1877-1880. |
Shen, et al., “Anisotropic Complementary Acoustic Metamaterial for Canceling out Aberrating Layers”, American Physical Society, Physical Review X 4.4: 041033., Nov. 19, 2014, 041033-1-041033-7. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2019061516, 14 pages, dated Mar. 12, 2020. |
ISA/EP, International Search Report and Written Opinion for International Application No. PCT/US2021/021412, 12 pages, dated Jun. 9, 2021. |
Taiwan Application No. 106114623, 1st Office Action, dated Aug. 5, 2021, pp. 1-8. |
EP Office Action, for Application 17724184.1, dated Oct. 12, 2021, 6 pages. |
EP Office Action, dated Oct. 9, 2021, 6 pages. |
European Patent Office, Office Action, App 17725018, pp. 5, dated Oct. 25, 2021. |
European Patent Office, Office Action, App 17725020.6, pp. 4, dated Oct. 25, 2021. |
Tang, et al., “Pulse-echo ultrasonic fingerprint sensor on a chip”, 2015 Transducers, 2015 18th International Conference on Solid-State Sensors, Actuators and Microsystems, Apr. 1, 2015, 674-677. |
Number | Date | Country | |
---|---|---|---|
20210056673 A1 | Feb 2021 | US |