The present disclosure relates to biometric security systems and methods and, in particular, to systems that verify a subject based on a combination of a response to a non-biometric challenge and biometric authenticity.
Security is a concern in a variety of transactions involving private information. Biometric identification systems have been used in government and commercial systems around the world to enable secure transactions. Biometric systems generally use a unique feature of an individual to be enrolled and then verified to gain access to a system. For example, traditional biometric systems can use unique features associated with a fingerprint, face, iris or voice to verify an individual's identity.
In one class of attacks against traditional biometric systems, the spoofer presents a facsimile of the real user's biometric feature to the system which, if adequately realistic in terms of the system criteria, can trick the system which then gives access to the spoofer. Examples of such attacks include the gummy-bear fingerprint spoof attack and the use of a photograph to trick a face recognition system of a smart phone. Defenses against biometric facsimile attacks include liveness testing. In the case of iris recognition systems, pupilometry includes a light to stimulate pupil contraction and the system measures saccadic eye movement. Both pupil contraction and saccades are involuntary and cannot be easily mimicked by a photograph or a video. However, because they are involuntary or passive, the type of information retrieved from pupil contraction and saccades can be limited.
Thus, a need exists for an improved method of identifying subjects while enhancing security to counter spoofing attacks. These and other needs are addressed by the biometric security systems and methods of the present disclosure.
In accordance with embodiments of the present disclosure, an exemplary biometric security system is provided that includes an interface, a camera, and a processing device in communication with the interface and camera. The processing device can be configured to display a challenge to a subject via the interface, and receive as input a response to the challenge from the subject. Contemporaneous (e.g., simultaneous) to receiving the response to the challenge from the subject, the processing device can be configured to capture one or more images of the subject with the camera. The processing device can be configured to analyze the received response to the challenge relative to a preset valid response, and analyze the captured one or more images of the subject for biometric authenticity. The processing device can be configured to verify the subject based on a combination of both a successful match between the response to the challenge and the preset valid response, and a successful finding of biometric authenticity.
In some embodiments, the interface can include a graphical user interface (GUI) including a display. In some embodiments, the biometric security system can include an illumination source (e.g., a near infrared illumination source) configured to illuminate an iris of the subject. In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical or alphanumerical passcode. In such embodiments, the interface can include a numerical display, and the processing device can be configured to provide a signal to the subject for visually entering the numerical passcode using the numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display. The signal can be at least one of a visual signal, an auditory signal, a tactile signal, combinations thereof, or the like.
In such embodiments, the camera can be configured to capture one or more images of the subject during sequential focus of the subject on each number of the numerical passcode. The processing device can be configured to determine a distance of the subject and a gaze angle of the subject relative to the interface based on the one or more captured images. The processing device can be configured to select a number of the numerical display determined to be of focus by the subject based on the distance of the subject and the gaze angle. The processing device can be configured to output a visual indicator regarding the selected number of the numerical display. The processing device can provide a limited time period for the subject to focus on each sequential number of the numerical passcode.
In some embodiments, the interface can include a numerical display, and the processing device can be configured to provide a signal to the subject for visually entering the numerical passcode using the numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display and blinking to sequentially confirm selection of each number. In some embodiments, the interface can include a numerical display, and the processing device can be configured to provide a signal to the subject for visually entering the numerical passcode using the numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display and actuating an input means (e.g., a button) of the interface to sequentially confirm selection of each number. In some embodiments, a fingerprint scanner of the interface can detect a fingerprint of the subject during actuation of the input means.
In some embodiments, for the biometric authenticity, the processing device can be configured to analyze the captured one or more images of the subject using iris segmentation and matching routines. In some embodiments, for the biometric authenticity, the processing device can be configured to measure at least one of a position of an iris of the subject within a socket relative to corners of an eye, a distance of the iris from eyelids of the eye, an eyelid opening distance, eyelid opening movement, a relative position between a pupil of the subject and specular reflection, a size of the specular reflection, combinations thereof, or the like. Such measurements can assist the biometric security system in determining the liveness of the subject.
In some embodiments, the interface can include one or more fingerprint scanners. In such embodiments, the challenge can be a request for input of the preset valid response in a form of an initial position of a finger of the subject against the fingerprint scanner and a subsequent position of the finger of the subject against the fingerprint scanner, the initial and subsequent positions of the finger being different (e.g., different orientations). The processing device can be configured to scan the finger of the subject positioned against the fingerprint scanner in the initial position, and the processing device can be configured to provide a signal (e.g., visual, audio, tactile, combinations thereof, or the like) to the subject for rotating the finger by a preset angle (e.g., preselected by the subject) to the subsequent position. In such embodiments, matching of the preset angle by the subject represents the response to the challenge, and scanning of the fingerprint at both positions represents the biometric authenticity portion.
In some embodiments, for the biometric authenticity, the processing device can be configured to analyze the captured one or more images of the subject for facial expression variation. In some embodiments, for the biometric authenticity, the processing device can be configured to analyze the captured one or more images of the subject for blinking frequency. In some embodiments, for the biometric authenticity, the processing device can be configured to analyze the captured one or more images of the subject for iris texture. The biometric security system can include one or more databases configured to electronically store the response to the challenge from the subject, the captured one or more images of the subject, and the preset valid response.
In accordance with embodiments of the present disclosure, an exemplary method of verification of a biometric security system is provided. The method includes displaying a challenge to a subject via an interface of the biometric security system, and receiving as input a response to the challenge from the subject. The method includes, contemporaneous (e.g., simultaneous) to receiving the response to the challenge from the subject, capturing one or more images of the subject with the camera. The method includes analyzing the received response to the challenge relative to a preset valid response, and analyzing the captured one or more images of the subject for biometric authenticity. The method includes verifying the subject based on both a successful match between the response to the challenge and the preset valid response, and a successful finding of biometric authenticity.
In some embodiments, the method can include illuminating the iris of the subject with an illumination source (e.g., a near infrared illumination source). In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical or alphanumerical passcode. In such embodiments, the method can include providing a signal to the subject for visually entering the numerical passcode using a numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display. The method can include capturing one or more images of the subject during sequential focus of the subject on each number of the numerical passcode, determining a distance of the subject and a gaze angle of the subject relative to the interface based on the one or more captured images, and selecting a number of the numerical display determined to be of focus by the subject based on the distance of the subject and the gaze angle. The method can include outputting a visual indicator regarding the selected number of the numerical display.
Determining which of the numbers of the numerical display is selected by the gaze of the subject can be performed by one or a combination of different methods. In some embodiments, a predetermined period of time during which the subject's gaze is detected to hover over a number can be indicative of the desired selection. In such embodiments, the system can indicate to the subject when it is time to move the subject's gaze to the next number. In some embodiments, the system can request the subject to blink after the subject's gaze is hovering over a number to indicate the desired selection, the subject's blink selecting the number and explicitly advancing the system to the next number (if any). In some embodiments, the user interface can include a “next” or “enter” button (physical and/or electronic) that the subject can actuate while the subject's gaze is hovering over a number to indicate the desired selection, actuation of the button selecting the number and explicitly advancing the system to the next number (if any). In some embodiments, actuation of the button can substantially simultaneously capture the number on the numerical display and the subject's fingerprint via a fingerprint scanner embedded in the button, resulting in a multi-biometric characteristic capture within a tight timing tolerance.
In some embodiments, the method can include analyzing the captured one or more images of the subject using iris segmentation and matching routines. In some embodiments, the method can include measuring at least one of a position of an iris of the subject within a socket relative to corners of an eye, a distance of the iris from eyelids of the eye, an eyelid opening distance, eyelid opening movement, a relative position between a pupil of the subject and specular reflection, a size of the specular reflection, combinations thereof, or the like.
In some embodiments, the interface can include a fingerprint scanner, and the challenge can be a request for input of the preset valid response in a form of an initial position of a finger of the subject against the fingerprint scanner and a subsequent position of the finger of the subject against the fingerprint scanner. The method can include scanning the finger of the subject positioned against the fingerprint scanner in the initial position, and providing a signal to the subject for rotating the finger by a preset angle to the subsequent position.
In some embodiments, the method can include analyzing the captured one or more images of the subject for facial expression variation. In some embodiments, the method can include analyzing the captured one or more images of the subject for blinking frequency. In some embodiments, the method can include analyzing the captured one or more images of the subject for iris texture. The method can include electronically storing the response to the challenge from the subject, the captured one or more images of the subject, and the preset valid response in a database.
In accordance with embodiments of the present disclosure, an exemplary non-transitory computer-readable medium storing instructions is provided for biometric security system verification, the instructions being executable by a processing device. Execution of the instructions by the processing device can cause the processing device to display a challenge to a subject via an interface of the biometric security system, and receive as input a response to the challenge from the subject. Execution of the instructions by the processing device can cause the processing device to, contemporaneous (e.g., simultaneous) to receiving the response to the challenge from the subject, capture one or more images of the subject with the camera.
Execution of the instructions by the processing device can cause the processing device to analyze the received response to the challenge relative to a preset valid response. Execution of the instructions by the processing device can cause the processing device to analyze the captured one or more images of the subject for biometric authenticity. Execution of the instructions by the processing device can cause the processing device to verify the subject based on both a successful match between the response to the challenge and the preset valid response, and a successful finding of biometric authenticity.
In accordance with embodiments of the present disclosure, an exemplary biometric security system is provided. The system includes an interface, a biometric acquisition device (e.g., camera, fingerprint scanner, combinations thereof, or the like), and a processing device in communication with the interface and the biometric acquisition device. The processing device configured to display a challenge to a subject via the interface, and receive as input a response to the challenge from the subject. Contemporaneous to receiving the response to the challenge from the subject, the processing device is configured to capture a biometric characteristic of the subject with the biometric acquisition device. The processing device configured to analyze the received response to the challenge relative to a preset valid response, analyze the biometric characteristic of the subject for biometric authenticity, and verify the subject based on both a successful match between the response to the challenge and the preset valid response, and a successful finding of biometric authenticity.
In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical passcode, the interface includes a numerical display, and the processing device is configured to provide a signal to the subject for visually entering the numerical passcode using the numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display for a predetermined period of time.
In some embodiments, the biometric acquisition device includes a camera, the camera is configured to capture one or more images of the subject during sequential focus of the subject on each number of the numerical passcode, and the processing device is configured to determine a distance of the subject and a gaze angle of the subject relative to the interface based on the one or more captured images, and wherein the processing device is configured to select a number of the numerical display determined to be of focus by the subject based on the distance of the subject and the gaze angle. In some embodiments, the processing device can be configured to output a visual indicator regarding the selected number of the numerical display. In some embodiments, the processing device can provide a limited time period for the subject to focus on each sequential number of the numerical passcode.
In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical passcode, the interface includes a numerical display, and the processing device is configured to provide a signal to the subject for visually entering the numerical passcode using the numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display and blinking to sequentially confirm selection of each number.
In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical passcode, the interface includes a numerical display, the processing device is configured to provide a signal to the subject for entering the numerical passcode using the numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display and actuating an input means of the interface to sequentially confirm selection of each number, and the biometric acquisition device includes a fingerprint scanner of the interface configured to detect a fingerprint of the subject during actuation of the input means.
In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical passcode, the interface includes a numerical display, the processing device is configured to provide a signal to the subject for entering the numerical passcode using the numerical display of the interface by sequentially actuating each number of the numerical passcode on the numerical display, and the biometric acquisition device includes a fingerprint scanner of the interface configured to detect a fingerprint of the subject during actuation of at least one number of the numerical passcode.
In some embodiments, the processing device is configured to analyze the captured one or more images of the subject using iris segmentation and matching routines. In some embodiments, the processing device is configured to measure at least one of a position of an iris of the subject within a socket relative to corners of an eye, a distance of the iris from eyelids of the eye, an eyelid opening distance, eyelid opening movement, a relative position between a pupil of the subject and specular reflection, or a size of the specular reflection.
In some embodiments, the biometric acquisition device includes a fingerprint scanner, and the challenge is a request for input of the preset valid response in a form of an initial position of a finger of the subject against the fingerprint scanner and a subsequent position of the finger of the subject against the fingerprint scanner. In such embodiments, the processing device can be configured to scan the finger of the subject positioned against the fingerprint scanner in the initial position, and the processing device can be configured to provide a signal to the subject for rotating the finger by a preset angle to the subsequent position.
In some embodiments, the processing device can be configured to analyze the captured one or more images of the subject for at least one of facial expression variation, blinking frequency, or iris texture. In some embodiments, the processing device can be configured substantially simultaneously receive the response to the challenge from the subject and capture the biometric characteristic of the subject with the biometric acquisition device.
In accordance with embodiments of the present disclosure, an exemplary method of verification of a biometric security system is provided. The method includes displaying a challenge to a subject via an interface of the biometric security system, and receiving as input a response to the challenge from the subject. Contemporaneous to receiving the response to the challenge from the subject, the method includes capturing a biometric characteristic of the subject with a biometric acquisition device. The method includes analyzing the received response to the challenge relative to a preset valid response, analyzing the captured biometric characteristic of the subject for biometric authenticity, and verifying the subject based on both a successful match between the response to the challenge and the preset valid response, and a successful finding of biometric authenticity.
In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical passcode, and the method includes providing a signal to the subject for visually entering the numerical passcode using a numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display for a predetermined period of time.
In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical passcode, and the method includes providing a signal to the subject for visually entering the numerical passcode using a numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display and blinking to sequentially confirm selection of each number.
In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical passcode, and the method includes providing a signal to the subject for visually entering the numerical passcode using a numerical display of the interface by sequentially focusing on each number of the numerical passcode on the numerical display and actuating an input means of the interface to sequentially confirm selection of each number, and detecting a fingerprint of the subject with a biometric acquisition device during actuation of the input means.
In some embodiments, the challenge can be a request for input of the preset valid response in a form of a numerical passcode, and the method includes providing a signal to the subject for entering the numerical passcode using a numerical display of the interface by sequentially actuating each number of the numerical passcode on the numerical display, and detecting a fingerprint of the subject with a biometric acquisition device during actuation of at least one number of the numerical passcode.
In some embodiments, the biometric acquisition device includes a fingerprint scanner, and the challenge can be a request for input of the preset valid response in a form of an initial position of a finger of the subject against the fingerprint scanner and a subsequent position of the finger of the subject against the fingerprint scanner, the method including scanning the finger of the subject positioned against the fingerprint scanner in the initial position, and providing a signal to the subject for rotating the finger by a preset angle to the subsequent position.
In accordance with embodiments of the present disclosure, an exemplary non-transitory computer-readable medium storing instructions for biometric security system verification is provided. The instructions are executable by a processing device. Execution of the instructions by the processing device causes the processing device to display a challenge to a subject via an interface of the biometric security system, and receive as input a response to the challenge from the subject. Contemporaneous to receiving the response to the challenge from the subject, execution of the instructions by the processing device causes the processing device to capture a biometric characteristic of the subject with a biometric acquisition device. Execution of the instructions by the processing device causes the processing device to analyze the received response to the challenge relative to a preset valid response, analyze the captured biometric characteristic of the subject for biometric authenticity, and verify the subject based on both a successful match between the response to the challenge and the preset valid response, and a successful finding of biometric authenticity.
Other objects and features will become apparent from the following detailed description considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed as an illustration only and not as a definition of the limits of the invention.
To assist those of skill in the art in making and using the disclosed biometric security systems and methods, reference is made to the accompanying figures, wherein:
In accordance with embodiments of the present disclosure, exemplary biometric security systems are provided that verify a subject based on a combination of a response to a non-biometric challenge and biometric authenticity, thereby increasing the anti-spoofing measures of traditional biometric identification systems. In particular, the exemplary biometric security systems provide additional layers of biometric security by necessitating that the subject possess and use a piece of private information (e.g., a numerical passcode, an alphanumeric passcode, unique credentials, a radio-frequency identification (RFID) card, or the like) contemporaneously (e.g., simultaneously) to biometric authentication when and only when challenged by the system to provide such information (e.g., a response to a challenge). The biometric security systems therefore require that the correct response to a challenge is provided by the subject at a correctly timed moment (a class of challenge and response measures), in combination with biometric identification of the subject. By relying on a multi-layer combination of the properly timed response to a challenge and biometric identification (as opposed to a simple presentation of a biometric feature), the level of difficulty for a spoof attack is increased.
In some embodiments, existing hardware of a biometric identification system can be programmed to present a challenge and interpret the response to the challenge in combination with biometric authentication. The requirement for the correct and correctly timed response to a challenge increases the level of difficulty for a spoof attack without necessitating a large investment in software or even a small investment in hardware for the system. The exemplary biometric security systems can be used in a variety of environments, such as, e.g., smart phones, door locks, ATM machines, home security, corporate security, military security, or the like. By offering an increase in the level of security to self-recognizing systems, the exemplary biometric security systems can be used in any environment requiring heightened security, e.g., for financial transactions, access to sensitive areas, or the like. The exemplary biometric security systems can also be used with lower security systems, e.g., opening smart phones, entry to a home, or the like, by layering a biometric authentication (something that you are) with private personal information (something that you know). The combination of a challenge response with biometric authentication can be applied in a variety of biometric modalities, such as voice, fingerprint, face, and iris identification.
With reference to
The system 100 includes one or more cameras 104 (e.g., one type of biometric acquisition device) configured to capture images of the subject, such as of the face and/or iris(es) of the subject. The illumination sources 102 and the cameras 104 can be part of a subject acquisition subsystem. The system 100 includes a user interface 106. In some embodiments, the user interface 106 can include a display in the form of a graphical user interface (GUI) 108. In some embodiments, the interface 106 can include a fingerprint scanner 110 for scanning one or more fingers of the subject. In some embodiments, the interface 106 can include a numerical (or alphanumerical) display 112. In some embodiments, the display 112 can be provided to the subject electronically via the GUI 108.
The system 100 includes a processing device 114 with a processor 116 in communication with the user interface 106, the camera 104 and the illumination source 102. The system 100 includes one or more databases 118 configured to electronically store a variety of data, such as one or more challenges 120 that can be presented to the subject via the interface 106, one or more images 122 captured by the camera 104, preset valid responses 124 to the challenges 120, and biometric data 126 associated with one or more subjects. For example, when initially enrolling into the system 100, the subject can be provided with one or more challenges 120 and can provide responses to such challenges 120. The correct responses to the challenges 120 can be stored as the preset valid responses 124 for matching at a future verification stage of the subject. The responses to the challenges 120 can be customized by the subject and can be changed by the subject when desired. The revocable nature of the responses to the challenges 120 allows the subject to vary the verification process if the passcode or biometric characteristics have been compromised.
As a further example, during initial enrollment into the system 100, one or more images 122 of the subject can be captured by the system 100, biometric identification information can be extracted from the one or more images 122, and stored as the biometric data 126. Thus, the database 118 can electronically store historical data from enrollment of the subject into the system 100, historical data associated with previous verification of the subject by the system 100, and/or real-time data associated with an attempt of the subject to be verified by the system 100.
The system 100 can include a timer 128 in communication with the processing device 114. The timer 128 can be used by the system 100 to ensure that the response to the challenge 120 is provided by the subject in a timely manner (e.g., within a predetermined period of time). The system 100 can include a communication interface 130 configured to provide for a communication network between components of the system 100, thereby allowing data to be transmitted and/or received by the components of the system 100. The system 100 can include a central computing system 132 for receiving and processing the data captured by the camera 104 and transmitted by the processing device 114. The system 100 can include a feedback module 134 configured to provide feedback to the subject regarding, for example, a request for response to a challenge 120, proper alignment of the subject with the field-of-view of the camera 104, a specific step to be taken by the subject during response to a challenge 120, combinations thereof, or the like. In some embodiments, the feedback module 134 can be configured to provide visual, auditory, and/or tactile feedback to the subject.
Security of a biometric self-recognizing system is strengthened by a challenge/response based on the necessity of the authentic subject to know how to respond to the challenge (e.g., the correct passcode), and for the authentic subject to know when to respond to the challenge (a temporal requirement). Because of the purposeful nature of a challenge and response, the system 100 uses both something a user knows (e.g., a passcode) and something a person is (e.g., a biometric characteristic), rather than just the latter. The system 100 can initiate the verification step with an alignment phase where the subject is guided (e.g., via the feedback module 132) into the proper capture position for entering digits, a pattern, or other information that can be readily changed by the subject (e.g., responses to a challenge). Contemporaneous (e.g., simultaneous) to entry of such information by the subject, the system 100 captures and analyzes one or more biometric characteristics of the subject. The period of time when the challenge response and biometric information is provided or extracted can be identified as the entry phase. Upon successful entry of the subject defined information and biometric information, the system 100 can make a decision to grant or deny access to the subject. The security can be derived from the fusing of subject defined keys, biometric credentials, and time limitations.
Thus, the system 100 can be configured to initially display a challenge 120 to the subject via the interface 106. In some embodiments, the challenge 120 can be to enter a numerical or alphanumerical passcode stored as a preset valid response 124 by sequentially following the numbers and/or letters of the passcode on a numerical or alphanumerical display provided at the interface 106 with one or more eyes of the subject. In such embodiments, the timer 128 can be used to provide the subject with a limited amount of time to gaze at the next number or letter in the passcode and, once selected, the feedback module 132 can be used to provide feedback to the subject to continue to the next number or letter of the passcode. In some embodiments, the challenge 120 can be to gaze at predetermined images or icons presented to the user at the interface 106 in a specific order. In some embodiments, the challenge 120 can be to position the subject's finger in a first orientation against the fingerprint scanner 110 (e.g., one type of biometric acquisition device) and, after a signal from the feedback module 132, rotate the finger by a predetermined angle to be scanned again by the fingerprint scanner 110. The system 100 is therefore configured to receive as input a response to the challenge 120 from the subject, whether in the form of the sequentially followed numbers and/or letters of the passcode or the correct change in angle of the finger for subsequent scanning. The received response to the challenge 120 can be analyzed and/or compared to the preset valid response 124 stored in the system 100.
Contemporaneous (e.g., simultaneous) to receiving the response to the challenge 120, the system 100 can be configured to capture one or more images 122 of the subject with the camera 104. For example, the camera 122 can capture images 122 of the iris of the subject. The images 122 can be analyzed by the system 100 for biometric authenticity. In some embodiments, biometric authenticity can be performed by iris segmentation and matching routines. In some embodiments, biometric authenticity can be performed by measurement of at least one of a position of an iris of the subject within a socket relative to corners of the eye, a distance of the iris from eyelids of the eye, an eyelid opening distance, eyelid opening movement, a relative position between a pupil of the subject and specular reflection, a size of the specular reflection, combinations thereof, or the like.
In some embodiments, biometric authenticity can be performed by facial expression variation, blinking frequency, iris texture, or the like, as analyzed and extracted from the images 122. In some embodiments, biometric authenticity can be performed by scanning the fingerprint of the user via the scanner 110. Thus, while the subject is providing the response to the challenge 120, the system 100 can use the captured images 122 to contemporaneously (e.g., simultaneously) determine the biometric authenticity of the subject. Based on both a successful match between the response to the challenge 120 and the preset valid response 124, and a successful finding of biometric authenticity, the subject can be verified by the system 100.
As noted above, in some embodiments, the challenge 120 presented to the subject can be a request for input of the preset valid response 124 in the form of a numerical or alphanumerical passcode. In some embodiments, the user interface 106 can initially display the subject's eye for alignment until the subject is in the capture volume or field-of-view of the camera 104. Once the subject is in the capture volume, the display can switch from an eye preview to an entry display capable of receiving input from the subject. In some embodiments, the GUI 108 can display a numeric or alphanumeric digital display to the subject for visually entering a preset passcode. For example, a possible entry display can be a 12-digit numeric keypad (e.g., an array of digits).
The processing device 114 can provide a signal (e.g., visual, auditory, tactile, combinations thereof, or the like) to the subject via the feedback module 132 to begin visually entering the passcode using the display provided on the interface 106. The subject can begin entering their personal identification number or passcode by sequentially looking at each number and maintaining their gaze on each number until notified by the system 100 to move their gaze to the next number. During the subject's gaze at each of the numbers or letters of the passcode on the display, the system 100 can capture one or more images 122 of the subject with the camera 104.
The processing device 114 can be configured to analyze each image substantially in real-time while the subject is maintaining their gaze on a specific number on the display to determine the distance of the subject and the gaze angle of the subject relative to the interface 106. Based on the calculated distance to the display and the gaze angle on every frame captured, the system 100 determines which of the numbers in the numerical display the subject was focusing on. The system 100 can output feedback in the form of a visual indicator (e.g., highlighting, bold, different color, flashing, or the like) on the interface 106 regarding the number determined by the system 100 to be of focus by the subject (e.g., the number the subject was looking at). Such feedback indicates to the subject that the digit or letter has been accepted and it is time to look at the subsequent digit or letter in the passcode. In particular, once the system 100 determines that the subject is staring at a specific number, the system 100 can simulate depression of the digit on the screen and provides the feedback to the subject. Selection of each digit can therefore be performed visually without physical depression on the display by the subject. In some embodiments, in addition to the above-described feedback, the feedback module 132 can provide a visual, auditory and/or tactile signal to the subject, indicating that the subject should focus their gaze on the next number in the passcode.
By requiring the subject to use their eyes to select the digit, the system 100 is able to determine who is entering the information into the system 100 with higher confidence. In particular, contemporaneously (e.g., simultaneously) to detecting the response to the challenge 120 from the subject, the system 100 can analyze the captured images 122 to determine biometric identification of the subject. The system 100 is able to gain the confidence by measuring multiple features of the acquired images 122. In some embodiments, iris segmentation and matching routines can be used to verify the identity of the believed subject. Measuring additional features of the eye, such as the position of the eye within the socket relative to the corners of the eyes, distance from the eyelids, eyelid opening, relative position between the pupil and specular reflection, size of the specular reflection, or other features, can increase the confidence that the eye is an authentic three-dimensional eye.
In some embodiments, if the subject moves out of the capture volume or field-of-view of the camera 104, the system 100 can detect such movement and can reset the sequence of responding to the challenge 120 and determining biometric authenticity. In some embodiments, if the subject moves out of the capture volume or field-of-view of the camera 104, the system 100 can alert the subject to move back into the field-of-view of the camera 104 and allows the subject to continue where the verification sequence left off. Leaving the capture volume may be indicative of someone attempting to spoof the system 100. Therefore, waiting until the subject is near the center of the capture volume or field-of-view of the camera 104 provides an extra buffer against small movements. Keeping the subject near the center of the field-of-view of the camera 104 should be simple since the subject is generally only shifting their eyes, thereby maintaining the proper distance, angle, reflections, or the like, during analysis by the system 100.
Time should be used to limit the entry of each piece of information (e.g., numbers for a numerical passcode), preventing an attacker from getting lucky with each required input. For example, an attacker shifting a piece of paper around may be able to accomplish the correct gaze periodically. However, the time to accomplish the correct gaze and to be able to hold that position is more challenging for an attacker than for an authentic eye. Security is therefore increased by reducing the time allowed between gaze positions. Such time limits can be enforced by the timer 128. Additionally, security can be increased by using longer sequences of digits for the passcode to be entered by the subject.
In some embodiments, the time between verification attempts can be limited by the timer 128. For example, if a subject makes a mistake during the verification process, or an attacker fails to accomplish the entire sequence correctly, the system 100 can allow an immediate retry. After a small number of retries (e.g., two retries), the system 100 can require a long pause before additional attempts are allowed to ensure that the number of attack sequences per day is limited. For example, in some embodiments, the system 100 can require that the subject wait one hour before attempting the verification process again after three incorrect sequences. Security can be increased by allowing fewer retries and necessitating longer pauses between verification attempts.
Thus, in some embodiments, a predetermined period of time during which the subject's gaze is detected to hover over a number can be indicative of the desired selection. In such embodiments, the system 100 can indicate to the subject when it is time to move the subject's gaze to the next number. In some embodiments, the system 100 can request the subject to blink after the subject's gaze is hovering over a number to indicate the desired selection, the subject's blink selecting the number and explicitly advancing the system 100 to the next number (if any). In some embodiments, the user interface 106 can include a “next” or “enter” button (physical and/or electronic) that the subject can actuate while the subject's gaze is hovering over a number to indicate the desired selection, actuation of the button selecting the number and explicitly advancing the system 100 to the next number (if any). In some embodiments, actuation of the button can substantially simultaneously capture the number on the numerical display and the subject's fingerprint via a fingerprint scanner 110 embedded in the button of the user interface 106, resulting in a multi-biometric characteristic capture within a tight timing tolerance. In some embodiments, one or more fingerprint scanners 110 can be embedded into the user interface 106 such that each respective fingerprint scanner 110 underlies a button associated with the user interface 106. In such embodiments, the challenge issued to the subject can be to enter a security code using the physical or electronic numerical display of the user interface 106, and the system 100 can capture the subject's fingerprint via the fingerprint scanner 110 simultaneous to actuation of the button(s).
In some embodiments, the subject could enter their passcode using two or more fingers. In some embodiments, a first finger (associated with a first fingerprint) is used to enter a first digit of a passcode and a second finger different than the first finger (and which is associated with a second fingerprint different than the first fingerprint) can be used for a second digit. For example, if a pin has four digits (e.g., 5-7-6-8), a first finger (e.g., the left index finger) could be used to enter a first digit (e.g., the third position digit, in this example a “6”), a second finger different than the first finger (e.g., the right index finger or the left thumb) could be used to enter a second digit (e.g., the first position, in this example a “5”). Increased complexity could be provided, for example, such that a different finger is used for different digits (e.g., six different fingerprints for six different digits).
In some embodiments, rather than asking the subject to gaze at each individual number of a passcode, the challenge 120 can involve directing the subject to swipe their pupil through a pattern by moving a device of the system 100 vertically and/or horizontally or tilting the device to sweep the pattern. Similar to a swipe pattern entered by using a finger, this pattern can be changed at any time by the user. The key to security is contemporaneous (e.g., simultaneous) iris capture and verification throughout the entry of the swipe motion. Longer patterns can increase overall security. The time to complete the pattern and the time allowed between attempts can be used to strengthen security of the system 100.
In some embodiments, the system 100 can apply variations in the type of biometric authentication used during the subject's response to the challenge 120. In some embodiments, the system 100 can change the position of the illumination source 102 (or change the direction of the illumination beam) to force a change in position of the specular reflection(s), and using such changes in position of the specular reflection(s) to determine if the eye has a three-dimensional shape. In some embodiments, such change in illumination can be performed by rotating a mobile device which would move the position of the illumination source 102 and the rotation angle of the camera 104. In embodiments using multiple illumination sources 102 and/or cameras 104, different patterns of illumination can be switched on or off to logically move and change the illumination position.
In some embodiments, the challenge 120 presented to the subject can be a request for input of the preset valid response 124 in the form of an initial position of a finger of the subject against the fingerprint scanner 110, and a signal to orient the finger in a different, subsequent position for an additional scan of the finger. For example, the subject can initially position the finger against the scanner 110 in any orientation with the system 100 designating the initial position as a zero angle position. Upon receiving a signal from the feedback module 132 directing the subject to change the position of the finger (e.g., the challenge 120), the subject can rotate the finger on the platen by a predetermined amount. In such embodiments, the angle and/or direction of rotation can be previously stored as the preset valid response 124 to the challenge 120.
Upon receiving the signal, the subject can have a limited amount of time to reposition the finger and stabilize the position of the finger in the new orientation. For example, a subject can present a right index fingerprint and, when an LED on the scanner 110 turns green prompting the subject to reorient the finger, the subject can rotate the finger by approximately 45 degrees counterclockwise. Another subject can rotate the finger by approximately 45 degrees clockwise. The angle and/or direction of rotation therefore serves as the response to the challenge 120, while the actual scan of the fingerprint in both orientations serves as the biometric authentication of the subject.
A subject's face is much more expressive than a fingerprint. For example, in response to a challenge 120, a subject can frown, smile or look surprised. The system 100 can analyze the captured images 122 to determine such variations in the subject's face during presentation of a challenge 120 and response to the challenge 120. Although discussed herein as variations to the system 100, it should be understood that the system 100 can use any combinations of challenges 120 and biometric authentication described herein for verifying the subject.
Iris recognition can also be used for verification by the system 100. For example, the eye is well adapted for looking left, right, up, down and for blinking. An eye can trace a swipe pattern on a grid presented on a screen of the interface 106, with the gaze being tracked using gaze tracking technology. An eye can blink once slowly and once rapidly or, could spell a password in Morse code. Iris recognition can be used to recognize the fine details of a subject's iris texture. Thus, the system 100 can use iris recognition to discern whether the subject is properly fixated in the proper direction with eyes open. Based on the captured images 122, the system 100 can detect and analyze a forward gaze, an open-eyed gaze, eye blinks, and off-axis gaze angles as a response to a challenge 120.
The system 100 therefore provides a biometric means for responding to a challenge 120. As noted above, challenges 120 can be presented in a variety of ways through visual, auditory and/or tactile signals from the feedback module 132 and/or the interface 106. For example, in the fingerprint example of the challenge 120, illuminating an LED of the system 100 green can signal the request for the subject to begin the response to the challenge (e.g., rotating the finger by the appropriate angle and the correct direction). As another example, a face or iris recognition system can present a pattern over which a subject gazes using a subject-specific pattern while being gaze-tracked.
An iris recognition system can, in the course of requiring the subject to look toward the iris camera 104, present a challenge (e.g., an LED, a display indicator, a sound, a tactile buzz or vibration, or the like), at which time the subject simply blinks once or briefly looks off in a subject-specific direction and then back at the camera 104. Variations in the gaze angle or direction can therefore be used as the response to the challenge 120. The responses to the challenge 120 are tracked for being during an appropriate time frame monitored by the system 100 to ensure that the response meets the temporal requirements of the response.
In some embodiments, the system 100 can combine a variety of liveness measures. For example, the system 100 can monitor the change in biometric characteristics of the iris 170 in response to the command or prompt to look down at the gaze attractor 166 and tracks the downward movement of the eye 164 and/or iris 170 as the subject follows the position of the gaze attractor 166. In some embodiments, the system 100 can select the different positions of the gaze attractor 166 randomly. The subject can therefore be directed to gaze and stare at the gaze attractor 166 in each position for a predetermined period of time, with the subject moving the gaze and staring at the gaze attractor 166 with each change in position. During the subject's focused gaze for the predetermined period of time, the system 100 can check the gaze angle relative to the interface 160. The biometric characteristics of the subject can be contemporaneously (e.g., simultaneously) measured, particularly in downward gazes. For example, in downward gazes, the gaze angle changes and the eyelid begins to naturally close substantially simultaneously, making it difficult to perform biometric analysis in the form of iris recognition. The combined and contemporaneous (e.g., simultaneous) measurement of change in gaze and biometric characteristics increases the overall security of the system 100, even in instances where the eyelid closes as the gaze angle changes.
For example,
The subject can be prompted to sequentially gaze from number to number to input the unique personal identification number, e.g., 1-2-3-4, while maintaining the gaze at each number for a predetermined period of time, e.g., one second. The system 100 can monitor the subject's gaze position to accept or reject the input passcode, while contemporaneously (e.g., simultaneously) verifying the subject's identity using iris recognition. The contemporaneous (e.g., simultaneous) combination of input of visual input of a passcode and biometric authentication in the form of iris recognition provides a multiple layer defense against spoofing.
Virtualization may be employed in the computing device 300 so that infrastructure and resources in the computing device 300 may be shared dynamically. A virtual machine 314 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor. Memory 306 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 306 may include other types of memory as well, or combinations thereof.
A user may interact with the computing device 300 through a visual display device 318 (e.g., a personal computer, a mobile smart device, or the like), such as a computer monitor, which may display one or more user interfaces 320 (e.g., a graphical user interface) that may be provided in accordance with exemplary embodiments. The computing device 300 may include other I/O devices for receiving input from a user, for example, a camera, a sensor, a keyboard, a fingerprint scanner, or any suitable multi-point touch interface 308, a pointing device 310 (e.g., a mouse). The keyboard 308 and the pointing device 310 may be coupled to the visual display device 318. The computing device 300 may include other suitable conventional I/O peripherals.
The computing device 300 may also include one or more storage devices 324, such as a hard-drive, CD-ROM, eMMC (MultiMediaCard), SD (secure digital) card, flash drive, non-volatile storage media, or other computer readable media, for storing data and computer-readable instructions and/or software that implement exemplary embodiments of the biometric security systems described herein. Exemplary storage device 324 may also store one or more databases 326 for storing any suitable information required to implement exemplary embodiments. For example, exemplary storage device 324 can store one or more databases 326 for storing information, such as data relating to challenges, captures images, preset valid responses, biometric data, combinations thereof, or the like, and computer-readable instructions and/or software that implement exemplary embodiments described herein. The databases 326 may be updated by manually or automatically at any suitable time to add, delete, and/or update one or more items in the databases.
The computing device 300 can include a network interface 312 configured to interface via one or more network devices 322 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, T1, T3, 56 kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above. The network interface 312 may include a built-in network adapter, network interface card, PCMCIA network card, PCI/PCIe network adapter, SD adapter, Bluetooth adapter, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing device 300 to any type of network capable of communication and performing the operations described herein. Moreover, the computing device 300 may be any computer system, such as a workstation, desktop computer, server, laptop, handheld computer, tablet computer (e.g., the tablet computer), mobile computing or communication device (e.g., the smart phone communication device), an embedded computing platform, or other form of computing or telecommunications device that is capable of communication and that has sufficient processor power and memory capacity to perform the operations described herein.
The computing device 300 may run any operating system 316, such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, or any other operating system capable of running on the computing device and performing the operations described herein. In exemplary embodiments, the operating system 316 may be run in native mode or emulated mode. In an exemplary embodiment, the operating system 316 may be run on one or more cloud machine instances.
The environment 400 can include repositories or databases 418, 420, which can be in communication with the servers 402, 404, as well as the one or more illumination sources 406, one or more cameras 408, one or more processing devices 410, the feedback module 412, the user interface 414, and the central computing system 416, via the communications platform 422.
In exemplary embodiments, the servers 402, 404, one or more illumination sources 406, one or more cameras 408, one or more processing devices 410, the feedback module 412, the user interface 414, and the central computing system 416 can be implemented as computing devices (e.g., computing device 300). Those skilled in the art will recognize that the databases 418, 420 can be incorporated into one or more of the servers 402, 404. In some embodiments, the databases 418420 can store data relating to challenges, captured images, preset valid responses, biometric data, combinations thereof, or the like, and such data can be distributed over multiple databases 418, 420.
While exemplary embodiments have been described herein, it is expressly noted that these embodiments should not be construed as limiting, but rather that additions and modifications to what is expressly described herein also are included within the scope of the invention. Moreover, it is to be understood that the features of the various embodiments described herein are not mutually exclusive and can exist in various combinations and permutations, even if such combinations or permutations are not made express herein, without departing from the spirit and scope of the invention.
The present application claims the benefit of priority to U.S. Provisional Application No. 62/537,253, filed Jul. 26, 2017, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3852592 | Scoville et al. | Dec 1974 | A |
3993888 | Fellman | Nov 1976 | A |
4109237 | Hill | Aug 1978 | A |
4641349 | Flom et al. | Feb 1987 | A |
5291560 | Daugman | Mar 1994 | A |
5337104 | Smith et al. | Aug 1994 | A |
5481622 | Gerhardt et al. | Jan 1996 | A |
5572596 | Wildes et al. | Nov 1996 | A |
5835616 | Lobo et al. | Nov 1998 | A |
5861940 | Robinson et al. | Jan 1999 | A |
5933515 | Pu et al. | Aug 1999 | A |
5953440 | Zhang et al. | Sep 1999 | A |
5966197 | Yee | Oct 1999 | A |
5987459 | Swanson et al. | Nov 1999 | A |
6055322 | Salganicoff et al. | Apr 2000 | A |
6081607 | Mori et al. | Jun 2000 | A |
6119096 | Mann et al. | Sep 2000 | A |
6144754 | Okano et al. | Nov 2000 | A |
6204858 | Gupta | Mar 2001 | B1 |
6229907 | Okano et al. | May 2001 | B1 |
6247813 | Kim et al. | Jun 2001 | B1 |
6252976 | Schildkraut et al. | Jun 2001 | B1 |
6301370 | Steffens et al. | Oct 2001 | B1 |
6307954 | Suzaki | Oct 2001 | B1 |
6320610 | Van Sant et al. | Nov 2001 | B1 |
6421462 | Christian et al. | Jul 2002 | B1 |
6424727 | Musgrave et al. | Jul 2002 | B1 |
6433326 | Levine et al. | Aug 2002 | B1 |
6525303 | Gladnick | Feb 2003 | B1 |
6526160 | Ito | Feb 2003 | B1 |
6542624 | Oda | Apr 2003 | B1 |
6549644 | Yamamoto | Apr 2003 | B1 |
6614919 | Suzaki et al. | Sep 2003 | B1 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6765581 | Cheng | Jul 2004 | B2 |
6836554 | Bolle et al. | Dec 2004 | B1 |
6850252 | Hoffberg | Feb 2005 | B1 |
6851051 | Bolle | Feb 2005 | B1 |
6895103 | Chen et al. | May 2005 | B2 |
6912298 | Wilensky | Jun 2005 | B1 |
6970582 | Langley | Nov 2005 | B2 |
6977989 | Bothe et al. | Dec 2005 | B2 |
7015955 | Funston et al. | Mar 2006 | B2 |
7095901 | Lee et al. | Aug 2006 | B2 |
7099495 | Kodno et al. | Aug 2006 | B2 |
7118042 | Moore et al. | Oct 2006 | B2 |
7130453 | Kondo et al. | Oct 2006 | B2 |
7146027 | Kim et al. | Dec 2006 | B2 |
7295686 | Wu | Nov 2007 | B2 |
7310443 | Kris et al. | Dec 2007 | B1 |
7380938 | Chmielewski, Jr. et al. | Jun 2008 | B2 |
7428320 | Northcott et al. | Sep 2008 | B2 |
7466308 | Dehlin | Dec 2008 | B2 |
7466847 | Komura | Dec 2008 | B2 |
7542628 | Lolacono et al. | Jun 2009 | B2 |
7574021 | Matey | Aug 2009 | B2 |
7583823 | Jones et al. | Sep 2009 | B2 |
7599524 | Camus et al. | Oct 2009 | B2 |
7627147 | Lolacono et al. | Dec 2009 | B2 |
7634114 | Zappia | Dec 2009 | B2 |
7657127 | Lolacono et al. | Feb 2010 | B2 |
7751598 | Matey et al. | Jul 2010 | B2 |
7925059 | Hoyos et al. | Apr 2011 | B2 |
7986816 | Hoanca | Jul 2011 | B1 |
8050463 | Hamza | Nov 2011 | B2 |
8170293 | Tosa et al. | May 2012 | B2 |
8189879 | Cambier | May 2012 | B2 |
8195576 | Grigg et al. | Jun 2012 | B1 |
8200980 | Robinson et al. | Jun 2012 | B1 |
8317325 | Raguin et al. | Nov 2012 | B2 |
8337104 | Takiguchi et al. | Dec 2012 | B2 |
8374404 | Williams et al. | Feb 2013 | B2 |
8553948 | Hanna | Oct 2013 | B2 |
8603165 | Park | Dec 2013 | B2 |
8639058 | Bergen et al. | Jan 2014 | B2 |
8682073 | Bergen | Mar 2014 | B2 |
8755607 | Bergen et al. | Jun 2014 | B2 |
8854446 | Bergen et al. | Oct 2014 | B2 |
8934005 | De Bruijn | Jan 2015 | B2 |
9100825 | Schultz et al. | Aug 2015 | B2 |
9131141 | Tinker et al. | Sep 2015 | B2 |
9195890 | Bergen | Nov 2015 | B2 |
9514365 | Tinker et al. | Dec 2016 | B2 |
9665772 | Bergen | May 2017 | B2 |
9836647 | Perna et al. | Dec 2017 | B2 |
9836648 | Perna et al. | Dec 2017 | B2 |
10025982 | Perna et al. | Jul 2018 | B2 |
20020026585 | Hondros | Feb 2002 | A1 |
20020080141 | Imai et al. | Jun 2002 | A1 |
20020118864 | Kondo et al. | Aug 2002 | A1 |
20020150280 | Li | Oct 2002 | A1 |
20020154794 | Cho | Oct 2002 | A1 |
20020164054 | McCartney et al. | Nov 2002 | A1 |
20020173354 | Winans | Nov 2002 | A1 |
20020180586 | Kitson et al. | Dec 2002 | A1 |
20030046553 | Angelo | Mar 2003 | A1 |
20030103652 | Lee et al. | Jun 2003 | A1 |
20030123711 | Kim et al. | Jul 2003 | A1 |
20030169334 | Braithwaite et al. | Sep 2003 | A1 |
20030174211 | Imaoka et al. | Sep 2003 | A1 |
20040037452 | Shin | Feb 2004 | A1 |
20040088584 | Shachar et al. | May 2004 | A1 |
20040146187 | Jeng | Jul 2004 | A1 |
20040170304 | Haven | Sep 2004 | A1 |
20040213437 | Howard et al. | Oct 2004 | A1 |
20040236549 | Dalton | Nov 2004 | A1 |
20050047655 | Luo et al. | Mar 2005 | A1 |
20050063582 | Park et al. | Mar 2005 | A1 |
20050084179 | Hanna et al. | Apr 2005 | A1 |
20050088200 | Takekuma et al. | Apr 2005 | A1 |
20050165327 | Thibault et al. | Jul 2005 | A1 |
20050187883 | Bishop | Aug 2005 | A1 |
20050210267 | Sugano et al. | Sep 2005 | A1 |
20050270386 | Saitoh et al. | Dec 2005 | A1 |
20060008125 | Lauper et al. | Jan 2006 | A1 |
20060028617 | Matsumura et al. | Feb 2006 | A1 |
20060098097 | Wach et al. | May 2006 | A1 |
20060105806 | Vance et al. | May 2006 | A1 |
20060120570 | Azuma et al. | Jun 2006 | A1 |
20060140454 | Northcott et al. | Jun 2006 | A1 |
20060150928 | Lehmann et al. | Jul 2006 | A1 |
20060184243 | Yilmaz | Aug 2006 | A1 |
20060202036 | Wang et al. | Sep 2006 | A1 |
20060210123 | Kondo et al. | Sep 2006 | A1 |
20060222212 | Du et al. | Oct 2006 | A1 |
20060245623 | Loiacono et al. | Nov 2006 | A1 |
20060274918 | Amantea et al. | Dec 2006 | A1 |
20070014439 | Ando | Jan 2007 | A1 |
20070025598 | Kobayashi et al. | Feb 2007 | A1 |
20070036397 | Hamza | Feb 2007 | A1 |
20070047770 | Swope et al. | Mar 2007 | A1 |
20070140531 | Hamza | Jun 2007 | A1 |
20070160266 | Jones et al. | Jul 2007 | A1 |
20070174628 | Charrette, III | Jul 2007 | A1 |
20070189582 | Hamza et al. | Aug 2007 | A1 |
20070198850 | Martin et al. | Aug 2007 | A1 |
20070201728 | Monro | Aug 2007 | A1 |
20070206935 | Ono | Sep 2007 | A1 |
20070236567 | Pillman et al. | Oct 2007 | A1 |
20070285537 | Dwinell et al. | Dec 2007 | A1 |
20080021331 | Grinvald et al. | Jan 2008 | A1 |
20080049185 | Huffman et al. | Feb 2008 | A1 |
20080069411 | Friedman et al. | Mar 2008 | A1 |
20080121721 | Chen et al. | May 2008 | A1 |
20080180544 | Drader et al. | Jul 2008 | A1 |
20080187174 | Metaxas et al. | Aug 2008 | A1 |
20080219515 | Namgoong | Sep 2008 | A1 |
20080271116 | Robinson et al. | Oct 2008 | A1 |
20090041309 | Kim | Feb 2009 | A1 |
20090049536 | Ooi | Feb 2009 | A1 |
20090092292 | Carver et al. | Apr 2009 | A1 |
20090208064 | Cambier | Aug 2009 | A1 |
20090216606 | Coffman et al. | Aug 2009 | A1 |
20090220126 | Claret-Tournier et al. | Sep 2009 | A1 |
20090232418 | Lolacono et al. | Sep 2009 | A1 |
20090278922 | Tinker et al. | Nov 2009 | A1 |
20100026853 | Mokhnatyuk | Feb 2010 | A1 |
20100034529 | Jelinek | Feb 2010 | A1 |
20100046808 | Connell et al. | Feb 2010 | A1 |
20100063880 | Atsmon et al. | Mar 2010 | A1 |
20100082398 | Davis et al. | Apr 2010 | A1 |
20100142938 | Zhang | Jun 2010 | A1 |
20100176802 | Huguet | Jul 2010 | A1 |
20100238316 | Kim et al. | Sep 2010 | A1 |
20100278394 | Raguin et al. | Nov 2010 | A1 |
20100287053 | Ganong et al. | Nov 2010 | A1 |
20100290668 | Friedman et al. | Nov 2010 | A1 |
20100301113 | Bohn et al. | Dec 2010 | A1 |
20100310133 | Mason et al. | Dec 2010 | A1 |
20100328420 | Roman | Dec 2010 | A1 |
20110007205 | Lee | Jan 2011 | A1 |
20110043683 | Beach et al. | Feb 2011 | A1 |
20110075893 | Connel, II et al. | Mar 2011 | A1 |
20110081946 | Singh | Apr 2011 | A1 |
20110134268 | MacDonald | Jun 2011 | A1 |
20110142297 | Yu et al. | Jun 2011 | A1 |
20110167489 | Ziv | Jul 2011 | A1 |
20110187878 | Mor et al. | Aug 2011 | A1 |
20110317991 | Tsai | Dec 2011 | A1 |
20120086645 | Zheng et al. | Apr 2012 | A1 |
20120154536 | Stoker et al. | Jun 2012 | A1 |
20120155716 | Kim | Jun 2012 | A1 |
20120163783 | Braithwaite et al. | Jun 2012 | A1 |
20120243729 | Pasquero | Sep 2012 | A1 |
20120293642 | Berini et al. | Nov 2012 | A1 |
20130014153 | Bhatia et al. | Jan 2013 | A1 |
20130044199 | Nanu et al. | Feb 2013 | A1 |
20130051631 | Hanna | Feb 2013 | A1 |
20130081119 | Sampas | Mar 2013 | A1 |
20130083185 | Coleman, III | Apr 2013 | A1 |
20130089240 | Northcott et al. | Apr 2013 | A1 |
20130091520 | Chen | Apr 2013 | A1 |
20130147603 | Malhas et al. | Jun 2013 | A1 |
20130150120 | Wu et al. | Jun 2013 | A1 |
20130162798 | Hanna et al. | Jun 2013 | A1 |
20130188943 | Wu | Jul 2013 | A1 |
20130194407 | Kim | Aug 2013 | A1 |
20130215228 | Stoker et al. | Aug 2013 | A1 |
20130250085 | MacKinnon | Sep 2013 | A1 |
20130329115 | Palmeri | Dec 2013 | A1 |
20140046772 | Raman | Feb 2014 | A1 |
20140055337 | Karlsson | Feb 2014 | A1 |
20140059607 | Upadhyay et al. | Feb 2014 | A1 |
20140071547 | O'Neill et al. | Mar 2014 | A1 |
20140078389 | Merz | Mar 2014 | A1 |
20140161325 | Bergen | Jun 2014 | A1 |
20140165175 | Sugiyama | Jun 2014 | A1 |
20140171150 | Hurst et al. | Jun 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140327815 | Auger | Nov 2014 | A1 |
20140369575 | Riopka et al. | Dec 2014 | A1 |
20150037935 | Kim et al. | Feb 2015 | A1 |
20150098629 | Perna et al. | Apr 2015 | A1 |
20150098630 | Perna et al. | Apr 2015 | A1 |
20150126245 | Barkan et al. | May 2015 | A1 |
20150193666 | Derakhshani et al. | Jul 2015 | A1 |
20150227790 | Smits | Aug 2015 | A1 |
20150286864 | Gottemukkula et al. | Oct 2015 | A1 |
20150338915 | Publicover et al. | Nov 2015 | A1 |
20150379325 | Tinker et al. | Dec 2015 | A1 |
20160012275 | Bergen | Jan 2016 | A1 |
20160012292 | Perna et al. | Jan 2016 | A1 |
20160014121 | Perna et al. | Jan 2016 | A1 |
20160022167 | Simon | Jan 2016 | A1 |
20160112414 | Tsou | Apr 2016 | A1 |
20160117544 | Hoyos et al. | Apr 2016 | A1 |
20160148384 | Bud et al. | May 2016 | A1 |
20160180169 | Bae et al. | Jun 2016 | A1 |
20160274660 | Publicover et al. | Sep 2016 | A1 |
20160345818 | Suzuki et al. | Dec 2016 | A1 |
20160364609 | Ivanisov et al. | Dec 2016 | A1 |
20170032114 | Turgeman | Feb 2017 | A1 |
20170111568 | Hsieh et al. | Apr 2017 | A1 |
20170124314 | Laumea | May 2017 | A1 |
20170132399 | Pawluk et al. | May 2017 | A1 |
20170286790 | Mapen et al. | Oct 2017 | A1 |
20170286792 | Ackerman et al. | Oct 2017 | A1 |
20170323167 | Mapen et al. | Nov 2017 | A1 |
20170337439 | Ackerman et al. | Nov 2017 | A1 |
20170337440 | Green et al. | Nov 2017 | A1 |
20170337441 | Mapen et al. | Nov 2017 | A1 |
20170347000 | Perna et al. | Nov 2017 | A1 |
20180025244 | Bohl et al. | Jan 2018 | A1 |
20180165537 | Ackerman | Jun 2018 | A1 |
20180336336 | Elovici | Nov 2018 | A1 |
Number | Date | Country |
---|---|---|
102708357 | Oct 2012 | CN |
103048848 | Apr 2013 | CN |
103099624 | May 2013 | CN |
0821912 | Feb 1998 | EP |
1324259 | Jul 2003 | EP |
2007011667 | Jan 2007 | JP |
2008-538425 | Oct 2008 | JP |
4372321 | Nov 2009 | JP |
2003-0066512 | Aug 2003 | KR |
10-2011-0134848 | Dec 2011 | KR |
WO-199619132 | Jun 1996 | WO |
WO-199714873 | Apr 1997 | WO |
WO-199721188 | Jun 1997 | WO |
WO-199808439 | Mar 1998 | WO |
WO-199931183 | Jun 1999 | WO |
WO-200039760 | Jul 2000 | WO |
WO-2013056001 | Apr 2013 | WO |
WO-2014093227 | Jun 2014 | WO |
WO-2014100250 | Jun 2014 | WO |
WO-2015102704 | Jul 2015 | WO |
WO-2017172695 | Oct 2017 | WO |
WO-2017173228 | Oct 2017 | WO |
Entry |
---|
Jacomet, Marcel, Josef Goette, and Andreas Eicher. “On using fingerprint-sensors for pin-pad entry.” 4th IEEE International Symposium on Electronic Design, Test and Applications (delta 2008). IEEE, 2008. (Year: 2008). |
Cymek, Dietlind Helene, et al. “Entering PIN codes by smooth pursuit eye movements.” (2014). (Year: 2014). |
Usher, David, Yasunari Tosa, and Marc Friedman. “Ocular biometrics: simultaneous capture and analysis of the retina and iris.” Advances in Biometrics. Springer, London, 2008. 133-155. (Year: 2008). |
Lee, Hyeon Chang, et al. “A new mobile multimodal biometric device integrating finger vein and fingerprint recognition.” Proceedings of the 4th International Conference on Ubiquitous Information Technologies & Applications. IEEE, 2009. (Year: 2009). |
Macek, Nemanja, et al. “Multimodal biometric authentication in IoT: Single camera case study.” (2016): 33-38. (Year: 2016). |
NPL Search Results (Year: 2020). |
Annapoorani et al., Accurate and Fast Iris Segmentation. International Journal of Engineering Science and Technology. 2010;2(6):1492-1499. |
Arfken, G., “Mathematical Methods for Physicists,” Academic Press, NY 6.sup.th Ed. (2005). |
Atos Origin, “UK Passport Service, Biometrics Enrollment Trial.” Atos Origin Report (May 2005). |
Bertalmio et al., Navier-Stokes, Fluid Dynamics, and Image and Video Inpainting. Proceedings of the 2001 IEEE Computer Society Conferenc on Computer Vision and Pattern Recognition. CVPR 2001, 8 pages, (2001). |
Betke, et al., “Preliminary Investigation of Real-time Monitoring of a Driver in City Traffic,” IEEE Intelligent Vehicles Syposium, Oct. 3-5, 2000, Dearborn, MI, 563-568. |
Boehnen et al., A Multi-Sample Standoff Multimodal Biometric System, Theory, Aoolications and Systems (BTAS), Sep. 23, 2012, pp. 127-134. |
Bowyer et al., Image Understanding for Iris Biometrics: A Survey. Computer Vision and Image Understanding. 2008;110:281-307. |
Braithwaite, Michael et al., “Application-Specific Biometric Templates,” AutoID 2002 Workshop, Tarrytown, NY, pp. 1-10 (2002). |
Burt, et al., “The Laplacian Pyramid as a Compact Image Code,” IEEE Transactions on Communications, 31(4): 532-540, 1983. |
Canadian Offic Action for Application 2,833,740 dated Jan. 15, 2018. |
Office Action dated Nov. 19, 2018, issued in connection with U.S. Appl. No. 15/661,297 (22 pages). |
Office Action dated Oct. 30, 2018, issued in connection with U.S. Appl. No. 15/514,098 (35 pages). |
Office Action dated Sep. 26, 2018, issued in connection with U.S. Appl. No. 15/471,131 (15 pages). |
Daugman John, “How Iris Recognition Works,” IEEE Transactions on Circuits and Systems for Video Teohnology, vol. 14, No. 1 (Jan. 2004). |
Daugman, J., “High confidence visual recognition of persons by a test of statistical independence”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 15 (11), pp. 1148-1161 (1993). |
Daugman, J., “Recognizing Persons by Their Iris Patterns,” in Biometrics: Personal Indentification in a Networked Society, A.K.Jain, et al., eds. Kluwer Academic Pub. 1999. |
Daugman, John et al., “Iris recognition border-crossing system in the UAE,” International Airport Review, Issue 2 (2004). |
Daugman, John.“How Iris Recognition Works”.Jun. 13, 2003. IEEE Transactions on Circuits and Systems for Video technology, vol. 14, No. 1. |
Daugman, The Importance of Being Random: Statistical Principles of Iris Recognition. Pattern Recognition. Pre-publication version. 13 pages, Dec. 21, 2001. |
DellaVecchia, et al., “Methodology and apparatus for using the human iris as a robust biometric,” Ophthalmic Technologies VIII, SPIE Biomedical Optics Society, Photonics West Conference, San Jose, CA Jan. 24, 1998. |
Du et al., Analysis of Partial Iris Recognition Using a 1-D Approach. Proceedings, IEEE International Conference on Acoustics, Speech, and Signal Processing. Mar. 18-23, 2005;2;961-964. |
European Office Action for Application 12719332.4 dated Jan. 29, 2018. |
European Search Report for Apllication 14876521.7 dated Oct. 19, 2017. |
Extended European Search Report in connection with European Patent Application No. 15864635.6 dated Jun. 6, 2018 (8 pages). |
Fan, et al., “An Efficient Automatic Iris Image Acquisition and Preprocessing System,” Proceedings of the 2006 IEEE International Conference on Mechatronics and Automation, pp. 1779-1784 (6 pages). |
Final Office Action dated Aug. 18, 2016 from U.S. Appl. No. 14/858,715, filed Sep. 18, 2015 (6 pages). |
Final Office Action dated Aug. 4, 2016 from U.S. Appl. No. 14/509,366, filed Oct. 8, 2014 (24 pages). |
Final Office Action dated Mar. 21, 2017 from U.S. Appl. No. 14/863,936, filed Sep. 24, 2015 (17 pages). |
Final Office Action dated Mar. 22, 2017 from U.S. Appl. No. 14/863,950, filed Sep. 24, 2015 (16 pages). |
Final Office Action dated Mar. 22, 2017 from U.S. Appl. No. 14/863,960, filed Sep. 24, 2015 (21 pages). |
Final Office Action for U.S. Appl. No. 10/818,307, dated Jan. 21, 2009, 28 pages. |
Final Office Action for U.S. Appl. No. 10/818,307, dated Jan. 30, 2008, 19 pages. |
Final Office Action for U.S. Appl. No. 11/377,042, dated Nov. 14, 2008, 20 pages. |
Final Office Action for U.S. Appl. No. 11/510,197, dated May 5, 2009, 16 pages. |
Final Office Action for U.S. Appl. No. 12/464,369, dated Aug. 5, 2014, 22 pages. |
Final Office Action for U.S. Appl. No. 12/464,369, dated Oct. 3, 2012, 27 pages. |
Final Office Action for U.S. Appl. No. 12/576,644, dated Oct. 13, 2010, 11 pages. |
Final Office Action for U.S. Appl. No. 14/100,615, dated Sep. 1, 2015, 22 pages. |
Final Office Action for U.S. Appl. No. 14/509,356, dated Sep. 28, 2016, 20 pages. |
Final Office Action for U.S. Appl. No. 14/509,366, dated Aug. 4, 2016, 29 pages. |
Final Office Action for U.S. Appl. No. 14/846,090, dated Jun. 15, 2016, 17 pages. |
Final Office Action for U.S. Appl. No. 14/858,715, dated Aug. 18, 2016, 29 pages. |
Final Office Action for U.S. Appl. No. 14/858,715, dated Aug. 18, 2016, 6 pages. |
Final Office Action for U.S. Appl. No. 14/863,936, dated Mar. 21, 2017, 17 pages. |
Final Office Action for U.S. Appl. No. 14/863,950, dated Mar. 22, 2017, 16 pages. |
Final Office Action for U.S. Appl. No. 14/863,960, dated Mar. 22, 2017, 21 pages. |
First Japanese Office Action for Application 2015-545911 dated Feb. 26, 2018 ( with English translation). |
FIT Validation Studies, http://www.pmifit.com/validation.htm, Mar. 2, 2004. |
Google Scholar Search—“Rida Hadma” pp. 1 of 2. |
Haro, et al., “Detecting and Tracking Eyes by Using Their Physological Properties, Dynamics and Appearance,” CVPR 2000, 163-168. |
Hutchinson, et al., “Human-Computer Interaction Using Eye-Gaze Input,” IEEE Transaction on Systems, Man and Cybernetics, 19(6): 1527-1534, 1989. |
International Biometrics Group, “Independent Testing of Iris Recognition Technology, Final Report,” Study Commissioned by the US Department of Homeland Security (May 2005). |
International Preliminary Report on Patentability for Application No. PCT/US2015/051863, dated Mar. 28, 2017, 6 pages. |
International Search Report and Written Opinion for Application No. PCT/US17/13110, dated May 18, 2017, 12 pages. |
International Search Report and Written Opinion for Application No. PCT/US17/24444, dated Jun. 19, 2017, 9 pages. |
International Search Report and Written Opinion for Application No. PCT/US2013/073887, dated Mar. 20, 2014, 11 pages. |
International Search Report and Written Opinion for Application No. PCT/US2017/025303, dated Jun. 16, 2017, 11 pages. |
International Search Report and Written Opinion for PCT/US2017/24444 dated Jun. 19, 2017 pp. 1-15. |
International Search Report and Written Opinion for PCT/US2018/042807, dated Sep. 27, 2018, pp. 1-19. |
International Search Report and Written Opinionf for PCT/US2017/025303 dated Jun. 16, 2017. |
International Search Report for Application No. PCT/US2015/051863, dated Dec. 10, 2015, 1 page. |
International Search Report for Application No. PCT/US2017/065793, dated Feb. 16, 2018, 3 pages. |
International Search Report for PCT/US2015061024, dated Mar. 31, 2016. |
International Search Report of the International Searching Authority dated Jun. 28, 2018, issued in connection with International Application No. PCT/US2018/025895 (3 pages). |
Iwai, Daisuke, Shoichiro Mihara, and Kosuke Sato. “Extended depth-of-field projector by fast focal sweep projection.” IEEE transactions on visualization and computer graphics 21.4 (2015): 462-470. |
Jacob, R., “The Use of Eye Movements in Human-Computer Interaction Techniques: What you Look At is What you Get,” ACM Trans. Info.Sys., 9(3):152-169. |
Japanese Office Action for Application No. 2015-545911, dated Feb. 20, 2018, 6 pages. |
Li, Zexi, “An Iris Recognition Algorithm Based on Coarse and Fine Location,” 2017 IEEE 2nd International Conference on Big Data Analysis, pp. 744-747 (4 pages). |
Ma et al., “Efficient Iris Recognition by Characterizing Key Local Variations”, IEEE Transactions on Image Processing, vol. 13, No. 6, Jun. 2004, 12 pages. |
Ma., et al. “Iris Recognition Using Circular Symmetric Filters,” Pattern Recognition, 2002, Proceedings 16th International Conference on vol. 2 IEEE, 2002 (4 pages). |
Ma., et al., “Iris Recognition Based on Multichannel Gabor Filtering” ACCV2002: The 5th Asian Conference on Computer Vision, Jan. 23-25, 2002, Melbourne, Australia (5 pages). |
Mansfield, Tony et al., “Biometric Product Testing Final Report,” CESG Contract X92A/4009309, CESG/BWG Biometric Test Programme; Centre for Mathematics & Scientific Computing, National Physical Laboratory (2001). |
Matey et al., Iris on the Move: Acquisition of Images for Iris Recognition in Less Constrained Environments. Proceedings of the IEEE. Nov. 2006;94(11):1936-1947. |
Miyazawa et al., Iris Recognition Algorithm Based on Phase-Only Correlation, The Institute of Image Information and Television Engineers, JapanJun. 27, 2006, vol. 30, No. 33, pp. 45-48. |
Monro et al., An Effective Human Iris Code with Low Complexity. IEEE International Conference on Image Processing. Sep. 14, 2005;3:277-280. |
Narayanswamy, et al., “Extended Depth-of-Field Iris Recognition System for a Workstation Environment,” Proc. SPIE. vol. 5779 (2005) (10 pages). |
Negin, et al., “An Iris Biometric System for Public and Personal Use,” IEEE Computer, pp. 70-75, Feb. 2000. |
Nguyen, et al., “Quality-Driven Super-Resolution for Less Constrained Iris Recognition at a Distance and on the Move,” IEEE Transactions on Information Forensics and Security 6.4 (2011) pp. 1248-1558 (11 pages). |
Non-Final Office Action for U.S. Appl. No. 10/809,471, dated Mar. 19, 2007, 12 pages. |
Non-Final Office Action for U.S. Appl. No. 10/818,307, dated Jul. 10, 2008, 28 pages. |
Non-Final Office Action for U.S. Appl. No. 10/818,307, dated Mar. 20, 2007, 22 pages. |
Non-Final Office Action for U.S. Appl. No. 11/334,968, dated Jan. 6, 2009, 28 pages. |
Non-Final Office Action for U.S. Appl. No. 11/377,042, dated Apr. 8, 2009, 22 pages. |
Non-Final Office Action for U.S. Appl. No. 11/377,042, dated Jan. 7, 2008, 13 pages. |
Non-Final Office Action for U.S. Appl. No. 11/510,197, dated Oct. 10, 2008, 36 pages. |
Non-Final Office Action for U.S. Appl. No. 11/510,197, dated Oct. 8, 2009, 21 pages. |
Non-Final Office Action for U.S. Appl. No. 11/849,969, dated Dec. 19, 2008, 17 pages. |
Non-Final Office Action for U.S. Appl. No. 11/857,432, dated Dec. 30, 2008, 23 pages. |
Non-Final Office Action for U.S. Appl. No. 12/429,695, dated Sep. 2, 2009, 11 pages. |
Non-Final Office Action for U.S. Appl. No. 12/464,369, dated Jan. 2, 2015, 23 pages. |
Non-Final Office Action for U.S. Appl. No. 12/464,369, dated May 9, 2012, 33 pages. |
Non-Final Office Action for U.S. Appl. No. 12/576,644, dated Jul. 14, 2010, 14 pages. |
Non-Final Office Action for U.S. Appl. No. 13/096,716, dated May 23, 2013, 16 pages. |
Non-Final Office Action for U.S. Appl. No. 13/096,724, dated Jan. 16, 2014, 29 pages. |
Non-Final Office Action for U.S. Appl. No. 13/096,728, dated May 7, 2013, 33 pages. |
Non-Final Office Action for U.S. Appl. No. 13/096,728, dated Nov. 8, 2012, 37 pages. |
Non-Final Office Action for U.S. Appl. No. 14/100,615, dated Mar. 4, 2015, 19 pages. |
Non-Final Office Action for U.S. Appl. No. 14/509,356, dated Feb. 29, 2016, 19 pages. |
Non-Final Office Action for U.S. Appl. No. 14/509,356, dated Mar. 16, 2017, 21 pages. |
Non-Final Office Action for U.S. Appl. No. 14/509,366, dated Feb. 21, 2017, 25 pages. |
Non-Final Office Action for U.S. Appl. No. 14/509,366, dated Mar. 3, 2016, 40 pages. |
Non-Final Office Action for U.S. Appl. No. 14/846,090, dated Jan. 7, 2016, 35 pages. |
Non-Final Office Action for U.S. Appl. No. 14/858,715, dated Mar. 14, 2016, 37 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,936, dated Aug. 4, 2016, 16 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,936, dated Sep. 26, 2017, 28 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,950, dated Aug. 3, 2016, 15 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,950, dated Sep. 26, 2017, 22 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,960, dated Aug. 3, 2016, 21 pages. |
Non-Final Office Action for U.S. Appl. No. 14/863,960, dated Sep. 28, 2017, 28 pages. |
Non-Final Office Action for U.S. Appl. No. 15/475,425, dated Jul. 12, 2018, 31 pages. |
Non-Final Office Action for U.S. Appl. No. 15/531,922, dated Jun. 12, 2018, 17 pages. |
Non-Final Office Action for for U.S. Appl. No. 12/464,369, dated Feb. 27, 2014, 25 pages. |
Notice of Allowance dated Feb. 1, 2017 from U.S. Appl. No. 14/858,715, filed Sep. 18, 2015 (8 pages). |
Notice of Allowance for U.S. Appl. No. 10/809,471, dated Mar. 24, 2008, 14 pages. |
Notice of Allowance for U.S. Appl. No. 10/809,471, dated Oct. 5, 2007, 11 pages. |
Notice of Allowance for U.S. Appl. No. 10/818,307, dated May 18, 2009, 8 pages. |
Notice of Allowance for U.S. Appl. No. 11/334,968, dated Apr. 17, 2009, 11 pages. |
Notice of Allowance for U.S. Appl. No. 11/377,042, dated Sep. 8, 2009, 16 pages. |
Notice of Allowance for U.S. Appl. No. 11/510,197, dated Feb. 1, 2010, 13 pages. |
Notice of Allowance for U.S. Appl. No. 11/849,969, dated Aug. 20, 2009, 21 pages. |
Notice of Allowance for U.S. Appl. No. 11/849,969, dated Jul. 10, 2009, 18 pages. |
Notice of Allowance for U.S. Appl. No. 11/857,432, dated Jun. 17, 2009, 17 pages. |
Notice of Allowance for U.S. Appl. No. 12/429,695, dated Dec. 15, 2009, 7 pages. |
Notice of Allowance for U.S. Appl. No. 12/429,695, dated Nov. 17, 2009, 12 pages. |
Notice of Allowance for U.S. Appl. No. 12/464,369, dated May 8, 2015, 29 pages. |
Notice of Allowance for U.S. Appl. No. 12/576,644, dated Dec. 10, 2010, 14 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,716, dated Oct. 30, 2013, 25 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,724, dated Aug. 19, 2014, 17 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,728, dated Feb. 7, 2014, 33 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,735, dated Jun. 24, 2013, 24 pages. |
Notice of Allowance for U.S. Appl. No. 13/096,735, dated Oct. 4, 2013, 26 pages. |
Notice of Allowance for U.S. Appl. No. 14/100,615, dated Sep. 28, 2015, 22 pages. |
Notice of Allowance for U.S. Appl. No. 14/509,356, dated Aug. 1, 2017, 29 pages. |
Notice of Allowance for U.S. Appl. No. 14/509,366, dated Jul. 31, 2017, 59 pages. |
Notice of Allowance for U.S. Appl. No. 14/846,090, dated Jul. 25, 2016, 22 pages. |
Notice of Allowance for U.S. Appl. No. 14/858,715, dated Feb. 1, 2017, 42 pages. |
Notice of Allowance for U.S. Appl. No. 14/858,715, dated Feb. 1, 2017, 8 pages. |
Notice of Allowance for U.S. Appl. No. 14/858,715, dated Mar. 1, 2017, 13 pages. |
Notice of Allowance for U.S. Appl. No. 14/863,936, dated Mar. 20, 2018, 9 pages. |
Notice of Allowance for U.S. Appl. No. 14/863,950, dated Mar. 27, 2018, 9 pages. |
Notice of Allowance for U.S. Appl. No. 14/863,960, dated Mar. 20, 2018, 9 pages. |
Office Action dated Aug. 3, 2016 from U.S. Appl. No. 14/863,950, filed Sep. 24, 2015 (15 pages). |
Office Action dated Aug. 3, 2016 from U.S. Appl. No. 14/863,960, filed Sep. 24, 2015 (21 pages). |
Office Action dated Aug. 4, 2016 from U.S. Appl. No. 14/863,936, filed Sep. 24, 2015 (16 pages). |
Office Action dated Feb. 21, 2017 from U.S. Appl. No. 14/509,366, filed Oct. 8, 2014 (25 pages). |
Office Action dated Mar. 14, 2016 from U.S. Appl. No. 14/858,715, filed Sep. 18, 2015 (9 pages). |
Office Action dated Mar. 3, 2016 from U.S. Appl. No. 14/509,366, filed Oct. 8, 2014 (19 pages). |
Ortiz et al., An Optimal Strategy for Dilation Based Iris Image Enrollment. IEEE International Joint Conference on Biometrics. 6 pages, Sep. 29-Oct. 2, 2014. |
Restriction Requirement for U.S. Appl. No. 11/510,197, dated May 16, 2008, 12 pages. |
Robert J.K. Jakob, “Eye Movement Based Human Computer Interaction Techniques; Toward Non-Command Interfaces,” Advances in Human-Computer Interaction, vol. 4, ed. by H.R. Hartson and D. Hix, pp. 151-190, Ablex Publishing Co., Norwood, N.J. (1993). |
Robert J.K. Jakob, “Eye Tracking in Advanced Interface Design,” in Virtual Environments and Advanced Interface Dseign, ed. by W. Barfield and T.A. Furness, pp. 258-288, Oxford University Press, New York (1995). |
Roth, Mouthpiece Meditations, Part 3. Online Trombone Journal, www.trombone.org. 5 pages, Jul. 23, 2018. |
Schovanec, Ocular Dynamics and Skeletal Systems, IEEE Control Systems Magazine. Aug. 2001;21(4):70-79. |
Scoblete, The Future of the Electronic Shutter. pdn, Photo District News, retrieved online at: https://www.pdnonline.com/gear/cameras/the-future-of-the-electronic-shutter/, 6 pates, May 9, 2016. |
Second Japanese Office Action for Application 2015-545911 dated Feb. 26, 2018 ( with English translation). |
Singapore Search Report and Written Report for Application No. 11201704097X, dated Mar. 13, 2018, 5 pages. |
SRI International, “Seeing the Future of Iris Recognition”, available at www.sri.com/iom, Mar. 2014, 9 pages. |
Swiniarski, Experiments on Human Recognition Using Error Backpropagation Artificial Neural Network. Neural Networks Class (CS553) of San Diego State University Computer Science Department, Apr. 2004. |
Tan et al., Efficient Iris Recognition by Characterizing Key Local Variations. IEEE Transactions on Image Processing. Jun. 2004;13(6):739-750. |
U.S. Appl. No. 14/100,615, “Iris Biometric Matching System”, filed Dec. 9, 2013, 57 pages. |
U.S. Appl. No. 14/100,615, “Iris Biometric Matching System,” filed Dec. 9, 2013, 61 pages. |
U.S. Appl. No. 61/888,130, filed Oct. 8, 2013, 20 pages. |
Van der Wal, et al., “The Acadia Vision Processor,” IEEE International Workshop on Computer Architecture for Machine Perception, pp. 31-40, Padova, Italy, Sep. 11-13, 2000. |
Weisstein E. et al.; “Circle” From MathWorld—A Wolfram Web Resource. www.mathworld.wolfram.com/circle.html, pp. 1 to 8., Jul. 3, 2008. |
Wildes, R., “Iris Recognition: An Emerging Biometric Technology,” Proc. IEEE, 85(9):1348-1363, Sep. 1997. |
Written Opinion for Application No. PCT/US2015/051863, dated Dec. 10, 2015, 5 pages. |
Written Opinion for Application No. PCT/US2017/065793, dated Feb. 16, 2018, 10 pages. |
Written Opinion for PCT/US2015061024, dated Mar. 21, 2016. |
Written Opinion of the International Searching Authority dated Jun. 28, 2018, issued in connection with International Application No. PCT/US2018/025895 (10 pages). |
www.m-w.com—definition—“ellipse” (Refer to Ellipse Illustration; also attached) pp. 1 of 2. |
Yokoya, Ryunosuke, and Shree K. Nayar. “Extended depth of field catadioptric imaging using focal sweep.” Proceedings of the IEEE International Conference on Computer Vision. 2015. |
Zhu, et al., “Biometric Personal Identification Based on Iris Patterns,” Pattern Recognition, Proceedings 15th International Conference on vol. 2 IEEE (2000) (4 pages). |
Number | Date | Country | |
---|---|---|---|
20190034606 A1 | Jan 2019 | US |
Number | Date | Country | |
---|---|---|---|
62537253 | Jul 2017 | US |