The present disclosure relates to a method of a biometric recognition system of performing real-eye detection, and a biometric recognition system performing the method.
When capturing images of an eye of a user for performing iris recognition using for instance a camera of a smartphone for subsequently unlocking the smart phone of the user, subtle visual structures and features of the user's iris are identified in the captured image and compared to corresponding features of a previously enrolled iris image in order to find a match. These structures are a strong carrier of eye identity, and by association, subject identity.
Both during authentication and enrolment of the user, accurate detection of these features is pivotal for performing reliable iris recognition. However, iris recognition is susceptible to spoofing where an attacker e.g. may present a credible and detailed iris printout to an iris recognition system for attaining false authentication.
One objective is to solve, or at least mitigate, this problem in the art and thus to provide an improved method of a biometric recognition system of performing real-eye detection.
This objective is attained in a first aspect by a method of a biometric recognition system of performing real-eye detection for an individual. The method comprises capturing a plurality of images comprising a representation of an eye of the individual, which images are captured utilizing polarized light reflected at the eye and received at a polarization-sensitive camera capturing said images, wherein for each image being captured a different polarization rotation is selected, detecting from the representation in each captured image birefringent features of a cornea of the individual, determining, by matching the detected birefringent cornea features of the captured images with the expected birefringent cornea features, whether the detected birefringent cornea features are correctly rendered in at least one of the captured images, and if so determining that the eye is a real eye.
This objective is attained in a second aspect by a biometric recognition system configured to perform real-eye detection. The system comprises a polarization-sensitive camera configured to capture a plurality of images comprising a representation of an eye of the individual, which images are captured utilizing polarized light reflected at the eye and received at a polarization-sensitive camera capturing said images, wherein for each image being captured a different polarization rotation is selected. The system further comprises a processing unit configured to detect, from the representation in each captured image, birefringent features of a cornea of the individual, determine, by matching the detected birefringent cornea features of the captured images with the expected birefringent cornea features, whether the detected birefringent cornea features are correctly rendered in at least one of the captured images, and if so to determine that the eye is a real eye.
Thus, by subjecting the eye of an individual to polarized light and capturing an image of the eye with a polarization-sensitive camera, so-called birefringent features of the cornea covering the iris will be present in the image. A spoof eye provided by an attacker, such as a paper printout, will not exhibit the birefringent cornea features and may thus be detected as a spoof.
If the detected birefringent cornea features have the appearance that would be expected upon a particular polarization configuration being applied, they are considered correctly rendered and the eye will be determined to be a real, authentic eye. To this effect, the detected birefringent cornea features may be matched against birefringent cornea features of a reference image and if there is a match, the detected birefringent cornea features are considered correctly rendered.
Now, if the user tilts her head (and thus her eyes), the birefringent cornea features typically become blurred and/or deformed in the captured image due to various polarization properties, making it difficult to determine whether or not the detected blurred/deform birefringent cornea features are correctly rendered in a captured image, or even that the birefringent cornea features at all are present in a captured image for subsequent detection.
To resolve this issue, a plurality of images are captured each with a different rotation of the polarization orientation, where only a selected polarization rotation which aligns with (or at least being close to aligning with) the eye orientation of the user will result in an image free from blur/deformations.
The detected birefringent cornea features of that image (or potentially all captured images) are then compared with expected birefringent cornea features e.g. from a reference image for a match and thus it is determined whether the birefringent features are correctly rendered in the captured image and if so, it is determined that the iris originates from a real eye.
Advantageously, this makes the system invariant to user head-tilts; even if the user would tilt her head, images are captured where the rotation of the orientation of the polarization is different for each captured images, and thus it is possible to captured an image where the rotation of the polarization aligns with the head-tilt.
Thus, this will make the detection of birefringent cornea features in a captured image far more effective, and the number of false rejections may be decreased since the sensitivity of the system is improved.
In an embodiment, the detected birefringent cornea features of the captured images are aligned with birefringent cornea features of an expected eye representation.
In an embodiment, the polarization of the light and the polarization at the camera is selected to be cross-polarized.
In an embodiment, information indicating an estimated head-tilt of the individual is acquired, wherein the capturing of images are performed with a polarization rotation selected to be in a range adapted to match the estimated head-tilt and/or the aligning of the detected birefringent cornea features with birefringent cornea features of an expected eye representation is performed by selecting images being captured with a polarization rotation being in a range of the estimated head-tilt as candidates for the aligning.
In an embodiment, the aligning is performed by rotating each captured representation of the eye such that said each captured eye representation has a same orientation as a non-tilted expected eye representation or by rotating the non-tilted expected eye representation to have the same eye orientation as said each captured eye representation, the rotation performed corresponding to the selected polarization rotation.
In an embodiment, if after a set number of failed attempts have been made for determining that the birefringent features are correctly rendered in the at least one of the captured images, the individual is required to prove knowledge of secret credentials before further attempts are allowed.
In an embodiment, the detected birefringent cornea features determined to be correctly rendered is compared with previously enrolled birefringent cornea features, and if there is a match an individual associated with the birefringent cornea features determined to be correctly rendered is authenticated.
In an embodiment, iris, face or periocular features are detected from the image in which the birefringent cornea features are determined to be correctly rendered and compared with previously enrolled iris, face or periocular features, and if there is a match an individual associated with the detected iris, face or periocular features is authenticated.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to “a/an/the element, apparatus, component, means, step, etc.” are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
Aspects and embodiments are now described, by way of example, with reference to the accompanying drawings, in which:
The aspects of the present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown.
These aspects may, however, be embodied in many different forms and should not be construed as limiting; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and to fully convey the scope of all aspects of invention to those skilled in the art. Like numbers refer to like elements throughout the description.
After having captured the image(s), the user's iris is identified in the image(s) and unique features of the iris are extracted from the image and compared to features of an iris image previously captured during enrolment of the user 100. If the iris features of the currently captured image—at least to a sufficiently high degree—correspond to those of the previously enrolled image, there is a match and the user 100 is authenticated. The smart phone 101 is hence unlocked.
It should be noted that this is exemplifying only, and that authentication may be utilized for numerous purposes, such as e.g. unlocking a vehicle to be entered by a user, allowing a user to enter a building, to perform a purchase at a point-of-sale terminal, etc, using appropriately adapted iris recognition systems.
The camera 103 will capture an image of the user's eye 102 resulting in a representation of the eye being created by the image sensor 104 in order to have the processing unit 105 determine whether the iris data extracted by the processing unit 105 from image sensor data corresponds to the iris of an authorised user or not by comparing the iris image to one or more authorised previously enrolled iris templates pre-stored in the memory 106.
With reference again to
Now, with reference to
In
Thus, the camera 103 must be polarization-sensitive in order to be able to perceive the polarized light 120 being reflected against the eye 102 and impinging on the image sensor of the camera 103.
In practice, the image sensor 104 of
Now, a human cornea—i.e. the outer membrane in front of the iris—exhibits birefringent properties that are apparent in a captured image when the iris is illuminated with polarized light and the image is captured with a polarization-sensitive camera 103.
Thus, as shown in
Thus, an image captured by the polarization-sensitive camera 103 while subjecting the iris 121 to polarized light will comprise birefringent cornea features 122 and may thus be utilized for detecting whether the eye 102 is a real, authentic eye or not.
For instance, assuming that an attacker subjects the iris recognition system 110 to a spoof attempt where the attacker presents e.g. a printout of a user's iris. Such a printout will not comprise the birefringent cornea features 122 of the eye 102 of the user, even if it should be noted that iris features of this printout may correspond perfectly to those of the user. Thus, if no birefringent features are detected in the captured image, the system 110 terminates the authentication process since the presented iris is not deemed to originate from a real eye.
Hence, the appearance of the birefringent features depends on the combination of polarization configuration selected for the first polarizing filter 109 and the second polarizing filter 111. As is understood, one or both of the polarization filters 109, 111 may be configured to be electrically controllable to change polarization. In such a case, the processing unit 105 may control the filters 109, 111 to change polarization configuration as desired.
As mentioned, the system 110 may conclude whether or not birefringent cornea features 122 are present in the captured image and if so determining that the eye is a real eye. However, if the user tilts her head (and thus her eyes), the birefringent cornea features typically become blurred and/or deformed in the captured image due to various polarization properties, making it difficult to determine whether or not the detected blurred/deform birefringent cornea features are correctly rendered in a captured image, or even to detect that the birefringent cornea features at all are present in a captured image.
As is understood, a head tilt may occur in any direction, i.e. pitch (back and forth), roll (from side to side) or yaw (rotation around the neck), all of which may cause blurring or deformation of birefringent cornea features. Embodiments disclosed herein mainly pertain to adapting to user head tilt in the form of head rolls.
Now, with reference to
Typically, with reference to
Reference will further be made to
As is illustrated, five images are captured each with a different selected rotation of the polarization of the first polarization filter 109 and the second polarization filter 111. Thus, a different polarization rotation is utilized for each captured image while maintaining a same relative orientation between the polarization of the first filter 109 and the polarization of the second filter.
In this example, the rotation of the polarization ranges from −20° to +20° with a resolution of 10°, wherein if the polarization of the first polarization filter 109 is rotated e.g. +10°, then the polarization of the second polarization filter 111 is also rotated +10°, if the polarization of the first polarization filter 109 is rotated e.g. +20°, then the polarization of the second polarization filter 111 is also rotated +20°, and so on.
In the embodiment illustrated in
Now, for each image being captured with a selected polarization rotation deviating from the actual head tilt—and thus eye orientation—of the user 100, a deformation will occur in the captured image as previously discussed with reference to
As illustrated in
Thus, if an image is captured using a same rotation of the polarization by the first polarization filter 109 and the polarization of the second polarization filter 111 as the head-tilt of the user 100, i.e. the selected rotation of the two polarizations being aligned with the eye orientation of the user 100 as illustrated with the far-right 20° rotation of the polarization of both the first filter 109 and the second filter 111, the birefringent cornea features 122 will not be deformed.
The rationale is that if both the polarizing filters 109, 111 and the eye 102 are tilted by the same angle, the relative rotation between the camera 103 and the eye 102 is zero resulting in an image with non-deformed birefringent cornea features.
In this embodiment, a cross-polarization configuration is selected for the first filter 109 and the second filter 111, i.e. one of the two filters is vertically polarized while the other is horizontally polarized. An advantage of using cross-polarization is that capturing of direct reflections that cause saturated spots in the image may be avoided. The reflections may occur on the eye itself or on glasses worn by the user.
However, any appropriate polarization configuration may be selected for the first polarization filter 109 and the second polarization filter 111. The selected polarization configuration must be taken into account upon subsequently determining whether or not detected birefringent cornea features 102 are correctly rendered in a captured image, since different combinations of polarizations will have different expected eye appearances as previously discussed with reference to
In step S101, for each image captured the polarization of the first and second filter 109, 111 is hence slightly rotated, for instance starting at −20° and ending at +20° in steps of 10° thereby capturing five images. As discussed, this may be effected by implementing the polarization filters 109, 111 as an electrically controllable filter where the polarization of both filters 109, 111 is rotated by 10° in a left-hand direction for each image being captured.
In other words, for each image being captured a different rotation is selected for the polarization orientation of the light 120 and the polarization orientation at the camera 103 while maintaining a same relative orientation between the polarization of the light 120 and the polarization at the camera 103.
For any captured image where the selected rotation of the polarization of the first filter 109 and the second filter 111 does not correspond to the head tilt of the user (or at least does not correspond to a sufficient degree), any birefringent cornea features in the image will be blurred and deformed, in line with what has previously been discussed.
Nevertheless, for each image, birefringent cornea features are detected by the processing unit 105 in step S102 (or at least an attempt is made to detect the birefringent cornea features).
As previously illustrated in
Thereafter, the detected birefringent cornea features 122 of each captured image are aligned with birefringent cornea features of an expected eye representation in step S103.
In practice, taking the fifth captured image with a polarization rotation of +20° as example, this may be performed by rotating the captured representation of the iris 20° to the right or by rotating a non-tilted, expected iris representation 20° to the left such that the captured representation and the expected representation have the same orientation (i.e. either zero tilt or a 20° left-tilt), thereby causing the captured representation and the expected representation to orientationally match.
As is understood, in a scenario where the camera 103 is equipped with multiple rotated sensors, or where the sensor 104 is arranged to be mechanically rotatable along with the rotation of the polarization orientation of the filters 109, 111, the aligning of step S103 is not required and may thus be omitted. Nevertheless, in embodiments described hereinbelow, it is assumed that a single fixed sensor is used and alignment according to step S103 thus is required.
Thus, either the captured eye representation or the expected eye representation is rotated to the same extent as the rotation of the polarization used for the captured image in order to have the detected cornea features 122 align with those of the expected eye representation.
As is understood, if the aligning process would start with the first captured image having a polarization rotation of −20°, a match would not be found for that image since the selected polarization rotation would not match the actual head tilt of the user thereby causing deformed cornea features, whereupon the process would proceed by aligning the second captured image with the expected representation, and so on, until a match indeed is found for the fifth image. Should the aligning process start with the fifth image and thus find a match after the 20° alignment has been performed, the process will not perform any further alignment attempts with the remaining four images.
Alternatively, an indication of the user head-tilt may be acquired and used as a starting point in order to avoid having to evaluate each captured image during alignment. For example, the head-tilt may be estimated by means of performing face analysis on a captured image. Based on the estimated head-tilt, one or more captured images are selected as candidates for the alignment. For instance, if the head-tilt is estimated to be around 10-20°, then the processing unit 105 may conclude that only images having been captured with a polarization rotation within the range of the estimated head-tilt will be considered for alignment. For instance, with respect to
Further, if an indication of the user head-tilt is acquired from an initially captured image (e.g. with a zero polarization rotation selected), such as e.g. head-tilt estimated to be around 10-20°, then the processing unit 105 may conclude that a number of images are captured with a polarization rotation selected to be in that range. As is understood, if the head tilt is estimated to be around +20°, it is not useful to capture an image having a polarization rotation of, say, −20°.
In other words, a best match may be evaluated for one or more of the captured images by aligning the captured eye representation with the expected eye representation by tilting one or the other representation with the angle corresponding to the selected polarization rotation (−20°, −10, 0, +10° and +20°, respectively) such that the captured eye representation is aligned with the expected eye representation for each image.
Thereafter, in step S104, the processing unit 105 detects—by matching the detected birefringent cornea features 122 with the expected birefringent cornea features (e.g. taken from a reference image)—whether the birefringent features 122 are correctly rendered in at least one of the five captured images, i.e. that the detected birefringent cornea features 122 of said at least one image match the expected appearance of the birefringent cornea features (cf.
If so, the processing unit 105 determines in step S105 that the eye in the captured image is a real eye. If not, the processing unit 105 may advantageously determine that a spoof attempt has been detected and the authentication process is terminated.
As will be concluded in step S104, only a captured image which indeed has the same (or at least similar) selected polarization rotation as the head tilt of the user—in this example the image having the selected +20° polarization rotation—will present birefringent cornea features free from deformation, which thus can be aligned with and likely be regarded to correspond to the expected birefringent cornea features.
Hence, as illustrated in
Advantageously, this makes the system invariant to user head-tilts; even if the user would tilt her head as illustrated in
If none of the selected polarization rotations sufficiently corresponds to the actual head tilt of the user, resulting in a non-successful match in step S104 after alignment (due to blurred/deformed cornea features in the captured images), the processing unit 105 may determine that a potential spoof attempt has been detected and the authentication process is terminated.
As is understood, the system 110 may accept some degree of misalignment between a captured eye representation and an expected eye representation in terms of polarization rotation. In practice, if the polarization rotation is selected to have a resolution of 10°, any +/−5° misalignment of the polarization rotation with the actual head tilt may still result in an acceptable deformation of the birefringent cornea features. Hence, in this example, any image being captured with a polarization rotation between +15° and +25° may in practice be accepted by the system in that the detected cornea features would be determined to be correctly rendered after having been aligned to take into account the eye orientation of the user.
In a further embodiment, it is envisaged that if after a number of failed detection attempts have been made in step S104 for a set of captured images, such as two failed attempts, the iris recognition system 110 enters a breach mode, where the user is required to prove knowledge of secret credentials, for instance enter a pin code, before any further attempts can be made.
As previously discussed with reference to
A user will typically exhibit characteristic birefringent features for each given configuration, from which characteristic birefringent features the user may be recognized, in addition to detecting whether or not the eye is authentic as has been described in the previous embodiments.
In the embodiment of
Thus, if after the processing unit 105 has determined in step S105 from one of the captured images that the eye indeed is a real eye, the birefringent cornea features detected in step S104 to be correctly rendered is compared in step S106 to previously enrolled birefringent cornea features of templates stored in the memory 106 of the iris recognition system 110 and if there is a match between the detected birefringent features and the previously enrolled birefringent features, the user is authenticated in step S107.
In first scenario (a), the birefringent cornea features detected in step S104 to be correctly rendered in one of the captured images are compared in step S106 to the previously enrolled birefringent cornea features of the templates stored in the memory 106 and since in this scenario there is a match between the detected birefringent features and the previously enrolled birefringent features, the user is authenticated in step S107. In other words, the identity of the user 100 associated with the detected birefringent features determined to be correctly rendered in step S104 must indeed correspond to identity A associated with the birefringent feature template pre-stored in the memory 106.
In second scenario (b), the birefringent cornea features detected in step S104 to be correctly rendered are compared in step S106 to the previously enrolled birefringent cornea features of the templates stored in the memory 106. However, since the detected birefringent features do not match the birefringent feature template in step S106, authentication is not successful. Thus, the detected birefringent features determined to be correctly rendered in step S104 cannot correspond to enrolled identity A but rather a different identity, in this example denoted identity B. As a result, the user is rejected.
In third scenario (c), an attempt is made in step S104 to detect birefringent cornea features from the images captured in step S101 but since in this scenario no birefringent features can be detected in anyone of the images, the system 110 concludes that a spoof attempt has occurred where an attacker presents e.g. a printout of a user's iris. It should be noted that iris features of this printout nevertheless may correspond perfectly to those of the user. As a result, the authentication process is terminated.
In a further embodiment, in addition to (or alternatively to) performing authentication based on detected birefringent cornea features, further detected biometric features of the captured images may also be considered.
It is noted that birefringent features of the cornea typically are less expressive than face features and even more so when compared to iris features. Thus, in a scenario where high security and reliability is required in the authentication process, the birefringent cornea feature detection described hereinabove is expanded upon such that iris feature detection and/or face feature detection and subsequent iris/face feature authentication further is undertaken.
Further envisaged biometric features to be utilized include those in the so-called periocular region, which is the area around the eye including features like eyelashes, eyebrows, eyelids, eye shape, tear duct, skin texture, etc.
Thereafter, the processing unit 105 compares the detected iris features to previously enrolled iris feature template(s) in step S106b.
If there is a match also for the compared iris features, the user 100 is authenticated in step S107. If not, authentication fails.
Advantageously, not only is the level of security and reliability raised in the authentication process, but liveness detection is further provided by means of the birefringent cornea feature detection. In other words, if the presented iris is a spoof (such as a printout of an iris image), no birefringent cornea features will be detected and the authentication will be terminated in the match operation undertaken in step S104.
As is understood, if for some reason the iris features are difficult to detect in in the captured image(s) being subject to polarized light, appropriate image processing may be applied, such as filtering, before the iris detection. As a further alternative, an image not being subjected to polarization is captured from which the iris features are detected.
While
The aspects of the present disclosure have mainly been described above with reference to a few embodiments and examples thereof. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.
Thus, while various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
2250613-3 | May 2022 | SE | national |
This application is a bypass continuation of International Application No. PCT/SE2023/050503, filed May 22, 2023, which claims priority to Swedish Patent Application No. 2250613-3, filed May 23, 2022. The disclosures of each of the above applications are incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/SE2023/050503 | May 2023 | WO |
Child | 18956117 | US |