The present disclosure relates to processing of an iris image.
As one kind of biometric authentication, iris authentication has been known. Generally, in iris authentication, an iris pattern of a user is registered in a database. At the time of authentication, an iris pattern acquired from a user is collated with the iris pattern registered in the database, and the authentication is thereby performed.
Since a fine pattern of an iris is used for registration and authentication of an iris, iris information has to be extracted from an image in which an iris is in focus. In order to avoid extraction of the iris information from an image in which an iris is out of focus, it is desirable to determine whether or not the photographed image is focused on the iris before the registration or authentication process of an iris is performed. Further, in view of a calculation amount, it is desirable to perform a focus determination in advance because an extraction process of the iris information from an image in which an iris is out of focus can be omitted. Patent Document 1 discloses a procedure to determine whether or not an iris image is in focus by using edge strength of an iris region.
In the procedure of Patent Document 1, a focus determination is performed by using an edge of an iris region of a photographed image. However, unless the photographed image is in focus to some extent, the edge of the iris region may not be correctly detected and the focus determination may not be performed.
One object of the present disclosure is to provide a focus determination device capable of precisely performing a focus determination for plural iris images.
According to one aspect of the present disclosure, there is provided a focus determination device comprising:
According to another aspect of the present disclosure, there is provided a focus determination method comprising:
According to still another aspect of the present disclosure, there is provided a recording medium recording a program causing a computer to execute the processes of:
Preferable example embodiments of the present disclosure will hereinafter be described with reference to drawings.
[Imaging Device]
When the iris image is taken, the user X stands in front of the camera 2. In a state where the face of the user X is illuminated by the light 3, the camera 2 images the region around the face of the user. Note that an imaging range of the camera 2 may be the whole face of the user X or may be only a part around an eye of the user X. Among the images taken by the camera 2, an image of an eye region of the user X is used as the iris image. Therefore, in a case where the imaging range of the camera 2 is wider than the eye region of the user X, the eye region is cut out from the taken image and is used as the iris image.
[Focus Determination Device]
(Hardware Configuration)
The IF 11 performs inputs and outputs with an external device. Specifically, the iris image of the user X is inputted to the focus determination device 100 via the IF 11. Further, a determination result by the focus determination device 100 is outputted to an external device through the IF 11.
The processor 12 is a computer such as a CPU (Central Processing Unit), and executes a program prepared in advance thereby to control the whole focus determination device 100. Note that as the processor 12, one of a CPU, a GPU (Graphics Processing Unit), an FPGA (Field-Programmable Gate Array), a DSP Demand-Side Platform), and an ASIC (Application Specific Integrated Circuit) may be used, or plural processors may be used in parallel. Specifically, the processor 12 executes a focus determination process described later.
The memory 13 is configured with a ROM (Read-Only Memory), a RAM (Random Access Memory), and so forth. The memory 13 is also used as a working memory during execution of various processes by the processor 12.
The recording medium 14 is a non-volatile and non-transitory recording medium such as a disk-type recording medium or a semiconductor memory and is configured to be detachable from the focus determination device 100. The recording medium 14 records various programs to be executed by the processor 12. When the focus determination device 100 executes the focus determination process, a program recorded in the recording medium 14 is loaded to the memory 13 and is executed by the processor 12.
The database 15 temporarily stores the iris image inputted through the IF 11, a focus determination result by the focus determination device 100, and so forth. Note that in a case where the memory 13 has a sufficient storage capacity for those sets of data, the database 15 may be omitted. Further, the focus determination device 100 may include input units such as a keyboard and a mouse and a display unit such as a liquid crystal display, by which an administrator or the like performs instructions and inputs.
[Functional Configuration]
The iris image acquisition unit 101 acquires the iris image from the camera 2 or the like illustrated in
The filter processing unit 102 executes filter processing for the focus determination. Specifically, the filter processing unit 102 applies an image processing filter of the shape decided based on the shape of the light 3 illustrated as an example in
In the following, the image processing filter will be described in detail. As one example, the shape of the image processing filter is determined in advance based on the shape of the light 3.
When the image processing filter as shown in
Further, as an image processing filter F2 illustrated in
Returning to
The focus determination unit 104 compares the filter output integration value P with a threshold value Q decided in advance and thereby performs the focus determination of the iris image. Specifically, in a case where the filter output integration value P is equal to or larger than the threshold value Q, the focus determination unit 104 determines that the iris image is in focus. On the other hand, in a case where the filter output integration value P is smaller than the threshold value Q, the focus determination unit 104 determines that the iris image is out of focus. Note that in a case where the filter output integration unit 103 outputs the filter output integration value of each of the local regions, the focus determination unit 104 may determine that the iris image is in focus when the number of local regions having the filter output integration values equal to or larger than the threshold value Q is equal to or larger than a predetermined number, and may determine that the iris image is out of focus when the number of local regions having the filter output integration values equal to or larger than the threshold value Q is smaller than the predetermined number.
In the above configuration, the iris image acquisition unit 101 is one example of an acquisition unit, the filter processing unit 102 is one example of a filter processing unit, and the filter output integration unit 103 and the focus determination unit 104 are one example of a focus determination unit.
(Focus Determination Process)
First, the iris image acquisition unit 101 acquires the iris image inputted from the camera 2 or the like and outputs it to the filter processing unit 102 (step S11). Next, the filter processing unit 102 executes the filter processing by applying the image processing filter prepared in advance to the iris image and outputs the filter-processed image to the filter output integration unit 103 (step S12). The filter output integration unit 103 performs the integration process using the pixel values of the filter-processed image and outputs the filter output integration value P to the focus determination unit 104 (step S13). The focus determination unit 104 performs the focus determination by comparing the filter output integration value P with the predetermined threshold value Q (step S14), and outputs a determination result (step S15). Then, the focus determination process ends.
As described above, in the first example embodiment, the filter processing is performed by using the image processing filter in the shape decided based on the shape of the light which is actually used when the iris image is taken, and the focus determination is performed. Thus, by using the shape of the actually used light as a reference, it becomes possible to precisely perform the focus determination.
In the first example embodiment, the shape of the image processing filter is decided based on the shape of the light 3. Differently, in a second example embodiment, the shape of the image processing filter is decided in advance based on the shape of the reflection region of the light in the eye region of the iris image. Thus, since the image processing filter in the shape corresponding to the reflection region of the light can be obtained, the focus determination can be precisely performed. Note that except for this point, the second example embodiment is similar to the first example embodiment.
In a third example embodiment, the shape of the image processing filter is decided such that the aspect ratio (length-to-width ratio) between the horizontal direction and the vertical direction of the light 3 agrees with the ratio between the magnitude in a lateral direction and the magnitude in a longitudinal direction of spatial frequency components of the image processing filter. Thus, since the image processing filter in the shape having the same length-to-width ratio as the light can be obtained, the focus determination can be precisely performed. Note that except for this point, the third example embodiment is similar to the first example embodiment.
In a fourth example embodiment, the shape of the image processing filter is decided such that the aspect ratio (length-to-width ratio) between the horizontal direction and the vertical direction of the reflection region 53 of the light agrees with the ratio between the magnitude in the lateral direction and the magnitude in the longitudinal direction of the spatial frequency components of the image processing filter. In a case where the shape of the image processing filter is decided based on the shape of the reflection region of the light, the shape of the reflection region of the light is detected by using an image of the eye region which is experimentally imaged with irradiating the light, and the shape of the image processing filter may be decided based on the detected shape. Thus, since the image processing filter in the shape having the same length-to-width ratio as the reflection region of the light can be obtained, the focus determination can be precisely performed. Note that except for this point, the fourth example embodiment is similar to the first example embodiment.
In a fifth example embodiment, the shape of the image processing filter is decided such that the ratio between the magnitudes of a spatial frequency component in the horizontal direction and of a spatial frequency component in the vertical direction of the light 3 agrees with the ratio between a spatial frequency component in the lateral direction and a spatial frequency component in the longitudinal direction of the image processing filter. Thus, since the image processing filter in the shape having the same length-to-width ratio as the light can be obtained, the focus determination can be precisely performed. Note that except for this point, the fifth example embodiment is similar to the first example embodiment.
In a sixth example embodiment, the shape of the image processing filter is decided such that the ratio between the magnitudes of a spatial frequency component in the horizontal direction and of a spatial frequency component in the vertical direction of the reflection region 53 of the light 3 agrees with the ratio between the spatial frequency component in the lateral direction and the spatial frequency component in the longitudinal direction of the image processing filter. Thus, since the image processing filter in the shape having the same length-to-width ratio as the reflection region of the light can be obtained, the focus determination can be precisely performed. Note that except for this point, the sixth example embodiment is similar to the first example embodiment.
In the above fifth and sixth example embodiments, the shape of the image processing filter is decided such that the ratio between the magnitudes of the spatial frequency component in the horizontal direction and of the spatial frequency component in the vertical direction of the light 3 or of the reflection region 53 of the light 53 agrees with the ratio between the spatial frequency component in the lateral direction and the spatial frequency component in the longitudinal direction of the image processing filter. However, the ratio between the spatial frequency components in the longitudinal direction and the lateral direction of the image processing filter may not have to strictly agree with the aspect ratio of the shape of the light 3 or the shape of the reflection region 53 of the light. In this point, in a seventh example embodiment, the shape of the image processing filter is decided such that the size relation of the lengths of the shape of the light 3 in the horizontal direction and in the vertical direction agrees with the size relation of the magnitude in the lateral direction and of the magnitude in the longitudinal direction of the spatial frequency components of the image processing filter. Thus, since the image processing filter in the shape whose longitudinal/lateral size relationship agrees with that of the light can be obtained, the focus determination can be performed by using a highly versatile image processing filter. Note that except for this point, the seventh example embodiment is similar to the first example embodiment.
In the seventh example embodiment, the shape of the image processing filter is decided such that the size relation of the lengths of the shape of the light 3 in the horizontal direction and in the vertical direction agrees with the size relation of the magnitude in the lateral direction and of the magnitude in the longitudinal direction of the spatial frequency components of the image processing filter. Instead, in an eighth example embodiment, the shape of the image processing filter may be decided such that the size relation of the lengths of the shape of the reflection region 53 of the light in the horizontal direction and in the vertical direction agrees with the size relation of the magnitude in the lateral direction and of the magnitude in the longitudinal direction of the spatial frequency components of the image processing filter. Thus, since the image processing filter in the shape whose longitudinal/lateral size relation agrees with that of the reflection region of the light can be obtained, the focus determination can be performed by using a highly versatile image processing filter. Note that except for this point, the eighth example embodiment is similar to the first example embodiment.
In the first to eighth example embodiments, the filter processing is performed by using the image processing filter which is prepared in advance based on the shape of the light or the shape of the reflection region of the light. That is, in the first to eighth example embodiments, a versatile image processing filter which is prepared in advance is applied to plural iris images inputted. Differently, in a ninth example embodiment, a reflection region of a light in an inputted iris image is detected to estimate the shape of the light, and the shape of the filter is decided based on the estimated shape.
(Hardware Configuration)
A hardware configuration of a focus determination device of the ninth example embodiment is similar to the focus determination device 100 of the first example embodiment which is illustrated in
(Functional Configuration)
The light reflection detection unit 105 detects a reflection region of the light as illustrated by the reflection region 53 in
The light shape estimation unit 106 estimates the shape of the light based on the sizes in the longitudinal direction and the lateral direction of the reflection region detected by the light reflection detection unit 105 and outputs a condition which defines the shape of the light to the filter generation unit 107. The condition which defines the shape of the light (hereinafter, also referred to as “light shape condition”) may be the ratio between the longitudinal and lateral lengths of the light (i.e., the aspect ratio) or may be a size relation between the longitudinal and lateral lengths of the light. In addition to those, the condition which defines the shape of the light may be a condition which defines the shape of the light such as a circular shape or a square.
The filter generation unit 107 decides the shape of the image processing filter to be used in the filter processing unit 102 based on the light shape condition inputted from the light shape estimation unit 106. That is, the filter generation unit 107 decides the shape of the image processing filter for extracting the shape defined by the inputted light shape condition and thereby generates the image processing filter. Note that as described above, the shape of the filter includes the size of the filter and the coefficient value of each segment of the filter.
Here, the filter generation unit 107 decides the shape of the image processing filter by a method similar to any one of the first to eighth example embodiments. That is, the filter generation unit 107 may decide the shape of the image processing filter such that the aspect ratio between the horizontal direction and the vertical direction of the light defined by the light shape condition agrees with the ratio between the magnitude in the lateral direction and the magnitude in the longitudinal direction of the spatial frequency components of the image processing filter. Alternatively, the filter generation unit 107 may decide the shape of the image processing filter such that the ratio between the magnitudes of the spatial frequency component in the horizontal direction and of the spatial frequency component in the vertical direction of the light defined by the light shape condition agrees with the ratio between the spatial frequency component in the lateral direction and the spatial frequency component in the longitudinal direction of the image processing filter. Further, the filter generation unit 107 may decide the shape of the image processing filter such that the size relation of the lengths in the horizontal direction and in the vertical direction of the shape of the light defined by the light shape condition agrees with the size relation of the magnitude in the lateral direction and of the magnitude in the longitudinal direction of the spatial frequency components of the image processing filter. The filter generation unit 107 outputs the generated image processing filter to the filter processing unit 102.
The operation of the filter processing unit 102, the filter output integration unit 103, and the focus determination unit 104 are similar to the first example embodiment. Note that the filter processing unit 102 applies the image processing filter generated by the filter generation unit 107 to the iris image inputted from the iris image acquisition unit 101 and thereby performs the filter processing.
In such a manner, in the ninth example embodiment, the filter processing is performed by using not a versatile image processing filter used for plural iris images but the image processing filter which is individually generated based on the reflection region of the light in the iris image which is a target of the focus determination. Thus, it becomes possible to perform the focus determination of individual iris images more precisely.
In the above configuration, the light reflection detection unit 105 is one example of a detection unit, the light shape estimation unit 106 is one example of an estimation unit, and the filter generation unit 107 is one example of a filter generation unit.
(Focus Determination Process)
First, the iris image acquisition unit 101 acquires the iris image inputted from the camera 2 or the like and outputs it to the filter processing unit 102 and the light reflection detection unit 105 (step S21). The light reflection detection unit 105 detects the reflection region of the light on the eye ball surface from the inputted iris image and outputs the lengths in the longitudinal direction and the lateral direction of the detected reflection region of the light to the light shape estimation unit 106 (step S22). The light shape estimation unit 106 estimates the shape of the light based on the lengths of the inputted reflection region in the longitudinal direction and the lateral direction and outputs the light shape condition which defines the shape of the light to the filter generation unit 107 (step S23). The filter generation unit 107 decides the shape of the image processing filter based on the inputted light shape condition, generates the image processing filter, and outputs it to the filter processing unit 102 (step S24).
The filter processing unit 102 applies the image processing filter generated by the filter generation unit 107 to the iris image inputted from the iris image acquisition unit 101 and thereby performs the filter processing (step S25). The subsequent processes are basically similar to the first example embodiment. That is, the filter output integration unit 103 performs the integration process using the pixel values of the filter-processed image and outputs the filter output integration value P to the focus determination unit 104 (step S26). Next, the focus determination unit 104 performs the focus determination by comparing the filter output integration value P with the predetermined threshold value Q (step S27) and outputs a determination result (step S28). Then, the focus determination process ends.
As described above, in the ninth example embodiment, the image processing filter to be used for the focus determination of a certain iris image is generated based on the shape of the light which is detected from the iris image. Therefore, the focus determination can be performed highly precisely by using the image processing filter of the shape close to the shape of the reflection of the light included in each iris image.
A tenth example embodiment is related to an iris authentication device which includes the focus determination device of any one of the first to ninth example embodiments. In the iris authentication, it is important to use an iris image in which the iris is in focus in order to capture a fine pattern of the iris. Generally, in order to take an image in which the iris is in focus, when the iris image is taken, a method is used in which plural images are taken while focus positions are changed. However, in a case where the iris authentication is performed by using the plural iris images taken, there is a problem that an authentication process takes time due to an increase in the number of the images to which the authentication process is applied. In the present example embodiment, the iris authentication is efficiently performed by using the focus determination device of any one of the first to ninth example embodiments.
(Hardware Configuration)
A hardware configuration of the iris authentication device according to the tenth example embodiment is basically similar to the focus determination device 100 of the first example embodiment illustrated in
(Functional Configuration)
Plural iris images of the same person in different focus states are inputted to the focus determination device 100 or 100X. The focus determination device 100 or 100X performs the focus determination of the inputted iris images and outputs determination results of the plural inputted iris images to the focused image selection unit 201.
The focused image selection unit 201 outputs the iris image determined to be in focus (hereinafter, also referred to as “focused iris image”) among the plural iris images based on the determination results to the iris detection unit 202. The iris detection unit 202 detects an iris region from the focused iris image and outputs it to the iris feature extraction unit 203. The iris feature extraction unit 203 extracts the iris feature values from the iris region detected by the iris detection unit 202 and outputs the iris feature values to the iris authentication unit 204. Note that the iris feature extraction unit 203 extracts the iris feature values by using Daugman's algorithm or other arbitrary algorithms.
The iris DB 205 stores the iris images and the iris feature values for the persons whose iris information has already been registered. The iris authentication unit 204 collates the iris feature values inputted from the iris feature extraction unit 203 with the iris feature values of the person who has already been registered in the iris DB 205 and thereby performs the iris authentication. For example, in a case where a collation score between the iris feature values from the iris feature extraction unit 203 and the iris feature values of a certain person who has already been registered in the iris DB 205 is equal to or larger than a predetermined score, the iris authentication unit 204 determines that the person of the iris image is a person who has already been registered.
Note that the focus determination device 100 or 100X detects the reflection region of the light by performing the filter processing. However, usually, halation occurs to the pixel values of the reflection region of the light, and the reflection region is unsuitable for authentication of feature values. Therefore, it is preferable that the iris authentication unit 204 do not use the reflection region of the light for authentication, or lower the weights of the reflection region of the light in the authentication so that the reflection region of the light is unlikely to be reflected in the score. Then, the iris authentication unit 204 outputs an authentication result.
In the above configuration, the focused image selection unit 201 is one example of an image selection unit, the iris detection unit 202 is one example of an iris detection unit, the iris feature extraction unit 203 is one example of a feature values extraction unit, and the iris authentication unit 204 is one example of an iris authentication unit.
As described above, the iris authentication device 200 of the tenth example embodiment extracts the focused iris image from plural iris images of the same person by the focus determination process and can thereby perform the iris authentication process. Therefore, an iris detection process, an iris feature extraction process, and the iris authentication process can be omitted for the iris images which are out of focus, and it becomes possible to reduce the calculation amount and to shorten a processing time in the iris authentication process. In addition, by removing the images which are out of focus in advance, the iris images which are blurry and unsuitable for authentication can be prevented from being used for the iris authentication process, and it becomes possible to reduce an occurrence of false authentication.
An eleventh example embodiment is also related to an iris authentication device which includes the focus determination device of any one of the first to ninth example embodiments. Usually, when the iris image is taken, the user stops in front of an imaging device as illustrated as an example in
(Hardware Configuration)
A hardware configuration of the iris authentication device according to the eleventh example embodiment is basically similar to the focus determination device 100 of the first example embodiment illustrated in
(Functional Configuration)
To the iris image selection unit 210, a large number of iris images, which are taken for the same person by a walk-through iris authentication system or the like, are inputted. The iris image selection unit 210 selects, as determination target images, a portion of the iris images among the large number of inputted iris images and outputs those to the focus determination device 100 or 100X. The focus determination device 100 or 100X performs the focus determination of the inputted determination target images and outputs determination results to the focused image selection unit 201. The focused image selection unit 201 selects the iris images which are in focus based on the determination results of the determination target images and outputs them to the iris detection unit 202. Alternatively, in a case where the number of the determination target images whose filter output integration values P exceed the predetermined threshold value Q is equal to or larger than a predetermined number, the focused image selection unit 201 may again apply the focus determination process to those iris images, thereby to extract a predetermined number of iris images in good focus state, and output them to the iris detection unit 202. Note that processes by the iris detection unit 202, the iris feature extraction unit 203, and the iris authentication unit 204 are similar to the tenth example embodiment, and descriptions thereof will be omitted.
Next, a detailed description will be given of a selection method of iris images by the iris image selection unit 210.
Here, it is assumed that focus deviation degrees of the iris images P6, P11, and P15 with respect to the threshold value Q as a reference are 40%, 10%, and 50%, respectively. Supposing that the moving speed of the user is constant, the focus deviation degrees of the iris images P7 to P10 and P12 to P14 can be estimated to be the values indicated in the parentheses in
In a case where the focused iris image decided as described above is an image having a relatively large halation portion caused by the light, the iris image is not suitable for authentication even though it is in focus. Accordingly, as for the focused iris image decided in the above method, the iris image selection unit 210 may determine whether the size of the reflection region of the light or the size of a portion of the reflection region which overlaps with the iris is equal to or larger than a predetermined value. In a case where the size is equal to or larger than the predetermined value, the iris image selection unit 210 may again select, as an image to be used for authentication, an image among previous and subsequent images of the above focused iris image, whose focus state is in an allowable level for authentication even if it is somewhat out of focus and whose size of the reflection region of the light is small.
Further, in the above method, after taking the iris images of a moving user, an iris image in a good focus state and suitable for authentication is selected from plural iris images. However, instead of that, there is a method repeatedly performing the focus determination of plural iris images taken every time a user moves a predetermined distance. In this case, the filter processing and the focus determination process are performed for plural iris images taken when the user is present at a certain distance, and if an iris image which is in focus and suitable for authentication is obtained, the process is finished. On the other hand, if an iris image which is in focus cannot be obtained or if only an iris image is obtained which is in focus but is unsuitable for authentication due to the halation or the like in the reflection region of the light, the work of taking plural iris images while the user moves the next predetermined distance and selecting an iris image which is in focus and suitable for authentication is repeated.
As described above, even in a case where a large number of iris images are taken by the walk-through iris authentication system or the like, the iris authentication device 200X of the eleventh example embodiment selects an iris image which is in focus by performing the focus determination of only a portion of the iris images and can thereby efficiently carry out the iris authentication.
Similarly to the eleventh example embodiment, a twelfth example embodiment is applied to a case where a large number of iris images are taken by the walk-through iris authentication system or the like. However, the twelfth example embodiment is different from the eleventh example embodiment in the selection method of the iris image by the iris image selection unit 210.
Here, it is assumed that the focus deviation degrees of the iris images P6 and P15 with respect to the threshold value Q as the reference are 20% and 40%. Supposing that the moving speed of the user is constant, it can be inferred that any one of the iris images P2, P3, P9, and P10, which are apart from the iris image P6 at a deviation degree of 20% by predetermined distances, is in the focused state. Thus, the focused image selection unit 201 inputs the iris images P2, P3, P9, and P10 to the focus determination device 100 or 100X, causes the focus determination device 100 or 100X to again perform the focus determination, and decides the focused iris image from the determination results.
As described above, even in a case where a large number of iris images are taken by the walk-through iris authentication system or the like, the iris authentication device 200X of the twelfth example embodiment selects an iris image which is in focus by performing the focus determination of only a portion of the iris images and can thereby efficiently carry out the iris authentication.
A thirteenth example embodiment is also related to an iris authentication device which includes the focus determination device of any one of the first to ninth example embodiments. When the iris image is taken, usually, an iris of a user is irradiated with light. However, in a case where a user whose iris image is taken wears a device covering the eyes such as glasses, reflection by lenses or a frame of the glasses enters a taken image. When an iris image including reflection by lenses or a frame of glasses is used, it may falsely be determined that the iris is in focus due to an edge of the reflection by the lenses or the frame of the glasses in the focus determination, although the iris is actually out of focus. Accordingly, in the thirteenth example embodiment, the focus determination device 100 or 100X performs the focus determination of an iris image (hereinafter, referred to as “eyewear wearing image”) taken in a state where a user wears a device covering the eyes (hereinafter, referred to as “eyewear”) such as glasses, but does not perform the focus determination of an iris image (hereinafter, referred to as “eyewear non-wearing image”) taken in a state where a user does not wear an eyewear. Note that the eyewear includes sunglasses, goggles, and so forth in addition to glasses.
(Hardware Configuration)
A hardware configuration of the iris authentication device according to the thirteenth example embodiment is basically similar to the configuration of the focus determination device 100 of the first example embodiment which is illustrated in
(Functional Configuration)
To the eyewear wearing image detection unit 215, plural iris images are inputted. The eyewear wearing image detection unit 215 detects eyewear wearing images from the inputted iris images. Note that the eyewear wearing image detection unit 215 may detect the eyewear wearing images based on colors or the like of the iris images. Further, in a case where the iris image includes a whole eyewear, the eyewear itself may be detected by image processing. The eyewear wearing image detection unit 215 outputs the eyewear wearing images to the focus determination device 100 or 100X and outputs eyewear non-wearing images to the iris detection unit 202. The focus determination device 100 or 100X performs the focus determination of the eyewear wearing images, and the focused image selection unit 201 selects the iris images in the focused state among the eyewear wearing images and outputs them to the iris detection unit 202. Thus, the iris images in the focused state among the eyewear wearing images and the eyewear non-wearing images are inputted to the iris detection unit 202. Note that the processes by the iris detection unit 202, the iris feature extraction unit 203, and the iris authentication unit 204 are similar to the tenth example embodiment, and descriptions thereof will be omitted.
In the thirteenth example embodiment, instead of performing the focus determination of the iris images of all users, the focus determination is performed only for the iris images of users wearing the eyewear. Thus, the focus determination and the iris authentication can efficiently be executed.
Next, a fourteenth example embodiment of the present disclosure will be described.
In the fourteenth example embodiment, since a focus state of the iris image is determined by using the shape of the light used when the iris image is taken, it becomes possible to precisely determine the focus state based on a known shape.
A part or all of the above example embodiments may also be described as the following supplementary notes, but not limited thereto.
(Supplementary Note 1)
A focus determination device comprising:
(Supplementary Note 2)
The focus determination device according to supplementary note 1,
(Supplementary Note 3)
The focus determination device according to supplementary note 2,
(Supplementary Note 4)
The focus determination device according to supplementary note 2,
(Supplementary Note 5)
The focus determination device according to supplementary note 2,
(Supplementary Note 6)
The focus determination device according to supplementary note 1,
(Supplementary Note 7)
The focus determination device according to supplementary note 6,
(Supplementary Note 8)
The focus determination device according to supplementary note 6,
(Supplementary Note 9)
The focus determination device according to supplementary note 6,
(Supplementary Note 10)
The focus determination device according to supplementary note 1, further comprising:
(Supplementary Note 11)
An iris authentication device comprising:
(Supplementary Note 12)
The iris authentication device according to supplementary note 11,
(Supplementary Note 13)
The iris authentication device according to supplementary note 11,
(Supplementary Note 14)
An iris authentication device comprising:
(Supplementary Note 15)
A focus determination method comprising:
(Supplementary Note 16)
A recording medium recording a program causing a computer to execute the processes of:
While the present disclosure has been described with reference to the example embodiments and examples, the present disclosure is not limited to the above example embodiments and examples. Various changes which can be understood by those skilled in the art within the scope of the present disclosure can be made in the configuration and details of the present disclosure.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/034887 | 9/15/2020 | WO |