The present invention relates to an imaging device and an imaging method.
Biometric authentication has become widespread recently. The biometric authentication is used to verify whether an operator of a computer or the like is a proper operator, for example, to authenticate an operator of a cash dispenser at a store. In this case, an imaging device images, for example, palm veins of an individual. When the vein pattern based on the image obtained by the imaging device matches with a vein pattern that has been registered in advance, the individual is verified. The cash dispenser accepts operations of only operators verified by the biometric authentication.
Imaging devices for various purposes including biometric authentication systems emit infrared lights to irradiate a palm held toward an imaging direction of the imaging device to capture an image. A part of the infrared lights is reflected on a surface of the palm, but a part thereof passes through the skin and is absorbed by hemoglobin of veins inside the hand. Therefore, by imaging with an infrared-light sensitive image sensor, a vein pattern inside a palm can be acquired in addition to surface information of the palm.
For example, such a imaging device has a self-illuminating device that emits an infrared light to irradiate a palm through a ring-shaped light guide, and acquire a palm image by an imaging system constituted of a lens and an image sensor. The acquired palm image includes a background image such as of a ceiling and a wall, and this can cause a disadvantageous influence on cutting a palm out of a whole image or signal extraction of a vein pattern in an image processing stage depending on conditions.
Therefore, in an actual situation, a method in which images are taken under two conditions of turning on and off the self-illuminating device, and a difference image thereof is used as a palm image is applied. The respective images taken when the self-illuminating device is on and off include the same background, and the background is canceled in the difference image, thereby acquiring a palm image only.
Ambient light, such as sunlight and room light, not only illuminates a background, but also illuminates a palm. Also the illumination by such ambient light or external light on a palm can cause disadvantageous influence on signal extraction of a vein pattern, but is canceled by the difference image, and an image that is illuminated only by the self-illuminating device can be obtained (For example, refer to Patent Document 1).
Patent Document 1: Japanese Laid-open Patent Publication No. 2014-078857
However, in the above technique, external-light-bearing capacity is limited, and when external light is strong like outside, there are problems as explained below.
Because it is a precondition that the brightness at a palm portion of the respective images of when the self-illuminating device is on and off are not saturated, to obtain an accurate difference image, if the external light becomes intense, the exposure when taking images with the self-illuminating device turned on or off is to be decreased. This results in a decreased difference between the respective images taken with the self-illuminating device turned on or off, and it thus becomes difficult to extract a vein pattern accurately. Therefore, an intensity limit is set for external light as a use environment, there is a disadvantage that some measures are necessary to be taken such as using an appropriate cover to make an artificial shade when used outside.
When the external light intensity is high, using only external light for illumination can be considered as another option, but a locally high brightness region (halation, etc.) can occur depending on an incident angle of the external light to a palm, and it can cause a disadvantageous influence on signal extraction of a vein pattern. Therefore, there is a problem that accurate extraction of a vein pattern becomes difficult.
According to an aspect of the embodiments, an imaging device includes: an image sensor that images an object to be imaged through polarizing plates arranged to have a different polarization direction for each pixel in a pixel group that includes a plurality of pixels corresponding to each of points of the object to be imaged; a pixel selecting unit that selects a pixel having a lowest brightness for each of the pixel group corresponding to each of the points; and an image output unit that outputs a captured image of the object to be imaged that is generated from pixels selected by the pixel selecting unit.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention.
Embodiments according to the present application are explained below, referring to the accompanying drawings. Following embodiments explain an imaging device that is applied to a vein authentication apparatus to authenticate a person based on characteristics of veins of the person as an example, but are not intended to limit the disclosed technique. The disclosed technique is applicable to most imaging devices that acquire, for example, biometric information of a subject from a difference image. The respective embodiments can be combined within a scope of not causing a contradiction. Furthermore, like reference symbols are assigned to like parts throughout the embodiments, and explanation of components and processing that have already been explained is omitted.
(Imaging Device According to First Embodiment)
In the following, an imaging device according to a first embodiment is explained, referring to
As illustrated in
As illustrated in
The image sensor 120 is a multi-polarization image sensor in which one pixel is structured, for example, by each of four image sensors having a polarizing plate that differs from others in a polarization direction. In the following, a pixel structuring method in the image sensor 120 is explained. Multi-polarization image sensors are structured based on the same idea as color image sensors, and are structured by replacing color filters in color image sensors with polarizing plates.
In this example, a wire-gird polarizing plate is assumed to be used, and as for a transmittance direction and a reflection direction of the respective polarizing plates, as illustrated in
Thus, as illustrated in
The multi-polarization image sensor also structures pixels similarly to the color image sensor.
To explain functions and effects of the first embodiment, an act of reflected light from an object to be imaged is first explained.
As illustrated in
Furthermore, as illustrated in
Moreover, the regular reflection also has distinctive polarization characteristics. The reflectivity of light on an object surface differs depending on a polarization direction, and P-polarized light in which an electric field vibrates in a plane of incidence is likely to pass through a boundary, and has low reflectivity. On the other hand, the vibration of an electric field is perpendicular to a plane of incidence, that is to say that S polarization light parallel to a boundary has high reflectivity. Therefore, even if incident light is randomly polarized, regular reflection light is to be light having strong S polarization component. This phenomenon, that is, the regular reflection and a change of a polarization direction on an object surface, is illustrated in a perspective view in
As illustrated in
Even when an incident direction of illumination varies, if a polarizing plate to shield the S polarization light with respect to the direction can be always applied, regular reflection light can be significantly reduced even with illumination light of random polarization, such as sun light.
In the first embodiment, a function of reducing this regular reflection light to the sufficiently practical level is achieved by using properties that the adjacent four pixels of the multi-polarization image sensor pass polarization light of respective different directions. Thus, even when illumination by external light that has random polarization and, the incident direction of which cannot be controlled is used, an effect of acquiring an excellent palm image without a local high-brightness region can be expected.
For example, when illumination light of external light enters from a direction (B) in
As described, by choosing entire pixels following a rule to use a pixel of the lowest brightness per group of four pixels, an excellent palm image in which regular reflection light is removed can be acquired.
Explanation is returned to
The control unit 110 includes a lighting control unit 110a, an image acquiring unit 110b, a pixel selecting unit 110c, and an image output unit 110d. The lighting control unit 110a controls lighting on and off of the light emitting device 130.
The image acquiring unit 110b instructs the lighting control unit 110a to turn off the light emitting device 130. The image acquiring unit 110b then acquires an image by causing the image sensor 120 to capture an image (hereinafter, external light image in some cases) including a palm held over the imaging device 100 and a background in a state in which the light emitting device 130 is turned off. The external light is light other than light by the self-illumination, such as outdoor natural light or indoor room illumination light.
Furthermore, the image acquiring unit 110b instructs the lighting control unit 110a to turn on the light emitting device 130. The image acquiring unit 110b then acquires an image by causing the image sensor 120 to capture an image (hereinafter, self-illumination image in some cases) including a palm held over the imaging device 100 and a background in a state in which the light emitting device 130 is turned on.
Subsequently, the image acquiring unit 110b acquires a difference image between the external light image and the self-illumination image.
The image acquiring unit 110b determines whether the brightness of a predetermined portion (for example, a central portion of the image) of the external light image is equal to or higher than a first threshold, or whether the brightness of a predetermined portion (for example, a central portion of the image) of the difference image between the external light image and the self-illumination image is lower than a second threshold. The image acquiring unit 110b uses the external image as an image subject of processing of pixel selection when the brightness of the predetermined portion of the external image is equal to or higher than the first threshold, or when the brightness of the predetermined portion of the difference image between the external image and the self-illumination image is lower than the second threshold.
On the other hand, the image acquiring unit 110b uses the difference image as an image subject of processing of pixel selection when the brightness of the predetermined portion of the external image is lower than the first threshold and the brightness of the predetermined portion of the difference image between the external image and the self-illumination image is equal to or higher than the second threshold. The image acquiring unit 110b uses the external light image as an image subject of processing of pixel selection even if the brightness of the predetermined portion of the external image is lower than the first threshold, if the brightness of the predetermined portion of the difference image between the external light image and the self-illumination image is lower than the second threshold.
The pixel selecting unit 110c selects a pixel, the brightness of which is the lowest per pixel group that is constituted of adjacent four pixels as one group, throughout the whole image subject of processing. The adjacent four pixels are four pieces of pixels that are arranged in a lattice-like structure. The image output unit 110d generates one palm image by connecting pixels selected by the pixel selecting unit 110c.
(Brightness Distribution of Image Including Regular Reflection)
As illustrated in
(Switch from Self-Illumination to External Illumination According to First Embodiment)
When the intensity of the external light is not high like that of indoors, the brightness of an image when the self-illumination is OFF is low as expressed by a broken line in
Specifically, when the intensity of the external light becomes high, the brightness is to be high in both of the case in which the self-illumination is ON illustrated in
Therefore, in the first embodiment, a method is switched, by threshold determination, from a method of acquiring a palm vein pattern by a difference image to a method of acquiring a palm vein pattern only by the external light image. Specifically, in the first embodiment, when the brightness of, for example, a central portion of the external light image illustrated in
That is, in the first embodiment, a palm is imaged by the self-illumination indoors, or the like when the intensity of the external light is lower than predetermined intensity. As the intensity of the external light increases due to usage at a window or outside, appropriate exposure time to acquire a difference image between those when the self-illumination is ON and OFF becomes short. Therefore, in the first embodiment, the method is switched from the method of acquiring a palm vein pattern by a difference image to the method of acquiring a palm vein pattern only by the external light image with respect to a predetermined condition.
(Imaging Processing in Imaging Device According to First Embodiment)
At step S12, the imaging device 100 turns off the self-illumination. Subsequently, the imaging device 100 acquires an external light image of the palm (step S13). Subsequently, the imaging device 100 determines whether the brightness of the external image (for example, at a predetermined portion) is equal to or higher than the first threshold (step S14). When the brightness of the external light image (at a predetermined portion) is equal to or higher than the first threshold (step S14: YES), the imaging device 100 shifts the processing to step S21. On the other hand, when the brightness of the external light image (at a predetermined portion) is lower than the first threshold (step S14: NO), the imaging device 100 shifts the processing to step S15.
At step S15, the imaging device 100 turns ON the self-illumination. Subsequently, the imaging device 100 acquires a self-illumination image (step S16). Subsequently, the imaging device 100 turns OFF the self-illumination (step S17). Subsequently, the imaging device acquires a difference image between the external light image acquired at step S13 and the self-illumination image acquired at step S16 (step S18).
Subsequently, the imaging device 100 determines whether the brightness of the difference image (for example, at a predetermined portion) is lower than the second threshold (step S19). When the brightness of the difference image (for example, at a predetermined portion) is lower than the second threshold (step S19: YES), the imaging device 100 shifts the processing to step S21. On the other hand, when the brightness of the difference image (for example, at a predetermined portion) is equal to or higher than the second threshold (step S19: NO), the imaging device 100 shifts the processing to step S20.
At step S20, the imaging device 100 uses the difference image as an image subject of processing. On the other hand, at step S21, the imaging device 100 uses the external light image as an image subject of processing. When step S20 or step S21 is completed, the imaging device 100 shifts the processing to step S22.
At step S22, the imaging device 100 selects a pixel having the lowest brightness per pixel group from the image subject of processing. The imaging device 100 then generates an output image from pixels selected at step S22 (step S23). Subsequently, the imaging device 100 determines whether to end the imaging processing (step S24). When determining to end the imaging processing (step S24: YES), the imaging device 100 ends the imaging processing, and when determining not to end the imaging processing (step S24: NO), the processing is shifted to step S11.
At step S19, when the brightness of the difference image (for example, at a predetermined portion) is lower than the second threshold (step S19: YES), the imaging device 100 can skip performing the processing at step S21 to step S24, and can shift the processing to step S11. Alternatively, the determination can be performed only based on the first threshold at step S14, and the processing at step S19 can be omitted.
Alternatively, upon acquiring either of the “external light image” or the “self-illumination image”, the processing of image selection and generation of an output image at step S22 to step S23 can be performed for the acquired image, and the other image acquisition can be omitted.
Alternatively, turning OFF of the self-illumination at step S12 and turning ON of the self-illumination at step S15 can be switched, and acquisition of an external light image at step S13 and acquisition of a self-illumination image at step S16 can be switched. That is, a “first image” and a “second image” can be the “external light image” and the “self-illumination image”, respectively, or can be the “self-illumination image” and the “external light image”, respectively.
In the first embodiment, while using external light for illumination, an excellent palm image in which a local high-brightness region, such as glare, is removed can be acquired.
Although the control unit 110 of the imaging device 100 performs the various kinds of processing illustrated in the flowchart of
In a second embodiment, the pixel structuring method is based on another pixel structuring method, unlike the first embodiment based on the pixel structuring method of the color image sensor illustrated in
In the second embodiment, pixels of the multi-polarization image sensor are structured in a similar manner as in the pixel structuring method of the color image sensor. That is, the method of replacing respective color filters of R, G, B in the Bayer arrangement with respective polarizing plates of 0°, 45°, 90°, and 135° is the same as the first embodiment. Moreover, other points are also the same as the first embodiment.
Pixels can be structured by using various publicly-known pixel structuring methods other than that of the first and the second embodiments according to a demanded image quality.
The respective components illustrated in the first to the second embodiment can be modified or omitted within a range not departing from the technical scope of the imaging device according to the disclosed technique. Furthermore, the first to the second embodiments are merely one example, and not only the modes described in the section of disclosure of the invention, but also other modes in which various modifications and improvements are made based on knowledge of a person skilled in the art are also included in the disclosed technique.
According to one example of the disclosed technique, for example, accurate extraction of a vein pattern is enabled.
All examples and conditional language provided herein are intended for the pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventors to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although one or more embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
This application is a continuation application of International Application PCT/JP2016/056095, filed on Feb. 29, 2016 and designating the U.S., the entire contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
9645074 | Kanamori | May 2017 | B2 |
10848727 | Kondo | Nov 2020 | B2 |
20070222781 | Kondo | Sep 2007 | A1 |
20090279807 | Kanamorl et al. | Nov 2009 | A1 |
20120230551 | Hama | Sep 2012 | A1 |
20140055664 | Yamagata et al. | Feb 2014 | A1 |
20150212294 | Imamura | Jul 2015 | A1 |
20150219552 | Kanamori | Aug 2015 | A1 |
20150235375 | Imagawa | Aug 2015 | A1 |
20150256733 | Kanamori | Sep 2015 | A1 |
20170075050 | Yamagata et al. | Mar 2017 | A1 |
20170243079 | Hiriyannaiah | Aug 2017 | A1 |
20170316266 | Kawabata | Nov 2017 | A1 |
20190273856 | Hirasawa | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
11-41514 | Feb 1999 | JP |
2012-173916 | Sep 2012 | JP |
2014-78857 | May 2014 | JP |
2015-164518 | Sep 2015 | JP |
WO 2008099589 | Aug 2008 | WO |
WO 2013114888 | Aug 2013 | WO |
Entry |
---|
Japanese Office Action dated Apr. 16, 2019 in corresponding Japanese Patent Application No. 2018-502875 (2 pages). |
International Search Report dated May 17, 2016 in corresponding International Patent Application No. PCT/JP2016/056095. |
Written Opinion of the International Searching Authority dated May 17, 2016 in corresponding International Patent Application No. PCT/JP2016/056095. |
Number | Date | Country | |
---|---|---|---|
20180336655 A1 | Nov 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2016/056095 | Feb 2016 | US |
Child | 16052019 | US |