This disclosure relates to technical fields of an authentication system, an authentication apparatus, an authentication method, and a recording medium that authenticate a target.
A system that uses information other than a face in face authentication is known as a system of this type. For example, Patent Literature 1 discloses a technique/technology of using a background image when a user is imaged. Patent Literature 2 discloses a technique/technology of preventing spoofing by using a background image. Patent Literature 3 discloses a technique/technology of collating/verifying a feature quantity of an object located around a face.
This disclosure aims to improve the techniques/technologies disclosed in Citation List.
An authentication system according to an example aspect of this disclosure includes: a first acquisition unit that obtains face information and environment information of a target, from an image including the target; a second acquisition unit that obtains imaging angle information that is information indicating an angle when the image is captured; and an authentication execution unit that performs an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
An authentication apparatus according to an example aspect of this disclosure includes: a first acquisition unit that obtains face information and environment information of a target, from an image including the target; a second acquisition unit that obtains imaging angle information that is information indicating an angle when the image is captured; and an authentication execution unit that performs an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
An authentication method according to an example aspect of this disclosure includes: obtaining face information and environment information of a target, from an image including the target; obtaining imaging angle information that is information indicating an angle when the image is captured; and performing an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
A recording medium according to an example aspect of this disclosure is a recording medium on which a computer program that allows a computer to execute an authentication method is recorded, the authentication method including: obtaining face information and environment information of a target, from an image including the target; obtaining imaging angle information that is information indicating an angle when the image is captured; and performing an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
Hereinafter, an authentication system, an authentication apparatus, an authentication method, and a recording medium according to example embodiments will be described with reference to the drawings.
An authentication system according to a first example embodiment will be described with reference to
First, a hardware configuration of an authentication system 10 according to the first example embodiment will be described with reference to
As illustrated in
The processor 11 reads a computer program. For example, the processor 11 is configured to read a computer program stored by at least one of the RAM 12, the ROM 13 and the storage apparatus 14. Alternatively, the processor 11 may read a computer program stored in a computer-readable recording medium by using a not-illustrated recording medium reading apparatus. The processor 11 may obtain (or read) a computer program from a not-illustrated apparatus disposed outside the authentication system 10, through a network interface. The processor 11 controls the RAM 12, the storage apparatus 14, the input apparatus 15, and the output apparatus 16 by executing the read computer program. Especially in this example embodiment, when the processor 11 executes the read computer program, a functional block for performing an authentication process on a target is realized or implemented in the processor 11.
The processor 11 may be configured as, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a FPGA (field-programmable gate array), a DSP (Digital Signal Processor) or an ASIC (Application Specific Integrated Circuit). The processor 11 may include one of them, or may use a plurality of them in parallel.
The RAM 12 temporarily stores the computer program to be executed by the processor 11. The RAM 12 temporarily stores the data that is temporarily used by the processor 11 when the processor 11 executes the computer program. The RAM 12 may be, for example, a D-RAM (Dynamic RAM).
The ROM 13 stores the computer program to be executed by the processor 11. The ROM 13 may otherwise store fixed data. The ROM 13 may be, for example, a P-ROM (Programmable ROM).
The storage apparatus 14 stores the data that is stored for a long term by the authentication system 10. The storage apparatus 14 may operate as a temporary storage apparatus of the processor 11. The storage apparatus 14 may include, for example, at least one of a hard disk apparatus, a magneto-optical disk apparatus, a SSD (Solid State Drive), and a disk array apparatus.
The input apparatus 15 is an apparatus that receives an input instruction from a user of the authentication system 10. The input apparatus 15 may include, for example, at least one of a keyboard, a mouse, and a touch panel.
The output apparatus 16 is an apparatus that outputs information about the authentication system 10 to the outside. For example, the output apparatus 16 may be a display apparatus (e.g., a display) that is configured to display the information about authentication system 10. In addition, the output apparatus 16 may be a speaker that is configured to audio-output the information about the authentication system 10.
The camera 20 is a camera installed at a position where an image of the user (e.g., an image including a face of the user) can be captured. The user here is not limited to only a human, and may include an animal like a dog, a snake, a robot, and the like. The camera 20 may be a camera that captures a still image, or a camera that captures a video. The camera 20 may be configured as a visible light camera or as a near infrared camera. The camera 20 may be provided, for example, in a portable terminal or the like.
The sensor 21 is a sensor that is configured to detect various information used in the system. A plurality of sensors 21 may be provided. A plurality of types of sensors 21 may be provided. Especially, the sensor 21 according to this example embodiment includes a sensor that detects an imaging angle of the camera 20. The sensor may be, for example, a gyro sensor or an acceleration sensor.
Next, a functional configuration of the authentication system 10 according to the first example embodiment will be described with reference to
As illustrated in
The face/environment acquisition unit 110 is configured to obtain, from an image of a target that is a target of an authentication process, information about a face of the target (hereinafter referred to as a “face information” as appropriate) and information about an environment when the image is captured (hereinafter referred to as an “environment information” as appropriate). The environment information is information about other than the target (e.g., a background and a landscape) included in the image. The image of the target may be obtained by the camera 20 (see
The imaging angle acquisition unit 120 is configured to obtain information indicating an angle (hereinafter referred to as an “imaging angle information”) when the image of the target (i.e., the image from which the face information and the environment information are obtained) is captured. The imaging angle information may be, for example, information indicating an angle of a terminal that captures the image. The imaging angle may be, for example, an angle indicating a single value such as “90 degrees”, or may be information having a width such as “80 to 90 degrees.” The imaging angle information may be obtained, for example, from the sensor 21 (e.g., gyro sensor) provided in the terminal. The imaging angle information may be two-dimensional angle information (e.g., information indicating the angle on a plane defined by an X-axis and a Y-axis: specifically, information indicating the angle in a vertical direction and in a lateral direction), or may be three-dimensional angle information (e.g., information indicating the angle on a three-dimensional space defined by the X-axis, Y-axis and a Z-axis: specifically, information indicating the angle in the vertical direction, in the lateral direction, and in a depth direction). The imaging angle information obtained by the imaging angle acquisition unit 120 is configured to be outputted to the authentication execution unit 140.
The registered information storage unit 130 stores the face information, the environment information, and the imaging angle information as registered information in advance. The registered information may be information obtained by the face/environment acquisition unit 110 and the imaging angle acquisition unit 120. That is, the registered information may be information obtained from the image of the target captured in advance. The registered information storage unit 130 may store the registered information for a plurality of registered users.
The registered information storage unit 130 may be configured to update, add, and delete the registered information as appropriate. The registered information stored in the registered information storage unit 130 is configured to be readable by the authentication execution unit 140 as appropriate.
The authentication execution unit 140 is configured to perform an authentication process on the target, on the basis of the face information and the environment information obtained by the face/environment acquisition unit 110, the imaging angle information obtained by the imaging angle acquisition unit 120, and the registered information stored in the registered information storage unit 130. The authentication execution unit 140 may be configured to perform a collation/verification process using the face information, the environment information, and the imaging angle information (e.g., a process of determining a degree of matching between the obtained information and the registered information). Specific contents of the authentication process will be described in detail in another example embodiment described later. The authentication execution unit 140 may be configured to output a result of the authentication process.
As illustrated in
Next, with reference to
As illustrated in
Subsequently, the authentication execution unit 140 executes the authentication process, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information (step S104). As illustrated in
Next, a technical effect obtained by the authentication system 10 according to the first example embodiment will be described.
As described in
The authentication system 10 according to a second example embodiment will be described with reference to
First, with reference to
As illustrated in
Subsequently, the registered information storage unit 130 stores the obtained face information, the obtained environment information, and the obtained imaging angle information, as the registered information (step S205). Then, the registered information storage unit 130 determines whether or not all the registrations is ended (step S206). The determination here may be performed to determine whether or not the registration is ended for a predetermined number of images. In other words, it may be determined whether or not the number of registrations reaches a preset number of times.
When it is determined that all the registrations is not ended (step S206: NO), the process is repeated from the step S201. In the step S201 for the second time or after, however, the image captured at a different imaging angle from that of the images captured before is obtained. That is, the target to be registered is imaged a plurality of times by changing the imaging angle. By repeating such a process, the registered information storage unit 130 stores a plurality of pieces of face information, environment information, and imaging angle information for a single target. When it is determined that all the registrations is ended (step S206: YES), a sequence of processing steps is ended.
Next, with reference to
The example illustrated in
As described above, in the authentication system 10 according to the second example embodiment, a plurality of images are captured at different imaging angles, and the face information, the environment information, and the imaging angle information obtained from the plurality of images are stored in the registered information storage unit 130 as the registered information. Here, an example in which the camera 20 is shifted in a horizontal direction is described, but a plurality of images may be captured in a vertical direction or a diagonal direction, for example. The registered information may be stored by using more than three images.
Next, with reference to
As illustrated in
Next, a technical effect obtained by the authentication system 10 according to the second example embodiment will be described.
As described in
The authentication system 10 according to a third example embodiment will be described with reference to
First, with reference to
As illustrated in
The face collation/verification unit 141 is configured to perform a collation/verification process of collating the face information obtained from the image of the target (i.e., the face of the target to be authenticated) with the face information stored as the registered information (i.e., the face registered in advance). The face collation/verification unit 141 may be configured to determine that these pieces of face information match when a degree of matching between the obtained face information and the stored face information is greater than or equal to a predetermined threshold, for example.
The environment selection unit 142 is configured to select the environment information corresponding to the imaging angle information from the registered information, in accordance with the imaging angle information when the image of the target is captured. In particular, the registered information storage unit 130 according to this example embodiment stores the environment information and the imaging angle information in association with each other. Therefore, the environment selection unit 142 is configured to select the environment information corresponding to the imaging angle, by using the obtained imaging angle information. For example, the environment selection unit 142 may select the environment information associated with the imaging angle information that is the closest to the obtained imaging angle information. The environment selection unit 142 typically selects one environment information corresponding to the imaging angle, but when there are a plurality of pieces of environment information in which the imaging angles are close, all of the pieces of environment information may be selected.
Furthermore, in the operation of selecting (narrowing) the environment information, position information obtained from, for example, a GPS may be used in addition to the imaging angle information. In this case, the environment image and the position information may be stored in advance in association with each other. When the environment information is selected, for example, the environment information corresponding to the position information may be narrowed from a plurality of pieces of environment information in accordance with the position information obtained by the GPS function of a smartphone, and the environment information may further be narrowed by using the imaging angle information.
In addition, when a plurality of users are registered (e.g., when the system is configured as a shared system, instead of being owned by an individual), face authentication may be performed first, and the environment image associated with the target that can be identified by the face authentication may be narrowed, and then, the environment information may be narrowed by using the imaging angle information.
The environment collation/verification unit 143 is configured to perform a collation/verification process of collating the environment information obtained from the image of the target (i.e., the environment of the target to be authenticated) with the environment information selected by the environment selection unit 142 (i.e., the environment information about the image captured at the imaging angle that is close to the current imaging angle, out of the environment information registered in advance). The environment collation/verification unit 143 may be configured to determine that these pieces of environment information match when a degree of matching between the obtained environment information and the selected environment information is greater than or equal to a predetermined threshold, for example.
Next, with reference to
As illustrated in
Subsequently, the face collation/verification unit 141 performs the collation/verification process of collating the obtained face information with the registered face information (step S301). When these pieces of face information do not match, the subsequent steps S302 and S303 may be omitted, and a result indicating that the authentication is failed, may be outputted.
Subsequently, the environment selection unit 142 selects the environment information corresponding to the obtained imaging angle information (step S302). Then, the environment collation/verification unit 143 performs the collation/verification process of collating the obtained environment information with the selected environment information (step S303). The steps S301 to S303 may be performed before or after one another, or may be performed at the same time in parallel.
Then, the authentication execution unit 140 outputs the result of the authentication process (step S304). Specifically, the authentication execution unit 140 may output a result indicating that the authentication process is successful when the collation/verification process performed on the face information by the face collation/verification unit 141 and the collation/verification process performed on the environment information by the environment collation/verification unit 143 are both successful (i.e., matching). On the other hand, the authentication execution unit 140 may output a result indicating that the authentication process is failed when one of the collation/verification process performed on the face information by the face collation/verification unit 141 and the collation/verification process performed on the environment information by the environment collation/verification unit 143 is failed (i.e., not matching).
Next, a technical effect obtained by the authentication system 10 according to the third example embodiment will be described.
As described in
The authentication execution unit 140 according to the third example embodiment may be configured without the environment selection unit 142 (i.e., may include only the face collation/verification unit 141 and the environment collation/verification unit 143). In this case, the environment collation/verification unit 143 may perform the collation/verification process against all of the pieces of registered environment information.
The authentication system 10 according to the third example embodiment may be realized or implemented in combination with the authentication system 10 according to the second example embodiment (i.e., the configuration that images are captured from a plurality of angles).
The authentication system 10 according to a fourth example embodiment will be described with reference to
First, with reference to
As illustrated in
The imaging angle collation/verification unit 144 is configured to perform a collation/verification process of collating the obtained imaging angle information (i.e., the imaging angle when the image for authentication is captured) with the registered imaging angle information (the imaging angle stored in advance). The imaging angle collation/verification unit 144 may determine that these pieces of imaging angle information match when a difference between the obtained imaging angle information and the stored imaging angle information is less than or equal to a predetermined threshold, for example.
Next, with reference to
As illustrated in
Subsequently, the face collation/verification unit 141 performs the collation/verification process of collating the obtained face information with the registered face information (step S401). Then, the environment verification unit 143 performs the collation/verification process of collating the obtained environment information with the registered environment information (step S402). In the fourth example embodiment, the environment information corresponding to the imaging angle is not selected as in the third example embodiment. Thus, the environment verification unit 143 performs the collation/verification process against to a plurality of pieces of registered environment information.
Subsequently, the imaging angle collation/verification unit 144 performs the collation/verification process of collating the obtained imaging angle information with the registered imaging angle information (step S403). The imaging angle collation/verification unit 144 may determine whether or not the obtained imaging angle matches the imaging angle information associated with the face information or the environment information that is matched in the collation/verification process described above. The steps S401 to S403 may be performed before or after one another, or may be performed at the same time in parallel.
Then, the authentication execution unit 140 outputs the result of the authentication process (step S404). Specifically, authentication execution unit 140 may output a result indicating that the authentication process is successful when the collation/verification process performed on the face information by the face collation/verification unit 141, the collation/verification process performed on the environment information by the environment collation/verification unit 143, and the collation/verification process performed on the imaging angle information by the imaging angle collation/verification unit 144 are all successful (i.e., matching). On the other hand, the authentication execution unit 140 may output a result indicating that the authentication process is failed when any of the collation/verification process performed on the face information by the face collation/verification unit 141, the collation/verification process performed on the environment information by the environment collation/verification unit 143, and the collation/verification process performed on the imaging angle information by the imaging angle collation/verification unit 144 is failed (i.e., not matching).
Next, a technical effect obtained by the authentication system 10 according to the fourth example embodiment will be described.
As described in
The authentication system 10 according to a fifth example embodiment will be described with reference to
First, with reference to
As illustrated in
The first imaging unit 210 and the second imaging unit 220 are configured as imaging units having different imaging ranges. Specifically, the first imaging unit 210 has a first imaging range. The second imaging unit 220 has a second imaging range that is different from the first imaging range. The first imaging unit 210 and the second imaging unit 220, are allowed to perform the imaging at the same time and to capture two images having different imaging ranges, for example. The first imaging unit 210 and the second imaging unit 220 may be configured as an in-camera and an out-camera provided by a common terminal (e.g., a smartphone), for example.
The target may be included in at least one of the image captured by the first imaging unit 210 and the image captured by the second imaging unit 220. That is, one of the image captured by the first imaging unit 210 and the image captured by the second imaging unit 220 may be an image including only an environment part.
Next, with reference to
As illustrated in
Subsequently, the face/environment acquisition unit 110 obtains the face information about the target from at least one of the first image and the second image (step S502). The face/environment acquisition unit 110 obtains the environment information from each of the first image and the second image (step S503). The face/environment acquisition unit 110 may obtain only the face information from the first image, and may obtain only the environment information from the second target. Then, the imaging angle acquisition unit 120 obtains the imaging angle information when the first image and the second image are captured (step S504). When a relative angle of the first imaging unit 210 and the second imaging unit 220 is fixed, the imaging angle acquisition unit 120 may obtain one imaging angle that is common to the first image and the second image.
Subsequently, the authentication execution unit 140 performs the authentication process, on the basis of the face information, the environment information, and the imaging angle information obtained from the first image and the second image, and the registered information (step S505). The authentication execution unit 140 determines whether or not the authentication is successful in both the first image and the second image (step S506). That is, the authentication execution unit 140 determines whether or not the result of the authentication process performed on the face information, the environment information, and the imaging angle information obtained from the first image, and the result of the authentication process performed on the face information, the environment information, and the imaging angle information obtained from the second image, are both successful.
When the authentication is successful in both the first image and the second image (step S506: YES), the authentication executable unit 140 outputs information indicating that the authentication process is successful (step S507). On the other hand, when the authentication is failed in at least one of the first image and the second image (step S506: NO), the authentication execution unit 140 outputs information indicating that the authentication process is failed (step S508).
Here, such a configuration that the image is obtained from the two imaging units that are the first imaging unit 210 and the second imaging unit 220 is exemplified, but the image may be obtained from three or more imaging units, for example. In this case, the same authentication process may be performed by using a plurality of images captured by the respective imaging units.
Next, a technical effect obtained by the authentication system 10 according to the fifth example embodiment will be described.
As described in
The authentication system 10 according to a sixth example embodiment will be described with reference to
First, with reference to
As illustrated in
The imaging angle difference calculation unit 145 is configured to calculate a difference in the imaging angle information corresponding to a plurality of images captured at different timings. Specifically, the imaging angle difference calculation unit 145 is configured to calculate a difference between the imaging angle information for the first image captured at a first timing and the imaging angle information for the second image captured at a second timing. The difference in the imaging angle information calculated here is information indicating how the user moves the camera 20 when the plurality of images are captured. For example, when the first image is captured at 90 degrees in the vertical direction and at 0 degrees in the lateral direction and the second image is captured at 80 degrees in the vertical direction and at 0 degrees in the lateral direction, the difference in the imaging angle information is calculated as 10 degrees in the vertical direction.
The authentication execution unit 140 according to the sixth example embodiment is configured to perform the authentication process by using the difference in the imaging angle information calculated as described above. The authentication execution unit 140 may determine whether or not the difference in the imaging angle information matches the registered information, for example. For example, the authentication execution unit 140 may determine whether or not the user moves the camera 20 as registered when the first image and the second image are captured. In this case, the authentication execution unit 140 may determine that the authentication is successful when the user moves the camera as registered (i.e., when the difference in the imaging angle information matches the registered information), and may determine that the authentication is failed when the user does not move the camera as registered (i.e., when the difference in the imaging angle information is different from the registered information). The authentication execution unit 140 may perform the authentication process by using the imaging angle information itself, in addition to the difference in the imaging angle information.
Next, with reference to
As illustrated in
Subsequently, the face/environment acquisition unit 110 obtains the face information about the target from the obtained first image and the obtained second image of the target (step S603). The face/environment acquisition unit 110 obtains the environment information from the obtained first image and the obtained second image of the target (step S604). Then, the imaging angle acquisition unit 120 obtains the imaging angle information when the first image and the second image of the target are captured (step S605). The steps S603 to S605 may be performed before or after one another, or may be performed at the same time in parallel.
Subsequently, the imaging angle difference calculation unit 145 calculates the difference between the imaging angle information about the first image and the imaging angle information about the second image (step S606). Then, the authentication execution unit 140 performs the authentication process, on the basis of the face information, the environment information, and the difference in the imaging angle information obtained from the images of the target, and the registered information (step S607). Then, the authentication execution unit 140 outputs the result of the authentication process (step S608).
Next, a technical effect obtained by the authentication system 10 according to the sixth example embodiment will be described.
As described in
The authentication system 10 according to a seventh example embodiment will be described with reference to
First, with reference to
As illustrated in
The time series change calculation unit 146 is configured to calculate a change in the environment information and a change in the imaging angle information, on the basis of a plurality of time series images (typically, images of respective frames of a video) captured in a time series. Specifically, the time series change calculating unit 146 may calculate how the environment information in the time series images is changed (e.g., how a background information is changed over time). The time series change calculating unit 146 may calculate how the imaging angle information in the time series images is changed (e.g., how the angle of the camera 20 is changed over time).
The authentication execution unit 140 according to the seventh example embodiment performs the authentication process by using the change in the environment information and the change in the imaging angle information. For example, the authentication execution unit 140 may determine whether or not the background of the images is changed correctly in accordance with the movement of the camera 20. More specifically, the authentication execution unit 140 may determine whether an object that was on a right side appears in the images when the camera is moved to the right. The authentication execution unit 140 may determine whether or not the imaging angle is as expected when the background is changed. More specifically, when the background is gradually changed, the authentication execution unit 140 may determine whether or not the imaging angle information is correctly changed from 88 degrees to 89 degrees and to 90 degrees (in other words, whether the angle is not an abnormal value, or whether the angle is not abruptly changed) may be determined.
Next, with reference to
As illustrated in
Subsequently, the face/environment acquisition unit 110 obtains the face information about the target from the obtained time series images (step S702). In addition, the face/environment acquisition unit 110 obtains the environment information from the obtained time series images (step S704). Then, the imaging angle acquisition unit 120 obtains the imaging angle information when each of the time series images is captured (step S705). The steps S702 to S705 may be performed before or after one another, or may be performed at the same time in parallel.
Subsequently, the time series change calculating unit 146 calculates the change in the time series images in the time series (specifically, the change in the environment information, and the change in the imaging angle information) (step S705). The authentication execution unit 140 performs the authentication process, on the basis of the face information, the change in the environment information, and the change in the imaging angle information obtained from the images of the target, and the registered information (step S706). Then, authentication execution unit 140 outputs the result of the authentication process (step S707).
Next, a technical effect obtained by the authentication system 10 according to the seventh example embodiment will be described.
As described in
The authentication system 10 according to an eighth example embodiment will be described with reference to
First, with reference to
As illustrated in
The position acquisition unit 150 is configured to obtain information about a position where the image of the target is captured (hereinafter referred to as a “position information” as appropriate). The position acquisition unit 150 may be configured to obtain the position information about a terminal that captures the image, for example. The position acquisition unit 150 may be configured to obtain the position information by using a GPS (Global Positioning System), for example. The position information obtained by the position acquisition unit 150 is configured to be outputted to the authentication execution unit 140.
Next, with reference to
As illustrated in
In the eighth example embodiment, furthermore, the position acquisition unit 150 obtains the position information when the image of the target is captured (step S701). A process of obtaining the position information may be performed before or after the process of obtaining the other information (i.e., the steps S101 to S103), or may be performed at the same time in parallel
Subsequently, the authentication execution unit 140 performs the authentication process, on the basis of the obtained face information, the obtained environment information, the obtained imaging angle information, and the obtained positional information, and the registered information (step S702). The authentication execution unit 140 outputs the result of the authentication process (step S703).
The registered information according to this example embodiment includes the position information in addition to the face information, the environment information, and the imaging angle information. Therefore, in the authentication process, a collation/verification process of collating the obtained position information with the registered position information may be performed. Alternatively, the position information may be used as information for narrowing the other information (i.e., the face information, the environment information, and the imaging angle information) (e.g., as information for selecting the information that is used for the collation/verification, such as the imaging angle information in the third example embodiment).
Next, a technical effect obtained by the authentication system 10 according to the eighth example embodiment will be described.
As described in
The authentication system 10 according to a ninth example embodiment will be described with reference to
First, with reference to
As illustrated in
The notification unit 160 is configured to notify the target to be authenticated, to change a condition of capturing the image. More specifically, the notification unit 160 is configured to make a notification when the obtained face information matches the registered face information (i.e., the face collation/verification is successful), but the obtained environment information or the obtained imaging angle information does not match the registered environment information or imaging angle information (i.e., the environment collation/verification or the imaging angle collation/verification is not successful) on the authentication execution unit 140. The notification unit 160 may make a notification by using a display, a speaker, or the like provided by the output apparatus 16 (see
Next, with reference to
In the example illustrated in
In the example illustrated in
In the example illustrated in
Next, a technical effect obtained by the authentication system 10 according to the ninth example embodiment will be described.
As described in
The authentication system 10 according to a tenth example embodiment will be described with reference to
First, with reference to
As illustrated in
The stereoscopic determination unit 170 is configured to determine whether or not the face of the target is stereoscopic/three-dimensional (e.g., whether or not the face is not a face on a plane such as a paper sheet or a display). Specifically, the stereoscopic determination unit 170 is configured to determine whether or not the face of the target is stereoscopic/three-dimensional, by using the face information obtained before the notification by the notification unit 160 (in other words, the face information obtained from the image captured before the notification by the notification unit 160) and the face information obtained after the notification by the notification unit 160 (in other words, the face information obtained from the image captured after the notification by the notification unit 160).
In particular, the image captured before the notification (hereinafter referred to as an “image before the notification” as appropriate) and the image captured after the notification (hereinafter referred to as an “image after the notification” as appropriate) are captured at different angles due to the notification. The stereoscopic determination unit 170 determines whether or not the face of the target is stereoscopic/three-dimensional, by using the image before the notification and the image after the notification in which the imaging angles are different. The stereoscopic determination unit 170 may be configured to determine whether or not the face of the target is stereoscopic/three-dimensional, by using a plurality of images captured from different angles, even if they are not the image before the notification and the image after the notification.
Next, a flow of operation of the stereoscopic determination unit 170 will be described with reference to
As illustrated in
Subsequently, the stereoscopic determination unit 170 determines whether or not the face of the target is stereoscopic/three-dimensional, on the basis of the face information for the image before the notification and the face information for the image after the notification (step S903). The stereoscopic determination unit 170 outputs a determination result (i.e., information indicating whether or not the face of the target is stereoscopic/three-dimensional) (step S904).
The determination result by the stereoscopic determination unit 170 may be outputted to the authentication execution unit 140, for example. In this case, the authentication execution unit 140 may perform the authentication process, on the basis of the determination result by the stereoscopic determination unit 170. The determination result by the stereoscopic determination unit 170 may be outputted as another information that is separate from the authentication result of the authentication execution unit 140. In this case, the determination result by the stereoscopic determination unit 170 may be outputted from the output apparatus 16 (e.g., a display, a speaker, or the like).
Next, a technical effect obtained by the authentication system 10 according to the tenth example embodiment will be described.
As described in
A processing method in which a program for allowing the configuration in each of the example embodiments to operate to realize the functions of each example embodiment is recorded on a recording medium, and in which the program recorded on the recording medium is read as a code and executed on a computer, is also included in the scope of each of the example embodiments. That is, a computer-readable recording medium is also included in the range of each of the example embodiments. Not only the recording medium on which the above-described program is recorded, but also the program itself is also included in each example embodiment.
The recording medium to use may be, for example, a floppy disk (registered trademark), a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a nonvolatile memory card, or a ROM. Furthermore, not only the program that is recorded on the recording medium and executes processing alone, but also the program that operates on an OS and executes processing in cooperation with the functions of expansion boards and another software, is also included in the scope of each of the example embodiments.
This disclosure is not limited to the examples described above and is allowed to be changed, if desired, without departing from the essence or spirit of this disclosure which can be read from the claims and the entire specification. An authentication system, ab authentication apparatus, an authentication method, and a recording medium with such changes are also intended to be within the technical scope of this disclosure.
The example embodiments described above may be further described as, but not limited to, the following Supplementary Notes below.
An authentication system according to Supplementary Note 1 is an authentication system including: a first acquisition unit that obtains face information and environment information of a target, from an image including the target; a second acquisition unit that obtains imaging angle information that is information indicating an angle when the image is captured; and an authentication execution unit that performs an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
An authentication system according to Supplementary Note 2 is the authentication system according to Supplementary Note 1, further including a storage unit that stores the face information, the environment information, and the imaging angle information that are obtained by performing a plurality of times of imaging by changing an image angle, in advance as registered information, wherein the storage unit stores a plurality of pieces of environment information and a plurality of pieces of imaging angle information in association with each other, and the authentication execution unit selects the environment information corresponding to the obtained imaging angle information, from the registered information, and determines that the authentication process is successful when the obtained environment information matches the selected environment information and the obtained face information matches registered face information.
An authentication system according to Supplementary Note 3 is the authentication system according to Supplementary Note 1 or 2, wherein the first acquisition unit obtains a plurality of pieces of face information from a first imaging unit having a first imaging range, and obtains a plurality of pieces of environment information from a second imaging unit having a second imaging range that is different from the first imaging range, the second acquisition unit obtains the imaging angle information when the first imaging unit and the second imaging unit capture the image, and the authentication execution unit performs the authentication process, on the basis of the face information and the environment information obtained from the first acquisition unit, and the imaging angle information obtained from the second acquisition unit.
An authentication system according to Supplementary Note 4 is the authentication system according to any one of Supplementary Notes 1 to 3, wherein the first acquisition unit obtains the face information and the environment information from a first image captured at a first timing and a second image captured at a second timing, the second acquisition unit obtains the imaging angle information when the first image and the second image are captured, and the authentication execution unit performs the authentication process, on the basis of the face information and the environment information obtained from the first image and the second image, and a difference between the imaging angle information for the first image and the imaging angle information for the second image.
An authentication system according to Supplementary Note 5 is the authentication system according to any one of Supplementary Notes 1 to 3, wherein the first acquisition unit obtains the face information and the environment information from a plurality of time series images captured in a time series, the second acquisition unit obtains the imaging angle information when each of the time series images is captured, and the authentication execution unit performs the authentication process, on the basis of the face information, a change in the environment information in the time series, and a change in the imaging angle information in the time series.
An authentication system according to Supplementary Note 6 is the authentication system according to any one of Supplementary Notes 2 to 5, further including a third acquisition unit that obtains position information when the image is captured, wherein the storage unit stores the positional information as the registered information, in addition to the face information, the environment information, and the imaging angle information, and the authentication execution unit performs the authentication process, on the basis of the obtained face information, the obtained environment information, the obtained imaging angle information, the obtained positional information, and the registered information.
An authentication system according to Supplementary Note 7 is the authentication system according to any one of Supplementary Notes 1 to 6, further including a notification unit that notifies the target to change a condition of capturing the image, when the obtained face information matches registered face information, but the obtained environment information or the obtained imaging angle information does not match registered environment information or registered imaging angle information in the authentication process.
An authentication system according to Supplementary Note 8 is the authentication system according to Supplementary Note 7, further including a stereoscopic determination unit that determines whether or not a face of the target is stereoscopic/three-dimensional, by using the face information obtained before the target is notified by the notification unit and the face information obtained after the target is notified by the notification unit.
An authentication apparatus according to Supplementary Note 8 is an authentication apparatus including: a first acquisition unit that obtains face information and environment information of a target, from an image including the target; a second acquisition unit that obtains imaging angle information that is information indicating an angle when the image is captured; and an authentication execution unit that performs an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
An authentication method according to Supplementary Note 10 is an authentication method including: obtaining face information and environment information of a target, from an image including the target; obtaining imaging angle information that is information indicating an angle when the image is captured; and performing an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
A recording medium according to Supplementary Note 11 is a recording medium on which a computer program that allows a computer to execute an authentication method is recorded, the authentication method including: obtaining face information and environment information of a target, from an image including the target; obtaining imaging angle information that is information indicating an angle when the image is captured; and performing an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
A computer program according to Supplementary Note 12 is a computer program that allows a computer to execute an authentication method, the authentication method including: obtaining face information and environment information of a target, from an image including the target; obtaining imaging angle information that is information indicating an angle when the image is captured; and performing an authentication process on the target, on the basis of the obtained face information, the obtained environment information, and the obtained imaging angle information.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/025990 | 7/9/2021 | WO |