The present invention relates to an information processing device, an information processing system, an information processing method, and a program.
PTL 1 discloses an entry and exit management system provided with an authentication server that collates input biological information with biological information on a registered user to perform identity confirmation, and unlocks a gate on the basis of authentication permission from the authentication server.
[PTL 1] JP 6246403 B
In the system exemplified in PTL 1, a face feature amount of a person captured in an authentication area is collated with face feature amounts of N (an integer of 2 or more) registrants stored in advance in a database in order. For this reason, if the number N of elements of a registrant population increases, authentication accuracy and authentication speed in face authentication may decrease.
Therefore, in view of the above-described problem, an object of the present invention is to provide an information processing device, an information processing system, an information processing method, and a program capable of improving the authentication accuracy and the authentication speed in the face authentication.
According to one aspect of the present invention, an information processing device is provided, which includes an acquisition unit configured to extract, from a registered biological information group including biological information of a plurality of registrants, a first biological information group including biological information on a first person detected from a first image obtained by capturing an object at a first distance; and a collation unit configured to collate biological information on a second person detected from a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image with the biological information included in the first biological information group.
According to another aspect of the present invention, an information processing device is provided, which includes a storage unit configured to store a registered biological information group including biological information of a plurality of registrants; a first collation unit configured to collate biological information on a first person detected from a first image obtained by capturing an object at a first distance with biological information included in the registered biological information group; a specifying unit configured to extract a first biological information group including the biological information on the first person from the registered biological information group based on a collation result in the first collation unit; and a second collation unit configured to collate biological information on a second person detected from a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image with biological information included in the first biological information group.
According to still another aspect of the present invention, an information processing system is provided, which includes a first server configured to collate a registered biological information group including biological information of a plurality of registrants with biological information on a first person detected from a first image obtained by capturing an object at a first distance, and extract a first biological information group including the biological information on the first person from the registered biological information group; and a second server configured to collate biological information on a second person detected from a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image with biological information included in the first biological information group.
According to still another aspect of the present invention, an information processing method is provided, which includes a step of acquiring a first image obtained by capturing an object at a first distance; a step of extracting, from a registered biological information group including biological information of a plurality of registrants, a first biological information group including biological information on a first person detected from the first image; a step of acquiring a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image; and a step of collating biological information on a second person detected from the second image with the biological information included in the first biological information group.
According to still another aspect of the present invention, a program is provided, which causes a computer to execute: processing of acquiring a first image obtained by capturing an object at a first distance; processing of extracting, from a registered biological information group including biological information of a plurality of registrants, a first biological information group including biological information on a first person detected from the first image; processing of acquiring a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image; and processing of collating biological information on a second person detected from the second image with the biological information included in the first biological information group.
According to the present invention, an information processing device, an information processing system, an information processing method, and a program capable of improving the authentication accuracy and the authentication speed in the face authentication can be provided.
Hereinafter, example embodiments of the present invention will be described with reference to the drawings. Similar elements or corresponding elements may be designated by the same reference signs in the drawings, and description thereof may be omitted or simplified.
A configuration of an information processing system 1 according to the present example embodiment will be described with reference to
The center server 10 is an information processing device (first server) that centrally manages information on a base.
Note that the phrase “biological information” in the present example embodiment means the face image and a feature amount extracted from the face image. The face image of the registrant is obtained by, for example, uploading an image file from a user when the user performs member registration online. Furthermore, the center server 10 and the relay server 40 each have a function to detect the biological information on a person from a received captured image. The feature amount (also referred to as “face feature amount”) extracted from the face image can be, for example, an amount indicating a feature of the face such as a position of a characteristic part such as a pupil, a nose, or a mouth end.
In the present example embodiment, the description will be given on the assumption that the camera 20 captures the main authentication image at the timing when approach (or entry) of the user is detected by a sensor provided in or in the vicinity of the gate device 50. With such a configuration, the main authentication image can be obtained at the position where the camera 20 and the object are separated by the distance t2. Similarly for the pre-authentication image, the pre-authentication image may be captured at the position where the camera 20 and the object are separated by the distance t1 by causing the camera 20 to capture the image by a sensor installed at a position separated from the gate device 50 by a predetermined distance.
Furthermore, as illustrated in
The camera 20 is a capture device that captures the object located at the distance t1 and the distance t2 and generates the first image (pre-authentication image) and the second image (main authentication image).
As illustrated in
The center server 10 collates a registered biological information group including biological information of a plurality of registrants with biological information on a person (hereinafter referred to as a “first person”) detected from the first image (pre-authentication image) on the basis of a request from the relay server 40, and extracts a first biological information group including the biological information on the first person from the registered biological information group. Then, the center server 10 transmits the first biological information group narrowed down from the registered biological information group (registrant information) to the relay server 40 as candidate information.
The relay server 40 is an information processing device (second server) that performs face authentication processing for a person in each base. The relay server 40 collates biological information on a person (hereinafter referred to as a “second person”) detected from the second image (main authentication image) captured in the main authentication image capturing area with the biological information included in the first biological information group. The relay server 40 includes a candidate information database 41 (hereinafter referred to as a “candidate information DB”) that stores the candidate information (first biological information group) received from the center server 10.
The gate device 50 is a passage restriction device installed at the boundary between the restricted area and the non-restricted area. The gate device 50 is not limited to a passage restriction device such as an automatic ticket gate, and may be an automatic door-type passage restriction device. Furthermore, the gate device 50 may not have a passage restriction function, and may be, for example, a device having a function to display a result (passable or impassable) of person authentication by the relay server 40.
Note that, in
The CPU 151 is a processor that performs a predetermined operation according to a program stored in the ROM 153, the HDD 154, or the like and has a function to control each unit of the center server 10. The RAM 152 includes a volatile storage medium and provides a temporary memory area necessary for the operation of the CPU 151. The ROM 153 includes a non-volatile storage medium, and stores necessary information such as a program used for the operation of the center server 10. The HDD 154 is a storage device including a non-volatile storage medium, and which stores data necessary for processing, an operation program of the center server 10, and the like.
The communication I/F 155 is a communication interface based on standards such as Ethernet (registered trademark), Wi-Fi (registered trademark), 4G, or 5G, and is a module for communicating with other devices. The display device 156 is a liquid crystal display, an organic light-emitting diode (OLED) display, or the like, and is used for displaying an image, a character, an interface, or the like. The input device 157 is a keyboard, a pointing device, or the like, and is used by the user to operate the information processing system 1. Examples of the pointing device include a mouse, a track ball, a touch panel, and a pen tablet. The display device 156 and the input device 157 may be integrally formed as a touch panel.
Note that functions of a CPU 451, a RAM 452, a ROM 453, an HDD 454, a communication I/F 455, a display device 456, an input device 457, and a bus 458 included in the relay server 40 are similar to those of the CPU 151, the RAM 152, the ROM 153, the HDD 154, the communication I/F 155, the display device 156, the input device 157, and the bus 158 of the center server 10, and thus description thereof is omitted.
Furthermore, the hardware configuration illustrated in
The CPU 151 of the center server 10 loads a program stored in the ROM 153, the HDD 154, or the like into the RAM 152 and executes the program. As a result, the CPU 151 of the center server 10 implements the functions of the first collation unit 102 and the candidate information output unit 103. Moreover, the CPU 151 of the center server 10 implements the function of the first storage unit 101 by controlling the HDD 154. In the present example embodiment, the registrant information database 11 corresponds to the first storage unit 101.
Similarly, the CPU 451 of the relay server 40 loads a program stored in the ROM 453, the HDD 454, or the like into the RAM 452 and executes the program. As a result, the CPU 451 of the relay server 40 implements the functions of the candidate information acquisition unit 401, the image acquisition unit 403, the feature amount calculation unit 404, the second collation unit 405, the determination unit 406, the gate control unit 407, and the candidate information deletion unit 408. Processing performed in each unit will be described below. Moreover, the CPU 451 of the relay server 40 implements the function of the second storage unit 402 by controlling the HDD 454. In the present example embodiment, the candidate information database 41 corresponds to the second storage unit 402.
Next, functions and effects of the information processing system 1 according to the present example embodiment will be described with reference to
First, the image acquisition unit 403 of the relay server 40 causes the camera 20 to capture the object located in the pre-authentication capturing area (step S101) and acquires the captured image (first image; pre-authentication image) (step S102). Next, the feature amount calculation unit 404 of the relay server 40 calculates first feature amounts of at least one or more detected persons (first persons) included in the received captured image, and requests the center server 10 to perform collation (step S103).
Next, the first collation unit 102 of the center server 10 collates the first feature amount of the detected person received from the relay server 40 with the feature amount of the registrant stored in the registrant information DB 11 (step S104). Next, the first collation unit 102 extracts the registrant having a score of collation of the first feature amount of the detected person with the feature amount of the registrant, the score of collation being equal to or larger than a predetermined threshold (determination reference value), as a candidate (step S105). Note that the number of candidates extracted by the center server 10 is not limited to one. In a case where there is a plurality of persons having the collation score that is equal to or larger than the predetermined threshold (determination reference value), the plurality of persons may be extracted.
Next, the candidate information output unit 103 of the center server 10 transmits the candidate information on the person extracted from the registrants by the collation processing to the relay server 40 (step S106). Next, the candidate information acquisition unit 401 of the relay server 40 stores the candidate information received from the center server 10 in the candidate information database 41 that is the second storage unit 402 (step S107).
The image acquisition unit 403 of the relay server 40 causes the camera 20 to capture an image of the object located in the main authentication capturing area (step S108) and acquires the captured image (second image; main authentication image) (step S109). Next, the feature amount calculation unit 404 of the relay server 40 calculates a second feature amount of the detected person (second person) included in the received captured image (second image) (step S110). Then, the second collation unit 405 of the relay server 40 collates the second feature amount with the first feature amount of the candidate (step S111).
Next, the determination unit 406 of the relay server 40 determines whether the score of collation of the second feature amount of the detected person with the first feature amount of the candidate is equal to or larger than the predetermined threshold (determination reference value) (step S112). Here, in a case of determining that the collation score of the feature amount is equal to or larger than the threshold (step S112: YES), the determination unit 406 of the relay server 40 specifies the candidate having the highest collation score of the feature amount from among the corresponding candidates and authenticates the detected person (step S113). That is, the two feature amounts are considered to be matched. Next, the gate control unit 407 of the relay server 40 transmits gate control information to the gate device 50 (step S114).
When opening a gate on the basis of the gate control information received from the relay server 40 (step S115), the gate device 50 transmits status information indicating completion of the gate opening to the relay server 40 (step S116).
When determining that the collation score of the feature amount is less than the threshold (step S112: NO), the determination unit 406 of the relay server 40 determines that the detected person (second person) is impassable (step S118) and terminates the processing. In this case, the closed state is maintained in the gate device 50.
In step S117, the candidate information deletion unit 408 of the relay server 40 deletes the candidate information on the person who has passed through the gate from the candidate information database 41, and terminates the processing. Note that, in the above-described processing, the description has been given on the assumption that an initial state of the gate device 50 is the closed state, but the initial state may be an open state. In this case, when it is determined that the collation score is less than the threshold, the gate device 50 may be controlled from the open state to the closed state. Furthermore, in the case where the collation score is less than the threshold value, an alert may be output from the gate device 50 or the like by voice, light, text, or the like, for example, instead of controlling the opening/closing operation of the gate device 50.
The information processing system 1 in the present example embodiment adopts the first person detected from the captured image (first image) captured in the pre-authentication capturing area as the candidate for the collation processing with the person detected from the captured image (second image) in the main authentication capturing area in the case where the first person is registered as a registrant. More specifically, the information processing system 1 specifies the first person from among a plurality of registrants, and transmits the candidate information thereof to the relay server 40. Then, the final authentication is performed by comparing the feature amounts between the candidate narrowed down in the pre-authentication capturing area and the person detected from the captured image in the main authentication capturing area. That is, since the information processing system has a configuration in which the final authentication is performed after the number of persons belonging to a population N of one-to-N authentication is significantly reduced, the authentication accuracy and the authentication speed in the final authentication can be significantly improved.
Furthermore, as illustrated in step S117 of
Hereinafter, an information processing system 2 according to a second example embodiment will be described. Note that the same reference signs as those assigned in the drawings of the first example embodiment represent the same objects. Therefore, description of portions common to the first example embodiment will be omitted, and different portions will be described in detail.
Next, functions and effects of the information processing system 2 according to the present example embodiment will be described with reference to
First, an image acquisition unit 403 of the relay server 40 causes a camera 20 to capture an object located in a pre-authentication capturing area (step S201) and acquires a captured image (first image; pre-authentication image) (step S202). Next, a feature amount calculation unit 404 of the relay server 40 calculates first feature amounts of at least one or more detected persons (first persons) included in the received captured image, and requests the center server 10 to perform collation (step S203).
Next, a first collation unit 102 of the center server 10 collates the first feature amount of the detected person with a feature amount of a registrant stored in a registrant information DB 11 (step S204). Next, the first collation unit 102 extracts the registrant having a score of collation of the first feature amount of the detected person with the feature amount of the registrant, the score of collation being equal to or larger than a predetermined threshold (determination reference value), as a candidate (step S205). Note that the number of candidates extracted by the center server 10 is not limited to one. In a case where there is a plurality of persons having the collation score that is equal to or larger than the predetermined threshold (determination reference value), the plurality of persons may be extracted.
Next, a candidate information output unit 103 of the center server 10 transmits candidate information on the person extracted from the registrants by the collation processing to the relay server 40 (step S206). Next, a candidate information acquisition unit 401 of the relay server 40 stores the candidate information received from the center server 10 in a candidate information database 41 that is a second storage unit 402 (step S207).
Next, the balance information acquisition unit 409 of the relay server 40 transmits member authentication information on the extracted person to the external server 60 (step S208). The member authentication information can be acquired from the center server 10 together with, for example, the candidate information. The external server 60 acquires the balance information on the electronic money of the corresponding person from the member information database 61 on the basis of the member authentication information received from the relay server 40, and returns the information to the relay server 40 (step S209).
The image acquisition unit 403 of the relay server 40 causes the camera 20 to capture an image of the object located in a main authentication capturing area (step S210) and acquires the captured image (second image; main authentication image) (step S211). Next, the feature amount calculation unit 404 of the relay server 40 calculates a second feature amount of the detected person (second person) included in the received captured image (second image) (step S212). Then, a second collation unit 405 of the relay server 40 collates the second feature amount with the first feature amount of the candidate (step S213).
Next, a determination unit 406 of the relay server 40 determines whether the score of collation of the second feature amount of the detected person with the first feature amount of the candidate is equal to or larger than the predetermined threshold (determination reference value) (step S214). Here, in a case of determining that the collation score of the feature amount is equal to or larger than the threshold (step S214: YES), the determination unit 406 of the relay server 40 specifies the candidate having the highest collation score of the feature amount from among the corresponding candidates and authenticates the detected person (step S215). Thereafter, the processing proceeds to step S216.
On the other hand, in a case of determining that the collation score of the feature amount is less than the threshold (step S214: NO), the determination unit 406 of the relay server 40 determines that the second person is impassable (step S223) and terminates the processing.
In step S216, the determination unit 406 of the relay server 40 refers to the balance information on the first person and determines whether the balance is equal to or larger than a charge amount. Here, in a case where the determination unit 406 of the relay server 40 determines that the balance is equal to or larger than the charge amount (step S216: YES), the relay server 40 transmits charging information to the external server 60 (step S217). Upon receiving the charging information, the external server 60 updates the balance information included in the member information in the member information database 61 (step S218).
Furthermore, a gate control unit 407 of the relay server 40 transmits gate control information to a gate device 50 (step S219) in parallel with the processing of step S217.
When opening a gate on the basis of the gate control information received from the relay server 40 (step S220), the gate device 50 transmits status information indicating completion of the gate opening to the relay server 40 (step S221).
In a case of determining that the electronic money balance is less than the charge amount (step S216: NO), the determination unit 406 of the relay server 40 determines that the second person is impassable (step S223) and terminates the processing. In this case, the closed state is maintained in the gate device 50.
In step S222, a candidate information deletion unit 408 of the relay server 40 deletes the candidate information on the person who has passed through the gate from the candidate information DB 41, and terminates the processing. In the sequence diagram of
As described above, the information processing system 2 according to the present example embodiment is configured to determine whether a person is passable to a restricted area in consideration of not only a collation result of a face feature amount but also the balance information of the candidate acquired from the external server 60 as a cooperation partner, and thus can determine the person who can use a service. Furthermore, since the relay server 40 has the balance information of the candidate before an image is captured by the camera 20 installed in the vicinity of the gate device 50, the relay server 40 can immediately open the gate in the case of determining that the score of collation of the second feature amount with the first feature amount of the captured images captured by the camera 20 is equal to or larger than the threshold.
Note that, in the above description, it has been described that the relay server 40 inquires of the external server 60 about the balance information of the member, but the center server 10 that has specified the candidate may perform the processing. In this case, in step S206, the center server 10 sends the balance information inquired of the external server 60 to the relay server 40 in addition to the candidate information. Therefore, steps S207 and S208 can be omitted.
Hereinafter, an information processing system 3 according to a third example embodiment will be described. Note that the same reference signs as those assigned in the drawings of the first example embodiment represent the same objects. Therefore, description of portions common to the first example embodiment will be omitted, and different portions will be described in detail.
Next, functions and effects of the information processing system 3 according to the present example embodiment will be described with reference to
First, when receiving the candidate information from the center server 10 (step S301), a candidate information acquisition unit 401 of the relay server 40 refers to a candidate information DB 41 (second storage unit 402) and determines whether a first feature amount included in the candidate information is a registered feature amount (step S302). For example, in a case of receiving the candidate information obtained from the image for pre-authentication B, the relay server 40 determines presence or absence of overlapping registration with the first feature amount included in the candidate information obtained from the pre-authentication image A received before the reception of the candidate information.
Here, in a case where the candidate information acquisition unit 401 of the relay server 40 determines that the first feature amount is a registered feature amount (step S302: YES), a candidate information deletion unit 408 of the relay server 40 deletes the registered candidate information having old registration date and time (step S303). Thereafter, the processing proceeds to step S304. On the other hand, in a case where the candidate information acquisition unit 401 of the relay server 40 determines that the first feature amount is an unregistered feature amount (step S302: NO), the processing proceeds to step S304.
In step S304, the candidate information acquisition unit 401 registers the candidate information received in step S301 in the candidate information database 41, and terminates the processing. Note that, in the above-described determination processing in step S302, the overlapping has been determined according to whether the feature amount has been registered, but the determination method is not limited thereto. For example, in a case where a registrant ID is stored in the candidate information DB 41 in association with the feature amount, the overlapping candidate information may be detected and deleted on the basis of the registrant ID.
In the information processing system 3 according to the present example embodiment, persons are detected from the plurality of pre-authentication images A and B having different distances from the camera 20 to the object, and are transmitted as candidate information from the center server 10 to the relay server 40. In this case, the candidate information may be transmitted a plurality of times for the same person. However, as described in step S303, since the candidate information DB 41 is controlled such that candidates having the same feature amount are not present, unnecessary candidate information can be prevented from being accumulated. As a result, a decrease in authentication accuracy and authentication speed in one-to-N face authentication can be prevented.
Hereinafter, an information processing system 5 according to a fourth example embodiment will be described. Note that the same reference signs as those assigned in the drawings of the first example embodiment represent the same objects. Therefore, description of portions common to the first example embodiment will be omitted, and different portions will be described in detail.
The management server 80 has functions similar to the center server 10 and the relay server 40 illustrated in
Next, functions and effects of the information processing system 5 according to the present example embodiment will be described with reference to
First, the management server 80 causes a camera 20 to capture an object located in a pre-authentication capturing area (step S501) and acquires a captured image of the object (first image; pre-authentication image) (step S502). Next, the management server 80 calculates a first feature amount of a detected person (first person) included in the captured image (first image) received from the camera 20 (step S503).
Next, the management server 80 collates the first feature amount of the detected person with a feature amount of a registrant stored in the registrant information DB 11 (step S504). Next, the management server 80 extracts the registrant having a score of collation of the first feature amount of the detected person with the feature amount of the registrant, the score of collation being equal to or larger than a predetermined threshold (determination reference value), as a candidate (step S505). Note that the number of candidates extracted by the center server 10 is not limited to one. In a case where there is a plurality of persons having the collation score that is equal to or larger than the predetermined threshold (determination reference value), the plurality of persons may be extracted.
Next, the management server 80 stores candidate information on the person specified from the registrants by the collation processing in the candidate information DB 41 (step S506).
Similarly, the camera 20 captures the object located in a main authentication capturing area (step S507) and transmits a captured image of the object (second image; main authentication image) to the management server 80 (step S508). Next, the management server 80 calculates a second feature amount of a detected person (second person) included in the captured image (second image) received from the camera 20 (step S509). Then, the management server 80 collates the second feature amount with the first feature amount of the candidate (step S510).
Next, the management server 80 determines whether the score of collation of the second feature amount of the detected person with the first feature amount of the candidate is equal to or larger than the predetermined threshold (determination reference value) (step S511). Here, in a case of determining that the collation score of the feature amount is equal to or larger than the threshold (step S511: YES), the management server 80 specifies the candidate having the highest collation score from among corresponding registrants and authenticates the detected person (step S512). Thereafter, the management server 80 transmits gate control information to a gate device 50 (step S513).
When opening a gate on the basis of the gate control information received from the management server 80 (step S514), the gate device 50 transmits status information indicating completion of the gate opening to the management server 80 (step S515). Thereafter, the processing proceeds to step S516.
On the other hand, in a case of determining that the collation score is less than the predetermined threshold in step S511 (step S511; NO), the management server 80 determines that the second person is impassable (step S517) and terminates the processing.
In step S516, the management server 80 deletes the candidate information on the person who has passed through the gate from the candidate information database 41, and terminates the processing.
Since the information processing system 5 according to the present example embodiment has a configuration in which the management server 80 has functions of a plurality of servers unlike the above-described first example embodiment, there is an advantage that data transmission and reception between servers is not required in addition to the effects of the above-described first example embodiment.
Hereinafter, an information processing system 6 according to a fifth example embodiment will be described. Note that the same reference numerals as those assigned in the drawings of the second example embodiment represent the same objects. Therefore, description of portions common to the second example embodiment will be omitted, and different portions will be described in detail.
Next, functions and effects of the information processing system 6 according to the present example embodiment will be described with reference to
First, an image acquisition unit 403 of a relay server 40 causes a camera 20 to capture an object located in a pre-authentication capturing area (step S601) and acquires a captured image (first image; pre-authentication image) (step S602). Next, a feature amount calculation unit 404 of the relay server 40 calculates a first feature amount of a detected person (first person) included in the received captured image (first image), and requests the center server 10 to perform collation (step S603).
Next, a first collation unit 102 of the center server 10 collates the first feature amount of the detected person with a feature amount of a registrant stored in the registrant information DB 11 (step S604). The first collation unit 102 extracts the registrant having a score of collation of the first feature amount of the detected person with feature amount of the registrant, the score of collation being equal to or larger than a predetermined threshold (determination reference value), as a candidate (step S605). Note that the number of candidates extracted by the center server 10 is not limited to one. In a case where there is a plurality of persons having the collation score that is equal to or larger than the predetermined threshold (determination reference value), the plurality of persons may be extracted.
Next, a candidate information output unit 103 of the center server 10 transmits the feature amount (candidate information) of the candidate extracted from the registrants by the collation processing and balance information on electronic money of the candidate to the relay server 40 (step S606). The balance information on electronic money of the candidate can be acquired from member information in which the registrant ID of the candidate matches the member ID by searching the member information DB 61 using a registrant ID as a search key. Next, a candidate information acquisition unit 401 of the relay server 40 stores the candidate information and the balance information received from the center server 10 in a candidate information DB 41 that is a second storage unit 402 (step S607).
Similarly, the image acquisition unit 403 of the relay server 40 causes the camera 20 to capture an image of the object located in a main authentication capturing area (step S608) and acquires the captured image (second image; main authentication image) (step S609). Next, the feature amount calculation unit 404 of the relay server 40 calculates a second feature amount of the detected person (second person) included in the received captured image (second image) (step S610). Then, a second collation unit 405 of the relay server 40 collates the second feature amount with the first feature amount of the candidate (step S611).
Next, a determination unit 406 of the relay server 40 determines whether the score of collation of the second feature amount of the detected person with the first feature amount of the candidate is equal to or larger than the predetermined threshold (determination reference value) (step S612). Here, in a case of determining that the collation score of the feature amount is equal to or larger than the threshold (step S612: YES), the determination unit 406 of the relay server 40 specifies the candidate having the highest collation score of the feature amount from among the corresponding candidates and authenticates the detected person (step S613). Thereafter, the processing proceeds to step S614.
In a case where the determination unit 406 of the relay server 40 determines that the collation score of the feature amount is less than the threshold (step S612: NO), the determination unit 406 of the relay server 40 determines that the second person is impassable (step S621) and terminates the processing.
In step S614, the determination unit 406 of the relay server 40 refers to the balance information on the electronic money of the first person stored in the candidate information database 41, and determines whether the balance is equal to or larger than the charge amount. Here, in a case where the determination unit 406 of the relay server 40 determines that the balance is equal to or larger than the charge amount (step S614: YES), a gate control unit 407 of the relay server 40 transmits gate control information to a gate device 50 (step S615).
When opening a gate on the basis of the gate control information received from the relay server 40 (step S616), the gate device 50 transmits status information indicating completion of the gate opening to the relay server 40 (step S617).
Furthermore, the relay server 40 transmits charging information to the center server 10 (step S618) in parallel with steps S615 to S617. Upon receiving the charging information, the center server 10 updates the balance information included in the member information in the member information DB 61 (step S619).
In a case of determining that the balance is less than the charge amount (step S614: NO), the determination unit 406 of the relay server 40 determines that the second person is impassable (step S621) and terminates the processing. In this case, a closed state (initial state) is maintained in the gate device 50.
In step S620, a candidate information deletion unit 408 of the relay server 40 deletes the candidate information on the person who has passed through the gate from the candidate information DB 41, and terminates the processing. In the sequence diagram of
The information processing system 6 according to the present example embodiment is configured to determine whether a person is passable to a restricted area in consideration of not only the collation result of a face feature amount but also the balance information of the candidate, and thus can determine the person who can use a service. Furthermore, since the relay server 40 has the balance information of the candidate before an image is captured by the camera 20 installed in the vicinity of the gate device 50, the relay server 40 can immediately open the gate in the case of determining that the score of collation of the second feature amount of the captured image (second image) captured by the camera 20 with the first feature amount is equal to or larger than the threshold. Moreover, unlike the above-described second example embodiment, the information processing system 6 according to the present example embodiment has a configuration in which the center server 10 includes both the registrant information database 11 and the member information DB 61. Therefore, there is also an advantage that the balance information of the registrant can be acquired without performing authentication processing to an external server.
The second collation unit 200D collates biological information on a second person detected from a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image with the biological information included in the first biological information group. The information processing device 200 according to the present example embodiment can improve authentication accuracy and authentication speed in face authentication.
The present invention is not limited to the above-described example embodiments, and can be appropriately modified without departing from the gist of the present invention.
In the above-described first example embodiment, the case of determining the presence or absence of the passage authority of the detected person on the basis of the collation result of the feature amount has been described. However, a determination result by a predetermined authentication card (for example, a security card, a transportation IC card, or the like) may be used together with the collation result of the feature amount.
First, when receiving the captured image (second image) of the main authentication area (step S701), the relay server 40 calculates the second feature amount of the detected person included in the captured image (second image) (step S702). Next, the relay server 40 collates the calculated second feature amount with the first feature amount of the candidate (step S703).
Next, the relay server 40 determines whether the score of collation of the second feature amount of the detected person with the first feature amount of the candidate is equal to or larger than the predetermined threshold (determination reference value) (step S704). Here, in the case where it is determined that the collation score is equal to or larger than the threshold (step S704: YES), the relay server 40 extracts a person having a high collation score of the feature amount from the corresponding candidates (step S705). Accordingly, the specified candidate (registrant) and the detected person are regarded as the same person. Thereafter, the processing proceeds to step S709. On the other hand, in the case where the relay server 40 determines that the collation score is less than the threshold (step S704: NO), the processing proceeds to step S706.
In step S706, the relay server 40 determines the presence or absence of presentation of the authentication card from the detected person (hereinafter referred to as “person to be authenticated”) in the main authentication area. Here, in a case where the relay server 40 determines the presence of presentation of the authentication card (step S706: YES), the processing proceeds to step S707. On the other hand, in a case where the relay server 40 determines the absence of presentation of the authentication card (step S706: NO), the processing proceeds to step S710.
In step S707, the relay server 40 inquires of the center server 10 whether the person to be authenticated is a registrant on the basis of the authentication information read from the authentication card by a card reader device (not illustrated). Next, the relay server 40 determines whether the person to be authenticated is a registrant on the basis of reply information from the center server 10 (step S708). Here, in a case where the relay server 40 determines that the person to be authenticated is a registrant (step S708: YES), the processing proceeds to step S709. On the other hand, in a case where the relay server 40 determines that the person to be authenticated is not a registrant (step S708: NO), the processing proceeds to step S710.
In step S709, when the relay server 40 transmits the gate control information instructing gate opening to the gate device 50, the processing proceeds to step S115. On the other hand, in step S710, when the relay server 40 transmits the gate control information instructing gate closing to the gate device 50, the processing ends. In this manner, by combining the face authentication and the authentication using the authentication card, the registrant whose face image is not registered can be coped with.
In the above-described fourth example embodiment, the management server 80 includes both the registrant information database 11 (the first storage unit 101) and the candidate information DB 41 (the second storage unit 402), but these databases can be integrated into one database.
Furthermore, the above-described example embodiments have been described that the candidate information is deleted under three types of deletion conditions: (A) in the case where the person to be authenticated has passed through the gate device 50, (B) in the case where the candidate information regarding the registered feature amount is received again, and (C) in the case where the candidate information on the person included in the captured image of the monitoring area has been registered, but the deletion conditions are not limited thereto. For example, the management server 80 may delete the candidate information on the basis of whether an elapsed time from registration date and time (DB registration time) of the candidate information to the current time has reached a predetermined time. In this case, the candidate information that becomes unnecessary with time can be prevented from remaining in the candidate information DB 41.
Furthermore, in each of the above-described example embodiments, the case in which the camera 20 has only the function to transmit the captured image has been described. However, the camera 20 may further have a function to detect a person from the captured image and calculate and transmit the feature amount. In this case, it is only necessary to transmit the feature amount instead of the captured image to the center server 10 and the relay server 40.
Furthermore, in each of the above-described example embodiments, the case in which the center server 10 transmits only the face feature amount to the relay server 40 has been described. However, a face image of the specified candidate may be transmitted, or the ID of the candidate may be transmitted together with the feature amount and the face image. In the case of transmitting the face image, passage history information including the collated face image of the candidate and the face image of the second person can be generated. Furthermore, in the case of transmitting the ID (registrant ID) of the candidate together and holding the feature amount and the ID in association with each other on the relay server 40 side, not only the collation of the feature amount but also the overlapping candidate information can be extracted and deleted on the basis of the collation result of the ID.
Furthermore, in the above-described first to fourth example embodiments, the case in which one relay server 40 is installed in one base has been described, but a plurality of relay servers may be installed in one base. For example, in a facility having an extremely large number of users, there is an advantage that the collation processing can be performed in a distributed manner by installing a plurality of the relay servers 40. In this case, it is favorable to control data registration so as not to hold overlapping candidate information among the plurality of relay servers 40.
Furthermore, in each of the above-described example embodiments, the case in which the threshold at the time of collation using the pre-authentication image and the threshold at the time of collation using the main authentication image are the same value has been described, but the threshold may be different between the pre-authentication area and the main authentication area. For example, the threshold at the time of collation in the pre-authentication area may be set to be lower than the threshold at the time of collation in the main authentication area. This has an advantage that omission of extraction of candidates in the pre-authentication area can be prevented. On the contrary, the threshold at the time of collation in the pre-authentication area may be set to be higher than the threshold at the time of collation in the main authentication area. Accordingly, the pre-authentication can be executed under a strict condition. In addition, in the case where a plurality of pre-authentication areas is set, the threshold at the time of collation in each pre-authentication area may be set so as to gradually increase as approaching the restricted area. As a result, the person to be authenticated (candidate) in the authentication using the main authentication image can be gradually narrowed down according to the distance from each pre-authentication area to the restricted area.
Furthermore, in each of the above-described example embodiments, the case in which the candidate information (first feature amount) on the same person is not redundantly registered even in the case where the same person is detected a plurality of times in the pre-authentication image, and the number of times of detection is not considered has been described. However, in a case where the camera 20 transmits a plurality of pre-authentication images, the same person may be captured a plurality of times. Therefore, in the case where the same person is detected a plurality of times in the plurality of pre-authentication images, the threshold at the time of collation using the main authentication image may be changed (to be high or low) according to the number of times of detection. As a result, for example, a person detected a predetermined number of times or more in the pre-authentication image scan be strictly authenticated as a person to watch out by making the threshold in the main authentication images higher than usual. Conversely, the authentication can be performed by setting the threshold in the authentication area to be lower than usual.
A processing method of recording a program for operating the configurations of the above-described example embodiments to implement the functions of the example embodiments, reading the program recorded on the storage medium as a code, and executing the program in a computer is also included in the scope of each of the example embodiments. That is, a computer-readable storage medium is also included in the scope of each of the example embodiments. In addition, not only the storage medium in which the above-described program is recorded but also the program itself is included in each of the example embodiments. In addition, one or more configuration elements included in the above-described example embodiments may be a circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA) configured to implement the functions of the configuration elements.
Examples of the storage medium include a floppy (registered trademark) disk, a hard disk, an optical disk, a magneto-optical disk, a compact disk (CD)-ROM, a magnetic tape, a non-volatile memory card, and a ROM. In addition, not only a configuration in which the processing is executed by the program recorded on the recording medium alone but also a configuration in which the processing is executed on an operating system (OS) in cooperation with another software or functions of an extension board is also included in the scope of each of the example embodiments.
The service implemented by the functions of each of the above-described example embodiments can also be provided to the user in the form of software as a service (SaaS).
Note that the above-described example embodiments are merely examples of embodiment in implementing the present invention, and the technical scope of the present invention should not be interpreted in a limited manner by these example embodiments. That is, the present invention can be implemented in various forms without departing from the technical idea or the main features thereof. For example, the relay server 40 described in each of the above-described example embodiments can be configured by an information processing device built in the gate device 50. Moreover, in a case where there is a plurality of gate devices 50, it is also possible to adopt a configuration in which information processing devices functioning as the relay server 40 share information with each other.
Some or all of the above-described example embodiments can be described as but are not limited to the following supplementary notes.
An information processing device including:
an acquisition unit configured to acquire a first image obtained by capturing an object at a first distance, and extract, from a registered biological information group including biological information of a plurality of registrants, a first biological information group including biological information on a first person detected from the first image obtained by capturing the object at the first distance; and
a collation unit configured to collate biological information on a second person detected from a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image with the biological information included in the first biological information group.
The information processing device according to supplementary note 1, further including:
a detection unit configured to detect the biological information on the second person from the second image.
The information processing device according to supplementary note 1 or 2, further including:
a determination unit configured to determine presence or absence of passage authority of the second person to a predetermined restricted area based on a collation result of the collation unit.
The information processing device according to supplementary note 3, further including:
a control unit configured to control opening and closing of a door of a passage restriction device that restricts entry and exit to and from the restricted area based on the presence or absence of the passage authority.
The information processing device according to supplementary note 3 or 4, further including:
a balance information acquisition unit configured to acquire balance information of the first person, in which
the determination unit determines the presence or absence of the passage authority of the second person to the restricted area based on the collation result and the balance information.
The information processing device according to any one of supplementary notes 1 to 5, further including:
a storage unit configured to store the acquired first biological information group; and
a deletion unit configured to delete the biological information on the first person, the biological information meeting a predetermined condition, from the first biological information group.
The information processing device according to supplementary note 6, in which
the deletion unit deletes the biological information on the first person of which an elapsed time from registration time to the storage unit has reached a predetermined time.
The information processing device according to supplementary note 6 or 7, in which,
in a case where a plurality of pieces of the biological information for the first person is present in the storage unit, the deletion unit deletes the biological information having earlier registration time to the storage unit.
An information processing device including:
a storage unit configured to store a registered biological information group including biological information of a plurality of registrants;
a first collation unit configured to acquire a first image obtained by capturing an object at a first distance and collate biological information on the first person detected from the first image with biological information included in the registered biological information group;
a specifying unit configured to extract a first biological information group including the biological information on the first person from the registered biological information group based on a collation result in the first collation unit; and
a second collation unit configured to collate biological information on a second person detected from a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image with biological information included in the first biological information group.
An information processing system including:
a first server configured to acquire a first image obtained by capturing an object at a first distance, collate a registered biological information group including biological information of a plurality of registrants with biological information on a first person detected from the first image, and extract a first biological information group including the biological information on the first person from the registered biological information group; and
a second server configured to collate biological information on a second person detected from a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image with biological information included in the first biological information group.
The information processing system according to supplementary note 10, in which
the second server
determines presence or absence of passage authority of the second person to a predetermined restricted area, and
controls opening and closing of a door of a passage restriction device that restricts entry and exit to and from the predetermined restricted area based on the presence or absence of the passage authority.
The information processing system according to supplementary note 10 or 11, in which
a reference value for determining matching in the collation of the biological information on the registrant with the biological information on the first person in the first server is set to be lower than a reference value for determining matching in the collation of the biological information on the first person with the biological information on the second person in the second server.
The information processing system according to supplementary note 10 or 11, in which
a reference value for determining matching in the collation of the biological information on the registrant with the biological information on the first person in the first server is set to be higher than a reference value for determining matching in the collation of the biological information on the first person with the biological information on the second person in the second server.
The information processing system according to any one of supplementary notes 10 to 13, in which
a reference value at the time of collation of the biological information on the first person with the biological information on the second person is determined according to the number of times of detecting a same person by the collation of the registered biological information group with the biological information on the first person.
An information processing method including:
acquiring a first image obtained by capturing an object at a first distance;
extracting, from a registered biological information group including biological information of a plurality of registrants, a first biological information group including biological information on a first person detected from the first image;
acquiring a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image; and
collating biological information on a second person detected from the second image with the biological information included in the first biological information group.
A program recording medium recording a program for causing a computer to execute:
processing of acquiring a first image obtained by capturing an object at a first distance;
processing of extracting, from a registered biological information group including biological information of a plurality of registrants, a first biological information group including biological information on a first person detected from the first image;
processing of acquiring a second image obtained by capturing the object at a second distance shorter than the distance to the object in the first image; and
processing of collating biological information on a second person detected from the second image with the biological information included in the first biological information group.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/012717 | 3/23/2020 | WO |