The present invention relates to an information processing apparatus, an information processing method, and a storage medium.
Patent Literature 1 discloses a ticketless boarding system that uses passenger biometric information (face image) to perform various procedures via face authentication at a plurality of check points (a check-in lobby, a security inspection site, a boarding gate, and the like) in an airport.
PTL 1: Japanese Patent Application Laid-Open No. 2007-79656
The system disclosed in Patent Literature 1 merely discloses a configuration to use face authentication for a particular operation in an airport. However, it is expected to improve efficiency of operation by implementing a face authentication function also in various operations of transportation systems, manufacturing industry, transportation industry, retail industry, accommodation industry, and the like.
Accordingly, in view of the problem described above, the present invention intends to provide an information processing apparatus, an information processing method, and a storage medium that can easily implement a face authentication function for various operations.
According to one aspect of the present invention, provided is an information processing apparatus including: a receiving unit that receives request data having control data from an operation terminal used for a predetermined operation; an extraction unit that extracts biometric information used for biometric authentication and encapsulated operation data from the request data based on the control data; an authentication unit that performs biometric authentication on a user based on the biometric information; and an operation processing unit that processes the operation data.
According to another aspect of the present invention, provided is an information processing method including: receiving request data having control data from an operation terminal used for a predetermined operation; extracting biometric information used for biometric authentication and encapsulated operation data from the request data based on the control data; performing biometric authentication on a user based on the biometric information; and processing the operation data.
According to yet another aspect of the present invention, provided is a storage medium storing a program that causes a computer to perform: receiving request data having control data from an operation terminal used for a predetermined operation; extracting biometric information used for biometric authentication and encapsulated operation data from the request data based on the control data; performing biometric authentication on a user based on the biometric information; and processing the operation data.
According to the present invention, it is possible to provide an information processing apparatus, an information processing method, and a storage medium that can easily implement a face authentication function for various operations.
Exemplary embodiments of the present invention will be described below with reference to the drawings. Throughout the drawings, the same or corresponding components are labeled with the same references, and the description thereof may be omitted or simplified.
In the information processing system 1 of the present example embodiment, a check-in terminal 20, an automatic baggage check-in machine 30, a security inspection apparatus 40, an automated gate apparatus 50, and a boarding gate apparatus 60 are connected to a shared management server 10 via a network NW, respectively. The security inspection apparatus 40, the automated gate apparatus 50, and the boarding gate apparatus 60 are installed in a security area SA illustrated by a dashed line. The network NW is formed of a local area network (LAN) including a private communication network of the airport A, a wide area network (WAN), a mobile communication network, or the like. The connection scheme may be a wireless scheme without being limited to a wired scheme. Note that, for simplified illustration,
The management server 10 is an information processing apparatus that manages operations related to inspection procedures in immigration of the user U. The management server 10 is installed in a facility of an airport company operating the airport A, an airline company, or the like, for example. Further, the management server 10 may be a cloud server instead of a server installed in the facility in which operations are actually performed. Note that the management server 10 is not necessarily required to be a single server and may be formed as a server group including a plurality of servers.
As illustrated in
The check-in terminal 20 is installed in a check-in lobby (hereafter, referred to as “touch point P1”) in the airport A. The check-in terminal 20 is a self-service terminal operated by the user U by himself/herself to perform a check-in procedure. The check-in terminal 20 is also called a Common Use Self Service (CUSS) terminal. After completion of the check-in procedure at the touch point P1, the user U moves to a baggage check-in place or a security inspection site.
The automatic baggage check-in machine 30 is installed in a region adjacent to a baggage counter (manned counter) in the airport A or a region near the check-in terminal 20 (hereafter, referred to as “touch point P2”). The automatic baggage check-in machine 30 is a self-service terminal operated by the user U by himself/herself to perform a procedure to check in baggage not carried in the aircraft (baggage check-in procedure). The automatic baggage check-in machine 30 is also called a Common Use Bag Drop (CUBD) terminal. After completion of the baggage check-in procedure, the user U moves to the security inspection site. Note that, when the user U does not check in his/her baggage, the procedure at the touch point P2 is omitted.
The security inspection apparatus 40 is installed in the security inspection site (hereafter, referred to as “touch point P3”) in the airport A. The security inspection apparatus 40 is an apparatus that uses a metal detector to check whether or not the user U is wearing a metal object that may be a dangerous object. Note that the term “security inspection apparatus” in the present example embodiment is used as a meaning including not only a metal detector but also an X-ray inspection device that uses an X-ray to check whether or not there is a dangerous object in carry-on baggage or the like, a terminal device of a Passenger Reconciliation System (PRS) that determines whether or not to permit passage of the user U at the entrance of a security inspection site, or the like. After completion of the security inspection procedure with the security inspection apparatus 40 at the touch point P3, the user U moves to a departure inspection site.
The automated gate apparatus 50 is installed in the departure inspection site (hereafter, referred to as “touch point P4”) in the airport A. The automated gate apparatus 50 is an apparatus that automatically performs a departure inspection procedure on the user U. After completion of the departure inspection procedure at the touch point P4, the user U moves to a departure area where duty free shops or boarding gates are provided.
The boarding gate apparatus 60 is a passage control apparatus each installed at a boarding gate (hereafter, referred to as “touch point P5”) of the departure area. The boarding gate apparatus 60 is also called an Automated Boarding Gates (ABG) terminal. The boarding gate apparatus 60 confirms that the user U is a passenger of an aircraft that is available for boarding through the boarding gate. After completion of the procedure at the touch point P5, the user U boards the aircraft and departs from the country to a second country.
Further, as illustrated in
The group ID is an identifier used for grouping ID information. The registered face image is a face image registered for the user U. The feature amount is a value extracted from biometric information (registered face image). Note that, although the term of biometric information in the present example embodiment means a face image or a feature amount extracted from a face image, biometric information is not limited to a face image or a face feature amount. That is, biometric authentication may be performed by using an iris image, a fingerprint image, a palmprint image, an auricle image, or the like as biometric information on the user U.
The token issuance time is a time that the management server 10 issued a token ID. The token issuance device name is a device name of an acquisition source of a registered face image that triggered issuance of a token ID. The invalidation flag is flag information indicating whether or not the token ID is currently valid. In response to issuance of a token ID, the invalidation flag in the present example embodiment becomes a value of “1” indicating a state where the token ID is valid. Further, if a predetermined condition is satisfied, the invalidation flag is updated to a value of “0” indicating a state where the token ID is invalid. The invalidation time is a timestamp of a time that the invalidation flag was invalidated.
The reservation number is an identifier that uniquely identifies boarding reservation information. The airline code is an identifier that uniquely identifies an airline company. Boarding reservation information included in operation information may be a passenger name, a reservation number, a departure place, a destination, an airline code, a flight number, a flight date, a seat number, a nationality, a passport number, a family number, a first name, a date of birth, a sexuality, or the like. The boarding reservation information can be acquired from a recording medium such as a passport, a boarding ticket, or the like. Further, the boarding reservation information can also be acquired from a reservation system (not illustrated) of an airline company by using a passport number, a reservation number, or the like as a key. The acquired boarding reservation information is then stored as operation information in the operation information DB 13.
The storage unit 10A stores token ID information, passage history information, operation information, and the like described above. The transceiver unit 10B receives request data D1 from an edge terminal 200 and transmits a process result in the management server 10 to the edge terminal 200 as response data D2. Note that the edge terminal 200 in the present example embodiment corresponds to each terminal device of the check-in terminal 20, the automatic baggage check-in machine 30, the security inspection apparatus 40, the automated gate apparatus 50, and the boarding gate apparatus 60.
The data extraction unit 10C determines an API of a calling target based on a command included in the received request data D1, extracts control data, face authenticating data, and operation data included in the request data D1, and assigns these data to each API.
If it is determined that the command content indicates “token ID issuance request” in the data extraction unit 10C, the matching unit 10D matches a target face image extracted from the request data D1 with a passport face image. Further, if it is determined that the command content indicates “face authentication execution request” in the data extraction unit 10C, the matching unit 10D matches a target face image extracted from the request data D1 with a face image of a registrant (registered face image) stored in the storage unit 10A. If the result of matching of the target face image with the passport face image performed by the matching unit 10D is that the matching is successful, the token ID issuance unit 10E issues a token ID to the user U.
The operation processing unit 10F is a set of X (X≥1) API(s) that performs data processing related to an operation(s) and is called by the data extraction unit 10C. For example, when the transceiver unit 10B receives the request data D1 from the automatic baggage check-in machine 30, the data extraction unit 10C first calls and causes the matching unit 10D to perform a matching process and, based on the result of matching, calls an operation API related to the baggage check-in procedure. In such a way, since the matching unit 10D or any operation API can be started in accordance with data extracted by the data extraction unit 10C, a face authentication technology can be easily applied to various operations.
The body part B1 is a field storing control data B11, face authenticating data B12, and operation data B13. The control data B11 is data used for controlling the operation of operation APIs and made up of data items that do not depend on the operation. The control data B11 differs in the control target from the control data stored in the header part H1. The control data of the header part H1 includes an execution command related to at least one of an operation data registration process, an operation data search process, a token ID issuance process, and biometric authentication. In contrast, the control data B11 of the body part B1 includes data items of a device name of a source of the request data D1, a system type, a location, and the like.
Further, in
Subsequently, the hardware configuration of respective devices forming the information processing system 1 will be described with reference to
The CPU 101 is a processor having a function of performing a predetermined operation in accordance with a program stored in the storage device 103 and controlling each component of the management server 10. In the management server 10, the CPU 101 functions as the transceiver unit 10B, the data extraction unit 10C, the matching unit 10D, the token ID issuance unit 10E, and the operation processing unit 10F described above. The RAM 102 is formed of a volatile storage medium and provides a temporary memory area required for the operation of the CPU 101.
The storage device 103 is formed of a storage medium such as a nonvolatile memory, a hard disk drive, or the like and functions as the storage unit 10A. The storage device 103 stores a program executed by the CPU 101, data referenced by the CPU 101 in execution of the program, or the like.
The communication I/F 104 is a communication interface based on a specification such as Ethernet (registered trademark), Wi-Fi (registered trademark), 4G, or the like and is a module for communicating with the check-in terminal 20 or the like. The communication I/F 104 functions as the transceiver unit 10B together with the CPU 101.
The input device 206 is a pointing device such as a touch panel, a keyboard, or the like, for example. In the check-in terminal 20 of the present example embodiment, the display device 207 and the input device 206 are integrally formed as a touch panel. The display device 207 is a liquid crystal display device, an organic light emitting diode (OLED) display device, or the like and is used for displaying a moving image, a still image, a text, or the like.
The medium reading device 208 is a device that reads a medium such as a passport, an airline ticket, or the like of the user U and acquires information recorded in the medium. The airline ticket medium may be, for example, a paper airline ticket, a mobile terminal displaying an e-ticket receipt, or the like. The medium reading device 208 is formed of a code reader, an image scanner, a contactless integrated circuit (IC) reader, an optical character reader (OCR) device, or the like, for example, and acquires information from various media presented to the reading unit thereof.
The biometric information acquisition device 209 is a device that acquires a face image of the user U as biometric information on the user U. The biometric information acquisition device 209 is, for example, a digital camera used for capturing a face of the user U standing in front of the check-in terminal 20 and captures the face of the user U to acquire the face image.
The baggage transport device 310 is a device that transports baggage of the user U for loading the baggage to the aircraft that the user U boards. The baggage transport device 310 transports, to a baggage handling place, baggage which is placed on receiving part by the user U and to which a baggage tag is attached.
The output device 311 is a device that outputs a baggage tag to be attached to checking-in baggage. Further, the output device 311 outputs a baggage claim tag that is necessary when the user U claims his/her baggage after arriving at the destination. Note that a baggage tag or a baggage claim tag is associated with at least one of a passport number, a reservation number, and a token ID.
The metal detector gate 410 is a gate type metal detector and detects a metal object worn by the user U passing through the metal detector gate 410.
The gate 511 transitions from a closed state to block passage of the user U during standby to an open state to permit passage of the user U under the control of the CPU 501 when identity verification of the user U at the automated gate apparatus 50 is successful and the user U has passed through the departure inspection. The scheme of the gate 511 is not particularly limited, and the gate 511 may be, for example, a flapper gate in which one or more flappers provided to one side or both sides of a passage are opened and closed, a turn style gate in which three bars are revolved, or the like.
Subsequently, the operation of each apparatus in the information processing system 1 in the present example embodiment will be described with reference to
First, the check-in terminal 20 determines whether or not an airline ticket medium of the user U is presented to the reading unit (not illustrated) of the medium reading device 208 (step S101) and stands by until an airline ticket medium is presented (step S101, NO).
Next, if the check-in terminal 20 determines that an airline ticket medium is presented to a reading unit of the medium reading device 208 (step S101, YES), the check-in terminal 20 acquires boarding reservation information on the user U from the presented airline ticket medium (step S102). The acquired boarding reservation information includes a family name, a first name, an airline code, a flight number, a boarding date, a departure place (boarding airport), a destination (arrival airport), a seat number, a boarding time, an arrival time, or the like.
Next, the check-in terminal 20 determines whether or not a passport of the user U is presented to the reading unit of the medium reading device 208 (step S103) and stands by until a passport is presented (step S103, NO).
Next, if the check-in terminal 20 determines that a passport is presented to the reading unit of the medium reading device 208 (step S103, YES), the check-in terminal 20 acquires passport information on the user U from the presented passport (step S104). The acquired passport information includes a passport face image of the user U, identity verification information, a passport number, a passport issuance country, or the like.
Next, the check-in terminal 20 captures a face of the user U by the biometric information acquisition device 209 and acquires a face image as a target face image (step S105). Note that it is preferable to display a guidance message related to capturing of a face image (for example, “By registering your face image, you can easily perform the following procedures required before departure through face recognition. The registered face image will be deleted from the system after boarding is completed.”) on a screen and obtain the consent of the user U before capturing a face image.
Next, the check-in terminal 20 transmits the request data D1 that requests matching of the face image and issuance of a token ID to the management server 10 (step S106).
Further, the body part B1 is formed of the control data B11, the face authenticating data B12, and the operation data B13. In the control data B11, label information and data on each item of a location of a terminal in the airport A (“location”), a terminal (“terminal”), a device name (“deviceName”), a system type (“sysType”), information on a system vender (“sysVender”), request data transmission time (“reqTimeStamp”), a camera ID used for face image capturing (“cameraId”), and a camera model name (“cameraModel”) are written.
Further, in the face authenticating data B12, label information and data on each item of a file name of a passport face image (“PassportFaceImage”), a capturing time of a captured face image (“queryTimeStamp”), a file name of a captured face image (“queryFaceImage”) are written.
Further, the operation data B13 is provided with a single label (“appdata”), and label information and data on each data item of operation data are hierarchically written in a portion bracketed between a symbol indicating a start part (“{”) and a symbol indicating an end part (“}”).
In response to receiving the request data D1 from the check-in terminal 20, the management server 10 matches the target face image captured by the check-in terminal 20 with the passport face image of the user U on a one-to-one basis (step S107). That is, in the example of
Next, if the management server 10 determines that the result of matching of the target face image with the passport face image is that the matching is successful (step S108, YES), the management server 10 issues a token ID (step S109). The token ID is set to a unique value based on date and time of processing or a sequence number, for example.
Next, the management server 10 registers the relationship between the token ID and the registered face image to the token ID information DB 11 by using the target face image as the registered face image (step S110).
In the present example embodiment, the reason why a face image captured on site (a target face image) is used as a registered face image is that the validated period (lifecycle) of a token ID expires within the day, that a captured image is of closer quality (appearance) to an image captured in a subsequent authentication process than a passport face image, or the like. However, instead of a captured face image, a passport face image may be set as a registered face image (registered biometric information). For example, when the lifecycle of a token ID is long (for example, when a token ID is validated for a certain validated period for a membership in the airline business or the like), a face image of a passport or a license can be set as a registered face image.
Next, the management server 10 uses passport information and boarding reservation information as operation information to register the relationship between the token ID and the operation information to the operation information DB 13 (step S111). In such a way, while control data required for face authentication and operation information required for performing operation are managed in separate databases, a registered face image and the operation information are associated with each other by a token ID. In the example of
Next, the management server 10 transmits the response data D2 including the issued token ID and matching result information indicating a successful matching to the check-in terminal 20 (step S112).
On the other hand, if the management server 10 determines that the result of matching of the passport face image with the target face image is that the matching failed (step S108, NO), the management server 10 transmits the response data D2 including matching result information indicating the failed matching to the check-in terminal 20 (step S113).
Next, if the check-in terminal 20 references the response data D2 received from the management server 10 and determines that it is possible to perform a check-in procedure (step S114, YES), the check-in terminal 20 performs a check-in procedure such as confirmation of an itinerary, a selection of a seat, or the like based on input information from the user U (step S115). The check-in terminal 20 then transmits the request data D1 that requests registration of passage history information on the user U to the management server 10 (step S116).
Next, in response to receiving the request data D1 from the check-in terminal 20, the management server 10 registers passage history information indicating the relationship between the token ID and passage information on the touch point P1 to the passage history information DB 12 (step S117).
The management server 10 then transmits the response data D2 to the check-in terminal 20 (step S118) and ends the process.
On the other hand, if the check-in terminal 20 references the response data D2 received from the management server 10 and determines that it is not possible to perform a check-in procedure on the user U (step S114, NO), the check-in terminal 20 notifies the user U of an error message (step S119). For example, a notification screen including a message such as “Please perform a check-in procedure at the manned counter” is displayed on the display device 207.
In such a way, a target face image (captured face image) successfully matched with a passport face image acquired from a passport in a check-in procedure is registered to the token ID information DB 11 as a registered face image, and a registered face image and operation information of the operation information DB 13 are associated with each other by the issued token ID. This enables a matching process between the captured face image and the registered face image to be made at each subsequent touch point. That is, the token ID associated with the registered face image is identification information that can be commonly used at all the touch points. The use of such a commonized token ID can increase efficiency of inspection on the user U.
The automatic baggage check-in machine 30 captures an image of an area in front of the machine continually or periodically and determines whether or not a face of the user U standing in front of the automatic baggage check-in machine 30 is detected in the captured image (step S201). The automatic baggage check-in machine 30 stands by until a face of the user U is detected in an image by the biometric information acquisition device 309 (step S201, NO).
If the automatic baggage check-in machine 30 determines that a face of the user U is detected by the biometric information acquisition device 309 (step S201, YES), the automatic baggage check-in machine 30 captures the face of the user U and acquires a face image of the user U as a target face image (step S202).
Next, the automatic baggage check-in machine 30 transmits the request data D1 that requests execution of a matching process between the target face image and the registered face image to the management server 10 (step S203).
Further, in the face authenticating data B12, label information and data on each item of a capturing time of a captured face image (“queryTimeStamp”) and a file name of a captured face image (“queryFaceImage”) are written. When the command is a matching request, unlike the example of
In response to receiving the request data D1 from the automatic baggage check-in machine 30, the management server 10 performs matching of a face image of the user U (step S204). That is, the management server 10 matches the target face image included in the request data D1 received from the automatic baggage check-in machine 30 with a plurality of registered face images registered in the token ID information DB 11 on a one-to-N basis. Note that the registered face images to be matched are limited to images associated with a token ID whose invalidation flag value is “1” (valid).
Herein, if the management server 10 determines that the result of matching is that the matching failed (step S205, NO), the management server 10 transmits the response data D2 including failed matching result information to the automatic baggage check-in machine 30 (step S207), and the process proceeds to step S209. In contrast, if the management server 10 determines that the result of matching is that the matching is successful (step S205, YES), the process proceeds to step S206.
In step S206, the management server 10 acquires operation information from the operation information DB 13 by using, as a key, the token ID associated with the successfully matched registered face image in the token ID information DB 11. The management server 10 then transmits the response data D2 to the automatic baggage check-in machine 30 (step S208).
Further, in the operation data B22, operation information (a passenger name (“PassengerName”), a passport number (“PassportNum”), a nationality (“Nationality”), a date of birth (“DateofBirth”), a sexuality (“Sex”), or the like) acquired from the operation information DB 13 based on the token ID is written.
Next, if the automatic baggage check-in machine 30 references the response data D2 and determines that it is possible to perform the procedure (step S209, YES), the automatic baggage check-in machine 30 performs a process of a baggage check-in procedure on the user U (step S210).
Next, the automatic baggage check-in machine 30 transmits the request data D1 that requests registration of passage history information on the user U to the management server 10 (step S211).
In response to receiving the request data D1 from the automatic baggage check-in machine 30, the management server 10 registers passage history information indicating the relationship between the token ID and passage information on the user U at the touch point P2 to the passage history information DB 12 (step S212).
The management server 10 then transmits the response data D2 to the automatic baggage check-in machine 30 (step S213) and ends the process.
On the other hand, if the automatic baggage check-in machine 30 references the response data D2 and determines that it is not possible to perform the procedure (step S209, NO), the automatic baggage check-in machine 30 notifies the user U of an error message (step S214). For example, a notification screen including a message such as “Please check in your baggage at the manned counter.” is displayed on the display device 307.
The security inspection apparatus 40 captures an image of an area in front of the entrance of the security inspection site continually or periodically and determines whether or not a face of the user U standing in front of the entrance is detected in the captured image (step S301). The security inspection apparatus 40 stands by until a face of the user U is detected in an image by the biometric information acquisition device 409 (step S301, NO).
If the security inspection apparatus 40 determines that a face of the user U is detected by the biometric information acquisition device 409 (step S301, YES), the security inspection apparatus 40 captures the face of the user U and acquires a face image of the user U as a target face image (step S302).
Next, the security inspection apparatus 40 transmits the request data D1 that requests execution of a matching process between the target face image and the registered face image to the management server 10 (step S303).
In response to receiving the request data D1 from the security inspection apparatus 40, the management server 10 performs matching of a face image of the user U (step S304). That is, the management server 10 matches the target face image included in the request data D1 received from the security inspection apparatus 40 with a plurality of registered face images registered in the token ID information DB 11 on a one-to-N basis. Note that the registered face images to be matched are limited to images associated with a token ID whose invalidation flag value is “1” (valid).
Herein, if the management server 10 determines that the result of matching is that the matching failed (step S305, NO), the management server 10 transmits the response data D2 including the result of matching of the failed matching to the security inspection apparatus 40 (step S307), and the process proceeds to step S309. In contrast, if the management server 10 determines that the result of matching is that the matching is successful (step S305, YES), the process proceeds to step S306.
In step S306, the management server 10 acquires operation information from the operation information DB 13 by using, as a key, the token ID associated with the successfully matched registered face image in the token ID information DB 11. The management server 10 then transmits the response data D2 including the matching result information, the token ID, and the operation information to the security inspection apparatus 40 (step S308).
Next, if the security inspection apparatus 40 references the response data D2 and determines that it is possible to perform the procedure (step S309, YES), the security inspection apparatus 40 performs a security inspection process on the user U (step S310). In the security inspection process, the CPU 401 controls each component of the security inspection apparatus 40. Accordingly, the security inspection apparatus 40 detects a metal object worn by the user U passing through the metal detector gate 410. The user U who has passed through the metal detector gate 410 moves to the departure inspection site.
Next, the security inspection apparatus 40 transmits the request data D1 that requests registration of passage history information on the user U to the management server 10 (step S311).
In response to receiving the request data D1 from the security inspection apparatus 40, the management server 10 registers passage history information indicating the relationship between the token ID and passage information on the user U at the touch point P3 to the passage history information DB 12 (step S312).
The management server 10 then transmits the response data D2 to the security inspection apparatus 40 (step S313) and ends the process.
On the other hand, if the security inspection apparatus 40 references the response data D2 and determines that it is not possible to perform the procedure (step S309, NO), the security inspection apparatus 40 notifies the user U of an error message (step S314).
The automated gate apparatus 50 captures an image of an area in front of the automated gate apparatus 50 continually or periodically and determines whether or not a face of the user U standing in front of the automated gate apparatus 50 is detected in the captured image (step S401). The automated gate apparatus 50 stands by until a face of the user U is detected in an image by the biometric information acquisition device 509 (step S401, NO).
If the automated gate apparatus 50 determines that a face of the user U is detected by the biometric information acquisition device 509 (step S401, YES), the automated gate apparatus 50 captures the face of the user U and acquires a face image of the user U as a target face image (step S402).
Next, the automated gate apparatus 50 transmits the request data D1 that requests execution of a matching process between the target face image and the registered face image to the management server 10 (step S403).
In response to receiving the request data D1 from the automated gate apparatus 50, the management server 10 performs matching of a face image of the user U (step S404). That is, the management server 10 matches the target face image included in the request data D1 received from the automated gate apparatus 50 with a plurality of registered face images registered in the token ID information DB 11 on a one-to-N basis. Note that the registered face images to be matched are limited to images associated with a token ID whose invalidation flag value is “1” (valid).
Herein, if the management server 10 determines that the result of matching is that the matching failed (step S405, NO), the management server 10 transmits the response data D2 including matching result information indicating the failed matching to the automated gate apparatus 50 (step S407), and the process proceeds to step S409. In contrast, if the management server 10 determines that the result of matching is that the matching is successful (step S405, YES), the process proceeds to step S406.
In step S406, the management server 10 acquires operation information from the operation information DB 13 by using, as a key, the token ID associated with the successfully matched registered face image in the token ID information DB 11. The management server 10 then transmits the response data D2 including the matching result information, the token ID, and the operation information to the automated gate apparatus 50 (step S408).
Next, if the automated gate apparatus 50 references the response data D2 and determines that it is possible to perform the procedure (step S409, YES), the automated gate apparatus 50 performs a departure inspection procedure on the user U and opens the gate 511 (step S410). The user U who has passed through the touch point P4 moves to the departure area where a boarding gate is present.
Next, the automated gate apparatus 50 transmits the request data D1 that requests registration of passage history information on the user U to the management server 10 (step S411).
In response to receiving the request data D1 from the automated gate apparatus 50, the management server 10 registers passage history information indicating the relationship between the token ID and passage information on the user U at the touch point P4 to the passage history information DB 12 (step S412).
The management server 10 then transmits the response data D2 to the automated gate apparatus 50 (step S413) and ends the process.
On the other hand, if the automated gate apparatus 50 references the response data D2 and determines that it is not possible to perform the procedure (step S409, NO), the automated gate apparatus 50 notifies the user U of an error message (step S414). For example, a notification screen including a message such as “Please perform a departure inspection procedure at the manned counter.” is displayed on the display device 507.
The boarding gate apparatus 60 captures an image of an area in front of the apparatus continually or periodically and determines whether or not a face of the user U standing in front of the boarding gate apparatus 60 is detected in the captured image (step S501). The boarding gate apparatus 60 stands by until a face of the user U is detected in an image by the biometric information acquisition device 609 (step S501, NO).
If the boarding gate apparatus 60 determines that a face of the user U is detected by the biometric information acquisition device 609 (step S501, YES), the boarding gate apparatus 60 captures the face of the user U and acquires a face image of the user U as a target face image (step S502).
Next, the boarding gate apparatus 60 transmits the request data D1 that requests execution of a matching process between the target face image and the registered face image to the management server 10 (step S503).
In response to receiving the request data D1 from the boarding gate apparatus 60, the management server 10 performs matching of a face image of the user U (step S504). That is, the management server 10 matches the target face image included in the request data D1 received from the boarding gate apparatus 60 with a plurality of registered face images registered in the token ID information DB 11 on a one-to-N basis. Note that the registered face images to be matched are limited to images associated with a token ID whose invalidation flag value is “1” (valid).
Herein, if the management server 10 determines that the result of matching is that the matching failed (step S505, NO), the management server 10 transmits the response data D2 including matching result information indicating the failed matching to the boarding gate apparatus 60 (step S507), and the process proceeds to step S509. In contrast, if the management server 10 determines that the result of matching is that the matching is successful (step S505, YES), the process proceeds to step S506.
In step S506, the management server 10 acquires operation information from the operation information DB 13 by using, as a key, the token ID associated with the successfully matched registered face image in the token ID information DB 11. The management server 10 then transmits the response data D2 including the matching result information, the token ID, and the operation information to the boarding gate apparatus 60 (step S508).
Next, if the boarding gate apparatus 60 references the response data D2 and determines that it is possible to perform the procedure (step S509, YES), the boarding gate apparatus 60 performs an aircraft boarding procedure on the user U and opens the gate 611 (step S510). The user U who has passed through the touch point P5 boards the aircraft.
Next, the boarding gate apparatus 60 transmits the request data D1 that requests invalidation of the token ID and registration of passage history information on the user U to the management server 10 (step S511).
In response to receiving the request data D1 from the boarding gate apparatus 60, the management server 10 updates the token ID information DB 11 (step S512). Specifically, the management server 10 updates the invalidation flag of the token ID information DB 11 to a value of invalid (“0”). Accordingly, the validated period (lifecycle) of the token ID expires.
Next, the management server 10 registers passage history information indicating the relationship between the token ID and passage information on the user U at the touch point P5 to the passage history information DB 12 (step S513).
The management server 10 then transmits the response data D2 to the boarding gate apparatus 60 (step S514) and ends the process.
On the other hand, if the boarding gate apparatus 60 references the response data D2 and determines that it is not possible to perform the procedure (step S509, NO), the boarding gate apparatus 60 notifies the user U of an error message (step S515). For example, the boarding gate apparatus 60 displays a notification screen including a message such as “Please perform the procedure at the manned counter.” on the display device 607.
As described above, according to the present example embodiment, a face authentication technology can be easily applied to a plurality of operations performed in the airport A. This increases efficiency of each operation. Further, although airport operations performed on the user U in departure from a country has been described in the present example embodiment, the configuration of the present example embodiment can also be applied to other operations such as an immigration inspection procedure in entry to a country, a custom procedure, or the like.
An information processing system 2 in the present example embodiment will be described below. Note that references common to the references provided in the drawings of the first example embodiment represent the same object. The description of features common to the first example embodiment will be omitted, and different features will be described in detail.
Further, the management server 10 in the present example embodiment has the token ID information DB 11, the passage history information DB 12, and the operation information DB 13 in the same manner as in the first example embodiment. However, since the operations in the present example embodiment differs from the operations of the first example embodiment, the data items of the operation information stored in the operation information DB 13 also differ. Specifically, the operation in the present example embodiment includes an operation involving commercial transactions in a station. Thus, the operation information includes payment information such as a credit card number. Note that the management server 10 and the automatic ticket gate 80 can control entry and exit of a user U based on face authentication even for the user U whose payment information is not registered.
The automatic change machine 712 is an apparatus that, when a total amount of money put into a deposit slot exceeds a payment for purchasing an item or using a service, automatically discharges money in accordance with a change amount calculated by the CPU 701 to a dispensing port. The printer 713 prints a train ticket, a receipt, a credit card statement, or the like under the control of the CPU 701.
Subsequently, the operation of each device of the information processing system 2 in the present example embodiment will be described with reference to
First, the automatic ticket vending machine 70 acquires information related to a train ticket such as a date, a section, a train name, and a seat category and payment information for purchasing (step S601).
In
Next, the automatic ticket vending machine 70 determines whether or not there is a consent from the user U about face image capturing. Herein, if the automatic ticket vending machine 70 determines that there is a consent from the user U (step S602, YES), the automatic ticket vending machine 70 acquires a face image of the user U captured by the biometric information acquisition device 709 as a registered face image (step S603), and the process proceeds to step S604. In contrast, if the automatic ticket vending machine 70 determines that there is no consent from the user U (step S602, NO), the process proceeds to step S604.
In step S604, the request data D1 that requests purchasing of a train ticket and issuance of a token ID is transmitted to the management server 10. If a face image has been captured in step S603, the face image of the user U (registered face image) is included in the request data D1. Further, if a consent about registration of payment information has been obtained from the user U, it is preferable to include data of instruction for payment information registration in the request data D1.
In the control data B11, label information and data on each item of a station name (“stationName”), an installation area of the automatic ticket vending machine 70 (“area”), a device name (“deviceName”), a system type (“sysType”), information on a system vender (“sysVender”), a request data transmission time (“reqTimeStamp”), a camera ID used for capturing a face image (“cameraId”), and a camera model name (“cameraModel”) are written.
Further, in the face authenticating data B12, label information and data on each item of a capturing time of a captured face image (“queryTimeStamp”) and a file name of a captured face image (“queryFaceImage”) are written.
Further, the operation data B13 is provided with a single label (“appdata”), and label information and data on each data item of operation data are hierarchically written in a portion bracketed between a symbol indicating a start part (“{”) and a symbol indicating an end part (“}”). In the example of
In step S605, in response to receiving the request data D1 from the automatic ticket vending machine 70, the management server 10 performs a train ticket purchasing process. The management server 10 then issues a token ID (step S606). Next, the management server 10 registers the relationship between the token ID and the registered face image to the token ID information DB 11 (step S607).
Next, the management server 10 registers the relationship between the token ID and operation information (the operation data B13 of
Next, the management server 10 transmits the response data D2 including the issued token ID to the automatic ticket vending machine 70 (step S609).
In step S610, the automatic ticket vending machine 70 references the response data D2 and determines whether or not the user U successfully purchased a train ticket. Herein, if the automatic ticket vending machine 70 determines that the user U successfully purchased a train ticket (step S610, YES), the automatic ticket vending machine 70 prints a receipt (step S611) and ends the process. Note that, instead of a receipt, a paper train ticket may be printed.
On the other hand, if the automatic ticket vending machine 70 references the response data D2 and determines that the user U failed to purchase a train ticket (step S610, NO), the automatic ticket vending machine 70 notifies that the purchasing process failed (step S612).
First, the management server 10 queries reservation information stored in a reservation database (not illustrated) based on conditions specified by the user U (step S701).
Next, the management server 10 determines based on a query result whether or not a boarding reservation is available (step S702). Herein, if the management server 10 determines that the boarding reservation is available (step S702, YES), the management server 10 performs a payment process based on payment information (step S703), and the process proceeds to step S704. In contrast, if the management server 10 determines that the boarding reservation is unavailable (step S702, NO), the process proceeds to step S710.
In step S704, the management server 10 determines whether or not the payment process is normally completed. Herein, if the management server 10 determines that the payment process is normally completed (step S704, YES), the management server 10 registers reservation information to the reservation system (step S705), and the process proceeds to step S706. In contrast, if the management server 10 determines that the payment process failed (step S704, NO), the process proceeds to step S710.
In step S706, the management server 10 determines whether or not car dispatch is requested. Herein, if the management server 10 determines that car dispatch is requested (step S706, YES), the management server 10 acquires an estimated arrival time at the arrival station AS of the train TR to be used (step S707) and performs a car dispatch reservation process (step S708). In the car dispatch reservation process, it is preferable to reflect a car dispatch place and a desired car dispatch time input by the user U (see
In step S709, the management server 10 outputs a process result of a purchase completion, and the process proceeds to step S606 of
The automatic ticket gate 80 captures an image of an area in front of the automatic ticket gate 80 continually or periodically and determines whether or not a face of the user U standing in front of the automatic ticket gate 80 is detected in the captured image (step S801). The automatic ticket gate 80 stands by until a face of the user U is detected in an image by the biometric information acquisition device 809 (step S801, NO).
If the automatic ticket gate 80 determines that a face of the user U is detected by the biometric information acquisition device 809 (step S801, YES), the automatic ticket gate 80 captures the face of the user U and acquires a face image of the user U as a target face image (step S802).
Next, the automatic ticket gate 80 transmits the request data D1 that requests execution of a matching process between the target face image and the registered face image to the management server 10 (step S803).
Further, in the face authenticating data B12, label information and data on each item of a capturing time of a captured face image (“queryTimeStamp”) and a file name of a captured face image (“queryFaceImage”) are written.
In response to receiving the request data D1 from the automatic ticket gate 80, the management server 10 performs matching of a face image of the user U (step S804). That is, the management server 10 matches the target face image included in the request data D1 received from the automatic ticket gate 80 with a plurality of registered face images registered in the token ID information DB 11 on a one-to-N basis. Note that the registered face images to be matched are limited to images associated with a token ID whose invalidation flag value is “1” (valid).
Herein, if the management server 10 determines that the result of matching is that the matching failed (step S805, NO), the management server 10 transmits the response data D2 including matching result information indicating the failed matching to the automatic ticket gate 80 (step S807), and the process proceeds to step S809. In contrast, if the management server 10 determines that the result of matching is that the matching is successful (step S805, YES), the process proceeds to step S806.
In step S806, the management server 10 acquires operation information from the operation information DB 13 by using, as a key, the token ID associated with the successfully matched registered face image in the token ID information DB 11. The management server 10 then transmits the response data D2 including the matching result information, the token ID, and the operation information to the automatic ticket gate 80 (step S808).
Next, if the automatic ticket gate 80 references the response data D2 and determines that it is possible to permit entry (step S809, YES), the automatic ticket gate 80 opens the gate 811 (step S810). The user U who has passed the ticket gate moves to a predetermined place in order to board the train TR.
Next, the automatic ticket gate 80 transmits the request data D1 that requests registration of passage history information on the user U to the management server 10 (step S811).
In response to receiving the request data D1 from the automatic ticket gate 80, the management server 10 registers passage history information indicating the relationship between the token ID and passage information on the user U to the passage history information DB 12 (step S812).
The management server 10 then transmits the response data D2 to the automatic ticket gate 80 (step S813) and ends the process.
On the other hand, if the automatic ticket gate 80 references the response data D2 and determines that it is not possible to permit entry (step S809, NO), the automatic ticket gate 80 notifies the user U of an error message (step S814). For example, a notification screen including a message such as “Please contact the station staff nearby.” is displayed on the display device 807.
First, when start of a payment process is instructed by staff, the POS terminal 90 determines whether or not a face of the user U standing in front of the POS terminal 90 is detected in an image capturing an area in front of the device (step S901). The POS terminal 90 stands by until a face of the user U is detected in an image by the biometric information acquisition device 909 (step S901, NO).
If the POS terminal 90 determines that a face of the user U is detected by the biometric information acquisition device 909 (step S901, YES), the POS terminal 90 captures the face of the user U and acquires a face image of the user U as a target face image (step S902).
Next, the POS terminal 90 transmits the request data D1 that requests execution of a matching process between the target face image and the registered face image and execution of a payment process to the management server 10 (step S903).
In response to receiving the request data D1 from the POS terminal 90, the management server 10 performs matching of a face image of the user U (step S904). That is, the management server 10 matches the target face image included in the request data D1 received from the POS terminal 90 with a plurality of registered face images registered in the token ID information DB 11 on a one-to-N basis. Note that the registered face images to be matched are limited to images associated with a token ID whose invalidation flag value is “1” (valid).
Herein, if the management server 10 determines that the result of matching is that the matching failed (step S905, NO), the management server 10 transmits the response data D2 including matching result information indicating the failed matching to the POS terminal 90 (step S910), and the process proceeds to step S911. In contrast, if the management server 10 determines that the result of matching is that the matching is successful (step S905, YES), the process proceeds to step S906.
In step S906, the management server 10 acquires payment information included in operation information from the operation information DB 13 by using, as a key, the token ID associated with the successfully matched registered face image.
Next, the management server 10 performs a payment process based on the payment information (step S907) and then updates the operation information DB 13 based on the token ID, information on the purchased item, and the like (step S908).
The management server 10 then transmits the response data D2 including the result of matching, the token ID, and a payment result to the POS terminal 90 (step S909), and the process then proceeds to step S911.
In step S911, the POS terminal 90 references the response data D2 and determines whether or not the payment process is normally completed. Herein, if the POS terminal 90 determines that the payment process is normally completed (step S911, YES), the POS terminal 90 prints a receipt (step S912) and ends the process.
On the other hand, if the POS terminal 90 determines that it is not possible to perform the payment process based on the payment information (step S911, NO), the POS terminal 90 notifies the user U of an error message (step S913).
As described above, a face image and payment information are associated with each other by a token ID when a train ticket is purchased, and thereby a payment process using face authentication is made possible in a period in which the token ID is valid.
The automatic ticket gate 80 captures an image of an area in front of the automatic ticket gate 80 continually or periodically and determines whether or not a face of the user U standing in front of the automatic ticket gate 80 is detected in the captured image (step S1001). The automatic ticket gate 80 stands by until a face of the user U is detected in an image by the biometric information acquisition device 809 (step S1001, NO).
If the automatic ticket gate 80 determines that a face of the user U is detected by the biometric information acquisition device 809 (step S1001, YES), the automatic ticket gate 80 captures the face of the user U and acquires a face image of the user U as a target face image (step S1002).
Next, the automatic ticket gate 80 transmits the request data D1 that requests execution of a matching process between the target face image and the registered face image to the management server 10 (step S1003).
In response to receiving the request data D1 from the automatic ticket gate 80, the management server 10 performs matching of a face image of the user U (step S1004). That is, the management server 10 matches the target face image included in the request data D1 received from the automatic ticket gate 80 with a plurality of registered face images registered in the token ID information DB 11 on a one-to-N basis. Note that the registered face images to be matched are limited to images associated with a token ID whose invalidation flag value is “1” (valid).
Herein, if the management server 10 determines that the result of matching is that the matching failed (step S1005, NO), the management server 10 transmits the response data D2 including matching result information indicating the failed matching to the automatic ticket gate 80 (step S1007), and the process proceeds to step S1009. In contrast, if the management server 10 determines that the result of matching is that the matching is successful (step S1005, YES), the process proceeds to step S1006.
In step S1006, the management server 10 acquires operation information from the operation information DB 13 by using, as a key, the token ID associated with the successfully matched registered face image in the token ID information DB 11. The management server 10 then transmits the response data D2 including the matching result information, the token ID, and the operation information to the automatic ticket gate 80 (step S1008).
Next, if the automatic ticket gate 80 references the response data D2 and determines that it is possible to permit exit (step S1009, YES), the automatic ticket gate 80 opens the gate 811 (step S1010).
Next, the automatic ticket gate 80 transmits the request data D1 that requests invalidation of the token ID and registration of passage history information on the user U to the management server 10 (step S1011).
In response to receiving the request data D1 from the automatic ticket gate 80, the management server 10 updates the token ID information DB 11 (step S1012). Specifically, the management server 10 updates the invalidation flag of the token ID information DB 11 to a value of invalid (“0”). Accordingly, the validated period (lifecycle) of the token ID expires.
Next, the management server 10 registers passage history information indicating the relationship between the token ID and ticket gate passage information on the user U to the passage history information DB 12 (step S1013).
The management server 10 then transmits the response data D2 to the automatic ticket gate 80 (step S1014) and ends the process.
On the other hand, if the automatic ticket gate 80 references the response data D2 and determines that it is not possible to permit exit (step S1009, NO), the automatic ticket gate 80 notifies the user U of an error message (step S1015).
As described above, according to the present example embodiment, the management server 10 can easily apply a face authentication technology to a plurality of operations performed in railroad facilities. This increases efficiency of each operation.
Further, when the face authentication technology is used in operations in a railroad, a face image is captured when a train ticket or a limited express ticket is purchased at a mobile terminal or a ticket vending machine at a counter. When ticket inspection is performed in the train by a conductor, the conductor may use a portable operation terminal to capture a face image of the user U and upload the face image to the management server 10 via the networks NW and NW3. This enables a ticket inspection operation based on face authentication in the train TR. Note that the ticket inspection may be automatically performed by the management server 10 based on an image captured by a network camera installed inside a vehicle.
Furthermore, if a token ID and payment information are associated with each other in advance, even when boarding a train without specifying a destination or even when riding past a destination, the user may also pay the fare by using face authentication when exiting an alighting station. [Third Example Embodiment]
Although the present invention has been described above with reference to the example embodiments, the present invention is not limited to the example embodiments described above. Various modifications that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope not departing from the spirit of the present invention. For example, it should be understood that an example embodiment in which a configuration of a part of any of the example embodiments is added to another example embodiment or an example embodiment in which a configuration of a part of any of the example embodiments is replaced with a configuration of a part of another example embodiment is also an example embodiment to which the present invention may be applied.
In the first and second example embodiments described above, the cases in which the present invention is applied to operations in an airport and a railroad have been described. However, the configuration of the present invention is applicable to operations in any types of business such as accommodation industry, service industry, manufacturing industry, or the like. For example, in application to operations in the accommodation industry, by associating a face image of a guest with operation information by a token ID at check-in to a hotel, it is possible to use face authentication to perform purchase of an item, use of a service, control of entry and exit to and from a guest room, or the like in a facility during a stay period of the user.
Although written in the JSON form in the first and second example embodiments described above, the request data D1 and the response data D2 may be written in other data formats such as an XML form. That is, any data format can be employed as long as it can encapsulate operation data.
The scope of each of the example embodiments also includes a processing method that stores, in a storage medium, a program that causes the configuration of each of the example embodiments to operate so as to implement the function of each of the example embodiments described above, reads the program stored in the storage medium as a code, and executes the program in a computer. That is, the scope of each of the example embodiments also includes a computer readable storage medium. Further, each of the example embodiments includes not only the storage medium in which the program described above is stored but also the individual program itself.
As the storage medium, for example, a floppy (registered trademark) disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a nonvolatile memory card, or the like can be used. Further, the scope of each of the example embodiments also includes an example that operates on OS to perform a process in cooperation with another software or a function of an add-in board without being limited to an example that performs a process by an individual program stored in the storage medium.
The whole or part of the example embodiments disclosed above can be described as, but not limited to, the following supplementary notes.
An information processing apparatus comprising:
a receiving unit that receives request data having control data from an operation terminal used for a predetermined operation;
an extraction unit that extracts biometric information used for biometric authentication and encapsulated operation data from the request data based on the control data;
an authentication unit that performs biometric authentication on a user based on the biometric information; and
an operation processing unit that processes the operation data.
The information processing apparatus according to supplementary note 1 further comprising a transmission unit that transmits, to the operation terminal, response data having the control data and the operation data to which process results from the operation processing unit and the authentication unit are reflected, respectively,
wherein in the response data, the operation data is encapsulated.
The information processing apparatus according to supplementary note 2, wherein the operation data includes a label for the operation processing unit to identify a data item.
The information processing apparatus according to supplementary note 3 further comprising an issuance unit that issues an identifier that associates the biometric information on the user with the operation data.
The information processing apparatus according to supplementary note 4, wherein the control data includes an execution command related to at least one of a registration process of the operation data, a search process of the operation data, an issuance process of the identifier, and the biometric authentication.
The information processing apparatus according to supplementary note 4 or 5, wherein when the biometric information of the request data includes first biometric information acquired from the user at the operation terminal provided in an airport facility and second biometric information acquired from a passport possessed by the user,
the authentication unit matches the first biometric information with the second biometric information, and
when a result of matching performed by the authentication unit is that the matching is successful, the issuance unit sets the first biometric information as registered biometric information on the user and associates the operation data related to the passport or a boarding ticket with the registered biometric information by using the identifier.
The information processing apparatus according to supplementary note 6, wherein when a result of matching of the first biometric information with the registered biometric information performed by the authentication unit is that the matching is successful, the response data includes the operation data associated with the registered biometric information.
The information processing apparatus according to supplementary note 4 or 5, wherein the issuance unit sets the biometric information acquired from the user as registered biometric information on the user when a train ticket is issued at the operation terminal provided in a railroad facility and associates the operation data related to the train ticket with the registered biometric information by using the identifier.
The information processing apparatus according to supplementary note 8, wherein when a result of matching of the biometric information with the registered biometric information performed by the authentication unit is that the matching is successful, the response data includes the operation data associated with the registered biometric information.
The information processing apparatus according to any one of supplementary notes 1 to 9,
wherein the operation processing unit is formed of a plurality of APIs, and
wherein the extraction unit assigns the operation data to two or more of the APIs based on one of the request data.
The information processing apparatus according to any one of supplementary notes 1 to 10, wherein the request data is written in JSON or XML.
The information processing apparatus according to any one of supplementary notes 1 to 11, wherein the biometric information is any of a face image, an iris image, and a fingerprint image.
An information processing method comprising:
receiving request data having control data from an operation terminal used for a predetermined operation;
extracting biometric information used for biometric authentication and encapsulated operation data from the request data based on the control data;
performing biometric authentication on a user based on the biometric information; and
processing the operation data.
A storage medium storing a program that causes a computer to perform:
receiving request data having control data from an operation terminal used for a predetermined operation;
extracting biometric information used for biometric authentication and encapsulated operation data from the request data based on the control data;
performing biometric authentication on a user based on the biometric information; and processing the operation data.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/038826 | 10/1/2019 | WO |