This application is a National Stage Entry of PCT/JP2020/010952 filed on Mar. 12, 2020, the contents of all of which are incorporated herein by reference, in their entirety.
The present invention relates to an information processing apparatus, an information processing method, and a computer readable recording medium.
Many display devices that display various types of information are installed in airport facilities. These display devices display flight information of aircraft, information regarding airport facilities, information regarding a transportation system operating at the airport, and the like. For example, Patent Document 1 discloses an invention for displaying guidance information for a transportation system. In Patent Document 1, a database in which a face image and a destination are registered in association with each other is searched, a destination corresponding to a captured face image of a user is acquired, and guidance information to the destination is generated and displayed.
Patent Document 1: Japanese Patent Laid-Open Application No. 2009-205504
Incidentally, in the case of an airport, and in particular, a hub airport, passengers stay at the airport for a long time due to aircraft transfer or the like in some cases. In this case, some passengers participate in tours at the airport where they disembark for a transfer, or rest at a hotel. For this reason, it is desirable to display information to passengers who have time to spare and suggest an optimum service. In Patent Document 1, it is not possible to display information suitable for such a passenger, and it is not possible to suggest sufficient services to the passengers.
In view of this, an example of an object of the present invention is to provide an information processing apparatus, an information processing method, and a computer readable recording medium that display information to a passenger who has disembarked from an aircraft.
In order to achieve the above-described object, an information processing apparatus according to an aspect of the present invention includes:
Also, in order to achieve the above-described object, an information processing method according to an aspect of the present invention includes:
Furthermore, in order to achieve the above-described object, a computer readable recording medium according to an aspect of the present invention includes a program recorded thereon, the program including instructions that cause a computer to carry out:
According to the present invention, information can be displayed to a passenger who has disembarked from an aircraft.
Hereinafter, an information processing apparatus, an information processing method, and a program according to an example embodiment of the present invention will be described with reference to
First, a configuration of the information processing apparatus according to the present example embodiment will be described with reference to
The information processing apparatus 10 is an apparatus that displays information to a passenger who is to stay at an airport for a long time for transit or transfer. The information processing apparatus 10 includes a biological information acquisition unit 1, an identification information acquisition unit 2, a passenger determination unit 3, a tour determination unit 4, and a display control unit 5.
The biological information acquisition unit 1 acquires biological information of a passenger who has disembarked from the aircraft. The biological information includes face image data, fingerprint data, vocal cord data, iris data, and the like.
The identification information acquisition unit 2 acquires the identification information corresponding to the biological information acquired by the biological information acquisition unit 1 from a storage device in which the identification information of the passenger, which includes the flight information of the passenger, and the biological information of the passenger are stored in association with each other in advance.
The passenger determination unit 3 determines whether the passenger is a passenger who is to re-board an aircraft after disembarking from an aircraft (hereinafter referred to as a transit passenger) based on the flight information included in the identification information acquired by the biological information acquisition unit 1.
If the passenger is a transit passenger, the tour determination unit 4 determines a tour suitable for the passenger based on the identification information acquired by the biological information acquisition unit 1.
The display control unit 5 displays information of the determined tour on the display device.
If the passenger who has disembarked from the aircraft is a transit passenger, the information processing apparatus 10 having this configuration can display information of a tour suitable for that passenger. For example, a transit passenger stays at the airport for a long time in some cases. For this reason, due to the information processing device 10 displaying the tour information, the transit passenger can effectively spend a layover at the airport.
Next, the configuration of the information processing apparatus 10 in the present example embodiment will be described more specifically.
The information processing apparatus 10 is connected to an image capture device 51, a storage device 52, and a display device 53 so as to be capable of data communication therewith.
The image capture device 51 is a device that is installed in an airport facility and captures an image of the face of the passenger 55. The image capture device 51 is installed at a location where an image of the face of the passenger 55 who has disembarked from an aircraft can be captured, such as an arrival gate or a baggage claim area of the airport.
The storage device 52 stores identification information of the passenger 55 and face image data of the passenger 55 in association with each other. The identification information includes unique information and flight information. The unique information is passport information, an e-mail address, and the like of the passenger 55. The passport information includes a name, nationality, gender, passport number, and so on. The flight information is a destination, transit point, departure and arrival times of the aircraft, and the like of the passenger 55. The flight information need only include at least information according to which it is possible to determine that the passenger 55 is a transit passenger, for example, information on the destination of the passenger 55, information on the departure and arrival of the aircraft to be boarded, and the like, and the information included in the flight information is not limited to the above.
For example, during check-in at the departure airport, identity confirmation is performed by matching the passport of passenger 55 and the face image data, and then the face image data, flight information, and unique information are associated with each other and registered in the storage device 52. The passenger 55 is registered, for example, using a dedicated terminal installed at the airport. This makes it possible to confirm the identity by capturing an image of the face of the passenger 55 at each touch point after check-in. At each touch point, when an image of the face is captured, the face image data of the passenger 55 is selected from the storage device 52 through one-to-N matching using the captured face image data. By acquiring the unique information and flight information corresponding to the selected face image data, identity confirmation is performed at each touch point. Note that it is possible to manage the procedure status of passenger 55 due to being able to confirm the identity through face recognition at each touch point.
The information processing apparatus 10 or another apparatus may perform various procedures through face recognition and the processing for storage in the storage device. Also, in the above description, the face image data, the unique information, and the flight information (identification information) are registered in the storage device 52 during check-in, but the timing of registration in the storage device 52 is not limited to this. For example, the passenger 55 may store the unique information, the flight information (identification information), and the face image data in association with each other in advance using a dedicated application of a mobile terminal such as a smartphone, a dedicated website, or the like.
The display device 53 is installed in the airport, for example, near the arrival gate 61. The display device 53 is, for example, a liquid crystal display device, and displays various types of information. Note that the display device 53 may be provided in one piece with an image capture device, or may be provided with an image capture device separately from the display device 53 in cooperation with the display device 53 to capture the face of the passenger 55 looking at the display device 53.
The information processing apparatus 10 and the storage device 52 are connected to a network, and the installation location may be in the airport or in a facility other than the airport.
Also, the information processing apparatus 10 can transmit data to a display device 54. The display device 54 is, for example, a smartphone, a tablet terminal, a PC (personal computer), or the like owned by the passenger 55. The information processing apparatus 10 transmits an e-mail to the display device 54 based on the e-mail address included in the unique information obtained by capturing an image of the face of the passenger 55.
As described above, the information processing apparatus 10 includes the biological information acquisition unit 1, the identification information acquisition unit 2, the passenger determination unit 3, the tour determination unit 4, and the display control unit 5.
The biological information acquisition unit 1 acquires face image data of the passenger 55 from the image capture device 51. For example, as shown in
The identification information acquisition unit 2 acquires identification information corresponding to the face image data acquired by the biological information acquisition unit 1 from the storage device 52. When the biological information acquisition unit 1 acquires the face image data, the identification information acquisition unit 2 selects the face image data from the storage device 52 through one-to-N matching using the face image data. The identification information acquisition unit 2 acquires the identification information associated with the selected face image data from the storage device 52.
The passenger determination unit 3 determines, based on the flight information included in the identification information, whether the passenger 55 whose image was captured by the biological information acquisition unit 1 is a transit passenger. For example, the passenger determination unit 3 determines, based on the flight information included in the identification information, whether the airport in which the image of the passenger 55 was captured (the airport where the image capture device 51 is installed) is the destination of the passenger 55. If it is not the destination, the passenger determination unit 3 determines that the passenger 55 is a transit passenger.
Note that transit passengers include passengers who are to re-board an aircraft different from the one they disembarked from, and passengers who are to re-board the same aircraft as the one they disembarked from.
If the passenger 55 is a transit passenger, the tour determination unit 4 determines a suitable tour for the passenger 55. The tour determination unit 4 determines a suitable tour for the passenger 55 based on the identification information. The tour determination unit 4 first searches for a tour being held on the current day. The tour determination unit 4 may search for a tour by having a tour provider provide information, or the information may be stored in the storage device 52 and the tour determination unit 4 may search from the storage device 52.
The tour determination unit 4 selects a tour suitable for the passenger 55 from among the searched-for tours based on the unique information or the flight information included in the identification information acquired by the identification information acquisition unit 2. For example, the tour determination unit 4 selects a tour that is popular with people who have the same nationality as the passenger 55 or are close in age to the passenger 55, based on the nationality, age, and gender of the passenger 55. The nationality, age, and gender of the passenger 55 are acquired from the passport information included in the unique information. Alternatively, the tour determination unit 4 estimates the stay time of the passenger 55 at the airport from the flight information, and selects a tour that can be participated in within the stay time. The stay time is, for example, the time from the current time until the transfer flight to be re-boarded starts boarding. Alternatively, the tour determination unit 4 acquires the destination of the flight of the passenger 55 and selects a tour according to the travel preference of the passenger 55.
The tour determination unit 4 may acquire information on the destination of the tour (hereinafter referred to as a sightseeing area), and narrow down the tours from the selected tours by reflecting that information. Information about a sightseeing area is information such as the degree of congestion of the sightseeing area, the weather at the sightseeing area, an event being held at the sightseeing area, and traffic conditions around the sightseeing area. The degree of congestion of the sightseeing area is, for example, the degree of concentration of people, and can be calculated by acquiring an image from an image capture device installed at the sightseeing area and performing image recognition on people in the image. In this case, it is possible to suggest a tour in which real-time information is reflected.
When reflecting the degree of congestion, the tour determination unit 4 may not select a tour if the degree of congestion is a predetermined value or more, or may not select a tour if the degree of congestion is less than the predetermined value, because the sightseeing area is inactive.
Also, when reflecting the information of the event held in the tourist spot, the tour determination unit 4 may preferentially select a tour in which an event is being held in the sightseeing area of the tour.
When reflecting the traffic conditions, the tour determination unit 4 calculates the round-trip time between the airport and the sightseeing area. Then, the tour determination unit 4 may select a tour in which the time obtained by subtracting the round-trip time from the stay time of the passenger 55 at the airport is a predetermined value or more.
When reflecting the weather at the sightseeing area, the tour determination unit 4 may not select a tour because the sightseeing area cannot be enjoyed when it is raining (e.g., the scenery is bad).
Also, the tour determination unit 4 may include information of a companion such as a family member of the passenger 55 in the identification information and determine a tour suitable for the passenger 55 based on the information of the companion. For example, if the companion of the passenger 55 is a child, the tour determination unit 4 determines a tour suited to a child.
The display control unit 5 displays the information determined by the tour determination unit 4 on the display device 53, or transmits the information determined by the tour determination unit 4 to the display device 54 owned by the passenger 55 by e-mail.
If the passenger 55 who has disembarked from the aircraft is a transit passenger, the display control unit 5 suggests the tour to the passenger 55 by displaying the information of the tour as shown in
The display control unit 5 sends an e-mail including the information of the tour to the display device 54. As a result, a mail screen as shown in
Alternatively, the display control unit 5 transmits an address of a website displaying the information of the tour to the display device 54 by e-mail. The owner (passenger 55) of the display device 54 taps (clicks on) the address included in the e-mail to display the website screen as shown in
The passenger 55, who is a transit passenger, can acquire information on an effective way to spend time at the airport where the passenger 55 is to stay for a long time by checking the screens shown in
[Apparatus Operation]
Next, the operation of the information processing apparatus 10 in the present example embodiment will be described with reference to
The biological information acquisition unit 1 acquires face image data captured by the image capture device 51 (S1). The identification information acquisition unit 2 acquires the identification information corresponding to the face image data from the storage device 52 (S2). The passenger determination unit 3 determines whether the passenger 55 of the face image data is a transit passenger based on the flight information included in the identification information acquired in S2 (S3).
If the passenger is a transit passenger (S3: YES), the tour determination unit 4 determines a suitable tour for the passenger 55 based on the identification information (S4). As shown in
If the passenger 55 is not a transit passenger (S3: NO), this flow ends without being displayed the tour information on the display devices 53 and 54 by the display control unit 5.
[Program]
The program according to this example embodiment may be any program that causes a computer to execute steps S1 to S5 shown in
Note that if the computer on which the program of the present example embodiment is installed functions as the display control unit 5, the program of the present example embodiment may be any program that causes the computer to execute steps S1 to S4 shown in
Also, the program in the present example embodiment may be executed by a computer system constructed by a plurality of computers. In this case, for example, each computer may function as any one of the biological information acquisition unit 1, the identification information acquisition unit 2, the passenger determination unit 3, the tour determination unit 4, and the display control unit 5.
In addition to general-purpose PCs, examples of computers include smartphones and tablet terminal devices.
(Physical Configuration of the Apparatus)
Here, a computer that realizes the information processing apparatus 10 by executing the program in the example embodiment will be described with reference to
As shown in
The CPU 111 loads the program (codes) of the present example embodiment stored in the storage device 113 into the main memory 112, and executes the program in a predetermined order to perform various computations. The main memory 112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory). Also, the program according to the present example embodiment is provided in a state of being stored in a computer readable recording medium 120. Note that the program according to the present example embodiment may also be distributed on the Internet, which is connected to via the communication interface 117.
Also, specific examples of the storage device 113 may include a semiconductor storage device such as a flash memory in addition to a hard disk. The input interface 114 mediates data transmission between the CPU 111 and input devices 118 such as a keyboard and mouse. The display controller 115 is connected to a display device 119 and controls the display on the display device 119. The data reader/writer 116 mediates data transmission between the CPU 111 and the recording medium 120, reads out the program from the recording medium 120, and writes the result of processing performed by the computer 110 in the recording medium 120. The communication interface 117 mediates data transmission between the CPU 111 and other computers.
Specific examples of the recording medium 120 include a general-purpose semiconductor storage device such as a CF (Compact Flash (registered trademark)) or an SD (Secure Digital), a magnetic storage medium such as a Flexible Disk, and an optical storage medium such as a CD-ROM (Compact Disk Read Only Memory).
A portion or all of the above-described example embodiment can be expressed by (Supplementary note 1) to (Supplementary note 27) described below, but is not limited thereto.
(Supplementary Note 1)
An information processing apparatus including:
(Supplementary Note 2)
The information processing apparatus according to supplementary note 1,
(Supplementary Note 3)
The information processing apparatus according to supplementary note 2,
(Supplementary Note 4)
The information processing apparatus according to supplementary note 3,
(Supplementary Note 5)
The information processing apparatus according to any one of supplementary notes 1 to 4,
(Supplementary Note 6)
The information processing apparatus according to supplementary note 5,
(Supplementary Note 7)
The information processing apparatus according to supplementary note 6,
(Supplementary Note 8)
The information processing apparatus according to supplementary note 5,
(Supplementary Note 9)
The information processing apparatus according to any one of supplementary notes 1 to 8,
(Supplementary Note 10)
An information processing method including:
(Supplementary Note 11)
The information processing method according to supplementary note 10,
(Supplementary Note 12)
The information processing method according to supplementary note 11,
(Supplementary Note 13)
The information processing method according to supplementary note 12,
(Supplementary Note 14)
The information processing method according to any one of claims 10 to 13,
(Supplementary Note 15)
The information processing method according to supplementary note 14,
(Supplementary Note 16)
The information processing method according to supplementary note 15,
(Supplementary Note 17)
The information processing method according to supplementary note 14,
(Supplementary Note 18)
The information processing method according to any one of supplementary notes 10 to 17,
(Supplementary Note 19)
A computer readable recording medium including a program recorded thereon, the program including instructions that cause a computer to carry out:
(Supplementary Note 20)
The computer readable recording medium according to supplementary note 19,
(Supplementary Note 21)
The computer readable recording medium according to supplementary note 20,
(Supplementary Note 22)
The computer readable recording medium according to supplementary note 21,
(Supplementary Note 23)
The computer readable recording medium according to any one of supplementary notes 19 to 22,
(Supplementary Note 24)
The computer readable recording medium according to supplementary note 23,
(Supplementary Note 25)
The computer readable recording medium according to supplementary note 24,
(Supplementary Note 26)
The computer readable recording medium according to supplementary note 23,
(Supplementary Note 27)
The computer readable recording medium according to any one of supplementary notes 19 to 26,
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/010952 | 3/12/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/181637 | 9/16/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
9965819 | DeVries | May 2018 | B1 |
20030229897 | Frisco | Dec 2003 | A1 |
20110284627 | Stefani | Nov 2011 | A1 |
20130070974 | Stefani | Mar 2013 | A1 |
20180210613 | Chen et al. | Jul 2018 | A1 |
20200019691 | Milgram | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
2000-163689 | Jun 2000 | JP |
2006-099483 | Apr 2006 | JP |
2007-048132 | Feb 2007 | JP |
2007-079656 | Mar 2007 | JP |
2007-257219 | Oct 2007 | JP |
2009-205504 | Sep 2009 | JP |
2015-018545 | Jan 2015 | JP |
2015-103234 | Jun 2015 | JP |
2018-534662 | Nov 2018 | JP |
2019-082450 | May 2019 | JP |
Entry |
---|
Changi Airport published article “How to Score a Free tour of Singapore while transiting through Changi Airport”, Dec. 12, 2019, www.ChangiAirport.com (Year: 2019). |
English translation of Written opinion for PCT Application No. PCT/JP2020/010952, dated Aug. 11, 2020. |
International Search Report for PCT Application No. PCT/JP2020/010952, dated Aug. 11, 2020. |
Extended European Search Report for EP Application No. 20924092.8, dated Mar. 3, 2023. |
JP Office Action for JP Application No. 2022-505672, dated Aug. 22, 2023 with English Translation. |
Number | Date | Country | |
---|---|---|---|
20220318851 A1 | Oct 2022 | US |