This application claims the benefit of Japanese Patent Application No. 2018-138421, filed on Jul. 24, 2018, which is hereby incorporated by reference herein in its entirety.
The present disclosure relates to an information processing apparatus and. an information processing method.
In recent years, use of what is called “ride sharing”, which is a form of transportation in which a plurality of users share a ride in a same vehicle, has been prevailing. Also, for use of such ride sharing, techniques for matching a plurality of users with each other have been developed.
Also, Patent Document 1 discloses a technique for determining whether or not respective owners of terminals share a ride in a same vehicle.
[Patent document 1] Japanese Patent Laid-Open No. 2011-237842
An object of the present disclosure is to provide a ride-sharing technique that enables alleviation of congestion caused by vehicles visiting a predetermined facility.
An information processing apparatus according to the present disclosure may include a control unit configured to: for each of a plurality of visitors who visit a predetermined facility by vehicle, acquire a schedule information piece which is an information piece relating to a schedule for the visitor to visit to the facility, a first vehicle being specified in the schedule information piece, the first vehicle being a vehicle that the visitor schedules to ride in when the visitor visits the facility; and if the number of a plurality of visiting vehicles visiting the predetermined facility during a predetermined period of time is predicted to be equal to or larger than a first predetermined number based on the respective schedule information pieces of the plurality of visitors, calculate ride-sharing user candidates that are candidates of visitors that each join a ride in a vehicle that is different from the first vehicle specified in the schedule information piece of the relevant visitor, to visit the predetermined facility during the predetermined period of time, from among the plurality of visitors, and second vehicles that are vehicles that the ride-sharing user candidates can join a ride in, each of the second vehicles being the first vehicle for a visitor that is different from the ride-sharing user candidates, and perform matching between a part or all of users in the calculated ride-sharing user candidates and the second vehicles so that the number of the plurality of visiting vehicles becomes smaller than the first predetermined number.
Also, an aspect of the present disclosure can provide an information processing method. For example, the present disclosure may provide an information processing method including the steps of: for each of a plurality of visitors who visit a predetermined facility by vehicle, acquiring a schedule information piece which is an information piece relating to a schedule for the visitor to visit to the facility, a first vehicle being specified in the schedule information piece, the first vehicle being a vehicle that the visitor schedules to ride in when the visitor visits the facility; and if the number of a plurality of visiting vehicles visiting the predetermined facility during a predetermined period of time is predicted to be equal to or larger than a first predetermined number based on the respective schedule information pieces of the plurality of visitors, calculating ride-sharing user candidates that are candidates of visitors that each join a ride in a vehicle that is different from the first vehicle specified in the schedule information piece of the relevant visitor, to visit the predetermined facility during the predetermined period of time, from among the plurality of visitors, and second vehicles that are vehicles that the ride-sharing user candidates can join a ride in, each of the second vehicles being the first vehicle for a visitor that is different from the ride-sharing user candidates, and performing matching between a part or all of users in the calculated ride-sharing user candidates and the second vehicles so that the number of the plurality of visiting vehicles becomes smaller than the first predetermined number. Then, the present disclosure may provide a non-transitory recording medium storing a program for causing a computer to execute such an information processing method,
The present disclosure enables provision of a ride-sharing technique that enables alleviation of congestion caused by vehicles visiting a predetermined facility.
Where the number of vehicles visiting a predetermined facility (visiting vehicles) during a predetermined period of time becomes equal to or larger than a first predetermined number, congestion is caused by the vehicles. The term “congestion” mentioned here includes congestion of a parking lot attached to a predetermined facility and congestion of a road leading to the facility or the parking lot. Also, the first predetermined number may be, for example, a number determined based on a capacity of a parking lot attached to a predetermined facility.
Here, where it is assumed that the first predetermined number set to a number close to a capacity of a parking lot, if the number of visiting vehicles becomes equal to or larger than the first predetermined number, even though a vehicle enters the parking lot, the vehicle fails to be smoothly parked. As a result, congestion of the parking lot and congestion of a road leading to the parking lot occur.
Also, even where a capacity of a parking lot is relatively large, if the number of entrances to the parking lot is small relative to the capacity, which thus becomes a bottleneck to the flow of vehicles, congestion of a road leading to the parking lot may occur although the parking lot is relatively empty. In this case, the first predetermined number may be set based on easiness of entry to the parking lot. In other words, even if parking lots have a same capacity, the first predetermined number may be set to be larger for a parking lot that is easy to enter than for a parking lot that is difficult to enter.
Therefore, an information processing apparatus according to the present disclosure enables predicting whether or not the number of visiting vehicles becomes equal to or larger than a first predetermined number, by acquiring schedule information pieces. Then, if the number of visiting vehicles is predicted to become equal to or larger than the first predetermined number, a control unit may perform ride-sharing matching for making a plurality of users ride in a same vehicle so that the number of visiting vehicles becomes smaller than the first predetermined number. More specifically, the control unit may calculate ride-sharing user candidates, which are candidates of visitors to share a ride to visit a predetermined facility during a predetermined period of time and, for each of the ride-sharing user candidates, calculate a second vehicle that is a vehicle that the ride-sharing user candidate can join a ride in, the vehicle being a first vehicle for a visitor that is different from the ride-sharing user candidate. Then, a part or all of users in the calculated ride-sharing user candidates may be matched with the respective second vehicles.
Then, each of the ride-sharing user candidates matched with the respective second vehicles joins a ride in the relevant second vehicle to visit the predetermined facility, and the number of visiting vehicles is thus decreased. Hereinafter, a ride-sharing user candidate that is matched with a second vehicle and joins a ride in the second vehicle to visit a predetermined facility may be referred to as “ride-sharing user”. Therefore, as a result of ride-sharing matching being performed so that the number of visiting vehicles becomes smaller than the first predetermined number, congestion caused by visiting vehicles is suppressed to the extent possible. In other words, the information processing apparatus according to the present disclosure enables provision of a ride-sharing system that enables alleviation of congestion caused by visiting vehicles.
Specific embodiments of the present disclosure will be described below with reference to the drawings. Unless otherwise stated, dimensions, materials, shapes, relative dispositions, etc., of the components described in the below embodiments are not intended to limit the technical scope of the disclosure thereto.
In the information processing system 1, the respective user terminals 200 and the server apparatus 300 are interconnected by a network N1. For the network N1, for example, a WAN (wide area network), which is a worldwide public communication network such as the Internet, or another communication network may be employed. Also, the network N1 may include a telephone communication network for, e.g., mobile phones and a wireless communication network for, e.g., WiFi.
A user using the information processing system 1 can input information relating to a schedule of the user himself/herself visiting a predetermined facility by vehicle (hereinafter “schedule information (piece)”) using his/her user terminal 200. Here, in each user terminal 200, a predetermined application for using the information processing system 1 (hereinafter may be referred to as “predetermined application”) has been installed and each user can input schedule information using the predetermined application installed in his/her user terminal 200. However, the above is not intended to limit a form of an input of schedule information to the form in which schedule information is input using a user terminal 200, and schedule information may be input using an arbitrary terminal that is connectable to the network N1 a smartphone, a mobile phone, a tablet terminal, a personal digital assistant or a wearable computer) or a personal computer (PC).
Here, the predetermined facility is an arbitrary facility that a user can visit by vehicle. Also, the schedule information includes information relating to a vehicle that a user is scheduled to ride in when the user visits the facility (hereinafter may be referred to as “first vehicle”), in addition to a schedule of the visit to the predetermined facility.
The server apparatus 300 is a management server that accepts registrations of schedule information pieces input using user terminals 200 and manages traffic to a predetermined facility. Here, the server apparatus 300 predicts whether or not congestion is caused by vehicles visiting the predetermined facility, based on the schedule information pieces and traffic information of traffic in the periphery of the predetermined facility (the traffic information includes, e.g., a current condition of the traffic in the periphery of the facility, a capacity of a parking lot attached to the predetermined facility and the number of vehicles currently parked, and information relating to entrances to the parking lot). Then, if congestion is predicted to occur, the server apparatus 300 performs ride-sharing matching for making a plurality of users share a ride in a same vehicle. Details of matching processing will be described later.
First, the server apparatus 300 will be described. The server apparatus 300 has a configuration of a general computer. The server apparatus 300 includes a processor 301, a main memory unit 302, an auxiliary memory unit 303 and a communication unit 304. These components are interconnected via a bus. Each of the main memory unit 302 and the auxiliary memory unit 303 is a computer-readable recording medium. The hardware configuration of the computer is not limited to the example illustrated in
In the server apparatus 300, the processor 301 loads a program stored in a recording medium to a work area of the main memory unit 302 and executes the program, and, e.g., respective functional component units are controlled through the execution of the program, enabling provision of functions meeting a predetermined purpose.
The processor 301 is, for example, a CPU (central processing unit) or a DSP (digital signal processor). The processor 301 controls the server apparatus 300 and performs various arithmetic operations for information processing. The main memory unit 302 includes a RAM (random access memory) and a ROM (read-only memory). The auxiliary memory unit 303 is, for example, an EPROM (erasable programmable ROM) or a hard disk drive (HDD). Also, the auxiliary memory unit 303 can include a removable medium, that is, a portable recording medium. The removable medium is, for example, a USB (universal serial bus) memory or a disk recording medium such as a CD (compact disc) or a DVD (digital versatile disc).
The auxiliary memory unit 303 stores various programs, various data and various tables in a recording medium in such a manner that programs, data and tables can be read/written. In the auxiliary memory unit 303, e.g., an operating system (OS), various programs and various tables are stored. Information to be stored in the auxiliary memory unit 303 may be stored in the main memory unit 302. Also, information to be stored in the main memory unit 302 may be stored in the auxiliary memory unit 303.
The communication unit 304 is connected to another apparatus and controls communication between the server apparatus 300 and the other apparatus. The communication unit 304 is, for example, a LAN (local area network) interface board or a wireless communication circuit for wireless communication. The LAN interface board or the wireless communication circuit is connected to the network N1 such as the Internet, which is a public communication network.
A series of processing performed by the server apparatus 300 can be performed by hardware but may also be performed by software.
Next, the user terminal 200 will be described. The user terminal 200 is, for example, a compact computer such as a smartphone, a mobile phone, a tablet terminal, a personal digital assistant or a wearable computer (e.g., a smart watch). Note that the user terminal 200 may be a personal computer (PC) connected to the server apparatus 300 via the network N1 such as the Internet, which is a public communication network.
The user terminal 200 includes a processor 201, a main memory unit 202, an auxiliary memory unit 203, a display unit 204, an input unit 205 and a communication unit 206. The processor 201, the main memory unit 202 and the auxiliary memory unit 203 are similar to the processor 301, the main memory unit 302 and the auxiliary memory unit 303 of the server apparatus 300, respectively, and thus, description thereof will be omitted. The display unit 204 is, for example, an LCD (liquid-crystal display) or an EL (electroluminescence) panel. The input unit 205 includes, e.g., a touch panel, push-buttons. Also, the input unit 205 can include a camera that enables an input of video or an image and an audio input unit such as a microphone. The communication unit 206 is, for example, a communication circuit for accessing the network N1 using a telephone communication network for, e.g., mobile phones or a wireless communication for, e.g., WiFi, and performing data communication with, e.g., the server apparatus 300.
Each of the schedule information database D310, the traffic information database D320 and the matching information database D330 is built by management of data stored in the auxiliary memory unit 303, by a program for a database management system (DBMS), the program being executed by the processor 301. Each of the schedule information database D310, the traffic information database D320 and the matching information database D330 is, for example, a relational database.
Any of the functional components of the server apparatus 300 or a part of processing in any of the functional components may be provided or performed by another computer connected to the network N1. For example, processing in the matching processing unit F330 and processings in the schedule information acquisition unit F310, the traffic information acquisition unit F320 and the matching information provision unit F340 may be performed by separate computers.
The schedule information acquisition unit F310 acquires schedule information from a user. The schedule information includes information relating to a first vehicle that a user is scheduled to ride in when the user visits a predetermined facility in addition to information relating to a schedule of the user's visit to the facility. Then, the schedule information acquisition unit F310 registers the schedule information in the schedule information database D310.
Here, the schedule information database D310 a database that stores schedule information. The schedule information database D310 includes a schedule information table, which is illustrated in
The schedule information table illustrated in
In the example illustrated in
Such schedule information is transmitted from each user terminal 200 to the server apparatus 300 by the user inputting the schedule information to the user terminal 200. More specifically, each user terminal 200 includes a functional configuration that receives an input of schedule information and transmits the input information to the server apparatus 300. The processor 201 of each user terminal 200 performs processing for transmitting schedule information input from the input unit 205, to the server apparatus 300 via the communication unit 206, according to a computer program in the main memory unit 202. Then, the schedule information acquisition unit F310 acquires the information transmitted from the user terminal 200.
Here,
The traffic information database D320 is a database that stores traffic information. The traffic information database D320 includes traffic information tables illustrated in
The traffic information table illustrated in
Also, the traffic information table illustrated in
Also, the traffic information table illustrated in
Here,
For example, parking lot P1 of facility F001 indicated in the above-described
Occurrence of congestion may also be determined by using the aforementioned passage rate. For example, parking lot P2 of facility F002 indicated in the above-described
In view of the above, it can be understood that the number of visiting vehicles calculated based on schedule information pieces becomes equal to or larger than the first predetermined number, congestion may be caused by the vehicles. Here, as described above, the first predetermined number is a number set based on traffic information (e.g., the parking lot capacity, the current number of vehicles parked, the passage rate and the degree of congestion).
Then, if the number of visiting vehicles wising the predetermined facility during a predetermined period of time is predicted to be equal to or larger than the first predetermined number, processing in the calculation unit F331 of the matching processing unit F330 and processing in the matching unit F332 of the matching processing unit F330 are further performed. The present embodiment will be described in terms of a case where users S001 to S004 indicated in the above-described
The calculation unit F331 of the matching processing unit F330 calculates ride-sharing user candidates and second vehicles. Here, a ride-sharing user candidate refers to a candidate of a user that joins a ride to visit a predetermined facility during a predetermined period of time. Here, ride-sharing refers to joining a ride in a vehicle that is different from a first vehicle specified in schedule information of the user himself/herself. Also, a second vehicle refers to a vehicle that a ride-sharing user candidate can join a ride in, the vehicle being a first vehicle for a user that is different from the ride-sharing user candidate. In other words, a first vehicle specified in schedule information for a user that is different from a ride-sharing user candidate can be a second vehicle for the ride-sharing user candidate. An example of the processing performed by the calculation unit F331 will be described with reference to the above-described
According to the above-described
Also, according to the above-described
In the present embodiment, as described above, the calculation unit F331 calculates user S002 and user S004 as ride-sharing user candidates and calculates vehicle 101 as a second vehicle for user S002 and vehicle 103 as a second vehicle for user S004. Then, next, the matching unit F332 of the matching processing unit F330 performs matching between a part or all of users in the calculated ride-sharing user candidates and the second vehicles so that the number of visiting vehicles becomes smaller than the first predetermined number.
As described above, if the number of vehicles visiting facility F001 indicated in the above-described
Then, after matching between the ride-sharing user candidate(s) and the second vehicle(s), the matching unit F332 generates matching information. Here, the matching information is information to be registered in the matching information database D330.
The matching information database D330 is a database that stores matching information. The matching information database D330 includes a matching information table, which is illustrated in
The matching information table illustrated in
On the other hand, as in the matching information table illustrated in
Then, such matching information is provided to the ride-sharing user(s) matched with the second vehicle(s) and the riding user(s) of the second vehicle(s), by the matching information provision unit F340 illustrated in
The flow of processing in the server apparatus 300 according to the present embodiment will be described.
In the processing in
Next, in step S102, traffic information is acquired. In step S102, the server apparatus 300 can acquire a capacity of a parking lot, which is a piece of traffic information, by, for example, reading information registered in a predetermined database in advance, and acquire the current number of vehicles parked, which is a piece of traffic information, based on a known method.
Next, in step S103, whether or not the number of visiting vehicles is predicted to be equal to or larger than a first predetermined number is determined. Here, the server apparatus 300 can calculate the number of first vehicles scheduled to visit a predetermined facility during a predetermined period of time in the schedule information pieces acquired in step S101, as the number of visiting vehicles. Also, the server apparatus 300 can determine the first predetermined number based on the traffic information pieces acquired in step S102. Then, if an affirmative determination is made in step S103, the processor 301 of the server apparatus 300 proceeds to the processing in step S104 and if a negative determination is made in step S103, the execution of the present flow ends.
If an affirmative determination is made in step S103, next, in step S104, ride-sharing user candidates and second vehicles are calculated. Then, in step D105, matching between a part or all of users in the ride-sharing user candidates calculated in step D104 and the second vehicles is performed so that the number of visiting vehicles becomes smaller than the first predetermined number. Here, details of the processing by the calculation unit F331, which is performed in step D104, and the processing by the matching unit F332, which is performed in step D105, are as described above.
Next, in step S106, matching information pieces on the matching according to the processing in step S105 are transmitted. In step S106, the server apparatus 300 transmits the matching information pieces to the relevant user terminals 200. Then, ride-sharing users matched with the respective second vehicles and riding users of the second vehicles can acquire the respective matching information pieces. Then, each of these users transmits information on whether or not to approve the matching, using the relevant user terminal 200, to the server apparatus 300. Then, in step S107, the server apparatus 300 determines whether or not the matching according to the processing in step S105 has been approved by the user. Then, if an affirmative determination is made in step S107, the processor 301 proceeds to processing in step S108 and if a negative determination is made in step S107, the processor 301 proceeds to the processing in step S105.
If an affirmative determination is made in step S107, next, in step S108, the matching information piece approved by the user is registered. In step S108, the server apparatus 300 registers the relevant matching information piece in the matching information database D330. Then, after the processing in step S108, execution of the present flow ends.
The above-described information processing system suppresses congestion caused by visiting vehicles visiting a predetermined facility during a predetermined period of time to the extent possible. In other words, the information processing apparatus according to the present disclosure can provide a ride-sharing system that enables alleviation of congestion caused by visiting vehicles.
A program that causes a computer or other machine or apparatus (hereinafter, “computer or the like”) to provide any of the above-described functions can be recorded in a recording medium that can be read by the computer or the like. The function can be provided by causing the computer or the like to read and execute the program in the recording medium.
Here, the recording medium that can be read by the computer or the like refers to a non-transitory recording medium that can store information such as data or a program by means of an electrical, magnetic, optical, mechanical or chemical action and allows the information to be read by the computer or the like. Examples of a recording medium that can be removed from the computer or the like from among such recording media include a flexible disk, a magnetooptical disk, a CD-ROM, a CD-R/W, a DVD, a Elu-ray disk, a DAT, an 8 mm tape and a memory card such as a. flash memory. Also, examples of a recording medium fixed in the computer or the like from among such recording media include, e.g., a hard disk and a ROM (read-only memory). Furthermore, an SSD (solid-state drive) can he used as a recording medium that can be removed from the computer or the like or a recording medium fixed in the computer or the like.
Next, an alteration of the above-described first embodiment will be described. In the present alteration, detailed description of components that are substantially the same as those of the first embodiment and control processing that is substantially the same as that of the first embodiment will be omitted.
In the above-described first embodiment, the number of visiting vehicles is reduced by performing ride-sharing matching. Consequently, congestion of a parking lot attached to a predetermined facility and congestion of a road leading to the parking lot can be suppressed; however, depending on the travel routes of visiting vehicles to the facility, congestion may still occur in the periphery of the parking lot. Therefore, in the present alteration, the server apparatus 300 generates a travel route in the periphery of a predetermined facility for each of a part or all of a plurality of visiting vehicles. This will be described with reference to
Here, a case where ten visiting vehicles from area b′ and ten visiting vehicles from area d′ schedule to enter parking lot P2 during a same period of time is assumed. In this case, as described above with reference to
Therefore, as illustrated in
The travel route generation unit F350 generates a travel route in the periphery of a predetermined facility (hereinafter simply referred to as “travel route”) for each of a part or all of a plurality of visiting vehicles. More specifically, the travel route generation unit F350 generates the travel routes so that the number of vehicles having a predetermined route from among the visiting vehicles for which the travel route has been generated by the travel route generation unit F350 becomes equal to or smaller than a second predetermined number. This will be described with reference to
In the present alteration, travel routes of 20 visiting vehicles including ten visiting vehicles from area b′ and ten visiting vehicles from area d′ are generated so that the number of vehicles scheduled to enter each of the entrances of parking lot P2 illustrated in the above-described
Accordingly, time taken for the vehicles to enter the parking lot from the respective entrances is reduced to be equal or shorter than a predetermined length of time and congestion caused by visiting vehicles is thus suppressed. Here, each of routes L1, L2, L3, L4 corresponds to the above predetermined route. Also, in the present alteration, the second predetermined number for each route defined so that the number of vehicles scheduled to enter a relevant entrance connected to the route becomes no more than twice the passage rate of the entrance.
Then, the travel routes generated as described above are provided to the visiting vehicles by the travel route provision unit F360 illustrated in
Here, the processor 301 functions as a control unit accord in to the present disclosure by performing the processing in the schedule information acquisition unit F310, the processing in the calculation unit F331, the processing in the matching unit F332, the processing in the travel route generation, unit F350 and the processing in the travel route provision unit F360.
The information processing system described above enables congestion caused by visiting vehicles visiting a predetermined facility during a predetermined period of time to be suppressed to the extent possible.
Next, a second embodiment of the present disclosure will be described. In the present embodiment, detailed description of components that are substantially the same as those of the first embodiment and control processing that is substantially the same as that of the first embodiment will be omitted.
In the above-described first embodiment, the server apparatus 300 acquires schedule information pieces through communication with user terminals 200. Then, the server apparatus 300 performs matching between ride-sharing user candidates and second vehicles and transmits matching information pieces on the matching to the user terminals 200. On the other hand, in the present embodiment, a server apparatus 300 acquires schedule information pieces through communication with a management server that manages schedule information pieces for users using the information processing system 1. Also, the server apparatus 300 performs matching between ride-sharing user candidates and second vehicles and transmits matching information on the matching to the management server.
In the information processing system 1, the server apparatus 300 and the management servers 400 are interconnected by a network N1. Also, each user terminal 200 and a relevant management server 400 are interconnected via a network N2. Here, for the networks N2, for example, a WAN (wide area network), which is a worldwide public communication network such as the Internet or another communication network, may be employed. Also, the network N2 may include a telephone communication network for, e.g., mobile phones and a wireless communication network for, e.g., WiFi.
Also, the management server 400 has a hardware configuration that is similar to that of the server apparatus 300, and includes a processor 401, a main memory unit 402, an auxiliary memory unit 403 and a communication unit 404. Furthermore, the management server 400 includes a schedule information acquisition unit F410, a matching information provision unit F440 and a schedule information database D410 as functional components. These functional components function in such a manner that is similar to the schedule information acquisition unit F310, t be matching information provision unit F340 and the schedule information database D310 mentioned in the description of the first embodiment, respectively.
Here, the flow of operation of the information processing system according to the present embodiment will be described.
In the present embodiment, first, schedule information pieces are registered by respective users. Each user terminal 200 receives a schedule information piece from the relevant user (step S201) and transmits the schedule information piece to the management server 400 (step S202). Then, a relevant management server 400 acquires the schedule information piece transmitted from the user terminal 200 through reception by the communication unit 404 and registers the schedule information piece in the schedule information database D410 (step S203). Furthermore, the management server 400 transmits the schedule information piece to the server apparatus 300 (step S204). Then, the server apparatus 300 acquires the schedule information pieces transmitted from the management server 400 through reception by the communication unit 304 (step S205).
Next, the server apparatus 300 acquires traffic information (step S206). Then, if the number of visiting vehicles is predicted to be equal to or larger than a first predetermined number (if an affirmative determination is made in step S103 in
The management server 400 acquires the matching information including the ride-sharing requests (step S209) and changes schedule information pieces of the ride-sharing users matched with the second vehicles (step S210). This will be described based on
Here, it is assumed that users S001 and S002 and vehicle 103 are matched with each other as a result of the matching in step S207. In this case, users S001 and S002 correspond to ride-sharing users according to the present disclosure and vehicle 103 correspond to a second vehicle for uses S001 and S002. Then, as illustrated in
Then, the management server 400 transmits the schedule information pieces changed as described above to the respective user terminals 200 (step S211). Then, the user terminals 200 acquire the respective changed schedule information pieces (step 8212). Also, the management server 400 registers the changed schedule information pieces in the schedule information database D410 (step S213).
As a result of the schedule information pieces of the users being managed by the management servers 400 as described above, burden on the server apparatus 300 is reduced. Here, when the management server 400 changes the schedule information pieces as described above, the management server 400 may request an approval of each user. Also, the server apparatus 300 may perform matching so that users belonging to different groups ride in a same vehicle.
The information processing system described above also suppresses congestion caused by visiting vehicles visiting a predetermined facility during a predetermined period of time to the extent possible. In other words, the information processing apparatus according to the present disclosure can provide a ride-sharing system that enables alleviation of congestion caused by visiting vehicles
Each of the above-described embodiments is a mere example and the present disclosure can be carried out with arbitrary change made thereto without departing from the spirit of the disclosure.
Also, the processing and measures described in the present disclosure can freely be combined and carried out as long as such combination causes no technical contradiction.
Also, the processing described as processing performed by a single apparatus may be shared and performed by a plurality of apparatuses. Alternatively, the processing described as processing performed by different apparatuses may be performed by a single apparatus. In a computer system, what hardware configuration (server configuration) to be employed to provide the respective functions can flexibly be changed.
The present disclosure can also be carried out by supplying computer programs implementing the functions described in the above embodiments to a computer and causing one or more processors included in the computer to read and execute the programs. Such computer programs may be provided to the computer via a non-transitory computer-readable recording medium that is connectable to a system bus of the computer or may be provided to the computer via a network. Examples of the non-transitory computer-readable recording medium include arbitrary types of disks including magnetic disks (e.g., a floppy (registered trademark) disk and a hard disk drive (HDD)), optical disks (e.g., a CD-ROM, a DVD disk and a Blu-ray disk), a read-only memory (ROM), a random access memory (RAM), an EPROM, an EEPROM, a magnetic card, a flash memory, an optical card, and an arbitrary type of medium suitable for storing electronic instructions.
Number | Date | Country | Kind |
---|---|---|---|
2018-138421 | Jul 2018 | JP | national |