This application claims priority to Japanese Patent Application No. 2021-034980 filed on Mar. 5, 2021, incorporated herein by reference in its entirety.
The present disclosure relates to an automated driving vehicle that is used as an unmanned taxi, a vehicle allocation management device that manages vehicle allocation in an unmanned taxi service, and a terminal device that can make a reservation for vehicle allocation.
For example, as described in Japanese Unexamined Patent Application Publication No. 2016-210417 (JP 2016-210417 A), an automated driving vehicle which does not require a driver's driving and which can operate autonomously is known.
In Japanese Unexamined Patent Application Publication No. 2005-56135 (JP 2005-56135 A), effective utilization of a pickup vehicle which is owned by a nursing care service provider is achieved by using the pickup vehicle as a care taxi when the pickup vehicle and a driver thereof are unoccupied. Japanese Unexamined Patent Application Publication No. 2016-162438 (JP 2016-162438 A) discloses a pickup service based on vehicle sharing in which a vehicle with a driver is shared for aged persons.
The present disclosure provides an automated driving vehicle, a vehicle allocation management device, and a terminal device that can remotely watch a passenger when providing an unmanned taxi service using an automated driving vehicle.
An automated driving vehicle according to an aspect of the present disclosure includes an analysis unit, an automatic travel control unit, and a passenger ascertaining unit. The analysis unit is configured to be able to perform attribute recognition and distance measurement of a nearby object of a host vehicle. The autonomous travel control unit is configured to be able to autonomously control a steering mechanism, a brake mechanism, and a drive mechanism based on predetermined travel route information and the attributes and distance of the nearby object. The passenger ascertaining unit is configured to transmit a notification to an interested person's contact-number device set in a reservation for vehicle allocation in at least one of (1) a case in which a passenger set in the reservation for vehicle allocation has boarded at a boarding point set in the reservation for vehicle allocation, (2) a case in which the passenger has not been recognized by the analysis unit within a predetermined waiting time from a target boarding point arrival time set in the reservation for vehicle allocation, and (3) a case in which the passenger has alighted at a destination set in the reservation for vehicle allocation.
With this configuration, information of boarding/alighting of a passenger and information of absence of a passenger at the time of boarding is transmitted to an interested person's contact-number device. For example, when the passenger is an aged person or a child, the information is transmitted to a contact number of a family member thereof and thus the family member can ascertain the passenger.
In the configuration, the automated driving vehicle may further include an imaging device that images the periphery of the automated driving vehicle and transmits a captured image to the analysis unit. In this case, the passenger ascertaining unit may be configured to transmit an absence notification as the notification to the interested person's contact-number device and to transmit the captured image in (2) the case in which the passenger has not been recognized by the analysis unit within the predetermined waiting time from the target boarding point arrival time. The imaging device may be configured to change imaging settings based on an operation command from the interested person's contact-number device having received the absence notification.
With this configuration, when the automated driving vehicle cannot recognize a passenger even after the target boarding point arrival time, the interested person's contact-number device can see an image captured by the imaging device of the automated driving vehicle and remotely operate the imaging device. Accordingly, a remote interested person can ascertain, for example, features of the passenger which the automated driving vehicle has difficulty recognizing using the captured image, and the imaging settings of the imaging device can be changed by a remote operation in order to ascertain features which can be easily recognized such as a face of the passenger.
In the configuration, the analysis unit may be configured to recognize the passenger in the captured image based on a face image of the passenger set in the reservation for vehicle allocation.
With this configuration, it is possible to save labor of the passenger inputting a password to a device mounted in the automated driving vehicle, or the like.
In the configuration, the automated driving vehicle may further include an outside speaker that is able to output voice which is input to a calling unit of the interested person's contact-number device.
With this configuration, it is possible attract attention of a passenger who is not conscious of the automated driving vehicle by outputting the interested person's voice from the outside speaker.
A vehicle allocation management device according to another aspect of the present disclosure includes: a vehicle allocation managing unit and a passenger ascertaining unit. The vehicle allocating managing unit is configured to allocate an automated driving vehicle to a boarding point set in a reservation for vehicle allocation at the time of pickup based on the reservation for vehicle allocation. The passenger ascertaining unit is configured to transmit a notification to an interested person's contact-number device set in a reservation for vehicle allocation in at least one of (1) a case in which a passenger set in the reservation for vehicle allocation has boarded the automated driving vehicle at the boarding point, (2) a case in which the passenger has not been recognized by the automated driving vehicle within a predetermined waiting time from a target boarding point arrival time set in the reservation for vehicle allocation, and (3) a case in which the passenger has alighted from the automated driving vehicle at a destination set in the reservation for vehicle allocation.
In the configuration, the passenger ascertaining unit may be configured to transmit an absence notification as the notification to the interested person's contact-number device and to transmit an image captured by an imaging device configured to image the periphery of the automated driving vehicle in (2) the case in which the passenger has not been recognized by the automated driving vehicle within the predetermined waiting time from the target boarding point arrival time. The absence notification may include a message for demanding an input of an operation command for the imaging device from the interested person's contact-number device.
A terminal device according to another aspect of the present disclosure includes an input unit configured to be able to receive an input of a reservation for vehicle allocation of an automated driving vehicle. A boarding point, a target boarding point arrival time, and a passenger are set in the reservation for vehicle allocation. The terminal device further includes a transceiver unit and a remote operation unit. The transceiver unit is configured to receive an absence notification and an image captured by an imaging device configured to image the periphery of the automated driving vehicle when the passenger has not been recognized at the boarding point by the automated driving vehicle within a predetermined waiting time from the target boarding point arrival time. The remote operation unit is configured to be able to input an operation command for the imaging device when the absence notification and the captured image are received by the transceiver unit.
With the automated driving vehicle, the vehicle allocation management device, and the terminal device according to the present disclosure, it is possible to remotely watch a passenger when providing an unmanned taxi service using an automated driving vehicle.
Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like signs denote like elements, and wherein:
Hereinafter, an embodiment of the disclosure will be described with reference to the accompanying drawings. Shapes, materials, numbers, and numerical values which will be described below are merely provided for description, and can be appropriately changed according to specifications of an automated driving vehicle, a vehicle allocation management device, and a terminal device. In the following description, the same or similar elements in all the drawings will be referred to by the same reference signs.
In the vehicle allocation system according to this embodiment, a user who is an owner of a terminal device 70 such as a smartphone makes a reservation for allocation of an automated driving vehicle 10 which is an unmanned taxi. The vehicle allocation management device 50 performs reception of a reservation for vehicle allocation and management of the automated driving vehicle 10.
In the vehicle allocation system according to this embodiment, a person other than a user of a terminal device 70 which has made a reservation for vehicle allocation is supposed as a main passenger. For example, an aged person or a child is a passenger and a family member thereof makes a reservation for vehicle allocation using his or her terminal device 70.
As will be described below in detail, in the vehicle allocation system according to this embodiment, an automated driving vehicle 10 which is an unmanned taxi travels to a boarding point which is set in a reservation for vehicle allocation at the time of pickup. When the automated driving vehicle 10 arrives at the boarding point, the automated driving vehicle 10 images the periphery of the automated driving vehicle and recognizes a passenger.
When the recognized passenger boards the automated driving vehicle 10 and departs, a boarding notification is transmitted to an interested person's contact-number device including the terminal device 70 having made the reservation for vehicle allocation. When the automated driving vehicle 10 with the passenger therein arrives at a destination and the passenger alights, an alighting notification is transmitted to the interested person's contact-number device. In this way, the remote interested person can be notified that the passenger has boarded or alighted and thus watch the passenger.
In the vehicle allocation system according to this embodiment, when the automated driving vehicle 10 cannot recognize a passenger at the boarding point after a target boarding point arrival time, an absence notification and a captured image around the automated driving vehicle 10 are transmitted to the terminal device 70 which is an interested person's contact-number device. In the terminal device 70 having received the transmitted information, a user thereof remotely operates an outside camera 11A of the automated driving vehicle 10. For example, an interested person who sees the captured image can visually recognize features of the passenger which have not been set at the time of the reservation for vehicle allocation and remotely operate the outside camera 11A (for example, image enlargement of the passenger), and thus the automated driving vehicle 10 can accurately recognize the passenger.
The vehicle allocation management device 50 is installed, for example, at a management company that provides an unmanned taxi service using an automated driving vehicle 10. The vehicle allocation management device 50 is constituted by, for example, a computer (an electronic calculator). Referring to
A program for performing vehicle allocation management is stored in at least one of the ROM 55 and the hard disk drive 57 which are storage devices. By causing the CPU 52 or the like of the vehicle allocation management device 50 to execute the program, functional blocks illustrated in
That is, the vehicle allocation management device 50 includes a dynamic map storage unit 66, a user registration information storage unit 67, a vehicle allocation reservation storage unit 68, and an automated driving vehicle storage unit 69 as storage units. The vehicle allocation management device 50 includes a transceiver unit 60, a vehicle allocation managing unit 61, a vehicle allocation reservation setting unit 62, a map preparing unit 63, and a timepiece 65 as functional units.
Dynamic map data which is map data is stored in the dynamic map storage unit 66. A dynamic map is a three-dimensional map and stores, for example, a position and a shape (three-dimensional shape) of a road. The three-dimensional shape of a road includes, for example, gradients and widths. Positions of lanes, crosswalks, stop lines, and the like which are drawn on a road are also stored in the dynamic map. Positions and shapes (three-dimensional shapes) of structures such as buildings and signals near a road are also stored in the dynamic map. Positions and shapes of parking lots are also stored in the dynamic map.
For example, a geographic coordinate system including latitude and longitude is used in the dynamic map. When an automated driving vehicle 10 travels by automated driving, the map preparing unit 63 extracts dynamic map data from the dynamic map storage unit 66 and prepares guidance map data including a travel route. The travel route includes a current position of the automated driving vehicle 10, a boarding point, and a destination.
Registration information of users who are provided with an unmanned taxi service using automated driving vehicles 10 is stored in the user registration information storage unit 67.
Entries of a user management number, a user name, a user account name, a user terminal identification symbol, an additional contact number identification symbol, and face images of passengers 1 and 2 are provided in the management table. The user account name is a name (a member name) for identifying a user who uses the unmanned taxi service using an automated driving vehicle 10 and is a name for using an unmanned taxi application 82 installed in a terminal device 70 of the user. For example, an email address of a user is used as the user account name.
The user terminal identification symbol is a symbol for identifying a terminal device 70 which is an interested person's contact-number device owned by a user over the Internet 95. The user terminal identification symbol may be, for example, an IP address which is assigned to the terminal device 70. As will be described later, when a passenger set in a reservation for vehicle allocation has boarded an automated driving vehicle 10, when the passenger has alighted from the vehicle, and when the scan data analyzing unit 40 of the automated driving vehicle 10 cannot recognize the passenger, the passenger ascertaining unit 45 accesses the terminal device 70 which is an interested person's contact-number device via the Internet 95 with reference to the user terminal identification symbol.
An identification symbol of a device that can receive a notification from the passenger ascertaining unit 45 in addition to the terminal device 70 which is a user terminal is stored in an entry of the additional contact number identification symbol. For example, when the passenger is an aged person, a smartphone which is owned by a family hospital is used as the additional device. The additional contact number identification symbol may be, for example, an IP address which is assigned to the added device.
Face image data of a passenger who uses the unmanned taxi service is stored in the entries of the face images of passengers 1 and 2. The face image data may be image data such as JPEG data. As will be described later, when a passenger boards an automated driving vehicle 10 which is an unmanned taxi, a face of the passenger is recognized based on the face image data. Because this recognition of a face is performed, a passenger does not need to operate a device for authentication, for example, when a person who is unaccustomed to an operation of an information and communication device such as a preschool child or an aged person is supposed as a passenger of the unmanned taxi.
A vehicle allocation reservation information management table illustrated in
The user account name corresponds to the user account name stored in the user registration information management table illustrated in
A boarding point and a destination which are input from the unmanned taxi application 82 are stored in the entries of the boarding point and the destination. Addresses of the boarding point and the destination are stored in these entries. Geographic coordinates (latitude and longitude) of the boarding point and the destination are also stored to cope with a coordinate system of dynamic data.
A name of a passenger of an unmanned taxi is stored in the entry of the passenger name. The passenger name is input by a user, for example, using the unmanned taxi application 82. Face image data of a passenger is stored in the entry of the passenger face image. For example, this data is selected out of face images of passengers stored in the user registration information management table illustrated in
Vehicle information of an automated driving vehicle 10 which is an unmanned taxi managed by the vehicle allocation management device 50 is stored in the automated driving vehicle storage unit 69. The vehicle information includes an identification symbol (for example, a vehicle number), a vehicle color, mileage, a state of charge, an operating state (in pickup, occupied, or available), etc. of the automated driving vehicle 10.
The vehicle allocation managing unit 61 manages automated driving vehicles 10 based on a reservation for vehicle allocation. As will be described later, when a target boarding point arrival time approaches, the vehicle allocation managing unit 61 sets an automated driving vehicle 10 in an available state as an allocated vehicle. The automated driving vehicle 10 is allocated for a boarding point which is set in the reservation for vehicle allocation. The vehicle allocation reservation setting unit 62 makes a reservation for vehicle allocation of an automated driving vehicle 10 in cooperation with the unmanned taxi application 82 of the terminal device 70.
The map preparing unit 63 prepares a travel route connecting three points including a current position (that is, a departure point) of the automated driving vehicle 10 set as the allocated vehicle and a boarding point and a destination set in the reservation for vehicle allocation. The map preparing unit 63 transmits dynamic map data including the travel route to the automated driving vehicle 10 set as the allocated vehicle. At the time of reservation for vehicle allocation using the unmanned taxi application 82, the map preparing unit 63 transmits map data (for example, two-dimensional map data) to the terminal device 70.
An appearance of an automated driving vehicle 10 is illustrated in
An automated driving vehicle 10 can operate at level 0 (at which a driver performs all operations) to level 5 (at which driving is fully automated) based on the standard in the Society of Automotive Engineers (SAE) of the United States. For example, the level of automated driving is set to, for example, level 4 or level 5 when the automated driving vehicle 10 operates.
An automated driving mechanism of an automated driving vehicle 10 is illustrated in
The automated driving vehicle 10 includes an outside camera 11A, a LiDAR unit 11B, a proximity sensor 12, a positioning unit 13, and a control unit 20 as mechanisms for acquisition of a self-position or ascertainment of surrounding circumstances.
Referring to
The LiDAR unit 11B is a sensor unit for travel by automated driving and is a distance measuring unit that can measure a distance between a nearby object of the host vehicle and the host vehicle. A technique of measuring a distance to a nearby object using Light Detection and Ranging (LiDAR), that is, laser light, is used in the LiDAR unit 11B. The LiDAR unit 11B includes an emitter that emits infrared laser light to the outside, a receiver that receives reflected light, and a motor that rotates the emitter and the receiver.
For example, the emitter emits infrared laser light to the outside. When laser light emitted from the emitter reaches a nearby object of the automated driving vehicle 10, reflected light thereof is received by the receiver. A distance between a reflecting point and the receiver is calculated based on a time required from emission of light from the emitter to reception of light by the receiver. When the emitter and the receiver are rotated by the motor, laser light is emitted in the horizontal direction and the vertical direction and thus three-dimensional point group data for the periphery of the automated driving vehicle 10 can be acquired.
The outside camera 11A is an imaging device that captures an image in the same field of view as the LiDAR unit 11B. The outside camera 11A includes an image sensor such as a CMOS sensor or a CCD sensor. The proximity sensor 12 is, for example, an infrared sensor and is provided on the front surface, the two side surfaces, and the rear surface of the automated driving vehicle 10 as illustrated in
The positioning unit 13 is a system that performs positioning using artificial satellites and, for example, a Global Navigation Satellite System (GNSS) is used. The self-position (latitude and longitude) can be estimated using the positioning unit 13.
An inside camera 29 images the inside of the automated driving vehicle 10. The captured image is used to determine whether a passenger has boarded the automated driving vehicle. The inside camera 29 includes an image sensor such as a CMOS sensor or a CCD sensor.
An outside speaker 19 is provided on the front surface of the automated driving vehicle 10 as illustrated in
For example, an outside noticeboard 90 (see
Referring to
A program for performing automated driving control of the automated driving vehicle 10 is stored in at least one of the ROM 25 and the hard disk drive 27 which are the storage devices. By causing the CPU 22 or the like of the control unit 20 to execute the program, functional blocks illustrated in
The scan data analyzing unit 40 is configured to be able to perform recognition of attributes of a nearby object of the host vehicle and measurement of a distance to the nearby object. The scan data analyzing unit 40 acquires a captured image captured by the outside camera 11A. The scan data analyzing unit 40 performs image recognition on the acquired captured image using a single shot multi-box detector (SSD) using supervised learning or a known deep learning method such as You-Only-Look-Once (YOLO). Detection of an object in the captured image and recognition of attributes (such as a vehicle, a pedestrian, or a structure) of the object are performed through the image recognition.
The scan data analyzing unit 40 acquires three-dimensional point group data from the LiDAR unit 11B. The scan data analyzing unit 40 performs clustering of dividing the three-dimensional point group into a plurality of clusters. The scan data analyzing unit 40 prepares periphery data in which the captured image subjected to the image recognition and coordinates of the clustered three-dimensional point group data are superimposed. The distances of objects having certain attributes from the automated driving vehicle 10 can be detected based on the periphery data. The periphery data is transmitted to the self-position estimating unit 41 and the autonomous travel control unit 42.
As will be described later, the scan data analyzing unit 40 recognizes a face of a passenger based on a face image of the passenger set in a reservation for vehicle allocation. For example, the scan data analyzing unit 40 includes a face image database in which a plurality of face images is stored. Face image data of passengers are also stored in the face image database. The scan data analyzing unit 40 is configured to be able to extract a face image from the captured image captured by the outside camera 11A and to extract face information with the highest similarity to the extracted face image from the face image database.
The self-position estimating unit 41 acquires self-position information (latitude and longitude) from the positioning unit 13. For example, the self-position estimating unit 41 acquires self-position information from satellites. It is known that self-position information acquired from satellites includes an error of about maximum 100 m. Therefore, the self-position estimating unit 41 may correct the self-position information acquired from the positioning unit 13.
For example, the self-position estimating unit 41 estimates a rough self-position from the self-position information acquired from the satellites and extracts dynamic map data near the self-position from the map storage unit 46. The self-position estimating unit 41 performs matching the periphery image from the scan data analyzing unit 40 with the three-dimensional image based on the dynamic map. A coordinate point on the dynamic map, that is, the self-position, is acquired through this matching. The acquired self-position information (vehicle position information) is sent to the autonomous travel control unit 42.
The transceiver unit 43 serves as an receiver unit that receives a signal sent from the outside to the automated driving vehicle 10 and a transmitter unit that transmits a signal from the automated driving vehicle 10 to the outside together. For example, travel route map data and travel schedule data are transmitted from the vehicle allocation management device 50 to the transceiver unit 43. As will be described later, the travel route map data includes dynamic map data and is transmitted from the map preparing unit 63 of the vehicle allocation management device 50. The travel schedule data includes a departure time, a target boarding point arrival time, and a target destination arrival time.
The autonomous travel control unit 42 is configured to able to control the steering mechanism 15, the brake mechanism 14, and the inverter 18 which is the drive mechanism based on predetermined travel route information and attributes and a distance of a nearby object which have been analyzed by the scan data analyzing unit 40.
Specifically, the autonomous travel control unit 42 performs travel control of the automated driving vehicle 10 based on the travel route map data stored in the map storage unit 46, the travel schedule data stored in the vehicle allocation reservation storage unit 47, the self-position information (vehicle position information) transmitted from the self-position estimating unit 41, and the periphery data transmitted from the scan data analyzing unit 40.
For example, a global route is determined based on the vehicle position and the travel route map data. A local route in which an obstacle ahead is avoided or the like is determined based on the periphery data. The autonomous travel control unit 42 controls the brake mechanism 14, the steering mechanism 15, and the inverter 18 according to the determined routes.
Vehicle allocation reservation data illustrated in
As will be described later, the passenger ascertaining unit 45 determines whether the scan data analyzing unit 40 has recognized a passenger at the boarding point. In at least one of the following cases of (1) to (3), the passenger ascertaining unit 45 transmits a notification to the terminal device 70 which is an interested person's contact-number device set in the reservation for vehicle allocation via the transceiver unit 43:
(1) a case in which a passenger set in a reservation for vehicle allocation has boarded at a boarding point set in the reservation for vehicle allocation;
(2) a case in which a passenger has not been recognized by the scan data analyzing unit 40 within a predetermined waiting time from a target boarding point arrival time set in a reservation for vehicle allocation; and
(3) a case in which a passenger has alighted at a destination set in a reservation for vehicle allocation.
In the case of (2), that is, when a passenger has not been recognized at the boarding point by the scan data analyzing unit 40, the passenger ascertaining unit 45 transmits an absence notification to the terminal device 70 (the interested person's contact-number device) to request a remote operation of the outside camera 11A.
When the passenger's face is recognized by the scan data analyzing unit 40, the passenger ascertaining unit 45 performs displaying a pickup message using the outside noticeboard 90 (see
Terminal Device
Referring to
A hardware configuration of the terminal device 70 is illustrated in
The input unit 73 and the display unit 74 may be integrally formed as a touch panel. As will be described later, the input unit 73 can input a destination, a boarding point, a date of use, and a scheduled boarding time (a target boarding point arrival time) in a reservation for vehicle allocation of an automated driving vehicle 10.
A program for performing a reservation for vehicle allocation of an automated driving vehicle 10 is stored in at least one of the ROM 75 and the storage device 77 which are storage devices. By causing the CPU 72 or the like of the terminal device 70 to execute the program, the functional blocks illustrated in
The unmanned taxi application 82 includes a display control unit 82A, a vehicle allocation reservation setting unit 82B, a remote camera operating unit 82C, and a remote speaker operating unit 82D. When the unmanned taxi application 82 is started, an authentication screen is displayed on the display unit 74. When a user account name and a password are input to the screen by a user, for example, using the input unit 73, the unmanned taxi application 82 logs in on a vehicle allocation reservation system.
For example, the display control unit 82A displays a vehicle allocation reservation screen including the authentication image on the display unit 74. The display control unit 82A displays guidance screens illustrated in
The vehicle allocation reservation setting unit 82B sets a reservation for vehicle allocation of an automated driving vehicle which is an unmanned taxi. For example, the vehicle allocation reservation setting unit 82B causes the display control unit 82A to display a vehicle allocation reservation screen on the display unit 74, and allows a user to input entries required for a reservation for vehicle allocation, which are illustrated in
When a passenger has not been recognized by the scan data analyzing unit 40 of the automated driving vehicle 10 at the time of pickup (when the passenger is absent) as will be escribed later, the remote camera operating unit 82C is configured to remotely operate the outside camera 11A. The remote speaker operating unit 82D is configured to transmit voice data such as natural voice of an interested person input to the calling unit 78 to the passenger ascertaining unit 45 when the passenger is absent. The passenger ascertaining unit 45 outputs the received voice data from the outside speaker 19.
The vehicle allocation managing unit 61 refers to a vehicle allocation reservation management table stored in the vehicle allocation reservation storage unit 68. The vehicle allocation managing unit 61 refers to the current time from the timepiece 65. The vehicle allocation managing unit 61 extracts a reservation for vehicle allocation in which the current time is included in a pickup time period before the target boarding point arrival time set in the vehicle allocation reservation management table (hereinafter appropriately referred to as a previous reservation for vehicle allocation) (S10). The pickup time period may be, for example, 1 hour before the target boarding point arrival time set in the previous reservation for vehicle allocation.
The vehicle allocation managing unit 61 accesses the automated driving vehicle storage unit 69, ascertains an operating state of each automated driving vehicle 10 under management (in pickup, occupied, or available), and extracts available automated driving vehicles 10 (S12). The vehicle allocation managing unit 61 sets an automated driving vehicle 10 closest to the boarding point set in the previous reservation for vehicle allocation out of the available automated driving vehicles 10 as an allocated vehicle (S14).
Then, the map preparing unit 63 prepares travel route map data passing through three points including the current position of the automated driving vehicle 10 set as an allocated vehicle and a boarding point and a destination which are set in the previous reservation for vehicle allocation. This map data is prepared by processing the dynamic map data stored in the dynamic map storage unit 66.
The vehicle allocation managing unit 61 transmits the travel route map data and travel schedule data to the automated driving vehicle 10 set as an allocated vehicle (S16). The travel schedule data includes the target boding point arrival time set in the previous reservation for vehicle allocation and a target destination arrival time when the automated driving vehicle 10 would travel to the destination at a predetermined rated speed at the target boarding point arrival time. The vehicle allocation managing unit 61 transmits a departure command to the automated driving vehicle 10 set as an allocated vehicle (S18).
A pickup routine to a boarding point which is performed by an automated driving vehicle 10 set as an allocated vehicle is illustrated in
As described above, the automated driving vehicle 10 receives a departure command from the vehicle allocation managing unit 61 of the vehicle allocation management device 50. The autonomous travel control unit 42 of the automated driving vehicle 10 having received the departure command performs automated-driving travel from the current position to the boarding point along the travel route map while controlling the brake mechanism 14, the steering mechanism 15, and the inverter 18 which is a drive mechanism (S30).
The vehicle allocation managing unit 61 of the vehicle allocation management device 50 refers to the vehicle allocation management table (see
The vehicle allocation managing unit 61 acquires a user terminal identification symbol correlated with the acquired user account name with reference to the user registration information management table (see
The vehicle allocation managing unit 61 transmits map image data and departure message data to the terminal device 70 (S32). The display control unit 82A of the terminal device 70 displays the transmitted data on the display unit 74 (S34).
For example, as illustrated in
When the automated driving vehicle 10 arrives at the boarding point, the autonomous travel control unit 42 controls the brake mechanism 14 and the inverter 18 such that the automated driving vehicle 10 stops at that position (S36). The vehicle allocation managing unit 61 transmits a notification indicating that the allocated vehicle has arrived at the boarding point to the terminal device 70. As illustrated in
The scan data analyzing unit 40 acquires a captured image of the periphery of the automated driving vehicle 10 from the outside camera 11A and performs image recognition (S38). An image recognition result from the scan data analyzing unit 40 is transmitted to the passenger ascertaining unit 45. The passenger ascertaining unit 45 determines whether a passenger is included in attributes of an object recognized by the scan data analyzing unit 40, that is, whether the scan data analyzing unit 40 has recognized the passenger's face in the captured outside image (S40).
When the scan data analyzing unit 40 has recognized the passenger's face in the captured outside image, the passenger ascertaining unit 45 transmits a notification indicating that recognition of the passenger is successful to the terminal device 70. A message box 110D illustrated in
The passenger ascertaining unit 45 ascertains whether the passenger has boarded the vehicle. Specifically, as illustrated in
When the passenger has not been recognized from the captured inside image, the passenger ascertaining unit 45 outputs a boarding demand message using at least one of the outside noticeboard 90 (see
Although not included in the routines illustrated in
When the scan data analyzing unit 40 has not recognized the passenger's face from the captured outside image in Step S40, the passenger ascertaining unit 45 determines whether the current time is past the target boarding point arrival time set in the reservation for vehicle allocation with reference to the timepiece 48 (S42). When the current time is not past the target boarding point arrival time, there is a likelihood that the passenger would have not reached the boarding point, and thus the scan data analyzing unit 40 continues to perform image recognition on a captured outside image in Step S38.
On the other hand, when the current time is past the target boarding point arrival time, the passenger ascertaining unit 45 calculates an elapsed time Tw from the target boarding point arrival time and determines whether the elapsed time is greater than a predetermined threshold time Tth (waiting time) (S44). The threshold time Tth may be, for example, 10 minutes. When the elapsed time Tw from the target boarding point arrival time is equal to or less than the threshold time Tth, the scan data analyzing unit 40 continues to perform image recognition on a captured outside image in S38.
On the other hand, when the elapsed time Tw from the target boarding point arrival time is greater than the threshold time Tth (waiting time), the passenger ascertaining unit 45 transmits data of absence notification 1 to the terminal device 70 (S46). The passenger ascertaining unit 45 transmits a captured image from the outside camera 11A to the terminal device 70.
When the transceiver unit 80 of the terminal device 70 receives the data of absence notification 1, a message box 110F in which absence notification 1 indicating that the automated driving vehicle 10 has not ascertained (recognized) the passenger is described is displayed in the map image 100 on the display unit 74 as illustrated in
A message for demanding a remote operation of the outside camera 11A and a message indicating that voice of the calling unit 78 can be output from the outside speaker 19 are also described in the message box 110F. The display unit 74 is switched to, for example, a remote operation screen illustrated in
Operation buttons 120A to 120D for changing a display position and a zoom button 124 for changing an imaging magnification are displayed in the remote operation screen. These buttons are operated by a user via the remote camera operating unit 82C. Imaging settings of the outside camera 11A are changed in accordance with an operation command input by the user (S48). The captured image after the settings have been changed is transmitted from the passenger ascertaining unit 45 to the terminal device 70 (S50).
For example, when the passenger faces a direction different from that in the face image set at the time of reservation for vehicle allocation, it may be difficult for the scan data analyzing unit 40 to recognize a face. When the passenger is apart from the automated driving vehicle 10, it may be difficult to analyze features of a face.
In this case, an interested person of the passenger may be able to recognize the passenger, for example, based on features such as the clothes or height of the passenger. Therefore, a user is allowed to remotely operate the outside camera 11A and settings such as a magnification or an imaging angle (a camera angle) are changed such that the face of the passenger can be recognized.
When the user speaks to the calling unit 78 of the terminal device 70, the calling unit 78 transmits voice data to the passenger ascertaining unit 45 (S52). The passenger ascertaining unit 45 outputs the received voice data from the outside speaker 19 (S54). When voice which has been heard by the passenger is output from the outside speaker 19, attraction of attention such as turning of the passenger's face to the automated driving vehicle 10 may be possible and thus it may be possible to easily recognize the face. Steps S48 to S54 may be repeatedly performed within a predetermined time.
The passenger ascertaining unit 45 determines whether the scan data analyzing unit 40 has recognized the passenger's face from the captured outside image (S56). When the passenger's face has been recognized, the routine proceeds to Step S58 in which it is ascertained whether the passenger boards the automated driving vehicle 10.
On the other hand, when the scan data analyzing unit 40 could not recognize the passenger's face from the captured outside image in Step S56, the passenger ascertaining unit 45 transmits data of absence notification 2 to the terminal device 70 (S66). As illustrated in
A message for demanding contact to a relevant such as the police or a facility for aged people or a message indicating that the reservation for vehicle allocation is cancelled is also described in the message box 110G. The passenger ascertaining unit 45 cancels the reservation for vehicle allocation (S68). For example, the passenger ascertaining unit 45 transmits a notification indicating that the passenger could not be recognized even by the user's remote operation and a notification for cancelling the reservation for vehicle allocation to the vehicle allocation managing unit 61 of the vehicle allocation management device 50. The vehicle allocation managing unit 61 having received the notifications cancels the corresponding reservation for vehicle allocation in the vehicle allocation reservation management table (see
In the aforementioned embodiment, the passenger ascertaining unit 45 transmits a notification to the terminal device 70 in all the case in which the passenger boards an automated driving vehicle 10, the case in which the passenger alights from the automated driving vehicle 10, and the case in which the passenger is absent at the boarding point, but the notification may be transmitted in only one of the three cases. For example, the passenger ascertaining unit 45 may perform S46 (transmission of absence notification 1) in
That is, the passenger ascertaining unit 45 provided in the vehicle allocation management device 50 transmits a notification to a terminal device 70 which is an interested person's contact-number device set in a reservation for vehicle allocation in at least one of:
(1) a case in which a passenger set in the reservation for vehicle allocation has boarded at a boarding point set in the reservation for vehicle allocation;
(2) a case in which the passenger has not been recognized by the scan data analyzing unit 40 within a predetermined waiting time from a target boarding point arrival time set in the reservation for vehicle allocation; and
(3) a case in which the passenger has alighted at a destination set in the reservation for vehicle allocation.
Specifically, a pickup routine which is performed by the vehicle allocation system illustrated in
As can be clearly seen from the routine illustrated in
Although not included in the routines illustrated in
In the aforementioned embodiment, the passenger ascertaining unit 45 of the vehicle allocation management device 50 transmits a notification to the terminal device 70 in all the case in which the passenger boards an automated driving vehicle 10, the case in which the passenger alights from the automated driving vehicle 10, and the case in which the passenger is absent at the boarding point, but the notification may be transmitted in only one of the three cases. For example, the passenger ascertaining unit 45 may perform S146 (transmission of absence notification 1) in
Number | Date | Country | Kind |
---|---|---|---|
2021-034980 | Mar 2021 | JP | national |