AUTONOMOUS DRIVING VEHICLE INFORMATION PRESENTATION DEVICE

Information

  • Patent Application
  • 20220063486
  • Publication Number
    20220063486
  • Date Filed
    August 25, 2021
    3 years ago
  • Date Published
    March 03, 2022
    2 years ago
Abstract
An autonomous driving vehicle information presentation device includes an identification unit that searches for a person existing around the own vehicle, identifies whether the person extracted by the search coincides with a user of the own vehicle, and determines whether the person extracted by the search is an owner of the own vehicle, and an information presentation unit that performs information presentation to the person through using an external display device. In a case where it is identified that the person extracted by the search coincides with the user of the own vehicle and it is determined that the person extracted by the search is the owner of the own vehicle, the information presentation unit presents information unique to the owner in a preset presentation mode with the owner serving as a presentation target.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims priority from Japanese Patent Application No. 2020-143968 filed on Aug. 27, 2020, the entire content of which is incorporated herein by reference.


BACKGROUND
Technical Field

The present invention relates to an autonomous driving vehicle information presentation device that is used in an autonomous driving vehicle and presents information to a person existing around an own vehicle.


Description of the Related Art

In recent years, in order to achieve safe and comfortable operation of a vehicle while reducing a burden on a driver, a technique called autonomous driving has been eagerly proposed.


As an example of the autonomous driving technique, JP-A-2017-199317 discloses a vehicle control system including: a detection unit that detects a surrounding state of a vehicle; an autonomous driving control unit that execute autonomous driving in which at least one of speed control and steering control of the vehicle is automatically performed based on the surrounding state of the vehicle detected by the detection unit; a recognition unit that recognizes a direction of a person relative to the vehicle based on the surrounding state of the vehicle detected by the detection unit; and an output unit that outputs information recognizable by the person recognized by the recognition unit, the information having directivity in the direction of the person recognized by the recognition unit.


Further, JP-A-2008-017227 discloses a face recognition device including: an imaging device that is attached to an automobile and images a face of a person existing in a field of view of imaging; and a face registration unit that stores face feature information of a person registered as a user in association with user identification information. The face recognition device performs recognition processing based on face feature information of an imaged face image and the face feature information registered in the face registration unit and outputs a recognition result thereof. When recognition fails, the face recognition device turns on an illumination device that illuminates a face of a person in the field of view of imaging to acquire a face image again, and performs re-recognition processing.


However, in the related art, there is room for improvement from the viewpoint of causing an owner of an autonomous driving vehicle to develop a feeling of attachment to the autonomous driving vehicle.


SUMMARY

The present invention provides an autonomous driving vehicle information presentation device capable of causing an owner of an autonomous driving vehicle to develop a feeling of attachment to the autonomous driving vehicle.


According to an aspect of the present invention, there is provided an autonomous driving vehicle information presentation device used for an autonomous driving vehicle that acquires external environment information including a target existing around an own vehicle, generates an action plan of the own vehicle based on the acquired external environment information, and automatically performs at least one of speed control and steering control of the own vehicle in accordance with the generated action plan, the autonomous driving vehicle information presentation device being configured to present information to a person existing around the own vehicle, the autonomous driving vehicle information presentation device. The autonomous driving vehicle information presentation device includes: an identification unit configured to search for a person existing around the own vehicle based on the external environment information, identify whether the person extracted by the search coincides with a user of the own vehicle, and determine whether the person extracted by the search is an owner of the own vehicle; and an information presentation unit configured to perform information presentation to the person through using an external display device provided in at least one of a front portion and a rear portion of the own vehicle, wherein in a case where, as a result of the identification performed by the identification unit, it is identified that the person extracted by the search coincides with the user of the own vehicle, and, as a result of the determination performed by the identification unit, it is determined that the person extracted by the search is the owner of the own vehicle, the information presentation unit presents information unique to the owner in a preset presentation mode with the owner serving as a presentation target.


According to the present invention, the autonomous driving vehicle information presentation device capable of causing the owner of the autonomous driving vehicle to develop the feeling of attachment to the autonomous driving vehicle can be provided.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is an overall configuration diagram of an autonomous driving vehicle including an information presentation device according to an embodiment of the present invention.



FIG. 2 is a functional block configuration diagram showing configurations of a vehicle control device including an autonomous driving vehicle information presentation device according to the embodiment of the present invention and a peripheral portion thereof.



FIG. 3 is a schematic configuration diagram of an HMI provided in the autonomous driving vehicle information presentation device.



FIG. 4 shows a vehicle interior front structure of an autonomous driving vehicle.



FIG. 5A is an external view shoving a front structure of the autonomous driving vehicle.



FIG. 5B is an external view showing a rear structure of the autonomous driving vehicle.



FIG. 5C is a front view showing a schematic configuration of left and right front lighting units provided in the autonomous driving vehicle.



FIG. 6 is a block configuration diagram conceptually showing functions of the autonomous driving vehicle information presentation device.



FIG. 7 shows an example of an information presentation mode stored by a storage unit of the autonomous driving vehicle information presentation device.



FIG. 8 is a flowchart showing an operation of the autonomous driving vehicle information presentation device.





DESCRIPTION OF EMBODIMENTS

Hereinafter, an autonomous driving vehicle information presentation device according to an embodiment of the present invention will be described in detail with reference to the drawings.


Note that, in the drawings described below, members having common functions are denoted by common reference signs. In addition, the size and shape of the member may be schematically illustrated in a deformed or exaggerated manner for convenience of description.


In description of a vehicle control device according to the embodiment of the present disclosure, when expressions of left and right are used for an own vehicle M, orientation of a vehicle body of the own vehicle M is used as a reference. Specifically, for example, in a case where the own vehicle M has a right hand drive specification, a driver seat side is referred to as a right side, and a passenger seat side is referred to as a left side.


[Configuration of Own Vehicle M]


First, a configuration of an autonomous driving vehicle (hereinafter, also referred to as an “own vehicle”) M including a vehicle control device 100 according to the embodiment of the present invention will be described with reference to FIG. 1.



FIG. 1 is an overall configuration diagram of the autonomous driving vehicle M including the vehicle control device 100 according to the embodiment of the present invention.


In FIG. 1, the own vehicle M on which the vehicle control device 100 is mounted is, for example, an automobile such as a two-wheeled automobile, a three-wheeled automobile, or a four-wheeled automobile.


Examples of the own vehicle M include an automobile having an internal combustion engine such as a diesel engine or a gasoline engine as a power source, an electric automobile having an electric motor as a power source, and a hybrid automobile having both an internal combustion engine and an electric motor. Among these automobiles, the electric automobile is driven by using electric power discharged from a battery such as a secondary battery, a hydrogen fuel battery, a metal fuel battery, or an alcohol fuel battery.


As shown in FIG. 1, the own vehicle M is equipped with an external environment sensor 10 that has a function of detecting external environment information on a target including an object or a sign existing around the own vehicle M, a navigation device 20 that has a function of mapping a current position of the own vehicle M on a map and performing route guidance to a destination and the like, and the vehicle control device 100 that has a function of performing autonomous travel control of the own vehicle M including steering, acceleration and deceleration of the own vehicle M, and the like.


These devices and instruments are connected to each other via a communication medium such as a controller area network (CAN) so as to he capable of performing data communication with each other.


In the present embodiment, a configuration in which the external environment sensor 10 and the like are provided outside the vehicle control device 100 is described as an example, and alternatively the vehicle control device 100 may be configured to include the external environment sensor 10 and the like.


[External Environment Sensor 10]


The external environment sensor 10 includes a camera 11, radar 13, and a LIDAR 15.


The camera 11 has an optical axis inclined obliquely downward in front of the own vehicle and has a function of imaging an image in a traveling direction of the own vehicle M. As the camera 11, for example, a complementary metal oxide semiconductor (CMOS) camera, a charge-coupled device (CCD) camera, or the like can be appropriately used. The camera 11 is, for example, provided in the vicinity of a rearview mirror (not shown) in a vehicle interior of the own vehicle M, and in a front portion of a right door and a front portion of a left door outside the vehicle interior of the own vehicle M.


For example, the camera 11 periodically and repeatedly images a state of a front side, a right rear side, and a left rear side in the traveling direction of the own vehicle M. In the present embodiment, the camera 11 provided in the vicinity of the rearview mirror is configured with a pair of monocular cameras arranged side by side. The camera 11 may also be a stereo camera.


Image information on the front side, the right rear side, and the left rear side in the traveling direction of the own vehicle M acquired by the camera 11 is transmitted to the vehicle control device 100 via a communication medium.


The radar 13 has a function of emitting a radar wave to a target including a preceding vehicle, which travels in front of the own vehicle M and is a follow-up target thereof, and receiving the radar wave reflected by the target, thereby acquiring distribution information of the target including a distance to the target and an azimuth of the target. As the radar wave, a laser, a microwave, a millimeter wave, an ultrasonic wave, or the like can be appropriately used.


In the present embodiment, as shown in FIG. 1, five radars 13 are provided, specifically, three on a front side and two on a rear side. The distribution information of the target acquired by the radar 13 is transmitted to the vehicle control device 100 via a communication medium.


The LIDAR (Light Detection and Ranging) 15 has, for example, a function of detecting presence or absence of a target and a distance to the target by measuring time required for detection of scattered light relative to irradiation light. In the present embodiment, as shown in FIG. 1, five LIDARs 15 are provided, specifically, two on the front side and three on the rear side. The distribution information of the target acquired by the LIDAR 15 is transmitted to the vehicle control device 100 via a communication medium.


[Navigation Device 20]


The navigation device 20 includes a global navigation satellite system (GNSS) receiver, map information (navigation map), a touch panel type internal display device 61 functioning as a human machine interface, a speaker 63 (see FIG. 3), a microphone, and the like. The navigation device 20 serves to calculate a current position of the own vehicle M by the GNSS receiver and to derive a route from the current position to a destination designated by a user.


The route derived by the navigation device 20 is provided to a target lane determination unit 110 (to be described below) of the vehicle control device 100. The current position of the own vehicle M may be specified or complemented by an inertial navigation system (TNS) using an output of a vehicle sensor 30 (see FIG. 2). When the vehicle control device 100 executes a manual driving mode, the navigation device 20 provides guidance on a route to a destination by voice or map display.


The function for calculating the current position of the own vehicle M may be provided independently of the navigation device 20. The navigation device 20 may be implemented by, for example, a function of a terminal device (hereinafter, also referred to as a “terminal device”) such as a smartphone or a tablet terminal carried by a user. In this case, transmission and reception of information is performed between the terminal device and the vehicle control device 100 by wireless or wired communication.


[Configurations of Vehicle Control Device 100 and Peripheral Portion Thereof]


Next, configurations of the vehicle control device 100 mounted on the own vehicle M and a peripheral portion thereof will be described with reference to FIG. 2.



FIG. 2 is a functional block configuration diagram showing the configurations of the vehicle control device 100 according to the embodiment of the present invention and the peripheral portion thereof.


As illustrated in FIG. 2, in addition to the external environment sensor 10, the navigation device 20, and the vehicle control device 100 described above, a communication device 25, the vehicle sensor 30, a human machine interface (HMI) 35, a travel driving force output device 200, a steering device 210, and a. brake device 220 are mounted on the own vehicle M.


The communication device 25, the vehicle sensor 30, the HMI 35, the travel driving force output device 200, the steering device 210, and the brake device 220 are connected to the vehicle control device 100 via a communication medium so as to be capable of performing data communication with the vehicle control device 100.


[Communication Device 25]


The communication device 25 has a function of performing communication via a wireless communication medium such as a cellular network, a Wi-Fi network, Bluetooth (registered trademark), or dedicated short range communication (DSRC).


The communication device 25 performs wireless communication with an information providing server of a system that monitors a traffic condition of a road such as the Vehicle Information and Communication System (VICS) (registered trademark), and acquires traffic information indicating a traffic condition of a road on which the own vehicle M is traveling or is scheduled to travel. The traffic information includes information on traffic congestion in front of the own vehicle M, information on time required for passing through a traffic congestion point, information on accidents, failed vehicles and constructions, information on speed regulation and lane regulation, position information of parking lots, information on whether a parking lot, service area or parking area is full or vacant, and the like.


The communication device 25 may acquire the traffic information by performing communication with a wireless beacon provided on a road side band or the like or performing vehicle-vehicle communication with another vehicle traveling around the own vehicle M.


For example, the communication device 25 performs wireless communication with an information providing server of traffic signal prediction systems (TSPS), and acquires signal information related to a traffic light provided on a road on which the own vehicle M is traveling or is scheduled to travel, The TSPS serves to support driving for smoothly passing through a signalized intersection by using signal information of the traffic light.


The communication device 25 may acquire the signal information by performing communication with an optical beacon provided on a road side band or the like or performing vehicle-vehicle communication with another vehicle traveling around the own vehicle M.


Furthermore, the communication device 25 may perform wireless communication with a terminal device such as a smartphone or a tablet terminal carried by a user, for example, and acquires user identification information indicating an identifier of the user. The terminal device is not limited to a smartphone or a tablet terminal, and may also be, for example, a so-called smart key. The user identification information may also he information indicating an identifier of the terminal device. However, in this case, for example, the vehicle control device 100 can refer to information in which the identifier of the terminal device and the identifier of the user are associated with each other such that the user can he specified from the identifier of the terminal device.


[Vehicle Sensor 30]


The vehicle sensor 30 has a function of detecting various types of information relating to the own vehicle M. The vehicle sensor 30 includes a vehicle speed sensor that detects a vehicle speed of the own vehicle M, an acceleration sensor that detects an acceleration of the own vehicle M, a yaw-rate sensor that detects an angular velocity around a vertical axis of the own vehicle M, an azimuth sensor that detects orientation of the own vehicle M, an inclination angle sensor that detects an inclination angle of the own vehicle M, an illuminance sensor that detects illuminance of a place where the own vehicle M is present, a raindrop sensor that detects an amount of raindrops of the place where the own vehicle M is present, and the like.


[Configuration of HMI 35]


Next, the HMI 35 will he described with reference to FIGS. 3. 4, 5A, and 5B.



FIG. 3 is a schematic configuration diagram of the HMI 35 connected to the vehicle control device 100 according to the embodiment of the present invention. FIG. 4 shows a vehicle interior front structure of the vehicle M including the vehicle control device 100. FIGS. 5A and 5B are external views shows a front structure and a rear structure of the vehicle M including the vehicle control device 100, respectively.


As shown in FIG. 3, the HMI 35 includes components of a driving operation system and components of a non-driving operation system. A boundary between the components of the driving operation system and the components of the non-driving operation system is not clear, and the components of the driving operation system may also he configured to have functions of the non-driving operation system (or vice versa).


The HMI 35 includes, as the components of the driving operation system, an accelerator pedal 41, an accelerator opening degree sensor 43, an accelerator pedal reaction force output device 45, a brake pedal 47, a brake depression amount sensor 49, a shift lever 51, a shift position sensor 53, a steering wheel 55, a steering angle sensor 57, a steering torque sensor 58, and other driving operation devices 59.


The accelerator pedal 41 is an acceleration operator for receiving an acceleration instruction (or a deceleration instruction by a return operation) from a driver. The accelerator opening degree sensor 43 detects a depression amount of the accelerator pedal 41, and outputs an accelerator opening degree signal indicating the depression amount to the vehicle control device 100.


Instead of outputting the accelerator opening degree signal to the vehicle control device 100, a configuration in which the accelerator opening degree signal is directly output to the travel driving force output device 200, the steering device 210, or the brake device 220 may be adopted. The same applies to other configurations of the driving operation system described below. The accelerator pedal reaction force output device 45 outputs a force (operation reaction force) in a direction opposite to an operation direction relative to the accelerator pedal 41, for example, in response to an instruction from the vehicle control device 100.


The brake pedal 47 is a deceleration operation element configured to receive a deceleration instruction given by the driver. The brake depression amount sensor 49 detects a depression amount (or a depression force) of the brake pedal 47, and outputs a brake signal indicating a detection result thereof to the vehicle control device 100.


The shift lever 51 is a speed changing operation element configured to receive a shift stage change instruction given by the driver. The shift position sensor 53 detects a shift stage instructed by the driver, and outputs a shift position signal indicating a detection result thereof to the vehicle control device 100.


The steering wheel 55 is a steering operation element configured to receive a turning instruction given by the driver. The steering angle sensor 57 detects an operation angle of the steering wheel 55, and outputs a steering angle signal indicating a detection result thereof to the vehicle control device 100. The steering torque sensor 58 detects torque applied to the steering wheel 55, and outputs a steering torque signal indicating a detection result thereof to the vehicle control device 100.


The other driving operation device 59 is, for example, a joystick, a button, a dial switch, a graphical user interface (GUI) switch., and the like. The other driving operation device 59 receives an acceleration instruction, a deceleration instruction, a turning instruction, and the like, and outputs the received instructions to the vehicle control device 100.


Further, the HMI 35 includes, as the components of the non-driving operation system, the internal display device 61, the speaker 63, a contact operation detection device 65, a content reproduction device 67, various operation switches 69, a seat 73 and a seat driving device 75, a window glass 77 and a window driving device 79, a vehicle interior camera 81, and an external display device 83, for example.


The internal display device 61 is preferably a touch panel type display device having a function of displaying various types of information for an occupant in the vehicle interior. As shown in FIG. 4, the internal display device 61 includes, in an instrument panel 60, a meter panel 85 that is provided at a position directly facing a driver seat, a multi-information panel 87 that is provided to face the driver seat and a passenger seat and is horizontally long in a vehicle width direction (a Y-axis direction of FIG. 4), a right panel 89a that is provided on a driver seat side in the vehicle width direction, and a left panel 89b that is provided on a passenger seat side in the vehicle width direction. The internal display device 61 may be additionally provided at a position facing a rear seat (on a back side of a front seat).


The meter panel 85 displays, for example, a speedometer, a tachometer, an odometer, shift position information, lighting status information of lights, and the like.


The multi-information panel 87 displays, for example, various types of information such as map information on surroundings of the own vehicle M, current position information of the own vehicle M on a map, traffic information (including signal information) on a current traveling path or a scheduled route of the own vehicle M, traffic participant information on traffic participants (including pedestrians, bicycles, motorcycles, other vehicles, and the like) existing around the own vehicle M, and messages issued to the traffic participants.


The right panel 89a displays image information on a rear side and a lower side on the right side of the own vehicle M imaged by the camera 11 provided on the right side of the own vehicle M.


The left panel 89b displays image information on a rear side and a lower side on the left side of the own vehicle M imaged by the camera 11 provided on the left side of the own vehicle M.


The internal display device 61 is not particularly limited, and may be configured with, for example, a liquid crystal display (LCD), an organic electroluminescence (EL) display, or the like. The internal display device 61 may be configured with a head-up display (HUD) that projects a required image on the window glass 77.


The speaker 63 has a function of outputting a sound. An appropriate number of the speakers 63 are provided at appropriate positions such as the instrument panel 60, a door panel, and a rear parcel shelf (all of which are not shown) in the vehicle interior, for example.


When the internal display device 61 is of a touch panel type, the contact operation detection device 65 functions to detect a touch position on a display screen of the internal display device 61, and output information on the detected touch position to the vehicle control device 100. When the internal display device 61 is not of the touch panel type, the contact operation detection device 65 may not be provided.


The content reproduction device 67 includes, for example, a digital versatile disc (DVD) reproduction device, a compact disc (CD) reproduction device, a television receiver, and a device for generating various guide images. A part or all of the internal display device 61, the speaker 63, the contact operation detection device 65, and the content reproduction device 67 may he configured to he common to the navigation device 20.


The various operation switches 69 are provided at appropriate positions in the vehicle interior. The various operation switches 69 include an autonomous driving changeover switch 71 that instructs immediate start (or future start) and stop of autonomous driving. The autonomous driving changeover switch 71 may be a graphical user interface (GUI) switch or a mechanical switch. The various operation switches 69 may include switches configured to drive the seat driving device 75 and the window driving device 79.


The seat 73 is a seat where an occupant of the own vehicle M sits. The seat driving device 75 freely drives a reclining angle, a front-rear direction position, a yaw angle, and the like of the seat 73. The window glass 77 is provided, for example, in each door. The window driving device 79 drives the window glass 77 to open and close.


The vehicle interior camera 81 is a digital camera using a solid-state imaging element such as a CCD or a CMOS. The vehicle interior camera 81 is provided at a position that enables imaging of at least a head portion of a driver seated in the driver seat, such as a rearview mirror, a steering boss portion (bath of which are not shown), and the instrument panel 60. For example, the vehicle interior camera 81 periodically and repeatedly images a state of the vehicle interior including the driver.


The external display device 83 has a function of displaying (informing various types of information for traffic participants (including pedestrians, bicycles, motorcycles, other vehicles, and the like) existing around the own vehicle M. As shown in FIG. 5A, the external display device 83 provided in a front portion of the own vehicle M includes, in a front grille 90 of the own vehicle M, a right front lighting unit 91,E and a left front lighting unit 91B that are provided apart from each other in the vehicle width direction, and a front display unit 93 provided between the left and right front lighting units 91A and 91B.


The external display device 83 provided in the front portion of the own vehicle M further includes a front indicator 92. When the ow vehicle M is moved by autonomous travel control of the vehicle control device 100, that is, when the own vehicle M is moved by autonomous driving, the front indicator 92 is lighted toward the front side of the own vehicle M, and informs a traffic participant existing in front of the own vehicle M that the own vehicle M is moved by autonomous driving.


As shown in FIG. 5B, the external display device 83 provided in a rear portion of the own vehicle M includes, in a rear grille 94 of the own vehicle M, a right rear lighting unit 95A and a left rear lighting unit 95B that are provided apart from each other in the vehicle width direction, and a rear display unit 97 that is provided in the vehicle interior of the own vehicle M at a position visible from the outside through a central lower portion of a rear window 96. The rear display unit 97 is provided, for example, at an opening lower end portion (not shown) of the rear window 96.


The external display device 83 provided in the rear portion of the own vehicle M further includes a rear indicator 98. When the own vehicle M is moved by autonomous travel control of the vehicle control device 100, that is, when the own vehicle M is moved by autonomous driving, the rear indicator 98 is lighted toward the rear side of the own vehicle M, and informs a traffic participant existing behind the own vehicle M that the own vehicle M is moved by autonomous driving.


Note that a right indicator may be provided such that, when the own vehicle M is moved by autonomous driving, the right indicator is lighted toward a right side of the own vehicle M and informs a traffic participant existing on the right side of the own vehicle M that the own vehicle M is moved by autonomous driving. Detailed description and illustration thereof are omitted. Similarly, a left indicator may be provided such that, when the own vehicle M is moved by autonomous driving, the left indicator is lighted toward a left side of the own vehicle M and informs a traffic participant existing on the left side of the own vehicle M that the own vehicle M is moved by autonomous driving.


Here, a configuration of the left and right front lighting units 91A and 91B of the external display device 83 will be described with reference to FIG. 5C. FIG. 5C. is a front view showing a schematic configuration of the left and right front lighting units 91A and 91B provided in the own vehicle M. Since the left and right front lighting units 91A and 91B have the same configuration, only one front lighting unit is shown in FIG. 5C. In the following description of FIG. 5C, reference signs without parentheses in FIG. 5C are referred to in description of the right front lighting unit 91A, and reference signs in parentheses in FIG. 5C are referred to in description of the left front lighting unit 91B.


The right front lighting unit 91A is firmed in a circular shape as viewed from the front. The right front lighting unit 91A is configured such that a direction indicator 91Ab, a lighting display unit 91Ac, and a position lamp 91Ad, each of which is formed in an annular shape, are sequentially arranged concentrically outward in a radial direction around a headlamp 91Aa, which is formed in a circular shape as viewed from the front and has a smaller diameter dimension than an outer diameter dimension of the right front lighting unit 91A.


The headlamp 91Aa serves to assist a front field of view of the occupant by emitting light forward in the traveling direction of the own vehicle M while the own vehicle M travels in a dark place. When the own vehicle M turns right or left, the direction indicator 91Ab serves to notify traffic participants existing around the own vehicle M of the intention of turning right or left. For example, the lighting display unit 91Ac is provided for communication with the user (including an owner) of the own vehicle M in combination with display contents of the front display unit 93. The position lamp 91Ad serves to notify the traffic participants existing around the own vehicle M of a vehicle width of the own vehicle M while the own vehicle M travels in a dark place.


Similarly to the right front lighting unit 91A, the left front lighting unit 91B is also configured such that a direction indicator 91Bb, a lighting display unit 91Bc, and a position lamp 91Bd, each of which is formed in an annular shape, are sequentially arranged concentrically outward in the radial direction around a headlamp 91Ba formed in a circular shape as viewed from the front. The left and right front lighting units 91A and 91B (for example, the left and right lighting display units 91Ac and 91Bc) are used for information presentation by an information presentation unit 331 to he described later below.


[Configuration of Vehicle Control Device 100]


Next, referring hack to FIG. 2, the configuration of the vehicle control device 100 will be described.


The vehicle control device 100 is implemented by, for example, one or more processors or hardware having equivalent functions. The vehicle control device 100 may be configured by combining an electronic control unit (ECU), a micro-processing unit (MPU), or the like in which a processor such as a central processing unit (CPU), a storage device, and a communication interface are connected by an internal bus.


The vehicle control device 100 includes the target lane determination unit 110, a driving support control unit 120, a travel control unit 160, an HMI control unit 170, and a storage unit 180.


Functions of the target lane determination unit 110 and the driving support control unit 120, and a part or all of functions of the travel control unit 160 are implemented by a processor executing a program (software). A part or all of such functions may be implemented by hardware such as large scale integration (LSI) or an application specific integrated circuit (ASIC), or may be implemented by a combination of software and hardware.


In the following description, when an “XX unit” is mainly described, it is assumed that the driving support control unit 120 reads each program from a ROM or electrically erasable programmable read-only memory (EEPROM) as necessary, then loads the program onto a RAM, and executes each function (which will be described later below). Each program may be stored in the storage unit 180 in advance, or may be loaded onto the vehicle control device 100 via another storage medium or communication medium as necessary.


[Target Lane Determination Unit 110]


The target lane determination unit 110 is implemented by, for example, a micro-processing unit (MPU). The target lane determination unit 110 divides a route provided from the navigation device 20 into a plurality of blocks (for example, divides the route every 100 m relative to a vehicle traveling direction), and determines a target lane for each block with reference to high-precision map information 181. For example, the target lane determination unit 110 determines which lane from the left the vehicle is to travels in. For example, in a case where a branching point, a merging point, or the like exists in the route, the target lane determination unit 110 determines a target lane such that the own vehicle M can travel along a reasonable travel route so as to travel to a branch destination. The target lane determined by the target lane determination unit 110 is stored in the storage unit 180 as target lane information 182.


[Driving Support Control Unit 120]


The driving support control unit 120 includes a driving support mode control unit 130, a recognition unit 140, and a switching control unit 150.


<Driving Support Mode Control Unit 130>


The driving support mode control unit 130 determines an autonomous driving mode (autonomous driving support state) to be executed by the driving support control unit 120, based on an operation of the driver on the HMI 35, an event determined by an action plan generation unit 144, a traveling mode determined by a trajectory generation unit 147, and the like. The autonomous driving mode is notified to the HMI control unit 170.


In any autonomous driving mode, it is possible to switch (override) to a lower-ranking autonomous driving mode by an operation on a component of the driving operation system in the HMI 35.


The override is started, for example, in a case where an operation on a component of the driving operation system of the HMI 35 performed by the driver of the own vehicle M continues for more than a predetermined time, in a case where a predetermined operation change amount (for example, an accelerator opening degree of the accelerator pedal 41, a brake depression amount of the brake pedal 47, or a steering angle of the steering wheel 55) is exceeded, or in a case where an operation on a component of the driving operation system is performed for more than a predetermined number of times.


<Recognition Unit 140>


The recognition unit 140 includes an own vehicle position recognition unit 141, an external environment recognition unit 142, an area identification unit 143, the action plan generation unit 144, and the trajectory generation unit 147.


<Own Vehicle Position Recognition Unit 141>


The own vehicle position recognition unit 141 recognizes a traveling lane where the own vehicle M travels and a relative position of the own vehicle M relative to the traveling lane, based on the high-precision map information 181 stored in the storage unit 180 and information input from the camera 11, the radar 13, the LIDAR 15, the navigation device 20, or the vehicle sensor 30.


The own vehicle position recognition unit 141 recognizes the traveling lane by comparing a pattern (for example, arrangement of solid lines and broken lines) of road lane marking recognized from the high-precision map information 181 with a pattern of road lane marking around the own vehicle M recognized from an image imaged by the camera 11. During such recognition, a current position of the own vehicle M acquired from the navigation device 20 or a processing result of the INS may be taken into consideration.


<External Environment Recognition Unit 142>


As shown in FIG. 2, the external environment recognition unit 142 recognizes, for example, an external environment state including a position, a vehicle speed, and acceleration of a surrounding vehicle based on external environment information input from the external environment sensor 10 including the camera 11, the radar 13, and the LIDAR 15. The surrounding vehicle is, for example, a vehicle traveling around the own vehicle M, and is another vehicle traveling in the same direction as the own vehicle M (a preceding vehicle and a following vehicle to be described later below).


The position of the surrounding vehicle may be indicated by a representative point such as a center of gravity or a corner of the other vehicle, or may be indicated by a region represented by a contour of the other vehicle. A state of the surrounding vehicle may include a speed and acceleration of the surrounding vehicle and whether the surrounding vehicle is changing a lane (or whether the surrounding vehicle is attempting to change a lane), which are grasped based on information of the various devices described above. The external environment recognition unit 142 may be configured to recognize a position of a target including a guardrail, a utility pole, a parked vehicle, a pedestrian, and a traffic sign, in addition to surrounding vehicles including a preceding vehicle and a following vehicle.


In the present embodiment, among surrounding vehicles, a vehicle that travels in a traveling lane common to the own vehicle M immediately in front of the own vehicle M and is a follow-up target during follow-up travel control is referred to as a “preceding vehicle”. In addition, among the surrounding vehicles, a vehicle that travels in a traveling lane common to the own vehicle M and immediately behind the own vehicle M is referred to as a “following vehicle”.


<Area Identification Unit 143>


Based on map information, the area identification unit 143 acquires information on a specific area (interchange (IC)/junction (JCT)/lane increase and decrease point) existing around the own vehicle M. Accordingly, even in a case where a traveling direction image cannot be acquired via the external environment sensor 10 due to blockage of front vehicles including the preceding vehicle, the area identification unit 143 can acquire the information on the specific area that assists smooth traveling of the own vehicle M.


Instead of acquiring the information on the specific area based on the map information, the area identification unit 143 may acquire the information on the specific area by identifying a target by image processing based on the traveling direction image acquired via the external environment sensor 10 or by recognizing the target based on a contour of the traveling direction image by internal processing of the external environment recognition unit 142.


In addition, as will be described later below, a configuration in which accuracy of the information on the specific area acquired by the area identification unit 143 is increased by using VICS information acquired by the communication device 25 may be adopted.


<Action Plan Generation Unit 144>


The action plan generation unit 144 sets a start point of autonomous driving and/or a destination of autonomous driving. The start point of autonomous driving may be a current position of the own vehicle M or may be a point where an operation that instructs autonomous driving is performed. The action plan generation unit 144 generates an action plan for a section between the start point and the destination of autonomous driving. Note that the action plan generation unit 144 is not limited thereto, and may generate an action plan for any section.


The action plan includes, for example, a plurality of events to be sequentially executed. The plurality of events include, for example, a deceleration event of decelerating the own vehicle M, an acceleration event of accelerating the own vehicle M, a lane keep event of causing the own vehicle M to travel without deviating from a traveling lane, a lane change event of changing a traveling lane, an overtaking event of causing the own vehicle M to overtake a preceding vehicle, a branching event of causing the own vehicle M to change to a desired lane at a branching point or causing the own vehicle M to travel without deviating from a current traveling lane, a merging event of accelerating and decelerating the own vehicle M in a merging lane so as to merge with a main lane and changing the traveling lane, and a handover event of causing the own vehicle M to transition from a manual driving mode to an autonomous driving mode (autonomous driving support state) at a starting point of autonomous driving or causing the own vehicle M to transition from the autonomous driving mode to the manual driving mode at a scheduled end point of autonomous driving.


The action plan generation unit 144 sets a lane change event, a branching event, or a merging event at a place where the target lane determined by the target lane determination unit 110 is switched. Information indicating the action plan generated by the action plan generation unit 144 is stored in the storage unit 180 as action plan information 183.


The action plan generation unit 144 includes a mode change unit 145 and a notification control unit 146.


<Mode Change Unit 145>


For example, based on a recognition result of a target existing in the traveling direction of the own vehicle M provided by the external environment recognition unit 142, the mode change unit 145 selects a driving mode corresponding to the recognition result from driving modes including a preset multi-stage autonomous driving mode and a manual driving mode, and uses the selected driving mode to perform a driving operation of the own vehicle M.


<Notification Control Unit 146>


When a driving mode of the own vehicle M is transitioned by the mode change unit 145, the notification control unit 146 notifies the fact that the driving mode of the own vehicle M is transitioned. The notification control unit 146 notifies the fact that the driving mode of the own vehicle M is transitioned, for example, by causing the speaker 63 to output sound information stored in advance in the storage unit 180.


As long as the driver can be notified of the transition of the driving mode of the own vehicle M, the notification is not limited to the notification by sound, and the notification may also be performed by display, light emission, vibration, or a combination thereof.


<Trajectory Generation Unit 147>


The trajectory generation unit 147 generates a trajectory along which the own vehicle M is to travel based on the action plan generated by the action plan generation unit 144.


<Switching Control Unit 150>


As shown in FIG. 2, the switching control unit 150 switches between the autonomous driving mode and the manual driving mode based on a signal input from the autonomous driving changeover switch 71 (see FIG. 3) and the like. In addition, based on an operation that instructs acceleration, deceleration, or steering relative to a component of the driving operation system in the HMI 35, the switching control unit 150 switches the autonomous driving mode at that time to a lower-ranking driving mode. For example, when a state where an operation amount indicated by a signal input from the component of the driving operation system in the HMI 35 exceeds a threshold continues for a reference time or more, the switching control unit 150 switches (overrides) the autonomous driving mode at that time to a lower-ranking driving mode.


In addition, the switching control unit 150 may perform switching control for returning to an original autonomous driving mode in a case where no operation is detected on any component of the driving operation system in the HMI 35 within a predetermined time after the switching to the lower-ranking driving mode by the override.


<Travel Control Unit 160>


The travel control unit 160 performs travel control of the own vehicle M by controlling the travel driving force output device 200, the steering device 210, and the brake device 220 in such a manner that the own vehicle M passes a trajectory generated by the trajectory generation unit 147 on which the own vehicle M is to travel at a preset time-point.


<HMI Control Unit 170>


When setting information on the autonomous driving mode of the own vehicle M is notified by the driving support control unit 120, the HMI control unit 170 refers to mode-specific operability information 184 indicating, for each driving mode, a device permitted to be used (a part or all of the navigation device 20 and the HMI 35) and a device not permitted to be used, and controls the HMI 35 according to setting contents of the autonomous driving mode.


As shown in FIG. 2, the HMI control unit 170 determines the device permitted to be used (a part or all of the navigation device 20 and the HMI 35) and the device not permitted to be used, based on driving mode information of the own vehicle M acquired from the driving support control unit 120 and by referring to the mode-specific operability information 184. Based on a determination result thereof, the HMI control unit 170 controls whether to accept a driver operation related to the HMI 35 of the driving operation system or the navigation device 20.


For example, when a driving mode executed by the vehicle control device 100 is the manual driving mode, the HMI control unit 170 accepts a driver operation related to the HMI 35 of the driving operation system (for example, the accelerator pedal 41, the brake pedal 47, the shift lever 51, and the steering wheel 55 in FIG. 3).


The HMI control unit 170 includes a display control unit 171.


<Display Control Unit 171>


The display control unit 171 performs display control related to the internal display device 61 and the external display device 83. Specifically, for example, when the driving mode executed by the vehicle control device 100 is an autonomous driving mode with a high degree of automation, the display control unit 171 performs control such that the internal display device 61 and/or the external display device 83 display information such as attention calling, warning, and driving assistance for traffic participants existing around the own vehicle M. This will he described in detail later below.


<Storage Unit 180>


The storage unit 180 stores information such as the high-precision map information 181, the target lane information 182, the action plan information 183, and the mode-specific operability information 184. The storage unit 180 is implemented by a read only memory (ROM), a random access memory (RAM), a hard disk drive (HDD), a flash memory, or the like. A program to be executed by a processor may be stored in advance in the storage unit 180, or may be downloaded from an external device via an in-vehicle Internet device or the like. In addition, the program may be installed in the storage unit 180 when a portable storage medium storing the program is mounted on a drive device (not shown).


The high-precision map information 181 is map information with higher precision than map information normally provided in the navigation device 20. The high-precision map information 181 includes, for example, information on a center of a lane and information on a boundary of the lane. The boundary of the lane includes a lane mark type, a color, a length, a road width, a road shoulder width, a main line width, a lane width, a boundary position, a boundary type (guardrail, planting, curbstone), a zebra zone, and the like, and these boundaries are included in a high-precision map.


The high-precision map information 181 may include road information, traffic regulation information, address information (address and postal code), facility information, telephone number information, and the like. The road information includes information indicating a road type such as an expressway, a toll road, a national highway, and a prefectural road, and information such as the number of lanes of a road, a width of each lane, a gradient of a road, a position of a road (three-dimensional coordinates including longitude, latitude, and height), a curve curvature of a lane, positions of merging and branching points of lanes, and signs provided on a road. The traffic regulation information includes, for example, information indicating that a lane is blocked due to construction, a traffic accident, traffic congestion, or the like.


[Travel Driving Force Output Device 200, Steering Device 210, and Brake Device 220]


As shown in FIG. 2, the vehicle control device 100 controls driving of the travel driving force output device 200, the steering device 210, and the brake device 220 in accordance with a travel control command of the travel control unit 160.


<Travel Driving Force Output Device 200>


The travel driving force output device 200 outputs a driving force (torque) for the own vehicle M to travel to driving wheels For example, when the own vehicle M is an automobile using an internal combustion engine as a power source, the travel driving force output device 200 includes an internal combustion engine, a transmission, and an engine electronic control unit (ECU) that controls the internal combustion engine (all of which are not shown).


When the own vehicle M is an electric automobile using an electric motor as a power source, the travel driving force output device 200 includes a travel motor and a motor ECU that controls the travel motor (both of which are not shown).


Further, when the own vehicle M is a hybrid automobile, the travel driving force output device 200 includes an internal combustion engine, a transmission, an engine ECU, a travel motor, and a motor ECU (all of which are not shown).


When the travel driving force output device 200 includes only the internal combustion engine, the engine ECU adjusts a throttle opening degree, a shift stage, and the like of the internal combustion engine in accordance with information input from the travel control unit 160 to be described later below.


When the travel driving force output device 200 includes only the travel motor, the motor ECU adjusts a duty ratio of a PWM signal provided to the travel motor in accordance with information input from the travel control unit 160.


When the travel driving force output device 200 includes the internal combustion engine and the travel motor, the engine ECU and the motor ECU control a. travel driving force in cooperation with each other in accordance with information input from the travel control unit 160.


<Steering Device 210>


The steering device 210 includes, for example, a steering ECU and an electric motor (both of which are not shown). The electric motor, for example, changes a direction of a steered wheel by applying a force to a rack and pinion mechanism. The steering ECU drives the electric motor in accordance with information input from the vehicle control device 100 or input information on a steering angle or on steering torque to change the direction of the steered wheel.


<Brake Device 220>


The brake device 220 is, for example, an electric servo brake device including a brake caliper, a cylinder that transmits hydraulic pressure to the brake caliper, an electric motor that generates the hydraulic pressure in the cylinder, and a braking control unit (all of which are not shown). The braking control unit of the electric servo brake device controls the electric motor according to information input from the travel control unit 160 in such a mariner that brake torque corresponding to a braking operation is output to each wheel. The electric servo brake device may include, as a backup, a mechanism that transmits hydraulic pressure generated by an operation of the brake pedal 47 to the cylinder via a master cylinder.


The brake device 220 is not limited to the electric servo brake device described above, and may also be an electronically controlled hydraulic brake device. The electronically controlled hydraulic brake device controls an actuator in accordance with information input from the travel control unit 160 to transmit hydraulic pressure of a master cylinder to the cylinder. The brake device 220 may include a regenerative brake using a travel motor that may be included in the travel driving force output device 200.


[Block Configuration of Autonomous Driving Vehicle Information Presentation Device 300]


Next, a block configuration of an autonomous driving vehicle information presentation device 300 according to the embodiment of the present invention included in the vehicle control device 100 described above will be described with reference to FIG. 6.



FIG. 6 is a block configuration diagram conceptually showing functions of the autonomous driving vehicle information presentation device 300 according to the embodiment of the present invention.


As shown in FIG. 6, the autonomous driving vehicle information presentation device 300 includes an external environment information acquisition unit 311, an identification unit 321, a storage unit 323, an extraction unit 325, and an information presentation unit 331.


<External Environment Information Acquisition Unit 311>


The external environment information acquisition unit 311 has a function of acquiring external environment information on a distribution condition of targets existing around the own vehicle M (in front of the own vehicle M in the traveling direction and behind the own vehicle M in the traveling direction) detected by the external environment sensor 10. An external environment information acquisition path of the external environment information acquisition unit 311 is not limited to the external environment sensor 10, and the navigation device 20 and the communication device 25 may also be adopted. For example, the external environment information acquisition unit 311 may acquire the above-described user identification information from the communication device 25 as one piece of the external environment information.


The external environment information acquisition unit 311 is a functional member corresponding to the recognition unit 140 of the vehicle control device 100 shown in FIG. 2.


<Identification Unit 321>


The identification unit 321 has a function of searching for a person existing around the own vehicle M based on the external environment information acquired by the external environment information acquisition unit 311 and identifying whether the person extracted by the search coincides with a user registered in the own vehicle M. Such identification may be implemented, for example, by performing face recognition processing of collating and recognizing face information of a person imaged by the camera 11 with face information of a user registered in a database (not shown).


Further, the identification unit 321 also has a function of determining whether the person extracted by the above search is an owner registered in the own vehicle M. In the own vehicle M, one of users of the own vehicle M is registered (set) in advance as the owner. The determination of whether the person is the owner may be implemented by, for example, using the user identification information acquired from the communication device 25 (that is, a terminal device) to perform user identification processing, or may be implemented by performing face recognition processing of collating and recognizing face information of a person imaged by the camera 11 with face information of an owner registered in a database (not shown).


The identification unit 321 is a functional member corresponding to the recognition unit 140 of the vehicle control device 100 shown in FIG. 2.


<Storage Unit 323>


The storage unit 323 has a function of storing a presentation mode of information (for example, lighting modes of the left and right front lighting units 91A and 91B and the front and rear indicators 92 and 98, a display mode of the front display unit 93, and the like, and hereinafter is also referred to as an “information presentation mode”) of the information presentation unit 331 to be described later below. For example, the storage unit 323 stores an information presentation mode for presenting information unique to the owner (hereinafter, also referred to as an “owner presentation mode”) in association with a user registered as the owner of the own vehicle M, and stores a user presentation mode, which is an information presentation mode different from the owner presentation mode, in association with users other than the owner.


Here, an example of the information presentation mode stored in the storage unit 323 will be described with reference to FIG. 7. FIG. 7 shows the example of the information presentation mode stored by the storage unit 323 of the autonomous driving vehicle information presentation device 300. The storage unit 323 stores, for example, an information presentation mode table T1 shown in FIG. 7. The information presentation mode table T1 is configured by associating a plurality of information presentation modes with conditions under which the information presentation modes are extracted by the extraction unit 325 to be described later below (hereinafter, also referred to as “extraction conditions”).


The extraction conditions are set through using, for example, a user ID that is an identifier of the user. In the present embodiment, a user whose user ID is “U1” is registered as the owner of the own vehicle M. That is, in FIG. 7, an information presentation mode in which the user ID in the extraction condition is “U1”, such as information presentation modes P11 to P13, is the owner presentation mode. On the other hand, in FIG. 7, an information presentation mode in which the user ID in the extraction condition is not “U1” (for example, the user ID is “U2”), such as the information presentation mode P21, is the user presentation mode.


In FIG. 7, the number of times of activation of the own vehicle M (simply referred to as the “number of times of activation”) is also set as the extraction condition for the information presentation modes P11 to P13 and the like that are owner presentation modes. For example, the own vehicle M is activated when it is detected that the user (including the owner) of the own vehicle M approaches the own vehicle M (for example, a terminal device carried by the user approaches the own vehicle M) during parking. The number of times of activation of the own vehicle M is, for example, the number of times the own vehicle M is activated in this manner after the owner is registered in the own vehicle M. The number of times of activation of the own vehicle M is not limited thereto, and may be the number of times an ignition power source is turned on after the owner is registered in the own vehicle M, or the like.


In the present embodiment, the number of times of activation “first time” is set as the extraction condition for the information presentation mode P11 where a message “I look forward to working with you in the future” is displayed on the front display unit 93 among the owner presentation modes. The number of times of activation “second to ninth times (from a second time to a ninth time)” is set as the extraction condition for the information presentation mode P12 where a message “ready for departure” is displayed on the front display unit 93. The number of times of activation “tenth time to (tenth time and thereafter)” is set as the extraction condition for the information presentation mode P13 where a message “please drive” is displayed on the front display unit 93.


That is, in the present embodiment, as the number of times of activation that serves as the extraction condition becomes larger, the owner presentation mode becomes an owner presentation mode where communication with the owner is performed in a more familiar way (the own vehicle M behaves more actively). As a result, it is possible to cause the extraction unit 325, which will be described later below, to extract the owner presentation mode where the more familiar communication is performed as the number of times of activation of the own vehicle M increases, that is, as the companionship between the owner and the own vehicle M increases.


Therefore, the autonomous driving vehicle information presentation device 300 can perform the communication with the owner in a more natural (more realistic) way such that intimacy between the owner and the own vehicle M increases as the companionship between the owner and the own vehicle M increases. As a result, it is possible to cause the owner of the own vehicle M to develop a feeling of attachment to the own vehicle M.


As shown in FIG. 7, a. situation where information is presented according to the information presentation mode may also be set as the extraction condition for each information presentation mode. For example, “when the user approaches the own vehicle” may be set as a situation in the extraction condition for an information presentation mode of information presentation that is desired to be performed when the user approaches the own vehicle M.


For example, “at the time of getting off the own vehicle M” may be set as a situation in the extraction condition for an information presentation mode of information presentation that is desired to be performed at the time of getting off the own vehicle M (for example, an information presentation mode where “bye-bye” is displayed on the front display unit 93). Further, for example, “future weather around the own vehicle M is sunny” may be set as a situation in the extraction condition for an information presentation mode of information presentation that is desired to be performed when future weather around the own vehicle M is sunny (for example, an information presentation mode where “today is sunny” is displayed on the front display unit 93), in addition, the situation in the extraction condition may be set through using time, a state of the own vehicle M (for example, a charging state of a battery), or the like.


As described above, by setting the situation where information presentation is performed in the information presentation mode as the extraction condition for each information presentation mode, it is possible to cause the extraction unit 325 described below to extract the information presentation mode according to the situation. Therefore, the autonomous driving vehicle information presentation device 300 can use an appropriate information presentation mode to present information in accordance with a situation, and can perform communication in a more natural (more realistic) way.


<Extraction Unit 325>


The extraction unit 325 has a function of extracting any information presentation mode from the information presentation modes stored in the storage unit 323. For example, the extraction unit 325 has a function of extracting the owner presentation mode from stored contents of the storage unit 323 in a case where the identification unit 321 identifies that the person coincides with the user of the own vehicle M and the user is determined to be the owner of the own vehicle M as a result of the identification of the identification unit 321. At this time, for example, the extraction unit 325 refers to the number of times of activation of the own vehicle M up to now, and extracts the owner presentation mode corresponding to the number of times of activation.


Specifically, for example, in a case where the identification indicating that the person coincides with the user of the own vehicle M is performed and it is determined that the user is the owner of the own vehicle M, the extraction unit 325 extracts the information presentation mode P11 from the storage unit 323 (the information presentation mode table T1) when the number of times of activation of the ow vehicle M is “first time”. In the case where the identification indicating that the person coincides with the user of the own vehicle M is performed and it is determined that the user is the owner of the own vehicle M, the extraction unit 325 extracts the information presentation mode P12 from the storage unit 323 when the number of times of activation of the own vehicle M is “second to ninth time”. In the case where the identification indicating that the person coincides with the user of the own vehicle M is performed and it is determined that the user is the owner of the own vehicle M, the extraction unit 325 extracts the information presentation mode P13 from the storage unit 323 when the number of times of activation of the own vehicle M is “tenth time to”.


The extraction unit 325 is a functional member belonging to the recognition unit 140 in the vehicle control device 100 shown in FIG. 2.


<Information Presentation Unit 331>


The information presentation unit 331 has a function of presenting information by the information presentation mode extracted by the extraction unit 325.


The information presentation unit 331 is configured to include the right front lighting unit 91A (see FIGS. 5A and 5C) that is a right eye corresponding portion of the own vehicle M, the left front lighting unit 91B (see FIGS. 5A and 5C) that is a left eye corresponding portion of the own vehicle M, and the front display unit 93 (see FIG. 5A).


For example, the right front lighting unit 91A, the left front lighting unit 91B, and the front display unit 93 are each configured with an LED panel in which a plurality of light emitting diode (LED) lights are integrated. The information presentation unit 331 performs information presentation by driving such LED panels in accordance with the information presentation mode (for example, the owner presentation mode) extracted by the extraction unit 325.


Specifically, for example, when the information presentation mode extracted by the extraction unit 325 is the information presentation mode P11, the information presentation unit 331 causes the front display unit 93 to display the message “I look forward to working with you in the future”. When the information presentation mode extracted by the extraction unit 325 is the information presentation mode P12, the information presentation unit 331 causes the front display unit 93 to display the message “ready for departure”. When the information presentation mode extracted by the extraction unit 325 is the information presentation mode P13, the information presentation unit 331 causes the front display unit 93 to display the message “please drive”.


During the information presentation, the information presentation unit 331 may express a line of sight or the like of the own vehicle M through using the left and right front lighting units 91A and 91B that correspond to eyes when the own vehicle M is personified in a front view.


Specifically, as shown in FIG. 5A, the left and right front lighting units 91A and 91B having circular outer peripheral edges are provided at left and right end portions of the front grille 90 in the vehicle width direction with an interval provided therebetween. Therefore, the left and right front lighting units 91A and 91B look like a pair of eyes when the own vehicle M is personified in a front view.


For example, when only upper half portions of an annular-shaped right lighting display unit 91Ac and annular-shaped left lighting display unit 91Bc of the left and right front lighting units 91A and 91B are lighted while lower half portions thereof are extinguished, the information presentation unit 331 can express a smile of the own vehicle M as if the own vehicle M is smiling when the own vehicle M is personified in the front view.


The information presentation unit 331 is a functional member corresponding to the HMI control unit 170 of the vehicle control device 100 shown in FIG. 2.


[Operation of Autonomous Driving Vehicle Information Presentation Device 300]


Next, an operation of the autonomous driving vehicle information presentation device 300 according to another embodiment of the present invention will be described with reference to FIG. 8.


For example, as described above, the autonomous driving vehicle information presentation device 300 performs an operation shown in FIG. 8 in a case where the user of the own vehicle M including the owner (for example, a terminal device such as a smart key carried by the user) approaches the own vehicle M during parking and the own vehicle M that has detected the approach is activated.


In step S11 shown in FIG. 8, the external environment information acquisition unit 311 acquires external environment information related to a distribution condition of targets existing around the own vehicle M, which is detected by the external environment sensor 10.


In step S12, the identification unit 321 searches for a person around the own vehicle M based on the external environment information acquired by the external environment information acquisition unit 311.


In step S13, the identification unit 321 identifies whether the person extracted by the search in step S12 coincides with the user registered in the own vehicle M.


In step S14, when it is identified, as a result of the identification in step S13, that the person extracted by the search in step S12 coincides with the user registered in the own vehicle M, the autonomous driving vehicle information presentation device 300 causes the flow of processing to proceed to the next step S15.


On the other hand, when it is identified, as the result of the identification in step S13, that the person extracted by the search in step S12 does not coincide with the user registered in the own vehicle M, the autonomous driving vehicle information presentation device 300 directly ends the operation shown in FIG. 8.


In step S15, the identification unit 321 determines whether the person extracted by the search in step S12 is the owner registered in the own vehicle M. When it is determined, as a result of the determination, that the person extracted by the search in step S12 is the owner registered in the own vehicle M, the autonomous driving vehicle information presentation device 300 causes the flow of processing to proceed to the next step S16.


On the other hand, when it is determined that the person extracted by the search in step S12 is not the owner registered in the own vehicle M, the autonomous driving vehicle information presentation device 300 causes the flow of processing to proceed to the next step S18.


In step S16, the extraction unit 325 refers to the number of times of activation of the own vehicle M.


In step S17, the extraction unit 325 extracts an owner presentation mode corresponding to the number of times of activation obtained in step S16 among the owner presentation modes from the stored contents of the storage unit 323.


In step S18, the extraction unit 325 extracts a user presentation mode different from the owner presentation mode from the stored contents of the storage unit 323. In a case where each user presentation mode is stored in the storage unit 323 in a state of being associated with a user who is a target of information presentation of the user presentation mode, in step S17, the extraction unit 325 may extract a user presentation mode corresponding to the user of the own vehicle M identified by the identification unit 321 among the user presentation modes from the storage unit 323.


In step S19, the information presentation unit 331 performs information presentation by the information presentation mode extracted in any one of steps S17 and S18 with the person extracted by the search in step S12 serving as a presentation target.


As described above, according to the autonomous driving vehicle information presentation device 300, in a case where it is identified that the person extracted by the search coincides with the user of the own vehicle M and it is determined that the person is the owner of the own vehicle M as a result of the identification performed by the identification unit 321, the information presentation unit 331 presents information unique to the owner in the owner presentation mode with the owner serving as the presentation target. That is, the autonomous driving vehicle information presentation device 300 presents the information in the owner presentation mode under the condition that the presentation target is the owner. As a result, the autonomous driving vehicle information presentation device 300 can provide pleasure and a feeling of superiority for the owner, such as “I am specially treated by the autonomous driving vehicle M”, and can thus cause the owner to develop a feeling of attachment to the autonomous driving vehicle M. Therefore, it is possible to improve marketability of the autonomous driving vehicle M.


The present invention is not limited to the embodiment described above, and modifications, improvements, or the like can be made as appropriate.


For example, although an example in which the number of times of activation of the own vehicle M is used as the extraction condition of the owner presentation mode in order to communicate with the owner with increased intimacy as the companionship between the owner and the own vehicle M increases has been described in the embodiment described above, the present invention is not limited thereto. For example, a length of a travel distance of the own vehicle M driven by the owner or a length of an owning period when the owner owns the own vehicle M may be used as the extraction condition instead of the number of times of activation described above or in addition to the number of times of activation. In this case, the communication can still be performed such that the intimacy between the owner and the own vehicle M increases as the companionship between the owner and the own vehicle M increases. Further, an evaluation value of the intimacy between the owner and the own vehicle M may be calculated from the number of times of activation, the travel distance, the owning period described above, and the like, and the information presentation may be performed in an owner presentation mode according to the evaluation value.


For example, the information presentation where the predetermined message is displayed on the front display unit 93 has been described in the above-described embodiment, the present invention is not limited thereto. For example, as described above, the external display device 83 includes the front indicator 92 and the rear indicator 98 as lighting units that are lighted when the own vehicle M is moved by autonomous driving so as to inform a. person around the own vehicle M that the own vehicle is moved by autonomous driving.


The autonomous driving vehicle information presentation device 300 may blink the front and rear indicators 92 and 98 according to the information presentation mode of the information presentation unit 331, For example, the autonomous driving vehicle information presentation device 300 may blink the front indicator 92 when the predetermined message is displayed on the front display unit 93. In this way, by blinking the front indicator 92 and the rear indicator 98 according to the information presentation mode of the information presentation unit 331, a presentation effect at the time of the information presentation performed by the information presentation unit 331 can be improved, and a person that is a presentation target of the information can be informed that the information presentation is performed.


The autonomous driving vehicle information presentation device 300 may blink the left and right front lighting units 91A and 91B and the left and right rear lighting units 95A and 9513 in accordance with the information presentation mode of the information presentation unit 331, instead of the front and rear indicators 92 and 98 or in addition to the front and rear indicators 92 and 98.


For example, the owner may approach the own vehicle M from behind the own vehicle M during parking, Therefore, the information presentation unit 331 may switch the external display device 83 used in the case of performing presentation of the information unique to the owner according to a positional relationship between the own vehicle M and the owner. Specifically, when the owner is positioned in front of the own vehicle M, the information presentation unit 331 performs presentation of the information unique to the owner by the left and right front lighting units 91A and 91B, the front display unit 93, the front indicator 92, and the like. On the other hand, when the owner is positioned behind the own vehicle M, the information presentation unit 331 performs presentation of the information unique to the owner by the left and right rear lighting units 95A and 95B, the rear display unit 97, the rear indicator 98, and the like. In this way, the information unique to the owner can be presented through using the appropriate external display device 83 according to the positional relationship between the own vehicle M and the owner, and thus communication with the owner can be achieved.


The autonomous driving vehicle information presentation device 300 may perform recommendation suitable for preference of the owner as the presentation of the information unique to the owner For example, the autonomous driving vehicle information presentation device 300 may perform information presentation such that a message of “how about go flower viewing” is displayed in April for an owner who 20 flower viewing in April every year.


For example, in a case where the own vehicle M is not locked, the autonomous driving vehicle information presentation device 300 may perform information presentation such that a message of “not locked” is displayed only for the owner from the viewpoint of ensuring theft prevention.


The present invention can also be implemented in a form in which a program for implementing one or more functions according to the above-described embodiment is supplied to a system or a device via a network or a storage medium, and one or more processors in a computer of the system or the device read and execute the program. The present invention may be implemented by a hardware circuit (for example, an ASIC) that implements one or more functions. Information including a program for implementing each function can he held in a recording device such as a memory or a hard disk, or a recording medium such as a memory card or an optical disk.


At least the following matters are described in the present specification. Components corresponding to those according to the embodiment described above are shown in parentheses. However, the present invention is not limited thereto.


(1) An autonomous driving vehicle information presentation device (autonomous driving vehicle information presentation device 300) used for an autonomous driving vehicle (autonomous driving vehicle M) that acquires external environment information including a target existing around an own vehicle, generates an action plan of the own vehicle based on the acquired external environment information, and automatically performs at least one of speed control and steering control of the own vehicle in accordance with the generated action plan, the autonomous driving vehicle information presentation device being configured to present information to a person existing around the own vehicle and including:


an identification unit (identification unit 321) configured to search for a person existing around the own vehicle based on the external environment information, identifies whether the person extracted by the search coincides with a user of the own vehicle, and determines whether the person extracted. by the search is an owner of the own vehicle; and


an information presentation unit (information presentation unit 331) configured to perform information presentation to the person through using an external display device (external display device 83) provided in at least one of a front portion and a rear portion of the own vehicle.


In a case where, as a result of the identification performed by the identification unit, it is identified that the person extracted by the search coincides with the user of the own vehicle, and, as a result of the determination performed by the identification unit, it is determined that the person extracted by the search is the owner of the own vehicle, the information presentation unit presents information unique to the owner in a preset presentation mode (information presentation modes P11 to P13) with the owner serving as a presentation target.


According to (1), in the case where it is identified that the person extracted by the search coincides with the user of the own vehicle and it is determined that the person extracted by the search is the owner of the own vehicle, the information unique to the owner is presented in the preset presentation mode with the owner serving as the presentation target. As a result, the information unique to the owner can be presented only when the owner of the own vehicle is a presentation target, so that pleasure and a feeling of superiority can be provided for the owner, such as “I am specially treated by the autonomous driving vehicle”, and the owner can thus develop a feeling of attachment to the autonomous driving vehicle.


(2) The autonomous driving vehicle information presentation device according to (1), in which


when presenting the information unique to the owner, the information presentation unit presents the information unique to the owner in a presentation mode corresponding to at least one of the number of times of activation of the own vehicle, a travel distance of the own vehicle, and an owning period of the own vehicle owned by the owner.


According to (2), it is possible to present information to the owner in such a manner that more familiar communication is performed as companionship between the owner and the own vehicle M increases. Therefore, the communication can be performed in a more natural (more realistic) way, and it is possible to cause the owner to develop a feeling of attachment to the autonomous driving vehicle.


(3) The autonomous driving vehicle information presentation device according to (1) or (2), in which


the external display device includes a lighting unit (front indicator 92 and rear indicator 98) configured to light up when the own vehicle is moved by autonomous driving and inform the person that the own vehicle is moved by autonomous driving.


According to (3), since the lighting unit that lights up when the own vehicle is moved by autonomous driving and informs the person around the own vehicle that the own vehicle is moved by autonomous driving is provided, the person can be easily informed that the own vehicle is moved by autonomous driving.


(4) The autonomous driving vehicle information presentation device according to (3), in which


the lighting unit blinks in accordance with an information presentation mode of the information presentation unit.


According to (4), since the lighting unit blinks in accordance with the information presentation mode of the information presentation unit, a presentation effect at the time of the information presentation performed by the information presentation unit can be improved, and the person that is the presentation target of the information can be informed that the information presentation is performed.


(5) The autonomous driving vehicle information presentation device according to any one of (1) to (4), in which


the external display device is provided in the front portion and the rear portion of the own vehicle, and


the information presentation unit switches, in accordance with a positional relationship between the own vehicle and the owner, the external display device to be used when presenting the information unique to the owner.


According to (5), since the external display device is provided at the front portion and the rear portion of the own vehicle, and the information presentation unit switches, in accordance with the positional relationship between the own vehicle and the owner, the external display device used when presenting the information unique to the owner, it is possible to perform the presentation of the information unique to the owner by using an appropriate external display device corresponding to the positional relationship between the own vehicle and the owner.

Claims
  • 1. An autonomous driving vehicle information presentation device used for an autonomous driving vehicle that acquires external environment information including a target existing around an own vehicle, generates an action plan of the own vehicle based on the acquired external environment information, and automatically performs at least one of speed control and steering control of the own vehicle in accordance with the generated action plan, the autonomous driving vehicle information presentation device being configured to present information to a person existing around the own vehicle, the autonomous driving vehicle information presentation device comprising: an identification unit configured to search for a person existing around the own vehicle based on the external environment information, identify whether the person extracted by the search coincides with a user of the own vehicle, and determine whether the person extracted by the search is an owner of the own vehicle; andan information presentation unit configured to perform information presentation to the person through using an external display device provided in at least one of a front portion and a rear portion of the own vehicle, whereinin a case where, as a result of the identification performed by the identification unit, it is identified that the person extracted by the search coincides with the user of the own vehicle, and, as a result of the determination performed by the identification unit, it is determined that the person extracted by the search is the owner of the own vehicle, the information presentation unit presents information unique to the owner in a preset presentation mode with the owner serving as a presentation target.
  • 2. The autonomous driving vehicle information presentation device according to claim 1, wherein when presenting the information unique to the owner, the information presentation unit presents the information unique to the owner in a presentation mode corresponding to at least one of the number of times of activation of the own vehicle, a travel distance of the own vehicle, and an owning period of the own vehicle owned by the owner.
  • 3. The autonomous driving vehicle information presentation device according to claim 1, wherein the external display device includes a lighting unit configured to light up when the own vehicle is moved by autonomous driving and inform the person that the own vehicle is moved by autonomous driving.
  • 4. The autonomous driving vehicle information presentation device according to claim 3, wherein the lighting unit blinks in accordance with an information presentation mode of the information presentation unit.
  • 5. The autonomous driving vehicle information presentation device according to claim 1, wherein the external display device is provided in the front portion and the rear portion of the own vehicle, andthe information presentation unit switches, in accordance with a positional relationship between the own vehicle and the owner, the external display device to be used when presenting the information unique to the owner.
Priority Claims (1)
Number Date Country Kind
2020-143968 Aug 2020 JP national