The present invention relates to a driving support device, an autonomous driving control device, a vehicle, a driving support method, and a program.
If a lane change is attempted in a direction where an obstacle is present when the obstacle is present in a rear side of a vehicle, a rear side obstacle warning system issues a notice that the obstacle is present on the rear side. In the rear side obstacle warning system, a display unit for telling the presence of the obstacle is provided on a door mirror, and a failure notification unit is provided on an instrument panel. Accordingly, it is difficult to surely understand whether or not the rear side obstacle warning system is out of order. Therefore, the failure notification unit is provided on the door mirror (for example, refer to PTL 1).
PTL 1: Unexamined Japanese Patent Publication No. 2007-1436
The present invention provides a technique for collectively issuing information regarding a sensor mounted on a vehicle.
A driving support device according to an aspect of the present invention includes: a monitoring unit that monitors whether a sensor to be mounted on a vehicle is operating; and an output unit that outputs operation-state information indicating a result of the monitoring by the monitoring unit. The monitoring unit detects malfunction of the sensor based on detection accuracy of the sensor. The detection accuracy is received when the sensor operates. The output unit outputs malfunction information on the malfunction of the sensor together with the operation-state information when the monitoring unit detects the malfunction of the sensor.
Another aspect of the present invention provides an autonomous driving control device. The autonomous driving control device includes: a monitoring unit that monitors whether a sensor to be mounted on a vehicle is operating; an output unit that outputs operation-state information indicating a result of the monitoring by the monitoring unit; and an autonomous driving controller that controls autonomous driving of the vehicle based on a detection result of the sensor. The monitoring unit detects malfunction of the sensor based on detection accuracy of the sensor. The detection accuracy is received when the sensor operates. The output unit outputs malfunction information on the malfunction of the sensor together with the operation-state information when the monitoring unit detects the malfunction.
Still another aspect of the present invention provides a vehicle. The vehicle includes a driving support device. The driving support device includes: a monitoring unit that monitors whether a sensor to be mounted on a vehicle is operating; and an output unit that outputs operation-state information indicating a result of the monitoring by the monitoring unit. The monitoring unit detects malfunction of the sensor based on detection accuracy of the sensor. The detection accuracy is received when the sensor operates. The output unit malfunction information on the malfunction of the sensor together with the operation-state information when the monitoring unit detects the malfunction.
Still another aspect of the present invention also provides a driving assistance method. A driving support method includes: monitoring whether a sensor to be mounted on a vehicle is operating; outputting operation-state information indicating a result of the monitoring by the monitoring unit; detecting malfunction of the sensor based on detection accuracy of the sensor, the detection accuracy being received when the sensor operates; and outputting malfunction information on the malfunction of the sensor together with the operation-state information when the malfunction of the sensor is detected.
Note that arbitrary combinations of the above constituents and any conversions of expressions of the present invention made among devices, systems, methods, programs, recording media recording programs, vehicles equipped with the devices, and the like are also effective as aspects of the present invention.
According to the present invention, information regarding a sensor mounted on a vehicle can be issued collectively.
Prior to description of an exemplary embodiment of the present invention, problems found in a conventional technique will briefly be described herein. In general, a plurality of sensors are mounted on a vehicle capable of executing autonomous driving. Presence of an obstacle is detected based on detection results in the plurality of sensors. Moreover, a direction where the obstacle is present or the like is displayed on a display in order to notify a driver of the presence of the obstacle. However, there is a problem that the driver is not notified whether or not the sensors are operating and whether or not detection accuracy by the sensors is low.
Prior to specific description of the exemplary embodiment of the present invention, an outline of the present invention will be described herein. The exemplary embodiment relates to notification of information about sensors to be used for autonomous driving of a vehicle. In particular, the present exemplary embodiment relates to a device (hereinafter also referred to as a “driving support device”) that controls a human machine interface (HMI) for exchanging information regarding a driving behavior of the vehicle with an occupant (for example, driver) of the vehicle. The “driving behavior” includes an operating state such as steering and braking during traveling and stopping of the vehicle, or control contents related to autonomous driving control. For example, the driving behavior is constant speed traveling, acceleration, deceleration, pause, stop, lane change, course change, right/left turn, parking, or the like. Moreover, the driving behavior may be cruising (running while keeping a lane and maintaining a vehicle speed), lane keeping, following a preceding vehicle, stop and go during following, lane change, passing, addressing a merging vehicle, crossover (interchange) including entry and exit to and from an expressway, merging, addressing a construction zone, addressing an emergency vehicle, addressing an interrupting vehicle, addressing lanes exclusive to right/left turns, interaction with a pedestrian/bicycle, avoidance of an obstacle other than a vehicle, addressing a sign, addressing restrictions of right/left turns and a U turn, addressing lane restriction, addressing one-way traffic, addressing a traffic sign, addressing an intersection/roundabout, or the like.
When the vehicle executes the autonomous driving, the presence of the obstacle is detected based on the detection results in the sensors, and the driving behavior is determined so that the obstacle is avoided. Moreover, the vehicle travels in accordance with the determined driving behavior. At this time, information regarding the detected obstacle or the like is displayed on the display, whereby the driver is notified of the presence of the obstacle. Meanwhile, when manual driving is executed in the vehicle, the presence of the obstacle is detected based on the detection results of the sensors, and the information regarding the detected obstacle or the like is displayed on the display, whereby the vehicle is driven so as to avoid the obstacle. Moreover, with regard to the sensors, it is preferable that the driver be also notified of information about operation/non-operation, information about malfunction, and information about a detection range corresponding to a travel state of the vehicle. It is preferable that these pieces of information be displayed on the display together with the information regarding the obstacle in order to cause the information to alert the driver.
Hereinafter, the exemplary embodiment of the present invention will be described in detail with reference to the drawings. Note that each exemplary embodiment described below is only illustrative, and does not limit the present invention.
Notification device 2 notifies the driver of information regarding travel of vehicle 100. Notification device 2 is a display for displaying information, such as a light emitter, for example, a light emitting diode (LED) provided on a car navigation system, a head-up display, a center display, a steering wheel, a pillar, a dashboard, and a vicinity of an instrument panel, those of which are installed in a vehicle interior. Moreover, notification device 2 may be a speaker for notifying the driver of information converted into a sound, or may be a vibrator provided on a position (for example, a seat of the driver, a steering wheel, or the like) where the driver can sense vibrations. Furthermore, notification device 2 may be a combination of these elements. Input device 4 is a user interface device that receives an operation input performed by an occupant. For example, input device 4 receives information regarding autonomous driving of the subject vehicle, the information having been input by the driver. Input device 4 outputs the received information to driving support device 40 as an operation signal.
Wireless device 8 is adapted to a mobile phone communication system, wireless metropolitan area network (WMAN) or the like, and executes wireless communication. Driving operating unit 10 includes steering wheel 11, brake pedal 12, accelerator pedal 13, and indicator switch 14. Steering 11, brake pedal 12, accelerator pedal 13 and indicator switch 14 can be electronically controlled by a steering electronic control unit (ECU), a brake ECU, at least one of an engine ECU and a motor ECU, and an indicator controller, respectively. In the autonomous driving mode, the steering ECU, the brake ECU, the engine ECU, and the motor ECU drive actuators according to control signals supplied from autonomous driving control device 30. In addition, the indicator controller turns on or off an indicator lamp according to a control signal supplied from autonomous driving control device 30.
Detector 20 detects a surrounding situation and travel state of vehicle 100. For example, detector 20 detects a speed of vehicle 100, a relative speed of a preceding vehicle with respect to vehicle 100, a distance between vehicle 100 and the preceding vehicle, a relative speed of a vehicle in an adjacent lane with respect to vehicle 100, a distance between vehicle 100 and the vehicle in the adjacent lane, and location information of vehicle 100. Detector 20 outputs the various pieces of detected information (hereinafter referred to as “detection information”) to autonomous driving control device 30 and driving support device 40. Detector 20 includes location information acquisition unit 21, sensor 22, speed information acquisition unit 23, and map information acquisition unit 24.
Location information acquisition unit 21 acquires a current location of vehicle 100 from a global positioning system (GPS) receiver. Sensor 22 is a general term for various sensors for detecting a situation outside the vehicle and the state of vehicle 100. As the sensor for detecting the situation outside the vehicle, for example, a camera, a millimeter-wave radar, a light detection and ranging, laser imaging detection and ranging (LIDAR), a temperature sensor, an atmospheric pressure sensor, a humidity sensor, and an illuminance sensor are mounted. The situation outside the vehicle includes a situation of a road where the subject vehicle travels, which includes lane information, an environment including weather, a surrounding situation of the subject vehicle, and other vehicles (such as other vehicles traveling in the adjacent lane) present nearby. Note that any information may be included as long as the information is vehicle exterior information that can be detected by sensor 22. Moreover, as the sensor 22 for detecting the state of vehicle 100, for example, an acceleration sensor, a gyroscope sensor, a geomagnetism sensor, and an inclination sensor are mounted.
Speed information acquisition unit 23 acquires the current speed of vehicle 100 from a speed sensor. Map information acquisition unit 24 acquires map information around the current location of vehicle 100 from a map database. The map database may be recorded in a recording medium in vehicle 100, or may be downloaded from a map server via a network when being used.
Autonomous driving control device 30 is an autonomous driving controller having an autonomous driving control function mounted thereto, and determines a behavior of vehicle 100 in autonomous driving. Autonomous driving control device 30 includes controller 31, storage unit 32, and input/output (I/O) unit 33. A configuration of controller 31 can be implemented by cooperation between hardware resources and software resources or by only hardware resources. Hardware resources which can be used include a processor, a read only memory (ROM), a random access memory (RAM), and other large scale integrations (LSIs). Software resources which can be used include programs such as an operating system, applications, and firmware. Storage unit 32 has a non-volatile recording medium such as a flash memory. I/O unit 33 executes communication control according to various communication formats. For example, I/O unit 33 outputs information regarding the autonomous driving to driving support device 40, and receives a control command from driving support device 40. I/O unit 33 receives the detection information from detector 20.
Controller 31 applies the control command input from driving support device 40 and the various pieces of information collected from detector 20 or the various ECUs to an autonomous driving algorithm, thereby calculating control values for controlling autonomous control targets such as a travel direction of vehicle 100. Controller 31 transmits the calculated control values to the ECUs or the controllers as the respective control targets. In the present exemplary embodiment, controller 31 transmits the calculated control values to the steering ECU, the brake ECU, the engine ECU, and the indicator controller. Note that, in a case of an electrically driven vehicle or a hybrid car, controller 31 transmits the control values to the motor ECU in place of or in addition to the engine ECU.
Driving support device 40 is an HMI controller executing an interface function between vehicle 100 and the driver, and includes controller 41, storage unit 42, and I/O unit 43. Controller 41 executes a variety of data processing such as HMI control. Controller 41 can be implemented by cooperation between hardware resources and software resources or by only hardware resources. Hardware resources which can be used include a processor, a ROM, a RAM, and other LSIs. Software resources which can be used include programs such as an operating system, applications, and firmware.
Storage unit 42 is a storage area for storing data which is referred to or updated by controller 41. For example, storage unit 42 is implemented by a non-volatile recording medium such as a flash memory. I/O unit 43 executes various types of communication controls corresponding to various types of communication formats. I/O unit 43 includes operation input unit 50, image/sound output unit 51, detection information input unit 52, command interface (IF) 53, and communication IF 56.
Operation input unit 50 receives, from input device 4, an operation signal input by an operation performed for input device 4 by the driver, the occupant, or a user outside of vehicle 100, and outputs this operation signal to controller 41. Image/sound output unit 51 outputs image data or a sound message, which is generated by controller 41, to notification device 2 and causes notification device 2 to display this image data or sound data. Detection information input unit 52 receives, from detector 20, information (hereinafter referred to as “detection information”) which is a result of the detection process performed by detector 20 and indicates the current surrounding situation and travel state of vehicle 100, and outputs the received information to controller 41.
Command IF 53 executes an interface process with autonomous driving control device 30, and includes action information input unit 54 and command output unit 55. Action information input unit 54 receives information regarding the autonomous driving of vehicle 100, the information having been transmitted from autonomous driving control device 30. Then, action information input unit 54 outputs the received information to controller 41. Command output unit 55 receives, from controller 41, a control command which indicates a manner of the autonomous driving to autonomous driving control device 30, and transmits this command to autonomous driving control device 30.
Communication IF 56 executes an interface process with wireless device 8. Communication IF 56 transmits the data, which is output from controller 41, to wireless device 8, and transmits this data to an external device from wireless device 8. Moreover, communication IF 56 receives data transmitted from the external device, the date having been transferred by wireless device 8, and outputs this data to controller 41.
Note that, herein, autonomous driving control device 30 and driving support device 40 are configured as individual devices. As a modification, autonomous driving control device 30 and driving support device 40 may be integrated into one controller as indicated by a broken line in
Input unit 70 is connected to each of sensors 22 via I/O unit 43, and receives the detection result from each of sensors 22 when sensor 22 is operating. The detection result from sensor 22 indicates a direction and the like of the obstacle when the obstacle is detected. Now,
When input unit 70 receives the detection result from each of sensors 22, input unit 70 also receives detection accuracy for the detection result in sensor 22. That is, monitoring unit 72 receives the detection accuracy of sensor 22 when sensor 22 is operating. The detection accuracy is a value indicating a probability of obstacle 220 thus detected, and for example, increases as the detection result becomes more accurate. Note that the detection accuracy is a value different depending on a type of sensor 22. Input unit 70 outputs the direction of obstacle 220 to image generator 74, and outputs the detection accuracy to monitoring unit 72.
Monitoring unit 72 receives the detection accuracy from input unit 70. Based on the detection accuracy, monitoring unit 72 detects malfunction of sensor 22 for the obstacle. For example, monitoring unit 72 stores a threshold value for each type of sensors 22, and selects a threshold value corresponding to sensor 22 that has derived the input detection accuracy. Moreover, when the detection accuracy is lower than the threshold value as a result of comparing the detection accuracy and the threshold value with each other, monitoring unit 72 detects the malfunction. When having detected the malfunction, monitoring unit 72 notifies image generator 74 that the malfunction is detected.
Moreover, monitoring unit 72 receives, as the travel state of vehicle 100, the current speed from speed information acquisition unit 23 via I/O unit 43. Monitoring unit 72 stores a threshold value for the current speed separately from the above-mentioned threshold value, and compares the threshold value and the current speed with each other. If the current speed is the threshold value or less, then monitoring unit 72 determines that a current state of vehicle 100 is a normal travel state. Meanwhile, when the current speed is larger than the threshold value, monitoring unit 72 determines that the current state is a high-speed travel state. Note that, based on the current location acquired in location information acquisition unit 21 and the map information acquired in map information acquisition unit 24, monitoring unit 72 specifies a type of a road on which vehicle 100 is traveling. If the road is an ordinary road, monitoring unit 72 may determine that the current state is the normal travel state. If the road is an expressway, monitoring unit 72 may determine that the current state is the high-speed travel state. Monitoring unit 72 outputs a determination result to image generator 74. Furthermore, monitoring unit 72 receives information as to whether vehicle 100 is under autonomous driving or manual driving from autonomous driving control device 30 via I/O unit 43, and also outputs the received information to image generator 74.
Image generator 74 receives the direction of obstacle 220 from input unit 70, and receives, from monitoring unit 72, information on the detection of the operation/non-operation and malfunction of each of sensors 22, the normal travel state/high-speed travel state of vehicle 100, and the autonomous driving/manual driving of vehicle 100. Image generator 74 specifies an area that includes obstacle 220 based on the received direction of obstacle 220.
Moreover, when non-operating sensor 22 is present in the received operation/non-operation of each of sensors 22, image generator 74 specifies an area, which corresponds to such a detection range of sensor 22, as a “non-operation area”. Note that information regarding the area corresponding to the detection range of sensor 22 is stored in image generator 74 in advance for each sensor 22. For example, when sensor 22 of which detection range is the rear of vehicle 100 is under non-operation, image generator 74 specifies fifth area 208 as the non-operation area. Moreover, when having received the detection of the malfunction, image generator 74 specifies an area, which corresponds to the detection of the malfunction, as a “malfunction area”. The malfunction area overlaps the detection area; however, the malfunction area is given priority.
When having received the normal travel state, image generator 74 does not specify an area. However, when having received the high-speed travel state, image generator 74 specifies, as a “non-notification area”, an area corresponding to a detection range of sensor 22 that is not used in the high-speed travel state. Here, third area 204 and seventh area 212, which are the right and left areas of vehicle 100, are specified as such non-notification areas. As described above, in response to the travel state of vehicle 100, image generator 74 changes the ranges where sensors 22 are detectable. Moreover, image generator 74 selects a first color when having received the autonomous driving, and selects a second color when having received the manual driving. Here, the first color and the second color just need to be different colors from each other, and these colors just need to be set arbitrarily.
Image generator 74 generates image data corresponding to these processes.
Also when the malfunction is detected, similar display to the case where non-operating sensor 22 is present is made. For example, in
Output unit 76 receives the image data from image generator 74, and outputs the image to center display 2b in
An operation of driving support device 40 having the above configuration will be described.
According to the present exemplary embodiment, the information on the malfunction of the sensor is also output together with the information on the operation/non-operation of the sensors. Accordingly, a notice on the information regarding the sensors mounted on the vehicle can be issued in a lump. Moreover, the information on the detection/non-detection of the obstacle is also output together with the information on the operation/non-operation of the sensors. Accordingly, the notice on the information on the sensors mounted on the vehicle can be issued in a lump. Moreover, the detectable ranges are changed and output in response to the travel state of the vehicle. Accordingly, the travel state of the vehicle and the detection ranges of the sensors can be recognized in association with each other. Furthermore, the information regarding the sensors is displayed collectively on one screen. Accordingly, it can be made easy for the driver to grasp the situation. Moreover, the background color is changed in response to whether the vehicle is under autonomous driving or manual driving. Accordingly, attention corresponding to whether the vehicle is under autonomous driving or manual driving can be urged to rise.
While the exemplary embodiment according to the present invention has been described above with reference to the drawings, the functions of the above-mentioned devices and processing units can be implemented by a computer program. A computer that achieves the above-mentioned functions through execution of a program is provided with an input device such as a keyboard, a mouse and a touch pad, an output device such as a display and a speaker, a central processing unit (CPU), a storage device such as a read only memory (ROM), a random access memory (RAM), a hard disk device and a solid state drive (SSD), a reading device for reading information from a recording medium such as a digital versatile disk read only memory (DVD-ROM) and a universal serial bus (USB) memory, and a network card that performs communication through a network. These units of the computer are interconnected with a bus.
The reading device reads the program from the recording medium recording the program therein, and the storage device stores the program. Alternatively, the network card performs communication with a server device connected to the network, and a program for implementing the respective functions of the above-described devices, the program having been downloaded from the server device, is stored in the storage device. Moreover, onto the RAM, the CPU copies the program stored in the storage device, and from the RAM, sequentially fetches instructions included in the program, and executes each of the instructions. In this way, the respective functions of the above-described devices are implemented.
An outline of an aspect of the present invention is as follows. A driving support device according to an aspect of the present invention includes: a monitoring unit that monitors whether a sensor to be mounted on a vehicle is operating; and an output unit that outputs operation-state information indicating a result of the monitoring by the monitoring unit. The monitoring unit detects malfunction of the sensor based on detection accuracy of the sensor. The detection accuracy is received when the sensor operates. The output unit malfunction information on the malfunction of the sensor together with the operation-state information when the monitoring unit detects the malfunction of the sensor.
According to this aspect, the information on the malfunction of the sensor is also output together with the information on the operation/non-operation of the sensors. Accordingly, a notice on the information on the sensors mounted on the vehicle can be issued in a lump.
The driving support device may further include an input unit that receives a detection result indicating a result of detection by the sensor. The output unit may output detection information together with the operation-state information. The detection information indicates a result of the detection received by the input unit. In this case, the information on the detection/non-detection of the obstacle is also output together with the information on the operation/non-operation of the sensor. Accordingly, a notice on the information regarding the sensors mounted on the vehicle can be issued in a lump.
The output unit may output the information in association with a range detectable by the sensor, the monitoring unit may also receive a travel state of the vehicle, and the output unit may change the detectable range of the information to be output in response to the travel state of the vehicle. In this case, the detectable range is changed and output in response to the travel state of the vehicle. Accordingly, the travel state of the vehicle and the detection range of the sensor can be recognized in association with each other.
The output unit may change an output mode in response to whether the vehicle is under autonomous driving or manual driving. In this case, attention corresponding to whether the vehicle is under autonomous driving or manual driving can be urged to rise.
Another aspect of the present invention provides an autonomous driving control device. This device includes: a monitoring unit that monitors whether a sensor to be mounted on a vehicle is operating; an output unit that outputs operation-state information indicating a result of the monitoring by the monitoring unit; and an autonomous driving controller that controls autonomous driving of the vehicle based on a detection result of the sensor. The monitoring unit detects malfunction of the sensor based on detection accuracy of the sensor. The detection accuracy is received when the sensor operates. The output unit outputs malfunction information on the malfunction of the sensor together with the operation-state information when the monitoring unit detects the malfunction.
Still another aspect of the present invention provides a vehicle. The vehicle includes a driving support device. The driving support device includes: a monitoring unit that monitors whether a sensor to be mounted on a vehicle is operating; and an output unit that outputs operation-state information indicating a result of the monitoring by the monitoring unit. The monitoring unit detects malfunction of the sensor based on detection accuracy of the sensor. The detection accuracy is received when the sensor operates. The output unit outputs malfunction information on the malfunction of the sensor together with the operation-state information when the monitoring unit detects the malfunction.
Yet another aspect of the present invention provides a driving support method. This method includes: monitoring whether a sensor to be mounted on a vehicle is operating; outputting operation-state information indicating a result of the monitoring by the monitoring unit; detecting malfunction of the sensor based on detection accuracy of the sensor, the detection accuracy being received when the sensor operates; and outputting malfunction information on the malfunction of the sensor together with the operation-state information when the malfunction of the sensor is detected.
The present invention has been described above based on the exemplary embodiment. It will be understood by those skilled in the art that the exemplary embodiment is merely an example, other exemplary modifications in which components and/or processes of the exemplary embodiment are variously combined are possible, and the other exemplary modifications still fall within the scope of the present invention.
The present invention is applicable to a vehicle, a driving support method provided in the vehicle, a driving support device using the driving support method, an autonomous driving control device, a program, and the like.
Number | Date | Country | Kind |
---|---|---|---|
2016-072731 | Mar 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/002439 | 1/25/2017 | WO | 00 |