The disclosure relates to a driver assistance apparatus to be mounted on a vehicle, a vehicle, and a driver assistance method.
In recent years, as to vehicles such as automobiles, developments of automated driving control techniques have been in progress. The automated driving control techniques include allowing vehicles to travel automatically without drivers' driving operations. Moreover, various proposals have been made for driver assistance apparatuses that make various kinds of controls to assist a driver in making driving operations by using this kind of the automated driving control techniques, and such driver assistance apparatuses are widely put into practical use. Techniques related to such driver assistance apparatuses are disclosed in, for example, Japanese Patent Nos. 7171808 and 2969174, Japanese Unexamined Patent Application Publication (JP-A) No. 2020-101986, and Japanese Patent No. 5776838.
An aspect of the disclosure provides a driver assistance apparatus including a processor. The processor is configured to predict behavior of a prediction target vehicle when a non-priority road is present ahead of a first vehicle and the prediction target vehicle is present stopped at a waiting point on the non-priority road or traveling toward the waiting point. The non-priority road merges with or intersects a priority road including one or more lanes on each side. The processor is configured to perform the following (A1), (A2), and (A3).
An aspect of the disclosure provides a vehicle including a processor. The processor is configured to predict behavior of a prediction target vehicle when a non-priority road is present ahead of a first vehicle and the prediction target vehicle is present stopped at a waiting point on the non-priority road or traveling toward the waiting point. The non-priority road merges with or intersects a priority road including one or more lanes on each side. The processor is configured to perform the following (B1), (B2), and (B3).
An aspect of the disclosure provides a driver assistance method including predicting behavior of a prediction target vehicle when a non-priority road is present ahead of a first vehicle and the prediction target vehicle is present stopped at a waiting point on the non-priority road or traveling toward the waiting point. The non-priority road merges with or intersects a priority road including one or more lanes on each side. The driver assistance method includes the following (C1), (C2), and (C3).
An aspect of the disclosure provides a driver assistance apparatus including a processor. The processor is configured to predict behavior of a prediction target vehicle when a non-priority road is present ahead of a first vehicle and the prediction target vehicle is present stopped at a waiting point on the non-priority road or traveling toward the waiting point. The non-priority road merges with or intersects a priority road including one or more lanes on each side. The processor is configured to perform the following (D1), (D2), and (D3).
An aspect of the disclosure provides a driver assistance apparatus including a processor. The processor is configured to predict behavior of a prediction target vehicle when a non-priority road is present ahead of a first vehicle and the prediction target vehicle is present stopped at a waiting point on the non-priority road or traveling toward the waiting point. The non-priority road merges with or intersects a priority road including one or more lanes on each side. The processor is configured to perform the following (E1), (E2), and (E3).
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments and, together with the specification, serve to explain the principles of the disclosure.
In the following, some example embodiments of the disclosure are described in detail with reference to the accompanying drawings. Note that the following description is directed to illustrative examples of the disclosure and not to be construed as limiting to the disclosure. Factors including, without limitation, numerical values, shapes, materials, components, positions of the components, and how the components are coupled to each other are illustrative only and not to be construed as limiting to the disclosure. Further, elements in the following example embodiments which are not recited in a most-generic independent claim of the disclosure are optional and may be provided on an as-needed basis. The drawings are schematic and are not intended to be drawn to scale. Throughout the present specification and the drawings, elements having substantially the same function and configuration are denoted with the same reference numerals to avoid any redundant description. In addition, elements that are not directly related to any embodiment of the disclosure are unillustrated in the drawings.
In recent years, as to vehicles such as automobiles, development of automated driving control techniques has been in progress. The automated driving control techniques include allowing vehicles to travel automatically without drivers' driving operations. Moreover, various proposals have been made for driver assistance apparatuses that make various kinds of controls to assist a driver in making driving operations by using this kind of the automated driving control techniques, and the driver assistance apparatuses are widely put into practical use. Techniques related to such driver assistance apparatuses are disclosed in, for example, Japanese Patent Nos. 7171808 and 2969174, JP-A No. 2020-101986, and Japanese Patent No. 5776838.
The invention described in Japanese Patent No. 7171808 discloses a technique including: predicting driving intentions of drivers of surrounding vehicles based on positions of the surrounding vehicles traveling around a subject vehicle and travel parameters such as a speed; and estimating presence or absence of possibility that any of the surrounding vehicles is going to merge into traffic on a lane on which the subject vehicle is traveling. The invention described in Japanese Patent No. 2969174 discloses a technique including: identifying a merged vehicle on a priority road; and when an inter-vehicle distance to the merged vehicle is equal to or less than a merging safety inter-vehicle distance, determining a traffic situation ahead of and behind the merged vehicle to determine whether the subject vehicle is to merge. The merged vehicle is a vehicle that is going to become a following vehicle after merging.
The invention described in JP-A No. 2020-101986 discloses a technique including: predicting behavior of moving bodies based on dynamic information regarding the moving bodies generated based on sensor data collected from multiple sensors; and determining a combination of behavior having possibility of collision between the moving bodies based on the predicted behavior. The invention described in Japanese Patent No. 5776838 discloses a technique including: predicting a moving body that may possibly rush out of a blind spot; and calculating a speed range in which a subject vehicle may possibly come into contact with the moving body, based on a predicted assumed speed of the moving body.
However, the inventions described in Japanese Patent Nos. 7171808 and 2969174, JP-A No. 2020-101986, and Japanese Patent No. 5776838 only include estimating a motion of the vehicle of the other party based on presence or absence of the possibility of collision by using parameters such as the vehicle speed of the vehicle of the other party and the inter-vehicle distance to the vehicle of the other party, without consideration of parameters highly correlated with a mental state of a driver of the vehicle of the other party. Thus, in the inventions described in Japanese Patent Nos. 7171808 and 2969174, JP-A No. 2020-101986, and Japanese Patent No. 5776838, it is difficult to predict possibility of merging, a lane change, etc. of the vehicle of the other party under an influence of the mental state of the driver of the vehicle of the other party even when the possibility of merging, a lane change, etc. of the vehicle of the other party is theoretically very low. As a result, the inventions described in Japanese Patent Nos. 7171808 and 2969174, JP-A No. 2020-101986, and Japanese Patent No. 5776838 end up in taking measures after the vehicle of the other party starts to merge, make a lane change, etc. This results in high possibility of an incident of collision between the subject vehicle and the vehicle of the other party.
As described, the existing inventions have a concern that it is difficult to predict the possibility of merging, a lane change, etc. of the vehicle of the other party under the influence of the mental state of the driver of the vehicle of the other party. As a result of intensive studies, the inventors of the application have created the technology that makes it possible to predict the possibility of merging, a lane change, etc. of the vehicle of the other party under the influence of the mental state of the driver of the vehicle of the other party. In the following, description is given of the background of the technology newly created, by giving virtual cases of traffic situations.
A driver of the vehicle 100a is aware that the vehicle 100a is traveling on the priority road Lm. Accordingly, the vehicle 100a is going to enter the intersection CL without decelerating. At this occasion, on the non-priority road Ls, the vehicle 100b is stopped in front of the stop line SL (waiting point). A driver of the vehicle 100b intends to pass through (cross or travel across) the intersection CL or turn left at the intersection CL (merge into traffic on the opposite lane L2). The driver of the vehicle 100b, on the non-priority road Ls, is looking out for timing of passing through the intersection CL or timing of turning left at the intersection CL while stopped in front of the stop line SL (waiting point). At this occasion, the driver of the vehicle 100b spots a large space SP between a vehicle 100c and a vehicle 100d on the lane (opposite lane L2) for leftward travel on the priority road Lm. The driver of the vehicle 100b decides to pass through the intersection CL or turn left at the intersection CL by using this space SP and starts to allow the vehicle 100b to enter the intersection CL (time tb in
As the stop time (waiting time) in front of the stop line SL becomes longer, the driver of the vehicle 100b becomes more irritated because they are unable to start the vehicle. Thus, normally, the driver of the vehicle 100b can recognize the presence of the vehicle 100a, but the irritated driver of the vehicle 100b inadvertently overlooks the presence of the vehicle 100a because of the irritation. As a result, the driver of the vehicle 100b starts the vehicle 100b without recognizing the presence of the vehicle 100a. In such a traffic situation, there is high possibility of an incident of collision between the vehicle 100a and the vehicle 100b as they meet at the intersection CL (time tc in
The driver of the vehicle 100a is aware that the vehicle 100a is traveling on the priority road Lm. Accordingly, the vehicle 100a is going to enter the intersection CL without decelerating. At this occasion, on the non-priority road Ls, the vehicle 100b is traveling far short of the stop line SL (waiting point). The driver of the vehicle 100b intends to pass through the intersection CL just ahead or turn left at the intersection CL. The driver of the vehicle 100b, on the non-priority road Ls, is looking out for the timing of passing through the intersection CL or the timing of turning left at the intersection CL while allowing the vehicle 100b to travel. At this occasion, the driver of the vehicle 100b spots the space SP on the lane (opposite lane L2) for the leftward travel on the priority road Lm. The driver of the vehicle 100b decides to pass through the intersection CL or turn left at the intersection CL by using the space SP, and starts to allow the vehicle 100b to enter the intersection CL without stopping in front of the stop line SL (time tb in
As time (time to spare) from the time when the driver of the vehicle 100b spots the space SP to the time when the driver of the vehicle 100b allows the vehicle 100b to enter the intersection CL becomes shorter, the driver of the vehicle 100b becomes more impatient, feeling that they have to enter the intersection CL immediately. In particular, when the vehicle 100b can enter the space SP without decelerating, or when the vehicle 100b can enter the space SP by entering the intersection CL with a small amount of deceleration, the driver of the vehicle 100b tends to make a hasty determination. Thus, normally, the driver of the vehicle 100b can recognize the presence of the vehicle 100a, but the impatient driver of the vehicle 100b inadvertently overlooks the presence of the vehicle 100a because of the impatience. As a result, the driver of the vehicle 100b allows the vehicle 100b to enter the intersection CL without recognizing the presence of the vehicle 100a. In such a traffic situation, there is high possibility of an incident of collision between the vehicle 100a and the vehicle 100b as they meet at the intersection CL (time to in
Thus, the inventors of the application has thought of predicting behavior of the vehicle 100b by using a parameter highly correlated with the mental state of the driver of the vehicle 100b, such as the waiting time for the vehicle 100b and the time to spare for the vehicle 100b, as a measure to reduce a risk of collision between the vehicle 100a and the vehicle 100b in a specific traffic situation in which the vehicle 100a and the vehicle 100b are going to enter the intersection CL where the priority road Lm and the non-priority road Ls intersect. In the following, detailed description is given of a travel control system to realize that.
The controller 200 may sequentially integrate and update road map information transmitted from the travel controller 10 of each vehicle, and transmit the updated road map information to each vehicle. The controller 200 may include, for example, a road map information integrated ECU 201 and a transceiver 202.
The road map information integrated ECU 201 may integrate the road map information collected from the multiple vehicles through the transceiver 202, and sequentially update the road map information surrounding the vehicles on the road. The road map information may include, for example, a dynamic map. The road map information may include static information and quasi-static information that mainly constitute road information, and quasi-dynamic information and dynamic information that mainly constitute traffic information.
The static information constituting the road information may include, for example, information to be updated within one month, e.g., roads and structures on the roads, structures around the roads, lane information, road surface information, and permanent regulatory information. The “roads” may include, for example, positions and shapes of the roads, intersections, and attributes of the roads (e.g., national roads, prefectural roads, municipal roads, private roads, priority roads, non-priority roads, general roads, and expressways), etc. The “structures on the roads” may include, for example, traffic signs, traffic lights, convex traffic mirrors at road curves, footbridges, and the like. The “structures around the roads” may include, for example, various buildings, parks, and the like.
The quasi-static information constituting the road information may include, for example, information to be updated within one hour, e.g., traffic regulatory information by road construction, events, and the like, wide-area weather information, and traffic congestion prediction.
The quasi-dynamic information constituting the traffic information may include, for example, information to be updated within one minute, e.g., an actual congestion state and a travel regulation at the time of observation, temporary states of obstacles to travel such as a falling object or an obstacle, an actual incident state, and narrow-area weather information.
The dynamic information constituting the traffic information may include information to be updated in units of one second, e.g., information transmitted and exchanged between mobile bodies, information regarding current display of the traffic lights, information regarding pedestrians and bicycles at an intersection, and vehicle information regarding vehicles traveling on the roads. Such road map information may be maintained and updated on cycles until the next piece of information is received from each vehicle, and the updated road map information may be transmitted as appropriate to each vehicle through the transceiver 202.
The travel controller 10 may include a travel environment recognizer 11 and a locator unit 12 as units that recognize a travel environment around the vehicle. Moreover, the travel controller 10 may include a travel control unit (hereinafter, referred to as a “travel_ECU”) 21, an engine control unit (hereinafter, referred to as an “E/G_ECU”) 22, a power steering control unit (hereinafter, referred to as a “PS_ECU”) 23, and a brake control unit (hereinafter, referred to as a “BK_ECU”) 24. These control units 21 to 24 may be coupled together through an in-vehicle communication line such as a CAN (Controller Area Network), together with the travel environment recognizer 11 and the locator unit 12.
The travel_ECU 21 may control the vehicle in accordance with, for example, a driving mode. Non-limiting examples of the driving mode may include a manual driving mode and a travel control mode. The manual driving mode is a driving mode that involves steering by a driver. The manual driving mode is a driving mode that includes allowing the subject vehicle to travel in accordance with driving operations by the driver, e.g., a steering operation, an accelerator operation, and a brake operation. The travel control mode is a driving mode that includes assisting the driver in making the driving operations to enhance safety of pedestrians, vehicles, or the like around the vehicle (subject vehicle). In the travel control mode, for example, when the vehicle (subject vehicle) approaches an intersection, the travel_ECU 21 may predict behavior of a traveling vehicle or a stopped vehicle on a road intersecting at the intersection (hereinafter, referred to as a “target vehicle”). As a result of the prediction, when there is high possibility of entry of the target vehicle into the intersection, the travel_ECU 21 is configured to, for example, call for the driver's attention, alert the driver, and furthermore, make a risk avoidance control such as braking. Contents of processing in the travel control mode are described later in detail.
To output side of the E/G_ECU 22, a throttle actuator 25 may be coupled. The throttle actuator 25 may open and close a throttle valve of an electronically controlled throttle provided in a throttle body of an engine. The E/G_ECU 22 may control operation of the throttle actuator 25 by outputting a drive signal to the throttle actuator 25. The throttle actuator 25 may generate a desired engine output by causing opening and closing operation of the throttle valve based on the drive signal from the E/G_ECU 22 to adjust an intake air flow rate.
To output side of the PS_ECU 23, an electric power steering motor 26 may be coupled. The electric power steering motor 26 may apply steering torque to a steering mechanism by a rotational force of the motor. The PS_ECU 23 may control operation of the electric power steering motor 26 by outputting a drive signal to the electric power steering motor 26. In automated driving, the electric power steering motor 26 may make a lane keeping travel control and a lane change control (lane change control for an overtaking control, etc.) based on the drive signal from the PS_ECU 23. The lane keeping travel control includes maintaining the travel on the current travel lane. The lane change control includes moving the subject vehicle to an adjacent lane.
To output side of the BK_ECU 24, a brake actuator 27 may be coupled. The brake actuator 27 may adjust brake hydraulic pressure to be supplied to brake wheel cylinders provided on respective wheels. The BK_ECU 24 may control operation of the brake actuator 27 by outputting a drive signal to the brake actuator 27. The brake actuator 27 may generate braking forces for the respective wheels by the brake wheel cylinders based on the drive signal from the BK_ECU 24, to force the vehicle to decelerate.
The travel environment recognizer 11 may be fixed to, for example, the upper center of an inner front portion of the vehicle. The travel environment recognizer 11 may include an in-vehicle camera (stereo camera), an image processing unit (IPU) 11c, and a travel environment detector 11d. The in-vehicle camera may include a main camera 11a and a sub-camera 11b.
The main camera 11a and the sub-camera 11b are autonomous sensors that sense the real space around the vehicle. The main camera 11a and the sub-camera 11b may be disposed at, for example, horizontally symmetrical positions with respect to the widthwise center of the vehicle. The main camera 11a and the sub-camera 11b are configured to capture vehicle-frontward stereo images from different viewpoints.
The IPU 11c is configured to generate a distance image. The distance image may be obtained based on an amount of displacement between positions of a corresponding target in a pair of the vehicle-frontward stereo images obtained by imaging by the main camera 11a and the sub-camera 11b.
For example, the travel environment detector 11d is configured to obtain lane lines that define a road around the vehicle, based on the distance image received from the IPU 11c. For example, the travel environment detector 11d is further configured to obtain a road curvature [1/m] of the lane lines that define the right and left edges of a travel road (travel lane) on which the vehicle travels, and a width (vehicle width) between the right and left lane lines. For example, the travel environment detector 11d is further configured to perform predetermined pattern-matching with respect to the distance image, to detect lanes and three-dimensional objects such as the structures present around the vehicle.
Here, in the detection of three-dimensional objects in the travel environment detector 11d, for example, detection of the kind of a three-dimensional object, a distance to the three-dimensional object, a speed of the three-dimensional object, a relative speed of the three-dimensional object to the vehicle (subject vehicle), and the like may be performed. Non-limiting examples of the three-dimensional objects to be detected may include traffic lights, intersections, road signs, stop lines, other vehicles, pedestrians, and various buildings. For example, the travel environment detector 11d is configured to output information regarding the detected three-dimensional objects to the travel_ECU 21.
The locator unit 12 may estimate a position of the vehicle (vehicle position) on a road map, and include a locator calculator 13. The locator calculator 13 may estimate the vehicle position. To input side of the locator calculator 13, sensors to be involved in estimating the position of the vehicle (vehicle position) may be coupled. Non-limiting examples of such sensors may include an acceleration rate sensor 14, a vehicle speed sensor 15, a gyro sensor 16, and a GNSS receiver 17. The acceleration rate sensor 14 is configured to detect a longitudinal acceleration rate of the vehicle. The vehicle speed sensor 15 is configured to detect a speed of the vehicle. The gyro sensor 16 is configured to detect an angular speed or an angular acceleration rate of the vehicle. The GNSS receiver 17 is configured to receive positioning signals transmitted from positioning satellites. Moreover, to the locator calculator 13, a transceiver 18 may be coupled. The transceiver 18 may transmit and receive information to and from the controller 200, and transmit and receive information to and from other vehicles.
Moreover, to the locator calculator 13, a high-precision road map database 19 may be coupled. The high-precision road map database 19 may include a large-capacity storage medium such as an HDD and hold high-precision road map information (dynamic map). As with the road map information included in the road map information integrated ECU 201, the high-precision road map information may include, for example, static information and quasi-static information that mainly constitute road information, and quasi-dynamic information and dynamic information that mainly constitute traffic information.
The locator calculator 13 may include, for example, a map information acquirer 13a, a vehicle position estimator 13b, and a travel environment recognizer 13c.
The vehicle position estimator 13b is configured to acquire positional coordinates of the vehicle (subject vehicle) based on the positioning signals received by the GNSS receiver 17. Moreover, the vehicle position estimator 13b is configured to estimate the vehicle position on the road map by map-matching the acquired positional coordinates on route map information. The map information acquirer 13a is configured to acquire map information regarding a predetermined range including the vehicle (subject vehicle) from the map information held in the high-precision road map database 19 based on the positional coordinates of the vehicle (subject vehicle) acquired by the vehicle position estimator 13b.
In an environment in which a decrease in sensitivity of the GNSS receiver 17 inhibits the GNSS receiver 17 from receiving valid positioning signals from the positioning satellites, e.g., on travel through a tunnel, the vehicle position estimator 13b is configured to estimate the vehicle position on the road map by switching to autonomous navigation. The autonomous navigation includes estimating the vehicle position based on the vehicle speed detected by the vehicle speed sensor 15, the angular speed detected by the gyro sensor 16, and the longitudinal acceleration rate detected by the acceleration rate sensor 14.
When the vehicle position estimator 13b estimates the position of the vehicle (vehicle position) on the road map based on, for example, the positioning signals received by the GNSS receiver 17 or the information detected by the gyro sensor 16 as described above, the vehicle position estimator 13b is configured to determine, for example, the kind of the travel road on which the vehicle (subject vehicle) is traveling, based on the estimated vehicle position on the road map.
The travel environment recognizer 13c is configured to update the road map information held in the high-precision road map database 19 to the latest state by using the road map information acquired by external communication (road-to-vehicle communication and vehicle-to-vehicle communication) through the transceiver 18. This information update may be made with respect to not only the static information but also the quasi-static information, the quasi-dynamic information, and the dynamic information. Thus, the road map information includes road information and traffic information acquired by the communication with outside the vehicle, and information regarding moving bodies such as vehicles traveling on the road is updated in substantially real time.
The travel environment recognizer 13c is configured to verify the road map information based on travel environment information recognized by the travel environment recognizer 11, and update the road map information held in the high-precision road map database 19 to the latest state. This information update may be made with respect to not only the static information but also the quasi-static information, the quasi-dynamic information, and the dynamic information. Thus, information regarding moving bodies such as vehicles traveling on the road recognized by the travel environment recognizer 11 is updated in real time.
The road map information thus updated may be transmitted to the controller 200, surrounding vehicles around the vehicle (subject vehicle), and the like by the road-to-vehicle communication and the vehicle-to-vehicle communication through the transceiver 18. Furthermore, the travel environment recognizer 13c is configured to output, to the travel_ECU 21, the map information regarding the predetermined range including the vehicle position estimated by the vehicle position estimator 13b, within the updated road map information, together with the vehicle position (vehicle position information).
Next, the travel_ECU 21 is described in detail.
In
Meanwhile, the road intersecting the priority road Lm at the intersection CL is a non-priority road Ls in relation to the priority road Lm. On the non-priority road Ls, the vehicle (target vehicle) 100b is stopped in front of the stop line SL (waiting point) or traveling toward the intersection CL. In an embodiment of the disclosure, the vehicle 100b may serve as a “prediction target vehicle”. No traffic lights are installed at the intersection CL.
The driver of the vehicle 100a is aware that the vehicle 100a is traveling on the priority road Lm. Accordingly, the vehicle 100a is going to enter the intersection CL without decelerating. At this occasion, on the non-priority road Ls, the vehicle 100b is stopped in front of the stop line SL (waiting point) or traveling toward the intersection CL. The driver of the vehicle 100b intends to pass through the intersection CL or turn left at the intersection CL. The driver of the vehicle 100b, on the non-priority road Ls, is looking out for the timing of passing through the intersection CL or the timing of turning left at the intersection CL while the vehicle 100b is stopped in front of the stop line SL (waiting point) or traveling toward the intersection CL. At this occasion, the driver of the vehicle 100b spots the large space SP between the vehicle 100c and the vehicle 100d on the lane (opposite lane L2) for the leftward travel on the priority road Lm. The driver of the vehicle 100b decides to pass through the intersection CL or turn left at the intersection CL by using this space SP, and allows the vehicle 100b to enter the intersection CL. However, the driver of the vehicle 100b is preoccupied with passing through the intersection CL or turning left at the intersection CL by using the space SP they have spotted, and inadvertently overlooks the presence of the vehicle 100a.
Here, let us assume that the vehicle 100b is stopped in front of the stop line SL. At this occasion, as the stop time (waiting time) in front of the stop line SL becomes longer, the driver of the vehicle 100b becomes more irritated because they are unable to start the vehicle. Thus, normally, the driver of the vehicle 100b can recognize the presence of the vehicle 100a, but the irritated driver of the vehicle 100b inadvertently overlooks the presence of the vehicle 100a because of the irritation. As a result, the driver of the vehicle 100b starts the vehicle 100b without recognizing the presence of the vehicle 100a. In such a traffic situation, there is high possibility of an incident of collision between the vehicle 100a and the vehicle 100b as they meet at the intersection CL.
Moreover, let us assume that the vehicle 100b is traveling short of the stop line SL. At this occasion, as the time (time to spare) from the time when the driver of the vehicle 100b spots the space SP to the time when the driver of the vehicle 100b allows the vehicle 100b to enter the intersection CL becomes shorter, the driver of the vehicle 100b becomes more impatient, feeling that they have to enter the intersection CL immediately. In particular, when the vehicle 100b can enter the space SP without decelerating, or when the vehicle 100b can enter the space SP by entering the intersection CL with a small amount of deceleration, the driver of the vehicle 100b tends to make a hasty determination. Thus, normally, the driver of the vehicle 100b can recognize the presence of the vehicle 100a, but the impatient driver of the vehicle 100b inadvertently overlooks the presence of the vehicle 100a because of the impatience and the hasty determination. As a result, the driver of the vehicle 100b allows the vehicle 100b to enter the intersection CL without recognizing the presence of the vehicle 100a. In such a traffic situation, there is high possibility of an incident of collision between the vehicle 100a and the vehicle 100b as they meet at the intersection CL.
Thus, the travel_ECU 21 is configured to make calculation in consideration of such a matter. In one example, the travel_ECU 21 is configured to determine whether it is the specific traffic situation in which the vehicle 100a and the vehicle 100b are going to enter the intersection CL where the priority road Lm and the non-priority road Ls intersect. Moreover, after determining that it is the specific traffic situation, the travel_ECU 21 is configured to predict the possibility of the entry of the vehicle 100b into the space SP (passing probability P) based on presence of the space SP (entry space) available for the entry of the vehicle 100b and a result of calculation of waiting time Tw or time to spare Ts of the vehicle 100b.
The space SP refers to a space formed by any two adjacent vehicles on a common lane (e.g., the opposite lane L2). The “space SP available for the entry of the vehicle 100b (entry space)” refers to a space having a width or a length theoretically large enough for the vehicle 100b to enter, when the vehicle 100b is stopped in front of the stop line SL or traveling toward the intersection CL. The “entry space” needs to be present at least within a range recognizable by the driver of the vehicle 100b. Accordingly, the “entry space” needs to be present, for example, in a range of a radius of about 50 m with the vehicle 100b as a center.
For example, the travel_ECU 21 is configured to determine presence or absence of the space SP satisfying the following passing conditions (1) and (2), among one or more spaces SP each formed by any two adjacent vehicles on the opposite lane L2 of the priority road Lm. When the passing conditions (1) and (2) are represented by mathematical expressions, the mathematical expressions are as given in the following paragraph. As a result, when the space SP is present that satisfies the following mathematical expressions of the passing conditions, the travel_ECU 21 is configured to recognize the relevant space SP as the space SP (entry space) available for the entry of the vehicle 100b.
It is to be noted that non-limiting examples of the traffic situation of “absence of the entry space” are as follows.
The travel_ECU 21 is configured to estimate whether the traffic situation ahead of the vehicle 100a is such a traffic condition, based on, for example, the data obtained from the sensors of the vehicle 100a (for example, the travel environment recognizer 11), data obtained from a road surface sensor by the road-to-vehicle communication by the transceiver 18, or data obtained from random vehicles by the vehicle-to-vehicle communication by the transceiver 18. When these pieces of data includes, for example, data indicating that multiple vehicles are traveling seamlessly on the opposite lane L2, the travel_ECU 21 is configured to determine that the traffic situation ahead of the vehicle 100a is the traffic situation as described above.
The waiting time Tw indicates time during which the vehicle 100b is stopped in front of the stop line SL. The time refers to time (predicted time) predicted to be spent by the vehicle 100b stopped in front of the stop line SL until the vehicle 100b starts at the stop line SL, or actual measured time having predetermined correlation with the predicted time.
Start timing of the predicted time and the actual measured time may include, for example, various kinds of timing as described below. The start timing of the predicted time and the actual measured time may be, for example, timing at which the vehicle 100b stops in front of the stop line SL, or timing at which measurement of the predicted time and the actual measured time is started when the vehicle 100b is stopped in front of the stop line SL. The start timing of the predicted time and the actual measured time may be, for example, timing at which the “entry space” is detected when the vehicle 100b is stopped in front of the stop line SL. The start timing of the predicted time and the actual measured time may be, for example, timing at which the vehicle 100b is detected to stop in front of the stop line SL, or timing at which the vehicle 100b stopped in front of the stop line SL is detected.
End timing of the actual measured time may be, for example, timing at which calculation of the waiting time Tw is started in the travel_ECU 21 (timing at which step S110 described later is started). The timing at which the travel_ECU 21 starts the calculation of the waiting time Tw is timing earlier, by a predetermined time, than the timing at which the vehicle 100b actually starts at the stop line SL. The end timing of the actual measured time is not limited to the timing at which step S110 described later is started.
The travel_ECU 21 is configured to calculate the waiting time Tw (predicted time or actual measured time) based on, for example, the data obtained from the sensors of the vehicle 100a (for example, the travel environment recognizer 11), the data obtained from the road surface sensor by the road-to-vehicle communication by the transceiver 18, or the data obtained from random vehicles by the vehicle-to-vehicle communication by the transceiver 18.
The time to spare Ts refers to time to spare until the entry of the vehicle 100b into the “entry space”. The time to spare Ts is, for example, a difference between the time at which the vehicle 100b is predicted to reach the “entry space” and the current time. The time to spare Ts may be, for example, time having predetermined correlation with the difference between the time at which the vehicle 100b is predicted to reach the “entry space” and the current time. The time to spare Ts may be, for example, time having predetermined correlation with a difference between the time at which the vehicle 100b is predicted to reach the stop line SL and the current time.
The travel_ECU 21 is configured to calculate the time to spare Ts based on, for example, the data obtained from the sensors of the vehicle 100a (for example, the travel environment recognizer 11), the data obtained from the road surface sensor by the road-to-vehicle communication by the transceiver 18, or the data obtained from random vehicles by the vehicle-to-vehicle communication by the transceiver 18.
The passing probability P refers to the possibility of the entry of the vehicle 100b into the space SP. It is possible to derive the passing probability P by, for example, the following Expression (1) or Expression (2). Expression (1) is an expression to derive the passing probability P when the vehicle 100b is stopped in front of the stop line SL. Expression (2) is an expression to derive the passing probability P when the vehicle 100b is traveling on the non-priority road Ls.
The travel_ECU 21 is configured to calculate the passing probability P based on, for example, the data obtained from the sensors of the vehicle 100a (for example, the travel environment recognizer 11), the data obtained from the road surface sensor by the road-to-vehicle communication by the transceiver 18, or the data obtained from random vehicles by the vehicle-to-vehicle communication by the transceiver 18. Calculation timing of the number of the vehicles N1 and the number of the vehicles N2 may be, for example, timing at which the presence of the space SP (entry space) available for the entry of the vehicle 100b has been determined, that is, timing at which step S108 described later is performed.
Description now moves on to the driver assistance procedure in the travel control system 1 with reference to
Thereafter, the travel environment recognizer 13c may detect the priority road Lm, the travel lane L1, the opposite lane L2, the non-priority road Ls, the intersection CL, the vehicles on the priority road Lm (for example, the vehicles 100a, and 100c to 100f), and the vehicle on the non-priority road Ls (for example, the vehicle 100b) by using the road map information acquired by the external communication. Here, it is assumed that the road map information acquired by the external communication includes information regarding the vehicles on the priority road Lm (for example, the vehicles 100a, and 100c to 100f) and information regarding the vehicle on the non-priority road Ls (for example, the vehicle 100b). At this occasion, it is possible for the travel environment recognizer 13c to detect the vehicles on the priority road Lm (for example, the vehicles 100a, and 100c to 100f) and the vehicle on the non-priority road Ls (for example, the vehicle 100b) by using the road map information acquired by the external communication.
The vehicle position estimator 13b may acquire the positional coordinates of the vehicle 100a based on the positioning signals received by the GNSS receiver 17. The vehicle position estimator 13b may further acquire the vehicle speed (the speed of the vehicle 100a) detected by the vehicle speed sensor 15.
Thereafter, the travel_ECU 21 may acquire road information Da and vehicle information Db based on various kinds of the information obtained from the travel environment detector 11d, the vehicle position estimator 13b, and the travel environment recognizer 13c (step S101). Here, the road information Da may include information regarding the priority road Lm, the travel lane L1, the opposite lane L2, the non-priority road Ls, and the intersection CL detected by the travel environment detector 11d or the travel environment recognizer 13c. The vehicle information Db may include information regarding the speed (vehicle speed) of the vehicle 100a acquired from the vehicle position estimator 13b and information regarding the vehicles on the priority road Lm (for example, the vehicles 100a, and 100c to 100f) acquired from the travel environment detector 11d or the travel environment recognizer 13c, and the vehicle on the non-priority road Ls (for example, the vehicle 100b).
Thereafter, the travel_ECU 21 may determine presence or absence of any intersections CL ahead of the vehicle 100a (step S102). When the information regarding the intersection CL is included in the road information Da (step S102; Y), the travel_ECU 21 may determine whether the lane (travel lane L1) on which the vehicle 100a is traveling is the priority road Lm (step S103). When the information regarding the priority road Lm is included in the road information Da (step S103; Y), the travel_ECU 21 may determine presence or absence of any vehicles (target vehicles) 100b traveling on the non-priority road Ls (step S104). When the information regarding the vehicle 100b is included in the vehicle information Db (step S104; Y), the travel_ECU 21 may calculate the inter-vehicle spaces ΔL formed by the vehicles traveling on the opposite lane L2 of the priority road Lm (step S105). When any one of the calculated inter-vehicle spaces ΔL is equal to or larger than a predetermined threshold value ΔLth (step S106: Y), the travel_ECU 21 may recognize the space having the inter-vehicle space ΔL equal to or larger than the threshold value ΔLth as the space SP mentioned above.
Thereafter, the travel_ECU 21 may calculate the passing conditions for the space SP (step S107). When the space SP satisfies the passing conditions (step S108; Y), the travel_ECU 21 may calculate the number of the vehicles N1, the number of the vehicles N2, the waiting time Tw or the time to spare Ts, and the passing probability P (steps S109, S110, and S111).
In each step described above, the travel_ECU 21 may perform step S101 when any one of the following applies.
Thereafter, the travel_ECU 21 may provide driver assistance in accordance with the passing probability P (step S112). However, it is assumed that α=1 and β=γ=0.4. For example, when P<0.25 (N2=3, N1=2, 1/Tw or Ts=2.6), the travel_ECU 21 may provide no driver assistance.
For example, when 0.25≤P<0.50 (N2=6, N1=5, 1/Tw or Ts=1.3), the travel_ECU 21 may call for attention of the driver of the vehicle 100a. The travel_ECU 21 may output, for example, a video signal on which a form image with a color (for example, yellow) indicating the presence of the vehicle 100b on the non-priority road Ls is superimposed, to a head-up display that displays a video on a front windshield. This makes it possible for the driver of the vehicle 100a to recognize the presence of the vehicle 100b on the non-priority road Ls by the video displayed on the front windshield, and, for example, pass through the intersection CL while decelerating.
For example, when 0.50≤P<0.75 (N2=10, N1=8, 1/Tw or Ts=0.2), the travel_ECU 21 may alert the driver of the vehicle 100a. The travel_ECU 21 may output, for example, a video signal on which a form image with a color (for example, red) indicating the presence of the vehicle 100b on the non-priority road Ls is superimposed, to the head-up display that displays a video on the front windshield. The travel_ECU 21 may output, for example, an audio signal that produces an intermittent sound, to a speaker. Thus, the driver of the vehicle 100a recognizes the presence of the vehicle 100b on the non-priority road Ls by the image displayed on the front windshield, and further recognizes a risk of a rush-out of the vehicle 100b on the non-priority road Ls, by the intermittent sound from the speaker. As a result, it is possible for the driver of the vehicle 100a to, for example, pass through the intersection CL slowly.
For example, when 0.75≤P (N2=15, N1=12, 1/Tw or Ts=0.1), the travel_ECU 21 may make the risk avoidance control such as braking with respect to the vehicle 100a. The travel_ECU 21 may make predetermined braking for risk avoidance at a stage where, for example, the collision between the vehicle 100a and the vehicle 100b is going to happen within 3 seconds or less. Hence, it is possible to avoid the collision between the vehicle 100a and the vehicle 100b.
Next, effects of the travel control system 1 according to the embodiment of the disclosure are described.
In the embodiment, the data is acquired that indicates that the vehicle 100b is present, that multiple vehicles are present on one or more lanes ahead of the vehicle 100a (travel lane L1 and opposite lane L2), and that the space SP (entry space) available for the entry of the vehicle 100b is present, among one or more spaces each formed by any adjacent two vehicles on a common lane (opposite lane L2) out of the one or more lanes ahead of the vehicle 100a (travel lane L1 and opposite lane L2). When the vehicle 100b is waiting in front of the stop line SL, the waiting time Tw for the vehicle 100b is estimated based on the acquired data. When the vehicle 100b is traveling toward the stop line SL, the time to spare Ts until the entry of the vehicle 100b into the entry space is estimated based on the acquired data. Furthermore, based on the waiting time Tw or the time to spare Ts, the possibility of the entry of the vehicle 100b into the entry space (passing probability P) is predicted. This makes it possible to predict the possibility that the vehicle 100b passes through the intersection CL under the influence of the mental state of the driver of the vehicle 100b. As a result, it is possible to call for the driver's attention, alert the driver, make a braking control, and the like, making it possible to avoid the collision between the vehicle 100a and the vehicle 100b.
In the embodiment, the possibility of the entry of the vehicle 100b into the entry space (passing probability P) may be predicted based on the number of the vehicles N1, the number of the vehicles N2, and the waiting time Tw or the time to spare Ts. This makes it possible to predict the possibility that the vehicle 100b passes through the intersection CL under the influence of the mental state of the driver of the vehicle 100b. As a result, it is possible to call for the driver's attention, alert the driver, make the braking control, and the like, making it possible to avoid the collision between the vehicle 100a and the vehicle 100b.
In the embodiment, when the road information Da and the vehicle information Db are acquired from the sensors provided in the vehicle 100a, it is possible to predict the possibility of the entry of the vehicle 100b into the entry space (passing probability P) even when it is difficult for the vehicle 100a to communicate with the network environment NW.
In the embodiment, when the road information Da and the vehicle information Db are acquired from the sensors provided in the vehicle 100a and the network environment NW, it is possible to predict the possibility of the entry of the vehicle 100b into the entry space (passing probability P) more accurately than the case where the road information Da and the vehicle information Db are generated only by the sensors provided in the vehicle 100a.
Although some example embodiments of the disclosure have been described in the foregoing by way of example with reference to the accompanying drawings, the disclosure is by no means limited to the embodiments described above. It should be appreciated that modifications and alterations may be made by persons skilled in the art without departing from the scope as defined by the appended claims. The disclosure is intended to include such modifications and alterations in so far as they fall within the scope of the appended claims or the equivalents thereof.
In the forgoing embodiment, the evaluation target region Rb may be, for example, as illustrated in
In the forgoing embodiment and the modification examples thereof, for example, as illustrated in step S113 in
In the forgoing embodiment, the disclosure is applied to the driver assistance at the intersection CL where the priority road Lm and the non-priority road Ls intersect. However, in the forgoing embodiment and the modification examples thereof, the disclosure may be applied to, for example, the driver assistance at a merging point where the non-priority road Ls merges with the priority road Lm. In such a case, as with the forgoing embodiment and the modification examples thereof, it is possible to predict possibility of merging of the vehicle 100b under the influence of the mental state of the driver of the vehicle 100b.
In the forgoing embodiment and the modification examples thereof, when it is difficult for the vehicle 100a to communicate with the network environment NW, the travel_ECU 21 may acquire the road information Da and the vehicle information Db based on, for example, various kinds of data regarding a sensor detection region SR obtained from the various sensors mounted on the vehicle 100a. Here, the road information Da may include the information regarding the priority road Lm, the travel lane L1, the opposite lane L2, the non-priority road Ls, and the intersection CL detected by the travel environment recognizer 13c. The vehicle information Db may include the information regarding the speed (vehicle speed) of the vehicle 100a acquired from the vehicle position estimator 13b and the information regarding the vehicles on the priority road Lm (for example, the vehicles 100a, and 100c to 100f) and the vehicle on the non-priority road Ls (for example, the vehicle 100b). Even in such a case, it is possible to predict the possibility of merging, crossing, etc. of the vehicle 100b under the influence of the mental state of the driver of the vehicle 100b.
It is to be noted that the effects described in the specification are merely examples. Effects of the disclosure are not limited to the effects described in the specification. The disclosure may produce other effects than described in the specification.
The example embodiment described above explains an example of a driver assistance apparatus in the case where the subject vehicle travels on a road where drivers keep to the right by law. Needless to say, if the driver assistance apparatus is to be applied to a road where drivers keep to the left by law, right and left settings or the like may be appropriately set in an opposite manner.
As used herein, the term “collision” may be used interchangeably with the term “contact”.
Moreover, for example, the disclosure may take the following configurations.
(1)
A driver assistance apparatus including
The driver assistance apparatus according to (1), in which
The driver assistance apparatus according to (1) or (2), in which
The driver assistance apparatus according to any one of (1) to (3), in which
The driver assistance apparatus according to (4), in which
The driver assistance apparatus according to (4), in which
A vehicle including
A driver assistance method including
A driver assistance apparatus including
A driver assistance apparatus including
The travel_ECU 21 illustrated in
This application is continuation of International Application No. PCT/JP2023/030240, filed on Aug. 23, 2023, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2023/030240 | Aug 2023 | WO |
Child | 19052617 | US |