The present application relates to the field of a vehicle travel path generation device and the field of a method for generating a vehicle travel path.
In a drive support device, that detects the division line of a road with a front recognition camera which is mounted in a vehicle, computes an autonomous sensor target travel path from the shape of a white line of a detected host vehicle drive lane, and holds a travel by employing the autonomous sensor target travel path as a travel path, there remains a subject that the detection performance of a road division line deteriorates due to the traffic jam and the deterioration of weather, and then, the drive support cannot be continued.
Toward this subject, there is a proposal in which at least two trajectories are detected, from among a trajectory of a target path on which a host vehicle travels, a running trajectory of a leading car which travels ahead of a host vehicle, and a running trajectory of a parallel running car which travels parallel to a host vehicle or a leading vehicle, where those trajectories are detected using the information from a front recognition camera which is mounted in a host vehicle. Further, the trajectories are unified with their own weight, and the unified integrated path is defined as a target path (Patent Document 1).
Moreover, a drive control device is proposed which detects lane information using a variable adoption ratio between graphical image information and map information, and sets a target travel path, where the variable adoption ratio depends on the reliability of the graphical image information with a front recognition camera, and the reliability of the high precision map information by the GNSS, such as the GPS, which includes a lane central point group, white line position information, and the like, of the peripheral road of a host vehicle. (Patent Document 2).
In the conventional device for generating a travel path, the graphical image information is obtained with a camera which recognizes the front, and the travel path of a vehicle is generated. However, it is desired that the accuracy of control is further enhanced.
The present application aims at offering a vehicle travel path generation device which presumes and outputs the travel path of a vehicle, so that an optimal control may be conducted according to a state in which the host vehicle is placed.
A vehicle travel path generation device according to the present application, includes
a first travel path generation part which approximates a lane on which a host vehicle travels to output as first travel path information,
a second travel path generation part which approximates a road division line ahead of the host vehicle to output as second travel path information,
a travel path weight setting part which sets a weight denoting a certainty between the first travel path information and the second travel path information, and
an integrated path generation part which generates an integrated path information, using the first travel path information, the second travel path information, and the weight by the travel path weight setting part,
wherein the travel path weight setting part sets the weight, on the basis of at least one of outputs from a bird's-eye view detection travel path weight setting part, a vehicle state weight setting part, a path distance weight setting part, and a peripheral environment weight setting part,
where the bird's-eye view detection travel path weight setting part computes a weight between the first travel path information and the second travel path information, on the basis of the first travel path information,
the vehicle state weight setting part computes a weight between the first travel path information and the second travel path information, on the basis of a state of the host vehicle,
the path distance weight setting part computes a weight between the first travel path information and the second travel path information, on the basis of a distance of a travel path of the second travel path information, and
the peripheral environment weight setting part computes a weight between the first travel path information and the second travel path information, on the basis of a peripheral road environment of the host vehicle.
The vehicle travel path generation device according to the present application makes it possible to generate a travel path with sufficient accuracy, according to the state where the host vehicle is placed.
As shown in
From the host vehicle position and azimuth detection part 10 and the road map data 20, a first travel path generation part 60 approximates, by a polynomial equation, a lane on which a host vehicle should travel, and outputs the approximation result as the first travel path information. A second travel path generation part 70 approximates, by a polynomial equation, a front road division line which is acquired with the front camera sensor 30, and outputs the approximation result as the second travel path information.
For example, the first travel path information which the first travel path generation part 60 outputs and the second travel path information, which the second travel path generation part 70 outputs, are equivalent of determining each of the coefficients for a lateral position deviation, an angle deviation, a path curvature, and a path curvature deviation, with respect to a host vehicle and an approximated curve. It is worth noticing that, henceforth, the first travel path information and the second travel path information are abbreviated as the first travel path and the second travel path, respectively.
From the information of the first travel path generation part 60, the host vehicle position and azimuth detection part 10, the road map data 20, the second travel path generation part 70, the front camera sensor 30, and the vehicle sensor 40, the travel path weight setting part 90 sets a weight, which denotes the certainty between the first travel path of the first travel path generation part 60 and the second travel path of the second travel path generation part 70, that is, the ratio of possibility. The integrated travel path generation part 100 outputs an integrated travel path which is the one integrated to a single path, on the basis of the information of the first travel path generation part 60, the second travel path generation part 70, and the travel path weight setting part 90.
Next, on the basis of
On the basis of the information from the vehicle sensor 40, the vehicle state weight setting part 92 sets a weight between the first travel path and the second travel path, that is, a vehicle state weight W sens. On the basis of the information on the path distance for both travel paths of the first travel path generation part 60 and the second travel path generation part 70, the path distance weight setting part 93 sets a weight between the first travel path and the second travel path, that is, a path distance weight W dist. On the basis of the information from the road map data 20, the peripheral environment weight setting part 94 sets a weight between the first travel path and the second travel path, that is, a peripheral environment weight W map.
On the basis of the information on the reliability of both travel paths of the first travel path generation part 60 and the second travel path generation part 70, the detection means state weight setting part 95 sets a weight between the first travel path and the second travel path, that is, a detection means state weight W status. The weight integration part 96 computes a final weight W total between the first travel path and the second travel path, from the bird's-eye view detection travel path weight W bird according to the bird's-eye view detection travel weight setting part 91, the Vehicle state weight W sens according to the vehicle state weight setting part 92, the path distance weight W dist according to the path distance weight setting part 93, the peripheral environment weight W map according to the peripheral environment weight setting part 94, and the detection means state weight W status according to the detection means state weight setting part 95. After that, the weight integration part 96 outputs the result of computation to the integrated travel path generation part 100.
Next, using the flow chart of
First, in the first travel path generation part 60, a target point sequence (a point sequence arranged fundamentally in the lane center) of a lane on which a host vehicle is traveling presently and the state of the host vehicle are computed as an approximate expression on a host vehicle reference coordinate system, from the information of the host vehicle position and azimuth detection part 10 and the road map data 20. The expression is represented as the Equation 1 (Step S100).
Eq. 1
path_1(x)=C3_1×x3+C2_1×x2+C1_1×x+C0_1 (Equation 1)
Next, in the second travel path generation part 70, the travel path on which a host vehicle should travel is computed from the information of a division line which is detected with the front camera sensor 30, where the division line is ahead of a host vehicle. The expression is represented as the Equation 2 (Step S200).
Eq. 2
path_2(x)=C3_2×x3C2_2×x2+C1_2×x+C0_2 (Equation 2)
In the Equation 1 and the Equation 2, the first term denotes the curvature of each path, the second term denotes an angle of a host vehicle with respect to each path, the third term denotes a lateral position of a host vehicle with respect to each path. Next, a travel path for each of the states is computed in Step S100 and Step S200. In addition, a weight W for each travel path, which is represented by the Equation 3, is computed by the path weight setting part 90 (Step S400).
After that, in the integrated travel path generation part 100, an integrated travel path Path_total, on which a host vehicle should travel, is computed by the Equation 4, from the paths computed in Step S100 and Step S200 and the weights to the respective paths computed in Step S400 (Step S500).
It is worth noticing that, as for the computing operation of each of the paths in Step S100 and Step S200, computed results at one side do not influence the computing operation of the other side. Therefore, there are no restrictions about an order of computation.
Next, using the flow chart of
First, using the information from the first travel path generation part 60, a bird's-eye view detection travel path weight W bird is set, and is represented as the Equation 5 (Step S410).
Next, using the information from the vehicle sensor 40, a vehicle state weight W sens is set, and is represented as the Equation 6 (Step S420).
Next, using the information on the path distance of each of the paths of the first travel path generation part 60 and the second travel path generation part 70, a path distance weight W dist is set, and is represented as the Equation 7 (Step S430).
Next, using the information from the road map data 20, a peripheral environment weight W map is set, and is represented as the Equation 8 (Step S440).
Next, using the information on the reliability of each of the paths of the first travel path generation part 60 and the second travel path generation part 70, a detection means state weight W stasus is set, and is represented as the Equation 9 (Step S450).
Next, from each of the weights which are set in Step S410 to Step S450, a weight for the first travel path W total_1 and a weight for the second travel path W total_2 are computed, and are represented as the Equation 10 (Step S460).
Eq. 10
W
total_n_cx
=W
bird_n_cx
×W
sens_n_cx
×W
dist_n_cx
×W
map_n_cx
×W
status_n_cx(n=1,2,x=0,1,2,3) (Equation 10)
It is worth noticing that, as for the setting operation of each of the weights in Step S410 to step S450, setting results at one side do not influence the other setting operations. Therefore, there are no restrictions about an order of computation.
Next, using the flow chart of
First, the weight of the bird's-eye view detection travel path weight W bird_1_cX (X=0, 1, 2, 3) for the first travel path is set to be a maximum vale of 1 (Step S411). Next, it is judged whether the magnitude of the coefficient of a curvature element of an approximated curve is larger than a threshold value C2_threshold, namely, it is judged whether a road curvature is larger than the threshold value C2_threshold (Step S412), where the approximated curve shows the relation between a host vehicle and a target path, and is computed in the first travel path generation part 60. When it is judged that the path curvature is larger in Step S412, the bird's-eye view detection travel path weight W bird_2_cX to the second travel path is set as a value which is smaller than the bird's-eye view detection travel path weight W bird_1_cX to the first travel path (Step S413).
Moreover, when it is judged that the road curvature is smaller in Step S412, it is judged whether the magnitude of the coefficient of the angle element of an approximated curve is larger than a threshold value C1_threshold, namely, it is judged whether the inclination of a host vehicle to a travel path is larger than the threshold value C1_threshold (Step S414), where the approximated curve shows the relation between the host vehicle and the target path and is computed in the first travel path generation part 60. When it is judged in Step S414 that the inclination of a host vehicle to a travel path is larger, the process proceeds to Step S413. Moreover, when it is judged in Step S414 that the inclination of a host vehicle to a travel path is smaller, it is judged whether the magnitude of the coefficient of the position element of an approximated curve is larger than the threshold value Cθ_threshold, namely, it is judged whether the distance of the host vehicle to a travel path is separated more than the threshold value Cθ_threshold, where the approximated curve shows the relation between a host vehicle and a target path and is computed in the first travel path generation part 60 (Step S415).
When it is judged that the host vehicle is separated with respect to a travel path in Step S415, the process proceeds to Step S413. Moreover, when it is judged that the host vehicle is not separated with respect to a travel path in Step S415, it is judged that the accuracy of the second travel path is high. Further, the bird's-eye view detection travel path weight W bird_2_cX for the second travel path is set as a value which is equivalent to the bird's-eye view detection travel path weight W bird_1_cX for the first travel path (Step S416).
In the operation of the bird's-eye view detection travel path weight setting part 91 according to the Embodiment 1,
In
The second travel path 201 is a travel path which is computed in the second travel path generation part 70. Moreover, the numeral 202 in
In the vehicle state of
As shown in
As shown in
As shown in
In the scene where a path curvature is small as in
In this way, according to the travel path generation device 1000 of vehicle use in the Embodiment 1, a weight is output to the weight integration part 96, from each of the bird's-eye view detection travel path weight setting part 91, the vehicle state weight setting part 92, the path distance weight setting part 93, the peripheral environment weight setting part 94, and the detection means state weight setting part 95, and further, the weight between the first travel path 200 and the second travel path 201 is set on the basis of each of the weights. Thereby, for example, even in the situation where the second travel path generation part 70 outputs the travel path information which is different from an actual travel path, it becomes possible in the bird's-eye view detection travel path weight setting part 91 to set a low weight to the concerned travel path depending on the positional relationship of a travel path to the host vehicle 1, from the information of the first travel path 200. Therefore, it becomes possible to generate an integrated travel path which is further in agreement with the actual travel path, and the convenience of an automatic operation function can be enhanced.
Next, using the flow chart of
First, the weight of the vehicle state weight W sens_1_cX (X=0, 1, 2, 3) for the first travel path 200 is set to be a maximum value of 1 (Step S421). Next, it is judged whether the vehicle body pitch angle 9 pitch of the host vehicle 1 is larger than a threshold value θ_threshold, from the information of the vehicle sensor 40 which is mounted in the host vehicle 1, namely, it is judged whether the vehicle body is tilted frontward or backward (Step S422). When it is judged in Step S422 that the vehicle body pitch angle is larger, a vehicle state weight W sens_2_cX to the second travel path 201 is set to be a value which is smaller than the vehicle state weight W sens_1_cX to the first travel path 200 (Step S423). Moreover, when it is judged in Step S423 that the vehicle body pitch angle is smaller, it is judged that the accuracy of the second travel path 201 is high, and then, the vehicle state weight W sens_2_cX for the second travel path 201 is set to be a value which is equivalent to the vehicle state weight W sens_1_cX for the first travel path 200 (Step S424).
In the operation of the vehicle state weight setting part 92 according to the present Embodiment 1,
In
In the state where a vehicle body pitch angle is small as in
Moreover, as mentioned already, the first travel path information which is output from the first travel path generation part 60 is a travel path which represents, in a bird's-eye view, the relation of a target path to the host vehicle 1, using an approximated curve, where the absolute coordinate information and absolute azimuth of the host vehicle 1, from the host vehicle position and azimuth detection part 10, and the information on the target point sequence 20A of a host vehicle drive lane, from the road map data 20 are used. Then, the decrease in the accuracy of a path due to the influence of a vehicle body pitch angle is small. From above, it can be said that the first travel path 200 is a high precision path to an actual travel path.
In this way, the travel path generation device 1000 of vehicle use according to the Embodiment 1 makes it possible to set a low weight to the concerned travel path, in the situation where, in a vehicle state weight setting part, the travel path information of the second travel path generation part is different from an actual travel path due to the influence of the vehicle body pitch angle of a host vehicle. Thereby, it becomes possible to generate an integrated travel path which is further in agreement with the actual travel path, and the convenience of an automatic operation function can be enhanced.
Next, using the flow chart of
First, the weight of a path distance weight W dist_1_cX (X=0, 1, 2, 3) for the first travel path is set to be a maximum value of 1 (Step S431). Next, it is judged whether the path detection distance dist_2 in the second travel path generation part is shorter than a set threshold value dist_threshold (Step S432). When it is judged in Step S432 that the detection distance of the second travel path is shorter, the weight of the path distance weight W dist_2_cX for the second travel path is set to be a value which is smaller than the path distance weight W dist_1_cX for the first travel path (Step S433). Moreover, when it is judged in Step S432 that the detection distance of the second travel path 201 is longer, the weight of the path distance weight W distt_2_cX for the second travel path 201 is set to be a value which is equivalent to the path distance weight W dist_1_cX for the first travel path 200 (Step S434).
In order to represent the operation of the path distance weight setting part 93 according to the present Embodiment 1,
The first travel path 200 is a travel path denoted by an approximated curve, showing the relation of the target path to the host vehicle 1, on the basis of the absolute coordinate information and absolute azimuth of the host vehicle 1, from the host vehicle position and azimuth detection part 10 A, and the information on the target point sequence 20A of a host vehicle drive lane, from the road map data 20. In addition, the first travel path is a travel path which is acquired from the result detected in a bird's-eye view, and then, it can be said that the first travel path is a path whose reliability is high. The second travel path 201 is a path which is generated using the information within the range of the image capturing distance 205, among the road division lines 202 whose images are captured with the front camera sensor 30.
As shown in
In the Equation 11, shown is an equation for computing a threshold value dist_threshold in Step S432 of
Eq. 11
dist_threshold=V×Tld (Equation 11)
In this way, the detection distance of the second travel path generation part is short in a path distance weight setting part. Thereby, the travel path generation device of vehicle use according to the Embodiment 1 makes it possible to set a low weight to the concerned travel path, in the situation where the travel path information of the second travel path generation part is different from an actual travel path. Therefore, it becomes possible to generate an integrated travel path which is further in agreement with an actual travel path, and the convenience of an automatic operation function can be enhanced.
Next, using the flow chart of
First, the weight of the peripheral environment weight W map1_cX (X=0, 1, 2, 3) for the first travel path 200 is set to be a maximum value of 1 (Step S441). Next, it is judged, using the information from the map data 20, whether the magnitude of a changed amount d θ of a road slope, between the current position of a host vehicle and a fixed distance point ahead of the host vehicle, is larger than the set threshold value d θ slope_threshold (Step S442). When it is judged in Step S442 that the change of a road slope is larger, the peripheral environment weight W map_2_cX for the second travel path 201 is set to be a value which is smaller than the peripheral environment weight W map_1_cX for the first travel path 200 (Step S443). Moreover, when it is judged in Step S442 that the change of a road slope is smaller, it is judged that the accuracy of the second travel path is high. Thereby, the peripheral environment weight W map_2_cX for the second travel path 201 is set to be a value which is equivalent to the peripheral environment weight W map_1_cX for the first travel path 200 (Step S424).
In the operation of the peripheral environment weight setting part 94 according to the present Embodiment 1, the road slope, which is in the range between the host vehicle 1 and the front, changes from a downward slope to an upward slope.
In
In this way, in the travel path generation device 1000 of vehicle use according to the Embodiment 1, the change amount of a front road slope is large to the host vehicle 1, in the peripheral environment weight setting part 94. Thereby, in the situation where the travel path information of the second travel path generation part 70 is different from an actual travel path, it becomes possible to set a low weight to the second travel path 201. Therefore, it becomes possible to generate an integrated travel path which is further in agreement with the actual travel path, and the convenience of an automatic operation function can be enhanced.
It is worth noticing that, in the Embodiment 1, as shown in
Next, regarding the method for generating a first travel path, explanation will be made about another example of the path generation by a “bird's-eye view” detection means. It is worth noticing that, according to the present Embodiment, in the first travel path generation part 60, the first travel path information is output from the host vehicle position and azimuth detection part 10 and the road map data 20. However, the method is not necessarily a means which uses the positioning information from an artificial satellite and road map data.
For example, load sensors, such as a millimeter wave sensor, a laser sensor (Lidar), or a camera sensor, which are installed on a telegraph pole or signboard at a travel path end, are used to recognize the position and angle of a vehicle in a sensing domain and the peripheral road shape of the vehicle. Further, a polynomial equation is used to express the relation between a host vehicle and a travel path on the periphery of the host vehicle. Thereby, the same benefit can be acquired.
It is worth noticing that, according to the present Embodiment, as shown in the Equation 3, the Equation 5, the Equation 6, the Equation 7, the Equation 8, the Equation 9, and the Equation 10, a weight which is set to the first travel path and a weight which is set to the second travel path are set in the travel path weight setting part 90. Those weights are set to a coefficient of each order, when the weight is denoted by an approximate equation of third order. However, those weights are not necessarily a weight to a coefficient of each order.
For example, the first travel path and the second travel path are changed into point group information, which is expressed in the target pass point of each path. It is allowed to employ the point group information also as a weight to each path.
The weight W which is set by the path weight setting part 90 is shown in the Equation 12, the bird's-eye view detection travel path weight W bird is shown in the Equation 13, the vehicle state weight W sens is shown in the Equation 14, the path distance weight W dis is shown in the Equation 15, the peripheral environment weight W map is shown in the Equation 16, the detection means state weight W status is shown in the Equation 17, and the weight for the first travel path Wtotal_1, and the weight for the second travel path Wtotal_2 are both shown in the Equation 18.
It is worth noticing that, as shown in
It is worth noticing that, as shown in
Although the present application is described above in terms of an exemplary embodiment, it should be understood that the various features, aspects and functionality described in the embodiment are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations to the embodiment. It is therefore understood that numerous modifications which have not been exemplified can be devised without departing from the scope of the present application. For example, at least one of the constituent components may be modified, added, or eliminated.
1 Host vehicle: 10 Host vehicle position and azimuth detection part: 20 Road map data: 20A Target point sequence: 30 Front camera sensor: 40 Vehicle sensor: 60 First travel path generation part: 70 Second travel path generation part: 90 Travel path weight setting part: 91 Bird's-eye view detection travel path weight setting part: 92 Vehicle state weight setting part: 93 Path distance weight setting part: 94 Peripheral environment weight setting part: 95 Detection means state weight setting part: 96 Weight integration part: 100 Integrated travel path generation part: 200 First travel path: 201 Second travel path: 202 Road division Line: 203 Image capturing range boundary: 205 Image capturing distance: 206 Integrated travel path: 500 Processor: 501 Memory storage: 1000 Travel path generation device: 2000 Drive control device
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/005793 | 2/14/2020 | WO |