The present application relates to the field of a travel path generation device.
In recent years, with respect to vehicles, various types of devices, which use the technology of automatic operation, have been developed and proposed, so that a driver could operate a vehicle more comfortably and more safely. For example, in the Patent Document 1, a vehicle control device for following an optimal path is proposed, in which the vehicle control device detects an autonomous sensor travel path which is computed out by the information from a front recognition camera, and a bird's-eye view sensor travel path which is computed out from high precision map information and GNSS (Global Navigation Satellite System), such as GPS, where the high precision map information includes the point group of a traffic lane center, white line position information, and the like, of the peripheral road of a host vehicle. In addition, the vehicle control device computes out a unified travel path, which is in accordance with a weight for each of the travel paths, where the weight is determined based on the reliability, judged from the detection state of the front recognition camera, and the reliability, judged from the receiving state of the GNSS.
In general, a path is the one which is expressed in a polynomial equation, and equations for a bird's-eye view sensor travel path, an autonomous sensor travel path, and an integrated path are represented respectively by the Equation (1) to the Equation (3). In each of the Equations, the coefficient of a first term (a second order term) represents a curvature component of a path (hereafter referred to as a curvature component), the coefficient of a second term (a first order term) represents an angle component between a host vehicle and a path (hereafter referred to as an angle component), and the coefficient of a third term (an intercept term) represents a lateral position component between a host vehicle and a path (hereafter referred to as a lateral position component).
Eq. 1
path_sat(x)=C2_sat×x2+C1_sat×x+C0_sat (1)
Eq. 2
path_cam(x)=C2_cam×x2+C1_cam×x+C0_cam (2)
Eq. 3
path_all(x)=C2_all×x2+C1_all×x+C0_all (3)
Moreover, respective components of an integrated path are represented by the Equation (4) to the Equation (6). In each of the Equations, symbol w2_sat, symbol w1_sat, and symbol w0_sat represent a weight for each of the components of a bird's-eye view sensor travel path, and symbol w2_cam, symbol w1_cam, and symbol w0_cam represent a weight for each of the components of an autonomous sensor travel path. A plurality of paths is averaged with a weight (weighted mean average) among their respective components, and thereby, each of the components of an integrated path can be obtained.
Eq. 4
C2_all=w2_sat×C2_sat+w2_cam×C2_cam (4)
(where w2_sat+w2_cam=1)
Eq. 5
C1_all=w1_sat×C1_sat+w1_cam×C1_cam (5)
(where w1_sat+w1_cam=1)
Eq. 6
C0_all=w0_sat×C0_sat+w0_cam×C0_cam (6)
(where w0_sat+w0_cam=1)
It is worth noticing that, in each of the equations, the symbol w2_sat is a weight for the bird's-eye view sensor drive path in the curvature component of an integrated path; the symbol w2_cam is a weight for the autonomous sensor drive path in the curvature component of an integrated path; the symbol w1_sat is a weight for the bird's-eye view sensor drive path in the angle component of an integrated path; the symbol w1_cam is a weight for the autonomous sensor drive path in the angle component of an integrated path; the symbol w0_sat is a weight for the bird's-eye view sensor drive path in the lateral position component of an integrated path; and the symbol w0_cam is a weight for the autonomous sensor drive path in the lateral position component of an integrated path. A plurality of paths is averaged with a weight (weighted mean average) among their respective components, and thereby, each of the components of an integrated path can be obtained.
Here, according to the technology which is proposed in the Patent Document 1, in the vicinity of a tunnel entrance or so, a front recognition camera is hard to recognize the inside of a tunnel. Assuming that the accuracy of the angle component and curvature component of an autonomous sensor travel path is low, as for the angle component and curvature component of an integrated path, the weight for a bird's-eye view sensor travel path is set to be higher than the weight for an autonomous sensor travel path.
However, in practice, due to the influence of errors in the position and azimuth by the GNSS, the lateral position component and angle component of a bird's-eye view sensor travel path is lower in accuracy than an autonomous sensor travel path. Therefore, even if, as for the angle component of an integrated path, the weight of a bird's-eye view sensor travel path is set up to be high, there remains a subject that the conventional averaging with a weight cannot generate an optimal integrated path.
The present application aims at generating a highly precise path, compared with the existing path generation device, so that optimal control may be performed according to the state where a host vehicle is placed.
A travel path generation device according to the present application, includes;
a first path generation part that outputs, based on road map data, a bird's-eye view travel path which is constituted of a bird's-eye view curvature component, a bird's-eye view angle component of a host vehicle, and a bird's-eye view lateral position component of the host vehicle,
a second path generation part that outputs, based on information from a sensor which is mounted in the host vehicle, an autonomous travel path which is constituted of an autonomous curvature component, an autonomous angle component of the host vehicle, and an autonomous lateral position component of the host vehicle, and
a path generation part that receives outputs of the first path generation part and the second path generation part; sets up a curvature component of a travel path of the host vehicle, an angle component to the travel path of the host vehicle, and a lateral position component to the travel path of the host vehicle, based on the bird's-eye view curvature component, the autonomous angle component, and the autonomous lateral position component; and generates the travel path of the host vehicle.
The travel path generation device according to the present application generates and represents a travel path, using a curvature component, an angle component, and a lateral position component, of a bird's-eye view travel path and an autonomous travel path. Thereby, it becomes possible to generate an integrated path with an accuracy higher than before.
Hereinafter, explanation will be made about the Embodiment 1 based on drawings. It is worth noticing that, in the drawings, each of the same symbols or numerals shows a portion which is the same or a corresponding part.
As shown in
From the host vehicle position and azimuth detection part 10 and the road map data 20, a specific section which is ahead of a host vehicle (referred to as a look ahead distance) is adopted as an approximation range. In addition, the bird's-eye view sensor travel path generation part 60 outputs the result of approximation by a polynomial equation, where the approximation is carried out, within the approximation range, to express a traffic lane on which a host vehicle should travel. That is, as shown in
It is worth noticing that, the bird's-eye view sensor travel path is based on the road map data. Thereby, there is a benefit that the bird's-eye view sensor travel path can express the curvature of a path with a sufficient accuracy, rather than an autonomous sensor travel path. Moreover, the autonomous sensor travel path is based on graphical image information with a camera. Thereby, there is a benefit that the autonomous sensor travel path can express the angle between a host vehicle and a path, and the lateral position between a host vehicle and a path, with a sufficient accuracy, rather than the bird's-eye view sensor travel path, which is subject to the influence of errors in the position or azimuth by the GNSS. It is worth noticing that, “bird's-eye view” denotes a state to look down the bottom from a high place, and “bird's-eye view like” denotes a state close to look down over the bottom from a high position. On the other hand, “autonomous type” denotes a state to recognize the circumference and respond to it, using various kinds of sensors which are mounted in a car, such as a camera or a sonar.
The travel path weight set up part 90 sets up the weight which denotes the probability between both travel paths of the bird's-eye view sensor travel path generation part 60 and the autonomous sensor travel path generation part 70. In the integrated path generation part 100, an integrated path, which is a single path, is output from the information of the bird's-eye view sensor travel path generation part 60, the autonomous sensor travel path generation part 70, and the travel path weight set up part 90.
Next, explanation will be made about the overall operation of the vehicle control device according to the Embodiment 1, using the flow chart of
Here, as for the curvature component of a path, the weight of a bird's-eye view sensor travel path is set up to be higher than the weight of an autonomous sensor travel path, and as for the angle component between a host vehicle and a path, and the lateral position component between a host vehicle and a path, a predetermined value is set up so that the weight of an autonomous sensor travel path may become larger than the weight of a bird's-eye view sensor travel path. It is worth noticing that, the weight of a bird's-eye view sensor travel path and the weight of an autonomous sensor travel path are the ones which become 1 when they are added. For example, as for the curvature component of a path, the weight of a bird's-eye view sensor travel path is set up to be 0.7 and the weight of an autonomous sensor travel path is set up to be 0.3; and as for the angle component between a host vehicle and a path, and the lateral position component between a host vehicle and a path, the weight of an autonomous sensor travel path is set up to be 0.7 and the weight of a bird's-eye view sensor travel path is set up to be 0.3. Or it is allowed that, as for the curvature component of a path, the weight of a bird's-eye view sensor travel path is set up to be 1, and the weight of an autonomous sensor travel path is set up to be 0; and as for the angle component between a host vehicle and a path, and the lateral position component between a host vehicle and a path, the weight of an autonomous sensor travel path is set up to be 1 and the weight of a bird's-eye view sensor travel path is set up to be 0. It is worth noticing that, in a case where, as for the curvature component of a path, the weight of a bird's-eye view sensor travel path is set up to be 1 and the weight of an autonomous sensor travel path is set up to be 0; and as for the angle component between a host vehicle and a path, and the lateral position component between a host vehicle and a path, the weight of an autonomous sensor travel path is set up to be 1 and the weight of a bird's-eye view sensor travel path is set up to be 0, it will become substantially a case where a bird's-eye view sensor travel path is used for the curvature component of a path; and an autonomous sensor travel path is used for the angle component between a host vehicle and a path, and the lateral position component between a host vehicle and a path.
After that, from the coefficient of each of the paths which are computed out in Step S100 and Step S200, and the weight to each of the paths which are set up in Step S400, the integrated path generation part 100 computes out the coefficient of an integrated path (the Equation (3)) on which a host vehicle should travel, by the Equation (4) to the Equation (6) (Step S500).
Finally, using the integrated path, the vehicle control part 110 performs vehicle control (Step S600). It is worth noticing that, in the computing out operation of each of the paths in Step S100 and Step S200, computed out results of one side do not influence the computing out operation of the other side. Therefore, there are no restrictions about an order of computation out.
In this way, the path generation device according to the present Embodiment carries out the averaging with a weight among each of the components of a plurality of paths. At that time, as for the curvature component of a path, the weight of a bird's-eye view sensor travel path is set up to be higher than the weight of an autonomous sensor travel path; and, as for the angle component between a host vehicle and a path, and the lateral position component between a host vehicle and a path, the weight of an autonomous sensor travel path is set up to be higher than the weight of a bird's-eye view sensor travel path. Then, it become possible to generate an integrated path with an accuracy higher than before.
It is worth noticing that, at all times in the present Embodiment, as for the curvature component of a path, the weight of a bird's-eye view sensor travel path is set up to be higher than the weight of an autonomous sensor travel path; and as for the angle component and the lateral position component, the weight of an autonomous sensor travel path is set up to be higher than the weight of a bird's-eye view sensor travel path. However, it will bring a better situation, if, only in the situation where the curvature accuracy of an autonomous sensor travel path becomes low, the set up of a weight mentioned above is carried out, in addition, in other situations, the weight is set up, as before, based on the reliability which is judged from the detection state of a front recognition camera, and in addition, the reliability which is judged from a receiving state of the GNSS. In that case, for example, the vehicle control device is configured in the constitution of
Or, as shown in
Or, the vehicle control device 400 is configured in the constitution which is shown in
Next, explanation will be made about the Embodiment 2, based on drawings.
Next, the overall operation of the vehicle control device 400 according to the present Embodiment will be explained. The overall flow chart here is the same as the Embodiment 1, however, the method of setting up the weight in Step S400 is different from the Embodiment 1. In the present Embodiment, the travel path weight set up part 90 sets up a weight in Step S400, based on the flow chart of
First, it is judged whether a vehicle speed V, which is input from the vehicle sensor 50, is lower than a set threshold value V1 (Step S401). When it is judged that the vehicle speed of a host vehicle is low in Step S401, the weight of an autonomous sensor travel path is set up to be higher than the weight of a bird's-eye view sensor travel path, in all of the curvature component, the angle component, and the lateral position component (Step S402). Moreover, when it is not judged that the vehicle speed of a host vehicle is low in Step S401, as for the curvature component, the weight of a bird's-eye view sensor travel path is set up to be higher than the weight of an autonomous sensor travel path, and as for the angle component and the lateral position component, the weight of an autonomous sensor travel path is set up to be higher than the weight of a bird's-eye view sensor travel path (Step S403).
Regarding the operation of the bird's-eye view sensor travel path generation part 60 according to the present Embodiment,
Like this way, according to the present Embodiment, when the vehicle speed of a host vehicle is low, the weight of an autonomous sensor travel path is set up to be higher than the weight of a bird's-eye view sensor travel path, in all of the curvature component, the angle component, and the lateral position component. Therefore, the present Embodiment is not subject to the influence of the problem mentioned above, and when the vehicle speed is low, an integrated path with an accuracy higher than the Embodiment 1 can be generated.
It is worth noticing that, according to the present Embodiment, when the vehicle speed of a host vehicle is low, the weight of the autonomous sensor travel path is set up to be higher than the weight of a bird's-eye view sensor travel path, in all of the curvature component, the angle component, and the lateral position component. However, it is further beneficial to judge directly whether the target point sequence of a host vehicle driving traffic lane, which is used for the computation out of an approximated curve in the bird's-eye view sensor travel path generation part 60 is small in number or not. In that case, for example, the vehicle control device 400 is configured in the constitution which is shown in
Moreover, in the Embodiment 1 and the Embodiment 2, the bird's-eye view sensor travel path computed out in the bird's-eye view sensor travel path generation part 60, the autonomous sensor travel path computed out in the autonomous sensor travel path generation part 70, and the integrated path are denoted by the quadratic expression which consists of the curvature component of a path, the angle component between a host vehicle and a path, and the lateral position component between a host vehicle and a path, like the Equation (1) to the Equation (6).
However, those paths are not necessarily the one which is configured in the above constitution. For example, the travel path is expressed by a cubic formula, which includes the curvature change component of a path, as a third term (the Equation (7) to the Equation (10)), and, as for the curvature change component of a path, the same weight set up as in the curvature component of a path is employed. Thereby, the same benefit as in the case when each of the travel paths is expressed by a quadratic expression can be obtained. Here, as for the symbol C2_all, the symbol C1_all, and the symbol C0_all, the same situation is true for the Equation (4) to the Equation (6), and descriptions are omitted.
Eq. 7
path_sat(x)=C3_sat×x3+C2_sat×x2+C1_sat×x+C0_sat (7)
Eq. 8
path_cam(x)=C3_cam×x3+C2_cam×x2+C1_cam×x+C0_cam (8)
Eq. 9
path_all(x)=C3_all×x3+C2_all×x2+C1_all×x+C0_all (9)
Eq. 10
C3_all=w3_sat×C3_sat+w3_cam×C3_cam (10)
(where w3_sat+w3_cam=1)
It is worth noticing that, the travel path generation device 300 consists of a processor 500 and a memory storage 501, as shown in
Although the present application is described above in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations to one or more of the embodiments.
It is therefore understood that numerous modifications which have not been exemplified can be devised without departing from the scope of the present application. For example, at least one of the constituent components may be modified, added, or eliminated. At least one of the constituent components mentioned in at least one of the preferred embodiments may be selected and combined with the constituent components mentioned in another preferred embodiment.
1 Host vehicle; 10 Host vehicle position and azimuth detection part; 20 Road map data; 21 Target point sequence information; 22 Host traffic lane; 23 Approximation range; 24 Division line information; 25 Approximated curve; 26 Travel path; 30 Camera sensor; 40 Forward looking radar; 50 Vehicle sensor; 60 Bird's-eye view sensor travel path generation part; 70 Autonomous sensor travel path generation part; 80 Vehicle speed sensor; 90 Travel path weight set up part; 91 Tunnel entrance travel judging part; 92 Host vehicle near travel judging part; 93 Autonomous sensor travel path effective range judging part; 94 Vehicle speed judging part; 95 Point sequence number judging part; 100 Integrated path generation part; 101 Bird's-eye view sensor travel path; 110 Vehicle control part; 200 Path generation part; 300 Path generation device; 400 Vehicle control device; 500 Processor; 501 Memory storage
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/016142 | 4/10/2020 | WO |