This application is a U.S. national stage application of International Application No. PCT/JP2017/043928, filed on Dec. 7, 2017.
The present invention relates to a road surface condition determination method for determining a condition of a road surface.
Japanese Laid-Open Patent Application No. 2007-232652 discloses a technique for determining a condition of a road surface based on information acquired by a camera capable of capturing ultraviolet images, infrared images, and temperature distribution images.
However, when a camera such as is described in Japanese Laid-Open Patent Application No. 2007-232652 is installed in a vehicle and vehicle control is performed using an estimated road surface friction coefficient, proper use has been difficult even when a distribution of road surface friction coefficients ahead is simply ascertained. Specifically, from the standpoint of ensuring the safety of the vehicle, control must be performed by assessing as a low μ road when a low μ road and a high μ road are distributed ahead of the vehicle. However, a problem has been encountered in which, when the road surface friction coefficient of a region in which the vehicle is actually traveling, or more specifically of a region in which the wheels of the vehicle contact the ground, is that of a high μ road, controlling by assessing as a low μ road cannot ensure sufficient travel performance.
An object of the present invention is to provide a road surface condition determination method and a road surface condition determination device with which it is possible to accurately estimate a road surface friction coefficient optimal for vehicle travel from road surface friction coefficients ahead of the vehicle.
To achieve this object, in the present invention, when a road surface condition is determined based on information acquired by a camera installed in a vehicle, a route of a host vehicle is predicted and the road surface condition of the predicted route is determined.
Therefore, a road surface condition necessary for the host vehicle can be accurately determined.
A preferred embodiments according to the present disclosure will be described below in more detail based on the drawings.
The vehicle has vehicle wheel speed sensors SS (FL, FR, RL, RR) that detect rotation states of vehicle wheels FL, FR, RL, RR, and an integrated sensor CS that detects a longitudinal acceleration rate Gx, lateral acceleration rate Gy, and yaw rate Yaw of the vehicle. A brake controller 10 receives sensor signals from the vehicle wheel speed sensors SS, computes a vehicle wheel speed Vw and a vehicle body speed Vx, and receives various sensor signals (Gx, Gy, Yaw) from the integrated sensor CS.
Based on the received sensor signals and the computed information, the brake controller 10 executes, inter alia: anti-lock brake control (written below as ABS) that minimizes a tendency of the vehicle wheels to lock; vehicle dynamics control (written below as VDC) that stabilizes vehicle behavior; and automatic braking control based on a braking request received from an automatic driving controller (not shown), and controls a brake condition (not shown).
A controller 20 has an engine control unit that controls the driving condition of the engine 1, a shift control unit that controls the shift condition of the automatic transmission 2, and a driving force distribution control unit that controls a driving force distribution condition of the transfer 3. The engine control unit controls the speed and the torque of the engine 1 according to a throttle position, a fuel injection amount, a plug spark timing, etc. The shift control unit decides an optimal shift position based on a vehicle speed VSP and an accelerator pedal position APO and shifts to a shift position selected by hydraulic pressure control within the automatic transmission 2. The driving force distribution control unit computes a driving force to be distributed to the front wheels and a driving force to be distributed to the rear wheels and controls the torque transmitted from the transfer 3 to the front-wheel side based on a traveling condition of the vehicle.
The vehicle has a camera 5 capable of capturing ultraviolet images, infrared images, and temperature distribution images of the area ahead of the vehicle. Images captured by the camera 5 are inputted to the controller 20. A road surface condition determination unit is present inside the controller 20. A road surface condition determination device in the first embodiment is configured from the camera 5 and the road surface condition determination unit inside the controller 20. Using an ultraviolet image, an infrared image, and a temperature distribution image of a road surface a condition of which is to be determined, the road surface condition determination unit determines the condition of the road surface in a state in which light including ultraviolet rays and infrared rays hits the road surface. The term “condition of the road surface” refers to a condition of snow, water, and ice on the road surface, and to distributions thereof. Ultraviolet rays readily scatter upon hitting a physical object, and ultraviolet rays are known to be largely scattered particularly on snowy surfaces. Therefore, the condition of snow on the road surface can be sensed from a feature quantity value that pertains to ultraviolet rays on the road surface, and the distribution of snow on the road surface can be sensed if sensing is performed in the entire area ahead of the vehicle. Because infrared rays are readily absorbed by water, the condition of water on the road surface can be sensed from a feature quantity value of infrared rays on the road surface, and the distribution of water on the road surface can be sensed if sensing is performed in the entire area ahead of the vehicle. The condition of ice on the road surface has a correlative relationship with road surface temperature. Therefore, the feature quantity values of ultraviolet rays and infrared rays are calculated from an image and the condition of the road surface is determined from a temperature distribution image of the road surface. In the first embodiment, a dry road surface is determined as DRY, a wet road surface as WET, a snow-covered road surface as SNOW, and an icy road surface as ICE.
The brake controller 10 and the controller 20 are connected via a CAN communication line. The controller 20 receives data such as pulse signals of the vehicle wheel speed sensors SS, the vehicle wheel speed Vw, the vehicle body speed Vx, the longitudinal acceleration rate Gx, the lateral acceleration rate Gy, and the yaw rate Yaw from the brake controller 10. The brake controller 10 receives data such as engine torque information, the shift position, and the driving force distribution condition from the controller 20.
In step S1, required torque requested by a driver is calculated based on the accelerator pedal position. Torque required from these controls is calculated as the required torque when VDC or automatic driving control is being executed.
In step S2, a road surface friction coefficient (written below as road surface μ) is read from a road surface condition determination process. Details of the road surface condition determination process are described hereinafter.
In step S3, driving force distribution control is executed based on the required torque, the road surface μ, and a control map shown in
A region S3 above the threshold value L2 in
In step S21, a road surface condition determination of each determination area is executed from an image captured by the camera 5.
In step S22, a predicted path for the host vehicle is calculated.
R={(L/sin α)+(L2+((L/tan β)+Tf)2)1/2}/2
R1=(R22+L2)1/2=((R cos α−Tr)2+L2)1/2
In step S23, a slipperiest road surface condition on the predicted route from among road surface conditions (DRY, WET, SNOW, ICE) in a grid is extracted.
In step S24, the road surface μ is estimated from the extracted road surface condition and a table shown in
The action of the road surface condition determination process shall next be described. When the road surface condition ahead of the vehicle is determined from the image captured by the camera 5, there could be a scenario in which a low μ road and a high μ road are distributed ahead of the vehicle. For example, when traveling on a morning after a snowfall, there would often be scenarios in which snow is melted and μ is high in a central region of a road having a large amount of traffic (written below as a main road), and μ is low in a road intersecting the main road and having a low amount of traffic (written below as a non-main road) because snow remains. Also, should a vehicle make a right or left turn from the main road toward the non-main road, the vehicle would pass through a region outside of the central region of the main road, and this region would have a low μ if snow remains in the region.
In this way, when regions each having a different road surface μ are present ahead of the vehicle, a sufficient driving force is not produced when, from the standpoint of ensuring vehicle safety, control is performed by assessing as a low μ even when the vehicle passes through a central region of a main road having little snow. Also, when the assessment is made using only the road surface μ of the central region of the main road and a large driving force is outputted, it is difficult to ensure vehicle stability due to an excessive driving force when the vehicle turns right or left toward a non-main road and travels in the direction of a low μ. In view of this, in the first embodiment, the route of the host vehicle is predicted and the road surface condition on the predicted route is determined, whereby an appropriate road surface μ and corresponding driving force distribution control are achieved.
The following operational effects are achieved in the first embodiment as is described above.
(1) When determining the road surface condition based on information acquired by the camera 5 installed in the vehicle, the controller 20 predicts the route of the host vehicle (step S22) and determines the road surface condition of the predicted route (step S23).
In other words, the road surface condition determination device includes a step S22 (prediction unit) involving predicting the route of the host vehicle and a step S23 (determination unit) involving determining the road surface condition of the predicted route based on the information acquired by the camera 5 installed in the vehicle.
Therefore, it is possible to determine a road surface condition necessary to the host vehicle.
(2) The controller 20 divides the ahead-of-vehicle image acquired by the camera 5 into determination areas, determines the road surface condition of each determination area (step S21), and determines the road surface condition based on the road surface conditions in the determination areas the predicted route will pass through.
Therefore, at all times, in addition to the road surface condition in the direction in which the vehicle proceeds, road surface conditions in non-proceeding directions can also be ascertained, and even when the route is changed, the post-change road surface condition can be instantly ascertained.
The second embodiment shall next be described. Only points of difference shall be described because the basic configuration is the same as that in the first embodiment. In the first embodiment, when a road surface condition is estimated, first, the road surface conditions in the captured image of the view ahead of the vehicle are all determined irrespective of the predicted route, and the road surface condition through which the predicted route of the host vehicle will pass is then extracted. By contrast, in the second embodiment, the predicted route of the host vehicle is calculated and a determination area is decided at a point of intersection between the predicted route of the host vehicle and a road surface condition determination line set ahead of the current host vehicle position by a prescribed distance. The second embodiment differs in that the road surface condition of the determination area containing this intersection point is extracted, and the road surface μ is estimated based on the extracted road surface condition.
In step S31, the predicted route of the host vehicle is calculated. This calculation is identical to that in step S22 in the first embodiment, and a description is therefore omitted.
In step S32, the road surface condition of the determination area containing the point of intersection between the predicted route of the host vehicle and the road surface condition determination line is extracted. The road surface condition determination line is set ahead of the current host vehicle position by, for example, 10 m, but is not particularly limited and may be set to another distance if the distance is one to which driving force distribution control can adapt.
In step S33, the road surface μ is estimated from the extracted road surface condition and the table shown in
Specifically, in the first embodiment, the computation load for the determination is large because all road surface conditions in the captured image of the view ahead of the vehicle are determined. By contrast, in the second embodiment, the computation load can be reduced because only the road surface conditions of determination areas corresponding to the predicted route of the host vehicle are determined and these conditions are used to estimate the road surface μ.
The following operational effects are achieved in the second embodiment as is described above.
(3) The controller 20 extracts determination areas through which the predicted route will pass from the ahead-of-vehicle image acquired by the camera 5 and determines the road surface condition of these determination areas.
Therefore, the computation load when determining road surface conditions can be reduced, and a quick road surface μ estimation can be realized.
The present invention was described above based on the examples, but the specific configuration may be other configurations. In the first embodiment, an example is presented in which the invention is applied to a four-wheel-drive vehicle having a rear-wheel-drive base, but the invention may also be applied to a four-wheel-drive vehicle having a front-wheel-drive base.
In the first embodiment, the road surface μ to be used when performing driving force distribution control is estimated, but an estimated road surface μ may be used in automatic driving control or other types of control such as ABS and VDC.
In the examples, the predicted route is calculated based on Ackermann theory, but when a destination is set in a navigation system, predicted route information may be taken from the navigation system.
In the examples, road surface conditions are determined from the image from the camera 5, which is capable of imaging ultraviolet images, infrared images, and temperature distribution images, but this configuration is not provided by way of limitation, and road surface conditions may be determined by a laser, millimeter waves, or other various types of radar.
In the second embodiment, road surface conditions of determination areas at intersection points between the predicted route of each wheel and the road surface condition determination line are determined, but determination areas including intersection points are not provided by way of limitation; road surface conditions of determination areas including intersection points between center points of the left and right front wheels and the road surface condition determination line may also be determined, and the lowest road surface μ of these may be employed.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/043928 | 12/7/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/111366 | 6/13/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20040204812 | Tran | Oct 2004 | A1 |
20050143889 | Isaji | Jun 2005 | A1 |
20100020170 | Higgins-Luthman | Jan 2010 | A1 |
20120078483 | Yajima | Mar 2012 | A1 |
20140347448 | Hegemann | Nov 2014 | A1 |
20150224925 | Hartmann | Aug 2015 | A1 |
20150371095 | Hartmann et al. | Dec 2015 | A1 |
20190251370 | Askeland | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
2002-127882 | May 2002 | JP |
2002-310896 | Oct 2002 | JP |
2005-178622 | Jul 2005 | JP |
2007-232652 | Sep 2007 | JP |
2008-709998 | Mar 2008 | JP |
2016-143399 | Aug 2016 | JP |
6482116 | Mar 2019 | JP |
Number | Date | Country | |
---|---|---|---|
20200317205 A1 | Oct 2020 | US |