This invention relates to a method and an apparatus for estimating the friction coefficient in a moving vehicle.
Sensing or determining the friction coefficient that acts between the tires and the road or detecting the condition of the road (e.g. dry, wet, snow-covered, and icy) from which the friction coefficient group can be derived is a major prerequisite to support the driver in his or her driving task and to avoid severe accidents. In general, assessment of the road conditions is the job of the driver, who will then adjust his or her driving style accordingly. Vehicle control systems such as ESC (Electronic Stability Control)/TCS (Traction Control System), or ABS (Anti-lock Braking System) help the driver to stabilize the vehicle in risky conditions, so that he or she will be better able to cope with driving in extreme situations.
Accident prevention is becoming increasingly important in driver assistance systems. Emergency braking and most recently also emergency collision avoidance systems are making an important contribution. But their effectiveness decisively depends on the friction coefficient of the ground. Moisture, snow, and ice reduce the coefficient of friction available between the tires and the road considerably compared to a dry road.
Document EP 792 228 B1 discloses a directional stability control system for ESP (Electronic Stability Program)/ESC controllers, which can be used in special situations to determine a friction coefficient. If at least one wheel utilizes the friction coefficient, e.g. when driving on a slippery road, the vehicle brake control system can determine the friction coefficient from the rotational behavior of the wheels and the ESP/ESC acceleration sensors.
Document DE 102 56 726 A1 discloses a method for generating a signal depending on road conditions using a reflection signal sensor, such as a radar or optical sensor. This facilitates proactive detection of the road condition in a motor vehicle.
Document DE 10 2004 018 088 A1 discloses a road recognition system having a temperature sensor, an ultrasonic sensor, and a camera. The road data obtained from the sensors is filtered, compared to reference data to determine whether the road is in drivable condition, in which process the type of road surface (e.g. concrete, asphalt, dirt, grass, sand, or gravel) and its condition (e.g. dry, icy, snow-covered, wet) can be classified.
Document DE 10 2004 047 914 A1 discloses a method for estimating the road condition in which data from multiple different sensors, such as camera, infrared sensor, rain sensor, or microphone, is merged to obtain a classification of the road condition to which a friction coefficient can be assigned.
It is further known that the friction coefficient information is not only output as driver information but also provided to other vehicle or driver assistance systems, so that these can adjust their operating state accordingly. For example, the ACC can be set to longer distances, or a curve warning function can be adjusted accordingly in case of a low friction coefficient.
Tire slip and tire vibration can be analyzed based on the wheel speed signal and then be used to classify the friction coefficient. The advantage is that this solution can be implemented as a pure software solution, which means cost-efficiently, in an electronic braking system (ESP). The disadvantage is that the friction coefficient cannot be determined proactively, but only after passing over the road surface.
Document DE 10 2008 047 750 A1 discloses a corresponding determination of traction using few sensors, in which torsional vibrations of a wheel of a vehicle are analyzed and a friction coefficient is estimated based on said analysis.
Document DE 10 2009 041 566 A1 discloses a method for determining a road friction coefficient μ in which a first constantly updated friction coefficient characteristic and a second friction coefficient variable that is only updated depending on the situation are combined into a joint estimated friction coefficient.
Document WO 2011/007015 A1 discloses a laser-based method for friction coefficient classification in motor vehicles.
Signals of a LiDAR or CV sensor aimed at the road surface are analyzed and subsequently friction coefficients are assigned based on the amplitude of the measured road surface. It can be estimated, for example, if snow, asphalt, or ice make up the road surface.
It is further known that images provided by one or several camera(s) in a vehicle can be interpreted in such a manner that conclusions with respect to the road surface can be drawn (e.g. based on reflections and brightness levels), which can also be used for friction coefficient classification. Since surroundings cameras are becoming more and more common in driver assistance systems (e.g. for detecting lane departures, traffic signs, and objects), this solution can also be provided cost-efficiently as a software add-on. It has the advantage that the friction coefficient can be estimated proactively. The disadvantage is that this method does not allow consistently precise interpretation because interferences (other vehicles, light sources, etc.) may have a negative effect on interpretation and result in misinterpretation.
Document WO 2012/110030 A2 discloses a method and an apparatus for estimating the friction coefficient using a 3D camera, such as a stereo camera. At least one image of the vehicle environment is taken using the 3D camera. The image data of the 3D camera is used to create a height profile of the road surface in the entire area in front of the vehicle. The anticipated local coefficient of friction of the road surface in the area in front of the vehicle is estimated from said height profile.
The approaches based on camera or tire speed signal evaluations described above have the disadvantages described there.
In view of the above, it is particularly an object of an embodiment of this invention to overcome or reduce the possibility of a misinterpretation of data with regard to estimation of a friction coefficient for a vehicle.
One basic idea of the solution according to the invention is a combination or merger of the two approaches: interpretation of the wheel speed signals and interpretation of the camera image of the surroundings. Suitable exchange and combined evaluation of the information from both subsystems allow proactive friction coefficient assessment and to minimize the risk of misinterpretations.
A method according to the invention for estimating the friction coefficient in a moving vehicle involves an analysis of image data of a forward-looking camera in the vehicle so as to allow conclusions with respect to the road surface. Classification of a friction coefficient from the camera (image) data provides a camera friction coefficient μk. Tire slip and tire vibration are analyzed based on a wheel speed signal. An excitation spectrum by the road, which correlates with the friction coefficient, is determined from this analysis of the vibration behavior of the tire. This value is then taken to perform a classification of the friction coefficient, which provides a wheel friction coefficient μw. Such an analysis is described, for example, in document DE 10 2008 047 750 A1. Proactive estimation of the friction coefficient is performed as a merger of the camera and wheel friction coefficients, wherein the proactive friction coefficient estimation is primarily based on the camera friction coefficient μk but the wheel friction coefficient μw is continuously taken into account to check the camera friction coefficient μk for plausibility.
An estimation of the friction coefficient can also be generated from the camera image of a stationary vehicle. But this friction coefficient estimate can only be checked for plausibility when the vehicle is moving.
In other words, both subsystems continuously provide their estimated friction coefficients μk (based on the camera) and μw (based on the wheel speed sensor) and preferably associated reliability information to a central evaluation unit (e.g. a software module in a driver assistance or electronic brake controller). It is there where the proactive friction coefficient estimation is performed, substantially based on the camera signal μ=μk. In addition, μw is constantly taken into account to increase robustness.
The camera data can advantageously be analyzed for a wet road surface.
The friction coefficient, also called coefficient of friction, adhesion coefficient, or friction factor, indicates the maximum force that can be transmitted between a road surface and a vehicle tire (e.g. in the tangential direction) and is therefore an essential parameter of the road condition. In addition to the road condition, tire properties are required for a complete determination of the friction coefficient. It is typically just road condition information that is considered for an estimate of the friction coefficient, e.g. from camera image data, since tire properties generally cannot be detected from camera image data. The term camera friction coefficient therefore always denotes an estimated friction coefficient that is determined by classifying the road conditions from the camera data. The wheel speed signals also contain information about the tire condition. However, the classification performed to determine the wheel friction coefficient typically also takes into account different road conditions.
One advantage of such merger is its cost-efficient implementation. On the other hand, the systems supplement one another advantageously, since even very small differences in the road surface can be detected based on the wheel speed signals (when the vehicle passes over them), even though these may not have been visible in the image. Vice versa, interferences in the image can quickly be confirmed as such if no change in the road surface was found after passing over the respective spot.
In a preferred embodiment, synchronism of the camera friction coefficient μk and the wheel friction coefficient μw is ensured taking into account the travel speed of the vehicle. The actual speed of the vehicle itself is used here to determine when a road section lying ahead will be passed over that already has an assigned camera friction coefficient μk from a current camera image. Camera and wheel friction coefficients must somehow be made congruent to be merged, but they cannot match at any given point in time because the camera is looking forward and cannot capture the road surface on which the tire is sitting at that point in time. Advantageously, other vehicle sensor information on the movement of the vehicle or odometer and time information is considered to determine the movement of the vehicle relative to the image sections and in this way ensure synchronism.
Advantageously, associated reliability information that can be taken into account in the merger is assigned to each camera friction coefficient μk and wheel friction coefficient μw.
Preferably, a camera friction coefficient μk with associated high reliability information can be immediately released as a proactive friction coefficient μ. If low reliability information is associated with the camera friction coefficient μk, however, the system waits for confirmation from the wheel friction coefficient μw. If this confirmation is given, the confirmed camera friction coefficient can be released, or an averaged value can be output if there is a deviation. The reliability information may be considered as a weighting when determining said averaged value. If the camera friction coefficient μk becomes unavailable, the system only resorts to the wheel friction value μw.
In an advantageous embodiment, suitable merger or analysis of the two pieces of information or friction coefficient estimates μk and μw, respectively, provides an opportunity to adjust the method to the characteristic field conditions of the vehicle or the driver by “learning” (e.g. the typical interferences in the image) during operation, thus increasing its availability and robustness. Typical interferences in the image are particularly identified and learned by checking the plausibility of the camera friction coefficient μk using the wheel friction coefficient μw if a deviation is detected in the process. After perfect training of the system, the precision of the friction coefficient estimation as can be provided by the wheel speed analysis can also be achievable proactively.
Depending on the main use of the camera system, the camera can be a mono camera, a stereo camera, or a camera of a panoramic view system.
According to a preferred embodiment, information of the method for friction coefficient estimation according to the invention is made available to other driver assistance systems or drive dynamics control systems and can be used there in a known way to adjust the operating state accordingly (e.g. the ACC distance). Suitable information includes the proactive friction coefficient, the camera friction coefficient μk, the wheel friction coefficient μw, and/or the associated reliability information.
The invention further relates to an apparatus for estimating the friction coefficient in a moving vehicle using a forward-looking camera, at least one wheel speed sensor, and evaluation means.
The evaluation means are configured to analyze image data of the camera so as to allow conclusions with respect to the road surface and to perform a friction coefficient classification. Said friction coefficient classification provides a camera friction coefficient μk. The wheel speed sensor transmits wheel speed signals to the evaluation means. The evaluation means are further configured to analyze a tire slip and a tire vibration based on a wheel speed signal and to use the same to perform a classification of the friction coefficient which provides a wheel friction coefficient μw. Proactive estimation of the friction coefficient is performed as a merger of the camera friction coefficient μk and the wheel friction coefficient μw, wherein the proactive friction coefficient estimation is primarily based on the camera friction coefficient μk but the wheel friction coefficient μw is continuously taken into account to check the camera friction coefficient μk for plausibility. The evaluation means can be distributed across multiple evaluation units based on the modular principle. It is preferred that the images are analyzed in a controller of the camera system, which can transmit the camera friction coefficient μk to a bus system. The wheel speed analysis is performed in an EBS controller which can transmit the wheel friction coefficient μw to the bus system. Taking into consideration the synchronism of the friction coefficient data, the merger is performed, for example, in a third controller that can receive the friction coefficients μk and μw from the bus system. In another preferred embodiment, synchronization of the friction coefficient data and its subsequent merger can be performed in the controller of the camera system or in the EBS controller, which for this purpose can receive the other friction coefficient μw or μk from the bus system. It is also conceivable that a central evaluation unit includes any and all evaluation means and performs all evaluation steps.
The invention will be explained in more detail below with reference to the drawings and exemplary embodiments. In the drawings:
The only
A reflecting road surface is visible in the camera image (gray scale image of a monocular camera) shown in
Reflection on the road surface can have various causes: a road surface wet from rain, but also a fresh dry pavement can result in reflections. A road section comprising reflections is assigned another friction coefficient μk than a section that does not reflect. However, actual friction coefficients for roads wet from rain clearly differ from friction coefficients for freshly paved dry road surfaces.
The flow diagram of
Furthermore, when a disparity between the camera friction coefficient and the wheel friction coefficient is recognized, the method can adjust the analysis of the image data so that the camera friction coefficient better matches the wheel friction coefficient in successive cycles of the method steps. For example, in the problematic case of a reflecting new road, the friction coefficient μk estimated from the camera image would initially have to be rated uncertain. After passing over such a road section and evaluating the friction coefficient μw obtained for it from a wheel speed signal, the system would learn this “interference” and determine μk correctly and reliably thereafter. This subsequent learning by checking the plausibility is a major advantage of the invention.
Number | Date | Country | Kind |
---|---|---|---|
10 2012 112 725 | Dec 2012 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/DE2013/200340 | 12/9/2013 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2014/094767 | 6/26/2014 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4712635 | Sumiya et al. | Dec 1987 | A |
5351192 | Tsuyama | Sep 1994 | A |
5774821 | Eckert | Jun 1998 | A |
5944392 | Tachihata | Aug 1999 | A |
5963148 | Sekine et al. | Oct 1999 | A |
6597980 | Kogure | Jul 2003 | B2 |
6636258 | Strumolo | Oct 2003 | B2 |
6807473 | Tran | Oct 2004 | B1 |
7315777 | Ono | Jan 2008 | B2 |
7702446 | Hiwatashi | Apr 2010 | B2 |
8180527 | Mueller-Schneiders et al. | May 2012 | B2 |
8306747 | Gagarin et al. | Nov 2012 | B1 |
8666562 | Tuononen | Mar 2014 | B2 |
8957949 | Randler et al. | Feb 2015 | B2 |
9081387 | Bretzigheimer | Jul 2015 | B2 |
20020007661 | Takahashi | Jan 2002 | A1 |
20030101805 | Raab | Jun 2003 | A1 |
20040016870 | Pawlicki et al. | Jan 2004 | A1 |
20050085987 | Yokota et al. | Apr 2005 | A1 |
20070050121 | Ammon | Mar 2007 | A1 |
20080027607 | Ertl et al. | Jan 2008 | A1 |
20100253541 | Seder et al. | Oct 2010 | A1 |
20110245995 | Schwarz | Oct 2011 | A1 |
20120029783 | Takenaka | Feb 2012 | A1 |
20120078483 | Yajima | Mar 2012 | A1 |
20120167663 | Groitzsch et al. | Jul 2012 | A1 |
20120323444 | Rieger et al. | Dec 2012 | A1 |
20130332028 | Heger et al. | Dec 2013 | A1 |
20140052325 | Naegele et al. | Feb 2014 | A1 |
20140347448 | Hegemann et al. | Nov 2014 | A1 |
20150166072 | Powers | Jun 2015 | A1 |
20150224925 | Hartmann | Aug 2015 | A1 |
20150344037 | Siegel | Dec 2015 | A1 |
20150371095 | Hartmann et al. | Dec 2015 | A1 |
20150375753 | Schrabler | Dec 2015 | A1 |
20160121902 | Huntzicker | May 2016 | A1 |
20160379065 | Hartmann | Dec 2016 | A1 |
Number | Date | Country |
---|---|---|
198 56 510 | Sep 1999 | DE |
198 54 964 | Jun 2000 | DE |
101 55 488 | May 2003 | DE |
102 56 726 | Jun 2004 | DE |
102004018088 | Feb 2005 | DE |
102004055069 | Feb 2006 | DE |
102004047914 | Mar 2006 | DE |
102004048637 | Apr 2006 | DE |
102006012289 | Sep 2007 | DE |
102008047750 | May 2009 | DE |
102010013339 | Jan 2011 | DE |
102009041566 | Mar 2011 | DE |
102011100907 | Jan 2012 | DE |
102010045162 | Mar 2012 | DE |
WO 2012117044 | Sep 2012 | DE |
102011081362 | Feb 2013 | DE |
0 412 791 | Feb 1991 | EP |
0 792 228 | Sep 1997 | EP |
0 827 127 | Mar 1998 | EP |
1 201 521 | May 2002 | EP |
2 521 111 | Nov 2012 | EP |
07-035522 | Feb 1995 | JP |
2005-226671 | Aug 2005 | JP |
1020110032422 | Mar 2011 | KR |
WO 2011007015 | Jan 2011 | WO |
WO 2012110030 | Aug 2012 | WO |
WO 2012113385 | Aug 2012 | WO |
WO 2013009697 | Jan 2013 | WO |
Entry |
---|
Raqib Omer, “An Automatic Image Recognition System for Winter Road Surface Condition Monitoring”, Master's Thesis, University of Waterloo, Ontario, Canada, Feb. 22, 2011, pp. i-xii, 1 to 68 retrieved at https://uwspace.uwaterloo.ca/handle/10012/5799. |
Maria Jokela et al., “Road Condition Monitoring System Based on a Stereo Camera”, Intelligent Computer Communication and Processing, IEEE 5th International Conference ICCP 2009, Piscataway, NJ, USA, Aug. 27, 2009, XP031545069, pp. 423 to 428. |
J. Chetan et al., “An Adaptive Outdoor Terrain Classification Methodology Using Monocular Camera”, Intelligent Robots and Systems, IEEE International Conference IROS 2010, Piscataway, NJ, USA, Oct. 18, 2010, XP031920567, pp. 766 to 771. |
J. H. Choi et al., “Road Identification in Monocular Color Images Using Random Forest and Color Correlogram”, International Journal of Automotive Technology, vol. 13, No. 6, The Korean Society of Automotive Engineers, Heidelberg, Oct. 2, 2012, XP035120063, pp. 941 to 948. |
Raquib Omer et al., “An Automatic Image Recognition System for Winter Road Surface Condition Classification”, Intelligent Transportation Systems, 13th International IEEE Conference ITSC 2010, Piscataway, NJ, USA, Sep. 19, 2010, XP031792816, pp. 1375 to 1379. |
T. Teshima et al., “Detection of the Wet Area of the Road Surface Based on a Saturated Reflection”, Meeting on Image Recognition and Understanding, 2007, XP002730931, pp. 1218 to 1223, retrieved at http://hvrl.ics.keio.ac.jp/paper/pdf/domestic—Conference/2007/MIRU2007—teshima.pdf. |
Ernst Dieter Dickmanns et al., “Dynamic Monocular Machine Vision”, Machine Vision and Applications, 1988 Springer-Verlag New York Inc., pp. 223 to 240. |
International Search Report of the International Searching Authority for International Application PCT/DE2013/200340, mailed Apr. 29, 2014, 2 pages, European Patent Office, HV Rijswijk, Netherlands. |
PCT International Preliminary Report on Patentability including English Translation of PCT Written Opinion of the International Searching Authority for International Application PCT/DE2013/200340, issued Jun. 23, 2015, 9 pages, International Bureau of WIPO, Geneva, Switzerland. |
German Search Report for German Patent Application No. 10 2012 112 725.5, dated Oct. 7, 2013, 5 pages, Muenchen, Germany, with English translation, 5 pages. |
Number | Date | Country | |
---|---|---|---|
20150251659 A1 | Sep 2015 | US |