Method for evaluating image data of a vehicle camera taking into account information about rain

Information

  • Patent Grant
  • 9508015
  • Patent Number
    9,508,015
  • Date Filed
    Friday, November 16, 2012
    12 years ago
  • Date Issued
    Tuesday, November 29, 2016
    8 years ago
Abstract
In a method for evaluating image data of a vehicle camera, information about raindrops on the vehicle's windshield within the field of view of the camera is taken into account in the evaluation of the image data for detection and classification of objects in the environment of the vehicle. Particularly, for example, depending on the number and the size of the raindrops on the windshield, different detection algorithms, image evaluation criteria, classification parameters, or classification algorithms are used for the detection and classification of objects.
Description
FIELD OF THE INVENTION

The invention relates to a method for evaluating image data of a vehicle camera, said method being in particular used in a driver assistance system.


BACKGROUND INFORMATION

Rain sensors and light sensors are incorporated already into many vehicles today in order to control the actuation of windshield wipers or vehicle lights. As more and more vehicles have cameras integrated as a basis for assistance or comfort functions, rain or light detection is also increasingly performed using a camera.


In WO 2010/072198 A1 rain detection is described which uses a camera that at the same time is also used for automotive driver assistance functions. A bifocal optic is used for rain detection, producing a sharp image of a partial region of the windshield on a partial region of the camera image sensor.


EP 2057583 B1 shows a camera-based driver assistance function for the automatic light control of headlamps, which distinguishes the vehicle lights of vehicles ahead or oncoming vehicles from reflectors. The headlamps of one's own vehicle can thus be controlled automatically such that blinding of the drivers of vehicles travelling ahead or of oncoming vehicles is prevented.


The range and distribution of the illumination provided by the headlamps can be adjusted accordingly to vehicles ahead and oncoming vehicles.


Difficulties arise with camera-based driver assistance functions due to environmental effects such as rain or darkness at night, which can considerably affect the imaging quality of the camera.


SUMMARY OF THE INVENTION

In view of the above, it is an object of at least one embodiment of the invention to overcome or avoid these difficulties in the prior art.


A basic idea of the invention is to use information provided by a rain or light sensor system in order to adjust accordingly assistance and object detection functions based on the data of a vehicle camera.


A method for evaluating the image data (or a method for object detection by means) of a vehicle camera according to the invention provides that information about raindrops on a window and/or information about the detected lighting conditions within the field of view of the vehicle camera is taken into account in the evaluation of the image data (or detection of objects).


Information about raindrops in particular includes the number and the size of the raindrops (or generally the precipitation particles), wherein precipitation particles comprising hailstones, snowflakes, ice crystals and dirt particles as well as raindrops, are regarded as “raindrops” as that term is used in the sense of the presently claimed invention.


Information about the detected lighting conditions is in particular the brightness of the surroundings (e.g. day/night, driving through tunnels), wherein also individual light sources such as, for example, street lights or vehicle lights can be part of the detected lighting conditions.


It is regarded as an advantage that the evaluation of image data or the detection of objects becomes more reliable because the information about raindrops or the lighting conditions is taken into account. The detection reliabilities of objects can thus be estimated in a better manner, such that even difficult situations can be dealt with by the camera system.


According to a preferred embodiment the information (raindrops or lighting conditions) is determined from the image data. This means that the vehicle camera serves simultaneously as a rain and/or light sensor, e.g. within a partial region of the image sensor as shown in WO 2010/072198 A1. The detection of rain and/or the lighting conditions is now taken into account for the (further) evaluation of image data or object detection for driver assistance functions.


In particular the effect of rain and/or light on camera functions can thus be estimated in a better manner, as the same camera detects the weather/lighting situation directly and the resulting visibility properties can thus be best estimated.


Advantageously, in the evaluation of the image data at least one criterion (e.g. a threshold value) regarding a detection of edges in the image from the image data can be varied as a function of the information (about raindrops on the window and/or detected lighting conditions within the field of view of the vehicle camera).


For example, the influence on the edges seen by the camera (light/dark or color transitions) can be estimated from a detected rain intensity. These edge transitions are mostly smoother in the event of rain, which means that the edge gradient is less steep than it would be without rain. Edge-based evaluation methods should therefore be adjusted accordingly as to their threshold values. Correspondingly, multiple parameterizations can be used depending on the weather situation and depending on the detected weather condition.


In particular a quality criterion of the image data can be derived from the information, which is taken into account in the evaluation of the image data.


Preferably individual assistance functions can be switched off entirely at a certain rain intensity if the quality of the sensor signals is no longer sufficient, i.e. if the quality criterion of the image data falls below a minimum value.


Advantageously, assistance functions providing speed control are restricted as to the maximum controllable speed. In particular it might not be possible any longer to activate an ACC (Adaptive Cruise Control) in heavy rain at higher speeds, and this would be communicated to the driver. The maximum speed which can be activated or controlled is preferably determined as a function of a quality criterion of the image data.


According to an advantageous embodiment, “blockage detection” can be performed. Usually, the windshield wipers are turned on in the event of rain. They may overlap regions of the image. In this case, tracking of objects in subsequent images of an image sequence (object tracking) can be made more stable with regard to failures of individual images. Objects can thus be assessed as valid across multiple cycles, even if individual measurements are missing.


This is preferably also the case for the detection of a gush of water, because here, objects may also not be detectable any longer in individual images of an image sequence. If a gush of water has been detected, advantageously functions which have already been triggered accurately can remain active and the triggering of new functions can be prevented.







DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS OF THE INVENTION

The invention will now be explained in greater detail with reference to exemplary embodiments.


With the camera, e.g. vehicle rear ends can be detected and classified. To begin with, this detection is edge-based.


In a first step, the image is searched for basic properties such as rear lamps, vehicle outlines, or a shadow beneath the vehicle.


As there are mostly no vehicle shadows in the event of rain, greater emphasis can be put on the lamps or on edge detection in this case, for example.


At night lamp detection can be taken into account even more, because due to the darkness hardly any other property will be visible any longer. It is particularly illumination which plays an essential role in the number and quality of the measuring points.


As a general rule, at night the visual range of some functions is practically limited to the illumination range of the lamps. Here, the illumination limit can be used as the detection limit for a reliable detection of non-illuminated objects within the surroundings of the vehicle.


In the subsequent step the classification of the detected vehicles can also be adjusted to the weather situation as to its parameters or the way in which it is performed. Special classifiers can be used for different weather conditions, for example.


It is possible, for example, to develop different classification algorithms/parameters for different weather situations. Also here, the classification algorithm is then used which has been trained for the corresponding ambient conditions (rain/light).


According to the weather situation, individual detection algorithms can also be switched off and others can be relied on more.


Furthermore, the signal qualities of the objects provided by the camera or other weather-dependent sensors such as, e.g., lidar or PMD (Photonic Mixing Device) can be assessed in a better manner, such that in particular a quality criterion of the image data can be derived from the information about rain or lighting conditions.


Apart from edge detection and the classification algorithms, the gathering of information about color can also be adjusted.


Lane markings often appear as a black line on light ground at night and in the event of rain, even irrespectively of their own color (yellow/white). Based on this knowledge, algorithms for lane marking detection can be designed to be more robust.


Another option is to activate/adapt the vehicle lighting (low beam, high beam, fog lamps) accordingly to obtain any remaining information about color. Preferably also special low-glare lamps having the corresponding light temperature can be installed on the vehicle and activated additionally.


For highly automated driving up to autonomous driving it is of particular interest how far into the future (or how far on the path in the direction of travel) a defined signal reliability can be assured. Thus also highly automated systems can adjust the driving speed accordingly. Driving faster would then be possible only manually.

Claims
  • 1. A method comprising steps: a) with a camera in a vehicle, producing first image data representing an object in outside surroundings of the vehicle and second image data representing precipitation particles on an area of a windshield of the vehicle;b) evaluating the second image data to determine precipitation particle information including a number and/or a size of the precipitation particles on the area of the windshield; andc) evaluating the first image data by applying at least one evaluation algorithm comprising an edge detection algorithm with at least one evaluation parameter to detect and classify the object, wherein the evaluation parameter comprises an edge detection threshold value for determining which ones of image transitions in the first image data are regarded as edges, and selecting or changing the edge detection threshold value in response to and dependent on the precipitation particle information.
  • 2. The method according to claim 1, further comprising providing plural different parameterizations, and wherein the selecting or changing in the step c) comprises selecting, as the edge detection threshold value, a respective one of the parameterizations in response to and dependent on the precipitation particle information.
  • 3. The method according to claim 1, further comprising providing plural different classifiers for classifying the object, and wherein the step c) further comprises selecting, as the evaluation algorithm for classifying the object, one of the classifiers in response to and dependent on the precipitation particle information.
  • 4. The method according to claim 1, further comprising providing plural different classification algorithms that have been respectively trained for operation under respective different conditions of the precipitation particle information, and wherein the step c) further comprises selecting, as the evaluation algorithm for classifying the object, a respective one of the classification algorithms that has been trained for operation at the respective condition corresponding to the precipitation particle information determined in the step b).
  • 5. The method according to claim 1, further comprising providing plural different detection algorithms for detecting the object, and wherein the step c) further comprises selecting, as the evaluation algorithm for detecting the object, a respective one of the detection algorithms in response to and dependent on the precipitation particle information.
  • 6. The method according to claim 1, wherein the precipitation particles are raindrops.
  • 7. The method according to claim 1, wherein the precipitation particles are hailstones, snowflakes or ice crystals.
  • 8. The method according to claim 1, further comprising comparing the precipitation particle information to a threshold requirement, and switching off a driver assistance function of a driver assistance system of the vehicle when the precipitation particle information fails to satisfy the threshold requirement.
  • 9. The method according to claim 1, further comprising adjusting a maximum speed limitation of an adaptive cruise control of the vehicle in response to and dependent on the precipitation particle information.
  • 10. A method comprising steps: a) with a camera in a vehicle, producing first image data representing an object in outside surroundings of the vehicle and second image data representing precipitation particles on an area of a windshield of the vehicle;b) evaluating the second image data to determine precipitation particle information including a number and/or a size of the precipitation particles on the area of the windshield; andc) evaluating the first image data at least by gathering color information from the first image data to detect and classify the object, and changing the gathering of the color information in response to and dependent on the precipitation particle information.
Priority Claims (1)
Number Date Country Kind
10 2011 056 051 Dec 2011 DE national
PCT Information
Filing Document Filing Date Country Kind
PCT/DE2012/100350 11/16/2012 WO 00
Publishing Document Publishing Date Country Kind
WO2013/083120 6/13/2013 WO A
US Referenced Citations (65)
Number Name Date Kind
5923027 Stam et al. Jul 1999 A
5987152 Weisser Nov 1999 A
6323477 Blasing et al. Nov 2001 B1
6331819 Hog Dec 2001 B1
6376824 Michenfelder et al. Apr 2002 B1
6392218 Kuehnle May 2002 B1
6555804 Blasing Apr 2003 B1
6617564 Ockerse et al. Sep 2003 B2
6841767 Mindl et al. Jan 2005 B2
7130448 Nagaoka et al. Oct 2006 B2
7253898 Saikalis et al. Aug 2007 B2
7259367 Reime Aug 2007 B2
7609857 Franz Oct 2009 B2
7612356 Utida et al. Nov 2009 B2
7646889 Tsukamoto Jan 2010 B2
7804980 Sasaki Sep 2010 B2
7855353 Blaesing et al. Dec 2010 B2
7863568 Fleury Jan 2011 B2
8270676 Heinrich et al. Sep 2012 B2
8274562 Walter et al. Sep 2012 B2
8541732 Rothenhaeusler Sep 2013 B2
8548200 Suzuki et al. Oct 2013 B2
8913132 Seger et al. Dec 2014 B2
9058643 Cord et al. Jun 2015 B2
20010028729 Nishigaki et al. Oct 2001 A1
20020148987 Hochstein Oct 2002 A1
20030138133 Nagaoka et al. Jul 2003 A1
20030201380 Ockerse et al. Oct 2003 A1
20040004456 LeBa et al. Jan 2004 A1
20040165749 Holz et al. Aug 2004 A1
20050035926 Takenaga et al. Feb 2005 A1
20050178954 Yukawa Aug 2005 A1
20050206511 Heenan et al. Sep 2005 A1
20050231725 Franz Oct 2005 A1
20050254688 Franz Nov 2005 A1
20050276447 Taniguchi et al. Dec 2005 A1
20060076477 Ishikawa Apr 2006 A1
20060163458 Reime Jul 2006 A1
20060228001 Tsukamoto Oct 2006 A1
20070047809 Sasaki Mar 2007 A1
20070053671 Garg et al. Mar 2007 A1
20070216768 Smith et al. Sep 2007 A1
20070267993 Leleve et al. Nov 2007 A1
20070272884 Utida et al. Nov 2007 A1
20080192984 Higuchi Aug 2008 A1
20090085755 Schafer et al. Apr 2009 A1
20090128629 Egbert et al. May 2009 A1
20100208060 Kobayashi et al. Aug 2010 A1
20110031921 Han Feb 2011 A1
20110043624 Haug Feb 2011 A1
20110098716 Peterson et al. Apr 2011 A1
20110128543 Choi Jun 2011 A1
20110204206 Taoka Aug 2011 A1
20110253917 Rothenhaeusler Oct 2011 A1
20110273564 Seger et al. Nov 2011 A1
20110273582 Gayko et al. Nov 2011 A1
20120026318 Huelsen et al. Feb 2012 A1
20120026330 Huelsen et al. Feb 2012 A1
20120153154 Rothenhaeusler et al. Jun 2012 A1
20130235381 Kroekel et al. Sep 2013 A1
20130245945 Morita Sep 2013 A1
20140300738 Mueller Oct 2014 A1
20140321709 Kasahara et al. Oct 2014 A1
20150070499 Roelke et al. Mar 2015 A1
20150332099 Kosubek et al. Nov 2015 A1
Foreign Referenced Citations (43)
Number Date Country
44 17 385 Nov 1995 DE
195 04 606 Aug 1996 DE
197 04 818 Aug 1997 DE
103 01 468 Oct 2003 DE
102 30 200 Jan 2004 DE
197 00 665 Jul 2004 DE
103 03 046 Oct 2004 DE
103 16 794 Nov 2004 DE
102004015040 Oct 2005 DE
102004037871 Mar 2006 DE
102005004513 Mar 2006 DE
102006008274 Aug 2007 DE
102007061725 Jun 2009 DE
102008001679 Nov 2009 DE
102008043737 May 2010 DE
0 832 798 Apr 1998 EP
1 580 092 Sep 2005 EP
1 637 837 Mar 2006 EP
1 764 835 Mar 2007 EP
1 826 648 Aug 2007 EP
1 962 254 Aug 2008 EP
2 057 583 May 2009 EP
2 230 496 Sep 2010 EP
2 381 416 Oct 2011 EP
08-030898 Feb 1996 JP
2001-160146 Jun 2001 JP
2003-315256 Nov 2003 JP
2005-292544 Oct 2005 JP
2006-184844 Jul 2006 JP
2006-227876 Aug 2006 JP
2007-228448 Sep 2007 JP
2009-092453 Apr 2009 JP
2010-096604 Apr 2010 JP
2011-165050 Aug 2011 JP
WO 03029757 Apr 2003 WO
WO 03093864 Nov 2003 WO
WO 2005075248 Aug 2005 WO
WO 2006015905 Feb 2006 WO
WO 2006024247 Mar 2006 WO
WO 2009020918 Feb 2009 WO
WO 2010072198 Jul 2010 WO
WO 2010084707 Jul 2010 WO
WO 2011098716 Aug 2011 WO
Non-Patent Literature Citations (5)
Entry
International Search Report of the International Searching Authority for International Application PCT/DE2012/100350, mailed Mar. 18, 2013, 3 pages, European Patent Office, HV Rijswijk, Netherlands.
PCT International Preliminary Report on Patentability including English Translation of PCT Written Opinion of the International Searching Authority for International Application PCT/DE2012/100350, issued Jun. 10, 2014, 7 pages, International Bureau of WIPO, Geneva, Switzerland.
German Search Report for German Application No. 10 2011 056 051.3, dated Oct. 12, 2012, 5 pages, Muenchen, Germany, with English translation, 5 pages.
Partial English translation of Japanese Office Action in Japanese Patent Application No. 2014-543769, mailed Jul. 6, 2016, 1 page.
Corrected partial English translation of Japanese Office Action in Japanese Patent Application No. 2014-543769, mailed Jul. 6, 2016, 1 page.
Related Publications (1)
Number Date Country
20150220792 A1 Aug 2015 US