The invention relates to a method of raindrop detection on a vehicle windscreen by processing images, particularly in order to trigger the automatic control of a functionality of the vehicle, particularly that of the windscreen wipers.
In motor vehicles, several driving assistance systems are known, using images captured by a single or by several cameras.
The images obtained can be processed to allow a display on screens, for example at the dashboard or projected on the windscreen, in particular to alert the driver in the case of danger or simply to improve his visibility. The images can also make it possible to detect raindrop or fog on the windscreen. These images can participate in the automatic triggering of a functionality of the vehicle (alert to the driver, automatic triggering of braking, automatic triggering of the windscreen wipers in the case of the display of drops of water on the windscreen, visual or audible warning, control of certain functions of the headlight, etc).
Image processing techniques aimed at detecting raindrops generally require that the camera is focused on the windscreen. Then, techniques have been elaborated to detect the presence of raindrop from information contained in these captured images.
An object of the invention is to propose an improved method of raindrop detection requiring less computation time and less memory capacities.
In a first aspect of the present invention, this object is achieved by a method of raindrop detection on a vehicle windscreen by capturing images using a camera which is at least focused on the windscreen, including a step of detecting edges in a research area of said captured images, characterized in that said method of raindrop detection comprises a rejection step in which edges that are uncharacteristic with respect to a raindrop are rejected.
Indeed, the study of the morphology of droplets on windscreens has shown that droplets deposited on a windscreen had specific shape or characteristic luminance properties enable to reject edges that are not raindrop edges.
In a further embodiment, said rejection step includes a stage in which closed edges are selected while unclosed edges are rejected. Indeed, the raindrop edge shape, elliptic or approximately round, is characteristic to the droplets.
Said rejection step may include one or more of the following stages:
In a further embodiment, extracted connected components are rejected only if the vehicle speed is bellow a predetermined speed threshold. Indeed, at low speed, raindrop which fall on the windscreen show nice curves that can be well interpreted.
Said method of raindrop detection may further include a tracking step in which each extracted connected components in a current captured image is tracked in next captured image doing a comparison between the extracted connected components grey level and the grey level of corresponding position in the next captured image.
This method allows a well distinguishing between raindrop and other spots such as lines or lamps without requiring a lot of calculation time or important resources.
The invention also relates to a method of raindrop detection on a vehicle windscreen by capturing images using a camera which is at least focused on the windscreen, including a step of detecting edges in a research area of said captured images, characterized in that said method of raindrop detection comprises a rejection step in which edges that are uncharacteristic with respect to a raindrop are rejected, this rejection step comprises at least the following sequence:
This particular stage order leads to optimize the rejection of closed edges reducing time calculation.
The invention further relates to a driving assistance device comprising:
characterized in that the processing means are further configured to reject edges that are uncharacteristic with respect to a raindrop.
According to an embodiment, said camera comprises a bifocal objective providing a first zone that is focused to infinity and a second zone focused on the windscreen to detect raindrop.
Other features and advantages of the invention will become apparent from the following description of a non-limited example and enclosed drawings.
For a better understanding of the invention with regard to the embodiments thereof, reference is made to the accompanying drawings, in which like numerals designate corresponding elements or sections throughout, and in which:
a is a graphic of the luminance of a selected raindrop closed edge,
b is a graphic similar to
a is a 3D representation of the selected raindrop closed edge of
b is a 3D representation of the rejected lamp closed edge of
The driving assistance device comprises a camera mounted onboard a motor vehicle.
The camera is installed in the cab interior of the vehicle, opposite a “wipable” zone of the vehicle windscreen, that is to say a zone which is swept by one of the wiper blades while in operation. The camera takes successive images of the road in front of the vehicle through the vehicle windscreen. For example, the camera is mounted behind the windscreen wiped area, at approximately ten to twenty centimeters from it.
As used herein, the term “camera” is used to designate any device for acquiring images of the camera type (more particularly the CCD (“charge coupling sensor”s) or CMOS (Complementary Metal Oxide Semiconductor) type) or photosensitive sensor, for example a black and white sensor or a color sensor. The camera is generally sensitive in the visible range and/or in the infrared range, in particular in the near infrared.
According to an embodiment of the driving assistance device, the camera comprises a bifocal (or multi-focal) objective providing a first zone 2 that is focused to infinity and a second zone 3 focused on the windscreen (image 1 on
It is thus possible to capture images in the distance with the adapted focal distance, and close by with a shorter focal distance. Therefore methods of detecting fog, lane edges or road verges, or detecting pedestrians or obstacles, will be able to use the portions of images captured at distance by the sensor, sufficiently sharp by virtue of the zone of the objective focused to infinity and the method of raindrop detection will be able to use the portions of images taken at a very short distance, on the windscreen, through the zone of the objective focused on the windscreen.
The driving assistance device also comprises processing means configured to:
The processing means are further configured to implement a method of raindrop detection on a vehicle windscreen 100 (
Said method of raindrop detection 100 comprises the following steps.
A first step 101 in which a research area is defined in the second zone 3 on said captured images. This research area may be situated in the lower part of the windscreen wiped area corresponding to the road images (image 1 on
The method of raindrop detection 100 comprises a second step 102 where edges are detected in said research area. In non-limiting examples, edge detection methods such as the Sobel, Prewitt, Roberts, Zero-cross, Canny methods etc. can be used.
The method of raindrop detection 100 also comprises a rejection step 103 in which edges that are uncharacteristic with respect to a raindrop are rejected. Indeed, the study of the morphology of droplets on windscreens has shown that droplets deposited on a windscreen had specific shape or characteristic luminance properties enable to reject edges that are not raindrop edges.
In a first stage 103a of this rejection step 103, closed edges are selected while unclosed edges are rejected. Indeed, the raindrop edge shape, elliptic or approximately round, is characteristic to the droplets.
A binary image resulting from this first stage 103a is represented on image 4 on
The rejection step 103 may further include a stage 103b following by one or more of the stages 103c, 103d, 103e, 103f or 103h.
In stage 103b, connected components are extracted.
This stage 103b uses classical image processing algorithm based on graphic theory. For each pixel, the algorithm checks value neighbors to isolate or merge pixels in the captured image research area. This stage 103b allows creating labels of connected components, for example labels 6 correspond to lanes spots (image 4).
In stage 103c, selected closed edges (also referred as “extracted connected components” or “labeled components”) that have geometric criteria different from raindrop ones are rejected.
For example, if the closed edge height is higher than the closed edge width, then said closed edge is rejected. In another example, if the closed edge height is lower than 2 pixels or if the closed edge width is lower than 3 pixels, then the closed edge is rejected. In a further example, if the closed edge size is higher than ⅓ of the research area, then the closed edge is rejected.
In stage 103d, selected closed edges (also referred as “extracted connected components” or “labeled components”) that are inclined bellow a predetermined angle threshold, such as bellow 45°, are selected while closed edges that are inclined over said predetermined angle are rejected.
The angle of the selected closed edges is taken between the closed edge major axis and the horizontal. Closed edges showing an inclination over said predetermined angle threshold are characteristic of lane closed edges.
An example is shown in
In stage 103e, closed edges (also referred as “extracted connected components” or “labeled components”) showing a grey level variation different from the raindrop grey level variation having one minimum and one maximum, are rejected.
The grey level variation (or profile) is obtained from a line of the closed edge, took for example in the middle of said closed edge. The raindrop profile presents a dark/light and a light/dark transitions, which are characteristic. However, closed edges showing only one extreme are characteristic of lamp.
a and 3b show two examples of grey level profiles according a pixel line in a raindrop closed edge (
In stage 103f, selected closed edges (also referred as “extracted connected components” or “labeled components”) having a predetermined pixels number with a grey level higher than a predetermined grey level threshold are rejected.
Two examples are shown in
In
In
Thus, closed edges of
Those stages 103c, 103e and 103f can be implemented in function of the vehicle speed. That is to say,
The predetermined speed threshold is for example 10 km/h.
Indeed, at low speed, raindrop which fall on the windscreen show nice curves that can be well interpreted.
In stage 103h, if more than two closed edges (also referred as “extracted connected components” or “labeled components”) are previously rejected by stage 103f within a predetermined distance threshold in a critical research area, selected closed edges between them are rejected. Indeed, theses closed edges may correspond to another vehicle taillight. An example is shown in
The critical research area 11 may be the middle of the lower zone 2 on the windscreen wiped area. The two closed edges 12a, 12b corresponding to another vehicle 13 followed, have a predetermined pixels number with a grey level higher than a predetermined grey level threshold, so they have been rejected during stage 103f.
A supplementary red color condition or stop position of the vehicle may be applied.
Moreover, selected closed edge 14 between both rejected closed edges 12a, 12b are also rejected as they should correspond to the vehicle beam reflection backward on said ahead vehicle 13.
Said method of raindrop detection may further includes a tracking step 104 in which each selected closed edge (also referred as “extracted connected components” or “labeled components”) in a current captured image is tracked in next captured image doing a comparison between the selected closed edge grey level and the grey level of corresponding position in the next captured image.
Indeed, from a current image n captured at t to a next image n+1 captured at t+1, each detected raindrop in the current captured image n can be tracked in the next captured image n+1, because the changing environment of a raindrop can prevent its detection by the edge detection mean.
This tracking step may calculate a ratio representing the difference between each last detected raindrop and its corresponding zone in actual image. The ratio can be for example an accumulator of absolute differences between each pixel grey level of a drop and its corresponding one in the actual image.
This tracking step 104 may be done only by low luminosity, such as by night (block 105). For example, this tracking step evaluates the mean of the captured images pixels. The comparison of that mean with a predetermined mean threshold gives an indication of the windscreen luminosity. During the day or with a high luminosity level, captured images are precise enough but during the night or at low luminosity level, the raindrop tracking may be useful. The raindrop tracking could then be submitted to this preliminary luminosity test.
In a last stage 106, the method can count the number of raindrop closed edges. Image 5 on
The raindrop number allows carrying out an appropriate treatment in real time on the vehicle. For example, velocity and cycle of the vehicle's wipers are adapted in function of the rain condition.
This method allows a well distinguishing between raindrop and other spots such as lines or lamps without requiring a lot of calculation time or important resources.
According to further embodiment, the method of rain detection 100 implements the following sequence of stages:
This particular stage order leads to optimize the rejection of closed edges reducing time calculation.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/EP2010/001355 | 3/4/2010 | WO | 00 | 4/12/2013 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2011/107118 | 9/9/2011 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6555804 | Blasing | Apr 2003 | B1 |
8797417 | Gayko | Aug 2014 | B2 |
20040200948 | Bos et al. | Oct 2004 | A1 |
20050206511 | Heenan | Sep 2005 | A1 |
20060228001 | Tsukamoto | Oct 2006 | A1 |
20070093949 | Stam et al. | Apr 2007 | A1 |
20070267993 | Leleve | Nov 2007 | A1 |
20070272884 | Utida | Nov 2007 | A1 |
Number | Date | Country |
---|---|---|
10 2007 057745 | Jun 2009 | DE |
1 790 541 | May 2007 | EP |
1 860 426 | Nov 2007 | EP |
1 923 280 | May 2008 | EP |
Entry |
---|
International Search Report from PCT/EP2010/001355 mailed Oct. 22, 2010 (3 pages). |
Written Opinion from PCT/EP2010/001355 mailed Oct. 22, 2010 (5 pages). |
Number | Date | Country | |
---|---|---|---|
20150310304 A1 | Oct 2015 | US |