Exemplary embodiments of the invention relate to a method for detecting dirt on a viewing window of a lidar, as well as to a device for detecting dirt on a viewing window of a lidar.
The recognition of contamination on a viewing window of a lidar represents a challenge in particular for automated vehicles, for example level 3 or higher. Contamination leads to a degradation of the sensor performance and thus to a limitation of the safety and availability of the mentioned systems. Lidars are active sensors and have a transmitter, for example one or more laser diodes, and a receiver, for example one or more avalanche photodiodes, in particular single-photon avalanche diodes.
A method for recognizing contamination in lidar systems is known from DE 10 2017 222 618 A1, which has the following steps:
Exemplary embodiments of the invention are directed to an improved method compared to the prior art and an improved device for detecting dirt on a viewing window of a lidar.
A method for detecting dirt on a viewing window of a lidar, for example a windscreen, is proposed according to the invention, wherein
It is provided according to the invention that
Due to the active illumination from the interior of the sensor, intensities of the reflections show a different sensitivity to contamination of the viewing windows compared to the background light, so that an assessment of the level of contamination is possible.
In one embodiment, the laser beam is transmitted in a pulse-like manner.
In one embodiment, the background light image is identified shortly before the transmission of the laser beam. A temporally close sequence of receiving the background light image and the intensity image is in particular advantageous when using the method in a motor vehicle, during a journey, since approximately the same scene is recorded in both greyscale images.
In one embodiment, features of buildings and/or vehicles and/or windows are identified in the intensity image and in the background light image.
In one embodiment, edges are recognized by means of an edge detection algorithm in the intensity image and in the background light image.
In one embodiment, edge distances and/or edge positions are identified based on the recognized edges.
According to one aspect of the present invention, a device for detecting dirt on a viewing window of a lidar is proposed, comprising a data processing unit, which is connected to the lidar and is configured for carrying out the method described above.
Further, a motor vehicle comprising such a device is proposed, in particular an automated vehicle, for example level 3 or higher.
Further, an application of the method described above or of the device described above in a motor vehicle is proposed. An application for other autonomous platforms, like lorries, buses, or robots, which use a LiDAR for navigation, is also possible.
Exemplary embodiments of the invention are illustrated in greater detail below by means of drawings.
Here are shown:
Parts that correspond to one another are provided with the same reference numerals in all figures.
The invention relates to a method for detecting dirt on a viewing window of a lidar, for example a windscreen. A lidar is an active sensor and has at least one transmitter, for example one or more laser diodes, and at least one receiver, for example one or more avalanche photodiodes, in particular single-photon avalanche diodes. In the method according to the invention, the transmitter transmits a pulse-like laser beam and the receiver detects reflections of the laser beam off objects within a detection region. With suitable receivers, additional information, like intensities of the reflections and background light of the scene, is also made available along with the distance information. This additional information shows a different degree of sensitivity for contaminations of the viewing window. The background light can, for example, be identified by a greyscale image being recorded by the receiver, without the transmitter transmitting a laser beam, for example shortly before the transmission of the laser beam.
For a lidar that provides this additional information, suitable features of the intensity image and of the background light image, for example edges, in particular road markings or edges of windows in walls of buildings, are therefore compared using methods for image processing. With edge detection algorithms known from the field of image processing, edges can be extracted from the background light image and the intensity image. For both resulting edge images, edge features are now calculated, for example edge distances, edge positions, etc. and are subsequently compared with each other. With this comparison, a measure for the similarity between the intensity image and the background light image can be obtained. If the number of common features lies above a certain threshold value, then the images are interpreted as being similar and it is concluded that there is no contamination.
If there are few or no common features as shown in
The threshold value can be defined sensor-specifically. The method can be used for the whole detection region of the lidar or for a section, for localized detection of contamination.
The proposed method for detecting contamination may function to a limited extent if there are few or no structures/edges within the field of vision of the lidar sensor, for example when recording a monochrome wall or the sky. When being used in road traffic, such scenarios are, however, the exception. A minimum number of features in the sensor field of vision, in particular structures and/or edges, that can be considered necessary for detecting contamination can be defined sensor-specifically.
Although the invention has been illustrated and described in detail by way of preferred embodiments, the invention is not limited by the examples disclosed, and other variations can be derived from these by the person skilled in the art without leaving the scope of the invention. It is therefore clear that there is a plurality of possible variations. It is also clear that embodiments stated by way of example are only really examples that are not to be seen as limiting the scope, application possibilities or configuration of the invention in any way. In fact, the preceding description and the description of the figures enable the person skilled in the art to implement the exemplary embodiments in concrete manner, wherein, with the knowledge of the disclosed inventive concept, the person skilled in the art is able to undertake various changes, for example, with regard to the functioning or arrangement of individual elements stated in an exemplary embodiment without leaving the scope of the invention, which is defined by the claims and their legal equivalents, such as further explanations in the description.
Number | Date | Country | Kind |
---|---|---|---|
10 2020 130 481.1 | Nov 2020 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2021/081186 | 11/10/2021 | WO |