Method for calibrating a Lidar sensor

Information

  • Patent Grant
  • 12291218
  • Patent Number
    12,291,218
  • Date Filed
    Friday, February 19, 2021
    4 years ago
  • Date Issued
    Tuesday, May 6, 2025
    5 days ago
  • Inventors
    • Kobetz; Christian
  • Original Assignees
  • Examiners
    • Goodbody; Joan T
    Agents
    • PATENT PORTFOLIO BUILDERS PLLC
Abstract
Decalibration of a Lidar sensor of a vehicle is identified using a reference and verification measurement, each of which projects at least one line on a road surface. A position of the projected at least one lines from the reference and verification measurements are compared to determine whether the lines are displaced from each other by more than a specified amount, in which case degradation of the Lidar sensor is identified.
Description
BACKGROUND AND SUMMARY OF THE INVENTION

Exemplary embodiments of the invention relate to a method for calibrating a Lidar sensor of a vehicle by means of a camera, as well as to a use of such a method.


A method for calibrating a sensor for distance measurement with two sensor channels is known from DE 10 2005 037 094 B3, the radiation of which is lobe-shaped.


The method comprises the following steps:






    • transmitting a radiation lobe of a sensor channel onto a calibration surface by means of the sensor,

    • detecting the radiation lobe with a camera of a video system,

    • identifying a first representative value of the radiation lobe in image coordinates of the camera,

    • adjusting a position of the calibration surface in relation to the sensor and the camera,

    • detecting the radiation lobe with the camera,

    • identifying a second representative value of the radiation lobe in image coordinates of the camera,

    • repeating the preceding steps for a further sensor channel,

    • modelling the lobe alignment of the emitted radiation lobes as straight lines in sensor coordinates,

    • transforming the modelled straight lines from sensor coordinates into image coordinates of the camera,

    • comparing the straight lines with the identified representative values in image coordinates for every sensor channel,

    • identifying a calibration function to compensate for a possible deviation of the modelled straight lines from several representative values.


      The sensor is a Lidar system based on light emission and light absorption or a Lidar sensor.





Exemplary embodiments of the invention are directed to a method for calibrating a Lidar sensor of a vehicle that is improved compared to the prior art, and a use of such a method.


In one method for calibrating a Lidar sensor of a vehicle, laser radiation is transmitted onto a road surface by means of the lidar sensor and it is detected as laser radiation reflected back to the Lidar sensor by means of the Lidar sensor.


According to the invention, a reference measurement is taken in a calibrated state by means of the Lidar sensor. Here, the laser radiation is emitted in at least one predetermined scanning plane and at least one line is hereby projected onto the road surface. By detecting the laser radiation reflected back, the position of this at least one line relative to the vehicle is identified and saved for a later evaluation. A reference measurement is taken at a later time. During the reference measurement, the laser radiation is likewise emitted in the at least one predetermined scanning plane, and at least one line is hereby also projected onto the road surface. By detecting the laser radiation reflected back, the position of this at least one line relative to the vehicle is identified. It is then checked whether the positions identified during the reference measuring and verification measuring are displaced relative to each other. If a displacement by more than a specified level is found, it is concluded there is a degradation of the Lidar sensor.


The identification of the positions of the lines projected onto the road advantageously relies on an identification of distances between the respective lines and the vehicle.


To use a Lidar sensor, a calibration of the same is necessary. A quick test of whether the Lidar sensor is still calibrated is very advantageous for a rapid availability of vehicle systems operated by data collected by means of the Lidar sensor, for example vehicle systems for the execution of an automated, in particular highly automated, or autonomous or driverless driving operation. Such an especially fast and simultaneously reliable test can be implemented with low effort by means of the method.


In one possible embodiment of the method, the displacement is only then identified if the reference measurement and the verification measurement are taken at the same place, i.e., at the same vehicle position. Due to the road curvature, the lines projected onto the road surface can be distorted. If the reference measurement and the verification measurement are taken at the same place, the lines projected onto the road surface are distorted in the same way. In this way, measurement errors caused by road curvature can be avoided.


In a further possible embodiment of the method, the reference measurement and verification measurement are taken while the vehicle is stationary. For example, the reference measurement is taken while the vehicle is parked, and the verification measurement is taken during the subsequent starting of the vehicle. In this way, it can be determined whether an event took place during parking or in the time between the reference measurement and the verification measurement being taken, that led to the de-calibration of the sensor. Such an event could, for example, be a change in the load of the vehicle or the weight distribution in the vehicle, or a change of tire. With the method, it can quickly be checked whether the Lidar sensor is calibrated, for example by briefly stopping in a car park or at a petrol station. If it is verified that the Lidar sensor is calibrated, learnt calibration parameters can be used for operation. Otherwise, an online calibration can be initiated in order to update the calibration parameters.


In a further possible embodiment of the method, more than one line is projected on the road surface during the reference measuring and the verification measuring respectively. The lines projected onto the road surface during the reference measuring form a reference pattern and the lines projected onto the road surface during the verification measuring form a verification pattern. Such patterns with line-shaped structures can be particularly easily generated, detected, and evaluated.


In a further possible embodiment of the method, the lines are generated by means of several lasers of the Lidar sensor that are arranged one on top of the other. Such a generation can be carried out particularly simply and precisely.


In a further possible embodiment of the method, the projection of the defined pattern onto the road surface during the verification measuring is carried out for a specified time period, for example for a duration of 30 seconds. Such a time period enables a reduction of the influence of road fluctuations by averaging or integration.


In the use according to the invention, the previously described method is used in an automated, in particular highly automated, or autonomously operable vehicle. Due to the particularly quick availability of data collected by means of the Lidar sensor for vehicle systems for carrying out the automated driving operation, such automated driving operation can be especially quickly and reliably started after a vehicle stop.


In this context, if, for example, a decalibration of the Lidar sensor is recognized, the automated, in particular highly automated, or autonomous operation of the vehicle is stopped and/or locked. In this way, traffic safety is significantly increased.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

Exemplary embodiments of the invention are illustrated in more detail in the following, using drawings.


Here:



FIG. 1 schematically shows a perspective view of a vehicle and a road surface at a first point in time; and



FIG. 2 schematically shows a perspective view of the vehicle and the road surface according to FIG. 1 at a second point in time.





DETAILED DESCRIPTION

Parts that correspond to each other are provided with the same reference numerals in all figures.


In FIG. 1, a perspective view of the vehicle 1 and a road surface FB at a first point in time, at which the reference measurement is taken, is shown.


The vehicle 1 comprises a Lidar sensor 2, which emits laser radiation in a conventional manner and detects the laser radiation that is reflected back. At least the data collected by means of the Lidar sensor 2 about a vehicle's surroundings is used to for an automated, in particular highly automated, or autonomous or driverless operation of the vehicle 1.


Due to changing load conditions of the vehicle 1, changes to the vehicle and outside influences, an orientation of the Lidar sensor 2 can be changed relative to the vehicle's environment, which influences the data collected by means of the Lidar sensor 2. For this reason, an exact calibration, in particular online calibration, of the Lidar sensor 2 is always necessary.


While the vehicle 1 is being parked, the Lidar sensor 2 can become decalibrated due to a parking maneuver. If the parking process is ended and the vehicle 1 is put back into operation, it is important to recognize quickly whether the Lidar sensor 2 is still calibrated, or is decalibrated. With a decalibrated Lidar sensor 2, operational safety during automated or driverless operation is no longer guaranteed. In one such case, the automated or driverless operation of the vehicle 1 is locked, at least until the Lidar sensor 2 has been recalibrated, for example by means of an online calibration.


In order to recognize a decalibrated state of the Lidar sensor 2, a reference measurement is first taken. In this process, one or more lines S1, S2, S3, S4 or a reference pattern RM made up of several lines are projected by means of laser radiation onto the in particular flat road surface FB. In this capacity, the Lidar sensor 2 is designed for so-called line scans, in which several lasers of the Lidar sensor 2, which are arranged one on top of the other, produce lines S1, S2, S3, S4 at different distances A1 to A4, so-called layers, on the road surface FB. In this process a distance A1 to A4 to the vehicle 1 increases with the growing height of the arrangement of the lasers. The first three lines S1, S2, S3 here form the reference pattern RM.


This reference pattern RM is recorded by the Lidar sensor 2 and is saved as the reference pattern RM as shown in FIG. 2. In further possible exemplary embodiments, the reference pattern RM can be formed by means of a different number of lines S1, S2, S3.



FIG. 2 shows a perspective view of the vehicle 1 and the road surface FB according to FIG. 1 at a later, second point in time, in particular after a restart of the vehicle 1 or in a new ignition sequence. At this second point in time, a verification measurement is taken. In this context, the dashed lines S1′, S2′, S3′ or the verification pattern M made up of these lines are projected onto the road surface FB and the positions of these lines S1′, S2′, S3′ or the position of the verification pattern M relative to the vehicle 1 are identified.


A decalibration of the Lidar sensor 2 then occurs before this restart, for example, if the loading of the vehicle 1 has been significantly changed or if the orientation of the Lidar sensor 2 had been changed due to a mechanical action, for example in a collision with an obstacle. A displacement of the lines S1′, S2′, S3′ projected onto the road surface FB at the second point in time compared to the lines S1′, S2′, S3′ projected onto the road surface FB at the first point in time results from these changes, as does a displacement between the reference pattern RM and the verification pattern M.


Based on this displacement, which can for example be identified by the distances A1′ to A3′ between the lines S1′ to S3′ and the vehicle 1 being identified and then being compared with the distances A1 to A3 that are contained in the reference pattern RM, a decalibration of the Lidar sensor 2 can be identified.


This means that, to recognize a decalibration of the Lidar sensor 2, the verification M is projected onto the road surface FB by means of the Lidar sensor 2 and recorded by means of the Lidar sensor 2 during a verification measurement, for example during the starting of the vehicle. A displacement between the verification pattern M and the reference pattern RM is then identified and then, if the displacement exceeds a predetermined amount of, for example, +/−0.1 m, it is concluded there is a decalibration of the Lidar sensor 2.


When such a decalibration of the Lidar sensor 2 is recognized, the automated, in particular highly automated, or driverless operation of the vehicle 1 is, in one possible embodiment, stopped or blocked. Preferably, an online calibration is then started, and after its completion the automated, in particular highly automated, or driverless operation of the vehicle 1 is permitted again.


If the identified displacement does not exceed the specified amount, it is assumed that the Lidar sensor 2 is not decalibrated and data collected by means of the Lidar sensor 2 is used for calibration according to calibration parameters recorded in FIG. 1, for the operation of the Lidar sensor 2 and for evaluation.


The predetermined amount is in particular separately determined and set for different types of vehicle.


In order to be able to take account of any possible unevenness in the road surface during the verification measuring, the defined pattern M is projected onto the road surface FB and this is transmitted for a predetermined duration of, for example, 30 seconds during the verification measuring. In this way, measurement errors can be reduced by integration or averaging.


Furthermore, in one possible embodiment, the verification pattern M is compared with the reference pattern RM, to find whether the verification pattern M is distorted compared to the reference pattern RM. As such a distortion of the verification pattern M is an indication of a road curvature, and measurement errors can result from a road curvature, the distortion is in particular only then determined if it has previously been determined that the pattern generated in the verification measuring is undistorted compared to the reference pattern, or the distortion is below a specified tolerance value.


The laser radiation reflected back to the Lidar sensor 2 is detected in the present exemplary embodiment by means of a photodetector provided in the Lidar sensor 2. However, it is also conceivable to detect the radiation reflected back by means of a photodetector 3 arranged outside the Lidar sensor, for example by means of a camera that has a sensor chip sensitive to the wavelength of the laser radiation, typically IR radiation.


Although the invention has been illustrated and described in detail by way of preferred embodiments, the invention is not limited by the examples disclosed, and other variations can be derived from these by the person skilled in the art without leaving the scope of the invention. It is therefore clear that there is a plurality of possible variations. It is also clear that embodiments stated by way of example are only really examples that are not to be seen as limiting the scope, application possibilities or configuration of the invention in any way. In fact, the preceding description and the description of the figures enable the person skilled in the art to implement the exemplary embodiments in concrete manner, wherein, with the knowledge of the disclosed inventive concept, the person skilled in the art is able to undertake various changes, for example, with regard to the functioning or arrangement of individual elements stated in an exemplary embodiment without leaving the scope of the invention, which is defined by the claims and their legal equivalents, such as further explanations in the description.

Claims
  • 1. A method, the method comprising: performing a reference measurement by
  • 2. The method of claim 1, wherein the first plurality of lines is a reference pattern, the second plurality of lines is a verification pattern, and wherein the comparison involves comparing the reference and verification patterns.
  • 3. The method of claim 2, wherein the first plurality of lines and the second plurality of lines are generated by several lasers of the Lidar sensor, wherein the several lasers are arranged one on top of the other.
  • 4. The method of claim 1, wherein the vehicle includes an automated driving operation, the method further comprising: stopping or blocking the automated driving operation responsive to the identification of the degradation of the Lidar sensor.
  • 5. The method of claim 4, further comprising: permitting the automated driving operation responsive to the recalibration of the Lidar sensor.
  • 6. The method of claim 1, wherein the reference and verification measurements are each performed for a predetermined duration, the distance between the vehicle and each of the first plurality of lines is determined by integration or averaging over the predetermined duration, and the distance between the vehicle and each of the second plurality of lines is determined by integration or averaging over the predetermined duration.
  • 7. A method, the method comprising: performing a reference measurement by
  • 8. The method of claim 1, wherein the photodetector is part of the Lidar sensor.
  • 9. The method of claim 1, wherein the photodetector is arranged outside of the Lidar sensor.
Priority Claims (2)
Number Date Country Kind
10 2020 109 374.8 Apr 2020 DE national
10 2020 007 645.9 Dec 2020 DE national
PCT Information
Filing Document Filing Date Country Kind
PCT/EP2021/054202 2/19/2021 WO
Publishing Document Publishing Date Country Kind
WO2021/197709 10/7/2021 WO A
US Referenced Citations (64)
Number Name Date Kind
8825260 Silver Sep 2014 B1
8928757 Maekawa Jan 2015 B2
9052721 Dowdall Jun 2015 B1
9866819 Suzuki Jan 2018 B2
9927813 Ferguson Mar 2018 B1
10107899 Han Oct 2018 B1
10392013 Hakki Aug 2019 B2
10688987 Bariant Jun 2020 B2
10788316 Kalscheur Sep 2020 B1
11243308 Matsui Feb 2022 B2
11321210 Huang May 2022 B2
11460855 Lim Oct 2022 B1
11506769 Zhou et al. Nov 2022 B2
11629835 Kuffner, Jr. Apr 2023 B2
11747453 Shepard Sep 2023 B1
11906672 Fendt Feb 2024 B2
20060114320 Nagaoka Jun 2006 A1
20060290920 Kampchen Dec 2006 A1
20100179781 Raphael Jul 2010 A1
20100194886 Asari Aug 2010 A1
20100235129 Sharma Sep 2010 A1
20100253493 Szczerba Oct 2010 A1
20110018973 Takayama Jan 2011 A1
20120081542 Suk Apr 2012 A1
20130321629 Zhang Dec 2013 A1
20140320658 Pliefke Oct 2014 A1
20150102995 Shen Apr 2015 A1
20150260833 Schumann Sep 2015 A1
20150362587 Rogan Dec 2015 A1
20150367855 Parchami Dec 2015 A1
20160349371 Suzuki Dec 2016 A1
20170018087 Yamaguchi Jan 2017 A1
20170160094 Zhang Jun 2017 A1
20170169627 Kim Jun 2017 A1
20170210282 Rodriguez Barros Jul 2017 A1
20170309066 Bybee Oct 2017 A1
20170345159 Aoyagi Nov 2017 A1
20180143018 Kimura May 2018 A1
20180342113 Kislovskiy Nov 2018 A1
20190049958 Liu Feb 2019 A1
20190118705 Yu Apr 2019 A1
20190152487 Tokuhiro May 2019 A1
20190285726 Muto Sep 2019 A1
20190324129 Castorena Martinez Oct 2019 A1
20200041650 Matsui Feb 2020 A1
20200064483 Li Feb 2020 A1
20200082570 Wunderwald Mar 2020 A1
20200191927 Lin Jun 2020 A1
20200209853 Leach Jul 2020 A1
20200249354 Yeruhami Aug 2020 A1
20200284882 Kirillov Sep 2020 A1
20200410704 Choe Dec 2020 A1
20210025997 Rosenzweig Jan 2021 A1
20210033255 Kuffner, Jr. Feb 2021 A1
20210201464 Tariq Jul 2021 A1
20210208263 Sutavani Jul 2021 A1
20210221389 Long Jul 2021 A1
20210264169 Speigle Aug 2021 A1
20210278543 Simon Sep 2021 A1
20210389469 Sakata Dec 2021 A1
20220118901 El Idrissi Apr 2022 A1
20220137196 Hayashi May 2022 A1
20230082700 Schindler Mar 2023 A1
20230184900 Newman Jun 2023 A1
Foreign Referenced Citations (8)
Number Date Country
102005037094 Oct 2006 DE
102014014295 Mar 2016 DE
102020102466 Aug 2021 DE
3001137 Mar 2016 EP
2016045330 Apr 2016 JP
2020020694 Feb 2020 JP
2020042024 Mar 2020 JP
2020054392 Mar 2020 WO
Non-Patent Literature Citations (6)
Entry
DE_102020102466 image and translation (Year: 2020).
Office Action dated Jun. 27, 2023 in related/corresponding JP Application No. 2022-560266.
International Search Report mailed Apr. 28, 2021 in related/corresponding International Application No. PCT/EP2021/054202.
Written Opinion mailed Apr. 28, 2021 in related/corresponding International Application No. PCT/EP2021/054202.
Office Action dated Jan. 25, 2025 in related/corresponding KR Application No. 2022-7031954.
Office Action created Feb. 19, 2025 in related/corresponding DE Application No. 10 2020 007 645.9.
Related Publications (1)
Number Date Country
20230150519 A1 May 2023 US