Method and system for increasing safety of partially or fully automated driving functions

Information

  • Patent Grant
  • 11908205
  • Patent Number
    11,908,205
  • Date Filed
    Monday, January 18, 2021
    3 years ago
  • Date Issued
    Tuesday, February 20, 2024
    3 months ago
  • CPC
    • G06V20/588
    • G06F18/22
    • G06V10/75
  • Field of Search
    • CPC
    • G06V20/588
    • G06K9/6215
    • G06K9/6201
  • International Classifications
    • G06V20/56
    • G06F18/22
    • G06V10/75
    • Term Extension
      7
Abstract
The invention relates to a method for increasing the safety of driving functions in a partially automated or fully autonomous vehicle, including the following steps: capturing (S1) an environment image or a sequence of environment images by means of at least one surroundings detection sensor (2),detecting (S2) driving lane boundaries in the environment image or the sequence of environment images,determining (S3) a driving lane course based on the detected driving lane boundaries;retrieving (S4) a further driving lane course from a data source;checking the plausibility (S5) of the determined driving lane course by verifying a matching of the driving lane courses;identifying (S6) a degree of matching;setting (S7) a confidence value based on the degree of matching;deciding (S8) whether the determined driving lane course is provided to a driving function.
Description
CROSS REFERENCE TO RELATED APPLICATION

This application claims priority to German Patent Application No. 10 2020 202 964.4, filed Mar. 9, 2020, the contents of such application being incorporated by reference herein.


FIELD OF THE INVENTION

The invention relates to a method for increasing the safety of driving functions.


BACKGROUND

In current driver assistance systems with steering intervention, trajectories are determined, based on the road markings, which are to be sent to the steering system by corresponding signals. In this case, the lane markings are detected by sensors in the form of different camera systems. An erroneous detection can in this case only be recognized with further camera systems. If only one camera is installed, an erroneous detection of the lane marking will lead to a departure from the planned trajectory. This poses a serious safety problem, in particular when the driver is not steering themselves (hands-off).


SUMMARY OF THE INVENTION

Therefore, an aspect of the invention is to provide a method which increases the safety and robustness of driving functions.


Initial ideas involved the thought that even with a further camera, plausibility checks of detected driving lanes are not expedient. In certain situations, the cameras are equally compromised by common cause errors (shared electronic/electric errors as well as backlight, snow, rain, etc.). Aside from that, a second camera must be installed, and that is no small financial factor.


The development of a safety concept for a lane keeping function which may be used hands-off led to the increased requirements. The ASIL for derived safety goals in relation to hands-on functions necessitates additional plausibility checks of detected driving lanes. Currently, cameras on the market do not fulfill the increased integrity requirement.


According to an aspect of the invention, a method is therefore proposed for increasing the safety of driving functions in a vehicle with a degree of automation equal to or above SAE L2 in accordance with SAE J3016, including the following steps:

    • capturing an environment image or a sequence of environment images by means of at least one surroundings detection sensor,
    • detecting driving lane boundaries in the environment image or the sequence of environment images,
    • determining a driving lane course based on the detected driving lane boundaries;
    • retrieving a further driving lane course from a data source;
    • checking the plausibility of the determined driving lane course by verifying a matching of the driving lane courses;
    • identifying a degree of matching;
    • setting a confidence value based on the degree of matching;
    • deciding whether the determined driving lane course is provided to a driving function.


The surroundings detection sensor is preferably a mono or stereo camera. In the case of an aspect of this invention, driving lane boundaries are understood to be driving lane markings as well as guardrails or other elevated road boundaries such as, for example, construction site boundaries.


The driving lane course can, for example, be determined by means of keypoint regression of detection points of the driving lane detection.


The degree of matching can, for example, be a percentage of matching of the driving lane courses. Based on this percentage of matching, a confidence value can be set for the determined driving lane course. This confidence value can subsequently be compared to a previously set confidence threshold value. In this way, it can be determined whether the determined driving lane course is sufficiently reliable for a driving function, for example a lane keeping assistant, to be able to function without error or whether autonomous driving along a planned trajectory is possible.


In a preferred embodiment, the further driving lane course is retrieved from an HD map, a cloud or an infrastructure. Said data source can thus be, for example, an HD map which is generated from different current sensor data of the ego vehicle and/or further vehicles. A map may also already be stored in the system, said map being based on a previous driving in and recording of the same vehicle environment. It would also be conceivable to retrieve the driving lane course via a cloud. In this cloud, a current driving lane course could be provided, for example by other vehicles and/or by infrastructure. Alternatively or cumulatively, it would also be possible to establish a direct V2X communication with an infrastructure element. This infrastructure element can, for example, be disposed next to the driving lane and can transmit the actual driving lane course to relevant traffic participants. This is advantageous, since current and precise driving lane courses are always available for verifying the determined driving lane course in this manner.


In a further preferred embodiment, the determined driving lane course is provided to a driving function when a confidence value above a predetermined confidence threshold value is present. In this way, it is ensured that only those driving lane courses are provided which allow an error-free driving function.


Particularly preferably, a matching of curve radii is verified when the matching of the driving lane course is checked. In particular in bends, an exact knowledge of the driving lane course is important for preventing erroneous steering interventions. This way, in an advantageous manner, invertedly recognized bends or falsely recognized bends can be determined and a driving function can be controlled accordingly.


Furthermore, a warning is preferably outputted to the driver when a confidence value below a predetermined confidence threshold value is present. The warning can also be a take-over request for the driver. Alternatively or cumulatively to the warning, it would also be conceivable to provide the driving function with the retrieved driving lane course instead of with the determined driving lane course. This would be particularly advantageous for fully autonomous vehicles, since they can be configured without a steering wheel and interference by persons would thus not be possible. In this way, a fallback level would be created, thus contributing to an increase in the reliability of the entire system.





BRIEF DESCRIPTION OF THE DRAWINGS

Further advantageous configurations can be seen in the drawings, in which:



FIG. 1 shows a schematic flow chart of an embodiment of the method;



FIG. 2 shows a schematic representation of an embodiment of the system.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

In FIG. 1, a schematic flow chart of an embodiment of the method is shown. In a first step S1, an environment image or a sequence of environment images is captured by means of at least one surroundings detection sensor. In a step S2, driving lane boundaries are detected in the environment image or the sequence of environment images. In step S3, a driving lane course is determined based on the detected driving lane boundaries. In a further step S4, a further driving lane course is retrieved from a data source. In step S5, the plausibility of the determined driving lane course is checked by verifying a matching of the driving lane courses. In a further step S6, a degree of matching is identified. In step S7, a confidence value is set based on the degree of matching. Finally, in step S8, it is decided whether the determined driving lane course is provided to a driving function.



FIG. 2 shows a schematic representation of an embodiment of the system 1. The system 1 comprises a surroundings detection sensor 2 as well as a data processing device 3. The data processing device 3 is connected to the surroundings detection sensor 2 via a data connection D.

Claims
  • 1. A method for increasing the safety of driving functions in a partially automated or fully autonomous vehicle, including the following steps: capturing an environment image or a sequence of environment images by means of at least one surroundings detection sensor,detecting driving lane boundaries in the environment image or the sequence of environment images,determining a driving lane course based on the detected driving lane boundaries, the driving lane course including a route of the driving lane in front of the vehicle;retrieving a further driving lane course from a data source;checking the plausibility of the determined driving lane course by determining whether the determined driving lane course matches the retrieved further driving lane course;identifying a degree of matching of the determined driving lane course to the retrieved further driving lane course, including verifying a degree of matching of curve radii of curves of the determined driving lane course and the retrieved further driving lane course;setting a confidence value based on the degree of matching;providing the determined driving lane course to a driving function when the confidence value is above a predetermined confidence threshold value, the driving function comprising a lane keeping assistant function; andproviding the retrieved further driving lane course to the lane keeping assistant function when the confidence value below the predetermined confidence threshold value is present.
  • 2. The method according to claim 1, wherein the further driving lane course is retrieved from an HD map, a cloud or an infrastructure.
  • 3. The method according to claim 1, wherein a warning is outputted to the driver when the confidence value below the predetermined confidence threshold value is present.
  • 4. A system for increasing the safety of driving functions in a partially automated of fully autonomous vehicle, comprising at least one surroundings detection sensor as well as a data processing device, wherein the data processing device is configured for performing the method according to claim 1.
Priority Claims (1)
Number Date Country Kind
10 2020 202 964.4 Mar 2020 DE national
US Referenced Citations (54)
Number Name Date Kind
5337245 Matsuzaki Aug 1994 A
6282478 Akita Aug 2001 B1
8190361 Lee May 2012 B2
8824826 Lukac Sep 2014 B2
10210631 Cinnamon Feb 2019 B1
10739152 Fryer Aug 2020 B2
10902521 Kanevsky Jan 2021 B1
20040064241 Sekiguchi Apr 2004 A1
20060241854 Tu Oct 2006 A1
20070118276 Suzuki May 2007 A1
20090222203 Mueller Sep 2009 A1
20100082238 Nakamura Apr 2010 A1
20110046843 Caveney Feb 2011 A1
20130063595 Niem Mar 2013 A1
20130070318 Byun Mar 2013 A1
20140039716 Buerkle Feb 2014 A1
20140379164 Joh Dec 2014 A1
20150177007 Su Jun 2015 A1
20160176333 Langkabel Jun 2016 A1
20160375908 Biemer Dec 2016 A1
20170025017 Thomas Jan 2017 A1
20170097241 Prokhorov Apr 2017 A1
20170178499 Dong Jun 2017 A1
20170210359 Brandin Jul 2017 A1
20170351925 Yeh Dec 2017 A1
20180046193 Takada Feb 2018 A1
20180156626 Kang Jun 2018 A1
20180178785 Lin Jun 2018 A1
20190063945 Liu Feb 2019 A1
20190101405 Feng Apr 2019 A1
20190111922 Nath Apr 2019 A1
20190145784 Ma May 2019 A1
20190179324 Rottkamp Jun 2019 A1
20190318174 Miklos Oct 2019 A1
20190347821 Stein Nov 2019 A1
20190392715 Strauß Dec 2019 A1
20200031335 Ohmura Jan 2020 A1
20200082183 Liu Mar 2020 A1
20200122717 Kim Apr 2020 A1
20200180610 Schneider et al. Jun 2020 A1
20200184308 Li Jun 2020 A1
20200247433 Scharfenberger Aug 2020 A1
20200271453 Wang Aug 2020 A1
20200310450 Reschka Oct 2020 A1
20200357138 Xiang Nov 2020 A1
20200380383 Kwong Dec 2020 A1
20210033416 Vladimerou Feb 2021 A1
20210063172 Jung Mar 2021 A1
20210089807 Liu Mar 2021 A1
20210107520 Oltmann Apr 2021 A1
20210188356 Goto Jun 2021 A1
20210365694 Lee Nov 2021 A1
20220009526 Campanale Jan 2022 A1
20220108545 Lee Apr 2022 A1
Foreign Referenced Citations (6)
Number Date Country
102016007567 Dec 2017 DE
102016214045 Feb 2018 DE
102018212219 Jan 2020 DE
102018212555 Jan 2020 DE
102018131466 Jun 2020 DE
102018222227 Jun 2020 DE
Non-Patent Literature Citations (1)
Entry
German Search Report for German Patent Application No. 10 2020 202 964.4, dated Oct. 16, 2020, 8 pages, German Patent and Trademark Office, Muenchen, Germany, with English partial translation, 7 pages.
Related Publications (1)
Number Date Country
20210279483 A1 Sep 2021 US