The present invention relates to a device and a method for localizing a vehicle in its surroundings, the vehicle having surround sensors, which at first times detect views of the surroundings using surround sensors and supply these to an evaluation unit, and the vehicle furthermore having a communication interface, via which at second times up-to-date surroundings data regarding the current surroundings of the vehicle are transmitted to the evaluation unit. The localization of the vehicle occurs in that the surroundings data, which were detected at first times by the surround sensors, and the temporally corresponding surroundings data, which were transmitted via the communication interface, are superimposed on one another in the evaluation unit. If it is detected that features in the surroundings data detected by the sensors and/or features in the surroundings data supplied via the communication interface occur multiple times in the data pertaining to one point in time and these represent one or multiple objects, then these are transmitted only once to the evaluation unit and, in the event of a repeated occurrence of the features in the data pertaining to one point in time, only these positional data of the repeatedly occurring object are transmitted once more.
German patent document DE 10 2004 075 922 A1 discusses a device for transmitting image data from a video camera situated in a vehicle to an image evaluation unit situated in a vehicle at a distance from the video camera, a device for data reduction being provided in spatial proximity to the video camera, which reduces the image data produced by the video camera in a manner adapted to the image evaluation unit and the reduced image data being transmitted to the image evaluation unit.
Modern driver assistance systems and vehicle guidance systems, which guide vehicles in highly automated or autonomous fashion, require a localization of the vehicle in its surroundings that is as exact as possible. An aspect of the present invention is to improve the localization of a vehicle in its surroundings in that views of the surroundings of the vehicle are detected by surround sensors and are supplied to evaluation units. Up-to-date surroundings data regarding the current surroundings of the vehicle are transmitted via a communication interface, and thus a highly precise localization of the vehicle is made possible, it being possible at the same time to reduce the data transmission from the surround sensors to the evaluation unit and/or the data transmission from the communication interface to the evaluation unit and/or the data transmission from infrastructure devices to the communication interface of the vehicle, without having to accept losses regarding the transmitted information in the process.
According to the present invention, this is achieved by the combination of the features described herein. Advantageous further developments and refinements are derived from the further descriptions herein.
In the process, features in the image information, which are transmitted for example by a video camera mounted in the vehicle to an evaluation unit likewise mounted in the vehicle, are transmitted multiple times. When transmitting this information, each of these image features must be transmitted separately, which results in a very great quantity of information to be transmitted and to be processed. By transmitting each of these image features only once and by additionally transmitting, in the event of a repeated occurrence of these features in the image, only the new position of the feature in this image, it is possible to reduce the quantity of the image information to be transmitted and to be evaluated significantly.
The described idea is not limited to a video sensor system and the transmission of image information, but may also be applied to a stereo-video sensor system or radar sensor system, lidar sensor system, ultrasonic sensor system or a combination of these in that respective features of the object detection in the vehicle surroundings, on the basis of which it is possible to infer the type of object, are transmitted and processed further.
Advantageously, the features existing multiple times in a sensor view are objects such as pedestrian crosswalk stripes, guardrail sections, guardrail posts on which the guardrails are fastened, broken lane markings for delineating adjacent driving lanes, delineator posts marking the edge of the roadway, directional arrows on the roadway that signal the driving direction on the respective lane or similar features occurring multiple times and repeatedly within the sensor information. In this connection, there may be a provision for the respective characteristics in the sensor information to be defined and programmed in prior to taking the method of the present invention into operation. It is furthermore conceivable that an algorithm is executed in connection with the present invention, which searches for regularly recurring object patters in the sensor information and, if these objects occur with sufficient frequency, that the respective object characteristics are extracted by this software and stored and supplemented in a database of the features that exist multiple times so that the number of the features that exist multiple times may be continuously increased and adapted to the respective situation.
It is furthermore advantageous that the frequent and regularly occurring features in the sensor data are ascertained using a filter adapted to the one or to the multiple objects. Particularly suitable for this purpose is the use of a comb filter or a data processing algorithm that is able to detect regularly recurring patterns.
It is furthermore advantageous that the first times and the second times are identical. This achieves the result that the data for localizing the vehicle, which were on the one hand obtained by the sensor from the surroundings of the vehicle and which are on the other hand received via a communication interface, concern identical times and thus both information sources describe the same point in time and may thus be superimposed. This makes it possible to determine the location of the vehicle in its surroundings even more precisely.
Furthermore it is possible that the localization of the vehicle is a highly precise localization and that the vehicle may be moved autonomously or in automated fashion in the current vehicle surroundings independently, that is, without driver intervention. In the case of automated or autonomous driving interventions, in which the driver is not directly participating in the task of driving, a highly precise knowledge of the vehicle surroundings is necessary, for which purpose it is advantageous to use the superimposition of multiple items of information of different information sources.
It is furthermore advantageous that the highly precise localization is a determination of the current location of the vehicle in its vehicle surroundings with a precision of approximately +/−10 centimeters. This precision is particularly advantageous since it very closely approximates the distance estimation capacity of a human driver.
It is furthermore advantageous that the method additionally uses a GPS signal or a DGPS (differential GPS) signal. As an alternative to the GPS (global positioning system) signal, it is also possible to use a Glonass signal, a Eureka signal or a signal of another satellite positioning system or several of these signals simultaneously.
It is furthermore advantageous that the surroundings data are detected by a sensor system that is a radar sensor system, a lidar sensor system, a video sensor system, an ultrasonic sensor system or a combination of these sensor types. Particularly in the context of autonomous or automated driving, a combination of several sensor types that differ from one another is necessary, which is able to provide a trustworthy situation of the vehicle surroundings by superimposition.
It is furthermore advantageous that the current surroundings data, which are transmitted to the vehicle via the communication network, are surroundings data that were detected by vehicle sensor systems of other vehicles. The other vehicles are road users that traveled the same vehicle surroundings just prior to the host vehicle and in the process collected surroundings data using their vehicle sensor systems and transmitted these data via a communication interface to a data-infrastructure unit. The host vehicle, which is currently traveling the same vehicle surroundings, receives the previously detected data via the infrastructure unit and is thereby able to ascertain its own vehicle position in a highly precise manner. It is important in this regard that only those data are used that are still current, that is, data that originate from vehicle sensor systems whose vehicles traveled the same vehicle surroundings only recently. It is particularly advantageous if the duration, within which the surroundings data are regarded as still current, is a period of time up to a maximum of 5 minutes, 10 minutes or 15 minutes. Advantageously, surroundings data that are older than this period of time, can no longer be regarded as current and thus can no longer be taken into account.
It is furthermore advantageous that the current surroundings data are items of information that were detected and provided by vehicle sensor systems of vehicles that previously detected the same vehicle surroundings. This provides an infrastructure service that automatically detects current surroundings situations and provides them to subsequent vehicles, the subsequent vehicles in turn detecting data and providing these to vehicles that pass the same location at a later point in time. This measure makes it possible to update data regarding changing surroundings situations and always to provide up-to-date data. These data are uploaded to a data server, are stored and held ready for download by other road users that travel the same route section at a later time.
It is furthermore advantageous that the communication interface is a mobile telephone connection or a digital radio network or a vehicle-to-infrastructure network (C2I network) or a vehicle-to-vehicle network (C2C network) or an interface to a navigation system. The navigation system may be in particular a vehicle having a stored digital map or a vehicle that accesses digital map data completely or at least partially via an interface to a data cloud.
The implementation of the method of the present invention in the form of a control element provided for a control unit of an adaptive distance and velocity control of a motor vehicle is of particular importance. For this purpose, a program, which is executable on a computer, in particular on a microprocessor or signal processor, and is suitable for carrying out the method of the invention, is stored on the control element. In this case, the present invention is thus implemented by a program stored in the control element such that this control element equipped with the program represents the present invention in the same manner as the method which the program is suited to implement. In particular, an electrical storage medium, for example a read-only memory, may be used as control element.
Additional features, application options and advantages of the present invention result from the following description of exemplary embodiments of the present invention, which are shown in the figures of the drawing. In this context, all of the described or represented features, alone or in any combination, form the subject matter of the present invention, regardless of their combination in the patent claims or their antecedent reference, and regardless of their wording and representation in the specification and in the drawings.
Exemplary embodiments of the present invention are explained below with reference to drawings.
In step S21 of
Number | Date | Country | Kind |
---|---|---|---|
10 2014 221 888 | Oct 2014 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2015/073641 | 10/13/2015 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2016/066419 | 5/6/2016 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5584035 | Duggan | Dec 1996 | A |
6922632 | Foxlin | Jul 2005 | B2 |
7148913 | Keaton | Dec 2006 | B2 |
8934709 | Saptharishi | Jan 2015 | B2 |
9240029 | Lynch | Jan 2016 | B2 |
9335766 | Silver | May 2016 | B1 |
9507346 | Levinson | Nov 2016 | B1 |
9612123 | Levinson | Apr 2017 | B1 |
9632502 | Levinson | Apr 2017 | B1 |
9734455 | Levinson | Aug 2017 | B2 |
10056001 | Harris | Aug 2018 | B1 |
10409284 | Kentley-Klay | Sep 2019 | B2 |
20040073360 | Foxlin | Apr 2004 | A1 |
20100198513 | Zeng | Aug 2010 | A1 |
20110190972 | Timmons | Aug 2011 | A1 |
20110210965 | Thorpe | Sep 2011 | A1 |
20130325243 | Lipkowski | Dec 2013 | A1 |
20140121964 | Stanley | May 2014 | A1 |
20140350852 | Nordbruch | Nov 2014 | A1 |
20150025917 | Stempora | Jan 2015 | A1 |
20150377607 | Einecke | Dec 2015 | A1 |
20160207526 | Franz | Jul 2016 | A1 |
20160305794 | Horita | Oct 2016 | A1 |
20170120904 | Kentley | May 2017 | A1 |
20170124476 | Levinson | May 2017 | A1 |
20170124781 | Douillard | May 2017 | A1 |
20170248963 | Levinson | Aug 2017 | A1 |
20170316333 | Levinson | Nov 2017 | A1 |
20170351261 | Levinson | Dec 2017 | A1 |
20180059779 | Sisbot | Mar 2018 | A1 |
20180061129 | Sisbot | Mar 2018 | A1 |
20180204111 | Zadeh | Jul 2018 | A1 |
20180217251 | Stanley | Aug 2018 | A1 |
20180233047 | Mandeville-Clarke | Aug 2018 | A1 |
20180282955 | McClendon | Oct 2018 | A1 |
20180292834 | Kindo | Oct 2018 | A1 |
20180307915 | Olson | Oct 2018 | A1 |
20190084577 | Nobre | Mar 2019 | A1 |
20190101649 | Jensen | Apr 2019 | A1 |
20190137287 | Pazhayampallil | May 2019 | A1 |
20190145784 | Ma | May 2019 | A1 |
20190176794 | Pinto, IV | Jun 2019 | A1 |
20190271559 | Colgate | Sep 2019 | A1 |
20190272446 | Kangaspunta | Sep 2019 | A1 |
Number | Date | Country |
---|---|---|
102107661 | Jun 2011 | CN |
102576075 | Jul 2012 | CN |
103538588 | Jan 2014 | CN |
102009044284 | Apr 2010 | DE |
102010002092 | Dec 2010 | DE |
102012211391 | Jan 2014 | DE |
102012219637 | Apr 2014 | DE |
1669273 | Jun 2006 | EP |
H082228 | Jan 1996 | JP |
H0822281 | Jan 1996 | JP |
2007156754 | Jun 2007 | JP |
2009168567 | Jul 2009 | JP |
2014089691 | May 2014 | JP |
Entry |
---|
International Search Report for PCT/EP2015/073641, dated Jun. 7, 2016. |
Number | Date | Country | |
---|---|---|---|
20170248962 A1 | Aug 2017 | US |