The present application is the national stage of International Pat. App. No. PCT/EP2017/055150 filed Mar. 6, 2017, and claims priority under 35 U.S.C. ยง 119 to DE 10 2016 207 463.6, filed in the Federal Republic of Germany on Apr. 29, 2016, the content of each of which are incorporated herein by reference in their entireties.
German Patent Application DE 10 2013 216 951 A1 describes a radar sensor for motor vehicles, including an antenna array having at least two groups of antenna elements that differ in elevation in the effective direction thereof, a control device designed for alternately activating the groups, and an evaluation device for evaluating the radar echoes received by the antenna array and for localizing objects by angular resolution, the evaluation device being designed to estimate the elevation angle of the objects on the basis of the radar echoes received from the various groups.
German Patent Application DE 10 2013 019 803 A1 describes a method for determining an object height from radar data acquired by a radar device mounted on a vehicle, a change over time in a distance of the object from the radar device being determined, and an echo signal received by the radar device being intensity modulated.
According to an example embodiment of the present invention, a method for operating at least one vehicle relative to at least one passable object in the surrounding area of the at least one vehicle includes inputting map data values from a map, the map data values including the at least one passable object in the form of first object data values; recording surrounding area data values, which represent the surrounding area of the at least one vehicle and include the at least one passable object in the form of second object data values; reconciling the input map data values with the recorded surrounding area data values in accordance with predefined first comparison criteria; and operating the at least one vehicle as a function of the reconciliation of the data values.
A map is understood here first to be a (two-dimensional) map, as used for navigation, and/or secondly a (three-dimensional) map that includes data that enable sensor data, that were recorded by a vehicle, for example, to be reconciled with data stored in the maps. This means, for example, that the (three-dimensional) map was created by a first vehicle recording surrounding area data using sensors thereof, such as radar, ultrasonic, lidar, video, etc., and storing them in the form of a (three-dimensional) map, for example, in conjunction with GPS data. If, at this stage, such a map is stored within a second vehicle, this second vehicle can likewise record the surrounding area thereof using the sensors thereof and thereby reconcile the data stored in the (three-dimensional) map with the data it itself recorded. Thus, for example, the map can first be completed and/or updated, and the already stored map also used to recognize changes that will possibly influence the safety of the second vehicle, and to initiate appropriate measures. The maps can be (physically) separate from each other, as well as in the form of one map; the individual map data can be provided, for example, as the (two-dimensional) map for navigating, and as a first (three-dimensional) radar map, and/or as a second (three-dimensional) video map in the form of different map layers.
The at least one vehicle is to be understood as a partially, highly or fully automated, and also as a non-automated vehicle.
An object can be understood to be anything stored in the (two-dimensional and/or three-dimensional) map, such as landscape features including lakes, mountains, woods, etc., and/or buildings, bridges, parts of the traffic infrastructure, etc., as well as any objects that can be recorded by sensors of a vehicle. Inter alia, objects can also be those that temporarily influence an operation of the at least one vehicle.
An advantage of the present invention resides in that additional information that is highly relevant to a safe operation of the at least one vehicle is obtained by the reconciliation. This is not possible in a conventional use of a map or sensor data. This information can first be made available, for example, to an operator or driver of the at least one vehicle, allowing the operator or driver to recognize and suitably respond (earlier) to potential dangers to the operator or driver and or to the at least one vehicle; and, secondly, to a driver assistance function, for example, that performs the operation thereof incorrectly and/or incompletely on the basis of the map data or the sensor data alone. A further advantage is derived by obtaining information relating to ride comfort in addition to safety-critical information by recognizing in a timely manner that an object is not classified as passable, thereby allowing a bypassing thereof to be initiated in a timely fashion.
In an especially preferred example embodiment, the at least one passable object is an object, in particular a bridge or a tunnel, that can be passed underneath.
It is especially the safety aspect that is brought to bear in this case because sensors often erroneously detect areas in a tunnel or underneath a bridge. Incorrect conclusions are thereby drawn on the basis of this sensor data. This can adversely affect the safety of the at least one vehicle, as well as of the occupants. Reconciling the data by making the distinction between data already stored in a map and newly recorded data makes it possible to recognize an object underneath a bridge as an obstacle, for example. This would not be possible without the reconciliation.
The at least one vehicle is preferably operated in such a way that the height of the at least one vehicle is considered.
Here an advantage is derived that it is also possible to give special consideration to objects that must be passed underneath, especially in blind areas, such as tunnels or underneath a bridge, and thereby minimize any risks when driving underneath such an object.
In an especially preferred example embodiment, the first object data values include the at least one passable object in the form of recorded sensor data.
The reconciliation is preferably performed in accordance with the specified first comparison criteria in such a way that it is determined whether the at least one passable object is at least temporarily passable or impassable for the at least one vehicle.
The map can be hereby temporarily updated, for example with a notification or danger message being transmitted to other vehicles.
The map data values are preferably updated with regard to the at least one passable object.
In an especially preferred example embodiment, the operation is carried out in such a way that at least one driving assistance function of the at least one vehicle, in particular at least one automated driving assistance function of the at least one vehicle, is performed and/or modified in dependence upon the reconciliation of the data values.
An especially preferred example embodiment provides that an emergency braking be prepared and/or performed as a driving assistance function in response to the at least one passable object being recognized as temporarily impassible.
According to an example embodiment of the present invention, a device for operating at least one vehicle relative to at least one passable object in the surrounding area of the at least one vehicle includes first means for inputting map data values from a map, the map data values including the at least one passable object in the form of first object data values; second means for recording surrounding area data values, which represent the surrounding area of the at least one vehicle and include the at least one passable object in the form of second object data values; third means for reconciling the input map data values with the recorded surrounding area data values in accordance with predefined first comparison criteria; and fourth means for operating the at least one vehicle as a function of the reconciliation of the data values.
Exemplary embodiments of the present invention are illustrated in the drawings and are explained in greater detail in the following description.
First means 111 are primarily designed for inputting 310 object data stored in map 120 as first object data values.
Furthermore, the device includes second means 112 for recording 320 surrounding area data values. They include sensors 101 which can be both an integral part of the device 110, as well as of the vehicle 100. The second means are thereby designed to allow access to the recorded data values of sensors 101 of vehicle 100 and use thereof for the method according to the present invention.
The sensors are video, lidar, radar, and/or ultrasonic sensors, for example. Other sensors, which are suited for capturing an object in surrounding area 250 of vehicle 100, can also be used for this purpose. Moreover, GPS and/or other position-finding sensors can be used, which can be both an integral part of device 110, as well as an integral part of vehicle 100. They can be used, for example, to determine objects more reliably and/or more rapidly on the basis of knowledge of the vehicle location by accessing two-dimensional map 120, for example, to compare the object to already stored objects and to identify the same.
The second means are primarily designed to use sensors 101 to record 320 objects in the form of second object data values.
Third means 113 make it possible to compare 330 the first object data values to second object data values. This can be accomplished, for example, by the first object data values including an object in the form of radar values that are already recorded in advance by another vehicle, for example. The second object data values thereby include the same object that is to be expected in the surrounding area 250 of vehicle 100, for example, on the basis of the location knowledge of vehicle 100, likewise in the form of radar values that were actually recorded by radar sensors 101.
At this stage, these data values are compared 330 in that differences in both data records, for example, are sought and identified as such. This is accomplished by appropriate computer programs that are likewise included in third means 113.
Following the reconciliation 330 of the first and second object data values by third means 113, the reconciliation results are transmitted to fourth means 114.
Furthermore, third means 113 are thereby designed to enable map 120 to be updated once the first and second object data values are reconciled 330 when it is determined, for example, that the second object data values are recognized as being consistently correct in comparison to the first object data values.
This can occur, for example, when sensors 101 of vehicle 100 detect a bridge that is not yet stored in map 120.
Fourth means 114 can be both an integral part of device 110, as well as an integral part of vehicle 100, for example, in the form of a control unit that is designed for performing driving assistance functions, such as emergency-braking maneuvers. The reconciliation results are thereby forwarded in a way that enables the driving assistance function to use the results to appropriately adapt and influence the maneuvers thereof to be performed.
In the situation shown here exemplarily, an object 210, which prevents a safe passage of vehicle 100, is located underneath the bridge in the lane of vehicle 100. Since this object 210 is located underneath the bridge, it is hardly possible for it to be recorded by the sensors of vehicle 100, and it is recognized as a danger. At this stage, reconciling the recorded second object data values with the first input object data values makes it possible for object 210 to be recognized due to a difference in the data values, whereby the operation 340 of vehicle 100 can be adapted accordingly. In the case of a non-automated vehicle, this can be effected, for example, by already preparing certain safety systems, such as an airbag or an emergency braking system, to ensure that the systems are available as quickly as possible in the case that a driver of vehicle 100 fails to brake. In the case of an automated vehicle, the trajectory, for example, can be adapted (taking into account the existing traffic) and vehicle 100 can be controlled to drive around object 210.
Moreover, it can occur that an object 200, such as the bridge, for example, is altogether recognized as an obstacle, since, for example, the area underneath the bridge is perceived as being so dark that a video sensor recognizes the actual passage possibility as an obstacle. At this stage, reconciliation 330 of the first and second object data values can result here in operation 340 of vehicle 100 being adapted in such a way that no unnecessary emergency braking is triggered, rather vehicle 100 is able to pass underneath the bridge without any problems.
On the basis of a flowchart,
In step 310, first object data values, which include the passable object, are input from map 120. In step 320, sensors 101 record second object data values, which likewise include passable object 200. In step 330, the first object data values are compared to the second object data values.
As a function of the comparison performed in step 330, the at least one vehicle 100 is operated in step 340 in such a way that it is determined whether or not there is a danger for the at least one vehicle 100 and or for the occupants thereof upon passing passable object 200. If such a danger exists, step 341 follows; and if there is no danger, the method ends with step 350.
In step 341, safety systems, such as an airbag or an emergency braking system, for example, are prepared for a potential collision.
In step 342, the decision is made as to whether an emergency braking is to be already initiated and, if indicated, is also initiated. If an emergency braking is initiated, step 350 follows, and the method ends. If the decision is made that there is no need to initiate an emergency braking, step 320 follows since sensors 101 can be able to better detect passable object 200 in response to the at least one vehicle 100 approaching passable object 200 and thus evaluate the situation more efficiently or accurately on the basis of reconciliation 330.
The method ends in step 350.
Number | Date | Country | Kind |
---|---|---|---|
10 2016 207 463 | Apr 2016 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2017/055150 | 3/6/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/186385 | 11/2/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4284971 | Lowry | Aug 1981 | A |
4477184 | Endo | Oct 1984 | A |
5389912 | Arvin | Feb 1995 | A |
5828320 | Buck | Oct 1998 | A |
6311123 | Nakamura | Oct 2001 | B1 |
6377205 | Eckersten | Apr 2002 | B1 |
6438491 | Farmer | Aug 2002 | B1 |
7605746 | Matsuura | Oct 2009 | B2 |
7877209 | Harris | Jan 2011 | B2 |
8199046 | Nanami | Jun 2012 | B2 |
8207836 | Nugent | Jun 2012 | B2 |
8289141 | Haberland | Oct 2012 | B2 |
8354920 | Kole | Jan 2013 | B2 |
8731816 | Dintzer | May 2014 | B2 |
8810382 | Laurita | Aug 2014 | B1 |
8935086 | Sadekar | Jan 2015 | B2 |
9121934 | Ohkado | Sep 2015 | B2 |
9297892 | Smith | Mar 2016 | B2 |
9472103 | Baskaran | Oct 2016 | B1 |
9477894 | Reed | Oct 2016 | B1 |
9495603 | Hegemann | Nov 2016 | B2 |
9546876 | Kleve | Jan 2017 | B2 |
9575170 | Kurono | Feb 2017 | B2 |
9618607 | Asanuma | Apr 2017 | B2 |
9718402 | Smyth | Aug 2017 | B2 |
9766336 | Gupta | Sep 2017 | B2 |
9847025 | Mohtashami | Dec 2017 | B2 |
10001560 | Ishimori | Jun 2018 | B2 |
10031224 | Aoki | Jul 2018 | B2 |
10317910 | Stein | Jun 2019 | B2 |
10460534 | Brandmaier | Oct 2019 | B1 |
10507766 | Gable | Dec 2019 | B1 |
10585183 | Cornic | Mar 2020 | B2 |
10585188 | Millar | Mar 2020 | B2 |
10586348 | Kawano | Mar 2020 | B2 |
20020161513 | Bechtolsheim | Oct 2002 | A1 |
20030001771 | Ono | Jan 2003 | A1 |
20030002713 | Chen | Jan 2003 | A1 |
20030011509 | Honda | Jan 2003 | A1 |
20040189451 | Zoratti | Sep 2004 | A1 |
20040201495 | Lim | Oct 2004 | A1 |
20050012603 | Ewerhart | Jan 2005 | A1 |
20070103282 | Caird | May 2007 | A1 |
20080111733 | Spyropulos | May 2008 | A1 |
20080189039 | Sadekar | Aug 2008 | A1 |
20090002222 | Colburn | Jan 2009 | A1 |
20090315693 | Nugent | Dec 2009 | A1 |
20100057353 | Friedman | Mar 2010 | A1 |
20100238066 | Lohmeier | Sep 2010 | A1 |
20110221628 | Kamo | Sep 2011 | A1 |
20120083964 | Montemerlo | Apr 2012 | A1 |
20120139756 | Djurkovic | Jun 2012 | A1 |
20130038484 | Ohkado | Feb 2013 | A1 |
20130099910 | Merritt | Apr 2013 | A1 |
20130103305 | Becker | Apr 2013 | A1 |
20130107669 | Nickolaou | May 2013 | A1 |
20130110346 | Huber | May 2013 | A1 |
20130222592 | Gieseke | Aug 2013 | A1 |
20140303886 | Roemersperger | Oct 2014 | A1 |
20150066349 | Chan | Mar 2015 | A1 |
20150120178 | Kleve | Apr 2015 | A1 |
20160028824 | Stenneth | Jan 2016 | A1 |
20170054948 | Angel | Feb 2017 | A1 |
20170113664 | Nix | Apr 2017 | A1 |
Number | Date | Country |
---|---|---|
102010038970 | Feb 2012 | DE |
102013201799 | Aug 2014 | DE |
102013216951 | Feb 2015 | DE |
102013019803 | May 2015 | DE |
10 2014 210259 | Dec 2015 | DE |
102014216008 | Feb 2016 | DE |
1736797 | Dec 2006 | EP |
2149799 | Feb 2010 | EP |
2006119090 | May 2006 | JP |
2015126491 | Aug 2015 | WO |
2016192934 | Dec 2016 | WO |
Entry |
---|
International Search Report dated May 22, 2017 of the corresponding International Application PCT/EP2017/055150 filed Mar. 6, 2017. |
Number | Date | Country | |
---|---|---|---|
20190092291 A1 | Mar 2019 | US |