The present invention relates to a method for detecting a traffic zone.
Traffic zone, as understood in this application, refers to those areas of the road and the users moving on it which are detectable by the driver of a vehicle, in particular with the aid of optical and electronic aids. Currently, environment sensors such as ultrasonic sensors, radar sensors, mono or stereo video sensors, or laser scanners in different configurations are used for detecting the traffic zone. Suitable sensor configurations are derived from the particular requirements of the downstream driver assistance systems.
The configurations differ by type, number, and arrangement of the individual sensors and sensor clusters used. Currently the sensor data are used mainly directly for the driving functions. Thus, in ACC (Adaptive Cruise Control), the distance and relative velocity with respect to the preceding vehicle are used for following it at a constant time interval. No further analysis of the available signals is performed. The driving environment is not described in detail at this time. The data delivered by the sensors are not fully analyzed. If the data are suitably combined, additional information results, and may be of additional use if appropriately processed.
An object of the exemplary embodiments and/or exemplary methods of the present invention is to provide information about the number of traffic lanes on the road surface and the direction of movement of the vehicles in the traffic lanes. In addition, the method should make an average velocity for each detected traffic lane available. For example, five traffic lanes are detected; three to the left of the host vehicle's lane, the host vehicle's lane, and one traffic lane to the right of the host vehicle's lane. Oncoming vehicles are moving on the two traffic lanes farthest to the left; vehicles moving in the same direction as the host vehicle are on the three traffic lanes farthest to the right, including the host vehicle's lane. In addition, an average velocity is calculated for each of these five traffic lanes. The exemplary embodiments and/or exemplary methods of the present invention therefore relates to a method for determining the number of traffic lanes, for determining the directions of travel, and for determining the average velocities on the individual traffic lanes. Furthermore, an interface is provided which displays the generated information about the traffic flow to the driver in a suitable manner. Another interface is provided which conveys the information to driver assistance systems.
The information about the traffic flow may be made available to the driver via a suitable display. The driver may use this additional information in planning his route, thus obtaining additional use of the installed sensor system. Downstream driver assistance systems may use the information about the traffic flow. In the event of a lane change or deviation to an adjacent traffic lane, the knowledge of the direction of travel is valuable information (for example, for a possible warning). The method according to the present invention allows a driver assistance system having enhanced usefulness to the driver to be implemented. The course of the traffic lanes may thus be recognized even on roads without markings. If the host vehicle intends to make a lane change to a traffic lane on which oncoming traffic has been previously detected, a warning may be output for the driver. By comparing the average velocities on adjacent traffic lanes, the driver may be given a suggestion for a lane change. In the case of driver assistance systems having collision warning, the possibility of a risky evasive path leading into the oncoming traffic may be ignored. Instead, an earlier warning or a more intensive braking operation may be considered. In the event of failure of sensors provided for lane recognition via road marking detection, the method according to the present invention may continue to recognize traffic lanes and thus virtually generated traffic lane markings. In road construction areas in particular, there are often no or only hardly recognizable markings. In this case, too, the method according to the present invention may provide a full-value substitute for detection of traffic lane markings by detecting traffic lanes on the basis of the traffic flow.
The present invention is elucidated in detail below with reference to the drawings.
The method basically works with any on-board surroundings sensors capable of detecting objects in the detectable traffic zone. Radar sensors (close-range and long-range), video sensors (mono and stereo sensors), and laser scanners are advantageously used. The method works exclusively with moving objects. Stationary objects are filtered out in an input stage of the system executing the method. This advantageously prevents construction along the road's edges from being included in the detection of traffic lanes. However, traffic lanes used exclusively for stationary vehicles such as the traffic lane labeled lane 5 in
At this time, there are no approaches applicable in practice for the above-mentioned cases. The exemplary embodiments and/or exemplary methods of the present invention is directed to a model for describing the traffic lanes. The parameters used include in particular the number of traffic lanes and the width of the traffic lanes. The model for detecting the traffic flow in the surroundings of the vehicle is based on the assumption that in general there is a plurality of traffic lanes, a maximum number of traffic lanes nmax being assumed. Parameters of the model include, in particular
Determining an optimum set of parameters for the model is an important task. The set of parameters is optimum when it describes reality in the best possible manner. This optimization problem is solved in each cycle that is triggered by new sensor data. New sensor data are ascertained every 100 ms, for example. The data are advantageously ascertained in two stages. In a first stage the optimum widths of the traffic lanes are determined (function modules 23, 24, 25). In a subsequent second stage, the direction of travel and the average velocity are determined on the particular traffic lanes (function module 26).
In order to avoid strong oscillations of the result, all output quantities are advantageously low-pass filtered (function module 27). Since the resulting information is not used for maneuvers that are critical for the driving physics or for time-critical decisions, filtration with a rather large time constant is recommended. Erroneous individual models are thus reliably filtered out and relatively stable traffic lane information is obtained. The model information is stored in a matrix (function module 20). The matrix has nmax columns. One column is provided for each traffic lane. Each row of the matrix contains a possible traffic lane width combination. A matrix element describes the plausibility that a certain traffic lane has a certain width. The matrix is recalculated in each cycle. At the beginning of each cycle, stationary objects are filtered out (function module 21). Stationary vehicles cannot be differentiated from construction along the road's edge and are therefore not used for determining the traffic lanes.
Moving objects detected by the sensors of the host vehicle, thus in particular other vehicles 40, 41, 43 from traffic zone 10, are assigned to traffic lanes lane 1, lane 2, and lane 4 ascertained using the model. The plausibility for the existence of the particular lane increases with each assignment of this type. The currently valid model is the most plausible one, which is also output. If such road markings are also recognized by a sensor that is sensitive to road markings, this information advantageously flows into the parameterization of the traffic lane parameters according to the model. The lane width of the traffic lane may thus be detected directly via sensor detection and therefore need not be ascertained as a model parameter. The block diagram shown in
The method according to the present invention is also known as “SIL” (Situation Interpretation Lane). The method according to the present invention is executed in the function module labeled with reference numeral 2. The input of function module 2 is connected to function modules 3, 4, 5, and 6. Function module 3 provides performance characteristics of the host vehicle (reference numeral 42 in
The sequence of the processing is illustrated by the arrows drawn in function module 2. Function module 2 in turn includes additional function modules 20, 21, 22, 23, 24, 25, 26, 27. Most of these function modules, namely function modules 22, 23, 24, 25, 26, 27 are connected to function module 20. Only function module 21 is not connected to function module 22, but to function module 23. Function module 6 is connected to function module 21 within function module 2. Function module 3 is connected to a function module 26. Function module 4 is connected to a function module 27. Function module 7 is connected to a function module 27. In the following, the functional relationship of the function modules and the sequence of the method steps are further elucidated. Data of objects that have been detected by sensors of host vehicle 42 in traffic zone 10 covered by the sensors are supplied to function module 21 via function module 6. The object data are selected in function module 21. Stationary objects are eliminated, and only the data of moving objects are relayed to a function module 21.1. Interfering data from construction along the road's edge and the like are thus advantageously eliminated.
A list of the objects provided by this selection is prepared in function module 21.1. The lateral positions of the detected objects with respect to host vehicle 42 are also contained in this list. As mentioned previously, the method according to the present invention depends on a model of the traffic zone covered. In particular, lane widths of the traffic lanes of the covered traffic zone are prepared in function module 22 and supplied to function module 20. The prepared values of the lane widths are also supplied, via function module 20, to function module 23, where the lanes described as a model are weighted and the weighted data are in turn returned to function module 20.
The existence of the traffic lanes detected by the sensors is checked on the basis of the weighted lane data in function module 24. In function module 25, a plausibility check is performed on the basis of the weighted lane data, and the most plausible model of the traffic lanes is computed. This is then conveyed to function modules 26 and 27. The average velocity of the objects moving on a traffic lane and their direction of travel are ascertained in function module 26, which is also connected to function module 3 and receives data of the host vehicle therefrom. The result is relayed to function module 20. In function module 27, the data about the calculated traffic lanes and the most plausible models are combined and transmitted as output information to function module 7.
The diagram of
The host vehicle is labeled with reference number 42. The traffic lanes detected using the method according to the present invention on which objects, i.e., vehicles 40, 41 as illustrated here, move in the opposite direction of the host vehicle, are labeled “−1.” The traffic lanes detected using the method according to the present invention on which objects, i.e., here other vehicle 43 and host vehicle 42, move in the same direction as host vehicle 42, are labeled “1.” A traffic lane that has not been detected is labeled “0.” This may be an empty lane or an emergency lane on which there are only stationary objects. The model on which the method is based thus includes a total of five traffic lanes, four of which have been recognized as existing. The information about the traffic zone gained using the method according to the present invention is advantageously made visible to the driver via a suitable display. For example, the data may be displayed in a window of a multi-instrument or on the monitor of a navigation system already present in the vehicle. The number of detected traffic lanes may also be advantageously represented in a convenient manner in the form of appropriate symbols.
The direction of travel of the objects on the traffic lanes may be advantageously represented via arrows as in
The method according to the present invention allows a driver assistance system having considerably enhanced usefulness to the driver to be implemented. The course of the traffic lanes may thus be recognized even on roads without markings. If the host vehicle intends to make a lane change to a traffic lane on which oncoming traffic has been previously detected (for example, change from lane 3 to lane 2 in
With driver assistance systems having collision warning, the possibility of a risky evasive maneuver leading into the oncoming traffic may be ignored. Instead, an earlier warning or a more intensive braking operation may be considered. In the event of failure of sensors provided for lane recognition by road marking detection, the method according to the present invention may continue to recognize traffic lanes and thus virtually generated traffic lane markings. In particular in construction areas there are often no or only hard-to-recognize markings. In this case, too, the method according to the present invention may provide a full-value substitute for detection of traffic lane markings by detecting traffic lanes on the basis of the traffic flow.
Number | Date | Country | Kind |
---|---|---|---|
10 2005 039 103 | Aug 2005 | DE | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/EP2006/064534 | 7/21/2006 | WO | 00 | 3/24/2009 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2007/020153 | 2/22/2007 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5369591 | Broxmeyer | Nov 1994 | A |
5467283 | Butsuen et al. | Nov 1995 | A |
5479173 | Yoshioka et al. | Dec 1995 | A |
5483453 | Uemura et al. | Jan 1996 | A |
5739848 | Shimoura et al. | Apr 1998 | A |
5745870 | Yamamoto et al. | Apr 1998 | A |
5884212 | Lion | Mar 1999 | A |
5959569 | Khodabhai | Sep 1999 | A |
6252520 | Asami et al. | Jun 2001 | B1 |
6311123 | Nakamura et al. | Oct 2001 | B1 |
6405132 | Breed et al. | Jun 2002 | B1 |
6429789 | Kiridena et al. | Aug 2002 | B1 |
6615137 | Lutter et al. | Sep 2003 | B2 |
6805216 | Noecker | Oct 2004 | B2 |
6859705 | Rao et al. | Feb 2005 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7085637 | Breed et al. | Aug 2006 | B2 |
7110880 | Breed et al. | Sep 2006 | B2 |
7295925 | Breed et al. | Nov 2007 | B2 |
7386371 | Kuge et al. | Jun 2008 | B2 |
7418346 | Breed et al. | Aug 2008 | B2 |
7418372 | Nishira et al. | Aug 2008 | B2 |
7440823 | Yamamura et al. | Oct 2008 | B2 |
7593838 | Winter et al. | Sep 2009 | B2 |
7617037 | Desens et al. | Nov 2009 | B2 |
7791503 | Breed et al. | Sep 2010 | B2 |
7835854 | Yamamoto et al. | Nov 2010 | B2 |
7848884 | Kawasaki | Dec 2010 | B2 |
7848886 | Kawasaki | Dec 2010 | B2 |
7860639 | Yang | Dec 2010 | B2 |
7899621 | Breed et al. | Mar 2011 | B2 |
7925425 | Tomita et al. | Apr 2011 | B2 |
7979172 | Breed | Jul 2011 | B2 |
7979173 | Breed | Jul 2011 | B2 |
7991550 | Zeng | Aug 2011 | B2 |
8046166 | Cabral et al. | Oct 2011 | B2 |
8055428 | Okawa | Nov 2011 | B2 |
8073575 | Tachibana et al. | Dec 2011 | B2 |
20030060969 | Waite et al. | Mar 2003 | A1 |
20050096838 | Jung | May 2005 | A1 |
20060006988 | Harter et al. | Jan 2006 | A1 |
20060047409 | Oka | Mar 2006 | A1 |
20060047410 | Oka | Mar 2006 | A1 |
20060064233 | Adachi et al. | Mar 2006 | A1 |
20060064236 | Hayashi | Mar 2006 | A1 |
20060095195 | Nishimura et al. | May 2006 | A1 |
20060152346 | Maass et al. | Jul 2006 | A1 |
20070027583 | Tamir et al. | Feb 2007 | A1 |
20070038361 | Yavitz et al. | Feb 2007 | A1 |
20070067081 | Ton | Mar 2007 | A1 |
20070083322 | Van Ee | Apr 2007 | A1 |
20070088488 | Reeves et al. | Apr 2007 | A1 |
20070142995 | Wotlermann | Jun 2007 | A1 |
20070198188 | Leineweber et al. | Aug 2007 | A1 |
20080040023 | Breed et al. | Feb 2008 | A1 |
20080154629 | Breed et al. | Jun 2008 | A1 |
20080215231 | Breed | Sep 2008 | A1 |
20080215232 | Ikeda et al. | Sep 2008 | A1 |
20090033540 | Breed et al. | Feb 2009 | A1 |
20090051516 | Abel et al. | Feb 2009 | A1 |
20100191449 | Iwamoto | Jul 2010 | A1 |
20100201508 | Green et al. | Aug 2010 | A1 |
20100228467 | Wolfe | Sep 2010 | A1 |
20100324806 | Ishikawa | Dec 2010 | A1 |
20130282271 | Rubin et al. | Oct 2013 | A1 |
Number | Date | Country |
---|---|---|
695 14 506 | May 2000 | DE |
1 413 500 | Apr 2004 | EP |
WO 9528653 | Oct 1995 | WO |
Number | Date | Country | |
---|---|---|---|
20090326752 A1 | Dec 2009 | US |