Not Applicable.
Not Applicable.
The present invention relates in general to monitoring a location of a roadway lane relative to a vehicle, and, more specifically, to improved lane detection during times when optical identifiers are not present.
Automatic lane detection and monitoring is useful for supporting various driver assistance systems such as a lane departure warning system or a lane keeping assist system. The primary sensor being used in conventional lane detection systems is vision-based, e.g., an optical camera. A lane detection algorithm detects lane markings such as painted lane lines or surface features corresponding to a road edge, and then estimates a vehicle's lateral position within the lane, the lane width, and the vehicle's heading angle with respect to the lane.
Currently used image processing technologies in lane detection algorithms have sufficiently advanced to detect various kinds of lane markings and road edges in a wide range of conditions. However, the lane markings on road surfaces can still be hard to detect. They may be worn away or covered by dirt. There are many other possible impediments that cause the optical system to fail to detect the lane location, such as shadows, overhead bridges, rain, and snow. In such cases, gaps may form in the representation of the lane being tracked. When the lane is lost by the optical system, the lane departure warning/lane keeping assist system is disabled so that no action is taken based on inaccurate or missing information.
It would be desirable to estimate or fill-in any missing lane markings in order to improve overall system availability. When multiple lane borders are being tracked and the markings for one border temporarily disappear, it is known to reconstruct the missing border at a fixed offset distance from the detected border. Nevertheless, instances still occur when the camera-based system is unable to produce a valid output.
Another possibility for lane tracking is through the use of geopositioning to pinpoint a vehicle location and correlate that location onto a digital map representing the roadway. Geographic coordinates are typically measured with an on-board GPS receiving-unit and/or a dead-reckoning system based on inertial sensor outputs in the is vehicle. In addition to the current position, these systems can also provide an instantaneous vehicle speed and heading angle.
Map databases have been constructed for the most of the roads in the country, making it theoretically possible to determine lane placement of the vehicle. Geometric and attribute information about the roadway at the matching coordinates for the vehicle can be looked up from the digital map database. The collection of this road information around a vehicle is called an Electronic Horizon (EH). In a typical EH system, roadways are composed of many road segments (also called links) for which the road geometric and attribute information is defined. The geometric information of a road segment includes the longitude, latitude, elevation, horizontal curvature, and grade along the road. The road attribute information may include road signs, number of lanes, road function class (e.g., freeway, ramp, arterial), lane marking type, paved/unpaved, and divided/undivided.
Although the number of lanes may be represented, the map database typically does not directly represent the coordinates of individual lanes because of the significant increase in the volume of data that would have to be represented. Instead, the links represent a one-dimensional pathline that typically corresponds with the centerline of the roadway. Even in the event that a digital map database does directly represent actual lane boundaries for a given roadway, issues of sporadic positional errors and intermittent availability of the geopositioning systems have limited the reliability of these systems. Consequently, optical camera-based lane monitoring systems have usually been preferred over GPS-based.
The present invention employs an optical-based system as a primary detector and uses a geopositioning system as a backup data source when optical data is unavailable, wherein the validity of the geopositioning data is enhanced using an offset adjustment derived during times that optical data is available.
In one aspect of the invention, an apparatus for a vehicle operated on a roadway having lane markers comprises an optical sensor providing optical data of the roadway. A first lane model is stored in an electronic memory in response to detected lane markers in the optical data. An electronic horizon system tracks a position of the vehicle and provides roadway data in response to the position. A second lane model is stored in the electronic memory in response to the roadway data. A confidence checker compares a discrepancy between the first and second lane models to a threshold in order to determine a confidence level. An output selector selects the first lane model when lane markers are detected in the optical data, and selects the second lane model if the lane markers are not detected in the optical data and the confidence level is greater than a predetermined level.
Referring now to
In the present invention, road segments represented in a digital map database are used to derive a secondary lane tracking model that can be used during times that a primary, optically-derived lane model becomes unavailable. As shown in
In order to identify a relative position of vehicle 10 with respect to pathline 22, the point on pathline 22 nearest to vehicle 10 is determined as shown in
dEH=((xj−xi)·(y0−yi)−(x0−xi)·(yj−yi))/p.
The distance dEH gives the lateral offset distance between vehicle 10 and map-derived pathline 22 which although it does not correspond with any particular lane border should run parallel to all the lanes. Pathline 22 between points 27 and 28 further has a heading direction represented as an angle φEH which is measured with respect to north. Heading angle φEH can be derived according to the formula:
The foregoing calculations to determine the distance and angle are preferably conducted periodically. For example, each iteration may be triggered after traveling a predetermined distance, such as one meter.
Whenever the actual lane border is actively being detected by the optical sensing system, an offset between the detected lane placement and the map-derived pathline can be determined as shown in
In
The variance is preferably calculated as a moving average of the difference between distances dEH and dLANE. Preferably, the average may take the form of either a statistical variance σ2 or a standard deviation σ. The confidence is high if the variance is below a respective threshold and confidence is low if it is above the threshold. In one embodiment, a confidence level is tracked using a count that is accumulated over a series of the periodic measurements. The count tracks a consecutive number of samples in which the variance is below the threshold. Once the consecutive count reaches a predetermined number (i.e., a confidence threshold), then the pathline can be reliably used to reconstruct a lane boundary in the event of loss of optical detection. Otherwise, no detection can be made of the lane.
A virtual lane generator 48 receives the confidence level from confidence checker 47, the pathline and offset distance from calculator 45, and the first lane model from vision system 40. Based on the offset distance, virtual lane generator 48 shifts the pathline so that it coincides with the previously determined offset between the vehicle and the lane border from vision system 40.
The second lane model based on the tracked vehicle position and roadway data from the map database is provided from the memory of generator 48 to one input of lane selector 41. The confidence level is also provided to lane selector 41 so that when the first lane model from vision system 40 loses validity, selector 41 checks the confidence level and then outputs the virtually-generated lane of the second lane model only if confidence is high. Otherwise, it outputs no lane model at all. Any lane model data that is output from selector 41 is coupled to applications 50 such as a lane departure warning system.
The matching confidence for the map-derived pathline is checked and updated in step 68. For example, a confidence number may be maintained. If the matching conditions for the heading and offset distances are met as described above in connection with
A check is made in step 70 to determine whether the vision-based lane information is available. If available, then the vision derived lane information is output in step 71 and a return is made to step 62 for the next iteration.
If vision information is not available in step 70, then a check is made in step 72 to determine whether the matching confidence is high. If not, then no lane information is output to the applications and a return is made to step 62. If matching confidence is high, then the virtual lane of the second lane model is generated in step 73. Optionally, a check is performed in step 74 to determine whether the host vehicle is still within the target roadway or lane. If not, then a return is made to step 62 for the next iteration without outputting a lane model (and the confidence number may preferably be reset). Otherwise, the virtual lane information of the second lane model based on the map derived information is output in step 75 before returning to step 62.
As used herein, the matching confidence (i.e., similarity between the two lane models) can be characterized according to various alternative tests or thresholds. Since the heading derived from each model should be the same, the difference can be compared to a threshold. Since the lateral offset distance from the vehicle to the lane border and the vehicle to the map pathline may often not be the same, but instead have a difference which should stay substantially constant, a variance or standard deviation to exhibited by the difference over time is used in the illustrated embodiment.
Number | Name | Date | Kind |
---|---|---|---|
6405128 | Bechtolsheim et al. | Jun 2002 | B1 |
6597984 | Appenrodt et al. | Jul 2003 | B2 |
6735515 | Bechtolsheim et al. | May 2004 | B2 |
6836719 | Andersson et al. | Dec 2004 | B2 |
6853908 | Andersson et al. | Feb 2005 | B2 |
7260465 | Waldis et al. | Aug 2007 | B2 |
7415134 | Ikeda et al. | Aug 2008 | B2 |
8405522 | Shaffer et al. | Mar 2013 | B2 |
8433100 | Nakamori et al. | Apr 2013 | B2 |
8447519 | Basnayake et al. | May 2013 | B2 |
8543292 | Choi | Sep 2013 | B2 |
8706352 | Hayakawa et al. | Apr 2014 | B2 |
8744675 | King et al. | Jun 2014 | B2 |
20040186650 | Tange et al. | Sep 2004 | A1 |
20070021912 | Morita et al. | Jan 2007 | A1 |
20090088978 | Ishikawa et al. | Apr 2009 | A1 |
20090228204 | Zavoli et al. | Sep 2009 | A1 |
20090299617 | Denaro | Dec 2009 | A1 |
20100030430 | Hayakawa et al. | Feb 2010 | A1 |
20100082248 | Dorum et al. | Apr 2010 | A1 |
20100098295 | Zhang et al. | Apr 2010 | A1 |
20100104199 | Zhang et al. | Apr 2010 | A1 |
20100191461 | Zeng | Jul 2010 | A1 |
20100253540 | Seder et al. | Oct 2010 | A1 |
20100292886 | Szczerba et al. | Nov 2010 | A1 |
20100332127 | Imai et al. | Dec 2010 | A1 |
20110169958 | Imai et al. | Jul 2011 | A1 |
20120050138 | Sato et al. | Mar 2012 | A1 |
20120053755 | Takagi | Mar 2012 | A1 |
20120226411 | Kuoch et al. | Sep 2012 | A1 |
20120314070 | Zhang et al. | Dec 2012 | A1 |
20130141520 | Zhang et al. | Jun 2013 | A1 |
20130148368 | Foltin | Jun 2013 | A1 |
20130158871 | Joh | Jun 2013 | A1 |
20130226400 | King et al. | Aug 2013 | A1 |
20140032108 | Zeng et al. | Jan 2014 | A1 |
20140129073 | Ferguson | May 2014 | A1 |
20140142780 | Chen et al. | May 2014 | A1 |
Number | Date | Country | |
---|---|---|---|
20140379164 A1 | Dec 2014 | US |