The present disclosure relates generally to vision-based lane sensing, and more particularly, to utilizing map software to assist in vision-based lane sensing.
Vision-based lane sensing (LS) systems detect roadway lane markings and can utilize this information for lane departure warning (LDW), road departure warning and lane keeping (LK) purposes in addition to other purposes (e.g., road geometry prediction). In general, the LS algorithms utilize information from both right and left lane markings to inform the driver of an inadvertent lane deviation, or to steer or keep the vehicle within the lane using, for example, electric power steering (EPS) or active front steering (AFS).
In most situations, the LS system utilizes the contrast between the lane marking and the pavement to detect the markings. For example, a bright white lane marking on black tar pavement can be detected by the image processor without too much difficulty. As this contrast deteriorates, so does the lane sensing performance. In this regard, it is more difficult for the image processor to detect yellow lane markings in a gray scaled image (and to a lesser degree color image) because of the lower intensities that they generate. At the same time, there is an abundance of yellow lane markings on roadways in the United States. For example, as shown in
According to one aspect of the invention, a method is provided for map-aided vision-based lane sensing. The method includes receiving map information corresponding to a current geographic position of a vehicle on a roadway. The map information includes the number of lanes on the roadway. Information about the number of lanes crossed by the vehicle on the roadway is received from a vision system. It is determined which of the lanes on the roadway is currently occupied by the vehicle based on the map information and the number of lanes crossed by the vehicle on the roadway.
In another aspect of the invention, a system is provided for map-aided vision-based lane sensing. The system includes an input device for receiving map information corresponding to a current geographic position of a vehicle on a roadway. The map information includes the number of lanes on the roadway. The input device also receives information about the number of lanes crossed by the vehicle on the roadway from the vision system. The system further includes a processor in communication with the input device. The processor includes instructions for facilitating determining which of the lanes on the roadway is currently occupied by the vehicle. The determining is based on the map information and the number of lanes crossed by the vehicle on the roadway.
In a further aspect of the invention, a computer program product is provided for map-aided vision-based lane sensing. The computer program product includes a storage medium readable by a processing circuit and storing instructions for execution by the processing circuit for performing a method. The method includes receiving map information corresponding to a current geographic position of a vehicle on a roadway. The map information includes the number of lanes on the roadway. Information about the number of lanes crossed by the vehicle on the roadway is received from the vision system. It is determined which of the lanes on the roadway is currently occupied by the vehicle based on the map information and the number of lanes crossed by the vehicle on the roadway.
Referring now to the figures, which are meant to be exemplary embodiments, and wherein the like elements are numbered alike:
To determine whether a lane marker is yellow, exemplary embodiments of the present invention utilize digital road maps, where such road attributes are available and can be provided to the lane sensing (LS) processor in advance. The image processor in the LS processor can utilize this information to adjust the algorithms and/or filters that it utilizes to detect lane markers. For example, the image processor could react by using a proper yellow filter (a typical yellow filter is αR+βG−γB with typical values of α, β>0.5 and γ<0.2) to enhance the detectability of yellow markers, without sacrificing the detectability of white markers. In addition, when the lane marker is white, the computational effort will be reduced by knowing that a lane marker is white.
On a freeway where the yellow lane marker is on the left side of the road (the leftmost lane), it is useful to know the number of lanes and the lane where the vehicle is traveling. This knowledge provides information about the color of the lane marker useful to the image processor for lane sensing purposes. The combination of global positioning system (GPS) coordinates, map software and a vision system can be utilized to identify the lane where the vehicle is traveling. The number of lanes in the roadway are provided by the map database. As the vehicle enters a road, the vision system can detect the lane markers and keep track of them as the vehicle changes lanes. In this manner, the lane determination software can determine which lane (e.g., lane number) the vehicle is traveling in.
The vision-based LS system 310 depicted in
The GPS 306, depicted in
In exemplary embodiments, the lane determination module 308 is implemented by software instructions located on a processor (e.g., a microprocessor) within the vehicle 302. In alternate exemplary embodiments, the lane determination module 308 is implemented by hardware and/or software. The input device in the lane determination module 308 may be implemented by any method of receiving information into the lane determination module 308 from the map software 304 and the vision-based LS system 310. The input device may receive data via a network that is internal and/or external to the vehicle 302. In exemplary embodiments, one or more of the elements depicted in
In exemplary embodiments, the information provided by the map data base 304 includes the number of lanes on the roadway. In alternate exemplary embodiments the map information from the map software 304 also includes a roadway type (e.g., highway, rural road), a lane marker type (e.g., reflector, painted) and/or an entrance point (e.g., left or right side) onto the roadway. In further alternate exemplary embodiments, depending on the information available from the map software 304, the lane marker type is more detailed including, for example, attributes (e.g., color, dotted/solid) of each of the lane markers.
At block 404, an LS system 310 is utilized to detect the lane markers and to keep track of the number and direction of vehicle lane changes. The information from the LS system 310 is received at the lane determination module 308.
At block 406, the lane currently occupied by the vehicle is determined. In exemplary embodiments, this is determined based on the data received from the LS system 310 that indicates the number of lanes on the roadway that have been crossed by the vehicle and their directions 302. This number is compared to the number of lanes on the roadway. Based on this comparison, the lane currently occupied by the vehicle 302 is determined. Once the lane currently occupied is determined, the LS system 310 may send an update to the lane determination module 308 when the vehicle 302 has changed a lane. Further, the LS system 310 may indicate to the lane determination module 308 whether the change was in the right direction or the left direction.
In exemplary embodiments, the LS system 310 keeps track of the relative lane number from the point of the vehicle 302 entrance on to the roadway. For example, in a four-lane road, a vehicle 302 enters onto the rightmost lane, it crosses two lanes to the left, then one lane to the right, and then crosses two lanes to the left again, and now the vehicle is in the leftmost lane.
Emphasis has been placed on the left or right lanes, however it should be noted that other lanes of the roadway are always white and mostly dashed (they can also be solid or reflectors). This information can also be used by the vision system processor to enhance robustness.
At block 508, the attributes of the lane markers (if known) are transmitted to the LS system 310. The LS system 310 utilizes the attributes (e.g., color) to select filters for use in detecting the lane markers. If the attribute specifies a color of white, a filter(s) optimized for identifying white lane markers will be utilized by the LS system 310 to detect the lane marker. If the attribute specifies a color of yellow, a filter(s) optimized for detecting yellow lane markers will be utilized by the LS system 310 to detect the lane marker. Alternatively, or in addition, to using different filter types in response to attributes, the LS system 310 may utilize different types of algorithms in response to different attributes. In general, the algorithms required for detecting white lines are less complex than those required for detecting yellow lines. Thus, by knowing attributes about lane markers, such as the color, the LS system 310 may utilize the most efficient algorithms for detecting lane markers.
In addition, the lane determination module 308 may use the lane currently occupied by the vehicle 302 to identify the lane position relative to the roadway (e.g., rightmost, leftmost, center). The lane position on the roadway is transmitted to the LS system 310. In exemplary embodiments, the LS system 310 utilizes the lane position on the roadway information to determine what type of warning (or message), to transmit to the operator of the vehicle 302. For example, the LS system 310 may provide stronger warnings to a driver of the vehicle 302 if the vehicle is crossing the center line of a divided roadway or driving off of the roadway. The LS system 310 may issue different kinds of warnings to a driver of the vehicle 302 depending on whether the vehicle 302 is going off the roadway (e.g., the vehicle 302 is in the rightmost lane and crossing the right lane marker) or whether the vehicle 302 is moving into a different lane on the roadway.
Alternate exemplary embodiments support the detection of Bots Dots (special type of reflectors) on a roadway used to mark lanes. Currently, many roadways in southern states (e.g., California, Texas) have Bots Dots as sole lane markers and are without any painted lane markers. The image processing algorithms required to detect Bots Dots, or reflectors in general, are very different than the algorithms used to detect painted lane markings. Currently, the lane sensing image processor typically runs both algorithms simultaneously and has to make a decision as to which algorithm is the appropriate one and should be selected. Again, this requires extra computational burden and, at times, even inaccurate selection of an algorithm. In such driving scenarios, the information about Bots Dots (or reflectors) can be made available as part of the digital map attributes. This information may be provided to the image processor (e.g., as attributes of the lane markers as described above in reference to
Exemplary embodiments of the present invention may be utilized to fuse data from a digital map database with an image processor on a LS system to enhance the robustness of the LS system and to reduce the computational burden of the LS system. The ability to detect road departures as well as lane departures may lead to the enhanced robustness of the LS system. The ability to identify attributes of a lane marker (e.g., color, type) allows the LS system to utilize filters and algorithms that will detect the lane markers in a more efficient manner and reduce computational burden.
As described above, the embodiments of the invention may be embodied in the form of hardware, software, firmware, or any processes and/or apparatuses for practicing the embodiments. Embodiments of the invention may also be embodied in the form of computer program code containing instructions embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. The present invention can also be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits.
While the invention has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope of the appended claims. Moreover, the use of the terms first, second, etc. do not denote any order or importance, but rather the terms first, second, etc. are used to distinguish one element from another.
Number | Name | Date | Kind |
---|---|---|---|
5483453 | Uemura et al. | Jan 1996 | A |
5904725 | Iisaka et al. | May 1999 | A |
5922036 | Yasui et al. | Jul 1999 | A |
6385536 | Kimura | May 2002 | B2 |
6411889 | Mizunuma et al. | Jun 2002 | B1 |
6411901 | Hiwatashi et al. | Jun 2002 | B1 |
6580986 | Uenuma et al. | Jun 2003 | B1 |
6741186 | Ross | May 2004 | B2 |
6768944 | Breed et al. | Jul 2004 | B2 |
6813370 | Arai | Nov 2004 | B1 |
6819779 | Nichani | Nov 2004 | B1 |
6850841 | Casino | Feb 2005 | B1 |
6973380 | Tange et al. | Dec 2005 | B2 |
7016517 | Furusho | Mar 2006 | B2 |
7030775 | Sekiguchi | Apr 2006 | B2 |
7058494 | Matsumoto et al. | Jun 2006 | B2 |
7072764 | Donath et al. | Jul 2006 | B2 |
7113866 | Taliwal | Sep 2006 | B2 |
7224290 | Takenaga et al. | May 2007 | B2 |
7254482 | Kawasaki et al. | Aug 2007 | B2 |
7336203 | Arquette et al. | Feb 2008 | B2 |
7424364 | Gern et al. | Sep 2008 | B2 |
7510038 | Kaufmann et al. | Mar 2009 | B2 |
7555367 | Kuge | Jun 2009 | B2 |
20020184236 | Donath et al. | Dec 2002 | A1 |
20030072471 | Otsuka et al. | Apr 2003 | A1 |
20040143381 | Regensburger et al. | Jul 2004 | A1 |
20040164851 | Crawshaw | Aug 2004 | A1 |
20040262063 | Kaufmann et al. | Dec 2004 | A1 |
20050004753 | Weiland et al. | Jan 2005 | A1 |
20050129279 | Unwin | Jun 2005 | A1 |
20050149251 | Donath et al. | Jul 2005 | A1 |
20050174223 | Egami et al. | Aug 2005 | A1 |
20060031008 | Kimura et al. | Feb 2006 | A1 |
20060106518 | Stam et al. | May 2006 | A1 |
20070021912 | Morita et al. | Jan 2007 | A1 |
Number | Date | Country |
---|---|---|
1707224 | Dec 2005 | CN |
19906614 | Oct 1999 | DE |
60202341 | Dec 2005 | DE |
1371948 | Dec 2003 | EP |
Entry |
---|
German Office Action dated Jun. 3, 2011. |
Number | Date | Country | |
---|---|---|---|
20070168113 A1 | Jul 2007 | US |