Fully or highly automated, e.g. autonomous or self-driven, driving systems are designed to operate a vehicle on the road either without or with low levels of driver interaction or other external controls. Autonomous driving systems require certainty in the position of and distance to geographic features surrounding the vehicle with a sufficient degree of accuracy to adequately control the vehicle. Details about the road or other geographic features surrounding the vehicle can be recorded on a detailed virtual map. The more accurate the detailed virtual map, the better the performance of the autonomous driving system. Existing virtual maps do not include sufficient or sufficiently accurate geographic feature details for optimized autonomous operation.
The detailed map format described here can be used to represent the drivable area of a road, including the boundary locations of each lane, the exact width of each lane, and the location of the impassable borders of a given lane, such as curbs, medians, and islands. The detailed map format can also include information to support driving rules associated with a given lane of the road, to calculate the distance from any object within the map format to the boundary of a lane, and to identify other map features intersecting a lane, such as crosswalks and driveways. The highly detailed nature of this map format allows for improved control of a highly-automated or autonomous vehicle as well as for improved localization (exact positioning) of the autonomous vehicle in respect to the detailed map format.
Each lane within the detailed map format can include lane segments formed of waypoints. The detailed map format disclosed can also include border segments formed of borderpoints. Information associated with these border segments and borderpoints includes border type and border color. An autonomous vehicle can be controlled to operate according to driving rules based on a given border type and border color associated with the detailed map format. Border segments can also be used to determine the distance to an edge of a lane for a given lane segment or the width of the lane at any point along the lane segment, providing for more accurate control of the autonomous vehicle than is possible using lane segments formed of waypoints alone.
In one respect, the subject matter described herein is directed to a method of operating an autonomous vehicle using a computer-readable map format. The method can include determining a current location of the autonomous vehicle. The method can also include comparing the current location of the autonomous vehicle to a computer readable map format. The map format can include a lane segment and a border segment. The method can further include determining a distance between the current location of the autonomous vehicle and an edge of the lane segment at a location along the lane segment. The determining of the distance can include measuring a distance between the current location of the autonomous vehicle and a portion of the border segment closest to the current location of the autonomous vehicle. The method can include determining a driving maneuver based on the determined distance. The method can include causing one or more vehicle systems of the autonomous vehicle to implement the determined driving maneuver.
In another respect, the subject matter described herein is directed to an autonomous vehicle system. The system can include a processing unit. The system can also include a memory communicatively coupled to the processing unit. The memory can include a computer-readable map format. The map format can include a lane segment and a border segment. The system can include one or more vehicle communicatively coupled to the processing unit. The processing unit can be configured to compare a current location of an autonomous vehicle to the map format. The processing unit can be configured to determine a distance between the current location of the autonomous vehicle and an edge of the lane segment at a location along the lane segment. Such determining can include measuring a distance between the current location of the autonomous vehicle and a portion of the border segment closest to the current location of the autonomous vehicle. The processing unit can be configured to determine a driving maneuver based on the determined distance. The processing unit can be configured to cause one or more of the vehicle systems to implement the determined driving maneuver.
The description herein makes reference to the accompanying drawings wherein like reference numerals refer to like parts throughout the several views, and wherein:
A computer-readable, highly detailed map format for an autonomous vehicle is disclosed. The detailed map format includes information representing the geographical location, travel direction, and speed limit of lanes on a given road using lane segments formed of waypoints. Beyond this basic information, the detailed map format also includes the geographical location for the borders of each lane in the form of border segments formed of borderpoints. Information associated with the border segments and border points within the detailed map format can include the border type and border color, such that driving rules can be associated with the lane segments based on the closest border segments. The detailed map format can also include stop lines linked to the end of lanes at traffic intersections to better position the autonomous vehicle for entry into a traffic intersection and to indicate where the autonomous vehicle should stop at the traffic intersection. Crosswalks can also be included in the detailed map format and associated with safety rules to be followed when the autonomous vehicle approaches the crosswalk.
The memory 104 can also include an operating system 110 and installed applications 112, the installed applications 112 including programs that permit the CPU 102 to perform automated driving methods using the detailed map format described below. The computing device 100 can also include secondary, additional, or external storage 114, for example, a memory card, flash drive, or any other form of computer readable medium. The installed applications 112 can be stored in whole or in part in the external storage 114 and loaded into the memory 104 as needed for processing.
The computing device 100 can also be in communication with one or more sensors 116. The sensors 116 can capture data and/or signals for processing by an inertial measurement unit (IMU), a dead-reckoning system, a global navigation satellite system (GNSS), a light detection and ranging (LIDAR) system, a radar system, a sonar system, an image-based sensor system, or any other type of system capable of capturing information specific to the environment surrounding a vehicle for use in creating a detailed map format as described below, including information specific to objects such as features of the route being travelled by the vehicle or other localized position data and/or signals and outputting corresponding data and/or signals to the CPU 102.
In the examples described below, the sensors 116 can capture, at least, signals for a GNSS or other system that determines vehicle position and velocity and data for a LIDAR system or other system that measures vehicle distance from lane lines (e.g., route surface markings or route boundaries), obstacles, objects, or other environmental features including traffic lights and road signs. The computing device 100 can also be in communication with one or more vehicle systems 118, such as vehicle braking systems, vehicle propulsions systems, etc. The vehicle systems 118 can also be in communication with the sensors 116, the sensors 116 being configured to capture data indicative of performance of the vehicle systems 118.
The vehicle 200 can also include a plurality of sensors, such as the sensors 116 described in reference to
Map formats can be constructed using geographic features captured by the vehicle 200 such as lane lines and curbs proximate the vehicle 200 as it travels a route. These geographic features can be captured using the above described LIDAR system and/or cameras in combination with an algorithm such as random sample consensus (RANSAC) to find lines, record the position of the vehicle 200, and collect data on position from a GNSS and/or an IMU. The captured geographic features can then be manipulated using a simultaneous localization and mapping (SLAM) technique to position all of the geographic features in relation to the vehicle's 200 position. Some of the geographic features can be categorized as lane borders, and lane centers can be determined based on the lane borders. Alternatively, map formats can be constructed using overhead images (e.g. satellite images) of geographic features traced by a map editor that allows selection of different categories for each geographic feature.
In the example map format shown in
Additional detail can be added to the map format in order to improve the map format for use with the autonomous vehicle 200. As shown in
The information associated with each borderpoint 324, 326, 328, 330, 332, 334 can be used by the autonomous vehicle 200 in order to determine navigation routes, make decisions regarding passing other vehicles, position or localize the autonomous vehicle 200 in respect to the border segments 318, 320, 322, and determine the driveable area along a given navigation route in order to support safety maneuvers or obstacle tracking. The information associated with each borderpoint 324, 326, 328, 330, 332, 334, can, for example, be built from data collected using a LIDAR sensor and manipulated using a SLAM technique when building the detailed map as describe above. The map information associated with the borders and lanes 302, 304 can be stored, for example, in the form of spline points or as curves with knot vectors in the memory 104 of the computing device 100 or can be available from a remote location.
Examples of different border types that can be associated with the borderpoints 324, 326, 328, 330, 332, 334 can include a “curb,” a “single solid line,” a “double solid line,” a “single dashed line,” a “combined dashed line and solid line,” and “no line.” For example, the borderpoints 324, 326 and hence the border segment 318 extending between them can be associated with a “single solid line” border type. To represent the “single solid line” border type within the map format, borderpoints 324, 326 are shown with an open circle representation and the border segment 318 is shown using a thin, solid line representation. Similarly, the border segment 320 extending between the borderpoints 328, 330 can be associated with a “double solid line” border type. The partially-shown border segments 336, 338 can be associated with a “combined dashed line and solid line” border type. In addition to border type, border color can also be associated with the borderpoints 324, 326, 328, 330, 332, 334. For example, border colors can include “yellow,” “white,” or “unknown.”
Border types and border colors can be used to associate a driving rule with each of the various lane segments 306, 308 (and/or with the waypoints 310, 312, 314, 316 forming the lane segments 306, 308). A driving rule can be based at least in part on the border type and the border color associated with the borderpoints 324, 326, 328, 330, 332, 334 and border segments 318, 318, 322, closest to the lane segment 306, 308. For example, two driving rules can be associated with lane segment 306: first, a driving rule of “no passing border” based on the border segment 320 extending between the borderpoints 328, 330 given that the border segment 320 can be associated with a border type of “double solid line” and a border color of “yellow;” second, a driving rule of “drivable lane border” based on the border segment 318 extending between the borderpoints 324, 326 given that the border segment 318 can be associated with a border type of “single solid line” and a border color of “white.” Though the border types, border colors, and driving rules described in reference to
Another benefit of storing information for both lane segments 306, 308 and border segments 318, 320, 322 in the map format is that the distance to an edge of the lane segment 306, 308 can be determined at any location along the lane segment 306, 308 by measuring a distance between the location and a portion of the border segment 318, 320, 322 closest to the location. This allows for the autonomous vehicle 200 to be positioned within, for example, either of the lanes 302, 304 at an optimum spacing based on the actual geographical location, border color, and border type of the border segment 318, 320, 322 instead of relying on fixed lane widths associated only with waypoints 310, 312, 314, 316. Knowing the actual distance to an edge of the lane segment 306, 308 leads to greater maneuverability of the autonomous vehicle 200. Further, the ability to localize the autonomous vehicle 200 is improved because the border segments 318, 320, 322 as stored within the detailed map format can be matched to images of lane borders or other geographic features captured during autonomous operation of vehicle 200.
In some examples, border segments 318, 320, 322 can be positioned both proximate to and on opposite sides of a given lane segment 306, 308. In these cases, a lane width of the lane segment 306, 308 can be determined at a chosen location along the lane segment 306, 308 by measuring the distance between the two border segments 318, 320, 322 positioned proximate to and on opposite sides of the lane segment 306, 308. For example, the lane width for lane 302 can be calculated anywhere along lane segment 306 by measuring the distance between border segments 318, 320 within the map format. Again, knowing the actual lane width at any point along the lane 302, 304 is beneficial both for overall positioning and maneuvering of the autonomous vehicle 200. The positioning benefit is further described in reference to
Similar information as described above in reference to
For example, the border segments 442, 448 can be associated with the “curb” border type, which is represented within the detailed map format using a dotted, hashed line type. The borderpoints 444, 446, 452 are represented using filled circles, and together, the borderpoints 444, 446, 452 and border segments 442, 448 indicate an “impassable” driving rule. When the driving rule indicated is “impassable,” the autonomous vehicle 200 is controlled in a manner such that the vehicle 200 will not attempt to navigate beyond the border. The borderpoint 450 is represented using a half-filled circle, indicating a transition in the border type from a “solid line” type to a “curb” type at the location of the borderpoint 450.
Understanding the location of a transition between border types is important for autonomous control, as the vehicle 200 is physically able, if necessary, to navigate along or across a “line” type border, but is not able to safely navigate along or across a “curb” type border. The benefit of using border segments is clear in this example. A median 464 is shown as present between lane 404 and lane 406. The left-most part of the median 464 is bordered by border segments of solid lines associated with a “drivable lane border” driving rule and a “line” type border while the right-most part of the median 464 is bordered by border segments of dotted, hashed lines associated with an “impassable” driving rule and a “curb” type border. If necessary, the vehicle 200 could navigate across only the left-most part of the median 464.
The additional information provided by the stop lines 454, 456 is useful in operation of the autonomous vehicle 200 because the stop lines 454, 456 allow the autonomous vehicle 200 to be positioned at the traffic intersection in a manner consistent with manual operation of a vehicle. For example, if the autonomous vehicle 200 approaches the traffic intersection within lane 402, instead of stopping at the waypoint 424 denoting the end of the lane segment 412, the autonomous vehicle 200 can be controlled to move forward to the stop line 456 and slightly around the corner of the lane 402 as denoted by the border segment 442. This maneuver is more consistent with how a driver would manually operate a vehicle on the road 400 when making a right turn at a traffic intersection. Though not shown, crosswalks can also be included in the detailed map format in a manner similar to that used for the stop lines 454, 456. Information associated with the crosswalks can include a geographical location of a position of the crosswalk and a driving rule associated with the crosswalk that directs the automated vehicle system to implement additional safety protocols.
Traffic signals are another feature present within the map format shown in
The foregoing description relates to what are presently considered to be the most practical embodiments. It is to be understood, however, that the disclosure is not to be limited to these embodiments but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. The scope of the claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.
This application is a continuation of U.S. application Ser. No. 14/265,370, filed Apr. 30, 2014, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3613073 | Clift | Oct 1971 | A |
3771096 | Walter | Nov 1973 | A |
4704610 | Smith et al. | Nov 1987 | A |
4775865 | Smith et al. | Oct 1988 | A |
4884072 | Horsch | Nov 1989 | A |
5041828 | Loeven | Aug 1991 | A |
5278554 | Marton | Jan 1994 | A |
5712618 | McKenna | Jan 1998 | A |
5798949 | Kaub | Aug 1998 | A |
5801646 | Pena | Sep 1998 | A |
5873674 | Hohl | Feb 1999 | A |
5926126 | Engelman | Jul 1999 | A |
6148370 | Kobayashi | Nov 2000 | A |
6226389 | Lemelson | May 2001 | B1 |
6230098 | Ando et al. | May 2001 | B1 |
6232889 | Apitz et al. | May 2001 | B1 |
6253128 | Kageyama | Jun 2001 | B1 |
6317058 | Lemelson et al. | Nov 2001 | B1 |
6321159 | Nohtomi | Nov 2001 | B1 |
6338021 | Yagyu et al. | Jan 2002 | B1 |
6405132 | Breed | Jun 2002 | B1 |
6418371 | Arnold | Jul 2002 | B1 |
6526352 | Breed | Feb 2003 | B1 |
6919823 | Lock | Jul 2005 | B1 |
7433889 | Barton | Oct 2008 | B1 |
7477988 | Dorum | Jan 2009 | B2 |
8000897 | Breed | Aug 2011 | B2 |
8121749 | Agrawal et al. | Feb 2012 | B1 |
8209120 | Breed | Jun 2012 | B2 |
8489316 | Hedges | Jul 2013 | B1 |
8527199 | Burnette et al. | Sep 2013 | B1 |
8712624 | Ferguson et al. | Apr 2014 | B1 |
8761991 | Ferguson et al. | Jun 2014 | B1 |
8855904 | Templeton et al. | Oct 2014 | B1 |
8917190 | Melvin | Dec 2014 | B1 |
9830517 | Vladimerou | Nov 2017 | B2 |
20010025528 | Blew et al. | Oct 2001 | A1 |
20020067292 | Appenrodt | Jun 2002 | A1 |
20020198632 | Breed | Dec 2002 | A1 |
20030016143 | Ghazarian | Jan 2003 | A1 |
20050060069 | Breed | Mar 2005 | A1 |
20050134440 | Breed | Jun 2005 | A1 |
20050137786 | Breed | Jun 2005 | A1 |
20050200467 | Au | Sep 2005 | A1 |
20050273261 | Niwa | Dec 2005 | A1 |
20060184321 | Kawakami et al. | Aug 2006 | A1 |
20060224303 | Hayashi | Oct 2006 | A1 |
20070005609 | Breed | Jan 2007 | A1 |
20070021912 | Morita et al. | Jan 2007 | A1 |
20070021915 | Breed | Jan 2007 | A1 |
20070091173 | Kade | Apr 2007 | A1 |
20070109111 | Breed | May 2007 | A1 |
20070152804 | Breed | Jul 2007 | A1 |
20070200730 | Kang | Aug 2007 | A1 |
20070296610 | Heffernan | Dec 2007 | A1 |
20080012726 | Publicover | Jan 2008 | A1 |
20080015771 | Breed | Jan 2008 | A1 |
20080040023 | Breed | Feb 2008 | A1 |
20080040029 | Breed | Feb 2008 | A1 |
20080042815 | Breed | Feb 2008 | A1 |
20080097689 | Germanos et al. | Apr 2008 | A1 |
20080106436 | Breed | May 2008 | A1 |
20080133136 | Breed | Jun 2008 | A1 |
20080140318 | Breed | Jun 2008 | A1 |
20080147253 | Breed | Jun 2008 | A1 |
20080150786 | Breed | Jun 2008 | A1 |
20080154629 | Breed | Jun 2008 | A1 |
20080162027 | Murphy et al. | Jul 2008 | A1 |
20080172171 | Kowalski | Jul 2008 | A1 |
20080238720 | Lee | Oct 2008 | A1 |
20080284616 | Rendon | Nov 2008 | A1 |
20080291052 | Burns | Nov 2008 | A1 |
20090030605 | Breed | Jan 2009 | A1 |
20090043506 | Breed | Feb 2009 | A1 |
20090135024 | Park et al. | May 2009 | A1 |
20090312888 | Sickert et al. | Dec 2009 | A1 |
20090326751 | Otake et al. | Dec 2009 | A1 |
20100002911 | Wu | Jan 2010 | A1 |
20100017060 | Zhang | Jan 2010 | A1 |
20100020170 | Higgins-Luthman | Jan 2010 | A1 |
20100026804 | Tanizaki et al. | Feb 2010 | A1 |
20100073194 | Ghazarian | Mar 2010 | A1 |
20100262359 | Motoyama | Oct 2010 | A1 |
20100312527 | Weiland | Dec 2010 | A1 |
20110006915 | Sower | Jan 2011 | A1 |
20110015850 | Tange | Jan 2011 | A1 |
20110025528 | Rejali et al. | Feb 2011 | A1 |
20110080303 | Goldberg et al. | Apr 2011 | A1 |
20110118900 | Uchida et al. | May 2011 | A1 |
20110182473 | Wang | Jul 2011 | A1 |
20110187559 | Applebaum | Aug 2011 | A1 |
20120095646 | Ghazarian | Apr 2012 | A1 |
20120098678 | Rathmacher | Apr 2012 | A1 |
20120101712 | Schramm | Apr 2012 | A1 |
20120112927 | Grieco et al. | May 2012 | A1 |
20120123640 | Mukaiyama | May 2012 | A1 |
20120209505 | Breed | Aug 2012 | A1 |
20120323474 | Breed | Dec 2012 | A1 |
20130018572 | Jang | Jan 2013 | A1 |
20130038433 | Ullrich | Feb 2013 | A1 |
20130080019 | Isaji | Mar 2013 | A1 |
20130245945 | Morita et al. | Sep 2013 | A1 |
20130304322 | Isaji | Nov 2013 | A1 |
20130335238 | Matsur | Dec 2013 | A1 |
20140104051 | Breed | Apr 2014 | A1 |
20140200798 | Huelsen | Jul 2014 | A1 |
20140257659 | Dariush | Sep 2014 | A1 |
20150105989 | Lueke et al. | Apr 2015 | A1 |
20150124096 | Koravadi | May 2015 | A1 |
20150151725 | Clarke | Jun 2015 | A1 |
20150177007 | Su | Jun 2015 | A1 |
20150266508 | Yoshihata | Sep 2015 | A1 |
20150339533 | Liu | Nov 2015 | A1 |
20150367778 | Vladimerou | Dec 2015 | A1 |
20160221575 | Posch | Aug 2016 | A1 |
20160318490 | Ben Shalom | Nov 2016 | A1 |
20170036678 | Takamatsu | Feb 2017 | A1 |
20170110010 | Grabs et al. | Apr 2017 | A1 |
20180046196 | Hashimoto | Feb 2018 | A1 |
20180101172 | Min | Apr 2018 | A1 |
Number | Date | Country |
---|---|---|
1498694 | Jan 2005 | EP |
1909247 | Apr 2008 | EP |
2466566 | Jun 2012 | EP |
2002333829 | Nov 2002 | JP |
2003315056 | Nov 2003 | JP |
2007086156 | Apr 2007 | JP |
2007278813 | Oct 2007 | JP |
2009015504 | Jan 2009 | JP |
2009199572 | Sep 2009 | JP |
2010026875 | Feb 2010 | JP |
2012163573 | Dec 2012 | WO |
2013060925 | May 2013 | WO |
Entry |
---|
Extended European Search Report, European Patent Application No. EP15166003.2, dated Jul. 16, 2015 (7 pages). |
Geographic Data Filed—GDF, Presentations, Articles, Publications/Documents/Handouts, retrieved from the Internet: < https://web.archive.org/web/*/http://www.ertico.com/assets/download/GDF/handouts.pdf>, website <http://www.ertico.com/assets/download/GDF/handouts.pdf> last archived Apr. 1, 2014 (41 pages). |
OpenStreetMap Wiki, retrieved from the Internet: <http://wiki.openstreetmap.org/wiki/Main_Page>, retrieved Feb. 19, 2016 (3 pages). |
Czerwionka et al., “Optimized Route Network Graph as Map Reference for Autonomous Cars Operating on German Autobahn”, Artificial Intelligence Group Institute of Computer Science freie Universitat Berlin, Germany (6 pages). |
CEN Technical Committee 278 Road Transport and Traffic Telematics. (1995). Geographic Data Files. Nederlands Normalisatie Instituut. p. 42 available at: http://www.ertico.com/assets/download/GDF/TOC1-5.pdf (1 page). |
Kiwi-W Consortium. (2001). Outline of Kiwi Format. Slide 20, information available at http://www.kiwi-w.org/documents_eng.html (1 page). |
International Search Report and Written Opinion for International Application No. PCT/US2015/027347, dated Jul. 3, 2015 (12 pages). |
Number | Date | Country | |
---|---|---|---|
20160257307 A1 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14265370 | Apr 2014 | US |
Child | 15155313 | US |