Change detection using curve alignment

Information

  • Patent Grant
  • 11327493
  • Patent Number
    11,327,493
  • Date Filed
    Wednesday, March 11, 2020
    4 years ago
  • Date Issued
    Tuesday, May 10, 2022
    2 years ago
Abstract
Aspects of the disclosure relate to determining whether a feature of map information. For example, data identifying an object detected in a vehicle's environment and including location coordinates is received. This information is used to identify a corresponding feature from pre-stored map information based on a map location of the corresponding feature. The corresponding feature is defined as a curve and associated with a tag identifying a type of the corresponding object. A tolerance constraint is identified based on the tag. The curve is divided into two or more line segments. Each line segment has a first position. The first position of a line segment is changed in order to determine a second position based on the location coordinates and the tolerance constraint. A value is determined based on a comparison of the first position to the second position. This value indicates a likelihood that the corresponding feature has changed.
Description
BACKGROUND

Autonomous vehicles, such as vehicles which do not require a human driver when operating in an autonomous driving mode, may be used to aid in the transport of passengers or items from one location to another. An important component of an autonomous vehicle is the perception system, which allows the vehicle to perceive and interpret its surroundings using cameras, radar, sensors, and other similar devices. The perception system executes numerous decisions while the autonomous vehicle is in motion, such as speeding up, slowing down, stopping, turning, etc. Autonomous vehicles may also use the cameras, sensors, and global positioning devices to gather and interpret images and sensor data about its surrounding environment, e.g., parked cars, trees, buildings, etc.


Information from the perception system may be combined with highly detailed map information in order to allow a vehicle's computer to safely maneuver the vehicle in various environments. This highly detailed map information may describe expected conditions of the vehicle's environment such as the shape and location of roads, traffic signals, and other objects. In this regard, the information from the perception system and detailed map information may be used to assist a vehicle's computer in making driving decisions involving intersections and traffic signals.


BRIEF SUMMARY

One aspect of the disclosure provides a computer-implemented method. The method includes receiving, by one or more computing devices, data identifying an object detected in a vehicle's environment. The data including location coordinates for the object. The method also includes identifying, by the one or more computing devices, a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature. The corresponding feature is defined as a curve and associated with a tag identifying a type of the corresponding object. The method includes identifying, by the one or more computing devices, a tolerance constraint based on the tag identifying the type of the corresponding object and dividing, by the one or more computing devices, the curve into two or more line segments. Each line segment of the two or more line segments has a first position. The method includes changing, by the one or more computing devices, the first position of the one of the two or more line segments to determine a second position of the one of the two or more line segments based on the location coordinates and the tolerance constraint. Changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments. The method also includes determining, by the one or more computing devices, a value based on a comparison of the first position to the second position. The value indicates a likelihood that the corresponding feature has changed.


In one example, the corresponding feature is identified based on whether a distance between the location coordinates and the map location satisfies a threshold. In another example, the tolerance constraint limits at least one of the shifting or rotating of the one of the two or more line segments. In another example, the method also includes identifying a second tolerance constraint based on the tag identifying the type of the corresponding object, and changing the first position is further based on the second constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments. In another example, changing the first position includes both shifting and rotating the first position of the one of the two or more line segments. In another example, the method also includes comparing the value to a threshold values to determine whether the corresponding feature no longer exists. In another example, the method also includes comparing the value to a threshold values to determine whether the corresponding feature has been shifted. In another example, the method also includes before segmenting, determining whether the detected object used to define a driving lane based on the second type of the second corresponding object. In another example, the method also includes using, by the one or more processors, the value to maneuver the vehicle.


A further aspect of the disclosure provides a system including one or more computing devices. The one or more computing devices are configured to receive data identifying an object detected in a vehicle's environment. The data including location coordinates for the object. The one or more computing devices are also configured to identify a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature. The corresponding feature is defined as a curve and associated with a tag identifying a type of the corresponding object. The one or more computing devices are further configured to identify a tolerance constraint based on the tag identifying the type of the corresponding object and divide the curve into two or more line segments. Each line segment of the two or more line segments has a first position. The one or more computing devices are also configured to change the first position of the one of the two or more line segments to determine a second position of the one of the two or more line segments based on the location coordinates and the tolerance constraint. Changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments. The one or more computing devices are configured to determine a value based on a comparison of the first position to the second position. The value indicates a likelihood that the corresponding feature has changed.


In one example, the corresponding feature is identified based on whether a distance between the location coordinates and the map location satisfies a threshold. In another example, the tolerance constraint limits at least one of the shifting or rotating of the one of the two or more line segments. In another example, the one or more computing devices are also configured to identify second tolerance constraint based on the tag identifying the type of the corresponding object, and to change the first position further based on the second constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments. In another example, the one or more computing devices are also configured to change the first position by both shifting and rotating the first position of the one of the two or more line segments. In another example, the one or more computing devices are further configured to compare the value to a threshold values to determine whether the corresponding feature no longer exists. In another example, the one or more computing devices are further configured to compare the value to a threshold values to determine whether the corresponding feature has been shifted. In another example, the one or more computing devices are also configured to, before segmenting, determine whether the detected object used to define a driving lane based on the second type of the second corresponding object. In another example, the system also includes the vehicle, and the one or more computing devices are further configured to maneuver the vehicle based on the value.


A further aspect of the disclosure provides a non-transitory, tangible computer readable medium on which instructions are stored. The instructions, when executed by one or more processors cause the one or more processors to perform a method. The method includes receiving data identifying an object detected in a vehicle's environment. The data including location coordinates for the object. The method also includes identifying a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature. The corresponding feature is defined as a curve and associated with a tag identifying a type of the corresponding object. The method includes identifying a tolerance constraint based on the tag identifying the type of the corresponding object and dividing the curve into two or more line segments. Each line segment of the two or more line segments has a first position. The method includes changing the first position of the one of the two or more line segments to determine a second position of the one of the two or more line segments based on the location coordinates and the tolerance constraint. Changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments. The method also includes determining a value based on a comparison of the first position to the second position. The value indicates a likelihood that the corresponding feature has changed.


In one example, the corresponding feature is identified based on whether a distance between the location coordinates and the map location satisfies a threshold. In another example, the tolerance constraint limits at least one of the shifting or rotating of the one of the two or more line segments. In another example, the method also includes identifying a second tolerance constraint based on the tag identifying the type of the corresponding object, and changing the first position is further based on the second constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments. In another example, changing the first position includes both shifting and rotating the first position of the one of the two or more line segments. In another example, the method also includes comparing the value to a threshold values to determine whether the corresponding feature no longer exists. In another example, the method also includes comparing the value to a threshold values to determine whether the corresponding feature has been shifted. In another example, the method also includes before segmenting, determining whether the detected object used to define a driving lane based on the second type of the second corresponding object. In another example, the method also includes using, by the one or more processors, the value to maneuver the vehicle.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a functional diagram of a system in accordance with aspects of the disclosure.



FIG. 2 is an interior of an autonomous vehicle in accordance with aspects of the disclosure.



FIG. 3 is an example of map information in accordance with aspects of the disclosure.



FIG. 4 is an example of an intersection in accordance with aspects of the disclosure.



FIG. 5 is an exterior of an autonomous vehicle in accordance with aspects of the disclosure.



FIG. 6 is an example of objects identified by a perception system in accordance with aspects of the disclosure.



FIG. 7 is a comparison of the objects of FIG. 7 as compared to features of the map information of FIG. 3 in accordance with aspects of the disclosure.



FIG. 8 is another comparison of an object of FIG. 7 to features of the map information of FIG. 3 in accordance with aspects of the disclosure.



FIG. 9 is a further comparison of an object of FIG. 7 to features of the map information of FIG. 3 in accordance with aspects of the disclosure.



FIGS. 10A and 10B are examples of tolerance constraints in accordance with aspects of the disclosure.



FIG. 11 is an example of adjusting a location of a feature of the map information using a tolerance constraint.



FIG. 12 is an example flow diagram in accordance with aspects of the disclosure.





DETAILED DESCRIPTION
Overview

The technology relates to determining a probability of changes between pre-stored map information and a current state of the world. The prior map may include pre-stored map information used to maneuver a vehicle autonomously. The more the vehicle can depend on the accuracy and detail of the pre-stored map information, the less the vehicle must detect for itself in real time. However, in order for this approach to be effective, the pre-stored map information must be accurate and up-to-date. Since this condition cannot be absolutely guaranteed, it is useful for an autonomous vehicle's computing devices to be able to detect relevant changes (and ignore some minor changes) between the pre-stored map information and a current state of the world. For example, the vehicle's computing devices may perform an optimization procedure that moves and morphs a curve in the pre-stored map information in an attempt to best align it with a corresponding portion of a detected object, while also trying to preserve the general shape of the curve and minimize the overall shift in position. From this, the vehicle's computing devices may determine the probability of changes between the pre-stored map information and a current state of the world.


The pre-stored map information may include information that describes the shapes and geographic location coordinates of features observed in the past. The features may include those that are used to define driving lanes for the vehicle such as lane markers, curbs, barriers, guard rails, or transitions from one type of road surface to another as well as other features such as crosswalks, signs, stopping lines, etc. Examples of lane markers may include painted lines, rumble strips, botts (round, non-raised reflective markers), and other types of reflectors. The shapes of these features may be described as curves. In addition, each feature may be associated with one or more tags. A tag may identify a type of the feature.


The autonomous vehicle may include an object detection system. This system may include a plurality of sensors which provide sensor data to the vehicle's computing devices. This sensor data may describe the shape and geographic location coordinates of objects detected in the vehicle's environment.


The geographic location coordinates of the detected object may be compared to the pre-stored map information in order to identify a corresponding feature. As an example, features having geographic location coordinates that are within a threshold distance (e.g., a few inches, a half meter, etc.) of the geographic location coordinates of the detected may be identified as a corresponding feature.


The curve of the corresponding feature may be divided into two or more segments. These segments may be described as a pair of points that correspond to a starting geographic location coordinate and an ending geographic location coordinate of the segment. These segments may also be described as a single point and a vector. As an example, each segment may be a predetermined distance, such as 0.5 meters or more or less. This predetermined distance may be selected based upon the underlying resolution of the sensor data, the pre-stored map information, computing resources of the vehicle's computing devices, etc.


Using the tag associated with the corresponding feature, a tolerance constraint may be identified. For example, the vehicle's computing devices may access a lookup table, database, matrix, etc. which relates each of the different tags of the pre-stored map information to tolerance constraints. A tolerance constraint may limit the amount a given segment can be shifted or rotated. For instance, the tolerance constraint may be related to the likelihood that the type of feature identified by the tag can change.


Each of the segments may then be repositioned in order to better align that segment with the location coordinates of a corresponding portion of the detected object given the restrictions of any tolerance constraints. This may include laterally shifting the position of the segment relative to the corresponding portion of the detected object. In addition or alternatively, the segment may be rotated about a center point. As noted above, the tolerance constraint may be used to limit the amount by which a segment can be shifted or rotated.


The location coordinates of the repositioned segments for a corresponding feature may then be compared to corresponding location coordinates of the curve of the corresponding feature of the pre-stored map information. Based on this comparison, a value indicative of a likelihood that the corresponding feature changed, or rather moved, may be determined. For example, the value may include a probability that some or all of the curve of the corresponding feature has changed. In this regard, a probability may be determined for each section or for a plurality of the sections based on the differences between the two positions of each segment and the clustering of those differences from different segments.


In the case where the probability of a change is very high, the vehicle's computing devices may also compute a value or probability that the corresponding feature of the pre-stored map information no longer exists in the current state of the world. For example, the probability that some or all of the curve of the corresponding feature has changed may be compared to one or more threshold values to determine whether the feature has merely shifted or if the feature no longer exists. These threshold values may be learned from training on actual data.


The vehicle's computing devices may use this probability in various ways. For example, if the probability is high and the change appears to be dramatic, the vehicle's computing devices may use this information to make driving decisions for the vehicle.


As described in more detail below, the aspects described herein may accommodate various alternatives. For example, before segmenting a corresponding feature, each of the objects detected in the vehicle's environment may be compared to the pre-stored map information to determine whether that detected object corresponds to a feature used to define a driving lane. In another example, rather than segmenting and repositioning the curve of a corresponding feature, an edge corresponding to the shape of the detected object may be segmented. As another example, when the probability that a corresponding feature has moved is very high, the detected object may be a new object in that it may not have a corresponding feature in the pre-stored map information. Similarly, a detected object that is identified as a new object may be used as a signal to indicate that another detected object is also a new object. In another example, when a detected object appears to have a corresponding feature that has shifted on top of another feature in the pre-store map information, the vehicle's computing devices may assume that there has been no change or simply ignore the change.


As shown in FIG. 1, a vehicle 100 in accordance with one aspect of the disclosure includes various components. While certain aspects of the disclosure are particularly useful in connection with specific types of vehicles, the vehicle may be any type of vehicle including, but not limited to, cars, trucks, motorcycles, busses, boats, airplanes, helicopters, lawnmowers, recreational vehicles, amusement park vehicles, farm equipment, construction equipment, trams, golf carts, trains, and trolleys. The vehicle may have one or more computing devices, such as computing device 110 containing one or more processors 120, memory 130 and other components typically present in general purpose computing devices.


The memory 130 stores information accessible by the one or more processors 120, including data 132 and instructions 134 that may be executed or otherwise used by the processor(s) 120. The memory 130 may be of any type capable of storing information accessible by the processor(s), including a computing device-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.


The data 132 may be retrieved, stored or modified by processor(s) 120 in accordance with the instructions 132. For instance, although the claimed subject matter is not limited by any particular data structure, the data may be stored in computing device registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files. The data may also be formatted in any computing device-readable format.


The instructions 134 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.


The one or more processors 120 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor, such as a field programmable gate array (FPGA). Although FIG. 1 functionally illustrates the processor(s), memory, and other elements of computing device 110 as being within the same block, it will be understood by those of ordinary skill in the art that the processor, computing device, or memory may actually include multiple processors, computing devices, or memories that may or may not be stored within the same physical housing. For example, memory may be a hard drive or other storage media located in a housing different from that of computing device 110. Accordingly, references to a processor or computing device will be understood to include references to a collection of processors or computing devices or memories that may or may not operate in parallel.


Computing device 110 may have all of the components normally used in connection with a computing device such as the processor and memory described above, as well as a user input 150 (e.g., a mouse, keyboard, touch screen and/or microphone) and various electronic displays (e.g., a monitor having a screen, a small LCD touch-screen or any other electrical device that is operable to display information). In this example, the vehicle includes an internal electronic display 152. In this regard, internal electronic display 152 may be located within a cabin of vehicle 100 and may be used by computing device 110 to provide information to passengers within the vehicle 100.


In one example, computing device 110 may be an autonomous driving computing system incorporated into vehicle 100. The autonomous driving computing system may capable of communicating with various components of the vehicle as needed in order to control the vehicle in fully autonomous (without input from a driver) as well as semiautonomous (some input from a driver) driving modes.


As an example, FIG. 2 depicts an interior design of a vehicle having autonomous, semiautonomous, and manual (continuous input from a driver) driving modes. In this regard, the autonomous vehicle may include all of the features of a non-autonomous vehicle, for example: a steering apparatus, such as steering wheel 210; a navigation display apparatus, such as navigation display 215 (which may be a part of electronic display 152); and a gear selector apparatus, such as gear shifter 220. The vehicle may also have various user input devices 140 in addition to the foregoing, such as touch screen 217 (again, which may be a part of electronic display 152), or button inputs 219, for activating or deactivating one or more autonomous driving modes and for enabling a driver or passenger 290 to provide information, such as a navigation destination, to the computing device 110.


Returning to FIG. 1, when engaged, computer 110 may control some or all of these functions of vehicle 100 and thus be fully or partially autonomous. It will be understood that although various systems and computing device 110 are shown within vehicle 100, these elements may be external to vehicle 100 or physically separated by large distances.


In this regard, computing device 110 may be in communication various systems of vehicle 100, such as deceleration system 160, acceleration system 162, steering system 164, signaling system 166, navigation system 168, positioning system 170, and perception system 172, such that one or more systems working together may control the movement, speed, direction, etc. of vehicle 100 in accordance with the instructions 134 stored in memory 130. Although these systems are shown as external to computing device 110, in actuality, these systems may also be incorporated into computing device 110, again as an autonomous driving computing system for controlling vehicle 100.


As an example, computing device 110 may interact with deceleration system 160 and acceleration system 162 in order to control the speed of the vehicle. Similarly, steering system 164 may be used by computing device 110 in order to control the direction of vehicle 100. For example, if vehicle 100 configured for use on a road, such as a car or truck, the steering system may include components to control the angle of wheels to turn the vehicle. Signaling system 166 may be used by computing device 110 in order to signal the vehicle's intent to other drivers or vehicles, for example, by lighting turn signals or brake lights when needed.


Navigation system 168 may be used by computing device 110 in order to determine and follow a route to a location. In this regard, the navigation system 168 and/or data 132 may store map information, e.g., highly detailed maps identifying the shapes, geographic location coordinates, and elevations of various objects that were previously observed such as roadways, features used to define driving lanes, intersections, crosswalks, traffic signals, buildings, signs, vegetation, or other such objects and information that the vehicle's computers may use to control the vehicle safely.


Examples of features may that are used to define driving lanes may include lane markers (painted lines, rumble strips, botts, etc.), curbs, barriers, guard rails, crosswalks, transitions from one type of road surface to another, or other such features. In some examples, the shapes of these features that are used to define driving lanes may be described as curves.



FIG. 3 is an example of detailed map information 300 for a section of roadway including an intersection 302. In this example, the detailed map information 300 includes information identifying the shape, location, and other characteristics of lane lines 310, 312, 314, and 316 traffic signals 320, 322, 324, and 326, as well as crosswalks 330, 332, and 334. As noted above, the features used to define driving lanes, such as lane lines 310, 312, 314, and 316 may be associated with one or more tags.


In addition, each of these features may be associated with one or more tags identifying the specify type of that feature. In the example of FIG. 3, tags 340, 342, 344, and 346 each identify the type of the corresponding feature, here lane liens 310, 312, 314, and 316.


Each tag may be associated with a tolerance constraint. As described in more detail below, a tolerance constraint may limit the amount a feature of the map information can be shifted or rotated when comparing the map information to a detected object. These tolerance constraints may be hard constraints, e.g., a segment cannot be rotated more than 30 degrees in any direction or shifted more than one meter. In addition, the tolerance constraints may be soft constraints where a segment is penalized for rotating or shifting greater than some threshold. As an example, penalties define the tradeoff between improved the improved appearance of an alignment and how that alignment affects the geometry. For instance, penalties may be defined such that a portion of the curve can only move some small distance (such as 10 centimeters or more or less) if they newly aligned location appears to have a shape that is somewhat more like the proper feature type for that portion, but the same portion can be moved a greater distance (such as 1 meter or more or less) if they newly aligned location appears to have a shape that is significantly more like the proper feature type for that portion.


A tolerance constraint may then help to maintain the shape of a given curve. For example, along with penalty based on how much any one segment move or rotate, there may be penalties based on how the shape of connected segments changes. These penalties can be overcome when the changed positions of a portion of a curve suggests a change in shape, but are useful in many other cases to prevent a noisy detection from indicating a change when there is not one.


The tolerance constraints may be related to the likelihood that the type of feature identified by the tag will change. For example, the probability that painted line markers will move may be much higher than the probability that curbs will move (line marks are much more easily moved by repainting than curbs, which may require significantly more labor). Thus, in some instances, such as where the corresponding object is a type of feature which is unlikely to be moved, a tolerance constraint may prevent a segment from being shifted and/or rotated at all. This tolerance constraint may be included in the tag, associated with the tag in the map information, or stored in some other location such as a lookup table, database, matrix, etc. which relates each of the different tags of the map information to a tolerance constraint.


In addition, the detailed map information includes a network of rails 350, 352, and 354, which provide the vehicle's computer with guidelines for maneuvering the vehicle so that the vehicle follows the rails and obeys traffic laws. As an example, a vehicle's computer may maneuver the vehicle from point A to point B (two fictitious locations not actually part of the detailed map information) by following rail 350, transitioning to rail 352, and subsequently transitioning to rail 354 in order to make a left turn at intersection 302.


As noted above, the map information may correspond to information observed in the past. In this regard, FIG. 4 is an example of a bird's eye view of an intersection that corresponds to the features of intersection 302. In this example, lane lines 410, 412, 414, and 416 correspond to the shape, location, and other characteristics of lane lines 310, 312, 314 and 316, respectively. Similarly crosswalks 440, 442, and 44 correspond to the shape, location, and other characteristics of crosswalks 330, 332, and 334, respectively and traffic signals 422, 420, and 426, corresponds to the shape, location, and other characteristics of traffic signals 320, 322, and 324.


Positioning system 170 may be used by computing device 110 in order to determine the vehicle's relative or absolute position on a map or on the earth. For example, the position system 170 may include a GPS receiver to determine the device's latitude, longitude and/or altitude position. Other location systems such as laser-based localization systems, inertial-aided GPS, or camera-based localization may also be used to identify the location of the vehicle. The location of the vehicle may include an absolute geographical location, such as latitude, longitude, and altitude as well as relative location information, such as location relative to other cars immediately around it which can often be determined with less noise that absolute geographical location.


The positioning system 170 may also include other devices in communication with computing device 110, such as an accelerometer, gyroscope or another direction/speed detection device to determine the direction and speed of the vehicle or changes thereto. By way of example only, an acceleration device may determine its pitch, yaw or roll (or changes thereto) relative to the direction of gravity or a plane perpendicular thereto. The device may also track increases or decreases in speed and the direction of such changes. The device's provision of location and orientation data as set forth herein may be provided automatically to the computing device 110, other computing devices and combinations of the foregoing.


The perception system 172 also includes one or more components for detecting and performing analysis on objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc. For example, the perception system 172 may include lasers, sonar, radar, one or more cameras, or any other detection devices which record data which may be processed by computing device 110. In the case where the vehicle is a small passenger vehicle such as a car, the car may include a laser mounted on the roof or other convenient location as well as other sensors such as cameras, radars, sonars, and additional lasers.


The computing device 110 may control the direction and speed of the vehicle by controlling various components. By way of example, if the vehicle is operating completely autonomously, computing device 110 may navigate the vehicle to a location using data from the detailed map information and navigation system 168. Computing device 110 may use the positioning system 170 to determine the vehicle's location and perception system 172 to detect and respond to objects when needed to reach the location safely. In order to do so, computing device 110 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine by acceleration system 162), decelerate (e.g., by decreasing the fuel supplied to the engine or by applying brakes by deceleration system 160), change direction (e.g., by turning the front or rear wheels of vehicle 100 by steering system 164), and signal such changes (e.g. by lighting turn signals of signaling system 166).



FIG. 5 is an example external view of vehicle 100 described above. As shown, various components of the perception system 172 may be positioned on or in the vehicle 100 in order to better detect external objects while the vehicle is being driven. In this regard, one or more sensors, such as laser range finders 510 and 512 may be positioned or mounted on the vehicle. As an example, the one or more computing devices 110 (not shown) may control laser range finder 510, e.g., by rotating it 180 degrees. In addition, the perception system may include one or more cameras 520 mounted internally on the windshield of vehicle 100 to receive and analyze various images about the environment. In addition to the laser range finder 510 is positioned on top of perception system 172 in FIG. 5, and the one or more cameras 520 mounted internally on the windshield, other detection devices, such as sonar, radar, GPS, etc., may also be positioned in a similar manner.


The one or more computing devices 110 may also features such as transmitters and receivers that allow the one or more devices to send and receive information to and from other devices. For example, the one or more computing devices may determine that the vehicle's environment has changed from an expected representation of the environment defined in the map information according to the aspects described herein. The one or more computing devices may send this information to other computing devise associated with other vehicles. Similarly, the one or more computing devices may receive such information from other computing devices.


This information may be sent and received via any wireless transmission method, such as radio, cellular, Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing. Such communication may be facilitated by any device capable of transmitting data to and from other computers, such as modems and wireless interfaces.


Example Methods

In addition to the operations described above and illustrated in the figures, various operations will now be described. It should be understood that the following operations do not have to be performed in the precise order described below. Rather, various steps can be handled in a different order or simultaneously, and steps may also be added or omitted.


As noted above, a vehicle's one or more computing devices may maneuver the vehicle using the various systems described above. While doing so, the perception system 172 may identify the shape and location of various objects in the vehicle's environment. For example, FIG. 6 depicts a section of roadway 600 including an intersection 602 identified by the perception system 172 from sensor data (e.g., lasers, cameras, radar, sonar, etc.). In this example, the perception system 172 has identified the shape and geographic location coordinates of various features such as lane lines 610, 612, 614, 616 and 618 as well as crosswalks 630, 632, and 634. In this example, the general location of intersection 602 may correspond to the location of intersection 302, however as can be seen there are various changes to the vehicle's environment. Here a left hand turning lane has been added, the roadway has been widened, and the location and placement of various crosswalks has changed.


The geographic location coordinates of the detected object may be compared to the map information in order to identify corresponding features between the map information and the objects detected by the perception system. As an example, features having at least some geographic location coordinates that are within a threshold distance (e.g., a few inches, a half meter, etc.) of the geographic location coordinates of a detected object may be identified as a corresponding feature. For example, FIG. 7 is a comparison of shapes and geographic location coordinates of lane lines 310, 312, 314, and 316 to the detected objects of FIG. 7. In this example, lane lines 312 and 612, 314 and 614, as well as 316 and 616 may be identified as corresponding features because of their close proximity to one another. In this example, however, only a portion of lane liens 312 and 316 are proximate to one another though they may still be identified as corresponding features.


The curve of the corresponding feature of the map information may be divided into two or more segments. For example, FIG. 8 is a representation of lane lines 312 (identified from the map information) and 612 (a detected object). Here lane line 612 is shown as divided into segments 810, 812, 814, 816, and 818.


These segments may be described as a pair of points that correspond to a starting geographic location coordinate and an ending geographic location coordinate of the segment. Thus, each of the segments 810, 812, 814, 816, and 818 is bounded by two of end points 820, 822, 824, 826, 828, and 830. Each of these end points represents geographic location coordinates for the ending location of a corresponding segment. For example, segment 810 is bounded and defined by the geographic location coordinates of end points 820 and 822, segment 812 is bounded and defined by the geographic location coordinates of end points 822 and 824, etc. Alternatively, the segments may also be described as a single point and a vector.


As an example, each segment may be a predetermined distance. For example, lane line 612 may be divided into segments 810, 812, 814, 816, and 818 that are each 0.5 meters or more or less. This predetermined distance may be selected based upon the underlying resolution of the sensor data, the pre-stored map information, computing resources of the vehicle's computing devices, etc.


Each of the segments may then be repositioned in order to better align that segment with the location coordinates of a corresponding portion of the detected object given the restrictions of any tolerance constraints. This may include laterally shifting the position of the segment relative to the corresponding portion of the detected object. In addition or alternatively, the segment may be rotated about a center point. FIG. 9 depicts the segments of lane line 612 as well as lane line 312 with arrows 910, 912, 914, and 916 indicating the direction of any shifting or rotating needed to move the geographic location coordinates of segments of lane line 612 to the geographic location coordinates of lane line 312. In this example, segment 810 does not need to be shifted or rotated, segment 812 needs to be shifted and rotated in the direction of arrow 910, segment 814 needs to be shifted and rotated in the direction of arrow 912, and segments 816 and 818 need to be shifted in the directions of arrows 912 and 914, respectively in order to better align the segments of lane line 612 with the lane line 312.


In some examples, a tolerance constraint may then be identified. For example, the vehicle's computing devices may use the tag associated with a map feature to identify a tolerance constraints. As noted above, this information may be included in the tag, associated with the tag, or stored in some other location.


As noted above, a tolerance constraint may be used to limit the amount by which a segment can be shifted or rotated. FIG. 10A is an example of a tolerance constraint that limits the amount a given segment can be shifted. In this example, segment 1010 is compared to the location of feature 1020 of the map information. Segment 1010 and feature 1020 are a distance S1 apart from one another. However, the tolerance constraint limits the distance that segment 1010 can be shifted towards feature 1020 to the distance T1.


Similarly, FIG. 10B is an example of a tolerance constraint that limits the amount a given segment can be rotated. In this example, segment 1030 is compared to the location of feature 1040 of the map information. Segment 1030 and feature 1040 are an angular distance S2 apart from one another. However, the tolerance constraint limits the degree to which that segment 1040 can be rotated towards feature 1040 to the angular distance T2.


Thus, each of the segments may be repositioned in order to better align that segment with the location coordinates of a corresponding portion of the detected object given the restrictions of any tolerance constraints. FIG. 11 depicts an example of shifting and rotating segments using tolerance constraints. In this example, tolerance constraint 1110 limits the rotation and shifting of segment 814 along arrow 912, tolerance constraint 1112 limits the shifting of segment 816 along arrow 912, and tolerance constraint 1110 limits the shifting of segment 816 along arrow 914. In this example, the tolerance constraints for lane line 312 do not affect the shifting and rotating of segments 812. In this example, the amount of shifting and rotating needed for segment 812 may be within the tolerance constraint for lane lien 312.


By shifting and rotating the segments, the end points of the segments will have new geographic location coordinates as can be seen from the example of FIG. 11. The geographic location coordinates of the repositioned segments for a corresponding feature may then be compared to corresponding geographic location coordinates of the curve of the corresponding feature of the pre-stored map information. Based on this comparison, a value indicative of a likelihood that the corresponding feature changed, or rather moved, may be determined. For example, the value may include a probability that some or all of the curve of the corresponding feature has changed. In this regard, a probability may be determined for each section or for a plurality of the sections based on the differences between the two positions of each segment and the clustering of those differences from different segments.


In the case where the probability of a change is very high, the vehicle's computing devices may also compute a value or probability that the corresponding feature of the pre-stored map information no longer exists in the current state of the world. For example, the probability that some or all of the curve of the corresponding feature has changed may be compared to one or more threshold values to determine whether the feature has merely shifted or if the feature no longer exists. These threshold values may be learned from training on actual data.


In another example, rather than relying on when the probability of a change for a feature being very high, if a good new location for a set of segments from the detailed map information is not found, then sensor data corresponding to the original location of the segments may be checked to see if the proper feature was detected there. If it was not, this may indicate that the feature had been completely removed.


The vehicle's computing devices may use this probability in various ways. For example, if the probability is high and the change appears to be dramatic, the vehicle's computing devices may use this information to make driving decisions for the vehicle. This may include slowing the vehicle down, maneuvering the vehicle in a more cautious mode, stopping the vehicle (e.g., to protect the safety of passengers), requesting that a passenger of the vehicle take control of the vehicle, etc. The vehicle's computing devices may also save the probability information, share the information with other autonomous vehicles, send the information to a system operator or centralized computing device for review and possible incorporation into the pre-stored map information, etc.


Although the examples of FIGS. 7-11 relate to lane markers, the aspects described above for aligning and determining probabilities may be used to with regard other types of features in the detailed map information. For example, such aspects may also be especially useful with regard to detecting changes in the location of crosswalks and stop lines.



FIG. 12 is an example flow diagram 1200 which depicts some of the aspects described above which may be performed by one or more computing devices such as one or more computing devices 110 of vehicle 100. In this example, data identifying an object detected in a vehicle's environment is received at block 1210. This data includes location coordinates for the object. A corresponding feature is identified from pre-stored map information based on the location coordinates and a map location of the corresponding feature at block 1220. This corresponding feature is defined as a curve and associated with a tag identifying a type of the corresponding object. A tolerance constraint is identified based on the tag identifying the type of the corresponding object at block 1230. The second curve is divided into two or more line segments at block 1240. Each line segment of the two or more line segments has a first position. The first position of the one of the two or more line segments is changed to determine a second position of the one of the two or more line segments based on the location coordinates and the tolerance constraint at block 1250. Changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments. A value is then determined based on a comparison of the first position to the second position at block 1260. This value indicates a likelihood that the corresponding feature has changed.


As noted above, the aspects described herein may accommodate various alternatives. For example, before segmenting a corresponding feature, each of the objects detected in the vehicle's environment may be compared to the pre-stored map information to determine whether that detected object corresponds to a feature used to define a driving lane. This may be achieved by comparing the location information of the sensor data for the detected object to the location information of the pre-stored map to identify a corresponding feature. Then based on the tag associated with the corresponding feature, the vehicle's computing device may determine whether the detected object corresponds to the location of a feature used to define driving lanes. If so, the corresponding feature may be segmented and processed as described above, and if not, the corresponding feature need not be segmented or processed as described above.


Alternatively, rather than segmenting and repositioning the curve of a corresponding feature, an edge of the detected object may be segmented. The segments of the edge may then be shifted or rotated to better align the segment to the curve of the corresponding feature. Again, the tolerance constraint identified based on the tag of the corresponding feature may be used to limit the shifting and/or rotation of the segment. The location coordinates of the repositioned segments for the edge may then be compared to the corresponding location coordinates of the edge of the detected object (before the segment was repositioned). Based on this comparison, various values may be determined as described above.


In some examples, the detected object may be a new object in that it may not have a corresponding feature in the pre-stored map information. In this case, other features of the pre-stored map information may be used as signals to indicate additional characteristics of detected objects not readily detectable from the location and orientation characteristics of the detected objects. For example, if a new driving lane was added, the boundaries for that new driving lane may have a similar angle and heading as the boundaries for any previous driving lane or lanes in the pre-stored map information that are also in the same general area. In that regard, if a detected object appears to follow the general shape of the boundaries of a curve corresponding to a lane line in the pre-stored map information but appears in another location, the vehicle's computing devices may determine that the detected object corresponds to a lane line which is likely to have a heading that corresponds to the heading of the lane lines in the pre-stored map information.


Similarly, one detected object that is identified as a new object may be used as signals to indicate that another detected object is also a new object. For example, if a new crosswalk or a new bike lane is detected, for example using image matching or other identification techniques, the likelihood that other features in that immediate area changed may be relatively high. Thus, the vehicle's computing devices may be more likely to determine that another detected object is a new object.


In some instances, when a detected object appears to have a corresponding feature that has shifted on top of another feature in the pre-store map information, the vehicle's computing devices may assume that there has been no change or simply ignore the change. For example, in the case of a solid double lane line, one of the lane lines may be more faded than the other making it more difficult for the vehicle's detection system to detect the faded lane line. This may cause the vehicle's computing devices to determine that one of the lane lines has shifted on top of another, when actually there has been no change. Thus, in this example, the vehicle's computing devices may assume that there has been no change or simply ignore the change.


Unless otherwise stated, the foregoing alternative examples are not mutually exclusive, but may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. In addition, the provision of the examples described herein, as well as clauses phrased as “such as,” “including” and the like, should not be interpreted as limiting the subject matter of the claims to the specific examples; rather, the examples are intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings can identify the same or similar elements.

Claims
  • 1. A computer-implemented method of maneuvering a vehicle when an object is detected in the vehicle's environment, the computer-implemented method comprising: receiving, by one or more computing devices from a perception system of a vehicle, data identifying an object detected in the vehicle's environment, the data including location coordinates for the object, the perception system including a plurality of sensors;identifying, by the one or more computing devices, a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature;dividing, by the one or more computing devices, at least one of a feature of the object or the corresponding feature identified from the pre-stored map information into two or more line segments, each line segment of the two or more line segments having a first position;changing, by the one or more computing devices, the first position of one of the two or more line segments to determine a second position of the one of the two or more line segments,wherein the changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments; andmaneuvering, by the one or more computing devices, the vehicle while operating in an autonomous driving mode based on the changing.
  • 2. The computer-implemented method of claim 1, wherein the identifying further comprises identifying the corresponding feature based on whether a distance between the location coordinates and the map location satisfies a threshold.
  • 3. The computer-implemented method of claim 2, further comprising identifying a first tolerance constraint that limits at least one of the shifting or rotating of the one of the two or more line segments.
  • 4. The computer-implemented method of claim 3, further comprising: identifying a second tolerance constraint based on a tag identifying a type of a corresponding object,wherein the changing the first position is further based on the second tolerance constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments.
  • 5. The computer-implemented method of claim 1, wherein the changing the first position includes both shifting and rotating the first position of the one of the two or more line segments.
  • 6. The computer-implemented method of claim 1, further comprising determining whether the corresponding feature no longer exists.
  • 7. The computer-implemented method of claim 1, further comprising determining whether the corresponding feature has been shifted.
  • 8. A vehicle comprising: a perception system configured to detect an object in the vehicle's environment, the perception system including a plurality of sensors; andone or more computing devices configured to:receive, from the perception system, data identifying the object detected in the vehicle's environment, the data including location coordinates for the object;identify a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature;divide at least one of a feature of the object or the corresponding feature identified from the pre-stored map information into two or more line segments, each line segment of the two or more line segments having a first position;change the first position of one of the two or more line segments to determine a second position of the one of the two or more line segments,wherein the changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments; andmaneuver the vehicle while operating in an autonomous driving mode based on the changing.
  • 9. The vehicle of claim 8, wherein the one or more computing devices are further configured to identify the corresponding feature based on whether a distance between the location coordinates and the map location satisfies a threshold.
  • 10. The vehicle of claim 9, wherein the one or more computing devices are further configured to: identify a first tolerance constraint that limits at least one of the shifting or rotating of the one of the two or more line segments.
  • 11. The vehicle of claim 10, wherein the one or more computing devices are further configured to: identify a second tolerance constraint based on a tag identifying a type of a corresponding object,wherein the changing the first position is further based on the second tolerance constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments.
  • 12. The vehicle of claim 8, wherein the changing the first position includes both shifting and rotating the first position of the one of the two or more line segments.
  • 13. The vehicle of claim 8, wherein the one or more computing devices are further configured to determine whether the corresponding feature no longer exists.
  • 14. The vehicle of claim 8, wherein the one or more computing devices are further configured to determine whether the corresponding feature has been shifted.
  • 15. A non-transitory, tangible computer readable medium on which instructions are stored, the instructions, when executed by one or more processors cause the one or more processors to perform a method, the method comprising: receiving data identifying an object detected in a vehicle's environment, the data including location coordinates for the object;identifying a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature;dividing at least one of a feature of the object or the corresponding feature identified from the pre-stored map information into two or more line segments, each line segment of the two or more line segments having a first position;changing the first position of one of the two or more line segments to determine a second position of the one of the two or more line segments,wherein the changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments; andmaneuvering the vehicle while operating in an autonomous driving mode based on the changing.
  • 16. The non-transitory, tangible computer readable medium of claim 15, wherein the identifying further comprises identifying the corresponding feature based on whether a distance between the location coordinates and the map location satisfies a threshold.
  • 17. The non-transitory, tangible computer readable medium of claim 16, wherein the method further comprises identifying a first tolerance constraint that limits at least one of the shifting or rotating of the one of the two or more line segments.
  • 18. The non-transitory, tangible computer readable medium of claim 17, wherein the method further comprises identifying a second tolerance constraint based on a tag identifying a type of a corresponding object, wherein the changing the first position is further based on the second tolerance constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments.
  • 19. The non-transitory, tangible computer readable medium of claim 15, wherein the changing the first position includes both shifting and rotating the first position of the one of the two or more line segments.
  • 20. The non-transitory, tangible computer readable medium of claim 15, wherein the method further comprises determining whether the corresponding feature no longer exists or has been shifted.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/799,304, filed Oct. 31, 2017, which is a continuation of U.S. patent application Ser. No. 15/070,425, filed on Mar. 15, 2016, now issued as U.S. Pat. No. 9,836,052, which is a continuation of U.S. patent application Ser. No. 14/472,795, filed Aug. 29, 2014, now issued as U.S. Pat. No. 9,321,461, the disclosures of which are incorporated herein by reference.

US Referenced Citations (330)
Number Name Date Kind
1924984 Fageol Aug 1933 A
3186508 Lamont Jun 1965 A
3324805 Mulch Jun 1967 A
3596728 Neville Aug 1971 A
4372414 Anderson et al. Feb 1983 A
4387783 Carman Jun 1983 A
4543572 Tanaka Sep 1985 A
4656834 Elpern Apr 1987 A
4924795 Ottemann May 1990 A
4970653 Kenue Nov 1990 A
4982072 Takigami Jan 1991 A
5187666 Watanabe Feb 1993 A
5415468 Latarnik May 1995 A
5448487 Arai Sep 1995 A
5470134 Toepfer et al. Nov 1995 A
5521579 Bernhard May 1996 A
5684696 Rao et al. Nov 1997 A
5774069 Tanaka et al. Jun 1998 A
5790403 Nakayama Aug 1998 A
5906645 Kagawa et al. May 1999 A
5913376 Takei Jul 1999 A
5954781 Slepian et al. Sep 1999 A
6055042 Sarangapani Apr 2000 A
6064926 Sarangapani et al. May 2000 A
6070682 Isogai et al. Jun 2000 A
6151539 Bergholz et al. Nov 2000 A
6173222 Seo et al. Jan 2001 B1
6195610 Kaneko Feb 2001 B1
6226570 Hahn May 2001 B1
6321147 Takeda et al. Nov 2001 B1
6332354 Lalor et al. Dec 2001 B1
6343247 Jitsukata et al. Jan 2002 B2
6385539 Wilson et al. May 2002 B1
6414635 Stewart et al. Jul 2002 B1
6438472 Tano et al. Aug 2002 B1
6438491 Farmer Aug 2002 B1
6453056 Laumeyer et al. Sep 2002 B2
6470874 Mertes Oct 2002 B1
6504259 Kuroda et al. Jan 2003 B1
6516262 Takenaga et al. Feb 2003 B2
6560529 Janssen May 2003 B1
6591172 Oda et al. Jul 2003 B2
6606557 Kotzin Aug 2003 B2
6643576 O'Connor et al. Nov 2003 B1
6832156 Farmer Dec 2004 B2
6836719 Andersson et al. Dec 2004 B2
6847869 Dewberry et al. Jan 2005 B2
6862524 Nagda Mar 2005 B1
6876908 Cramer et al. Apr 2005 B2
6934613 Yun Aug 2005 B2
6963657 Nishigaki et al. Nov 2005 B1
7011186 Frentz et al. Mar 2006 B2
7031829 Nisiyama Apr 2006 B2
7085633 Nishira et al. Aug 2006 B2
7102496 Ernst, Jr. et al. Sep 2006 B1
7177760 Kudo Feb 2007 B2
7194347 Harumoto et al. Mar 2007 B2
7207304 Lwatsuki Apr 2007 B2
7233861 Van Buer et al. Jun 2007 B2
7327242 Holloway et al. Feb 2008 B2
7340332 Underdahl et al. Mar 2008 B2
7346439 Bodin Mar 2008 B2
7373237 Wagner et al. May 2008 B2
7394046 Olsson et al. Jul 2008 B2
7486802 Hougen Feb 2009 B2
7499774 Barrett et al. Mar 2009 B2
7499776 Allard et al. Mar 2009 B2
7499804 Svendsen et al. Mar 2009 B2
7515101 Bhogal et al. Apr 2009 B1
7565241 Tauchi Jul 2009 B2
7579942 Kalik Aug 2009 B2
7639841 Zhu et al. Dec 2009 B2
7656280 Hines et al. Feb 2010 B2
7694555 Howell et al. Apr 2010 B2
7778759 Tange et al. Aug 2010 B2
7818124 Herbst et al. Oct 2010 B2
7835859 Bill Nov 2010 B2
7865277 Larson et al. Jan 2011 B1
7894951 Norris et al. Feb 2011 B2
7908040 Howard et al. Mar 2011 B2
7956730 White et al. Jun 2011 B2
7979175 Allard et al. Jul 2011 B2
8024102 Swoboda et al. Sep 2011 B2
8050863 Trepagnier et al. Nov 2011 B2
8078349 Prada Gomez et al. Dec 2011 B1
8095313 Blackburn Jan 2012 B1
8099213 Zhang et al. Jan 2012 B2
8126642 Trepagnier et al. Feb 2012 B2
8144926 Mori Mar 2012 B2
8190322 Lin et al. May 2012 B2
8194927 Zhang et al. Jun 2012 B2
8195341 Huang et al. Jun 2012 B2
8224031 Saito Jul 2012 B2
8244408 Lee et al. Aug 2012 B2
8244458 Blackburn Aug 2012 B1
8260515 Huang et al. Sep 2012 B2
8280601 Huang et al. Oct 2012 B2
8280623 Trepagnier et al. Oct 2012 B2
8311274 Bergmann et al. Nov 2012 B2
8352111 Mudalige Jan 2013 B2
8352112 Mudalige Jan 2013 B2
8412449 Trepagnier et al. Apr 2013 B2
8452506 Groult May 2013 B2
8457827 Ferguson et al. Jun 2013 B1
8527199 Burnette et al. Sep 2013 B1
8634980 Urmson et al. Jan 2014 B1
8694236 Takagi Apr 2014 B2
8706394 Trepagnier et al. Apr 2014 B2
8718861 Montemerlo et al. May 2014 B1
8724093 Sakai et al. May 2014 B2
8762021 Yoshihama Jun 2014 B2
8775063 Zeng Jul 2014 B2
8825259 Ferguson Sep 2014 B1
8831813 Ferguson et al. Sep 2014 B1
8855849 Ferguson Oct 2014 B1
8855860 Isaji et al. Oct 2014 B2
8874267 Dolgov et al. Oct 2014 B1
8880272 Ferguson et al. Nov 2014 B1
8918277 Niem et al. Dec 2014 B2
8929604 Platonov et al. Jan 2015 B2
8935057 Dolinar et al. Jan 2015 B2
8948954 Ferguson et al. Feb 2015 B1
8949016 Ferguson et al. Feb 2015 B1
8970397 Nitanda et al. Mar 2015 B2
8972093 Joshi Mar 2015 B2
9008369 Schofield et al. Apr 2015 B2
9014903 Zhu et al. Apr 2015 B1
9062979 Ferguson et al. Jun 2015 B1
9063548 Ferguson et al. Jun 2015 B1
9081383 Montemerlo et al. Jul 2015 B1
9145139 Ferguson Sep 2015 B2
9182759 Wimmer et al. Nov 2015 B2
9285230 Silver Mar 2016 B1
9310804 Ferguson Apr 2016 B1
9321461 Silver Apr 2016 B1
9384394 Joshi Jul 2016 B2
9395192 Silver Jul 2016 B1
9460624 Pandita et al. Oct 2016 B2
9562777 Kang Feb 2017 B2
9600768 Ferguson Mar 2017 B1
9836052 Silver Dec 2017 B1
10042362 Fairfield Aug 2018 B2
10161754 Matsushita Dec 2018 B2
10181084 Ferguson Jan 2019 B2
10248871 Ramasamy Apr 2019 B2
10304333 Engel May 2019 B2
10474154 Wengreen Nov 2019 B1
10474162 Browning Nov 2019 B2
10481606 Wengreen Nov 2019 B1
10627816 Silver Apr 2020 B1
10732632 Li Aug 2020 B2
20010037927 Nagler et al. Nov 2001 A1
20020188499 Jenkins et al. Dec 2002 A1
20030014302 Jablin Jan 2003 A1
20030016804 Sheha et al. Jan 2003 A1
20030037977 Tatara et al. Feb 2003 A1
20030055554 Shioda et al. Mar 2003 A1
20030093209 Andersson et al. May 2003 A1
20030125963 Haken Jul 2003 A1
20040243292 Roy Dec 2004 A1
20050012589 Kokubu et al. Jan 2005 A1
20050099146 Nishikawa et al. May 2005 A1
20050125154 Kawasaki Jun 2005 A1
20050131645 Panopoulos Jun 2005 A1
20050149251 Donath et al. Jul 2005 A1
20050273251 Nix et al. Dec 2005 A1
20060037573 Iwatsuki et al. Feb 2006 A1
20060082437 Yuhara Apr 2006 A1
20060089764 Filippov et al. Apr 2006 A1
20060089765 Pack et al. Apr 2006 A1
20060089800 Svendsen et al. Apr 2006 A1
20060116801 Shirley et al. Jun 2006 A1
20060173841 Bill Aug 2006 A1
20060178240 Hansel Aug 2006 A1
20060276942 Anderson et al. Dec 2006 A1
20070010942 Bill Jan 2007 A1
20070024501 Yeh Feb 2007 A1
20070084655 Kakinami Apr 2007 A1
20070142992 Gronau et al. Jun 2007 A1
20070149214 Walsh et al. Jun 2007 A1
20070165910 Nagaoka et al. Jul 2007 A1
20070193798 Allard et al. Aug 2007 A1
20070203617 Haug Aug 2007 A1
20070225909 Sakano Sep 2007 A1
20070239331 Kaplan Oct 2007 A1
20070247281 Shimomura Oct 2007 A1
20070279250 Kume et al. Dec 2007 A1
20080021628 Tryon Jan 2008 A1
20080033615 Khajepour et al. Feb 2008 A1
20080039991 May et al. Feb 2008 A1
20080040039 Takagi Feb 2008 A1
20080056535 Bergmann et al. Mar 2008 A1
20080059015 Whittaker et al. Mar 2008 A1
20080059048 Kessler et al. Mar 2008 A1
20080084283 Kalik Apr 2008 A1
20080089556 Salgian et al. Apr 2008 A1
20080120025 Naitou et al. May 2008 A1
20080120171 Ikeuchi et al. May 2008 A1
20080147253 Breed Jun 2008 A1
20080161987 Breed Jul 2008 A1
20080162036 Breed Jul 2008 A1
20080167771 Whittaker et al. Jul 2008 A1
20080183512 Benzinger et al. Jul 2008 A1
20080188246 Sheha et al. Aug 2008 A1
20080195268 Sapilewski et al. Aug 2008 A1
20080277183 Huang et al. Nov 2008 A1
20080303696 Aso et al. Dec 2008 A1
20080306969 Mehta et al. Dec 2008 A1
20090005959 Bargman et al. Jan 2009 A1
20090074249 Moed et al. Mar 2009 A1
20090082879 Dooley et al. Mar 2009 A1
20090115594 Han May 2009 A1
20090164071 Takeda Jun 2009 A1
20090198400 Allard et al. Aug 2009 A1
20090248231 Kamiya Oct 2009 A1
20090276154 Subramanian et al. Nov 2009 A1
20090287367 Salinger Nov 2009 A1
20090287368 Bonne Nov 2009 A1
20090306834 Hjelm et al. Dec 2009 A1
20090313077 Wheeler, IV Dec 2009 A1
20090313095 Hurpin Dec 2009 A1
20090319096 Offer et al. Dec 2009 A1
20090319112 Fregene et al. Dec 2009 A1
20090322872 Muehlmann et al. Dec 2009 A1
20090326799 Crook Dec 2009 A1
20100010699 Taguchi et al. Jan 2010 A1
20100014714 Zhang et al. Jan 2010 A1
20100017056 Asakura et al. Jan 2010 A1
20100042282 Taguchi et al. Feb 2010 A1
20100052945 Breed Mar 2010 A1
20100066587 Yamauchi et al. Mar 2010 A1
20100076640 Maekawa et al. Mar 2010 A1
20100079590 Kuehnle et al. Apr 2010 A1
20100097383 Nystad et al. Apr 2010 A1
20100098295 Zhang Apr 2010 A1
20100104199 Zhang Apr 2010 A1
20100114416 Au et al. May 2010 A1
20100179715 Puddy Jul 2010 A1
20100179720 Lin et al. Jul 2010 A1
20100191433 Groult Jul 2010 A1
20100198491 Mays Aug 2010 A1
20100205132 Taguchi et al. Aug 2010 A1
20100207787 Catten et al. Aug 2010 A1
20100208937 Kmiecik et al. Aug 2010 A1
20100228419 Lee et al. Sep 2010 A1
20100241297 Aoki et al. Sep 2010 A1
20100246889 Nara et al. Sep 2010 A1
20100253542 Seder et al. Oct 2010 A1
20100256836 Mudalige Oct 2010 A1
20100265354 Kameyama Oct 2010 A1
20100299063 Nakamura et al. Nov 2010 A1
20110010131 Miyajima et al. Jan 2011 A1
20110040481 Trombley et al. Feb 2011 A1
20110071718 Norris et al. Mar 2011 A1
20110099040 Felt et al. Apr 2011 A1
20110137520 Rector et al. Jun 2011 A1
20110150348 Anderson Jun 2011 A1
20110206273 Plagemann et al. Aug 2011 A1
20110210866 David et al. Sep 2011 A1
20110213511 Visconti et al. Sep 2011 A1
20110239146 Dutta et al. Sep 2011 A1
20110246156 Zecha et al. Oct 2011 A1
20110254655 Maalouf et al. Oct 2011 A1
20110264317 Druenert et al. Oct 2011 A1
20120053775 Nettleton et al. Mar 2012 A1
20120069185 Stein Mar 2012 A1
20120083960 Zhu et al. Apr 2012 A1
20120114178 Platonov et al. May 2012 A1
20120157052 Quade Jun 2012 A1
20120271483 Samukawa et al. Oct 2012 A1
20120277947 Boehringer et al. Nov 2012 A1
20120283912 Lee et al. Nov 2012 A1
20120314070 Zhang et al. Dec 2012 A1
20130035821 Bonne et al. Feb 2013 A1
20130054049 Uno Feb 2013 A1
20130054106 Schmudderich et al. Feb 2013 A1
20130054128 Moshchuk et al. Feb 2013 A1
20130144520 Ricci Jun 2013 A1
20130179382 Fritsch et al. Jul 2013 A1
20130282277 Rubin et al. Oct 2013 A1
20130321400 Van Os et al. Dec 2013 A1
20130321422 Pahwa et al. Dec 2013 A1
20140050362 Park et al. Feb 2014 A1
20140067187 Ferguson et al. Mar 2014 A1
20140088855 Ferguson Mar 2014 A1
20140156164 Schuberth et al. Jun 2014 A1
20140195093 Litkouhi Jul 2014 A1
20140195138 Stelzig et al. Jul 2014 A1
20140214255 Dolgov et al. Jul 2014 A1
20140236473 Kondo et al. Aug 2014 A1
20140297181 Kondo et al. Oct 2014 A1
20140350836 Stettner et al. Nov 2014 A1
20140369168 Max et al. Dec 2014 A1
20150110344 Okumura Apr 2015 A1
20150112571 Schmudderich Apr 2015 A1
20150153735 Clarke et al. Jun 2015 A1
20150177007 Su et al. Jun 2015 A1
20150198951 Thor et al. Jul 2015 A1
20150203107 Lippman Jul 2015 A1
20150260530 Stenborg Sep 2015 A1
20150293216 O'Dea et al. Oct 2015 A1
20150302751 Strauss et al. Oct 2015 A1
20150321665 Pandita et al. Nov 2015 A1
20150325127 Pandita et al. Nov 2015 A1
20150345966 Meuleau Dec 2015 A1
20160046290 Aharony et al. Feb 2016 A1
20160091609 Ismail et al. Mar 2016 A1
20160327947 Ishikawa et al. Nov 2016 A1
20160334230 Ross et al. Nov 2016 A1
20160334797 Ross et al. Nov 2016 A1
20170016734 Gupta Jan 2017 A1
20170016740 Cui Jan 2017 A1
20170277960 Ramasamy Sep 2017 A1
20170316684 Jammoussi Nov 2017 A1
20170329330 Hatano Nov 2017 A1
20170356746 Iagnemma Dec 2017 A1
20170356747 Iagnemma Dec 2017 A1
20180024562 Bellaiche Jan 2018 A1
20180111613 Oh Apr 2018 A1
20180120859 Eagelberg May 2018 A1
20180131924 Jung May 2018 A1
20180188743 Wheeler Jul 2018 A1
20180189578 Yang Jul 2018 A1
20180322782 Engel Nov 2018 A1
20180348757 Mimura Dec 2018 A1
20180348758 Nakamura Dec 2018 A1
20180373250 Nakamura Dec 2018 A1
20190064826 Matsui Feb 2019 A1
20190137287 Pazhayampallil May 2019 A1
20190235498 Li Aug 2019 A1
Foreign Referenced Citations (50)
Number Date Country
101073018 Nov 2007 CN
101364111 Feb 2009 CN
10218010 Nov 2003 DE
10336986 Mar 2005 DE
0884666 Dec 1998 EP
2042405 Apr 2009 EP
2692064 Dec 1993 FR
H05246635 Sep 1993 JP
H08110998 Apr 1996 JP
09066853 Feb 1997 JP
09160643 Jun 1997 JP
H09161196 Jun 1997 JP
H09166209 Jun 1997 JP
H1139598 Feb 1999 JP
11282530 Oct 1999 JP
2000149188 May 2000 JP
2000193471 Jul 2000 JP
2000305625 Nov 2000 JP
2000338008 Dec 2000 JP
2001101599 Apr 2001 JP
2002236993 Aug 2002 JP
2002251690 Sep 2002 JP
2003081039 Mar 2003 JP
2003162799 Jun 2003 JP
2003205804 Jul 2003 JP
2004206510 Jul 2004 JP
2004326730 Nov 2004 JP
2004345862 Dec 2004 JP
2005067483 Mar 2005 JP
2005071114 Mar 2005 JP
2005297621 Oct 2005 JP
2005339181 Dec 2005 JP
2006264530 Oct 2006 JP
2006322752 Nov 2006 JP
2007001475 Jan 2007 JP
2007022135 Feb 2007 JP
2007331458 Dec 2007 JP
2008087545 Apr 2008 JP
2008117082 May 2008 JP
2008152655 Jul 2008 JP
2008170404 Jul 2008 JP
2008213581 Sep 2008 JP
2008290680 Dec 2008 JP
2009026321 Feb 2009 JP
2009053925 Mar 2009 JP
0070941 Nov 2000 WO
2001088827 Nov 2001 WO
2005013235 Feb 2005 WO
2007145564 Dec 2007 WO
2009028558 Mar 2009 WO
Non-Patent Literature Citations (23)
Entry
“Extended European Search Report received for European Patent Application No. 11831362.6, dated Mar. 14, 2017”, 11 pages.
“Extended European Search Report received for European Patent Application No. 11831503.5, dated Dec. 3, 2015”, 14 pages.
“Fact Sheet: Beyond Traffic Signals: A Paradigm Shift Intersection Control For Autonomous Vehicles”, Available online at: <http://www.fhwa.dot.gov/advancedresearch/pubs/10023/index. cfm>, 3 pages.
“International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2011/054154, dated Apr. 24, 2012”, 9 pages.
“International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2011/054896, dated Apr. 25, 2012”, 8 pages.
“International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2011/054899, dated May 4, 2012”, 8 pages.
“International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2013/061604, dated Jul. 3, 2014”, 10 pages.
“Notice of Preliminary Rejection received for Korean Patent Application No. 10-2013-7011657, dated Feb. 1, 2016”, 10 pages (4 pages of English Translation and 6 pages of Official Copy).
“Notice of Reasons for Rejection received for Japanese Patent Application No. 2013-532909, dated May 26, 2016”, 6 pages (3 pages of English Translation and 3 pages of Official Copy).
“Notice of reasons for rejection received for Japanese Patent Application No. 2013-532909, dated Nov. 25, 2015”, 9 pages (5 pages of English Translation and 4 pages of Official Copy).
“Office Action received for Chinese Patent Application No. 201180057942.8, dated Jun. 3, 2015”, 21 pages (14 pages of English Translation and 7 pages of Official Copy).
“Office Action received for Chinese Patent Application No. 201180057954.0, dated Apr. 29, 2015”, 14 pages (8 pages of English Translation and 6 pages of Official Copy).
“Office Action received for Japanese Patent Application No. 2013-532908, dated Sep. 8, 2015”, 12 pages (6 pages of English Translation and 6 pages of Official Copy).
“Partial Supplementary European Search Report received for European Patent Application No. 11831505.0, dated Dec. 20, 2016”, 6 pages.
“TomTom GO user manual”, Accessed on Oct. 1, 2007. XP055123040. Available online at: <http://download.tomtom.com/open/manuals/device/refman/TomTom-GO-en-GB.pdt>, 100 pages.
Crane , et al., “Team Gator Nation's Autonomous Vehicle Development for the 2007 DARPA Urban Challenge”, Journal of Aerospace Computing, Information and Communication, vol. 4, Dec. 2007, pp. 1059-1085.
Di Leece , et al., “Experimental System to Support Real-Time Driving Pattern Recognition”, Advanced Intelligent Computing Theories and Applications, With Aspects of Artificial Intelligence, ICIC 2008, Lecture Notes in Computer Science, vol. 5227, Springer, Berlin, 2008, pp. 1192-1199.
Guizzo, Eric , “How's Google's Self-Driving Car Works, IEEE”, Org, IEEE, Accessed on Oct. 18, 2011, pp. 1/31-31/31.
Jaffe, Eric , “The First Look at How Google's Self-Driving Car Handles City Streets”, The Atlantic City Lab, Apr. 28, 2014, 16 pages.
Markoff, John , “Google Cars Drive Themselves, in Traffic”, Available online at: <http://www.nytimes.com/2010/10/10/science/10google.html>, Oct. 9, 2010, 4 pages.
McNaughton , et al., “Motion Planning for Autonomous Driving with a Conformal Spatiotemporal Lattice”, IEEE, International Conference on Robotics and Automation, May 9-13, 2011, pp. 4889-4895.
Schonhof , et al., “Autonomous Detection and Anticipation of Jam Fronts From Messages Propagated by Intervehicle Communication”, Journal of the Transportation Research Board, vol. 1999, No. 1, Jan. 1, 2007, pp. 3-12.
Tiwari , et al., “Survival Analysis: Pedestrian Risk Exposure at Signalized Intersections”, Trans Research Part F: Traffic Psych and Behav, Pergamon, Amsterdam, vol. 10, No. 2, 2007, pp. 77-89.
Continuations (3)
Number Date Country
Parent 15799304 Oct 2017 US
Child 16815204 US
Parent 15070425 Mar 2016 US
Child 15799304 US
Parent 14472795 Aug 2014 US
Child 15070425 US