Autonomous vehicles, such as vehicles which do not require a human driver when operating in an autonomous driving mode, may be used to aid in the transport of passengers or items from one location to another. An important component of an autonomous vehicle is the perception system, which allows the vehicle to perceive and interpret its surroundings using cameras, radar, sensors, and other similar devices. The perception system executes numerous decisions while the autonomous vehicle is in motion, such as speeding up, slowing down, stopping, turning, etc. Autonomous vehicles may also use the cameras, sensors, and global positioning devices to gather and interpret images and sensor data about its surrounding environment, e.g., parked cars, trees, buildings, etc.
Information from the perception system may be combined with highly detailed map information in order to allow a vehicle's computer to safely maneuver the vehicle in various environments. This highly detailed map information may describe expected conditions of the vehicle's environment such as the shape and location of roads, traffic signals, and other objects. In this regard, the information from the perception system and detailed map information may be used to assist a vehicle's computer in making driving decisions involving intersections and traffic signals.
One aspect of the disclosure provides a computer-implemented method. The method includes receiving, by one or more computing devices, data identifying an object detected in a vehicle's environment. The data including location coordinates for the object. The method also includes identifying, by the one or more computing devices, a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature. The corresponding feature is defined as a curve and associated with a tag identifying a type of the corresponding object. The method includes identifying, by the one or more computing devices, a tolerance constraint based on the tag identifying the type of the corresponding object and dividing, by the one or more computing devices, the curve into two or more line segments. Each line segment of the two or more line segments has a first position. The method includes changing, by the one or more computing devices, the first position of the one of the two or more line segments to determine a second position of the one of the two or more line segments based on the location coordinates and the tolerance constraint. Changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments. The method also includes determining, by the one or more computing devices, a value based on a comparison of the first position to the second position. The value indicates a likelihood that the corresponding feature has changed.
In one example, the corresponding feature is identified based on whether a distance between the location coordinates and the map location satisfies a threshold. In another example, the tolerance constraint limits at least one of the shifting or rotating of the one of the two or more line segments. In another example, the method also includes identifying a second tolerance constraint based on the tag identifying the type of the corresponding object, and changing the first position is further based on the second constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments. In another example, changing the first position includes both shifting and rotating the first position of the one of the two or more line segments. In another example, the method also includes comparing the value to a threshold values to determine whether the corresponding feature no longer exists. In another example, the method also includes comparing the value to a threshold values to determine whether the corresponding feature has been shifted. In another example, the method also includes before segmenting, determining whether the detected object used to define a driving lane based on the second type of the second corresponding object. In another example, the method also includes using, by the one or more processors, the value to maneuver the vehicle.
A further aspect of the disclosure provides a system including one or more computing devices. The one or more computing devices are configured to receive data identifying an object detected in a vehicle's environment. The data including location coordinates for the object. The one or more computing devices are also configured to identify a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature. The corresponding feature is defined as a curve and associated with a tag identifying a type of the corresponding object. The one or more computing devices are further configured to identify a tolerance constraint based on the tag identifying the type of the corresponding object and divide the curve into two or more line segments. Each line segment of the two or more line segments has a first position. The one or more computing devices are also configured to change the first position of the one of the two or more line segments to determine a second position of the one of the two or more line segments based on the location coordinates and the tolerance constraint. Changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments. The one or more computing devices are configured to determine a value based on a comparison of the first position to the second position. The value indicates a likelihood that the corresponding feature has changed.
In one example, the corresponding feature is identified based on whether a distance between the location coordinates and the map location satisfies a threshold. In another example, the tolerance constraint limits at least one of the shifting or rotating of the one of the two or more line segments. In another example, the one or more computing devices are also configured to identify second tolerance constraint based on the tag identifying the type of the corresponding object, and to change the first position further based on the second constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments. In another example, the one or more computing devices are also configured to change the first position by both shifting and rotating the first position of the one of the two or more line segments. In another example, the one or more computing devices are further configured to compare the value to a threshold values to determine whether the corresponding feature no longer exists. In another example, the one or more computing devices are further configured to compare the value to a threshold values to determine whether the corresponding feature has been shifted. In another example, the one or more computing devices are also configured to, before segmenting, determine whether the detected object used to define a driving lane based on the second type of the second corresponding object. In another example, the system also includes the vehicle, and the one or more computing devices are further configured to maneuver the vehicle based on the value.
A further aspect of the disclosure provides a non-transitory, tangible computer readable medium on which instructions are stored. The instructions, when executed by one or more processors cause the one or more processors to perform a method. The method includes receiving data identifying an object detected in a vehicle's environment. The data including location coordinates for the object. The method also includes identifying a corresponding feature from pre-stored map information based on the location coordinates and a map location of the corresponding feature. The corresponding feature is defined as a curve and associated with a tag identifying a type of the corresponding object. The method includes identifying a tolerance constraint based on the tag identifying the type of the corresponding object and dividing the curve into two or more line segments. Each line segment of the two or more line segments has a first position. The method includes changing the first position of the one of the two or more line segments to determine a second position of the one of the two or more line segments based on the location coordinates and the tolerance constraint. Changing the first position includes at least one of shifting or rotating the first position of the one of the two or more line segments. The method also includes determining a value based on a comparison of the first position to the second position. The value indicates a likelihood that the corresponding feature has changed.
In one example, the corresponding feature is identified based on whether a distance between the location coordinates and the map location satisfies a threshold. In another example, the tolerance constraint limits at least one of the shifting or rotating of the one of the two or more line segments. In another example, the method also includes identifying a second tolerance constraint based on the tag identifying the type of the corresponding object, and changing the first position is further based on the second constraint, and the second tolerance constraint prohibits at least one of the shifting or rotating of the first position of the one of the two or more line segments. In another example, changing the first position includes both shifting and rotating the first position of the one of the two or more line segments. In another example, the method also includes comparing the value to a threshold values to determine whether the corresponding feature no longer exists. In another example, the method also includes comparing the value to a threshold values to determine whether the corresponding feature has been shifted. In another example, the method also includes before segmenting, determining whether the detected object used to define a driving lane based on the second type of the second corresponding object. In another example, the method also includes using, by the one or more processors, the value to maneuver the vehicle.
Overview
The technology relates to determining a probability of changes between pre-stored map information and a current state of the world. The prior map may include pre-stored map information used to maneuver a vehicle autonomously. The more the vehicle can depend on the accuracy and detail of the pre-stored map information, the less the vehicle must detect for itself in real time. However, in order for this approach to be effective, the pre-stored map information must be accurate and up-to-date. Since this condition cannot be absolutely guaranteed, it is useful for an autonomous vehicle's computing devices to be able to detect relevant changes (and ignore some minor changes) between the pre-stored map information and a current state of the world. For example, the vehicle's computing devices may perform an optimization procedure that moves and morphs a curve in the pre-stored map information in an attempt to best align it with a corresponding portion of a detected object, while also trying to preserve the general shape of the curve and minimize the overall shift in position. From this, the vehicle's computing devices may determine the probability of changes between the pre-stored map information and a current state of the world.
The pre-stored map information may include information that describes the shapes and geographic location coordinates of features observed in the past. The features may include those that are used to define driving lanes for the vehicle such as lane markers, curbs, barriers, guard rails, or transitions from one type of road surface to another as well as other features such as crosswalks, signs, stopping lines, etc. Examples of lane markers may include painted lines, rumble strips, botts (round, non-raised reflective markers), and other types of reflectors. The shapes of these features may be described as curves. In addition, each feature may be associated with one or more tags. A tag may identify a type of the feature.
The autonomous vehicle may include an object detection system. This system may include a plurality of sensors which provide sensor data to the vehicle's computing devices. This sensor data may describe the shape and geographic location coordinates of objects detected in the vehicle's environment.
The geographic location coordinates of the detected object may be compared to the pre-stored map information in order to identify a corresponding feature. As an example, features having geographic location coordinates that are within a threshold distance (e.g., a few inches, a half meter, etc.) of the geographic location coordinates of the detected may be identified as a corresponding feature.
The curve of the corresponding feature may be divided into two or more segments. These segments may be described as a pair of points that correspond to a starting geographic location coordinate and an ending geographic location coordinate of the segment. These segments may also be described as a single point and a vector. As an example, each segment may be a predetermined distance, such as 0.5 meters or more or less. This predetermined distance may be selected based upon the underlying resolution of the sensor data, the pre-stored map information, computing resources of the vehicle's computing devices, etc.
Using the tag associated with the corresponding feature, a tolerance constraint may be identified. For example, the vehicle's computing devices may access a lookup table, database, matrix, etc. which relates each of the different tags of the pre-stored map information to tolerance constraints. A tolerance constraint may limit the amount a given segment can be shifted or rotated. For instance, the tolerance constraint may be related to the likelihood that the type of feature identified by the tag can change.
Each of the segments may then be repositioned in order to better align that segment with the location coordinates of a corresponding portion of the detected object given the restrictions of any tolerance constraints. This may include laterally shifting the position of the segment relative to the corresponding portion of the detected object. In addition or alternatively, the segment may be rotated about a center point. As noted above, the tolerance constraint may be used to limit the amount by which a segment can be shifted or rotated.
The location coordinates of the repositioned segments for a corresponding feature may then be compared to corresponding location coordinates of the curve of the corresponding feature of the pre-stored map information. Based on this comparison, a value indicative of a likelihood that the corresponding feature changed, or rather moved, may be determined. For example, the value may include a probability that some or all of the curve of the corresponding feature has changed. In this regard, a probability may be determined for each section or for a plurality of the sections based on the differences between the two positions of each segment and the clustering of those differences from different segments.
In the case where the probability of a change is very high, the vehicle's computing devices may also compute a value or probability that the corresponding feature of the pre-stored map information no longer exists in the current state of the world. For example, the probability that some or all of the curve of the corresponding feature has changed may be compared to one or more threshold values to determine whether the feature has merely shifted or if the feature no longer exists. These threshold values may be learned from training on actual data.
The vehicle's computing devices may use this probability in various ways. For example, if the probability is high and the change appears to be dramatic, the vehicle's computing devices may use this information to make driving decisions for the vehicle.
As described in more detail below, the aspects described herein may accommodate various alternatives. For example, before segmenting a corresponding feature, each of the objects detected in the vehicle's environment may be compared to the pre-stored map information to determine whether that detected object corresponds to a feature used to define a driving lane. In another example, rather than segmenting and repositioning the curve of a corresponding feature, an edge corresponding to the shape of the detected object may be segmented. As another example, when the probability that a corresponding feature has moved is very high, the detected object may be a new object in that it may not have a corresponding feature in the pre-stored map information. Similarly, a detected object that is identified as a new object may be used as a signal to indicate that another detected object is also a new object. In another example, when a detected object appears to have a corresponding feature that has shifted on top of another feature in the pre-store map information, the vehicle's computing devices may assume that there has been no change or simply ignore the change.
As shown in
The memory 130 stores information accessible by the one or more processors 120, including data 132 and instructions 134 that may be executed or otherwise used by the processor(s) 120. The memory 130 may be of any type capable of storing information accessible by the processor(s), including a computing device-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
The data 132 may be retrieved, stored or modified by processor(s) 120 in accordance with the instructions 132. For instance, although the claimed subject matter is not limited by any particular data structure, the data may be stored in computing device registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files. The data may also be formatted in any computing device-readable format.
The instructions 134 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
The one or more processors 120 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor, such as a field programmable gate array (FPGA). Although
Computing device 110 may have all of the components normally used in connection with a computing device such as the processor and memory described above, as well as a user input 150 (e.g., a mouse, keyboard, touch screen and/or microphone) and various electronic displays (e.g., a monitor having a screen, a small LCD touch-screen or any other electrical device that is operable to display information). In this example, the vehicle includes an internal electronic display 152. In this regard, internal electronic display 152 may be located within a cabin of vehicle 100 and may be used by computing device 110 to provide information to passengers within the vehicle 100.
In one example, computing device 110 may be an autonomous driving computing system incorporated into vehicle 100. The autonomous driving computing system may capable of communicating with various components of the vehicle as needed in order to control the vehicle in fully autonomous (without input from a driver) as well as semiautonomous (some input from a driver) driving modes.
As an example,
Returning to
In this regard, computing device 110 may be in communication various systems of vehicle 100, such as deceleration system 160, acceleration system 162, steering system 164, signaling system 166, navigation system 168, positioning system 170, and perception system 172, such that one or more systems working together may control the movement, speed, direction, etc. of vehicle 100 in accordance with the instructions 134 stored in memory 130. Although these systems are shown as external to computing device 110, in actuality, these systems may also be incorporated into computing device 110, again as an autonomous driving computing system for controlling vehicle 100.
As an example, computing device 110 may interact with deceleration system 160 and acceleration system 162 in order to control the speed of the vehicle. Similarly, steering system 164 may be used by computing device 110 in order to control the direction of vehicle 100. For example, if vehicle 100 configured for use on a road, such as a car or truck, the steering system may include components to control the angle of wheels to turn the vehicle. Signaling system 166 may be used by computing device 110 in order to signal the vehicle's intent to other drivers or vehicles, for example, by lighting turn signals or brake lights when needed.
Navigation system 168 may be used by computing device 110 in order to determine and follow a route to a location. In this regard, the navigation system 168 and/or data 132 may store map information, e.g., highly detailed maps identifying the shapes, geographic location coordinates, and elevations of various objects that were previously observed such as roadways, features used to define driving lanes, intersections, crosswalks, traffic signals, buildings, signs, vegetation, or other such objects and information that the vehicle's computers may use to control the vehicle safely.
Examples of features may that are used to define driving lanes may include lane markers (painted lines, rumble strips, botts, etc.), curbs, barriers, guard rails, crosswalks, transitions from one type of road surface to another, or other such features. In some examples, the shapes of these features that are used to define driving lanes may be described as curves.
In addition, each of these features may be associated with one or more tags identifying the specify type of that feature. In the example of
Each tag may be associated with a tolerance constraint. As described in more detail below, a tolerance constraint may limit the amount a feature of the map information can be shifted or rotated when comparing the map information to a detected object. These tolerance constraints may be hard constraints, e.g., a segment cannot be rotated more than 30 degrees in any direction or shifted more than one meter. In addition, the tolerance constraints may be soft constraints where a segment is penalized for rotating or shifting greater than some threshold. As an example, penalties define the tradeoff between improved the improved appearance of an alignment and how that alignment affects the geometry. For instance, penalties may be defined such that a portion of the curve can only move some small distance (such as 10 centimeters or more or less) if they newly aligned location appears to have a shape that is somewhat more like the proper feature type for that portion, but the same portion can be moved a greater distance (such as 1 meter or more or less) if they newly aligned location appears to have a shape that is significantly more like the proper feature type for that portion.
A tolerance constraint may then help to maintain the shape of a given curve. For example, along with penalty based on how much any one segment move or rotate, there may be penalties based on how the shape of connected segments changes. These penalties can be overcome when the changed positions of a portion of a curve suggests a change in shape, but are useful in many other cases to prevent a noisy detection from indicating a change when there is not one.
The tolerance constraints may be related to the likelihood that the type of feature identified by the tag will change. For example, the probability that painted line markers will move may be much higher than the probability that curbs will move (line marks are much more easily moved by repainting than curbs, which may require significantly more labor). Thus, in some instances, such as where the corresponding object is a type of feature which is unlikely to be moved, a tolerance constraint may prevent a segment from being shifted and/or rotated at all. This tolerance constraint may be included in the tag, associated with the tag in the map information, or stored in some other location such as a lookup table, database, matrix, etc. which relates each of the different tags of the map information to a tolerance constraint.
In addition, the detailed map information includes a network of rails 350, 352, and 354, which provide the vehicle's computer with guidelines for maneuvering the vehicle so that the vehicle follows the rails and obeys traffic laws. As an example, a vehicle's computer may maneuver the vehicle from point A to point B (two fictitious locations not actually part of the detailed map information) by following rail 350, transitioning to rail 352, and subsequently transitioning to rail 354 in order to make a left turn at intersection 302.
As noted above, the map information may correspond to information observed in the past. In this regard,
Positioning system 170 may be used by computing device 110 in order to determine the vehicle's relative or absolute position on a map or on the earth. For example, the position system 170 may include a GPS receiver to determine the device's latitude, longitude and/or altitude position. Other location systems such as laser-based localization systems, inertial-aided GPS, or camera-based localization may also be used to identify the location of the vehicle. The location of the vehicle may include an absolute geographical location, such as latitude, longitude, and altitude as well as relative location information, such as location relative to other cars immediately around it which can often be determined with less noise that absolute geographical location.
The positioning system 170 may also include other devices in communication with computing device 110, such as an accelerometer, gyroscope or another direction/speed detection device to determine the direction and speed of the vehicle or changes thereto. By way of example only, an acceleration device may determine its pitch, yaw or roll (or changes thereto) relative to the direction of gravity or a plane perpendicular thereto. The device may also track increases or decreases in speed and the direction of such changes. The device's provision of location and orientation data as set forth herein may be provided automatically to the computing device 110, other computing devices and combinations of the foregoing.
The perception system 172 also includes one or more components for detecting and performing analysis on objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc. For example, the perception system 172 may include lasers, sonar, radar, one or more cameras, or any other detection devices which record data which may be processed by computing device 110. In the case where the vehicle is a small passenger vehicle such as a car, the car may include a laser mounted on the roof or other convenient location as well as other sensors such as cameras, radars, sonars, and additional lasers.
The computing device 110 may control the direction and speed of the vehicle by controlling various components. By way of example, if the vehicle is operating completely autonomously, computing device 110 may navigate the vehicle to a location using data from the detailed map information and navigation system 168. Computing device 110 may use the positioning system 170 to determine the vehicle's location and perception system 172 to detect and respond to objects when needed to reach the location safely. In order to do so, computing device 110 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine by acceleration system 162), decelerate (e.g., by decreasing the fuel supplied to the engine or by applying brakes by deceleration system 160), change direction (e.g., by turning the front or rear wheels of vehicle 100 by steering system 164), and signal such changes (e.g. by lighting turn signals of signaling system 166).
The one or more computing devices 110 may also features such as transmitters and receivers that allow the one or more devices to send and receive information to and from other devices. For example, the one or more computing devices may determine that the vehicle's environment has changed from an expected representation of the environment defined in the map information according to the aspects described herein. The one or more computing devices may send this information to other computing devise associated with other vehicles. Similarly, the one or more computing devices may receive such information from other computing devices.
This information may be sent and received via any wireless transmission method, such as radio, cellular, Internet, World Wide Web, intranets, virtual private networks, wide area networks, local networks, private networks using communication protocols proprietary to one or more companies, Ethernet, WiFi and HTTP, and various combinations of the foregoing. Such communication may be facilitated by any device capable of transmitting data to and from other computers, such as modems and wireless interfaces.
Example Methods
In addition to the operations described above and illustrated in the figures, various operations will now be described. It should be understood that the following operations do not have to be performed in the precise order described below. Rather, various steps can be handled in a different order or simultaneously, and steps may also be added or omitted.
As noted above, a vehicle's one or more computing devices may maneuver the vehicle using the various systems described above. While doing so, the perception system 172 may identify the shape and location of various objects in the vehicle's environment. For example,
The geographic location coordinates of the detected object may be compared to the map information in order to identify corresponding features between the map information and the objects detected by the perception system. As an example, features having at least some geographic location coordinates that are within a threshold distance (e.g., a few inches, a half meter, etc.) of the geographic location coordinates of a detected object may be identified as a corresponding feature. For example,
The curve of the corresponding feature of the map information may be divided into two or more segments. For example,
These segments may be described as a pair of points that correspond to a starting geographic location coordinate and an ending geographic location coordinate of the segment. Thus, each of the segments 810, 812, 814, 816, and 818 is bounded by two of end points 820, 822, 824, 826, 828, and 830. Each of these end points represents geographic location coordinates for the ending location of a corresponding segment. For example, segment 810 is bounded and defined by the geographic location coordinates of end points 820 and 822, segment 812 is bounded and defined by the geographic location coordinates of end points 822 and 824, etc. Alternatively, the segments may also be described as a single point and a vector.
As an example, each segment may be a predetermined distance. For example, lane line 612 may be divided into segments 810, 812, 814, 816, and 818 that are each 0.5 meters or more or less. This predetermined distance may be selected based upon the underlying resolution of the sensor data, the pre-stored map information, computing resources of the vehicle's computing devices, etc.
Each of the segments may then be repositioned in order to better align that segment with the location coordinates of a corresponding portion of the detected object given the restrictions of any tolerance constraints. This may include laterally shifting the position of the segment relative to the corresponding portion of the detected object. In addition or alternatively, the segment may be rotated about a center point.
In some examples, a tolerance constraint may then be identified. For example, the vehicle's computing devices may use the tag associated with a map feature to identify a tolerance constraints. As noted above, this information may be included in the tag, associated with the tag, or stored in some other location.
As noted above, a tolerance constraint may be used to limit the amount by which a segment can be shifted or rotated.
Similarly,
Thus, each of the segments may be repositioned in order to better align that segment with the location coordinates of a corresponding portion of the detected object given the restrictions of any tolerance constraints.
By shifting and rotating the segments, the end points of the segments will have new geographic location coordinates as can be seen from the example of
In the case where the probability of a change is very high, the vehicle's computing devices may also compute a value or probability that the corresponding feature of the pre-stored map information no longer exists in the current state of the world. For example, the probability that some or all of the curve of the corresponding feature has changed may be compared to one or more threshold values to determine whether the feature has merely shifted or if the feature no longer exists. These threshold values may be learned from training on actual data.
In another example, rather than relying on when the probability of a change for a feature being very high, if a good new location for a set of segments from the detailed map information is not found, then sensor data corresponding to the original location of the segments may be checked to see if the proper feature was detected there. If it was not, this may indicate that the feature had been completely removed.
The vehicle's computing devices may use this probability in various ways. For example, if the probability is high and the change appears to be dramatic, the vehicle's computing devices may use this information to make driving decisions for the vehicle. This may include slowing the vehicle down, maneuvering the vehicle in a more cautious mode, stopping the vehicle (e.g., to protect the safety of passengers), requesting that a passenger of the vehicle take control of the vehicle, etc. The vehicle's computing devices may also save the probability information, share the information with other autonomous vehicles, send the information to a system operator or centralized computing device for review and possible incorporation into the pre-stored map information, etc.
Although the examples of
As noted above, the aspects described herein may accommodate various alternatives. For example, before segmenting a corresponding feature, each of the objects detected in the vehicle's environment may be compared to the pre-stored map information to determine whether that detected object corresponds to a feature used to define a driving lane. This may be achieved by comparing the location information of the sensor data for the detected object to the location information of the pre-stored map to identify a corresponding feature. Then based on the tag associated with the corresponding feature, the vehicle's computing device may determine whether the detected object corresponds to the location of a feature used to define driving lanes. If so, the corresponding feature may be segmented and processed as described above, and if not, the corresponding feature need not be segmented or processed as described above.
Alternatively, rather than segmenting and repositioning the curve of a corresponding feature, an edge of the detected object may be segmented. The segments of the edge may then be shifted or rotated to better align the segment to the curve of the corresponding feature. Again, the tolerance constraint identified based on the tag of the corresponding feature may be used to limit the shifting and/or rotation of the segment. The location coordinates of the repositioned segments for the edge may then be compared to the corresponding location coordinates of the edge of the detected object (before the segment was repositioned). Based on this comparison, various values may be determined as described above.
In some examples, the detected object may be a new object in that it may not have a corresponding feature in the pre-stored map information. In this case, other features of the pre-stored map information may be used as signals to indicate additional characteristics of detected objects not readily detectable from the location and orientation characteristics of the detected objects. For example, if a new driving lane was added, the boundaries for that new driving lane may have a similar angle and heading as the boundaries for any previous driving lane or lanes in the pre-stored map information that are also in the same general area. In that regard, if a detected object appears to follow the general shape of the boundaries of a curve corresponding to a lane line in the pre-stored map information but appears in another location, the vehicle's computing devices may determine that the detected object corresponds to a lane line which is likely to have a heading that corresponds to the heading of the lane lines in the pre-stored map information.
Similarly, one detected object that is identified as a new object may be used as signals to indicate that another detected object is also a new object. For example, if a new crosswalk or a new bike lane is detected, for example using image matching or other identification techniques, the likelihood that other features in that immediate area changed may be relatively high. Thus, the vehicle's computing devices may be more likely to determine that another detected object is a new object.
In some instances, when a detected object appears to have a corresponding feature that has shifted on top of another feature in the pre-store map information, the vehicle's computing devices may assume that there has been no change or simply ignore the change. For example, in the case of a solid double lane line, one of the lane lines may be more faded than the other making it more difficult for the vehicle's detection system to detect the faded lane line. This may cause the vehicle's computing devices to determine that one of the lane lines has shifted on top of another, when actually there has been no change. Thus, in this example, the vehicle's computing devices may assume that there has been no change or simply ignore the change.
Unless otherwise stated, the foregoing alternative examples are not mutually exclusive, but may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. In addition, the provision of the examples described herein, as well as clauses phrased as “such as,” “including” and the like, should not be interpreted as limiting the subject matter of the claims to the specific examples; rather, the examples are intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings can identify the same or similar elements.
The present application is a continuation of U.S. patent application Ser. No. 17/705,967, filed Mar. 28, 2022, which is a continuation of U.S. patent application Ser. No. 16/815,204, filed Mar. 11, 2020, now issued as U.S. Pat. No. 11,327,493, which is a continuation of U.S. patent application Ser. No. 15/799,304, filed Oct. 31, 2017, now issued as U.S. Pat. No. 10,627,816, which is a continuation of U.S. patent application Ser. No. 15/070,425, filed on Mar. 15, 2016, now issued as U.S. Pat. No. 9,836,052, which is a continuation of U.S. patent application Ser. No. 14/472,795, filed Aug. 29, 2014, now issued as U.S. Pat. No. 9,321,461, the disclosures of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
1924984 | Fageol | Aug 1933 | A |
3186508 | Lamont | Jun 1965 | A |
3324805 | Munch | Jun 1967 | A |
3596728 | Neville | Aug 1971 | A |
4372414 | Anderson | Feb 1983 | A |
4387783 | Carman | Jun 1983 | A |
4543572 | Tanaka | Sep 1985 | A |
4656834 | Elpern | Apr 1987 | A |
4924795 | Ottemann | May 1990 | A |
4970653 | Kenue | Nov 1990 | A |
4982072 | Takigami | Jan 1991 | A |
5187666 | Watanabe | Feb 1993 | A |
5415468 | Latarnik | May 1995 | A |
5448487 | Arai | Sep 1995 | A |
5470134 | Toepfer | Nov 1995 | A |
5521579 | Bernhard | May 1996 | A |
5684696 | Rao | Nov 1997 | A |
5774069 | Tanaka | Jun 1998 | A |
5790403 | Nakayama | Aug 1998 | A |
5906645 | Kagawa | May 1999 | A |
5913376 | Takei | Jun 1999 | A |
5954781 | Slepian | Sep 1999 | A |
6055042 | Sarangapani | Apr 2000 | A |
6064926 | Sarangapani | May 2000 | A |
6070682 | Isogai | Jun 2000 | A |
6151539 | Bergholz | Nov 2000 | A |
6173222 | Seo | Jan 2001 | B1 |
6195610 | Kaneko | Feb 2001 | B1 |
6226570 | Hahn | May 2001 | B1 |
6321147 | Takeda | Nov 2001 | B1 |
6332354 | Lalor | Dec 2001 | B1 |
6343247 | Jitsukata | Jan 2002 | B2 |
6385539 | Wilson | May 2002 | B1 |
6414635 | Stewart | Jul 2002 | B1 |
6438472 | Tano | Aug 2002 | B1 |
6438491 | Farmer | Aug 2002 | B1 |
6453056 | Laumeyer | Sep 2002 | B2 |
6470874 | Mertes | Oct 2002 | B1 |
6504259 | Kuroda | Jan 2003 | B1 |
6516262 | Takenaga | Feb 2003 | B2 |
6560529 | Janssen | May 2003 | B1 |
6591172 | Oda | Jul 2003 | B2 |
6606557 | Kotzin | Aug 2003 | B2 |
6643576 | O Connor | Nov 2003 | B1 |
6832156 | Farmer | Dec 2004 | B2 |
6836719 | Andersson et al. | Dec 2004 | B2 |
6847869 | Dewberry | Jan 2005 | B2 |
6862524 | Nagda | Mar 2005 | B1 |
6876908 | Cramer | Apr 2005 | B2 |
6934613 | Yun | Aug 2005 | B2 |
6963657 | Nishigaki | Nov 2005 | B1 |
7011186 | Frentz | Mar 2006 | B2 |
7031829 | Nisiyama | Apr 2006 | B2 |
7085633 | Nishira | Aug 2006 | B2 |
7102496 | Ernst, Jr. | Sep 2006 | B1 |
7177760 | Kudo | Feb 2007 | B2 |
7194347 | Harumoto | Mar 2007 | B2 |
7207304 | Iwatsuki | Apr 2007 | B2 |
7233861 | Van Buer | Jun 2007 | B2 |
7327242 | Holloway | Feb 2008 | B2 |
7340332 | Underdahl | Mar 2008 | B2 |
7346439 | Bodin | Mar 2008 | B2 |
7373237 | Wagner | May 2008 | B2 |
7394046 | Olsson | Jul 2008 | B2 |
7486802 | Hougen | Feb 2009 | B2 |
7499774 | Barrett | Mar 2009 | B2 |
7499776 | Allard | Mar 2009 | B2 |
7499804 | Svendsen et al. | Mar 2009 | B2 |
7515101 | Bhogal | Apr 2009 | B1 |
7565241 | Tauchi | Jul 2009 | B2 |
7579942 | Kalik | Aug 2009 | B2 |
7639841 | Zhu | Dec 2009 | B2 |
7656280 | Hines | Feb 2010 | B2 |
7694555 | Howell | Apr 2010 | B2 |
7778759 | Tange | Aug 2010 | B2 |
7818124 | Herbst | Oct 2010 | B2 |
7835859 | Bill | Nov 2010 | B2 |
7865277 | Larson | Jan 2011 | B1 |
7894951 | Norris et al. | Feb 2011 | B2 |
7908040 | Howard | Mar 2011 | B2 |
7956730 | White | Jun 2011 | B2 |
7979175 | Allard et al. | Jul 2011 | B2 |
8024102 | Swoboda | Sep 2011 | B2 |
8050863 | Trepagnier | Nov 2011 | B2 |
8078349 | Prada Gomez | Dec 2011 | B1 |
8095313 | Blackburn | Jan 2012 | B1 |
8099213 | Zhang | Jan 2012 | B2 |
8126642 | Trepagnier | Feb 2012 | B2 |
8144926 | Mori | Mar 2012 | B2 |
8190322 | Lin et al. | May 2012 | B2 |
8194927 | Zhang | Jun 2012 | B2 |
8195341 | Huang | Jun 2012 | B2 |
8224031 | Saito | Jul 2012 | B2 |
8244408 | Lee et al. | Aug 2012 | B2 |
8244458 | Blackburn | Aug 2012 | B1 |
8260515 | Huang | Sep 2012 | B2 |
8280601 | Huang | Oct 2012 | B2 |
8280623 | Trepagnier et al. | Oct 2012 | B2 |
8311274 | Bergmann et al. | Nov 2012 | B2 |
8352111 | Mudalige | Jan 2013 | B2 |
8352112 | Mudalige | Jan 2013 | B2 |
8412449 | Trepagnier et al. | Apr 2013 | B2 |
8452506 | Groult | May 2013 | B2 |
8457827 | Ferguson | Jun 2013 | B1 |
8527199 | Burnette | Sep 2013 | B1 |
8634980 | Urmson et al. | Jan 2014 | B1 |
8694236 | Takagi | Apr 2014 | B2 |
8706394 | Trepagnier et al. | Apr 2014 | B2 |
8718861 | Montemerlo | May 2014 | B1 |
8724093 | Sakai | May 2014 | B2 |
8762021 | Yoshihama | Jun 2014 | B2 |
8775063 | Zeng | Jul 2014 | B2 |
8825259 | Ferguson | Sep 2014 | B1 |
8831813 | Ferguson | Sep 2014 | B1 |
8849494 | Herbach | Sep 2014 | B1 |
8855849 | Ferguson | Oct 2014 | B1 |
8855860 | Isaji | Oct 2014 | B2 |
8874267 | Dolgov | Oct 2014 | B1 |
8880272 | Ferguson | Nov 2014 | B1 |
8918277 | Niem | Dec 2014 | B2 |
8929604 | Platonov et al. | Jan 2015 | B2 |
8935057 | Dolinar | Jan 2015 | B2 |
8948954 | Ferguson | Feb 2015 | B1 |
8949016 | Ferguson | Feb 2015 | B1 |
8970397 | Nitanda | Mar 2015 | B2 |
8972093 | Joshi | Mar 2015 | B2 |
8996228 | Ferguson | Mar 2015 | B1 |
9008369 | Schofield | Apr 2015 | B2 |
9008890 | Herbach | Apr 2015 | B1 |
9014903 | Zhu | Apr 2015 | B1 |
9056395 | Ferguson | Jun 2015 | B1 |
9062979 | Ferguson | Jun 2015 | B1 |
9063548 | Ferguson | Jun 2015 | B1 |
9081383 | Montemerlo | Jul 2015 | B1 |
9145139 | Ferguson | Sep 2015 | B2 |
9164511 | Ferguson | Oct 2015 | B1 |
9182759 | Wimmer | Nov 2015 | B2 |
9227632 | Lee | Jan 2016 | B1 |
9229453 | Lee | Jan 2016 | B1 |
9285230 | Silver | Mar 2016 | B1 |
9310804 | Ferguson | Apr 2016 | B1 |
9321461 | Silver | Apr 2016 | B1 |
9384394 | Joshi | Jul 2016 | B2 |
9395192 | Silver | Jul 2016 | B1 |
9460624 | Pandita et al. | Oct 2016 | B2 |
9562777 | Kang | Feb 2017 | B2 |
9600768 | Ferguson | Mar 2017 | B1 |
9836052 | Silver | Dec 2017 | B1 |
10042362 | Fairfield | Aug 2018 | B2 |
10161754 | Matsushita | Dec 2018 | B2 |
10181084 | Ferguson | Jan 2019 | B2 |
10248871 | Ramasamy | Apr 2019 | B2 |
10304333 | Engel | May 2019 | B2 |
10474154 | Wengreen | Nov 2019 | B1 |
10474162 | Browning | Nov 2019 | B2 |
10481606 | Wengreen | Nov 2019 | B1 |
10627816 | Silver | Apr 2020 | B1 |
10732632 | Li | Aug 2020 | B2 |
10776634 | Meis | Sep 2020 | B2 |
11327493 | Silver | May 2022 | B1 |
11829138 | Silver | Nov 2023 | B1 |
20010037927 | Nagler | Nov 2001 | A1 |
20020188499 | Jenkins | Dec 2002 | A1 |
20030014302 | Jablin | Jan 2003 | A1 |
20030016804 | Sheha | Jan 2003 | A1 |
20030037977 | Tatara | Feb 2003 | A1 |
20030055554 | Shioda | Mar 2003 | A1 |
20030093209 | Andersson | May 2003 | A1 |
20030125963 | Haken | Jul 2003 | A1 |
20040243292 | Roy | Dec 2004 | A1 |
20050012589 | Kokubu | Jan 2005 | A1 |
20050099146 | Nishikawa | May 2005 | A1 |
20050125154 | Kawasaki | Jun 2005 | A1 |
20050131645 | Panopoulos | Jun 2005 | A1 |
20050149251 | Donath | Jul 2005 | A1 |
20050273251 | Nix | Dec 2005 | A1 |
20060037573 | Iwatsuki | Feb 2006 | A1 |
20060082437 | Yuhara | Apr 2006 | A1 |
20060089764 | Filippov | Apr 2006 | A1 |
20060089765 | Pack | Apr 2006 | A1 |
20060089800 | Svendsen | Apr 2006 | A1 |
20060116801 | Shirley | Jun 2006 | A1 |
20060173841 | Bill | Aug 2006 | A1 |
20060178240 | Hansel | Aug 2006 | A1 |
20060276942 | Anderson | Dec 2006 | A1 |
20070010942 | Bill | Jan 2007 | A1 |
20070024501 | Yeh | Feb 2007 | A1 |
20070084655 | Kakinami | Apr 2007 | A1 |
20070142992 | Gronau | Jun 2007 | A1 |
20070149214 | Walsh | Jun 2007 | A1 |
20070165910 | Nagaoka | Jul 2007 | A1 |
20070193798 | Allard | Aug 2007 | A1 |
20070203617 | Haug | Aug 2007 | A1 |
20070225909 | Sakano | Sep 2007 | A1 |
20070239331 | Kaplan | Oct 2007 | A1 |
20070247281 | Shimomura | Oct 2007 | A1 |
20070279250 | Kume | Dec 2007 | A1 |
20080021628 | Tryon | Jan 2008 | A1 |
20080033615 | Khajepour | Feb 2008 | A1 |
20080039991 | May | Feb 2008 | A1 |
20080040039 | Takagi | Feb 2008 | A1 |
20080056535 | Bergmann | Mar 2008 | A1 |
20080059015 | Whittaker | Mar 2008 | A1 |
20080059048 | Kessler | Mar 2008 | A1 |
20080084283 | Kalik | Apr 2008 | A1 |
20080089556 | Salgian | Apr 2008 | A1 |
20080120025 | Naitou | May 2008 | A1 |
20080120171 | Ikeuchi | May 2008 | A1 |
20080147253 | Breed | Jun 2008 | A1 |
20080161987 | Breed | Jul 2008 | A1 |
20080162036 | Breed | Jul 2008 | A1 |
20080167771 | Whittaker | Jul 2008 | A1 |
20080183512 | Benzinger | Jul 2008 | A1 |
20080188246 | Sheha et al. | Aug 2008 | A1 |
20080195268 | Sapilewski | Aug 2008 | A1 |
20080277183 | Huang | Nov 2008 | A1 |
20080303696 | Aso | Dec 2008 | A1 |
20080306969 | Mehta | Dec 2008 | A1 |
20090005959 | Bargman | Jan 2009 | A1 |
20090074249 | Moed | Mar 2009 | A1 |
20090082879 | Dooley | Mar 2009 | A1 |
20090115594 | Han | May 2009 | A1 |
20090164071 | Takeda | Jun 2009 | A1 |
20090198400 | Allard et al. | Aug 2009 | A1 |
20090248231 | Kamiya | Oct 2009 | A1 |
20090276154 | Subramanian | Nov 2009 | A1 |
20090287367 | Salinger | Nov 2009 | A1 |
20090287368 | Bonne | Nov 2009 | A1 |
20090306834 | Hjelm | Dec 2009 | A1 |
20090313077 | Wheeler, IV | Dec 2009 | A1 |
20090313095 | Hurpin | Dec 2009 | A1 |
20090319096 | Offer | Dec 2009 | A1 |
20090319112 | Fregene | Dec 2009 | A1 |
20090322872 | Muehlmann | Dec 2009 | A1 |
20090326799 | Crook | Dec 2009 | A1 |
20100010699 | Taguchi | Jan 2010 | A1 |
20100014714 | Zhang | Jan 2010 | A1 |
20100017056 | Asakura | Jan 2010 | A1 |
20100042282 | Taguchi | Feb 2010 | A1 |
20100052945 | Breed | Mar 2010 | A1 |
20100066587 | Yamauchi | Mar 2010 | A1 |
20100076640 | Maekawa | Mar 2010 | A1 |
20100079590 | Kuehnle | Apr 2010 | A1 |
20100097383 | Nystad | Apr 2010 | A1 |
20100098295 | Zhang | Apr 2010 | A1 |
20100104199 | Zhang | Apr 2010 | A1 |
20100114416 | Au | May 2010 | A1 |
20100179715 | Puddy | Jul 2010 | A1 |
20100179720 | Lin | Jul 2010 | A1 |
20100191433 | Groult | Jul 2010 | A1 |
20100198491 | Mays | Aug 2010 | A1 |
20100204911 | Taguchi | Aug 2010 | A1 |
20100205132 | Taguchi | Aug 2010 | A1 |
20100207787 | Catten | Aug 2010 | A1 |
20100208937 | Kmiecik | Aug 2010 | A1 |
20100228419 | Lee | Sep 2010 | A1 |
20100241297 | Aoki | Sep 2010 | A1 |
20100246889 | Nara | Sep 2010 | A1 |
20100253542 | Seder | Oct 2010 | A1 |
20100256836 | Mudalige | Oct 2010 | A1 |
20100265354 | Kameyama | Oct 2010 | A1 |
20100299063 | Nakamura | Nov 2010 | A1 |
20110010131 | Miyajima | Jan 2011 | A1 |
20110040481 | Trombley | Feb 2011 | A1 |
20110071718 | Norris et al. | Mar 2011 | A1 |
20110099040 | Felt | Apr 2011 | A1 |
20110137520 | Rector | Jun 2011 | A1 |
20110150348 | Anderson | Jun 2011 | A1 |
20110206273 | Plagemann | Aug 2011 | A1 |
20110210866 | David | Sep 2011 | A1 |
20110213511 | Visconti | Sep 2011 | A1 |
20110239146 | Dutta | Sep 2011 | A1 |
20110246156 | Zecha | Oct 2011 | A1 |
20110254655 | Maalouf | Oct 2011 | A1 |
20110264317 | Druenert | Oct 2011 | A1 |
20120053775 | Nettleton | Mar 2012 | A1 |
20120069185 | Stein | Mar 2012 | A1 |
20120083960 | Zhu | Apr 2012 | A1 |
20120114178 | Platonov | May 2012 | A1 |
20120157052 | Quade | Jun 2012 | A1 |
20120271483 | Samukawa | Oct 2012 | A1 |
20120277947 | Boehringer | Nov 2012 | A1 |
20120283912 | Lee | Nov 2012 | A1 |
20120314070 | Zhang | Dec 2012 | A1 |
20130035821 | Bonne | Feb 2013 | A1 |
20130054049 | Uno | Feb 2013 | A1 |
20130054106 | Schmudderich | Feb 2013 | A1 |
20130054128 | Moshchuk | Feb 2013 | A1 |
20130144520 | Ricci | Jun 2013 | A1 |
20130179382 | Fritsch | Jul 2013 | A1 |
20130282277 | Rubin | Oct 2013 | A1 |
20130321400 | van Os | Dec 2013 | A1 |
20130321422 | Pahwa | Dec 2013 | A1 |
20140050362 | Park | Feb 2014 | A1 |
20140063232 | Fairfield | Mar 2014 | A1 |
20140067187 | Ferguson | Mar 2014 | A1 |
20140088855 | Ferguson | Mar 2014 | A1 |
20140100801 | Banhegyesi et al. | Apr 2014 | A1 |
20140156164 | Schuberth | Jun 2014 | A1 |
20140195093 | Litkouhi | Jul 2014 | A1 |
20140195138 | Stelzig | Jul 2014 | A1 |
20140200801 | Tsuruta | Jul 2014 | A1 |
20140214255 | Dolgov | Jul 2014 | A1 |
20140236473 | Kondo | Aug 2014 | A1 |
20140297181 | Kondo | Oct 2014 | A1 |
20140350836 | Stettner | Nov 2014 | A1 |
20140369168 | Max | Dec 2014 | A1 |
20150110344 | Okumura | Apr 2015 | A1 |
20150112571 | Schmudderich | Apr 2015 | A1 |
20150120137 | Zeng | Apr 2015 | A1 |
20150153735 | Clarke | Jun 2015 | A1 |
20150177007 | Su | Jun 2015 | A1 |
20150198951 | Thor | Jul 2015 | A1 |
20150203107 | Lippman | Jul 2015 | A1 |
20150260530 | Stenborg | Sep 2015 | A1 |
20150293216 | O'Dea | Oct 2015 | A1 |
20150302751 | Strauss | Oct 2015 | A1 |
20150321665 | Pandita | Nov 2015 | A1 |
20150325127 | Pandita | Nov 2015 | A1 |
20150345966 | Meuleau | Dec 2015 | A1 |
20150353082 | Lee | Dec 2015 | A1 |
20150353085 | Lee | Dec 2015 | A1 |
20160046290 | Aharony | Feb 2016 | A1 |
20160091609 | Ismail | Mar 2016 | A1 |
20160327947 | Ishikawa | Nov 2016 | A1 |
20160334230 | Ross | Nov 2016 | A1 |
20160334797 | Ross | Nov 2016 | A1 |
20170016734 | Gupta | Jan 2017 | A1 |
20170016740 | Cui | Jan 2017 | A1 |
20170277960 | Ramasamy | Sep 2017 | A1 |
20170316684 | Jammoussi | Nov 2017 | A1 |
20170329330 | Hatano | Nov 2017 | A1 |
20170356746 | Iagnemma | Dec 2017 | A1 |
20170356747 | Iagnemma | Dec 2017 | A1 |
20180024562 | Bellaiche | Jan 2018 | A1 |
20180111613 | Oh | Apr 2018 | A1 |
20180120859 | Eagelberg | May 2018 | A1 |
20180131924 | Jung | May 2018 | A1 |
20180188743 | Wheeler | Jul 2018 | A1 |
20180189578 | Yang | Jul 2018 | A1 |
20180322782 | Engel | Nov 2018 | A1 |
20180348757 | Mimura | Dec 2018 | A1 |
20180348758 | Nakamura | Dec 2018 | A1 |
20180373250 | Nakamura | Dec 2018 | A1 |
20190064826 | Matsui | Feb 2019 | A1 |
20190137287 | Pazhayampallil | May 2019 | A1 |
20190235498 | Li | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
101073018 | Nov 2007 | CN |
101364111 | Feb 2009 | CN |
10218010 | Nov 2003 | DE |
10336986 | Mar 2005 | DE |
0884666 | Dec 1998 | EP |
2042405 | Apr 2009 | EP |
2692064 | Dec 1993 | FR |
H05246635 | Sep 1993 | JP |
H08110998 | Apr 1996 | JP |
09066853 | Feb 1997 | JP |
09160643 | Jun 1997 | JP |
H09161196 | Jun 1997 | JP |
H09166209 | Jun 1997 | JP |
H1139598 | Feb 1999 | JP |
11282530 | Oct 1999 | JP |
2000149188 | May 2000 | JP |
2000193471 | Jul 2000 | JP |
2000305625 | Nov 2000 | JP |
2000338008 | Dec 2000 | JP |
2001101599 | Apr 2001 | JP |
2002236993 | Aug 2002 | JP |
2002251690 | Sep 2002 | JP |
2003081039 | Mar 2003 | JP |
2003162799 | Jun 2003 | JP |
2003205804 | Jul 2003 | JP |
2004206510 | Jul 2004 | JP |
2004326730 | Nov 2004 | JP |
2004345862 | Dec 2004 | JP |
2005067483 | Mar 2005 | JP |
2005071114 | Mar 2005 | JP |
2005297621 | Oct 2005 | JP |
2005339181 | Dec 2005 | JP |
2006264530 | Oct 2006 | JP |
2006322752 | Nov 2006 | JP |
2007001475 | Jan 2007 | JP |
2007022135 | Feb 2007 | JP |
2007331458 | Dec 2007 | JP |
2008087545 | Apr 2008 | JP |
2008117082 | May 2008 | JP |
2008152655 | Jul 2008 | JP |
2008170404 | Jul 2008 | JP |
2008213581 | Sep 2008 | JP |
2008290680 | Dec 2008 | JP |
2009026321 | Feb 2009 | JP |
2009053925 | Mar 2009 | JP |
0070941 | Nov 2000 | WO |
2001088827 | Nov 2001 | WO |
2005013235 | Feb 2005 | WO |
2007145564 | Dec 2007 | WO |
2009028558 | Mar 2009 | WO |
Entry |
---|
“Extended European Search Report received for European Patent Application No. 11831362.6, mailed on Mar. 14, 2017”, 11 pages. |
“Extended European Search Report received for European Patent Application No. 11831503.5, mailed on Dec. 3, 2015”, 14 pages. |
“Fact Sheet: Beyond Traffic Signals: A Paradigm Shift Intersection Control for Autonomous Vehicles”, Available online at: <http://www.fhwa.dot.gov/advancedresearch/pubs/10023/index. cfm>, 3 pages. |
“International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2011/054154, mailed on Apr. 24, 2012”, 9 pages. |
“International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2011/054896, mailed on Apr. 25, 2012”, 8 pages. |
“International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2011/054899, mailed on May 4, 2012”, 8 pages. |
“International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2013/061604, mailed on Jul. 3, 2014”, 10 pages. |
“Notice of Preliminary Rejection received for Korean Patent Application No. 10-2013-7011657, mailed on Feb. 1, 2016”, 10 pages (4 pages of English Translation and 6 pages of Official Copy). |
“Notice of Reasons for Rejection received for Japanese Patent Application No. 2013-532909, mailed on May 26, 2016”, 6 pages (3 pages of English Translation and 3 pages of Official Copy). |
“Notice of reasons for rejection received for Japanese Patent Application No. 2013-532909, mailed on Nov. 25, 2015”, 9 pages (5 pages of English Translation and 4 pages of Official Copy). |
“Office Action received for Chinese Patent Application No. 201180057942.8, mailed on Jun. 3, 2015”, 21 pages (14 pages of English Translation and 7 pages of Official Copy). |
“Office Action received for Chinese Patent Application No. 201180057954.0, mailed on Apr. 29, 2015”, 14 pages (8 pages of English Translation and 6 pages of Official Copy). |
“Office Action received for Japanese Patent Application No. 2013-532908, mailed on Sep. 8, 2015”, 12 pages (6 pages of English Translation and 6 pages of Official Copy). |
“Partial Supplementary European Search Report received for European Patent Application No. 11831505.0, mailed on Dec. 20, 2016”, 6 pages. |
“TomTom GO user manual”, Accessed on Oct. 1, 2007. XP055123040. Available online at: <http://download.tomtom.com/open/manuals/device/refman/TomTom-GO-en-GB.pdf>, 100 pages. |
Crane , et al., “Team Gator Nation's Autonomous Vehicle Development for The 2007 DARPA Urban Challenge”, Journal of Aerospace Computing, Information and Communication, vol. 4, Dec. 2007, pp. 1059-1085. |
Di Leece , et al., “Experimental System to Support Real-Time Driving Pattern Recognition”, Advanced Intelligent Computing Theories and Applications, With Aspects of Artificial Intelligence, ICIC 2008, Lecture Notes in Computer Science, vol. 5227, Springer, Berlin, 2008, pp. 1192-1199. |
Guizzo, Eric , “How's Google's Self-Driving Car Works, IEEE”, Org, IEEE, Accessed on Oct. 18, 2011, pp. 1/31-31/31. |
Jaffe, Eric , “The First Look at How Google's Self-Driving Car Handles City Streets”, The Atlantic City Lab, Apr. 28, 2014, 16 pages. |
Markoff, John , “Google Cars Drive Themselves, in Traffic”, Available online at: <http://www.nytimes.com/2010/10/10/science/10google.html>, Oct. 9, 2010, 4 pages. |
McNaughton , et al., “Motion Planning for Autonomous Driving with a Conformal Spatiotemporal Lattice”, IEEE, International Conference on Robotics and Automation, May 9-13, 2011, pp. 4889-4895. |
Schonhof , et al., “Autonomous Detection and Anticipation of Jam Fronts From Messages Propagated by Intervehicle Communication”, Journal of the Transportation Research Board, vol. 1999, No. 1, Jan. 1, 2007, pp. 3-12. |
Tiwari , et al., “Survival Analysis: Pedestrian Risk Exposure at Signalized Intersections”, Trans Research Part F: Traffic Psych and Behav, Pergamon, Amsterdam, vol. 10, No. 2, 2007, pp. 77-89. |
Number | Date | Country | |
---|---|---|---|
Parent | 17705967 | Mar 2022 | US |
Child | 18492921 | US | |
Parent | 16815204 | Mar 2020 | US |
Child | 17705967 | US | |
Parent | 15799304 | Oct 2017 | US |
Child | 16815204 | US | |
Parent | 15070425 | Mar 2016 | US |
Child | 15799304 | US | |
Parent | 14472795 | Aug 2014 | US |
Child | 15070425 | US |