Route planning for an autonomous vehicle

Information

  • Patent Grant
  • 10126136
  • Patent Number
    10,126,136
  • Date Filed
    Tuesday, June 14, 2016
    8 years ago
  • Date Issued
    Tuesday, November 13, 2018
    5 years ago
  • Inventors
  • Original Assignees
  • Examiners
    • Sweeney; Brian P
    Agents
    • Fish & Richardson P.C.
Abstract
Among other things, a determination is made of an ability of an autonomous vehicle to safely or robustly travel a road feature or a road segment or a route that is being considered for the autonomous vehicle as of a time or range of times. Route root conforms to properties of stored road network information. The road feature or road segment or route is eliminated from consideration if the computer has determined that the road feature or road segment or route cannot be safely or robustly traveled by the autonomous vehicle. The determination by the computer is based on properties of the environment in which the autonomous vehicle travels.
Description
BACKGROUND

This description relates to route planning for an autonomous vehicle.


An autonomous vehicle can drive safely without human intervention during part of a journey or an entire journey.


An autonomous vehicle includes sensors, actuators, computers, and communication devices to enable automated generation and following of routes through the environment. Some autonomous vehicles have wireless two-way communication capability to communicate with remotely-located command centers that may be manned by human monitors, to access data and information stored in a cloud service, and to communicate with emergency services.


As shown in FIG. 1, in a typical use of an autonomous vehicle 10, a desired goal position 12 (e.g., a destination address or street intersection) may be identified in a variety of ways. The goal position may be specified by a rider (who may be, for example, an owner of the vehicle or a passenger in a mobility-as-a-service “robo-taxi” application). The goal position may be provided by an algorithm (which, for example, may be running on a centralized server in the cloud and tasked with optimizing the locations of a fleet of autonomous vehicles with a goal of minimizing rider wait times when hailing a robo-taxi). In some cases, the goal position may be provided by a process (e.g., an emergency process that identifies the nearest hospital as the goal position due to a detected medical emergency on board the vehicle).


Given a desired goal position, a routing algorithm 20 determines a route 14 through the environment from the vehicle's current position 16 to the goal position 12. We sometimes call this process “route planning.” In some implementations, a route is a series of connected segments of roads, streets, and highways (which we sometimes refer to as road segments or simply segments).


Routing algorithms typically operate by analyzing road network information. Road network information typically is a digital representation of the structure, type, connectivity, and other relevant information about the road network. A road network is typically represented as a series of connected road segments. The road network information, in addition to identifying connectivity between road segments, may contain additional information about the physical and conceptual properties of each road segment, including but not limited to the geographic location, road name or number, road length and width, speed limit, direction of travel, lane edge boundary type, and any special information about a road segment such as whether it is a bus lane, whether it is a right-turn only or left-turn only lane, whether it is part of a highway, minor road, or dirt road, whether the road segment allows parking or standing, and other properties.


The routing algorithm typically identifies one or more candidate routes 22 from the current position to the goal position. Identification of the best, or optimal, route 14 from among the candidate routes is generally accomplished by employing algorithms (such as A*, D*, Dijkstra's algorithm, and others) that identify a route that minimizes a specified cost. This cost is typically a function of one or more criteria, often including the distance traveled along a candidate route, the expected time to travel along the candidate route when considering speed limits, traffic conditions, and other factors. The routing algorithm may identify one or more than one good routes to be presented to the rider (or other person, for example, an operator at a remote location) for selection or approval. In some cases, the one optimal route may simply be provided to a vehicle trajectory planning and control module 28, which has the function of guiding the vehicle toward the goal (we sometimes refer to the goal position or simply as the goal) along the optimal route.


As shown in FIG. 2, road network information typically is stored in a database 30 that is maintained on a centrally accessible server 32 and may be updated at high frequency (e.g., 1 Hz or more). The network information can be accessed either on-demand (e.g., requested by the vehicle 34), or pushed to the vehicle by a server.


Road network information can have temporal information associated with it, to enable descriptions of traffic rules, parking rules, or other effects that are time dependent (e.g., a road segment that does not allow parking during standard business hours, or on weekends, for example), or to include information about expected travel time along a road segment at specific times of day (e.g., during rush hour).


SUMMARY

In general, in an aspect, a determination is made of an ability of an autonomous vehicle to safely or robustly travel a road feature or a road segment or a route that is being considered for the autonomous vehicle as of a time or range of times. Route root conforms to properties of stored road network information. The road feature or road segment or route is eliminated from consideration if the computer has determined that the road feature or road segment or route cannot be safely or robustly traveled by the autonomous vehicle. The determination by the computer is based on properties of the environment in which the autonomous vehicle travels.


Implementations may include one or a combination of two or more of the following features. The environment includes road features. The properties of the environment include navigability by the autonomous vehicle. The properties of the environment include spatial characteristics of road features. The properties of the environment include connectivity characteristics of road features. The properties of the environment include spatial orientations of road features. The properties of the environment include locations of road work or traffic accidents. The properties of the environment include the road surface roughness of road features. The properties of the environment include curvature slope that affect visibility. The properties of the environment include characteristics of markings of road features. The properties of the environment include physical navigation challenges of road features associated with inclement weather. The computer determines an ability of the autonomous vehicle to safely or robustly travel each of a set of road features or road segments or routes.


The ability of the autonomous vehicle to safely or robustly travel a road feature or a road segment or a route depends on characteristics of sensors on the vehicle. The characteristics include an actual or estimated level of performance as a function of current or predicted future conditions. The computer determines the ability of the autonomous vehicle as of a given time. The route is one of two or more candidate routes determined by a route planning process. The ability of the autonomous vehicle to safely or robustly travel a road feature or a road segment or a route depends on characteristics of software processes. The software processes include processing of data from sensors on the vehicle. The software processes include motion planning. The software processes include decision-making. The software processes include vehicle motion control. The characteristics include an actual or estimated level of performance as a function of current or predicted future conditions.


These and other aspects, features, implementations, and advantages, and combinations of them, can be expressed as methods, systems, components, apparatus, program products, methods of doing business, means and steps for performing functions, and in other ways.


Other aspects, features, implementations, and advantages will become apparent from the following description and from the claims.





DESCRIPTION


FIGS. 1 through 3 are block diagrams.



FIGS. 4 through 9 are schematic diagrams of roadway scenarios.



FIG. 10 is a schematic view of a vehicle and a remotely located database.





For route planning involving human-piloted vehicles, it is generally assumed that a route identified by a routing algorithm from a current position to a goal position that is composed of connected road segments is a route that can be driven safely by the driver. However, this assumption may not be valid for routes identified by the routing algorithm for an autonomous vehicle for various reasons. Autonomous vehicles may not be able to safely navigate certain road segments, intersections, or other geographic regions (which we will broadly refer to as road features) due to the specific properties of the road features and the vehicle's capabilities with respect to those road features. Also, autonomous vehicles may not be able to safely navigate certain road features during certain times of the day, periods of the year, or under certain weather conditions.


An example of the physical locations of sensors and software processes in a vehicle and at a cloud-based server and database is shown in FIGS. 3 and 10.


Sensors and Software Processes

In many cases, this inability to safely navigate road features relates to characteristics of sensors and software processes that the autonomous vehicle uses to perceive the environment, process data from the sensors, understand conditions that are currently presented by and may at future times be presented by the perceived environment, perform motion planning, perform motion control, and make decisions based on those perceptions and understandings. Among other things, under certain conditions and at certain times, the ability of the sensors and processes to perceive the environment, understand the conditions, perform motion planning and motion control, and make the decisions may be degraded or lost or may be subject to unacceptable variation.


Examples of such degradation or unacceptable variation of sensor and software process outputs are as follows:


Sensors for Perceiving the Vehicle's Environment


As shown on FIG. 3, sensors 40 of the following types are commonly available on vehicles that have a driver assistance capability or a highly automated driving capability (e.g., an autonomous vehicle): Sensors able to measure properties of the vehicle's environment including but not limited to, e.g., LIDAR, RADAR, monocular or stereo video cameras in the visible light, infrared, or thermal spectra, ultrasonic sensors, time-of-flight (TOF) depth sensors, as well as temperature and rain sensors, and combinations of them. Data 42 from such sensors can be processed 44 to yield “data products” 46, e.g., information about the type, position, velocity, and estimated future motion of other vehicles, pedestrians, cyclists, scooters, carriages, carts, animals, and other moving objects. Data products also include the position, type, and content of relevant objects and features such as static obstacles (e.g., poles, signs, curbs, traffic marking cones and barrels, traffic signals, traffic signs, road dividers, and trees), road markings, and road signs.


The ability of the software processes 44 to use such sensor data to compute such data products at specified levels of performance depends on the properties of the sensors, such as the detection range, resolution, noise characteristics, temperature dependence, and other factors. The ability to compute such data products at a specified level of performance may also depend on the environmental conditions, such as the properties of the ambient lighting (e.g., whether there is direct sunlight, diffuse sunlight, sunrise or sunset conditions, dusk, or darkness), the presence of mist, fog, smog, or air pollution, whether or not it is raining or snowing or has recently rained or snowed, and other factors.


Generally, it is possible to characterize the capability of a particular sensor (and associated processing software) to yield a data product of interest at a specific level of performance (e.g., a specific level of accuracy of detection, range of detection, rate of true or false positives, or other metric) as a function of a measurable metric relating to the environmental conditions. For example, it is generally possible to characterize the range at which a particular monocular camera sensor can detect moving vehicles at a specified performance level, as a function of ambient illumination levels associated with daytime and nighttime conditions.


Further, it is generally possible to identify specific failure modes of the sensor, i.e., conditions or circumstances where the sensor will reliably degrade or fail to generate a data product of interest, and to identify data products that the sensor has not been designed to be able to generate.



FIG. 9 shows an example of an autonomous vehicle sensor configuration.


Software for Processing Data from Sensors


As noted above, data from sensors can be used by software processes 44 to yield a variety of data products of interest. The ability of each of the software processes to generate data products that conform to specified levels of performance depends on properties of the sensor software processes (e.g., algorithms), which may limit their performance in scenarios with certain properties, such as a very high or very low density of data features relevant to the sensing task at hand.


For example, an algorithm (we sometimes use the terms software process and algorithm interchangeably) for pedestrian detection that relies on data from a monocular vision sensor may degrade or fail in its ability to detect, at a specified level of performance (e.g., a specified processing rate), more than a certain number of pedestrians and may therefore degrade or fail (in the sense of not detecting all pedestrians in a scene at the specified level of performance) in scenarios with a large number of pedestrians. Also, an algorithm for determining the location of the ego vehicle (termed “localization”) based on comparison of LIDAR data collected from a vehicle-mounted sensor to data stored in a map database may fail in its ability to determine the vehicle's current position at a specified level of performance (e.g., at a specified degree of positional accuracy) in scenarios with little geometric relief, such as a flat parking lot.


Generally, it is possible to characterize the capability of a particular sensor software processes to yield a data product of interest at a specific level of performance as a function of measurable scenario properties.


Often the data provided by more than one sensor is combined in a data fusion framework implemented by one or more software processes, with an aim of improving the overall performance of computing a data product or data products. For example, data from a video camera can be combined with data from a LIDAR sensor to enable detection of pedestrians, at a level of performance that is designed to exceed the level of performance achievable through the use of either a video camera or LIDAR sensor alone. In data fusion scenarios such as this, the above remains true: it is generally possible to characterize the capability of a particular data fusion framework to yield a data product of interest at a specific level of performance.


Software Processes for Motion Planning


Vehicles capable of highly automated driving (e.g., autonomous vehicles) rely on a motion planning process, i.e., an algorithmic process to automatically generate and execute a trajectory through the environment toward a designated short-term goal. We use the term trajectory broadly to include, for example, a path from one place to another. To distinguish the trajectory that is generated by the motion planning process from the route that is generated by a route planning process, we note that trajectories are paths through the vehicle's immediate surroundings (e.g. with distance scales typically on the order of several meters to several hundred meters) that are specifically designed to be free of collisions with obstacles and often have desirable characteristics related to path length, ride quality, required travel time, lack of violation of rules of the road, adherence to driving practices, or other factors.


Some motion planning processes employed on autonomous vehicles exhibit known limitations. For example, a certain motion planning process may be able to compute paths for the vehicle from its current position to a goal under the assumption that the vehicle moves only in the forward direction, but not in reverse. Or, a certain motion planning process may be able to compute paths for a vehicle only when the vehicle is traveling at a speed that is less than a specified speed limit.


It is generally possible to identify these and similar performance characteristics (e.g., limitations) on a motion planning process, based on knowledge of the process's algorithmic design or its observed performance in simulation or experimental testing. Depending on the limitations of a particular motion planning process, it may prove difficult or impossible to navigate safely in specific regions, e.g., highways that require travel at high speeds, or multi-level parking structures that require complex multi-point turns involving both forward and reverse maneuvering.


Software Processes for Decision Making


Vehicles capable of highly automated driving rely on a decision making process, i.e., an algorithmic process, to automatically decide an appropriate short-term course of action for a vehicle at a given time, e.g., whether to pass a stopped vehicle or wait behind it; whether to proceed through a four-way stop intersection or yield to a vehicle that had previously arrived at the intersection.


Some decision making processes employed on autonomous vehicles exhibit known limitations. For example, a certain decision making process may not be able to determine an appropriate course of action for a vehicle in certain scenarios of high complexity, e.g., in roundabouts that include traffic lights, or in multi-level parking structures.


As in the case of motion planning processes, it is generally possible to identify these and similar performance characteristics (e.g., limitations) on a decision making process, based on knowledge of the process's algorithmic design or its observed performance in simulation or experimental testing. Depending on the limitations of a particular decision making process, it may prove difficult or impossible to navigate safely in specific regions.


Software Processes for Vehicle Motion Control


Autonomous vehicles generally aim to follow the trajectory provided by a motion planning process with a high degree of precision by employing a motion control process. Motion control processes compute a set of control inputs (i.e., steering, brake, and throttle inputs) based on analysis of the current and predicted deviation from a desired trajectory and other factors.


Such motion control processes exhibit known limitations. For example, a certain motion control process may allow for stable operation only in the forward direction, but not in reverse. Or, a certain motion control process may possess the capability to track (to a specified precision) a desired trajectory only when the vehicle is traveling at a speed that is less than a specified speed limit. Or, a certain motion control process may possess the capability to execute steering or braking inputs requiring a certain level of lateral or longitudinal acceleration only when the road surface friction coefficient exceeds a certain specified level.


As in the case of motion planning and decision making processes, it is generally possible to identify these and similar limitations on a motion control process, based on knowledge of the processes' algorithmic design, or its observed performance in simulation or experimental testing. Depending on the limitations of a particular motion control process, it may prove difficult or impossible to navigate safely in specific regions.


Safe or robust operation of the autonomous vehicle can be determined based on specific levels of performance of sensors and software processes as functions of current and future conditions.


Characteristics of Road Features

The route planning process aims to exclude candidate routes that include road features that can be determined to be not safely navigable by an autonomous vehicle. For this purpose the route planning process can usefully consider sources of information that are specifically relevant to autonomous vehicles, including information about characteristics of road features such as spatial characteristics, orientation, surface characteristics, and others. Generally, such information would be used to avoid routing the autonomous vehicle through areas of the road network that would be difficult or impossible for the vehicle to navigate at a required level of performance or safety. Examples of sources of information, and an explanation of their effects on autonomous vehicle performance or safety, are described here.


Spatial Characteristics of Intersections, Roundabouts, Junctions, or Other Road Features


As illustrated by the example shown in FIG. 5, road network information may contain, or allow calculation by a separate process, information pertaining to the spatial characteristics of road intersections, roundabouts, junctions, or other roadway features including multi-lane surface roads and highways. Such information may include the width of a roadway, the distance across an intersection (i.e., the distance from a point on a travel lane at the edge of an intersection to a point on an opposing lane at an opposite edge of an intersection), and the distance across a roundabout (i.e. the diameter of a roundabout), for example.


Analysis of such spatial characteristics, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, may allow a determination that certain road segments cannot be navigated by the autonomous vehicle at a specified level of safety or robustness without regard to or in light of a certain time or times of day or range of times (e.g., after sunset and before sunrise). This may allow the autonomous vehicle to avoid (for example) certain intersections that are, for example, “too large to see across after sunset,” given practical limitations on the autonomous vehicle's sensing capabilities and the allowable travel speed of the roadway. These limitations may make it impossible for the autonomous vehicle sensors to provide data products to the motion planning process with sufficient time to react to oncoming traffic.


Connectivity Characteristics of Intersections, Roundabouts, Junctions, or Other Road Features


As illustrated by the example shown in FIG. 4, road network information may contain, or allow calculating by a separate process, information pertaining to the connectivity characteristics of specific road segments or individual road segment lanes or other road features. Such information may include the orientation of intersecting road segments with respect to one another, for example. It may also include designations of specialized travel lanes such as right turn only and left turn only lane designations, or identifications of highway entrance ramps and exit ramps.


Analysis of such connectivity characteristics, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, the capabilities of the motion planning process, and the capabilities of the decision-making process, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle at a specified level of safety or robustness, potentially at a certain time(s) of day or range of times. This may allow the autonomous vehicle to avoid, for example, intersections with geometric properties that make it impossible for the autonomous vehicle sensors to provide data products to the motion planning process with sufficient time to react to oncoming traffic. It may also allow the autonomous vehicle to avoid, intersections that are too complex to safely navigate (e.g., due to complex required merging, or inability to reason about travel in specialized travel lanes), given known limitations on the vehicle's decision-making capability.


Spatial Orientations of Road Features


As illustrated by the examples shown in FIG. 6, road network information may contain, or allow calculation by a separate process, information pertaining to the spatial orientation (e.g., the orientation in an inertial coordinate frame) of specific road segments or individual road segment lanes or other road features.


Analysis of orientation of road features, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle at a specified level of safety or robustness, potentially at a certain time(s) of day or range of times. This may allow the autonomous vehicle to avoid (for example) being “sun blinded” (i.e., experiencing severely degraded performance of video and/or LIDAR sensors due to exposure to direct sunlight at a low oblique incidence angle).


Locations of Roadworks and Traffic Accidents


Road network information may contain, or be augmented to include via real time mapping service providers or another input, information regarding the location of roadworks or accidents, potentially resulting in closure of certain road segments. Analysis of such information, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle due to the vehicle's inability to detect ad hoc signage, barriers, or hand signals presented by human traffic guides associated with the roadworks or accident.


Locations of Rough Road Features


Road network information may contain, or be augmented to include via real time mapping service providers or similar inputs, information regarding the locations of regions of rough, degraded, potholed, damaged, washboarded, or partially constructed roads, including unprepared roads and secondary roads, and roads deliberately constructed with speed bumps or rumble strips. This information may be in the form of a binary designation (e.g., “ROUGH ROAD” or “SMOOTH ROAD”) or in the form of a continuous numerical or semantic metric that quantifies road surface roughness.


Analysis of road surface roughness, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle at a specified level of safety or robustness, potentially at a certain time(s) of day or range of times. This may allow the autonomous vehicle to avoid (for example) severely washboarded roads that incite vibration in the physical sensor mounts, leading to poor sensor system performance, or road regions with speed bumps that might be accidentally classified as impassable obstacles by a perception process.


Locations of Road Features Having Poor Visibility Due to Curvature and Slope


As shown in FIGS. 7 and 8, road network information may contain, or allow calculation by a separate process of information pertaining to the curvature and slope (along the vehicle pitch or roll axis) of the road feature.


Analysis of curvature and slope of road features, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle at a specified level of safety or robustness, potentially at a certain time(s) of day or range of times. This may allow the autonomous vehicle to avoid road segments that are steeply pitched and therefore make it impossible for the vehicle sensor system to “see over the hill,” (i.e., detect the presence of traffic in the surrounding environment due to the limited vertical field of view of the sensors), and to “see around the corner,” (i.e., detect the presence of traffic in the surrounding environment due to the limited horizontal field of view of the sensors).


Locations of Road Features having Illegible, Eroded, Incomprehensible, Poorly Maintained or Positioned Markings, Signage, or Signals


Road network information may contain, or be augmented to include via real time mapping service providers or another input, information regarding the locations of road regions with illegible, eroded, incomprehensible, or poorly maintained or positioned lane markings and other road markings, signage, or signals


Analysis of such information, in light of knowledge of the detection properties of the autonomous vehicle's sensor system and (potentially) the capabilities of the motion planning or decision-making process, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle at a specified level of safety or robustness, potentially at a certain time(s) of day or range of times. This may allow the autonomous vehicle to avoid (for example) poorly marked road regions to take account of the vehicle's inability to safely navigate within the lanes, intersections with traffic signs or signals that are partially occluded (e.g. by foliage) or otherwise difficult to detect from a nominal travel lane(s). It may also allow the autonomous vehicle to avoid (for example) road regions with signals or signage that are region- or country-specific and cannot be reliably detected by the vehicle perception process(es).


Locations of Road Features having Poor Prior Driving Performance by the Autonomous Vehicle or Another Autonomous Vehicle


Road network information may contain, or be augmented to include via real time mapping service providers or another input, or by the autonomous vehicle of interest or any other vehicle in a fleet of autonomous vehicles, information regarding the locations of road features where the autonomous vehicle of interest, or another autonomous vehicle, has experienced dangerous, degraded, unsafe, or otherwise undesirable driving performance, potentially due to high scenario traffic or pedestrian density, occlusion from static objects, traffic junction complexity, or other factors. Identification of regions of poor vehicle performance can be “tagged” in a map database, and marked for avoidance when the number of tagged incidents exceeds a specified threshold. This may allow the autonomous vehicle to avoid road features where the vehicle or other vehicles have experienced navigation difficulty.


Locations of Road Features having Poor Prior Simulation Performance by a Modeled Autonomous Vehicle


Road network information may contain, or be augmented to include, information regarding the locations of road regions where a model of the autonomous vehicle of interest has been observed in a simulated environment to experience dangerous, degraded, unsafe, or otherwise undesirable driving performance, potentially due to scenario traffic or pedestrian density, occlusion from static objects, traffic junction complexity, or other factors. Identification of regions of poor vehicle performance can be “tagged” in a map database, and marked for avoidance. This may allow the autonomous vehicle to avoid road regions where a model of the vehicle has experienced difficulty in safely navigating in a simulation environment, thereby suggesting that the experimental vehicle may face navigation challenges in the real world environment.


Locations of Road Features that may Present Physical Navigation Challenges in Inclement Weather


Road network information may contain, or allow calculation by a separate process, or be augmented to include via real time mapping service providers or another input, information pertaining to the locations of road features that may present navigation challenges in inclement weather or under specified environmental conditions.


Analysis of such information, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, and knowledge of the performance characteristics of the vehicle's motion control process, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle at a specified level of safety or robustness, potentially at a certain time(s) of day or range of times. This may allow the autonomous vehicle to avoid (for example) road segments containing road inclination or curvature that are impossible to safely navigate when covered with ice, snow, or freezing rain.


Locations of Road Features that may Lead to known Vehicle Fault or Failure Conditions


Road network information may contain, or allow calculation by a separate process, or be augmented to include via real time mapping service providers or another input, information pertaining to the locations of road features that may lead to known vehicle fault or failure conditions in various sensors or software processes.


Analysis of such information, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, and knowledge of the performance characteristics of the vehicle's motion control process, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle at a specified level of safety or robustness, potentially at a certain time(s) of day or range of times. This may allow the autonomous vehicle to avoid (for example) specific types of metal bridges or overpasses that may induce false readings from RADAR sensors, certain tunnels that may block GPS signals and therefore lead to poor vehicle localization performance, and certain extremely flat roadway regions that may not provide vertical features that are detectable by LIDAR sensors and may therefore lead to poor vehicle localization performance.


Road Segments Containing Road Inclination or Curvature that are Impossible to Safely Navigate when Covered with Ice, Snow, or Freezing Rain.


Road network information may contain, or allow calculation by a separate process, or be augmented to include from real time mapping service providers or another source, information pertaining to the locations of road features that may present navigation challenges in inclement weather or under specified environmental conditions.


Analysis of such information, in light of knowledge of the detection properties of the autonomous vehicle's sensor system, and knowledge of the performance characteristics of the vehicle's motion control process, may allow determination that certain road segments or junctions cannot be navigated by the autonomous vehicle at a specified level of safety or robustness, potentially at a certain time(s) of day or range of times. This may allow the autonomous vehicle to avoid (for example) road segments containing road inclination or curvature that are impossible to safely navigate when covered with ice or freezing rain.


In addition to identifying specific road segments that are not able to be safely navigated by an autonomous vehicle, it is possible to do the opposite: to identify specific road segments that are able to be safely navigated by an autonomous vehicle, based on analysis of relevant information sources as described above. For example, based on analysis of known performance characteristics of vehicle sensors and software processes, and given information about road features, it is possible to determine if a given road segment can be safely and robustly navigated by the autonomous vehicle.


Such analysis would be useful for compiling a map data product or a feed of data to be used by other products or processes, describing “validated autonomous driving routes” of the autonomous vehicle. In some implementations, a data product or data feed could describe “unsafe autonomous driving routes”. This data could be used as one of the properties of road segments that are maintained as part of road network information. In some cases, the validation of road segments and routes (or determination of inability to travel safely or robustly) could be based on successful experimental travel (or simulated travel) by an autonomous vehicle at a level of road features such as streets or at a lane level within a given road feature. A routing algorithm could make use of such information by considering only validated autonomous driving routes when determining an optimal route between the ego vehicle's current position and a goal position. Such an optimal route might attempt to include only road segments that have been deemed “validated autonomous driving routes,” or it might attempt to include a combination of validated and unvalidated driving routes, with the combination determined by an optimization process that considers a variety of factors such as travel distance, expected travel time, and whether or not the road segments are validated or unvalidated, among other factors. In general the route algorithm could explore only candidate routes that are known to have a viability status that exceeds a viability threshold, for example, to allow for sufficiently robust or sufficiently safe travel or both.


In some instances, such information could be used for urban planning purposes, to enable users (i.e., human planners of road networks or automated road network planning software processes) to avoid designing road segments or intersections that are likely to pose navigation challenges to autonomous vehicles. In such a use case, the analysis methods described here would be employed in the context of road design software tools or processes.


Such a road design software tool or process would allow a user to specify or design a road segment, road network, intersection, highway, or other road feature using a variety of possible input devices and user interfaces. As the user employs the road design software tool to specify or design a road segment, road network, intersection, highway, or other road feature, a software process (i.e., a “viability status process”) would analyze the viability status of a road segment or region of multiple, potentially connected, road segments (e.g., a freeway, or intersection). The viability status process may also analyze the viability status of a route. The viability status is determined based on the analysis methods described above.


The output of the viability status process can be a viability status assessment, i.e., an assessment of the viability of the road segment, road network, intersection, highway, or other road feature, or route, expressed in binary designation (e.g., “VIABLE” or “NOT VIABLE”) or can take the form of a continuous numerical or semantic metric that quantifies viability. The viability status assessment may include independent assessments of the safety or robustness of the road segment, road network, intersection, highway, or other road feature, or route, each expressed in binary designation or in the form of a continuous numerical or semantic metrics that quantifies safety or robustness. The output of the viability status process may include a warning to the user based on the value of the viability status assessment.


Depending on the value of the viability status assessment, the road segment, road network, intersection, highway, or other road feature designed by the user may be automatically deleted. Depending on the value of the viability status assessment, the road segment, road network, intersection, highway, or other road feature may be automatically modified in such a manner as to improve the viability status assessment.


In this manner a road design tool or process may be able to alert a user when the user has designed a hazardous road segments, intersection, or route and thereby deter construction of such road segment, intersection, or route, and potentially also suggest improved designs of the road segment, intersection, or route.


We sometimes use the phrase “viability status” broadly to include, for example, any determination or indication for a route or road feature or segment of a route of the level of suitability for travel by an autonomous vehicle, for example, whether it is unsafe, the degree to which it is unsafe, whether it is safe, the degree to which it is safe, whether it can be traveled robustly or not, and the degree of robustness, whether it is valid or not, and other similar interpretations.


Other implementations are also within the scope of the following claims.

Claims
  • 1. A method for causing safe or robust travel of an autonomous vehicle, the method comprising receiving signals from a sensor of the autonomous vehicle, the signals representing a sensed aspect of an environment in which the autonomous vehicle is to travel,by computer determining an ability of the autonomous vehicle to safely or robustly travel a road feature or a road segment or a route that is being considered for the travel of the autonomous vehicle as of a time or range of times, the road feature or road segment or route conforming to properties of stored road network informationeliminating the road feature or road segment or route from consideration if the computer has determined that the road feature or road segment or route cannot be safely or robustly traveled by the autonomous vehicle,the determining of the ability of the autonomous vehicle to safely or robustly travel the road feature or road segment or route being based on properties of the environment in which the autonomous vehicle travels,the determining of the ability of the autonomous vehicle to safely or robustly travel the road feature or road segment or route based on an actual or estimated level of performance of the sensor with respect to the properties of the environment, andsending signals to cause the autonomous vehicle to travel and to avoid the road feature or road segment or route while traveling.
  • 2. The method of claim 1 in which the environment comprises road features.
  • 3. The method of claim 1 in which the properties of the environment comprise navigability by the autonomous vehicle.
  • 4. The method of claim 1 in which the properties of the environment comprise spatial characteristics of road features.
  • 5. The method of claim 4 in which the spatial characteristics comprise aspects of intersections, roundabouts, or junctions.
  • 6. The method of claim 1 in which the properties of the environment comprise connectivity characteristics of road features.
  • 7. The method of claim 6 in which the connectivity characteristics comprise aspects of intersections, roundabouts, or junctions.
  • 8. The method of claim 1 in which the properties of the environment comprise spatial orientations of road features.
  • 9. The method of claim 1 in which the properties of the environment comprise locations of road work or traffic accidents.
  • 10. The method of claim 1 in which the properties of the environment comprise the road surface roughness of road features.
  • 11. The method of claim 1 in which the properties of the environment comprise curvature or slope that affect visibility.
  • 12. The method of claim 1 in which the properties of the environment comprise characteristics of markings of road features.
  • 13. The method of claim 1 in which the properties of the environment comprise physical navigation challenges of road features associated with inclement weather.
  • 14. The method of claim 1 in which the computer determines an ability of the autonomous vehicle to safely or robustly travel each of a set of road features or road segments or routes.
  • 15. The method of claim 1 in which the computer determines the ability of the autonomous vehicle as of a given time.
  • 16. The method of claim 1 in which the route is one of two or more candidate routes determined by a route planning process.
  • 17. The method of claim 1 in which the ability of the autonomous vehicle to safely or robustly travel a road feature or a road segment or a route depends on characteristics of software processes.
  • 18. The method of claim 17 in which the software processes comprise processing of data from sensors on the vehicle.
  • 19. The method of claim 17 in which the software processes comprise motion planning.
  • 20. The method of claim 17 in which the software processes comprise decision-making.
  • 21. The method of claim 17 in which the software processes comprise vehicle motion control.
  • 22. The method of claim 17 in which the characteristics comprise an actual or estimated level of performance as a function of current or predicted future conditions.
  • 23. The method of claim 1 in which the detection property of the sensor comprises the reaction time of the sensor to a change in the aspect of the environment.
  • 24. The method of claim 23 in which the detection property of the sensor comprises the reaction time of the sensor relative to a characteristic of the travel of the autonomous vehicle.
  • 25. The method of claim 24 in which the characteristic of the travel of the autonomous vehicle comprises a speed of the autonomous vehicle.
  • 26. The method of claim 23 in which the change in the aspect of the environment comprises a motion of another vehicle or object.
  • 27. The method of claim 1 in which the properties of the environment comprise road roughness.
  • 28. The method of claim 1 in which the properties of the environment comprise aspects contributing to poor visibility.
  • 29. The method of claim 1 in which the properties of the environment comprise flawed markings, signage, or signals.
  • 30. A method for causing safe or robust travel of an autonomous vehicle, the method comprising receiving a signal from a motion-planning, decision-making, or motion-control process representing a trajectory for travel of the autonomous vehicle, a decision about a short-term course of action for travel of the autonomous vehicle, or control inputs for driving functions of the autonomous vehicle,by computer, determining an ability of the autonomous vehicle to safely or robustly travel a road feature or a road segment or a route that is being considered for the travel of the autonomous vehicle as of a time or range of times, the road feature or road segment or route conforming to properties of stored road network information,eliminating the road feature or road segment or route from consideration if the computer has determined that the road feature or road segment or route cannot be safely or robustly traveled by the autonomous vehicle,the determining of the ability of the autonomous vehicle to safely or robustly travel the road feature or road segment or route being based on a capability of the motion-planning, decision-making, or motion-control process with respect to the trajectory, decision, or control inputs, andsending signals to cause the autonomous vehicle, based on the trajectory, decision, or control inputs, to travel and to avoid the road feature or road segment or route while traveling.
  • 31. The method of claim 30 in which the capability of the motion-planning, decision-making, or motion-control process with respect to the trajectory, decision, or control inputs comprises a performance characteristic.
  • 32. The method of claim 30 in which the capability comprises a limitation.
  • 33. The method of claim 30 in which the capability is based on knowledge of the process design or observed performance and simulation or experimental testing of the process.
  • 34. The method of claim 30 in which the road feature, road segment, or route comprises connectivity characteristics.
  • 35. The method of claim 34 in which the connectivity characteristics comprise aspects of intersections, roundabouts, or junctions.
US Referenced Citations (448)
Number Name Date Kind
4113046 Arpino Sep 1978 A
5128874 Bhanu et al. Jul 1992 A
5166668 Aoyagi Nov 1992 A
5521579 Bernhard May 1996 A
5913917 Murphy Jun 1999 A
6018806 Cortopassi et al. Jan 2000 A
6026347 Schuster Feb 2000 A
6067501 Vieweg May 2000 A
6126327 Bi et al. Oct 2000 A
6151539 Bergholz et al. Nov 2000 A
6188602 Alexander et al. Feb 2001 B1
6320515 Olsson Nov 2001 B1
6356961 Oprescu-Surcobe Mar 2002 B1
6546552 Peleg Apr 2003 B1
6768813 Nakayama Jul 2004 B1
6782448 Goodman et al. Aug 2004 B2
6836657 Ji et al. Dec 2004 B2
6947554 Freyman et al. Sep 2005 B2
6978198 Shi Dec 2005 B2
7007049 Peng Feb 2006 B2
7218212 Hu May 2007 B2
7260465 Waldis et al. Aug 2007 B2
7292870 Heredia et al. Nov 2007 B2
7350205 Ji Mar 2008 B2
7512516 Widmann Mar 2009 B1
7512673 Miloushev et al. Mar 2009 B2
7516450 Ogura Apr 2009 B2
7562360 Tai et al. Jul 2009 B2
7584049 Nomura Sep 2009 B2
7587433 Peleg et al. Sep 2009 B2
7642931 Sato Jan 2010 B2
7657885 Anderson Feb 2010 B2
7665081 Pavlyushchik Feb 2010 B1
7668871 Cai et al. Feb 2010 B1
7681192 Dietsch et al. Mar 2010 B2
7734387 Young et al. Jun 2010 B1
7802243 Feeser et al. Sep 2010 B1
7805720 Chang et al. Sep 2010 B2
7853405 Yamamoto Dec 2010 B2
7865890 Sumi et al. Jan 2011 B2
7890427 Rao et al. Feb 2011 B1
7904895 Cassapakis et al. Mar 2011 B1
7934209 Zimmer et al. Apr 2011 B2
7941656 Hans et al. May 2011 B2
8010959 Mullis et al. Aug 2011 B2
8078349 Prada Gomez et al. Dec 2011 B1
8095301 Kawamura Jan 2012 B2
8112165 Meyer et al. Feb 2012 B2
8145376 Sherony Mar 2012 B2
8146075 Mahajan Mar 2012 B2
8170739 Lee May 2012 B2
8229618 Tolstedt et al. Jul 2012 B2
8261256 Adler et al. Sep 2012 B1
8266612 Rathi et al. Sep 2012 B2
8271972 Braghiroli Sep 2012 B2
8326486 Moinzadeh et al. Dec 2012 B2
8375108 Aderton et al. Feb 2013 B2
8392907 Oshiumi et al. Mar 2013 B2
8397230 Ewington et al. Mar 2013 B2
8428649 Yan et al. Apr 2013 B2
8429643 Venkatachalam et al. Apr 2013 B2
8437890 Anderson et al. May 2013 B2
8457827 Ferguson et al. Jun 2013 B1
8468243 Ogawa et al. Jun 2013 B2
8495618 Inbaraj et al. Jul 2013 B1
8516142 Lee et al. Aug 2013 B2
8543261 Anderson et al. Sep 2013 B2
8549511 Seki et al. Oct 2013 B2
8578361 Cassapakis et al. Nov 2013 B2
8612153 Nomura et al. Dec 2013 B2
8612773 Nataraj et al. Dec 2013 B2
8676427 Ferguson et al. Mar 2014 B1
8706394 Trepagnier et al. Apr 2014 B2
8744648 Anderson et al. Jun 2014 B2
8781707 Kawawa et al. Jul 2014 B2
8781715 Breed Jul 2014 B2
8813061 Hoffman et al. Aug 2014 B2
8880270 Ferguson et al. Nov 2014 B1
8880272 Ferguson et al. Nov 2014 B1
8996234 Tamari et al. Mar 2015 B1
9008961 Nemec et al. Apr 2015 B2
9045118 Taguchi et al. Jun 2015 B2
9070305 Raman et al. Jun 2015 B1
9081383 Montemerlo et al. Jul 2015 B1
9090259 Dolgov et al. Jul 2015 B2
9096267 Mudalige et al. Aug 2015 B2
9097549 Rao et al. Aug 2015 B1
9110196 Urmson et al. Aug 2015 B2
9120485 Dolgov Sep 2015 B1
9128798 Hoffman et al. Sep 2015 B2
9139199 Harvey Sep 2015 B2
9176500 Teller et al. Nov 2015 B1
9187117 Spero et al. Nov 2015 B2
9188982 Thomson Nov 2015 B2
9196164 Urmson et al. Nov 2015 B1
9202382 Klinger et al. Dec 2015 B2
9218739 Trombley et al. Dec 2015 B2
9243537 Ge Jan 2016 B1
9314924 Laurent et al. Apr 2016 B1
9321461 Silver Apr 2016 B1
9348577 Hoffman et al. May 2016 B2
9349284 Cudak May 2016 B2
9354075 Kim May 2016 B2
9365213 Stenneth et al. Jun 2016 B2
9399472 Minoiu-Enache Jul 2016 B2
9412280 Zwillinger et al. Aug 2016 B1
9465388 Fairfield et al. Oct 2016 B1
9493158 Harvey Nov 2016 B2
9519290 Kojo Dec 2016 B2
9523984 Herbach Dec 2016 B1
9534910 Okumura Jan 2017 B2
9547307 Cullinane Jan 2017 B1
9547986 Curlander et al. Jan 2017 B1
9555736 Solar et al. Jan 2017 B2
9557736 Silver Jan 2017 B1
9566899 Foltin Feb 2017 B2
9568915 Berntorp et al. Feb 2017 B1
9587952 Slusar Mar 2017 B1
9594373 Solyom Mar 2017 B2
9600768 Ferguson Mar 2017 B1
9625261 Polansky Apr 2017 B2
9645577 Frazzoli et al. May 2017 B1
9648023 Hoffman et al. May 2017 B2
9671785 Bhatia et al. Jun 2017 B1
9682707 Silver Jun 2017 B1
9688199 Koravadi Jun 2017 B2
9729636 Koravadi et al. Aug 2017 B2
9740945 Divekar et al. Aug 2017 B2
9789809 Foltin Oct 2017 B2
9881220 Koravadi Jan 2018 B2
9881501 Weber Jan 2018 B2
9898008 Xu et al. Feb 2018 B2
9910440 Wei et al. Mar 2018 B2
9921585 Ichikawa et al. Mar 2018 B2
20030043269 Park Mar 2003 A1
20030060973 Mathews et al. Mar 2003 A1
20030125864 Banno et al. Jul 2003 A1
20030125871 Cherveny et al. Jul 2003 A1
20040054995 Lee Mar 2004 A1
20040093196 Hawthorne et al. May 2004 A1
20040167702 Isogai et al. Aug 2004 A1
20050065711 Dahlgren et al. Mar 2005 A1
20050093720 Yamane May 2005 A1
20050134710 Nomura et al. Jun 2005 A1
20050143889 Isaji et al. Jun 2005 A1
20050206142 Prakah-Asante et al. Sep 2005 A1
20050273256 Takahashi Dec 2005 A1
20050283699 Nomura et al. Dec 2005 A1
20060103590 Divon May 2006 A1
20060174240 Flynn Aug 2006 A1
20060195257 Nakamura Aug 2006 A1
20060217939 Nakate et al. Sep 2006 A1
20060242206 Brezak et al. Oct 2006 A1
20070001831 Raz et al. Jan 2007 A1
20070055446 Schiffmann et al. Mar 2007 A1
20070061074 Safoutin Mar 2007 A1
20070061779 Dowedeit et al. Mar 2007 A1
20070087756 Hoffberg Apr 2007 A1
20070124029 Hattori May 2007 A1
20070142995 Wotlermann Jun 2007 A1
20070162905 Kooijmans Jul 2007 A1
20070185624 Duddles et al. Aug 2007 A1
20070225900 Kropp Sep 2007 A1
20070226726 Robsahm Sep 2007 A1
20070229310 Sato Oct 2007 A1
20070253261 Uchida et al. Nov 2007 A1
20070255764 Sonnier et al. Nov 2007 A1
20070265767 Yamamoto Nov 2007 A1
20080001919 Pascucci Jan 2008 A1
20080005733 Ramachandran et al. Jan 2008 A1
20080046174 Johnson Feb 2008 A1
20080071460 Lu Mar 2008 A1
20080134165 Anderson et al. Jun 2008 A1
20080140278 Breed Jun 2008 A1
20080162027 Murphy et al. Jul 2008 A1
20080184785 Wee Aug 2008 A1
20080201702 Bunn Aug 2008 A1
20080244757 Nakagaki Oct 2008 A1
20080266168 Aso et al. Oct 2008 A1
20080303696 Aso et al. Dec 2008 A1
20090024357 Aso et al. Jan 2009 A1
20090058677 Tseng et al. Mar 2009 A1
20090062992 Jacobs et al. Mar 2009 A1
20090070031 Ginsberg Mar 2009 A1
20090079839 Fischer et al. Mar 2009 A1
20090089775 Zusman Apr 2009 A1
20090174573 Smith Jul 2009 A1
20090177502 Doinoff et al. Jul 2009 A1
20090237263 Sawyer Sep 2009 A1
20090271084 Taguchi Oct 2009 A1
20090312942 Froeberg Dec 2009 A1
20100100268 Zhang et al. Apr 2010 A1
20100198513 Zeng et al. Aug 2010 A1
20100228419 Lee et al. Sep 2010 A1
20100228427 Anderson et al. Sep 2010 A1
20100256836 Mudalige Oct 2010 A1
20100274430 Dolgov et al. Oct 2010 A1
20100286824 Solomon Nov 2010 A1
20100317401 Lee et al. Dec 2010 A1
20110066312 Sung et al. Mar 2011 A1
20110080302 Muthaiah et al. Apr 2011 A1
20110102195 Kushi et al. May 2011 A1
20110106442 Desai et al. May 2011 A1
20110137549 Gupta et al. Jun 2011 A1
20110141242 Fernandez Alvarez et al. Jun 2011 A1
20110153166 Yester Jun 2011 A1
20110184605 Neff Jul 2011 A1
20110190972 Timmons Aug 2011 A1
20110197187 Roh Aug 2011 A1
20110231095 Nakada et al. Sep 2011 A1
20110252415 Ricci Oct 2011 A1
20110265075 Lee Oct 2011 A1
20110307879 Ishida et al. Dec 2011 A1
20120010797 Luo et al. Jan 2012 A1
20120016581 Mochizuki et al. Jan 2012 A1
20120017207 Mahajan et al. Jan 2012 A1
20120109421 Scarola May 2012 A1
20120110296 Harata May 2012 A1
20120112895 Jun May 2012 A1
20120124568 Fallon et al. May 2012 A1
20120124571 Nagai et al. May 2012 A1
20120140039 Ota Jun 2012 A1
20120179362 Stille Jul 2012 A1
20120242167 Zeung et al. Sep 2012 A1
20120266156 Spivak et al. Oct 2012 A1
20120268262 Popovic Oct 2012 A1
20120271510 Seymour et al. Oct 2012 A1
20120275524 Lien et al. Nov 2012 A1
20120323402 Murakami Dec 2012 A1
20130018572 Jang Jan 2013 A1
20130046471 Rahmes et al. Feb 2013 A1
20130054133 Lewis et al. Feb 2013 A1
20130055231 Hyndman et al. Feb 2013 A1
20130079950 You Mar 2013 A1
20130085817 Pinkus Apr 2013 A1
20130099911 Mudalige et al. Apr 2013 A1
20130112132 Mueller May 2013 A1
20130151058 Zagorski et al. Jun 2013 A1
20130167131 Carson Jun 2013 A1
20130174050 Heinonen et al. Jul 2013 A1
20130184926 Spero et al. Jul 2013 A1
20130223686 Shimizu Aug 2013 A1
20130227538 Maruyama Aug 2013 A1
20130238235 Kitchel Sep 2013 A1
20130245877 Ferguson et al. Sep 2013 A1
20130253754 Ferguson et al. Sep 2013 A1
20130261952 Aso et al. Oct 2013 A1
20130297172 Ariga et al. Nov 2013 A1
20130304349 Davidson Nov 2013 A1
20130304365 Trombley et al. Nov 2013 A1
20130325241 Lombrozo Dec 2013 A1
20130328916 Arikan et al. Dec 2013 A1
20130332918 Aoyagi et al. Dec 2013 A1
20130335569 Einecke et al. Dec 2013 A1
20130338854 Yamamoto Dec 2013 A1
20130339721 Yasuda Dec 2013 A1
20140013015 Chang Jan 2014 A1
20140018994 Panzarella et al. Jan 2014 A1
20140059534 Daum et al. Feb 2014 A1
20140062725 Maston Mar 2014 A1
20140063232 Fairfield et al. Mar 2014 A1
20140067488 James et al. Mar 2014 A1
20140068594 Young et al. Mar 2014 A1
20140088855 Ferguson Mar 2014 A1
20140104077 Engel et al. Apr 2014 A1
20140112538 Ogawa et al. Apr 2014 A1
20140136414 Abhyanker May 2014 A1
20140149153 Cassandras et al. May 2014 A1
20140156182 Nemec et al. Jun 2014 A1
20140168377 Cluff et al. Jun 2014 A1
20140176350 Niehsen et al. Jun 2014 A1
20140195093 Litkouhi et al. Jul 2014 A1
20140204209 Huth et al. Jul 2014 A1
20140207325 Mudalige et al. Jul 2014 A1
20140222280 Salomonsson Aug 2014 A1
20140245285 Krenz Aug 2014 A1
20140266665 Haushalter Sep 2014 A1
20140272894 Grimes et al. Sep 2014 A1
20140278052 Slavin et al. Sep 2014 A1
20140278090 Boes et al. Sep 2014 A1
20140288810 Donovan et al. Sep 2014 A1
20140303827 Dolgov et al. Oct 2014 A1
20140309856 Willson-Quayle Oct 2014 A1
20140309885 Ricci Oct 2014 A1
20140327532 Park Nov 2014 A1
20140330479 Dolgov et al. Nov 2014 A1
20140334168 Ehlgen et al. Nov 2014 A1
20140334689 Butler et al. Nov 2014 A1
20140371987 Van Wiemeersch Dec 2014 A1
20150006012 Kammek et al. Jan 2015 A1
20150012204 Breuer et al. Jan 2015 A1
20150032290 Kitahama et al. Jan 2015 A1
20150046076 Costrello Feb 2015 A1
20150051785 Pal et al. Feb 2015 A1
20150081156 Trepagnier et al. Mar 2015 A1
20150088357 Yopp Mar 2015 A1
20150094943 Yoshihama et al. Apr 2015 A1
20150100216 Rayes Apr 2015 A1
20150120125 Thomson et al. Apr 2015 A1
20150121071 Schwarz Apr 2015 A1
20150123816 Breed May 2015 A1
20150124096 Koravadi May 2015 A1
20150134180 An et al. May 2015 A1
20150149017 Attard et al. May 2015 A1
20150154243 Danaher Jun 2015 A1
20150154323 Koch Jun 2015 A1
20150160024 Fowe Jun 2015 A1
20150161895 You et al. Jun 2015 A1
20150166069 Engel et al. Jun 2015 A1
20150178998 Attard et al. Jun 2015 A1
20150191135 Noon et al. Jul 2015 A1
20150191136 Noon et al. Jul 2015 A1
20150210274 Clarke et al. Jul 2015 A1
20150219463 Kang Aug 2015 A1
20150253778 Rothoff et al. Sep 2015 A1
20150266488 Solyom Sep 2015 A1
20150268665 Ludwich et al. Sep 2015 A1
20150279210 Zafiroglu et al. Oct 2015 A1
20150285644 Pfaff et al. Oct 2015 A1
20150292894 Goddard et al. Oct 2015 A1
20150293534 Takamatsu Oct 2015 A1
20150307131 Froeschl et al. Oct 2015 A1
20150310744 Farrelly et al. Oct 2015 A1
20150319093 Stolfus Nov 2015 A1
20150329107 Meyer et al. Nov 2015 A1
20150332101 Takaki et al. Nov 2015 A1
20150336502 Hillis et al. Nov 2015 A1
20150338849 Nemec et al. Nov 2015 A1
20150339928 Ramanujam Nov 2015 A1
20150345959 Meuleau Dec 2015 A1
20150345966 Meuleau Dec 2015 A1
20150345967 Meuleau Dec 2015 A1
20150345971 Meuleau et al. Dec 2015 A1
20150346724 Jones et al. Dec 2015 A1
20150346727 Ramanujam Dec 2015 A1
20150348112 Ramanujam Dec 2015 A1
20150353082 Lee et al. Dec 2015 A1
20150353085 Lee et al. Dec 2015 A1
20150353094 Harda et al. Dec 2015 A1
20150355641 Choi et al. Dec 2015 A1
20150358329 Noda et al. Dec 2015 A1
20150360692 Ferguson et al. Dec 2015 A1
20150379468 Harvey Dec 2015 A1
20160013934 Smereka et al. Jan 2016 A1
20160016127 Mentzel et al. Jan 2016 A1
20160016525 Chauncey et al. Jan 2016 A1
20160025505 Oh et al. Jan 2016 A1
20160033964 Sato et al. Feb 2016 A1
20160041820 Ricci et al. Feb 2016 A1
20160047657 Caylor et al. Feb 2016 A1
20160047666 Fuchs Feb 2016 A1
20160075333 Sujan et al. Mar 2016 A1
20160078758 Basalamah Mar 2016 A1
20160107655 Desnoyer et al. Apr 2016 A1
20160109245 Denaro Apr 2016 A1
20160117923 Dannenbring Apr 2016 A1
20160121482 Bostick et al. May 2016 A1
20160129907 Kim et al. May 2016 A1
20160137199 Kuhne et al. May 2016 A1
20160137206 Chandraker et al. May 2016 A1
20160138924 An May 2016 A1
20160139594 Okumura et al. May 2016 A1
20160139598 Ichikawa et al. May 2016 A1
20160139600 Delp May 2016 A1
20160147921 VanHolme May 2016 A1
20160148063 Hong et al. May 2016 A1
20160161266 Crawford et al. Jun 2016 A1
20160161270 Okumura Jun 2016 A1
20160161271 Okumura Jun 2016 A1
20160167652 Slusar Jun 2016 A1
20160176398 Prokhorov et al. Jun 2016 A1
20160180707 MacNeille et al. Jun 2016 A1
20160209843 Meuleau et al. Jul 2016 A1
20160231122 Beaurepaire Aug 2016 A1
20160231746 Hazelton et al. Aug 2016 A1
20160239293 Hoffman et al. Aug 2016 A1
20160260328 Mishra et al. Sep 2016 A1
20160266581 Dolgov et al. Sep 2016 A1
20160280264 Baek Sep 2016 A1
20160282874 Kurata et al. Sep 2016 A1
20160288788 Nagasaka Oct 2016 A1
20160291155 Nehmadi et al. Oct 2016 A1
20160318437 Vilakathara Nov 2016 A1
20160318531 Johnson et al. Nov 2016 A1
20160321551 Priness et al. Nov 2016 A1
20160332574 Park et al. Nov 2016 A1
20160334229 Ross et al. Nov 2016 A1
20160334230 Ross et al. Nov 2016 A1
20160355192 James et al. Dec 2016 A1
20160370194 Colijn et al. Dec 2016 A1
20160370801 Fairfield et al. Dec 2016 A1
20160379486 Taylor Dec 2016 A1
20170010613 Fukumoto Jan 2017 A1
20170016730 Gawrilow Jan 2017 A1
20170059335 Levine et al. Mar 2017 A1
20170059339 Sugawara et al. Mar 2017 A1
20170082453 Fischer et al. Mar 2017 A1
20170090480 Ho et al. Mar 2017 A1
20170106871 You et al. Apr 2017 A1
20170110022 Gulash Apr 2017 A1
20170113683 Mudalige et al. Apr 2017 A1
20170122766 Nemec et al. May 2017 A1
20170123430 Nath et al. May 2017 A1
20170139701 Lin et al. May 2017 A1
20170153639 Stein Jun 2017 A1
20170154225 Stein Jun 2017 A1
20170219371 Suzuki et al. Aug 2017 A1
20170242436 Creusot Aug 2017 A1
20170245151 Hoffman et al. Aug 2017 A1
20170248949 Moran et al. Aug 2017 A1
20170249848 Niino et al. Aug 2017 A1
20170276502 Fischer et al. Sep 2017 A1
20170277193 Frazzoli et al. Sep 2017 A1
20170277194 Frazzoli et al. Sep 2017 A1
20170277195 Frazzoli et al. Sep 2017 A1
20170286784 Bhatia et al. Oct 2017 A1
20170291608 Engel et al. Oct 2017 A1
20170292843 Wei et al. Oct 2017 A1
20170305335 Wei et al. Oct 2017 A1
20170309179 Kodama Oct 2017 A1
20170327128 Denaro Nov 2017 A1
20170336788 Iagnemma Nov 2017 A1
20170337819 Wei et al. Nov 2017 A1
20170341652 Sugawara et al. Nov 2017 A1
20170345321 Cross et al. Nov 2017 A1
20170349181 Wei et al. Dec 2017 A1
20170351263 Lambermont et al. Dec 2017 A1
20170356746 Iagnemma Dec 2017 A1
20170356748 Iagnemma Dec 2017 A1
20170356750 Iagnemma Dec 2017 A1
20170356751 Iagnemma Dec 2017 A1
20170369051 Sakai et al. Dec 2017 A1
20180004206 Iagnemma Jan 2018 A1
20180004210 Iagnemma et al. Jan 2018 A1
20180039269 Lambermount et al. Feb 2018 A1
20180050664 Tarte Feb 2018 A1
20180053276 Iagnemma et al. Feb 2018 A1
20180053412 Iagnemma et al. Feb 2018 A1
20180086280 Nguyen Mar 2018 A1
20180113455 Iagnemma et al. Apr 2018 A1
20180113456 Iagnemma et al. Apr 2018 A1
20180113457 Iagnemma et al. Apr 2018 A1
20180113459 Bennie et al. Apr 2018 A1
20180113463 Iagnemma et al. Apr 2018 A1
20180113470 Iagnemma et al. Apr 2018 A1
20180114442 Minemura et al. Apr 2018 A1
20180120845 Lambermont et al. May 2018 A1
20180120859 Eagelberg et al. May 2018 A1
Foreign Referenced Citations (30)
Number Date Country
105652300 Jun 2016 CN
102013010983 Jan 2015 DE
0436213 Jul 1991 EP
2381361 Oct 2011 EP
2639781 Sep 2013 EP
2955077 Dec 2015 EP
2982562 Feb 2016 EP
2005-189983 Jul 2005 JP
2009-102003 May 2009 JP
2009-251759 Oct 2009 JP
2010-086269 Apr 2010 JP
2011-253379 Dec 2011 JP
2013-242737 Dec 2013 JP
2015-044432 Mar 2015 JP
2016095627 May 2016 JP
2018-012478 Jan 2018 JP
10-2013-0085235 Jul 2013 KR
10-2014-0069749 Jun 2014 KR
10-2014-0130968 Nov 2014 KR
10-1480652 Jan 2015 KR
10-1590787 Feb 2016 KR
20160049017 May 2016 KR
WO2007053350 May 2007 WO
WO2014139821 Sep 2014 WO
WO2015008032 Jan 2015 WO
WO2015151055 Oct 2015 WO
WO2016018636 Feb 2016 WO
WO2017205278 Nov 2017 WO
WO2017218563 Dec 2017 WO
WO2018005819 Jan 2018 WO
Non-Patent Literature Citations (177)
Entry
Kessels et al., “Electronic Horizon: Energy Management using Telematics Information”, Vehicle Power and Propulsion Conference, 2007. VPPC 2007. IEEE, 6 pages.
Hammerschmidt, “Bosch to Focus on Cloud for Connected Car Services”, EE Times Europe. Dec. 3, 2015, 4 pages.
“Gain Scheduling”, Wikipedia, 1 page. https://en.wikipedia.org/wiki/Gain_scheduling.
http://www.bosch-presse.de/pressportal/en/connected-horizon--seeing-beyong-the-bends-ahead-35691.html.
International Search Report and Written Opinion from PCT application PCT/US2017/037294 dated Oct. 17, 2017 (22 pages).
Dolgov, Dmitri et al., “Path Planning for Autonomous Vehicles in Unknown Semi-structured Environments”, International Journal of Robotics Research, vol. 29 Issue 5, pp. 485-501, Apr. 2010 (18 pages).
International Search Report and Written Opinion from PCT application PCT/US2017/040040 dated Sep. 15, 2017 (22 pages).
Transaction history and application as filed of U.S. Appl. No. 15/182,281, filed Jun. 14, 2016.
Transaction history and application as filed of U.S. Appl. No. 15/200,050, filed Jul. 1, 2016.
Transaction history and application as filed of U.S. Appl. No. 15/182,313, filed Jun. 14, 2016.
Transaction history and application as filed of U.S. Appl. No. 15/182,400, filed Jun. 14, 2016.
Transaction history and application as filed of U.S. Appl. No. 15/182,365, filed Jun. 14, 2016.
Transaction history and application as filed of U.S. Appl. No. 15/200,035, filed Jul. 1, 2016.
U.S. Appl. No. 15/182,281, filed Jun. 14, 2016—Pending.
U.S. Appl. No. 15/200,050, filed Jul. 1, 2016—Pending.
U.S. Appl. No. 15/182,313, filed Jun. 14, 2016—Pending.
U.S. Appl. No. 15/182,400, filed Jun. 14, 2016—Pending.
U.S. Appl. No. 15/182,365, filed Jun. 14, 2016—Pending.
U.S. Appl. No. 15/200,035, filed Jul. 1, 2016—Pending.
U.S. Appl. No. 15/477,833, filed Apr. 3, 2017, Ravichandran et al.
U.S. Appl. No. 15/624,780, filed Jun. 16, 2017, Liu et al.
U.S. Appl. No. 15/624,802, filed Jun. 16, 2017, Liu et al.
U.S. Appl. No. 15/624,819, filed Jun. 16, 2017, Liu et al.
U.S. Appl. No. 15/624,838, filed Jun. 16, 2017, Liu et al.
U.S. Appl. No. 15/624,839, filed Jun. 16, 2017, Liu et al.
U.S. Appl. No. 15/624,857, filed Jun. 16, 2017, Liu et al.
Aguiar et al., “Path-following for non-minimum phase systems removes performance limitations,” IEEE Transactions on Automatic Control, 2005, 50(2):234-239.
Aguiar et al., “Trajectory-tracking and path-following of under-actuated autonomous vehicles with parametric modeling uncertainty,” Transactions on Automatic Control, 2007, 52(8):1362-1379.
Amidi and Thorpe, “Integrated mobile robot control,” International Society for Optics and Photonics, Boston, MA, 1991, 504-523.
Aoude et al., “Mobile agent trajectory prediction using Bayesian nonparametric reachability trees,” American Institute of Aeronautics and Astronautics, 2011, 1587-1593.
Autoliv.com [online], “Vision Systems—another set of “eyes”,” available on or before Sep. 8, 2012, retrieved Oct. 20, 2016,<https://www.autoliv.com/ProductsAndInnovations/ActiveSafetySystems/Pages/VisionSystems.aspx>, 2 pages.
Autonomoustuff.com [online], “ibeo Standard Four Layer Multi-Echo LUX Sensor: Bringing together the World's Best Technologies,” available on or before Jul. 2016, retrieved on Feb. 7, 2017, <http://www.autonomoustuff.com/product/ibeo-lux-standard/>, 2 pages.
Bahlmann et al., “A system for traffic sign detection, tracking, and recognition using color, shape, and motion information.” IEEE Intelligent Vehicles Symposium, 2005, 255-260.
Balabhadruni, “Intelligent traffic with connected vehicles: intelligent and connected traffic systems,” IEEE International Conference on Electrical, Electronics, Signals, Communication, and Optimization, 2015, 2 pages (Abstract Only).
Bertozzi et al., “Stereo inverse perspective mapping: theory and applications” Image and Vision Computing, 1999, 16:585-590.
Betts, “A survey of numerical methods for trajectory optimization,” AIAA Journal of Guidance, Control, and Dynamics, Mar.-Apr. 1998, 21(2):193-207.
Castro et al., “Incremental Sampling-based Algorithm for Minimum-violation Motion Planning”, Decision and Control, IEEE 52nd Annual Conference, Dec. 2013, 3217-3224.
Chaudari et al., “Incremental Minimum-Violation Control Synthesis for Robots Interacting with External Agents,” American Control Conference, Jun. 2014, <http://vision.ucla.edu/˜pratikac/pub/chaudhari.wongpiromsarn.ea.acc14.pdf>, 1761-1768.
Chen et al., “Likelihood-Field-Model-Based Dynamic Vehicle Detection and Tracking for Self-Driving,” IEEE Transactions on Intelligent Transportation Systems, Nov. 2016, 17(11):3142-3158.
d'Andrea-Novel et al., “Control of Nonholonomic Wheeled Mobile Robots by State Feedback Linearization,” The International Journal of Robotics Research, Dec. 1995, 14(6):543-559.
de la Escalera et al., “Road traffic sign detection and classification,” IEEE Transactions on Industrial Electronics, Dec. 1997, 44(6):848-859.
Delphi.com [online], “Delphi Electronically Scanning Radar: Safety Electronics,” retrieved on Feb. 7, 2017, <http://delphi.com/manufacturers/auto/safety/active/electronically-scanning-radar>, 4 pages.
Demiris, “Prediction of intent in robotics and multi-agent systems.” Cognitive Processing, 2007, 8(3):151-158.
Dominguez et al., “An optimization technique for positioning multiple maps for self-driving car's autonomous navigation,” IEEEE International Conference on Intelligent Transportation Systems, 2015, 2694-2699.
Fairfield and Urmson, “Traffic light mapping and detection,” In Proceedings of the International Conference on Robotics and Automation (ICRA), 2011, 6 pages.
Falcone et al., “A linear time varying model predictive control approach to the integrated vehicle dynamics control problem in autonomous systems,” IEEE Conference on Decision and Control, 2007, 2980-2985.
Falcone et al., “A Model Predictive Control Approach for Combined Braking and Steering in Autonomous Vehicles”, Ford Research Laboratories, Mediterranean Conference on Control & Automation, 2007, <http;//www.me.berkeley.edu/˜frborrel/pdfpub/pub-20.pdf>, 6 pages.
Fong et al., “Advanced Interfaces for Vehicle Teleoperation: Collaborative Control Sensor Fusion Displays, and Remote Driving Tools”, Autonomous Robots 11, 2001, 77-85.
Franke et al., “Autonomous driving goes downtown,” IEEE Intelligent Systems and their Applications, 1998, 6:40-48.
Fraser, “Differential Synchronization,” ACM: DocEng '09, Sep. 2009, <https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/35605.pdf>, 13-20.
Garcia et al., “Model predictive control: theory and practice—a survey,” Automatica, 1989, 25(3):335-348.
Gavrila and Philomin, “Real-time object detection for “smart” vehicles,” In Proceedings of the Seventh IEEE International Conference on Computer Vision, 1999, 1:87-93.
Golovinsky et al., “Shape-based Recognition of 3D Point Clouds in Urban Environments,” Proceedings of the 12th International Conference on Computer Vision, 2009, 2154-2161.
He et al., “Color-Based Road Detection in Urban Traffic Scenes,” IEEE Transactions on Intelligent Transportation Systems, Dec. 2004, 5(4):309-318.
Himmelsback et al., “Fast Segmentation of 3D Point Clouds for Ground Vehicles,” IEEE Intelligent Vehicles Symposium, Jul. 21-24, 2010, 6 pages.
IEEE Global Initiative for Ethical Consideration in Artificial Intelligence and Autonomous Systems, “Ethically Aligned Design: A Vision for Prioritizing Human Wellbeing with Artificial Intelligence and Autonomous Systems,” IEEE Advancing Technology for Humanity, Dec. 13, 2016, 138 pages.
ISO.org, “ISO 14229-1:2006; Road Vehicles—Unified diagnostic services (UDS)—Part 1: Specification and requirements,” International Standard Organization, 2006, retrieved on Apr. 4, 2018, <https://www.iso.org/standard/45293.html>, 2 pages (abstract).
ISO.org, “ISO 15765-3:2004; Road Vehicles—Diagnostics on Controller Area Networks (CAN)—Part 3: Implementation of unified diagnostic services (UDS on CAN),” International Standard Organization, Oct. 2004, retrieved on Apr. 4, 2018, <https://www.iso.org/obp/ui/#iso:std:iso:14229:-1:ed-1:v2:en>, 2 pages (abstract).
Jiang and Nijmeijer, “Tracking control of mobile robots: a case study in backstepping,” Automatica, 1997, 33(7):1393-1399.
Kala, et al: “Motion Planning of Autonomous Vehicles on a Dual Carriageway without Speed Lanes”, Electronics, Jan. 13, 2015 4(1):59-81.
Kanayama, “A Stable Tracking Control Method for an Autonomous Mobile Robot,” International Conference on Robotics and Automation, 1990, 384-389.
Karaman and Frazzoli, “Sampling-based algorithms for optimal motion planning.” Int. Journal of Robotics Research, Jun. 2011, <http://ares.lids.mit.edu/papers/Karaman.Frazzoli.IJRR11.pdf>, 30(7):846-894.
Karaman et al., “Sampling-based Algorithms for Optimal Motion Planning with Deterministic-Calculus Specifications”, 2012 American Control Conference, Jun. 27-Jun. 29, 2012, 8 pages.
Kavraki et al., “Probabilistic roadmaps for path planning in high-dimensional configuration spaces.” IEEE Transactions on Robotics and Automation, 1996, 12(4):566-580.
Kim, “Robust lane detection and tracking in challenging scenarios.” IEEE Transactions on Intelligent Transportation Systems, 2008, 9(1):16-26.
Larson et al., “Securing Vehicles against Cyber Attacks,” ACM, 2008, retrieved on [date], <http://dl.acm.org/citation.cfm?id=1413174>, 3 pages.
Lindner et al., “Robust recognition of traffic signals,” IEEE Intelligent Vehicles Symposium, 2004, 5 pages.
Liu et al, “Nonlinear Stochastic Predictive Control with Unscented Transformation for Semi_Autonomous Vehicles,” American Control Conference, Jun. 4-6, 2014, 5574-5579.
Liu et al., “Robust semi-autonomous vehicle control for roadway departure and obstacle avoidance,” ICCAS, Oct. 20-23 2013, 794-799.
Lobdell, “Robust Over-the-air Firmware Updates Using Program Flash Memory Swap on Kinetis Microcontrollers,” Freescal Semiconductor Inc., 2012, retrieved on Apr. 11, 2018, <http://cachefreescale.com/flies/microcontrollers/doc/app_note/AN4533.pdf>, 20 pages.
Luzcando (searcher), “EIC 3600 Search Report,” STIC—Scientific & Technical Information Center, Feb. 14, 2018, 20 pages.
Maldonado-Bascón et al., “Road-sign detection and recognition based on support vector machines,” IEEE Transactions on Intelligent Transportation Systems, 2007, 8(2):264-278.
Mayne et al., “Constrained model predictive control: Stability and optimality,” Automatica, 2000, 36(6):789-814.
Mobileye [online], “Advanced Driver Assistance Systems (ADAS) systems range on the spectrum of passive/active,” Copyright 2017, retrieved on Oct. 20, 2016, <http://www.mobileye.com/our-technology/adas/>, 2 pages.
Mogelmose et al., “Vision-based traffic sign detection and analysis for intelligent driver assistance systems: Perspectives and survey,” IEEE Transactions on Intelligent Transportation Systems, 2012, 13(4):1484-1497.
Morris et al., “Learning, modeling, and classification of vehicle track patterns from live video.” IEEE Transactions on Intelligent Transportation Systems, 2008, 9(3):425-437.
Nilsson et al., “Conducting Forensic Investigations of Cyber Attacks on Automobiles In-Vehicle Networks,” ICST, 2008, retrieved on Mar. 20, 2016, <http://dl.acm.org/citation.cfm?id=1363228>, 6 pages.
Nilsson et al., “A Framework for Self-Verification of Firmware Updates over the Air in Vehicle ECUs,” IEEE: GLOBECOM Workshops, Nov. 2008, 5 pages.
Ollero and Amidi, “Predictive path tracking of mobile robots. application to the CMU Navlab,” in 5th International Conference on Advanced Robotics, 1991, 91:1081-1086.
Paik et al., “Profiling-based Log Block Replacement Scheme in FTL for Update-intensive Executions,” IEEE: Embedded and Ubiquitous Computing (EUC), Oct. 2011, 182-188.
Premebida et al., “A lidar and vision-based approach for pedestrian and vehicle detection and tracking.” In Proceedings of the IEEE Intelligent Transportation Systems Conference, 2007, 1044-1049.
Premebida et al., “LIDAR and vision-based pedestrian detection system.” Journal of Field Robotics, 2009, 26(9):696-711.
Ponomarev, “Augmented reality's future isn't glasses. It's the car,” Venturebeat.com, available on or before, Aug. 2017, retrieved on Mar. 30, 2018, <https://venturebeat.com/2017/08/23/ar-will-drive-the-evolution-of-automated-cars/>, 4 pages.
Rankin et al., “Autonomous path planning navigation system used for site characterization,” SPIE—International Society for Optics and Photonics, 1996, 176-186.
Shavel-Shwartz et al., “Avoiding a “Winter of Autonomous Driving”: On a Formal Model of Safe, Scalable, Self-driving Cars,” arXiv preprint, Aug. 17, 2017, 25 pages.
Shen et al., “A Robust Video based Traffic Light Detection Algorithm for Intelligent Vehicles,” Proceedings of the IEEE Intelligent Vehicles Symposium, 2009, 521-526.
Shin, “Hot/Cold Clustering for Page Mapping in NAND Flash Memory,” IEEE: Transactions on Consumer Electronics, Nov. 2011, 57(4):1728-1731.
Spieser et al, “Toward a systematic approach to the design and evaluation of automated mobility-on-demand systems: A case study in Singapore,” Road Vehicle Automation, 2014, 229-245.
Standards.sae.org, “Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles,” SAE International, Sep. 2016, retrieved on Apr. 18, 2017, <http://standards.sae.org/j3016_201609/>, 3 pages.
Strahn et al., “Laser Scanner-Based Navigation for Commercial Vehicles,” IEEE Intelligent Vehicles Symposium, Jun. 13-15, 2007, 969-974.
Steger et al, “Applicability of IEEE 802.11s for Automotive Wireless Software Updates,” IEEE: Telecommunications (ConTEL), Jul. 2015, 8 pages.
Stokar, “Perform over-the-air updates for car ECUss,” eMedia Asia Ltd., 2013, retrieved on Apr. 11, 2018, <http://www.eetasia.com/STATIC/PDF/201312/EEOL_2013DEC05_NET_EMS_TA_01.pdf?SOURCES=DOWNLOAD>, 3 pages.
Tabuada and Pappas, “Linear time logic control of discrete-time linear systems,” IEEE Transactions on Automatic Control, 2006, 51(12):1862-1877.
Wallace et al., “First results in robot road-following,” in IJCAI, 1985, 1089-1095.
Wang et al., “Lane detection and tracking using B-Snake,” Image and Vision Computing, 2004, 22(4):269-280.
Wang et al., “Simultaneous localization, mapping and moving object tracking,” The International Journal of Robotics Research, 2007, 26(9):889-916.
Weiskircher et al., “Predictive Guidance and Control Framework for (Semi-) Autonomous Vehicles in Public Traffic,” IEEE Transactions on Control Systems Technology, 2017, 25(6):2034-2046.
Weiss et al., “Autonomous v. Tele-operated: How People Perceive Human-Robot Collaboration with HRP-2,” Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, 2009, 3 pages.
Wit et al., “Autonomous ground vehicle path tracking,” Journal of Robotic Systems, 2004, 21(8):439-449.
Wu et al., “Data Sorting in Flash Memory,” ACM, 2015, <http://dl.acm.org/citation.cfm?id=2747982.2665067>, 25 pages.
Yilmaz et al., “Object tracking: A survey,” Computing Surveys, 2006, 31 pages.
Zax, “A Software Update for Your Car? Ford reboots it infotainment system, following consumer complaints,” MIT Technology Review, 2012, retrieved on Apr. 11, 2018, <http://www.technologyreview.com/view/427153/a-software-update-for-yourcar?/>, 6 pages.
Zheng et al, “Lane-level positioning system based on RFID and vision,” IET International Conference on Intelligent and Connected Vehicles, 2016, 5 pages.
U.S. Appl. No. 15/872,554, Censi et al., filed Jan. 16, 2018.
U.S. Appl. No. 15/478,991, Frazzoli et al., filed Apr. 4, 2017.
U.S. Appl. No. 15/605,335, Frazzoli et al., filed May 25, 2017.
U.S. Appl. No. 15/605,388, Frazzoi et al., filed May 25, 2017.
U.S. Appl. No. 15/161,996, Iagnemma, filed May 23, 2016.
U.S. Appl. No. 15/182,281, Iagnemma, filed Jun. 14, 2016.
U.S. Appl. No. 15/200,050, Iagnemma et al., filed Jul. 1, 2016.
U.S. Appl. No. 15/182,313, Iagnemma, filed Jun. 14, 2016.
U.S. Appl. No. 15/182,400, Iagnemma, filed Jun. 14, 2016.
U.S. Appl. No. 15/182,365, Iagnemma, filed Jun. 14, 2016.
U.S. Appl. No. 15/200,035, Iagnemma et al., filed Jul. 1, 2016.
U.S. Appl. No. 15/240,072, Iagnemma et al., filed Aug. 18, 2016.
U.S. Appl. No. 15/240,150, Iagnemma et al., filed Aug. 18, 2016.
U.S. Appl. No. 15/688,345, Frazzoli et al., filed Aug. 28, 2017.
U.S. Appl. No. 15/298,935, Iagnemma et al., filed Oct. 20, 2016.
U.S. Appl. No. 15/298,984. Iagnemma et al., filed Oct. 20, 2016.
U.S. Appl. No. 15/298,936, Iagnemma et al., filed Oct. 20, 2016.
U.S. Appl. No. 15/298,970, Iagnemma et al, filed Oct. 20, 2016.
U.S. Appl. No. 15/299,028, Iagnemma et al, filed Oct. 20, 2016.
U.S. Appl. No. 15/401,473, Iagnemma et al., filed Jan. 9, 2017.
U.S. Appl. No. 15/401,499, Iagnemma et al., filed Jan. 9, 2017.
U.S. Appl. No. 15/401,519, Iagnemma et al., filed Jan. 9, 2017.
U.S. Appl. No. 15/451,703, Frazzoli et al., filed Mar. 7, 2017.
U.S. Appl. No. 15/490,694, Qin et al., filed Apr. 18, 2017.
U.S. Appl. No. 15/624,780, Liu et al., filed Jun. 16, 2017.
U.S. Appl. No. 15/477,833, Ravichandran et al., filed Apr. 3, 2017.
U.S. Appl. No. 15/451,734, Frazzoli et al, filed Mar. 7, 2017.
U.S. Appl. No. 15/451,747, Frazzoli et al, filed Mar. 7, 2017.
U.S. Appl. No. 15/477,872, Ravichandran et al., filed Apr. 3, 2017.
U.S. Appl. No. 15/477,882, Ravichandran et al., filed Apr. 3, 2017.
U.S. Appl. No. 15/477,936 , Ravichandran et al., filed Apr. 3, 2017.
U.S. Appl. No. 15/477,930, Ravichandran et al., filed Apr. 3, 2017.
U.S. Appl. No. 15/477,970, Ravichandran et al., filed Apr. 3, 2017.
U.S. Appl. No. 15/490,599, Qin et al., filed Apr. 18, 2017.
U.S. Appl. No. 15/490,616, Qin et al., filed Apr. 18, 2017.
U.S. Appl. No. 15/490,682, Qin et al., filed Apr. 18, 2017.
U.S. Appl. No. 15/624,802, Liu et al., filed Jun. 16, 2017.
U.S. Appl. No. 15/624,819, Liu et al., filed Jun. 16, 2017.
U.S. Appl. No. 15/624,838, Liu et al., filed Jun. 16, 2017.
U.S. Appl. No. 15/624,839, Liu et al., filed Jun. 16, 2017.
U.S. Appl. No. 15/624,857, Liu et al., filed Jun. 16, 2017.
U.S. Appl. No. 15/879,015, Yershov et al., filed Jan. 24, 2018.
U.S. Appl. No. 15/688,470, Frazzoli et al., filed Aug. 28, 2017.
U.S. Appl. No. 15/688,500, Frazzoli et al., filed Aug. 28, 2017.
U.S. Appl. No. 15/688,503, Frazzoli et al., filed Aug. 28, 2017.
U.S. Appl. No. 15/688,548, Frazzoli et al., filed Aug. 28, 2017.
U.S. Appl. No. 15/688,540, Frazzoli et al., filed Aug. 28, 2017.
U.S. Appl. No. 15/872,627, Censi et al., filed Jan. 16, 2018.
U.S. Appl. No. 15/872,603, Censi et al., filed Jan. 16, 2018.
U.S. Appl. No. 15/872,614, Censi et al., filed Jan. 16, 2018.
U.S. Appl. No. 15/138,602, Hoffman et al., filed Apr. 26, 2016.
U.S. Appl. No. 15/590,610, Hoffman et al., filed May 9, 2017 (abandoned).
U.S. Appl. No. 15/350,275, Hoffman et al., filed Nov. 14, 2016 (abandoned).
U.S. Appl. No. 15/083,520, Bhatia et al., filed Mar. 29, 2016.
U.S. Appl. No. 15/093,021, Wei et al., filed Apr. 7, 2016.
U.S. Appl. No. 15/146,534, Wei et al., filed May 4, 2016.
U.S. Appl. No. 15/824,037, Wei et al., filed Nov. 28, 2017.
U.S. Appl. No. 15/135,861, Lee et al., filed Apr. 22, 2016 (abandoned).
U.S. Appl. No. 15/135/825, Wei et al., filed Apr. 22, 2016 (abandoned).
U.S. Appl. No. 15/171,129, Lambermont et al., filed Jun. 2, 2016.
U.S. Appl. No. 15/159,234, Wei et al., filed May 16, 2016 (abandoned).
U.S. Appl. No. 15/160,655 , Wei et al., filed May 20, 2016.
U.S. Appl. No. 15/171,148, Bhatia et al., filed Jun. 2, 2016 (abandoned).
U.S. Appl. No. 15/171,174, Wei et al., filed Jun. 2, 2016 (abandoned).
U.S. Appl. No. 15/177,955, Wei et al., filed Jun. 6, 2016 (abandoned).
U.S. Appl. No. 15/187,157, Wei et al., filed Jun. 20, 2016 (abandoned).
U.S. Appl. No. 15/336,942, Lambermount et al., filed Oct. 28, 2016.
U.S. Appl. No. 15/230,019, Lambermount et al., filed Aug. 5, 2016.
U.S. Appl. No. 15/198,598, Wei et al., filed Jun. 30, 2016 (abandoned).
U.S. Appl. No. 15/619,939, Wei et al., filed Jun. 12, 2017.
U.S. Appl. No. 15/591,680, Wei et al., filed May 10, 2017.
U.S. Appl. No. 15/653,879, Xu et al., filed Jul. 19, 2017.
U.S. Appl. No. 15/906,059, Pai et al., filed Feb. 27, 2018.
U.S. Appl. No. 15/967,862, Wei et al., filed May 1, 2018.
Related Publications (1)
Number Date Country
20170356747 A1 Dec 2017 US