This description relates to affecting functions of a vehicle based on function-related characteristics of its environment.
An autonomous vehicle can drive safely without human intervention during part of a journey or an entire journey.
An autonomous vehicle includes sensors, actuators, computers, and communication devices to enable automated generation and following of routes through the environment. Some autonomous vehicles have wireless two-way communication capability to communicate with remotely-located command centers that may be manned by human monitors, to access data and information stored in a cloud service and to communicate with emergency services.
As shown in
Given a desired goal position, a routing algorithm 20 determines a route 14 through the environment from the vehicle's current position 16 to the goal position 12. We sometimes call this process “route planning.” In some implementations, a route is a series of connected segments of roads, streets, and highways (which we sometimes refer to as road segments or simply segments).
Routing algorithms typically operate by analyzing road network information. Road network information typically is a digital representation of the structure, type, connectivity, and other relevant information about the road network. A road network is typically represented as a series of connected road segments. The road network information, in addition to identifying connectivity between road segments, may contain additional information about the physical and conceptual properties of each road segment, including but not limited to the geographic location, road name or number, road length and width, speed limit, direction of travel, lane edge boundary type, and any special information about a road segment such as whether it is a bus lane, whether it is a right-turn only or left-turn only lane, whether it is part of a highway, minor road, or dirt road, whether the road segment allows parking or standing, and other properties.
The routing algorithm typically identifies one or more candidate routes 22 from the current position to the goal position. Identification of the best, or optimal, route 14 from among the candidate routes is generally accomplished by employing algorithms (such as A*, D*, Dijkstra's algorithm, and others) that identify a route that minimizes a specified cost. This cost is typically a function of one or more criteria, often including the distance traveled along a candidate route, the expected time to travel along the candidate route when considering speed limits, traffic conditions, and other factors. The routing algorithm may identify one or more than one good routes to be presented to the rider (or other person, for example, an operator at a remote location) for selection or approval. In some cases, the one optimal route may simply be provided to a vehicle trajectory planning and control module 28, which has the function of guiding the vehicle toward the goal (we sometimes refer to the goal position simply as the goal) along the optimal route.
As shown in
Road network information can have temporal information associated with it, to enable descriptions of traffic rules, parking rules, or other effects that are time dependent (e.g., a road segment that does not allow parking during standard business hours, or on weekends, for example), or to include information about expected travel time along a road segment at specific times of day (e.g., during rush hour).
In general, in an aspect, information is received about a function-related feature of an environment of a vehicle that has been identified in connection with a location of the vehicle.
Execution of a function of the vehicle is affected to alter performance characteristics of the function based on the identified function-related feature.
Implementations may include one or a combination of two or more of the following features. The function-related feature includes a geographic region. The function-related feature includes a road feature. The function includes a software process. The function includes motion planning. The function includes trajectory tracking. The function includes actuator control. The function includes decision-making. The function includes perception processing. The function includes localization of the vehicle. The function includes sensor data recording on-board the vehicle. The execution of the function is affected by switching from one function to another function. The execution of the function is affected by altering an operation of a function. The affecting of the execution of the function is based on parameter settings. The parameter settings are selected from among two or more different sets of parameter settings. The affecting of the execution of the function is based on prior information. The prior information includes a simulation model. The prior information includes two or more simulation models. The information about the function-related feature is received from a database.
In general, in an aspect, information is received that identifies or defines a function-related feature of an environment of a vehicle. Function-related information is generated that corresponds to the function-related feature.
Implementations may include one or a combination of two or more of the following features. The information is received from an automated process. The information is received from a user interacting with a user interface. The information is received from a combination of an automated process and a user interacting with a user interface. The function-related information includes a location that corresponds to the function-related feature. The function-related feature includes a geographic region that corresponds to the function-related feature. The function-related feature includes a road feature. The information is received while the vehicle is traveling in the vicinity of the function-related feature. The generated function-related information is stored in a database. The database includes a road network information database that includes information about road segments. The function-related information is fed through a communication network to a location where a road network information is accumulated.
These and other aspects, features, implementations, and advantages, and combinations of them, can be expressed as methods, systems, components, apparatus, program products, methods of doing business, means and steps for performing functions, and in other ways.
Other aspects, features, implementations, and advantages will become apparent from the following description and from the claims.
As shown in
We sometimes refer to this information broadly as “function-related information” and to the functions of the vehicle broadly as “vehicle functions.” Such function-related information may be used not only to select a route but also to manage, control, influence, inform, or otherwise (in broad terms) “affect” the functions and performance of a vehicle. In some implementations, the function-related information is used to affect the vehicle functions by modifying the type, structure, or performance (or a combination of them) of software processes on the vehicle.
We here introduce the term “function-related features” 52 which we use broadly to include, for example, any location or geographic region or other features of the environment of a vehicle that can be related to one or more vehicle functions. The identity or location or boundary or other identification information for a function-related feature 52 is part of the function-related information that is stored in a road network information database or similar database.
Referring to
As illustrated also in
Function-related features may include a wide variety of different types of features of the environment of the vehicle.
For example, function-related features may be associated with physical roadway structures such as pedestrian crosswalks, intersection zones, railway crossing areas; freeways, turnpikes, highways, and other high-speed travel zones; secondary roads; parking lots; pedestrian zones that allow vehicular traffic; designated school zones; driveways, road shoulders or breakdown lanes; toll booths; gas stations or electric vehicle charging stations; taxi stands; car rental lots; airport arrival/departure levels; construction zones; areas with known vehicle-to-vehicle (v2v) or vehicle-to-infrastructure (v2i) capability; designated autonomous vehicle travel areas (e.g., specified travel lanes or road regions that only admit passage to autonomous vehicles); drive-throughs (e.g., for bank automated tellers, or fast food), or car washes, to name a few.
Function-related features may be associated with drivable areas such as portions of a road, parking lot, unpaved drivable area, or other drivable surface.
In addition to identifying or defining different types of function-related features to be represented by function-related information in the database, sub-features 54 (
Function-related sub-features can be identified or defined based on geographic proximity to objects or road features of interest. For example, a section of a secondary road that is near to a stop or yield sign can be identified in the database as a sub-feature having proximity to objects or road features of interest.
Different portions of function-related features that lie in different geographic places, and therefore are associated with different driving rules or customs, can be identified in the database as being in the corresponding geographic places. For example, an instance of a type “geographic region” can be defined as a 2D polygon describing the geographic extent of the function-related feature, such as an ordered set of 2D coordinates, together with a unique map identifier for which the 2D coordinates are valid. An instance of a type “road feature” (such as a specific traffic sign or a set of traffic signs) could be defined by a set of 2D coordinates together with a unique map identifier for which 2D coordinates are valid.
The following are exemplary ways in which vehicle functions (e.g., software processes) of self-driving vehicles can be invoked or modified (that is, affected) based on function-related feature information.
Motion Planning Processes
As mentioned earlier and as also illustrated in
Certain motion planning processes may exhibit performance characteristics 63 (
For example, a certain motion planning process (process A) may be able quickly to generate candidate trajectories that exhibit little path curvature and may therefore be well-suited to motion planning for driving at high speeds on highways, where due to high vehicle travel speeds it is impossible to safely follow high-curvature trajectories. Another motion planning process (process B) may require greater computational effort and therefore operate at a lower update rate than process A, but may be able to generate complex (e.g., involving high path curvature or multi-point turn) trajectories, and may therefore be well-suited to motion planning for driving at low-speeds in parking lots and driveways.
In other words, managing the most effective operation of a motion planning process may depend on responding to characteristics of function-related features that are to be traveled.
Therefore, based on known properties (performance characteristics) of motion planning process A and motion planning process B, it may be desirable to use process A for function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones, and process B for function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering.
Parameter Settings for a Given Motion Planning Process
Motion planning processes of various types generally rely on user-selected parameter settings 71 (
As a result, the output of a given motion planning process that uses given parameter settings may be better suited (and therefore result in improved performance, by some metric) to the properties of a particular function-related feature. Therefore, it may be desirable to associate different sets of parameters for a given motion planning process with corresponding different function-related features.
For example, a certain motion planning process based on a class of algorithms known as rapidly exploring random trees (RRT), employing a parameter set A, tends to bias the RRT algorithm tree growth along a pre-defined nominal route. Such an approach may exhibit better performance characteristics (e.g., faster identification of a path of motion whose quality exceeds some user-defined threshold) in driving scenarios that do not require complex maneuvering, compared to the same motion planning process employing parameter set B, which does not tend to bias the RRT algorithm tree growth toward any particular route.
In some cases, a certain motion planning process based on a class of algorithms known as state lattices, using a parameter set A, defines a coarsely-spaced lattice in both space and time and therefore enables only coarse vehicle maneuvers and can identify candidate motion trajectories extremely rapidly. Such an approach may exhibit better performance characteristics (e.g., faster identification of a motion path whose quality exceeds some user-defined threshold) in a range of high-speed driving scenarios compared to the same planning process using a parameter set B, which defines a finely-spaced lattice in both space and time and therefore enables precise maneuvering at the cost of additional computation time.
Therefore, based on known properties of a motion planning process that can employ parameter set A or parameter set B, it may be desirable to arrange for the use of parameter set A for function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones and the parameter set B for function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering. In some implementations, the two different parameter sets may be employed by a single given motion planning process. In some cases, the two different parameter sets may be employed by two different motion planning processes or two separate instances of a single process.
Motion planning process parameter settings can also be adjusted to influence common vehicle operational settings such as the travel speed, nominal and maximum longitudinal and lateral acceleration and deceleration, and other settings.
Use of Motion Planning Process Prior Information
Motion planning processes of various types may take advantage of information of various types (“prior information”) that is provided or available at initiation of the motion planning process or at a time prior to the planning of motion. Such prior information 79 (
For example, given information about an ego vehicle's current position and goal position, and knowledge of properties of the road network e.g., function-related information 42, it may be possible to compute a nominal path from the current position to the goal position under the assumption that the route is free of obstacles. This nominal path can then be provided to the motion planning process as prior information, which may reduce the burden on the planning process of computing a path in certain scenarios. The nominal path can also take account of other information associated with the applicable function-related feature, such as the nominal travel speed, which influences the path curvature that can be followed by the vehicle to within a specified precision.
In some cases, given knowledge of the properties of the road network stored in the database, it may be possible to compute multiple nominal paths between a collection of start positions and goal positions on the road network. This collection of nominal paths can then be provided to the motion planning process as prior information, which may reduce the burden on the planning process of computing a motion plan in certain scenarios. This type of prior information may be especially useful in scenarios that require complex maneuvering, such as parking lots, because complex maneuvers can be challenging or time consuming to compute using a motion planning process, and the presentation of the nominal paths as prior information can reduce the computational burden.
Trajectory Tracking Processes
As also shown in
Certain trajectory tracking processes may exhibit performance characteristics that vary depending on the operational environment, for example, a function-related feature associated with the vehicle's location. Therefore, it may be desirable to associate different trajectory tracking processes with corresponding different function-related features, so that when the ego vehicle enters or leaves or transitions between function-related features it also transitions between execution of different trajectory tracking processes.
For example, a certain trajectory tracking process (process A) 105 based on a pure pursuit algorithm may quickly generate sets of input levels that are suitable for tracking paths that exhibit little path curvature, and may therefore be well-suited to driving at high speeds on highways for which it is impossible to safely follow high-curvature trajectories. In contrast, another trajectory tracking process (process B) 107 based on MPC may require greater computational effort and therefore operate at a lower update rate than process A, but may generate sets of input levels that are suitable for tracking complex (e.g., involving high path curvature or multi-point turn) trajectories, and may therefore be well-suited to driving at low-speeds in parking lots and driveways.
Therefore, based on known properties (e.g., performance characteristics 109) of trajectory tracking process A and trajectory tracking process B, it may be desirable to use process A to track paths in function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones, and process B to track paths in function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering.
Adjustment of Trajectory Tracking Process Parameter Settings
Trajectory tracking processes of various types generally rely on user-selected parameter settings 112 that influence the performance characteristics and input levels that are the output of a trajectory tracking process. As a result, the output from a given trajectory tracking process that uses given parameter settings may be better suited (and therefore yield better performance, by some metric) to the properties of a particular type of function-related feature. Therefore, it may be desirable to associate different sets of parameters for a single trajectory tracking process with different types of function-related features.
For example, a certain trajectory tracking process that is based on a pure pursuit algorithm and uses a parameter set A, which includes a large “look ahead distance,” may track trajectories more accurately in driving scenarios that do not require high curvature maneuvering, compared to the same trajectory tracking process using parameter set B, which includes a small look ahead distance, and results in accurate tracking of high curvature trajectories.
In some cases, a certain trajectory tracking process based on a class of algorithms known as MPC, and using a parameter set A, may define a cost function that weakly penalizes deviation from the trajectory provided by the motion planning process. This approach may produce coarse maneuvering, though also may generate steering inputs that have relatively small rate of change of amplitude and are comfortable to a passenger. By contrast, the same trajectory tracking process using parameter set B, which defines a cost function that heavily penalizes deviation from the trajectory provided by the motion planning process and therefore results in precise maneuvering, may yield steering inputs that have relatively large rate of change of amplitude and are less comfortable to a passenger.
Therefore, based on known properties of the trajectory tracking process when using different parameter sets (such as parameter set A or parameter set B), it may be desirable to have the process use a parameter set A for paths for function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones, and to use a parameter set B for paths for function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering.
Actuator Control Processes
As also shown in
Certain actuator control processes may exhibit performance characteristics 121 that vary depending on the vehicle operational conditions and the environment conditions. Therefore, it may be desirable to associate different actuator control processes with corresponding different function-related features. Then ego vehicle transitions between function-related features can cause transitions between executions of different corresponding actuator control processes. Changes in vehicle operational conditions may include, for example, changes in vehicle speed, changes in engine throttle level, changes in selected transmission gear, and others. Changes in the environment conditions may include, for example, road surface friction levels, road pitch and roll, road roughness levels, whether or not it is raining or snowing or has recently rained or snowed, whether or not there is slush or puddles present on the road, and others.
For example, a certain throttle actuator control process (process A) 123 may yield accurate tracking of a desired throttle input level when the engine RPM level is relatively high and may therefore be well-suited to scenarios involving driving at high speeds on highways. Another throttle actuator control process (process B) 125 may yield accurate tracking of desired throttle input levels when the engine RPM level is relatively low and may therefore be well-suited to scenarios involving tracking complex (e.g., involving high path curvature or multi-point turn) trajectories, such as driving at low-speeds in parking lots and driveways.
In some cases, a certain steering actuator control process (process A) may yield accurate tracking of a desired steering input level when the vehicle speed level is relatively high and the wheel “scrubbing” torque that resists turning is thus relatively low and may therefore be well-suited to scenarios involving driving at high speeds on highways. Another steering actuator control process (process B) may yield accurate tracking of desired steering input levels when the vehicle speed is relatively low and the wheel “scrubbing” torque that resists turning is thus relatively high, and may therefore be well-suited to scenarios involving tracking complex (e.g., involving high path curvature or multi-point turn) trajectories, such as driving at low-speeds in parking lots and driveways.
Therefore, in both examples and other examples, based on known properties of actuator control process A and actuator control process B, it may be desirable to use process A for function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones and process B for function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering.
Actuator Control Process Parameter Settings
As also shown in
For example, a certain throttle actuator control process based on a proportional-derivative algorithm, using a parameter set A, may track a desired throttle input level more accurately when the engine RPM level is relatively high and may therefore be well-suited to scenarios involving driving at high speeds on highways. The same throttle actuator control process based on the proportional-derivative algorithm, and using a parameter set B, may track the throttle input levels more accurately when the engine RPM level is relatively low and may therefore be well-suited to scenarios involving tracking complex (e.g., involving high path curvature or multi-point turn) trajectories, such as driving at low-speeds in parking lots and driveways.
In some cases, a certain steering actuator control process, using a parameter set A having relatively low proportional gain, may track a steering input level more accurately when the vehicle speed level is relatively high and the wheel “scrubbing” torque that resists turning is thus relatively low and may therefore be well-suited to scenarios involving driving at high speeds on highways. The same steering actuator control process using a parameter set B with relatively high proportional gain may track steering input levels more accurately when the vehicle speed is relatively low and the wheel “scrubbing” torque that resists turning is thus relatively high and may therefore be well-suited to scenarios involving tracking complex (i.e. involving high path curvature, multi-point turns) trajectories, such as driving at low-speeds in parking lots and driveways.
Therefore, based on known properties of the actuator control process employing parameter set A and employing parameter set B, it may be desirable to apply parameter set A to function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones, and parameter set B to function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering.
Actuator Control Process Prior Information
Actuator control processes of various types may take advantage of prior information 128 typically in the form of a model that is provided at initiation of the process and that may influence the performance characteristics and output of the actuator control process. This may result in the output from a given actuator control process being better suited (and therefore producing better performance, by some metric) to the properties of a particular function-related feature when the actuator control process makes use of prior information, compared to when the actuator control process does not make use of prior information. Therefore, it may be desirable to associate prior information contained in a model with corresponding different function-related features.
For example, given information about the ego vehicle's current position and knowledge of the properties of the road network available in the database, it is generally possible to generate a nominal speed associated with travel along a particular road segment. Based on this it may be possible to infer whether or not vehicle motion (including both longitudinal motion and yaw (e.g., steering-induced motion) arising from given actuator inputs can be accurately predicted through the use of models of varying complexity. Examples of models of varying complexity include kinematic models and dynamic models of increasing order that consider effects such as longitudinal wheel slip, vehicle rollover, and lateral skidding.
Control processes that exploit prior information contained in a model can generally be referred to as “model-based control processes.” Model based control processes generally work better when the specific model used in the control process accurately represents the physical relationship described by the model. Therefore it may be desirable to associate with a particular type of function-related feature a prior model that is expected to accurately represent the physical relationship described by the model during operation of the vehicle in the particular function-related feature.
Different Decision Making Processes
As also shown in
Certain decision making processes may exhibit performance characteristics that vary depending on the operational environment (e.g., function-related features). Therefore, it may be desirable to associate different decision making processes with different types of function-related features, so that when the ego vehicle transitions between function-related features it makes a corresponding transition between execution of different decision making processes.
For example, a certain decision making process (process A) 133 based on finite state machines may rapidly calculate a desirable navigation decision in scenarios where the decision-space of possible decisions is small and may therefore be well-suited to scenarios involving driving on highways, where the decision space relates primarily to determination of the vehicle's appropriate travel speed and lane. Another decision making process (process B) based on formal logic may calculate a desirable navigation decision in scenarios where the space of possible decisions is large and may therefore be well-suited to scenarios involving driving in crowded urban centers. For driving in crowded urban centers, the decision making process must adhere to a rule set governing not only vehicles but pedestrians and cyclists, and the vehicle must negotiate intersections. The corresponding decision space relates not only to the vehicle's appropriate travel speed and lane but also to the temporal sequencing of actions in light of complex actions of other road users.
Therefore, based on known properties of decision making process A and decision making process B, it may be desirable to use process A for types of function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones, and process B for types of function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering.
Decision Making Process Parameter Settings
Decision making processes of various types generally rely on user-selected parameter settings 140 that influence the performance characteristics and output of the decision making process. This may result in output from a given decision making process that uses given parameter settings being better suited (and therefore producing better performance, by some metric) to the properties of a particular type of function-related feature. Therefore, it may be desirable to associate different sets of parameters for a given decision making process with different function-related features.
For example, a certain decision making process based on finite state machines, using a parameter set A that restricts the breadth of allowable decisions, may be well-suited to scenarios involving driving at high speeds on highways where the space of possible decisions is small and relates primarily to determination of the vehicle's appropriate travel speed and lane. The same decision making process using a parameter set B that expands the breadth of allowable decisions may be well-suited to driving in crowded urban centers. In urban centers, the space of possible decisions is large, and the decision making process must adhere to a rule set governing not only vehicles but pedestrians and cyclists. The vehicle also must negotiate intersections corresponding to a decision space that relates not only to the vehicle's appropriate travel speed and lane but also the temporal sequencing of actions in light of complex actions of other road users.
Therefore, based on known properties (characteristics 139) of the decision making process using parameter set A and using parameter set B, it may be desirable to use the process employing parameter set A for types of function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones, and the process employing parameter set B for types of function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering.
Parameter sets may also adjust the level of assertiveness of the vehicle, define the rules of the road or cultural norms in different areas, including but not limited to time to wait at an intersection and whether overtaking on the right is permissible by local law.
Adjustment of Perception Process Parameter Settings
As also shown in
Perception processes of various types generally rely on user-selected parameter settings 150 that influence the performance characteristics and output 146 of the perception process. As a result, output from a given perception process that uses given parameter settings may be better suited (and therefore produce better performance, by some metric) to the properties of a particular function-related feature. Therefore, it may be desirable to associate different sets of parameters for a given perception process with different function-related features.
For example, a certain object detection perception process using a parameter set A may discount the importance of RADAR (or another sensor type) in a sensor fusion perception process in a geographic area where certain road infrastructure affects the quality of sensor readings, such as via the induction of invalid measurements.
In some cases, a certain sensor fusion perception process employing a parameter set A that heavily weights long-range RADAR data for the purpose of detecting vehicles at long range, at the expense of precise estimation of vehicle position, may be well-suited to scenarios involving driving on highways where long-range detection of vehicles is required in order to give sufficient reaction time to respond to emergency situations. In some cases, the same sensor fusion perception process employing a parameter set B that heavily weights short-range LIDAR data for the purpose of precisely detecting vehicle position at short range, at the expense of accurate vehicle detection at longer range, may be well-suited to driving in crowded urban centers, where precise, short-range detection of vehicles is required.
In some instances, a perception process using a tunable RADAR sensor employing a parameter set A that adjusts the RADAR sensor to collect data at long range with a narrow field of view for the purpose of detecting vehicles at long range, at the expense of detecting vehicles in the broader surroundings, may be well-suited to scenarios involving driving on highways where long-range detection of vehicles is required in order to give sufficient reaction time to respond to emergency situations. In contrast, the same perception process using a tunable RADAR sensor employing a parameter set B that adjusts the RADAR sensor to collect data at short range with a wide field of view for the purpose of detecting vehicles at short range, though in a broader surrounding region, may be well-suited to driving in crowded urban centers, where precise, short-range detection of vehicles is required.
Therefore, in various examples, based on known properties of the perception process when it is using parameter set A and when it is using parameter set B, it may be desirable to use the process with parameter set A for function-related features that are associated with highways, freeways, turnpikes, and other high-speed travel zones, and to use the process with parameter set B for function-related features associated with parking lots, driveways, pedestrian zones that allow vehicular traffic, and other low-speed travel zones that require complex maneuvering.
In some implementations, the output of a certain perception process that enables detection of vehicles and pedestrians may be ignored completely when the detected vehicles and pedestrians lie outside a function-related feature that describes the drivable road surface and a nearby buffer zone, because such vehicles and pedestrians (e.g., vehicles parked in a distant parking lot, or pedestrians walking on a sidewalk that is far from the road) are considered to be irrelevant to the driving task.
Therefore, as shown in
For example, a function selector 212 can use information about feature-related performance characteristics 204 of function alternatives together with the identified feature information to select among the feature-related function alternatives or to alter or affect in a wide variety of other ways the operation of a vehicle function.
Also, for example, a prior information process 208 can use the information about the identified features in generating, updating, or modifying prior information and then providing it to the currently active vehicle function.
Also, for example, a process that manages sets of parameter settings 206 and provides selected sets to the feature-related function alternatives can operate in response to the identified function-related features.
The function-related information 42 in the database 30 can be generated by a function-related information generation process to 10, which can be manual, semi-automated, automated, or a combination of them, based on known, observed, or identified function-related features and sub features 52, 54 in the environment 48.
Different Localization Processes
As also shown in
Example data inputs include lane marker information, road curbs, raw or processed sensor readings from RADAR and LIDAR, the most recent GPS pose, and relative vehicle motion estimates from odometry and inertial measurement sensors. These data inputs are commonly combined in the localization process into a probabilistic state estimate based on a filtering process (for example, a Kalman or particle filter process) to compute the pose of the vehicle with respect to the map at the current time.
Different localization processes may utilize different data inputs. For example, GPS precision may be poor in urban regions due to well-known multipath and other errors, lack of satellite availability, for example. Similarly, lane marker information may be unavailable in certain regions.
Therefore, based on known properties of localization process A and localization process B, it may be desirable to use process A for types of function-related features that are associated with highways, and process B for types of function-related features associated with urban settings.
Sensor Data Recording On-Board the Vehicle
Different geographic regions may impose restrictions on what data may be collected by a self-driving vehicle. For example, privacy laws may differ among states or countries and disallow the collection of video data on-board the vehicle.
Therefore, it may be desirable to disallow or restrict the duration of sensor data recording or storage on- and off the vehicle for types of function-related features that are associated with different areas including, but not limited to, countries, states, and communities, based on prior knowledge of the respective local laws or policies.
Transitions Between Function-Related Features
A given geographic region may have associated with it two or more function-related features. In this scenario, if each function-related feature would cause a modification of (affect) the structure or performance of software processes on the vehicle in a contradictory or conflicting manner, an arbitration method can be employed to prioritize one function-related feature above others for purposes of influencing a process, or run multiple processes concurrently and choose, switch between, or otherwise merge the outputs of the multiple processes.
In practice, when the ego-vehicle engages in transitions between geographic regions associated with different function-related features, care must be taken to ensure that the transition between different software processes (or different operating modes of a given software process) associated with the respective different function-related features does not result in abrupt changes in vehicle behavior. Such abrupt changes may cause vehicle motion that is uncomfortable or unsafe to a passenger or unusual to an outside observer. A variety of methods may be pursued to ensure that transitions between different software processes (or different modes of operation of a given software process) are accomplished in such a manner that vehicle operation remains safe, stable, and robust, including one or more of the following exemplary methods:
1. Comparing the outputs of the software processes (or modes of operation of a software process) associated with the respective function-related features and ensuring smooth transitioning between software process outputs through the use of a filtering method (e.g., averaging filter, low-pass filter, exponential moving average filter) on the value of the outputs;
2. Allowing a transition between outputs of the software processes associated with the respective function-related features and ensuring smooth transitioning between software process outputs by only allowing the transition when the difference between the outputs is smaller than a pre-defined threshold;
3. Ensuring a smooth transitioning between software process outputs by smoothly transitioning between the software process parameter settings associated with the respective function-related features;
4. Allowing the transition between outputs of the software processes associated with the respective function-related features, but ensuring that the outputs do not switch back and forth at high frequency (potentially due to the presence of high frequency noise in the vehicle's location estimate) by applying a hysteresis function.
Information about function-related features can be used during operation of either or both of a physical vehicle or of a simulation process, in which a model (at a chosen level of fidelity) of the vehicle and environment are analyzed to study or predict vehicle operational performance characteristics in a range of user-defined scenarios. Such a simulation process could be executed on the vehicle during operation (to yield a model-based predictive capability of future operational performance) or on computing resources located in an office or in the cloud.
In certain scenarios a vehicle may be controlled or managed in certain or all functions by a remote operator (termed a “teleoperator”). We use the term teleoperator broadly to include any person or software process that is located away from the vehicle and receives over a wireless communication link sufficient information (e.g., video feedback captured by a forward-facing video camera) to provide actuator input levels, navigation decisions, or other vehicle operation commands. In scenarios where the vehicle is under teleoperator control, the vehicle may be considered in effect to lie within a function-related feature that may result in the affecting of one or more of the processes described above.
Generation of Function-Related Information
As discussed earlier, function-related information 42 may be included in a road network information database or similar database and can contain information, such as geographically-dependent information, that can be used to modify (affect) the operation or performance of a vehicle such as an autonomous vehicle such as to affect the operation of software processes associated with such operational performance. The function-related information can be generated in several ways, including through the use of manual, semi-automated, or automated processes, as described below. Function-related information generated by any of these processes (or any combination of them) can be stored in a database in memory units located on automated vehicles or located on a cloud server and accessed by wireless communication by numerous automated vehicles that have access to that communication network.
Manual Generation of Function-Related Information
Function-related information can be generated by a software process termed a manual map labeling process. Such a process requires a human user to identify or define different function-related features and associate geometric regions and the road network information related to these regions with different function-related features, using a variety of possible input devices and user interfaces.
A manual map labeling process might present to a user a visual representation of a road network on a display screen, including an aerial view of the road network overlaid with a graphical representation of the connectivity of the travel lanes present in the road network. The user could then “paint,” or bound with geometric shapes of varying size, function-related features associated with highways, parking lots, driveways, toll booths, or other road features are geographic regions.
A manual map labeling process can be run on a computer processor that is not located on a vehicle or it may be run on a computer processor that is located on a vehicle or both. In applications in which the manual map labeling process is run on a computer processor that is located on an autonomous vehicle, a vehicle passenger or other human user could manually identify the presence of and the boundaries of one or more different physical roadway structures or geographic regions through a user interface. A function-related feature associated with the distinct physical roadway structure or geographic region could then be generated by the manual map labeling process.
As an example of such a process, when an automated vehicle enters a parking lot, a vehicle passenger could employ an in-vehicle user interface to indicate that the vehicle has entered a parking lot. Alternatively, a vehicle-to-infrastructure communication process could send a message to the vehicle to indicate that the vehicle has entered a parking lot. Function-related information for a function-related feature of the type associated with parking lots then can be generated by the manual map labeling process, and various software processes could then be automatically adjusted to optimize vehicle performance for operation in parking lots.
Semi-Automated Generation of Function-Related Information
Function-related features can be defined using a software process termed a semi-automated map labeling process. This process can use one or more supervised or unsupervised classification algorithms to automatically identify distinct physical roadway structures based on analysis of characteristic visual, geometric, or other features associated with the structures, including features derived from observed traffic flow and from data collected by manually-driven vehicles or automated vehicles. Such classification algorithms include, for example, support vector machines, neural networks (including convolutional neural networks and similar “deep learning” methods), and others.
A human user then can be presented with the classified physical roadway structures and asked to accept, label, modify, or reject the classifications. If the human user accepts or modifies a classification, a function-related feature associated with the particular physical roadway structure is assigned to the geographic region or regions associated with the roadway structures. Such a method promises to increase efficiency in function-related feature definition, because part of the labeling process is performed rapidly and automatically by a machine.
A semi-automated map labeling process can be run on a computer processor that is not located on a vehicle, or it may be run on a computer processor that is located on a vehicle. In applications where the semi-automated map labeling process is run on a computer processor that is located on a vehicle, data collected by sensors located on board the vehicle are analyzed, potentially in concert with data stored on a memory located on the vehicle or in the cloud, to identify the presence of distinct physical roadway structures associated with distinct function-related features. A vehicle passenger or other human user is then presented with the classified physical roadway structures and asked to accept, label, modify, or reject the classifications.
As an example of such a process, data collected by a vision sensor mounted on the vehicle could be analyzed to detect geometric and color features associated with traffic cones, thereby indicating the likely presence of a construction zone. When such a construction zone is identified, a vehicle passenger or remote vehicle monitor could be asked to confirm that the nearby road contains a construction zone using a visual prompt on an in-vehicle display, an auditory prompt, or some other prompt. If the passenger or remote monitor answers affirmatively, function-related information for a function-related feature of the type associated with construction zones can be associated with the geographic region or regions associated with the detected cones by the semi-automated map labeling process, and various software processes could then be automatically adjusted to optimize vehicle performance for operation in construction zones.
Automated Generation of Function-Related Features
Function-related features can be defined using a software process termed an automated map labeling process. This process can use one or more supervised or unsupervised classification algorithms to automatically identify distinct physical roadway structures or geographic regions based on analysis of unique visual, geometric, or other features associated with the structures, including features derived from observed traffic flow and from data collected by manually-driven vehicles or automated vehicles. Such classification algorithms include, for example, support vector machines, neural networks (including convolutional neural networks and similar “deep learning” methods), and others. Function-related information for a function-related feature associated with the distinct physical roadway structure is then automatically assigned to the geographic region(s) associated with the roadway structures. Such a method promises to greatly increase efficiency in function-related feature definition, since all of the labeling process is performed rapidly by a machine, however care must be taken to ensure that the algorithms employed for automated identification are highly accurate, since erroneous classifications by the classification process could lead to erroneous function-related feature assignment.
An automated map labeling process can be run on a computer processor that is not located on a vehicle, or it may also be run on a computer processor that is located on a vehicle. In applications where the automated map labeling process is run on a computer processor that is located on a vehicle, data collected by sensors located on board the vehicle are analyzed, potentially in concert with data stored on a memory located on the vehicle or in the cloud, to identify the presence of distinct physical roadway structures associated with distinct function-related features. Function-related features are then automatically defined based on the geometric region(s) associated with these roadway structures.
As an example of such a process, data collected by a vision sensor mounted on the automated vehicle could be analyzed to detect color and texture features associated with unpaved roads. When an unpaved road is identified, a function-related feature associated with the unpaved road can be associated with the geographic region(s) associated with the unpaved road by the semi-automated map labeling process, and various software processes could then be automatically adjusted to optimize vehicle performance for operation on unpaved roads.
Other implementations are also within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
5166668 | Aoyagi | Nov 1992 | A |
5913917 | Murphy | Jun 1999 | A |
6018806 | Cortopassi et al. | Jan 2000 | A |
6067501 | Vieweg | May 2000 | A |
6126327 | Bi et al. | Oct 2000 | A |
6151539 | Bergholz et al. | Nov 2000 | A |
6188602 | Alexander et al. | Feb 2001 | B1 |
6320515 | Olsson | Nov 2001 | B1 |
6356961 | Oprescu-Surcobe | Mar 2002 | B1 |
6546552 | Peleg | Apr 2003 | B1 |
6768813 | Nakayama | Jul 2004 | B1 |
6782448 | Goodman et al. | Aug 2004 | B2 |
6836657 | Ji et al. | Dec 2004 | B2 |
6947554 | Freyman et al. | Sep 2005 | B2 |
6978198 | Shi | Dec 2005 | B2 |
7007049 | Peng | Feb 2006 | B2 |
7218212 | Hu | May 2007 | B2 |
7260465 | Waldis et al. | Aug 2007 | B2 |
7292870 | Heredia et al. | Nov 2007 | B2 |
7350205 | Ji | Mar 2008 | B2 |
7512673 | Miloushev et al. | Mar 2009 | B2 |
7516450 | Ogura | Apr 2009 | B2 |
7562360 | Tai et al. | Jul 2009 | B2 |
7584049 | Nomura | Sep 2009 | B2 |
7587433 | Peleg et al. | Sep 2009 | B2 |
7642931 | Sato | Jan 2010 | B2 |
7657885 | Anderson | Feb 2010 | B2 |
7665081 | Pavlyushchik | Feb 2010 | B1 |
7668871 | Cai et al. | Feb 2010 | B1 |
7681192 | Dietsch et al. | Mar 2010 | B2 |
7734387 | Young et al. | Jun 2010 | B1 |
7802243 | Feeser et al. | Sep 2010 | B1 |
7805720 | Chang et al. | Sep 2010 | B2 |
7865890 | Sumi et al. | Jan 2011 | B2 |
7890427 | Rao et al. | Feb 2011 | B1 |
7904895 | Cassapakis et al. | Mar 2011 | B1 |
7934209 | Zimmer et al. | Apr 2011 | B2 |
7941656 | Hans et al. | May 2011 | B2 |
8010959 | Mullis et al. | Aug 2011 | B2 |
8078349 | Prada Gomez et al. | Dec 2011 | B1 |
8095301 | Kawamura | Jan 2012 | B2 |
8112165 | Meyer et al. | Feb 2012 | B2 |
8146075 | Mahajan | Mar 2012 | B2 |
8170739 | Lee | May 2012 | B2 |
8229618 | Tolstedt et al. | Jul 2012 | B2 |
8261256 | Adler et al. | Sep 2012 | B1 |
8266612 | Rathi et al. | Sep 2012 | B2 |
8271972 | Braghiroli | Sep 2012 | B2 |
8326486 | Moinzadeh et al. | Dec 2012 | B2 |
8375108 | Aderton et al. | Feb 2013 | B2 |
8392907 | Oshiumi et al. | Mar 2013 | B2 |
8397230 | Ewington et al. | Mar 2013 | B2 |
8428649 | Yan et al. | Apr 2013 | B2 |
8429643 | Venkatachalam et al. | Apr 2013 | B2 |
8437890 | Anderson et al. | May 2013 | B2 |
8468243 | Ogawa et al. | Jun 2013 | B2 |
8495618 | Inbaraj et al. | Jul 2013 | B1 |
8516142 | Lee et al. | Aug 2013 | B2 |
8543261 | Anderson et al. | Sep 2013 | B2 |
8549511 | Seki et al. | Oct 2013 | B2 |
8578361 | Cassapakis et al. | Nov 2013 | B2 |
8612153 | Nomura et al. | Dec 2013 | B2 |
8612773 | Nataraj et al. | Dec 2013 | B2 |
8676427 | Ferguson et al. | Mar 2014 | B1 |
8706394 | Trepagnier et al. | Apr 2014 | B2 |
8712624 | Ferguson et al. | Apr 2014 | B1 |
8744648 | Anderson et al. | Jun 2014 | B2 |
8781715 | Breed | Jul 2014 | B2 |
8813061 | Hoffman et al. | Aug 2014 | B2 |
8880270 | Ferguson et al. | Nov 2014 | B1 |
9008961 | Nemec et al. | Apr 2015 | B2 |
9045118 | Taguchi et al. | Jun 2015 | B2 |
9070305 | Raman et al. | Jun 2015 | B1 |
9097549 | Rao | Aug 2015 | B1 |
9110196 | Urmson et al. | Aug 2015 | B2 |
9120485 | Dolgov | Sep 2015 | B1 |
9128798 | Hoffman et al. | Sep 2015 | B2 |
9139199 | Harvey | Sep 2015 | B2 |
9188982 | Thomson | Nov 2015 | B2 |
9202382 | Klinger et al. | Dec 2015 | B2 |
9243537 | Ge | Jan 2016 | B1 |
9348577 | Hoffman et al. | May 2016 | B2 |
9349284 | Cudak | May 2016 | B2 |
9365213 | Stenneth et al. | Jun 2016 | B2 |
9399472 | Minoiu-Enache | Jul 2016 | B2 |
9412280 | Zwillinger et al. | Aug 2016 | B1 |
9465388 | Fairfield et al. | Oct 2016 | B1 |
9493158 | Harvey | Nov 2016 | B2 |
9519290 | Kojo | Dec 2016 | B2 |
9523984 | Herbach | Dec 2016 | B1 |
9547307 | Cullinane | Jan 2017 | B1 |
9547986 | Curlander et al. | Jan 2017 | B1 |
9557736 | Silver | Jan 2017 | B1 |
9568915 | Berntorp et al. | Feb 2017 | B1 |
9587952 | Slusar | Mar 2017 | B1 |
9594373 | Solyom et al. | Mar 2017 | B2 |
9600768 | Ferguson | Mar 2017 | B1 |
9625261 | Polansky | Apr 2017 | B2 |
9645577 | Frazzoli | May 2017 | B1 |
9648023 | Hoffman et al. | May 2017 | B2 |
9682707 | Silver | Jun 2017 | B1 |
9734528 | Gromley | Aug 2017 | B2 |
9841763 | Valois | Dec 2017 | B1 |
20020035422 | Sasaki | Mar 2002 | A1 |
20040054995 | Lee | Mar 2004 | A1 |
20040093196 | Hawthorne et al. | May 2004 | A1 |
20050065711 | Dahlgren et al. | Mar 2005 | A1 |
20050093720 | Yamane et al. | May 2005 | A1 |
20050134710 | Nomura et al. | Jun 2005 | A1 |
20060103590 | Divon | May 2006 | A1 |
20060174240 | Flynn | Aug 2006 | A1 |
20060217939 | Nakate et al. | Sep 2006 | A1 |
20060242206 | Brezak et al. | Oct 2006 | A1 |
20070001831 | Raz et al. | Jan 2007 | A1 |
20070061074 | Safoutin | Mar 2007 | A1 |
20070061779 | Dowedeit et al. | Mar 2007 | A1 |
20070124029 | Hattori | May 2007 | A1 |
20070162905 | Kooijmans | Jul 2007 | A1 |
20070185624 | Duddles et al. | Aug 2007 | A1 |
20070225900 | Kropp | Sep 2007 | A1 |
20070226726 | Robsahm | Sep 2007 | A1 |
20070229310 | Sato | Oct 2007 | A1 |
20070253261 | Uchida et al. | Nov 2007 | A1 |
20070255764 | Sonnier et al. | Nov 2007 | A1 |
20080001919 | Pascucci | Jan 2008 | A1 |
20080005733 | Ramachandran et al. | Jan 2008 | A1 |
20080027646 | Kawabata | Jan 2008 | A1 |
20080046174 | Johnson | Feb 2008 | A1 |
20080119993 | Breed | May 2008 | A1 |
20080134165 | Anderson et al. | Jun 2008 | A1 |
20080140278 | Breed | Jun 2008 | A1 |
20080201702 | Bunn | Aug 2008 | A1 |
20080244757 | Nakagaki | Oct 2008 | A1 |
20080266168 | Aso et al. | Oct 2008 | A1 |
20080303696 | Aso et al. | Dec 2008 | A1 |
20090024357 | Aso et al. | Jan 2009 | A1 |
20090058677 | Tseng et al. | Mar 2009 | A1 |
20090089775 | Zusman | Apr 2009 | A1 |
20090177502 | Doinoff et al. | Jul 2009 | A1 |
20090237263 | Sawyer | Sep 2009 | A1 |
20090299630 | Denaro | Dec 2009 | A1 |
20100088011 | Bruce et al. | Apr 2010 | A1 |
20100228427 | Anderson et al. | Sep 2010 | A1 |
20100274430 | Dolgov et al. | Oct 2010 | A1 |
20100286824 | Solomon | Nov 2010 | A1 |
20100317401 | Lee et al. | Dec 2010 | A1 |
20110137549 | Gupta et al. | Jun 2011 | A1 |
20110197187 | Roh | Aug 2011 | A1 |
20110231095 | Nakada et al. | Sep 2011 | A1 |
20110252415 | Ricci | Oct 2011 | A1 |
20110265075 | Lee | Oct 2011 | A1 |
20110307879 | Ishida | Dec 2011 | A1 |
20120017207 | Mahajan et al. | Jan 2012 | A1 |
20120083947 | Anderson | Apr 2012 | A1 |
20120110296 | Harata | May 2012 | A1 |
20120124568 | Fallon et al. | May 2012 | A1 |
20120124571 | Nagai et al. | May 2012 | A1 |
20120140039 | Ota | Jun 2012 | A1 |
20120179362 | Stille | Jul 2012 | A1 |
20120242167 | Zeung et al. | Sep 2012 | A1 |
20120266156 | Spivak et al. | Oct 2012 | A1 |
20120275524 | Lien et al. | Nov 2012 | A1 |
20120323402 | Murakami | Dec 2012 | A1 |
20130054133 | Lewis et al. | Feb 2013 | A1 |
20130055231 | Hyndman et al. | Feb 2013 | A1 |
20130079950 | You | Mar 2013 | A1 |
20130085817 | Pinkus | Apr 2013 | A1 |
20130167131 | Carson | Jun 2013 | A1 |
20130174050 | Heinonen et al. | Jul 2013 | A1 |
20130223686 | Shimizu | Aug 2013 | A1 |
20130227538 | Maruyama | Aug 2013 | A1 |
20130231824 | Wilson et al. | Sep 2013 | A1 |
20130238235 | Kitchel | Sep 2013 | A1 |
20130304349 | Davidson | Nov 2013 | A1 |
20130328916 | Arikan et al. | Dec 2013 | A1 |
20130332918 | Aoyagi et al. | Dec 2013 | A1 |
20130338854 | Yamamoto | Dec 2013 | A1 |
20130339721 | Yasuda | Dec 2013 | A1 |
20140013015 | Chang | Jan 2014 | A1 |
20140018994 | Panzarella et al. | Jan 2014 | A1 |
20140059534 | Daum et al. | Feb 2014 | A1 |
20140067488 | James et al. | Mar 2014 | A1 |
20140068594 | Young et al. | Mar 2014 | A1 |
20140088855 | Ferguson | Mar 2014 | A1 |
20140136414 | Abhyanker | May 2014 | A1 |
20140149153 | Cassandras et al. | May 2014 | A1 |
20140156182 | Nemec et al. | Jun 2014 | A1 |
20140204209 | Huth et al. | Jul 2014 | A1 |
20140245285 | Krenz | Aug 2014 | A1 |
20140278090 | Boes et al. | Sep 2014 | A1 |
20140303827 | Dolgov et al. | Oct 2014 | A1 |
20140309885 | Ricci | Oct 2014 | A1 |
20140371987 | Van Wiemeersch | Dec 2014 | A1 |
20150012204 | Breuer et al. | Jan 2015 | A1 |
20150051785 | Pal et al. | Feb 2015 | A1 |
20150081156 | Trepagnier et al. | Mar 2015 | A1 |
20150088357 | Yopp | Mar 2015 | A1 |
20150120125 | Thomson et al. | Apr 2015 | A1 |
20150121071 | Schwarz | Apr 2015 | A1 |
20150154243 | Danaher | Jun 2015 | A1 |
20150154323 | Koch | Jun 2015 | A1 |
20150160024 | Fowe | Jun 2015 | A1 |
20150178998 | Attard et al. | Jun 2015 | A1 |
20150191135 | Noon et al. | Jul 2015 | A1 |
20150191136 | Noon et al. | Jul 2015 | A1 |
20150253778 | Rothoff et al. | Sep 2015 | A1 |
20150266488 | Solyom et al. | Sep 2015 | A1 |
20150279210 | Zafiroglu et al. | Oct 2015 | A1 |
20150285644 | Pfaff et al. | Oct 2015 | A1 |
20150292894 | Goddard et al. | Oct 2015 | A1 |
20150310744 | Farrelly et al. | Oct 2015 | A1 |
20150319093 | Stolfus | Nov 2015 | A1 |
20150338849 | Nemec et al. | Nov 2015 | A1 |
20150339928 | Ramanujam | Nov 2015 | A1 |
20150345966 | Meuleau | Dec 2015 | A1 |
20150345967 | Meuleau | Dec 2015 | A1 |
20150345971 | Meuleau et al. | Dec 2015 | A1 |
20150346727 | Ramanujam | Dec 2015 | A1 |
20150348112 | Ramanujam | Dec 2015 | A1 |
20150353082 | Lee et al. | Dec 2015 | A1 |
20150353085 | Lee et al. | Dec 2015 | A1 |
20150355641 | Choi et al. | Dec 2015 | A1 |
20150358329 | Noda et al. | Dec 2015 | A1 |
20150379468 | Harvey | Dec 2015 | A1 |
20160013934 | Smereka et al. | Jan 2016 | A1 |
20160016127 | Mentzel et al. | Jan 2016 | A1 |
20160016525 | Chauncey et al. | Jan 2016 | A1 |
20160033964 | Sato et al. | Feb 2016 | A1 |
20160041820 | Ricci et al. | Feb 2016 | A1 |
20160047657 | Caylor et al. | Feb 2016 | A1 |
20160075333 | Sujan et al. | Mar 2016 | A1 |
20160107655 | Desnoyer et al. | Apr 2016 | A1 |
20160109245 | Denaro | Apr 2016 | A1 |
20160129907 | Kim et al. | May 2016 | A1 |
20160137206 | Chandraker et al. | May 2016 | A1 |
20160138924 | An | May 2016 | A1 |
20160139594 | Okumura et al. | May 2016 | A1 |
20160139598 | Ichikawa et al. | May 2016 | A1 |
20160139600 | Delp | May 2016 | A1 |
20160147921 | VanHolme | May 2016 | A1 |
20160148063 | Hong et al. | May 2016 | A1 |
20160161266 | Crawford et al. | Jun 2016 | A1 |
20160161271 | Okumura | Jun 2016 | A1 |
20160167652 | Slusar | Jun 2016 | A1 |
20160209843 | Meuleau et al. | Jul 2016 | A1 |
20160214608 | Packwood-Ace | Jul 2016 | A1 |
20160231122 | Beaurepaire | Aug 2016 | A1 |
20160239293 | Hoffman et al. | Aug 2016 | A1 |
20160266581 | Dolgov et al. | Sep 2016 | A1 |
20160282874 | Kurata et al. | Sep 2016 | A1 |
20160288788 | Nagasaka | Oct 2016 | A1 |
20160334229 | Ross et al. | Nov 2016 | A1 |
20160334230 | Ross et al. | Nov 2016 | A1 |
20160355192 | James et al. | Dec 2016 | A1 |
20160370801 | Fairfield et al. | Dec 2016 | A1 |
20160379486 | Taylor | Dec 2016 | A1 |
20170010613 | Fukumoto | Jan 2017 | A1 |
20170016730 | Gawrilow | Jan 2017 | A1 |
20170024500 | Sebastian et al. | Jan 2017 | A1 |
20170059335 | Levine et al. | Mar 2017 | A1 |
20170059339 | Sugawara et al. | Mar 2017 | A1 |
20170082453 | Fischer et al. | Mar 2017 | A1 |
20170090480 | Ho et al. | Mar 2017 | A1 |
20170110022 | Gulash | Apr 2017 | A1 |
20170122766 | Nemec et al. | May 2017 | A1 |
20170139701 | Lin et al. | May 2017 | A1 |
20170192437 | Bier et al. | Jul 2017 | A1 |
20170219371 | Suzuki et al. | Aug 2017 | A1 |
20170242436 | Creusot | Aug 2017 | A1 |
20170245151 | Hoffman et al. | Aug 2017 | A1 |
20170276502 | Fischer et al. | Sep 2017 | A1 |
20170277193 | Frazzoli et al. | Sep 2017 | A1 |
20170277194 | Frazzoli et al. | Sep 2017 | A1 |
20170277195 | Frazzoli et al. | Sep 2017 | A1 |
20170291608 | Engel et al. | Oct 2017 | A1 |
20170305420 | Desens et al. | Oct 2017 | A1 |
20170327128 | Denaro | Nov 2017 | A1 |
20170336788 | Iagnemma | Nov 2017 | A1 |
20170341652 | Sugawara et al. | Nov 2017 | A1 |
20170345321 | Cross et al. | Nov 2017 | A1 |
20170356746 | Iagnemma | Dec 2017 | A1 |
20170356747 | Iagnemma | Dec 2017 | A1 |
20170356748 | Iagnemma | Dec 2017 | A1 |
20170356750 | Iagnemma | Dec 2017 | A1 |
20170356751 | Iagnemma | Dec 2017 | A1 |
20170369051 | Sakai et al. | Dec 2017 | A1 |
20180004210 | Iagnemma et al. | Jan 2018 | A1 |
20180053276 | Iagnemma et al. | Feb 2018 | A1 |
20180053412 | Iagnemma et al. | Feb 2018 | A1 |
20180113455 | Iagnemma et al. | Apr 2018 | A1 |
20180113456 | Iagnemma et al. | Apr 2018 | A1 |
20180113457 | Iagnemma et al. | Apr 2018 | A1 |
20180113459 | Bennie et al. | Apr 2018 | A1 |
20180113463 | Iagnemma et al. | Apr 2018 | A1 |
20180113470 | Iagnemma et al. | Apr 2018 | A1 |
20180114442 | Minemura et al. | Apr 2018 | A1 |
20180120859 | Eagelberg et al. | May 2018 | A1 |
Number | Date | Country |
---|---|---|
105652300 | Jun 2016 | CN |
2381361 | Oct 2011 | EP |
2982562 | Feb 2016 | EP |
2009-102003 | May 2009 | JP |
2018-012478 | Jan 2018 | JP |
WO2014139821 | Sep 2014 | WO |
WO2015008032 | Jan 2015 | WO |
WO2015151055 | Oct 2015 | WO |
WO2017205278 | Nov 2017 | WO |
WO2017218563 | Dec 2017 | WO |
WO2018005819 | Jan 2018 | WO |
Entry |
---|
Kessels et al., “Electronic Horizon: Energy Management using Telematics Information”, Vehicle Power and Propulsion Conference, 2007. VPPC 2007. IEEE, 6 pages. |
Hammerschmidt, “Bosch to Focus on Cloud for Connected Car Services”, EE Times Europe. Dec. 3, 2015, 4 pages. |
“Gain Scheduling”, Wikipedia, 1 page. https://en.wikipedia.org/wiki/Gain_scheduling. |
http://www.bosch-presse.de/pressportal/en/connected-horizon----seeing-beyond-the-bends-ahead-35691.html. |
Transaction history and application as filed of U.S. Appl. No. 15/182,281, filed Jun. 14, 2016. |
Transaction history and application as filed of U.S. Appl. No. 15/200,050, filed Jul. 1, 2016. |
Transaction history and application as filed of U.S. Appl. No. 15/182,313, filed Jun. 14, 2016. |
Transaction history and application as filed of U.S. Appl. No. 15/182,360, filed Jun. 14, 2016. |
Transaction history and application as filed of U.S. Appl. No. 15/182,400, filed Jun. 14, 2016. |
Transaction history and application as filed of U.S. Appl. No. 15/182,365, filed Jun. 14, 2016. |
U.S. Appl. No. 15/182,281, filed Jun. 14, 2016, Iagnemma. |
U.S. Appl. No. 15/200,050, filed Ju1. 1, 2016, Iagnemma. |
U.S. Appl. No. 15/182,313, filed Jun. 14, 2016, Iagnemma. |
U.S. Appl. No. 15/182,360, filed Jun. 14, 2016, Iagnemma. |
U.S. Appl. No. 16/186,289, filed Nov. 9, 2018, Iagnemma. |
U.S. Appl. No. 15/182,400, filed Jun. 14, 2016, Iagnemma. |
U.S. Appl. No. 15/182,365, filed Jun. 14, 2016, Iagnemma. |
Dolgov et al. “Path Planning for Autonomous Vehicles in Unknown Semi-structured Environments,” International Journal of Robotics Research, 2010, 29(5):485-501. |
Florentine et al., “Pedestrian notification methods in autonomous vehicles for multi-class mobility-on-demand service.” Proceedings of the Fourth International Conference on Human Agent Interaction, Oct. 4, 2016, pp. 387-392. |
Pendleton et al., “Autonomous golf cars for public trial of mobility-on-demand service.” Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on Sep. 28, 2018, pp. 1164-1171. |
International Search Report and Written Opinion in International Application No. PCT/US2017/040040, dated Sep. 15, 2017, 22 pages. |
Aguiar et al., “Path-following for non-minimum phase systems removes performance limitations,” IEEE Transactions on Automatic Control, 2005, 50(2):234-239. |
Aguiar et al., “Trajectory-tracking and path-following of under-actuated autonomous vehicles with parametric modeling uncertainty,” Transactions on Automatic Control, 2007, 52(8):1362-1379. |
Amidi and Thorpe, “Integrated mobile robot control,” International Society for Optics and Photonics, Boston, MA, 1991, 504-523. |
Aoude et al., “Mobile agent trajectory prediction using Bayesian nonparametric reachability trees,” American Institute of Aeronautics and Astronautics, 2011, 1587-1593. |
Autoliv.com [online], “Vision Systems—another set of “eyes”,” available on or before Sep. 8, 2012, retrieved Oct. 20, 2016,<https://www.autoliv.com/ProductsAndInnovations/ActiveSafetySystems/Pages/VisionSystems.aspx>, 2 pages. |
Autonomoustuff.com [online], “ibeo Standard Four Layer Multi-Echo LUX Sensor: Bringing together the World's Best Technologies,” available on or before Jul. 2016, retrieved on Feb. 7, 2017, <http://www.autonomoustuff.com/product/ibeo-lux-standard/>, 2 pages. |
Bahlmann et al., “A system for traffic sign detection, tracking, and recognition using color, shape, and motion information.” IEEE Intelligent Vehicles Symposium, 2005, 255-260. |
Balabhadruni, “Intelligent traffic with connected vehicles: intelligent and connected traffic systems,” IEEE International Conference on Electrical, Electronics, Signals, Communication, and Optimization, 2015, 2 pages (Abstract Only). |
Bertozzi et al., “Stereo inverse perspective mapping: theory and applications” Image and Vision Computing, 1999, 16:585-590. |
Betts, “A survey of numerical methods for trajectory optimization,” AIAA Journal of Guidance, Control, and Dynamics, Mar.-Apr. 1998, 21(2):193-207. |
Castro et al., “Incremental Sampling-based Algorithm for Minimum-violation Motion Planning”, Decision and Control, IEEE 52nd Annual Conference, Dec. 2013, 3217-3224. |
Chaudari et al., “Incremental Minimum-Violation Control Synthesis for Robots Interacting with External Agents,” American Control Conference, Jun. 2014, <http://vision.ucla.edu/˜pratikac/pub/chaudhari.wongpiromsarn.ea.acc14.pdf>, 1761-1768. |
Chen et al., “Likelihood-Field-Model-Based Dynamic Vehicle Detection and Tracking for Self-Driving,” IEEE Transactions on Intelligent Transportation Systems, Nov. 2016, 17(11):3142-3158. |
D'Andrea-Novel et al., “Control of Nonholonomic Wheeled Mobile Robots by State Feedback Linearization,” The International Journal of Robotics Research, Dec. 1995, 14(6):543-559. |
De la Escalera et al., “Road traffic sign detection and classification,” IEEE Transactions on Industrial Electronics, Dec. 1997, 44(6):848-859. |
Delphi.com [online], “Delphi Electronically Scanning Radar: Safety Electronics,” retrieved on Feb. 7, 2017, <http://delphi.com/manufacturers/auto/safety/active/electronically-scanning-radar>, 4 pages. |
Demiris, “Prediction of intent in robotics and multi-agent systems.” Cognitive Processing, 2007, 8(3):151-158. |
Dominguez et al., “An optimization technique for positioning multiple maps for self-driving car's autonomous navigation,” IEEEE International Conference on Intelligent Transportation Systems, 2015, 2694-2699. |
Fairfield and Urmson, “Traffic light mapping and detection,” In Proceedings of the International Conference on Robotics and Automation (ICRA), 2011, 6 pages. |
Falcone et al., “A linear time varying model predictive control approach to the integrated vehicle dynamics control problem in autonomous systems,” IEEE Conference on Decision and Control, 2007, 2980-2985. |
Falcone et al., “A Model Predictive Control Approach for Combined Braking and Steering in Autonomous Vehicles”, Ford Research Laboratories, Mediterranean Conference on Control & Automation, 2007, <http;//www.me.berkeley.edu/˜frborrel/pdf/pub/pub-20.pdf>, 6 pages. |
Fong et al., “Advanced Interfaces for Vehicle Teleoperation: Collaborative Control Sensor Fusion Displays, and Remote Driving Tools”, Autonomous Robots 11, 2001, 77-85. |
Franke et al., “Autonomous driving goes downtown,” IEEE Intelligent Systems and their Applications, 1998, 6:40-48. |
Fraser, “Differential Synchronization,” ACM: DocEng '09, Sep. 2009, <https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/35605.pdf>, 13-20. |
Garcia et al., “Model predictive control: theory and practice—a survey,” Automatica, 1989, 25(3):335-348. |
Gavrila and Philomin, “Real-time object detection for “smart” vehicles,” In Proceedings of the Seventh IEEE International Conference on Computer Vision, 1999, 1:87-93. |
Golovinsky et al., “Shape-based Recognition of 3D Point Clouds in Urban Environments,” Proceedings of the 12th International Conference on Computer Vision, 2009, 2154-2161. |
He et al., “Color-Based Road Detection in Urban Traffic Scenes,” IEEE Transactions on Intelligent Transportation Systems, Dec. 2004, 5(4):309-318. |
Himmelsback et al., “Fast Segmentation of 3D Point Clouds for Ground Vehicles,” IEEE Intelligent Vehicles Symposium, Jul. 21-24, 2010, 6 pages. |
IEEE Global Initiative for Ethical Consideration in Artificial Intelligence and Autonomous Systems, “Ethically Aligned Design: A Vision for Prioritizing Human Wellbeing with Artificial Intelligence and Autonomous Systems,” IEEE Advancing Technology for Humanity, Dec. 13, 2016, 138 pages. |
ISO.org, “ISO 14229-1:2006; Road Vehicles—Unified diagnostic services (UDS)—Part 1: Specification and requirements,” International Standard Organization, 2006, retrieved on Apr. 4, 2018, <https://www.iso.org/standard/45293.html>, 2 pages (abstract). |
ISO.org, “ISO 15765-3:2004; Road Vehicles—Diagnostics on Controller Area Networks (CAN)—Part 3: Implementation of unified diagnostic services (UDS on CAN),” International Standard Organization, Oct. 2004, retrieved on Apr. 4, 2018, <https://www.iso.org/obp/ui/#iso:std:iso:14229:-1:ed-1:v2:en>, 2 pages (abstract). |
Jiang and Nijmeijer, “Tracking control of mobile robots: a case study in backstepping,” Automatica, 1997, 33(7):1393-1399. |
Kanayama, “A Stable Tracking Control Method for an Autonomous Mobile Robot,” International Conference on Robotics and Automation, 1990, 384-389. |
Karaman and Frazzoli, “Sampling-based algorithms for optimal motion planning.” Int. Journal of Robotics Research, Jun. 2011, <http://ares.lids.mit.edu/papers/Kamman.Frazzoli.IJRR11.pdf>, 30(7):846-894. |
Karaman et al., “Sampling-based Algorithms for Optimal Motion Planning with Deterministic—Calculus Specifications”, 2012 American Control Conference, Jun. 27-Jun. 29, 2012, 8 pages. |
Kavraki et al., “Probabilistic roadmaps for path planning in high-dimensional configuration spaces.” IEEE Transactions on Robotics and Automation, 1996, 12(4):566-580. |
Kim, “Robust lane detection and tracking in challenging scenarios.” IEEE Transactions on Intelligent Transportation Systems, 2008, 9(1):16-26. |
Larson et al., “Securing Vehicles against Cyber Attacks,” ACM, 2008, retrieved on [date], <http://dl.acm.org/citation.cfm?id=1413174>, 3 pages. |
Lindner et al., “Robust recognition of traffic signals,” IEEE Intelligent Vehicles Symposium, 2004, 5 pages. |
Liu et al, “Nonlinear Stochastic Predictive Control with Unscented Transformation for Semi_ Autonomous Vehicles,” American Control Conference, Jun. 4-6, 2014, 5574-5579. |
Liu et al., “Robust semi-autonomous vehicle control for roadway departure and obstacle avoidance,” ICCAS, Oct. 20-23, 2013, 794-799. |
Lobdell, “Robust Over-the-air Firmware Updates Using Program Flash Memory Swap on Kinetis Microcontrollers,” Freescal Semiconductor Inc., 2012, retrieved on Apr. 11, 2018, <http://cache.freescale.com/flies/microcontrollers/doc/app_note/AN4533.pdf>, 20 pages. |
Luzcando (searcher), “EIC 3600 Search Report,” STIC—Scientific & Technical Information Center, Feb. 14, 2018, 20 pages. |
Maldonado-Bascón et al., “Road-sign detection and recognition based on support vector machines,” IEEE Transactions on Intelligent Transportation Systems, 2007, 8(2):264-278. |
Mayne et al., “Constrained model predictive control: Stability and optimality,” Automatica, 2000, 36(6):789-814. |
Mobileye [online], “Advanced Driver Assistance Systems (ADAS) systems range on the spectrum of passive/active,” Copyright 2017, retrieved on Oct. 20, 2016, <http://www.mobileye.com/our-technology/adas/>, 2 pages. |
Mogelmose et al., “Vision-based traffic sign detection and analysis for intelligent driver assistance systems: Perspectives and survey,” IEEE Transactions on Intelligent Transportation Systems, 2012, 13(4):1484-1497. |
Morris et al., “Learning, modeling, and classification of vehicle track patterns from live video.” IEEE Transactions on Intelligent Transportation Systems, 2008, 9(3):425-437. |
Nilsson et al., “A Framework for Self-Verification of Firmware Updates over the Air in Vehicle ECUs,” IEEE. GLOBECOM Workshops, Nov. 2008, 5 pages. |
Nilsson et al., “Conducting Forensic Investigations of Cyber Attacks on Automobiles In-Vehicle Networks,” ICST, 2008, retrieved on Mar. 20, 2016, <http://dl.acm.org/citation.cfm?id=1363228>, 6 pages. |
Ollero and Amidi, “Predictive path tracking of mobile robots. application to the CMU Navlab,” in 5th International Conference on Advanced Robotics, 1991, 91:1081-1086. |
Paik et al., “Profiling-based Log Block Replacement Scheme in FTL for Update-intensive Executions,” IEEE: Embedded and Ubiquitous Computing (EUC), Oct. 2011, 182-188. |
Ponomarev, “Augmented reality's future isn't glasses. It's the car,” Venturebeat.com, available on or before, Aug. 2017, retrieved on Mar. 30, 2018, <https://venturebeat.com/2017/08/23/ar-will-drive-the-evolution-of-automated-cars/>, 4 pages. |
Premebida et al., “A lidar and vision-based approach for pedestrian and vehicle detection and tracking.” In Proceedings of the IEEE Intelligent Transportation Systems Conference, 2007, 1044-1049. |
Premebida et al., “LIDAR and vision-based pedestrian detection system.” Journal of Field Robotics, 2009, 26(9):696-711. |
Rankin et al., “Autonomous path planning navigation system used for site characterization,” SPIE—International Society for Optics and Photonics, 1996, 176-186. |
Shavel-Shwartz et al., “Avoiding a “Winter of Autonomous Driving”: On a Formal Model of Safe, Scalable, Self-driving Cars,” arXiv preprint, Aug. 17, 2017, 25 pages. |
Shen et al., “A Robust Video based Traffic Light Detection Algorithm for Intelligent Vehicles,” Proceedings of the IEEE Intelligent Vehicles Symposium, 2009, 521-526. |
Shin, “Hot/Cold Clustering for Page Mapping in NAND Flash Memory,” IEEE: Transactions on Consumer Electronics, Nov. 2011, 57(4):1728-1731. |
Spieser et al, “Toward a systematic approach to the design and evaluation of automated mobility-on-demand systems: A case study in Singapore,” Road Vehicle Automation, 2014, 229-245. |
Standards.sae.org, “Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles,” SAE International, Sep. 2016, retrieved on Apr. 18, 2017, <http://standards.sae.org/j3016_201609/>, 3 pages. |
Steger et al, “Applicability of IEEE 802.11s for Automotive Wireless Software Updates,” IEEE: Telecommunications (ConTEL), Jul. 2015, 8 pages. |
Stokar, “Perform over-the-air updates for car ECUss,” eMedia Asia Ltd., 2013, retrieved on Apr. 11, 2018, <http://www.eetasia.com/STATIC/PDF/201312/EEOL_2013DEC05_NET_EMS_TA_01.pdf?SOURCES=DOWNLOAD>, 3 pages. |
Strahn et al., “Laser Scanner-Based Navigation for Commercial Vehicles,” IEEE Intelligent Vehicles Symposium, Jun. 13-15, 2007, 969-974. |
Tabuada and Pappas, “Linear time logic control of discrete-time linear systems,” IEEE Transactions on Automatic Control, 2006, 51(12):1862-1877. |
Wallace et al., “First results in robot road-following,” in IJCAI, 1985, 1089-1095. |
Wang et al., “Lane detection and tracking using B-Snake,” Image and Vision Computing, 2004, 22(4):269-280. |
Wang et al., “Simultaneous localization, mapping and moving object tracking,” The International Journal of Robotics Research, 2007, 26(9):889-916. |
Weiskircher et al., “Predictive Guidance and Control Framework for (Semi-) Autonomous Vehicles in Public Traffic,” IEEE Transactions on Control Systems Technology, 2017, 25(6):2034-2046. |
Weiss et al., “Autonomous v. Tele-operated: How People Perceive Human-Robot Collaboration with HRP-2,” Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, 2009, 3 pages. |
Wit et al., “Autonomous ground vehicle path tracking,” Journal of Robotic Systems, 2004, 21(8):439-449. |
Wu et al., “Data Sorting in Flash Memory,” ACM, 2015, <http://dl.acm.org/citation.cfm?id=2747982.2665067>, 25 pages. |
Yilmaz et al., “Object tracking: A survey,” ACM Computing Surveys, 2006, 31 pages. |
Zax, “A Software Update for Your Car? Ford reboots it infotainment system, following consumer complaints,” MIT Technology Review, 2012, retrieved on Apr. 11, 2018, <http://www.technologyreview.com/view/427153/a-software-update-for-yourcar?/>, 6 pages. |
Zheng et al, “Lane-level positioning system based on RFID and vision,” IET International Conference on Intelligent and Connected Vehicles, 2016, 5 pages. |
International Preliminary Report on Patentability in International Application No. PCT/US2017/040040, dated Jan. 10, 2019, 7 pages. |
EP Extended Search Report in European Application No. 17821263, dated Jun. 25, 2019, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20180004206 A1 | Jan 2018 | US |