The subject disclosure relates to operation of an autonomous vehicle and, in particular, to a system and method for predicting an environmental state or road condition based on the behavior of road actors with respect to the environmental state or road condition.
An autonomous vehicle navigates through its environment by detecting objects in the environment and planning its trajectory to avoid the objects. Such operation focuses on what is immediately detectable by the autonomous vehicle. However, some road conditions are outside of the range of detection or awareness of the vehicle, such as construction down the road, a stalled vehicle, an inactive traffic light, etc. Thus, the autonomous vehicle cannot plan its trajectory for these road conditions. These obstacles nonetheless cause the vehicle, as well as other road actors, to change their behavior from what one normally expects. Accordingly, it is desirable to be able to predict the presence of a degraded environmental state based on the behavior of the other road actors.
In one exemplary embodiment, a method of operating a vehicle is disclosed. A current behavior of a road actor in response to an environmental state is detected. The environmental state is determined based on the current behavior of the road actor. driving policy for the vehicle is planned based on the environmental state. A movement of the vehicle is actuated according to the driving policy.
In addition to one or more of the features described herein, the environmental state further includes at least one of an unknown road condition, road construction, a traffic signal malfunction, a stalled vehicle, an obstruction in the road, a weakly controlled or uncontrolled road intersection, and a newly changed road condition. The method further includes obtaining raw data of the road actor, determining a feature for the road actor from the raw data, and determining the current behavior from the feature. The feature of the road actor is at least one of a deceleration, an acceleration, a stopped motion, an initiated motion, a deviation from a lane, and a turn maneuver. The method further includes determining the behavior from a location of the feature within at least one of a temporal and a spatial sequence. The method further includes determining the environmental state using at least one of a Bayesian inference algorithm and a tree diagram. The method further includes creating a model for vehicle behavior, identifying a parameter of the model for an expected behavior of the road actor under a normal environmental state, and detecting a difference between the current behavior and the expected behavior to determine the environmental state from a comparison of the parameter of the model to the parameter for the current behavior.
In another exemplary embodiment, a system for navigating an autonomous vehicle is disclosed. The system includes a sensor and a processor. The sensor is configured to obtain raw data of a road actor in an environment. The processor is configured to determine a current behavior of the road actor from the raw data, wherein the current behavior is in response to an environmental state, determine the environmental state based on the current behavior of the road actor, plan a driving policy for the vehicle based on the environmental state, and actuate a movement of the vehicle according to the driving policy.
In addition to one or more of the features described herein, the environmental state further includes at least one of a road condition, road construction, a traffic signal malfunction, a stalled vehicle, an obstruction in the road, a weakly controlled or uncontrolled road intersection, and a newly changed road condition. The processor is further configured to determine a feature for the road actor from the raw data and determine the current behavior from the feature. The feature of the road actor is at least one of a deceleration, an acceleration, a stopped motion, an initiated motion, a deviation from a lane, and a turn maneuver. The processor is further configured to determine the behavior from a location of the feature within at least one of a temporal and a spatial sequence. The processor is further configured to determine the environmental state using at least one of a Bayesian inference algorithm and a tree diagram. The processor is further configured to create a model for vehicle behavior, identify a parameter of the model for an expected behavior of the road actor under a normal environmental state, and detect a difference between the current behavior and the expected behavior to determine the environmental state from a comparison of the parameter of the model to the parameter for the current behavior.
In yet another exemplary embodiment, a vehicle is disclosed. The vehicle includes a sensor and a processor. The sensor is configured to obtain raw data of a road actor in an environment. The processor is configured to determine a current behavior of the road actor from the raw data, wherein the current behavior is in response to an environmental state, determine the environmental state based on the current behavior of the road actor, plan a driving policy for the vehicle based on the environmental state, and actuate a movement of the vehicle according to the driving policy.
In addition to one or more of the features described herein, the environmental state further includes at least one of a road condition, road construction, a traffic signal malfunction, a stalled vehicle, an obstruction in the road, a weakly controlled or uncontrolled road intersection, and a newly changed road condition. The processor is further configured to determine a feature for the road actor from the raw data and determine the current behavior from the feature. The processor is further configured to determine the behavior from a location of the feature within at least one of a temporal sequence and a spatial sequence. The processor is further configured to determine the environmental state using at least one of a Bayesian inference algorithm and a tree diagram. The processor is further configured to create a model for vehicle behavior, identify a parameter of the model for an expected behavior of the road actor under a normal environmental state, and detect a difference between the current behavior and the expected behavior to determine the environmental state from a comparison of the parameter of the model to the parameter for the current behavior.
The above features and advantages, and other features and advantages of the disclosure are readily apparent from the following detailed description when taken in connection with the accompanying drawings.
Other features, advantages and details appear, by way of example only, in the following detailed description, the detailed description referring to the drawings in which:
The following description is merely exemplary in nature and is not intended to limit the present disclosure, its application or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.
In accordance with an exemplary embodiment,
The autonomous vehicle 10 generally includes at least a navigation system 20, a propulsion system 22, a transmission system 24, a steering system 26, a brake system 28, a sensor system 30, an actuator system 32, and a controller 34. The navigation system 20 determines a road-level route plan for automated driving of the autonomous vehicle 10. The propulsion system 22 provides power for creating a motive force for the autonomous vehicle 10 and can, in various embodiments, include an internal combustion engine, an electric machine such as a traction motor, and/or a fuel cell propulsion system. The transmission system 24 is configured to transmit power from the propulsion system 22 to two or more wheels 16 of the autonomous vehicle 10 according to selectable speed ratios. The steering system 26 influences a position of the two or more wheels 16. While depicted as including a steering wheel 27 for illustrative purposes, in some embodiments contemplated within the scope of the present disclosure, the steering system 26 may not include a steering wheel 27. The brake system 28 is configured to provide braking torque to the two or more wheels 16.
The sensor system 30 includes a radar system 40 that senses objects in an exterior environment of the autonomous vehicle 10 and determines various parameters of the objects useful in locating the position and relative velocities of various remote vehicles in the environment of the autonomous vehicle. Such parameters can be provided to the controller 34. In operation, the transmitter 42 of the radar system 40 sends out a radio frequency (RF) reference signal 48 that is reflected back at the autonomous vehicle 10 by one or more objects 50 in the field of view of the radar system 40 as one or more echo signals 52, which are received at receiver 44. The one or more echo signals 52 can be used to determine various parameters of the one or more objects 50, such as a range of the object, Doppler frequency or relative radial velocity of the object, and azimuth, etc. The sensor system 30 can include additional sensors such as digital cameras, Lidar, etc.
The controller 34 builds a trajectory or driving policy for the autonomous vehicle 10 based on the output of sensor system 30. The controller 34 can provide the trajectory or driving policy to the actuator system 32 to control the propulsion system 22, transmission system 24, steering system 26, and/or brake system 28 in order to move the autonomous vehicle 10 to follow the trajectory or according to the driving policy.
The controller 34 includes a processor 36 and a computer readable storage medium 38. The computer readable storage medium 38 includes programs or instructions 39 that, when executed by the processor 36, operate the autonomous vehicle 10 based on output from the sensor system 30. In various embodiment, the instructions 39 can cause the processor 36 to determine the behavior of road actors and predict a degraded environmental state based on the behavior of the road actors. Exemplary degraded environmental states can include, but are not limited to, a road condition, road construction, a traffic light malfunction, a stalled vehicle, an obstruction in the road, etc.
The raw data can be provided to a reasoning engine 216 operating on the processor 36. The reasoning engine 216 determines the current behavior of the road actors (e.g., road actors 212 and 214) from the raw data and predicts the environmental state from the road obstruction (i.e., the broken vehicle) detected from the behaviors of current road actors.
The reasoning engine 216 can be trained using data obtained when the environmental state is under normal conditions in order to determine an expected behavior of the road actors driving under degraded environmental conditions. The reasoning engine 216 can compare the current behavior of the road actors to the expected behavior. The reasoning engine 216 can be used to generate a model for vehicle behavior and identify a parameter of the model that is a characteristic of the expected behavior of the vehicle. A parameter of the current behavior of the road actors can also be determined. A difference in these two behaviors (or their corresponding parameter) can indicate the presence of a degraded environmental condition. The reasoning engine 216 can provide the degraded environmental condition to the navigation system 20 for planning a trajectory or driving policy for the ego vehicle 208. Additionally, or alternatively, the degraded environmental condition can be used at one or more downstream software modules, such as discussed with respect to
The reasoning engine 216 can also be in communication with a remote computer 218, such as a cloud computer, map server, etc. This remote computer 218 can provide map data or a normal expected behavior over a road segment, or other prior knowledge of the road segment. This information can be used in determining the presence of the degraded environmental state. In addition, once the environmental state is determined, this information can be shared back to the remote computer 218 and accessed by other vehicles 220.
The raw data of the first level 902 includes data such as a road actor's speed 910, the road actor's position 912 and the presence of a road sign 914 (or traffic light).
The features of the second level 904 can include, for example, “cruising” 916 (maintaining a constant speed), “stopped” 918, “decelerating” 920, “accelerating” 922, which can be determined from the road actor's speed 910. The features can also be an indicator of the road actor being the lead vehicle (“lead vehicle” 924) or whether the road actor is crossing traffic (“crossing traffic” 926).
The detected behaviors at the third level 906 can include, for example, “the crossing traffic does not stop” 928, “the lead vehicle decelerates, stops and goes” 930, “the crossing traffic decelerates, stops and goes” 932. Attributes such as “inactive light” 934, “all-way stop sign” 936, and “stop sign” 938 also resides in this level can be determined from raw data of the road sign 914.
The estimated environmental states of the fourth level 908 can include “partially inactive traffic light” 940, “all-way inactive traffic light” 942, “missing an all-way stop sign” 944, “minor road-only stop control” 946 and “all-way stop” 948. The states of include “partially inactive traffic light” 940, “all-way inactive traffic light” 942, and “missing an all-way stop sign” 944 are due to weak signal communication for traffic control at the intersection, which are detected by the environmental state detector 302.
In one embodiment, each feature can be assigned a probability and the reasons or conclusions predicted from the feature is a result of probabilistic calculations, which can include use of a Bayesian inferencing algorithm to update confidences or probabilities, for example.
Returning to the “lead vehicle cruises” node 1114, a first branch leads to a “crossing traffic decelerates, stops and goes” node 1124, which allows for the conclusion of “a minor road only stop control” environmental state 1126. Another branch from the “lead vehicle cruises” node 1114 leads to a “crossing traffic cruises” node 1128.
The determined environmental state can be used for to improve various aspects of the driving process. Once the environmental state is determined a trajectory or driving policy can be generated and used to move and navigate the vehicle. Alternatively, the environmental state can be used to increase a confidence of the vehicle in a trajectory that has already been generated. Mismatches can be detected between the prior mapping information for the vehicle and a new mapping necessitated by the degraded environmental conditions. Also, the driver can be notified of the detected situation. The environmental state can be used to update map system information, which is shared with other vehicles, thereby improving the quality of data provided with the other vehicles and facilitate trajectory planning at these other vehicles.
While the above disclosure has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from its scope. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiments disclosed, but will include all embodiments falling within the scope thereof