The present disclosure relates to a system for an autonomous vehicle, where the system predicts a location-based maneuver of a remote vehicle located in a surrounding environment. The system also determines an adaptive maneuver that the autonomous vehicle performs in response to predicting the location-based maneuver of the remote vehicle.
Autonomous vehicles may employ a variety of technologies that collect sensory information to detect their surroundings such as, but not limited to, radar, laser light, global positioning systems (GPS), and cameras. The autonomous vehicle may interpret the sensory information collected by the variety of sensors to identify appropriate navigation paths, as well as obstacles and relevant signage. Autonomous vehicles provide numerous advantages such as, for example, increased roadway capacity and reduced traffic congestion. Autonomous vehicles also relieve vehicle occupants from driving and navigation chores, allowing them to do other tasks during long and intense traffic journeys. However, there are still some challenges that autonomous vehicles experience. For example, autonomous vehicles are presently unable to predict the probability that a remote vehicle will perform a maneuver or undergo a change in vehicle speed in the immediate future, which in turn may affect motion planning.
Thus, while current autonomous vehicles achieve their intended purpose, there is a need in the art for an approach for a system that predicts the probability that a remote vehicle will perform a maneuver or undergo a change in vehicle speed in the immediate future.
According to several aspects, a system for an autonomous vehicle that predicts a location-based maneuver of a remote vehicle located in a surrounding environment is disclosed. The system includes one or more vehicle sensors collecting sensory data indicative of one or more vehicles located in the surrounding environment and one or more automated driving controllers in electronic communication with the one or more vehicle sensors. The one or more automated driving controllers executes instructions to monitor the one or more vehicle sensors for the sensory data. The one or more automated driving controllers identify the remote vehicle located in a specific geographical location relative to the autonomous vehicle based on the sensory data. The automated driving controllers determine a lateral distance and a longitudinal distance between the remote vehicle and the autonomous vehicle. The one or more automated driving controllers compare the lateral distance and the longitudinal distance with respective threshold distance values based on the sensory data. In response to determining the lateral distance and the longitudinal distance are less than the respective threshold distance values, the one or more automated driving controllers determine a lane of travel of the remote vehicle based on the sensory data. The one or more automated driving controllers compare the lane of travel of the remote vehicle with a current lane of travel of the autonomous vehicle. In response to determining the lane of travel of the remote vehicle is a different lane than the current lane of the autonomous vehicle, the one or more automated driving controllers predict the location-based maneuver of the remote vehicle based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location. The one or more automated driving controllers determine an adaptive maneuver that the autonomous vehicle performs in response to predicting the location-based maneuver of the remote vehicle.
In an aspect, the remote vehicle is located in front of the autonomous vehicle, and where the remote vehicle travels in the same direction as the autonomous vehicle.
In another aspect, the location-based maneuver of the remote vehicle is a lane change.
In yet another aspect, the one or more automated driving controllers execute instructions to compare the lateral distance with a maximum threshold lateral distance value that is part of the aggregated vehicle metrics. The one or more controllers determine the lateral distance is less than the maximum threshold lateral distance value. In response to determining the lateral distance is less than the maximum threshold lateral distance value, the one or more automated driving controllers determine a probability that the remote vehicle performs the lane change from a lane of travel into a current lane that the autonomous vehicle is located based on the aggregated vehicle metrics.
In an aspect, the remote vehicle travels in an opposite direction from the autonomous vehicle. The autonomous vehicle and the remote vehicle are both located at a four-way intersection.
In another aspect, the location-based maneuver is a turn at the four-way intersection.
In yet another aspect, the one or more automated driving controllers execute instructions to compare the lateral distance with a maximum threshold lateral distance value that is part of the aggregated vehicle metrics. The one or more automated driving controllers determine the lateral distance is less than the maximum threshold lateral distance value. In response to determining the lateral distance is less than the maximum threshold lateral distance value, the one or more automated driving controllers determine a probability that the remote vehicle performs a turn from a four-way intersection based on the aggregated vehicle metrics.
In an aspect, the adaptive maneuver is either decelerating the autonomous vehicle or having the autonomous vehicle come to a stop.
In another aspect, the historical data is collected over a period of time and represents overall vehicle behavior in the specific geographical location.
In yet another aspect, the historical data accounts for changes in overall vehicle behavior based on a time of day, a day of the week, and zoning rules.
In an aspect, the historical data may include discrete profiles for a unique geographical location based on different times of the day or day of the week.
In another aspect, the adaptive maneuver includes a deceleration or stop, increasing a longitudinal distance between the autonomous vehicle and the remote vehicle, merging left or right, or changing lanes.
In an aspect, a method for predicting a location-based maneuver of a remote vehicle located in a surrounding environment. The method includes monitoring, by one or more controllers, one or more vehicle sensors for sensory data. The one or more vehicle sensors are part of an autonomous vehicle and collect sensory data indicative of one or more vehicles located in the surrounding environment. The method includes identifying, by the one or more controllers, the remote vehicle located in a specific geographical location relative to the autonomous vehicle based on the sensory data. The method includes determining a lateral distance and a longitudinal distance between the remote vehicle and the autonomous vehicle. The method includes comparing the lateral distance and the longitudinal distance with respective threshold distance values based on the sensory data. In response to determining the lateral distance and the longitudinal distance are less than the respective threshold distance values, the method includes determining a lane of travel of the remote vehicle based on the sensory data. The method includes comparing the lane of travel of the remote vehicle with a current lane of travel of the autonomous vehicle. In response to determining the lane of travel of the remote vehicle is a different lane than the current lane of the autonomous vehicle, the method includes predicting the location-based maneuver of the remote vehicle based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location relative to the autonomous vehicle. Finally, the method includes determining an adaptive maneuver that the autonomous vehicle performs in response to predicting the location-based maneuver of the remote vehicle.
In an aspect, a system for an autonomous vehicle that predicts a change in vehicle speed of a remote vehicle located in a surrounding environment is disclosed. The system includes one or more vehicle sensors collecting sensory data indicative of one or more vehicles located in the surrounding environment; and one or more automated driving controllers in electronic communication with the one or more vehicle sensors. The one or more automated driving controllers executes instructions to monitor the one or more vehicle sensors for the sensory data, and identify the remote vehicle located in a specific geographical location relative to the autonomous vehicle based on the sensory data. The one or more controllers determine a lateral distance and a longitudinal distance between the remote vehicle and the autonomous vehicle. The one or more controllers compare the lateral distance and the longitudinal distance with respective threshold distance values based on the sensory data. In response to the determining the lateral distance and the longitudinal distance are less than the respective threshold distance values, the one or more controllers determine a lane of travel of the remote vehicle based on the sensory data. The one or more controllers compare the lane of travel of the remote vehicle with a current lane of travel of the autonomous vehicle. In response to determining the lane of travel of the remote vehicle is a different lane than the current lane of the autonomous vehicle, the one or more controllers predict the change in vehicle speed of the remote vehicle based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location relative to the autonomous vehicle. Finally, the one or more controllers determine an adaptive maneuver that the autonomous vehicle performs in response to predicting the change in vehicle speed of the remote vehicle.
In an aspect, the change in vehicle speed is either a deceleration event or an acceleration event.
In another aspect, the remote vehicle travels in the same direction as the autonomous vehicle.
In yet another aspect, the historical data is collected over a period of time and represents overall vehicle behavior in the specific geographical location.
In an aspect, the historical data accounts for changes in overall vehicle behavior based on a time of day, a day of the week, and zoning rules.
In another aspect, the historical data may include discrete profiles for a unique geographical location based on different times of the day or day of the week.
In yet another aspect, the adaptive maneuver includes a deceleration or stop, increasing a longitudinal distance between the autonomous vehicle and the remote vehicle, merging left or right, or changing lanes.
Further areas of applicability will become apparent from the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses.
Referring to
As explained below, the location-based maneuver of the remote vehicle 14 that is predicted by the system 12 is either a lane change performed by a remote vehicle 14 located in a position in front of the autonomous vehicle 10, where the autonomous vehicle 10 and the remote vehicle 14 travel in the same direction (seen in
As explained below, the system 12 predicts the location-based maneuver of the remote vehicle 14 based on aggregated vehicle metrics that are based on historical data collected at a specific geographical location where the remote vehicle 14 is presently located. The aggregated vehicle metrics are stored in memory of the one or more automated driving controllers 20 or, in the alternative, by one or more databases 40 that are part of one or more centralized computers 42 located at the back-end office 36. The historical data that the aggregated vehicle metrics are based on is collected over a period of time and is representative of overall vehicle behavior in the specific geographical location. The overall vehicle behavior includes information such as vehicle speed, whether the vehicle accelerated or decelerated, and any possible maneuvers that were performed. In an embodiment, the aggregated vehicle metrics include the probability that the remote vehicle 14 will perform a specific maneuver at the specific geographical location. For example, the aggregated vehicle metrics may indicate eighty percent probability that a vehicle may continue straight at a specific intersection, a five percent probability the vehicle turns right, and a fifteen percent probability that the vehicle turns left.
The historical data accounts for changes in the overall vehicle behavior based on a time of day, a day of the week, and zoning rules. Some examples of zoning rules include, but are not limited to, areas of reduced speed during specific hours of the day such as school zones, and signage forbidding vehicles to perform specific maneuvers such as, for example, turning during a red light. In one embodiment, the historical data may include discrete profiles for a unique geographical location based on different times of the day or day of the week. For example, a first profile may be used during a morning rush hour time during the weekday, a second profile for an evening rush hour time during the weekday, and a third profile for weekends with respect to a unique geographical location. For example, if the specific geographical location is in a school zone, then the probability that a remote vehicle 14 may turn left or right at an intersection in a school zone may be significantly greater during the morning rush hour time during a weekday as parents drop off their children to school when compared to other times of the day, or on weekends.
Referring to
Referring to
In block 204, the one or more automated driving controllers 20 identify the remote vehicle 14 located in the specific geographical location relative to the autonomous vehicle 10 based on the sensory data. In addition to the specific geographical location, the one or more automated driving controllers 20 also determine a direction of travel of the remote vehicle 14 relative to the autonomous vehicle 10. In the example as shown in
In block 206, the one or more automated driving controllers 20 determine the lateral distance dlat and the longitudinal distance dlong between the remote vehicle 14 and the autonomous vehicle 10 based on the sensory data. The method 200 may then proceed to block 208.
In block 208, the one or more automated driving controllers 20 compare the lateral distance dlat and the longitudinal distance dlong with respective threshold distance values. That is, the lateral distance dlat is compared with a lateral threshold distance value and the longitudinal distance dlong is compared with a longitudinal threshold distance value.
The lateral threshold distance value and the longitudinal threshold distance value are part of the aggregated vehicle metrics that are stored in memory of the one or more automated driving controllers 20 or, in the alternative, by the one or more databases 40. When the lateral distance dlat is less than the lateral threshold distance value and the longitudinal distance dlong is less than longitudinal threshold distance value, the one or more automated driving controllers 20 determine a potential change in motion of the autonomous vehicle 10. The potential change in motion occurs when the remote vehicle 14 performs the location-based maneuver. For example, in the embodiment as shown in FIG. 2A, the potential change in motion is when the remote vehicle 14 changes lanes from the center lane C to the right lane R. In addition to the lateral distance dlat and the longitudinal distance dlong, in an embodiment the potential change is also determined based on factors such as, for example, road shape and speed limit.
In response to determining the lateral distance dlat is less than the lateral threshold distance value and the longitudinal distance dlong is less than longitudinal threshold distance value, the method 200 may proceed to block 210. Otherwise, the method 200 terminates.
In block 210, in response to determining the lateral distance dlat and the longitudinal distance dlong are less than the respective threshold distance values, the one or more automated driving controllers 20 determine a lane of travel of the remote vehicle 14 based on the sensory data. In the example as shown in
In block 212, the one or more automated driving controllers 20 compare the lane of travel of the remote vehicle 14 with a current lane of the autonomous vehicle 10. In response to the one or more automated driving controllers 20 determining both the autonomous vehicle 10 and the remote vehicle 14 are traveling in the same lane, the method 200 may terminate. However, in response to determining the lane of travel of the remote vehicle 14 is different than the current lane of the autonomous vehicle 10, the method 200 may then proceed to block 214.
In block 214, in response to determining the lane of travel of the remote vehicle 14 is different than the current lane of the autonomous vehicle 10, the one or more controllers 20 may predict the location-based maneuver of the remote vehicle 14 based on the aggregated vehicle metrics that are based on historical data collected at the specific geographical location relative to the autonomous vehicle 10.
In the example as shown in
In block 216, the one or more automated driving controllers 20 determine the adaptive maneuver that the autonomous vehicle 10 performs in response to predicting the location-based maneuver of the remote vehicle 14. That is, in the example as shown in
Referring now to
In block 304, the one or more automated driving controllers 20 identify the remote vehicle 14 located in the specific geographical location relative to the autonomous vehicle 10 based on the sensory data. In the embodiment as shown in
In block 306, the one or more automated driving controllers 20 determine the lateral distance dlat and the longitudinal distance dlong between the remote vehicle 14 and the autonomous vehicle 10. The method 300 may then proceed to block 308.
In block 308, the one or more automated driving controllers 20 compare the lateral distance dlat and the longitudinal distance dlong with respective threshold distance values. That is, the lateral distance dlat is compared with the lateral threshold distance value and the longitudinal distance dlong is compared with the longitudinal threshold distance value. In response to determining the lateral distance dlat is less than the lateral threshold distance value and the longitudinal distance dlong is less than longitudinal threshold distance value, the method 300 may proceed to block 310. Otherwise, the method 300 terminates.
In block 310, in response to determining the lateral distance chat and the longitudinal distance dlong are less than the respective threshold distance values, the one or more automated driving controllers 20 determine a lane of travel of the remote vehicle 14 based on the sensory data. In the example as shown in
In block 312, the one or more automated driving controllers 20 compare the lane of travel of the remote vehicle 14 with a current lane of the autonomous vehicle 10. In response to the one or more automated driving controllers 20 determining both the autonomous vehicle 10 and the remote vehicle 14 are traveling in the same lane, the method 300 may terminate. However, in response to determining the lane of travel of the remote vehicle 14 is different than the current lane of the autonomous vehicle 10, the method 300 may then proceed to block 314.
In block 314, in response to determining the lane of travel of the remote vehicle 14 is different than the current lane of the autonomous vehicle 10, the one or more automated driving controllers 20 predict the location-based maneuver of the remote vehicle 14 based on the aggregated vehicle metrics.
In the example as shown in
In block 316, the one or more automated driving controllers 20 determine the adaptive maneuver that the autonomous vehicle 10 performs in response to predicting the location-based maneuver of the remote vehicle 14. That is, in the example as shown in
Referring now to
In block 404, the one or more automated driving controllers 20 identify the remote vehicle 14 located in the specific geographical location relative to the remote vehicle 14 based on the sensory data. In addition to the specific geographical location, the one or more automated driving controllers 20 also determine a direction of travel of the remote vehicle 14 relative to the autonomous vehicle 10. In the example as shown in
In block 406, the one or more automated driving controllers 20 determine the lateral distance dlat and the longitudinal distance dlong between the remote vehicle 14 and the autonomous vehicle 10. The method 400 may then proceed to block 408.
In block 408, the one or more automated driving controllers 20 compare the lateral distance dlat and the longitudinal distance dlong with respective threshold distance values. That is, the lateral distance dlat is compared with the lateral threshold distance value and the longitudinal distance dlong is compared with the longitudinal threshold distance value. In response to determining the lateral distance dlat is less than the lateral threshold distance value and the longitudinal distance dlong is less than longitudinal threshold distance value, the method 400 may proceed to block 410. Otherwise, the method 400 terminates.
In block 410, in response to determining the lateral distance dlat and the longitudinal distance dlong are less than the respective threshold distance values, the one or more automated driving controllers 20 determine a lane of travel of the remote vehicle 14 based on the sensory data. In the example as shown in
In block 412, the one or more automated driving controllers 20 compare the lane of travel of the remote vehicle 14 with a current lane of the autonomous vehicle 10. In response to the one or more automated driving controllers 20 determining both the autonomous vehicle 10 and the remote vehicle 14 are traveling in different lanes, the method 300 may terminate. However, in response to determining the lane of travel of the remote vehicle 14 is the same as the current lane of the autonomous vehicle 10, the method 400 may then proceed to block 414.
In block 414, in response to determining the lane of travel of the remote vehicle 14 is the same as the current lane of the autonomous vehicle 10, the one or more automated driving controllers 20 predict the change in vehicle speed of the remote vehicle 14 based on the aggregated vehicle metrics that are based on historical data collected at the specific geographical location relative to the autonomous vehicle 10. In the example as shown in
In block 416, the one or more automated driving controllers 20 determine the adaptive maneuver that the autonomous vehicle 10 performs in response to predicting the change in vehicle speed of the remote vehicle 14. That is, in the example as shown in
Referring generally to the figures, the disclosed system provides various technical effects and benefits by providing an approach to predict the behavior of vehicles surrounding the host or autonomous vehicle. The prediction is determined based on aggregated vehicle metrics that are based on historical data collected at the specific geographical location of the remote vehicle. The system also determines adaptive maneuvers for the autonomous vehicle to perform to accommodate the behavior of the remote vehicle. Thus, the disclosed system anticipates likely maneuvers by surrounding vehicles and instructs the autonomous vehicle to react to the likely maneuvers, thereby allowing the autonomous vehicle to operate more naturalistically in traffic.
The controllers may refer to, or be part of an electronic circuit, a combinational logic circuit, a field programmable gate array (FPGA), a processor (shared, dedicated, or group) that executes code, or a combination of some or all of the above, such as in a system-on-chip. Additionally, the controllers may be microprocessor-based such as a computer having a at least one processor, memory (RAM and/or ROM), and associated input and output buses. The processor may operate under the control of an operating system that resides in memory. The operating system may manage computer resources so that computer program code embodied as one or more computer software applications, such as an application residing in memory, may have instructions executed by the processor. In an alternative embodiment, the processor may execute the application directly, in which case the operating system may be omitted.
The description of the present disclosure is merely exemplary in nature and variations that do not depart from the gist of the present disclosure are intended to be within the scope of the present disclosure. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure.