This disclosure relates generally to vehicle driver assist or autonomous driving systems. More specifically, this disclosure relates to vehicle path prediction and closest in path vehicle detection.
Advanced driving assist system (ADAS) features, which use automated technology to assist the vehicle operator in driving and parking, form a foundation for autonomous driving (AD). Determination of vehicle position information and/or detection of nearby objects enables features such as: collision detection and avoidance for adaptive cruise control (ACC), emergency braking; blind spot detection for collision warning and/or evasive steering; lane detection for lane keeping and/or centering, lane changing, or lane departure warning; and path planning and control. Other ADAS and AD features may also be implemented using the same sensor set(s).
Electric vehicles (EVs) are often capable of higher driving and handling performance relative to conventional vehicles. EV designs can include low centers of gravity, independent steering, and immediate, quick, and smooth acceleration. As a result, ADAS and AD features for EVs can involve different considerations than those for conventional vehicles.
Vehicle path prediction and closest in path vehicle detection for collision avoidance, within the vehicle's ADAS or AD features, is improved in ways suitable to EVs having higher driving and handling performance. Predicted path curvature is calculated based on assumption of both low vehicle speed and high vehicle speed, and the two path curvatures and corresponding derived rates of curvature are combined in a weighted manner based on the vehicle's current speed. The weighted combinations of predicted path curvature and rate more accurately predict the vehicle's path, improving identification of the closest in path vehicle for evaluation of collision potential. Steering and/or braking actuator(s) may be activated, if necessary, to avoid collision with the identified closest in path vehicle.
In one embodiment, an apparatus includes at least one camera configured to capture an image of a traffic lane in front of a vehicle. The apparatus also includes a radar transceiver configured to detect one or more target vehicles proximate to the vehicle. The apparatus further includes a path prediction and vehicle detection controller configured to determine first parameters for predicting a path of the vehicle; determine second parameters for predicting the path of the vehicle; predict the path of the vehicle using a combination of the first parameters and the second parameters, where the combination is weighted based on a speed of the vehicle; identify one of the one or more target vehicles as a closest in path vehicle based on the predicted path of the vehicle; and activate at least one of a braking control and a steering control based on a proximity of the identified closest in path vehicle.
The first parameters may include a first path curvature for predicting the path of the vehicle and a first rate of the first path curvature, and the second parameters may include a second path curvature for predicting the path of the vehicle and a second rate of the second path curvature. The first and second parameters may be accurate for different ranges of speed. The weighted combination may weight the first parameters using a weight α and may weight the second parameters using a weight 1−α. The weight α may applied to the first parameters for vehicle speeds below a first threshold, and the weight 1−α may be applied to the second parameters for vehicle speeds above a second threshold. The weight α and the weight 1−α may vary linearly with vehicle speed between the first and second thresholds. The first parameters may include a first path curvature κL and a first rate κ′L, the second parameters may include a second path curvature κH and a second rate κ′H, the first path curvature κL and the second path curvature κH may be combined according to κ=α·κL+(1−α)·κH, and the first rate κ′L and the second rate κ′H may be combined according to κ′=α·κ′L+(1−α)·κ′H. The path prediction and vehicle detection controller may be configured to generate at least one of a braking command and a steering command based on the speed of the vehicle and a proximity of the identified closest in path vehicle.
In another embodiment, a vehicle includes the apparatus and a motor configured to drive wheels of the vehicle. The vehicle also includes a chassis supporting axles on which the wheels are mounted. The steering control may be configured to generate a steering command configured to control the wheels when the steering control is activated based on the proximity of the identified closest in path vehicle. A brake actuator is configured to actuate brakes for one or more of the wheels, and the brake actuator may be configured to receive a braking control signal from the braking control when the braking control is activated based on the proximity of the identified closest in path vehicle. The vehicle may be an electric vehicle.
In still another embodiment, a method includes capturing an image of a traffic lane in front of a vehicle using at least one camera; detecting one or more target vehicles proximate to the vehicle using a radar transceiver; determining first parameters for predicting a path of the vehicle; determining second parameters for predicting the path of the vehicle; predicting the path of the vehicle using a combination of the first parameters and the second parameters, where the combination is weighted based on a speed of the vehicle; identifying one of the one or more target vehicles as a closest in path vehicle based on the predicted path of the vehicle; and activating at least one of a braking control and a steering control based on a proximity of the identified closest in path vehicle.
The first parameters may include a first path curvature for predicting the path of the vehicle and a first rate of the first path curvature, and the second parameters may include a second path curvature for predicting the path of the vehicle and a second rate of the second path curvature. The first and second parameters may be accurate for different ranges of speed. The weighted combination may weight the first parameters using a weight α and may weight the second parameters using a weight 1−α. The weight α may applied to the first parameters for vehicle speeds below a first threshold, and the weight 1−α may be applied to the second parameters for vehicle speeds above a second threshold. The weight α and the weight 1−α may vary linearly with vehicle speed between the first and second thresholds. The first parameters may include a first path curvature κL and a first rate κ′L, the second parameters may include a second path curvature κH and a second rate κ′H, the first path curvature κL and the second path curvature κH may be combined according to κ=α·κL+(1−α)·κH, and the first rate κ′L and the second rate κ′H may be combined according to κ′=α·κ′L+(1−α)·κ′H. At least one of a braking command and a steering command may be generated based on the speed of the vehicle and a proximity of the identified closest in path vehicle.
The method may further include driving wheels of the vehicle with a motor; generating a steering command controlling the wheels when the steering control is activated based on the proximity of the identified closest in path vehicle; and actuating brakes for one or more of the wheels when the braking control is activated based on the proximity of the identified closest in path vehicle. The vehicle may be an electric vehicle.
Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
For a more complete understanding of this disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:
Vehicle path prediction, planning, and/or control within ADAS or AD features often needs to account for the closest in path vehicle (CIPV). However, due to the characteristics of radar reflections off surrounding stationary or moving objects, detecting and tracking the object of main interest to a vehicle using ADAS or AD is difficult. Selecting the object detected by radar that is “closest” to the vehicle using ADAS or AD often needs to account for relevant information relating to the detected target vehicle(s), such as relative distance and velocity, as well as predicted path of the vehicle.
With kinematics, the curvature of the vehicle's predicted path (e.g., occupied traffic lane) can be obtained, together with the vehicle's lateral acceleration and speed. For example, lateral vehicle acceleration Ay (in units of meters per second squared, [m/s2]), path curvature κ (in units of [m−1]), and vehicle speed Vx (in units of [m/s]) are related as:
However, the information needs to be filtered for noisy lateral acceleration and does not consider the vehicle's dynamic effects.
Ackerman steer angle calculates path curvature from the road wheel angle δrwa (in units of radians [rad]) and the wheelbase length L (in units of [m]) as follows:
However, the relationship is not considered accurate for low-speed maneuvers.
The vehicle 100 of
Passengers may enter and exit the cabin 101 through at least one door 102 forming part of the cabin 101. A transparent windshield 103 and other transparent panels mounted within and forming part of the cabin 101 allow at least one passenger (referred to as the “operator,” even when the vehicle 100 is operating in an AD mode) to see outside the cabin 101. Rear view mirrors 104 mounted to sides of the cabin 101 enable the operator to see objects to the sides and rear of the cabin 101 and may include warning indicators (e.g., selectively illuminated warning lights) for ADAS features such as blind spot warning (indicating that another vehicle is in the operator's blind spot) and/or lane departure warning.
Wheels 105 mounted on axles that are supported by the chassis and driven by the motor(s) (all not visible in
In the present disclosure, the vehicle 100 includes a vision system including at least a front camera 106, side cameras 107 (mounted on the bottoms of the rear view mirrors 104 in the example depicted), and a rear camera. The cameras 106, 107 provide images to the vehicle control system for use as part of ADAS and AD features as described below, and the images may optionally be displayed to the operator. In addition, the vehicle 100 includes a radar transceiver 120 (shown in phantom in
Although
By way of example, power doors on a vehicle may be operated by an ECU called the body control module (not shown in
Notably, vehicle control systems are migrating to higher-speed networks with an Ethernet-like bus for which each ECU is assigned an Internet protocol (IP) address. Among other things, this may allow both centralized vehicle ECUs and remote computers to pass around huge amounts of information and participate in the Internet of Things (IoT).
In the example shown in
For the present disclosure, the vehicle control system 200 includes an image processing module (IPM) CAN 211 to which the front camera ECU 216, side camera ECU 217, and rear camera ECU 218 are connected. The front camera ECU 216 receives image data from the front camera 106 on the vehicle 100, while the side camera ECU 217 receives image data from each of the side cameras 107, and the rear camera ECU 218 receives image data from the rear camera. In some embodiments, a separate ECU may be used for each camera, such that two side camera ECUs may be employed. The IPM CAN 211 and the front camera ECU 216, side camera ECU 217, and rear camera ECU 218 process image data for use in vision-based ADAS features, such as providing a rear back-up camera display and/or stitching together the images to create a “bird's eye” view of the vehicle's surroundings.
For the present disclosure, the vehicle control system 200 also includes a radar CAN 220 to which a radar ECU 221 and a radar transceiver are connected. The radar CAN 220, radar ECU 221, and radar transceiver are used to detect objects around the vehicle 100 and to measure the relative distance to and velocity of those objects.
Although
To support various ADAS functions such as collision avoidance during high performance operation, the IPM CAN 211 for the vehicle 100 can accurately predict the vehicle path, and the radar CAN 220 can detect the closest in path object to determine whether collision is likely. In the present disclosure, a combination of radar and vision (with optional input from other sensors) is used to detect and track target vehicles within the ego vehicle path (where “ego” refers to the vehicle implementing the ADAS and/or AD feature(s)).
To support ADAS and AD features, the system 300 includes the functions of camera perception and radar 301, ego and target vehicle behavior prediction 302, decision and motion planning 303, and motion control 304. Camera perception and radar 301 detects a traffic lane ahead and the relative position and velocity of other vehicles, while vehicle behavior prediction 302 determines whether the ego vehicle could potentially collide with another vehicle ahead in the ego vehicle's path based on the ego vehicle's speed, the predicted path, and the relative position and velocity of each detected target vehicle. Decision and motion planning 303 and motion control 304 respectively determine and, if necessary, implement reactive responses to the ego vehicle's possible collision with a target vehicle, such as evasive steering and/or emergency braking.
Camera perception and radar 301 performs radar detection 305 to determine a list of surrounding objects, the direction and distance of each object detected from the ego vehicle, and the velocity of each object relative to the ego vehicle. Objects may be identified as a target vehicle based, for example, on the relative velocity of the corresponding object. Camera perception and radar 301 also determines motion by the ego vehicle 306, such as the ego vehicle speed, yaw rate, lateral offset from a reference path, longitudinal acceleration, and steering angle. In some embodiments, these ego vehicle parameters may be determined from vision, radar, other sensors such as an inertial measurement unit (IMU), or some combination thereof.
Based on the ego vehicle motion parameters 306, behavior prediction 302 predicts the ego vehicle's motion 307, such as path curvature of the predicted path and rate of curvature of the predicted path. Using the predicted ego vehicle motion 307 and the object list from radar detection 305, behavior prediction 302 determines a closest in path vehicle 308, which involves using the ego vehicle's predicted path to filter target vehicles and determine which target vehicle is closest along the ego vehicle's predicted path.
y=α
0+α1x+α2x2+α3x3,
where x is distance along the longitudinal direction of the ego vehicle, y is distance along the lateral direction, α0 is the ego vehicle lateral offset from the reference path, α1 is the ego vehicle heading offset from the reference path, ½α2 is curvature of the predicted (and reference) path to be found, and ⅙α3 is the rate of curvature to be found. When the ego vehicle 100 travels along the reference path (the lane centerline), the above polynomial (with α0=0) represents the predicted lane centerline 403 shown in
Using the path prediction represented by the above polynomial, the closest in path vehicle from among target vehicles 404, 405 and 406 shown in
Vehicle kinematics 501, vehicle dynamics 502, and weighting 503 may be implemented as part of behavior prediction 302 and/or decision and motion planning 303. Vehicle kinematics 501 receive as inputs 504 the ego vehicle steering angle, speed, and yaw rate. Vehicle dynamics 502 receive as inputs 505 the ego vehicle steering angle and speed. Weighting 503 receives as an input 506 the ego vehicle speed.
In some embodiments, the ego vehicle path prediction is made with a third-order polynomial including both curvature and rate of curvature. Two types of curvature and rate of curvature may be obtained by using the Ackerman angle, kinematics, and vehicle dynamics. The final curvature and rate of curvature may be determined by fusing the previous initial two types of curvature and the associated rates based on the vehicle speed.
The path curvature κL can be expressed from the Ackerman angle as:
The rate of that path curvature κ′L can be derived as:
where the derivative of road wheel angle {dot over (δ)}rwa can be obtained from the first-order delay between road wheel angle (δrwa) and steering wheel angle (δswa) by a time delay τ and a ratio (κ):
which can be written in the time domain as:
The path curvature κH can be also expressed from kinematics using yaw rate ω and vehicle speed Vx:
The rate of path curvature κ′H can be derived as:
where {dot over (ω)}can be obtained from bicycle dynamics. The integrated system model with first-order delay and bicycle dynamics can be expressed as:
where β is side slip angle, ω is yaw rate, Cf and Cr are respectively front/rear cornering stiffness, lf and lr are respectively front/rear axle distance from the vehicle center of gravity, m is vehicle mass, and Iz is yaw rotational inertia.
Accordingly, vehicle kinematics 501 output two curvatures 507, κL and κH. Vehicle dynamics 502 employ those two curvatures to derive two rates of curvature 508, {dot over (κ)}L and {dot over (κ)}H (or alternatively κ′L and κ′H). The relationship among vehicle speed Iz, steering wheel angle δswa, and road wheel angle δrwa may be provided in a mapping table 509 as illustrated in
The final curvature used for ego vehicle path prediction can be determined from those calculated from Ackerman angle and kinematics, and the final rate of curvature used for ego vehicle path prediction can be derived from the curvatures calculated from Ackerman angle and kinematics. For example, this may be accomplished by applying weights α and 1−α as follows:
y=κx
2
+κ′x
3
κ=α·κL+(1−α)·κH
κ′=α·κ′L+(1−α)·κ′H
The weights α and 1−α can be applied by weighting 503 based on vehicle speed according to tuning parameters νfading,start and νfading,width as shown in
The third-order polynomial above for ego vehicle path prediction can be completed with the following coefficients:
a0=0
a1=0
a2=2κ
a3=6κ′
Note that α0=0 and α1=0 when the ego vehicle follows the reference path (the predicted lane centerline 403). The left and right lane boundaries 401, 402 have the same coefficients (α1, α2, α3) but different lateral offsets
The example process 600 illustrated in
A kinematics control (e.g., kinematics 501) is used to determine path curvatures according to kinematics and Ackerman steering angle, and a dynamics control (e.g., dynamics 502) is used to determine rates of the two curvatures (step 603). Vehicle speed-dependent weighting is applied (e.g., by weighting 503) to determine final predicted path curvature and rate of curvature so that the predicted path is determined (step 604). Based on the predicted path and relative direction and distance of each identified target vehicle, a closest in path vehicle is determined (step 605).
Optionally, the process 600 includes a check of whether the vehicle speed (determined, for example, from the input signal for the speedometer within the dashboard ECU 209) exceeds a value determined based on distance to and speed of the closest in path vehicle (step 606). If not, another iteration of the process is started. If so, the process activates a brake control (step 607) or illuminates a warning indicator until the vehicle speed is sufficiently reduced, and another iteration of the process is started.
The determination of two curvatures based on kinematics and Ackerman steering angle, and two rates of curvature, with weighted combinations of both used for path prediction helps to detect and track to the closest in path target object(s) so that prediction, planning, and control may take advantage of that information. In some embodiments, radar and sensor fusion is used so there is little or no increase in cost or difficulty in manufacturability and assembly.
It may be advantageous to set forth definitions of certain words and phrases used throughout this patent document. The terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation. The term “or” is inclusive, meaning and/or. The phrase “associated with,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like. The phrase “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C.
The description in this patent document should not be read as implying that any particular element, step, or function is an essential or critical element that must be included in the claim scope. Also, none of the claims is intended to invoke 35 U.S.C. § 112(f) with respect to any of the appended claims or claim elements unless the exact words “means for” or “step for” are explicitly used in the particular claim, followed by a participle phrase identifying a function. Use of terms such as (but not limited to) “mechanism,” “module,” “device,” “unit,” “component,” “element,” “member,” “apparatus,” “machine,” “system,” “processor,” “processing device,” or “controller” within a claim is understood and intended to refer to structures known to those skilled in the relevant art, as further modified or enhanced by the features of the claims themselves, and is not intended to invoke 35 U.S.C. § 112(f).
While this disclosure has described certain embodiments and generally associated methods, alterations and permutations of these embodiments and methods will be apparent to those skilled in the art. Accordingly, the above description of example embodiments does not define or constrain this disclosure. Other changes, substitutions, and alterations are also possible without departing from the spirit and scope of this disclosure, as defined by the following claims.
This application is related to the subject matter of: U.S. patent application Ser. No. ______/______,______ filed ______, 2021 and entitled SYSTEM AND METHOD IN THE PREDICTION OF TARGET VEHICLE BEHAVIOR BASED ON IMAGE FRAME AND NORMALIZATION (Attorney Docket CNOO01-00048); U.S. patent application Ser. No. ______/______,______ filed ______, 2021 and entitled SYSTEM AND METHOD IN DATA-DRIVEN VEHICLE DYNAMIC MODELING FOR PATH-PLANNING AND CONTROL (Attorney Docket CNOO01-00049); U.S. patent application Ser. No. ______/______,______ filed ______, 2021 and entitled SYSTEM AND METHODS OF INTEGRATING VEHICLE KINEMATICS AND DYNAMICS FOR LATERAL CONTROL FEATURE AT AUTONOMOUS DRIVING (Attorney Docket CNOO01-00050); U.S. patent application Ser. No. ______/______,______ filed ______, 2021 and entitled SYSTEM AND METHOD IN LANE DEPARTURE WARNING WITH FULL NONLINEAR KINEMATICS AND CURVATURE (Attorney Docket CNOO01-00052); U.S. patent application Ser. No. ______/______,______ filed ______, 2021 and entitled SYSTEM AND METHOD FOR LANE DEPARTURE WARNING WITH EGO MOTION AND VISION (Attorney Docket CNOO01-00066). The content of the above-identified patent documents is incorporated herein by reference.