1. Field of the Invention
This invention relates generally to a system for providing vehicle lateral stability control and, more particularly, to a system for providing vehicle lateral stability control that integrates vehicle dynamics control from sensor measurements and target path projections, and path tracking control that integrates vehicle kinematics control with vehicle dynamics control.
2. Discussion of the Related Art
Vehicle dynamics typically refers to the yaw, side-slip and roll of a vehicle and vehicle kinematics typically refers to vehicle path and lane tracking. Vehicle stability control systems are known in the art for providing stability control based on vehicle dynamics. Further, lane keeping and/or lane tracking systems are known that use vehicle kinematics. If the vehicle is traveling along a curve where the road surface has a low coefficient of friction because of ice or snow, vehicle dynamics and kinematics are both important. Conventionally, vehicle dynamics and kinematics control were performed separately and independently although they may be coordinated by a supervisory control, but only to an extent that they do not interfere with each other.
A typical vehicle stability control system relies solely on the driver steering input to generate a control command for steering assist and/or differential braking. However, driver response and style vary greatly, and there is no reliable way to identify the driving skill level and the driving style to determine how the driver is handling a particular driving situation. Contributing factors include driver incapacity, lack of experience, panic situation, etc.
Further, during a path tracking maneuver, the vehicle may encounter stability problems because of sensor data quality, such as noise, slow through-put and possible environmental disturbances. Also, because the road surface condition is unknown, and typically is not considered for path-tracking control, the same control design for a high coefficient of friction surface may generate a significant vehicle oscillation or even instability for a vehicle traveling on a low coefficient of friction surface.
In accordance with the teachings of the present invention, a vehicle lateral control system is disclosed that integrates both vehicle dynamics control and kinematics control. The system includes a driver interpreter that generates desired vehicle dynamics and a predicted vehicle path based on driver input. Error signals between desired and measured vehicle dynamics, and between the predicted vehicle path and the measured vehicle path are sent to dynamics and kinematics control processors, respectively, for generating separate dynamics and kinematics command signals. The command signals are integrated by a control integration processor to combine the commands and reduce the error signals to stabilize the vehicle as well as tracking the path. The integrated command signal can be used to control a front-wheel assist steering, rear-wheel assist steering and/or differential braking.
Additional features of the present invention will become apparent from the following description and appended claims, taken in conjunction with the accompanying drawings.
The following discussion of the embodiments of the invention directed to a vehicle lateral control system that combines both vehicle dynamics control and kinematics control is merely exemplary in nature, and is in no way intended to limit the invention or its applications or uses.
The system 10 generates an integrated control command that is sent to an actuator 12 to assist the driver in controlling the vehicle to provide the lateral stability control and path tracking control. The actuator 12 is intended to be any one or more of several control actuators used in vehicle stability control systems, such as front-wheel steering assist actuators, real-wheel steering assist actuators, differential braking actuators, etc., all well known to those skilled in the art.
For the discussion below, the following nomenclature is used:
a: distance between the vehicle front axle and the vehicle center of gravity;
b: distance between the vehicle rear axle and the vehicle center of gravity;
Cf: vehicle front tire cornering stiffness;
Cr: vehicle rear tire cornering stiffness;
Iz: vehicle moment of inertia to the center of gravity;
L: feedback gain of a state observer;
m: vehicle mass;
r: vehicle yaw rate;
u: vehicle speed;
vy: vehicle lateral speed;
x: system state variables;
δf: vehicle front wheel angle; and
δr: vehicle rear wheel angle.
The system 10 includes a hand-wheel angle sensor 14 that measures the angle of the vehicle hand-wheel to provide a signal indicative of the driver steering intent. The hand-wheel angle sensor 14 is one known device that can provide driver steering intent. Those skilled in the art will recognize that other types of sensor, such as road wheel angle sensors, can also be employed for this purpose. Also, the driver input can be a braking input or a throttle input in other embodiments.
The signal from the hand-wheel angle sensor 14 is provided to a driver interpreter 16. The driver interpreter 16 includes a command interpreter processor 20 that interprets the driver input as desired yaw rate and/or side-slip (rate) based on the hand-wheel angle signal. In other words, the processor 20 interprets the driver steering to desired vehicle dynamics. In one non-limiting embodiment, the command interpreter processor 20 uses a two-degree of freedom bicycle model for a high-coefficient of friction surface, well known to those skilled in the art. The desired yaw rate and/or the desired side-slip (rate) signals are sent to a subtractor 24.
Additionally, sensor measurement signals from sensors 26 are provided to the subtractor 24. The subtractor 24 subtracts the signals and provides a vehicle dynamical error signal Δedyn. The sensors 26 are intended to represent any of the sensors used in the system 10, including, but not limited to, a yaw rate sensor, a lateral acceleration sensor and a vehicle speed sensor. If the command interpreter processor 20 provides a yaw rate signal, then the actual measurement from the vehicle yaw rate sensor is used. If the command interpreter processor provides a desired side-slip rate signal, then an estimate of the side-slip rate is provided from the yaw rate sensor and the lateral acceleration sensor. It is well known in the art how to provide an estimate of the side-slip rate.
The driver interpreter 16 also includes a motion/path prediction processor 30 that receives the hand-wheel angle signal. The prediction processor 30 generates an objectively predicted path signal of the trajectory or the path of the vehicle as ŷ=[ŷ0, y1 . . . ŷN].
The vehicle dynamics estimation processor 32 is shown in
The vehicle state signal from the vehicle dynamics estimation processor 32 is then sent to the vehicle kinematics estimation processor 34 to determine the vehicle heading with respect to a fixed vehicle coordinate system (X, Y) as:
{circumflex over ({dot over (X)}=u·cos({circumflex over (ψ)})−{circumflex over (v)}y·sin({circumflex over (ψ)}) (2)
{circumflex over ({dot over (Y)}=u·sin({circumflex over (ψ)})+{circumflex over (v)}y·cos({circumflex over (ψ)}) (3)
{circumflex over ({dot over (ψ)}={circumflex over (r)} (4)
Where Ψ is the orientation of the vehicle. Thus, the predicted vehicle trajectory can be calculated as:
The predicted path signal from the prediction processor 30 is sent to a subtractor 46. The system 10 also includes a path projection processor 50 that provides a target path signal to the subtractor 46. The path projection processor 50 can include one or more of a vision system, a radar system and/or a map system with a GPS sensor, all known to those skilled in the art, and available on some vehicle models. The target path signal may be different depending on what type of device the path projection processor 50 uses. For example, if the path projection processor 50 uses as radar system for collision avoidance, then the target path signal may be used to avoid another vehicle. However, if the path projection processor 50 uses a map system, then the target path system may just follow the road curvature. The processor 50 provides a target path signal to the subtractor 46 indicative of the curvature of the road ahead of the vehicle as a target path signal. The subtractor 46 generates a kinematical error signal Δekin, shown in equation (7) below, where wi is a weighting factor, as the difference between the predicted vehicle path from the prediction processor 30 and the target path from the processor 50. The weighting factor wi is used to properly weight the contributing importance of each path error, such as for reducing the weighting of projected paths farther from the vehicle.
The error signal Δedyn, from the subtractor 24 is sent to a dynamics control processor 54. The dynamics control processor 54 uses the error signal Δedyn, to generate a dynamics control command signal δcmd
The kinematical error signal Δekin from the subtractor 46 is sent to a kinematics control processor 56 that generates a kinematics control command signal δcmd
Where, y(t) and ŷ(t) are the vehicles target offset and predicted offset, respectively, and T is the preview time period.
Where, Ci and Di are the system free-response array and forced-response array, respectively, and N is the number of sampling points used during the preview time period.
The command signal δcmd
The control integration processor 70 outputs a command signal δcmd to the actuator 12 as ρ1(t)δcmd
The control integration processor 70 is designed to handle cases where kinematics control is constrained as a result of slow sensing or data transfer from the processor 50. When the vehicle is traveling at high speeds, properly handling the slow throughput is necessary to avoid significant adverse effects. This is also useful in handling some occasional loss of data from the sensors.
An example of handling the slow throughput of the processor 50 is depicted in
A vehicle-fixed coordinate system (X,Y) is defined at the time where a set of vision data is read, and a vehicle-fixed coordinate system (x,y) is defined at each of the updating time for control. The position and the orientation of (x,y) with respect to (X,Y) can be estimated as (X0, Y0, Ψ0), similarly based on motion/path estimation from equations (1)-(6). Thus, the coordinate transform can be performed from (X,Y) to (x,y) as:
The sensor data read at the time for (X,Y) is defined as:
Thus, the data can be transformed under (x,y) by equation (10) as:
The foregoing discussion discloses and describes merely exemplary embodiments of the present invention. One skilled in the art will readily recognize from such discussion and from the accompanying drawings and claims that various changes, modifications and variations can be made therein without departing from the spirit and scope of the invention as defined in the following claims.
This application is a Divisional application of U.S. patent application Ser. No. 11/220,996, filed Sep. 7, 2005, titled “Method and Apparatus for Preview-Based Vehicle Lateral Control.”
Number | Name | Date | Kind |
---|---|---|---|
3230348 | Hammond, Jr. | Jan 1966 | A |
4615410 | Hosaka | Oct 1986 | A |
4898431 | Karnopp et al. | Feb 1990 | A |
5243523 | Stepper et al. | Sep 1993 | A |
5440486 | Rudzewicz et al. | Aug 1995 | A |
5448487 | Arai | Sep 1995 | A |
5524079 | Ishida et al. | Jun 1996 | A |
5627756 | Fukada et al. | May 1997 | A |
5702165 | Koibuchi | Dec 1997 | A |
5752752 | Tozu et al. | May 1998 | A |
5762406 | Yasui et al. | Jun 1998 | A |
5790970 | Brachert et al. | Aug 1998 | A |
5826951 | Sano | Oct 1998 | A |
5869753 | Asanuma et al. | Feb 1999 | A |
5947221 | Taniguchi et al. | Sep 1999 | A |
6005492 | Tamura et al. | Dec 1999 | A |
6027183 | Katayose et al. | Feb 2000 | A |
6236915 | Furukawa et al. | May 2001 | B1 |
6292111 | Ishikawa et al. | Sep 2001 | B1 |
6308123 | Ikegaya et al. | Oct 2001 | B1 |
6334500 | Shin | Jan 2002 | B1 |
6338015 | Kawagoe et al. | Jan 2002 | B1 |
6405132 | Breed et al. | Jun 2002 | B1 |
6453228 | Shimada | Sep 2002 | B1 |
6489887 | Satoh et al. | Dec 2002 | B2 |
6553293 | Hac | Apr 2003 | B1 |
6571160 | Akita | May 2003 | B2 |
6659570 | Nakamura | Dec 2003 | B2 |
6894605 | Isogai et al. | May 2005 | B2 |
6944544 | Prakah-Asante et al. | Sep 2005 | B1 |
7010409 | Lu et al. | Mar 2006 | B2 |
7162333 | Koibuchi et al. | Jan 2007 | B2 |
20020022916 | Akita | Feb 2002 | A1 |
20020032512 | Shimada | Mar 2002 | A1 |
20020109402 | Nakamura | Aug 2002 | A1 |
20030130783 | Hellmann et al. | Jul 2003 | A1 |
20050027402 | Koibuchi et al. | Feb 2005 | A1 |
20050080542 | Lu et al. | Apr 2005 | A1 |
20050080543 | Lu et al. | Apr 2005 | A1 |
20050096827 | Sadano et al. | May 2005 | A1 |
20050192728 | Yasui et al. | Sep 2005 | A1 |
20050225477 | Cong et al. | Oct 2005 | A1 |
20060167600 | Nelson et al. | Jul 2006 | A1 |
20060253240 | Rao et al. | Nov 2006 | A1 |
20070046677 | Hong et al. | Mar 2007 | A1 |
20070055431 | Deng et al. | Mar 2007 | A1 |
20070085850 | Hong et al. | Apr 2007 | A1 |
20070091094 | Hong et al. | Apr 2007 | A1 |
20070288152 | Lu et al. | Dec 2007 | A1 |
Number | Date | Country | |
---|---|---|---|
20080195280 A1 | Aug 2008 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11220996 | Sep 2005 | US |
Child | 11838032 | US |