Process for determining state of a vehicle

Information

  • Patent Grant
  • 9916699
  • Patent Number
    9,916,699
  • Date Filed
    Monday, July 24, 2017
    7 years ago
  • Date Issued
    Tuesday, March 13, 2018
    6 years ago
Abstract
A method for determining a corrected yaw rate for a vehicle includes receiving a first yaw rate input from a yaw rate sensor of the vehicle and determining if the vehicle is moving or stationary. If the vehicle is determined to be moving, the method includes determining a steering angle of the vehicle, and determining an offset correction value based at least in part on a determined speed of the vehicle and the determined steering angle. A yaw rate offset is determined based at least in part on the determined offset correction value and the received first yaw rate input. A second yaw rate input is received from the yaw rate sensor of the vehicle, and a corrected yaw rate value is determined based at least in part on the received second yaw rate input and the determined yaw rate offset.
Description
FIELD OF THE INVENTION

This disclosure relates to vehicles, and more particularly to improving the accuracy of the determination of yaw rate in vehicles.


BACKGROUND OF THE INVENTION

Sensing yaw rate is important to land-based vehicles, and specifically, to road-going vehicles. Vehicle systems, such as collision avoidance systems, collision mitigation systems and stability control systems, may require accurate values of yaw rate to correctly determine the projected path of vehicle travel. Yaw rate sensors are susceptible to error, and if the measured yaw rate has significant error, then these kinds of vehicle systems may perform poorly or even fail. For example, significant error in the measured yaw rate could trigger a collision mitigation system to mistakenly determine that the vehicle is going to collide with another vehicle and could invoke emergency braking of the vehicle. Such braking could be dangerous if other vehicles are close behind and emergency braking is not warranted. Perhaps even worse, the system may mistakenly determine that the vehicle is not about to be in a collision and the system does not take action to prevent an otherwise avoidable collision.


SUMMARY OF THE INVENTION

A yaw rate offset is used to correct measured yaw rate error. In some embodiments, the yaw rate offset may be updated when the vehicle is stationary. The vehicle may be determined to be stationary by referencing at least two sensors. The first sensor provides an indication that the vehicle speed is zero. In embodiments, wherein the first sensor is a speed sensor, it can be insensitive to movement at low speeds, such as when pulling out of a parking space. Thus, the acceleration or derivative of acceleration (commonly referred to as “jerk”) obtained from the second sensor, which is sensitive to slight movements, is also used to more confidently determine that the vehicle is stationary. In some embodiments, the yaw rate offset may be updated when the vehicle is moving straight. A camera that captures images of or image data representative of the road the vehicle is driving on may be used to determine that the vehicle is moving straight. Lane delimiters may be detected in the camera images with the goal of determining whether the vehicle is moving parallel to the lane delimiters. In a preferred embodiment, the yaw rate offset is updated both at times when the vehicle is stationary and at times when the vehicle is moving straight. In some embodiments, the yaw rate offset may be updated by combining a new yaw rate offset with a previous yaw rate offset according to a ratio. The ratio may be based on a level of confidence that the vehicle is indeed in an operating condition suitable for updating the yaw rate offset.


In a particular embodiment, the present invention is directed to a method for determining a yaw rate for a road-based vehicle having a yaw rate sensor, the method comprising:


(a) capturing images or image data representative of the environment outside the vehicle;


(b) determining if the vehicle is moving and has a zero yaw rate at least based on the images, such as based at least in part on image processing of image data captured by a camera;


(c) obtaining a first measured yaw rate from the yaw rate sensor when the vehicle is determined in step (b) to be moving and to have a zero yaw rate;


(d) determining a yaw rate offset based at least in part on the measured yaw rate obtained in step (c);


(e) obtaining a second measured yaw rate from the yaw rate sensor; and


(f) determining a corrected yaw rate for the vehicle based on the second measured yaw rate and the yaw rate offset.


In another embodiment, the present invention is directed to a system for determining a corrected yaw rate for a yaw rate sensor on a land-based vehicle. The system includes a camera configured to be mounted to the vehicle, and a control system connected to the yaw rate sensor and the camera, the control system is operable to carry out the method described above.


In another embodiment, the present invention is directed to a method for determining a yaw rate for a road-based vehicle having a yaw rate sensor, the method comprising:


(a) obtaining an acceleration from an accelerometer positioned to sense a longitudinal acceleration of the vehicle;


(b) determining a rate of change of the acceleration;


(c) determining the vehicle speed;


(d) carrying out a determination of a yaw rate offset based at least in part on a first measured yaw rate from the yaw rate sensor, at least in part depending on whether the rate of change of the acceleration determined in step (b) is approximately zero, and at least in part depending on if the vehicle speed is zero;


(e) obtaining a second measured yaw rate from the yaw rate sensor; and


(f) determining a corrected yaw rate for the vehicle based on the second measured yaw rate and the yaw rate offset.


In another embodiment, the present invention is directed to a system for determining a corrected yaw rate for a yaw rate sensor on a land-based vehicle. The system includes an accelerometer configured to or operable to sense a longitudinal acceleration of the vehicle, a speed sensor and a control system connected to all three sensors. The control system is configured to or operable to carry out the method described above.


In another embodiment, the present invention is directed to a method for determining a yaw rate for a road-based vehicle having a yaw rate sensor and at least one other sensor, the method comprising:


(a) determining whether the vehicle has a yaw rate of zero;


(b) obtaining a first measured yaw rate from the yaw rate sensor depending on the determination made in step (a);


(c) determining a yaw rate offset that is a first selected proportion of the first measured yaw rate obtained in step (b) and a second selected proportion of a previous yaw rate offset, wherein the first selected proportion and the second selected proportion are selected based on a set of criteria based on data determined from the at least one other sensor;


(d) obtaining a second measured yaw rate from the yaw rate sensor; and


(e) determining a corrected yaw rate for the vehicle based on the second measured yaw rate and the yaw rate offset.


These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The drawings illustrate, by way of example only, embodiments of the present disclosure.



FIG. 1 is a perspective view of a land-based vehicle in accordance with an embodiment of the present invention;



FIGS. 2a-c are diagrams showing projected paths of the vehicle;



FIG. 3 is a diagram illustrating an example of yaw rate error;



FIG. 4 is a functional block diagram of a control system, camera, and sensors that are part of the vehicle shown in FIG. 1;



FIG. 5 is a flowchart of a method of determining a corrected yaw rate; and



FIG. 6 is an example image from a camera at the vehicle shown in FIG. 1.





DETAILED DESCRIPTION OF THE INVENTION

Referring now to the drawings and the illustrative embodiments depicted therein, FIG. 1 illustrates a land-based vehicle 10 equipped with the system of the present invention. In this example, the vehicle 10 is a passenger car, but in other examples, the vehicle may be a truck, bus, van, motorcycle, or any other kind of vehicle. In the illustrated embodiment, the equipped vehicle 10 includes a body, a passenger area, wheels 12 (including front wheels 12a and rear wheels 12b), an internal combustion engine and/or an electric motor to drive the vehicle 10, a transmission 14 to convey power from the engine or motor to the wheels 12, a steering wheel 16 to turn the front wheels 12a, as well as other components for powering and controlling the vehicle 10. Clearly, the equipped vehicle may have other systems or components, such as, for example, steering of the rear wheels 12b or the like, without affecting the scope of the present invention.


As shown in FIG. 1, the vehicle 10 further includes a control system 18, a camera 20, a yaw rate sensor 22, a longitudinal accelerometer 24, a transmission sensor 26, a steering angle sensor 28, and a speed sensor 30. The camera 20, yaw rate sensor 22, longitudinal accelerometer 24, transmission sensor 26, steering angle sensor 28, and speed sensor 30 are each connected to the control system 18 to provide sensed information to the control system 18. Such connections may be by way of conductive wires or wireless signals. A bus, such as a Controller-Area Network (CAN) bus or a Local Interconnect Network (LIN) bus or the like, may be used for communication between the sensors and the control system 18.


The control system 18 controls aspects of the vehicle's operations. The control system 18 may comprise a single device or it may comprise a plurality of devices that communicate with one another. The control system 18 may comprise a separate device that is dedicated to carrying out the functions described below, or alternatively it may be partly or wholly contained within a unit that carries out other functions, such as the engine control unit (not shown) or vehicle control unit (not shown). The control system 18 will be discussed in further detail below with respect to FIG. 4.


The camera 20 is positioned to capture image data or images representative of the scene exterior of the vehicle 10 and encompassed by the field of view of the camera. In this example, the camera 20 is positioned in a forward-facing manner at the windshield of the vehicle 10. The camera 20 may be included as part of a rearview mirror assembly. In other examples, the camera 20 may be positioned elsewhere on the vehicle 10, such as at the front or rear bumpers. The camera 20 is operable to capture images of the road on which the vehicle 10 is travelling. The camera 20 may also be operable to capture images of other vehicles, road surface characteristics (such as, for example, lane markings or lane delimiters or the like), hazards on or near the road and other features of interest. The camera 20 may be provided with a camera control system (not shown) that processes images or image data captured by the camera. The camera 20 or the camera control system is connected to the control system 18 to output images and/or image information to the control system 18.


One or more of the camera control system and the control system 18 is capable of processing images captured by the camera 20 to detect lane markers or delimiters 32 (FIGS. 2a-c), such as painted lines or Botts' dots. Any suitable algorithm may be used to detect lane delimiters. For example, portions of the images may be scanned and processed for elements in a selected range of colors that are representative of the colors of typical roadway lane delimiters. Ambient lighting conditions may be taken into account. The color ranges being searched for may be adjusted depending on the time of day. In embodiments wherein the camera control system performs lane detection, the camera control system may output information indicative of characteristics of the lane delimiters, such as the number of detected lane delimiters, their positions and angles, a curvature of one or more lane delimiters, and/or a quality of a lane delimiter and/or the like, to the control system 18. Alternatively, the camera 20 may provide images to the control system 18, which performs lane detection on the images. Detection and assessment of lane delimiters will be discussed in further detail below.


The yaw rate sensor 22 is operable to sense the left and right yaw rate of the vehicle 10 (in other words, to sense the positive and negative angular rotational velocity of the vehicle about a local vertical axis A of the vehicle). Output of the yaw rate sensor 22 to the control system 18 may comprise a voltage within a range of voltages, such as about 0 to 5 volts, with about 2.5 volts being indicative of zero yaw rate, or a data message sent over a communications bus or network bus of the vehicle, such as a CAN bus or the like. The yaw rate sensor 22 may include any type of device, such as piezoelectric device, a micromechanical device, a microelectromechanical device, or similar. The longitudinal accelerometer 24 is operable to sense the longitudinal (forward or reverse) acceleration of the vehicle 10 and provide a signal indicative of a magnitude of such acceleration to the control system 18. The longitudinal accelerometer 24 may include any type of device, such as piezoelectric device, a micromechanical device, a microelectromechanical device, or similar. The longitudinal accelerometer 24 may be part of a multi-axis accelerometer.


The yaw rate sensor 22 and the longitudinal accelerometer 24 may be provided together in a sensor package that outputs aggregated data to the control system 18. Alternatively, the yaw rate sensor 22 and the longitudinal accelerometer 24 may be provided separately.


The transmission sensor 26 senses whether the vehicle transmission 14 is in park, in the case of an automatic transmission, or whether the transmission is disengaged, in the case of a manual transmission. The brake sensor 31 (FIG. 1) senses whether any brake system or systems of the vehicle are engaged. For greater clarity, ‘a brake system’ may be a main brake system 27 or a parking brake system 29. The transmission sensor 26 (alone in some cases such as in the case of an automatic transmission, and when combined with the brake sensor 31 in some cases such as in the case of a manual transmission) can thus provide to the control system 18 information indicative of whether the vehicle 10 is stationary or moving. The transmission sensor 26 may sense other gears as well, such as drive, reverse, first gear, and second gear, among others.


The steering angle sensor 28 communicates to the control system 18 information relating to the steering angle of the vehicle 10. In the embodiment shown, the steering angle sensor 28 senses a rotational position of the steering column (not shown), however the steering angle sensor 28 may be positioned anywhere suitable for sensing any suitable vehicle component related to the vehicle steering. The steering angle sensor 28 can provide to the control system 18 information indicative of whether the vehicle 10 is not turning, that is, whether the wheels 12 are positioned to move the vehicle 10 straight.


The speed sensor 30 is operable to sense a speed of the vehicle 10. The speed sensor 30 may be positioned at all of the wheels 12 of the vehicle 10. The speed sensor 30 provides to the control system 18 an indication of the vehicle's speed, and in some embodiments the speed sensor 30 may further provide an indication of the vehicle's direction of travel. For example, the speed sensors 30 may be used to determine, whether the vehicle is driving forward or in reverse. In another example, if there is a difference in the speeds recorded at left and right speed sensors 30 it is an indication that the vehicle is travelling in an arc.


Referring to FIGS. 2a-c, the yaw rate sensor 22 may be used by the control system 18 to determine the projected path of the vehicle 10. Yaw rate may be expressed in any suitable units such as degrees per second or radians per second. Once the projected path of the vehicle is known (such as based in part on having an accurate determination of the yaw rate), the control system 18 can determine what other vehicles or obstacles on the road are in the projected path of the vehicle.


The control system 18 may operate on yaw rates stored in degrees per second, radians per second or any other units. However, the control system 18 may alternatively directly operate on yaw rates stored in other units, such as volts, binary values, or pulses, to name a few.


An example of the operation of the vehicle 10 using the yaw rate sensor 22 is shown in FIG. 2a. As shown, the vehicle 10 follows a second vehicle 34 in the same lane, as defined by lane delimiters 32. The yaw rate sensor 22 ideally outputs to the control system 18 a signal indicative of a yaw rate of about 0 degrees per second, and the control system 18 accordingly determines that the projected path 38 of the vehicle 10 is straight. Using an obstacle detection system that may, for example, include the camera 20 and/or a radar system (not shown), the control system 18 may detect the presence of the second vehicle 34. Using the yaw rate, the control system 18 can determine that the second vehicle 34 is in the projected path of the vehicle 10. If the control system 18 determines that there is a risk of collision between the vehicle 10 and the second vehicle 34 (such as a likelihood of collision that is at or above a threshold level of risk), the control system 18 can take an appropriate action, such as warning the driver of vehicle 10 via a visual and/or audible warning, and/or automatically applying the brakes of the vehicle 10.


Another example of the operation of the control system 18 is shown in FIG. 2b, in which the vehicle 10 is directly behind a third vehicle 36 in a left lane but has started a rightward lane change. Accordingly, the yaw rate sensor 22 ideally outputs to the control system 18 a signal indicative of a yaw rate of some value, such as about 0.5 degrees per second, and the control system 18 accordingly determines that the projected path, shown at 40, of the vehicle 10 is curved to the right. Consequently, the control system 18 then determines that the third vehicle 36 is not in the projected path 40 of the vehicle 10, but that the second vehicle 34 in the right lane is in the projected path of the vehicle 10, even though the second vehicle 34 is not directly in front of the vehicle 10.


Yet another example is shown in FIG. 2c, in which the vehicle 10 is travelling in a right lane of a curved road, and thus, the third vehicle 36, which is located ahead of the vehicle 10 in a left lane, may appear to be in the projected path of the vehicle 10. However, assuming the road has a radius of curvature of about 600 m (1970 ft) and the speed of the vehicle 10 is about 113 km/hr (70 mph), the yaw rate sensor 22 ideally outputs to the control system 18 a signal indicative of a yaw rate of about 3.0 degrees per second. Accordingly, the control system 18 determines that the projected path 42 of the vehicle 10 is curved to the right. Consequently, the control system 18 then determines that the third vehicle 36 is not in the projected path 42 of the vehicle 10, but that the second vehicle 34 in the right lane is in the projected path 42 of the vehicle 10.


Thus, it can be seen that yaw rate is a factor in predicting collisions. The vehicle 10 may use the yaw rate sensor 22 in a collision mitigation or avoidance system, of which the control system 18 may be a component. The vehicle 10 may additionally or alternatively use the yaw rate sensor 22 with an electronic stability control system, of which the control system 18 may be a component.


However, it should be noted that the examples of FIGS. 2a-c reference ideal yaw rate values. In practical applications, yaw rate sensors are susceptible to error. Error in the determined yaw rate can cause vehicle systems to incorrectly determine the projected path of the vehicle 10. As an example, when a prior art vehicle is driving straight, the error in the determined yaw rate can cause a control system in the prior art vehicle to determine that the projected path of the vehicle is curved. As another example, when a prior art vehicle is driving in an arc, that same error can cause a control system in the prior art vehicle to determine that the vehicle is driving straight. A relatively large magnitude error may be exhibited when the vehicle 10 is first turned on and can change with changing temperature after the vehicle 10 is started and as the vehicle 10 heats up to normal operating temperature. Such an error may be from about 0.5 degrees per second to more than about 7 degrees per second. A generally more gradual and sometimes smaller kind of error is known as drift error, which may also be attributable to temperature changes, and may also occur as the vehicle 10 is operated. Total yaw rate error may be greater than about 7 degrees per second.


Besides temperature, errors can also be caused by electromagnetic interference (EMI) and variation in sensor quality. Although a temperature sensor located at a yaw rate sensor could be used to compensate for errors due to temperature, this can greatly increase the cost of the sensor, since such sensors are typically instrumentation-grade and impractical to use for high production volumes. Known error compensation and filtering techniques may delay output of yaw rate to other systems, such as a stability control system, that may require low delays in yaw rate data. In addition, since multiple different systems of the vehicle 10 may require yaw rates according to different criteria (such as, for example, low delay, averaged, filtered), it may be advantageous to allow those systems to process yaw rates according to their specific criteria by providing to all such systems common yaw rates that have been filtered as little and as quickly as possible.



FIG. 3 shows an example of the effect of yaw rate error. In this example, the vehicle 10 drives along a straight path in the right lane (shown at 90) from position A to position B, and then steers into the left lane (shown at 92) to avoid the second vehicle 34. In this example, the yaw rate sensor 22 of the vehicle 10 has an error of about +0.5 degrees per second, with positive yaw rate indicating a projected path that is rightward and a negative yaw rate indicating a projected path that is leftward. As noted above, the vehicle 10 travels straight in a right lane of a road from a position A to a position B, at which point it approaches a slower-moving or stationary second vehicle 34. When the vehicle 10 is at position A and is driving straight, if the control system 18 does not compensate for the error in the yaw rate sensor 22, it would determine that the yaw rate for the vehicle 10 is 0.5 degrees per second, which falsely indicates that the vehicle is turning to the right, as shown by projected path 52, as opposed to correctly indicating that the vehicle 10 is traveling along a straight path 54.


As a result, a collision mitigation system on board the vehicle 10 would not determine that the second vehicle 34 is in the projected path of the vehicle 10. As a result, it would not appropriately apply emergency braking or warn the driver of vehicle 10 in the event of an impending collision between the two vehicles 10 and 34. When the vehicle 10 reaches point B and steers into the left lane 92 (such as, for example, with an actual yaw rate of about −0.5 degrees/second) to avoid a collision with the second vehicle 34, the sensor error would cause the control system 18 to determine the vehicle's yaw rate to be about 0 degrees per second, which falsely indicates that the vehicle 10 is travelling straight, as shown by projected path 56, as opposed to correctly indicating that the vehicle 10 is turning left with a yaw rate of about −0.5 degrees per second, as indicated at 58.


Due to this error, a collision avoidance or mitigation system of the vehicle 10 may incorrectly apply emergency braking to prevent the apparent imminent collision with the second vehicle 34, which can be dangerous. Unfortunately, even if the control system 18 attempts to correct the error in the yaw rate signal from the yaw rate sensor 22 based on a fixed formula or a fixed lookup table (so as to compensate for error caused by ambient temperature), it is still not sufficient, since the amount of error in the yaw rate signal can drift for a variety of reasons. Accordingly, a control system 18 can still have significant error in the determined yaw rate for the vehicle 10 when employing such techniques to correct for error.


Techniques for correcting output of the yaw rate sensor 22 for error in accordance with an embodiment of the present invention will now be discussed.



FIG. 4 illustrates a functional block diagram of the control system 18, the camera 20, and the sensors 22-31. The control system 18 includes a processor 62, a memory 64 connected to the processor 62, and an input/output interface 66 connected to the processor 62. The sensors 22-31 are connected to the processor 62 via the input/output interface 66 to provide sensed information to the processor 62. The camera 20 may also be connected to the processor 62 in this way, or a camera control system of the camera 20 may be connected to the processor 62 via the input/output interface 66.


The input/output interface 66 can include a bus, such as a CAN bus. The input/output interface 66 can include one or more analog-to-digital converters to convert analog signals from any of the sensors 22-31 (in embodiments wherein any of them are analog devices) into digital signals usable by the processor 62.


The memory 64 stores a vehicle stationary detection routine 72, a vehicle straight driving detection routine 74, and a yaw rate offset filter routine 76. Such routines may be in the form of any programmatic entity such as a program, a routine, a subroutine, a function, a script, executable instructions, intermediate instructions that are executed to generate executable instructions, an object, and a class, to name a few. Any of the routines 72-76 can include lookup tables for obtaining lookup values using known values. The routines 72-76 may be provided as separate routines, as illustrated. Alternatively, two or more of the routines 72-76 may be provided together in a larger routine. In addition, any of the routines 72-76 may be separated into two or more smaller routines.


The memory 64 further stores a yaw rate offset 78 as a variable value that may be updated when appropriate. To determine a new yaw rate offset (such as to update the yaw rate offset 78), the processor 62 references the camera 20 and sensors 22-30, executes the routines 72-76 to act on the information obtained from the camera 20 and sensors 22-30, and further references the stored yaw rate offset 78.


As the processor 62 obtains one or more yaw rates from the yaw rate sensor 22, the one or more yaw rates may be stored in a buffer 82 in the memory 64. The processor 62 can reference the buffer 82 for a yaw rate required by the routines 72-76, and can ignore spurious yaw rates in the buffer 82 or apply an input filter to one or more of these yaw rates to reduce noise and obtain a filtered input yaw rate.


The processor 62 can further obtain a corrected yaw rate 80 by subtracting the yaw rate offset 78 from a measured yaw rate obtained from the yaw rate sensor 22. The corrected yaw rate 80 may be output to the memory 64 for storage for later use by the processor 62 in performing routines that require a corrected yaw rate, such as routines for collision mitigation or avoidance or stability control. The processor 62 may additionally or alternatively output the corrected yaw rate 80 to the input/output interface 66 for use by other systems of the vehicle 10.


The vehicle stationary detection routine 72 and the vehicle straight driving detection routine 74 are used determine when the vehicle 10 is in a condition where the actual yaw rate of the vehicle 10 is approximately zero, and therefore the signal sent from an ideal yaw rate sensor would indicate a yaw rate of zero. When the vehicle is in such a condition, it is conducive to determining the amount of error that exists in the real yaw rate sensor 22, and therefore is conducive to updating the yaw rate offset 78. In a simple embodiment, when the vehicle 10 is determined to be either stationary or moving in a straight path, the yaw rate offset 78 may simply be determined to be the measured yaw rate from the yaw rate sensor 22. It will be understood that, in some circumstances, it is at least theoretically possible for even the most sophisticated control system 18 to determine that the vehicle is stationary or is moving in a straight path, when the vehicle 10 is, in fact, not. If the yaw rate offset 78 were simply replaced with the measured yaw rate at that time, a potentially large error could inadvertently be introduced into the yaw rate offset 78. To reduce the potential for introducing a large error into the yaw rate offset 78, in some more complex embodiments, updating the yaw rate offset 78 entails:


(a) determining the vehicle 10 is either stationary or is moving in a straight path; and


(b) if the vehicle is determined to be either stationary or moving in a straight path, adding a percentage of the value of the measured yaw rate with a percentage of the previously stored value yaw rate offset, to arrive at an updated value for the yaw rate offset 78.


Each of the routines 72-76 will now be discussed in detail. The vehicle stationary detection routine 72 references the longitudinal accelerometer 24 to determine whether the vehicle 10 is stationary. A longitudinal acceleration of the vehicle 10 may be obtained over a selected duration of time. A derivative or rate of change of the longitudinal acceleration can then be determined. The first derivative of acceleration is known as jerk. When the value of jerk is determined to indicate that the vehicle 10 is stationary, the vehicle stationary detection routine 72 can reference the yaw rate sensor 22 to obtain a measured yaw rate that is used for the updating of the yaw rate offset 78.


The value of jerk that would be indicative of a stationary vehicle would be a value that is approximately zero. The selected duration of time may be several seconds (such as, for example, about 0.5 seconds). However, it will be noted that when the vehicle 10 is under a constant acceleration, the value of jerk would also be approximately zero. Thus, the control system 18 does not rely on the value of jerk alone to determine when the vehicle 10 is stationary. The control system 18 also uses at least one other criterion to support a determination that the vehicle 10 is stationary. For example, another criterion may be that the vehicle speed (as measured by speed sensor 30) is measured to be approximately zero. An example of another criterion (in embodiments wherein the vehicle 10 is equipped with an automatic transmission) is whether the gear selector for the transmission is in ‘Park’, as determined by the transmission sensor 26. If the vehicle 10 is equipped with a manual transmission, the criterion could instead be whether the gear selector for the transmission is in ‘Neutral’, as determined by the transmission sensor 26 and whether the main brake system (such as the brake pedal of the vehicle) has been depressed beyond a selected amount for a selected period of time or whether the parking brake system has been depressed or actuated.


As a condition for referencing the longitudinal accelerometer 24 to determine the current value of jerk, the vehicle stationary detection routine 72 may first determine whether the vehicle speed obtained from the vehicle speed sensor 30 is approximately zero for a selected duration (such as, for example, about 0.5 seconds), and/or whether the gear selector is in ‘Park’ (for an automatic transmission) or whether the gear selector is in neutral (for a manual transmission) and the brake is sufficiently depressed for sufficiently long, thereby saving the processor 62 from having to determine jerk when the vehicle 10 is known by the control system 18 to not meet other criteria for being stationary.


Determining whether the vehicle 10 is stationary by using the value of jerk in addition to vehicle speed and/or the position of the gear selector is advantageous over a system that would determine whether the vehicle is stationary using only vehicle speed and/or gear selector position, without using the value of jerk. This is because there are situations in which the measured vehicle speed from the speed sensor 30 could be zero, even though the vehicle is moving. An example would be when the vehicle 10 is being carried on a ferry. It will be noted that when the vehicle 10 is being carried on a ferry, the gear selector for the transmission would also be in ‘Park’. Thus, when the vehicle is on a ferry, a control system of the prior art could be fooled into determining that the vehicle is stationary even though it is not. However, the movements of the ferry during transport of the vehicle 10 could be sensed by the longitudinal accelerometer 24 and would result in a value of jerk that is non-zero. Thus, by basing the determination of whether the vehicle 10 is stationary on the value of jerk in addition to at least one other criterion such as vehicle speed and/or gear selector the control system 18 is inhibited from determining that the vehicle 10 is stationary in such a situation.


Using the value of jerk is advantageous over simply using the value of longitudinal acceleration itself, because there are situations in which the vehicle 10 is stationary but where the longitudinal acceleration sensor 24 would signal to the control system 18 that there is an acceleration on the vehicle, thereby misleading the control system 18 to determine that the vehicle 10 is not stationary. Such a situation would be, for example, when the vehicle 10 is stationary on a downhill slope or on an uphill slope. In such situations, the acceleration sensor 24 would sense the force of gravity urging the vehicle to roll down the slope. Thus, the sensor 24 would send signals to the control system 18 that are non-zero and which have a magnitude determined by the angle of the slope on which the vehicle 10 is positioned. As a result, the control system 18 could determine that the vehicle is under acceleration even though the vehicle may be stationary. However, the force of gravity is constant, and so value of jerk that would exist in such a situation would be approximately zero. Thus, by determining whether the vehicle 10 is stationary based on jerk instead of longitudinal acceleration, situations in which the vehicle 10 is stationary can be captured that might otherwise be missed.


As described above, examples of conditions that the vehicle stationary detection routine 72 can evaluate in order to determine whether the vehicle 10 is stationary include:


1. A value of jerk being less than a threshold value, preferably for at least a selected period of time,


2. A vehicle speed being less than a threshold speed, preferably for at least a selected period of time, and


3. The transmission 14 being determined to be in ‘Park’ (for automatic transmission) or in ‘Neutral’ with the brake depressed, preferably for at least a selected period of time (for manual transmission). The three aforementioned selected periods of time need not be the same as each other.


The vehicle straight driving detection routine 74 references the camera 20 (and one or more other sensors such as the steering angle sensor 28 and the speed sensor 30) to determine whether the vehicle 10 is driving straight (and therefore has an actual yaw rate of about zero) based on an analysis of images captured by the camera 20. In an embodiment, to determine whether the vehicle 10 is driving straight (such as driving in a straight path), the control system 18 detects any lane delimiters 32 present in the images. An example image is shown at 300 in FIG. 6. In this image, two lane delimiters shown at 32 (shown individually at 32a and 32b) are detected (one on either side of the vehicle 10). Optionally, the control system 18 may be programmed to continue to the next steps only with lane delimiters 32 that are detected for a continuous selected period of time, such as about 0.5 seconds or more or less.


In order for the control system 18 to determine whether or not a lane delimiter 32 has been detected, it may assess the ‘quality’ of the data obtained from the camera images, such as, for example, whether the control system 18 has received sufficient image information to accurately determine the position and curvature of the lane delimiter 32. When the quality of the data does not meet the threshold quality, then the control system 18 determines that the data does not relate to a lane delimiter 32. Evaluation of lane delimiter quality may be performed by the control system 18, for example as part of the vehicle straight driving detection routine 74, or by a camera control system or lane-keeping system.


After detecting any lane delimiters 32, the control system 18 may be programmed to determine whether the detected lane delimiters 32 are generally straight. Each lane delimiter 32 may be modeled using a polynomial equation, such as a third-order polynomial equation. A curvature of the lane delimiter 32 may be obtained by taking a derivative of the polynomial equation. Coefficients of the terms of the derivative equation may be tested for linearity. A relatively small coefficient for a non-linear term (such as, for example, x2 or x3), when compared to a linear coefficient (such as, for example, x), can indicate a lane delimiter 32 of low curvature. Lane curvature may be determined by the controller 18, for example as part of the vehicle straight driving detection routine 74, or for example, by a camera controller or lane-keeping system.


For any lane delimiters 32 that are determined to be straight, the controller 18 may be programmed to determine the heading of the vehicle 10 relative to each generally straight lane delimiter 32. Thus, if there is only one lane delimiter 32 detected, and it is determined to be generally straight, the controller 18 may be programmed to determine the heading of the vehicle 10 relative to that lane delimiter 32. If there are two generally straight lane delimiters 32 detected the controller 18 may be programmed to determine the heading of the vehicle 10 relative to the first lane delimiter (such as, for example, lane delimiter 32a in FIG. 6) and then to determine the heading of the vehicle relative to the second lane delimiter (such as, for example, lane delimiter 32b in FIG. 6).


The heading of the vehicle may be determined by the controller 18 by any suitable method. For example, it may be determined using one or more of the steering angle sensor 28, the yaw rate sensor 22 and by analysis of the images from the camera 20. For example, the images from the camera 20 may be analyzed by the controller 18 to determine if, over time, any detected lane delimiters 32 are moving by more than a selected amount along the x-axis in the images. Such movement would be indicative that the vehicle 10 is not precisely parallel to the lane delimiters 32. The amount of such movement could be correlated to a relative angle between the vehicle 10 heading and the direction of the lane delimiters 32.


When the control system 18 determines that the vehicle 10 is sufficiently parallel to (such as, for example, within a selected number of degrees of) one or more generally straight lane delimiters 32 for at least a threshold duration of time (such as, for example, about 0.1 seconds), the vehicle straight driving detection routine 74 may update the yaw rate offset 78. To update the yaw rate offset 78, the control system 18 may obtain a measured yaw rate from the yaw rate sensor 22. This measured yaw rate may, for example, be used by the control system 18 directly as the new yaw rate offset 78 and may simply replace the previous yaw rate offset. Alternatively (and preferably), the new yaw rate offset 78 may be some selected proportion or percentage of the measured yaw rate added to some selected proportion or percentage of the previous yaw rate offset, as described in more detail below.


In addition to using the images from the camera 20, data from other sensors may be used by the control system 18 to determine whether the vehicle 10 is driving in a straight path. For example, the control system 18 may receive signals from the steering angle sensor 28 to determine whether the steering angle of the vehicle 10 is less than a selected threshold steering angle (such as, for example, approximately zero). Also, the speed sensor 30 will be used to determine that the vehicle 10 is in fact, moving (such as, for example, moving above a selected threshold speed).


It is advantageous to be able to determine the error (which may be referred to as the offset) of the yaw rate sensor 22 while driving (not just when the vehicle is stationary), since drift error can increase the error in yaw rate over time, and it is not always practical to wait until the vehicle stops to determine the yaw rate offset 78, especially during a long drive.


Using the camera 20 to detect lane delimiters 32 that are straight and to determine whether the vehicle heading is parallel with the lane delimiters 32 has advantages over other approaches, such as by relying solely on steering angle. This is because it is possible in some situations for the steering angle not to reflect the true heading of the vehicle 10. For example, if the road has a crown as many roads do, the vehicle may steer by some small amount towards the peak in order to drive straight. Similarly in a crosswind a driver may steer the vehicle into the crosswind by some small amount in order to maintain a straight heading on the road. Therefore, relying on steering angle may mask situations of straight driving that could have been used to determine yaw rate offset 78.


Even though the aforementioned examples describe situations in which the steering angle is not zero when the vehicle 10 heading is straight, the control system 18 may still compare the measured steering angle (such as, for example, from steering angle sensor 28) with a threshold steering angle as a condition for determining whether the vehicle 10 has some chance to be travelling straight before going on to carry out the detection and operations relating to the camera images. For example, the control system 18 may determine that the vehicle 10 has at least some chance of following a straight path if the steering angle if the vehicle is less than, for example, about 10 degrees angularly to the left or to the right, preferably for at least a selected period of time (such as, for example, about 5 seconds). Additionally the control system 18 can determine whether or not the rate of change of the steering angle exceeds a threshold rate of change of the steering angle (such as, for example, about 2 degrees per second), before permitting the control system 18 to analyze the camera images 20 for lane delimiters 32. Alternatively, the analysis of images from the camera 20 may be carried out simultaneously with the determinations made above relating to steering angle and the rate of change of the steering angle.


Even in situations where there are no lane delimiters detected, the control system 18 may still update the yaw rate offset 78 if, for example, the steering angle is sufficiently low and if the vehicle is moving.


The conditions that the vehicle straight driving detection routine 74 can evaluate in order to determine whether the vehicle 10 is moving straight are summarized as follows:


1. Detection of any straight lane delimiters 32 from images from the camera 20 for at least a selected duration,


2. A steering angle from the steering angle sensor 28 being less than a threshold angle for at least a selected duration,


3. A rate of change of steering angle from the steering angle sensor 28 being less than a threshold rate of change of angle for at least a selected duration, and


4. A vehicle speed from the speed sensor 30 being greater than a threshold speed for at least a selected duration.


When updating the yaw rate offset 78, it will be noted that the potential for error in determining that the vehicle 10 should have an actual yaw rate of zero varies depending on the specifics of each situation. For example, there is relatively less potential for error in a determination that the vehicle 10 is stationary than there is in a determination that the vehicle 10 is driving straight when only one lane delimiter 32 is detected by the camera 20. Depending on the potential for error associated with a particular updating of the yaw rate, the proportions of the measured yaw rate and the previous yaw rate offset that are added together to form the new yaw rate offset 78 can be adjusted. More specifically, when there is a relatively high potential for error in the measured offset (such as, for example, when the measured yaw rate is obtained when the vehicle is determined to be driving straight but where the control system 18 could not identify any lane delimiters), the new yaw rate offset 78 may be generated from a relatively small proportion (such as, for example, about 1 percent) of the measured yaw rate added to a relatively large proportion (such as, for example, about 99 percent) of the previous yaw rate offset. By contrast, when there is relatively less potential for error (such as, for example, when the measured yaw rate was obtained when the vehicle 10 is determined to be stationary), the new yaw rate offset 78 may be determined to be a relatively higher proportion (such as, for example, about 10 percent) of the measured yaw rate can be added to a relatively reduced proportion (such as, for example, about 90 percent) of the previous yaw rate offset.


Accordingly, the yaw rate offset filter routine 76 can apply the following offset update formula:

WOFFSET=K(0−WBUFFER)+(1−K)WOLD OFFSET

where:


WOFFSET is the new yaw rate offset 78 being determined;


K is the proportion of the measured yaw rate used to determine the new yaw rate offset 78;


(1−K) is the proportion of the previous yaw rate offset used to determine the new yaw rate offset 78;


WBUFFER is the measured yaw rate (e.g. obtained using one of the routines 72, 74); and


WOLD OFFSET is the previous yaw rate offset.


As can be seen, the proportion K defines a ratio by which the measured yaw rate obtained by one of the routines 72, 74 is combined with the stored yaw rate offset 78, the ratio being K/(1−K). For small values of K, the stored yaw rate offset 78 is thus less updated by the measured yaw rate obtained by one of the routines 72, 74. The proportion K can be considered to be a numerical expression of the level of confidence in the measured yaw rate obtained by one of the routines 72, 74 being an accurate value for the yaw rate offset 78. Combining the yaw rate obtained by one of the routines 72, 74 with the stored yaw rate offset 78 in such a way also serves as a time-based filter, so that error in the yaw rate obtained by one of the routines 72, 74 is mitigated and eventually eliminated by subsequent iterations. The proportion K may be set to zero to allow the stored yaw rate offset 78 to not be updated, or 1 to allow immediate and complete copy of the current yaw rate to the yaw rate offset 78. The yaw rate offset filter routine 76 may be understood to implement a low-pass filter, where the filter constant is the proportion K.


In this exemplary embodiment, the proportion K ranges from 0.01 to 0.10, giving ratios of 1:99 and 1:9 for the measured yaw rate (WBUFFER) obtained by one of the routines 72, 74 to the previous yaw rate offset 78 (WOLD OFFSET). The proportion K may be about 0.10 when the vehicle stationary detection routine 72 obtains the yaw rate to reflect a relatively high confidence in the accuracy of WBUFFER in such a situation. Continuing with this exemplary embodiment, when the vehicle straight driving detection routine 74 obtains the yaw rate, the proportion K may be somewhere in the range of about 0.01 to 0.04 depending on such factors as the number of straight lane delimiters that are detected. It will be understood that these are merely example values. Higher values for the proportion K mean that the yaw rate offset 78 will more quickly change, while lower offset correction values mean that the yaw rate offset 78 will change more slowly. The proportion K will be discussed in more detail below with respect to FIG. 5.


Once the yaw rate offset filter routine 76 determines the new yaw rate offset 78, WOFFSET, the new yaw rate offset is stored in the memory 64 and becomes the stored yaw rate offset 78.


The routines 72, 74, 76 may be repeated to continually update the yaw rate offset 78. Using the yaw rate offset 78, the control system 18 can correct for error in signals from the yaw rate sensor 22. In other words, using the yaw rate offset 78, the control system 18 can receive a measured yaw rate from the yaw rate sensor 22 (which may be referred to as a second measured yaw rate to distinguish it from the measured yaw rate taken in the routines 72, 74 used to update the offset 78) and can apply the known offset 78 to it to arrive at a corrected yaw rate 80. When the control system 18 needs to determine the yaw rate for the vehicle 10 (such as, for example, for use by the collision mitigation system or the stability control system) the processor 62 can obtain a corrected yaw rate 80 by combining a new measured yaw rate sensed at the yaw rate sensor 22 (the aforementioned second measured yaw rate), with the yaw rate offset 78, as may be expressed by the following corrected yaw rate formula:

WCORRECTED=WINPUT+WOFFSET

where:


WCORRECTED is the corrected yaw rate 80 being determined;


WINPUT is the yaw rate obtained from the yaw rate sensor 22, and this may be a yaw rate stored in the buffer 82; and


WOFFSET is the yaw rate offset 78.


The corrected yaw rate 80 may be stored in the memory 64 or output at the input/output interface 66, as needed.



FIG. 5 illustrates a flowchart of a method 100 of correcting yaw rate. The method 100 may be embodied by the routines 72, 74, 76 discussed above, and the description for these routines may be referenced to better understand the method 100.


At step 102, the method is initialized. The buffer 82 may be cleared and then filled with raw yaw rate input from the yaw rate sensor 22.


At step 104, the input yaw rates from the sensor 22 may be filtered to reduce the effects of noise. Spurious values may be ignored. Any suitable filter may be used. Step 104 may be performed after the buffer 82 is filled or while the buffer 82 is being filled during step 102. The result of steps 102 and 104 is a filtered input yaw rate that will be used by the remainder of the method 100.


At step 106, it is determined whether the vehicle 10 is stationary for a sufficient duration, as described above. If it is determined at step 106 that the vehicle 10 is stationary, the value of the proportion K is set to be equal to K4 at step 108. K4 may, for example, be 0.1, which reflects a relatively high degree of confidence that the vehicle 10 is in fact stationary (and therefore has a true yaw rate of about zero).


At step 110, the yaw rate offset 78 is updated as described above, using the formulas described above. At step 112, the corrected yaw rate 80 is determined using the formula provided above.


When it is determined at step 106 that the vehicle is not stationary, step 114 is carried out. At step 114, it is determined whether the vehicle 10 is moving straight. To that end, one or more of the following conditions may be evaluated:


the steering angle meeting related criteria such as whether the steering angle is about +/−10 degrees of a zero steering angle for a selected period of time, and a vehicle speed meeting related criteria such as whether the vehicle speed is greater than a minimum acceptable speed (such as, for example, about 10 kph, or for example, about 60 kph) for a selected (optionally different) period of time;


the differential speeds of the wheels 12 on the right side of the vehicle vs. the left side of the vehicle being below a selected threshold differential speed;


data from an onboard GPS system meeting selected criteria that indicate that the vehicle is driving straight.


Step 114 is a coarse determination of whether the vehicle 10 is moving in a straight path. When it is determined that the vehicle 10 is not stationary at step 106 and is not moving in a straight path, it is determined that the yaw rate offset 78 is not to be updated at step 116 and as a result the yaw rate offset 78 is not updated, and the method 100 can proceed to step 112 to determine a corrected yaw rate 80 by referencing the previously obtained yaw rate offset 78.


If, on the other hand, at step 114 it is determined that the vehicle 10 is moving relatively straight, then the control system 18 analyzes images from the camera 20 to determine how many lane straight delimiters can be detected.


At steps 118 and 122, one or more images from the environment outside the vehicle 10 are captured by the camera 20 and the control system 18 performs image analysis to detect and evaluate lane delimiters in the images. This may be part of an ongoing lane-keeping process or may be a process that is only performed when yaw rate offset 78 is being determined by the method 100.


If at step 118 no lane delimiters are found to be acceptable (such as of sufficient quality and of sufficient straightness, optionally for a sufficient period of time), then the proportion K is set to be equal to offset correction value K1 at step 120 and the yaw rate offset 78 is updated accordingly. If at steps 118 and 122 one lane delimiter (32a or 32b) is found to be acceptable, but not two lane delimiters (32a and 32b), then the proportion K is set to be equal to offset correction value K2 at step 124 and the yaw rate offset 78 is updated accordingly. If at step 122 two lane delimiters are found to be acceptable, then the proportion K is set to be equal to offset correction value K3 at step 126 and the yaw rate offset 78 is updated accordingly. As noted above, K1 may be less than K2, which may be less than K3, which may be less than K4. Exemplary values for K1, K2 and K3 may comprise about 0.02, 0.03 and 0.04, respectively.


At step 118, if no acceptable lane delimiter is detected, or if one or more lane delimiters are detected, but did not remain detected for a sufficient period of time then step 120 is performed to select the offset correction value K1 corresponding to a first level of confidence that the vehicle 10 is moving straight based on the conditions met at step 114. If one acceptable lane delimiter (such as either the left lane delimiter 32a or the right lane delimiter 32b in FIG. 6) is detected for the first duration of time, then step 122 is performed to determine whether two acceptable lane delimiters (32a and 32b) are detected for a second duration of time.


The first and second durations may be of different lengths or the same length, and may be coincident times, partially overlapping times or separate times. Steps 118 and 122 may be performed at the same time by the same process, and are merely described as separate for clarity.


In relation to the above-described routines 72-76, the method 100 may be understood as follows. Steps 106 and 108 correspond to the vehicle stationary detection routine 72. Steps 114, 118, 120, 122, 124, and 126 correspond to the vehicle straight driving detection routine 74. Step 110 corresponds to the yaw rate offset filter routine 76.


When steps 102, 104, 106, 114, 116, and 112 are performed in that sequence, the vehicle 10 is not being operated in a manner conducive to updating the previously obtained yaw rate offset 78 and corrected yaw rates 80 are being determined using the previously obtained yaw rate offset.


The steps of the method 100 may be performed in orders different from that described. Any of the steps may be split into two or more smaller steps. Any two or more of the steps may be combined into a larger step. Steps may be omitted.


It will be understood that the use of variable values for the proportion K may be advantageous regardless of the specific details of the routines that are used to determine the new yaw rate offset 78. In other words it is contemplated to be advantageous to assign different values for the proportion K based on a set of different situations determined to be conducive for updating the value of the yaw rate offset 78, wherein the different situations have different levels of confidence associated therewith.


The techniques described herein may be repeated while the vehicle 10 is being operated in order to continually determine whether conditions are suitable for updating the yaw rate offset 78. Such repetition can occur according to a period, such as 10 ms, 50 ms, or 100 ms, and such period may be allowed to vary, such as by about +/−50 percent or thereabouts.


Thus, the present invention may comprise a system for determining a corrected yaw rate of a land-based vehicle, with the system comprising a yaw rate sensor, a camera and a processor. The camera is configured to be mounted to the vehicle and operable to capture image data representative of the environment exterior of the vehicle. The processor is operable to:


(a) process image data captured by the camera;


(b) determine if the vehicle is moving and has a zero yaw rate at least based on processing of captured image data;


(c) obtain a first measured yaw rate from the yaw rate sensor when the vehicle is determined in step (b) to be moving and to have a zero yaw rate;


(d) determine a yaw rate offset based at least in part on the measured yaw rate obtained in step (c);


(e) obtain a second measured yaw rate from the yaw rate sensor; and


(f) determine a corrected yaw rate for the vehicle based on the second measured yaw rate and the yaw rate offset.


Optionally, the processor may be operable in step (b) to (g) determine at least from processing of captured image data that the vehicle is driving straight. The processor may be operable in step (b) to:


(h) detect any lane delimiters present in the exterior scene represented by captured image data;


(i) determine whether any lane delimiters detected in step (h) are qualifying lane delimiters which meet selected criteria; and


(j) compare a heading of the vehicle with the direction of at least one qualifying lane delimiter.


Optionally, when there are two qualifying lane delimiters, the processor may be operable in step (j) to compare a heading of the vehicle with the direction of the two qualifying lane delimiters. Optionally, the processor may be operable in step (i) to determine whether the amount of curvature in any lane delimiters detected in step (h) is below a selected threshold amount of curvature. Optionally, the processor may be operable in step (i) to determine whether any lane delimiters detected in step (h) have been detected for more than a selected duration.


Optionally, the yaw rate offset may be a first selected proportion of the first measured yaw rate obtained in step (c) and a second selected proportion of a previous yaw rate offset, and wherein the first and second selected proportions are selected based at least in part on how many lane delimiters meet the selected criteria.


Optionally, the processor may be operable in step (b) to determine whether a steering angle of the vehicle is less than a selected threshold steering angle. Optionally, the processor may be operable in step (b) to determine whether the speed of the vehicle is greater than a selected threshold speed. Optionally, the processor may be operable in step (b) to determine whether the steering angle of the vehicle is approximately zero.


Optionally, the yaw rate offset may be a first selected proportion of the first measured yaw rate obtained in step (c) and a second selected proportion of a previous yaw rate offset.


According to another aspect of the present invention, a system for determining a corrected yaw rate for a yaw rate sensor on a land-based vehicle comprises an accelerometer operable to sense a longitudinal acceleration of the vehicle and a processor connected to the yaw rate sensor and the accelerometer, with the processor operable to:


(a) obtain an acceleration from an accelerometer positioned to sense a longitudinal acceleration of the vehicle;


(b) determine a rate of change of the acceleration;


(c) carry out a determination of a yaw rate offset based at least in part on a first measured yaw rate from the yaw rate sensor, at least in part depending on whether the rate of change of the acceleration determined in step (b) is approximately zero;


(d) obtain a second measured yaw rate from the yaw rate sensor; and


(e) determine a corrected yaw rate for the vehicle based on the second measured yaw rate and the yaw rate offset.


The processor may be operable to carry out step (c) depending at least in part on whether the rate of change of the acceleration determined in step (b) is approximately zero for a selected duration. The processor may be operable to carry out step (c) depending in part on a determination of whether the vehicle has a speed that is approximately zero.


Optionally, the yaw rate offset is a first selected proportion of the measured yaw rate obtained in step (c) and a second selected proportion of a previous yaw rate offset.


According to another aspect of the present invention, a system for determining a corrected yaw rate for a yaw rate sensor on a land-based vehicle comprises a memory and a processor connected to the yaw rate sensor and the memory, with the processor operable to:


(a) determine whether the vehicle has a yaw rate of zero;


(b) obtain a first measured yaw rate from the yaw rate sensor depending on the determination made in step (a);


(c) determine a yaw rate offset that is a first selected proportion of the first measured yaw rate obtained in step (b) and a second selected proportion of a previous yaw rate offset, wherein the first selected proportion and the second selected proportion are selected based on a set of criteria based on data determined from the at least one other sensor;


(d) obtain a second measured yaw rate from the yaw rate sensor; and


(e) determine a corrected yaw rate for the vehicle based on the second measured yaw rate and the yaw rate offset.


The data may include a derivative of longitudinal acceleration for the vehicle and wherein the set of criteria includes whether the derivative of the longitudinal acceleration for the vehicle is approximately zero. The ratio of the first and second selected proportions may vary between about 1:99 and about 1:9. The at least one sensor may include a vehicle-mounted camera and the data may include image data captured by the vehicle-mounted camera and representative of the environment exterior of the vehicle. The set of criteria may include the number of lane delimiters detected in the image data.


Optionally, and desirably, the system of the present invention utilizes an image-based sensor or camera and image processing of image data captured by the camera. The system and/or camera of the vehicle includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EyeQ2 or EyeQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.


The camera or imager or imaging sensor may comprise any suitable camera or imager or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012, which is hereby incorporated herein by reference in its entirety.


The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, an array of a plurality of photosensor elements arranged in at least about 640 columns and 480 rows (at least about a 640×480 imaging array), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data. For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, PCT Application No. PCT/US2010/047256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686 and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US2012/048800, filed Jul. 30, 2012, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and/or PCT Application No. PCT/US2012/056014, filed Sep. 19, 2012, and/or PCT Application No. PCT/US12/57007, filed Sep. 25, 2012, and/or PCT Application No. PCT/US2012/061548, filed Oct. 24, 2012, and/or PCT Application No. PCT/US2012/062906, filed Nov. 1, 2012, and/or PCT Application No. PCT/US2012/063520, filed Nov. 5, 2012, and/or PCT Application No. PCT/US2012/064980, filed Nov. 14, 2012, and/or PCT Application No. PCT/US2012/066570, filed Nov. 27, 2012, and/or PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012, and/or PCT Application No. PCT/US2012/068331, filed Dec. 7, 2012, and/or PCT Application No. PCT/US2012/071219, filed Dec. 21, 2012, and/or PCT Application No. PCT/US2013/022119, filed Jan. 18, 2013, and/or PCT Application No. PCT/US2013/027342, filed Feb. 22, 2013, and/or U.S. patent application Ser. No. 13/681,963, filed Nov. 20, 2012, now U.S. Pat. No. 9,264,673; Ser. No. 13/660,306, filed Oct. 25, 2012, now U.S. Pat. No. 9,146,898; Ser. No. 13/653,577, filed Oct. 17, 2012, now U.S. Pat. No. 9,174,574; and/or Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, and/or U.S. provisional applications, Ser. No. 61/736,104, filed Dec. 12, 2012; Ser. No. 61/736,103, filed Dec. 12, 2012; Ser. No. 61/735,314, filed Dec. 10, 2012; Ser. No. 61/734,457, filed Dec. 7, 2012; Ser. No. 61/733,598, filed Dec. 5, 2012; Ser. No. 61/733,093, filed Dec. 4, 2012; Ser. No. 61/727,912, filed Nov. 19, 2012; Ser. No. 61/727,911, filed Nov. 19, 2012; Ser. No. 61/727,910, filed Nov. 19, 2012; Ser. No. 61/718,382, filed Oct. 25, 2012; Ser. No. 61/710,924, filed Oct. 8, 2012; Ser. No. 61/696,416, filed Sep. 4, 2012; Ser. No. 61/682,995, filed Aug. 14, 2012; Ser. No. 61/682,486, filed Aug. 13, 2012; Ser. No. 61/680,883, filed Aug. 8, 2012; Ser. No. 61/676,405, filed Jul. 27, 2012; Ser. No. 61/666,146, filed Jun. 29, 2012; Ser. No. 61/648,744, filed May 18, 2012; Ser. No. 61/624,507, filed Apr. 16, 2012; Ser. No. 61/616,126, filed Mar. 27, 2012; Ser. No. 61/613,651, filed Mar. 21, 2012; and/or Ser. No. 61/607,229, filed Mar. 6, 2012, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in PCT Application No. PCT/US10/038477, filed Jun. 14, 2010, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.


The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and 6,824,281, and/or International Publication No. WO 2010/099416, published Sep. 2, 2010, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or U.S. patent application Ser. No. 12/508,840, filed Jul. 24, 2009, and published Jan. 28, 2010 as U.S. Pat. Publication No. US 2010-0020170, and/or PCT Application No. PCT/US2012/048110, filed Jul. 25, 2012, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012 and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. patent application Ser. No. 12/091,359, filed Apr. 24, 2008 and published Oct. 1, 2009 as U.S. Publication No. US-2009-0244361, and/or Ser. No. 13/260,400, filed Sep. 26, 2011, now U.S. Pat. No. 8,542,451, and/or U.S. Pat. Nos. 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606 and/or 7,720,580, and/or U.S. patent application Ser. No. 10/534,632, filed May 11, 2005, now U.S. Pat. No. 7,965,336; and/or PCT Application No. PCT/US2008/076022, filed Sep. 11, 2008 and published Mar. 19, 2009 as International Publication No. WO/2009/036176, and/or PCT Application No. PCT/US2008/078700, filed Oct. 3, 2008 and published Apr. 9, 2009 as International Publication No. WO/2009/046268, which are all hereby incorporated herein by reference in their entireties.


The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, and/or U.S. patent application Ser. No. 11/239,980, filed Sep. 30, 2005, now U.S. Pat. No. 7,881,496, and/or U.S. provisional applications, Ser. No. 60/628,709, filed Nov. 17, 2004; Ser. No. 60/614,644, filed Sep. 30, 2004; Ser. No. 60/618,686, filed Oct. 14, 2004; Ser. No. 60/638,687, filed Dec. 23, 2004, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.


Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. No. 7,255,451 and/or U.S. Pat. No. 7,480,149; and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, and/or Ser. No. 12/578,732, filed Oct. 14, 2009 and published Apr. 22, 2010 as U.S. Publication No. US-2010-0097469, which are hereby incorporated herein by reference in their entireties.


Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. patent application Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in PCT Application No. PCT/US2011/056295, filed Oct. 14, 2011 and published Apr. 19, 2012 as International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).


Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in PCT Application No. PCT/US10/25545, filed Feb. 26, 2010 and published on Sep. 2, 2010 as International Publication No. WO 2010/099416, and/or PCT Application No. PCT/US10/47256, filed Aug. 31, 2010 and published Mar. 10, 2011 as International Publication No. WO 2011/028686, and/or PCT Application No. PCT/US2011/062834, filed Dec. 1, 2011 and published Jun. 7, 2012 as International Publication No. WO 2012/075250, and/or PCT Application No. PCT/US2012/048993, filed Jul. 31, 2012, and/or PCT Application No. PCT/US11/62755, filed Dec. 1, 2011 and published Jun. 7, 2012 as International Publication No. WO 2012-075250, and/or PCT Application No. PCT/CA2012/000378, filed Apr. 25, 2012, and/or PCT Application No. PCT/US2012/066571, filed Nov. 27, 2012, and/or PCT Application No. PCT/US2012/068331, filed Dec. 7, 2012, and/or PCT Application No. PCT/US2013/022119, filed Jan. 18, 2013, and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties.


Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. patent application Ser. No. 12/091,525, filed Apr. 25, 2008, now U.S. Pat. No. 7,855,755; Ser. No. 11/226,628, filed Sep. 14, 2005 and published Mar. 23, 2006 as U.S. Publication No. US-2006-0061008; and/or Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.


Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and 6,124,886, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.


While the foregoing provides certain non-limiting example embodiments, it should be understood that combinations, subsets, and variations of the foregoing are contemplated. The monopoly sought is defined by the claims.

Claims
  • 1. A method for determining a corrected yaw rate for a vehicle, said method comprising: receiving a first yaw rate input from a yaw rate sensor of the vehicle;determining, by a processor, if the vehicle is moving or stationary;if the vehicle is determined to be moving, determining a steering angle of the vehicle;determining, by the processor, an offset correction value based at least in part on a determined speed of the vehicle and the determined steering angle;determining, by the processor, a yaw rate offset based at least in part on the determined offset correction value and the received first yaw rate input;receiving a second yaw rate input from the yaw rate sensor of the vehicle; anddetermining, by the processor, a corrected yaw rate value based at least in part on the received second yaw rate input and the determined yaw rate offset.
  • 2. The method of claim 1, wherein, if the vehicle is determined to be stationary, said method comprises determining that the offset correction value is within 10 percent of a previously determined offset correction value.
  • 3. The method of claim 1, wherein, if the vehicle is determined to be moving below a threshold speed and the steering angle is determined to be less than a threshold level, the yaw rate offset is not changed from a previously determined yaw rate offset.
  • 4. The method of claim 1, wherein, if the vehicle is determined to be moving at a determined speed that is above a threshold speed or the steering angle is determined to be greater than a threshold level, the yaw rate offset is determined based at least in part on the determined speed or the determined steering angle.
  • 5. The method of claim 1, wherein, if the vehicle is determined to be moving at a determined speed that is above a threshold speed and the steering angle is determined to be greater than a threshold level, the yaw rate offset is determined based at least in part on the determined speed and the determined steering angle.
  • 6. The method of claim 1, wherein determining the offset correction value is based at least in part on determination of lane markers on the road being traveled by the vehicle.
  • 7. The method of claim 1, wherein determining the offset correction value is based at least in part on how many lane markers are determined on the road being traveled by the vehicle.
  • 8. The method of claim 1, wherein determining the yaw rate offset comprises using a first selected proportion of the received first yaw rate input and a second selected proportion of a previously determined yaw rate offset.
  • 9. The method of claim 8, comprising selecting the first and second selected proportions based at least in part on detection of lane markers on the road being traveled by the vehicle.
  • 10. The method of claim 8, wherein a ratio of the first and second selected proportions varies between 1:99 and 1:9.
  • 11. The method of claim 8, comprising selecting the first selected proportion and the second selected proportion based on a set of criteria based on data determined from at least one other sensor of the vehicle.
  • 12. The method of claim 11, wherein the at least one sensor comprises a vehicle-mounted camera.
  • 13. A method for determining a corrected yaw rate for a vehicle, said method comprising: receiving a first yaw rate input from a yaw rate sensor of the vehicle;determining, by a processor, if the vehicle is moving or stationary;if the vehicle is determined to be moving, determining a steering angle of the vehicle;determining, by the processor, an offset correction value based at least in part on a determined speed of the vehicle and the determined steering angle;determining, by the processor, a yaw rate offset based at least in part on the determined offset correction value and the received first yaw rate input;receiving a second yaw rate input from the yaw rate sensor of the vehicle;determining, by the processor, a corrected yaw rate value based at least in part on the received second yaw rate input and the determined yaw rate offset;wherein, if the vehicle is determined to be moving below a first threshold speed and the steering angle is determined to be less than a first threshold level, the yaw rate offset is not changed from a previously determined yaw rate offset; andwherein, if the vehicle is determined to be moving at a determined speed that is above a second threshold speed or the steering angle is determined to be greater than a second threshold level, the yaw rate offset is determined based at least in part on the determined speed or the determined steering angle.
  • 14. The method of claim 13, wherein, if the vehicle is determined to be stationary, said method comprises determining that the offset correction value is within 10 percent of a previously determined offset correction value.
  • 15. The method of claim 13, wherein, if the vehicle is determined to be moving at a determined speed that is above the second threshold speed and the steering angle is determined to be greater than the second threshold level, the yaw rate offset is determined based at least in part on the determined speed and the determined steering angle.
  • 16. The method of claim 13, wherein determining the offset correction value is based at least in part on determination of lane markers on the road being traveled by the vehicle.
  • 17. The method of claim 13, wherein determining the yaw rate offset comprises using a first selected proportion of the received first yaw rate input and a second selected proportion of a previously determined yaw rate offset.
  • 18. A method for determining a corrected yaw rate for a vehicle, said method comprising: receiving a first yaw rate input from a yaw rate sensor of the vehicle;determining, by a processor, if the vehicle is moving or stationary;if the vehicle is determined to be moving, determining a steering angle of the vehicle;determining, by the processor, an offset correction value based at least in part on a determined speed of the vehicle and the determined steering angle;determining lane markers on the road being traveled by the vehicle via processing of image data captured by a vehicle-mounted camera;wherein determining the offset correction value is based at least in part on determination of lane markers on the road being traveled by the vehicle;determining, by the processor, a yaw rate offset based at least in part on the determined offset correction value and the received first yaw rate input;receiving a second yaw rate input from the yaw rate sensor of the vehicle; anddetermining, by the processor, a corrected yaw rate value based at least in part on the received second yaw rate input and the determined yaw rate offset.
  • 19. The method of claim 18, wherein, if the vehicle is determined to be stationary, said method comprises determining that the offset correction value is within 10 percent of a previously determined offset correction value.
  • 20. The method of claim 18, wherein determining the offset correction value is based at least in part on how many lane markers are determined on the road being traveled by the vehicle.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 15/161,709, filed May 23, 2016, now U.S. Pat. No. 9,715,769, which is a continuation of U.S. patent application Ser. No. 14/499,784, filed Sep. 29, 2014, now U.S. Pat. No. 9,346,468, which is a continuation of U.S. patent application Ser. No. 14/246,495, filed Apr. 7, 2014, now U.S. Pat. No. 8,849,495, which is a continuation of U.S. patent application Ser. No. 13/779,881, filed Feb. 28, 2013, now U.S. Pat. No. 8,694,224, which claims the filing benefit of U.S. provisional application, Ser. No. 61/605,516, filed Mar. 1, 2012, which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (368)
Number Name Date Kind
4720790 Miki et al. Jan 1988 A
4987357 Masaki Jan 1991 A
4991054 Walters Feb 1991 A
5001558 Burley et al. Mar 1991 A
5003288 Wilhelm Mar 1991 A
5012082 Watanabe Apr 1991 A
5016977 Baude et al. May 1991 A
5027001 Torbert Jun 1991 A
5027200 Petrossian et al. Jun 1991 A
5044706 Chen Sep 1991 A
5055668 French Oct 1991 A
5059877 Teder Oct 1991 A
5064274 Alten Nov 1991 A
5072154 Chen Dec 1991 A
5073012 Lynam Dec 1991 A
5076673 Lynam et al. Dec 1991 A
5086253 Lawler Feb 1992 A
5096287 Kakinami et al. Mar 1992 A
5097362 Lynas Mar 1992 A
5115346 Lynam May 1992 A
5121200 Choi Jun 1992 A
5124549 Michaels et al. Jun 1992 A
5130709 Toyama et al. Jul 1992 A
5148014 Lynam Sep 1992 A
5151816 Varaprasad et al. Sep 1992 A
5168378 Black Dec 1992 A
5170374 Shimohigashi et al. Dec 1992 A
5172235 Wilm et al. Dec 1992 A
5177685 Davis et al. Jan 1993 A
5182502 Slotkowski et al. Jan 1993 A
5184956 Langlais et al. Feb 1993 A
5189561 Hong Feb 1993 A
5193000 Lipton et al. Mar 1993 A
5193029 Schofield Mar 1993 A
5204778 Bechtel Apr 1993 A
5208701 Maeda May 1993 A
5245422 Borcherts et al. Sep 1993 A
5253109 O'Farrell Oct 1993 A
5255442 Schierbeek et al. Oct 1993 A
5276389 Levers Jan 1994 A
5285060 Larson et al. Feb 1994 A
5289182 Brillard et al. Feb 1994 A
5289321 Secor Feb 1994 A
5305012 Faris Apr 1994 A
5307136 Saneyoshi Apr 1994 A
5309137 Kajiwara May 1994 A
5313072 Vachss May 1994 A
5325096 Pakett Jun 1994 A
5325386 Jewell et al. Jun 1994 A
5329206 Slotkowski et al. Jul 1994 A
5331312 Kudoh Jul 1994 A
5336980 Levers Aug 1994 A
5341437 Nakayama Aug 1994 A
5351044 Mathur et al. Sep 1994 A
5355118 Fukuhara Oct 1994 A
5374852 Parkes Dec 1994 A
5386285 Asayama Jan 1995 A
5394333 Kao Feb 1995 A
5406395 Wilson et al. Apr 1995 A
5406414 O'Farrell et al. Apr 1995 A
5410346 Saneyoshi et al. Apr 1995 A
5414257 Stanton May 1995 A
5414461 Kishi et al. May 1995 A
5416313 Larson et al. May 1995 A
5416318 Hegyi May 1995 A
5416478 Morinaga May 1995 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5430431 Nelson Jul 1995 A
5434407 Bauer et al. Jul 1995 A
5440428 Hegg et al. Aug 1995 A
5444478 Lelong et al. Aug 1995 A
5451822 Bechtel et al. Sep 1995 A
5457493 Leddy et al. Oct 1995 A
5461357 Yoshioka et al. Oct 1995 A
5461361 Moore Oct 1995 A
5469298 Suman et al. Nov 1995 A
5471515 Fossum et al. Nov 1995 A
5475494 Nishida et al. Dec 1995 A
5497306 Pastrick Mar 1996 A
5498866 Bendicks et al. Mar 1996 A
5500766 Stonecypher Mar 1996 A
5510983 Lino Apr 1996 A
5515448 Nishitani May 1996 A
5521633 Nakajima et al. May 1996 A
5528698 Kamei et al. Jun 1996 A
5529138 Shaw et al. Jun 1996 A
5530240 Larson et al. Jun 1996 A
5530420 Tsuchiya et al. Jun 1996 A
5535314 Alves et al. Jul 1996 A
5537003 Bechtel et al. Jul 1996 A
5539397 Asanuma et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555555 Sato et al. Sep 1996 A
5568027 Teder Oct 1996 A
5574443 Hsieh Nov 1996 A
5581464 Woll et al. Dec 1996 A
5594222 Caldwell Jan 1997 A
5610756 Lynam et al. Mar 1997 A
5614788 Mullins Mar 1997 A
5619370 Guinosso Apr 1997 A
5632092 Blank et al. May 1997 A
5634709 Iwama Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5648835 Uzawa Jul 1997 A
5650944 Kise Jul 1997 A
5660454 Mori et al. Aug 1997 A
5661303 Teder Aug 1997 A
5666028 Bechtel et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5677851 Kingdon et al. Oct 1997 A
5699044 Van Lente et al. Dec 1997 A
5724316 Brunts Mar 1998 A
5732379 Eckert et al. Mar 1998 A
5737226 Olson et al. Apr 1998 A
5760828 Cortes Jun 1998 A
5760931 Saburi et al. Jun 1998 A
5760962 Schofield et al. Jun 1998 A
5761094 Olson et al. Jun 1998 A
5765116 Wilson-Jones et al. Jun 1998 A
5765118 Fukatani Jun 1998 A
5781437 Wiemer et al. Jul 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker et al. Aug 1998 A
5793308 Rosinski et al. Aug 1998 A
5793420 Schmidt Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5835255 Miles Nov 1998 A
5837994 Stam et al. Nov 1998 A
5844505 Van Ryzin Dec 1998 A
5844682 Kiyomoto et al. Dec 1998 A
5845000 Breed et al. Dec 1998 A
5848802 Breed et al. Dec 1998 A
5850176 Kinoshita et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5867591 Onda Feb 1999 A
5877707 Kowalick Mar 1999 A
5877897 Schofield et al. Mar 1999 A
5878357 Sivashankar et al. Mar 1999 A
5878370 Olson Mar 1999 A
5883739 Ashihara et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890021 Onoda Mar 1999 A
5896085 Mori et al. Apr 1999 A
5899956 Chan May 1999 A
5915800 Hiwatashi et al. Jun 1999 A
5923027 Stam et al. Jul 1999 A
5924212 Domanski Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5949331 Schofield et al. Sep 1999 A
5959555 Furuta Sep 1999 A
5963247 Banitt Oct 1999 A
5986796 Miles Nov 1999 A
5990469 Bechtel et al. Nov 1999 A
5990649 Nagao et al. Nov 1999 A
6020704 Buschur Feb 2000 A
6049171 Stam et al. Apr 2000 A
6066933 Ponziana May 2000 A
6084519 Coulling et al. Jul 2000 A
6097023 Schofield et al. Aug 2000 A
6097024 Stam et al. Aug 2000 A
6144022 Tenenbaum et al. Nov 2000 A
6175300 Kendrick Jan 2001 B1
6178034 Allemand et al. Jan 2001 B1
6198409 Schofield et al. Mar 2001 B1
6222447 Schofield et al. Apr 2001 B1
6223114 Boros et al. Apr 2001 B1
6227689 Miller May 2001 B1
6250148 Lynam Jun 2001 B1
6266082 Yonezawa et al. Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6294989 Schofield et al. Sep 2001 B1
6297781 Turnbull et al. Oct 2001 B1
6302545 Schofield et al. Oct 2001 B1
6310611 Caldwell Oct 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6320282 Caldwell Nov 2001 B1
6333759 Mazzilli Dec 2001 B1
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6370329 Teuchert Apr 2002 B1
6392315 Jones et al. May 2002 B1
6420975 DeLine et al. Jul 2002 B1
6424273 Gutta et al. Jul 2002 B1
6430303 Naoi et al. Aug 2002 B1
6442465 Breed et al. Aug 2002 B2
6477464 McCarthy et al. Nov 2002 B2
6497503 Dassanayake et al. Dec 2002 B1
6498620 Schofield et al. Dec 2002 B2
6516664 Lynam Feb 2003 B2
6523964 Schofield et al. Feb 2003 B2
6539306 Turnbull Mar 2003 B2
6547133 DeVries, Jr. et al. Apr 2003 B1
6553130 Lemelson et al. Apr 2003 B1
6559435 Schofield et al. May 2003 B2
6574033 Chui et al. Jun 2003 B1
6589625 Kothari et al. Jul 2003 B1
6594583 Ogura et al. Jul 2003 B2
6611202 Schofield et al. Aug 2003 B2
6611610 Stam et al. Aug 2003 B1
6636258 Strumolo Oct 2003 B2
6650455 Miles Nov 2003 B2
6672731 Schnell et al. Jan 2004 B2
6674562 Miles Jan 2004 B1
6678614 McCarthy et al. Jan 2004 B2
6680792 Miles Jan 2004 B2
6690268 Schofield et al. Feb 2004 B2
6700605 Toyoda et al. Mar 2004 B1
6704621 Stein et al. Mar 2004 B1
6710908 Miles et al. Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6714331 Lewis et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6735506 Breed et al. May 2004 B2
6741377 Miles May 2004 B2
6744353 Sjönell Jun 2004 B2
6757109 Bos Jun 2004 B2
6762867 Lippert et al. Jul 2004 B2
6794119 Miles Sep 2004 B2
6795221 Urey Sep 2004 B1
6802617 Schofield et al. Oct 2004 B2
6806452 Bos et al. Oct 2004 B2
6819231 Berberich et al. Nov 2004 B2
6822563 Bos et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6831261 Schofield et al. Dec 2004 B2
6882287 Schofield Apr 2005 B2
6889161 Winner et al. May 2005 B2
6891563 Schofield et al. May 2005 B2
6909753 Meehan et al. Jun 2005 B2
6946978 Schofield Sep 2005 B2
6953253 Schofield et al. Oct 2005 B2
6968736 Lynam Nov 2005 B2
6975775 Rykowski et al. Dec 2005 B2
6989736 Berberich et al. Jan 2006 B2
7004606 Schofield Feb 2006 B2
7005974 McMahon et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7062300 Kim Jun 2006 B1
7065432 Moisel et al. Jun 2006 B2
7079017 Lang et al. Jul 2006 B2
7085637 Breed et al. Aug 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7111968 Bauer et al. Sep 2006 B2
7116246 Winter et al. Oct 2006 B2
7123168 Schofield Oct 2006 B2
7145519 Takahashi et al. Dec 2006 B2
7149613 Stam et al. Dec 2006 B2
7161616 Okamoto et al. Jan 2007 B1
7167796 Taylor et al. Jan 2007 B2
7195381 Lynam et al. Mar 2007 B2
7202776 Breed Apr 2007 B2
7205904 Schofield Apr 2007 B2
7227459 Bos et al. Jun 2007 B2
7227611 Hull et al. Jun 2007 B2
7311406 Schofield et al. Dec 2007 B2
7325934 Schofield et al. Feb 2008 B2
7325935 Schofield et al. Feb 2008 B2
7338177 Lynam Mar 2008 B2
7339149 Schofield et al. Mar 2008 B1
7344261 Schofield et al. Mar 2008 B2
7355524 Schofield Apr 2008 B2
7370983 De Wind et al. May 2008 B2
7380948 Schofield et al. Jun 2008 B2
7388182 Schofield et al. Jun 2008 B2
7402786 Schofield et al. Jul 2008 B2
7423248 Schofield et al. Sep 2008 B2
7425076 Schofield et al. Sep 2008 B2
7446650 Schofield et al. Nov 2008 B2
7459664 Schofield et al. Dec 2008 B2
7460951 Altan Dec 2008 B2
7480149 DeWard et al. Jan 2009 B2
7490007 Taylor et al. Feb 2009 B2
7492281 Lynam et al. Feb 2009 B2
7526103 Schofield et al. Apr 2009 B2
7561181 Schofield et al. Jul 2009 B2
7581859 Lynam Sep 2009 B2
7592928 Chinomi et al. Sep 2009 B2
7616781 Schofield et al. Nov 2009 B2
7619508 Lynam et al. Nov 2009 B2
7639149 Katoh Dec 2009 B2
7681960 Wanke et al. Mar 2010 B2
7720580 Higgins-Luthman May 2010 B2
7777611 Desai Aug 2010 B2
7881496 Camilleri et al. Feb 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7965336 Bingle et al. Jun 2011 B2
8027029 Lu et al. Sep 2011 B2
8058977 Lynam Nov 2011 B2
8280586 Jung Oct 2012 B2
8340866 Hanzawa et al. Dec 2012 B2
8694224 Chundrlik, Jr. Apr 2014 B2
8849495 Chundrik, Jr. Sep 2014 B2
9090234 Johnson et al. Jul 2015 B2
9092986 Salomonsson et al. Jul 2015 B2
9205742 Seo Dec 2015 B2
9318020 Salomonsson et al. Apr 2016 B2
9327693 Wolf May 2016 B2
9346468 Chundrlik, Jr. et al. May 2016 B2
9346486 Itamoto May 2016 B2
9481301 Schaffner Nov 2016 B2
9499139 Koravadi Nov 2016 B2
9715769 Chundrlik, Jr. et al. Jul 2017 B2
20020015153 Downs Feb 2002 A1
20020044065 Quist et al. Apr 2002 A1
20020113873 Williams Aug 2002 A1
20020159270 Lynam et al. Oct 2002 A1
20030137586 Lewellen Jul 2003 A1
20030156015 Winner Aug 2003 A1
20030222982 Hamdan et al. Dec 2003 A1
20030227777 Schofield Dec 2003 A1
20040012488 Schofield Jan 2004 A1
20040016870 Pawlicki et al. Jan 2004 A1
20040032321 McMahon et al. Feb 2004 A1
20040051634 Schofield et al. Mar 2004 A1
20040114381 Salmeen et al. Jun 2004 A1
20040128065 Taylor et al. Jul 2004 A1
20040200948 Bos et al. Oct 2004 A1
20050078389 Kulas et al. Apr 2005 A1
20050134966 Burgner Jun 2005 A1
20050134983 Lynam Jun 2005 A1
20050146792 Schofield et al. Jul 2005 A1
20050169003 Lindahl et al. Aug 2005 A1
20050195488 McCabe et al. Sep 2005 A1
20050200700 Schofield et al. Sep 2005 A1
20050232469 Schofield et al. Oct 2005 A1
20050264891 Uken et al. Dec 2005 A1
20060018511 Stam et al. Jan 2006 A1
20060018512 Stam et al. Jan 2006 A1
20060028731 Schofield et al. Feb 2006 A1
20060050018 Hutzel et al. Mar 2006 A1
20060061008 Karner et al. Mar 2006 A1
20060091813 Stam et al. May 2006 A1
20060103727 Tseng May 2006 A1
20060164230 DeWind et al. Jul 2006 A1
20060250501 Wildmann et al. Nov 2006 A1
20060290479 Akatsuka et al. Dec 2006 A1
20070023613 Schofield et al. Feb 2007 A1
20070104476 Yasutomi et al. May 2007 A1
20070109406 Schofield et al. May 2007 A1
20070109651 Schofield et al. May 2007 A1
20070109652 Schofield et al. May 2007 A1
20070109653 Schofield et al. May 2007 A1
20070109654 Schofield et al. May 2007 A1
20070120657 Schofield et al. May 2007 A1
20070176080 Schofield et al. Aug 2007 A1
20080180529 Taylor et al. Jul 2008 A1
20080189013 Iwazaki Aug 2008 A1
20090113509 Tseng et al. Apr 2009 A1
20090177347 Breuer et al. Jul 2009 A1
20090243824 Peterson et al. Oct 2009 A1
20090244361 Gebauer et al. Oct 2009 A1
20090295181 Lawlor et al. Dec 2009 A1
20100020170 Higgins-Luthman et al. Jan 2010 A1
20100045797 Schofield et al. Feb 2010 A1
20100097469 Blank et al. Apr 2010 A1
20100228437 Hanzawa et al. Sep 2010 A1
20120062743 Lynam et al. Mar 2012 A1
20120245817 Cooprider et al. Sep 2012 A1
20120303222 Cooprider et al. Nov 2012 A1
20130124052 Hahne May 2013 A1
20130131918 Hahne May 2013 A1
20130231825 Chundrlik, Jr. Sep 2013 A1
Non-Patent Literature Citations (4)
Entry
Yihu et al., A Fuzzy Control Method to Improve Vehicle Yaw Stability Based on Integrated Yaw Moment Control and Active Front Steering, 2007, IEEE, p. 1508-1512.
Kim et al., Sensor offset compensation for a vehicle Yaw Rate Sensor Using fuzzy logic, 2007, IEEE, p. 362-366.
Pi et al., Enhancements in Vehicle Stability with Yaw Moment Control via Differential Braking, 2009, IEEE, p. 136-140.
Zhao et al., Estimation of Vehicle Yaw Rate and Side Slip Angle Using Moving Horizon Strategy, 2006, IEEE, p. 1828-1832.
Related Publications (1)
Number Date Country
20170323493 A1 Nov 2017 US
Provisional Applications (1)
Number Date Country
61605516 Mar 2012 US
Continuations (4)
Number Date Country
Parent 15161709 May 2016 US
Child 15657332 US
Parent 14499784 Sep 2014 US
Child 15161709 US
Parent 14246495 Apr 2014 US
Child 14499784 US
Parent 13779881 Feb 2013 US
Child 14246495 US