1. Field of the Invention
This invention relates generally to estimating state of health of an object sensing fusion system and, more particularly, to a method for estimating state of health of an object sensing fusion system in which target data from two or more object sensors with an overlapping field of view are analyzed to determine target correlation matching scores, where the matching scores are calculated both within individual frames of data and across a sequence of frames.
2. Discussion of the Related Art
Object detection systems, also known as object sensing systems, have become increasingly common in modern vehicles. Object detection systems can provide a warning to a driver about an object in the path of a vehicle. Object detection systems can also provide input to active vehicle systems—such as Adaptive Cruise Control, which controls vehicle speed to maintain appropriate longitudinal spacing to a leading vehicle, and Collision Avoidance systems, which can control both steering and braking to attempt to avoid an imminent collision.
Object detection systems use one or more sensors, which may be radar, lidar, camera, or other technologies, to detect the presence of an object in or near the path of a host vehicle. Software is used to track the relative motion of objects over time, determine if the objects are moving or stationary, determine what each object is likely to be (another vehicle, a pedestrian, a tree, etc.), and determine whether each object poses a collision threat to the host vehicle.
Object sensing fusion systems are also known in the art, where the fusion system performs a fusion calculation on target data from two or more sensors, and provides a more robust assessment of in-path objects as a result. However, even with an object sensing fusion system, it is possible for accuracy to suffer if a sensor fails, or if a sensor is partially or completely obscured by dirt or debris, or if a sensor is blinded by direct sun or other light. It would be advantageous to have an assessment of the state of health of the object sensing fusion system, and an indication of possible faulty sensors if the state of health is low.
In accordance with the teachings of the present invention, a method and system are disclosed for estimating the state of health of an object sensing fusion system. Target data from a vision system and a radar system, which are used by an object sensing fusion system, are also stored in a context queue. The context queue maintains the vision and radar target data for a sequence of many frames covering a sliding window of time. The target data from the context queue are used to compute matching scores, which are indicative of how well vision targets correlate with radar targets, and vice versa. The matching scores are computed within individual frames of vision and radar data, and across a sequence of multiple frames. The matching scores are used to assess the state of health of the object sensing fusion system. If the fusion system state of health is below a certain threshold, one or more faulty sensors are identified.
Additional features of the present invention will become apparent from the following description and appended claims, taken in conjunction with the accompanying drawings.
The following discussion of the embodiments of the invention directed to a method and apparatus for estimating the state of health of an object sensing fusion system is merely exemplary in nature, and is in no way intended to limit the invention or its applications or uses.
Many modern vehicles include an object sensing system for detecting objects in the path of the vehicle. The object sensing system can provide a warning to a driver, or even trigger other systems to take evasive action to avoid a vehicle collision. Some vehicles also include an object sensing fusion system, which numerically “fuses” the data from two or more object detection sensors and then bases its object detection logic on the fused object data. One such fusion system is described in U.S. Pat. No. 7,460,951, which is assigned to the assignee of the present application, and which is hereby incorporated by reference in its entirety.
However, even with an object sensing fusion system and two or more object sensors, accuracy can suffer if an object detection sensor is faulty or is obscured in some way. The nature of this problem is illustrated in
The vision system 20 has a vision coverage area 22, in which vision targets 24, 26 and 28 exist. The radar system 30 has a radar coverage area 32, in which radar targets 34, 36 and 38 exist. In the
It is to be noted that only targets which exist in zones where the vision system 20 and the radar system 30 have a common field-of-view can be used for comparison. That is, targets in areas where the vision coverage area 22 overlaps the radar coverage area 32, as are all of the targets in
The context queue 54 is a buffer which contains vision target data and radar target data for a sliding window of time. The data in the context queue 54 is organized into “frames”, or snapshots in time. The data frames are maintained in the buffer of the context queue 54 for a certain period of time, then discarded when they become too stale to be of any use. The context queue 54 provides the frame data—containing vision targets and radar targets—to a matching score computation module 56. The matching score computation module 56 computes matching scores, which are an indication of how well the vision targets correlate with the radar targets. The matching scores includes a score for target matching within individual frames and a score for target matching across a sequence of frames, as will be discussed in detail below.
The matching scores from the matching score computation module 56 are provided to a state of health estimation module 58, which determines a value for the state of health of the fusion system 12 based on the matching scores. If the state of health value is below a predetermined threshold, which may be in the range of 50-70%, then a fault identification module 60 identifies one or more faulty sensors. The fault identification module 60 uses data from the vision system 20, the radar system 30 and the matching score computation module 56 to identify faulty sensors, as will be discussed below.
The state of health value and the faulty sensor information, if any, can be used by the fusion system 12 and by other vehicle control systems (not shown) to enhance the performance of vehicle control applications which use the object sensing data.
It is also noted that the system 50 could include more or different object sensing systems than the vision system 20 and the radar system 30. For example, some vehicles are equipped with a front-left and a front-right radar system (not shown), which provide object sensing coverage in lateral areas ahead of the vehicle. Although these other object sensing systems, such as the front-left and front-right radar systems, may have different ranges and fields-of-view than straight-ahead vision or radar systems, they still typically have significant areas of overlapping coverage, and therefore can be used to great benefit in the system 50.
At box 78, matching scores are calculated based on the target data in the context queue 54. At box 80, a state of health of the object sensing fusion system 12 is calculated from the matching scores. If the state of health of the fusion system 12 is determined to be below a certain threshold, one or more faulty sensors are identified at box 82. The matching scores and the state of health may be calculated by the processor in the fusion system 12 or another onboard vehicle processor, or the elements 52-60 of the system 50 may be packaged as a processing module unto themselves. In any case, the matching score and state of health calculations are performed on some controller or microprocessor onboard the vehicle 12. The matching score calculations performed at the box 78, the state of health calculation performed at the box 80 and the faulty sensor identification performed at the box 82 will be discussed in detail below.
It can be seen in the combined frame 120 of the example shown in
A matching score Sf is defined as the matching score within a particular time frame, in this case the combined frame 120. The matching score Sf is defined such that a smaller score represents better matching of the sensor targets or objects within the frame. First, three statistics are determined for the frame of interest. A value n1 is defined as the number of matched pairs of vision and radar objects within the frame. For the combined frame 120 of
The matching score Sf can then be calculated as:
Where Pmis
The radius R and the penalty costs Pmis
In
The sequence 140 focuses on the targets within the neighborhood circle 122 across the K frames of data. In the sequence 140, K=7. At the beginning of the sequence 140 (on the left-hand side), the neighborhood circle 122 contains the vision target 112, which is used to define the center of the circle 122, and the radar target 102. Although the location of the radar target 102 moves around slightly relative to the vision target 112, the matched pair remains the same. That is, throughout the sequence 140, the radar target 102 is matched with the vision target 112.
The sequence 150 represents another scenario, where target switching occurs within the matched pairs over time. At the beginning of the sequence 150, again the neighborhood circle 122 contains the vision target 112 and the radar target 102. However, after three frames, the radar target 102 no longer appears within the neighborhood circle 122; it is replaced instead with the radar target 106. Two frames later, the radar target changes again, this time to radar target 108. These changes, or switches, in a matched pair of targets across a sequence of frames will be reflected in a higher value of the sequence matching score Sseq, which can be defined as:
Where nswitch is the number of target switches within the sliding time window, and Pswitch is the penalty cost of each target switch. In the sequence 150, K=7 and nswitch=2. A smaller value of the sequence matching score, Sseq, indicates a better match between radar and vision targets, as was the case with the single-frame matching scores Sf.
As each new frame f is added to the context queue 54, the sliding time window of the sequence 130 is adjusted and the sequence matching score Sseq is re-computed. Thus, matching scores are continuously updated as the vision system 20 and the radar system 30 track objects during driving of the vehicle 10.
In
A regression function from the two sequence matching scores to the fusion system state of health can be established using a learning algorithm. Examples of learning algorithms include support vector machine, nearest neighbor, neural network, and discriminant analysis. Other types of learning algorithms may also be used. As an example, the probability that the fusion system 12 is healthy (has a value SOH=100%) can be written as:
Where β0, β1 and β2 are calibrated from training data. It can be seen from Equation 3 that higher values of the matching scores, Sseq
If the fusion system state of health is below the threshold at the decision diamond 184, then at box 188 self-diagnostic information is collected from each object detection sensor system. For example, both the vision system 20 and the radar system 30 would typically have self-diagnostic capability to detect some types of internal faults. At decision diamond 190, the self-diagnostic information from each object detection sensor system is evaluated to determine if any of the object sensors are reporting a low state of health value. If any of the object sensors reports a state of health value below a predetermined threshold, which could again be 70% or any other appropriate value, then the faulty sensor identification is complete at box 192. Note that there may be more than one faulty sensor. Conversely, there may be no sensors reporting a low state of health, as some external factor such as blockage or blinding saturation may be causing inaccurate sensor readings.
If none of the objects sensors reports a state of health value below the threshold, then at decision diamond 194 it is determined how many object sensors are included in the vehicle 10 and providing data to the fusion system 12. If the number of object sensors is less than three (that is, two), then at box 196 the sensor which is experiencing more frequent target switching in a sequence is identified. That is, in the current sequence, if the number of target switches using the vision system 20 as the reference (nswitch
If the number of object sensors is three or more, then at box 198 another technique known as majority consistency is used to identify the faulty sensor. For example, if there are three object sensors onboard, then sequence matching scores can be computed between each pair of sensors. The sequence matching score between sensors 1 and 2 can be designated Sseq
The fusion system state of health calculated by the state of health estimation module 58 and the faulty sensor identification from the fault identification module 60 can be used by the fusion system 12 or another onboard controller to modify system functions. For example, if one of three object sensors is determined to be faulty or providing inaccurate data, then the fusion system 12 can temporarily disregard input from the faulty sensor, until such time as that sensor is demonstrated to be providing reliable data. Other types of system behavior modification logic are also made possible by the fusion system state of health estimation methods disclosed above.
The foregoing discussion discloses and describes merely exemplary embodiments of the present invention. One skilled in the art will readily recognize from such discussion and from the accompanying drawings and claims that various changes, modifications and variations can be made therein without departing from the spirit and scope of the invention as defined in the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4255799 | Laing | Mar 1981 | A |
6513022 | Morgan | Jan 2003 | B1 |
7991550 | Zeng | Aug 2011 | B2 |
20080306666 | Zeng et al. | Dec 2008 | A1 |
20090138141 | Nwadiogbu et al. | May 2009 | A1 |
20100191391 | Zeng | Jul 2010 | A1 |
20120002016 | Zhang et al. | Jan 2012 | A1 |
20120140061 | Zeng | Jun 2012 | A1 |
20130242284 | Zeng | Sep 2013 | A1 |
Number | Date | Country | |
---|---|---|---|
20140142800 A1 | May 2014 | US |