This invention pertains generally to object-tracking systems, and more specifically to measurement systems associated with object-tracking systems related to vehicle operation.
Modern vehicles may be equipped with various sensing devices and systems that assist a vehicle operator in managing vehicle operation. One type of sensing system is intended to identify relative locations and trajectories of other vehicles and other objects on a highway. Exemplary systems employing sensors which identify relative locations and trajectories of other vehicles and other objects on the highway include collision-avoidance systems and adaptive cruise control systems.
Sensor systems installed on vehicles are typically calibrated during the vehicle assembly process. However, there is an ongoing concern that sensor orientation and signal output may drift during the life of the sensor, such that the orientation of the sensor relative to the vehicle is changed. When the sensor orientation changes or drifts, measurements become skewed relative to the vehicle. When there are multiple sensors, the concern is further complicated in that outputs between the sensors become skewed.
In order for the data from various sensors to be successfully combined to produce a consistent object map, i.e. locus and trajectory of a remote object, the sensor data need to be correctly registered. That is, the relative locations of the sensors, and the relationship between their coordinate systems and the vehicle coordinate system, typically oriented to the vehicle frame, needs to be determined. When a system fails to correctly account for registration errors, a result may comprise a mismatch between a compiled object map (sensor data) and ground truth. Examples include an overstated confidence in location and movement of a remote object (or target) such as a vehicle, and, unnecessary multiplicity of tracks in an on-board tracking database, including multiple tracks corresponding to a single remote object.
Therefore, there is a need to align each individual sensor with an accuracy comparable to its intrinsic resolution, e.g., having an alignment accuracy of 0.1 degree for a sensor having an azimuth accuracy on an order of 0.1 degree. Precision sensor mounting is vulnerable to drift during the vehicle's life and difficult to maintain manually.
There is a need to ensure that signals output from sensors are aligned and oriented with a fixed coordinate system to eliminate risk of errors associated with skewed readings. Therefore, it is desirable to have a sensor system that automatically aligns sensor output to a reference coordinate system. It is also desirable to align the sensors using a tracked object as a reference, in order to facilitate regular, ongoing alignments, to improve sensor accuracy and reduce errors associated with drift.
This invention presents a method and apparatus by which object-locating sensors mounted on a vehicle can be aligned to high precision with respect to each other. The invention includes a method and associated apparatus to automatically perform on-line fine alignment of multiple sensors. Up to three geometrical parameters, two for location, one for bearing alignment, can be computed for each sensor based upon object trajectories.
Thus, in accordance with the present invention, an article of manufacture is provided, comprising a storage medium having a computer program encoded therein for effecting a method to align one of a plurality of object-locating sensors mounted on a vehicle. Executing the program accomplishes a method which includes establishing initial values for alignments of each of the object-locating sensors relative to a coordinate system for the vehicle, and determining a plurality of positions for a target object for each of the object-locating sensors. A trajectory is determined for the target object. The alignment of each of the object-locating sensors is adjusted relative to the coordinate system for the vehicle based upon the trajectory for the target object.
Another aspect of the invention comprises establishing initial values for alignments of each of the object-locating sensors using a manual calibration process.
Another aspect of the invention comprises determining positions of the target object for each of the object-locating sensors at a series of substantially time-coincident moments occurring over a period of time, including determining a plurality of matched positions of the target object.
A further aspect of the invention comprises adjusting the alignment of each of the object-locating sensors relative to the coordinate system for the vehicle based upon the trajectory for the target object, including determining matched positions of the target object at a series of substantially time-coincident moments occurring over a period of time, and estimating corrections using a least-squares method. An angular alignment of the sensor is determined relative to the vehicle coordinate system. Each matched position of the target object comprises a fused position of the target object, and, a time-coincident sensor-observed position of the target object.
Another aspect of the invention comprises estimating a plurality of corrections by iteratively executing a least-squares estimation equation.
Another aspect of the invention comprises incrementally iteratively correcting the angular alignment of the sensor relative to the vehicle coordinate system.
Another aspect of the invention consists of the object-locating sensors and subsystem, which can comprise a short-range radar subsystem, and long-range radar subsystem, and a forward vision subsystem.
Another aspect of the invention comprises a system for locating a target object. The system comprises a vehicle equipped with a control system operably connected to a plurality of object-locating sensors each operable to generate a signal output characterizing location of the target object in terms of a range, a time-based change in range, and an angle measured from a coordinate system oriented to the vehicle. The control system operates to fuse the plurality of signal outputs of the object-locating sensors to locate the target object. The control system includes an algorithm for aligning the signal outputs of each of the object-locating sensors.
These and other aspects of the invention will become apparent to those skilled in the art upon reading and understanding the following detailed description of the embodiments.
The invention may take physical form in certain parts and arrangement of parts, the preferred embodiment of which will be described in detail and illustrated in the accompanying drawings which form a part hereof, and wherein:
Referring now to the drawings, wherein the showings are for the purpose of illustrating the invention only and not for the purpose of limiting the same,
The exemplary vehicle comprises a passenger vehicle intended for use on highways, although it is understood that the invention described herein is applicable on any vehicle or other system seeking to monitor position and trajectory of remote vehicles and other objects. The vehicle includes a control system containing various algorithms and calibrations which it is operable to execute at various times. The control system is preferably a subset of an overall vehicle control architecture which is operable to provide coordinated vehicle system control. The control system is operable to monitor inputs from various sensors, synthesize pertinent information and inputs, and execute algorithms to control various actuators to achieve control targets, including such parameters as collision avoidance and adaptive cruise control. The vehicle control architecture comprises a plurality of distributed processors and devices, including a system controller providing functionality such as antilock brakes, traction control, and vehicle stability.
Each processor is preferably a general-purpose digital computer generally comprising a microprocessor or central processing unit, read only memory (ROM), random access memory (RAM), electrically programmable read only memory (EPROM), high speed clock, analog-to-digital (A/D) and digital-to-analog (D/A) circuitry, and input/output circuitry and devices (I/O) and appropriate signal conditioning and buffer circuitry. Each processor has a set of control algorithms, comprising resident program instructions and calibrations stored in ROM and executed to provide the respective functions of each computer.
Algorithms described herein are typically executed during preset loop cycles such that each algorithm is executed at least once each loop cycle. Algorithms stored in the non-volatile memory devices are executed by one of the central processing units and are operable to monitor inputs from the sensing devices and execute control and diagnostic routines to control operation of a respective device, using preset calibrations. Loop cycles are typically executed at regular intervals, for example each 3, 6.25, 15, 25 and 100 milliseconds during ongoing engine and vehicle operation. Alternatively, algorithms may be executed in response to occurrence of an event.
Referring now to
Each object-locating sensor and subsystem provides an output typically characterized in terms of range, R, time-based change in range, R_dot, and angle, Θ, preferably measuring from a longitudinal axis of the vehicle. An exemplary short-range radar subsystem has a field-of-view (‘FOV’) of 160 degrees and a maximum range of thirty meters. An exemplary long-range radar subsystem has a field-of-view of 17 degrees and a maximum range of 220 meters. An exemplary forward vision subsystem has a field-of-view of 45 degrees and a maximum range of fifty (50) meters. For each subsystem the field-of-view is preferably oriented around the longitudinal axis of the vehicle 10. The vehicle is preferably oriented to a coordinate system, referred to as an XY-coordinate system 20, wherein the longitudinal axis of the vehicle 10 establishes the X-axis, with a locus at a point convenient to the vehicle and to signal processing, and the Y-axis is established by an axis orthogonal to the longitudinal axis of the vehicle 10 and in a horizontal plane, which is thus parallel to ground surface.
As shown in
The preferred DAC module 24 includes a controller 28, wherein an algorithm and associated calibration (not shown) is stored and configured to receive the estimate data from each of the sensors A, B, to cluster data into like observation tracks (i.e. time-coincident observations of the object 30 by sensors 14, 16 over a series of discrete time events), and to fuse the clustered observations to determine a true track status. It is understood that fusing data using different sensing systems and technologies yields robust results. Again, it is appreciated that any number of sensors can be used in this technique. However, it is also appreciated that an increased number of sensors results in increased algorithm complexity, and the requirement of more computing power to produce results within the same time frame. The preferred controller 28 is housed within the host vehicle 10, but may also be located at a remote location. In this regard, the preferred controller 28 is electrically coupled to the sensor processors 14a, 16a, but may also be wirelessly coupled through RF, LAN, infrared or other conventional wireless technology. The TLM module 26 is configured to receive fused data of liked observations, and store the fused observations in a list of tracks 26a.
The invention, as now described, comprises a method to determine an alignment of each object-locating sensor relative to the XY-coordinate system 20 for the vehicle, executed as one or more algorithms in the aforementioned control system. The method comprises establishing initial values for the alignments of each of the object-locating sensors relative to the XY-coordinate system for the vehicle, for each sensor. A plurality of positions for target object 30 is determined, as measured by each of the object-locating sensors, and trajectories are thus determined. A fused trajectory for the target object is determined, based upon the aforementioned trajectories. Alignment of each of the object-locating sensors is adjusted relative to the XY-coordinate system for the vehicle based upon the fused trajectory for the target object. This is now described in greater detail.
The schematic illustration of
The trajectory fusion process comprises a method and apparatus for fusing tracking data from a plurality of sensors to more accurately estimate a location of an object. An exemplary target tracking system and method utilizing a plurality of sensors and data fusion increases the precision and certainty of system measurements above that of any single system sensor. Sensor coverage is expanded by merging sensor fields-of-view and reducing capture/recapture time of objects, thus decreasing a likelihood of producing false positives and false negatives. The exemplary target tracking and sensor fusion system can estimate a condition of at least one object. The system includes a first sensor configured to determine a first estimate of a condition of the object, and a second sensor configured to determine a second estimate of the condition. The system includes a controller communicatively coupled to the sensors, and configured to determine a third estimate of the condition. The third estimate is based in part on the first and second estimates, and each of the first and second estimates includes a measured value and a standard deviation value. The third estimate presents a calculated value and a standard deviation less than each of the first and second standard deviations. A computer program executed by the controller is configured to receive initial estimate data of at least one condition from the sensors, e.g. position, range, or angle, and apply the fusion algorithm to the initial estimate data, so as to determine a state estimate for the condition. The state estimate presents a higher probability and smaller standard deviation than the initial estimate data. The sensor fusion algorithm is applied to a vehicle having like or dissimilar sensors, which increases the robustness of object detection. In this configuration, applications, such as full speed adaptive cruise control (ACC), automatic vehicle braking, and pre-crash systems can be enhanced.
The aforementioned fusion process permits determining position of a device in the XY-coordinate system relative to the vehicle. The fusion process comprises measuring forward object 30 in terms of RA, RB, R_dotA, R_dotB, ΘA, ΘB, using sensors 14, 16, located at points A, B. A fused location for the forward object 30 is determined, represented as RF, R_dotF, ΘF, Θ_dOtF, described in terms of range, R, and angle, Θ, as previously described. The position of forward object 30 is then converted to parametric coordinates relative to the vehicle's XY-coordinate system. The control system preferably uses fused track trajectories (Line rf1, rf2, rf3), comprising a plurality of fused objects, as a benchmark, i.e., ground truth, to estimate true sensor positions for sensors 14, 16. As shown in
With reference now to
To transform a point, representing a time-stamped location of a target object 30 located on the sensor coordinate system (u, v) to the vehicle coordinate system (x, y) the following actions are executed as algorithms and calibrations in the vehicle control system, as described hereinabove, starting with Eq. 1:
r=Rq+r0 (1)
wherein r=(x, y), q=(u, v), R is a 2-D rotation and r0=(x0, y0) is the position of the sensor center in the vehicle frame.
Initially R and r0 are typically determined by a manual calibration process in the vehicle assembly plant. During operation, this information is corrected by an incremental rotation δR and translation δr0 so that the new rotation and translation become as shown in Eqs. 2 and 3, below:
R′=ERR, and, (2)
r′0=r0+δr0 (3)
wherein R is written as:
The value ψ denotes the specific sensor's angular alignment with respect to the vehicle frame, i.e. the orientation of the UV-coordinate system relative to the XY-coordinate system. Since the alignment corrections are typically small, the incremental rotation δR can be approximated by Eq. 4, below:
δR=I+ε (4)
wherein:
and δψ denotes correction of the alignment angle.
A correction of the object position is given by Eq. 5:
Δr=r′−r=R′q+r′0−Rq−r0 (5)
Equations 1-5, above, are combined to yield Eq. 6:
Δr=δRRq+δr0−Rq=ε(r−r0)+δr0. (6)
Eq. 6 is rewritten in component form, as Eq. 7:
wherein:
δr0=(δx0, δy0)T,
ri=(xi, yi)T,
r0=(δx0, δy0)T, and
β=(δx0, δy0, δψ)T.
Correction of the sensor position is determined by using matched objects. Results calculated in Eq. 7 provide a model by which unknown corrections β are estimated by minimizing a respective χ2 function using a large number of matched objects.
As an example, assume the matched object denoted by {(rfi, rai)|i=1, . . . , N}, wherein rfi and rai denote the positions of the i-th fused object and the sensor-observed object, respectively.
The χ2 function is minimized to Eq. 8:
wherein the sum is taken over all matched object pairs (rfi, rai), Δri=rfi−rai and W=diag{w1, w2, . . . , wN} is a weight matrix. Here wi is a function of object range (i.e., w1=f(ri)) such that distant matched objects are attributed larger weighting factors than nearby matched objects. The correction β is found by the least square estimation procedure. The solution is shown in Eq. 9, below:
wherein X† denotes a pseudoinverse of X.
Therefore, the incremental correction equations of the sensor position (R and r0) comprise Eq. 10 and 11, below:
wherein η is a learning factor, typically a small positive number (e.g., η=0.01) for updating the sensor position iteratively through time. A large value for η may help the algorithm quickly converge to a true value, but may lead to undesirable offshoot effects. On the other hand, the drift of sensor position is typically a slow process, thus permitting a small parametric value for η.
To recapitulate, adjusting alignment of each object-locating sensor relative to the vehicle coordinate system comprises initially setting each sensor's position (R and r0) to nominal values. The following steps are repeated. Each object map is compensated based on each sensor's position (R and r0). Outputs from each of the sensors are fused to determine a series of temporal benchmark positions for the targeted object. A trajectory and associated object map is stored in a circular queue for the fused outputs. When the queues of fused objects have a sufficient amount of data, for each sensor the following actions are executed: the matched object {(rfi, rai)|i=1, . . . , N} in the queues is output, wherein rfi, and rai denote the positions of the fused object and the sensor observed object, respectively. Eq. 9 is executed to compute corrections β, and Eqs. 10 and 11 are executed to update each sensor's position (R and r0).
The invention has been described with specific reference to the preferred embodiments and modifications thereto. Further modifications and alterations may occur to others upon reading and understanding the specification. It is intended to include all such modifications and alterations insofar as they come within the scope of the invention.