The subject disclosure relates to vehicle sensors and, in particular, to a system and method for automatically aligning vehicle sensors.
Autonomous, semi-autonomous and driver-assisted vehicles use sensors such as Lidar, radar, camera, etc. in order to obtain measurements of the surroundings of a vehicle. Such measurements are then used by a processor or navigation system of the vehicle in order to control operation and navigation of the vehicle. Proper geometric alignment of these sensors is important to providing self-consistent data to the processor or navigation system. However, normal use and wear of the vehicle can lead to these sensors losing alignment over time. Accordingly, it is desirable to provide a system and method for realigning these sensors automatically.
In one exemplary embodiment, a method for aligning a sensor with a vehicle is disclosed. A first measurement of a kinematic vector of the vehicle is obtained at a first inertial measurement unit (IMU) associated with the vehicle. A second measurement of the kinematic vector is obtained at a second IMU associated with the sensor. A current relative orientation between a first reference frame associated with the vehicle and a second reference frame associated with the sensor is determined from the kinematic vector. An alignment error between the sensor and the vehicle is determined based on the current relative orientation and a specified relative orientation. The sensor is adjusted to the specified relative orientation to correct for the alignment error.
In addition to one or more of the features described herein, determining the current relative orientation further includes determining a rotation matrix for rotating the first reference frame into the second reference frame. Determining the rotation matrix further includes reducing a cost function. The cost function includes a difference between the first measurement of the kinematic vector in the first reference frame and a rotation of the second measurement of the kinematic vector. In various embodiments, the first measurement of the kinematic vector is obtained at a first time and the second measurement of the kinematic vector is obtained at a second time. The kinematic vector is at least one of an acceleration vector and an angular velocity vector. The first IMU is associated with one of the vehicle and another sensor.
In another exemplary embodiment, a system for aligning a sensor with a vehicle is disclosed. The system includes a first inertial measurement unit (IMU) associated with the vehicle, the first IMU configured to obtain a first measurement of a kinematic vector of the vehicle, a second IMU associated with the sensor, the second IMU configured to obtain a second measurement of the kinematic vector, and a processor. The processor is configured to determine a current relative orientation between a first reference frame associated with the vehicle and a second reference frame associated with the sensor from the kinematic vector, determine an alignment error between the sensor and the vehicle based on the current relative orientation and a specified relative orientation, and adjust the sensor from the current relative orientation to the specified relative orientation to correct for the alignment error.
In addition to one or more of the features described herein, the processor is further configured to determine the current relative orientation by determining a rotation matrix for rotating the first reference frame into the second reference frame. The processor is further configured to determine the rotation matrix by reducing a cost function. The cost function includes a difference between the first measurement of the kinematic vector and a rotation of the second measurement of the kinematic vector. The processor is further configured to obtain the first measurement at a first time and obtain the second measurement at a second time. The kinematic vector is at least one of an acceleration vector and an angular velocity vector. The first IMU is associated with one of the vehicle and another sensor.
In yet another exemplary embodiment, a vehicle is disclosed. The vehicle includes a first inertial measurement unit (IMU) associated with the vehicle, the first IMU configured to obtain a first measurement of a kinematic vector of the vehicle, a second IMU associated with a sensor of the vehicle, the second IMU configured to obtain a second measurement of the kinematic vector, and a processor. The processor is configured to determine a current relative orientation between a first reference frame associated with the vehicle and a second reference frame associated with the sensor from the kinematic vector, determine an alignment error between the sensor and the vehicle based on the current relative orientation and a specified relative orientation, and adjust the sensor from the current relative orientation to the specified relative orientation to correct for the alignment error.
In addition to one or more of the features described herein, the processor is further configured to determine the current relative orientation by determining a rotation matrix for rotating the first reference frame into the second reference frame. The processor is further configured to determine the rotation matrix by reducing a cost function, the cost function including a difference between the first measurement of the kinematic vector and a rotation of the second measurement of the kinematic vector. The processor is further configured to obtain the first measurement at a first time and obtain the second measurement at a second time. The kinematic vector is at least one of an acceleration vector and an angular velocity vector. The first IMU is associated with one of the vehicle and another sensor.
The above features and advantages, and other features and advantages of the disclosure are readily apparent from the following detailed description when taken in connection with the accompanying drawings.
Other features, advantages and details appear, by way of example only, in the following detailed description, the detailed description referring to the drawings in which:
The following description is merely exemplary in nature and is not intended to limit the present disclosure, its application or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.
In accordance with an exemplary embodiment,
In general, a trajectory planning system 100 determines a trajectory plan for automated driving of the vehicle 10. The vehicle 10 generally includes a chassis 12, a body 14, front wheels 16, and rear wheels 18. The body 14 is arranged on the chassis 12 and substantially encloses components of the vehicle 10. The body 14 and the chassis 12 may jointly form a frame. The wheels 16 and 18 are each rotationally coupled to the chassis 12 near respective corners of the body 14.
As shown, the vehicle 10 generally includes a propulsion system 20, a transmission system 22, a steering system 24, a brake system 26, a sensor system 28, an actuator system 30, at least one data storage device 32, at least one controller 34, and a communication system 36. The propulsion system 20 may, in various embodiments, include an internal combustion engine, an electric machine such as a traction motor, and/or a fuel cell propulsion system. The transmission system 22 is configured to transmit power from the propulsion system 20 to the vehicle wheels 16 and 18 according to selectable speed ratios. According to various embodiments, the transmission system 22 may include a step-ratio automatic transmission, a continuously-variable transmission, or other appropriate transmission. The brake system 26 is configured to provide braking torque to the vehicle wheels 16 and 18. The brake system 26 may, in various embodiments, include friction brakes, brake by wire, a regenerative braking system such as an electric machine, and/or other appropriate braking systems. The steering system 24 influences a position of the vehicle wheels 16 and 18. While depicted as including a steering wheel for illustrative purposes, in some embodiments contemplated within the scope of the present disclosure, the steering system 24 may not include a steering wheel.
The sensor system 28 includes one or more sensing devices 40a-40n that sense observable conditions of the exterior environment and/or the interior environment of the vehicle 10. The sensing devices 40a-40n can include, but are not limited to, radars, lidars, global positioning systems, optical cameras, thermal cameras, ultrasonic sensors, and/or other sensors for observing and measuring parameters of the exterior environment. The sensing devices 40a-40n may further include brake sensors, steering angle sensors, wheel speed sensors, etc. for observing and measuring in-vehicle parameters of the vehicle. The cameras can include two or more digital cameras spaced at a selected distance from each other, in which the two or more digital cameras are used to obtain stereoscopic images of the surrounding environment in order to obtain a three-dimensional image. The actuator system 30 includes one or more actuator devices 42a-42n that control one or more vehicle features such as, but not limited to, the propulsion system 20, the transmission system 22, the steering system 24, and the brake system 26. In various embodiments, the vehicle features can further include interior and/or exterior vehicle features such as, but are not limited to, doors, a trunk, and cabin features such as air, music, lighting, etc. (not numbered).
The at least one controller 34 includes at least one processor 44 and a computer readable storage device or media 46. The at least one processor 44 can be any custom made or commercially available processor, a central processing unit (CPU), a graphics processing unit (GPU), an auxiliary processor among several processors associated with the at least one controller 34, a semiconductor-based microprocessor (in the form of a microchip or chip set), a macroprocessor, any combination thereof, or generally any device for executing instructions. The computer readable storage device or media 46 may include volatile and nonvolatile storage in read-only memory (ROM), random-access memory (RAM), and keep-alive memory (KAM), for example. KAM is a persistent or non-volatile memory that may be used to store various operating variables while the at least one processor 44 is powered down. The computer-readable storage device or media 46 may be implemented using any of a number of known memory devices such as PROMs (programmable read-only memory), EPROMs (electrically PROM), EEPROMs (electrically erasable PROM), flash memory, or any other electric, magnetic, optical, or combination memory devices capable of storing data, some of which represent executable instructions, used by the at least one controller 34 in controlling the vehicle 10.
The instructions may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions. The instructions, when executed by the at least one processor 44, receive and process signals from the sensor system 28, perform logic, calculations, methods and/or algorithms for automatically controlling the components of the vehicle 10, and generate control signals to the actuator system 30 to automatically control the components of the vehicle 10 based on the logic, calculations, methods, and/or algorithms. Although only one controller is shown in
The communication system 36 is configured to wirelessly communicate information to and from other entities 48, such as but not limited to, other vehicles (“V2V” communication,) infrastructure (“V2I” communication), remote systems, and/or personal devices. In an exemplary embodiment, the communication system 36 is a wireless communication system configured to communicate via a wireless local area network (WLAN) using IEEE 802.11 standards or by using cellular data communication. However, additional or alternate communication methods, such as a dedicated short-range communications (DSRC) channel, are also considered within the scope of the present disclosure. DSRC channels refer to one-way or two-way short-range to medium-range wireless communication channels specifically designed for automotive use and a corresponding set of protocols and standards.
Each IMU 200, 202a, 202b, 202c, . . . 202N includes kinematic sensors for measuring a kinematic vector. Each sensor-centered IMU 202a, 202b, 202c, . . . 202N measures the kinematic vector in a reference frame for its associated sensor, while the vehicle-centered IMU 200 measures the kinematic vector in a reference frame of the vehicle 10. In various embodiments, the kinematic vector includes an angular velocity vector and an acceleration vector A. An IMU can measure components of the kinematic vector in three-dimensions. The acceleration vector is a three-dimensional vector. However, the largest component of the acceleration vector is along a forward-axis direction while sideways acceleration and vertical acceleration are considerably smaller. Similarly, the angular velocity is a three-dimensional vector. However, the largest component of the angular velocity is a yaw rate component Q while pitch and roll component vectors are considerably smaller. Acceleration and angular velocity vectors are generally measured while the vehicle is in motion.
During vehicle motion, each IMU obtains measurements of the kinematic vector and registers its vector measurements at the processor 44. The processor 44 determines a current relative orientation between the vector measurements, thereby determining a current relative orientation between either a sensor and the chassis of the vehicle or between any two sensors. The current relative orientation can be compared to a specified orientation required for the sensors, thereby determining an alignment error. The processor 44 can then send a signal to a selected IMU, causing the selected IMU to activate one or more of its adjustment activators to adjust the sensor back to the specified orientation.
In box 306, a correlation function is applied to the kinematic measurements in order to resolve the differences in the kinematic vectors due to time differences between measurements. The results of the correlation function, shown in box 308, provides kinematic measurements for all sensors at time t (i.e., ((a_vehicle(t), Ω_vehicle(t)),. (a_1(t), Ω_1(t)), (a_2(t), Ω_2(t), . . . , (a_N(t), Ω_N(t))), as shown in box 308.
In box 310, rotation matrices are found between each sensor-centered IMU and the vehicle-centered IMU, using the time-corrected kinematic vectors obtained in box 308. A rotation matrix can be determined using the relevant kinematic vector. For example, the rotation matrix between the frame of reference (or “reference frame”) of the first IMU and the frame of reference of the vehicle can be found using (a_vehicle(t), 106 _vehicle(t)) and (a_1(t), Ω_1(t)). The current relative orientation of the first IMU to the vehicle chassis is therefore given by the rotation matrix. In box 312, relative rotation matrices Rij between any two sensor-based IMU can be determined from the rotation matrices found in box 310. Comparing the current relative orientation to the specified relative orientation yields an alignment error. The processor 44 can determine this alignment error and send a signal to the relevant IMUs in order to correct for the alignment error.
Referring to the illustrative example of
A rotation matrix R1 rotates the first sensor frame of reference 404 into alignment with the vehicle frame of reference 402. A rotation matrix R2 rotates the second sensor frame of reference 406 into the vehicle frame of reference 402. Assuming orthogonality, a rotation matrix R1R2T rotates the first sensor frame of reference 404 into the second sensor frame of reference 406.
Stated generally, acceleration vector ai in the ith sensor reference frame and angular velocity vector Ωi in the ith sensor reference frame are obtained by applying the rotation matrix Ri to the acceleration vector A and angular velocity vector Ω in the vehicle frame of reference 402, as shown by Eq. (1):
ai=Ria Eq. (1)
and
Ωi=RiΩ Eq. (2)
A rotation matrix between an ith sensor frame of reference and a jth sensor frame of reference is given by
ai=Rijaj Eq. (3)
where
R
ij=(θij, ϕij, ψij)=RiRj−1=RiRjT Eq. (4)
Given any two sets of measurements (e.g., m measurements {v(m)}i in the ith frame of reference and m measurements {v(m)}j in the jth frame of reference), the method disclosed herein determines a relative rotation matrix between the IMUi of the ith frame of reference and the IMUj of the jth frame of reference. The relative rotation matrix is determined by finding the values of the angular rotation variables that minimize or reduce a cost function (Eq. (5)):
{circumflex over (θ)}ij, {circumflex over (ϕ)}ij, {circumflex over (ψ)}ij=arg min[Φ(θij, ϕij, ψij)] Eq. (5)
where the cost function Φ is given by:
Φ(θij, ϕij, ψij)=Σm(vi(m)−Rijvj(m))TΠv−1(vi(m)−Rijvj(m)) Eq. (6)
where the total covariance Πv is calculated from measurement covariances Πv(i) adn Πv(i) as:
Πv=Πv(i)+RijΠv(j)RijT Eq (7)
The calculations performed in Eqs. (5)-(7) can be extended to global alignment of a plurality of IMUs by including all corresponding measurements from all IMUs to obtain a cost function:
Φg=Σi>jΦ(θij, ϕij, ψij) Eq. (8)
and minimizing the cost function of Eq. (8) as indicated in Eqs. (5)-(7). This approach tends to distribute errors between the sensors and reduces the overall error propagation of the method.
The simulated signal measurements can be used to determine an estimate of a relative rotation matrix {circumflex over (R)}12 between the first frame of the reference of the first IMU and the second frame of reference of the second IMU. An alignment error can then be determined based on the estimate {circumflex over (R)}12 and the known rotation matrices R1 and R2 used in stage 504. The alignment error is therefore given as:
δR=(R2R1T){circumflex over (R)}12 Eq. (9)
By performing the simulation shown in
While the above disclosure has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various bchanges may be made and equivalents may be substituted for elements thereof without departing from its scope. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the disclosure without departing from the essential scope thereof. Therefore, it is intended that the present disclosure not be limited to the particular embodiments disclosed, but will include all embodiments falling within the scope thereof.