Sensor alignment calibration

Information

  • Patent Grant
  • 10241215
  • Patent Number
    10,241,215
  • Date Filed
    Monday, November 14, 2016
    8 years ago
  • Date Issued
    Tuesday, March 26, 2019
    5 years ago
Abstract
A calibration scheme measures roll, pitch, and yaw and other speeds and accelerations during a series of vehicle maneuvers. Based on the measurements, the calibration scheme calculates inertial sensor misalignments. The calibration scheme also calculates offsets of the inertial sensors and GPS antennas from a vehicle control point. The calibration scheme can also estimate other calibration parameters, such as minimum vehicle radii and nearest orthogonal orientation. Automated sensor calibration reduces the amount of operator input used when calibrating sensor parameters. Automatic sensor calibration also allows the operator to install an electronic control unit (ECU) in any convenient orientation (roll, pitch and yaw), removing the need for the ECU to be installed in a restrictive orthogonal configuration. The calibration scheme may remove dependencies on a heading filter and steering interfaces by calculating sensor parameters based on raw sensor measurements taken during the vehicle maneuvers.
Description
COPYRIGHT NOTICE

A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.


TECHNICAL FIELD

One or more implementations relate generally to sensor calibration.


BACKGROUND

Accurate and robust automated steering applications rely on accurate sensor positioning and installation. Measured vehicle position and heading is based on knowledge of the relative alignment and offset distance of the vehicle sensors from a vehicle control point. If the relative sensor distances from the control point are incorrectly measured, vehicle positions and headings based on those relative sensor distances are also incorrect.


Sensor installation parameters, such as distances and angles relative to the vehicle control point, are currently measured by the vehicle operator. Subject to the amount of care and expertise of the vehicle operator, manually measured installation parameters may include parallax errors and other types of errors that result in poor steering performance and possible vehicle instability.





BRIEF DESCRIPTION OF THE DRAWINGS

The included drawings are for illustrative purposes and serve to provide examples of possible structures and operations for the disclosed inventive systems, apparatus, methods and computer-readable storage media. These drawings in no way limit any changes in form and detail that may be made by one skilled in the art without departing from the spirit and scope of the disclosed implementations.



FIG. 1 shows example orthogonal electronic control unit (ECU) installations.



FIG. 2 shows example non-orthogonal ECU installations.



FIGS. 3A and 3B show vehicle maneuvers performed while reading vehicle sensor data.



FIG. 4 shows an example process for calculating sensor pitch misalignment.



FIG. 5 shows an example process for calculating sensor roll misalignment.



FIG. 6 shows an example process for calculating inertial sensor offsets and yaw misalignment.



FIG. 7 shows an example process for calculating GPS antenna offsets.



FIG. 8 shows an example process for automatically calibrating vehicle sensors.



FIG. 9 shows an example guidance system for calibrating vehicle sensors.



FIG. 10 shows the guidance system controlling an auto-steering system.





DETAILED DESCRIPTION

A calibration scheme measures roll, pitch, and yaw rates and other speed and accelerations during a series of vehicle maneuvers. Based on the measurements, the calibration scheme calculates inertial sensor misalignments. The calibration scheme also calculates offsets of inertial sensors and GPS antennas from a vehicle control point. The calibration scheme can also estimate other calibration parameters, such as minimum vehicle radii and nearest orthogonal orientation.


Automated sensor calibration reduces the amount of operator input used when defining sensor installation parameters. Automated sensor calibration also allows the operator to install an electronic control unit (ECU) in any convenient orientation (roll, pitch and yaw), removing the need for the ECU to be installed in a restrictive orthogonal configuration. Automated sensor calibration may remove dependencies on a heading filter and steering interfaces by calculating sensor parameters based on raw sensor measurements taken during the vehicle maneuvers.



FIG. 1 illustrates two typical orthogonal ECU installations. In one example, ECU 120 may contain one or more inertial measurement units (IMU) 150. The IMUs 150 may include a collection of accelerometers and gyroscopes alternatively referred to as inertials used for measuring speed, acceleration, yaw, pitch, roll, or any other vehicle heading or position. ECU 120 may include other electronics and processors for recording the inertial measurements.


A plate 127 is attached to a vehicle 100 and a set of hinges 124, 125, and 126 attach plate 127 to a plate 123 attached to ECU 120. Hinges 124, 125, and 126 may rotate ECU 120 into a variety of different alignments relative to vehicle 100. In alignment 1, ECU 120 is aligned and level with attached vehicle 100. In alignment 2, ECU 120 is rotated in pitch by 90° and appears vertically aligned with vehicle 100. The orthogonal installations in FIG. 1 are indicative of current installation procedures, where ECU 120 has to be aligned parallel or perpendicularly with the vehicle axis.


A GPS antenna may be installed on the chassis of vehicle 100 and offset from ECU 120. Preferably, IMU 150 within ECU 120 is located as close as possible to a pivot point of vehicle 100. The pivot point is alternatively referred to as a vehicle control point or center of gravity. However, due to space limitations on vehicle 100, ECU 120 may be offset and/or misaligned with the control point and longitudinal/lateral axes of vehicle 100 that extend through the control point.



FIG. 2 illustrates non-orthogonal installations, where no axes of ECU 120 are aligned with axes of vehicle 100. These cases represent more general ECU installations, where there is no restriction on the alignment of ECU 120. Sensor alignment calibration as described below allows an operator to attach ECU 120 in any orientation with respect to vehicle 100 as shown in FIGS. 1 and 2 and automatically calculate the misalignments and offsets relative to the vehicle axes.


References below to a local frame may refer to a frame of reference on the ground looking north and is also referred to as a fixed frame. References to a body frame may refer to a frame of vehicle 100 as it moves and an x-axis may typically be a heading of the body frame. An ECU frame may refer to the vehicle body but relative to ECU box 120.


Calibration



FIGS. 3A and 3B show example vehicle maneuvers performed during automated sensor alignment calibration. In one example, an operator may manually drive vehicle 100 through a series of steps 1-6 based on instructions provided via user interface prompts displayed from a screen included on a navigation system 110.


During step 1, the operator may turn on vehicle 100 and navigation system 110 but keep vehicle 100 in a stationary position. Navigation system 110 takes first sensor readings during step 1, such as vehicle acceleration readings from accelerometers in ECU 120 in FIGS. 1 and 2. During step 2, the operator may drive vehicle 100 from the current stationary location around in a circular pattern and then return to the initial location but in a 180 degree opposite direction. Navigation system 110 then takes additional sensor readings during step 2 and step 3 while vehicle 100 is in the stationary opposite 180 degree position.


Referring now to FIG. 3B, during step 4, navigation system 110 takes sensor readings while the operator drives vehicle 100 forward in a straight line at a constant speed. For example, navigation system 110 may take speed and yaw baseline readings.


During step 5, navigation system 110 takes sensor readings while the operator drives vehicle 100 in a full lock left turn while maintaining a constant speed. During step 6, navigation system 110 takes sensor readings while the operator drives vehicle 100 in a full lock right turn while maintaining a constant speed. A full lock turn refers to the vehicle operator turning the vehicle steering wheel as far as possible to the left or to the right. Sensor readings taken by navigation system 110 during steps 5 and 6, may include global positioning system (GPS) speed readings, gyroscope yaw rate readings, and accelerometer readings.


ECU Installation Misalignment


One objective of step 1 is to obtain the roll ϕ1 and pitch θ1 measured by IMU 150 (FIG. 1). Step 2 then obtains the roll ϕ2 and pitch θ2 measured by IMU 150 following the repositioning of vehicle 100 at the 180 degree heading change. Steps 1 and 2 are part of an initial calibration used to determine the roll and pitch installation misalignment of ECU 120 with the reference axis of vehicle 100. Performing these measurements in opposite vehicle directions removes terrain effects that may contribute to false roll and pitch ECU measurements.



FIG. 4 illustrates the principles behind steps 1 and 2 in more detail. In step 1, vehicle 100 is stationary, facing up on a sloped surface, with a terrain pitch of θt. ECU 120 was installed with an installation misalignment of θm relative to a body axis 122 (Xvehicle, Yvehicle) of vehicle 100. The roll and pitch values calculated in steps 1 and 2 are derived by comparing acceleration components measured by the IMU accelerometers within ECU 120 relative to a gravity vector. When sensor measurements are taken during step 1, the measured pitch may be a combination of both the terrain and IMU misalignment pitch angles, where:

θ1tm  (2.1)


In step 2, vehicle 100 has changed direction relative to the terrain, facing down the slope, but misalignment angle θm of ECU 120 relative to body axis 122 remains the same. Therefore, the measured pitch angle during step 2 is:

θ2=−θtm  (2.2)


With this information, navigation system 110 may calculate the ECU pitch misalignment θm by averaging the pitch estimates obtained in steps 1 and 2, where:










θ
m

=



θ
1

+

θ
2


2





(
2.3
)







Operation 2.3 eliminates the terrain pitch effects, so only the pitch installation misalignment angle θm for ECU 120 remains.



FIG. 5 shows how navigation system 110 calculates the roll misalignment angle ϕm. By measuring vehicle roll in opposite directions, the terrain effects on roll can be eliminated, where:










ϕ
1

=


ϕ
t

+

ϕ
m






(
2.4
)







ϕ
2

=


-

ϕ
t


+

ϕ
m






(
2.5
)







ϕ
m

=



ϕ
1

+

ϕ
2


2





(
2.6
)







With the roll and pitch misalignments calculated, navigation system 110 performs an intermediate frame transformation to align the measurements from IMU 150 with a local-level frame. As mentioned above, the local frame is the frame of reference on the ground or fixed frame often with a reference to north. An intermediate frame transformation can be expressed as:

CbL=Ly(−θm)Lr(−ϕm)  (2.7)


where CbL describes the transformation from the sensor (body) frame to the local-level frame. The matrices Lx and Ly refer to rotations in the x and y axes respectively.


The frame transformation in process 2.7 may align the IMU sensor readings with expected vehicle axes. For example, if IMU 150 pitches up and is rolled to the right, process 2.7 may unroll and depitch the measurements to align with vehicle axes that extend through the vehicle control point and effectively algorithmically straightens out IMU 150.


Navigation system 110 uses step 3 in FIG. 3A to estimate yaw rate bias {dot over (ψ)}b, which is used to de-bias yaw rate measurements during subsequent vehicle maneuvers. A yaw rate de-bias is then performed so subsequent calculations are not effected by drifting yaw rate measurements that can be common to inertial sensors. Navigation system 110 may achieve greater steering accuracy by estimating and eliminating yaw rate bias from navigation measurements.


Navigation system 110 uses step 4 to determine the calibration speed V0, maintained throughout the calibration process. In one example, calibration speed V0 is selected to be as high as possible without the vehicle wheels slipping on the terrain. The same calibration speed may be used throughout the other vehicle maneuvers in steps 5 and 6.



FIG. 6 illustrates how ECU misalignment estimation is performed in steps 5 and 6 of FIG. 3B. Navigation system 110 uses step 5 to determine the IMU velocity VIMU,L and yaw rate {dot over (ψ)}L for a left-hand vehicle turn. This is performed by steering vehicle 100 to full-lock left (the maximum amount vehicle 100 can turn left) and recording the measured data from ECU 120.


Similarly, navigation system 110 uses step 6 in FIG. 3B to determine the IMU velocity VIMU,R and yaw rate {dot over (ψ)}R for the right-hand turn at full-lock right. Left and right hand turning maneuvers are used since the installation location of ECU 120 can be off-center and turning in both left and right directions allows navigation system 110 to compensate for the effects of an off-center installation of ECU 120. At the completion of step 6, navigation system 110 has collected all sensor data needed for calculating installation misalignment and offset values for ECU 120 and a GPS antenna.


In one example, the velocities from IMU 150 within ECU 120 cannot be measured directly, and are estimated using corrected accelerations that have been transformed into the local-level frame. Again, IMU 150 may include any combination of gyroscopes and accelerometers that measure yaw, pitch and roll rates, speed, acceleration, etc. Navigation system 110 uses the corrected accelerations to calculate the IMU velocities as follows:










V

IMU
,
L


=




a


x
^

,

L
2



+

a


y
^

,

L
2








ψ
.

L

-


ψ
.

b







(
2.8
)







V

IMU
,
R


=




a


x
^

,

R
2



+

a


y
^

,

R
2








ψ
.

R

-


ψ
.

b







(
2.9
)








where a{circumflex over (x)},L, aŷ,L and a{circumflex over (x)},R, a{circumflex over (x)},R are the corrected accelerations measured in the x and y axis accelerometers for each turn direction obtained during steps 5 and 6.


With these measured values, navigation system 110 determines the forward IMU offset Ix and lateral IMU offset Iy as follows:










l
y

=



n
R

-

n
L

-

(



m
R

-

m
L




n
R

+

n
L



)


2





(
2.10
)








l
x

=



m
R

-


(


n
R

-

l
y


)

2









where




(
2.11
)







m
R

=


(


V

IMU
,
R





ψ
.

R

-


ψ
.

b



)

2





(
2.12
)







m
L

=


(


V

IMU
,
L





ψ
.

L

-


ψ
.

b



)

2








(
2.13
)






(
2.14
)













and












n
R

=




V
0




ψ
.

R

-


ψ
.

b









(
2.15
)







n
L

=




V
0




ψ
.

L

-


ψ
.

b









(
2.16
)








Note that Equations 2.10 and 2.11 are functions of estimated values obtained during the calibration process, namely accelerations and yaw rates.


Navigation system 110 estimates the ECU yaw misalignment ψm following the calculation of the IMU offsets relative to the vehicle control point. The yaw misalignment can be expressed as:











ψ
m

=



ψ

m
,
L


+

ψ

m
,
R



2








where


:






(
2.17
)







ψ

m
,
L


=



ψ
^

L

+

ψ

a
,
L







(
2.18
)







ψ

m
,
R


=



ψ
^

L

+

ψ

a
,
R










(
2.19
)






(
2.20
)











ψ
^

L

=


tan

-
1




(


a


x
^

,
L



a


y
^

,
L



)






(
2.21
)








ψ
^

R

=


tan

-
1




(


a


x
^

,
R



a


y
^

,
R



)






(
2.22
)







ψ

a
,
L


-


tan

-
1


(


l
x




V
0


ψ
L


+

l
y



)





(
2.23
)







ψ

a
,
R


-


tan

-
1


(


l
x




V
0


ψ
R


-

l
y



)





(
2.24
)







In a similar manner to the calculation of the pitch and roll misalignment angles calculated in Equations 2.3 and 2.6, the calculation of yaw misalignment in Equation 2.17 is an averaging process between the left and right hand turn maneuvers. This is to eliminate any effects of an off-center ECU installation.



FIG. 7 illustrates the process for estimating the offset of GPS antenna 12 during steps 4-6 discussed above in FIG. 3B. The GPS antenna offset calculation is similar to the ECU offset calculation but may not calculate antenna pitch and roll. Again a GPS antenna offset from vehicle control point 115 may produce different velocity readings when the vehicle turns to the left and right. Thus, navigation system 110 electronically corrects/calibrates GPS readings to the control point axes of vehicle 100.


Navigation system 110 during step 4 determines the calibration speed V0, which is maintained throughout the calibration process. Navigation system 110 during step 5 measures the GPS velocity VGPS,L and yaw rate {dot over (ψ)}L during the left-hand turn. Similarly, navigation system 100 during step 6 measures the GPS velocity VGPS,R and yaw rate {dot over (ψ)}R for the right-hand turn. The values for V0, VGPS,L, VGPS,R, {dot over (ψ)}L, and {dot over (ψ)}R are measured and stored within ECU 120 or navigation system 110 as they are collected during steps 1-6.


Navigation system 110 uses the measured values to determine the longitudinal antenna offset lx and lateral antenna offset ly as follows:










l
y

=



n
R

-

n
L

-

(



m
R

-

m
L




n
R

+

n
L



)


2





(
2.25
)








l
x

=



m
R

-


(


n
R

-

l
y


)

2










where


:






(
2.26
)







m
R

=


(


V

GPS
,
R





ψ
.

R

-


ψ
.

b



)

2





(
2.27
)







m
L

=


(


V

GPS
,
L





ψ
.

L

-


ψ
.

b



)

2








(
2.28
)






(
2.29
)












n
R


=




V
0




ψ
.

R

-


ψ
.

b









(
2.30
)







n
L

=




V
0




ψ
.

L

-


ψ
.

b









(
2.31
)








Note that the antenna offset estimation may be more effective when GPS antenna 12 is installed ahead of control point 115 on vehicle 100, i.e. lx>0.


Minimum Radii


Navigation system 110 may calculate the left and right minimum radii for vehicle 100 using the GPS speed and yaw rates measured during each turn direction. The left and right minimum radii can be expressed as:










R

L
,
min


=



V

GPS
,
L





ψ
.

L

-


ψ
.

b



-

l
y






(
2.32
)







R

R
,
min


=



V

GPS
,
R





ψ
.

R

-


ψ
.

b



+

l
y






(
2.33
)








Installation Attitude Biases


In adhering to the current definition of roll and pitch bias, the bias angles are represented relative to the nearest orthogonal installation. This rotation can expressed as follows:

CorthECU=CbECUCorthb  (2.34)


Where CorthEC|U is the transformation from the nearest orthogonal alignment to the ECU frame, C|orthECU is the transformation from the body frame to the ECU frame (known) and Corthb is the transformation from the nearest orthogonal installation to the body frame (known). An orthogonal alignment is one that has 90° rotations between axes.


Extraction of the angles from CorthEC|U yields the roll ϕb, pitch θb and yaw bias angles ψb relative to the nearest orthogonal installation in the ECU frame. To represent the bias angles in the body frame, the following rotation is applied:











[




ϕ
b






θ
b






ψ
b




]

b

=



C
orth
b



[




ϕ
b






θ
b






ψ
b




]


ECU





(
2.35
)








FIG. 8 shows an example process for automatic sensor calibration. In operation 200A, the navigation system reads sensor data while the vehicle is in a stationary position and oriented in a first direction. In operation 200B, the navigation system reads the sensor data while the vehicle is in a stationary position and oriented in the opposite direction.


In operation 200C, the navigation system reads sensor data while the vehicle is traveling in a straight line at a constant speed. In operation 200D, the navigation system reads vehicle sensor data while the vehicle is turning locked to the left and in operation 200E, the navigation system reads vehicle sensor data while the vehicle is turning locked to the right.


In operation 200F, the navigation system calculates the roll, pitch, yaw and misalignments of the ECU based on the measured vehicle sensor data. For example, the navigation system calculates the roll and pitch misalignments based on the average roll and pitch readings from operations 200A and 200B. The navigation system calculates yaw misalignment based on vehicle speed, acceleration and yaw rate measurements made during operations 200C, 200D, and 200E.


In operation 200G, the navigation system calculates offsets of the ECU and GPS antenna based on the vehicle speed, acceleration and yaw rate measurements made during operations 200C, 200D, and 200E. For example, the navigation system calculates the lateral and longitudinal distances of the inertial sensors and GPS antenna from the vehicle control point. In operation 200H, the navigation system calibrates the sensors by adjusting sensor readings to the vehicle frame axes or local axes based on the derived misalignments and offsets.


Computer, Software, and Sensor Systems


A Global navigation satellite system (GNSS) is broadly defined to include GPS (U.S.) Galileo (European Union, proposed) GLONASS (Russia), Beidou (China) Compass (China, proposed) IRNSS (India, proposed), QZSS (Japan, proposed) and other current and future positioning technology using signal from satellites, with or with augmentation from terrestrial sources.


IMUs may include gyroscopic (gyro) sensors, accelerometers and similar technologies for providing outputs corresponding to the inertial of moving components in all axes, i.e., through six degrees of freedom (positive and negative directions along transverse X, longitudinal Y and vertical Z axes). Yaw, pitch and roll refer to moving component rotation about the Z, X, and Y axes respectively. Said terminology will include the words specifically mentioned, derivative thereof and words of similar meaning.



FIG. 9 generally shows guidance system 110 used in conjunction with an electrical direct-drive steering assistance mechanism 3. Without limitation on the generality of useful applications of guidance system 110, a GNSS receiver 4, and a guidance processor 6 are connected to a GNSS antenna 12 and installed into vehicle 100, such as an agricultural vehicle or tractor. An auto-steering system 166 is electrically connected to guidance processor 6, and is mechanically interfaced with vehicle 100 via steering assistance mechanism 3.



FIG. 10 shows additional detail of guidance system 110. The GNSS receiver 4 is further comprised of an RF convertor (i.e., downconvertor) 16, a tracking device 18, and a rover RTK receiver element 20. The receiver electrically communicates with, and provides GNSS positioning data to, guidance processor 6. Guidance processor 6 includes a graphical user interface (GUI) 26, a microprocessor 24, and a media element 22, such as a memory storage drive. Guidance processor 6 electrically communicates with, and provides control data to auto-steering system 166. Auto-steering system 166 includes a wheel movement detection switch 28 and an encoder 30 for interpreting guidance and steering commands from CPU 6.


Auto-steering system 166 may interface mechanically with the vehicle's steering column 34, which is mechanically attached to steering wheel 32. A control line 42 may transmit guidance data from the CPU 6 to the auto-steering system 166. An electrical subsystem 44, which powers the electrical needs of vehicle 100, may interface directly with auto-steering system 166 through a power cable 46. The auto-steering subsystem 166 can be mounted to steering column 34 near the floor of the vehicle, and in proximity to the vehicle's control pedals 36. Alternatively, auto-steering system 166 can be mounted at other locations along steering column 34.


The auto-steering system 166 physically drives and steers vehicle 100 or 110 by actively turning the steering wheel 32 via steering column 34. A motor 45 powered by vehicle electrical subsystem 44 may power a worm drive which powers a worm gear 48 affixed to auto-steering system 166. These components are preferably enclosed in an enclosure. In other embodiments, auto-steering system 166 is integrated directly into the vehicle drive control system independently of steering column 34.


Some of the operations described above may be implemented in software and other operations may be implemented in hardware. One or more of the operations, processes, or methods described herein may be performed by an apparatus, device, or system similar to those as described herein and with reference to the illustrated figures.


“Computer-readable storage medium” (or alternatively, “machine-readable storage medium”) used in guidance system 120 may include any type of memory, as well as new technologies that may arise in the future, as long as they may be capable of storing digital information in the nature of a computer program or other data, at least temporarily, in such a manner that the stored information may be “read” by an appropriate processing device. The term “computer-readable” may not be limited to the historical usage of “computer” to imply a complete mainframe, mini-computer, desktop, wireless device, or even a laptop computer. Rather, “computer-readable” may comprise storage medium that may be readable by a processor, processing device, or any computing system. Such media may be any available media that may be locally and/or remotely accessible by a computer or processor, and may include volatile and non-volatile media, and removable and non-removable media.


Examples of systems, apparatus, computer-readable storage media, and methods are provided solely to add context and aid in the understanding of the disclosed implementations. It will thus be apparent to one skilled in the art that the disclosed implementations may be practiced without some or all of the specific details provided. In other instances, certain process or methods also referred to herein as “blocks,” have not been described in detail in order to avoid unnecessarily obscuring the disclosed implementations. Other implementations and applications also are possible, and as such, the following examples should not be taken as definitive or limiting either in scope or setting.


References have been made to accompanying drawings, which form a part of the description and in which are shown, by way of illustration, specific implementations. Although these disclosed implementations are described in sufficient detail to enable one skilled in the art to practice the implementations, it is to be understood that these examples are not limiting, such that other implementations may be used and changes may be made to the disclosed implementations without departing from their spirit and scope. For example, the blocks of the methods shown and described are not necessarily performed in the order indicated in some other implementations. Additionally, in other implementations, the disclosed methods may include more or fewer blocks than are described. As another example, some blocks described herein as separate blocks may be combined in some other implementations. Conversely, what may be described herein as a single block may be implemented in multiple blocks in some other implementations. Additionally, the conjunction “or” is intended herein in the inclusive sense where appropriate unless otherwise indicated; that is, the phrase “A, B or C” is intended to include the possibilities of “A,” “B,” “C,” “A and B,” “B and C,” “A and C” and “A, B and C.”


Having described and illustrated the principles of a preferred embodiment, it should be apparent that the embodiments may be modified in arrangement and detail without departing from such principles. Claim is made to all modifications and variation coming within the spirit and scope of the following claims.

Claims
  • 1. A system for calibrating sensor readings, comprising: a hardware processor to:identify a vehicle control point for a vehicle, wherein the vehicle control point is a measured center of gravity of the vehicle;read first sensor data from sensors located on the vehicle, the vehicle at a first stationary position and location;read second sensor data from the sensors at substantially the same location and at a second opposite vehicle position;estimate a first pitch angle of the vehicle from the first sensor data;estimate a second pitch angle of the vehicle from the second sensor data;calculate a pitch misalignment of the sensors relative to the vehicle control point by averaging the first and second estimated pitch angles;estimate a first roll angle of the vehicle from the first sensor data;estimate a second roll angle of the vehicle from the second sensor data;calculate a roll misalignment of the sensors relative to a vehicle control point by averaging the first and second estimated roll angles; andcalibrate the sensors based on the pitch misalignment and the roll misalignment.
  • 2. The system of claim 1, the hardware processor further to: read third sensor data while the vehicle travels in a straight line at a constant speed;read fourth sensor data while the vehicle turns to the left;read fifth sensor data while the vehicle turns to the right; andcalculate offsets of the sensors from the lateral and longitudinal axes of the control point based on a yaw bias rate, and the third, fourth, and fifth sensor data.
  • 3. The system of claim 2, the hardware processor further to calculate a yaw misalignment of the sensors relative to the control point based on the yaw bias rate and the third, fourth, and fifth sensor data.
  • 4. The system of claim 2, the hardware processor further to: identify left turn acceleration and yaw rate values from the fourth sensor data;identify right turn acceleration and yaw rate values from the fifth sensor data; andcalculate the offsets of the sensors based on the left turn and right turn acceleration and yaw rate values.
  • 5. The system of claim 2, the hardware processor further to calculate the offsets for an inertial measurement unit (IMU) on the vehicle according to:
  • 6. The system of claim 5, the hardware processor further to calculate a yaw misalignment ψm of the sensors, wherein:
  • 7. The system of claim 2, the hardware processor further to calculate a global positioning system (GPS) antenna offset according to:
  • 8. A computer system for calibrating sensors in a vehicle, the computer system comprising: a processor; andmemory coupled to the processor and storing a set of instructions that, when executed by the processor are operable to:measure a calibration speed V0 of the vehicle while traveling at a constant speed in a straight line;measure a yaw rate bias {dot over (ψ)}b of the vehicle while stationary;measure a left yaw rate {dot over (ψ)}L during a left-hand turn of the vehicle;measure a right yaw rate {dot over (ψ)}R during a right-hand turn of the vehicle;calculate a misalignment of the sensors relative to a lateral and longitudinal vehicle axes based on the calibration speed V0, left yaw rate {dot over (ψ)}L, right yaw rate {dot over (ψ)}R and yaw rate bias {dot over (ψ)}b of the vehicle; andcalibrate the sensors based on the misalignment of the sensors relative to the lateral and longitudinal vehicle axes.
  • 9. The computer system of claim 8, wherein the misalignment of the sensors is a yaw misalignment.
  • 10. The computer system of claim 8, the set of instructions when executed by the processor are further operable to calculate an offset of the sensors from the lateral and longitudinal vehicle axes based on the calibration speed V0, left yaw rate {dot over (ψ)}L right yaw rate {dot over (ψ)}R and yaw rate bias {dot over (ψ)}b of the vehicle.
  • 11. The computer system of claim 8, wherein the sensors are located in an inertial measurement unit (IMU) and the set of instructions when executed by the processor are operable to calculate a misalignment and offset of the IMU relative to the lateral and longitudinal vehicle axes.
  • 12. The computer system of claim 8, wherein the sensors include a global positioning system (GPS) antenna and the set of instructions when executed by the processor are operable to calculate an offset of the GPS antenna relative to the lateral and longitudinal vehicle axes.
  • 13. The computer system of claim 8, wherein the instructions when executed by the processor are further operable to: measure a first pitch angle θ1 of the vehicle at a location when pointed in a first direction;measure a second pitch angle θ2 of the vehicle at substantially the same location when pointed in a second opposite direction; andcalculate a pitch misalignment of the sensors based on the first pitch angle θ1 and the second pitch angle θ2.
  • 14. The computer system of claim 8, wherein the instructions when executed by the processor are further operable to: measure a first roll angle ϕ1 of the vehicle at a location when pointed in a first direction;measure a second roll angle ϕ2 of the vehicle at substantially the same location when pointed in a second opposite direction; andcalculate a roll misalignment of the sensors based on the first roll angle ϕ1 and the second roll angle ϕ2.
  • 15. The computer system of claim 8, wherein the instructions when executed by the processor are further operable to: measure left turn acceleration values for the vehicle;measure right turn acceleration values for the vehicle; andcalculate the misalignment of the sensors based on the left turn acceleration values and the right turn acceleration values.
  • 16. A method for calibrating sensor readings, comprising: reading first sensor data from sensors located on a vehicle at a location and first stationary vehicle position;reading second sensor data from the sensors at substantially the same location and a second opposite stationary vehicle position;reading third sensor data while the vehicle travels in a straight line at a constant speed;reading fourth sensor data while the vehicle turns to the left;reading fifth sensor data while the vehicle turns to the right;calculating misalignments and offsets of the sensors based on the sensor data; andcalibrate the sensors based on the misalignments and offsets of the sensors.
  • 17. The method of claim 16, further comprising: reading a first pitch angle and first roll angle of the vehicle from the first sensor data;reading a second pitch angle and first roll angle of the vehicle from the second sensor data;calculating a pitch misalignment for the sensors based on the first and second pitch angle; andcalculating a roll misalignment for the sensors based on the first and second roll angle.
  • 18. The method of claim 16, further comprising: reading a vehicle speed from the third sensor data;reading left yaw rate and left acceleration from the fourth sensor data;reading right yaw rate and right acceleration from the fifth sensor data; andcalculating a sensor yaw misalignment and sensor offsets from a vehicle control point based on the vehicle speed, left yaw rate, left acceleration, right yaw rate, and right acceleration.
Parent Case Info

The present application claims priority to U.S. Provisional Patent Application Ser. No. 62/257,395 filed on Nov. 19, 2015, entitled: SENSOR ALIGNMENT CALIBRATION which is incorporated by reference in its entirety.

US Referenced Citations (98)
Number Name Date Kind
4713697 Gotou Dec 1987 A
5194851 Kraning et al. Mar 1993 A
5390125 Sennott et al. Feb 1995 A
5446658 Pastor Aug 1995 A
5663879 Trovato et al. Sep 1997 A
5923270 Sampo et al. Jul 1999 A
6052647 Parkinson et al. Apr 2000 A
6070673 Wendte Jun 2000 A
6212453 Kawagoe et al. Apr 2001 B1
6282496 Chowdhary Aug 2001 B1
6373432 Rabinowitz et al. Apr 2002 B1
6377889 Soest Apr 2002 B1
6415223 Lin Jul 2002 B1
6445983 Dickson et al. Sep 2002 B1
6522992 McCall Feb 2003 B1
6539303 McClure et al. Mar 2003 B2
6711501 McClure et al. Mar 2004 B2
6714848 Schubert Mar 2004 B2
6714851 Hrovat Mar 2004 B2
6789014 Rekow et al. Sep 2004 B1
6819780 Benson et al. Nov 2004 B2
6865465 McClure Mar 2005 B2
6876920 Mailer Apr 2005 B1
7142956 Heiniger et al. Nov 2006 B2
7162348 McClure et al. Jan 2007 B2
7277792 Overschie Oct 2007 B2
7373231 McClure et al. May 2008 B2
7400956 Feller et al. Jul 2008 B1
7437230 McClure Oct 2008 B2
7460942 Mailer Dec 2008 B2
7689354 Heiniger et al. Mar 2010 B2
7698036 Watson Apr 2010 B2
RE41358 Heiniger et al. May 2010 E
7835832 Macdonald et al. Nov 2010 B2
7885745 McClure et al. Feb 2011 B2
8018376 McClure et al. Sep 2011 B2
8065104 Fiedler et al. Nov 2011 B2
8190337 McClure May 2012 B2
8214111 Heiniger et al. Jul 2012 B2
8271175 Takenaka Sep 2012 B2
8311696 Reeve Nov 2012 B2
8386129 Collins et al. Feb 2013 B2
8392102 Fiedler Mar 2013 B2
8401704 Pollock et al. Mar 2013 B2
8489291 Dearborn et al. Jul 2013 B2
8521372 Hunt et al. Aug 2013 B2
8532899 Loomis Sep 2013 B1
8548649 Guyette et al. Oct 2013 B2
8566034 Van Wyck Loomis Oct 2013 B1
8583312 Schreiber Nov 2013 B2
8583315 Whitehead et al. Nov 2013 B2
8583326 Collins et al. Nov 2013 B2
8589013 Pieper et al. Nov 2013 B2
8589015 Willis Nov 2013 B2
8594879 Roberge et al. Nov 2013 B2
8634993 McClure et al. Jan 2014 B2
8639416 Jones et al. Jan 2014 B2
8649930 Reeve et al. Feb 2014 B2
8676620 Hunt et al. Mar 2014 B2
8718874 McClure et al. May 2014 B2
8768558 Reeve et al. Jul 2014 B2
8781685 McClure Jul 2014 B2
8803735 McClure Aug 2014 B2
8897973 Hunt et al. Nov 2014 B2
8924152 Hunt et al. Dec 2014 B2
9002565 Jones et al. Apr 2015 B2
9002566 McClure et al. Apr 2015 B2
9141111 Webber et al. Sep 2015 B2
9162703 Miller et al. Oct 2015 B2
9173337 Guyette et al. Nov 2015 B2
9223314 McClure et al. Dec 2015 B2
9255992 McClure Feb 2016 B2
9389615 Webber et al. Jul 2016 B2
9791279 Somieski Oct 2017 B1
20020008661 McCall Jan 2002 A1
20020022924 Begin Feb 2002 A1
20020072850 McClure et al. Jun 2002 A1
20020128795 Schiffmann Sep 2002 A1
20020135420 McCall Sep 2002 A1
20020183899 Wallner Dec 2002 A1
20040150557 Ford et al. Aug 2004 A1
20040186644 McClure et al. Sep 2004 A1
20060167600 Nelson, Jr. et al. Jul 2006 A1
20080048405 DeLorenzis Feb 2008 A1
20090025998 Brandmeier Jan 2009 A1
20090088974 Yasan Apr 2009 A1
20100274452 Ringwald et al. Oct 2010 A1
20120221244 Georgy et al. Aug 2012 A1
20140266877 McClure Sep 2014 A1
20140277676 Gattis Sep 2014 A1
20150175194 Gattis Jun 2015 A1
20160039454 Mortimer Feb 2016 A1
20160040992 Palella Feb 2016 A1
20160154108 McClure et al. Jun 2016 A1
20160205864 Gattis et al. Jul 2016 A1
20160214643 Joughin et al. Jul 2016 A1
20160252909 Webber et al. Sep 2016 A1
20160334804 Webber et al. Nov 2016 A1
Foreign Referenced Citations (1)
Number Date Country
102005033237 Jan 2007 DE
Non-Patent Literature Citations (3)
Entry
Noh, Kwang-Mo, Self-tuning controller for farm tractor guidance, Iowa State University Retrospective Theses and Dissertations, Paper 9874, (1990).
Van Zuydam,. R.P., Centimeter-Precision Guidante of Agricultural Impleet ns in the Open Field by Means of Real Tim Kinematic DGPS, ASA-CSSA-SSSA, pp. 1023-1034 (1999).
Van Zuydam,. R.P., Centimeter-Precision Guidance of Agricultural Implements in the Open Field by Means of Real Tim Kinematic DGPS, ASA-CSSA-SSSA, pp. 1023-1034 (1999).
Related Publications (1)
Number Date Country
20170146667 A1 May 2017 US
Provisional Applications (1)
Number Date Country
62257395 Nov 2015 US