The present invention relates to a method and device for determining navigation parameters of an aircraft, in particular of a transport airplane, during a landing phase, which are intended to afford an aid to the navigation of the aircraft.
Within the framework of the present invention, the landing phase comprises the approach and/or the landing proper.
In a standard manner, a device for determining navigation parameters provides an estimation of navigation parameters of the aircraft, with the aid of baro-inertial data and GNSS data.
Within the framework of the present invention, the following is meant:
More particularly, the device for determining navigation parameters is intended to determine and to provide at least some of the following navigation parameters: aircraft position, velocity and attitude parameters, as well as other parameters relating to sensor errors, such as inertial sensor measurement errors, a GNSS receiver clock bias and offset, and GNSS pseudo-distance correlated measurement errors.
The integration of GNSS data with baro-inertial data arising from INS (“Inertial Navigation System”) sensors is one of the main standard solutions for improving aircraft location in difficult environments, the use of measurements of inertial sensors (INS sensors) making it possible in particular to fill intervals between successive GNSS data.
In a standard manner, such a device carries out, generally, an estimation of the navigation parameters, with the aid of a Kalman filter in which the INS measurements and the GNSS measurements are integrated.
However, when fusing the INS measurements and the GNSS measurements, the temporal degradation of the precision of the drift of the INS system must be taken into account. This drift depends on the quality of the INS system. However, the choice of the INS system depends on a compromise between performance and cost. Thus, the drift may vary for example from 1 meter per minute to several hundred meters per minute according to the INS system, and the cost is also very different between diverse INS systems.
It may therefore be advantageous to be able to deploy a very precise and low-cost solution for determining navigation parameters.
The object of the present invention is to propose such a solution. It relates to a method for determining navigation parameters of an aircraft during a landing phase, said method comprising steps implemented in an automatic and repetitive manner and entailing:
in a first step, determining on the aircraft at least:
in a second step, computing the navigation parameters on the basis at least of said first and second data, with the aid of an extended Kalman filter.
According to the invention:
the first step comprises, moreover, an operation entailing determining, on the aircraft, video data corresponding to current data relating to at least one characteristic point on the Earth, whose coordinates are known, said video data being generated by at least one digital video camera arranged on the aircraft and observing said characteristic point; and
the second step is configured to compute the navigation parameters on the basis also of said video data, in addition to said first and second data.
Advantageously, for the implementation of the second step, the extended Kalman filter is configured and adapted to take into account the video data so as to compute the navigation parameters.
Thus, by virtue of the integration of the video data, it is possible to improve the performance (precision, integrity and availability) of the navigation parameters determined with the aid of the extended Kalman filter. In particular, it is thus possible to obtain precise navigation parameters able to be used on the aircraft to aid it with navigation during the landing phase (approach and/or landing), using for this purpose baro-inertial data obtained for example on the basis of a lower-cost INS system.
The video camera or cameras used for the implementation of the present invention may be:
Advantageously, in the first step of the method, the video data comprise the following angular measurements:
a first measurement, at a focal point of the video camera, of a first angle between an axis of an aircraft frame and a projection on a vertical plane of a line of sight of the video camera, observing said characteristic point; and
a second measurement, at a focal point of the video camera, of a second angle between the axis of the aircraft frame and a projection on a horizontal plane of the line of sight of the video camera, observing said characteristic point.
Furthermore, in an advantageous manner, the second step of the method takes into account an observation matrix comprising a first observation matrix relating to said first data and a second observation matrix relating to said video data, and said second observation matrix comprises the tangent of the first measurement and the tangent of the second measurement which are defined with respect to the following parameters:
the latitude, the longitude and the height with respect to the ground of the aircraft; and
the angles of roll, of pitch and of yaw of the aircraft.
Within the framework of the present invention, the characteristic point on the Earth which is used for the measurement of video data can represent any point which can be identified by the video camera (and located on the images taken by the video camera), and whose coordinates are known.
Preferably, this characteristic point corresponds to a particular point of a runway scheduled for a landing of the aircraft during the landing phase, and especially the threshold of the runway.
Moreover, advantageously, said navigation parameters comprise at least some of the following data:
a position parameter of the aircraft;
a velocity parameter of the aircraft;
an attitude parameter of the aircraft; and
at least one parameter relating to an error of at least one sensor.
The present invention also relates to a device for determining navigation parameters of an aircraft during a landing phase.
Said device of the type comprising:
a first data generating unit, configured to determine first data corresponding to current navigation data of the aircraft, arising from a satellite navigation system;
a second data generating unit, configured to determine second data corresponding to current inertial data of the aircraft; and
a data processing unit comprising an extended Kalman filter and configured to determine the navigation parameters on the basis at least of said first and second data,
is noteworthy, according to the invention, in that said device comprises, moreover, a video system comprising at least one digital video camera arranged on the aircraft, the digital video camera being configured to generate on the aircraft current video data relating to at least one characteristic point on the Earth, whose coordinates are known, and in that the data processing unit is configured to determine the navigation parameters on the basis also of said video data.
In a particular embodiment, said device also comprises user means of navigation parameters determined by the data processing unit.
The present invention also relates to an aircraft navigation system, which comprises the aforementioned device.
The present invention relates, furthermore, to an aircraft, in particular a transport airplane, which comprises a navigation device and/or system, such as those aforementioned.
The appended figures will elucidate the manner in which the invention may be embodied. In these figures, identical references designate similar elements.
The device 1 illustrating the invention and represented schematically in
Said device 1 which is onboard the aircraft AC comprises, in a standard manner:
a data generating unit 2 which comprises at least one standard receiver 3 (preferably a GPS receiver) which is associated with a global satellite navigation system, in particular of GNSS (“Global Navigation Satellite System”) type, associated with a satellite positioning system, especially of GPS (“Global Positioning System”) type. The data generating unit 2 is configured to determine, in a standard manner, first data corresponding to current navigation data (or GPS or GNSS data) of the aircraft AC (on the basis of the information received by the receiver 3 associated with the satellite navigation system);
a data generating unit 4, for example an inertial reference and anemobarometric data system of ADIRS (“Air Data and Reference System”) type, which combines both inertial data and barometric data. The data generating unit 4 comprises a plurality of standard inertial sensors 5A to 5M (INS sensors), M being an integer, and is configured to determine, in a standard manner, second data corresponding to current inertial data (or baro-inertial or INS data) of the aircraft AC; and
a data processing unit 6 which comprises an extended Kalman filter 7 and which is connected by way of links 8 and 9, respectively, to said units 2 and 4. The data processing unit 6 is configured to determine the navigation parameters, on the basis at least of said first and second data received from the units 2 and 4 via the links 8 and 9.
According to the invention, said device 1 comprises, moreover, a video system 10 comprising at least one digital video camera 11 which is arranged on the aircraft.
The video camera 11 is arranged on the aircraft AC in such a way as to take video images of the outside environment in front of the aircraft AC.
This video camera 11 is configured to generate on the aircraft AC current video data relating to at least one characteristic point 12 on the Earth T, whose coordinates are known. Accordingly, the device 1 can comprise a database 14 which contains the coordinates (longitude and latitude especially) of said characteristic point 12 and which is, for example, integrated into the data processing unit 6.
This characteristic point 12 (or target point) can be any point which can be identified by the video camera 11 (and located on the video images taken by the video camera 11), and whose coordinates are known. Preferably, this characteristic point 12 corresponds to a particular point of a runway scheduled for a landing during the landing phase, and especially the threshold of the runway.
The video system 10 also comprises a standard unit 21 for processing video data, which processes the data generated by the video camera 11 and which provides the video data specified hereinbelow.
Moreover, according to the invention, the data processing unit 6 is configured to determine the navigation parameters on the basis also of the video data generated by the video system 10 and received by way of a link 13, as specified hereinbelow, in addition to the first and second aforementioned data.
Accordingly, the extended Kalman filter 7 of the data processing unit 6 is configured and adapted to take into account the video data so as to compute the navigation parameters, as specified hereinbelow.
Thus, by virtue of the integration of the video data, the device 1 is able to improve the performance (precision, integrity and availability) of the navigation parameters determined with the aid of the extended Kalman filter 7. In particular, the device 1 thus makes it possible to obtain precise navigation parameters, able to be used on the aircraft AC to aid it with navigation during the landing phase, using for this purpose baro-inertial data obtained for example on the basis of a lower-cost unit 4.
The video system 10 used for the implementation of the present invention can comprise:
either one or more cameras already installed on the aircraft, this being the case especially on modern commercial airplanes;
or one or more cameras dedicated to the implementation of the present invention and installed specifically with this aim.
In a particular embodiment, said device 1 also comprises a set 15 of user means onboard the aircraft AC, for example display systems and/or computers in the cockpit, for command of control surfaces of the aircraft AC (flight controls) or else for guidance of the aircraft AC (automatic pilot), which use the navigation parameters determined by the data processing unit 6 (and received via a link 19).
The present invention also relates to a navigation system of the aircraft AC, which comprises said device 1 and which is charged with the navigation of the aircraft AC especially during the landing phase.
The data processing unit 6 comprises an extended Kalman filter 7 with an error state vector which estimates errors of the position, velocity and attitude parameters, as well as other parameters relating to sensor errors, such as inertial sensor (sensors 5A to 5M) measurement errors, a clock bias and offset of a receiver 3 of GNSS type, and GNSS pseudo-distance correlated measurement errors.
In the following description of the invention, the following parameters are considered:
Moreover, some of the frames used in the implementation of the invention have been represented in
A/(X,Y,Z) is an inertial frame (or I frame). It is defined as a reference frame, in which Newton's laws of motion apply. The origin O of the inertial frame coincides with the center of mass of the Earth T. The X axis is directed toward the vernal point (or “vernal equinox”), the Z axis is directed along the rotation axis of the Earth T, and the Y axis is defined so as to complete the right-handed coordinate system;
B/(Xe,Ye,Ze) is a frame termed E (or ECEF frame for “Earth-Centered Earth-Fixed frame”). Its origin O is fixed at the center of the Earth T. The axis Ze is aligned with the Z axis of the inertial frame. The ECEF frame rotates with respect to the inertial frame at a frequency of:
ωe/i≈7,292115.10−5 rad/s
In the ECEF frame, two coordinate systems can be used:
The relation between the two sets of coordinates is as follows:
X=(RE+h)·cos λ·cos φ
Y=(RE+h)·cos λ·sin φ
Z=((1−e2)·RE+h)·sin λ
where e=0.0818 is the eccentricity and RE is the terrestrial radius;
C/(N,E,D) is a geographical navigation frame (NED frame or N frame) which is defined locally with respect to the geoid of the Earth T. The axis Zn(D) is directed toward the interior of the ellipsoid along the normal to the ellipsoid. The axis Xn(N) is directed northward, and the axis Yn(E) is directed eastward to complete the right-handed coordinate system. The origin of the frame is the projection of the origin of the platform on the geoid of the Earth T.
The inertial rotation rate of the Earth T, expressed in the N frame, is:
D/(Xb,Yb,Zb) is a body or moving object frame (B frame or M frame) which is tied rigidly to the vehicle considered, in this instance to the aircraft AC, usually to a fixed point such as its center of gravity G (
E/(Xp,Yp,Zp) is a platform frame (P frame) which is considered to be aligned with the body frame (aircraft frame); and
F/(Xw,Yw,Zw) is an azimuth reference frame termed “Wander” (W frame) which solves the high-latitude problem encountered by the geographical frame. The definition is expressed in terms of angular velocity of the frame with respect to the Earth frame. Thus, if the vector of instantaneous rotation of the N frame with respect to the E frame, expressed in the N frame, is:
then the vector of instantaneous rotation of the W frame with respect to the E frame, expressed in the N frame, is:
The data received and used by the data processing unit 6 of the device 1 are now specified.
Firstly, in a standard manner, the unit 2 provides the data processing unit 6, via the link 8, with measurements of pseudo-distances as current navigation data.
Secondly, the unit 4 provides (via the link 9) the following data as current inertial data:
Thirdly, the video system 10 detects from the aircraft AC one or more characteristic points 12 of the exterior environment, and especially of the environment of the runway scheduled for the landing, and it provides, for each characteristic considered, the following data, as represented in
The data processing unit 6 uses the previous data to determine the navigation parameters with the aid of the extended Kalman filter 7.
The processing operations implemented by the extended Kalman filter 7 are presented hereinafter while specifying firstly the use of the baro-inertial data and navigation data (GPS data) before specifying the use of the video data.
Moreover, in the description hereinbelow:
The radius of curvature RM of the Earth T along a meridian 18 (
RM=(a·(1−e2))/(1−e2·sin2λ)3/2
RN=a/(1−e2·sin2λ)1/2
where a is the semi-major axis of the terrestrial ellipsoid.
The radii of curvature in the directions Xw and Yw of the W frame are:
1/(Rx+hB)=cos2 w/(RN+hB)+sin2 w/(RM+hB)
1/(Ry+hB)=cos2w/(RM+hB)+sin2w/(RN+hB)
1/(Rxy+hB)=cos w·sin w·(1/(RN+hB)−1(RM+hB))
Moreover, the components of the error state vector δX for the processing of the extended Kalman filter 7 (implemented by the data processing unit 6) are, successively, as follows:
The propagation matrix is specified first.
For this purpose, firstly, the horizontal angular position error transition matrix, is:
Secondly, concerning the baro-inertial altitude error transition matrix, the equations of the baro-inertial altitude error model are:
δ{dot over (h)}B=−δVZ−K1·(δhB−δhbaro)
δ{dot over (a)}B=K3·(δhB−δhbaro)
where K1, K2, and K3 are the gains of a third-order baro-inertial loop.
The transition state matrix is then:
The complete 3D position error transition matrix is thus:
Thirdly, concerning the horizontal velocity error transition matrix, the equation for the INS horizontal velocity error in the W frame is derived from that for the navigation frame, specified subsequently:
δ{dot over (v)}ew=δƒm/iw+(−
The gravity error vector is:
is the gravity model error, and g0 is the gravitational constant. The gravity model error is not modeled in the state vector. These errors are integrated into the velocity state noise vector.
For the equation for the horizontal velocity error, the first two rows of the gravity error are negligible:
Fvelo/baro=0
The computation of δωw/ew+2δωe/iw is carried out with the aid of the expression for the partial derivatives. For this purpose, it is necessary to firstly compute the following partial derivatives:
The rotation of the W frame with respect to the E frame and the rotation of the E frame with respect to the I frame, both expressed in the W frame, are:
The horizontal velocity error dynamic matrix FveloINS then comprises the following successive parameters:
Fvelo/pos=({circumflex over (v)}ewΛ)·Fvelo/pos1
Fvelo/baro=03×2
Fvelo/velo=({circumflex over (v)}ewΛ)·Fvelo/velo1−Fvelo/velo2
Fvelo/att=(−
Fvelo/bg=03×3
Fvelo/ba={tilde over (R)}m2w
The final horizontal velocity error dynamic matrix Fhorizvelo is composed of the first two rows of the matrix FveloINS. The baro-inertial vertical speed error transition matrix is computed hereinafter.
Fourthly, concerning the baro-inertial vertical speed error transition matrix, the equation for the model of baro-inertial vertical speed error is:
δ{dot over (V)}z=δƒcorrz−2·g0·δhB/RE+δaB+K2·(δhB−δhbaro)
with ƒcorrz the specific vertical force of the aircraft AC with inertial corrections.
The transition state matrix is then:
The complete 3D velocity error transition matrix is:
Fifthly, concerning the attitude error transition matrix, the INS attitude error equation in the W frame, is as follows:
{dot over (ρ)}=−{circumflex over (Ω)}w/iw·ρ+{circumflex over (R)}m2w·δωm/im−δωw/iw
The computation of δωw/iw and of {circumflex over (Ω)}w/iw is defined on the basis of the corresponding equations indicated hereinbelow:
ωw/iw=ωw/ew+ωe/iw
Fatt/pos=−((∂ωw/ew/∂(θx,θy,h)+(∂ωe/iw/∂(θx,θy,h))
Fatt/velo=−((∂ωw/ew/∂vew)+(∂ωe/iw/∂vew)))
Fatt/att=(−ωw/iwΛ)
Fatt/bg={circumflex over (R)}m2w
Fatt/ba=0
The attitude dynamic matrix may be then written:
Fatt=[Fatt/pos Fatt/velo Fatt/att Fatt/bg Fatt/ba]
And the INS dynamic matrix is:
Moreover, sixthly, the discrete matrix of transition of measurements of the sensors is:
Moreover, seventhly, concerning the drift and the bias of the clock of the GPS receiver (receiver 3), the transition processing of the clock drift and bias is:
The clock drift and bias transition matrix Fclock is:
Moreover, eighthly, concerning the correlated errors of GPS pseudo-distances, the discrete transition matrix FerrGPS for the correlated errors of pseudo-distances is:
The final GPS dynamic matrix FGPS is then:
And the global dynamic matrix is:
The state noise covariance matrix relating to the extended Kalman filter 7 is now defined.
Firstly, the INS and IMU state noise covariance matrix is:
The diagonal components reflect the dynamics of the evolution of the inertial error state such as adapted for the inertial classes.
Secondly, the GPS state noise covariance function is:
The diagonal components reflect the dynamics of the evolution of the inertial error state such as adapted for the type of receiver.
The global state noise covariance matrix is thus:
The observation matrix relating to the extended Kalman filter 7 is now defined.
Concerning the observation matrix, firstly, the GPS pseudo-distance observation function is:
with:
The linear observation matrix HGPS is then:
Moreover, secondly, concerning the observation matrix for the video measurements generated by the video system 10, the observation matrix for optical angular measurements is:
The optical angular measurements are defined with respect to co:
tan(αx)=ƒx(λ,φ,hB,θ)·cos φ+ƒy(λ,φ,hB,Ψ)·sin φ
tan(αy)=−ƒx(λ,φ,hB,θ)·sin φ+ƒy(λ,φ,hB,Ψ)·cos φ
ƒx(λ,φ,hB,θ)=tan(θ+π/2−αD)=1+tan αD·tan θ)/(tan αD−tan θ)
L=RT·sin GCA
Reference is made for the previous processing operations to
It is known that half the versed sine (“haversine”) of an angle is defined by:
haver sin e(x)=sin2(x/2)=(1−cos(x))/2
Half the versed sine (“haversine”) of the great circle angle GCA is:
haver sin e(CGA)=haver sin e(Δλ−Δφ)+sin Δλ·sin Δφ·haver sin e(π/2)=(1−cos Δλ·cos Δφ)/2
The great circle angle GCA is:
Reference is made for the previous processing operations to
The functions which describe the observation function are:
The linearized observation matrix Hvideo is:
The partial derivatives of k with respect to λ and to φ are:
On the basis of the previous data we obtain the global observation matrix H which corresponds, within the framework of the present invention, to:
Moreover, concerning the measurement noise covariance matrix:
The device 1, such as described hereinabove, integrates video data, thereby making it possible to improve the precision of the navigation parameters determined with the aid of the extended Kalman filter 7 (adapted in an appropriate manner, as specified hereinabove) of the data processing unit 6 of the device 1. The device 1 thus affords an aid to the navigation of the aircraft AC, especially during the approach and/or landing, being for example integrated into a navigation system of the aircraft AC.
Moreover, the integration of the additional visual measurements (video data), derived from a video processing operation of at least one digital video camera 11, makes it possible to obtain a navigation device 1 and/or system, which are autonomous (and which do not require, in particular, any ground stations).
Number | Date | Country | Kind |
---|---|---|---|
14 51848 | Mar 2014 | FR | national |
Number | Name | Date | Kind |
---|---|---|---|
6028624 | Watkins | Feb 2000 | A |
6157876 | Tarleton, Jr. et al. | Dec 2000 | A |
6405975 | Sankrithi | Jun 2002 | B1 |
7616130 | Astruc | Nov 2009 | B2 |
7855675 | Fouet | Dec 2010 | B2 |
20030225487 | Robert et al. | Dec 2003 | A1 |
20050125142 | Yamane | Jun 2005 | A1 |
20080269966 | Markiton | Oct 2008 | A1 |
20100232639 | Ibrahim | Sep 2010 | A1 |
20110142281 | He | Jun 2011 | A1 |
20110282580 | Mohan | Nov 2011 | A1 |
20120022784 | Louis | Jan 2012 | A1 |
20130282208 | Mendez-Rodriguez | Oct 2013 | A1 |
20140236398 | Zhang | Aug 2014 | A1 |
Number | Date | Country |
---|---|---|
1 335 258 | Aug 2003 | EP |
Entry |
---|
Search Report (FR 14 51848) dated Nov. 13, 2014. |
Number | Date | Country | |
---|---|---|---|
20150253150 A1 | Sep 2015 | US |