The present subject matter relates generally to dialog devices and methods for an aircraft, for example a transport airplane, enabling a dialog between an operator of the aircraft, in particular a pilot, and a guidance system of the aircraft.
Airplanes that are provided with a guidance system, either a flight director that computes piloting targets on the basis of guidance targets or an automatic piloting system that makes it possible to follow guidance targets automatically, are typically provided with an item of equipment, for example one called FCU (Flight Control Unit) on airplanes of the AIRBUS type or one called MCP (Mode Control Panel) on airplanes of the BOEING type, that enables a pilot of the airplane to enter guidance targets into the guidance system. Generally, the pilot chooses a guidance target, then he or she controls the engagement (activation) of the associated guidance mode, so that it takes into account either the value entered (in a so-called “selected” mode), or a value computed by the system according to various criteria (in a so-called “managed” mode).
More particularly, the pilot can, with respect to the speed axis, enter a speed (i.e., calibrated airspeed CAS) or Mach target or give control to the system so as to use a speed or Mach target computed on the basis of certain criteria. On the lateral axis, the pilot can enter a heading (HEADING) or route (TRACK) target or give control to the system so as to use the route from the predefined flight plan. On the vertical axis, the pilot can provide a level, follow an axis (e.g., an approach axis), enter an altitude target, indicate how to reach this altitude target by observing a vertical speed or a gradient, by optimizing the climb or descent time while observing an air speed, or by observing a geometrical vertical profile defined by the system according to certain criteria. These targets are taken into account by the guidance system, either directly as soon as their value is modified if the associated mode is active, or after validation (i.e., engagement of the associated mode) in the case where another guidance mode is initially engaged. In the latter case, the target is to be preset before its validation.
For each selection of a target to be reached or to be maintained there is a corresponding guidance mode of the airplane. There is one mode engaged for each axis (speed, lateral, vertical) exclusively. As an illustration, on the lateral axis, a heading mode or route mode can be captured or maintained, a trajectory of the flight plan mode can be joined or maintained, or an approach axis on a horizontal plane mode can be captured or maintained. On the vertical axis, an altitude mode can be captured or maintained, a desired altitude can be reached (climb or descent) while observing an air speed mode, a climb or descent can be performed while observing a vertical speed or a gradient, a climb or descent can be performed while observing a geometrical profile or altitude constraints mode, or a vertical plane mode can be used to capture or maintain the approach axis.
A synthetic summary of the behavior of the guidance system (flight director or automatic piloting system, associated or not with an automatic thrust control) is produced, generally, on the screens displaying the primary flight parameters, of PFD (Primary Flight Display) type, on a panel of FMA (Flight Mode Annunciator) type. This synthetic summary reviews, generally, the guidance modes that are engaged (active) on each axis (speed, lateral, vertical), as well as the guidance modes that are armed, that is to say those which have been requested by the pilot and which will be engaged automatically when conditions for engaging the mode are satisfied. As an example, outside the trajectory of the flight plan, in maintain heading mode converging toward the trajectory of the flight plan with the join or maintain the trajectory of the flight plan mode armed, the latter mode is engaged automatically on approaching the flight plan.
In most airplanes with two pilots, the control unit of the guidance system is situated in the center of the cockpit (above the screens showing the flight parameters) so that both pilots can access it. This control unit, for example of FCU type, makes it possible to select guidance targets, to engage the modes associated with a guidance target (render the mode active), or to request the arming of the mode, and to change reference (for example heading rather than route) for a guidance target.
The task of the pilot responsible for the guidance of the airplane is to select the guidance targets and modes. Currently, he or she performs this task through the dedicated control unit (FCU or MCP) which is located between the two pilots, then he or she has to check the selection of his or her targets (values) on the primary flight screen which is located facing him or her (PFD, standing for Primary Flight Display) and/or on the navigation screens (ND, standing for Navigation Display in the lateral plane; VD, standing for Vertical Display in the vertical plane). Then, the guidance is monitored on these screens which indicate the behavior of the guidance. For instance, the guidance can be a summary of the behavior via the synthesis of the modes that are armed and engaged (e.g., shown on an FMA panel), a display of guidance targets (e.g., speed CAS, heading/route, altitude, vertical speed/gradient) and deviations in relation to the current parameters of the airplane (e.g., shown on a PFD screen), or margins in relation to the limits, such as a margin in relation to the minimum operational speed and stall speed (e.g., shown on a PFD screen).
This standard solution presents drawbacks, however, such as the pilot having to select the guidance targets and modes in one place (control unit FCU), then check and monitor the behavior of the airplane in another place (on the playback screens). This involves visual toing and froing and a dispersion of the guidance elements between the control and the display of the behavior of the system. In addition, the control unit is a physical item of equipment that is costly and difficult to modify (because it is of hardware type), and this control unit is bulky in the cockpit.
The present subject matter provides novel dialog devices and methods for an operator, notably a pilot, of an aircraft and a guidance system of the aircraft, which makes it possible to remedy the above-mentioned drawbacks. To this end, and according to the subject matter disclosed herein, the dialog device can be installed on the aircraft and can comprise a global screen configured for displaying guidance information related to each of a navigation display, a vertical display, and a primary flight display. The global screen can comprise at least one graphic object that can be produced in the form of an interaction element that can represent a control feature that can be grasped and moved along a path, such as a curve, by an operator so as to modify a value of at least one guidance target of the guidance system. Thus, by virtue of the present subject matter, there is on the screen (e.g., PFD, ND, or VD type) at least one interaction element associated with a guidance target of the guidance system and that not only makes it possible to restore the value of this guidance target with which it is associated, but also enables an operator to modify this value on the screen. In this way, the control and the monitoring are combined or co-located.
The present subject matter can be applied to any guidance target used by a guidance system and in particular to the following guidance targets: speed/Mach, heading/route, altitude, vertical speed/gradient. An interaction function (direct) can thus be obtained on a screen (which was hitherto dedicated only to the display of the flight parameters and guidance), through an interaction element (namely a graphic object allowing an interaction) associated with a guidance target.
This interaction element can be grasped or selected and moved by an operator along a curve (e.g., on a scale, which can appear dynamically and contextually when modifying a target) so as to modify the associated guidance target. By way of example, the present subject matter can make it possible to grasp an interaction element indicating a heading target, move it along a heading scale (a heading rose for example) to modify the heading target so that the new heading target is taken into account by the guidance system of the aircraft. The path, such as a curve, which is predefined can be a scale of values displayed by default or an independent path or curve on which a scale of values can appear dynamically and contextually.
A dialog device according to the present subject matter, of interactive type, thus makes it possible for the pilot to select guidance targets (as well as guidance modes, as specified below) in the same place (screen) where he or she can check and monitor the behavior of the aircraft. This arrangement avoids the visual toing and froing and a dispersion of the guidance elements that exists on the standard dialog devices. The dialog device can further make it possible, in circumstances specified below, to do away with a control unit (e.g., FCU type), which is an item of equipment that is costly, difficult to modify and bulky.
In one particular configuration, the interaction element can comprise a plurality of states which allow different actions to be implemented. In this case, advantageously, the interaction element can be movable to any of a plurality states which allow at least some of the following different actions to be implemented: modifying a guidance target, called selected, which is directly applied by the guidance system; modifying a preset guidance target, which will be applied by the guidance system after validation; engaging a capture or maintain mode for a selected guidance target; and/or engaging a capture or maintain mode for a computed guidance target (called “managed”). Furthermore, advantageously the transition from one state to another of the interaction element can be generated by a corresponding movement thereof.
Moreover, in one configuration, the dialog device can comprise a plurality of interaction elements, each of which is intended for a given guidance target (speed/Mach, heading/route, altitude, vertical speed/gradient) of the guidance system. The use of a plurality of interaction elements, namely an interaction element for each guidance target, on the screens dedicated to the playback of the flight parameters and of the guidance (PFD, ND, VD) makes it possible to directly implement on these screens all the functions of a standard physical control unit, for example of FCU type, and therefore to do away with such a control unit, which represents a significant saving in particular in terms of cost, weight and bulk.
In one particular configuration, the global screen can generate a dynamic visual feedback on a predicted trajectory associated with the guidance target, which makes it possible to have directly on the same screen both a way for selecting the guidance target, for displaying its value, and an indication of the effect generated on the trajectory of the aircraft. This embodiment is particularly advantageous operationally, since the pilot can immediately interpret the impact of his or her guidance target modifications on the trajectory, and can do so without the need for any visual toing and froing between a control panel and a display screen. Furthermore, in this case, advantageously the screen can automatically display at least one characteristic point of the predicted trajectory, and the interaction element is capable of acting on the characteristic point(s), thus displayed, of the predicted trajectory to modify them.
In a first embodiment of a dialog device, the screen can be a touch screen, and a graphic object can be controlled by a direct contact (e.g., finger contact) on the part of the operator on this touch screen. Furthermore, in a second embodiment, the dialog device can comprise, in addition to the screen, a control device, such as a trackball or a touchpad in particular (of the multi-touch type or not), that can be linked to the screen and that can enable an operator to control the movement of a cursor on the screen, intended to act on the interaction element provided.
The present subject matter also relates to a guidance system of an aircraft, namely a flight director or an automatic piloting system which may be associated with an automatic thrust system, the automatic piloting system comprising a dialog device such as that mentioned above, to enable a dialog between the guidance system and an operator, notably a pilot, of the aircraft. The present subject matter also relates to an aircraft, in particular a transport airplane, which is equipped with such a dialog device and/or with such a guidance system.
These and other objects of the present disclosure as can become apparent from the disclosure herein are achieved, at least in whole or in part, by the subject matter disclosed herein.
A full and enabling disclosure of the present subject matter including the best mode thereof to one of ordinary skill in the art is set forth more particularly in the remainder of the specification, including reference to the accompanying figures, in which:
The present subject matter provides devices, systems, and methods that enable a dialog between an operator of an aircraft, in particular a pilot, and a guidance system of the aircraft. In one aspect schematically represented in
For this, the dialog device 1 that can be installed on the aircraft can comprise a display system 2 that can comprise at least one screen 3 capable of displaying guidance information of the guidance system 4. The dialog device 1 may comprise one or more screen 3. Specifically, for example, the dialog device 1 can comprise at least one of a piloting screen of Primary Flight Display (PFD) type, a navigation screen of Navigation Display (ND) type in relation to the lateral plane, and/or a navigation screen of Vertical Display (VD) type in relation to the vertical plane.
According to the present subject matter, the screen 3 can comprise at least one graphic object that can be produced in the form of an interaction element 8. This interaction element 8 can be associated with at least one guidance target of the guidance system 4 and can represent, on the one hand, a display element that indicates the value of this guidance target of the guidance system 4, in conjunction with a scale of values and, on the other hand, a control feature that can be grasped and moved along a curve by an operator, in particular the pilot of the aircraft, so as to modify the value of the guidance target (of the guidance system 4).
To do this, the display system 2 comprising the screen 3 can be linked such as via a link 5 to guidance components 4A, 4B, and 4C of the guidance system 4, so as to be able to provide a communication of information between the two assemblies. The guidance system 4 may comprise, as guidance components, a standard flight director 4A, that can compute piloting targets on the basis of guidance targets, and/or a standard automatic piloting system 4B, which makes it possible to follow guidance targets automatically, and/or a standard automatic thrust system 4C which makes it possible to manage the engines thrust automatically. Thus, by virtue of the dialog device 1 according to the present subject matter, the operator has on the screen 3 at least one interaction element 8 that can be associated with a guidance target of the guidance system 4 and that not only makes it possible to restore the value of this guidance target with which it is associated, but also enables this value to be modified on the screen 3.
A dialog device 1 according to the present subject matter therefore allows a direct interaction on a screen 3 (which was hitherto dedicated solely to the display of the flight parameters and guidance), through an interaction element 8 (namely a graphic object allowing an interaction) associated with a guidance target. For example, in a first configuration of the dialog device, the screen 3 can be a touch screen, as represented in
Furthermore, in a second configuration, dialog device 1 can comprise a control device 6, represented by broken lines in
The eye tracking system/software can be configured to detect the focus of the pilot's eyes even if there are perturbations. Therefore, the eye tracker can be calibrated so that a pilot has only to look at a large zone around the interaction element that the pilot wants to select, thus limiting the accuracy required for selection. In this way, control device 6 can be configured such that the interaction element can be selected only if the pilot looks at the large zone during a predetermined time (e.g., 1 second).
Referring to
If it is desired to change which guidance target can be modified, the pilot's focus can be shifted from interaction element 8 to a second interaction element. After the second interaction element is selected, a specific gesture could be performed using the second control device to confirm the selection. This gesture can be, for example, maintaining the finger on the touch pad during a predetermined time, touching the pad with a second finger, acting with any handles that could be present in the cockpit, and/or pushing/pulling on any knobs in the cockpit panel. This selection confirmation can prevent perturbations in case the eyes of the pilot cannot stay focused or directed for a period of time.
Regardless of the specific form, control device 6 can be configured to allow an operator to select or grasp and move the interaction element 8 such as on a display along a predefined path such as a curved path or straight path (on a scale for example, which may appear dynamically and contextually when modifying a target) so as to modify the associated guidance target. The path such as a curve for example may be a scale of values that can be displayed by default, as represented in
As an illustration, in
More specifically,
Moreover, by way of illustration, in
It is also possible to implement a climb mode to a target altitude by observing a particular constraint, for example an altitude or geometrical profile constraint. As an illustration, in the example of
Furthermore, in the latter embodiment, screen 3 may also display, automatically, at least one characteristic point 31 of the predicted trajectory 30 (
As an illustration, it is thus notably possible to carry out the following operations. First, on the heading presetting, it can be possible to delay the start of turn by pushing back, along the predicted trajectory for example, the representation (on the ND screen) of the point at which the taking into account of the heading presetting target begins. Similarly, on the gradient/speed presetting, it can be possible to delay the descent/climb start point by an interaction on the graphic representation of this point (e.g., on the VD screen). It can be further possible to modify the vertical speed/gradient target by an interaction on the end-of-climb/descent graphic representation.
As an illustration, as shown in
In addition, in yet another configuration of the present subject matter, the screen 3 can be a primary flight display PFD type, including a second heading scale 12b, a second altitude scale 22b, an airspeed indicator 42, and a vertical speed indicator 44. As with the other configurations for screen 3 discussed above, an interaction element 8 can allow integrated control of one or more of these guidance targets, for example by having the screen 3 be configured as a touch screen device or by using a separate control device 6 (e.g., an eye tracker used in combination with a secondary control device).
In another particular configuration of dialog device 1, rather than a plurality of individual screens 3 that each display one of the three guiding screens (e.g., navigation display ND (See, e.g.,
Global screen 3a can be a tactile screen as discussed above, or it can be a conventional screen connected to one or more control devices 6 (e.g., an eye tracker combined with a second control device) that can be used to manipulate the data. In any configuration, the flight information (e.g., speed, altitude, vertical speed, heading, and/or track) displayed on one of the three guiding “screens” (i.e., portions of global screen 3a) can be linked to the information on the two other “screens”. Furthermore, each guiding screen can have at least one interaction element 8.
Specifically, for example, as to navigation display ND, it can be possible to act with the heading of the aircraft, such as is discussed above with respect to the embodiments shown in
In addition, on the global screen 3a, the interaction element 8 associated with associated guidance targets on different displays can linked. For example, changing the position of an interaction element 8 with respect to the first altitude scale 22a on the vertical display VD can cause a corresponding change of an interaction element associated with a second altitude scale 22b on the primary flight display PFD (and reciprocally). In another example, changing the position of an interaction element 8 with respect to the first heading scale 12a on the navigation display ND can change the position of a corresponding interaction element 8 associated with a second heading scale 12b on the primary flight display PFD (and reciprocally). As described above, each of these changes can be managed through the interaction element 8.
In addition, this global screen 3a can allow further interactivity. For example, each “screen” (e.g., portions of the global screen 3a corresponding to a navigation display ND, vertical display VD, or primary flight display PFD) can be selectively magnified (i.e., enlarged compared to the others), and the scale of the other screens can be adapted accordingly. As shown in
Dialog device 1 according to the present subject matter thus enables the pilot to select guidance targets (as well as guidance modes) in the same place (screen 3 or global screen 3a) where the pilot can check and monitor the behavior of the aircraft. This avoids the visual toing and froing and a dispersion of the guidance elements, which exist on the standard dialog devices. These comments also apply to the second embodiment using a control device 6 since, in this case, the pilot can visually follow, on the screen 3, the commands produced using the control device 6 (which are likely to be located separately from the screen 3).
The present subject matter also relates to a guidance system 4 of an aircraft, namely a flight director 4A or an automatic piloting system 4B or an auto thrust system 4C, which comprises a dialog device 1 such as that mentioned above, to enable a dialog between the guidance system 4 and a pilot of the aircraft.
Moreover, in one aspect, dialog device 1 can comprise an interaction element 8 associated with each of one or more given guidance targets (e.g., speed/Mach, heading/route, altitude, vertical speed/gradient) of the guidance system 4. The use of each interaction element 8, namely one interaction element for each guidance target, on the screens 3 dedicated to the playback of the flight parameters and guidance (e.g., PFD, ND, VD), makes it possible to implement, directly on these screens 3, all the functions of a standard physical control unit (e.g., of FCU type), and therefore to dispense with such a control unit, which represents a significant saving, notably in terms of cost, weight and bulk. For example,
In addition, in one particular configuration, interaction element 8 can comprise a plurality of states which allow different actions to be implemented. The transition from one state to another of the interaction element 8 can be generated by a corresponding movement thereof. In this case, the interaction element 8 comprises states which allow at least some of the following different actions to be implemented: modifying a selected guidance target, which can be applied by guidance system 4; modifying a preset guidance target, which will be applied directly by guidance system 4 after validation; arming or engaging a capture or maintain mode for a selected guidance target (selected mode); and/or engaging a capture or maintain mode for a guidance target computed automatically in the usual manner (managed mode).
In one particular configuration, interaction element 8 thus makes it possible to control the engagement (i.e., activation) of the associated guidance mode on the defined value (so-called selected mode) or on a value computed by the system according to certain criteria (so-called managed mode), and also the arming of a guidance mode. In a particular embodiment, interaction element 8 is not displayed continuously on screen 3, but rather appears on request by placing a pointing element on the corresponding graphic object (by a direct contact or by the positioning of a cursor), as illustrated in
Furthermore, each interaction element 8 can have the abovementioned states (e.g., not visible, modification directly taken into account for guidance, preset, request to arm or engage the managed mode) which can be accessed by a cursor movement, by contact in touch mode, or by eye focus when using an eye-tracking version of control device 6. The management of interaction element 8 can be such that, by default, the state of interaction element 8 is invisible (e.g., only the display of the target value is displayed in the case where a target exists). Interaction element 8 can be configured to appear, on request, by placing the cursor (or a finger 9) on the graphic object representing the value of the guidance target or the current value of the parameter. Consequently, the modification of the associated target can be effected by moving interaction element 8 along a predefined path such as for example a curve. The guidance target can then be taken into account immediately.
Alternatively, if the pilot wants to preset the guidance target (i.e., choose a value without activating it), and activate it only later (e.g., after validation of his or her request by air traffic control), the pilot can access the presetting state by locating on the interaction element 8, by selecting or grasping it, such as by simply touching it, and by moving it appropriately. For example, the pilot can move interaction element 8 backward (i.e., away from the scale or the curve of movement for the modification) so as to cause a different graphic state associated with the presetting to appear (which is highlighted by an appropriate color, for example yellow). Then, the pilot can modify the presetting value by moving the interaction element 8 along the predefined path, such as a curve for example (as for the guidance target). To actually activate a presetting, an appropriate movement of the interaction element 8, such as toward the interior this time (i.e., toward the scale, as shown in
To engage or arm the managed mode of the axis concerned (mode for which the guidance target is computed automatically by the system according to predefined criteria), the interaction element 8 can be pushed more toward the interior of the interface giving control to the system and causing a graphic object to be covered to validate the command to appear temporarily. In a particular embodiment as shown in
In the context of the present subject matter, the interaction element 8 can be moved by a direct action. It is, however, also possible to envisage moving the interaction element by a so-called “lever arm” effect. In the latter case, an operator interacts with the graphic object representing the guidance target (for example heading/route), not by a direct interaction on this object, but with a lever arm located diametrically opposite this target representation, along the scale, notably in heading rose form, as illustrated by a dashed line 17 in
Moreover, in a particular embodiment, dialog device 1 can comprise at least one interaction element, which is capable of controlling at least two different references (e.g., speed/Mach, heading/route, vertical speed/gradient) of a guidance target of the guidance system 4. In this case, it can be capable of controlling only one reference at a time, and the selection of one of the references to be controlled depends on the way in which the interaction element 8 is made to appear.
In the latter embodiment, the manner in which the interaction element 8 is made to appear therefore makes it possible to select the target reference. For example, by bringing the interaction element over the first heading scale 12a (See, e.g.,
The subject matter described herein can be implemented in software in combination with hardware and/or firmware. For example, the subject matter described herein may be implemented in software executed by one or more processors. In one exemplary implementation, the subject matter described herein may be implemented using a non-transitory computer readable medium having stored thereon computer executable instructions that when executed by the processor of a computer control the computer to perform steps. Exemplary computer readable media suitable for implementing the subject matter described herein can include non-transitory computer readable media such as, for example and without limitation, disk memory devices, chip memory devices, programmable logic devices, and application specific integrated circuits. In addition, a computer readable medium that implements the subject matter described herein may be located on a single device or computing platform or may be distributed across multiple devices or computing platforms.
The present subject matter can be embodied in other forms without departure from the spirit and essential characteristics thereof. The embodiments described therefore are to be considered in all respects as illustrative and not restrictive. Although the present subject matter has been described in terms of certain preferred embodiments, other embodiments that are apparent to those of ordinary skill in the art are also within the scope of the present subject matter.
Number | Date | Country | Kind |
---|---|---|---|
11 60884 | Nov 2011 | FR | national |
This application is a continuation-in-part application from and claims priority to co-pending U.S. patent application Ser. No. 13/687,729 filed Nov. 28, 2012, which relates and claims priority to French Patent Application No. 11 60884 filed Nov. 29, 2011, the entire disclosures of which are incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
Parent | 13687729 | Nov 2012 | US |
Child | 13834401 | US |