This disclosure generally relates to a user interface for enabling a user to control a mode of a machine. In particular, this disclosure relates to hardware control devices on a flight deck of an aircraft.
Modern jet transports are equipped with a cockpit mode control panel that interfaces with a flight management system to control the selection and engagement of automatic flight control modes of operation. These automatic flight control modes of operation include, for example, flight level change (FLCH), vertical navigation (VNAV) and lateral navigation (LNAV). The FLCH mode can automatically manage thrust and speed to climb or descend from one altitude to another. The VNAV mode can provide automatic optimized profile control from initial climb through final approach, including adherence to terminal area procedure speed and altitude constraints. The LNAV mode can provide steering to a preprogrammed route including selected terminal area procedures.
The pilot chooses the available modes that will best accomplish the desired vertical flight profile and lateral routing. In most instances, the pilot plans the flight in advance, both laterally and vertically, and preprograms the LNAV and VNAV modes so that the desired flight path will be followed. While preprogrammed flights are advantageous because they reduce the pilot's burden, particularly during takeoff and landing, in practice, rarely can flights be flown as preplanned. For example, rerouting and clearance instructions may be received from air traffic control (ATC) during the flight. These instructions force the pilot to depart from the vertical flight profile and/or the lateral route that was originally planned. In some instances, rerouting and reclearance come far enough in advance to allow the pilot to reprogram the route or profile instructions stored in the memory of an auto flight computer so that the flight management system can remain in the LNAV and VNAV flight control modes. On other occasions, pilots are forced to manually intervene in order to depart from LNAV and VNAV preprogrammed flight paths and comply with ATC instructions in a timely manner.
Intervention-capable flight management systems (FMS) have been developed which allow a pilot to intervene in the operation of the preprogrammed auto flight computer of a flight management system and change the speed and/or flight path of an aircraft in response to ATC instructions. One such system is disclosed in U.S. Pat. No. 4,811,230, and entitled “Intervention Flight Management System.” The intervention FMS disclosed in that patent includes a mode control panel via which the pilot interfaces with an FMS program. The FMS program includes several modules that override the preprogrammed instructions stored in the memory of the auto flight computer when the modules are engaged. In this manner, the FMS allows the pilot to manually intervene and control the auto flight computer and, thus, the aircraft in response to, for example, ATC instructions to change heading, altitude, airspeed or vertical speed. The FMS automatically returns to fully optimized flight along the preprogrammed profile when the intervention is cancelled.
An FMS control panel consists of a multitude of control devices for enabling a pilot to interact with airplane systems and displays. Often these control devices produce a corresponding change to an associated display. For example, turning a knob in one location can change a displayed value in another location. However, some control devices are typically used without looking directly at the controls. To avoid confusion, similarly shaped control devices can be discriminated from each other spatially and sometimes with unique textures. However, operation of the hardware controls on the flight deck is still susceptible to confusion because these control devices do not completely prevent the pilot from reaching for and activating the wrong control device. Furthermore, pilots sometimes only detect the error after the incorrect control device has been used, which produces an unexpected change in the associated display (and sometimes unwanted aircraft performance).
There is a need for an improved system and method that can be used to help avoid such control errors.
An improved system and method for enabling a control panel user, while looking at an associated display screen rather than the control panel, to verify that the user is touching a correct control device on that control panel. In accordance with some embodiments, touch or proximity sensors are integrated into control devices such as knobs, dials, levers, or wheels incorporated in a user control panel. A control output processor detects whether any of these sensors are outputting signals indicating that a control device is being touched (or nearly touched) by a user. The control output processor causes an associated display area to be highlighted or otherwise modified on a display screen to visually indicate to the user which control device is being touched. (An “associated” display area is a region on a display screen that will be affected by an action performed by the user when manipulating the touched control device.) This feature allows for “no-look” interaction with control panels, and keeps the user's eyes focused on the relevant displays where the control devices produce effects. This allows users to preview and verify that the correct control device is being manipulated prior to taking any (potentially erroneous) control actions and without the user looking at the control panel.
The benefits of the above-described system and method include at least the following:
(1) The highlighting (or other visual change) offers a preview of where the control inputs will produce changes on the display device. This allows the user to find the correct control while keeping his/her visual attention directed at a non-collocated display device, without necessarily having to refocus attention to find the correct control device.
(2) Because the user can direct his/her attention toward a display device rather then the control device he/she is manipulating, the control panel can be simplified and less feedback/data on the panel itself (such as numerical windows or other displays) is required. Information (e.g., control settings and values) can be consolidated on the main display rather than being redundantly shown on the control panel.
(3) The foregoing technical feature aids with error prevention by confirming that the correct control device is being touched prior to taking action by manipulating the control device.
In accordance with one aspect, an interface system for enabling a user to control a mode of a flight vehicle is provided. The system comprises: a control panel comprising first and second settable control devices configured to output first and second setting signals representing respective current settings of the first and second settable control devices; a first sensor configured to output a user proximity signal while the first settable control device is being touched by a user; a second sensor configured to output a user proximity signal while the second settable control device is being touched by a user; a display system comprising a display screen that is not part of or collocated with the control panel; and a computer system coupled to receive the first and second setting signals from the first and second settable control devices and receive any user proximity signal output by the first or second sensor and further coupled to send display control signals to the display system. The computer system is programmed to output display control signals that control the display system to display pixel data indicating the current settings of the first and second settable control devices and whether or not the first or second settable control device is currently being touched by the user. In accordance with one embodiment, the computer system is programmed to produce a first display control signal that controls the display system to display pixel data indicating the current setting of the first settable control device in a first area of the display screen, a second display control signal that controls the display system to display pixel data indicating the current setting of the second settable control device in a second area of the display screen which does not overlap the first area of the display screen, and a third display control signal that changes pixel data displayed in the first area of the display screen in response to the presence of a user proximity signal from the first sensor for a time interval having a duration greater than a first threshold value.
In accordance with another aspect, an interface system for enabling a user to control a mode of a machine comprises: a control panel comprising first and second settable control devices configured to output first and second setting signals representing respective settings of the first and second settable control devices; a first sensor configured to output a user proximity signal while the first settable control device is being touched by a user; a second sensor configured to output a user proximity signal while the second settable control device is being touched by a user; a display system that is not part of or collocated with the control panel; and a computer system coupled to receive the first and second setting signals from the first and second settable control devices and receive any user proximity signal output by the first or second sensor and further coupled to send display control signals to the display system. The computer system is programmed to output display control signals that control the display system to display pixel data indicating the settings of the first and second settable control devices and pixel data indicating that the first or second settable control device is being touched by the user. The foregoing system may further comprise an auto flight computer coupled to receive the first and second setting signals from the computer system, and a flight control system comprising hardware components which are operated in a mode that is responsive to the first and second settings. A further aspect is an intervention flight management system comprising the system as described in this paragraph.
In accordance with a further aspect, a method for producing a visual indication that a user is touching a control device on a control panel is provided. The method comprises the following steps: producing an electrical user proximity signal whenever the user is currently touching the control device and not producing the electrical user proximity signal whenever the user is not touching the control device; producing a first electrical display control signal in response to the absence of an electrical user proximity signal for a time interval having a duration greater than a first threshold value; displaying a first set of pixel data in an area on a display screen that is associated with the control device, but is not part of or collocated with the control panel, while the first electrical display control signal is being produced, the first set of pixel data comprising a first subset of pixel data indicating a current setting of the control device; producing a second electrical display control signal in response to the presence of an electrical user proximity signal for a time interval having a duration greater than a second threshold value; and displaying a second set of pixel data in the area on the display screen while the second electrical display control signal is being produced, the second set of pixel data comprising a first subset of pixel data indicating the current setting of the control device, wherein the second set of pixel data is not identical to said first set of pixel data.
Other aspects of the system and method are disclosed and claimed below.
Reference will hereinafter be made to the drawings in which similar elements in different drawings bear the same reference numerals.
Modern aircraft may employ a flight management system (FMS).
The embodiment shown in
The mode control panel 14 also receives current parameter value signals (not shown) from the auto flight computer 10. The auto flight computer 10 also sends current parameter value signals to a display system 22 for display thereby. The display system 22 comprises a primary flight display that is configured to display symbology, graphical elements, icons, coloring, shading, highlighting, etc. in order to visually communicate air data and basic flight information.
In accordance with the embodiment shown in
The guidance commands produced by the auto flight computer 10 control the orientation and speed of the aircraft in a well-known manner. In this regard, it should be understood that
The mode control panel 14 allows a user, such as a pilot of the aircraft, to interface with the FMS. The mode control panel 14 may include a number of different sections, such as a direction section, an altitude section, a speed section and a vertical path section, for allowing the pilot to control various functions of the FMS. (One example of such a mode control panel will be described later with reference to
In accordance with one embodiment, the mode control panel 14 further comprises a multiplicity of touch or proximity sensors 20, each touch sensor corresponding to a respective control device having an associated graphical element to be displayed on the primary flight display (described in detail below). In these embodiments, each touch sensor 20 outputs a signal in response to an effect of a pilot's finger touching or closely approaching a corresponding control device. Many different types of touch or proximity sensors capable of being integrated into a control device could be used. In accordance with one embodiment, the sensors can be capacitive-type touch sensors of types which are well-known for use as human interface devices. Capacitive sensing is a technology based on capacitive coupling which takes human body capacitance as input. Capacitive sensors detect anything that is conductive or has a dielectric constant different than that of air. Capacitive sensors can be constructed from many different media, such as copper, indium tin oxide and printed ink. There are two types of capacitive sensing systems: (1) mutual capacitance, where the object (such as a finger) alters the mutual coupling between two electrodes; and (2) self- or absolute capacitance, where the object (such as a finger) loads the sensor or increases the parasitic capacitance to ground. In both cases, the difference of a preceding absolute position from the present absolute position yields the relative motion of the object or finger during that time. In accordance with alternative embodiments, the sensors can be infrared detectors which react to infrared radiation, such as infrared radiation emitted by a pilot's finger.
In accordance with a different design, rather than incorporating individual touch sensors into the design of the control devices (which may have different sizes and shapes), the human interface state of a control device can be monitored by an infrared camera mounted to the mode control panel.
The control output processor 16 processes the touch sensor outputs and then sends display control signals to the display system 22 which cause the primary flight display of the latter to display symbology, graphical elements, icons, coloring, shading, highlighting, changes in font size, etc. which indicate to the pilot which control device he/she is touching, as will be described in more detail below with reference to
The control output processor 16 processes the touch sensor outputs in accordance with an algorithm designed to discriminate each instance where a control device 18 of interest has been touched or nearly touched by a pilot in a predetermined manner. One embodiment of such an algorithm will be described later with reference to
For this embodiment, the speed section 24 comprises a speed knob/selection button 32, an indicated airspeed (IAS)/MACH number speed mode selection switch 34, a speed display window 36 that displays the speed selected, and mode selection buttons 38 with mode active indicator lights. The IAS/MACH speed mode selection switch 34 is a toggle switch that allows the pilot to choose between IAS and MACH number speed modes of operation, including alternately changing the speed display window 36 between IAS and Mach number displays. In operation, the mode selection buttons 38 can be pushed to engage a particular mode of guidance (i.e., LNAV, VNAV, FLCH or A/T (auto-throttle)) and can illuminate to indicate that the selected mode is active.
A speed management module is engaged by pushing speed knob/selection button 32. The term “module” as used herein, may refer to any combination of software, firmware, or hardware used to perform the specified function or functions. When speed knob/selection button 32 is pushed, the speed management module is synchronized to the current aircraft speed. Thereafter the speed of the aircraft is increased or decreased by rotating the speed knob/selection button 32. During knob rotation, the indicated speeds in the speed display window 36 and in the primary flight display (not shown in
Referring again to
Still referring to
Still referring to
Other control panel devices depicted in
In accordance with the concept shown in
The indicator 72 is a centrally located electronic attitude director indicator which is substantially rectangular in shape and has a central boresight box 82 representing the airplane longitudinal axis at the center of the box. On either side thereof are conventional stationary aircraft symbols 84 and 86. An artificial horizon is provided by line 88. The overall presentation by the electronic attitude director indicator 72 is substantially conventional.
Adjacent and along the left-hand side of attitude director indicator 72 is an air speed indicator 74 comprising a vertically movable scale or “tape” 90 having graduations representing air speed values along the right-hand side thereof. The air speed indicator 74 further comprises a fixed pointer 92 which points inwardly toward the air speed scale 90. The pointer 92 is provided with a window 94 digitally indicating the air speed in response to instrumentation of the aircraft. As the air speed changes, the scale or tape 90 moves vertically relative to the fixed pointer 92. The tape 90 presents a range of speed values above and below the current speed, with the numerics being disposed immediately to the left of the corresponding scale graduations. Portions of the scale or tape above and below the viewable range are blanked from the presentation. Moreover, the scale is blanked at the location of window 94 which supplies the numerical readout of the current speed as a “rolling” number. The air speed indicator 74 further includes a pilot-controlled marker or “bug” 96 consisting of a pair of adjacent horizontal lines, with the current value of the selected air speed being numerically displayed at location 98 above the air speed presentation. When the selected air speed is attained, the marker or “bug” 96 will reach pointer 92.
Indicator 76 for aircraft heading comprises an area having the shape of a segment of a circle or compass rose which is easily comprehensible by the viewer. The indicator 76 is provided with a degree scale 102 along the upper, arc-shaped portion thereof adjacent to the attitude director indicator 74, and like the previously described air speed indicator 76, the scale 102 of heading indicator 78 moves with respect to a fixed vertical heading/track line 104 which indicates the current track according to the auto flight computer. For other than the segment of the heading display as illustrated in
A vertically disposed altitude indicator 78 is located adjacent the right-hand side of the attitude director indicator 72 in
The primary flight display 70 shown in
In accordance with the touch-sensitive hardware control concept disclosed herein, each time that the pilot touches or nearly touches a control panel device that incorporates a touch or proximity sensor, an associated portion of the primary flight display is modified to visually indicate to the pilot which control panel device he/she is touching or nearly touching while he/she is looking at the primary flight display.
For the purpose of illustration,
Similarly, in the case where the pilot is touching or nearly touching the altitude knob/selection button (item 62 in
In the case where the pilot is touching or nearly touching the vertical speed wheel (item 52 in
Instead of contrasting color borders such as the rectangular border 120 seen in
For example, in the case where the pilot is touching or nearly touching the heading knob/selection button (item 42 in
In accordance with one embodiment, the control output processor (item 16 in
The touch-sensitive hardware control concept disclosed hereinabove is not limited in its application to the mapping of mode control panel devices to areas on a cockpit or flight deck display. More generally, the concept may be applied to any physically located hardware control devices that have associated areas on a non-collocated display.
While touch-sensitive hardware controls have been described with reference to particular embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the teachings herein. In addition, many modifications may be made to adapt a particular situation to the teachings herein without departing from the essential scope thereof. Therefore it is intended that the claims set forth hereinafter not be limited to the disclosed embodiment.
As used in the claims, the term “computer system” should be construed broadly to encompass a system having at least one computer or processor, and which may have two or more interconnected computers or processors.
Number | Name | Date | Kind |
---|---|---|---|
4185281 | Silverstone | Jan 1980 | A |
4390861 | Cohen et al. | Jun 1983 | A |
4811230 | Graham et al. | Mar 1989 | A |
4860007 | Konicke et al. | Aug 1989 | A |
4922061 | Meadows et al. | May 1990 | A |
5398045 | Sach et al. | Mar 1995 | A |
5844503 | Riley et al. | Dec 1998 | A |
6112141 | Briffe et al. | Aug 2000 | A |
6128553 | Gordon et al. | Oct 2000 | A |
6130663 | Null | Oct 2000 | A |
6567014 | Hansen et al. | May 2003 | B1 |
6587056 | Fraser et al. | Jul 2003 | B1 |
6664945 | Gyde et al. | Dec 2003 | B1 |
6745113 | Griffin et al. | Jun 2004 | B2 |
7256710 | Mumaw et al. | Aug 2007 | B2 |
7308343 | Horvath et al. | Dec 2007 | B1 |
7321318 | Crane et al. | Jan 2008 | B2 |
7346854 | Hedrick | Mar 2008 | B2 |
7418319 | Chen et al. | Aug 2008 | B2 |
7512464 | Tarleton et al. | Mar 2009 | B2 |
7602382 | Hinckley et al. | Oct 2009 | B2 |
7626515 | Langner et al. | Dec 2009 | B1 |
7724259 | Hedrick et al. | May 2010 | B2 |
7772995 | Cabaret De Alberti et al. | Aug 2010 | B2 |
7834779 | He et al. | Nov 2010 | B2 |
7986153 | Easter | Jul 2011 | B2 |
8098175 | Berthou et al. | Jan 2012 | B2 |
8132117 | Hedrick | Mar 2012 | B2 |
8165810 | Fetzmann et al. | Apr 2012 | B2 |
8223119 | Krenz et al. | Jul 2012 | B1 |
8312479 | Boillot | Nov 2012 | B2 |
8330735 | Lin et al. | Dec 2012 | B2 |
8380366 | Schulte et al. | Feb 2013 | B1 |
8520015 | Krishna et al. | Aug 2013 | B2 |
8694184 | Boorman et al. | Apr 2014 | B1 |
8761971 | Gershzohn | Jun 2014 | B2 |
20050231390 | Crane et al. | Oct 2005 | A1 |
20070198141 | Moore | Aug 2007 | A1 |
20090009491 | Grivna | Jan 2009 | A1 |
20090019188 | Mattice et al. | Jan 2009 | A1 |
20090215500 | You et al. | Aug 2009 | A1 |
20090237372 | Kim et al. | Sep 2009 | A1 |
20090289902 | Carlvik et al. | Nov 2009 | A1 |
20100001132 | Detouillon et al. | Jan 2010 | A1 |
20100053107 | Tsuzaki et al. | Mar 2010 | A1 |
20110125347 | Boorman et al. | May 2011 | A1 |
20120050180 | King et al. | Mar 2012 | A1 |
20120050210 | King et al. | Mar 2012 | A1 |
20120113051 | Bird et al. | May 2012 | A1 |
20120154307 | Nunomaki | Jun 2012 | A1 |
20120169623 | Grossman et al. | Jul 2012 | A1 |
20120217982 | Narayanasamy et al. | Aug 2012 | A1 |
20120235892 | Narendra et al. | Sep 2012 | A1 |
20120242608 | Koshiyama et al. | Sep 2012 | A1 |
20120256768 | Kratchounova et al. | Oct 2012 | A1 |
20120268374 | Heald | Oct 2012 | A1 |
20120296496 | Hedrick et al. | Nov 2012 | A1 |
20120327034 | Dominici et al. | Dec 2012 | A1 |
20130097550 | Grossman et al. | Apr 2013 | A1 |
Number | Date | Country |
---|---|---|
10015726 | Oct 2001 | DE |
102005049514 | May 2006 | DE |
102010012239 | Sep 2011 | DE |
1932727 | Jun 2008 | EP |
2045789 | Apr 2009 | EP |
Entry |
---|
PCT International Search Report and Written Opinion dated Jan. 7, 2014, International Application No. PCT/US2013/040188 (PCT counterpart to U.S. Appl. No. 13/606,082). |
Number | Date | Country | |
---|---|---|---|
20140074325 A1 | Mar 2014 | US |