The present disclosure generally relates to display systems, including aircraft display systems, and methods for providing displays. More particularly, the present disclosure relates to display systems and methods for providing displays having an adaptive combined vision system.
Display systems are known in the art that include a sensory image overlaid on a synthetic image. In the context of a primary flight display in the cockpit of an aircraft, for example, such display systems may include a synthetic image of an area forward of the direction of travel, with a sensory image overlaid over a portion of the synthetic image. Such systems are commonly referred to in the art as “combined vision systems” (“CVS”), and are provided to increase the decision aiding cues available to the pilot of the aircraft when flying at low altitudes and under low visibility conditions.
In known CVS systems, the sensory image is always fixed in the middle of the synthetic image, and only occupies a small portion of the overall display. As is known in the art, it has been found that, even if the sensory image is capable of capturing the entire area shown by the display, uneven reflected colors captured in the sensory image do not blend smoothly with the synthetic image. Thus, it is generally desirable for the synthetic image to show only the details that are particularly relevant to aiding the pilot, such as the runway and the immediately surrounding area. In this manner, it is generally desirable for the sensory image to occupy only a portion of the synthetic image over which it is positioned, such as less than half of the synthetic image or smaller.
In such systems, however, in circumstances where the aircraft is executing turns, such as a circling approach, the sensory image, which is centered within the synthetic image and is smaller than the synthetic image, will fail to capture the relevant imagery that the aircraft will actually encounter and that is desirable to display to the pilot, such as the runway. Further, in situations such as cross-wind landings, where the angle of the aircraft does not coincide with the direction of travel, the sensory image will likewise fail to capture the relevant imagery that the aircraft will actually encounter. Thus, the prior art remains deficient.
Accordingly, it is desirable to provide improved display systems and methods for providing displays that overcome the deficiencies in the prior art. Furthermore, other desirable features and characteristics of the present disclosure will become apparent from the subsequent detailed description of the inventive subject matter and the appended claims, taken in conjunction with the accompanying drawings and this background of the inventive subject matter.
Display systems and methods for providing displays are disclosed. In one exemplary embodiment, a method for providing a display to a flight crew of an aircraft includes the steps of providing a synthetic image including a first field of view forward of a direction of travel of the aircraft and providing a sensory image overlaying a first portion of the synthetic image. The sensory image includes a second field of view forward of the direction of travel of the aircraft. At least a portion of the first field of view and the second field of view overlap one another. The sensory image is centered within the synthetic image with respect to a horizontal axis. The method further includes moving the sensory image so as to include a third field of view forward of the direction of travel of the aircraft and so as to overlay a second portion of the synthetic image such that the sensory image is no longer centered with respect to the horizontal axis within the synthetic image. At least a portion of the first field of view and the third field of view overlap one another.
In another exemplary embodiment, a display system configured to provide a display to a flight crew of an aircraft includes an image sensor, an image display device, a data storage device that stores navigation information and runway information, and a computer processor device. The computer processor device is configured to generate for display on the image display device a synthetic image that includes a first field of view forward of a direction of travel of the aircraft based at least in part on the navigation information and the runway information. The computer processor device is further configured to receive for display on the image display device and from the image sensor a sensory image and display the sensory image overlaying a first portion of the synthetic image. The sensory image includes a second field of view forward of the direction of travel of the aircraft. At least a portion of the first field of view and the second field of view overlap one another. The sensory image is centered within the synthetic image with respect to a horizontal axis. Still further, the computer processor device is configured to receive for display on the image display device and from the image sensor a further sensory image that includes a third field of view forward of the direction of travel of the aircraft and move the sensory image so as to overlay a second portion of the synthetic image such that the sensory image is no longer centered with respect to the horizontal axis within the synthetic image. At least a portion of the first field of view and the third field of view overlap one another.
In yet another exemplary embodiment, a method for providing a display to a flight crew of an aircraft includes the following steps: while the aircraft is descending but prior to reaching a first predetermined position, providing a first synthetic image that includes a first field of view forward of a direction of travel of the aircraft and providing a first sensory image overlaying a first portion of the first synthetic image. The first sensory image includes a second field of view forward of the direction of travel of the aircraft. At least a portion of the first field of view and the second field of view overlap one another. The first sensory image is centered within the first synthetic image with respect to a horizontal axis. While the aircraft is descending and after reaching the first predetermined position but prior to reaching a second predetermined position, the method further includes providing a second synthetic image that includes the first field of view forward of the direction of travel of the aircraft and providing a second sensory image overlaying a first portion of the second synthetic image. The second sensory image includes a third field of view forward of the direction of travel of the aircraft. At least a portion of the first field of view and the third field of view overlap one another. The second sensory image is centered on a flight path vector with respect to the horizontal axis. Still further, while the aircraft is descending and after reaching the second predetermined position but prior to reaching a runway, the method includes providing a third synthetic image that includes the first field of view forward of the direction of travel of the aircraft and the runway and providing a third sensory image overlaying a first portion of the third synthetic image. The third sensory image includes a third field of view forward of the direction of travel of the aircraft and the runway. At least a portion of the first field of view and the third field of view overlap one another. The third sensory image is centered on a touchdown zone of the runway with respect to the horizontal axis.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The present disclosure will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The following detailed description is merely illustrative in nature and is not intended to limit the embodiments of the subject matter or the application and uses of such embodiments. As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Any implementation described herein as exemplary is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
Referring to
The processor 104 may be any one of numerous known general-purpose microprocessors or an application specific processor that operates in response to program instructions. In the depicted embodiment, the processor 104 includes on-board RAM (random access memory) 103, and on-board ROM (read only memory) 105, and/or other non-transitory data storage media known in the art. The program instructions that control the processor 104 may be stored in either or both the RAM 103 and the ROM 105. For example, the operating system software may be stored in the ROM 105, whereas various operating mode software routines and various operational parameters may be stored in the RAM 103. It will be appreciated that this is merely exemplary of one scheme for storing operating system software and software routines, and that various other storage schemes may be implemented. It will also be appreciated that the processor 104 may be implemented using various other circuits, in addition to or in lieu of a programmable processor. For example, digital logic circuits and analog signal processing circuits could also be used.
Regardless of how the processor 104 is specifically implemented, it is in operable communication with the sensor 125 and the display device 116, and is coupled to receive data about the installation of the imaging sensor 125 on the aircraft. In one embodiment, this information can be hard-coded in the ROM memory 105. In another embodiment, this information can be entered by a pilot. In yet another embodiment, an external source of aircraft data can be used. The information about the installation of the sensor 125 on board may include, for example, that it is forward looking and aligned with the main axis of the aircraft body in the horizontal direction. More precise information may be provided, such as but not limited to, detailed information about sensor position in the aircraft reference frame, or sensor projection characteristics.
In one embodiment, the processor 104 may further receive navigation information from navigation sensors 113 or 114, identifying the position of the aircraft. In some embodiments, information from navigation database 108 may be utilized during this process. Having navigation information, the processor 104 may be further configured to receive information from runway database 110. In some embodiments, the display system includes a combined vision system (CVS). In particular, the imaging sensor 125 may include the CVS sensor, the processor 104 may include a CVS processor, and the display device 116 may include a CVS display. The CVS system may also use other data sources such as terrain database, obstacle database, etc.
The navigation databases 108 include various types of navigation-related data. These navigation-related data include various flight plan related data such as, for example, waypoints, distances between waypoints, headings between waypoints, data related to different airports, navigational aids, obstructions, special use airspace, political boundaries, communication frequencies, and aircraft approach information. It will be appreciated that, although the navigation databases 108 and the runway databases 110 are, for clarity and convenience, shown as being stored separate from the processor 104, all or portions of either or both of these databases 108, 110 could be loaded into the RAM 103, or integrally formed as part of the processor 104, and/or RAM 103, and/or ROM 105. The databases 108, 110 could also be part of a device or system that is physically separate from the system 100. The sensors 113 may be implemented using various types of inertial sensors, systems, and or subsystems, now known or developed in the future, for supplying various types of inertial data. The inertial data may also vary, but preferably include data representative of the state of the aircraft such as, for example, aircraft speed, heading, altitude, and attitude. The number and type of external data sources 114 may also vary. The external systems (or subsystems) may include, for example, a flight director and a navigation computer, and various position detecting systems. However, for ease of description and illustration, only a global position system (GPS) receiver 122 is depicted in
The GPS receiver 122 is a multi-channel receiver, with each channel tuned to receive one or more of the GPS broadcast signals transmitted by the constellation of GPS satellites (not illustrated) orbiting the earth. Each GPS satellite encircles the earth two times each day, and the orbits are arranged so that at least four satellites are always within line of sight from almost anywhere on the earth. The GPS receiver 122, upon receipt of the GPS broadcast signals from at least three, and preferably four, or more of the GPS satellites, determines the distance between the GPS receiver 122 and the GPS satellites and the position of the GPS satellites. Based on these determinations, the GPS receiver 122, using a technique known as trilateration, determines, for example, aircraft position, groundspeed, and ground track angle.
The display device 116, as noted above, in response to display commands supplied from the processor 104, selectively renders various textual, graphic, and/or iconic information, and thereby supply visual feedback to the user 109. It will be appreciated that the display device 116 may be implemented using any one of numerous known display devices suitable for rendering textual, graphic, and/or iconic information in a format viewable by the user 109. Non-limiting examples of such display devices include various cathode ray tube (CRT) displays, and various flat panel displays such as various types of LCD (liquid crystal display) and TFT (thin film transistor) displays. The display device 116 may additionally be implemented as a panel mounted display, a HUD (head-up display) projection, or any one of numerous known or emerging technologies. It is additionally noted that the display device 116 may be configured as any one of numerous types of aircraft flight deck displays. For example, it may be configured as a multi-function display, a horizontal situation indicator, or a vertical situation indicator. In the depicted embodiment, however, the display device 116 is configured as a primary flight display (PFD).
As such,
In an exemplary embodiment, the amount that the sensory image 151 is shifted from center (i.e., up, down, left, or right) of the synthetic image 150 depends upon the attitude of the aircraft. For example, a five degree banking turn will shift the image 151 to the left or right by a relatively small amount, whereas a thirty degree banking turn will shift the image 151 by a relatively larger amount. Likewise, a five degree descending angle will shift the image 151 downward by a relatively small amount, whereas a ten degree descending angle will shift the image 151 downward by a relatively larger amount. All forms and amounts of lateral and vertical translation of the sensory image 151 within the synthetic image 150 will thus be understood to be within the scope of the present disclosure.
In an exemplary embodiment, the amount of shift from center of the sensory image 151 relative to the synthetic image 150 is coordinated based on the movement of the flight path vector symbol 157, which, as noted above, is already provided on many CVS systems known in the art. As shown in
Further embodiments of the present disclosure are depicted in
In
The various exemplary embodiments of a display system having now been described,
The presently described method may feature automatic transitioning between the above-noted modes. For example, once the aircraft starts descending, the CVS may be displayed in normal mode. Near the IAF 203, the CVS image may transition into the track mode, where the image is centered on the FPV. Near the FAF 204, once the runway is in view, the CVS image may transition into the runway lock mode so that the image is centered on the runway. If the landing is aborted and a missed approach is performed, the runway image will slide out of the view and the CVS image will again automatically transition to track mode
In some embodiments, the operation of flight display system 100 may be provided in connection with an air traffic alert system, such as traffic collision avoidance system (TCAS). As is known in the art, a TCAS system includes a display, such as a primary flight display, with symbols superimposed thereover indicating the position and altitude of other aircraft within a pre-defined vicinity of the aircraft. As such, the TCAS system includes data representing the position of other nearby aircraft. The presently described flight display system may be provided to operate in association with a TCAS system. For example, in one embodiment, the CVS system may be provided in an “alert mode.” As used herein, the term alert mode refers to the operation of the CVS wherein, based on the location of a traffic alert (TA) issued by the TCAS system, the sensory image 151 may be centered on the “intruder” aircraft location if the aircraft is within the CVS view frustum. Alert mode may be provided in place of any other operational mode, as needed based on the receipt of a traffic alert.
In further embodiments, the alert mode may be provided to operate in coordination with other alerting systems of the aircraft, such as terrain or obstacle alerting systems. Thus, based on a terrain alert or an obstacle alert, the sensory image 151 may be positioned on the obstacle location if it is within the CVS view frustum. This mode of operation gives precise awareness of the obstacle/intruder's location to avoid a collision.
Regarding any mode described above, a mode over-ride option may be provided for the pilot to choose an alternate mode other than the one provided automatically by the system.
At a further position along the approach to the airport, such as upon crossing the IAF as indicated at block 710, flight path vector information is retrieved from the PFD at block 711 and the CVS system changes to track mode at block 712. As described above, in track mode, the sensory image changes position based on the flight path of the aircraft, for example as indicated by the flight path vector, as shown at block 713.
Thereafter, at a further position along the approach to the airport, such as within a given distance and altitude, or at the FAF, as shown at block 714, the CVS system retrieves runway information at block 715 and the CVS system change to runway lock mode at block 716. As described above, in runway lock mode, the sensory image change position to be fixed on the runway, for example centered at the landing zone of the runway. In the event of a go-around, as shown at block 718, the CVS system reverts to track mode.
As such, the embodiments described herein provide an adaptive combined vision system that allows the position of the sensory image within the synthetic image to change under various circumstances. The embodiments allow the sensory image to remain desirably small while still providing the pilot with all of the most relevant imagery to the flight. Further, the exemplary methods of providing a display set forth above allow for the automatic transitioning of the mode of operation of the CVS system based on the stage of flight of the aircraft. Further, the CVS may automatically transition to an alert mode in the event of an aircraft intrusion or the presence of terrain or an obstacle, thereby providing enhanced safety in the operation of the aircraft.
While at least one exemplary embodiment has been presented in the foregoing detailed description of the inventive subject matter, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the inventive subject matter in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the inventive subject matter. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the inventive subject matter as set forth in the appended claims.