The present invention generally relates to a system for enhancing a pilot's situational awareness of terrain around an aircraft, and more particularly relates to a system for depicting terrain.
In order to prevent the occurrence of postulated controlled flight into terrain (“CFIT”) events, Terrain Awareness and Warning Systems (hereinafter “TAWS”) have been developed to provide pilots with an electronic means to detect terrain as well and to provide warnings when an aircraft's flight path, if unaltered, will lead to a collision with terrain. One such example of a TAWS is a Ground Proximity Warning System (hereinafter “GPWS”) which uses a radar altimeter to assist in calculating terrain closure rates.
An improvement to GPWS is an Enhanced Ground Proximity Warning System (hereinafter “EGPWS”) which incorporates the use of Global Positioning Satellites (hereinafter “GPS”) and a terrain database storing data relating to the altitudes and/or elevations of terrain world wide. EGPWS determines the position of an aircraft and then provides a two dimensional image of the terrain around the aircraft on a display screen in the cockpit of the aircraft. The terrain around the aircraft is depicted with a color code that correlates to the altitude and/or elevation of the terrain. For instance, if the terrain reaches to elevations well above the altitude of the aircraft, it is shown in red. Terrain that lies well below the aircraft is shown in green. Terrain that lies at approximately the same altitude as the aircraft is shown in yellow.
Another system that electronically provides pilots with an enhanced awareness of the terrain near the aircraft is an integrated Primary Flight Display (hereinafter “IPFD”). An IPFD provides a pilot with a three dimensional image of the terrain located in front of the pilot's aircraft. The image is constantly updated and resembles a video while the aircraft is in flight.
While EGPWS and IPFD have greatly reduced the likelihood of CFIT events, these systems have their limitations. For instance, the image provided by EGPWS provides a pilot with a two-dimensional view from an above-the-aircraft perspective. The pilot must mentally manipulate and interpret the image to gain a situational awareness of where the illustrated terrain is with respect to the aircraft. In certain circumstances, the pilot may not be able to absorb the information provided by EGPWS. With respect to IPFD, this system only provides a view of the terrain lying ahead of the aircraft, not beneath it, behind it, or on either side of it.
Accordingly, it is desirable to provide a system that enhances a pilot's ability to gain a situational awareness of the terrain located around the pilot's aircraft which, in turn, will reduce the occurrence of CFIT. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description of the invention and the appended claims, taken in conjunction with the accompanying drawings and this background of the invention.
A system for providing a pilot of an aircraft with a visual depiction of a terrain is disclosed herein. In a first non-limiting embodiment, the system includes, but is not limited to, a data storage unit that is configured to store terrain data. The system also includes a position determination unit that is configured to determine a position of the aircraft. The system further includes a near to eye display unit that is adapted to be worn by the pilot and that is configured to display visual images and to determine a view direction of the pilot. The system also includes a processing unit, that is communicatively connected to the data storage unit, to the position determination unit and to the near to eye display unit. The processing unit is configured to control the near to eye display unit to display a three dimensional representation of a viewed terrain region, the three dimensional representation including a depiction of the terrain within the viewed terrain region, the depiction of the terrain being graphically differentiated in a manner that corresponds with an altitude of the terrain within the viewed terrain region. The depiction of the terrain is at least partially transparent.
In a second non-limiting embodiment, the system includes a data storage unit that is configured to store terrain data and structure data. The system also includes a position determination unit that is configured to determine a position of the aircraft. The system also includes a near to eye display unit that is adapted to be worn by the pilot and that is configured to display visual images and to determine a view direction of the pilot. The system also includes a processing unit that is communicatively connected to the data storage unit, to the position determination unit and to the near to eye display unit. The processing unit is configured to control the near to eye display unit to display a three dimensional representation of a viewed terrain region. The three dimensional representation includes a depiction of the terrain and a depiction of a structure within the viewed terrain region. The depiction of the terrain and the depiction of the structure are color coded in a manner that corresponds with an altitude of the terrain and an altitude of the structure within the viewed terrain region. The depiction of the terrain and the depiction of the structure are at least partially transparent.
In a third non-limiting embodiment, the system includes a data storage unit that is configured to store terrain data. They system also includes a position determination unit that is configured to determine a position of the aircraft. The system further includes a near to eye display unit that is adapted to be worn by the pilot and that is configured to display visual images and to determine a view direction of the pilot. The system also includes a processing unit that is communicatively connected to the data storage unit, to the position determination unit and to the near to eye display unit. The processing unit is configured to obtain the position of the aircraft from the position determination unit, to obtain the view direction from the near to eye display, to determine a viewed terrain region utilizing the position of the aircraft and the view direction, to obtain a sub-set of the terrain data relating to the viewed terrain region from the data storage unit, and to control the near to eye display unit to display a three dimensional representation of the viewed terrain region. The three dimensional representation includes a depiction of the terrain overlaying the terrain within the viewed terrain region. The depiction of the terrain is color coded in a manner that corresponds with an altitude of the terrain within the viewed terrain region. The depiction of the terrain comprises a hollow wire frame.
The present invention will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
One solution to the problem described above is to provide the pilot of an aircraft with an on-board system that displays a three dimensional depiction of the terrain located in generally all directions around the aircraft as the aircraft is in flight. The three dimensional display is provided to the pilot through a near to eye display unit that displays the three dimensional depiction of the terrain over the actual terrain that would be visible to the pilot if visual conditions permitted the pilot to see the terrain. The three dimensional display is transparent (e.g., a hollow wire frame image) to prevent obstruction of the pilot's vision and to allow the pilot to see air traffic, the interior of the cockpit, and the view through the windshield.
Use of a near to eye display unit permits the pilot to turn and/or tilt his or her head to see a three dimensional depiction of the terrain all around the aircraft. In this manner, the pilot can gain an understanding of the terrain located around the aircraft in the darkness of night or in other environments that present diminished or limited visibility. By providing the display in three dimensions, the pilot can inherently understand the information presented without having to engage in any substantial mental manipulation of the information presented.
The three dimensional depiction of the terrain is graphically differentiated to provide the pilot with information about the altitude of the depicted terrain. Any method of graphic differentiation may be implemented. For example, lines of differing thickness or brightness may be used to communicate differing elevations of terrain. The use of solid and broken lines may also be implemented. In other embodiments, portions of the terrain may be flashed or strobed to communicate information about terrain elevation to the pilot.
In another example, the graphic differentiation may be achieved by color coding the three dimensional depiction such that terrain located generally above the aircraft would be displayed in a first color (e.g., red), terrain located at generally the same altitude as the aircraft would be displayed in a second color (e.g., yellow) and terrain located generally below the aircraft would be displayed in a third color (e.g., green). By color coding the three dimensional depiction, the pilot can easily assess whether the depicted terrain poses a danger to the aircraft. Color coding the three dimensional depiction in this manner provides the pilot with assistance even when visibility through the windshield is unobstructed. For example, while flying through mountainous terrain on a clear day, a pilot may become confused as his or her eyes attempt to adjust to the various terrain features that are constantly changing in depth and distance. The color coding allows the pilot to instantly ascertain whether a visible terrain feature is a danger to the aircraft even when the pilot's eyes can't adjust quickly enough to provide this assessment. The use of other types of graphic differentiation can similarly assist the pilot in this manner.
The system disclosed herein includes a processing unit that is communicatively connected to the near to eye display unit. The near to eye display unit is configured to determine a view direction of the pilot and to communicate the view direction to the processing unit. As used herein, the term “view direction” refers to the direction that the pilot is looking in from the cockpit of the aircraft. The system also includes a position determination unit and a data storage unit, both of which are communicatively connected to the processing unit.
The position determination unit is configured to detect the position of the aircraft with respect to the earth and to communicate the position to the processing unit. The processing unit is configured to determine a viewed terrain region using the position of the aircraft and the view direction of the pilot. As used herein, the term “viewed terrain region” refers to a portion of terrain that the pilot is presently looking at. Depending on the angle of the pilot's head and the elevation of the aircraft, the viewed terrain region may include items other than terrain. For example, if a pilot is looking toward the horizon, then a portion of the viewed terrain region will include sky.
The data storage unit stores data including information about the terrain on the surface of the earth. The processing unit controls the data storage unit to provide a subset of the terrain data from the data storage unit that relates to the viewed terrain region. Using the subset of data relating to the viewed terrain region, the processing unit controls the near to eye display unit to present the three dimensional depiction of the terrain that is located within the viewed terrain region.
A greater understanding of the embodiments of the system disclosed herein may be obtained through a review of the illustrations accompanying this application together with a review of the detailed description that follows.
With respect to
Processing unit 12 may be any suitable computer processor such as, for example, a microprocessor, a digital signal processor, or any other processor capable of at least receiving and/or retrieving data, calculating a result using the data, and controlling a display unit to display the result. Processing unit 12 may comprise multiple computer processors that are communicatively connected to one another over a local area network (LAN) or a wide area network (WAN). In some embodiments, processing unit 12 may be housed onboard an aircraft. In other embodiments, processing unit 12 may configured to wirelessly communicate with the aircraft employing system 10 and housed remotely at an airport, at an air traffic control facility, at a central location, or otherwise housed externally to the aircraft employing system 10.
Near to eye display unit 14 is a system that is configured to be worn by a pilot and that is further configured to present a display screen close to one or both eyes of the pilot. Near to eye display systems are well known in the art and exemplary near to eye systems are disclosed in U.S. Pat. Nos. RE28,847; 5,003,300; 5,539,422; 5,742,263; and 5,673,059. In at least one non-limiting embodiment of system 10, near to eye display unit 14 is configured to display three dimensional images to the pilot. The three dimensional images are transparent and may have the appearance of hollow wire frames. In other embodiments, other types of transparent three dimensional images may be displayed. By providing a transparent image, near to eye display unit 14 permits the pilot to see the three dimensional image without obstructing the pilot's view. In some embodiments, near to eye display unit may also be configured to display images in different colors to permit color coding of the three dimensional images, as discussed below.
In some embodiments, near to eye display unit 14 includes a display screen 20 (see
Near to eye display unit 14 is communicatively connected to processing unit 12 and is configured to communicate the view direction to processing unit 12. Processing unit 12 is configured to provide control commands to near to eye display unit 14 to cause near to eye display unit 14 to display three dimensional images. Any type of connection effective to transmit signals between near to eye display unit 14 and processing unit 12 may be employed. For example, and without limitation, coaxial cables, transmission lines, microstrips, an input/output bus connection, and any type of wireless communication system, or any combination of the foregoing, may be employed to communicatively connect near to eye display unit 14 to processing unit 12.
Position determination unit 16 is an aircraft mounted device that is configured to determine the aircraft's current position (e.g., latitude, longitude and altitude) and to provide the aircraft's current position to processing unit 12. Position determination unit 16 may comprise an onboard navigation system that can include, but which is not limited to, an inertial navigation system, a satellite navigation system (e.g., Global Positioning System) receiver, VLF/OMEGA, Loran C, VOR/DME, DME/DME, IRS, a Flight Management System (FMS), and/or an altimeter or any combination of the foregoing. Position determination unit 16 is communicatively connected to processing unit 12. Any type of connection effective to transmit signals between position determination unit 16 and processing unit 12 may be employed. For example, and without limitation, coaxial cables, transmission lines, microstrips, an input/output bus connection, and any type of wireless communication system, or any combination of the foregoing, may be employed to communicatively connect position determination unit 16 to processing unit 12.
Data storage unit 18 is a data storage component that may be housed onboard the aircraft employing system 10. In other embodiments, data storage unit 18 may be configured to wirelessly communicate with the aircraft employing system 10 and may be housed remotely at an airport, at an air traffic control facility, at a central location, or otherwise housed externally to the aircraft employing system 10. Data storage unit 18 is communicatively connected to processing unit 12. Any type of connection effective to transmit signals between processing unit 12 and data storage unit 18 may be employed. For example, and without limitation, coaxial cables, transmission lines, microstrips, an input/output bus connection, and any type of wireless communication system, or any combination of the foregoing, may be employed to communicatively connect data storage unit 18 to processing unit 12.
In at least the embodiment illustrated in
In at least one embodiment, data storage unit 18 is configured to store data pertaining to the altitude and/or the elevation of terrain (hereinafter, terrain data 22). In some embodiments, terrain data 22 includes information about terrain located world wide. In other embodiments, terrain data 22 may relate to terrain only in predetermined geographical regions of the world where the aircraft is expected to be employed.
In some embodiments, data storage unit 18 is further configured to store data related to manmade structures such as buildings, towers, navigation beacons, antennas, and other manmade structures whose height may be relevant to a pilot of an aircraft (hereinafter, structure data 24). In still other embodiments, data storage unit 18 may be further configured to store data indicative of the type of terrain present. For example, data storage unit 18 may store data indicating whether terrain is a woods, a mountain, a lake, an ocean, a canyon, a river, a ravine, a flat land, etc. . . .
Processing unit 12 is configured (i.e., processing unit 12 is loaded with, and operates, appropriate software, algorithms and/or sub-routines) to receive the aircraft's position from position determining unit 16 and to receive the view direction from near to eye display unit 14 and to utilize the aircraft's position and the view direction to determine the viewed terrain region. Once processing unit 12 has determined the viewed terrain region, it is configured to obtain a subset of terrain data 22 from data storage unit 18 that contains information about the altitude of the terrain located within the viewed terrain region. Processing unit 12 is further configured to send control commands to near to eye display unit 14 to display a three dimensional depiction of the terrain located within the viewed terrain region.
As the pilot turns his or her head, near to eye display unit 14 determines a new view direction. As the plane continues in flight, the position determining unit 16 determines a new position of the aircraft. Processing unit 12 utilizes the new view direction and the new position of the aircraft to determine a new viewed terrain region. Processing unit 12 uses the new viewed terrain region to obtain a new subset of terrain data 22 which processing unit 12 then uses to control near to eye display unit 14 to display a new three dimensional depiction of the terrain located within the new viewed terrain region. In at least some embodiments, this process repeats itself continuously throughout the flight of the aircraft to provide the pilot with a constantly updated, real-time three dimensional depiction of the terrain located in the viewed terrain region.
In the embodiments illustrated in
With respect to
With respect to
With respect to
In some embodiments, system 10 is configured to display the three dimensional representation 40 of the viewed terrain region utilizing depth cueing. As used herein, the term “depth cueing” refers to a display strategy wherein displayed objects that are distant from the viewer are displayed lightly or faintly as compared with objects that are closer to the viewer. In other embodiments, system 10 is configured to display the three dimensional representation 40 of the viewed terrain region utilizing color cueing. As used herein, the term “color cueing” refers to a display strategy wherein colors that are used to display objects that are disposed at a higher altitude are displayed more brightly than colors used to display objects that are disposed at a lower altitude.
With respect to
In the embodiment illustrated in
As with the embodiment illustrated in
With respect to
While at least one exemplary embodiment has been presented in the foregoing detailed description of the invention, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the invention. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the invention as set forth in the appended claims.