The present invention generally relates to situational awareness, and more particularly relates to a system and method of providing enhanced situational awareness to an operator, either within a vehicle or a centralized control station.
Air travel has long been, and continues to be, a safe mode of transportation. Nonetheless, substantial effort continues to be expended to develop flight systems and human-factors practices that even further improve aircraft flight safety. Some examples of these flight systems include flight management systems, global navigation satellite systems, differential global positioning systems, air data computers, instrument landing systems, satellite landing systems, traffic alert and collision avoidance systems, weather avoidance systems, thrust management systems, flight control surface systems, and flight control computers, just to name a few.
Despite good flight system design and improved human-factors practices, there is a continuous desire to provide further flight safety improvements. One particular aspect that is presently undergoing significant improvement is in the area of obstacle avoidance. It is generally understood that improving aircraft flight crew situational awareness during flight operations, ground operations, and landing operations, will likely improve the ability of a flight crew to avoid obstacles.
During flight operations, flight crews make every effort to consistently survey the region around the aircraft. However, aircraft structures, such as the wings and the aft lower fuselage, may block large regions of airspace from view. Moreover, at times the cockpit workload can possibly detract the flight crew from visual scanning. To enhance situational awareness during crowded air traffic and/or low visibility flight operations, many aircraft are equipped with a Traffic Alert and Collision Avoidance System (TCAS). Although the TCAS does provide significant improvements to situational awareness, the burden remains on the pilots of TCAS-equipped aircraft to avoid another aircraft.
During ground operations, the possibility for a runway incursion exists, especially at relatively large and complex airports. Governmental regulatory bodies suggest that most runway incursions that have occurred are due to pilot induced errors. These regulatory bodies also suggest that the likelihood of a runway incursion increases if a pilot lacks awareness on the position and intention of other traffic in the vicinity of the aircraft.
Regarding landing operations, there is presently no method or device that provides a visual display of another aircraft encroaching on the flight path of the host aircraft during simultaneous approach on parallel runways. Although the Instrument Landing System (ILS) does provide lateral, along-course, and vertical guidance to aircraft that are attempting to land, the ILS may not maintain adequate separation during a simultaneous approach on parallel runways because the displayed localizer signal during an ILS approach does not support independent parallel approaches. Although parallel approaches may be adequately staggered in fair weather, and the ILS is intended to maintain an adequate vertical separation between aircraft until an approach is established, inclement weather may decrease airport capacity and compound the potential parallel approach problem.
Hence, there is a need for a system and method of improving aircraft flight crew situational awareness during flight operations, ground operations, and landing operations that does not suffer the drawbacks of presently known systems. The present invention addresses at least this need.
In one embodiment, and by way of example only, a method of providing enhanced situational awareness to an operator includes receiving automatic dependent surveillance-broadcast (ADS-B) traffic data transmitted by a traffic entity. The ADS-B traffic data are processed to determine traffic entity position. The traffic entity position is mapped to corresponding image coordinates on an enhanced vision system (EVS) display. A region of interest around at least a portion of the corresponding image coordinates is selected. An actual image of the traffic entity is rendered on the EVS display, at the corresponding image coordinates, and with at least a portion of the region of interest being highlighted.
In another exemplary embodiment, a system for providing enhanced situational awareness to an operator includes an enhanced vision system (EVS) display and a processor. The EVS display is coupled to receive image rendering display commands and is operable, in response thereto, to render images. The processor is in operable communication with the EVS display. The processor is adapted to receive automatic dependent surveillance-broadcast (ADS-B) traffic data associated with a traffic entity and image data representative of the traffic entity and is operable, in response to these data, to determine traffic entity position, map the traffic entity position to corresponding image coordinates on the EVS display, select a region of interest around at least a portion of the corresponding image coordinates, and supply image rendering display commands to the EVS display that cause the EVS display to render an actual image of the traffic entity, at the corresponding image coordinates, and with at least a portion of the region of interest being highlighted.
In still another exemplary embodiment, a system for providing enhanced situational awareness to an operator includes a plurality of enhanced vision system (EVS) image sensors, an EVS display, and a processor. Each EVS image sensor is operable to sense one or more target entities within a predetermined range and supply image data representative thereof. The EVS display is coupled to receive image rendering display commands and is operable, in response thereto, to render images. The processor in is operable communication with the EVS display and the EVS sensors, the processor is adapted to receive automatic dependent surveillance-broadcast (ADS-B) traffic data associated with a traffic entity and image data from one or more of the EVS image sensors. The processor is operable, in response to the received data, to determine a position of each of the traffic entities, compute a threat level of each of the traffic entities, assign a priority level to each of the traffic entities based on the computed threat levels, select one of the plurality of EVS image sensors from which to receive image data based at least in part on the priority level of each of the traffic entities, map each traffic entity position to corresponding image coordinates on the EVS display, select a region of interest around at least a portion of each of the corresponding image coordinates, and supply image rendering display commands to the EVS display that cause the EVS display to render actual images of selected ones of the traffic entities, at the corresponding image coordinates, and with at least a portion of each region of interest being highlighted.
Furthermore, other desirable features and characteristics of the enhanced situational awareness system and method will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the preceding background.
The present invention will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
Turning first to
It will be appreciated that the EVS display 102 may be implemented using any one of numerous known displays suitable for rendering image and/or text data in a format viewable by the user 101. Non-limiting examples of such displays include various cathode ray tube (CRT) displays, and various flat panel displays such as various types of LCD (liquid crystal display) and TFT (thin film transistor) displays. The EVS display 102 may be implemented as a panel mounted display, a HUD projection, or any one of numerous other display technologies now known or developed in the future. The EVS display 102 may additionally be implemented as a stand-alone, dedicated display, or be implemented as part of an existing flight deck display, such as a primary flight display (PFD) or a multi-function display (MFD), just to name a few. As
The processor 104 is in operable communication with the EVS display 102 and a plurality of data sources via, for example, a communication bus 106. The processor 104 is coupled to receive data from the data sources and is operable, in response to the received data, to supply appropriate image rendering display commands to the EVS display 102 that causes the EVS display 102 to render various images. The data sources that supply data to the processor 104 may vary, but in the depicted embodiment these data sources include at least an automatic dependent surveillance-broadcast (ADS-B) receiver 108, one or more EVS image sensors 112, and a weather data source 114. Moreover, though not depicted in
The processor 104 may include one or more microprocessors, each of which may be any one of numerous known general-purpose microprocessors or application specific processors that operate in response to program instructions. In the depicted embodiment, the processor 104 includes on-board RAM (random access memory) 103 and on-board ROM (read only memory) 105. The program instructions that control the processor 104 may be stored in either or both the RAM 103 and the ROM 105. For example, the operating system software may be stored in the ROM 105, whereas various operating mode software routines and various operational parameters may be stored in the RAM 103. It will be appreciated that this is merely exemplary of one scheme for storing operating system software and software routines, and that various other storage schemes may be implemented. It will also be appreciated that the processor 104 may be implemented using various other circuits, not just one or more programmable processors. For example, digital logic circuits and analog signal processing circuits could also be used.
The ADS-B receiver 108 is configured to receive ADS-B transmissions from one or more external traffic entities (e.g., other aircraft) and supplies ADS-B traffic data to the processor 104. As is generally known, ADS-B is a cooperative surveillance technique for air traffic control and related applications. More specifically, each ADS-B equipped aircraft automatically and periodically transmits its state vector, preferably via a digital datalink. An aircraft state vector typically includes its position, airspeed, altitude, intent (e.g., whether the aircraft is turning, climbing, or descending), aircraft type, and flight number. Each ADS-B receiver, such as the ADS-B receiver 108 in the depicted system 100, that is within the broadcast range of an ADS-B transmission, processes the ADS-B transmission and supplies ADS-B traffic data to one or more other devices. In the depicted embodiment, and as was just mentioned, these traffic data are supplied to the processor 104 for additional processing. This additional processing will be described in more detail further below.
The EVS image sensor 112 is operable to sense at least one or more target entities within a predetermined range and supply image data representative of each of the sensed target entities. The image data are supplied to the processor 104 for further processing, which will also be described further below. The EVS image sensor 112 may be implemented using any one of numerous suitable image sensors now known or developed in the future. Some non-limiting examples of presently known EVS image sensors 112 include various long-wave infrared (LWIR) cameras, medium wave infrared (MWIR) cameras, short-wave infrared (SWIR) cameras, electro-optical (EO) cameras, line scan cameras, radar devices, lidar devices, and visible-band cameras, just to name a few.
No matter the particular type of EVS sensor 112 that is used, it is noted that each EVS sensor type exhibits varied capabilities of range, resolution, and other characteristics. As such, in a particular preferred embodiment, the system 100 preferably includes a plurality of EVS sensors 112 of varying capability. Moreover, in the context of an aircraft environment, the EVS sensors 112 are preferably mounted on the outer surface of the aircraft, and are strategically located, either together or at various locations on the aircraft, to optimize performance, design, and cost. As will be described further below, when a plurality of EVS image sensors 112 are included, the processor 104 implements a process to select one or more of the EVS image sensors 112 from which to receive image data for further processing.
The weather data source 114, as the nomenclature connotes, supplies data representative of environmental weather conditions. Preferably, the weather data used by the processor 104 in the depicted system is representative of the environmental weather conditions that are within a predetermined range of the aircraft within which the system 100 is installed. For example, within the range of the EVS sensor 112 having the maximum range. It will be appreciated, of course, that this may vary. Nonetheless, as will be described further below, the processor 104, at least in some embodiments, uses the weather data as part of the process to select one or more of the EVS sensors 112 from which to receive image data for further processing. Moreover, in some embodiments, the system 100 could be implemented without the weather data source 114.
The system 100 described above and depicted in
The process 200 begins upon receipt, by the processor 104, of ADS-B traffic data supplied from the ADS-B receiver 108 (202). The processor 104 processes the received ADS-B traffic data to determine, among other things, the position of each traffic entity associated with the received ADS-B traffic data (204). The processor 104 then maps the position of the traffic entity to corresponding image coordinates on EVS display 102 (208), and selects a region of interest around at least a portion of the corresponding image coordinates (212). Thereafter, the processor 104 supplies image rendering display commands to the EVS display 102 that causes the EVS display 102 to render an actual image of the traffic entity, at the corresponding image coordinates, and with at least a portion of the region of interest being highlighted (214).
It will be appreciated that the system 100 could implement the process 200 for each and every target entity from which ADS-B traffic data are received. However, in a particular preferred embodiment, the system 100 is configured to implement the entire process 200 for only selected traffic entities. In particular, for only traffic entities that are considered to present a suitably high threat. For example, some traffic entities may be static (e.g., not presently moving) entities, or may be moving away from the aircraft in which the system 100 is installed. In both of these exemplary instances, the traffic entity (or entities) that made the ADS-B transmission, while within range, may or may not be assessed as viable potential threats and/or may or may not be classified as threats of sufficiently high priority.
In view of the foregoing, and as
It was noted above that the system 100 is preferably implemented with a plurality of EVS image sensors 112 of varying capability. This, in part, is because no single EVS image sensor 112 may exhibit suitable capabilities under all weather conditions. In addition, in most embodiments the computational resources of the system 100 may not be adequate to justify simultaneously operating all of the EVS sensors 112, processing the image data, and rendering the captured images. Thus, as
After the appropriate EVS image sensor 112 is selected, the EVS image sensor 112 supplies image data representative of the high priority level traffic entities to the processor 104. An exemplary image that may be captured by the EVS sensor 112 is depicted in
Thereafter, and as was also noted above, the processor 104 selects a region of interest around at least a portion of the corresponding image coordinates (212). In a preferred embodiment, and as is depicted most clearly in
With reference now to
A single system 100 is depicted in
In addition to the above-described functionality, visual cues can be further analyzed using advanced image processing techniques to extract additional features. For example, the images captured by individual EVS image sensors 112 may be “mosaiced” or “stitched” to provide a more comprehensive, seamless view to the pilot. This seamless view may be most important to a pilot undergoing a curved approach (on single runway or parallel runways), during which the pilot may have a limited view of the runway, terrain, traffic. Moreover, the captured images may be subjected to advanced video analytics, such as object tracking.
Although the system 100 and method 200 were described herein as being implemented in the context of an aircraft, it may also be implemented in the context of an air traffic control station. Furthermore, during aircraft ground operations, the visual cues of surrounding aircraft may be up-linked from an aircraft to air traffic control using a suitable data link (e.g., WiMax) to improve an air traffic controller's situational awareness of ground traffic.
While at least one exemplary embodiment has been presented in the foregoing detailed description of the invention, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the invention. It being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the invention as set forth in the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
4918442 | Bogart | Apr 1990 | A |
5179377 | Hancock | Jan 1993 | A |
5296854 | Hamilton et al. | Mar 1994 | A |
6064335 | Eschenbach | May 2000 | A |
6512975 | Watson | Jan 2003 | B2 |
6683541 | Staggs et al. | Jan 2004 | B2 |
6694249 | Anderson et al. | Feb 2004 | B1 |
6911936 | Stayton et al. | Jun 2005 | B2 |
7030780 | Shiomi et al. | Apr 2006 | B2 |
7414567 | Zhang et al. | Aug 2008 | B2 |
20010048763 | Takatsuka et al. | Dec 2001 | A1 |
20040095259 | Watson | May 2004 | A1 |
20050232512 | Luk et al. | Oct 2005 | A1 |
20070001874 | Feyereisen et al. | Jan 2007 | A1 |
Number | Date | Country |
---|---|---|
1950532 | Jul 2008 | EP |
2009010969 | Jan 2009 | WO |
Number | Date | Country | |
---|---|---|---|
20100253546 A1 | Oct 2010 | US |