The present invention relates generally to the field of aircraft navigation. The present invention more particularly relates to the fields of aircraft navigation systems and display features. While the present invention particularly relates to the fields of aircraft navigation systems and display features, applications could also include marine and terrestrial navigation systems.
Current enhanced vision systems (EVS) are being certified to extend certain non-precision and Category I (CAT I) operations through utilization of enhanced vision systems (EVS) as a supplement to natural vision. For example, CAT I conditions, at most, only allow descent to 200 ft. HAT before a go-around decision is made if the airport environment is not visually acquired by the crew. For example, with an EVS equipped and certified aircraft, credit is given for the EVS system, so that, in some conditions, if the crew ‘sees’ the airport environment using the enhanced vision system's output, then the approach can proceed down to 100′ HAT before natural vision must acquire the airport environment in order to continue the approach.
Normally, low visibility operations (such as those described above and provided by EVS) beyond CAT I conditions impose higher demands on the reliability, availability and failure modes of the system being relied upon to perform those operations. These more stringent requirements are typically addressed through system redundancy and/or high design assurance levels. Both solutions come at considerable costs.
What is needed are systems and methods for providing higher EVS/SVS integrity without dramatically increasing costs. Higher integrity may advantageously allow special authorization credit even if the EVS does not meet expensive high design assurance levels.
One embodiment of the invention relates to a system for controlling an image displayed on a display unit of an aircraft. The system includes an enhanced vision system that, among other scene features, detects elements of an approach lighting system and provides the detected elements for display on the display unit. The system also includes a synthetic vision system that uses a database of navigation information and a determined aircraft location to generate synthetic display elements representing the approach lighting system. Display electronics of the system cause the detected elements from the enhanced vision system to be simultaneously displayed on the display unit within a same scene as the generated synthetic display elements from the synthetic vision system. Advantageously, the simultaneous display allows a pilot viewing the display unit to check for whether information provided by the enhanced vision system matches information provided by the synthetic vision system.
Another embodiment of the invention relates to a method for controlling an image displayed on a display unit of an aircraft. The method includes using an enhanced vision system to detect and optionally enhance elements of an approach lighting system for display on a scene of a display unit. The method further includes using a synthetic vision system and the synthetic vision system's database of navigation information to generate synthetic display elements representing the approach lighting system. The method also includes causing the detected elements from the enhanced vision system to be simultaneously displayed on the display unit within the same scene as the generated synthetic display elements from the synthetic vision system. The synthetic vision system and the enhanced vision system may determine placement of their respective elements on the display unit independently. The synthetic vision system may cause its generated synthetic display elements to be positioned on the scene of the display unit based on aircraft position information and aircraft orientation information obtained from a first set of sensors. In some embodiments, the method may further include using a comparator to automatically compare the graphical distance between the enhanced elements from the enhanced vision system and the generated synthetic display elements from the synthetic vision system. The method may also include causing an indication of a degree of alignment of the enhanced elements and the corresponding elements on the generated synthetic display to be output.
Another embodiment of the invention relates to computer-readable media including computer executable instructions for completing the following steps: using an enhanced vision system to detect and enhance elements of an approach lighting system for display on a scene of a display unit; using a synthetic vision system and the synthetic vision system's database of navigation information to generate synthetic display elements representing the approach lighting system; and causing the enhanced elements from the enhanced vision system to be simultaneously displayed on the display unit within the same scene as the generated synthetic display elements from the synthetic vision system.
Alternative exemplary embodiments relate to other features and combinations of features as may be generally recited in the claims.
The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements, in which:
Before turning to the Figures, which illustrate the exemplary embodiments in detail, it should be understood that the disclosure is not limited to the details or methodology set forth in the description or illustrated in the Figures. It should also be understood that the terminology is for the purpose of description only and should not be regarded as limiting.
Referring generally to the Figures, systems and methods for controlling an image displayed on a display unit of an aircraft are shown and described. The system includes an enhanced vision system that detects elements of an approach lighting system and enhances the detected elements for display on the display unit. The system also includes a synthetic vision system that uses a database of navigation information and a determined aircraft location to generate synthetic display elements representing the approach lighting system. Display electronics of the system cause the enhanced elements from the enhanced vision system to be simultaneously displayed on the display unit within a same scene as the generated synthetic display elements from the synthetic vision system. Advantageously, the simultaneous display allows a pilot viewing the display unit to check for whether information provided by the enhanced vision system matches information provided by the synthetic vision system.
Referring to
Referring now to
In conventional systems, low visibility operations beyond CAT I conditions impose higher demands on the reliability, availability and failure modes of the system being relied upon to perform these operations. These more stringent requirements are typically addressed through system redundancy and/or high design assurance levels. Both solutions come at considerable cost.
Embodiments of the systems and methods described herein utilize selected features of the enhanced vision system 204 together with features of the synthetic vision system 202 to provide two independent information threads to a displayed scene. If both systems are already installed on the aircraft, the display electronics or software thereof can advantageously provide the displayed scene without adding another set of vision sensors/electronics. Some enhanced vision systems excel at detecting elements of approach lighting systems (ALS) at or near published approach minima. Synthetic vision system databases contain information for rendering the ALS. Using exemplary embodiments described herein, a “decluttered” or basic synthetic vision system image of a runway or runway features (i.e., an iconic depiction of the location of ALS light sources, a graphical depiction of an extended runway centerline, a graphical depiction of a runway outline, etc.) can be generated by the synthetic vision system and displayed with an enhanced vision system depiction of the same approach features. For example, some embodiments of the present disclosure may overlay a synthetic vision system scene on an enhanced vision system scene. A pilot using embodiments of the present disclosure can readily confirm that the two disparate information sources are providing coherent information. If the two disparate information sources are providing coherent information, it is likely that the information is accurate, allowing the pilot to confirm the integrity of the system and to proceed with the approach in accordance with prescribed parameters. In some countries, airport/aircraft regulations may be revised to allow an approach with pilot-confirmed “dual threaded” information source conformity to descend to a lower altitude before visual aircraft environment acquisition is required. Exemplary systems and methods of the present application advantageously improve an aircraft operational envelope in low visibility conditions without the need to deploy expensive conventional “high-integrity” systems.
In an exemplary embodiment, the synthetic vision system 202 and the enhanced vision system 204 determine placement on a shared scene of a display unit independently. For example, the synthetic vision system 204 can be configured to cause its generated synthetic display elements to be positioned on the scene of the display unit based on aircraft position information and aircraft orientation information obtained from a first set of sensors. While the synthetic vision system 204 is causing its generated synthetic display elements to be positioned on a scene using a first set of sensors, the enhanced vision system causes its enhanced detected elements to be positioned in the scene of the display unit based on information from a second set of sensors. In some embodiments, the first set of sensors and the second set of sensors do not share any one sensor.
The synthetic display elements provided by information from the synthetic vision system may be generated using database information corresponding with the aircraft position information and the aircraft orientation information obtained from the first set of sensors. The aircraft position information may be derived from a satellite-based positioning system such as GPS or GNSS. With reference to
Referring still to
The SVS 202 includes or is coupled to an SVS database 206 and a processor 203. The SVS database 206 may be or include a terrain database to create a three-dimensional perspective of the scene in front of the aircraft (e.g., on a two-dimensional display unit). In an exemplary embodiment, the SVS database 206 includes information regarding approach lighting systems for a plurality of airports. The SVS database 206 may also include terrain information. Accordingly, the SVS 202 may provide a computer-generated scene including synthetic display elements of an approach lighting system and terrain surrounding the approach lighting system. The terrain surrounding the approach lighting system may include an outline of significant terrain aspects as well as a computer-generated representation of the runway or other airport structures. Using SVS 202, the aircraft can provide the pilot with a computer-generated approximation of an approach scene even if visibility of the actual scene may be limited or obscured (e.g., by smoke, sand, fog, clouds, rain, snow, etc.).
SVS 202 generally, and SVS processor 203 more particularly, may be programmed to receive aircraft position information from the GPS receiver 208 as input. While a GPS receiver 208 is shown as coupled to SVS 202 in
Processors 203 and 205 may be or include a microprocessor, microcontroller, application-specific integrated circuit (ASIC), or other circuitry configured to perform or support the processes described in the present application for each processor's respective parent circuits (e.g., SVS 202, EVS 204). SVS 202 and EVS 204 may be or include computer systems (e.g., embedded computer systems, real-time operating systems, etc.) that use their respective processors 203, 205 to receive inputs, analyze the inputs, and to generate and output appropriate outputs. The I/O devices shown in
SVS 202 and EVS 204 may include memory devices (e.g., RAM, ROM, Flash memory, hard disk storage, etc.) for storing data and/or computer code for completing and/or facilitating the various processes described in the present disclosure. The memory devices may be or include volatile memory or non-volatile memory. The memory devices may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures of the present disclosure. The memory devices may also or alternatively be configured to store buffered or generated images (e.g., bitmaps, compressed images, video data, etc.) for outputting to display electronics 212. In other embodiments, generated enhanced vision elements or generated synthetic vision elements are created and output from SVS 202, 204 in a non-image form for parsing by display electronics 212. For example, SVS 202 may send a mark-up language description of objects for display to display electronics 212. Display electronics 212 may parse the mark-up language and generate display objects/content based on the mark-up language. SVS 202, EVS 204, processor 203, and processor 205 may each include memory having computer code for execution by a processing circuit (e.g., processor 203, processor 205, another processor of SVS 202, EVS 204).
HUD 295 may be a head-up display of any suitable technology of the past, present, or future. In an exemplary embodiment, HUD 295 may be a projection/combiner type HUD. Electronics for driving the HUD 295 may be a part of HUD 295, a part of display electronics 212, or distributed among HUD 295 and display electronics 212.
Head down display (HDD) 20 may be a cathode ray tube (CRT), LCD, OLED, or a display unit of another display technology. Electronics for driving the HDD 20 may be a part of HDD 20, a part of display electronics 212, or distributed among HDD 20 and display electronics 212.
Gyroscope system 210 may be or include any suitable gyroscope system for determining aircraft orientation information (e.g., tilt angle, yaw angle, pitch angle, etc.). Outputs from gyroscope system 210 may be provided directly to SVS 202, processed by other flight systems 211, or otherwise provided to the components of
The processor 215 of display electronics 212 may include or access a memory device of display electronics 212. The processor 215 may execute computer code stored in the memory device to complete the activities described herein with respect to the display electronics. In an exemplary embodiment, display electronics 212 uses a coupled memory device to buffer the enhanced detected elements received from the enhanced vision system and to buffer the synthetic display elements from the synthetic vision system. The display electronics 212 generally, and processor 215 more particularly, may then cause the elements to be shown on the same scene of one of the display units in the aircraft (e.g., one of displays 20, 295, etc.). One or more processing tasks may be conducted by processor 215 in order for the respective display elements of the SVS 202 and EVS 204 to be shown in the same scene. For example, display electronics 212 or its processor 215 may synchronize data received from SVS 202 and EVS 204 such at it the timing matches. In an exemplary embodiment, for example, display electronics 212 may provide a clock signal to the SVS 202 and its components and the same clock signal to the EVS 204 for use in synchronizing the SVS 202, EVS 204, and the display electronics 214.
Referring now to
While the EVS and SVS elements are detected or generated independently, a process may run (e.g., on the display electronics) that estimates whether the elements are aligned. If the elements are aligned, the display electronics may output an indication of the alignment determination. For example, the display electronics may determine whether or not to populate a check box that indicates whether the SVS and EVS elements are aligned. In the Example of
In
The construction and arrangement of the systems and methods as shown in the various exemplary embodiments are illustrative only. Although only a few embodiments have been described in detail in this disclosure, many modifications are possible. All such modifications are intended to be included within the scope of the present disclosure. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions and arrangement of the exemplary embodiments without departing from the scope of the present disclosure.
Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
It should be noted that although the figures may show a specific order of method steps, the order of the steps may differ from what is depicted. Also two or more steps may be performed concurrently or with partial concurrence. Such variations will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.
Number | Name | Date | Kind |
---|---|---|---|
4914436 | Bateman et al. | Apr 1990 | A |
5047779 | Hager | Sep 1991 | A |
5839080 | Muller et al. | Nov 1998 | A |
5920276 | Frederick | Jul 1999 | A |
5945926 | Ammar et al. | Aug 1999 | A |
5978715 | Briffe et al. | Nov 1999 | A |
6064942 | Johnson et al. | May 2000 | A |
6092009 | Glover | Jul 2000 | A |
6112141 | Briffe et al. | Aug 2000 | A |
6112570 | Hruschak | Sep 2000 | A |
6122570 | Muller et al. | Sep 2000 | A |
6128553 | Gordon et al. | Oct 2000 | A |
6138060 | Conner et al. | Oct 2000 | A |
6150901 | Auken | Nov 2000 | A |
6154151 | McElreath et al. | Nov 2000 | A |
6163021 | Mickelson | Dec 2000 | A |
6166661 | Anderson et al. | Dec 2000 | A |
6169770 | Henely | Jan 2001 | B1 |
6178391 | Anderson et al. | Jan 2001 | B1 |
6194980 | Thon | Feb 2001 | B1 |
6199008 | Aratow et al. | Mar 2001 | B1 |
6201494 | Kronfeld | Mar 2001 | B1 |
6219592 | Muller et al. | Apr 2001 | B1 |
6233522 | Morici | May 2001 | B1 |
6259400 | Higgins et al. | Jul 2001 | B1 |
6266114 | Skarohlid | Jul 2001 | B1 |
6281832 | McElreath | Aug 2001 | B1 |
6285298 | Gordon | Sep 2001 | B1 |
6285337 | West et al. | Sep 2001 | B1 |
6285926 | Weiler et al. | Sep 2001 | B1 |
6289277 | Feyereisen et al. | Sep 2001 | B1 |
6311108 | Ammar et al. | Oct 2001 | B1 |
6317468 | Meyer | Nov 2001 | B1 |
6345127 | Mitchell | Feb 2002 | B1 |
6377892 | Johnson et al. | Apr 2002 | B1 |
6388607 | Woodell | May 2002 | B1 |
6388608 | Woodell et al. | May 2002 | B1 |
6411890 | Zimmerman | Jun 2002 | B1 |
6421603 | Pratt et al. | Jul 2002 | B1 |
6424288 | Woodell | Jul 2002 | B1 |
6426717 | Maloratsky | Jul 2002 | B1 |
6441773 | Kelly et al. | Aug 2002 | B1 |
6448922 | Kelly | Sep 2002 | B1 |
6452511 | Kelly et al. | Sep 2002 | B1 |
6456236 | Hauck et al. | Sep 2002 | B1 |
6473240 | Dehmlow | Oct 2002 | B1 |
6492934 | Hwang et al. | Dec 2002 | B1 |
6501424 | Haendel et al. | Dec 2002 | B1 |
6512476 | Woodell | Jan 2003 | B1 |
6525674 | Kelly et al. | Feb 2003 | B1 |
6531669 | Miller et al. | Mar 2003 | B1 |
6549161 | Woodell | Apr 2003 | B1 |
6567728 | Kelly et al. | May 2003 | B1 |
6574030 | Mosier | Jun 2003 | B1 |
6577947 | Kronfeld et al. | Jun 2003 | B1 |
6591171 | Ammar et al. | Jul 2003 | B1 |
6603425 | Woodell | Aug 2003 | B1 |
6650275 | Kelly et al. | Nov 2003 | B1 |
6653947 | Dwyer et al. | Nov 2003 | B2 |
6667710 | Cornell et al. | Dec 2003 | B2 |
6690298 | Barber et al. | Feb 2004 | B1 |
6690299 | Suiter | Feb 2004 | B1 |
6714186 | Mosier et al. | Mar 2004 | B1 |
6741203 | Woodell | May 2004 | B1 |
6741208 | West et al. | May 2004 | B1 |
6744382 | Lapis et al. | Jun 2004 | B1 |
6744408 | Stockmaster | Jun 2004 | B1 |
6757624 | Hwang et al. | Jun 2004 | B1 |
6771626 | Golubiewski et al. | Aug 2004 | B1 |
6804614 | McGraw et al. | Oct 2004 | B1 |
6806846 | West | Oct 2004 | B1 |
6819983 | McGraw | Nov 2004 | B1 |
6822617 | Mather et al. | Nov 2004 | B1 |
6839017 | Dillman | Jan 2005 | B1 |
6850185 | Woodell | Feb 2005 | B1 |
6862501 | He | Mar 2005 | B2 |
6865452 | Burdon | Mar 2005 | B2 |
6879280 | Bull et al. | Apr 2005 | B1 |
6882302 | Woodell et al. | Apr 2005 | B1 |
6918134 | Sherlock et al. | Jul 2005 | B1 |
6950062 | Mather et al. | Sep 2005 | B1 |
6972727 | West et al. | Dec 2005 | B1 |
6977608 | Anderson et al. | Dec 2005 | B1 |
6992614 | Joyce | Jan 2006 | B1 |
6995726 | West et al. | Feb 2006 | B1 |
6998908 | Sternowski | Feb 2006 | B1 |
6999022 | Vesel et al. | Feb 2006 | B1 |
7002546 | Stuppi et al. | Feb 2006 | B1 |
7034753 | Elsallal et al. | Apr 2006 | B1 |
7042387 | Ridenour et al. | May 2006 | B2 |
7064680 | Reynolds et al. | Jun 2006 | B2 |
7089092 | Wood et al. | Aug 2006 | B1 |
7092645 | Sternowski | Aug 2006 | B1 |
7109912 | Paramore et al. | Sep 2006 | B1 |
7109913 | Paramore et al. | Sep 2006 | B1 |
7129885 | Woodell et al. | Oct 2006 | B1 |
7145501 | Manfred et al. | Dec 2006 | B1 |
7148816 | Carrico | Dec 2006 | B1 |
7151507 | Herting | Dec 2006 | B1 |
7158072 | Venkatachalam et al. | Jan 2007 | B1 |
7161525 | Finley et al. | Jan 2007 | B1 |
7170446 | West et al. | Jan 2007 | B1 |
7196329 | Wood et al. | Mar 2007 | B1 |
7205933 | Snodgrass | Apr 2007 | B1 |
7209070 | Gilliland et al. | Apr 2007 | B2 |
7219011 | Barber | May 2007 | B1 |
7242343 | Woodell | Jul 2007 | B1 |
7272472 | McElreath | Sep 2007 | B1 |
7292178 | Woodell et al. | Nov 2007 | B1 |
7312725 | Berson et al. | Dec 2007 | B2 |
7312743 | Ridenour et al. | Dec 2007 | B2 |
7486291 | Berson et al. | Feb 2009 | B2 |
7525448 | Wilson et al. | Apr 2009 | B1 |
7570177 | Reynolds et al. | Aug 2009 | B2 |
7633430 | Wichgers et al. | Dec 2009 | B1 |
7639175 | Woodell | Dec 2009 | B1 |
7664601 | Daly, Jr. | Feb 2010 | B2 |
7675461 | McCusker et al. | Mar 2010 | B1 |
7733264 | Woodell et al. | Jun 2010 | B1 |
7783427 | Woodell et al. | Aug 2010 | B1 |
7859448 | Woodell et al. | Dec 2010 | B1 |
7859449 | Woodell et al. | Dec 2010 | B1 |
8035547 | Flanigan et al. | Oct 2011 | B1 |
8373580 | Bunch et al. | Feb 2013 | B2 |
8493241 | He | Jul 2013 | B2 |
20020185600 | Kerr | Dec 2002 | A1 |
20030071828 | Wilkins et al. | Apr 2003 | A1 |
20030216859 | Martell et al. | Nov 2003 | A1 |
20040044445 | Burdon | Mar 2004 | A1 |
20040059473 | He | Mar 2004 | A1 |
20040083038 | He | Apr 2004 | A1 |
20040181318 | Redmond et al. | Sep 2004 | A1 |
20050174350 | Ridenour et al. | Aug 2005 | A1 |
20060097895 | Reynolds et al. | May 2006 | A1 |
20060227012 | He | Oct 2006 | A1 |
20060290531 | Reynolds et al. | Dec 2006 | A1 |
20070060063 | Wright et al. | Mar 2007 | A1 |
20080180351 | He | Jul 2008 | A1 |
20090207048 | He et al. | Aug 2009 | A1 |
20100033499 | Gannon et al. | Feb 2010 | A1 |
20110282580 | Mohan | Nov 2011 | A1 |
Number | Date | Country |
---|---|---|
196 49 838 | Apr 1998 | DE |
19649838 | Apr 1998 | DE |
Entry |
---|
U.S. Appl. No. 13/194,594, filed Jul. 29, 2011, Lahr et al. |
Office Action on U.S. Appl. No. 13/247,742 Dated Dec. 3, 2013, 11 pages. |
Fountain, J.R., Digital Terrain Systems, Airborne Navigation Systems Workshop (Digest No. 1997/169), IEE Colloquium, pp. 4/1-4/6, Feb. 21, 1997. |
Honeywell, RDR-4B Forward looking windshear detection / weather radar system user's manual with radar operating guidelines, Rev. 6, Jul. 2003, 106 pages. |
Johnson, A., et al., Vision Guided Landing of an Autonomous Helicopter in Hazardous Terrain, Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference, pp. 3966-3971, Apr. 18-22, 2005. |
Kuntman, D., Airborne system to address leading cause of injuries in non-fatal airline accidents, ICAO Journal, Mar. 2000, 4 pages. |
Office Action for U.S. Appl. No. 11/851,323, mail date Jul. 5, 2012, 23 pages. |
Office Action for U.S. Appl. No. 11/851,323), mail date Dec. 15, 2010, 13 pages. |
Office Action for U.S. Appl. No. 11/851,323, mail date Aug. 6, 2009, 23 pages. |
Technical Standard Order, TSO-C115b, Airborne Area Navigation Equipment Using Multi-Sensor Inputs, Sep. 30, 1994, 11 pages, Department of Transportation, Federal Aviation Administration, Washington, DC. |
Vadlamani, A., et al., Improving the detection capability of spatial failure modes using downward-looking sensors in terrain database integrity monitors, Digital Avionics Systems Conference, 2003. DASC-03. The 22nd, vol. 2, pp. 9C.5-91-12 vol. 2, Oct. 12-16, 2003. |
US Office Action on U.S. Appl. No. 13/247,742 Dated Apr. 16, 2014, 15 pages. |