The present application is related to application Ser. No. 11/851,323 filed on Sep. 6, 2007 and Ser. No. 11/904,491 filed on Sep. 26, 2007, both of which are herein incorporated by reference in their entireties. The present application is also related to: U.S. patent application Ser. No. 13/241,051, entitled “DUAL THREADED SYSTEM FOR LOW VISIBILITY OPERATIONS” by Tiana et al. ('780), U.S. patent application Ser. No. 12/263,282 entitled “SYSTEM AND METHOD FOR GROUND NAVIGATION” by McCusker et al. ('568) and U.S. patent application Ser. No. 12/180,293 entitled “SYSTEM AND METHOD FOR AIRCRAFT ALTITUDE MEASUREMENT USING RADAR AND KNOWN RUNWAY POSITION” by Woodell et al. ('550), all of which are herein incorporated by reference in their entirety and assigned to the assignees of the present application.
The present disclosure relates generally to weather radar data. The disclosure more specifically relates to an apparatus and method for display of weather data.
Displays are used in head down display (HDD) systems, head up display (HUD) systems and wearable displays, such as, helmet mounted display (HMD) systems. In aircraft applications, HUD, HDD and HMD systems advantageously display information from aircraft systems and sensors in a graphical and alphanumeric format. The display information can include an enhanced vision image from a camera or other imaging sensor (such as a visible light imaging sensor, infrared imaging sensor, millimeter wave radar imager, etc.) and/or a synthetic vision image from a synthetic vision computer in certain applications. The enhanced vision image can be merged with a synthetic vision image to provide a single image to the pilot.
Enhanced vision systems have certain disadvantages. For example, enhanced vision systems add extra weight and cost to the aircraft. In addition, enhanced vision images are not always able to sense objects or terrain through all types of weather and require a noticeable bump in the nose of the aircraft where a camera or other sensor is installed.
FAA-certified enhanced flight vision systems can allow pilots landing under instrument flight rules to operate below certain specified altitudes during instrument approaches even when the airport environment is not visible. Conventional SVS cannot provide enhanced flight visibility, especially the capability to show a real-time image of an aircraft, vehicle or other obstacle on the runway during an impending landing. Although SVS has been approved for flying an instrument approach procedure, SVS has not been approved for operations below authorized decision height (DH) or minimum descent altitude (MDA). The use of an integrity monitor for a SVS may allow for higher design assurance levels which could lead to the use of monitored SVS for lower landing minimum credit (e.g., would allow a pilot with a monitored SVS display system to land where a non-monitored SVS pilot would not be otherwise allowed to land due to the current low visibility runway visible range (RVR) restrictions). Accordingly, there have been proposals to provide a monitor for an SVS system based upon various instruments. The use of additional equipment to provide an integrity monitor for the SVS can add to the cost and weight of the aircraft.
Weather radar systems can provide images of an environment outside the aircraft. However, such images are only available after a radar sweep is completed which can require a significant period of time. Accordingly, images from weather radar systems are not conventionally used to replace real time images from EVS.
Accordingly, there is a need for systems for and methods of displaying images derived from weather data. There is still a further need for systems for and methods of providing real time images derived from weather data. Yet further, there is a need for a HUD including a merged SVS image and a real time image derived from weather radar data or providing a real time image derived from weather radar data. There is also a need for a system for and method of providing an integrity check for an SVS without use of additional systems. There is also a need for systems for and methods of providing integration of a synthetic image and an image based upon weather data.
It would be desirable to provide a system and/or method that provides one or more of these or other advantageous features. Other features and advantages will be made apparent from the present specification. The teachings disclosed extend to those embodiments which fall within the scope of the appended claims, regardless of whether they accomplish one or more of the aforementioned needs.
One embodiment of the disclosure relates to an apparatus for use with an aircraft weather radar system having a radar antenna. The apparatus comprises processing electronics configured to provide image data associated with an image associated with radar return data from the weather radar system. The radar return data is updated at a first frequency. The processing electronics are configured to update the image data at a second frequency greater than the first frequency.
The image data can be used instead of data from an enhanced vision sensor in one embodiment. In one embodiment, a weather radar system can replace the need for an enhanced vision sensor.
Another embodiment of the disclosure relates to a method of displaying a terrain image on an electronic display using radar return data from a weather radar system on an aircraft. The method comprises receiving the radar return data from the weather radar system at a first time, and providing image data in response to the radar return data. The method also includes adjusting the image data in accordance with aircraft movement between the first time and a second time, and displaying the terrain image in response to the image data.
A further embodiment of the disclosure relates to an apparatus for use with a weather radar system. The apparatus comprises means for generating a first frame from weather radar data from the weather radar system, and means for providing a second frame based upon the first frame. The first frame being representative of terrain sensed by the weather radar system. The second frame is provided in accordance with movement of the aircraft from a time of reception of weather radar data.
The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying drawings.
Before describing in detail the particular improved system and method, it should be observed that the invention includes, but is not limited to, a novel structural combination of conventional data/signal processing components and communications circuits, and not in the particular detailed configurations thereof. Accordingly, the structure, methods, functions, control and arrangement of conventional components software, and circuits have, for the most part, been illustrated in the drawings by readily understandable block representations and schematic diagrams, in order not to obscure the disclosure with structural details which will be readily apparent to those skilled in the art, having the benefit of the description herein. Further, the invention is not limited to the particular embodiments depicted in the exemplary diagrams, but should be construed in accordance with the language in the claims.
Referring to
Flight displays 20 can be used to provide information to the flight crew, thereby increasing visual range and enhancing decision-making abilities. In an exemplary embodiment, flight displays 20 and combiner 21 can include a weather display, a joint display, a weather radar map and a terrain display. Further, flight displays 20 may include images from a synthetic vision system (SVS). For example, flight displays 20 can include a display configured to display a three dimensional perspective image of terrain and/or weather information. Other view of terrain and/or weather information may also be provided (e.g. plan view, horizontal view, vertical view, etc.). Additionally, flight displays 20 can be implemented using any of a variety of display technologies, including CRT, LCD, organic LED, dot matrix display, and others. Aircraft control center 11 preferably includes a combiner 21 associated with a head up display system. Conformal images are provided on combiner 21. Center 11 can also include worn displays, such as helmet mounted displays (HMD), etc.
According to an exemplary embodiment, at least one of displays 20 or combiner 21 is configured to provide an image of terrain derived from weather data. In one embodiment, at least one of displays 20 or combiner 21 displays a merged image of terrain derived from radar data and SVS data. Advantageously, the data from the radar data is updated at a greater frequency than it is received to provide a real time rendition of the image derived from radar data in one embodiment. The updated data also advantageously allows matching and/or merging with the SVS data which is provided at a faster rate than the radar data in one embodiment.
Flight displays 20 and/or combiner 21 can be configured to provide an indication to a flight crew as to whether the terrain features associated with the radar data and SVS data displayed on the electronic display are correct or incorrect. In one embodiment, such an indication notifies the crew if the integrity of the SVS is sufficient, possibly for lower authorized decision heights and minimum descent altitudes in low visibility conditions.
Referring to
Radar system 102 is preferably a weather radar system generally located inside the nose of the aircraft, inside a cockpit of the aircraft, on the top of the aircraft or on the tail of the aircraft. Radar system 102 can include a radar antenna 12 and a processor 15. Radar system 102 is preferably a weather radar system, such as, a Multiscan™ radar system from Rockwell Collins, Inc. Radar system 102 can utilize a split or half aperture or other technique for obtaining radar data associated with terrain in one embodiment. The type of radar system 102 and data gathering techniques are not discussed in a limiting fashion.
Radar system 102 preferably includes a computer system or processor 15 including radar data 17. Processor 15 receives radar returns (e.g., weather radar returns data) from radar antenna 12, processes the radar returns and provides data stored in radar data 17. The data stored in memory preferably includes an image frame representing terrain.
Display system 10 preferably includes a memory 54 for receiving an image frame derived from radar data 17 and a memory 52 for receiving an image frame from SVS 111. Alternatively, the radar return data can be directly communicated to system 10, and system 10 can provide the image frame derived from radar data in memory 54. In one embodiment, system 10 includes a processor 25 coupled to a head down display (HDD) 32 and a head up display (HUD) 34. HUD 34 can include a HUD computer, projector and combiner 21 (
Processor 25 includes an image merge control configuration module 38, an image merge function module 26, and a conformal data interpolation module 24. Modules 24, 26, and 38 can be implemented in software and can be executed on a computing platform including processor 25. Modules 24, 26, and 38 can be stored on a non-transitory medium.
According to one embodiment, the conformal data interpolation module 24 adjusts the frame from weather radar data in accordance with updated real time movement of the aircraft derived from sensors. According to one embodiment, the frame associated with terrain sensed by the radar system 102 is rotated and/or translated in accordance with aircraft position data (e.g., pitch, roll, speed, etc.) multiple times per second (e.g., at 20 Hz) to interpolate a single five second terrain data radar sweep (frame) and compute a frame in each real time 20 Hz video frame that is merged with the real time SVS video frame.
System 10 can also include a SVS for credit monitor 18 in one embodiment. SVS for credit monitor 18 can receive signals from various aircraft sensors including integrated landing system (ILS), radio altimeters, an inertial navigation system (INS) and/or other sensors. Monitor 18 can provide an indication of the result of an integrity check for display on HDD 32 and HUD 34.
Radar system 102 is preferably a weather radar system and provides weather radar data to form an image of terrain derived from the weather radar data for reception by memory 54. System 102 can provide the frame or data for the image in radar data 15 and communicate it to memory 54. Alternatively, processor 25 can build the frame or image based upon radar return data from system 102. Similarly, SVS 111 can provide data or a frame for SVS image received by memory 52. Alternatively, system 10 can provide the data or image frame to memory 52 in response to data from SVS 111.
In one embodiment, system 102 preferably provides data representing a 120 degree field of view in accordance with a weather radar sweep which takes approximately five seconds to complete in one embodiment. The sweep can be limited during approach to be a 30 degree sweep which requires five seconds before new data is available for display in certain embodiments. The sweep is directed toward the surface of the Earth so that returns are obtained which allow a terrain image to be created. Various types of sweeps, scans and timings of sweeps and scans can be utilized without departing from the scope of the invention.
According to one embodiment, SVS 111 can be any electronic system or device for providing a computer generated image of the external scene topography. The image can be from the perspective of the aircraft flight deck as derived from aircraft attitude, high-precision navigation solutions, and a database of terrain, obstacles and relevant cultural features. Generally, only those terrain, obstacle, and runway features which are contained in the current version of the SVS database are displayed in a conventional system. The pilot uses the synthetic vision images as enhancements to available visual cues.
Preferably, system 10 combines or fuses images from SVS 111 and the images derived from radar data from system 102 to optimize the overall image provided to the pilot according to one embodiment. Preferably, the images are fused in a format that utilizes the best features of SVS 111 and the images from radar data from system 102.
SVS 111 can be comprised in part of terrain database and a processor according to one exemplary embodiment. The terrain database can be used to create a three-dimensional perspective of the scene in front of the aircraft on a two-dimensional display or a three dimensional display. The database can employ topographical colors similar to those depicted on standard aeronautical charts. Furthermore, the database is capable of detecting segments of image data corresponding to various objects in the computer generated image such as runways, terrain and sky.
SVS 111 can also receive aircraft position data from an aircraft data source, such as, the source used by module 24 or other equipment. The aircraft data source can include any system or sensor (or combination thereof) that provides navigation data or aircraft flight parameters. For example, a typical navigation system in an aircraft is comprised of numerous sub-systems. Sub-systems which provide aircraft position data and flight parameter data could include, but are not limited to, an inertial navigation system (INS), a global navigation satellite system (e.g., global positioning system (GPS)), air data sensors, compasses, and a flight management computing system.
In one embodiment, the SVS image frame from SVS 111 is updated at a frequency of 20 Hz. In a preferred embodiment, individual frames of the SVS image are provided every 0.05 seconds and are received in memory 52 or by processor 25. The image frame or data derived from radar returns received by memory 54 is updated at approximately 0.2 Hz in one embodiment. A new image frame or data associated with image is provided to processor 25 or memory 54 every five seconds in one embodiment. The timing provided above is exemplary only. Image frames from SVS 111 and derived from system 102 can be provided at different times, frequencies or rates.
According to one embodiment, the frequency of image frames received by memory 54 is slower than the frequency of image frames received by memory 52. Accordingly, processor 25 using conformal data interpolation module 24 updates the image frame derived from weather radar data. In one embodiment, conformal data interpolation module 24 receives pitch, speed and roll information to update the image frame with respect to new positions of the aircraft.
With reference to
As shown in
Accordingly, conformal data interpolation module 24 can update the output of radar system 102 to temporally match the output of SVS 111 on an image-for-image basis in one embodiment. Alternatively, other update ratios can be utilized. For example, updates by module 24 can occur at half the rate, or one quarter of the rate, that images from SVS 111 are received. The timing shown on
The updated images w2-w100 are provided from conformal data interpolation module 24 to image merge function module 26 where updated images w2-100 are merged with images s2-100 from SVS 111. The merged images are displayed on displays 20 and/or combiner 21. Advantageously, updated images w2-100 derived from radar data allow EVS functionality to be replaced such that the images w1-101 derived from radar data can effectively be used in a credit monitor 18 for HUD 34, thereby allowing the use of lower landing minima in SVS 111 without EVS according to one embodiment. Further, applicants have found that images w1-w101 derived from weather radar data can effectively see through weather conditions, thereby providing an effective real time sensor of the outside environment.
With reference to
Various matching algorithms can be utilized to determine if the SVS data images s1 and s101 is accurate when compared to the real time data image w1 and w101 provided by system 102. In one embodiment, monitor 18 only receives images w1 and w101 from system 102 as opposed to updated images w2-100. (e.g., only images w1, w101, . . . etc. are compared to images s1, s101, . . . ).
The compare function generally looks for specific terrain features (peaks and valleys) and performs such comparisons on a regular, repeated, periodic schedule (e.g., every couple of seconds). In one embodiment, terrain features, such as runways, can be matched. Runways are readily ascertainable from radar data. If the compare function indicates that there is not a match, a monitor 18 indicates that lower minimum approaches are not to be allowed as there may be a database error or error with system 102 or SVS 111. The indication can be provided on combiner 21 associated with HUD 34 or on displays 20.
Applicants believe that use of monitor 18 and system 10 may allow for a reduction of minimum from a 200 foot decision height to a 100 foot. An additional benefit of the system may also be for use in low visibility taxi scenarios.
With reference to
Steps 104, 106, 108, 110 and 112 are performed in a loop at 20 Hz in one embodiment. Steps 102 and 104 performed in a loop at 0.2 Hz according to one embodiment.
A new frame (e.g., w101) is captured at step 101. The frame captured at step 101 is set as a current frame and the frame (e.g., w100) created at step 110 is discarded.
In one embodiment, differences in the frame provided at step 110 and provided at step 104 can be highlighted for the pilot. Such differences may be an indication of weather changes or traffic changes.
Image control configuration module 38 can provide format adjustments to data. SVS 111 and the system 102 can have their own specific interface type and format. Also, each display or displays 20 or combiner 21 may require specific formatting. A standard format can be a format used in HUD processing functions. Module 38 can be implemented in hardware, software, or combinations thereof.
Processor 25 preferably executes a fusion processing algorithm in module 26. The frames from SVSs 111 can be provided as a video signal and the updated frames from system 120 can be provided as a video signal for fusion processing.
In one embodiment, the fusion processing algorithm fuses the frames from memory 52 and 54 provided as video signals. This fusion process may include special formatting (positioning, sizing, cropping, etc.) of specific features or the entire image from a specific image source based on other sensor inputs or aircraft. After the combined or fused image has been completed, the entire image is sized to fit appropriately within the total HUD field-of-view and conformally overlay the outside scene which is viewed through the HUD. In addition, the overall fused image contrast is standardized with the brightness/contrast to support the brightness/contrast controls of the HUD. Alternatively, images w1-100 can be presented on displays 20 and combiner 21 without merging with images s1-101.
Processor 25 can be any hardware and/or software processor or processing architecture capable of executing instructions and operating on navigational data. Processor 25 can be capable of determining navigational information such as altitude, heading, bearing, and location based on data from aircraft sensors. Applicants note that process 100 and conformal data interpolation can be performed in various equipment on the aircraft including in a HUD computer, a display processor, radar system 102, a navigation system, SVS 111 etc.
Radar system 102 can also be used to detect weather patterns in the vicinity of the aircraft. Further, radar system 102 can provide weather related information on combiner 21 or displays 20.
While the detailed drawings, specific examples, detailed algorithms, and particular configurations given describe preferred and exemplary embodiments, they serve the purpose of illustration only. The inventions disclosed are not limited to the specific forms shown. For example, the methods may be performed in any of a variety of sequence of steps or according to any of a variety of mathematical formulas. The hardware and software configurations shown and described may differ depending on the chosen performance characteristics and physical characteristics of the radar and processing devices. For example, the type of system components and their interconnections may differ. The systems and methods depicted and described are not limited to the precise details and conditions disclosed. The flow charts show preferred exemplary operations only. The specific data types and operations are shown in a non-limiting fashion. For example, the scope of the claims are intended to cover any technique that uses a selectable fractional aperture unless literally delineated from the claims. Furthermore, other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the exemplary embodiments without departing from the scope of the invention as expressed in the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
4914436 | Bateman et al. | Apr 1990 | A |
5047779 | Hager | Sep 1991 | A |
5839080 | Muller et al. | Nov 1998 | A |
5920276 | Frederick | Jul 1999 | A |
5945926 | Ammar et al. | Aug 1999 | A |
5978715 | Briffe et al. | Nov 1999 | A |
6064942 | Johnson et al. | May 2000 | A |
6092009 | Glover | Jul 2000 | A |
6112141 | Briffe et al. | Aug 2000 | A |
6122570 | Muller et al. | Sep 2000 | A |
6128553 | Gordon et al. | Oct 2000 | A |
6138060 | Conner et al. | Oct 2000 | A |
6150901 | Auken | Nov 2000 | A |
6154151 | McElreath et al. | Nov 2000 | A |
6163021 | Mickelson | Dec 2000 | A |
6166661 | Anderson et al. | Dec 2000 | A |
6169770 | Henely | Jan 2001 | B1 |
6178391 | Anderson et al. | Jan 2001 | B1 |
6194980 | Thon | Feb 2001 | B1 |
6199008 | Aratow et al. | Mar 2001 | B1 |
6201494 | Kronfeld | Mar 2001 | B1 |
6219592 | Muller et al. | Apr 2001 | B1 |
6233522 | Morici | May 2001 | B1 |
6259400 | Higgins et al. | Jul 2001 | B1 |
6266114 | Skarohlid | Jul 2001 | B1 |
6281832 | McElreath | Aug 2001 | B1 |
6285298 | Gordon | Sep 2001 | B1 |
6285337 | West et al. | Sep 2001 | B1 |
6285926 | Weiler et al. | Sep 2001 | B1 |
6289277 | Feuereisen et al. | Sep 2001 | B1 |
6311108 | Ammar et al. | Oct 2001 | B1 |
6317468 | Meyer | Nov 2001 | B1 |
6345127 | Mitchell | Feb 2002 | B1 |
6377892 | Johnson et al. | Apr 2002 | B1 |
6388607 | Woodell | May 2002 | B1 |
6388608 | Woodell et al. | May 2002 | B1 |
6411890 | Zimmerman | Jun 2002 | B1 |
6421603 | Pratt et al. | Jul 2002 | B1 |
6424288 | Woodell | Jul 2002 | B1 |
6426717 | Maloratsky | Jul 2002 | B1 |
6441773 | Kelly et al. | Aug 2002 | B1 |
6448922 | Kelly | Sep 2002 | B1 |
6452511 | Kelly et al. | Sep 2002 | B1 |
6456236 | Hauck et al. | Sep 2002 | B1 |
6473240 | Dehmlow | Oct 2002 | B1 |
6492934 | Hwang et al. | Dec 2002 | B1 |
6501424 | Haendel et al. | Dec 2002 | B1 |
6512476 | Woodell | Jan 2003 | B1 |
6525674 | Kelly et al. | Feb 2003 | B1 |
6531669 | Miller et al. | Mar 2003 | B1 |
6549161 | Woodell | Apr 2003 | B1 |
6567728 | Kelly et al. | May 2003 | B1 |
6574030 | Mosier | Jun 2003 | B1 |
6577947 | Kronfeld et al. | Jun 2003 | B1 |
6591171 | Ammar et al. | Jul 2003 | B1 |
6603425 | Woodell | Aug 2003 | B1 |
6650275 | Kelly et al. | Nov 2003 | B1 |
6653947 | Dwyer et al. | Nov 2003 | B2 |
6667710 | Cornell et al. | Dec 2003 | B2 |
6690298 | Barber et al. | Feb 2004 | B1 |
6690299 | Suiter | Feb 2004 | B1 |
6714186 | Mosier et al. | Mar 2004 | B1 |
6741203 | Woodell | May 2004 | B1 |
6741208 | West et al. | May 2004 | B1 |
6744382 | Lapis et al. | Jun 2004 | B1 |
6744408 | Stockmaster | Jun 2004 | B1 |
6757624 | Hwang et al. | Jun 2004 | B1 |
6771626 | Golubiewski et al. | Aug 2004 | B1 |
6804614 | McGraw et al. | Oct 2004 | B1 |
6806846 | West | Oct 2004 | B1 |
6819983 | McGraw | Nov 2004 | B1 |
6822617 | Mather et al. | Nov 2004 | B1 |
6839017 | Dillman | Jan 2005 | B1 |
6850185 | Woodell | Feb 2005 | B1 |
6862501 | He | Mar 2005 | B2 |
6865452 | Burdon | Mar 2005 | B2 |
6879280 | Bull et al. | Apr 2005 | B1 |
6882302 | Woodell et al. | Apr 2005 | B1 |
6918134 | Sherlock et al. | Jul 2005 | B1 |
6950062 | Mather et al. | Sep 2005 | B1 |
6972727 | West et al. | Dec 2005 | B1 |
6977608 | Anderson et al. | Dec 2005 | B1 |
6992614 | Joyce | Jan 2006 | B1 |
6995726 | West et al. | Feb 2006 | B1 |
6998908 | Sternowski | Feb 2006 | B1 |
6999022 | Vesel et al. | Feb 2006 | B1 |
7002546 | Stuppi et al. | Feb 2006 | B1 |
7034753 | Elsallal et al. | Apr 2006 | B1 |
7042387 | Ridenour et al. | May 2006 | B2 |
7064680 | Reynolds et al. | Jun 2006 | B2 |
7089092 | Wood et al. | Aug 2006 | B1 |
7092645 | Sternowski | Aug 2006 | B1 |
7109912 | Paramore et al. | Sep 2006 | B1 |
7109913 | Paramore et al. | Sep 2006 | B1 |
7129885 | Woodell et al. | Oct 2006 | B1 |
7145501 | Manfred et al. | Dec 2006 | B1 |
7148816 | Carrico | Dec 2006 | B1 |
7151507 | Herting | Dec 2006 | B1 |
7158072 | Venkatachalam et al. | Jan 2007 | B1 |
7161525 | Finley et al. | Jan 2007 | B1 |
7170446 | West et al. | Jan 2007 | B1 |
7196329 | Wood et al. | Mar 2007 | B1 |
7205933 | Snodgrass | Apr 2007 | B1 |
7209070 | Gilliland et al. | Apr 2007 | B2 |
7219011 | Barber | May 2007 | B1 |
7242343 | Woodell | Jul 2007 | B1 |
7272472 | McElreath | Sep 2007 | B1 |
7292178 | Woodell et al. | Nov 2007 | B1 |
7312725 | Berson et al. | Dec 2007 | B2 |
7312743 | Ridenour et al. | Dec 2007 | B2 |
7486291 | Berson et al. | Feb 2009 | B2 |
7525448 | Wilson et al. | Apr 2009 | B1 |
7570177 | Reynolds et al. | Aug 2009 | B2 |
7633430 | Wichgers et al. | Dec 2009 | B1 |
7639175 | Woodell | Dec 2009 | B1 |
7664601 | Daly, Jr. | Feb 2010 | B2 |
7675461 | McCusker et al. | Mar 2010 | B1 |
7733264 | Woodell et al. | Jun 2010 | B1 |
7783427 | Woodell et al. | Aug 2010 | B1 |
7859448 | Woodell et al. | Dec 2010 | B1 |
7859449 | Woodell et al. | Dec 2010 | B1 |
8035547 | Flanigan et al. | Oct 2011 | B1 |
8373580 | Bunch et al. | Feb 2013 | B2 |
8493241 | He | Jul 2013 | B2 |
20020185600 | Kerr | Dec 2002 | A1 |
20030071828 | Wilkins et al. | Apr 2003 | A1 |
20030216859 | Martell et al. | Nov 2003 | A1 |
20040044445 | Burdon | Mar 2004 | A1 |
20040059473 | He | Mar 2004 | A1 |
20040083038 | He | Apr 2004 | A1 |
20040181318 | Redmond et al. | Sep 2004 | A1 |
20050174350 | Ridenour et al. | Aug 2005 | A1 |
20060097895 | Reynolds et al. | May 2006 | A1 |
20060227012 | He | Oct 2006 | A1 |
20060290531 | Reynolds et al. | Dec 2006 | A1 |
20070060063 | Wright et al. | Mar 2007 | A1 |
20080180351 | He | Jul 2008 | A1 |
20090207048 | He et al. | Aug 2009 | A1 |
20100033499 | Gannon et al. | Feb 2010 | A1 |
20110282580 | Mohan | Nov 2011 | A1 |
Number | Date | Country |
---|---|---|
196 49 838 | Apr 1998 | DE |
Entry |
---|
Fountain, J.R., “Digital Terrain Systems,” Airborne Navigation Systems Workshop (Digest No. 1997/169), IEE Colloquium, pp. 4/1-4/6, Feb. 21, 1997. |
Johnson, A., et al., Vision Guided Landing of an Autonomous Helicopter in Hazardous Terrain, Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference, pp. 3966-3971, Apr. 18-22, 2005. |
Technical Standard Order, TSO-C115b, Airborne Area Navigation Equipment Using Multi-Sensor Inputs, Sep. 30, 1994, 11 pages, Department of Transportation, Federal Aviation Administration, Washington, DC. |
Office Action for U.S. Appl. No. 11/851,323, mail date Aug. 6, 2009, 23 pages. |
Office Action for U.S. Appl. No. 11/851,323, mail date Dec. 15, 2010, 13 pages. |
Vadlamani, A., et al., Improving the detection capability of spatial failure modes using downward-looking sensors in terrain database integrity monitors, Digital Avionics Systems Conference, 2003. DASC-03. The 22nd, vol. 2, pp. 9.C.5-1-9.C.5-12, Oct. 12-16, 2003. |
Honeywell, RDR-4B Forward looking windshear detection / weather radar system user's manual with radar operating guidelines, Rev. 6, Jul. 2003, 106 pages. |
Kuntman, D., Airborne system to address leading cause of injuries in non-fatal airline accidents, ICAO Journal, Mar. 2000, 4 pages. |
Office Action for U.S. Appl. No. 11/851,323, mail date Jul. 5, 2012, 23 pages. |
Office Action on U.S. Appl. No. 13/241,051 Dated Feb. 27, 2014, 21 pages. |