The present application is related to U.S. patent application Ser. No. 13/857,955 filed on Apr. 5, 2013 by Barber et al., entitled “Extended Runway Centerline Systems And Methods,” which is assigned to the assignee of the present application and incorporated herein by reference.
An enhanced flight vision system (EFVS) is sometimes used in by an aircraft flight crew in order to provide imagery of an airport terminal area and runway environment to a flight crew during times when meteorological conditions prevent a clear natural view of the of the external surroundings of the aircraft through the windscreen. For example, the EFVS may overlay an image of an airport terminal area and runway environment over the pilot's natural unaided view of the external surroundings of the aircraft through the aircraft's cockpit windscreen. Such imagery may improve the situation awareness of the flight crew during instrument approach procedures in low visibility conditions such as fog. For example, under Title 14 of the Code of Federal Regulations, part 91, a pilot may not descend below decision altitude (DA) or minimum descent altitude (MDA) to 100 feet above the touchdown zone elevation (TDZE) from a straight-in instrument approach procedure (IAP), other than Category II or Category III, unless the pilot can see certain required visual references. Such visual references include, for example, the approach lighting system, the threshold lighting system, and the runway edge lighting system. The pilot may, however, use an EFVS to identify the required visual references in low visibility conditions where the pilots natural unaided vision is unable to identify these visual references. Accordingly, the use of an EFVS may minimize losses due to the inability of the pilot to land the plane and deliver cargo and/or passengers on time in low visibility conditions.
EFVS imagery is typically presented to the pilot flying (PF) on a head up display (HUD). The HUD is typically a transparent display device that allows the PF to view EFVS imagery while looking at the external surroundings of the aircraft through the cockpit windscreen. As long as visibility conditions outside of the aircraft permit the PF to see the external surroundings of the aircraft through the cockpit windscreen, the PF can verify that the EFVS is functioning properly such that the imagery on the HUD is in alignment with the PF's view of the external surroundings of the aircraft.
EFVS imagery is sometimes also presented to the pilot monitoring (PM) on a head down display (HDD). For example, in some countries, the system must present the EFVS imagery to the PM for confirmation that the EFVS information is a reliable and accurate indicator of the required visual references. The PM may also use the EFVS imagery to determine whether the PF is taking appropriate action during approach and landing procedures. The HDD is typically a non-transparent display device mounted adjacent to or within a console or instrument panel of the aircraft. The HDD is typically positioned such that the PM must look away from the cockpit windscreen in order to see the displayed imagery, and the PM is unable to see the external surroundings of the aircraft through the cockpit windscreen while viewing the HDD. As such, the HDD does not overlay the EFVS image over the PM's natural unaided view of the external surroundings of the aircraft. Without the context of the external surroundings, it is very difficult for the PM to detect problems in the EFVS imagery beyond gross failures such as loss of image or white-out images.
An EFVS typically uses either a passive or active sensing system to acquire data used to generate imagery of the airport terminal area and runway environment. A typical passive sensor, such as a forward looking infrared (FLIR) camera or visible light spectrum camera, receives electromagnetic energy from the environment and outputs data that may be used by the system to generate video images from the point of view of the camera. The camera is installed in an appropriate position, such as in the nose of an aircraft, so that the PF may be presented with an appropriately scaled and positioned video image on the HUD having nearly the same point of view as the PF when viewing the external surroundings of the aircraft through the HUD. However, while passive sensors provide higher quality video imagery, they may be unable to identify required visual references in certain low visibility conditions such as heavy fog.
Active sensing systems, such as millimeter wavelength radar systems (e.g., 94 GHz), transmit electromagnetic energy into the environment and then receive return electromagnetic energy reflected from the environment. The active sensing system is typically installed in an appropriate position, such as in the nose of an aircraft. Active sensing systems do not generate the same video imagery as passive sensing systems, but rather map the received return energy into three-dimensional (3-D) models (e.g., using a polar coordinate system with range, azimuth and elevation from the nose of an aircraft). The 3-D model may then be rendered into a two-dimensional (2-D) image that may be appropriately scaled, positioned, and presented to the PF on the HUD in much the same way as video imagery from a passive sensing system. However, while active millimeter wavelength radar systems provide better identification of required visual references than passive sensing systems in low visibility conditions such as heavy fog, the quality of the imagery is not as good.
Additionally, both passive FLIR cameras and active millimeter wavelength radar systems may have limited range in certain low visibility conditions such as heavy fog. Furthermore, regardless of the sensor technology used, current EFVS designs generate image views that are positioned and scaled with respect to a point of view useful for a PF using a HUD and having additional the additional context of the external surroundings of the aircraft while looking through the cockpit windscreen. The EFVS images are not rescaled or repositioned or provided with any additional flight information symbology or situational context for use by a PM when viewed on an HDD. Such EFVS images alone are of limited used to a PM using an HDD to verify the reliability and accuracy of the EFVS and/or to determine that the PF is taking appropriate action during approach and landing procedures. There is an ongoing need for an improved EFVS having a sensing system tuned to identify visual references required for aircraft approach and landing in low visibility conditions with sufficient accuracy and range. There is yet further need for an improved EFVS capable of providing imagery on an HDD that is useful to a PM to verify the reliability and accuracy of the EFVS, and to determine that the PF is taking appropriate action during approach and landing procedures.
According to an exemplary embodiment, an image processing system for enhanced flight vision includes a processor and memory coupled to the processor. The memory contains program instructions that, when executed, cause the processor to receive radar returns data for a runway structure, generate a three-dimensional model representative of the runway structure based on the radar returns data, generate a two-dimensional image of the runway structure from the three-dimensional model, and generate an aircraft situation display image representative of the position of the runway structure with respect to an aircraft based on the two-dimensional image.
According to another exemplary embodiment, a method of providing enhanced flight vision includes receiving radar returns data for a runway structure, generating a three-dimensional model representative of the runway structure based on the radar returns data, generating a two-dimensional image of the runway structure from the three-dimensional model, and generating an aircraft situation display image representative of the position of the runway structure based on the two-dimensional image.
According to another exemplary embodiment, an enhanced flight vision system includes a Doppler weather radar system including an antenna and a transducer configured to enhance reflectivity of radar returns from runway structures, processing electronics in communication with the weather radar system and configured to generate two-dimensional aircraft situation display images representative of the position of the runway structures based on the radar returns, and a display in communication with the processing electronics and configured to display the images to a pilot monitoring.
According to various exemplary embodiments, an EFVS may be provided with radar sensing and imagery displayable to a PM on an aircraft display, such as an HDD. For example, the EFVS may include a Doppler weather radar system including an antenna and a transducer. The Doppler weather radar system may be configured to enhance reflectivity of radar returns from runway structures in an airport terminal or runway environment, such as an approach lighting system, a threshold lighting system, and or a runway edge lighting system. As will be appreciated, using a Doppler weather radar system configured according to the various exemplary embodiments of an EFVS disclosed herein may provide greater range than millimeter wavelength radar sensing systems or passive FLIR or visible light camera systems in low visibility conditions such as heavy fog, given the Doppler weather radar system's superior ability to penetrate heavy fog. Using a Doppler weather radar system configured according to the various exemplary embodiments may also provide EFVS imagery having sufficient accuracy in low visibility conditions given that many of the visual references required under Title 14 of the Code of Federal Regulations, part 91, such as approach lighting systems, threshold lighting systems, runway edge lighting systems, and other runway structures are metallic structures that exhibit high radar reflectivity.
The EFVS may also include a radar processing module in communication with the radar system and configured to generate two-dimensional aircraft situation display images representative of the position of runway structures in an airport terminal or runway environment based on the radar returns. For example, the radar processing module may include an image processing system having a processor and memory containing program instructions that, when executed, cause the processor to receive radar returns data for runway structures and generate a 3-D model representative of the runway structures based on the radar returns data. The program instructions may also be configured to cause the processor to filter the radar returns data to identify areas in the 3-D model having a reflectivity lower than a predetermined value. As will be appreciated, generating a 3-D model representative of the runway structure from active radar return data provides range and/or depth information to the EFVS and allows for 2-D imagery to be produced with multiple viewpoints as opposed to a single point of view aligned with that of the PF looking at the external surroundings of the aircraft through the cockpit windscreen.
The program instructions may also be configured to generate a 2-D image of the runway structure in an airport terminal or runway environment from the 3-D model. For example, the 2-D image may be a top-down or side view of the runway structure. The program instructions may also be configured to generate an aircraft situation display image representative of the position of the runway structure with respect to an aircraft based on the 2-D image. For example, the aircraft situation display image may include a 2-D electronic moving map display of an airport terminal or runway environment having an overlay including the 2-D radar image of the runway structure. In some embodiments, the aircraft situation display image may be a horizontal situation display image including a top-down view of the runway structure. In some embodiments, the aircraft situation display image may be a vertical situation display image including a side view of the runway structure. In some embodiments, both a horizontal and vertical situation displays may be presented.
The EFVS may also include a PM display in communication with the radar processing module and configured to display the images. For example, the PM display may be an HDD. The HDD may be configured to present 2-D aircraft situation display map images as opposed to a transparent HUD configured to overlay sensing system imagery over the pilot's natural unaided view of the external surroundings of the aircraft through the aircraft's cockpit windscreen. As will be appreciated, using a 2-D horizontal or vertical aircraft situation display, such as an electronic moving map display having an overlay including a 2-D top-down or side-view radar image provides the PM with context in which the reliability and accuracy of the EFVS may be determined, and within which the PM may determine that the PF is taking appropriate action during approach and landing procedures. For example, any misalignment between or among the overlays may indicate to the PM that there may be a problem with the aircraft navigation system, altimeter, radar system, an airport location database, the course selected by the PF, etc.
Referring now to
According to various exemplary embodiments, HUD 22 may be configured for use by a PF in the aircraft, and one of flight displays 20 may be an HDD configured for use by a PM in the aircraft. HUD 22 may be, for example, a transparent or semi-transparent display device that allows a PF to view EFVS imagery. For example, HUD 22 may be configured to overlay an image of the external surroundings of the aircraft (e.g., structures in an airport terminal area or runway environment) over the pilot's natural unaided view of the external surroundings of the aircraft while looking through the aircraft cockpit windscreen 24. HDD 20 may be, for example, a non-transparent CRT, LCD, or LED display device mounted adjacent to or within a console or instrument panel of the aircraft. HDD 20 may be configured to present aircraft situation display map images having EFVS imagery overlaid or otherwise merged therewith. HDD 20 may be positioned such that the PM may need to look away from the cockpit windscreen in order to see the displayed images, and such that the PM may not be able to see the external surroundings of the aircraft through the cockpit windscreen while viewing HDD 20.
EFVS processing electronics on board the aircraft may be configured to provide data regarding the state of the aircraft to flight displays 20 and/or HUD 22. For example, data regarding the aircraft's altitude, heading, velocity, etc., may be provided to flight displays 20 and/or HUD 22 by the processing electronics. The processing electronics may be further configured to provide data regarding the external surroundings of the aircraft to flight displays 20 and/or HUD 22. In some embodiments, the data can be conformably represented on HUD 22 with respect to the PF's natural view of the environment outside the aircraft. In other words, data that appears on HUD 22 may be located precisely in the location of the corresponding feature in the environment outside the aircraft (e.g., a line on HUD 22 may conform to the location of the horizon as the plane moves, etc.). The data may also be provided to an HDD 20 in the context of an aircraft situation display, such as a horizontal situation display or a vertical situation display including an electronic moving map.
The processing electronics of the aircraft may receive data regarding the aircraft's surroundings from onboard sensors. For example, the aircraft may be equipped with a radar that performs vertical and horizontal radar sweeps in front of the aircraft. Radar returns may then be processed by the processing electronics to generate and provide display data to HDD 20 and HUD 22 regarding the external surroundings of the aircraft. For example, HDD 20 may provide a top-down view, a horizontal view, a vertical profile view, or any other view of structures in an airport terminal or runway environment, weather, terrain, objects, and/or other aircraft detected by processing electronics onboard the aircraft.
The processing electronics of the aircraft may also receive data regarding the aircraft's surroundings communicated from an external source (e.g., a satellite, another aircraft, a ground-based communications station, etc.). In various embodiments, communication devices in the aircraft may be configured to receive and/or transmit data with the external sources. For example, the aircraft may request data regarding the location and bearing of nearby aircraft via the communication devices. The returned data may then be processed by the processing electronics and used to provide information regarding the other aircraft to the pilot via HDD 20 and HUD 22.
A database may be stored by the processing electronics. The database may be, for example, a terrain database that may include a terrain elevation database, an obstacle location and elevation database, an aerodrome mapping database, an electronic charts and maps database, a runway database, etc. The database may be used by the processing electronics to generate aircraft situation displays of the aircraft's surroundings, such as during approach and landing procedures. For example, the database may include data regarding the location of an airport's runways, control tower, etc. In other embodiments, the database may incorporate data regarding an airport from another database stored by the processing electronics, such as a chart database configured to store airport diagrams, approach charts, etc. In various embodiments, the processing electronics may use radar returns to generate EFVS imagery and overlay or otherwise merge the EFVS imagery with the aircraft situation displays. For example, the processing electronics may use radar returns to generate image overlays of visual references required for approach and landing procedures, such as approach lighting systems, threshold lighting systems, and runway edge lighting systems.
The processing electronics may generate aircraft situation displays of the aircraft's surroundings using the database, radar returns, other sensor data, and data regarding the aircraft's altitude, bearing, and heading. For example, the processing electronics may generate a 2-D or 3-D representation of an airport terminal or runway environment in front of the aircraft from the viewing perspective of a PF and provide the representation to HUD 22. The rendition may also include various indicia regarding the current state of the aircraft. For example, the rendering on HUD 22 may include data regarding the aircraft's heading, course, altitude, or the like. The processing electronics may also generate a 2-D aircraft situation display image representative of the airport terminal or runway environment and provide the representation to HDD 20 for review by a PM. For example, the aircraft situation display image may include a 2-D electronic moving map display of the airport terminal or runway environment. The aircraft situation display image may further include an overlay including the 2-D radar image of structures in the airport terminal or runway environment. In some embodiments, the aircraft situation display image may be a horizontal situation display image including a top-down view of the airport terminal or runway environment. In some embodiments, the aircraft situation display image may be a vertical situation display image including a side view of the airport terminal or runway environment. In some embodiments, both a horizontal and vertical situation displays of the airport terminal or runway environment may be presented.
Referring to
Returns may also be processed to, for example, distinguish between terrain and weather, to determine the height of terrain, or to determine the height of weather. Returns may be processed to, for example, identify structures in an airport terminal or runway environment that may serve as visual reference points during aircraft approach and landing procedures. Such structures may include, for example, runway structures such as an approach lighting system, a threshold lighting system, and or a runway edge lighting system. The aircraft may include other forms of sensors similar to radar system 200 that receive data regarding the aircraft's surroundings, in further embodiments. Other forms of sensors (not shown) may include, but are not limited to, barometers, altimeters (e.g., radio altimeters, barometric altimeters, etc.), temperature sensors, accelerometers, attitude gyros, or video sensors (e.g., infrared cameras, cameras that operate in the visual spectrum, etc.). Radar system 200 may be any radar system configured to detect or receive data for the systems and methods of the present disclosure. According to exemplary embodiments, radar system 200 may be an RTA-4218 MULTISCAN radar system, a WXR-2100 MULTISCAN radar system, or similar system manufactured by Rockwell Collins and configured in accordance with the principles described herein.
Communication devices 202 may include devices configured to receive and/or transmit data between the aircraft and one or more external sources. For example, communication devices 202 may include antennas located along the top or bottom of the aircraft to communicate with other airborne or ground-based systems. Communication devices 202 may also include communication electronics coupled to the antennas, such as receivers, transmitters, or transceivers. Communication devices 202 may include separate hardware to support different communication protocols and systems. For example, communication devices 202 may include a TCAS antenna and a separate antenna for receiving location data from a satellite-based positioning system (e.g., GPS, GLONASS, etc.). Communication devices 202 may also include shared hardware configured to communicate via multiple communication protocols.
Communication devices 202 may also receive data regarding the aircraft's surroundings. For example, communication devices 202 may receive data regarding another aircraft (e.g., range, altitude, bearing, etc.) or airport from a ground-based communications system, a satellite-based communications system, or from the other aircraft itself. The data may be received by communication devices 202 actively (e.g., in response to a request sent by communication devices 202) or passively (e.g., not in response to a request for the data). Communication devices 202 may also be configured to allow audio or video to be communicated between aircraft control center 10 and an external source. For example, communication devices 202 may transfer audio data between the aircraft's pilot and an air traffic controller or pilot of another aircraft via a radio channel.
Referring to
Processing electronics 302 may be in communication with onboard systems configured to generate data regarding the aircraft and its surroundings. For example, processing electronics 302 may be in communication with a radar system 304 (e.g., similar to radar system 200 shown in
Processing electronics 302 may also be in communication with the communication devices 306 (e.g., similar to communication devices 202 shown in
Processing electronics 302 are shown in communication with aircraft sensors 308. In general, sensors 308 may be any number of sensors that measure aircraft parameters related to the state of the aircraft. For example, sensors 308 may include temperature sensors, humidity sensors, infrared sensors, altitude sensors, pressure sensors, fuel gauges, airspeed sensors, throttle position sensors, ground speed sensors, pitot-static tubes, a gyroscope, a global positioning system (GPS), a camera (e.g., an infrared camera, a microwave camera, etc.), or any other aircraft-mounted sensors that may be used to provide data to processing electronics 302. It should be appreciated that sensors 308 (or any other component shown connected to processing electronics 302) may be indirectly or directly connected to the processing electronics 302. For example, processing electronics 302 may receive a temperature reading directly from a temperature sensor and a throttle position indirectly from a position sensor via an engine controller.
Processing electronics 302 are further shown in communication with avionics equipment 310. In general, avionics equipment 310 may include other electronic control systems in the aircraft. For example, avionics equipment 310 may include a flight management system, a navigation system, a backup navigation system, or another aircraft system configured to provide inputs to processing electronics 302. For example, avionics equipment 310 may include the landing gear system of the aircraft and provide information such as whether or not the landing gear is deployed, a weight on wheels determination, or other parameters to processing electronics 302. In another example, avionics equipment 310 may provide controls inputs, such as a desired throttle or power level to processing electronics 302.
Processing electronics 302 are additionally shown in communication with displays 312 (e.g., HDD 20 and/or HUD 22 shown in
Referring now to
Processing electronics may include hardware circuitry for supporting the execution of the computer code of modules 406, 407, 408, 409, and 410. For example, processing electronics 400 may include hardware interfaces (e.g., output 414) for communicating control signals (e.g., analog or digital signals) from processing electronics 400 to avionics equipment (e.g., avionics equipment 310 shown in
Memory 402 may include a memory buffer 418 for receiving and storing radar return data (e.g., from radar system 300 shown in
Memory 402 may also include configuration data 420. Configuration data 420 may include various parameters used to control which display data is provided to, for example, a HUD or an HDD, and to the other user interface devices of the aircraft. Configuration data 420 may also include one or more parameters that control when EFVS imagery regarding an airport environment, aircraft landing structure, etc. is presented to a HUD or an HDD.
Memory 402 may also include a database 422. Database 422 may include, for example data regarding the geolocation of natural and man-made terrain. A geolocation may be a set of latitude and longitude coordinates or any other form of values that uniquely define a location along the surface of the Earth. Database 422 may include data regarding the geolocations of naturally occurring terrain, such as oceans, mountains, hills, and the like. Database 422 may also include the geolocations of man-made objects and other terrain, such as airports, buildings, towers, etc. In the case of airports stored in database 422, database 422 may store data regarding the location and layout of the runways of the airport, the terminal of the airport, and the location of any other building or structure associated with the airport. In some embodiments, database 422 may include an ARINC 424 runway database that includes latitude and longitude data for airport runway endpoints. In some embodiments, database 422 may include one or more electronic moving maps for one or more airports, runways, etc.
Memory 402 may include a radar signal processing module 406 configured to receive radar returns data and generate a 3-D model based on the radar returns data (e.g., using a polar coordinate system with range, azimuth and elevation from the nose of an aircraft). In particular, module 406 may receive radar returns data from highly reflective structures in an airport terminal area or runway environment that may serve as visual reference points during aircraft approach and landing procedures. Such structures may include, for example, runway structures such as an approach lighting system, a threshold lighting system, and or a runway edge lighting system. Module 406 may generate a 3-D model representative of one or more runway structures in the airport environment based on the received radar returns data.
In order to facilitate generation of clearer images, module 406 may be configured to filter the radar returns data to identify areas in the 3-D model having a reflectivity lower than a predetermined value. In some embodiments, low energy areas in the 3-D model may be zeroed out based on their corresponding reflectivity values, such that the area will be rendered transparent. Such filtering may result in a 3-D model representative of only highly reflective structures in an airport terminal area or runway environment, such as an approach lighting system, a threshold lighting system, and or a runway edge lighting system.
Memory 402 may also include a radar image processing module 407 configured to generate 2-D images from the 3-D model. In particular, module 407 may receive data from module 406 including a 3-D model of highly reflective structures in an airport terminal area or runway environment that may serve as visual reference points during aircraft approach and landing procedures. Module 407 may generate one or more 2-D images of one or more of the structures in the 3-D model. For example, in some embodiments, module 407 may generate a top-down view of the structures. In some embodiments, module 407 may generate a side view of the structures. In some embodiments, both top-down and side views may be generated. Other 2-D views are contemplated as well.
Memory 402 may also include a location analyzer module 408 configured to determine the location of the aircraft. Location analyzer module 408 uses sensor data received via input 416 to determine location values for the aircraft, such as the geolocation of the aircraft and the altitude of the aircraft. For example, location analyzer module 408 may use received GPS data and data from a radio altimeter to determine the geolocation and altitude of the aircraft. Location analyzer module 408 may also analyze the received sensor data to determine movement-related parameters, such as the direction, heading, course, pitch, or roll of the aircraft.
Location analyzer module 408 may be configured to determine the location of airports, terrain, weather, other aircraft, and the like relative to the location of the aircraft. In particular, location analyzer module 408 may compare the determined location of the aircraft to data from database 422 to identify terrain near the aircraft. For example, location analyzer module 408 may use the geolocation, altitude, heading, etc. of the aircraft to retrieve data from database 422 regarding an airport terminal or runway environment near the aircraft for use in upcoming approach and landing procedures. Location analyzer module 408 may also compare the determined location of the aircraft to location information contained in radar returns data used by modules 406 and 407 to generate 3-D models and 2-D images of nearby airport terminal or runway environment structures (e.g., distance, altitude and azimuth data determined by radar system 304 shown in
Memory 402 may also include a runway centerline module 409 configured to determine a runway centerline for a nearby runway identified by location analyzer 408. The runway centerline may be calculated in the same plane as the runway or at an angle extending from the runway. For example, the runway centerline may be calculated as extending from the landing threshold of the runway out to a predetermined distance (e.g., the runway centerline may extend out from the runway to 10 NM at an angle of 0.5° from the geometric plane of the runway). In one embodiment, runway centerline module 409 may be configured to generate a runway centerline based on a threshold distance between the aircraft and the runway. For example, runway centerline module 409 may only generate a runway centerline for a runway that is within 30 NM from the aircraft, as determined by location analyzer module 408. Runway centerline module 409 may generate a runway centerline out to any distance extending from the runway (e.g., 15 NM, 10 NM, etc.). In some embodiments, runway centerline module 409 also generates distance markers along the runway centerline that denote the distance between a point on the centerline and the runway. The distance markers may be at predetermined intervals or may be at variable intervals based on a parameter set in configuration data 420. For example, runway centerline module 409 may generate hash marks along the generated centerline and/or symbolic indicators that represent the distance from a point on the centerline to the runway.
Memory 402 may also include a display data module 410 configured to generate and provide display data to one or more displays (e.g., a HUD 22 and/or an HDD 20 as shown in
According to various exemplary embodiments, display data module 410 may be configured to generate a 2-D aircraft situation display image representative of the position of structures in an airport terminal or runway environment with respect to an aircraft. For example, display data module 410 may generate an aircraft situation display image including a 2-D electronic moving map display (e.g., a moving map of an airport terminal or runway environment) using data from terrain database 422. The electronic moving map may automatically scroll with movement of the aircraft in real time. The aircraft situation display image may further include an overlay including a 2-D radar image of one or more structures (e.g., runway structures) in an airport terminal or runway environment generated by radar image processing module 407. The overlay may be positioned with respect to the electronic moving map based on comparative location data received from location analyzer module 408, and may also automatically scroll with movement of the aircraft in real time (e.g., using updated imagery from radar image processing module 407). Similarly, the aircraft situation display image may further include an overlay including a runway centerline based on data generated by runway centerline module 409. Display data module 410 may also generate display data that includes indicia regarding the state of the aircraft, such as the altitude, speed, heading, etc. of the aircraft. Display data module may also generate display data that includes other terrain, air traffic, weather, etc.
In some embodiments, display data module 410 may generate a horizontal aircraft situation display image. The horizontal aircraft situation display image may include, for example, a top-down view of the airport terminal or runway environment oriented in a track-up display (e.g., an outline image or airport moving map having a top-down orientation). The aircraft situation display image may further include an overlay including a 2-D top-down radar image of one or more structures in an airport terminal or runway environment generated by radar image processing module 407. Display data module 410 may add additional display data as well.
In some embodiments, display data module 410 may generate a vertical aircraft situation display image. The vertical aircraft situation display image may include, for example, a side view of the airport terminal or runway environment. The aircraft situation display image may further include an overlay including a 2-D side view radar image of one or more structures in an airport terminal or runway environment generated by radar image processing module 407. Display data module 410 may add additional display data as well. In some embodiments, both a horizontal and vertical situation displays may be presented.
Referring now to
Horizontal situation display 502 also includes an overlay 512 of a 2-D top-down radar image, such as a Doppler weather radar image. Areas of high reflectivity 514 are indicated as dark areas over a lighter background in the overlay image. Areas of mid to low reflectivity with respect to a predetermined threshold value have been filtered out to improve image clarity. Other coloring, shading, and filtering schemes are contemplated as well. Areas of high reflectivity 514 may correspond to one or more structures (e.g., runway structures) in an airport terminal or runway environment. For example, one or more areas of high reflectivity 514 may correspond to key visual references, such as the approach lighting system for the runway outlined at 506, which may be metallic structures that exhibit high radar reflectivity. As shown in
In some embodiments, additional indicia regarding the state of the aircraft, such as the altitude, speed, heading, etc. of the aircraft, or an extended runway centerline may be overlaid on horizontal situation display 502. Additional overlays may also include other terrain, air traffic, weather, etc. A PM viewing horizontal situation display 502 on an HDD without the benefit of a natural view of the environment outside the aircraft through a cockpit may use any of these additional indicia to verify the reliability and accuracy of the EFVS, as well as to verify that the PF is taking appropriate action during approach and landing procedures to track the runway. For example, the PM may verify that the aircraft heading is aligned with the runway indicated at 506, that an extended runway centerline is aligned with areas of high reflectivity 514, etc.
Vertical situation display 504 includes an electronic moving map having a side view orientation and showing an outline 516 of the runway, a range indicator 517 (set to 2 NM), an altitude indicator 518, and an indicator 520 of the position of the aircraft (e.g., altitude and range) with respect to the runway. Vertical situation display 502 also includes an overlay 522 of a 2-D side view radar image, such as a Doppler weather radar image. Areas of high reflectivity 524 are indicated as dark areas over a lighter background. Areas of mid to low reflectivity with respect to a predetermined threshold value have been filtered out to improve image clarity. Other coloring, shading, and filtering schemes are contemplated as well. As with overlay 512, areas of high reflectivity 524 may correspond to one or more structures (e.g., runway structures) in an airport terminal or runway environment. For example, one or more areas of high reflectivity 524 may correspond to key visual references, such as the approach lighting system for the runway outlined at 516, which may be metallic structures that exhibit high radar reflectivity. As shown in
In some embodiments, additional indicia regarding the state of the aircraft, such as the altitude, speed, heading, etc. of the aircraft, or an extended runway centerline may be overlaid on vertical situation display 504. Additional overlays may also include other terrain, air traffic, weather, etc. A PM viewing vertical situation display 504 on an HDD without the benefit of a natural view of the environment outside the aircraft through a cockpit may use any of these additional indicia to verify the reliability and accuracy of the EFVS, as well as to verify that the PF is taking appropriate action during approach and landing procedures. For example, the PM may verify that the aircraft altitude is aligned with the runway indicated at 516, that an extended runway centerline is aligned with areas of high reflectivity 524, etc.
Referring now to
The scope of this disclosure should be determined by the claims, their legal equivalents and the fact that it fully encompasses other embodiments which may become apparent to those skilled in the art. All structural, electrical and functional equivalents to the elements of the below-described disclosure that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. A reference to an element in the singular is not intended to mean one and only one, unless explicitly so stated, but rather it should be construed to mean at least one. No claim element herein is to be construed under the provisions of 35 U.S.C. §112, sixth paragraph, unless the element is expressly recited using the phrase “means for.” Furthermore, no element, component or method step in the present disclosure is intended to be dedicated to the public, regardless of whether the element, component or method step is explicitly recited in the claims.
The embodiments in the present disclosure have been described with reference to drawings. The drawings illustrate certain details of specific embodiments that implement the systems and methods and programs of the present disclosure. However, describing the embodiments with drawings should not be construed as imposing any limitations that may be present in the drawings. The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing its operations. The embodiments of the present disclosure may be implemented using an existing computer processor, or by a special purpose computer processor incorporated for this or another purpose or by a hardwired system.
As noted above, embodiments within the scope of the present invention include program products comprising non-transitory machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media may be any available media that may be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media may comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to carry or store desired program code in the form of machine-executable instructions or data structures and which may be accessed by a general purpose or special purpose computer or other machine with a processor. Thus, any such a connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Embodiments in the present disclosure have been described in the general context of method steps which may be implemented in one embodiment by a program product including machine-executable instructions, such as program code, for example in the form of program modules executed by machines in networked environments. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Machine-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
As previously indicated, embodiments in the present disclosure may be practiced in a networked environment using logical connections to one or more remote computers having processors. Those skilled in the art will appreciate that such network computing environments may encompass many types of computers, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and so on. Embodiments in the disclosure may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
An exemplary system for implementing the overall system or portions of the disclosure might include one or more computers including a processor, a system memory or database, and a system bus that couples various system components including the system memory to the processor. The database or system memory may include read only memory (ROM) and random access memory (RAM). The database may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to a removable optical disk such as a CD ROM or other optical media. The drives and their associated machine-readable media provide nonvolatile storage of machine-executable instructions, data structures, program modules and other data for the computer. User interfaces, as described herein, may include a computer with monitor, keyboard, a keypad, a mouse, joystick or other input devices performing a similar function.
It should be noted that although the diagrams herein may show a specific order and composition of method steps, it is understood that the order of these steps may differ from what is depicted. For example, two or more steps may be performed concurrently or with partial concurrence. Also, some method steps that are performed as discrete steps may be combined, steps being performed as a combined step may be separated into discrete steps, the sequence of certain processes may be reversed or otherwise varied, and the nature or number of discrete processes may be altered or varied. The order or sequence of any element or apparatus may be varied or substituted according to alternative embodiments. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. Such variations will depend on the software and hardware systems chosen and on designer choice. It is understood that all such variations are within the scope of the disclosure. Likewise, software and web implementations of the present invention could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various database searching steps, correlation steps, comparison steps and decision steps.
The foregoing description of embodiments has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the subject matter to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the subject matter disclosed herein. The embodiments were chosen and described in order to explain the principals of the disclosed subject matter and its practical application to enable one skilled in the art to utilize the disclosed subject matter in various embodiments and with various modifications as are suited to the particular use contemplated. Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the embodiments without departing from the scope of the presently disclosed subject matter.
Throughout the specification, numerous advantages of the exemplary embodiments have been identified. It will be understood, of course, that it is possible to employ the teachings herein without necessarily achieving the same advantages. Additionally, although many features have been described in the context of a particular data processor, it will be appreciated that such features could also be implemented in the context of other hardware configurations.
While the exemplary embodiments illustrated in the figures and described above are presently preferred, it should be understood that these embodiments are offered by way of example only. Other embodiments may include, for example, structures with different data mapping or different data. The disclosed subject matter is not limited to a particular embodiment, but extends to various modifications, combinations, and permutations that nevertheless fall within the scope and spirit of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
2416155 | Chubb | Feb 1947 | A |
2849184 | Arden et al. | Aug 1958 | A |
2929059 | Parker | Mar 1960 | A |
2930035 | Altekruse | Mar 1960 | A |
2948892 | White | Aug 1960 | A |
2965894 | Sweeney | Dec 1960 | A |
2994966 | Senitsky et al. | Aug 1961 | A |
3031660 | Young | Apr 1962 | A |
3049702 | Schreitmueller | Aug 1962 | A |
3064252 | Varela | Nov 1962 | A |
3070795 | Chambers | Dec 1962 | A |
3071766 | Fenn | Jan 1963 | A |
3072903 | Meyer | Jan 1963 | A |
3089801 | Tierney et al. | May 1963 | A |
3107351 | Milam | Oct 1963 | A |
3113310 | Standing | Dec 1963 | A |
3129425 | Sanner | Apr 1964 | A |
3153234 | Begeman et al. | Oct 1964 | A |
3175215 | Blasberg et al. | Mar 1965 | A |
3212088 | Alexander et al. | Oct 1965 | A |
3221328 | Walter | Nov 1965 | A |
3241141 | Wall | Mar 1966 | A |
3274593 | Varela et al. | Sep 1966 | A |
3325807 | Burns et al. | Jun 1967 | A |
3334344 | Colby, Jr. | Aug 1967 | A |
3339199 | Pichafroy | Aug 1967 | A |
3373423 | Levy | Mar 1968 | A |
3397397 | Barney | Aug 1968 | A |
3448450 | Alfandari et al. | Jun 1969 | A |
3618090 | Garrison | Nov 1971 | A |
3680094 | Bayle et al. | Jul 1972 | A |
3716855 | Asam | Feb 1973 | A |
3739380 | Burdic et al. | Jun 1973 | A |
3781878 | Kirkpatrick | Dec 1973 | A |
3810175 | Bell | May 1974 | A |
3815132 | Case et al. | Jun 1974 | A |
3851758 | Makhijani et al. | Dec 1974 | A |
3866222 | Young | Feb 1975 | A |
3885237 | Kirkpatrick | May 1975 | A |
3956749 | Magorian | May 1976 | A |
4024537 | Hart | May 1977 | A |
4058701 | Gruber et al. | Nov 1977 | A |
4058710 | Altmann | Nov 1977 | A |
4063218 | Basov et al. | Dec 1977 | A |
4235951 | Swarovski | Nov 1980 | A |
4277845 | Smith et al. | Jul 1981 | A |
4405986 | Gray | Sep 1983 | A |
4435707 | Clark | Mar 1984 | A |
4481519 | Margerum | Nov 1984 | A |
4532515 | Cantrell et al. | Jul 1985 | A |
4594676 | Breiholz et al. | Jun 1986 | A |
4595925 | Hansen | Jun 1986 | A |
4598292 | Devino | Jul 1986 | A |
4628318 | Alitz | Dec 1986 | A |
4646244 | Bateman et al. | Feb 1987 | A |
4649388 | Atlas | Mar 1987 | A |
4654665 | Kiuchi et al. | Mar 1987 | A |
4685149 | Smith et al. | Aug 1987 | A |
4760396 | Barney et al. | Jul 1988 | A |
4828382 | Vermilion | May 1989 | A |
4843398 | Houston et al. | Jun 1989 | A |
4912477 | Lory et al. | Mar 1990 | A |
4914436 | Bateman et al. | Apr 1990 | A |
4924401 | Bice et al. | May 1990 | A |
4939513 | Paterson et al. | Jul 1990 | A |
4951059 | Taylor, Jr. | Aug 1990 | A |
4953972 | Zuk | Sep 1990 | A |
4965573 | Gallagher et al. | Oct 1990 | A |
4987419 | Salkeld | Jan 1991 | A |
5045855 | Moreira | Sep 1991 | A |
5047779 | Hager | Sep 1991 | A |
5047781 | Bleakney | Sep 1991 | A |
5049886 | Seitz et al. | Sep 1991 | A |
5166688 | Moreira | Nov 1992 | A |
5173703 | Mangiapane et al. | Dec 1992 | A |
5175554 | Mangiapane et al. | Dec 1992 | A |
5198819 | Susnjara | Mar 1993 | A |
5202690 | Frederick | Apr 1993 | A |
5247303 | Cornelius et al. | Sep 1993 | A |
5273553 | Hoshi et al. | Dec 1993 | A |
5311183 | Mathews et al. | May 1994 | A |
5329391 | Miyamoto et al. | Jul 1994 | A |
5332998 | Avignon et al. | Jul 1994 | A |
5345241 | Huddle | Sep 1994 | A |
5365356 | McFadden | Nov 1994 | A |
5442364 | Lee et al. | Aug 1995 | A |
5539409 | Mathews et al. | Jul 1996 | A |
5559515 | Alimena et al. | Sep 1996 | A |
5559518 | DiDomizio | Sep 1996 | A |
5566840 | Waldner et al. | Oct 1996 | A |
5592178 | Chang et al. | Jan 1997 | A |
5678303 | Wichmann | Oct 1997 | A |
5736957 | Raney | Apr 1998 | A |
5820080 | Eschenbach | Oct 1998 | A |
5828332 | Frederick | Oct 1998 | A |
5831570 | Ammar et al. | Nov 1998 | A |
5839080 | Muller et al. | Nov 1998 | A |
5867119 | Corrubia et al. | Feb 1999 | A |
5894286 | Morand et al. | Apr 1999 | A |
5918517 | Malapert et al. | Jul 1999 | A |
5920276 | Frederick | Jul 1999 | A |
5923279 | Bamler et al. | Jul 1999 | A |
5936575 | Azzarelli et al. | Aug 1999 | A |
5942062 | Hassall et al. | Aug 1999 | A |
5945926 | Ammar et al. | Aug 1999 | A |
5950512 | Fields | Sep 1999 | A |
5959762 | Bandettini et al. | Sep 1999 | A |
5978715 | Briffe et al. | Nov 1999 | A |
6002347 | Daly et al. | Dec 1999 | A |
6023240 | Sutton | Feb 2000 | A |
6061016 | Lupinski et al. | May 2000 | A |
6061022 | Menegozzi et al. | May 2000 | A |
6064942 | Johnson et al. | May 2000 | A |
6075484 | Daniel et al. | Jun 2000 | A |
6092009 | Glover | Jul 2000 | A |
6112141 | Briffe et al. | Aug 2000 | A |
6112570 | Hruschak | Sep 2000 | A |
6122570 | Muller et al. | Sep 2000 | A |
6127944 | Daly et al. | Oct 2000 | A |
6128066 | Yokozeki | Oct 2000 | A |
6128553 | Gordon et al. | Oct 2000 | A |
6138060 | Conner et al. | Oct 2000 | A |
6150901 | Auken | Nov 2000 | A |
6154151 | McElreath et al. | Nov 2000 | A |
6154169 | Kuntman | Nov 2000 | A |
6157339 | Sato et al. | Dec 2000 | A |
6157891 | Lin | Dec 2000 | A |
6163021 | Mickelson | Dec 2000 | A |
6166661 | Anderson et al. | Dec 2000 | A |
6169770 | Henely | Jan 2001 | B1 |
6178391 | Anderson et al. | Jan 2001 | B1 |
6184816 | Zheng et al. | Feb 2001 | B1 |
6188330 | Glover | Feb 2001 | B1 |
6194980 | Thon | Feb 2001 | B1 |
6199008 | Aratow et al. | Mar 2001 | B1 |
6201494 | Kronfeld | Mar 2001 | B1 |
6204806 | Hoech | Mar 2001 | B1 |
6205400 | Lin | Mar 2001 | B1 |
6208284 | Woodell et al. | Mar 2001 | B1 |
6219592 | Muller et al. | Apr 2001 | B1 |
6233522 | Morici | May 2001 | B1 |
6236351 | Conner et al. | May 2001 | B1 |
6259400 | Higgins et al. | Jul 2001 | B1 |
6266114 | Skarohlid | Jul 2001 | B1 |
6278799 | Hoffman | Aug 2001 | B1 |
6281832 | McElreath | Aug 2001 | B1 |
6285298 | Gordon | Sep 2001 | B1 |
6285337 | West et al. | Sep 2001 | B1 |
6285926 | Weiler et al. | Sep 2001 | B1 |
6289277 | Feyereisen et al. | Sep 2001 | B1 |
6311108 | Ammar et al. | Oct 2001 | B1 |
6317468 | Meyer | Nov 2001 | B1 |
6317690 | Gia | Nov 2001 | B1 |
6317872 | Gee et al. | Nov 2001 | B1 |
6340946 | Wolfson et al. | Jan 2002 | B1 |
6345127 | Mitchell | Feb 2002 | B1 |
6359585 | Bechman et al. | Mar 2002 | B1 |
6366013 | Leenders et al. | Apr 2002 | B1 |
6373418 | Abbey | Apr 2002 | B1 |
6374286 | Gee et al. | Apr 2002 | B1 |
6377202 | Kropfli et al. | Apr 2002 | B1 |
6377892 | Johnson et al. | Apr 2002 | B1 |
6388607 | Woodell | May 2002 | B1 |
6388608 | Woodell et al. | May 2002 | B1 |
6388724 | Campbell et al. | May 2002 | B1 |
6389354 | Hicks et al. | May 2002 | B1 |
6401038 | Gia | Jun 2002 | B2 |
6411890 | Zimmerman | Jun 2002 | B1 |
6421000 | McDowell | Jul 2002 | B1 |
6421603 | Pratt et al. | Jul 2002 | B1 |
6424288 | Woodell | Jul 2002 | B1 |
6426717 | Maloratsky | Jul 2002 | B1 |
6426720 | Ross et al. | Jul 2002 | B1 |
6427122 | Lin | Jul 2002 | B1 |
6441773 | Kelly et al. | Aug 2002 | B1 |
6445310 | Bateman et al. | Sep 2002 | B1 |
6448922 | Kelly | Sep 2002 | B1 |
6452511 | Kelly et al. | Sep 2002 | B1 |
6456236 | Hauck et al. | Sep 2002 | B1 |
6456238 | Posey | Sep 2002 | B1 |
6462703 | Hedrick | Oct 2002 | B2 |
6473026 | Ali-Mehenni et al. | Oct 2002 | B1 |
6473037 | Vail et al. | Oct 2002 | B2 |
6473240 | Dehmlow | Oct 2002 | B1 |
6481482 | Shimotomai | Nov 2002 | B1 |
6492934 | Hwang et al. | Dec 2002 | B1 |
6501424 | Haendel et al. | Dec 2002 | B1 |
6512476 | Woodell | Jan 2003 | B1 |
6512527 | Barber et al. | Jan 2003 | B1 |
6516272 | Lin | Feb 2003 | B2 |
6516283 | McCall et al. | Feb 2003 | B2 |
6520056 | Nemeth et al. | Feb 2003 | B1 |
6525674 | Kelly et al. | Feb 2003 | B1 |
6531669 | Miller et al. | Mar 2003 | B1 |
6549161 | Woodell | Apr 2003 | B1 |
6567728 | Kelly et al. | May 2003 | B1 |
6574030 | Mosier | Jun 2003 | B1 |
6577947 | Kronfeld et al. | Jun 2003 | B1 |
6590528 | DeWulf | Jul 2003 | B1 |
6591171 | Ammar et al. | Jul 2003 | B1 |
6593875 | Bergin et al. | Jul 2003 | B2 |
6600443 | Landt | Jul 2003 | B2 |
6603425 | Woodell | Aug 2003 | B1 |
6614057 | Silvernail et al. | Sep 2003 | B2 |
6650275 | Kelly et al. | Nov 2003 | B1 |
6650291 | West et al. | Nov 2003 | B1 |
6653947 | Dwyer et al. | Nov 2003 | B2 |
6667710 | Cornell et al. | Dec 2003 | B2 |
6681668 | Smirle | Jan 2004 | B1 |
6690298 | Barber et al. | Feb 2004 | B1 |
6690299 | Suiter | Feb 2004 | B1 |
6690317 | Szeto et al. | Feb 2004 | B2 |
6697008 | Sternowski | Feb 2004 | B1 |
6697012 | Lodwig et al. | Feb 2004 | B2 |
6710663 | Berquist | Mar 2004 | B1 |
6714186 | Mosier et al. | Mar 2004 | B1 |
6724344 | Stockmaster et al. | Apr 2004 | B1 |
6731236 | Hager et al. | May 2004 | B1 |
6738011 | Evans | May 2004 | B1 |
6739929 | Furukawa et al. | May 2004 | B2 |
6741203 | Woodell | May 2004 | B1 |
6741208 | West et al. | May 2004 | B1 |
6744382 | Lapis et al. | Jun 2004 | B1 |
6744408 | Stockmaster | Jun 2004 | B1 |
6757624 | Hwang et al. | Jun 2004 | B1 |
6760155 | Murayama et al. | Jul 2004 | B2 |
6771626 | Golubiewski et al. | Aug 2004 | B1 |
6782392 | Weinberger et al. | Aug 2004 | B1 |
6799095 | Owen et al. | Sep 2004 | B1 |
6803245 | Auch et al. | Oct 2004 | B2 |
6804614 | McGraw et al. | Oct 2004 | B1 |
6806846 | West | Oct 2004 | B1 |
6807538 | Weinberger et al. | Oct 2004 | B1 |
6813777 | Weinberger et al. | Nov 2004 | B1 |
6819983 | McGraw | Nov 2004 | B1 |
6822617 | Mather et al. | Nov 2004 | B1 |
6825804 | Doty | Nov 2004 | B1 |
6832538 | Hwang | Dec 2004 | B1 |
6839017 | Dillman | Jan 2005 | B1 |
6842288 | Liu et al. | Jan 2005 | B1 |
6850185 | Woodell | Feb 2005 | B1 |
6862323 | Loper | Mar 2005 | B1 |
6862501 | He | Mar 2005 | B2 |
6865452 | Burdon | Mar 2005 | B2 |
6879280 | Bull et al. | Apr 2005 | B1 |
6879886 | Wilkins et al. | Apr 2005 | B2 |
6882302 | Woodell et al. | Apr 2005 | B1 |
6908202 | Graf et al. | Jun 2005 | B2 |
6917396 | Hiraishi et al. | Jul 2005 | B2 |
6918134 | Sherlock et al. | Jul 2005 | B1 |
6933885 | Stockmaster et al. | Aug 2005 | B1 |
6938258 | Weinberger et al. | Aug 2005 | B1 |
6950062 | Mather et al. | Sep 2005 | B1 |
6959057 | Tuohino | Oct 2005 | B1 |
6972727 | West et al. | Dec 2005 | B1 |
6977608 | Anderson et al. | Dec 2005 | B1 |
6984545 | Grigg et al. | Jan 2006 | B2 |
6990022 | Morikawa et al. | Jan 2006 | B2 |
6992614 | Joyce | Jan 2006 | B1 |
6995726 | West et al. | Feb 2006 | B1 |
6998648 | Silvernail | Feb 2006 | B2 |
6998908 | Sternowski | Feb 2006 | B1 |
6999022 | Vesel et al. | Feb 2006 | B1 |
6999027 | Stockmaster | Feb 2006 | B1 |
7002546 | Stuppi et al. | Feb 2006 | B1 |
7010398 | Wilkins et al. | Mar 2006 | B2 |
7023375 | Klausing et al. | Apr 2006 | B2 |
7026956 | Wenger et al. | Apr 2006 | B1 |
7028304 | Weinberger et al. | Apr 2006 | B1 |
7030945 | Umemoto et al. | Apr 2006 | B2 |
7034753 | Elsallal et al. | Apr 2006 | B1 |
7042387 | Ridenour et al. | May 2006 | B2 |
7053796 | Barber | May 2006 | B1 |
7057549 | Block | Jun 2006 | B2 |
7064680 | Reynolds et al. | Jun 2006 | B2 |
7069120 | Koenck et al. | Jun 2006 | B1 |
7089092 | Wood et al. | Aug 2006 | B1 |
7092645 | Sternowski | Aug 2006 | B1 |
7098913 | Etherington et al. | Aug 2006 | B1 |
7109912 | Paramore et al. | Sep 2006 | B1 |
7109913 | Paramore et al. | Sep 2006 | B1 |
7123260 | Brust | Oct 2006 | B2 |
7129885 | Woodell et al. | Oct 2006 | B1 |
7145501 | Manfred et al. | Dec 2006 | B1 |
7148816 | Carrico | Dec 2006 | B1 |
7151507 | Herting | Dec 2006 | B1 |
7158072 | Venkatachalam et al. | Jan 2007 | B1 |
7161525 | Finley et al. | Jan 2007 | B1 |
7170446 | West et al. | Jan 2007 | B1 |
7170959 | Abbey | Jan 2007 | B1 |
7180476 | Guell et al. | Feb 2007 | B1 |
7191406 | Barber et al. | Mar 2007 | B1 |
7196329 | Wood et al. | Mar 2007 | B1 |
7205933 | Snodgrass | Apr 2007 | B1 |
7209070 | Gilliland et al. | Apr 2007 | B2 |
7212216 | He et al. | May 2007 | B2 |
7218268 | VandenBerg | May 2007 | B2 |
7219011 | Barber | May 2007 | B1 |
7242343 | Woodell | Jul 2007 | B1 |
7242345 | Raestad et al. | Jul 2007 | B2 |
7250903 | McDowell | Jul 2007 | B1 |
7265710 | DeAgro | Sep 2007 | B2 |
7269657 | Alexander et al. | Sep 2007 | B1 |
7272472 | McElreath | Sep 2007 | B1 |
7273403 | Yokota et al. | Sep 2007 | B2 |
7280068 | Lee et al. | Oct 2007 | B2 |
7289058 | Shima | Oct 2007 | B2 |
7292178 | Woodell et al. | Nov 2007 | B1 |
7292180 | Schober | Nov 2007 | B2 |
7295150 | Burlet et al. | Nov 2007 | B2 |
7295901 | Little et al. | Nov 2007 | B1 |
7301496 | Honda et al. | Nov 2007 | B2 |
7307576 | Koenigs | Dec 2007 | B1 |
7307577 | Kronfeld et al. | Dec 2007 | B1 |
7307583 | Woodell et al. | Dec 2007 | B1 |
7312725 | Berson et al. | Dec 2007 | B2 |
7312743 | Ridenour et al. | Dec 2007 | B2 |
7317427 | Pauplis et al. | Jan 2008 | B2 |
7321332 | Focke et al. | Jan 2008 | B2 |
7337043 | Bull | Feb 2008 | B2 |
7349154 | Aiura et al. | Mar 2008 | B2 |
7352292 | Alter et al. | Apr 2008 | B2 |
7361240 | Kim | Apr 2008 | B2 |
7372394 | Woodell et al. | May 2008 | B1 |
7373223 | Murphy | May 2008 | B2 |
7375678 | Feyereisen et al. | May 2008 | B2 |
7379014 | Woodell et al. | May 2008 | B1 |
7379796 | Walsdorf et al. | May 2008 | B2 |
7381110 | Sampica et al. | Jun 2008 | B1 |
7417578 | Woodell et al. | Aug 2008 | B1 |
7417579 | Woodell | Aug 2008 | B1 |
7423578 | Tietjen | Sep 2008 | B1 |
7446697 | Burlet et al. | Nov 2008 | B2 |
7446938 | Miyatake et al. | Nov 2008 | B2 |
7452258 | Marzen et al. | Nov 2008 | B1 |
7474262 | Alland | Jan 2009 | B2 |
7479920 | Niv | Jan 2009 | B2 |
7486220 | Kronfeld et al. | Feb 2009 | B1 |
7486291 | Berson et al. | Feb 2009 | B2 |
7492304 | Woodell et al. | Feb 2009 | B1 |
7492305 | Woodell et al. | Feb 2009 | B1 |
7515087 | Woodell et al. | Apr 2009 | B1 |
7515088 | Woodell et al. | Apr 2009 | B1 |
7525448 | Wilson et al. | Apr 2009 | B1 |
7528765 | Woodell et al. | May 2009 | B1 |
7528915 | Choi et al. | May 2009 | B2 |
7541970 | Godfrey et al. | Jun 2009 | B1 |
7541971 | Woodell et al. | Jun 2009 | B1 |
7551451 | Kim et al. | Jun 2009 | B2 |
7557735 | Woodell et al. | Jul 2009 | B1 |
7566254 | Sampica et al. | Jul 2009 | B2 |
7570177 | Reynolds et al. | Aug 2009 | B2 |
7576680 | Woodell | Aug 2009 | B1 |
7603209 | Dwyer | Oct 2009 | B2 |
7609200 | Woodell et al. | Oct 2009 | B1 |
7612706 | Honda et al. | Nov 2009 | B2 |
7616150 | Woodell | Nov 2009 | B1 |
7633428 | McCusker et al. | Dec 2009 | B1 |
7633430 | Wichgers et al. | Dec 2009 | B1 |
7633584 | Umemoto et al. | Dec 2009 | B2 |
7639175 | Woodell | Dec 2009 | B1 |
7664601 | Daly, Jr. | Feb 2010 | B2 |
7675461 | McCusker et al. | Mar 2010 | B1 |
7693621 | Chamas | Apr 2010 | B1 |
7696921 | Finley et al. | Apr 2010 | B1 |
7714767 | Kronfeld et al. | May 2010 | B1 |
7733264 | Woodell et al. | Jun 2010 | B1 |
7783427 | Woodell et al. | Aug 2010 | B1 |
7783429 | Walden et al. | Aug 2010 | B2 |
7791529 | Filias et al. | Sep 2010 | B2 |
7808422 | Woodell et al. | Oct 2010 | B1 |
7814676 | Sampica et al. | Oct 2010 | B2 |
7843380 | Woodell | Nov 2010 | B1 |
7859448 | Woodell et al. | Dec 2010 | B1 |
7859449 | Woodell et al. | Dec 2010 | B1 |
7864103 | Weber et al. | Jan 2011 | B2 |
7868811 | Woodell et al. | Jan 2011 | B1 |
7872594 | Vesel | Jan 2011 | B1 |
7889117 | Woodell et al. | Feb 2011 | B1 |
7889118 | Finley et al. | Feb 2011 | B1 |
7927440 | Matsuhira | Apr 2011 | B2 |
7929086 | Toyama et al. | Apr 2011 | B2 |
7965223 | McCusker | Jun 2011 | B1 |
7965225 | Dickerson et al. | Jun 2011 | B1 |
8035547 | Flanigan et al. | Oct 2011 | B1 |
8038498 | Miyauchi et al. | Oct 2011 | B2 |
8045098 | Kitagawa et al. | Oct 2011 | B2 |
8059025 | D'Addio | Nov 2011 | B2 |
8068984 | Smith et al. | Nov 2011 | B2 |
8072368 | Woodell | Dec 2011 | B1 |
8077078 | Woodell et al. | Dec 2011 | B1 |
8102487 | Kitagawa et al. | Jan 2012 | B2 |
8118075 | Sampica et al. | Feb 2012 | B2 |
8137498 | Sampica et al. | Mar 2012 | B2 |
8140223 | Whitehead et al. | Mar 2012 | B2 |
8159464 | Gribble et al. | Apr 2012 | B1 |
8232917 | Scherzinger et al. | Jul 2012 | B2 |
8296065 | Haynie et al. | Oct 2012 | B2 |
8373580 | Bunch et al. | Feb 2013 | B2 |
8410975 | Bell et al. | Apr 2013 | B1 |
8477062 | Kanellis | Jul 2013 | B1 |
8486535 | Nemeth et al. | Jul 2013 | B1 |
8493241 | He | Jul 2013 | B2 |
8515600 | McCusker | Aug 2013 | B1 |
8540002 | Sampica et al. | Sep 2013 | B2 |
8558731 | Woodell | Oct 2013 | B1 |
8576112 | Garrec et al. | Nov 2013 | B2 |
8583315 | Whitehead et al. | Nov 2013 | B2 |
8594879 | Roberge et al. | Nov 2013 | B2 |
8603288 | Sampica et al. | Dec 2013 | B2 |
8634993 | McClure et al. | Jan 2014 | B2 |
8639416 | Jones et al. | Jan 2014 | B2 |
8643533 | Woodell et al. | Feb 2014 | B1 |
8691043 | Sampica et al. | Apr 2014 | B2 |
8717226 | Bon et al. | May 2014 | B2 |
8936057 | Sampica et al. | Jan 2015 | B2 |
20010050372 | Krijn et al. | Dec 2001 | A1 |
20010053648 | Furukawa et al. | Dec 2001 | A1 |
20020039070 | Ververs et al. | Apr 2002 | A1 |
20020111717 | Scherzinger et al. | Aug 2002 | A1 |
20020116125 | Lin | Aug 2002 | A1 |
20020116126 | Lin | Aug 2002 | A1 |
20020158256 | Yamada et al. | Oct 2002 | A1 |
20020179229 | Chuzles | Dec 2002 | A1 |
20020185600 | Kerr | Dec 2002 | A1 |
20020187284 | Kinoshita et al. | Dec 2002 | A1 |
20030021491 | Brust | Jan 2003 | A1 |
20030038916 | Nakano et al. | Feb 2003 | A1 |
20030043315 | Umemoto et al. | Mar 2003 | A1 |
20030071828 | Wilkins et al. | Apr 2003 | A1 |
20030089214 | Fukuta et al. | May 2003 | A1 |
20030093187 | Walker | May 2003 | A1 |
20030102999 | Bergin et al. | Jun 2003 | A1 |
20030156238 | Hiraishi et al. | Aug 2003 | A1 |
20030160718 | Nagasaku | Aug 2003 | A1 |
20030174396 | Murayama et al. | Sep 2003 | A1 |
20030180528 | Flosenzier et al. | Sep 2003 | A1 |
20030189606 | Moon et al. | Oct 2003 | A1 |
20030195672 | He | Oct 2003 | A1 |
20030216859 | Martell et al. | Nov 2003 | A1 |
20030222887 | Wilkins et al. | Dec 2003 | A1 |
20040044445 | Burdon | Mar 2004 | A1 |
20040059473 | He | Mar 2004 | A1 |
20040066645 | Graf et al. | Apr 2004 | A1 |
20040072575 | Young et al. | Apr 2004 | A1 |
20040083038 | He | Apr 2004 | A1 |
20040160341 | Feyereisen et al. | Aug 2004 | A1 |
20040160364 | Regev | Aug 2004 | A1 |
20040181318 | Redmond et al. | Sep 2004 | A1 |
20040264549 | Hoole | Dec 2004 | A1 |
20050004748 | Pinto et al. | Jan 2005 | A1 |
20050052451 | Servantie | Mar 2005 | A1 |
20050126679 | Kim | Jun 2005 | A1 |
20050136625 | Henseler et al. | Jun 2005 | A1 |
20050174350 | Ridenour et al. | Aug 2005 | A1 |
20050200502 | Reusser et al. | Sep 2005 | A1 |
20050230563 | Corcoran | Oct 2005 | A1 |
20060004497 | Bull | Jan 2006 | A1 |
20060097895 | Reynolds et al. | May 2006 | A1 |
20060098452 | Choi et al. | May 2006 | A1 |
20060164284 | Pauplis et al. | Jul 2006 | A1 |
20060207967 | Bocko et al. | Sep 2006 | A1 |
20060215265 | Miyatake et al. | Sep 2006 | A1 |
20060227012 | He | Oct 2006 | A1 |
20060244636 | Rye et al. | Nov 2006 | A1 |
20060245171 | Kim et al. | Nov 2006 | A1 |
20060290253 | Yeo et al. | Dec 2006 | A1 |
20060290531 | Reynolds et al. | Dec 2006 | A1 |
20070001897 | Alland | Jan 2007 | A1 |
20070002078 | He et al. | Jan 2007 | A1 |
20070008214 | Wasiewicz | Jan 2007 | A1 |
20070013575 | Lee et al. | Jan 2007 | A1 |
20070018887 | Feyereisen et al. | Jan 2007 | A1 |
20070032951 | Tanenhaus et al. | Feb 2007 | A1 |
20070060063 | Wright et al. | Mar 2007 | A1 |
20070146364 | Aspen | Jun 2007 | A1 |
20070171094 | Alter et al. | Jul 2007 | A1 |
20070176794 | Feyereisen et al. | Aug 2007 | A1 |
20070228586 | Merrill et al. | Oct 2007 | A1 |
20070247350 | Ryan | Oct 2007 | A1 |
20070279253 | Priest | Dec 2007 | A1 |
20070297736 | Sherman et al. | Dec 2007 | A1 |
20080018524 | Christianson | Jan 2008 | A1 |
20080051947 | Kemp | Feb 2008 | A1 |
20080074308 | Becker et al. | Mar 2008 | A1 |
20080111731 | Hubbard et al. | May 2008 | A1 |
20080145610 | Muller et al. | Jun 2008 | A1 |
20080180351 | He | Jul 2008 | A1 |
20080305721 | Ohashi et al. | Dec 2008 | A1 |
20090040070 | Alter et al. | Feb 2009 | A1 |
20090040772 | Laney | Feb 2009 | A1 |
20090046229 | Umemoto et al. | Feb 2009 | A1 |
20090148682 | Higuchi | Jun 2009 | A1 |
20090152391 | McWhirk | Jun 2009 | A1 |
20090153783 | Umemoto | Jun 2009 | A1 |
20090164067 | Whitehead et al. | Jun 2009 | A1 |
20090207048 | He et al. | Aug 2009 | A1 |
20090279030 | Toyama et al. | Nov 2009 | A1 |
20090279175 | Laney et al. | Nov 2009 | A1 |
20100033499 | Gannon et al. | Feb 2010 | A1 |
20100103353 | Yamada | Apr 2010 | A1 |
20100297406 | Schaffer et al. | Nov 2010 | A1 |
20100312428 | Roberge et al. | Dec 2010 | A1 |
20100312461 | Haynie et al. | Dec 2010 | A1 |
20110054729 | Whitehead et al. | Mar 2011 | A1 |
20110075070 | Kitagawa et al. | Mar 2011 | A1 |
20110141405 | Kitagawa et al. | Jun 2011 | A1 |
20110165361 | Sherman et al. | Jul 2011 | A1 |
20110282580 | Mohan | Nov 2011 | A1 |
20120053831 | Halder | Mar 2012 | A1 |
20120150426 | Conway | Jun 2012 | A1 |
20120174445 | Jones et al. | Jul 2012 | A1 |
20120215410 | McClure et al. | Aug 2012 | A1 |
20130041529 | He | Feb 2013 | A1 |
Number | Date | Country |
---|---|---|
196 49 838 | Apr 1998 | DE |
0 556 351 | Jun 1995 | EP |
0 962 752 | Dec 1999 | EP |
0 814 744 | Jun 1959 | GB |
01-210328 | Aug 1989 | JP |
05-200880 | Aug 1993 | JP |
05-293895 | Nov 1993 | JP |
06-051484 | Feb 1994 | JP |
H08-220547 | Aug 1996 | JP |
09-057779 | Mar 1997 | JP |
10-156853 | Jun 1998 | JP |
10-244589 | Sep 1998 | JP |
2000-141388 | May 2000 | JP |
2004-233590 | Aug 2004 | JP |
2004-354645 | Dec 2004 | JP |
2006-218658 | Aug 2006 | JP |
2006-334912 | Dec 2006 | JP |
2006-348208 | Dec 2006 | JP |
2007-206559 | Aug 2007 | JP |
2008-238607 | Jan 2008 | JP |
WO-9305634 | Mar 1993 | WO |
WO-2011089474 | Jul 2011 | WO |
Entry |
---|
Airports Authority of India, Chapter 7: Visual aids for navigation-lights, available prior to Jan. 1, 2005, retrieved from the internet at: http://www.aai.aero/aai—employees/chapter—7.pdf on Sep. 26, 2014, 33 pages. |
Brailovsky et al., REVS122: A Radar-Based Enhanced Vision System for Degraded Visual Environments, Proc. of SPIE vol. 9087 908708-1, retrieved from the internet at http://proceedings.spiedigitallibrary.org on Jun. 25, 2014, 13 pages. |
Federal Aviation Administration, Advisory Circular AC 90-106, “Enhanced Flight Vision Systems”, initiated by AFS-400, dated Jun. 2, 2010, 55 pages. |
Federal Aviation Administration, Aeronautical Information Manual (AIM) Basic Flight Information and ATC Procedures, dated Jul. 24, 2014, 2 pages. |
Fountain, J.R., Digital Terrain Systems, Airborne Navigation Systems Workshop (Digest No. 1997/169), IEE Colloquium, pp. 4/1-4/6, Feb. 21, 1997. |
Honeywell, RDR-4B Forward looking windshear detection / weather radar system user's manual with radar operating guidelines, Rev. 6, Jul. 2003, 106 pages. |
Johnson, A., et al., Vision Guided Landing of an Autonomous Helicopter in Hazardous Terrain, Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference, pp. 3966-3971, Apr. 18-22, 2005. |
Kuntman, D., Airborne system to address leading cause of injuries in non-fatal airline accidents, ICAO Journal, Mar. 2000, 4 pages. |
Notice of Allowance for U.S. Appl. No. 11/863,221, mail date Aug. 2, 2010, 9 pages. |
Notice of Allowance for U.S. Appl. No. 11/899,801, mail date Aug. 19, 2010, 5 pages. |
U.S. Appl. No. 12/236,464, filed Sep. 23, 2008, Rockwell Collins. |
U.S. Appl. No. 13/627,788, filed Sep. 26, 2012, Rockwell Collins. |
U.S. Appl. No. 13/250,798, filed Sep. 30, 2011, Rockwell Collins. |
U.S. Appl. No. 14/301,199, filed Jun. 10, 2014, Rockwell Collins. |
U.S. Appl. No. 14/482,681, filed Sep. 10, 2014, Rockwell Collins. |
Notice of Allowance for U.S. Appl. No. 11/900,002, mail date Sep. 14, 2010, 5 pages. |
Notice of Allowance for U.S. Appl. No. 12/167,200, mail date Oct. 28, 2010, 5 pages. |
Notice of Allowance for U.S. Appl. No. 12/167,203, mail date Jun. 21, 2013, 7 pages. |
Notice of Allowance for U.S. Appl. No. 12/167,208, mail date Mar. 21, 2011, 8 pages. |
Notice of Allowance for U.S. Appl. No. 12/180,293, mail date Aug. 4, 2011, 8 pages. |
Notice of Allowance on U.S. Appl. No. 13/241,051 Dated Aug. 28, 2014, 9 pages. |
Notice of Allowance on U.S. Appl. No. 13/247,742 Dated Jul. 30, 2014, 9 pages. |
Office Action for U.S. Appl. No. 11/851,323, mail date Aug. 6, 2009, 23 pages. |
Office Action for U.S. Appl. No. 11/851,323, mail date Dec. 15, 2010, 13 pages. |
Office Action for U.S. Appl. No. 11/851,323, mail date Jul. 5, 2012, 23 pages. |
Office Action for U.S. Appl. No. 12/167,200, mail date Jul. 21, 2010, 6 pages. |
Office Action for U.S. Appl. No. 12/167,203, mail date Aug. 26, 2010, 11 pages. |
Office Action for U.S. Appl. No. 12/167,203, mail date Sep. 21, 2012, 6 pages. |
Office Action for U.S. Appl. No. 12/167,208, mail date Dec. 30, 2009, 10 pages. |
Office Action for U.S. Appl. No. 12/167,208, mail date Jun. 3, 2011, 11 pages. |
Office Action for U.S. Appl. No. 12/167,208, mail date Oct. 19, 2010, 8 pages. |
Office Action for U.S. Appl. No. 12/180,293, mail date Jan. 4, 2011, 5 pages. |
Office Action for U.S. Appl. No. 12/180,293, mail date Jul. 28, 2010, 8 pages. |
Office Action for U.S. Appl. No. 12/976,871, mail date Feb. 15, 2012, 8 pages. |
Office Action for U.S. Appl. No. 12/976,871, mail date Jul. 10, 2012, 4 pages. |
Office Action for U.S. Appl. No. 12/976,871, mail date May 6, 2013, 5 pages. |
Office Action for U.S. Appl. No. 12/976,871, mail date Nov. 21, 2012, 5 pages. |
Office Action for U.S. Appl. No. 12/976,871, mail date Oct. 9, 2013, 5 pages. |
Office Action for U.S. Appl. No. 13/183,314, mail date Aug. 14, 2013, 11 pages. |
Office Action for U.S. Appl. No. 13/183,314, mail date Mar. 28, 2013, 12 pages. |
Office Action for U.S. Appl. No. 13/474,559, mail date Aug. 28, 2013, 10 pages. |
Office Action for U.S. Appl. No. 13/474,559, mail date Dec. 28, 2012, 8 pages. |
Office Action on U.S. Appl. No. 13/241,051 Dated Feb. 27, 2014, 21 pages. |
Office Action on U.S. Appl. No. 13/247,742 Dated Dec. 3, 2013, 11 pages. |
REVS Product Information Sheet, Sierra Nevada Corporation, dated May 7, 2014, 2 pages. |
SKOLNIK, Introduction Radar Systems, McGraw Hill Book Company, 2001, 3 pages. |
SKOLNIK, Radar Handbook (McGraw Hill Book Company), 1990, 23 pages. |
Technical Standard Order, TSO-C115b, Airborne Area Navigation Equipment Using Multi-Sensor Inputs. Department of Transportation, Federal Aviation Administration, Sep. 30, 1994, 11 pages. |
US Office Action on U.S. Appl. No. 11/900,002 Dated Jun. 8, 2010. |
U.S. Office Action on U.S. Appl. No. 13/247,742 Dated Apr. 16, 2014, 15 pages. |
Vadiamani, A., et al., Improving the detection capability of spatial failure modes using downward looking sensors in terrain database integrity monitors, Digital Avionics Systems Conference, 2003. DASC-03. The 22nd, vol. 2, pp. 9C.5-91-12 vol. 2, Oct. 12-16, 2003. |
Wang et al., A Simple Based on DSP Antenna Controller of Weather Radar, 2001 CIE International Conference, 4 pages. |
U.S. Appl. No. 13/857,955, filed Apr. 5, 2013, Barber et al. |
Synthetic Vision System, en.wikipedia.org/wiki/Synthetic—vision—system, retrieved Feb. 28, 2013, 4 pages. |
U.S. Appl. No. 11/851,323, filed Sep. 6, 2007, McCusker. |
U.S. Appl. No. 11/863,219, filed Sep. 27, 2007, Woodell. |
U.S. Appl. No. 11/863,221, filed Sep. 27, 2007, Woodell. |
U.S. Appl. No. 11/899,801, filed Sep. 6, 2007, Woodell et al. |
U.S. Appl. No. 11/900,002, filed Sep. 6, 2007, Woodell et al. |
U.S. Appl. No. 12/167,200, filed Jul. 2, 2008, Woodell et al. |
U.S. Appl. No. 12/167,203, filed Jul. 2, 2008, Woodell. |
U.S. Appl. No. 12/167,208, filed Jul. 2, 2008, Dickerson et al. |
U.S. Appl. No. 12/180,293, filed Jul. 25, 2008, Woodell et al. |
U.S. Appl. No. 12/786,169, filed May 24, 2010, Nemeth et al. |
U.S. Appl. No. 13/224,992, filed Sep. 2, 2011, Hufnagel et al. |
U.S. Appl. No. 13/250,307, filed Sep. 30, 2011, Jinkins. |
U.S. Appl. No. 13/250,798, filed Sep. 30, 2011, Jinkins. |
“MountainScope™ on a TabletPC,” PCAvionics™, printed from website www.pcavionics.com on Aug. 28, 2007, 1 page. |
TAWS CLASS A and CLASS B, Terrain Awareness and Warning Systems, Universal® Avionics Systems Corporation, Sep. 2007, 6 pages. |
“TAWS Terrain Awareness and Warning System,” Universal® Avionics, printed from website www.uasc.com on Aug. 28, 2007, 2 pages. |
Adams, Charlotte, “Synthetic Vision: Picturing the Future,” Avionics magazine, Oct. 1, 2006, printed from website www.aviationtoday.com, 4 pages. |
Adams, Charlotte, “Synthetic Vision: Picturing the Future,” Avionics magazine, Solutions for Global Airspace Electronics, Oct. 2006, cover and pp. 22-29. |
Advisory Action for U.S. Appl. No. 12/009,472, mail date Feb. 25, 2013, 3 pages. |
Advisory Action for U.S. Appl. No. 13/538,957, mail date Jun. 14, 2013, 6 pages. |
Blue Mountain Avionics' Products, printed from website www.bluemountainavionics.com on Aug. 28, 2007, 4 pages. |
Carter, S. P., D. D. Blankenship, M. E. Peters, D. A. Young, J. W. Holt, and D. L. Morse (2007), Radar-Based subglacial lake classification in Antarctica, Geochem, Geophys, Geosyst., 8, 003016, doi:10 1029/2006GC001408, 20 pages. |
Final Office Action on U.S. Appl. No. 13/250,798 dated Sep. 4, 2014, 22 pages. |
Final Office Action on U.S. Appl. No. 13/867,556 Dated Jul. 3, 2014, 11 pages. |
Final Office Action on U.S. Appl. No. 13/250,307 Dated Jun. 11, 2014, 8 pages. |
Final Office Action on U.S. Appl. No. 13/250,798 Dated Aug. 7, 2015, 21 pages. |
G2000, Garmin, printed from website https://buy.garmin.com/shop/shop/do?cID=153&pID=97668 on Jun. 28, 2011, 2 pages. |
G3000, Garmin, printed from website https://buy.garmin.com/shop/shop.do?cID=153&pID=66916 on Jun. 28, 2011, 2 pages. |
G5000, Garmin, printed from website https://buy.garmin.com/shop.do?cID=153&pID=90821&ra=true on Apr. 20, 2011, 2 pages. |
Non-Final Office Action on U.S. Appl. No. 13/250,798 Dated Mar. 18, 2015, 21 pages. |
Notice of Allowance for U.S. Appl. No. 12/009,372, mail date Oct. 13, 201, 8 pages. |
Notice of Allowance for U.S. Appl. No. 12/009,373, mail date Jun. 16, 2010, 4 pages. |
Notice of Allowance for U.S. Appl. No. 12/009,472, mail date Sep. 5, 2013, 8 pages. |
Notice of Allowance for U.S. Appl. No. 12/786,169, mail date Mar. 28, 2013, 6 pages. |
Notice of Allowance for U.S. Appl. No. 13/538,957, mail date Oct. 3, 2013, 13 pages. |
Office Action for U.S. Appl. No. 12/009,372, mail date Dec. 20, 2010, 10 pages. |
Office Action for U.S. Appl. No. 12/009,372, mail date Jun. 13, 2011, 9 pages. |
Office Action for U.S. Appl. No. 12/009,373, mail date Dec. 30, 2009, 14 pages. |
Office Action for U.S. Appl. No. 12/009,472, mail date Apr. 16, 2012, 16 pages. |
Office Action for U.S. Appl. No. 12/009,472, mail date Jan. 14, 2011, 14 pages. |
Office Action for U.S. Appl. No. 12/009,472, mail date Mar. 20, 2013, 15 pages. |
Office Action for U.S. Appl. No. 12/009,472, mail date Nov. 3, 2011, 15 pages. |
Office Action for U.S. Appl. No. 12/009,472, mail date Nov. 9, 2012, 15 pages. |
Office Action for U.S. Appl. No. 12/263,282, mail date Jan. 5, 2012, 10 pages. |
Office Action for U.S. Appl. No. 12/786,169, mail date Jan. 18, 2013, 14 pages. |
Office Action for U.S. Appl. No. 12/892,563, mail date Feb. 19, 2013, 12 pages. |
Office Action for U.S. Appl. No. 13/224,992, mail date Feb. 28, 2013, 10 pages. |
Office Action for U.S. Appl. No. 13/250,307, mail date Nov. 5, 2013, 11 pages. |
Office Action for U.S. Appl. No. 13/538,957, mail date Apr. 4, 2013, 19 pages. |
Office Action for U.S. Appl. No. 13/538,957, mail date Oct. 5, 2012, 18 pages. |
Office Action for U.S. Appl. No. 13/743,182, mail date Apr. 8, 2013, 10 pages. |
Office Action for U.S. Appl. No. 12/786,169, mail date Jul. 20, 2012, 8 pages. |
Office Action in Japanese Patent Application 2015-116688, dated Aug. 25, 2015, 4 pages. |
Office Action in Japanese Patent Application 2015-116716, dated Aug. 25, 2015, 3 pages. |
Office Action on U.S. Appl. No. 12/236,464, mail date Feb. 11, 2014, 21 pages. |
Office Action on U.S. Appl. No. 12/236,464, mail date Jun. 22, 2011, 14 pages. |
Office Action on U.S. Appl. No. 13/250,798 Dated Apr. 23, 2014, 15 pages. |
Office Action on U.S. Appl. No. 13/867,556 Dated Feb. 7, 2014, 11 pages. |
Office Action U.S. Appl. No. 11/787,460, mail date Aug. 31, 2010, 18 pages. |
Office Action with English Translation received in Korean Patent Application 10-2010-7017278, dated Aug. 26, 2015, 5 pages. |
Pictures of DELPHINS, printed from website www.tunnel-in-the-sky.tudelft.nl on Aug. 28, 2007, 4 pages. |
Restriction Requirement for U.S. Appl. No. 13/867,556, mail date Dec. 26, 2013, 6 pages. |
Van Kasteren, Joost, “Tunnel-in-the-Sky, Synthetic vision simplifies the pilot's job and enhances safety,” printed from website www.delfoutlook.tudelft.nl on Aug. 28, 2007, 13 pages. |
Walker, GD-Itronix Dynavue Technology, The Ultimate Outdoor-Readable Touch-Screen Display, Rugged PC Review, 4 pages. |
Non-Final Office Action on U.S. Appl. No. 13/250,798 dated Feb. 26, 2016, 9 pages. |
Notice of Allowance on U.S. Appl. No. 12/263,282 dated Jan. 29, 2016, 8 pages. |