The present application is related to U.S. patent application Ser. No. 14/482,681 filed Sep. 10, 2014 by Wood et al., U.S. patent application Ser. No. 14/301,199 filed on Jun. 10, 2014 by McCusker et al, U.S. patent application Ser. No. 13/627,788 filed on Sep. 26, 2012; U.S. patent application Ser. No. 12/892,563 filed on Sep. 28, 2010, U.S. patent application Ser. No. 13/250,798 filed on Sep. 30, 2011, U.S. patent application Ser. No. 12/236,464 filed on Sep. 28, 2008, U.S. patent application Ser. No. 12/167,200 filed on Jul. 2, 2008, U.S. patent application Ser. No. 12/180,293 filed on Jul. 25, 2008, U.S. patent application Ser. No. 13/247,742 filed on Sep. 28, 2011, U.S. patent application Ser. No. 11/851,323 filed on Sep. 6, 2007, U.S. patent application Ser. No. 11/904,491 filed on Sep. 26, 2007, U.S. patent application Ser. No. 13/241,051 filed on Sep. 22, 2011, U.S. patent application Ser. No. 12/263,282 filed on Oct. 31, 2008 and U.S. patent application Ser. No. 12/180,293 filed on Jul. 25, 2008, all of which are herein incorporated by reference in their entireties and assigned to the assignee of the present application.
An aircraft uses an enhanced vision system (EVS) to provide imagery to an aircraft crew. The imagery can include an airport terminal area and runway environment when meteorological conditions prevent a clear natural view of the external surroundings of the aircraft through the windscreen. For example, the EVS may overlay an image of an airport terminal area and runway environment over the pilot's natural unaided view of the external surroundings of the aircraft through the aircraft's cockpit windscreen. Such imagery can improve the situational awareness of the flight crew during instrument approach procedures in low visibility conditions such as fog. That same enhanced vision system can be used as an FAA-certified enhanced flight vision systems (EFVS) which can allow pilots landing under instrument flight rules to operate below certain specified altitudes during instrument approaches even when the airport environment is not visible. For example, under Title 14 of the Code of Federal Regulations, part 91, a pilot may not descend below decision altitude (DA) or minimum descent altitude (MDA) to 100 feet above the touchdown zone elevation (TDZE) from a straight-in instrument approach procedure (IAP), other than Category II or Category III, unless the pilot can see certain required visual references. Such visual references include, for example, the approach lighting system, the threshold lighting system, and the runway edge lighting system. The pilot may, however, use an EFVS to identify the required visual references in low visibility conditions where the pilot's natural unaided vision is unable to identify these visual references. Accordingly, the use of an EFVS may minimize losses due to the inability of the pilot to land the plane and deliver cargo and/or passengers on time in low visibility conditions.
EVS imagery is typically presented to the pilot flying (PF) on a head up display (HUD). The HUD is typically a transparent display device that allows the PF to view EVS imagery while looking at the external surroundings of the aircraft through the cockpit windscreen. As long as visibility conditions outside of the aircraft permit the PF to see the external surroundings of the aircraft through the cockpit windscreen, the PF can verify that the EVS is functioning properly such that the imagery on the HUD is in alignment with the PF's view of the external surroundings of the aircraft.
EVS imagery is sometimes also presented to the pilot monitoring (PM) on a head down display (HDD). For example, in some countries, the system must present the EVS imagery to the PM for confirmation that the EVS information is a reliable and accurate indicator of the required visual references. The PM may also use the EVS imagery to determine whether the PF is taking appropriate action during approach and landing procedures. The HDD is typically a non-transparent display device mounted adjacent to or within a console or instrument panel of the aircraft.
An EVS typically uses either a passive or active sensing system to acquire data used to generate imagery of the airport terminal area and runway environment. A typical passive sensor, such as a forward looking infrared (FLIR) camera or visible light spectrum camera, receives electromagnetic energy from the environment and outputs data that may be used by the system to generate video images from the point of view of the camera. The camera is installed in an appropriate position, such as in the nose of an aircraft, so that the PF may be presented with an appropriately scaled and positioned video image on the HUD having nearly the same point of view as the PF when viewing the external surroundings of the aircraft through the HUD. However, while passive sensors provide higher quality video imagery, they may be unable to identify required visual references in certain low visibility conditions such as heavy fog.
Active sensing systems, such as millimeter wavelength (MMW) radar systems (e.g., 94 GHz), transmit electromagnetic energy into the environment and then receive return electromagnetic energy reflected from the environment. The active sensing system is typically installed in an appropriate position, such as in the nose of an aircraft. Active sensing systems are expensive and require space on-board the aircraft that is required for other types of equipment. In addition, MMW radar systems require expensive radome technology.
Additionally, both passive FLIR cameras and active millimeter wavelength radar systems may have limited range in certain low visibility conditions such as heavy fog.
Thus, there is a need for real time or near real time sensing systems for and methods of providing enhanced vision at longer ranges and in inclement weather. Further, there is a need for real time or near real time sensing systems for and methods of providing enhanced vision imagery that is less expensive and does not require additional space on the aircraft. There is also a need for display systems for and methods of providing images of the external scene topography using radar data from a weather radar system. There is still a further need for systems for and methods of providing images of the runway environment derived from weather radar data where such images enable operation below certain specified altitudes during instrument approaches. Further still, there is a need for systems and methods that achieve higher resolution imaging using X-band and C-band radar data.
It would be desirable to provide a system and/or method that provides one or more of these or other advantageous features. Other features and advantages will be made apparent from the present specification. The teachings disclosed extend to those embodiments which fall within the scope of the appended claims, regardless of whether they accomplish one or more of the aforementioned needs.
In one aspect, embodiments of the inventive concepts disclosed herein are directed to an image processing system for enhanced vision including a processor and memory coupled to the processor. The memory contains program instructions that, when executed, causes the processor to provide radar beams and receive radar returns with improved angular and/or range resolution for deriving image data of the external scene topography.
In a further aspect, embodiments of the inventive concepts disclosed herein are directed to a vision method which uses or a vision system which includes a weather radar system configured to enhance resolution in range and in azimuth. The weather radar system generates image data associated with radar returns received by the weather radar system. The radar returns are in an X-band or a C-band. The vision system also includes a display in communication with the weather radar system configured to display an image associated with the image data.
In a further aspect, embodiments of the inventive concepts disclosed herein are directed to an airborne weather radar system which provides enhanced vision. The weather radar system includes an antenna, and a control circuit configured to provide radar beams via the antenna toward external surroundings and configured to receive radar returns. The control circuit is configured to process the radar returns to provide image data associated with the external surroundings. The weather radar system provides increased range resolution and increased angular resolution compared to weather radar sensing functions for the radar returns used to provide the image data. The radar beams are in the X-band or the C-band, and the image data is for providing a visual image of the external scene topography to a pilot.
In a further aspect, embodiments of the inventive concepts disclosed herein are directed to a method that provides a real time sensor image. The method includes receiving radar returns from an X-band or C-band airborne weather radar system. The radar returns can be processed to have increased range resolution and angular resolution and are received from external surroundings. The method also includes providing a visual image of the external scene topography based on the radar returns.
Implementations of the inventive concepts disclosed herein may be better understood when consideration is given to the following detailed description thereof. Such description makes reference to the annexed drawings, which are not necessarily to scale, and in which some features may be exaggerated and some features may be omitted or may be represented schematically in the interest of clarity. Like reference numerals in the figures may represent and refer to the same or similar element, feature, or function. In the drawings:
Before describing in detail the inventive concepts disclosed herein, it should be observed that the inventive concepts disclosed herein include, but are not limited to, a novel structural combination of data/signal processing components and communications circuits, and not in the particular detailed configurations thereof. Accordingly, the structure, methods, functions, control and arrangement of components, software, and circuits have, for the most part, been illustrated in the drawings by readily understandable block representations and schematic diagrams, in order not to obscure the disclosure with structural details which will be readily apparent to those skilled in the art, having the benefit of the description herein. Further, the inventive concepts disclosed herein are not limited to the particular embodiments depicted in the exemplary diagrams, but should be construed in accordance with the language in the claims.
According to various exemplary embodiments, an EVS or display system may be provided with radar sensing and imagery displayable to a pilot or co-pilot on an aircraft display, such as an HDD or HUD. For example, the display system may include or use a weather radar system to display an image based upon radar return data. In some embodiments, a Doppler weather radar system may be configured to have enhanced resolution (e.g., angular resolution and/or range resolution). Reflectivity of radar returns from runway structures in an airport terminal or runway environment, such as, an approach lighting system, a threshold lighting system, and or a runway edge lighting system, can be sensed. As will be appreciated, using a weather radar system configured according to the various exemplary embodiments provides greater range than millimeter wavelength radar sensing systems or passive FLIR or visible light camera systems in low visibility conditions, such as, heavy fog, given the weather radar system's superior ability of weather radar system to penetrate heavy fog.
Using the weather radar system configured according to the various exemplary embodiments may also provide EVS imagery having sufficient accuracy in low visibility conditions (given that many of the visual references required under Title 14 of the Code of Federal Regulations, part 91, such as, approach lighting systems, threshold lighting systems, runway edge lighting systems, and other runway structures, are metallic structures that exhibit high radar reflectivity). The imagery may allow lower landing minima (e.g., 100 feet or less) in some embodiments. In some embodiments, the lack of radar returns from the runway surface combined with runway structures and lights can provide a suitable image for runway identification by the pilot.
The display system includes a radar processing module in communication with the radar system and configured to generate high resolution radar image data for display in some embodiments. The image data is processed to provide a two-dimensional aircraft situation display (e.g., vertical profile display or plan view display) or three dimensional or perspective aircraft situation display image representative of the 3-D positions of runway structures in an airport terminal or runway environment based on the radar returns as described in U.S. patent application Ser. Nos. 14/301,199 and 14/482,681 incorporated herein by reference in their entireties in some embodiments. For example, the radar processing module can be embodied as a processor and a non-transitory memory containing program instructions that, when executed, cause the processor to provide radar beams and receive radar returns and generate image data from the radar returns. In some embodiments, program instructions stored on the non-transitory medium can cause the processor to filter the radar returns data to remove noise.
According to certain exemplary embodiments, a radar system such as a weather radar system, can be used to detect features of a runway environment. Utilizing the high radar cross section associated with metal content of runway lighting advantageously allows detection to be achieved whether at day or night, regardless of whether runway lights are on or are off in some embodiments. In one embodiment, the regular, periodic, equal spacing nature of visual aids, such as, approach lighting system, runway edge lights, taxi way lights, and center line lights, can be identified from the image generated from the radar data. In certain embodiments, the systems and methods can be utilized as extension to a combined vision system (CVS).
Referring to
The flight displays 20 and the combiner 21 can be used to provide information to the flight crew, thereby increasing visual range and enhancing decision-making abilities. In an exemplary embodiment, the flight displays 20 and the combiner 21 can include a weather display, a joint display, a weather radar map and a terrain display. Further, the flight displays 20 may include images from a synthetic vision system (SVS) or an enhanced vision system (EVS) (e.g., an EFVS). For example, the flight displays 20 can include a display configured to display a perspective image of terrain and/or weather information. Other views of terrain and/or weather information may also be provided (e.g., plan view, horizontal view, vertical view, or combinations thereof). Additionally, the flight displays 20 can be implemented using any of a variety of display technologies, including CRT, LCD, organic LED, dot matrix display, and others.
According to some embodiments, the display system 10 is configured to provide an image based upon radar data to at least one of the displays 20 or the combiner 21. In
In some embodiments, a symbol or icon for the runway 23 and extended centerline 27 can be provided on the displays 20 or the combiner 21. In some embodiments, the runway 23 and extended centerline 27 can be associated with SVS data. A set of runway features 29, such as, approach lighting system or other runway and taxi way lights, can be indicated on the displays 20 or the combiner 21 in some embodiments. The runway features 29 can be associated with radar data in some embodiments.
Referring to
The filter 154, the image renderer 155, the image merge module 160, and the image merge control/configuration module 162 can be embodied as software modules operating on a computing platform or a processor 175 and can be stored on a non-transitory medium. The processor 175 can be part of or integrated with the radar system 102, the SVS 111, the EVS 112, HDD display computer 132, or HUD computer 134 in certain embodiments. In one embodiment, processor 175 is an independent platform.
The radar system 102 is a weather radar system generally located inside the nose of the aircraft, inside a cockpit of the aircraft, on the top of the aircraft or on the tail of the aircraft in some embodiments. The radar system 102 can include a radar data storage unit 180, a radar antenna 182 and a processor 185. The radar system 102 can be a weather radar system, such as, a Multiscan™ radar system from Rockwell Collins, Inc. configured as described herein. The radar system 102 can utilize a split, half or sub-aperture or other technique for obtaining radar data associated with external surroundings in some embodiments. The radar system 102 can use the split or sub-aperture techniques of the radar systems described in U.S. application Ser. Nos. 13/627,788, 12/892,563, 13/250,798, 12/236,464, and 12/167,200 and U.S. Pat. No. 8,077,078, incorporated herein by reference and assigned to the assignee of the present application. The type of the radar system 102 and data gathering techniques are not discussed in the specification in a limiting fashion.
The processor 185 receives radar returns (e.g., weather radar returns data) from the radar antenna 182, processes the radar returns and provides the radar data in radar data storage unit 180. In certain embodiments, the data stored in radar data storage unit 180 can be stored as an image frame representing the data from a radar scan of the external surroundings (e.g., a runway environment).
The radar system 102 provides radar data (e.g., weather radar data) in the storage unit 180 to a filter 154 in one embodiment. In one embodiment, the image renderer 155 or other image generator can generate an image frame from the data stored in radar data storage unit 180 or filtered by the filter 154 and provides this to memory 156. Alternatively, the processor 185 can build the frame or image based upon radar return data from the radar system 102. Similarly, the SVS 111 can provide data or a frame for SVS image received by a memory 152. Alternatively, the display system 10 can provide the data or image frame to the memory 152 in response to data from the SVS 111. Similarly, the EVS 112 can provide data or a frame for EVS image received by a memory 153. Alternatively, the display system 10 can provide the data or image frame to the memory 153 in response to data from the EVS 112.
The radar data associated with the external surroundings can represent detected targets and the location of the detected targets. Targets include terrain, man-made features, objects, runways, etc. Improved angular resolution and range resolution techniques allow the location of the targets to be more accurately determined and represented in image data in some embodiments. The radar system 102 scans the external surroundings in front of the aircraft to sense the location of targets. The radar system 102 can utilize clutter suppression and Doppler filtering to improve performance in some embodiments.
In some embodiments, the radar system 102 provides data representing a 120 degree field of view in accordance with a weather radar sweep which takes approximately five seconds to complete in one embodiment. The sweep can be limited during approach to be a 30 degree sweep which requires five seconds before new data is available for display in certain embodiments. The sweep is directed toward the surface of the Earth so that returns are obtained which allow runway environment features to be detected. Various types of sweeps, scans and timings of sweeps and scans can be utilized without departing from the scope of the invention.
The radar system 102 embodied as a weather radar allows existing avionic equipment to be used as a real-time sensor for providing a radar-derived enhanced image of the external scene topography to the pilot in some embodiments. The image or representation generated by the radar system 102 is provided on the displays 20 or the combiner 21 can function as an EVS to provide situation awareness to the pilot in some embodiments. In other embodiments, the image or representation generated by the radar system 102 is provided on the displays 20 or the combiner 21 can function as an EFVS to allow lower landing minima.
The radar system 102 includes a range resolution module 190 and an angle resolution module 192 in some embodiments. The range resolution module 190 advantageously increases the range resolution of the radar system 102 when compared to conventional weather sensing operations in some embodiments. The angle resolution module 190 advantageously increases the angle resolution of the radar system 102 when compared to conventional weather sensing operations in some embodiments. The increased resolution in range and angle allows a higher resolution image to be provided on the displays 20 and the combiner 21 in some embodiments. The range resolution module 190 and the angle resolution module 192 can be software modules executed by the processor 185.
According to some embodiments, the radar system 102 under control of the angle resolution module 192 can use a beam sharpening method to achieve increased angular resolution. In some embodiments, the radar system 102 can utilize techniques such as beam sharpening (e.g., horizontal beam sharpening) and de-convolution of the beam point spread function for improved angular resolution. In some embodiments, the radar system 102 can use beam sharpening as a process that improves the antenna-induced poor angular resolution (e.g., due to the beam width). There are many methods that can be used such as: Doppler Beam Sharpening, Synthetic Aperture Radar (SAR), Monopulse Radar, Sub-Aperture Radar or Split-Aperture Radar, etc. Mathematical methods can be utilized to determine a center of the radar echo for identifying runway features. Techniques for beam sharpening are discussed in U.S. patent application Ser. Nos. 13/627,788, 12/892,563, 13/250,798, 12/236,464, and 12/167,200 and U.S. Pat. No. 8,077,078 incorporated herein by reference in their entireties.
The radar system 102 can use the radar antenna 182 configured as a switched aperture antenna for beam sharpening. The radar system 102 can also be configured for sequential lobing or monopulse operation to accurately estimate at which angle the target was located within the radar beam. In some embodiments, the radar beams provided by the radar antenna 182 and returns received by the radar antenna 182 associated with the radar system 102 can be separated into two or more portions and can be used to determine an angle from the radar antenna 182 to a target or a vector from the radar antenna 182 to a target such as a runway feature. The vector can be represented as an angle (bore site angle) and range to the target. Various processes can be utilized to calculate the angle or vector to the target.
The radar system 102 uses the radar antenna 182 that toggles between transmitting and receiving on the full aperture and transmitting on the full aperture while receiving on the partial aperture in some embodiments. These techniques can be used to accurately estimate at which angle the target was located within the radar beam and can be used to improve the accuracy of the Doppler calculations correcting for those angles. The received returns can be processed to determine a high resolution estimate of a target angle relative to the bore site of the antenna beam. According to some embodiments, the returns can be processed using a complex conjugate multiplication method to determine the target angle. The processing can be related to sequential lobing processing but is executed in the phase domain as opposed to the common amplitude domain in some embodiments.
In some embodiments, the radar system 102 uses sequential lobing techniques where two antennas that are close to the same place may be used, going back and forth between the two antennas. An amplitude signature or phase signature that varies between the two halves of the antennas may be used to obtain data about target position for detected targets (e.g., an object such as other aircraft, terrain, towers). Sequential lobing generally does not use phase comparisons with moving targets due to Doppler-induced phase changes that contaminate the phase center measurement. However, using a complex conjugate multiply method allows the Doppler-induced phase changes to be removed by cancellation. Therefore, a change in phase center between multiple different sub-apertures may be determined and used to determine angle to target.
In some embodiments, the range resolution module 190 provides higher resolution by increasing the effective waveform bandwidth of the radar system 102. The range resolution module 190 can use stepped-frequency compression in some embodiments. To provide higher range resolution, the range resolution module 192 can control the radar system 102 to provide ultra-wideband radar (UWB) beams (e.g., extremely narrow pulses with high power), or to provide intra pulse compression (frequency of phase modulation of the transmitted pulse) in some embodiments. Frequency coding techniques including the common linear frequency modulation (LFM) or chirp method, and discrete coded segments within the pulse can be utilized in some embodiments. Phase coding techniques including binary phase codes as well as various polyphase codes can be utilized in some embodiments. To provide higher range resolution, the range resolution module 192 can control the radar system 102 to provide interpulse pulse compression or stepped frequency compression (e.g., successive pulses with discrete increasing frequency steps) in some embodiments. In some embodiments, stepped frequency compression advantageously achieves high effective bandwidth with narrow instantaneous bandwidth. The receive bandwidth is smaller, has lower noise bandwidth, and a higher signal to noise ratio in some embodiments. Analog-to-digital sampling rates are lower (vs. pulse-compression) in some embodiments. In addition, the stepped frequency compression also has a smaller peak power (e.g., when compared to impulse), provides flexible transmit frequency control, can “hop” over restricted or undesired transmit frequencies, enables adaptive/cognitive frequency use, and rejects later received clutter from earlier transmit pulses in some embodiments. Further, the stepped frequency compression provides returns from clutter in ambiguous ranges that have frequencies that are different from returns from targets and rejects ambiguous clutter returns in the receiver IF filter of the radar system 102 in some embodiments. Stepped frequency compression generally does not achieve range resolution with a single pulse, requires transmit, receive and processing of a group of pulses for any one bin, and has more pronounced range-Doppler coupling (e.g., different Doppler shifts for each frequency) in some embodiments.
According to one embodiment, the SVS 111 can be any electronic system or device for providing a computer generated image of the external scene topography. The image can be from the perspective of the aircraft flight deck as derived from aircraft attitude, high-precision navigation solutions, and a database of terrain, obstacles and runway features. Generally, only those terrain, obstacle, and runway features which are contained in the current version of the SVS database are displayed in a conventional system. In some embodiments, the pilot uses the synthetic vision images as enhancements to available visual cues.
According to one embodiment, the EVS 112 can be any electronic system or device for providing a sensed image of the external scene topography. The EVS 112 can be an infrared camera in one embodiment.
In some embodiments, the display system 10 combines or fuses images from the HUD computer 134, the SVS 111 and/or the EVS 112 with the image derived from radar data from the radar system 102 to provide an overall image provided to the pilot according to one embodiment. In some embodiment, the image derived from the radar data are fused with HUD symbology for the displays 20 or the combiner 21.
The SVS 111 can include a terrain database and a processor according to one exemplary embodiment. The terrain database can be used to create a perspective image of the scene in front of the aircraft on a two-dimensional display or a three dimensional display. The terrain database can employ topographical colors similar to those depicted on standard aeronautical charts.
The SVS 111 can also receive aircraft position data from an aircraft data source. The aircraft data source can include any system or sensor (or combination thereof) that provides navigation data or aircraft flight parameters. For example, a typical navigation system in an aircraft has numerous sub-systems. Sub-systems which provide aircraft position data and flight parameter data could include, but are not limited to, an inertial navigation system (INS), a global navigation satellite system (e.g., global positioning system (GPS)), air data sensors, compasses, and a flight management computer (FMC).
In some embodiments, the filter 154 processes the radar data for better image quality. The filter 154 can be located in the radar system 102. The filter 154 can reduce noise and employ anti-speckling filtering, Kalman filtering, Chebyshev filtering, adaptive filtering, smoothing, etc. The filter 154 can also perform anti-aliasing in some embodiments. Techniques for increasing image quality and identifying runway features are discussed in U.S. patent application Ser. No. 14/482,681 and incorporated herein by reference.
In order to facilitate generation of clearer images, the processor 185 and/or the filter 154 may be configured to filter the radar returns data to identify areas having a reflectivity lower than a predetermined value. In some embodiments, low energy areas may be zeroed out based on their corresponding reflectivity values, such that the area will be rendered transparent. Such filtering may result in a final image with only highly reflective structures in an airport terminal area or runway environment, such as an approach lighting system, a threshold lighting system, and or a runway edge lighting system.
In some embodiments, the radar data from the radar data storage unit 180 is provided to filter 154, the image renderer 155, and the provided as image data to memory 156 and to the HUD computer 134 or the HDD display computer 132 for providing images on the displays 20 or the combiner 21. In another embodiment, the radar data can be provided as image data to an image merge function module 160. The image merge function module 160 receives an EVS frame from the memory 153 or an SVS frame from the memory 152 and merges the data to appropriately display an EVS image or an SVS image with the image derived from the radar data.
The processor 175 executes a fusion processing algorithm for fusing the frames from the memory 152, the memory 153, and the memory 156 provided as video signals in some embodiments. This fusion process may include special formatting (positioning, sizing, cropping, etc.) of specific features or the entire image from a specific image source based on other sensor inputs or aircraft. After the combined or fused image has been completed, the entire image is sized to fit appropriately within the total HUD field-of-view (e.g., with HUD symbology) and conformally overlay the outside scene, which is viewed through the combiner 21 of the HUD. In addition, the overall fused image contrast is standardized with the brightness/contrast to support the brightness/contrast controls of the HUD.
The processors 175 and 185 can be any hardware and/or software processor or processing architecture capable of executing instructions and operating on navigational and radar data. The processors 175 and 185 can be capable of determining navigational information such as altitude, heading, bearing, and location based on data from aircraft sensors. Applicants note that flow 300 can be performed in various equipment on the aircraft including in the HUD computer 134, a display processor, the weather radar system 102, a navigation system, the SVS 111, etc. in accordance with an exemplary embodiment. The processors 175 and 185 may be, or may include, one or more microprocessors, an application specific integrated circuit (ASIC), a circuit containing one or more processing components, a group of distributed processing components, circuitry for supporting a microprocessor, or other hardware configured for processing.
Image merge control configuration module 162 can provide format adjustments to data. The SVS 111 and the radar system 102 can have their own specific interface type and format. Also, each display of the displays 20 and the combiner 21 may require specific formatting. A standard format can be a format used in HUD processing functions. The image control configuration module 138 can be implemented in hardware, software, or combinations thereof.
Real time images derived from radar data allow the pilot exact and very reliable confirmation of the presence of a runway in some embodiment. In one embodiment, localization of the pattern of runway environment features, such as the runway approach lights or the runway edge lights allows easy recognition of the location of the runway with respect to the aircraft. In some embodiments the image data can be processed to provide a two-dimensional aircraft situation display (e.g., vertical profile display or plan view display). In other embodiments the image data can be processed to provide a three dimensional or perspective aircraft situation display image representative of the 3-D positions of runway environment features.
With reference to
At an operation 310, the image associated with the image data is displayed on a display via a display computer such as the HUD display computer 132 or the HUD computer 134. After operation 310, flow 300 returns to operation 302 in some embodiments.
With reference to
With reference to
The radar system 102 generally operates by sweeping a radar beam horizontally and/or vertically along the sky for weather detection. For example, radar system 102 may conduct a first horizontal sweep directly in front of the aircraft and a second horizontal sweep downward at some tilt angle (e.g., 20 degrees down). Returns from different tilt angles may be electronically merged to form a composite image for display on an electronic display, such as the displays 20 and the combiner 21 in the aircraft control center 11. Sensing of the external surroundings can be performed at higher resolutions than the weather sensing and use one or more beams directed toward the external surroundings. Sensing of the external surroundings can be performed in a more forward looking direction with smaller azimuthal sweeps than are used for weather detection in some embodiments. GPS and/or other navigation information can be used to point the radar beam toward the external surroundings associated with an airport in some embodiments.
In some embodiments, the weather radar system 102 may operate in a weather sense mode until approach or landing. During approach or landing, the weather radar system 102 alternatively performs radar data gathering for sensing of the external surroundings, radar data gathering for weather sensing, and radar data gathering for wind shear detection. In some embodiments, during approach or landing, the weather radar system 102 alternatively performs radar data gathering for sensing of external surroundings, and radar data gathering for wind shear detection or other hazard detection. During approach or landing, weather radar system 102 alternatively performs radar data gathering for sensing of external surroundings, and radar data gathering for weather sensing in some embodiments. In some embodiments, weather sensing operations are suspended during approach and landing.
The scope of this disclosure should be determined by the claims, their legal equivalents and the fact that it fully encompasses other embodiments which may become apparent to those skilled in the art. All structural, electrical and functional equivalents to the elements of the above-described disclosure that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. A reference to an element in the singular is not intended to mean one and only one, unless explicitly so stated, but rather it should be construed to mean at least one. No claim element herein is to be construed under the provisions of 35 U.S.C. §112, sixth paragraph, unless the element is expressly recited using the phrase “means for.” Furthermore, no element, component or method step in the present disclosure is intended to be dedicated to the public, regardless of whether the element, component or method step is explicitly recited in the claims.
Embodiments of the inventive concepts disclosed herein have been described with reference to drawings. The drawings illustrate certain details of specific embodiments that implement the systems and methods and programs of the present disclosure. However, describing the embodiments with drawings should not be construed as imposing any limitations that may be present in the drawings. The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing its operations. Embodiments of the inventive concepts disclosed herein may be implemented using an existing computer processor, or by a special purpose computer processor incorporated for this or another purpose or by a hardwired system.
As noted above, embodiments within the scope of the inventive concepts disclosed herein include program products comprising non-transitory machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media may be any available media that may be accessed by a computer or other machine with a processor. By way of example, such machine-readable media may comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to carry or store desired program code in the form of machine-executable instructions or data structures and which may be accessed by a computer or other machine with a processor. Thus, any such a connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions comprise, for example, instructions and data which cause processor to perform a certain function or group of functions.
Embodiments in the inventive concepts disclosed herein have been described in the general context of method steps which may be implemented in one embodiment by a program product including machine-executable instructions, such as program code, for example in the form of program modules executed by machines in networked environments. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Machine-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
As previously indicated, embodiments in the present disclosure may be practiced in a networked environment using logical connections to one or more remote computers having processors. Those skilled in the art will appreciate that such network computing environments may encompass many types of computers, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and so on. Embodiments in the disclosure may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
The database or system memory may include read only memory (ROM) and random access memory (RAM). The database may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to a removable optical disk such as a CD ROM or other optical media. The drives and their associated machine-readable media provide nonvolatile storage of machine-executable instructions, data structures, program modules and other data for the computer. User interfaces, as described herein, may include a computer with monitor, keyboard, a keypad, a mouse, joystick or other input devices performing a similar function.
It should be noted that although the diagrams herein may show a specific order and composition of method steps, it is understood that the order of these steps may differ from what is depicted. For example, two or more steps may be performed concurrently or with partial concurrence. Also, some method steps that are performed as discrete steps may be combined, steps being performed as a combined step may be separated into discrete steps, the sequence of certain processes may be reversed or otherwise varied, and the nature or number of discrete processes may be altered or varied. The order or sequence of any element or apparatus may be varied or substituted according to alternative embodiments. Accordingly, all such modifications are intended to be included within the scope of the present disclosure.
The foregoing description of embodiments has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the subject matter to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the subject matter disclosed herein. The embodiments were chosen and described in order to explain the principals of the disclosed subject matter and its practical application to enable one skilled in the art to utilize the disclosed subject matter in various embodiments and with various modifications as are suited to the particular use contemplated. Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the embodiments without departing from the scope of the presently disclosed subject matter.
While the exemplary embodiments illustrated in the figures and described above are presently preferred, it should be understood that these embodiments are offered by way of example only. Other embodiments may include, for example, structures with different data mapping or different data. The disclosed subject matter is not limited to a particular embodiment, but extends to various modifications, combinations, and permutations that nevertheless fall within the scope and spirit of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
2416155 | Chubb | Feb 1947 | A |
2849184 | Arden et al. | Aug 1958 | A |
2929059 | Parker | Mar 1960 | A |
2930035 | Altekruse | Mar 1960 | A |
2948892 | White | Aug 1960 | A |
2965894 | Sweeney | Dec 1960 | A |
2994966 | Senitsky et al. | Aug 1961 | A |
3031660 | Young | Apr 1962 | A |
3049702 | Schreitmueller | Aug 1962 | A |
3064252 | Varela | Nov 1962 | A |
3070795 | Chambers | Dec 1962 | A |
3071766 | Fenn | Jan 1963 | A |
3072903 | Meyer | Jan 1963 | A |
3089801 | Tierney et al. | May 1963 | A |
3107351 | Milam | Oct 1963 | A |
3113310 | Standing | Dec 1963 | A |
3129425 | Sanner | Apr 1964 | A |
3153234 | Begeman et al. | Oct 1964 | A |
3175215 | Blasberg et al. | Mar 1965 | A |
3212088 | Alexander et al. | Oct 1965 | A |
3221328 | Walter | Nov 1965 | A |
3241141 | Wall | Mar 1966 | A |
3274593 | Varela et al. | Sep 1966 | A |
3325807 | Burns et al. | Jun 1967 | A |
3334344 | Colby, Jr. | Aug 1967 | A |
3339199 | Pichafroy | Aug 1967 | A |
3373423 | Levy | Mar 1968 | A |
3397397 | Barney | Aug 1968 | A |
3448450 | Alfandari et al. | Jun 1969 | A |
3618090 | Garrison | Nov 1971 | A |
3680094 | Bayle et al. | Jul 1972 | A |
3716855 | Asam | Feb 1973 | A |
3739380 | Burdic et al. | Jun 1973 | A |
3781878 | Kirkpatrick | Dec 1973 | A |
3810175 | Bell | May 1974 | A |
3815132 | Case et al. | Jun 1974 | A |
3816718 | Hall et al. | Jun 1974 | A |
3851758 | Makhijani et al. | Dec 1974 | A |
3866222 | Young | Feb 1975 | A |
3885237 | Kirkpatrick | May 1975 | A |
3956749 | Magorian | May 1976 | A |
4024537 | Hart | May 1977 | A |
4058701 | Gruber et al. | Nov 1977 | A |
4058710 | Altmann | Nov 1977 | A |
4063218 | Basov et al. | Dec 1977 | A |
4235951 | Swarovski | Nov 1980 | A |
4277845 | Smith et al. | Jul 1981 | A |
4405986 | Gray | Sep 1983 | A |
4435707 | Clark | Mar 1984 | A |
4481519 | Margerum | Nov 1984 | A |
4532515 | Cantrell et al. | Jul 1985 | A |
4594676 | Breiholz et al. | Jun 1986 | A |
4595925 | Hansen | Jun 1986 | A |
4598292 | Devino | Jul 1986 | A |
4628318 | Alitz | Dec 1986 | A |
4646244 | Bateman et al. | Feb 1987 | A |
4649388 | Atlas | Mar 1987 | A |
4654665 | Kiuchi et al. | Mar 1987 | A |
4685149 | Smith et al. | Aug 1987 | A |
4760396 | Barney et al. | Jul 1988 | A |
4828382 | Vermilion | May 1989 | A |
4843398 | Houston et al. | Jun 1989 | A |
4912477 | Lory et al. | Mar 1990 | A |
4914436 | Bateman et al. | Apr 1990 | A |
4924401 | Bice et al. | May 1990 | A |
4939513 | Paterson et al. | Jul 1990 | A |
4951059 | Taylor, Jr. | Aug 1990 | A |
4953972 | Zuk | Sep 1990 | A |
4965573 | Gallagher et al. | Oct 1990 | A |
4987419 | Salkeld | Jan 1991 | A |
5045855 | Moreira | Sep 1991 | A |
5047779 | Hager | Sep 1991 | A |
5047781 | Bleakney | Sep 1991 | A |
5049886 | Seitz et al. | Sep 1991 | A |
5166688 | Moreira | Nov 1992 | A |
5173703 | Mangiapane et al. | Dec 1992 | A |
5175554 | Mangiapane et al. | Dec 1992 | A |
5198819 | Susnjara | Mar 1993 | A |
5202690 | Frederick | Apr 1993 | A |
5247303 | Cornelius et al. | Sep 1993 | A |
5273553 | Hoshi et al. | Dec 1993 | A |
5311183 | Mathews et al. | May 1994 | A |
5329391 | Miyamoto et al. | Jul 1994 | A |
5332998 | Avignon et al. | Jul 1994 | A |
5345241 | Huddle | Sep 1994 | A |
5365356 | McFadden | Nov 1994 | A |
5383457 | Cohen | Jan 1995 | A |
5442364 | Lee et al. | Aug 1995 | A |
5539409 | Mathews et al. | Jul 1996 | A |
5559515 | Alimena et al. | Sep 1996 | A |
5559518 | Didomizio | Sep 1996 | A |
5566840 | Waldner et al. | Oct 1996 | A |
5592178 | Chang et al. | Jan 1997 | A |
5678303 | Wichmann | Oct 1997 | A |
5736957 | Raney | Apr 1998 | A |
5820080 | Eschenbach | Oct 1998 | A |
5828332 | Frederick | Oct 1998 | A |
5831570 | Ammar | Nov 1998 | A |
5839080 | Muller et al. | Nov 1998 | A |
5867119 | Corrubia et al. | Feb 1999 | A |
5894286 | Morand et al. | Apr 1999 | A |
5918517 | Malapert et al. | Jul 1999 | A |
5920276 | Frederick | Jul 1999 | A |
5923279 | Bamler et al. | Jul 1999 | A |
5936575 | Azzarelli et al. | Aug 1999 | A |
5942062 | Hassall et al. | Aug 1999 | A |
5945926 | Ammar et al. | Aug 1999 | A |
5950512 | Fields | Sep 1999 | A |
5959762 | Bandettini et al. | Sep 1999 | A |
5978715 | Briffe et al. | Nov 1999 | A |
6002347 | Daly et al. | Dec 1999 | A |
6023240 | Sutton | Feb 2000 | A |
6061016 | Lupinski et al. | May 2000 | A |
6061022 | Menegozzi et al. | May 2000 | A |
6064942 | Johnson et al. | May 2000 | A |
6075484 | Daniel et al. | Jun 2000 | A |
6092009 | Glover | Jul 2000 | A |
6112141 | Briffe et al. | Aug 2000 | A |
6112570 | Hruschak | Sep 2000 | A |
6122570 | Muller et al. | Sep 2000 | A |
6127944 | Daly et al. | Oct 2000 | A |
6128066 | Yokozeki | Oct 2000 | A |
6128553 | Gordon et al. | Oct 2000 | A |
6138060 | Conner et al. | Oct 2000 | A |
6150901 | Auken | Nov 2000 | A |
6154151 | McElreath et al. | Nov 2000 | A |
6154169 | Kuntman | Nov 2000 | A |
6157339 | Sato et al. | Dec 2000 | A |
6157891 | Lin | Dec 2000 | A |
6163021 | Mickelson | Dec 2000 | A |
6166661 | Anderson et al. | Dec 2000 | A |
6169770 | Henely | Jan 2001 | B1 |
6178391 | Anderson et al. | Jan 2001 | B1 |
6184816 | Zheng et al. | Feb 2001 | B1 |
6188330 | Glover | Feb 2001 | B1 |
6194980 | Thon | Feb 2001 | B1 |
6199008 | Aratow et al. | Mar 2001 | B1 |
6201494 | Kronfeld | Mar 2001 | B1 |
6204806 | Hoech | Mar 2001 | B1 |
6205400 | Lin | Mar 2001 | B1 |
6208284 | Woodell et al. | Mar 2001 | B1 |
6219592 | Muller et al. | Apr 2001 | B1 |
6233522 | Morici | May 2001 | B1 |
6236351 | Conner et al. | May 2001 | B1 |
6259400 | Higgins et al. | Jul 2001 | B1 |
6266114 | Skarohlid | Jul 2001 | B1 |
6278799 | Hoffman | Aug 2001 | B1 |
6281832 | McElreath | Aug 2001 | B1 |
6285298 | Gordon | Sep 2001 | B1 |
6285337 | West et al. | Sep 2001 | B1 |
6285926 | Weiler et al. | Sep 2001 | B1 |
6289277 | Feyereisen et al. | Sep 2001 | B1 |
6311108 | Ammar et al. | Oct 2001 | B1 |
6317468 | Meyer | Nov 2001 | B1 |
6317690 | Gia | Nov 2001 | B1 |
6317872 | Gee et al. | Nov 2001 | B1 |
6340946 | Wolfson et al. | Jan 2002 | B1 |
6345127 | Mitchell | Feb 2002 | B1 |
6359585 | Bechman et al. | Mar 2002 | B1 |
6366013 | Leenders et al. | Apr 2002 | B1 |
6373418 | Abbey | Apr 2002 | B1 |
6374286 | Gee et al. | Apr 2002 | B1 |
6377202 | Kropfli et al. | Apr 2002 | B1 |
6377892 | Johnson et al. | Apr 2002 | B1 |
6388607 | Woodell | May 2002 | B1 |
6388608 | Woodell et al. | May 2002 | B1 |
6388724 | Campbell et al. | May 2002 | B1 |
6389354 | Hicks et al. | May 2002 | B1 |
6401038 | Gia | Jun 2002 | B2 |
6411890 | Zimmerman | Jun 2002 | B1 |
6421000 | McDowell | Jul 2002 | B1 |
6421603 | Pratt et al. | Jul 2002 | B1 |
6424288 | Woodell | Jul 2002 | B1 |
6426717 | Maloratsky | Jul 2002 | B1 |
6426720 | Ross et al. | Jul 2002 | B1 |
6427122 | Lin | Jul 2002 | B1 |
6441773 | Kelly et al. | Aug 2002 | B1 |
6445310 | Bateman et al. | Sep 2002 | B1 |
6448922 | Kelly | Sep 2002 | B1 |
6452511 | Kelly et al. | Sep 2002 | B1 |
6456236 | Hauck et al. | Sep 2002 | B1 |
6456238 | Posey | Sep 2002 | B1 |
6462703 | Hedrick | Oct 2002 | B2 |
6473026 | Ali-Mehenni et al. | Oct 2002 | B1 |
6473037 | Vail et al. | Oct 2002 | B2 |
6473240 | Dehmlow | Oct 2002 | B1 |
6481482 | Shimotomai | Nov 2002 | B1 |
6492934 | Hwang et al. | Dec 2002 | B1 |
6501424 | Haendel et al. | Dec 2002 | B1 |
6512476 | Woodell | Jan 2003 | B1 |
6512527 | Barber et al. | Jan 2003 | B1 |
6516272 | Lin | Feb 2003 | B2 |
6516283 | McCall et al. | Feb 2003 | B2 |
6520056 | Nemeth et al. | Feb 2003 | B1 |
6525674 | Kelly et al. | Feb 2003 | B1 |
6531669 | Miller et al. | Mar 2003 | B1 |
6549161 | Woodell | Apr 2003 | B1 |
6567728 | Kelly et al. | May 2003 | B1 |
6574030 | Mosier | Jun 2003 | B1 |
6577947 | Kronfeld et al. | Jun 2003 | B1 |
6590528 | Dewulf | Jul 2003 | B1 |
6591171 | Ammar et al. | Jul 2003 | B1 |
6593875 | Bergin et al. | Jul 2003 | B2 |
6600443 | Landt | Jul 2003 | B2 |
6603425 | Woodell | Aug 2003 | B1 |
6614057 | Silvernail et al. | Sep 2003 | B2 |
6650275 | Kelly et al. | Nov 2003 | B1 |
6650291 | West et al. | Nov 2003 | B1 |
6653947 | Dwyer et al. | Nov 2003 | B2 |
6667710 | Cornell et al. | Dec 2003 | B2 |
6681668 | Smirle | Jan 2004 | B1 |
6690298 | Barber et al. | Feb 2004 | B1 |
6690299 | Suiter | Feb 2004 | B1 |
6690317 | Szeto et al. | Feb 2004 | B2 |
6697008 | Sternowski | Feb 2004 | B1 |
6697012 | Lodwig et al. | Feb 2004 | B2 |
6710663 | Berquist | Mar 2004 | B1 |
6714186 | Mosier et al. | Mar 2004 | B1 |
6724344 | Stockmaster et al. | Apr 2004 | B1 |
6731236 | Hager et al. | May 2004 | B1 |
6738011 | Evans | May 2004 | B1 |
6739929 | Furukawa et al. | May 2004 | B2 |
6741203 | Woodell | May 2004 | B1 |
6741208 | West | May 2004 | B1 |
6744382 | Lapis et al. | Jun 2004 | B1 |
6744408 | Stockmaster | Jun 2004 | B1 |
6757624 | Hwang et al. | Jun 2004 | B1 |
6760155 | Murayama et al. | Jul 2004 | B2 |
6771626 | Golubiewski et al. | Aug 2004 | B1 |
6782392 | Weinberger et al. | Aug 2004 | B1 |
6799095 | Owen et al. | Sep 2004 | B1 |
6803245 | Auch et al. | Oct 2004 | B2 |
6804614 | McGraw et al. | Oct 2004 | B1 |
6806846 | West | Oct 2004 | B1 |
6807538 | Weinberger et al. | Oct 2004 | B1 |
6813777 | Weinberger et al. | Nov 2004 | B1 |
6819983 | McGraw | Nov 2004 | B1 |
6822617 | Mather et al. | Nov 2004 | B1 |
6825804 | Doty | Nov 2004 | B1 |
6832538 | Hwang | Dec 2004 | B1 |
6839017 | Dillman | Jan 2005 | B1 |
6842288 | Liu et al. | Jan 2005 | B1 |
6850185 | Woodell | Feb 2005 | B1 |
6862323 | Loper | Mar 2005 | B1 |
6862501 | He | Mar 2005 | B2 |
6865452 | Burdon | Mar 2005 | B2 |
6879280 | Bull et al. | Apr 2005 | B1 |
6879886 | Wilkins et al. | Apr 2005 | B2 |
6882302 | Woodell et al. | Apr 2005 | B1 |
6908202 | Graf et al. | Jun 2005 | B2 |
6917396 | Hiraishi et al. | Jul 2005 | B2 |
6918134 | Sherlock et al. | Jul 2005 | B1 |
6933885 | Stockmaster et al. | Aug 2005 | B1 |
6938258 | Weinberger et al. | Aug 2005 | B1 |
6950062 | Mather et al. | Sep 2005 | B1 |
6959057 | Tuohino | Oct 2005 | B1 |
6972727 | West et al. | Dec 2005 | B1 |
6977608 | Anderson et al. | Dec 2005 | B1 |
6984545 | Grigg et al. | Jan 2006 | B2 |
6990022 | Morikawa et al. | Jan 2006 | B2 |
6992614 | Joyce | Jan 2006 | B1 |
6995726 | West et al. | Feb 2006 | B1 |
6998648 | Silvernail | Feb 2006 | B2 |
6998908 | Sternowski | Feb 2006 | B1 |
6999022 | Vesel et al. | Feb 2006 | B1 |
6999027 | Stockmaster | Feb 2006 | B1 |
7002546 | Stuppi et al. | Feb 2006 | B1 |
7010398 | Wilkins et al. | Mar 2006 | B2 |
7023375 | Klausing et al. | Apr 2006 | B2 |
7026956 | Wenger et al. | Apr 2006 | B1 |
7028304 | Weinberger et al. | Apr 2006 | B1 |
7030945 | Umemoto et al. | Apr 2006 | B2 |
7034753 | Elsallal et al. | Apr 2006 | B1 |
7042387 | Ridenour et al. | May 2006 | B2 |
7053796 | Barber | May 2006 | B1 |
7057549 | Block | Jun 2006 | B2 |
7064680 | Reynolds et al. | Jun 2006 | B2 |
7069120 | Koenck et al. | Jun 2006 | B1 |
7089092 | Wood et al. | Aug 2006 | B1 |
7092645 | Sternowski | Aug 2006 | B1 |
7098913 | Etherington et al. | Aug 2006 | B1 |
7109912 | Paramore et al. | Sep 2006 | B1 |
7109913 | Paramore et al. | Sep 2006 | B1 |
7123260 | Brust | Oct 2006 | B2 |
7129885 | Woodell et al. | Oct 2006 | B1 |
7145501 | Manfred et al. | Dec 2006 | B1 |
7148816 | Carrico | Dec 2006 | B1 |
7151507 | Herting | Dec 2006 | B1 |
7158072 | Venkatachalam et al. | Jan 2007 | B1 |
7161525 | Finley et al. | Jan 2007 | B1 |
7170446 | West et al. | Jan 2007 | B1 |
7170959 | Abbey | Jan 2007 | B1 |
7180476 | Guell et al. | Feb 2007 | B1 |
7191406 | Barber et al. | Mar 2007 | B1 |
7196329 | Wood et al. | Mar 2007 | B1 |
7205933 | Snodgrass | Apr 2007 | B1 |
7209070 | Gilliland et al. | Apr 2007 | B2 |
7212216 | He et al. | May 2007 | B2 |
7218268 | Vandenberg | May 2007 | B2 |
7219011 | Barber | May 2007 | B1 |
7242343 | Woodell | Jul 2007 | B1 |
7242345 | Raestad et al. | Jul 2007 | B2 |
7250903 | McDowell | Jul 2007 | B1 |
7265710 | Deagro | Sep 2007 | B2 |
7269657 | Alexander et al. | Sep 2007 | B1 |
7272472 | McElreath | Sep 2007 | B1 |
7273403 | Yokota et al. | Sep 2007 | B2 |
7280068 | Lee et al. | Oct 2007 | B2 |
7289058 | Shima | Oct 2007 | B2 |
7292178 | Woodell et al. | Nov 2007 | B1 |
7292180 | Schober | Nov 2007 | B2 |
7295150 | Burlet et al. | Nov 2007 | B2 |
7295901 | Little et al. | Nov 2007 | B1 |
7301496 | Honda et al. | Nov 2007 | B2 |
7307576 | Koenigs | Dec 2007 | B1 |
7307577 | Kronfeld et al. | Dec 2007 | B1 |
7307583 | Woodell et al. | Dec 2007 | B1 |
7312725 | Berson et al. | Dec 2007 | B2 |
7312743 | Ridenour et al. | Dec 2007 | B2 |
7317427 | Pauplis et al. | Jan 2008 | B2 |
7321332 | Focke et al. | Jan 2008 | B2 |
7337043 | Bull | Feb 2008 | B2 |
7349154 | Aiura et al. | Mar 2008 | B2 |
7352292 | Alter et al. | Apr 2008 | B2 |
7361240 | Kim | Apr 2008 | B2 |
7372394 | Woodell et al. | May 2008 | B1 |
7373223 | Murphy | May 2008 | B2 |
7375678 | Feyereisen et al. | May 2008 | B2 |
7379014 | Woodell et al. | May 2008 | B1 |
7379796 | Walsdorf et al. | May 2008 | B2 |
7381110 | Sampica et al. | Jun 2008 | B1 |
7417578 | Woodell et al. | Aug 2008 | B1 |
7417579 | Woodell | Aug 2008 | B1 |
7423578 | Tietjen | Sep 2008 | B1 |
7446697 | Burlet et al. | Nov 2008 | B2 |
7446938 | Miyatake et al. | Nov 2008 | B2 |
7452258 | Marzen et al. | Nov 2008 | B1 |
7474262 | Alland | Jan 2009 | B2 |
7479920 | Niv | Jan 2009 | B2 |
7486220 | Kronfeld et al. | Feb 2009 | B1 |
7486291 | Berson et al. | Feb 2009 | B2 |
7492304 | Woodell et al. | Feb 2009 | B1 |
7492305 | Woodell et al. | Feb 2009 | B1 |
7515087 | Woodell et al. | Apr 2009 | B1 |
7515088 | Woodell et al. | Apr 2009 | B1 |
7525448 | Wilson et al. | Apr 2009 | B1 |
7528765 | Woodell et al. | May 2009 | B1 |
7528915 | Choi et al. | May 2009 | B2 |
7541970 | Godfrey et al. | Jun 2009 | B1 |
7541971 | Woodell et al. | Jun 2009 | B1 |
7551451 | Kim et al. | Jun 2009 | B2 |
7557735 | Woodell et al. | Jul 2009 | B1 |
7566254 | Sampica et al. | Jul 2009 | B2 |
7570177 | Reynolds et al. | Aug 2009 | B2 |
7576680 | Woodell | Aug 2009 | B1 |
7603209 | Dwyer et al. | Oct 2009 | B2 |
7609200 | Woodell et al. | Oct 2009 | B1 |
7612706 | Honda et al. | Nov 2009 | B2 |
7616150 | Woodell | Nov 2009 | B1 |
7633428 | McCusker et al. | Dec 2009 | B1 |
7633430 | Wichgers et al. | Dec 2009 | B1 |
7633584 | Umemoto et al. | Dec 2009 | B2 |
7639175 | Woodell | Dec 2009 | B1 |
7664601 | Daly, Jr. | Feb 2010 | B2 |
7675461 | McCusker et al. | Mar 2010 | B1 |
7693621 | Chamas | Apr 2010 | B1 |
7696921 | Finley et al. | Apr 2010 | B1 |
7714767 | Kronfeld et al. | May 2010 | B1 |
7733264 | Woodell et al. | Jun 2010 | B1 |
7783427 | Woodell et al. | Aug 2010 | B1 |
7783429 | Walden et al. | Aug 2010 | B2 |
7791529 | Filias et al. | Sep 2010 | B2 |
7808422 | Woodell et al. | Oct 2010 | B1 |
7814676 | Sampica et al. | Oct 2010 | B2 |
7843380 | Woodell | Nov 2010 | B1 |
7859448 | Woodell et al. | Dec 2010 | B1 |
7859449 | Woodell et al. | Dec 2010 | B1 |
7864103 | Weber et al. | Jan 2011 | B2 |
7868811 | Woodell et al. | Jan 2011 | B1 |
7872594 | Vesel | Jan 2011 | B1 |
7889117 | Woodell et al. | Feb 2011 | B1 |
7889118 | Finley et al. | Feb 2011 | B1 |
7927440 | Matsuhira | Apr 2011 | B2 |
7929086 | Toyama et al. | Apr 2011 | B2 |
7965223 | McCusker | Jun 2011 | B1 |
7965225 | Dickerson et al. | Jun 2011 | B1 |
8035547 | Flanigan et al. | Oct 2011 | B1 |
8038498 | Miyauchi et al. | Oct 2011 | B2 |
8045098 | Kitagawa et al. | Oct 2011 | B2 |
8059025 | D'Addio | Nov 2011 | B2 |
8068984 | Smith et al. | Nov 2011 | B2 |
8072368 | Woodell | Dec 2011 | B1 |
8077078 | Woodell et al. | Dec 2011 | B1 |
8102487 | Kitagawa et al. | Jan 2012 | B2 |
8118075 | Sampica et al. | Feb 2012 | B2 |
8137498 | Sampica et al. | Mar 2012 | B2 |
8140223 | Whitehead et al. | Mar 2012 | B2 |
8159464 | Gribble et al. | Apr 2012 | B1 |
8232917 | Scherzinger et al. | Jul 2012 | B2 |
8296065 | Haynie et al. | Oct 2012 | B2 |
8373580 | Bunch et al. | Feb 2013 | B2 |
8410975 | Bell et al. | Apr 2013 | B1 |
8477062 | Kanellis | Jul 2013 | B1 |
8486535 | Nemeth et al. | Jul 2013 | B1 |
8493241 | He | Jul 2013 | B2 |
8515600 | McCusker | Aug 2013 | B1 |
8540002 | Sampica et al. | Sep 2013 | B2 |
8558731 | Woodell | Oct 2013 | B1 |
8576112 | Garrec et al. | Nov 2013 | B2 |
8583315 | Whitehead et al. | Nov 2013 | B2 |
8594879 | Roberge et al. | Nov 2013 | B2 |
8603288 | Sampica et al. | Dec 2013 | B2 |
8634993 | McClure et al. | Jan 2014 | B2 |
8639416 | Jones et al. | Jan 2014 | B2 |
8643533 | Woodell et al. | Feb 2014 | B1 |
8691043 | Sampica et al. | Apr 2014 | B2 |
8717226 | Bon et al. | May 2014 | B2 |
8773301 | Woodell | Jul 2014 | B1 |
8896480 | Wilson et al. | Nov 2014 | B1 |
8909471 | Jinkins et al. | Dec 2014 | B1 |
8917191 | Tiana et al. | Dec 2014 | B1 |
8936057 | Sampica et al. | Jan 2015 | B2 |
9354633 | McCusker et al. | May 2016 | B1 |
20010023390 | Gia | Sep 2001 | A1 |
20010050372 | Krijn et al. | Dec 2001 | A1 |
20010053648 | Furukawa et al. | Dec 2001 | A1 |
20020039070 | Ververs et al. | Apr 2002 | A1 |
20020111717 | Scherzinger et al. | Aug 2002 | A1 |
20020116125 | Lin | Aug 2002 | A1 |
20020116126 | Lin | Aug 2002 | A1 |
20020158256 | Yamada et al. | Oct 2002 | A1 |
20020179229 | Chuzles | Dec 2002 | A1 |
20020185600 | Kerr | Dec 2002 | A1 |
20020187284 | Kinoshita et al. | Dec 2002 | A1 |
20030021491 | Brust | Jan 2003 | A1 |
20030038916 | Nakano et al. | Feb 2003 | A1 |
20030043315 | Umemoto et al. | Mar 2003 | A1 |
20030071828 | Wilkins et al. | Apr 2003 | A1 |
20030089214 | Fukuta et al. | May 2003 | A1 |
20030093187 | Walker | May 2003 | A1 |
20030102999 | Bergin et al. | Jun 2003 | A1 |
20030156238 | Hiraishi et al. | Aug 2003 | A1 |
20030160718 | Nagasaku | Aug 2003 | A1 |
20030174396 | Murayama et al. | Sep 2003 | A1 |
20030180528 | Flosenzier et al. | Sep 2003 | A1 |
20030189606 | Moon et al. | Oct 2003 | A1 |
20030195672 | He | Oct 2003 | A1 |
20030216859 | Martell et al. | Nov 2003 | A1 |
20030222887 | Wilkins et al. | Dec 2003 | A1 |
20040044445 | Burdon | Mar 2004 | A1 |
20040059473 | He | Mar 2004 | A1 |
20040066645 | Graf et al. | Apr 2004 | A1 |
20040072575 | Young et al. | Apr 2004 | A1 |
20040083038 | He | Apr 2004 | A1 |
20040145499 | Schmidt et al. | Jul 2004 | A1 |
20040160341 | Feyereisen et al. | Aug 2004 | A1 |
20040160364 | Regev | Aug 2004 | A1 |
20040181318 | Redmond et al. | Sep 2004 | A1 |
20040264549 | Hoole | Dec 2004 | A1 |
20050004748 | Pinto et al. | Jan 2005 | A1 |
20050052451 | Servantie | Mar 2005 | A1 |
20050126679 | Kim | Jun 2005 | A1 |
20050136625 | Henseler et al. | Jun 2005 | A1 |
20050150289 | Osborne | Jul 2005 | A1 |
20050174350 | Ridenour et al. | Aug 2005 | A1 |
20050200502 | Reusser et al. | Sep 2005 | A1 |
20050225481 | Bonthron | Oct 2005 | A1 |
20050230563 | Corcoran, III | Oct 2005 | A1 |
20060004497 | Bull | Jan 2006 | A1 |
20060097895 | Reynolds et al. | May 2006 | A1 |
20060098452 | Choi et al. | May 2006 | A1 |
20060164284 | Pauplis et al. | Jul 2006 | A1 |
20060207967 | Bocko et al. | Sep 2006 | A1 |
20060215265 | Miyatake et al. | Sep 2006 | A1 |
20060227012 | He | Oct 2006 | A1 |
20060244636 | Rye et al. | Nov 2006 | A1 |
20060245171 | Kim et al. | Nov 2006 | A1 |
20060290253 | Yeo et al. | Dec 2006 | A1 |
20060290531 | Reynolds et al. | Dec 2006 | A1 |
20070001897 | Alland | Jan 2007 | A1 |
20070002078 | He et al. | Jan 2007 | A1 |
20070008214 | Wasiewicz | Jan 2007 | A1 |
20070013575 | Lee et al. | Jan 2007 | A1 |
20070018887 | Feyereisen et al. | Jan 2007 | A1 |
20070032951 | Tanenhaus et al. | Feb 2007 | A1 |
20070060063 | Wright et al. | Mar 2007 | A1 |
20070146364 | Aspen | Jun 2007 | A1 |
20070171094 | Alter et al. | Jul 2007 | A1 |
20070176794 | Feyereisen et al. | Aug 2007 | A1 |
20070179684 | He | Aug 2007 | A1 |
20070228586 | Merrill et al. | Oct 2007 | A1 |
20070247350 | Ryan | Oct 2007 | A1 |
20070279253 | Priest | Dec 2007 | A1 |
20070297736 | Sherman et al. | Dec 2007 | A1 |
20080018524 | Christianson | Jan 2008 | A1 |
20080051947 | Kemp | Feb 2008 | A1 |
20080074308 | Becker et al. | Mar 2008 | A1 |
20080111731 | Hubbard et al. | May 2008 | A1 |
20080145610 | Muller et al. | Jun 2008 | A1 |
20080180351 | He | Jul 2008 | A1 |
20080305721 | Ohashi et al. | Dec 2008 | A1 |
20090021397 | Wipf et al. | Jan 2009 | A1 |
20090040070 | Alter et al. | Feb 2009 | A1 |
20090040772 | Laney | Feb 2009 | A1 |
20090046229 | Umemoto et al. | Feb 2009 | A1 |
20090148682 | Higuchi | Jun 2009 | A1 |
20090152391 | McWhirk | Jun 2009 | A1 |
20090153783 | Umemoto | Jun 2009 | A1 |
20090164067 | Whitehead et al. | Jun 2009 | A1 |
20090207048 | He et al. | Aug 2009 | A1 |
20090279030 | Toyama et al. | Nov 2009 | A1 |
20090279175 | Laney et al. | Nov 2009 | A1 |
20100033499 | Gannon et al. | Feb 2010 | A1 |
20100103353 | Yamada | Apr 2010 | A1 |
20100297406 | Schaffer et al. | Nov 2010 | A1 |
20100312428 | Roberge et al. | Dec 2010 | A1 |
20100312461 | Haynie et al. | Dec 2010 | A1 |
20110037616 | Leutelt et al. | Feb 2011 | A1 |
20110054729 | Whitehead et al. | Mar 2011 | A1 |
20110075070 | Kitagawa et al. | Mar 2011 | A1 |
20110141405 | Kitagawa et al. | Jun 2011 | A1 |
20110165361 | Sherman et al. | Jul 2011 | A1 |
20110184594 | Manfred et al. | Jul 2011 | A1 |
20110273325 | Goldman | Nov 2011 | A1 |
20110282580 | Mohan | Nov 2011 | A1 |
20110304479 | Chen et al. | Dec 2011 | A1 |
20120053831 | Halder | Mar 2012 | A1 |
20120150426 | Conway | Jun 2012 | A1 |
20120174445 | Jones et al. | Jul 2012 | A1 |
20120215410 | McClure et al. | Aug 2012 | A1 |
20130041529 | He et al. | Feb 2013 | A1 |
20130234884 | Bunch | Sep 2013 | A1 |
20140009324 | Ranney | Jan 2014 | A1 |
20150211883 | He | Jul 2015 | A1 |
20160131739 | Jinkins et al. | May 2016 | A1 |
Number | Date | Country |
---|---|---|
196 49 838 | Apr 1998 | DE |
19949737 | Apr 2001 | DE |
0 556 351 | Jun 1995 | EP |
0 962 752 | Dec 1999 | EP |
0 814 744 | Jun 1959 | GB |
1 092 821 | Nov 1967 | GB |
01-210328 | Aug 1989 | JP |
05-200880 | Aug 1993 | JP |
05-293895 | Nov 1993 | JP |
06-051484 | Feb 1994 | JP |
H08-220547 | Aug 1996 | JP |
09-057779 | Mar 1997 | JP |
10-156853 | Jun 1998 | JP |
10-244589 | Sep 1998 | JP |
2000-141388 | May 2000 | JP |
2004-233590 | Aug 2004 | JP |
2004-354645 | Dec 2004 | JP |
2006-218658 | Aug 2006 | JP |
2006-334912 | Dec 2006 | JP |
2006-348208 | Dec 2006 | JP |
2007-206559 | Aug 2007 | JP |
2007-302398 | Nov 2007 | JP |
2008-238607 | Jan 2008 | JP |
WO-9305634 | Mar 1993 | WO |
WO-2009133102 | Nov 2009 | WO |
WO-2011089474 | Jul 2011 | WO |
Entry |
---|
U.S. Appl. No. 11/851,323, filed Sep. 6, 2007, McCusker. |
U.S. Appl. No. 11/863,219, filed Sep. 27, 2007, Woodell. |
U.S. Appl. No. 11/863,221, filed Sep. 27, 2007, Woodell. |
U.S. Appl. No. 11/899,801, filed Sep. 6, 2007, Woodell et al. |
U.S. Appl. No. 11/900,002, filed Sep. 6, 2007, Woodell et al. |
U.S. Appl. No. 12/167,200, filed Jul. 2, 2008, Woodell et al. |
U.S. Appl. No. 12/167,203, filed Jul. 2, 2008, Woodell. |
U.S. Appl. No. 12/167,208, filed Jul. 2, 2008, Dickerson et al. |
U.S. Appl. No. 12/180,293, filed Jul. 25, 2008, Woodell et al. |
U.S. Appl. No. 12/786,169, filed May 24, 2010, Nemeth et al. |
U.S. Appl. No. 13/224,992, filed Sep. 2, 2011, Hufnagel et al. |
U.S. Appl. No. 13/250,307, filed Mar. 30, 2011, Jinkins. |
U.S. Appl. No. 13/250,798, filed Sep. 30, 2011, Jinkins. |
“MountainScope™ on a TabletPC,” PCAvionics™ , printed from website www.pcavionics.com on Aug. 28, 2007, 1 page. |
TAWS Class A and Class B, Terrain Awareness and Warning Systems, Universal® Avionics Systems Corporation, Sep. 2007, 6 pages. |
“TAWS Terrain Awareness and Warning System,” Universal® Avionics, printed from website www.uasc.com on Aug. 28, 2007, 2 pages. |
Adams, Charlotte, “Synthetic Vision: Picturing the Future,” Avionics magazine, Oct. 1, 2006, printed from website www.aviationtoday.com, 4 pages. |
Adams, Charlotte, “Synthetic Vision: Picturing the Future,” Avionics magazine, Solutions for Global Airspace Electronics, Oct. 2006, cover and pp. 22-29. |
Advisory Action for U.S. Appl. No. 12/009,472, dated Feb. 25, 2013, 3 pages. |
Advisory Action for U.S. Appl. No. 13/538,957, dated Jun. 14, 2013, 6 pages. |
Blue Mountain Avionics' Products, printed from website www.bluemountainavionics.com on Aug. 28, 2007, 4 pages. |
Carter, S. P., D. D. Blankenship, M. E. Peters, D. A. Young, J. W. Holt, and D. L. Morse (2007), Radar-based subglacial lake classification in Antarctica, Geochem. Geophys. Geosyst., 8, 003016, doi:10.1029/2006GC001408, 20 pages. |
Final Office Action on U.S. Appl. No. 13/250,798 dated Sep. 4, 2014, 22 pages. |
Final Office Action on U.S. Appl. No. 13/867,556 dated Jul. 3, 2014, 11 pages. |
Final Office Action on U.S. Appl. No. 13/250,307 dated Jun. 11, 2014, 8 pages. |
Final Office Action on U.S. Appl. No. 13/250,798 dated Aug. 7, 2015, 21 pages. |
G2000, Garmin, printed from website https://buy.garmin.com/shop/shop.do?cID=153&pID=97668 on Jun. 28, 2011, 2 pages. |
G3000, Garmin, printed from website https://buy.garmin.com/shop/shop.do?cID=153&pID=66916 on Jun. 28, 2011, 2 pages. |
G5000, Garmin, printed from website https://buy.garmin.com/shop/shop.do?cID=153&pID=90821&ra=true on Apr. 20, 2011, 2 pages. |
Non-Final Office Action on U.S. Appl. No. 13/250,798 dated Mar. 18, 2015, 21 pages. |
Non-Final Office Action on U.S. Appl. No. 14/301,199 dated Sep. 9, 2015, 18 pages. |
Notice of Allowance for U.S. Appl. No. 12/009,372, dated Oct. 13, 2011, 8 pages. |
Notice of Allowance for U.S. Appl. No. 12/009,373, dated Jun. 16, 2010, 4 pages. |
Notice of Allowance for U.S. Appl. No. 12/009,472, dated Sep. 5, 2013, 8 pages. |
Notice of Allowance for U.S. Appl. No. 12/786,169, dated Mar. 28, 2013, 6 pages. |
Notice of Allowance for U.S. Appl. No. 13/538,957, dated Oct. 3, 2013, 13 pages. |
Office Action for U.S. Appl. No. 12/009,372, dated Dec. 20, 2010, 10 pages. |
Office Action for U.S. Appl. No. 12/009,372, dated Jun. 13, 2011, 9 pages. |
Office Action for U.S. Appl. No. 12/009,373, dated Dec. 30, 2009, 14 pages. |
Office Action for U.S. Appl. No. 12/009,472, dated Apr. 16, 2012, 16 pages. |
Office Action for U.S. Appl. No. 12/009,472, dated Jan. 14, 2011, 14 pages. |
Office Action for U.S. Appl. No. 12/009,472, dated Mar. 20, 2013, 15 pages. |
Office Action for U.S. Appl. No. 12/009,472, dated Nov. 3, 2011, 15 pages. |
Office Action for U.S. Appl. No. 12/009,472, dated Nov. 9, 2012, 15 pages. |
Office Action for U.S. Appl. No. 12/263,282, dated Jan. 5, 2012, 10 pages. |
Office Action for U.S. Appl. No. 12/786,169, dated Jan. 18, 2013, 14 pages. |
Office Action for U.S. Appl. No. 12/892,563, dated Feb. 19, 2013, 12 pages. |
Office Action for U.S. Appl. No. 13/224,992, dated Feb. 28, 2013, 10 pages. |
Office Action for U.S. Appl. No. 13/250,307, dated Nov. 5, 2013, 11 pages. |
Office Action for U.S. Appl. No. 13/538,957, dated Apr. 4, 2013, 19 pages. |
Office Action for U.S. Appl. No. 13/538,957, dated Oct. 5, 2012, 18 pages. |
Office Action for U.S. Appl. No. 13/743,182, dated Apr. 8, 2013, 10 pages. |
Office Action for U.S. Appl. No. 12/786,169, dated Jul. 20, 2012, 8 pages. |
Office Action in Japanese Patent Application 2015-116688, dated Aug. 25, 2015, 4 pages. |
Office Action in Japanese Patent Application 2015-116716, dated Aug. 25, 2015, 3 pages. |
Office Action on U.S. Appl. No. 12/236,464, dated Feb. 11, 2014, 21 pages. |
Office Action on U.S. Appl. No. 12/236,464, dated Jun. 22, 2011, 14 pages. |
Office Action on U.S. Appl. No. 13/250,798 dated Apr. 23, 2014, 15 pages. |
Office Action on U.S. Appl. No. 13/867,556 dated Feb. 7, 2014, 11 pages. |
Office Action U.S. Appl. No. 11/787,460, dated Aug. 31, 2010, 18 pages. |
Office Action with English Translation received in Korean Patent Application 10-2010-7017278, dated Aug. 26, 2015, 5 pages. |
Pictures of DELPHINS, printed from website www.tunnel-in-the-sky.tudelft.nl on Aug. 28, 2007, 4 pages. |
Restriction Requirement for U.S. Appl. No. 13/867,556, dated Dec. 26, 2013, 6 pages. |
Van Kasteren, Joost, “Tunnel-in-the-Sky, Synthetic vision simplifies the pilot's job and enhances safety,” printed from website www.delftoutlook.tudelft.nl on Aug. 28, 2007, 13 pages. |
Walker, GD-Itronix Dynavue Technology, The Ultimate Outdoor-Readable Touch-Screen Display, Rugged PC Review, 4 pages. |
U.S. Appl. No. 12/236,464, filed Sep. 23, 2008, Rockwell Collins. |
U.S. Appl. No. 13/627,788, filed Sep. 26, 2012, Rockwell Collins. |
U.S. Appl. No. 13/857,955, filed Apr. 5, 2013 Barber et al. |
U.S. Appl. No. 13/250,798, filed Sep. 30, 2011, Rockwell Collins. |
U.S. Appl. No. 14/301,199, filed Jun. 10, 2014, Rockwell Collins. |
U.S. Appl. No. 14/482,681, filed Sep. 10, 2014, Rockwell Collins. |
Airports Authority of India, Chapter 7: Visual AIDS for Navigation—Lights, available prior to Jan. 1, 2005, retrieved from the internet at: http://www.aai.aero/aai_employees/chapter_7.pdf on Sep. 26, 2014, 33 pages. |
Brailovsky et al., REVS122: A Radar-Based Enhanced Vision System for Degraded Visual Environments, Proc. of SPIE vol. 9087 908708-1, retrieved from the internet at http://proceedings.spiedigitallibrary.org on Jun. 25, 2014, 13 pages. |
Federal Aviation Administration, Advisory Circular AC 90-106, “Enhanced Flight Vision Systems”, initiated by AFS-400, dated Jun. 2, 2010, 55 pages. |
Federal Aviation Administration, Aeronautical Information Manual (AIM) Basic Flight Information and ATC Procedures, dated Jul. 24, 2014, 2 pages. |
Fountain, J.R., Digital Terrain Systems, Airborne Navigation Systems Workshop (Digest No. 1997/169), IEE Colloquium, pp. 4/1-4/6, Feb. 21, 1997. |
Honeywell, RDR-4B Forward looking windshear detection / weather radar system user's manual with radar operating guidelines, Rev. 6, Jul. 2003, 106 pages. |
Johnson, A., et al., Vision Guided Landing of an Autonomous Helicopter in Hazardous Terrain, Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference, pp. 3966-3971, Apr. 18-22, 2005. |
Kuntman, D., Airborne system to address leading cause of injuries in non-fatal airline accidents, ICAO Journal, Mar. 2000, 4 pages. |
Notice of Allowance for U.S. Appl. No. 11/863,221, dated Aug. 2, 2010, 9 pages. |
Notice of Allowance for U.S. Appl. No. 11/899,801, dated Aug. 19, 2010, 5 pages. |
Notice of Allowance for U.S. Appl. No. 11/900,002, dated Sep. 14, 2010, 5 pages. |
Notice of Allowance for U.S. Appl. No. 12/167,200, dated Oct. 28, 2010, 5 pages. |
Notice of Allowance for U.S. Appl. No. 12/167,203, dated Jun. 21, 2013, 7 pages. |
Notice of Allowance for U.S. Appl. No. 12/167,208, dated Mar. 21, 2011, 8 pages. |
Notice of Allowance for U.S. Appl. No. 12/180,293, dated Aug. 4, 2011, 8 pages. |
Notice of Allowance on U.S. Appl. No. 13/241,051 dated Aug. 28, 2014, 9 pages. |
Notice of Allowance on U.S. Appl. No. 13/247,742 dated Jul. 30, 2014, 9 pages. |
Office Action for U.S. Appl. No. 11/851,323, dated Aug. 6, 2009, 23 pages. |
Office Action for U.S. Appl. No. 11/851,323, dated Dec. 15, 2010, 13 pages. |
Office Action for U.S. Appl. No. 11/851,323, dated Jul. 5, 2012, 23 pages. |
Office Action for U.S. Appl. No. 12/167,200, dated Jul. 21, 2010, 6 pages. |
Office Action for U.S. Appl. No. 12/167,203, dated Aug. 26, 2010, 11 pages. |
Office Action for U.S. Appl. No. 12/167,203, dated Sep. 21, 2012, 6 pages. |
Office Action for U.S. Appl. No. 12/167,208, dated Dec. 30, 2009, 10 pages. |
Office Action for U.S. Appl. No. 12/167,208, dated Jun. 3, 2010, 11 pages. |
Office Action for U.S. Appl. No. 12/167,208, dated Oct. 19, 2010, 8 pages. |
Office Action for U.S. Appl. No. 12/180,293, dated Jan. 4, 2011, 5 pages. |
Office Action for U.S. Appl. No. 12/180,293, dated Jul. 28, 2010, 8 pages. |
Office Action for U.S. Appl. No. 12/976,871, dated Feb. 15, 2012, 8 pages. |
Office Action for U.S. Appl. No. 12/976,871, dated Jul. 10, 2012, 4 pages. |
Office Action for U.S. Appl. No. 12/976,871, dated May 6, 2013, 5 pages. |
Office Action for U.S. Appl. No. 12/976,871, dated Nov. 21, 2012, 5 pages. |
Office Action for U.S. Appl. No. 12/976,871, dated Oct. 9, 2013, 5 pages. |
Office Action for U.S. Appl. No. 13/183,314, dated Aug. 14, 2013, 11 pages. |
Office Action for U.S. Appl. No. 13/183,314, dated Mar. 28, 2013, 12 pages. |
Office Action for U.S. Appl. No. 13/474,559, dated Aug. 28, 2013, 10 pages. |
Office Action for U.S. Appl. No. 13/474,559, dated Dec. 28, 2012, 8 pages. |
Office Action on U.S. Appl. No. 13/241,051 dated Feb. 27, 2014, 21 pages. |
Office Action on U.S. Appl. No. 13/247,742 dated Dec. 3, 2013, 11 pages. |
REVS Product Information Sheet, Sierra Nevada Corporation, dated May 7, 2014, 2 pages. |
Skolnik, Introduction to Radar Systems, McGraw Hill Book Company, 2001, 3 pages. |
Skolnik, Radar Handbook (McGraw Hill Book Company), 1990, 23 pages. |
Synthetic Vision System, en.wikipedia.org/wiki/Synthetic_vision_system, retrieved Feb. 28, 2013, 4 pages. |
Technical Standard Order, TSO-C115b, Airborne Area Navigation Equipment Using Multi-Sensor Inputs, Department of Transportation, Federal Aviation Administration, Sep. 30, 1994, 11 pages. |
U.S. Office Action on U.S. Appl. No. 11/900,002 dated Jun. 8, 2010. |
U.S. Office Action on U.S. Appl. No. 13/247,742 dated Apr. 16, 2014, 15 pages. |
Vadlamani, A., et al., Improving the detection capability of spatial failure modes using downward-looking sensors in terrain database integrity monitors, Digital Avionics Systems Conference, 2003. DASC-03. The 22nd, vol. 2, pp. 9C.5-91-12 vol. 2, Oct. 12-16, 2003. |
Wang et al., A Simple Based on DSP Antenna Controller of Weather Radar, 2001 CIE International Conference, 4 pages. |
Non-Final Office Action on U.S. Appl. No. 13/250,798 dated Feb. 26, 2016, 9 pages. |
Notice of Allowance on U.S. Appl. No. 12/263,282 dated Jan. 29, 2016, 8 pages. |
Notice of Allowance on U.S. Appl. No. 14/301,199 dated Mar. 1, 2016, 11 pages. |
First Office Action on Korean Patent Application No. 10-2016-7013740, dated Sep. 19, 2016, 7 pages. |
U.S. Appl. No. 14/841,558, filed Aug. 31, 2015, Rockwell Collins, Inc. |
McGray et al., Air Operators, Airlines, Manufacturers and Interested Industry Stakeholders & Aero Chart Forum-Utilizing EFVS technology and incorporating it into FAA NextGen, Federal Aviation Administration, Apr. 23, 2014 & Apr. 30, 2014, 34 pages. |
Non-Final Office Action on U.S. Appl. No. 13/250,798, dated Sep. 9, 2016, 6 pages. |
Notice of Allowance on U.S. Appl. No. 13/250,798, dated Sep. 28, 2016, 10 pages. |
Non-Final Office Action on U.S. Appl. No. 14/482,681, dated Dec. 20, 2016, 9 pages. |
English Translation of Japanese Notice of Reasons for Rejection in Japanese Application No. 2016-001165 , dated Apr. 25, 2017, 1 page. |
Non-Final Office Action on U.S. Appl. No. 14/270,587, dated May 8, 2017, 16 pages. |
First Office Action with English Translation of Chinese Application No. 201510005057.5, dated Apr. 25, 2017, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20160131739 A1 | May 2016 | US |