The present application is related to U.S. patent application Ser. No. 14/162,035 filed on Jan. 23, 2014 by Kronfeld et al., entitled “Weather Radar System and Method With Path Attenuation Shadowing,” U.S. patent application Ser. No. 14/086,844 filed on Nov. 21, 2013 by Breiholz et al., entitled “Weather Radar System and Method for Estimating Vertically Integrated Liquid Content,” and to U.S. patent application Ser. No. 14/465,730 filed on Aug. 21, 2014 by Breiholz et al., entitled “Weather Radar System and Method With Latency Compensation for Data Link Weather Information,” each of which is assigned to the assignee of the present application and incorporated herein by reference in its entirety.
The present disclosure relates generally to the field of weather display systems. More particularly, the present disclosure relates to a weather display system and method configured to provide latency compensation for data linked weather data.
Aircraft weather radar systems are often used to alert operators of vehicles, such as aircraft pilots, of weather hazards in the area near the aircraft. Such weather radar systems typically include an antenna, a receiver transmitter, a processor, and a display. The system transmits radar pulses or beams and receives radar return signals indicative of weather conditions. Conventional weather radar systems, such as the WXR 2100 MULTISCAN radar system manufactured by Rockwell Collins, Inc., have Doppler capabilities and can measure or detect parameters such as weather range, weather reflectivity, weather velocity, and weather spectral width or velocity variation. Weather radar systems may also detect outside air temperature, winds at altitude, INS G loads (in-situ turbulence), barometric pressure, humidity, etc.
Weather radar signals are processed to provide graphical images to a radar display. The radar display is typically a color display providing graphical images in color to represent the severity of the weather. Some aircraft systems also include other hazard warning systems such as a turbulence detection system. The turbulence detection system can provide indications of the presence of turbulence or other hazards. Conventional weather display systems are configured to display weather data in two dimensions and often operate according to ARINC 453 and 708 standards. A horizontal plan view provides an overview of weather patterns that may affect an aircraft mapped onto a horizontal plane. Generally the horizontal plan view provides images of weather conditions in the vicinity of the aircraft, such as indications of precipitation rates. Red, yellow, and green colors are typically used to symbolize areas of respective precipitation rates, and black color symbolizes areas of very little or no precipitation. Each color is associated with radar reflectivity range which corresponds to a respective precipitation rate range. Red indicates the highest rates of precipitation while green represents the lowest (non-zero) rates of precipitation. Certain displays may also utilize a magenta color to indicate regions of turbulence.
While aircraft-based weather radar systems may typically provide the most timely and directly relevant weather information to the aircraft crew based on scan time of a few seconds, the performance of aircraft-based weather systems may be limited in several ways. First, typical radar beam widths of aircraft-based weather radar systems are 3 to 10 degrees. Additionally, the range of aircraft-based weather radar systems is typically limited to about 300 nautical miles, and typically most effective within about 80-100 nautical miles. Further, aircraft-based weather radar systems may be subject to ground clutter when the radar beam intersects with terrain, or to path attenuation due to intense precipitation or rainfall.
Information provided by aircraft weather radar systems may be used in conjunction with weather information from other aircraft or ground-based systems to, for example, improve range and accuracy and to reduce gaps in radar coverage. For example, the National Weather Service WSR-88D Next Generation Radar (NEXRAD) weather radar system is conventionally used for detection and warning of severe weather conditions in the United States. NEXRAD data is typically more complete than data from aircraft-based weather radar systems due to its use of volume scans of up to 14 different elevation angles with a one degree beam width. Similarly, the National Lightning Detection Network (NLDN) may typically be a reliable source of information for weather conditions exhibiting intense convection. Weather satellite systems, such as the Geostationary Operational Environmental Satellite system (GOES) and Polar Operational Environmental Satellite system (POES) are other sources of data used for weather analyses and forecasts.
While NEXRAD has provided significant advancements in the detection and forecasting of weather, NEXRAD data may have gaps where no data is collected (e.g., due to cone of silence and umbrella of silence regions, insufficient update rates, geographic limitations, or terrain obstructions). Similarly, weather observations and ground infrastructure are conventionally limited over oceans and less-developed land regions. Providing weather radar information from aircraft systems to other aircraft systems and/or ground-based operations may provide significant improvement to weather observations and forecasts by filling such gaps in radar coverage. Similarly, providing weather radar information from ground-based systems to aircraft-based systems may increase the range and accuracy of aircraft-based systems in certain conditions.
One issue with sharing weather data among aircraft-based and ground-based weather radar systems is the discrepancy in apparent location of weather conditions due to the time delay associated with transmitting and displaying shared weather data. For example, depending on circumstances, latencies associated with transmission and delivery of uplinked weather radar data from a ground-based system to an aircraft-based system are commonly on the order of 5-10 minutes, and in some cases may be as long as 20 minutes. Ground-based weather radar systems may also have a lower update rate. Such latency issues have conventionally limited the use of uplinked weather radar data to longer range applications rather than shorter range tactical decisions.
Such latency issues are further compounded when multiple weather radar data sources are combined. Current systems typically provide individual displays for each data source, and often with display limitations that require the aircraft crew to form a mental image of an integrated display rather than actually viewing an integrated display. Furthermore, misalignments due to latency issues may result in increasing the size of threat regions such that longer diversions are required for an aircraft to avoid such enlarged threat regions. In addition, the various sensors used by each of the multiple weather data sources may provide dissimilar measurements for a weather condition, creating uncertainty in how to weight the data provided from each source. There is an ongoing need for improved weather radar systems and methods that provide a comprehensive integrated display of weather conditions using multiple weather radar data sources. There is yet further need for improved weather radar systems and methods that compensate for latency issues associated with sharing and fusing weather radar data among aircraft-based and ground-based systems to more accurately portray the threat posed by a weather condition. There is further need for improved weather radar systems and methods that provide a confidence factor reflecting the degree to which various input sources agree in their characterization of the severity of the threat posed by a weather condition. There is further need for improved weather radar systems and methods that provide a composite estimate of storm top height to provide a more accurate picture of the vertical extent of the threat posed by a weather condition.
According to an exemplary embodiment, a method of displaying an image representative of a weather condition near an aircraft includes receiving weather data representative of the weather condition from a plurality of weather data sources. The weather data includes location data for the weather condition. The method also includes mapping the weather data received from each source to a common locational reference frame based on the location data, adjusting the weather data received from each source to a common hazard scale, determining a hazard level associated with the weather condition for a reference point in the reference frame based on the adjusted weather data for each source, and
displaying the image representative of the weather condition near the aircraft based on the hazard level for the reference point
According to another exemplary embodiment, an aircraft weather radar system includes a processor and a non-transitory memory coupled to the processor. The memory contains program instructions that, when executed, cause the processor to receive weather data representative of a weather condition from a plurality of weather data sources. The weather data includes location data for the weather condition. The instructions further cause the processor to map the weather data received from each source to a common locational reference frame based on the location data, determine a hazard level associated with the weather condition for a reference point in the reference frame based on the adjusted weather data for each source, evaluate a confidence level for the hazard level at the reference point, and display an image representative of the weather condition near the aircraft based on the confidence level for the hazard level at the reference point.
According to another exemplary embodiment, a weather radar system includes a processor and a non-transitory memory coupled to the processor. The memory contains program instructions that, when executed, cause the processor to receive echo top data for a weather condition from first and second weather data sources, map the echo top data to a common reference frame, adjust the echo top data received from the first source based on the echo top data received from the second source, and generate an image representative of the weather condition based on the adjusted echo top data.
The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements, in which:
Before turning to the figures, which illustrate the exemplary embodiments in detail, it should be understood that the application is not limited to the details or methodology set forth in the description or illustrated in the figures. It should also be understood that the terminology is for the purpose of description only and should not be regarded as limiting. As discussed below, the systems and methods can be utilized in a number of display devices for various types of applications or sensing systems. In some embodiments, the systems and methods of the present disclosure may be used for a flight display of an aircraft. According to various other exemplary embodiments, the systems and methods of the present disclosure may be used by any system in any other embodiment for rendering computer graphics and displaying an output (e.g., in another aircraft or spacecraft, a ground-based vehicle, or in a non-vehicle application such as a ground-based weather radar system.
Referring to
In some embodiments, flight displays 20 may provide an output from an aircraft-based weather radar system, LIDAR system, infrared system or other system on the aircraft. For example, flight displays 20 may include a weather display, a joint display, a weather radar map and a terrain display. Further, flight displays 20 may include an electronic display or a synthetic vision system (SVS). For example, flight displays 20 may include a display configured to display a two-dimensional (2-D) image, a three dimensional (3-D) perspective image of terrain and/or weather information, or a four dimensional (4-D) display of weather information or forecast information. Other views of terrain and/or weather information may also be provided (e.g., plan view, horizontal view, vertical view, etc.). The views may include monochrome or color graphical representations of the terrain and/or weather information. Graphical representations of weather or terrain may include an indication of altitude of the weather or terrain or the altitude relative to the aircraft.
Aircraft control center 10 may include one or more user interface (UI) elements 22. UI elements 22 may include, for example, dials, switches, buttons, touch screens, keyboards, a mouse, joysticks, cursor control devices (CCDs) or other multi-function key pads certified for use with avionics systems, etc. UI elements 22 may be configured to, for example, allow an aircraft crew member to interact with various avionics applications and perform functions such as data entry, manipulation of navigation maps, and moving among and selecting checklist items. For example, UI elements 22 may be used to adjust features of flight displays 20, such as contrast, brightness, width, and length. UI elements 22 may also (or alternatively) be used by an occupant to interface with or change the displays of flight displays 20. UI elements 22 may additionally be used to acknowledge or dismiss an indicator provided by flight displays 20. Further, UI elements 22 may be used to correct errors on the electronic display. Other UI elements 22, such as indicator lights, displays, display elements, and audio alerting devices, may be configured to warn of potentially threatening conditions such as severe weather, terrain, obstacles, etc.
Referring to
Radar system 50 may generally work by sweeping a radar beam horizontally back and forth across the sky. For example, radar system 50 may conduct a first horizontal sweep 52 directly in front of the aircraft and a second horizontal sweep 54 downward at a tilt angle 56 (e.g., 20 degrees down). Returns from different tilt angles may be electronically merged to form a composite image for display on an electronic display, such as a flight display 20 in aircraft control center 10. Returns may also be processed to, for example, distinguish among terrain, weather, and other objects, to determine the height of the terrain, to determine the height of the weather, etc.
Radar system 50 may also sweep a radar beam vertically back and forth at varying vertical tilt angles. Results from the different vertical tilt angles may be analyzed to determine the characteristics of weather. For example, the altitude, range, and vertical height of weather may be determined using the vertical scan results. The vertical scan results may be used to form an image for display on an electronic display (e.g., flight display 20, etc.). For example, a vertical profile view of the weather may be generated. The profile may be used by a pilot to determine height, range, hazards and threats, and other relevant information that may be utilized by an aircraft crew member to change the course of the aircraft to avoid the detected weather condition.
Referring to
Additionally, weather radar system 202 may perform multiple radar sweeps. The radar sweeps may include horizontal sweeps, vertical sweeps, or a combination of horizontal and vertical sweeps. Further, the radar sweeps can be performed such that they are substantially orthogonal to one another. According to other exemplary embodiments, weather radar system 202 can be a monopulse radar system, a sequential lobing system, or a radar system with an aperture capable of switching modes. Aircraft sensors 203 may include, for example, one or more lightning sensors, turbulence sensors, pressure sensors, optical systems (e.g., camera system, infrared system), outside air temperature sensors, winds at altitude sensors, INS G load (in-situ turbulence) sensors, barometric pressure sensors, humidity sensors, or any other aircraft sensors or sensing systems that may be used to monitor weather and detect, for example, lightning, convective cells, clear air turbulence, etc. Data from aircraft sensors 203 may be output to processor 204 for further processing and display, or for transmission to a station 220 (e.g., a ground-based weather radar system or terrestrial station) or to other aircraft 230, 240 via communication system 208.
Weather radar system 202 may be a system for detecting weather patterns. Detected weather patterns may be communicated to electronic display system 206 for display to the flight crew. In addition, data from station 220 may be displayed on display system 206. Detected weather patterns may instead or may also be provided to electronics or processor 204 for further analysis or transmission to a station 220 or another aircraft 230, 240 via communication system 208.
Station 220 may direct the aircraft 201, 230, 240 via communication system 208 to scan in specific areas to improve detection accuracy of weather. Alternatively, system 202 may request that station 220 and aircraft 230, 240 direct a scan towards weather of interest to aircraft 201 (e.g., in the flight path) to improve weather detection accuracy. The scans performed by radar system 202 and the requests may be transmitted to station 220 or another aircraft 230, 240 via communication system 208.
Referring to
Various types of channels may be utilized including virtual channels, radio channels, satellite channels, etc. The channels may be bi-directional or uni-directional. Channels may be satellite link channels, VHF channels, INMARSAT channels, etc. Any type of wireless communications may be utilized. Various types of communication protocols, including network and ad hoc network protocols may be used to perform communication operations and establish the channels in
The weather data exchanged among ground station 320 and aircraft 301, 330, and 340 may be in a number of forms. For example, the weather data may include radar data containing location information, motion vector data, time of sensing information, and measured parameter values for a weather condition 390. The location information may be in, for example, a format based on azimuth, elevation, and range from the radar system or another fixed reference point, in a rectangular grid format, a georegistered format, or other format. The radar data may also include radar characteristics associated with the radar used to provide the radar data. The characteristics may include an indication of band-type, radar quality, tilt angle, etc. In some embodiments, station 320 may adjust radar for its particular bands so that comparisons and selection of data is consistent.
In some embodiments, the weather data may be provided from a plurality of sources. Such weather data may also be indicative of one or more types of weather conditions. For example, weather data may be indicative of convective weather systems (e.g., thunderstorms), turbulence, winds aloft, icing, and/or volcanic ash. In some embodiments, data regarding convective weather systems may be provided from a ground-based weather system such as NEXRAD. Such data may include IDs for an adaptable number of weather cells, which may be segmented (e.g., delivered in polygon format) weather cells identified in a series of radar volume scans. Individual weather cells may be, for example, 3-D regions of significant reflectivity or other values above one or more specified threshold values. Individual weather cells may be composed of reflectivity radial run segments, and in turn, 2-D weather components composed of segment groups and occurring at different radar elevation angles. Weather components with calculated mass weighted centroids may be vertically correlated into a cell with an established centroid. Such weather cell data may also include individual data points and trends for each weather cell. For example, current weather cell location may be provided with azimuth, range, direction, and speed information, such as a motion vector using polar and/or Cartesian coordinates along with an estimate of any tracking errors. Other examples include storm base height, storm top height, maximum reflectivity, height of maximum reflectivity, probability of hail, probability of severe hail, cell-based vertically integrated liquid (VIL) content, enhanced echo tops (EET) and centroid height. Weather tracking data may be generated by monitoring movement of weather cells and matching cells in current and prior volume scans. Forecast data may be generated by predicting future centroid locations based on prior volume scans, and growth, decay, and/or shape change estimates. Average data for multiple weather cells may be provided as well (e.g., average motion vector data). The weather data may be provided as, for example, a table of alphanumeric values, and/or as a stand-alone display or graphical overlay.
In some embodiments, weather data indicative of weather conditions exhibiting intense convection may include lightning data such as that provided by the NLDN. Such data may include indications of individual discharges or flash rates in a given area. In some embodiments, pilot reports (PIREPs) may be used to indicate turbulence. In some embodiments, observation, nowcast and/or forecast data from weather satellite systems, such as the Geostationary Operational Environmental Satellite system (GOES) and Polar Operational Environmental Satellite system (POES) may also be used (e.g., to track volcanic ash cloud behavior). In some embodiments, radiosonde data from weather balloons may be used. In some embodiments, data from satellite sources or nowcasting weather data sources (e.g., the Corridor Integrated Weather System (CIWS)) may be used.
Referring to
Memory 406 may include any type of machine-readable storage device capable of storing radar returns or associated weather data 417 (shown in
In some embodiments, memory 406 may store in a readily addressable and rapidly retrievable manner at least two sets of weather data 417 resulting from two or more antenna or radar beam sweeps at different angles or tilt angles. Although a multi-scan, multi-tilt scanning and data sets are described, it should be understood by one of ordinary skill in the art that a single scan of data may also be used in some embodiments. Memory 406 may also include a three-dimensional storage buffer for storing weather radar parameters according to X, Y and Z coordinates according to one embodiment. The storage of radar data and the form of the weather data 417 stored in the memory 406 is not disclosed in a limiting fashion. A variety of techniques for storing weather data 417 may be used as well.
In some embodiments, weather data 417 may be stored (e.g., in the memory 406) as a mathematical equation representation of the information. The mathematical equation representation may be a piecewise linear function, piecewise nonlinear function, coefficients of a cubic spline, coefficients of a polynomial function, etc., that represent vertical representations of a weather condition based on the horizontal scan data and/or horizontal representation of the weather condition based on the vertical scan data. The function may be an equation based on weather parameters that may be sensor driven, model driven, a merger of sensor and model, etc. Although horizontal scan data is described, alternative embodiments may include Cartesian coordinates, rho/theta input, latitude and longitude coordinates, altitude, etc. Weather conditions may be estimated for any desired point in space with the vertical dimension being the subject of the weather equation.
Processor 408 may be implemented in hardware, firmware, software, or any combination of these methods. System 400 may have one or more processors 408 that use the same or a different processing technology. Additionally, processor 408 may be a separate component of system 400 or may be embedded within another component of system 400. Processor 408 may execute instructions that may be written using one or more programming languages, scripting languages, assembly languages, etc. The instructions may be carried out by, for example, a special purpose computer, logic circuits, or hardware circuits. The term “execute” is the process of running an application or the carrying out of the operation called for by an instruction. Processor 106 may process data and/or execute applications stored in memory 406, such as weather data 417 and weather image application 418 and/or other instructions.
Processor 408 may be included as part of a multi-scan, multi-tilt angle weather radar system and may perform the customary functions performed by a conventional weather radar return processing unit. Processor 408 may also perform several additional operations based upon the additional data and/or instructions provided in memory 406. In general, processor 408 may merge or cross qualify portions, or ranges, of the radar returns of several different antenna sweeps at several different tilt angles, so that a single, relatively clutter-free image may be presented to the pilot based upon the several separate scans. The radar returns may be processed by processor 408 to generate a 2-D, 3-D, or 4-D weather profile of the weather near the aircraft. In some embodiments, processor 408 may merge or cross qualify portions, or ranges, of the radar returns or weather data of several different sources, including weather data from one or more remote sources 414, so that a composite or fused image may be presented to the pilot based upon the several weather data sources.
Processor 408 may process weather radar returns to identify or sense the presence of weather conditions in front of (e.g., in the flight path) or in view of the aircraft. In some embodiments, processor 408 may utilize the altitude and range of the weather condition to generate a vertical profile associated with the weather. Processor 408 may scan across an array of azimuths to generate a 3-D weather profile of the weather near the aircraft, which may be stored for later presentation and/or displayed on display 410. In some embodiments, additional visual indicators other than the representation of weather are provided on display 410. In some embodiments, a range and bearing matrix having range markers indicating distance from a current location of the aircraft and bearing markers indicating azimuths from a current flight path or bearing of the aircraft may be provided and may assist the pilot in cognitive recognition of weather features from the pilot's perspective.
Referring now to
Weather data 417a from returns received by antenna 404 and weather data 417b from remote source 414 may be stored in memory 406. Weather data 417b from remote source 414 may be received via communications unit 416 (shown in
Referring again to
Referring now to
At a step 520, the weather data may be filtered. For example, weather data that is outdated or corrupted may be removed. In some embodiments, radar returns data or other weather data indicative of convective weather conditions may be removed if they are older than one update cycle. In some embodiments, lightning data provided as individual flash records may be managed such that flash density in a time window of interest may be determined, and flash records outside of the window of interest may be discarded. In some embodiments, data outside of a defined spatial coverage area may be removed. For example, if NEXRAD data is to be combined with other data sources, data from the other sources falling outside of a defined spatial coverage for the NEXRAD data may be removed or may be assigned a lower level of confidence. In some embodiments, weather data within a certain range of the physical radar source may be rejected. For example, data within 35 kilometers of a radar source may typically be rejected where it may be assumed that the scan pattern results in inverted cones of silence in the radar returns data, when the volume coverage pattern and altitude of the weather condition indicates a concern. In some embodiments, data from one source may not be used if data from another source is deemed of better quality. For example, data from a radar station using a clear air volume coverage pattern may not be used if there is another station using a convective volume coverage for the same area, and convectivity is the primary focus.
At a step 530, weather data may be mapped to a common reference frame. In some embodiments, the received weather data may be mapped to a common gridded format using, for example, a 1 kilometer grid. Such a grid may provide a uniform structure in latitude and longitude to which one or more sets of weather data from one or more sources may be mapped. Other formats and grid spacings are contemplated as well, depending on required detail, available memory, and processing capabilities.
In some embodiments, weather data received in a non-gridded format (e.g., weather cell tracking data, lightning flash data, etc.) may be linked to the common gridded format. For example, referring now to
Referring again to
In some embodiments, the received weather data may include data indicative of a respective first location for each of a plurality of individual weather cells, which may or may not be segmented. The received weather data may further include a respective motion vector for each weather cell. Each respective motion vector may include data indicative of a speed and direction of travel for its corresponding weather cell, or for each grid location within a relevant region. For segmented weather cells, the respective motion vector for each cell may be used to advect the weather cell in an appropriate distance and direction from a first location to an updated second location. For gridded weather data, the respective motion vectors for each gridded location may be used to advect the weather condition to an updated second location for each grid point.
In some embodiments, an estimated motion vector may be determined for a grouping of the plurality of weather cells (e.g., based on an average, interpolation, a data quality analysis, etc.), and the estimated motion vector may be used to update the location of one or more of the weather cells in the grouping, the entire grouping, or weather cells outside the grouping (e.g., non-segmented cells, cells without motion vector or tracking data, cells exhibiting erratic motion with respect to surrounding cells, etc.). For example, a motion vector for the grouping of cells may be received from the weather data source (e.g., NEXRAD data or data from an aircraft weather radar system) or may be determined from a plurality of individual motion vectors by the weather radar system receiving the weather data (e.g., based on an average, interpolation, a data quality analysis, etc.). The estimated motion vector may be applied to one or more cells in the grouping, or to cells outside the grouping, such as an adjacent or nearby weather cell. In some embodiments where weather data from multiple sources is to be combined, fused or overlaid, the estimated motion vector may be calculated using data for one or more weather cells from a first one of the weather data sources and applied to weather cells from other data sources (e.g., weather cells that may not be correlated with weather cells in the weather data from the first source). In some embodiments, general atmospheric motion data from numerical analysis and forecasting may be used to provide a default motion vector. In some embodiments the estimated motion vector may be received from a nowcasting or forecasting weather product, such as CIWS. In some embodiments where weather data from multiple sources may be combined to estimate motion vectors for each grid location, a decision based on the quality of the weather data may be made. In some embodiments where weather data is not available for all grid locations, general atmospheric motion data from numerical analysis and forecasting may be used to provide a default motion vector for certain grid locations.
In some embodiments, uncertainty in the advection process may be reflected by adjusting the size of a weather condition indicator. For example, the size of a weather condition indicator may be increased in proportion with the distance that the weather condition may be advected. Similarly, the perimeter of a weather cell may be increased by a predetermined number of grid points or image pixels (e.g., one grid point or pixel) to account for changes such as growth or decay or a change in shape. In some embodiments, a weather data source may provide weather data including an estimate of possible errors in the tracking process for one or more weather cells, or an estimate of a change in size of the weather condition due to growth or decay. The error estimate or change estimate may be used to adjust the size of the weather condition indicator. In some embodiments, overlay images may be generated to indicate estimated error or growth and decay. In some embodiments, the size of a weather condition indicator may be increased or decreased according to the estimated error and/or growth or decay, and/or the severity level indicated by the weather condition indicator may be increased or decreased. In some embodiments, when divergent motion vectors will result in gaps in advected data, each gap may be evaluated based on the motion vectors of adjacent grid points. In some embodiments, multiple weather data sources having a common time and grid may be advected using a common motion vector.
In some situations, the received weather data may include one or more weather cells with motion vectors indicating motion significantly different from that of adjacent or nearby weather cells (e.g., due to collapse of a large weather cell generating a diverging pool of cold air that spawns new cells around its perimeter. In some embodiments, this may be addressed by analyzing the tracking history or determining a confidence estimate for any divergent weather cells. For divergent weather cells with a short tracking history (e.g., based on a minimum threshold of samples), an average motion vector for a grouping of surrounding or nearby cells may be calculated and used for advecting the divergent weather cell. For divergent weather cells with a sufficient number of samples, the motion vector for the weather cell may be used to advect the weather cell to an updated location. In some embodiments, both the average and actual motion vectors may be used to advect the divergent weather cell to two locations per the calculated distances for each motion vector, and an image of the divergent weather cell may be increased to cover both advected locations.
Referring now to
In some embodiments, weather data for a weather condition may be received from multiple unsynchronized sources (e.g., multiple NEXRAD radar installations). For example, an aircraft having an aircraft-based weather radar system and receiving weather data from a ground-based weather radar system may be moving from the radar coverage area of one ground-based system to another ground-based system. In such embodiments, the location of a weather condition may be advected using weather data from whichever data source provides the best tracking geometry. For example, the respective range of each of the weather data sources may be used to determine which source may provide the best source of data for advection (e.g., azimuth resolution may deteriorate with increasing distance, data near a data source may be limited due to a low maximum radar elevation range, etc.). In some embodiments, an estimated motion vector may be calculated using data from one or more or all of the available data sources within range of the weather condition. In some embodiments, multilateration may be applied to range data received from each of the weather data sources rather than using motion vector data in order to avoid azimuth resolution issues. In some embodiments, the volume coverage pattern, altitude, range, and age of the weather data for each weather data source may be used to estimate a number of radar slices or beams in order to determine the quality of the weather data from each source.
Referring now to
A quality estimate may be determined for weather data from each weather data source. For example, at a step 840, a first criterion may be evaluated on a cell-by-cell basis for weather data received from each of the weather data sources for each corresponding cell. In some embodiments, the first criterion may be indicative of negative factors associated with the weather data for each corresponding weather cell from each data source. According to an exemplary embodiment, these factors may include, but are not limited to one or more of the following: (a) absence of tracking data; (b) data aged beyond a predetermined limit (e.g., 15 minutes); (c) direction data received without a speed value; (d) the difference between a minimum acceptable number of elevation scans (e.g., five beams) and the number of scans that fall below 40,000 feet of the cell location; (e) tracking error estimates provided by the data source (e.g., NEXRAD error estimates); (f) absence of forecast data; (g) absence of cell structure data; and (h) data outside of a maximum allowable distance range for the weather data source. Appropriate weights may be applied to each factor. The value of the first criterion for each corresponding cell from each weather data source may be compared with a threshold value, and cell data from a source with a value of the first criterion greater than the threshold value may be discarded. If no weather data from any weather data source for a particular cell falls below the threshold value, then the weather cell from the source having the lowest value may be selected for further evaluation. If weather data for one or more corresponding weather cells falls below the threshold, then each of these weather cells may be selected for further evaluation.
At a step 850, a second criterion may be evaluated on a cell-by-cell basis for corresponding weather cells from each weather data source selected for further evaluation. In some embodiments, the second criterion may be indicative of positive factors associated with the data for each corresponding weather cell from each data source. According to an exemplary embodiment, more highly weighted factors may include, but are not limited to, one or more of the following: (a) the number of forecast values provided (e.g., for NEXRAD data, the number of 15, 30, 45, and 60 minute forecasts received, with higher weighting assigned to the 15 and 30 minute forecasts); (b) the estimated number of beams in the volume scan pattern that fall below a particular aircraft altitude (e.g., 40,000 feet) at the range of the cell multiplied by a weighting factor; and (c) volume scan patterns used under differing ambient conditions (e.g., a higher value may be assigned to a higher number of scans and a higher sweep rate used by NEXRAD for convective weather, and a lower value to a lower number of scans or a lower scan rate used by NEXRAD in stratiform rain, clear air, or winter modes). In some embodiments, lower weighted factors may include: (d) freshness of data (e.g., decremented by the difference between the current time and the time of sensing, with a linear decrement from no decrement for fresh data to a score of zero for data of the maximum allowable age (e.g., 15 minutes)); (e) forecast motion accuracy, with a decreasing value as the error approaches a maximum acceptable error (e.g., 5 nautical miles); (f) VIL for each cell; and (g) maximum radar reflectivity for each cell. Appropriate weights may be applied to each factor. The value of the weighted sum of these factors for each corresponding cell may be used to select the data source for each weather cell having the highest value.
At a step 860, one or more of the weather cells evaluated in steps 840 and 850 may be selected as providing the best weather data based on the evaluation. For example, a composite set of motion vector data may be generated using the data for each weather cell selected based on the evaluations in steps 840 and 850. At a step 870 each weather cell may be advected to an updated location for display. In some embodiments, if no weather cell from any weather data source for a particular cell is found acceptable, then a motion vector (e.g., an average motion vector, a motion vector based on a quality estimate, etc.) may be calculated from a grouping of adjacent or nearby cells and used to advect that particular cell.
In some situations, weather data received from a weather data source may include one or more weather cells without motion vector or tracking data. In some embodiments, a motion vector based on all other cells having motion vector data may be used as described above, provided that the region of interest represented by the weather cell does not exhibit strong rotation (e.g., in the presence of a strong low-pressure area). In some embodiments, a more accurate estimate may be achieved by calculating the motion vector based on adjacent or nearby cells as described above. In some embodiments, the motion vector may be based on weather model information (e.g., High Resolution Rapid Refresh, Weather Research and Forecasting, etc.).
Referring now to
Referring again to
Referring again to
In some embodiments, lightning data received from one or more weather data sources may be factored in the assignment of each VIP value. As shown in
A VIP value or other hazard level may also be applied to other weather data, such as observations, nowcasts, and/or forecasts of turbulence, icing, High Altitude Ice Water Content areas, volcanic ash, etc. In some embodiments, inferred hail data received from one or more weather data sources may be factored in the assignment of each VIP value. For example, given that hail is indicative of severe weather (e.g., a VIP value of 3 or higher), indication of the presence of hail may be evidence that a weather cell is hazardous. In some embodiments, NEXRAD inferred hail data using multiple elevation angles and referenced to the freezing and −20° C. levels may be used. Similarly, inferred lightning data (e.g., from an aircraft-based weather radar system) may be factored into the assignment of each VIP value. In some embodiments, VIP threat information for turbulence may be used (e.g., Graphical Turbulence Guidance data, aircraft-based radar data). In some embodiments, icing data from in-situ icing detectors (e.g., own-ship or other aircraft) or weather data sources may be used. Volcanic ash information may come from, for example, an infrared detection system (e.g., own-ship, another aircraft, etc.) or derived from satellite weather data sources. In some embodiments, consistency of each VIP level or value may be cross-checked with a received echo top height measurement from one or more weather data sources, given that heavy stratiform rain systems may have heavy rainfall, but lack the high storm top altitudes associated with severe convective weather systems.
In some embodiments, sounding data received from one or more weather radar data sources may be factored into the assignment of each VIP value. Sounding data may generally provide profiles of temperature, dew point, and wind speed and direction as a function of altitude, and may provide insight into the likelihood and severity of convective weather. For example, the region between freezing and −20° C. may provide the highest radar reflectivity due to the increased likelihood of liquid coated hail in this region. The tropopause height may generally indicate how high air will continue to rise during convection. Exemplary indices derived from sounding data that may be indicative of convective weather may include Convective Available Potential Energy (CAPE) and Convective Inhibition (CIN). High CAPE may indicate the likelihood of intense storms. High CIN may indicate that a greater atmospheric disturbance will be required to initiate convective weather, but also that if convection does occur, it may be more intense.
In some embodiments, data from an aircraft-based weather radar system, such as ambient temperature, altitude, air pressure, and wind direction and speed may be factored into the assignment of VIP values. In some embodiments, this data may be used in conjunction with sounding data, analyses, or forecasts to adjust estimated atmospheric profiles based on, for example, 12-hour soundings to the current time to provide improved estimates of parameters such as freezing level or tropopause height. In some embodiments, weather data for a predetermined distance below the altitude of the aircraft, (e.g., 10,000 feet), may be suppressed as being less hazardous to the aircraft.
In some embodiments, other parameters may be factored into the assignment of each VIP value. For example, in some embodiments, reported surface data may be used in conjunction with aircraft-based air data measurements to estimate current atmospheric temperature profiles. In some embodiments, satellite-derived weather data may be used, including, for example, images formed at multiple visible and infrared wavelengths. These images may be used individually or combined together with other weather data for specific purposes targeted at convective activity such as cloud cover, cloud top temperature (from which cloud top height may be inferred), and convective initiation. A determination of cloud cover may be useful in removing spurious data from weather data. In some embodiments, the cloud cover image may be used as a mask, which may be advected to compensate for latency, and used to mask out weather radar returns from cloudless areas. Such returns may be due to flocks of birds or insects or to atmospheric anomalies or solar interference. In some embodiments, cloud top height derived from the measured temperature of the clouds referenced to the atmospheric temperature profile derived from soundings may be used in lieu of or in addition to radar cloud top data to gauge the severity of storms and to determine their relevance to aircraft flying at high altitudes. In some embodiments, convective initiation may be used as an early indicator of convective activity and as a source of information about convection beyond the range of airborne weather radar in parts of the world where ground radar data may not be available.
Referring again to
According to an exemplary embodiment, an acceptable match between weather data sources may be defined by any value greater than two less than the value of the data being evaluated at a particular grid point. Under this acceptable match definition, surrounding values higher than the value of the data being evaluated at a particular grid point are always counted as support. In some embodiments, more complex variations on both the region of interest and the acceptable match definition may be implemented by, for example, assigning lower weights to more distant matches or defining fuzzy logic membership functions around the data being evaluated at a particular grid point.
A confidence level of 1.0 may be output for a particular grid point if an acceptable match exists for a certain number or percentage of surrounding weather data values (e.g., 40%). For example, if three weather data sources are used with a region of interest of a 5×5 matrix of grid squares centered about the particular grid point being evaluated, there are three 5×5 matrices of weather data values (75 total values). The center point of one of these matrices contributes the value of the data being evaluated (e.g., a VIP value), leaving 74 values that may support the value of the data being evaluated. If the acceptable match percentage is set at 40%, then 30 of these values must meet the acceptable match definition (e.g., 30 values that are greater than two less than the value of the data being evaluated at a particular grid point) in order to achieve a confidence level of 1.0.
A curve may be defined to set the confidence level for cases where fewer than the required number or percentage of surrounding weather data values (e.g., 40%) support the value of the weather data being evaluated. In some embodiments, a linear curve may be used. In some embodiments, an exponential curve may be used. For example, where the required percentage of surrounding weather data values is 40%, an exemplary exponential rise may be defined by C=10(Count)/10(0.40) where C is the confidence level and Count is the percentage of surrounding weather data values that support the value of the weather data being evaluated.
In some embodiments, the various weather data sources may be weighted according to their reliability at a given grid point. For example, an aircraft-based weather radar may be weighted more heavily than other weather data sources within a certain range of the aircraft (e.g., 80 nautical miles) due to a faster update rate, but may be weighted lower than other weather data sources at greater ranges where it may be subject to beam spreading, shadowing, or ground clutter.
Referring again to
In some embodiments, the images may provide an indication of the confidence levels determined at step 570. For example, in some embodiments, a separate image is generated for purposes of portraying confidence values. In some embodiments, the color saturation or brightness of a portion of an image may be reduced to indicate regions of lower confidence. In some embodiments, a cross-hatch pattern may be used to distinguish regions of lower confidence, depending on the capabilities of the display.
Some ground-based weather data sources, such as NEXRAD, provide echo top height data in addition to identifying individual weather cells and determining cell height and VIL for each. Similarly, some aircraft-based weather radar systems may perform vertical sweeps through areas of high reflectivity to provide estimates of echo top height. In some embodiments, asynchronous ground-based and aircraft-based echo top height data may be fused to adjust older, but more detailed ground based data.
Referring now to
In some embodiments, the fused echo top data may be displayed as a separate color-coded relief map. In some embodiments, estimated tops of individual cells may be indicated as numerical tags on a display, such as a reflectivity/VIL VIP display. In some embodiments, a 3-D display may be generated to represent an individual polygon for each grid point. In some embodiments, data more than a predetermined threshold distance (e.g., 10,000 feet) below the aircraft may be suppressed or otherwise visually indicated.
The embodiments in the present disclosure have been described with reference to drawings. The drawings illustrate certain details of specific embodiments that implement the systems and methods and programs of the present disclosure. However, describing the embodiments with drawings should not be construed as imposing any limitations that may be present in the drawings. The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing its operations. The embodiments of the present disclosure may be implemented using an existing computer processor, or by a special purpose computer processor incorporated for this or another purpose or by a hardwired system.
As noted above, embodiments within the scope of the present invention include program products comprising non-transitory machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media may be any available media that may be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media may comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to carry or store desired program code in the form of machine-executable instructions or data structures and which may be accessed by a general purpose or special purpose computer or other machine with a processor. Thus, any such a connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Embodiments in the present disclosure have been described in the general context of method steps which may be implemented in one embodiment by a program product including machine-executable instructions, such as program code, for example, in the form of program modules executed by machines in networked environments. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Machine-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
As previously indicated, embodiments in the present disclosure may be practiced in a networked environment using logical connections to one or more remote computers having processors. Those skilled in the art will appreciate that such network computing environments may encompass many types of computers, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and so on. Embodiments in the disclosure may also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
An exemplary system for implementing the overall system or portions of the disclosure might include one or more computers including a processor, a system memory or database, and a system bus that couples various system components including the system memory to the processor. The database or system memory may include read only memory (ROM) and random access memory (RAM). The database may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to a removable optical disk such as a CD ROM or other optical media. The drives and their associated machine-readable media provide nonvolatile storage of machine-executable instructions, data structures, program modules and other data for the computer. User interfaces, as described herein, may include a computer with monitor, keyboard, a keypad, a mouse, joystick or other input devices performing a similar function.
It should be noted that although the diagrams herein may show a specific order and composition of method steps, it is understood that the order of these steps may differ from what is depicted. For example, two or more steps may be performed concurrently or with partial concurrence. Also, some method steps that are performed as discrete steps may be combined, steps being performed as a combined step may be separated into discrete steps, the sequence of certain processes may be reversed or otherwise varied, and the nature or number of discrete processes may be altered or varied. The order or sequence of any element or apparatus may be varied or substituted according to alternative embodiments. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. Such variations will depend on the software and hardware systems chosen and on designer choice. It is understood that all such variations are within the scope of the disclosure. Likewise, software and web implementations of the present invention could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various database searching steps, correlation steps, comparison steps and decision steps.
The foregoing description of embodiments has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the subject matter to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the subject matter disclosed herein. The embodiments were chosen and described in order to explain the principals of the disclosed subject matter and its practical application to enable one skilled in the art to utilize the disclosed subject matter in various embodiments and with various modifications as are suited to the particular use contemplated. Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the embodiments without departing from the scope of the presently disclosed subject matter.
Throughout the specification, numerous advantages of the exemplary embodiments have been identified. It will be understood, of course, that it is possible to employ the teachings herein without necessarily achieving the same advantages. Additionally, although many features have been described in the context of a particular data processor, it will be appreciated that such features could also be implemented in the context of other hardware configurations.
While the exemplary embodiments illustrated in the figures and described above are presently preferred, it should be understood that these embodiments are offered by way of example only. Other embodiments may include, for example, structures with different data mapping or different data. The disclosed subject matter is not limited to a particular embodiment, but extends to various modifications, combinations, and permutations that nevertheless fall within the scope and spirit of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
650275 | Reeve | May 1900 | A |
3251057 | Buehler et al. | May 1966 | A |
3359557 | Fow et al. | Dec 1967 | A |
3404396 | Buchler et al. | Oct 1968 | A |
3465339 | Marner | Sep 1969 | A |
3491358 | Hicks | Jan 1970 | A |
3508259 | Andrews | Apr 1970 | A |
3540829 | Collinson et al. | Nov 1970 | A |
3567915 | Altshuler et al. | Mar 1971 | A |
3646555 | Atlas | Feb 1972 | A |
3715748 | Hicks | Feb 1973 | A |
3764719 | Dell | Oct 1973 | A |
3781530 | Britland et al. | Dec 1973 | A |
3781878 | Kirkpatrick | Dec 1973 | A |
3803609 | Lewis et al. | Apr 1974 | A |
3885237 | Kirkpatrick | May 1975 | A |
3943511 | Evans et al. | Mar 1976 | A |
3964064 | Brandao et al. | Jun 1976 | A |
3968490 | Gostin | Jul 1976 | A |
4015257 | Fetter | Mar 1977 | A |
4043194 | Tanner | Aug 1977 | A |
4223309 | Payne | Sep 1980 | A |
4283715 | Choisnet | Aug 1981 | A |
4283725 | Chisholm | Aug 1981 | A |
4318100 | Shimizu et al. | Mar 1982 | A |
4346595 | Frosch et al. | Aug 1982 | A |
4430654 | Kupfer | Feb 1984 | A |
4435707 | Clark | Mar 1984 | A |
4459592 | Long | Jul 1984 | A |
4533915 | Lucchi et al. | Aug 1985 | A |
4555703 | Cantrell | Nov 1985 | A |
4600925 | Alitz et al. | Jul 1986 | A |
4613938 | Hansen et al. | Sep 1986 | A |
4649388 | Atlas | Mar 1987 | A |
4658255 | Nakamura et al. | Apr 1987 | A |
4684950 | Long | Aug 1987 | A |
4742353 | D'Addio et al. | May 1988 | A |
4761650 | Masuda et al. | Aug 1988 | A |
4835536 | Piesinger et al. | May 1989 | A |
RE33152 | Atlas | Jan 1990 | E |
4914444 | Pifer et al. | Apr 1990 | A |
4928131 | Onozawa | May 1990 | A |
4940987 | Frederick | Jul 1990 | A |
5036334 | Henderson et al. | Jul 1991 | A |
5049886 | Seitz et al. | Sep 1991 | A |
5057820 | Markson et al. | Oct 1991 | A |
5077558 | Kuntman | Dec 1991 | A |
5105191 | Keedy | Apr 1992 | A |
5159407 | Churnside et al. | Oct 1992 | A |
5164731 | Borden et al. | Nov 1992 | A |
5173704 | Buehler et al. | Dec 1992 | A |
5177487 | Taylor et al. | Jan 1993 | A |
5198819 | Susnjara | Mar 1993 | A |
5202690 | Frederick | Apr 1993 | A |
5208600 | Rubin | May 1993 | A |
5221924 | Wilson, Jr. | Jun 1993 | A |
5262773 | Gordon | Nov 1993 | A |
5291208 | Young | Mar 1994 | A |
5296865 | Lewis | Mar 1994 | A |
5311183 | Mathews et al. | May 1994 | A |
5311184 | Kuntman | May 1994 | A |
5331330 | Susnjara | Jul 1994 | A |
5396220 | Markson et al. | Mar 1995 | A |
5402116 | Ashley | Mar 1995 | A |
5469168 | Anderson | Nov 1995 | A |
5479173 | Yoshioka et al. | Dec 1995 | A |
5485157 | Long | Jan 1996 | A |
5517193 | Allison et al. | May 1996 | A |
5521603 | Young | May 1996 | A |
5534868 | Gjessing et al. | Jul 1996 | A |
5568151 | Merritt | Oct 1996 | A |
5583972 | Miller | Dec 1996 | A |
5592171 | Jordan | Jan 1997 | A |
5602543 | Prata et al. | Feb 1997 | A |
5615118 | Frank | Mar 1997 | A |
5648782 | Albo et al. | Jul 1997 | A |
5654700 | Prata et al. | Aug 1997 | A |
5657009 | Gordon | Aug 1997 | A |
5686919 | Jordan et al. | Nov 1997 | A |
5726656 | Frankot | Mar 1998 | A |
5757322 | Ray et al. | May 1998 | A |
5771020 | Markson et al. | Jun 1998 | A |
5828332 | Frederick | Oct 1998 | A |
5838239 | Stern et al. | Nov 1998 | A |
5839080 | Muller et al. | Nov 1998 | A |
5907568 | Reitan, Jr. | May 1999 | A |
5920276 | Frederick | Jul 1999 | A |
5945926 | Ammar et al. | Aug 1999 | A |
5973635 | Albo | Oct 1999 | A |
6034760 | Rees | Mar 2000 | A |
6043756 | Bateman et al. | Mar 2000 | A |
6043757 | Patrick | Mar 2000 | A |
6081220 | Fujisaka et al. | Jun 2000 | A |
6138060 | Conner et al. | Oct 2000 | A |
6154151 | McElreath et al. | Nov 2000 | A |
6154169 | Kuntman | Nov 2000 | A |
6177873 | Cragun | Jan 2001 | B1 |
6184816 | Zheng et al. | Feb 2001 | B1 |
6201494 | Kronfeld | Mar 2001 | B1 |
6208284 | Woodell et al. | Mar 2001 | B1 |
6236351 | Conner et al. | May 2001 | B1 |
6240369 | Foust | May 2001 | B1 |
6246367 | Markson et al. | Jun 2001 | B1 |
6281832 | McElreath | Aug 2001 | B1 |
6289277 | Feyereisen et al. | Sep 2001 | B1 |
6297772 | Lewis | Oct 2001 | B1 |
6339747 | Daly | Jan 2002 | B1 |
6340946 | Wolfson et al. | Jan 2002 | B1 |
6381538 | Robinson et al. | Apr 2002 | B1 |
6388607 | Woodell | May 2002 | B1 |
6388608 | Woodell et al. | May 2002 | B1 |
RE37725 | Yamada | Jun 2002 | E |
6405134 | Smith | Jun 2002 | B1 |
6424288 | Woodell | Jul 2002 | B1 |
6441773 | Kelly et al. | Aug 2002 | B1 |
6456226 | Zheng et al. | Sep 2002 | B1 |
6480142 | Rubin | Nov 2002 | B1 |
6496252 | Whiteley | Dec 2002 | B1 |
6501392 | Gremmert et al. | Dec 2002 | B2 |
6512476 | Woodell | Jan 2003 | B1 |
6518914 | Peterson et al. | Feb 2003 | B1 |
6549161 | Woodell | Apr 2003 | B1 |
6560538 | Schwinn et al. | May 2003 | B2 |
6563452 | Zheng et al. | May 2003 | B1 |
6577947 | Kronfeld et al. | Jun 2003 | B1 |
6590520 | Steele et al. | Jul 2003 | B1 |
6597305 | Szeto et al. | Jul 2003 | B2 |
6603425 | Woodell | Aug 2003 | B1 |
6606564 | Schwinn et al. | Aug 2003 | B2 |
6614382 | Cannaday et al. | Sep 2003 | B1 |
6650275 | Kelly et al. | Nov 2003 | B1 |
6650972 | Robinson et al. | Nov 2003 | B1 |
6667710 | Cornell et al. | Dec 2003 | B2 |
6670908 | Wilson et al. | Dec 2003 | B2 |
6677886 | Lok | Jan 2004 | B1 |
6683609 | Baron et al. | Jan 2004 | B1 |
6690317 | Szeto et al. | Feb 2004 | B2 |
6703945 | Kuntman et al. | Mar 2004 | B2 |
6720906 | Szeto et al. | Apr 2004 | B2 |
6738010 | Steele et al. | May 2004 | B2 |
6741203 | Woodell | May 2004 | B1 |
6744382 | Lapis et al. | Jun 2004 | B1 |
6771207 | Lang | Aug 2004 | B1 |
6788043 | Murphy et al. | Sep 2004 | B2 |
6791311 | Murphy et al. | Sep 2004 | B2 |
6828922 | Gremmert et al. | Dec 2004 | B1 |
6828923 | Anderson | Dec 2004 | B2 |
6839018 | Szeto et al. | Jan 2005 | B2 |
6850185 | Woodell | Feb 2005 | B1 |
6856908 | Devarasetty et al. | Feb 2005 | B2 |
6879280 | Bull et al. | Apr 2005 | B1 |
6882302 | Woodell et al. | Apr 2005 | B1 |
6917860 | Robinson | Jul 2005 | B1 |
6977608 | Anderson et al. | Dec 2005 | B1 |
7030805 | Ormesher et al. | Apr 2006 | B2 |
7042387 | Ridenour et al. | May 2006 | B2 |
7082382 | Rose et al. | Jul 2006 | B1 |
7109912 | Paramore et al. | Sep 2006 | B1 |
7109913 | Paramore et al. | Sep 2006 | B1 |
7116266 | Vesel et al. | Oct 2006 | B1 |
7129885 | Woodell et al. | Oct 2006 | B1 |
7132974 | Christianson | Nov 2006 | B1 |
7139664 | Kelly et al. | Nov 2006 | B2 |
7145503 | Abramovich et al. | Dec 2006 | B2 |
7161525 | Finley et al. | Jan 2007 | B1 |
7200491 | Rose et al. | Apr 2007 | B1 |
7205928 | Sweet | Apr 2007 | B1 |
7242343 | Woodell | Jul 2007 | B1 |
7259714 | Cataldo | Aug 2007 | B1 |
7292178 | Woodell et al. | Nov 2007 | B1 |
7307576 | Koenigs | Dec 2007 | B1 |
7307577 | Kronfeld et al. | Dec 2007 | B1 |
7307583 | Woodell et al. | Dec 2007 | B1 |
7307586 | Peshlov et al. | Dec 2007 | B2 |
7307756 | Walmsley | Dec 2007 | B2 |
7352317 | Finley et al. | Apr 2008 | B1 |
7352929 | Hagen et al. | Apr 2008 | B2 |
7365674 | Tillotson et al. | Apr 2008 | B2 |
7372394 | Woodell et al. | May 2008 | B1 |
7383131 | Wey et al. | Jun 2008 | B1 |
7417579 | Woodell | Aug 2008 | B1 |
7427943 | Kronfeld et al. | Sep 2008 | B1 |
7436361 | Paulsen et al. | Oct 2008 | B1 |
7471995 | Robinson | Dec 2008 | B1 |
7486219 | Woodell et al. | Feb 2009 | B1 |
7486220 | Kronfeld et al. | Feb 2009 | B1 |
7492304 | Woodell et al. | Feb 2009 | B1 |
7492305 | Woodell et al. | Feb 2009 | B1 |
7515087 | Woodell et al. | Apr 2009 | B1 |
7515088 | Woodell et al. | Apr 2009 | B1 |
7528613 | Thompson et al. | May 2009 | B1 |
7541971 | Woodell et al. | Jun 2009 | B1 |
7557735 | Woodell et al. | Jul 2009 | B1 |
7576680 | Woodell | Aug 2009 | B1 |
7581441 | Barny et al. | Sep 2009 | B2 |
7598901 | Tillotson et al. | Oct 2009 | B2 |
7598902 | Woodell et al. | Oct 2009 | B1 |
7633428 | McCusker et al. | Dec 2009 | B1 |
7633431 | Wey et al. | Dec 2009 | B1 |
7664601 | Daly, Jr. | Feb 2010 | B2 |
7696921 | Finley et al. | Apr 2010 | B1 |
7714767 | Kronfeld et al. | May 2010 | B1 |
7728758 | Varadarajan et al. | Jun 2010 | B2 |
7733264 | Woodell et al. | Jun 2010 | B1 |
7859448 | Woodell et al. | Dec 2010 | B1 |
7868811 | Woodell et al. | Jan 2011 | B1 |
7917255 | Finley | Mar 2011 | B1 |
7932853 | Woodell et al. | Apr 2011 | B1 |
7973698 | Woodell et al. | Jul 2011 | B1 |
7982658 | Kauffman et al. | Jul 2011 | B2 |
8022859 | Bunch et al. | Sep 2011 | B2 |
8054214 | Bunch | Nov 2011 | B2 |
8072368 | Woodell | Dec 2011 | B1 |
8081106 | Yannone | Dec 2011 | B2 |
8089391 | Woodell et al. | Jan 2012 | B1 |
8098188 | Costes et al. | Jan 2012 | B2 |
8098189 | Woodell et al. | Jan 2012 | B1 |
8111186 | Bunch et al. | Feb 2012 | B2 |
8159369 | Koenigs | Apr 2012 | B1 |
8217828 | Kirk | Jul 2012 | B2 |
8228227 | Bunch et al. | Jul 2012 | B2 |
8314730 | Musiak et al. | Nov 2012 | B1 |
8332084 | Bailey | Dec 2012 | B1 |
8902100 | Woodell et al. | Dec 2014 | B1 |
9019146 | Finley et al. | Apr 2015 | B1 |
20020039072 | Gremmert et al. | Apr 2002 | A1 |
20020126039 | Dalton | Sep 2002 | A1 |
20030001770 | Cornell et al. | Jan 2003 | A1 |
20030025627 | Wilson et al. | Feb 2003 | A1 |
20030193411 | Price | Oct 2003 | A1 |
20040183695 | Ruokangas | Sep 2004 | A1 |
20040239550 | Daly, Jr. | Dec 2004 | A1 |
20050049789 | Kelly et al. | Mar 2005 | A1 |
20050174350 | Ridenour et al. | Aug 2005 | A1 |
20060036366 | Kelly et al. | Feb 2006 | A1 |
20070005249 | Dupree et al. | Jan 2007 | A1 |
20080158049 | Southard et al. | Jul 2008 | A1 |
20090177343 | Bunch et al. | Jul 2009 | A1 |
20090219197 | Bunch | Sep 2009 | A1 |
20100019938 | Bunch | Jan 2010 | A1 |
20100042275 | Kirk | Feb 2010 | A1 |
20100194628 | Christianson et al. | Aug 2010 | A1 |
20100201565 | Khatwa | Aug 2010 | A1 |
20100245164 | Kauffman | Sep 2010 | A1 |
20100302094 | Bunch | Dec 2010 | A1 |
20110074624 | Bunch | Mar 2011 | A1 |
20110148692 | Christianson | Jun 2011 | A1 |
20110148694 | Bunch et al. | Jun 2011 | A1 |
20120029786 | Calandra et al. | Feb 2012 | A1 |
20120133551 | Pujol et al. | May 2012 | A1 |
20120139778 | Bunch et al. | Jun 2012 | A1 |
20130226452 | Watts | Aug 2013 | A1 |
20130234884 | Bunch et al. | Sep 2013 | A1 |
20140176362 | Sneed | Jun 2014 | A1 |
20140362088 | Veillette et al. | Dec 2014 | A1 |
Number | Date | Country |
---|---|---|
1 329 738 | Jul 2003 | EP |
2658617 | Aug 1991 | FR |
WO-9807047 | Feb 1998 | WO |
WO-9822834 | May 1998 | WO |
WO-03005060 | Jan 2003 | WO |
WO-2009137158 | Nov 2009 | WO |
Entry |
---|
U.S. Appl. No. 12/075,103, filed Mar. 7, 2008, Woodell et al. |
U.S. Appl. No. 13/841,893, filed Mar. 15, 2013, Rockwell Collins, Inc. |
U.S. Appl. No. 13/919,406, filed Jun. 17, 2013, Rockwell Collins, Inc. |
U.S. Appl. No. 14/086,844, filed Nov. 21, 2013, Rockwell Collins, Inc. |
U.S. Appl. No. 14/206,239, filed Mar. 12, 2014, Rockwell Collins. |
U.S. Appl. No. 14/206,651, filed Mar. 12, 2014, Rockwell Collins, Inc. |
U.S. Appl. No. 14/207,034, filed Mar. 12, 2014, Rockwell Collins, Inc. |
3-D Weather Hazard and Avoidance System, Honeywell InteVue Brochure dated Nov. 2008, 4 pages. |
Bovith et al., Detecting Weather Radar Clutter by Information Fusion with Satellite Images and Numerical Weather Prediction Model Output; Jul. 31-Aug. 4, 2006, 4 pages. |
Burnham et al., Thunderstorm Turbulence and Its Relationship to Weather Radar Echoes, J. Aircraft, Sep.-Oct. 1969, 8 pages. |
Corridor Integrated Weather System (CIWS), www.II.mit.edu/mission/aviation/faawxsystems/ciws.html, received on Aug. 19, 2009, 3 pages. |
Doviak et al., Doppler Radar and Weather Observations, 1984, 298 pages. |
Dupree et al.,FAA Tactical Weather Forecasting in the United States National Airspace, 29 pages. |
Goodman et al., LISDAD Lightning Observations during the Feb. 22-23, 1998 Central Florida Tornado Outbreak, http:www.srh.noaa.gov/topics/attach/html/ssd98-37.htm, Jun. 1, 1998, 5 pages. |
Greene et al., Vertically Integrated Liquid Water—A New Analysis Tool, Monthly Weather Review, Jul. 1972, 5 pages. |
Hodanish, Integration of Lightning Detection Systems in a Modernized National Weather Service Office, http://www.srh.noaa.gov/mlb/hoepub.html, retrieved on Aug. 6, 2007, 5 pages. |
Honeywell, RDR-4B Forward Looking Windshear Detection/Weather Radar System User's Manual with Radar Operation Guidelines, Jul. 2003. |
Keith, Transport Category Airplane Electronic Display Systems, Jul. 16, 1987, 34 pages. |
Klingle-Wilson et al., Description of Corridor Integrated Weather System (CIWS) Weather Products, Aug. 1, 2005, 120 pages. |
Kuntman et al, Turbulence Detection and Avoidance System, Flight Safety Foundation 53rd International Air Safety Seminar (IASS), Oct. 29, 2000. |
Kuntman, Airborne System to Address Leading Cause of Injuries in Non-Fatal Airline Accidents, ICAO Journal, Mar. 2000. |
Kuntman, Satellite Imagery: Predicting Aviation Weather Hazards, ICAO Journal, Mar. 2000, 4 pages. |
Meteorological/KSC/L71557/Lighting Detection and Ranging (LDAR), Jan. 2002, 12 pages. |
Nathanson, Fred E., “Radar and Its Composite Environment,” Radar Design Principles, Signal Processing and the Environment, 1969, 5 pages, McGraw-Hill Book Company, New York et al. |
Pessi et al., On the Relationship Between Lightning and Convective Rainfall Over the Central Pacific Ocean, date unknown, 9 pages. |
RDR-4B Honeywell User Manual for Forward Looking Windshear Detection/Weather Radar System, Rev. 6, Jul. 2003, 106 pages. |
Robinson et al., En Route Weather Depiction Benefits of the Nexrad Vertically Integrated Liquid Water Product Utilized by the Corridor Integrated Weather System, 10th Conference on Aviation, Range, and Aerospace Meteorology (ARAM), 2002, 4 pages. |
Stormscope Lightning Detection Systems, L3 Avionics Systems, retrieved on Jul. 11, 2011, 6 pages. |
Waldvogel et al., The Kinetic Energy of Hailfalls. Part I: Hailstone Spectra, Journal of Applied Meteorology, Apr. 1978, 8 pages. |
Wilson et al., The Complementary Use of Titan-Derived Radar and Total Lightning Thunderstorm Cells, 10 pages. |
Zipser et al., The Vertical Profile of Radar Reflectivity and Convective Cells: A Strong Indicator of Storm Intensity and Lightning Probability? America Meteorological Society, 1994, 9 pages. |
Decision on Appeal for Inter Parties Reexamination Control No. 95/001,860, dated Oct. 17, 2014, 17 pages. |
Final Office Action on U.S. Appl. No. 12,892,663 dated Mar. 7, 2013, 13 pages. |
Final Office Action on U.S. Appl. No. 13/238,606 Dated Apr. 1, 2014, 11 pages. |
Final Office Action on U.S. Appl. No. 13/238,606 Dated Jan. 22, 2015, 6 pages. |
Final Office Action on U.S. Appl. No. 13/246,769 Dated Sep. 16, 2014, 18 pages. |
Non-Final Office Action on U.S. Appl. No. 12/892,663 Dated May 29, 2013, 14 pages. |
Non-Final Office Action on U.S. Appl. No. 13/238,606 Dated Jul. 8, 2014, 12 pages. |
Non-Final Office Action on U.S. Appl. No. 13/238,606 Dated Sep. 23, 2013, 15 pages. |
Non-Final Office Action on U.S. Appl. No. 13/717,052 Dated Sep. 9, 2014, 8 pages. |
Notice of Allowance on U.S. Appl. No. 12/075,103 Dated Aug. 4, 2014, 10 pages. |
Notice of Allowance on U.S. Appl. No. 13/246,769 Dated Jan. 8, 2015, 10 pages. |
Office Action for U.S. Appl. No. 12/892,663, mail date Oct. 22, 2012, 12 pages. |
Office Action for U.S. Appl. No. 13/717,052, mail date Aug. 22, 2013, 15 pages. |
Office Action on U.S. Appl. No. 12/075,103 Dated Jul. 31, 2013, 8 pages. |
Office Action on U.S. Appl. No. 13/246,769 Dated Apr. 21, 2014, 18 pages. |
TOA Technology, printed from website: http://www.toasystems.com/technology.html on Dec. 29, 2010, 2 pages. |
Triangulation, from Wikipedia, printed from website: http://en.wikipedia.org/wiki/Triangulation on Dec. 29, 2010, 6 pages. |
U.S. Appl. No. 13/246,769, filed Sep. 27, 2011, Rockwell Collins. |
U.S. Appl. No. 13/717,052, filed Dec. 17, 2012, Dwoodell et al. |
U.S. Appl. No. 13/717,052, filed Dec. 17, 2012, Woodell et al. |
U.S. Appl. No. 13/837,538, filed Mar. 15, 2013, Kronfeld et al. |
U.S. Appl. No. 14/162,035, filed Jan. 23, 2014, Kronfeld et al. |
U.S. Appl. No. 14/323,766, filed Jul. 3, 2014, Weichbrod et al. |
U.S. Appl. No. 14/465,730, filed Aug. 21, 2014, Breiholz et al. |
U.S. Appl. No. 14/608,071, filed Jan. 28, 2015, Breiholz et al. |
Boudevillain et al., 2003, Assessment of Vertically Integrated Liquid (VIL) Water Content Radar Measurement, J. Atmos. Oceanic Technol., 20, 807-819. |
Greene et al., 1972, Vertically Integrated Water—A New Analysis Tool, Mon. Wea. Rev., 100, 548-552. |
Lahiff, 2005, Vertically Integrated Liquid Density and Its Associated Hail Size Range Across the Burlington, Vermont County Warning Area, Eastern Regional Technical Attachment, No. 05-01, 20 pages. |
Liu, Chuntao et al., Relationships between lightning flash rates and radar reflectivity vertical structures in thunderstorms over the tropics and subtropics, Journal of Geophysical Research, vol. 177, D06212, doi:10.1029/2011JDo17123,2012, American Geophysical Union, 2012, 19 pages. |
Non-Final Office Action on U.S. Appl. No. 13/238,606 Dated May 27, 2015, 14 pages. |
Non-Final Office Action on U.S. Appl. No. 14/452,235 Dated Apr. 23, 2015, 9 pages. |
Non-Final Office Action on U.S. Appl. No. 14/681,901 Dated Jun. 17, 2015, 21 pages. |
Non-Final Office Action on U.S. Appl. No. 13/238,606 Dated Mar. 27, 2015, 21 pages. |
Non-Final Office Action on U.S. Appl. No. 13/717,052 Dated Feb. 11, 2015, 15 pages. |
Non-Final Office Action on U.S. Appl. No. 13/841,893 Dated Jun. 22, 2015, 27 pages. |
Non-Final Office Action on U.S. Appl. No. 13/913,100 Dated May 4, 2015, 25 pages. |
Non-Final Office Action on U.S. Appl. No. 13/919,406 Dated Jul. 14, 2015, 23 pages. |
Non-Final Office Action on U.S. Appl. No. 14/162,035, dated Feb. 4, 2016, 9 pages. |
Non-Final Office Action on U.S. Appl. No. 14/086,844, dated Nov. 10, 2015, 17 pages. |
Notice of Allowance on U.S. Appl. No. 13/707,438 Dated Feb. 25, 2015, 11 pages. |
Notice of Allowance on U.S. Appl. No. 14/681,901, dated Dec. 23, 2015, 8 pages. |
Zipser, Edward J. et al., The Vertical Profile of Radar Reflectivity of Convective Cells: A Strong Indicator of Storm Intensity and Lightning Probability?, American Meteorological Society, Aug. 1994, 9 pages. |
Corrected Notice of Allowability on U.S. Appl. No. 14/086,844, dated Sep. 26, 2016, 2 pages. |
Non-Final Office Action on U.S. Appl. No. 14/162,035 dated Jul. 11, 2016, 10 pages. |