Some embodiments relate to the presentation of three (3) dimensional (3D) weather with animation (4D) using augmented reality (AR) Doppler weather radar data (DWR). Some embodiments relate to the generation of 2D polygons and weather alerts from weather radar data in real time.
People responsible for weather prediction and monitoring may have a difficult time interpreting the weather data due to the large volume of weather data. Additionally, the volume of the weather data continues to increase. People responsible for weather prediction and monitoring may have a difficult time determining whether there is a weather alert due to the large amount of weather data.
The present disclosure is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
The following description and the drawings sufficiently illustrate specific embodiments to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. Portions and features of some embodiments may be included in, or substituted for, those of other embodiments. Embodiments set forth in the claims encompass all available equivalents of those claims.
In some embodiments, the 3D or 4D (animation) AR-DWR visualization will enable users of the system to more easily analyze the weather situation. In some embodiments, the 3D or 4D (animation) AR-DWR visualization enables a user to interpret radar data quicker. In some embodiments, the 3D or 4D (animation) AR-DWR visualization enables a user to more quickly identify dangerous weather conditions and determine possible actions. In some embodiments, the 3D or 4D (animation) AR-DWR visualization enables a user to view weather data in 3D rather than 2D.
In some embodiments, the system for AR-DWR visualization 100 complies with Federal Aviation Administration (FAA) standards or regulations, and/or National Oceanic and Atmospheric Administration (NOAA) standards or regulations. The real world 158 may be what the user 146 is looking (e.g., a map or room) where the augmented reality of the 4D) AR-DWR visualization 132 may add the graphics to. The system for AR-DWR visualization 100 may utilize an application program interface (API) of an Advanced Weather Interactive Processing System (AWIPS®)/JET® application.
The radars 102 may be Spectrum Efficient National Surveillance Radar (SENSR), Doppler weather radar, legacy next generation radar (NEXRAD), Terminal Doppler Weather Radar (TDWR), Raytheon® Skyterm™, X-Band Low Power Radar (LPR), satellites, ground-based radar, X-band radar (e.g., 30 seconds to do a full volume), or another type of radar.
The RDA 104 may be a module or application that processes the data from the radars 102 and stores the raw data 108 in a database 106. The RDA 104 may be hosted by a computer, which may be on a same computer as AR-DWR module 112 and database 106 or a different computer. The RDA 104 may reside over a computer network from the database 106 and/or AR-DWR module 112.
The database 106 may be electronic storage for the raw data 108. The database 106 may reside over a computer network from the RDA 104 and/or AR-DWR module 112. The raw data 108 may be data from the radars 102. The raw data 108 may have one or more data types 110. The data types 110 may be reflectivity (e.g., composite reflectivity), wind velocity (e.g., storm relative velocity, radial velocity), temperature, etc. In some embodiments the database 106 may store the raw data 108 in a geographical hash storage system that enables quicker access to the raw data 108.
The AR-DWR module 112 may include data processing module 114, interaction module 116, 2D objects 118, and 3D objects 122. The AR-DWR module 112 may be part of Advanced Weather Interactive Processing System (AWIPS®) and/or uFrame™. The data processing module 114 may include routine processing 212 and alert processing 216, one or more of which may be across a computer network. The data processing module 114 may be an AWIPS II® application that takes the raw data 108 and generates the 2D objects 118 and the 3D objects 122. The data processing module 114 and/or interaction module 116 may determine the weather alerts 142 from the raw data 108, 2D objects 118, and/or 3D objects 122. For example, data processing module 114 and/or interaction module 116 may determine that there is a hail core (e.g., a warning 152) based on reflectivity of 2D objects 118, 3D objects 122, and/or raw data 108. The data processing module 114 and/or interaction module 116 may determine an area 160 for the weather alert 142. The data processing module 114, interaction module 116, and/or processing engine 162 may determine colors for the weather alerts 142.
Table 1 illustrates some types of raw data 108 available through AWIPS. The raw data may include an AWIPS header (not illustrated in Table 1), which may be the data type 110 in conjunction with the product code. The description indicates the type of the data type 110. The range indicates the range of the raw data 108.
Table 2 illustrates additional types of raw data 108. Table 2 may illustrate products available for X-band. The range is smaller and thus there may be more data per a volume of weather. The raw data may include an AWIPS header (not illustrated in Table 1), which may be the data type 110 in conjunction with the product code.
Table 3 illustrates additional types of raw data 108. Table 3 may illustrate products available for NEXRAD Level 2 Products. The header for all the rows may indicate NEXRAD Level 2 Products. The data type 110 is NEXRAD Level 2 Products in conjunction with the product code.
The 2D objects 118 may include data type 120. The data type 120 may be reflectivity, wind velocity, temperature, etc. The 3D objects 122 may include data type 124. The data type 124 may be reflectivity, wind velocity, temperature, etc. The data type 120, 124 may be termed metadata because it is derived from the raw data 108. The 3D objects 122 may be faces of objects that are rendered to present the 4D AR-DWR visualization 132. Without the construction of faces for the 3D objects 122 then the raw data 108 is information about individual points, e.g., x, y, z, coordinate with the information that is data for that point such as wind velocity, reflectivity, temperature, etc. The routine processing 212 may perform routine processing for 3D weather display 134. The alert processing 216 may determine weather alerts 142. For example, the alert processing 216 may be configured to examine the raw data 108 and determine whether a weather alert 142 is indicated by the raw data 108. The alert processing 216 may use artificial intelligence or other means to perform alert processing 216. In some embodiments, people examine the raw data 108 and generate weather alerts 142. In some embodiments, the user 146 may indicate that a portion of the 3D weather display 134 should be part of a weather alert 142, which generates a new weather alert 142.
In some embodiments, the AR-DWR module 112 may include or be in communication with Microsoft HoloLens®. In some embodiments, AR-DWR module 112 may make calls to application program interfaces (APIs) of an AR application, e.g., Microsoft HoloLens®. In some embodiments, AR-DWR module 112 may be or include Raytheon® Airline Aviation Services (RAAS). In some embodiments, AR-DWR module 112 may be or include SPARK. In some embodiments, AR-DWR module 112 may be a RAAS 557th radar product generator (RPG) AF weather forecaster (WF). In some embodiments, AR-DWR module 112 may be a RPG.
The interaction module 116 may respond to interactions 148 from the user 146 and/or 3D AR-DWR hardware 126 with responses 148. The interactions 148 may be an indication of a hand recognition 136, a cube 138, a selection of an item 141 of the menu 140, an operation of a control 144, etc. The responses 148 may be 2D objects 118, 3D objects 124, menus 140, weather alerts 142, raw data 108, etc. The interaction module 116 may control the 3D AR-DWR hardware 126. In some embodiments, the 3D AR-DWR hardware 126 may control the generation of the 4D AR-DWR visualization 132 by making calls (e.g., interactions 148) to the AR-DWR module 112. An example interaction 148 may be a selection of an area of the 4D AR-DWR visualization 132 for greater detail with the response 148 being greater detail in the 2D objects 118 and/or 3D objects 124 so that great detail may be displayed for an area of the 4D AR-DWR visualization 132. In some embodiments the 2D objects 118 and/or the 3D objects 122 may be stored in a geographical hash storage system that enables quicker access that may enable the 4D AR-DWR visualization 132 to be updated in real time. In some embodiments, the weather alerts 142, 2D objects 118 and/or the 3D objects 122 are stored with a geographical hash to enable quicker retrieval.
The AR-DWR hardware 126 may include a headset 128, a hand control 130, processing engine 162, and other AR-DWR hardware. The 3D AR-DWR hardware 126 may include other connections to other software/hardware. For example, the 3D AR-DWR hardware 126 may include Microsoft HoloLens 2® and/or SPARK. In some embodiments, the 3D AR-DWR hardware 126 may include additional or different hardware. The AR-DWR hardware 126 may include a wireless connection to AR-DWR, module 112. The AR-DWR hardware 126 may be a standalone headset, with the standalone headset including the AR-DWR module 112, in accordance with some embodiments. The processing engine 162 may render the 4D AR-DWR visualization 132. The processing engine 162 may communicate with the AR-DWR module 112. The processing engine 162 may recognize the hand recognition 136, selection of cube 138, the controls 144, an item 141 of a menu 140, etc. The processing engine 162 may determine coordinates to use to display the AR on the real world 158.
The 4D AR-DWR visualization 132 may include one or more of 3D weather display 134, hand recognition 136, cube 138, menus 140, weather alert , controls 144, feature 164. The 4D AR-DWR visualization 132 may be mixed reality visualization of 3D objects 122. The weather alert 142 may include watches 152, warnings 154, advisories 156, area 160, and type 168. The area 160 may be an area of a warning 152, watch 154, and/or advisory 156. The type 168 may be for strong winds, shearing winds, tornado, hail, etc. The controls 144 may be controls for the user 146 to select, e.g., a zoom control, a control to select a feature 164, a control to select color palette, etc.
Table 4 illustrates an embodiment of weather alerts 142. Where each type 168 includes a name and a priority. An active bookmark (NN) may be of type 168 user defined. A user defined type 168 may be where a user 146 has defined a type of weather alert 142 with the system for AR-DWR visualization 100. The user 146 may select an area of the 3D weather display 134 and define the weather alert 142.
The hand recognition 136 may be selection of a menu item 141, a selection of an object (e.g., 3D objects 122, cube 138, control 144, weather alert 142, etc.). The selection of an object may be based on a hand gestor, an articulated hand gestor, eye tracking, etc. In some embodiments, the 4D AR-DWR visualization 132 may be transmitted across a network to other users 146, e.g., a user 146 may view the 4D AR-DWR visualization 132 on a mobile smartphone. In some embodiments, users 146 may share the 4D AR-DWR visualization 132 with a same set of coordinates, so that the 3D rendering of the 3D objects 122 is based on the same coordinate system. The feature 164 may include an area 166. The feature 164 may be an area of the 4D AR-DWR visualization 132 selected by the user 146. The features 164 may be stored using a geographical hash to enable quicker retrieval. In some embodiments, the feature 164 may be selected and changed into a weather alert 142, which may be shared over a network to other users 146 and/or other 4D AR-DWR visualizations 132.
The user 146 may be one or more people (e.g., forecaster, weather presenter, home user, etc.) that are using the AR-DWR hardware 126 to view the 4D AR-DWR visualization 132. The functionality described in connection with
Returning to
The interaction module 116 may respond to an interaction 202 with B 204, which may be a flow of requests to update the 4D AR-DWR visualization 132, e.g., to update the 3D weather display 134 with a different data type such as wind velocity. The interaction module 116 may generate or retrieve 3D objects 122 with a data type of wind velocity. The interaction module 116 may then send B 204 to the AR-DWR hardware 126 to update the 3D weather display 134 with the 3D objects 122 with data type 124 of wind velocity.
A 206 indicates a flow of data that are responses to interactions 202. A 206 may be from the interaction module 116 to the user 146. For example, it may be a confirmation of the interaction 202. The responses may be indicted by the 4D AR-DWR visualization 132 to the user 146.
C 208 may be a flow of data from the 4D AR-DWR visualization 132, which may be updated frequently. The routine display 210 indicates routine 3D weather display 134, e.g., weather 308. A data flow may be from the radars 102 to the routine display 210. For example, raw data 104 may come from the radars 102 and be processed by the RDA 104. E 214 represents the raw data 104 being distributed to data processing module 114. The raw data 104 may be distributed to many different data processing modules 114 via various computer networks. The data processing module 114 may perform routine processing 212, e.g., 3D weather display 134 such as weather 308, and perform alert processing 216, e.g., weather alert 142 such as weather alert 312. The alert processing 216 is a data flow F 218 of alerts that represent high-threat alerts that prompt the forecaster to determine the best course of action to protect life and property. The data processing module 114 may be configured to insure that data flow F 218 is presented immediately on the 4D AR-DWR visualization 132. Data flow F 218 may be transmitted on a computer network with a higher service level than data flow E 214. The alert processing 216 may include artificial intelligence and other techniques to determine weather alerts 142 from the raw data 104, e.g., a tornado decision algorithm may be included in the alert processing 216.
Data flow F 218 may denote that continue with display alert 220 where the 4D AR-DWR visualization 132 includes the alert. For example, data processing module 114 may have send a response 150 to AR-DWR hardware 126 indicating that a weather alert 142 had a high priority for display.
Data flow G 224 indicates a user alert 222 from a weather alert 142 that is a high-threat alert. The high-threat alert may invoke data flow D 226. Data flow D 226 may be a data flow where the user 146 makes a decision regarding a high-threat alert. Warning decision 228 may indicate that the user 146 is presented with a decision to issue a weather alert 142 as a warning 152 for the displayed alert 220. In some embodiments user alert 222 may be generated based on the user 146 indicating a portion of the 3D weather display 134 is a weather alert 142.
In some embodiments the processing may be distributed. For example, RDA 104 may be on a server across a network, data processing module 114 may be on another server across a network, and alert processing 216 may be on another server across a network.
The method continues at operation 704 with generating 2D polygons from the weather radar data, where the weather data comprises a 3D coordinate and a value indicating a weather condition, and where the 2D polygons are generated based on values of the value indicating the weather condition and an area of coverage. For example, data processing module 114 may generate 2D objects 118 with a same data type 120 having values. The generation of the 2D objects 118 may be based on an area of coverage (not illustrated) and based on the values of the data types 120 being equal or similar.
The method continues at operation 706 with sending the 2D polygons to an AR weather radar visualization system for the AR weather radar system to present a 3D rendering of the 2D polygons. For example, AR-DWR module 112 may send the 2D objects 118 to the AR-DWR hardware 126 for 4D AR-DWR visualization 132.
One or more operations of method 700 may be optional. One or more additional operations may be part of method 700. In some embodiments, the order of the operations of method 700 may be different.
Machine (e.g., computer system) 800 may include a hardware processor 802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 804 and a static memory 806, some or all of which may communicate with each other via an interlink (e.g., bus) 808.
Specific examples of main memory 804 include Random Access Memory (RAM), and semiconductor memory devices, which may include, in some embodiments, storage locations in semiconductors such as registers. Specific examples of static memory 806 include non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; RAM; and CD-ROM and DVD-ROM disks.
The machine 800 may further include a display device 810, an input device 812 (e.g., a keyboard), and a user interface (UI) navigation device 814 (e.g., a mouse). In an example, the display device 810, input device 812 and UI navigation device 814 may be a touch screen display. In an example, the display device 810 may be an AR headset and navigation device 814 may be a handheld interface pen. The machine 800 may additionally include a mass storage (e.g., drive unit) 816, a signal generation device 818 (e.g., a speaker), a network interface device 820, and one or more sensors 821, such as a global positioning system (GPS) sensor, compass, accelerometer, video camera, or other sensor. The machine 800 may include an output controller 832, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.). In some embodiments the processor 802 and/or instructions 824 may comprise processing circuitry and/or transceiver circuitry.
The storage device 816 may include a machine readable medium 822 on which is stored one or more sets of data structures or instructions 824 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. For example, one or more of AR-DWR module 112, RDA 104, processing engine 162, interaction module 116, data processing module 114 may be implemented by machine 800 to form a special purpose machine 800. The instructions 824 may also reside, completely or at least partially, within the main memory 804, within static memory 806, or within the hardware processor 802 during execution thereof by the machine 800. In an example, one or any combination of the hardware processor 802, the main memory 804, the static memory 806, or the storage device 816 may constitute machine-readable media. Example machine-readable medium may include non-transitory machine-readable medium that may include tangible non-transitory medium for storing information in a form readable by one or more computers, such as but not limited to read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; a flash memory, etc.
Specific examples of machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., EPROM or EEPROM) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; RAM; and CD-ROM and DVD-ROM disks.
While the machine readable medium 822 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 824.
The instructions 824 may further be transmitted or received over a communications network 826 using a transmission medium via the network interface device 820 utilizing any one of a number of transfer protocols (e.g., frame relay, internee protocol (P), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, Licensed Assisted Access (LAA), IEEE 802.15.4 family of standards, a Long Term Evolution (LIE) family of standards, a Universal Mobile Telecommunications System (UMTS) family of standards, peer-to-peer (P2P) networks, among others.
In an example, the network interface device 820 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 826. In an example, the network interface device 820 may include one or more antennas 830 to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. In some examples, the network interface device 820 may wirelessly communicate using Multiple User MIMO techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 800, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
It should be appreciated that where software is described in a particular form (such as a component or module) this is merely to aid understanding and is not intended to limit how software that implements those functions may be architected or structured. For example, modules are illustrated as separate modules, but may be implemented as homogenous code, as individual components, some, but not all of these modules may be combined, or the functions may be implemented in software structured in any other convenient manner.
Furthermore, although the software modules are illustrated as executing on one piece of hardware, the software may be distributed over multiple processors or in any other convenient manner.
The above description is illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of embodiments should therefore be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
In the foregoing description of the embodiments, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting that the claimed embodiments have more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Description of the Embodiments, with each claim standing on its own as a separate exemplary embodiment.
This patent application claims the benefit of U.S. Provisional Patent Application No. 62/711,910, filed. Jul. 30, 2018, entitled “AUGMENTED REALITY (AR) DOPPLER WEATHER RADAR (DWR) VISUALIZATION APPLICATION”, which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
62711910 | Jul 2018 | US |