The invention pertains to systems and methods for the generation of synthetic environments for training or mission rehearsal. More particularly, the invention pertains to systems and methods to increase speed of creation and accuracy of landscapes for virtual battlefields which might be traversed by computer generated forces.
There is a continuing and ongoing need to be able to generate authentic synthetic environments in connection with training or exercise rehearsal. For example, aircraft or vehicular simulators provide more realistic simulations and enhance the training and/or rehearsal experiences of the participants by using dynamically changing, real time, out the window displays or scenes. Particularly in connection with aircraft, these displays can represent large areas of terrain which can be viewed, preferably in real time, by the participant. Such displays require large databases derived from, for example, satellite images, high altitude photography or the like.
The databases and display equipment must be able to take into account widely changing scenes relative to a common area which could include take offs or landings, as well as high or low altitude engagements with simulated adversaries. One such approach has been disclosed and claimed in published U.S. patent application 2004/0075667 A1, assigned to the Assignee hereof and entitled System and Related Methods for Synthesizing Color Imagery, incorporated by reference herein.
Realistic simulation experiences will likely include computer generated forces (CGF) which move across the displayed terrain and exhibit behavior consistent with terrain features such as water, trees, buildings and the like. Typical forces could include tanks, self-propelled artillery, boats, as well as mechanized or dismounted infantry.
Terrain databases for modeling and simulation are known and commercially available. Commercially available software can be used to process such databases and, for example, extract features or the like. In addition commercially available software can be used to create and automate both friendly and enemy forces.
Another prior art system 10 is disclosed in
The database 12 is processed to produce a full feature set 14. It is recognized that production of the full feature set 14 is both time consuming and is a source of errors, miscorrelations and loss of fidelity.
As is known, the corrected imagery/raster map 12 could be processed to produce out the window image tiling 16 to at least in part produce visual displays for the simulation participants.
The full feature set 14 can in turn be combined with a terrain grid 18, and a model library 20, to produce terrain triangulation and feature placement information 22. The out the window image tiling 16 and the terrain triangulation and feature placement 22 are stored in visual/infrared database 26. Additional databases such as radar database 28 and semi-automated forces (SAF) or CGF database 30 can also be loaded with the terrain triangulation and feature placement information 22.
The full feature set 14 typically would incorporate a plurality of polygons to represent the respective geometric surfaces. Each polygon would be assigned a single surface type of material. At times, such polygons may cover a large area which could include a plurality of materials. As a result, the limit of a single material per polygon reduces the fidelity of the surface material presentation during the simulation or training exercise. The limitation is particularly evident in systems which include other presentations of a plurality of materials in the area. This would be evident if the area is visualized using overhead image resources.
As noted above, the process of extracting the full feature set 14 from the corrected imagery/raster map database 12 requires extensive time and effort. A significant portion of this time and effort is devoted to obtaining the surface material definition for the various polygons. For example, manual digitalization of material outlines from maps or from overhead imagery is often required to provide polygon material definition or assignments.
There continues to be an ongoing need to produce synthetic or simulated environments and databases for CGF more rapidly than has heretofore been possible. Additionally, it would be desirable to be able to minimize the errors and loss of fidelity that is often associated with the process of developing full feature sets, such as set 14.
While embodiments of this invention can take many different forms, specific embodiments thereof are shown in the drawings and will be described herein in detail with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention, and as a disclosure of the best mode of practicing the invention. It is not intended to limit the invention to the specific embodiment illustrated.
Systems and methods for creating databases with material coded imagery for computer generated forces in accordance with the invention can shorten preparation time thereby incorporating more flexibility into the training process. Missions can be simulated sooner and with greater realism than with prior art systems.
Item image classification software 56 of a known type can process data from the corrected image/raster map 12 to form pixel based material coded imagery data 58. For example, each pixel could represent geographical area such as 5 meters square of the region of interest. The pixel based material coded imagery data includes type of surface material present at some or all of the respective pixel.
The corrected image/raster map 12 is processed, using commercially available software, to produce a reduced feature set 52 which can be represented using a plurality of polygons as would be understood by those of skill in the art. The reduced feature set illustrates three dimensional aspects of the terrain of interest along with key lineals; points or other features that are not adequately represented in the material coded imagery. The reduced feature set is generally much smaller than the full feature set, and can even be an empty set, so it can be created more quickly than the full feature set. The reduced feature set 52 is combined with terrain grid 18 and model library 20 to form terrain triangulation and reduced feature placement data 22′.
Each pixel is assigned a data value which represents the material for that particular geographical area. For example, and without limitation, indicia and types of material could include:
Additionally, each pixel material can be assigned a height, as discussed in Donovan U.S. Pat. No. 4,780,084 for a radar simulator and incorporated by reference herein. In such an instance, the material height for a pixel can be used to modify the underlying elevation for the pixel, increasing fidelity. For example, a pixel with “tree” material may be assign an elevation (e.g. 10 meters), indicating that the pixel is higher than the underlaying surface.
The material coded imagery pixels 58 include a geographical position header to identify the location of the respective pixel in the subject environment. For example, each pixel could be identified with either Cartesian or geodesic coordinates. Different resolutions can be provided for different pixels.
More than on type of material can be identified per pixel. In such an instance, pixel data can incorporate multiple codes reflecting multiple types of surfaces present in respective portions of the pixel.
Those of skill will understand that the information from the respective pixels 58 will be layered on terrain surface data 22′. Surface data 22′, for example polygons, can exhibit lineals, areas and default material attributes. Conflicts need to be addressed. Lineals, roads for example, will usually take precedence over MCI data 58. If no MCI data is present for respective coordinates, the default terrain material will be used.
Prioritization can be provided to resolve areas where multiple objects are defined for the same area with different materials. For example, a material coded pixel might be coded for a selected material. On the other hand, three dimensional objects, areals might be present at the corresponding coordinates 22′. In such instances, one form of prioritization can correspond to:
1. Where there is a conflict between material coded imagery 58 and 3D objects, lineals or areals at a respective coordinate or region, lineals, are always assigned a higher priority than the respective material coded imagery 58. Conflicts can be resolved with areals using the following exemplary priority process:
1. MCI priority designation of “true” indicates that the MCI data take priority over the current areal material.
2. MCI priority designation of “false” indicates that the current areal material takes priority over the MCI coded material.
3. MCI priority designation of “available” indicates that MCI data is available for at least part of the respective polygon.
It will be understood that databases 26′, 28′ can be used with various simulation programs to present displays for participants (such as visual, IR or radar). Database 30′ can be used by CGF mobilizing software 32 to provide more realistic force behavior. These databases incorporate respectively, at least material coded imagery 58, and the reduced feature placement data correlated to the triangulated terrain 22′ for purposes of presenting an appropriate display as well as providing enhanced terrain information for CGF.
The image classification software can process various types of source data to produce the material coded imagery data 58.
Correlated run time information associated with respective databases 26′, 28′ and 30′ is illustrated by colorized out the window visual displays and thermal images 26′-1, -2, the respective radar image correlated with vector information from the reduced feature set 52 is illustrated in image 28′-1. Finally, trafficability information usable by the computer generated, or, semi-automated forces, database 30′, is illustrated by display 30′-1.
Database 30′ thus reflects both material coded imagery data 58 as well as the reduced feature set polygonal-type representation 22′. As would be understood by those of skill in the art, the computer generated forces would behave more realistically during a simulation or training exercise than would be the case without the additional material coded data.
The reduced feature set, such as reduced feature set 52, is combined with terrain grid 18 and model library 20, step 110. The material coded imagery information, such as information 58 can then be stored along with the combined reduced feature set information, terrain grid and library information in respective databases such as 26′, 28′ and 30′, step 112. The stored material coded data and terrain data can be used at simulation run-time, step 114 to improve realism of mobility of computer generated forces.
The material for the current pixel is established, step 136. In step 138 a surface material code is established for the current pixel. If the last pixel has been processed, the material coded pixel data and associated attribute table can be stored in a respective database, step 140. Otherwise, the process returns to step 134 to process the next pixel.
In a step 162, the next Cartesian coordinate is specified. The respective polygon corresponding to that pair of coordinates is then selected, step 164.
In step 166 a check is made to determine if the material data flag of the respective polygon has been set. If yes, in step 168 an evaluation is carried out to determine if lineals are present. If so, they take priority over any MCI data. If not, the respective coordinates X, Y are mapped to the respective pixel of the material coded imagery 58, step 170. Those of skill in the art will understand that processes are known and available for establishing a correlation between Cartesian coordinates of a region X, Y and the geodedic coordinates of various pixels. One such system has been disclosed in Donovan et al. U.S. Pat. No. 5,751,62 entitled “System and Method for Accurate and Efficient Geodetic Database Retrieval” assigned to the Assignee hereof, and incorporated by reference herein.
In step 172, the respective pixel data is accessed. In step 174 the respective material coded data is extracted for the respective pixel. In step 176 a determination is made if priority needs to be established between a local areal(s) and the respective MCI data. If not, then in step 178, that respective MCI surface information is associated with the respective polygon. Otherwise the prioritizing process, discussed above is carried out, step 180. Then the appropriate material data is associated with the subject polygon, step 178. If finished, the composite polygon information, including the overlayed coded imagery information can be subsequently retrieved and displayed or used in the operation of computer generated forces, step 182. It will be understood that variations in the above processes can be implemented and come within the spirit and scope of the invention.
From the foregoing, it will be observed that numerous variations and modifications may be effected without departing from the spirit and scope of the invention. It is to be understood that no limitation with respect to the specific apparatus illustrated herein is intended or should be inferred. It is, of course, intended to cover by the appended claims all such modifications as fall within the scope of the claims.