The present invention relates to the field of automatedly measuring and/or assessing property damage in an area affected by a natural catastrophe event. Particularly, the present invention relates to the field of automated risk-measuring systems and associated digital platforms providing precise assessment of natural catastrophic events and physical impacts on a physical object and property, respectively. Further, the present invention relates to technical improvements to leverage computer vision/deep learning/artificial intelligence on actual post catastrophe satellite and aerial imagery to detect and measure different types of damage on properties, as e.g. roofs or other property structures. The aerial or satellite imagery can explicitly be taken by airborne and/or spaceborne optical sensing devices, as digital imagery based cameras placed in or at manned/unmanned aircrafts or drones and/or satellites or spacecrafts. In general, the present invention relates to forecast, predictive and/or optical measuring and/or imagery recognition systems for measuring values taken by defined or otherwise selected measuring parameters of the property and/or the natural catastrophe event, and to digital systems and methods for impact forecasting to support emergency management of natural hazards.
Automated forecasting and early warning systems are important technical means to protect lives, properties, and livelihood. The same is true for systems for quantitative measurements and assessments, and further automated precise recognition and classification of physical damages to objects. While early warning systems are frequently used to predict the magnitude, location, and timing of potentially damaging events, these systems rarely provide predictive and quantified impact measures and/or estimates, such as the expected amount and distribution of physical damage, human consequences, disruption of services, or financial loss. Complementing early warning systems with impact forecasts has a twofold advantage: It is able to provide decision makers with more accurate information to take suitable and dedicated decisions about emergency measures and focus the attention of different disciplines on a common target. This also allows capitalizing on synergies between different disciplines and boosting the technical development of multi-hazard early warning systems. The present invention adds the value of impact-based warnings and damage assessment compared to hazard forecasting for the emergency phase, allowing to cope with the technical challenges and to overcome the technical pitfalls of the prior art systems regarding impact forecasting for a wide range of natural hazards.
The technical demand for such automated measuring and forecasting systems is obvious: Over the last decade, relevant natural loss events worldwide caused on average economic losses in excess of USD 190 billion per year and displaced an average of 24 million people each year. Among the global risks, extreme weather events and geophysical phenomena such as damaging earthquakes and tsunamis are perceived as the top first and third risks in terms of likelihood and as the top third and fifth risks in terms of impact. Urbanization, population growth, increasing interconnectivity, and interdependence of critical infrastructure are expected to further aggravate the risks imposed by natural hazards. Climate change is also acting as a major driver and amplifier of the losses related to hydrometeorological events. Both heat waves and droughts will become more frequent and are expected to persist over longer time periods under climate change. Similarly, climate-driven increases in river, urban and coastal flooding are a global problem, affecting mainly developing countries and also industrialized regions.
Technical-based forecasting, early warning and the provision of rapid disaster risk information are cornerstones of disaster risk reduction. This was also recognized by the United Nations (UN). For example, the UN Sendai Framework for Disaster Risk Reduction calls for a substantial increase in the availability of precise multi-hazard early warning systems and rapid disaster risk data by 2030 (United Nations International Strategy for Disaster Reduction [UNISDR]), which directly shows the technical need for such forecasting and assessment systems. In the state of the art, forecast and warning have focused on physical event characteristics, such as magnitude, spatial extent, and duration of the impending event. Though, the provision of robust measuring data on the potential event impacts, such as predicted number and location of affected people, damage to buildings and infrastructure, or disruption of services, has gained attention, there is still a need for such technical systems enabled to cope with these requirements. In general, such systems require considering additional information on exposure, that is, people, property, or other elements present in hazard zones, and on vulnerability, depending on the characteristics of the exposed communities, systems, or assets that make them susceptible to the damaging effects of a hazard. Thus, impact forecasting and warning systems are an emerging and important topic in the technical field of measuring and forecasting systems, i.e. for developing forecasting technology, and at the level of institutions responsible for natural hazards management. For instance, the World Meteorological Organization (WMO) has launched in 2015 a program on multi-hazard impact-based forecast and warning systems. This program aims to assist WMO members to further develop forecast and warning systems tailored to the needs of users to fully perceive and understand the consequences of severe weather events and, as a consequence, to undertake appropriate mitigating actions.
The document CN 109408965 A for example discloses a platform estimating the risk of loss for a location in the context of earthquakes. The invention provides an analysis method for assessing building damages using a house earthquake damage matrix curve based on earthquake motion parameters. Based on the corresponding relationship between the intensity and the seismic oscillation parameters, a maximum likelihood estimation is adopted. A house vulnerability matrix or a damage ratio result of actual earthquake damage statistics is converted into a dual-parameter vulnerability curve to overcome a possible defect based on the vulnerability curve. On the basis of the relationship between the intensity and the seismic oscillation parameters, seismic vulnerability curve characteristic parameters of various house structures are given, and basic data are provided for house building seismic damage assessment based on the seismic oscillation parameters.
Starting from the state of the art, there is a need in automated and robust forecasting and assessment of impacts of hazardous events for a wide range of geophysical and weather-/climate-related natural hazards. This technical need does not only concern forecasting as the provision of timely information to improve the management in the emergency phase, that is, shortly before, during and after a hazardous event, but also medium- and long-term risk and probabilities measurements and/or assessments that, for example, are carried out as expert systems or emergency signaling systems to assist decision makers in risk prevention and mitigation activities. To technically cover the whole range, such systems should be enabled of impact forecasting and assessment (as a basis for impact-based warnings) and simultaneously of hazard forecasting and assessment (hazard-based warning), indicate challenges and pitfalls, and synthesize the review results across hazard types. One further deficiency of the prior art systems in impact forecasting and risk-assessment is, that they are very different across hazard types and disciplines, which makes a stringent analysis impossible. As forecasting and assessment technology are typically advanced within specific disciplinary contexts, they are not able to forecast and measure across different hazard types rendering impossible transferring information and knowledge and harmonizing concepts across discipline borders and bridging gaps between different technological approaches.
In the prior art, the document U.S. Ser. No. 10/896,468B1 discloses a system for processing overhead imagery using telemetry data received from unmanned aerial vehicles. In particular, the system accesses aerial images including a property, determines an owner of the property, determines whether the owner of the property is eligible to be a member of a financial institution, determines whether the owner of the property has property insurance, and presents an offer for insurance to insure the property in the aerial image. Further, the system can determine damage estimates, and reserves resources to repair the properties based on the damage estimates. The document U.S. Ser. No. 10/354,386B1 shows a system using unmanned vehicles (terrestrial, aerial, nautical, or multi-mode) to survey a property in response to or in anticipation of damage to an object. The system allows determining damage information associated with structures (objects) in aerial images obtained by the unmanned vehicles, or other source. The damage information includes intensity of damage of a structure in an image. Finally, the document US2014245210A1 discloses a system for providing a damage assessment report. A geographic area potentially affected by an event is identified with objects in the geographic area. An aerial image of the objects is displayed via an interactive graphic display. An option to select a specific object in the aerial image is provided. Finally, a damage assessment report for the given object is provided, wherein the damage assessment report includes image data from an aerial vehicle, and a damage characteristic for the selected object based on the image data, the damage characteristic identifying potential damage to the given object based on the event.
It is one object of the present invention to provide an automatable, sensory-based system and a method for measuring and assessing property damage in case of a natural catastrophe event impact, which allows for a fast forecast and analysis of the risk of property damage, efficient damage claim handling, quantified impact measures or estimates, enables damage assessment across different hazard types and assists in harmonizing damage responses across different disciplines and organizations. Further, it is an object of the present invention to provide an automatable, sensory-based system and a method for impact measure forecasting and prediction to support emergency management of natural hazards by combining precise and automated impact-based forecasting with hazard forecasting e.g. for the emergency phase and/or for appropriate and accurate conduct of automated risk-transfer. The sensory-based system and method for forecasting and early warning should be able to predict the magnitude, location, and timing of potentially damaging events, and additionally measure or assess quantified impact measures, such as the expected physical damage, human impacts, impacts by disruption of services, or impacts for financial loss. The digital system and method should be able to measure the effects across a wide range of natural hazards. Further, it should be able to operate as a digital expert system outlining opportunities and key challenges based on the impact forecasting measurements.
According to the present invention, these objects and other objects that will become apparent in the following description are achieved with a digital system and a method for assessing property damage measures and/or estimates in case of a natural catastrophe event comprising the features of the independent claims. In addition, further advantageous embodiments and variants can be derived from the dependent claims and the related descriptions.
According to the present invention, the above-mentioned objects for an aerial and/or satellite imagery-based, optical sensory system for measuring physical impacts to land-based objects and/or structures are achieved, particularly, by measuring impact measurands in case of an occurrence of a natural catastrophe event, the natural catastrophe event impacting the land-based objects and/or structures causing a physical damage to the land-based objects and/or structures, in that the aerial and/or satellite imagery-based, optical system comprise one or more airborne and/or spaceborne optical remote sensing devices equipped with one or more optical remote sensors being within a frequency band/wavelength range at least comprising infrared to visible multi-spectral sensors and/or synthetic aperture radar and/or hyperspectral sensors for capturing digital aerial and/or satellite imagery of a geographic area affected by the natural catastrophe event and transmitting the digital aerial and/or satellite imagery to a digital ground system, in that the digital ground system comprises a core engine for generating a digital natural catastrophe event footprint of the natural catastrophe event based on the captured digital aerial and/or satellite imagery, the natural catastrophe event footprint at least comprising a topographical map of the natural catastrophe event, in that the digital ground system comprises a data transmission interface for receiving location parameter values defining selected land-based objects and/or structures located in or near the area affected by the natural catastrophe event, in that the digital ground system comprises an object filter for matching the received location parameter values of each land-based objects and/or structure to the generated topographical map, wherein land-based objects and/or structure are identified and filtered lying in the area affected by the natural catastrophe event, if the received location parameter values of a land-based objects and/or structure are detected to be in a geographic parameter value range of the affected area of the topographical map, and in that the core engine comprises an adaptive vulnerability curve structure for parametrizing impact measurands for the land-based objects and/or structures per event intensity based on the measured topographical map, and for generating an impact measurand value for each of one or more of the land-based objects and/or structures based on an event intensity measured based on the natural catastrophe event footprint using the vulnerability curve structure. The land-based objects and/or structures can e.g. comprise any property as buildings, service constructions, agricultural land or other assets exposed to natural events. These can be for example private housing properties, company or government facilities, energy services, water supply services, infrastructure constructions, crop fields, pastures and the like. A physical damage can be defined as any reduction in value or loss of return; for example a physical damage to a land-based object and/or structure, e.g. a building or construction, can be a direct damage resulting from the hazardous impact to the property, however, may also comprise indirect damage for example resulting from a breakdown of services provided with or at the property. The damage assessment according to the invention provides reliable, optical sensory-based property damage measures, which for example can also be used to index/measure a risk of loss or value reduction of assets or service, costs for a need of using alternative services, costs for remediation measures, compensations of third parties, etc. In general, the land-based objects and/or structures and/or formations can e.g. at least comprise building structures and/or agricultural structures and/or artificial landscape formations as water training systems, dams, or artificial terraces for agriculture or industrial purposes. The natural catastrophe event can e.g. at least comprise a flood event and/or hurricane event and/or a fire event and/or an earthquake event and/or a drought event and/or seismic sea wave/tsunami event and/or costal erosion event and/or volcanic eruption event, the natural catastrophe event footprint at least comprises a flood event footprint and/or a hurricane event footprint and/or a fire event footprint and/or an earthquake event footprint and/or a drought event footprint and/or seismic sea wave/tsunami event and/or costal erosion event and/or volcanic eruption event.
In an embodiment variant, a quantified loss measure value can e.g. be generated by the core engine for each of one or more of the land-based objects and/or structures based on the measured impact measurands for a respective land-based objects and/or structures the quantified loss measure value being given by the percentual portion of physical damage to a land-based objects and/or structures weighted by the undamaged land-based objects and/or structures. A monetary equivalent of measured quantified loss measure value of one or more of the land-based objects and/or structures can e.g. be generated by the core engine giving the monetary equivalent of the measured physical damage of the land-based objects and/or structures.
In an additional embodiment variant, digital representations of the land-based objects and/or structures can e.g. be assembled by the core engine, wherein the digital representations are composed of digital object elements stored in the object elements library, and wherein the monetary equivalent of the measured physical damage of the land-based objects and/or structures is generated from an aggregated monetary equivalent of the digital object elements of a land-based object and/or structure in relation to the measured physical damage of the land-based object and/or structure. Further, monetary equivalent values to each of digital object elements stored in an object elements library can e.g. be assigned and dynamically updated by the system, wherein the aggregated monetary equivalent of the digital object elements of a land-based object and/or structure is dynamically generated based on the digital object elements of the object elements library. One or more digital images of the land-based object and/or structure can e.g. be captured by the system, the one or more digital images automatically captured by the remote airborne and/or spaceborne sensors and/or transmitted by an individual associated with the land-based object and/or structure and/or captured from a database accessible via a data transmission network (13), wherein, by means of an identificator and locator unit, elements of a land-based object and/or structure are identified by data processing of the one or more digital images based on the digital elements of the object elements library and located within the land-based object and/or structure, and wherein the core engine assembles the digital representations of the land-based objects and/or structures using the digital elements identified and located within the land-based object and/or structure. Automated pattern recognition can e.g. be applied to the one or more digital images by the identificator and locator unit using automated pattern recognition for identifying and locating the digital elements within the land-based object and/or structure. The automated pattern recognition can e.g. be realized by machine-learning structures or AI-based structures comprised in the identificator and locator unit.
As an embodiment variant, the one or more airborne and/or space-based optical remote sensing devices equipped with one or more optical remote sensors can e.g. have a sensor resolution in a spectral band in the infrared range measuring temperature between −50° C. to 50° C. The one or more airborne and/or space-based optical remote sensing devices and/or optical sensory satellites or spacecrafts and/or optical sensory manned/unmanned aircrafts or drones equipped with one or more remote airborne or spaceborne optical sensors can e.g. have a radiometric resolution given by the optical sensor's sensitivity to the magnitude of the electromagnetic energy or the optical sensor's measuring color depth at least comprises 8 bit giving at least 255 brightness levels, wherein the radiometric resolution defines the resolution of the system to detect differences in reflected or emitted energy. The one or more airborne and/or space-based optical remote sensing devices equipped with one or more optical remote sensors can e.g. have a spatial resolution of at least 7.5 cm and/or at least 20 cm. The one or more airborne and/or space-based optical remote sensing devices equipped with one or more optical remote sensors have a spatial resolution of at least with 120×120 total internal reflection (TIR) and/or 30×30 with 60×60 TIR and/or greater than 15×15. The one or more airborne and/or space-based optical remote sensing devices equipped with one or more optical remote sensors can e.g. have a temporal resolution greater than 5 to 10 revisits a day. The one or more airborne and/or space-based optical remote sensing devices equipped with one or more optical remote sensors can e.g. have a spatial coverage of 100×100 km or more.
In an embodiment variant, the vulnerability curve can e.g. be related on one or more characteristic parameter values the land-based objects and/or structures comprising at least aggregated monetary equivalent and/or size and/or quality and/or age and/or type of structure and/or degree of coverage and/or type of coverage and/or occupancy and/or past/historical damage assessment parameter values capturing past damages impacted by former natural catastrophe events and/or deviation parameter values capturing based on measured deviations in a data imagery of a land-based object and/or structure before and after the natural catastrophe event.
In another embodiment variant, at least one current damage parameter value capturing physical damages resulting from the natural catastrophe event can e.g. be received by the digital ground system, wherein the vulnerability curve is calibrated based on said current damage parameter value. Said current damage parameter value can e.g. be generated by matching a digital image of a land-based object and/or structure prior to the occurrence of the natural catastrophe event to a digital image of the land-based object and/or structure after the impact by the natural catastrophe event determining the damage parameter value as detected variance within that land-based object and/or structure.
In a further embodiment variant, object and/or structure location parameters can e.g. be received from extracting location data from satellite imagery previous to the natural catastrophe event and/or from existing object and/or structure location data listings. Object and/or structure location parameters can e.g. also be derived from portfolio information of a risk-transfer system.
In an embodiment variant, the core engine the core engine generates normalized and/or weighted distribution maps of land-based objects and/or structures identified by the location parameters and potentially damaged in the area affected by the natural catastrophe event, the normalized and/or weighted distribution maps at least comprise distribution maps of damage impact strength to land-based objects and/or structures and/or normalized loss distribution.
In another embodiment variant, an impact measurand value can e.g. generated for each of one or more of the land-based objects and/or structures based on an event intensity in real-time or quasi real-time with the occurrence of the natural catastrophe event, wherein the generation is automatically triggered by detecting one or more predefined threshold values measured associated with the natural catastrophe event by means of the airborne and/or spaceborne optical remote sensing devices and/or satellites exceeding predefined threshold values, the threshold values at least comprising a predefined threshold for measuring the extent of the affected area and/or intensity of the natural catastrophe event and/or impact strength of the natural catastrophe event. This embodiment variant has, inter alia, the advantage that it technically allows quantitative, real-time, or quasi-real-time measurements of physical natural catastrophe impacts, e.g. flood event impacts to land-based objects or structures, which was not possible with prior art systems, mainly relying on historical data and statistical analysis, i.e. which do not include direct physical measurements by sensory devices.
In a further embodiment variant, the natural catastrophe event footprint can e.g. comprise one or a plurality of measured time series of digital satellite imagery, each digital satellite imagery comprises an assigned measuring time stamps or time range, wherein based on the time series of measured digital satellite imagery dynamics of a propagation of the natural catastrophe event footprint is measurably captured by the core engine. This embodiment variant has, inter alia, the advantage that it technically allows to capture and measure and/or monitored real-time dynamics of an occurring natural catastrophe event. This allows to increase the precision and accuracy of the measured impact values, since it can e.g. be calibrated by propagating the impact stepwise in time intervals.
In an embodiment variant, the natural catastrophe event footprint can e.g. be generated by measuring the satellite imagery using one or more natural event parameters for locations or grid cells of the topographical map, wherein the natural event parameters comprising measurands measuring at least windspeed and/or precipitation range and/or intensity, flood level and/or hale intensity and/or hale size and/or air temperature and/or humidity and/or earthquake intensity and/or storm surge measure and/or avalanche strength and/or mud slide and/or tsunami strength and/or terrain incline and/or wildfire or conflagration extent.
In an embodiment variant, the natural catastrophe event footprint can e.g. further be generated by measuring the satellite imagery, the natural catastrophe event footprint being based on predicted occurrence probability measures for a selected area to be affected by a future occurrence of a natural catastrophe event.
Finally, in an embodiment variant, the impact measurands and/or loss measures can e.g. be generated to represent quantified measures for an actual physical damage in case of the occurrence of the natural catastrophe event.
Thus, the satellite and aerial imagery-based, optical sensory system for assessing property damage measures and/or estimates in case of a natural catastrophe event can e.g. at least comprise a digital platform configured for receiving digital satellite and aerial imagery of an area affected by the natural catastrophe event and for receiving location information about properties in the area affected by the natural catastrophe event, and a core engine configured for deriving a topographical map of a natural catastrophe event footprint from the digital satellite and aerial imagery for the area affected by the natural catastrophe event, for matching location information to the topographical map and the natural catastrophe event footprint, respectively, and for identifying properties in the area affected by the natural catastrophe event. The core engine is further configured for parameterizing a vulnerability curve based on the natural catastrophe event footprint. The vulnerability curve represents a damage indicator per event intensity. The core engine is also configured for generating damage measures and/or estimate values for one or more properties in the area affected by the natural catastrophe event based on the vulnerability curve.
The satellite and aerial imagery-based, optical sensory system is, inter alia, structures to process and transmit data, in particular measuring data, in digital form. The sensory-based system according to the invention may be realized to use elements as used e.g. in wireless communication systems, process control systems or digital instruments comprising for example central processors, network processors, memory units, input/output units, interfaces, graphic engines, arithmetic and logic devices, gate arrays, interconnect structures, etc. In the context of the present invention the core engine may for example at least comprise a control unit and an arithmetic and logit unit running the central processing operations of the inventive sensory-based system. The sensory-based system may for example signal the damage measures and/or estimate values for the properties as listings, as pointers in a map or as diagrams in a dashboard style. The measured damage measures and/or forecasted probability values may for example be quantified as damage percentage in the area, as an estimated absolute amount of loss of property value, as a range of value reduction or the like.
The satellite and aerial imagery can as a variant also include imagery for example provided by satellite and aerial imaging companies or government institution using satellite imaging technology for spaceborne photography of the Earth and aerial imagery based on manned/unmanned aircrafts, drones, balloons. Satellite imaging uses Earth observation satellites of Earth remote sending satellites designed for Earth observation from orbit. Different imaging technologies and different satellite altitudes achieve different imaging resolutions. The satellite imagery can be provided as digital data to the digital system which can be used to derive a topographical map of the captured landscape. In case the satellite imagery further includes data about hazard parameters like surface temperature, rain intensity or elevation the satellite imagery may also be used as a basis for the footprint of the natural catastrophe event.
The method for assessing property damage measures and/or forecasted impact measures in case of a natural catastrophe event impact according to the present invention is designed for rapid, as e.g. real-time, or quasi real-time, damage assessment. The digital platform receives digital satellite and aerial imagery of an area affected by a natural catastrophe event. Further, the sensory-based system receives location information about properties located in a region including the area affected by the natural catastrophe event. The geographic location data may for example be received in form of portfolio data for a property portfolio as for example established for administration or insurance purposes. The property or portfolio data may include additional information as for example the property value, past damages, etc. as will be explained in more detail below.
In the method for measuring and/or assessing property damage measures according to the invention the core engine of the sensory-based system can e.g. generate, from the digital satellite and aerial imagery, a topographical map of a natural catastrophe event footprint for the area affected by the natural catastrophe event. Further, the core engine matches the location data of a property to the topographical map to identify properties in the area affected by the natural catastrophe event. The core engine parameterizes a vulnerability curve, which represents a damage indicator per event intensity based on the natural catastrophe event footprint. The damage indicator may for example be expressed as the mean damage degree for a specific hazard intensity. The damage indicator may for example be derived from data about a monetary equivalent value of a property, e.g. additionally using expert opinions about damages at a specific hazard intensity and/or from historic damage data as will be explained in more detail below. Based on the vulnerability curve the core engine generates damage measures values for one or more properties in the area affected by the natural catastrophe event. Thus, the method according to the invention can for example directly measure or forecast/predict a potential risk of property loss, a replacement value, a risk of service failure or crop shortfall before a natural catastrophe event occurred. For example, the core engine can e.g. compare the location data of a property with properties at similar locations and derive an expected damage indicator from the vulnerability curve for the similar location. The method also can provide rapid damage assessment shortly after the impact of a natural catastrophe event and indicate quantified damage measures and/or estimate values related to the properties in the area of impact.
The topographical map of the footprint of the natural catastrophe event serves as the basis for the distribution of hazardous impacts caused by the event (or in short event impacts) in the area where the natural catastrophe event occurred. Advantageously, the footprint indicates the distribution of the intensity of the event by laying out hazard parameter values over the topographic map. The event impact is extracted from the satellite imagery measurements, as discussed above. Also, the accuracy of an event impact can e.g. be improved or calibrated by supplemental measurements by ground measuring stations of one or more hazard parameters at locations of the topographical map, for example measuring stations for windspeed and wind direction. The hazard parameter can for example indicate windspeed, rainfall intensity, hale intensity, hale size, temperature (particularly temperatures below 0° C.), earthquake intensity, storm surge, avalanche, mud slide, tsunami, terrain incline and/or wildfire. Advantageously, the natural catastrophe event footprint can include information of a combination of hazard parameters. For example the footprint my indicate the distribution of rain intensity and temperature because both hazard parameters mutually enforce the event impact on a property. Or the footprint may indicate measurements for the parameters windspeed and storm surge which can be correlated and intensify the damaging event impact. The significance of specific hazard parameters depends on the geographic area and the type of natural catastrophe event and can be reflected in the natural catastrophe event footprint.
The footprint of the natural catastrophe event can also be based on a predicting modelling structures and simulation structures of one or more hazard parameters. The hazard model structure may for example indicate the distribution of windspeed, rainfall intensity, hale intensity, hale size, temperature (particularly temperatures below 0° C.), earthquake intensity, storm surge intensity, tsunami intensity. Further the hazard model may for example indicate the probability of areas being affected by avalanches, mud slides, earthquakes, tsunamis and/or wildfires. The hazard parameter models can for example be derived from measurements of the hazard parameters during previous hazard impacts in the affected area or elsewhere. Commonly used weather models or any other hazard modelling indicating the progression of a hazard parameter during a natural catastrophe event may serve as a basis for hazard parameter model.
In a variant of the method for assessing property damage measures, the natural catastrophe event footprint is a flood footprint and/or hurricane track footprint. The satellite imagery of the area affected by a flood event or hurricane event is used as the basis for the topographical map of the footprint of the natural catastrophe event. Based on the distribution of the flood and/or the hurricane shown in the satellite imagery the event footprint can be indicated in the topographic map. The core engine can assign the location of a property in the map using the location information and can extract the hazard intensity and impact on the property from the vulnerability curve based on the footprint. Using the vulnerability curve the hazard intensity can be transformed in a quantified damage measure or estimate for the risk of a property damage.
The vulnerability curve derived from the core engine of the sensory-based system and used for the generation of property damage measures assigns a damage indicator to an identified hazard intensity. The damage indicator provides a measure for a degree of damage that is to be expected for a specific hazard intensity. The vulnerability curve can aggregate multiple hazard parameters which together define a hazard intensity and be designed as a two dimensional graph. The vulnerability curve can also be three or multi-dimensional indicating different hazard parameter intensities on different axes of the curve. The vulnerability curve can also be designed as a vulnerability model including one or more hazard parameters that are included in the natural catastrophe event footprint. The vulnerability model reflects the intensity of one or more physical forces impacting a property during a hazardous event and allows to investigate the impact on a property before actual impact. The intensity of the physical forces can be measured by real world measurements. In case of missing measurements or measurement gaps the intensity can be derived from reasonable assumptions, regressions, or simulations.
The vulnerability curve, respectively the damage indicator, can for example be based on information about one or more measurable property characteristics. The property characteristics may for example indicate a value, a size, quality, age, type of structure, a coverage and/or occupancy of a property. The value may for example be derived from the size of property and/or be represented by the purchasing value of the property and/or the construction costs of a building on the property. Further, the property characteristics may for example comprise past damage information defining property damages resulting from natural catastrophe events in the past, wherein the information may include specifications about the past event and hazard intensity. Also, the property characteristics may be based on an expert opinion about a damage level at a specific event or hazard intensity, existing loss data related to natural catastrophe events, and literature reviews or damage survey reports related to natural catastrophe events impacting property. Furthermore, the property characteristics may be based on comparing data imagery of the property before and after the natural catastrophe event. The damage indicator can summarize several property characteristics to represent a risk of damage. Preferably, the damage indicator is provided as a statistical mean damage degree.
The core engine can e.g. derive the vulnerability curve or vulnerability model specifically for the natural catastrophe event monitored by the measured satellite imagery and the natural catastrophe event footprint derived thereof including the hazard parameters relevant for this event as well as for the specific property characteristics provided by the property information. Thus, only hazard parameters and property characteristics relevant for the assessment of a property damage in the affected area are included in the vulnerability curve and the processing time of the core engine for generating the damage measures and/or estimate values for one or more properties can be reduced.
In a variant of the method for assessing property damage measures, the digital ground system can e.g. receive additionally actual and current damage data measuring property damages resulting from the natural catastrophe event, and the vulnerability curve can be weighted and/or calibrated on said present damage data. For example, the owners or operators of some properties may be able to capture and provide a damage data quickly after the hazardous impact of the natural catastrophe event took place. The information may for example be based on automated, electronic property surveillance systems that provide on-time information of damages of the property. The surveillance systems may for example use surveillance cameras or fire detectors that are configured to transmit present damage information via a network. The present damage information may be associated with the hazard intensity as indicated in the natural catastrophe event footprint for the location of the property, which may be applied for generating the vulnerability curve. The use of present damage information may increase the accuracy of the vulnerability curve.
In a further variant of the method for assessing property damage measures and/or estimates according to the present invention the digital platform receives geographic property location data, e.g. latitude and longitude parameter values, from extracting location data from satellite and aerial imagery pervious to the natural catastrophe event. The core engine may be configured to compare the previous satellite and aerial imagery with the satellite and aerial imagery of the natural catastrophe event and identify imaged properties in the affected area and their location information. The use of previous satellite and aerial imagery to receive location information about properties in the area affected by the natural catastrophe event accelerates the risk assessment, particularly the forecast of potential damages and property loss. The assessment can be refined after additional property information has been received.
In an example embodiment of the sensory-based system for measuring property damage measures, the digital ground system additionally can e.g. be connected to at least one location data database. For example, the digital ground system may receive property location data from existing property location data listings. Such listings are for example established and maintained for legal and administrative purposes. The location information may be publicly accessible or provided on demand. Advantageously, the digital system may include an application programming interface (API) providing communication to at least one location information database to receive location information about properties in the area of the natural catastrophe event.
In another variant of the method for assessing property damage measures, the digital ground system can e.g. receive property location data filtered from portfolio data of a risk-transfer system, as e.g. an automated property insurance system. This way the risk-transfer system is able to assess damage and risk measurement for all of the properties in a portfolio that are in the area affected by the natural catastrophe event. The number of properties in the portfolio can e.g. be smaller than the number of properties in the area, which accelerates the damage impact and risk (impact probability measurements) measurements by focusing on selected properties. The digital ground system may e.g. be connected to an insurer's data processing system using an application programming interface (API) for providing communication to the portfolio information database of the insurer. Ideally, the functionalities of the present inventive sensory-based system can be integrated into the insurer's data processing workflow using the API. The risk-transfer system can receive the results of the risk assessment quickly after providing access to the database.
In still a further variant of the method for assessing property damage measures and/or estimates according to the present invention the core engine additionally can generate statistics and/or maps of properties identified by geographic location data and potentially damaged in the area affected by the natural catastrophe event. The statistics and or maps may include data and measures about the damage degree, the type of damage indicator, the hazard intensity and other information associated to the hazard impact and the damage risk. Particularly, the core engine may e.g. generate a map of a damage distribution and/or loss distribution. The statistics provide a quick oversight of the extend of potential property damages and the magnitude of the risks associated with the hazardous event.
Further, the inventive system can e.g. generate the property damage measure values for one or more properties in a short time period (real-time or quasi-real-time) after the event. The time period is preferably smaller or equal one week. Particularly, the period of time for generating estimated loss values for the portfolio is equal or shorter than 1 week. The sensory method provides a reliable damage measurement with highly increased speed with a high accuracy, which further also allows for early detection of risks (impact probabilities in case of a natural catastrophe event) and required mitigation measures. Advantageously, the damage measures generated by the core engine providing a quantified forecast measure for an expected physical damage to a land-based object or structure, impact on humans, disruption of services and/or loss of livelihood. The quantified measures and measuring parameter values allow for fast hazard response and efficient claim handling, for example, in respect to the in-time supply of resources to cover and/or mitigate the impact of the natural catastrophe event. In an example embodiment of the sensory-based system for measuring damage impact measures, the digital ground system can e.g. be connected to at least one location date database, particularly providing portfolio location data of a property insurer entity.
The sensory-measurement based system and the automated method for measuring property impact and damage further also enables the reliable prediction of the magnitude and the timing of potentially future or up-coming events and technically allows for measuring quantified impact measures and physically occurring losses impacted by the natural catastrophic event, which may include a quantified forecasted physical damage measure, a realistic prediction of the impact on humans, a quantified dimension of service disruption, a quantified monetary loss and loss of livelihood. The technically method is able to aggregate different hazard types using a data-driven common vulnerability curve or modelling structure and allows for coordinated and harmonized risk responses across different disciplines.
The present invention will be explained in more detail, by way of example, with reference to the drawings which merely serve for explanation and should not be construed as being restrictive. The features of the invention becoming obvious from the drawings should be considered to be part of the disclosure of the invention both on their own and in any combination:
The satellite imagery-based system 1 comprise one or more airborne and/or spaceborne optical remote sensing devices 121 and/or optical sensory satellites 12 and/or optical sensory manned/unmanned aircrafts or drones 12 equipped with one or more remote airborne or spaceborne sensors 121 being within a frequency band/wavelength range 1211 at least comprising infrared to visible multi-spectral sensors 12111 and/or synthetic aperture radar 12112 and/or hyperspectral sensors 12113 for capturing digital satellite imagery 122 of a geographic area 4 affected by the natural catastrophe event 2 and transmitting the digital aerial and/or satellite imagery 122 to a digital ground system 11. It is explicitly to be noted that the remote sensing devices 12 can be any kind of airborne or spaceborne vehicles, as, for example, manned and/or unmanned aircraft and/or drones and/or balloons (hot-air balloons, gas balloons etc.) and/or zeppelins and/or satellites and/or spacecrafts etc. equipped with optical sensors, in particular imagery sensors for measuring and/or capturing aerial or satellite surface imagery 121. In particular, Unmanned Aerial Systems (UAS) can be used to provide a largely inexpensive, flexible way to capture high spatial and temporal resolution geospatial data. Computer vision technology, as e.g. Structure from Motion (SfM), can be used according to the invention for processing of the UAS or otherwise captured aerial or spaceborne (satellite) imagery to generate three dimensional point clouds and orthophotos. The manned/unmanned aircrafts or drones 12 can e.g. comprise an Unmanned Aerial System (UAS), i.e. an aircraft without an onboard pilot that is operated autonomously or manually by a remote control operator or operator system. The terms unmanned aerial vehicle (UAV), unmanned aircraft systems/vehicles, remotely piloted aircraft (RPA), and drone can be used herein interchangeably. UAS platforms are herein adopted for geospatial purposes, and can e.g. be small UAS (sUAS), weighing between 0.5 lbs (˜0.2 kg) and 55 lbs (˜25 kg) as designated by the U.S. Federal Aviation Administration (FAA; weight limits may vary in other countries). The aerial remote sensing devices 12 can also comprise Rotary Wing devices (RW), i.e. single or multirotor copter with upward-mounted propeller(s) that generate lift allowing aircraft to take off and land vertically and hover during flight. To capture specific surface structure RW platforms typically provide more maneuverability than fixed wing aircraft. The aerial remote sensing devices 12 can also comprise Fixed Wing devices (FW), i.e. devices with a stationary wing and forward mounted propeller(s) to generate lift and continuously move aircraft forward at varying pitch angles. FW aerial platforms can be useful for the present invention, the airborne devices are required to fly at higher speeds and for longer duration (40 minutes to several hours) increasing aerial coverage in comparison to RW. As mentioned, for the image pre-processing, for example, image processing techniques can be applied as Structure from Motion (SfM) computer vision algorithms to process digital photos into 3D point clouds and subsequent geospatial data processing such as digital terrain and surface modelling, and/or orthophotos. SfM, as used herein, also encompasses multi-view stereo techniques (e.g., MVS, SfMMVS). For the present use, itis to be noted that UAS can often be hindered by even slightly windy conditions, requiring frequent confirmation of weather forecasts at/near the site to be optically measured/captured. Although device dependent, FW aircraft are often flown into and with the wind to minimize side-to-side movement, whereas RW aircraft are less restricted in flight direction. FW platforms require a larger staging area than RW platforms for launch and skid landings. During data collection missions, flightlines should be organized to ensure stereoscopic coverage. Further, UAS-based image capture may require considerable overlap (80-90% end-lap and 60% side-lap can e.g. be recommendable) to ensure effective image matching due to the larger distortions introduced by lower flying altitudes and platform instability. Nadir-facing images can be collected, although convergent views can be recommendable (i.e. integration of obliques).
UAS capturing imagery can e.g. comprise also off-the-shelf, point-and-shoot digital cameras as sensor option. It is often recommendable to avoid wide-angle lenses due to high image distortion, and also parsing video into still images can often not be recommendable because frames may contain blur. It is to note, that off-the-shelf cameras typically have limited spectral resolution, and reflectance calibration can be challenging, but removal of the internal hot mirror permits capturing of near-infrared wavelengths. For this, spectral targets with known reflectance properties can e.g. be placed in situ to calibrate the optical sensor measurements, or sensors such as the Tetracam ADC Lite sensor allow image capture from UAS with spectral bands matching certain Landsat bands, thereby facilitating imagery mapping and comparison. Georeferencing schemes for UAS acquired imagery include: (1) direct, which uses known camera locations through GNSS-enabled cameras or onboard GNSS and IMU measurements stored and attached to captured images, (2) indirect, which uses GNSS-located ground control points (GCPs), and (3) a combination of direct and indirect. It is to be noted that airborne sensory (unmanned or manned), in contrast to satellite sensory, may allow for better non-image measuring date capturing. Non-imagery sensory of UAS, e.g. used for improving the optical measurements, can comprise, for example, collecting measurements of temperature, pressure, humidity, and wind for atmospheric sampling and meteorology or environmental surveillance using sensors that can detect CO2, methane, and other gases for pipeline monitoring. Further lidar sensors can e.g. be employed for terrain and 3D mapping, but sensor size, weight, and cost may be restrictive for application.
In general, images can e.g. be processed to generate very high spatial resolution orthophotos. The herein proposed proper orthophoto production comprises removal of radiometric effects (e.g., vignetting, brightness variation from image-to-image, conversion to reflectance values) and geometric effects (e.g., lens distortion, relief displacement). It is to be noted, that for the present application, geometric corrections can e.g. be challenging when using uncalibrated sensors at low altitudes where distortions are magnified.
The optical remote sensing devices 12, as e.g. optical sensory satellites 12 or optical sensory manned/unmanned aircrafts or drones 12, are connected to one or more aerial and/or satellite receiving stations 16. The aerial and/or satellite receiving stations 16 can be strategically located across a certain geographic region to ensure coverage of the landmass and waters of said geographic region. These aerial and/or satellite receiving stations 16 track and receive data in real-time from satellites for the inventive mapping and/or surveillance and/or monitoring process. As an embodiment variant, the ground-based sensors or aircraft-based sensors can e.g. be used to record additional sensory data about the surface which is compared with the measurements collected from the satellite sensors. In some cases, this can be used to calibrate and/or to weighted and/or normalize the measurements of the target which is being imaged by these satellite sensors. Such sensors may be placed on a ladder, scaffolding, tall building, cherry-picker, crane, etc. Aerial platforms are primarily stable wing aircraft, although helicopters can also be used. Aircraft can be used to collect very detailed images and facilitate the collection of data over virtually any portion of the Earth's surface at any time.
In the embodiment variant of UAS, the aerial and/or satellite receiving stations 16 can e.g. comprise a Ground Control Stations (GCSs) for the unmanned sensory devices, realized as stationary or transportable hardware/software devices to monitor and command the unmanned aircraft. Although the word ground, a UA may actually be operated from the ground, sea or air. GCS are technically as important as the unmanned sensory aircraft themselves, as they enable the interface with the aerial and/or satellite receiving stations 16, wherein any change in the route of the UAS, any eventual error on the aerial platform and/or any outcome of the payload sensors is transmitted and monitored within the GCS of the aerial and/or satellite receiving stations 16. The UAS can further comprise an autopilot loop repeatedly reading the aircraft's position, velocity and attitude (tPVA, with t standing for time) from the Navigation System (NS) and using the tPVA parameters to feed the Flight Control System (FCS) to guide the aircraft. These measuring parameters can also be transmitted and monitored by the Ground Control Stations (GCSs) of the aerial and/or satellite receiving stations 16 during optical sensory data capturing.
The one or more airborne and/or spaceborne optical remote sensing devices 12 equipped with one or more optical remote sensors 121 can e.g. have a sensor resolution 1212 in a spectral band in the infrared range measuring temperature between −50° C. to 50° C. The one or more remote airborne and/or spaceborne sensors 121 can e.g. have a radiometric resolution 12121 given by the optical sensor's 121 sensitivity to the magnitude of the electromagnetic energy or the optical sensor's 121 measuring color depth at least comprises 8 bit giving at least 256 brightness levels, wherein the radiometric resolution 12121 defines the resolution of the system 1 to detect differences in reflected or emitted energy. The one or more remote airborne and/or spaceborne sensors 121 can e.g. have a spatial resolution 12122 of at least 2.5 m and/or at least 10 m. Further, the one or more remote airborne and/or spaceborne sensors 121 can e.g. have a spatial resolution 12122 of at least 30×30 with 120×120 total internal reflection (TIR) and/or 30×30 with 60×60 TIR and/or greater than 50×50. The one or more remote airborne and/or spaceborne sensors 121 can e.g. have a temporal resolution 12124 greater than 5 to 10 revisits a day. The one or more remote airborne and/or spaceborne sensors 121 can e.g. have a spatial coverage 12125 of 100×100 km or more.
The digital ground system 11 comprises a core engine 111 for generating a digital natural catastrophe event footprint 1111 of the natural catastrophe event 2 based on the captured digital aerial and/or satellite imagery 122, the natural catastrophe event footprint 1111 at least comprising a topographical map 11111 of the natural catastrophe event 2. The natural catastrophe event footprint 1111 can e.g. be generated by measuring the satellite imagery 122 and extracting one or more natural event parameters 111102 for locations or grid cells 1111021 of the topographical map 11110. The natural event parameters 111102 can e.g. comprise measurands measuring at least windspeed 1111022 and/or precipitation range and/or intensity 1111023, flood level 1111024 and/or hale intensity and/or hale size 1111025 and/or air temperature and/or humidity 1111026 and/or earthquake intensity 1111027 and/or storm surge measure and/or avalanche strength and/or mud slide and/or tsunami strength 1111028 and/or terrain incline and/or wildfire or conflagration extent. Further, as an embodiment variant, the natural catastrophe event footprint 1111 can e.g. be generated by measuring the satellite imagery 122, the natural catastrophe event footprint 1111 being based on predicted occurrence probability measures for a selected area to be affected by a future occurrence of a natural catastrophe event 2.
The digital ground system 11 comprises a data transmission interface 112 for receiving location parameter values 41 defining selected land-based objects and/or structures 3 located in or near the area 4 affected by the natural catastrophe event 2.
The digital ground system 11 comprises an object filter 115 for matching the received location parameter values 41 of each land-based objects and/or structure 3 to the generated topographical map 11111, wherein land-based objects and/or structure 3 are identified and filtered lying in the area 4 affected by the natural catastrophe event 2, if the received location parameter values 41 of a land-based objects and/or structure 3 are detected to be in a geographic parameter value range 111111 of the affected area 4 of the topographical map 11111.
The core engine 111 comprises an adaptive vulnerability curve structure 1112 for parametrizing impact measurands 1113 for the land-based objects and/or structures 3 per event intensity 23 based on the measured topographical map 11111, and for generating an impact measurand 1113 value for each of one or more of the land-based objects and/or structures 3 based on an event intensity 23 measured based on the natural catastrophe event footprint 1111 using the vulnerability curve structure 1112. The vulnerability curve 4 can e.g. relate on one or more characteristic parameter values the land-based objects and/or structures 3 comprising at least aggregated monetary equivalent 11171 and/or size 11172 and/or quality 11173 and/or age 11174 and/or type of structure 11175 and/or degree of coverage 11176 and/or type of coverage 11177 and/or occupancy 11178 and/or past/historical damage assessment parameter values 11182 capturing past damages impacted by former natural catastrophe events 11181 and/or deviation parameter values 11179 capturing based on measured deviations in a data imagery of a land-based object and/or structure 3 before and after the natural catastrophe event 2.
The optical sensory-based system 1 and its parameter measurements can e.g. be weighted and/or calibrated using additional measuring parameter values. As such, at least one current damage parameter value can e.g. be received by the digital ground system 11 capturing physical damages resulting from the natural catastrophe event 2, wherein the vulnerability curve 1112 is calibrated based on said current damage parameter value. However, the weighted or calibration process can e.g. also be conducted fully automatically by the system 1, wherein said current damage parameter value can e.g. be generated by matching a digital image of a land-based object and/or structure 3 prior to the occurrence of the natural catastrophe event 2 to a digital image of the land-based object and/or structure 3 after the impact by the natural catastrophe event 2 determining the damage parameter value as detected variance within that land-based object and/or structure 3. Further, object and/or structure 3 location parameters can e.g. be extracted from extracting location data from satellite imagery previous to the natural catastrophe event 2 and/or from existing object and/or structure 3 location data listings. As an embodiment variant, object and/or structure 3 location parameters 32 can e.g. also or additionally be derived from portfolio information of a risk-transfer system. As another embodiment variant, normalized and/or weighted distribution maps of land-based objects and/or structures 3 identified by the location parameters 32 and potentially damaged in the area 4 affected by the natural catastrophe event 2 can e.g. be generated by the core engine 111. The normalized and/or weighted distribution maps can e.g. at least comprise distribution maps of damage impact strength to land-based objects and/or structures 3 and/or normalized loss distribution.
An impact measurand 1113 value can e.g. be generated for each of one or more of the land-based objects and/or structures 3 based on an event intensity 23 in real-time or quasi real-time with the occurrence of the natural catastrophe event 2. The real-time or quasi real-time generation can e.g. automatically be triggered by detecting one or more predefined threshold values 1213 measured associated with the natural catastrophe event 2 by means of the airborne and/or spaceborne optical remote sensing devices and/or satellites 12 exceeding predefined threshold values 1213. The threshold values can at least comprise a predefined threshold for measuring the extent of the affected area 12131 and/or intensity of the natural catastrophe event 12132 and/or impact strength of the natural catastrophe event 12133. The inventive threshold triggering in combination with the inventive system allows a completely automated monitoring and measuring of the physical impact of a newly occurring natural catastrophic event 2 based on optical sensory measurements. Moreover, by converting the trigger signals in case of exceeding the predefined threshold value, the optical sensory-based systems 1 can e.g. also be used to provide a completely automated alarm system. If the system 1 e.g. further comprises electronical or at least electronically activatable alarm devices, as siren alarm devices and/or alarm lights, the system 1 can e.g. generate based on the predefined thresholds in case of detecting an occurring natural disaster event 2 an electronic signaling by an electronic signal generator 114 and transmit the electronic signaling to the alarm devices and/or siren alarm devices and/or alarm lights for activation of the alarm devices. This can be particularly useful (i) regarding natural catastrophe events with a spatial delayed propagation as e.g. flood events where flood in higher zones propagates time-delayed to zones lying closer to the sea level, or fire events where fire propagates typically in wind direction toward new regions, or (ii) regarding natural catastrophe events with a temporal delayed propagation, as e.g. earthquakes were the mainshock typically follows time-delayed preceding preshocks, or volcanic eruptions where volcano eruptions typically go through several stages beginning with earthquake swarms and gas emissions, then moving to initial steam and ash venting, lava dome buildup, dome collapse, magmatic explosions, more dome growth interspersed with dome failures and finally, ash, lava and pyroclastic eruptions. For such natural catastrophe events 2 with a spatial or temporal delay in propagation, the present inventive optical sensory-based system comprising automated signaling upon threshold triggering of measuring parameters of real-time or quasi real-time measured topographical maps 11110 with dynamic adapted event footprints 1111 is able to provide a reliable and more accurate automated alarm system 1, than the prior art systems are able to provide. Therefore, the natural catastrophe event footprint 2 can e.g. comprise a time series 11101 of measured digital satellite imagery 122, each digital satellite imagery 122 comprises an assigned measuring time stamps or time range 11102, wherein based on the time series 11101 of measured digital satellite imagery 122 dynamics of a propagation of the natural catastrophe event footprint 2 is measurably captured by the core engine 111.
Further, a quantified loss measure value 1114 can e.g. be generated by the core engine 111 for each of one or more of the land-based objects and/or structures 3 based on the measured impact measurands 1113 fora respective land-based objects and/or structures 3 the quantified loss measure value 1114 being given by the percentual portion of physical damage to a land-based objects and/or structures 3 weighted by the undamaged land-based objects and/or structures 3. In addition, a monetary equivalent value 11142 of measured quantified loss measure value 1114 of one or more of the land-based objects and/or structures 3 giving the monetary equivalent 11142 of the measured physical damage of the land-based objects and/or structures 3 can e.g. be generated by the core engine 111. Thus the impact measurands 1113 and/or loss measures 1114 can e.g. be generated representing quantified measures for an actual physical damage in case of the occurrence of the natural catastrophe event 2.
Inter alia, to generate the monetary equivalent measure 11142, digital representations 1116 of the land-based objects and/or structures 3 can e.g. automatically be assembled by the core engine 111. The digital representations 1116 are composed of digital object elements 11151 stored in an object elements library 1115. The monetary equivalent 11142 of the measured physical damage of the land-based objects and/or structures 3 is generated from an aggregated monetary equivalent of the digital object elements 11151 of a land-based object and/or structure 3 in relation to the measured physical damage of the land-based object and/or structure 3. To each of digital object elements 11151 stored in an object elements library 1115, e.g. monetary equivalent values can be assigned and/or dynamically updated. The aggregated monetary equivalent of the digital object elements 11151 of a land-based object and/or structure 3 can e.g. be dynamically generated based on the digital object elements 11151 of the object elements library 1115.
Further, one or more digital images 1119 of the land-based object and/or structure 3 can be captured by the digital ground system 11. The one or more digital images 1119 are automatically captured by the remote satellite sensors 121 and/or transmitted by an individual associated with the land-based object and/or structure 3 and/or captured from a database accessible via a data transmission network 13. By means of an identificator and locator unit 14, elements 34 of a land-based object and/or structure 3 can e.g. be identified by data processing of the one or more digital images 1119 based on the digital elements 11151 of the object elements library 1115 and located within the land-based object and/or structure 3. The core engine 111 assembles the digital representations 1116 of the land-based objects and/or structures 3 using the digital elements 11151 identified and located within the land-based object and/or structure 3. Automated pattern recognition can e.g. be applied to the one or more digital images 1119 by the identificator and locator unit 14 using automated pattern recognition for identifying and locating the digital elements 11151 within the land-based object and/or structure 3 based on processing the one or more digital images 1119.
Risk-Transfer Application
In the following a method and an optical sensory-based system 1 for assessing property damage measures in case of a natural catastrophe event impact according to the invention is referred to as a Rapid Damage Assessment System (RDA). The RDA is described for an insurance use case, where an insurance company is interested in assessing damage measures and/or estimate values for one or more properties in an area affected by a natural catastrophe event.
The industry is facing increasing losses from natural catastrophes and associated operational challenges due to event impact uncertainty and sudden influx of large volume claims. Risk-transfer systems have to balance the urgency of rapid response to customers against lack of access to impacted areas and limited availability of adjusting resources. This puts the entire claims operations in significant financial and operational stress due to increase in Loss Adjustment Expenses (2× increase), Claims Leakage (3× increase) and Claims Cycle Time (5× increase).
The RDA system allows to support risk-transfer systems to effectively manage natural catastrophic claims and address the above mentioned challenges. The RDA system is an end-to-end automated natural catastrophic claims platform allowing claims managers and loss adjustors to make faster and smarter claims decisions from one platform during a hurricane
The RDA system supports 3 key phases across the event lifecycle:
RDA system allows to leverage natural catastrophic modelling, imagery, weather, and property data, and augments it with deep AI algorithms to determine damage at every risk-exposed property level. It is then aggregated at portfolio (or other definable sets of risk-exposed objects) and other geographic levels to deliver damage assessment. The inventive probabilistic CAT modelling structure helps in estimating potential impact to a specific portfolio of objects before a hurricane makes landfall for users to start planning response. The inventive system procures post event images of impacted areas within 2-4 days from landfall and analyze the same to determine damage severity at portfolio as well as individual property level. This supports Claims Managers and Claims Adjusters in accessing relevant data at one place for faster and accurate processing of claims.
The inventive system has inter alia the advantage to (i) reduce claims OPEX: Mobilize CAT response users well in advance based on highly accurate predicted impact and deploy adjusting resources based on accurate assessment of damage severity. Minimize the need for field adjustment by enabling remote inspection of properties; (ii) reduce claims leakage: Prioritize and react faster to damages which have the potential of compounding losses. Reduce the risk of fraud and litigation with pre and post damage images of individual properties, and (iii) improve customer satisfaction: Support insurers with the ability to proactively reach out to impacted customers and reduce the claims cycle time by enabling claims teams in data driven adjustment.
In summary,
The table of
As shown, the vulnerability curve of
Thus, the method for assessing property damage measures and/or estimates in case of a natural catastrophe event impact is based on an architecture for a possible implementation of an embodiment of the optical sensory-based system 1 for providing fast and accurate quantified loss measures and estimates and actionable insights for insurers after an occurred natural catastrophe event.
The inventive digital Rapid Damage Assessment system provides fast and accurate loss estimates and actionable insights for insurers after a natural catastrophe event. The basis for the Rapid Damage Assessment system is given by the location or portfolio information of a user like for example an insurer. Starting from this information, the system generates loss estimates, corresponding graphical maps of events and the portfolio impact after large disaster events by tracking natural catastrophe events. As shown by
In summary, as illustrated by
Imagine a large Hurricane is approaching the coast, and one of risk-exposed users has a large number of risks in the area where the storm is expected to make landfall. Then it will be interesting to assess the expected loss on this portfolio after—or even before—the Hurricane makes landfall. It is further important to understand which of the locations in the portfolio are expected to suffer the largest losses. Such information can also be used to improve claims management. The RDA system and method provides users with the means to quickly answer some of the most pressing questions during a natural catastrophic event in an automated and standardized way.
The system provides an automated generation of an event report for pre-defined portfolios for tropical cyclones and earthquakes, as soon as they occur. Event report can e.g. contain: (i) Event information with general event information and/or visualization of event footprint, (ii) Portfolio information with visualization of portfolio, and (iii) Loss information with expected portfolio loss, locations with highest loss, and visualization of loss map. The system is also able (a) to generate a similar report for other perils (e.g. flood or wildfire), if a satellite footprint of the event is available. However, this is not yet fully automated. See below for the current state of capabilities, (b) for non-automated runs on a specific portfolio in MultiSNAP, please check out the information about the Nat Cat Event Footprint capabilities.
It is to be noted that the inventive system and platform is able to provide full integration in a Geo architecture and to offer e.g. via a user portal full automation of other perils than earthquake and tropical cyclones. The system also allows easy integration and addition of new footprint sources (e.g. wind footprints from Meteomatics) and new perils (e.g. modelled storm surge based on the track forecast).
This description and the accompanying drawings that illustrate aspects and embodiments of the present invention should not be taken as limiting the claims defining the protected invention. In other words, while the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. Various compositional, structural, and operational changes may be made without departing from the spirit and scope of this description and the claims. In some instances, well-known processes, structures, and techniques have not been shown in detail in order not to obscure the invention. Thus, it will be understood that changes and modifications may be made by those of ordinary skill within the scope and spirit of the following claims. In particular, the present invention covers further embodiments with any combination of features from different embodiments described above and below.
Furthermore, in the claims the word “comprising” does not exclude other components or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single unit or step may fulfil the functions of several features recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
The method for assessing property damage measures and/or estimates in case of a natural catastrophe event impact can be realized as a computer program, which may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. In particular, e.g., a computer program can be a computer program product stored on a computer readable medium which computer program product can have computer executable program code adapted to be executed to implement a specific method such as the method according to the invention. Furthermore, a computer program can also be a data structure product or a signal for embodying a specific method such as the method according to the invention.
As described above several aspects, components and steps of the optical sensory-based system 1 and the method for assessing property damage measures and/or estimates in case of a natural catastrophe event impact according to the present invention are based on technical considerations and concepts as for example satellite imagery technologies, measuring of real world parameters representing hazardous events, damage assessments using optical and sensor technologies to identify damage degrees and damage characters, remote sensing and imaging devices for assessing of property locations, automation technologies to integrate data information in existing portfolios and more.
Process to Get the Pre and Post-Event Pictures
An important part of the inventive system is the inventive process to get the pre and post-event pictures.
(i) Identify the Coordinates of Insured Building
Input: The process can start with customers providing their portfolio data which consists of addresses of each insured risks in the portfolio along with the Zip codes.
Process: The process comprises primarily 2 steps:
Output: The output signaling usually gives the centroid lat/long of a land parcel for the given address
In case there is no absolute match of the input address with reference address database, there are some interpolation rules which are used to identify the approximate coordinates of an address. RDA uses precisely for geocoding.
(ii) Create a Building Footprint Database
Input: Addresses of insured risks as part of the registered portfolio in RDA.
Assumption and boundary condition: A residential building might have the main structure and then ancillary structures (like garage, garden shed, guest house etc.). It is assumed that residential buildings do not have more than 3-5 building footprints in a land parcel. Commercial buildings on the other hand can have many building footprints in a land parcel as it comprises of multiple smaller structures.
Process: The process differs for residential buildings and commercial buildings. The system can e.g. use Ecopia as building footprint data provider.
1. Residential Buildings:
a) If for a given address, number of building footprint is less than or equal to 5 in a given parcel. Combine all of them together to create one combined polygon (technically known as multi-polygon) having one unique identifier
Output: Coordinates of the multi-polygon and centroid lat/long for the multi-polygon.
b) If for a given address, number of buildings is greater than 5, split all of these addresses into individual polygons with unique identifiers for each of these split polygons separately.
Output: Coordinates of each individual polygons and centroid lat/long each of these individual polygons. This is done to handle potential errors in the portfolio where a commercial building is categorized as residential building as well as incorrect collection of building footprint data.
2. Commercial Buildings:
a) Irrespective of the number of building footprints, combine all building footprints into one combined polygon having one unique identifier.
Output: Coordinates of the multi-polygon and centroid lat/long for the multi-polygon
(iii) Create the Image Bounding Box to Retrieve Images
Input: Lat/Long from Geocoding and Centroid Lat/Long from Building footprint data.
Assumption and boundary condition: Address geocoding is more accurate than determining building footprint for a given address.
Process:
Output: Coordinates of the extended rectangle which is considered as image bounding box
(iii) Create the Image Bounding Box to Retrieve Images
Input: Image Bounding Box
Assumption and boundary condition: One location can have multiple pre and post event images
Process:
Output: High resolution aerial imageries (Blue sky and gray sky) for the given image bounding box.
As an embodiment variant, the system can e.g. capture high resolution aerial imageries (7.5 cm-20 cm resolution) e.g. from Vexcel. However, the same process can be replicated for other form of imageries such as satellite, drone etc.
Process to Detect Damage Severity Based on Aerial Imageries
(i) Damage Classification: Damage Severity at an Overall Building Level
Technical objective: Model is trained to classify buildings into classes, e.g. 5 classes of damage severity which e.g. are No damage, Minor, Moderate, Major and Complete Damage
Input: a) High resolution Pre-event and Post-event optical aerial imageries for a given insured building. These are ortho imageries (i.e. top view images), b) DSM Images which gives the elevation of all objects within an image.
Process:
Output: a) Processed image, b) Rooftop footprint, c) One of the five damage severity classes along with confidence score. If the rooftop is majorly covered by vegetation, then it is classified as “background”.
(ii) Damage Segmentation: Detection of Localized Damaged Segments of the Roof
Technical objective: Highlight the part of the roof that is damaged and determine the percentage of roof that is damaged
Input: a) High resolution Pre-event and Post-event optical aerial imageries for a given insured building after image pre-processing, b) Rooftop segments i.e. bounding box of the rooftops.
Process:
Output: a) Damaged segments of the roof i.e. the pixels which are identified as damaged part of the roof, b) Percentage of overall roof that is damaged, and c) Percentage of damage by each facet (see
(iii) Damage Sub-Class Classification: Detect the Type of Damages
Technical objective: Determination of the type of damage i.e. shingles missing, holes in the roof etc.
Input: a) Post-event image with 3 classes from damage segmentation model (Damaged segment of the roof, undamaged segment of the roof and background), b) Damage class from damage classification model.
Process:
Output: Damage sub-class which represents damage type
(iv) Configuration of Damage Severity
Technical objective: Damage severity definition varies from insurer to insurer and based on the state where the insured building is. Customers should be able to define damage severity based on their internal business definitions
Input: a) Percentage of overall roof damaged, b) Percentage of damage by facets, c) Damage sub-class i.e. type of damage, d) Property characteristics, e) Geographic details of an insured building.
Process:
Output: Damage severity based on rules set up.
A Energy source—electromagnetic wave source
B Interaction of energy with atmosphere (passive vs active)
C Interaction of energy with surface and land-based object/structure
D Measuring of energy by remote sensors, in particular optical sensors
E Transmitting of the digital satellite imagery to the digital ground station and monitoring by the digital satellite imagery occurring natural catastrophe events
F Preprocessing of the digital satellite imagery, generating digital natural catastrophe event footprint with the topographical map
G Matching of selected land-based objects to the generated topographical map and measuring the impact measurands for each of the selected objects in respect to the measured event intensity using the vulnerability curve structure
Number | Date | Country | Kind |
---|---|---|---|
070332/2021 | Sep 2021 | CH | national |
This application is a continuation of and claims benefit under 35 U.S.C. § 120 to International Application No. PCT/EP2022/077235 filed on Sep. 29, 2022, which is based upon and claims priority to Swiss Application No. 070332/2021, filed Sep. 29, 2021, the entire contents of each of which are hereby incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/EP2022/077235 | Sep 2022 | US |
Child | 18340389 | US |