The 2018 hurricane season opened with the arrival of subtropical storm Alberto on the coast of Florida. Natural disasters such as these regularly imperil human lives and trillions of dollars of infrastructure. Although we can’t stop them, we can limit their financial repercussions through the development of more accurate predictions based on an updated approach to modeling catastrophic risk.
The Flawed Assumption
Stationarity is the name for the concept of data remaining unchanged—or stationary—over time. When applied to climate science, it refers to the assumption that the earth’s climate is not changing. The vast majority of climate scientists believe the stationarity assumption is incorrect, and any approaches based on this assumption are fundamentally flawed.
Yet traditional catastrophic climate risk models are built on the assumption of stationarity. They project the future based on past statistics and the assumption of a static climate. Insurers actually use this approach with reasonable success for regional, national and international insurance policy portfolios. However, when stationarity is applied to risk analyses for specific structures or large commercial properties, the model breaks down.
Localized Assets
The problem is that risks to localized assets are not homogeneous across regions and properties. Localized predictions require data that accounts for the dynamics of the local environment.
Those dynamics include not only a changing climate but human-engineered alterations, as well. Simply breaking ground for a new building affects potential flooding scenarios. To accurately assess and mitigate potential risk, developers, municipalities and insurance companies need models for the individual block and street and are not constrained by stationarity.
Creating a dynamic model that collects and analyzes data with such localized resolution is not a simple matter of “downscaling” old methods. It requires a different strategy and discipline, with single-site analysis as a core objective.
See also: Role of Big Data in Fighting Climate Risk
Risk Modeling Reimagined
Incorporating natural and human-architected factors in a dynamic, integrated model is fundamental to an asset-focused solution that delivers accurate, actionable information. Such a solution requires comprehensive and current data, powerful big data analytics and a flexible design that can easily incorporate new modeling techniques as they become available.
At
Jupiter Intelligence, our solution is built on a cloud-based platform designed specifically for the rigors of climate analysis and links data, probabilistic and scenario-based models and advanced validation.
ClimateScore runs multiple models based on a changing climate, such as weather research and forecasting. ClimateScore’s models are continuously fine-tuned using petabytes of constantly refreshed data from millions of ground-based and orbital sensors. Novel machine learning techniques reduce local biases of scientific simulations and help the system continually improve as new observations become available.
Forgoing stationarity and adding the flexibility of a cloud model, current data from multiple sources and state-of-the-art analytics, machine learning and artificial intelligence technology produces asset-level predictions that are accurate from two hours to 50 years in the future.
Case Study: Miami
Understanding how developed Miami’s coast has become with localized data down to the individual block and street can help insurance companies, municipalities and developers assess the potential risk and determine cost-effective solutions.
Engineering firms need this data to evaluate the potential effects of flooding at a particular site and simulate how effective individual coastal protection measures are in protecting properties and neighborhoods from flooding over the life of these structures.
Public agencies also need this granularity to figure out how vulnerable their assets (ports, airports, transit, waste water treatment and drinking water facilities) are to a changing climate. Similarly, private entities want to assess exposed assets (substations, buildings, generators and data centers) and critical systems that may need to be redesigned to handle changing conditions. One critical condition to evaluate is the capacity of the electrical grid to handle peak demand during longer and more intense heat waves.
See also: Low-Risk Doesn’t Mean No-Risk
New Risk-Transfer Mechanisms
Stationarity-based catastrophic risk models were never intended to assess risks to specific assets. To mitigate asset-level risk, all aspects of the private sector, as well as government bodies at the international, national and local levels, must make informed decisions based on accurate, current, highly localized data.
Property values, liability risk and lives are at stake. With dynamic models, current data and modern analytics, mitigating risk is feasible. This type of information resource also will support new risk transfer mechanisms, including private insurance—and help reform obsolete mitigation strategies.
This article was originally published at Brink News, here.