Industry events, like the recent Reinsurance Association of America’s (RAA) Cat Risk Management conference, are always a great way to take the pulse of the industry and connect with people. I’ve been attending RAA’s conference on behalf of SpatialKey for years, and I generally come away invigorated by my conversations with clients and prospects. This year, however, the energy among these conversations was a bit different. That energy had more urgency and emotion behind it. It’s clear the unprecedented events of 2017 have taken a toll on people, and there’s a compelling need to do something about it. Individuals and teams alike have worked tirelessly; while the events have passed, the emotional fatigue is left in their wake.
I can empathize. While insurers worked diligently to serve insureds during back-to-back events, at SpatialKey, we worked around-the-clock to serve up timely, expert data to our insurance clients. The job of 24/7 data put an enormous strain on our own employees—and we have a dedicated data team! Insurers, which lack the expertise or resources to consume and work with the sheer volume and complexity of data that was being put out by multiple data providers, may have found it grueling. That exhaustion still lingers in the faces of the people I spoke with at RAA.
And, what’s bubbling to the surface now is the underlying problem:
There’s a ton of data and no easy way for insurers to consume it and act on it.
Put more eloquently, there’s a gap between the wealth of data now available and insurers’ ability to quickly process, contextualize and derive insight from that data.
Not just an event-response problem
While this transforming-data-into-insight problem was illuminated by 2017’s catastrophic events, this is not just an event-response problem. This is not an underwriting problem. This is not a new problem! Events like those of 2017 touch the entire insurance community—insurers and solutions providers alike. And together we need to solve the problem. What I heard time and again at RAA is that everyone is generally frustrated by a lack of process and an easy way to consume the frequent and sophisticated data that expert providers are putting out during events like Harvey, Irma, Maria, the Mexico City earthquake and the California wildfires. Insurance professionals are expected to use legacy or complex GIS tools to
extract and consume expert data from providers. Yet, I didn’t speak to a single GIS expert. It doesn’t make sense.
See also: 5 Ways Data Allows for Value-Based Care
There’s an opportunity cost to the productivity that employees could be generating elsewhere
Nobody has the time to teach himself a complicated GIS solution to look at data while working to deploy help to customers in the wake of catastrophe.
No underwriter has the time to get up to speed on a GIS solution that takes years to learn while trying to win business quickly.
It’s like giving your star quarterback a basketball and expecting him to win the Super Bowl with it. He’s talented, he can throw that ball, but he’ll never throw a winning pass with a basketball. It’s clunky, it’s cumbersome and it just doesn’t fly as fast. In the same way, folks across claims, exposure management and underwriting can’t quickly consume and understand data with legacy or complex tools that weren't created for their specific uses.
With all the data comes challenge, and a call for ways to interpret
information more efficiently
We’d all like to think 2017 was an anomaly. That we won’t have a replay of such extreme events. However, 2017 may only be a precursor of what’s to come. Even so, the insurance industry is poised to handle events like these better than ever before because there’s now a wealth of expert data and models. That’s a good thing, and it energizes me! Data quality and modeling is becoming better all the time—more accuracy, better science, higher resolution—as we can attest to because of working with providers like NOAA, USGS, KatRisk, JBA, RedZone, Swiss Re, Impact Forecasting and HazardHub. But, with all this data choice comes challenge. And a call for ways to interpret information more efficiently. We know it’s possible because we see our insurance clients succeeding every day when it comes to accessing, analyzing and interpreting data within SpatialKey. While late 2017 was exhausting and overwhelming, I’m inspired to see so much data come to life in platforms like ours at SpatialKey, and energized to see how empowering it is for the people using it.
See also: How to Earn Consumers’ Trust
Insurers, don’t try to solve this problem alone
The solution is collaboration: partnering with experts who have technology purpose-built to consume data quickly and produce intelligence that insurers can readily act on. I’m not advocating collaboration because I’m at the helm of a company that fills the data gap. I saw a lot of pain at RAA in the faces of my insurance friends, and there’s quite honestly just a simple way to solve this.
Processing information is a basic need that has become incredibly complex and time-consuming for insurers.
This can be easily outsourced, so insurance professionals can go about analyzing, managing and mitigating risk. Insurers have an opportunity right now to empower underwriters with the intelligence they need to keep losses on the scale of 2017 from happening again — and to empower them to understand data without complex GIS solutions. Start now. Your shareholders will thank you later.