In 2018, there were over 6,000 wildfires in California alone, destroying well over 10,000 structures, producing over $9 billion in property damage and claiming the lives of at least 88 people. Horrific. But there’s hope. More than 95% of the wildfires in California are controlled within the first 24 hours and are contained to 10 acres or less. In other words, fewer than 5% of wildfires grow out of control and move into day 2, producing most of the devastation. And improvements in technology can help us tackle that 5%.
I say that after having had the privilege of being invited to a workshop hosted by the Gordon and Betty Moore Foundation, in collaboration with the University of Maryland, last week. (Yes, that Gordon Moore, co-founder of Intel and author of what has come to be known as Moore’s law.) The workshop pulled together 28 of the smartest people I have ever met, all committed to curbing the impact of wildfire, not only in California, but also across the globe.
There are lots of reasons why fires get out of control, and it turns out the biggest issue is a lack of real-time information. The best technology available to the commanders making strategic decisions and firefighters in the field gives them information that is usually six to 12 hours old. Fire can do a lot in six to 12 hours.
The workshop identified a number of technologies that could greatly improve the information flow, three of which stood out to me: offerings from Planet Labs, Jupiter Intelligence and Descartes Labs.
Planet Labs has deployed a pearl-like string of shoe-box-sized satellites that provide an image of the entire globe every 24 hours. The image is accurate to about three meters—you won’t be able to pick out a person, but if you’re looking for anything as big as a pick-up truck or bigger, you’re good—and will let firefighters watch for danger.
Jupiter Intelligence provides a supercomputer platform that models potential dangers from one hour to 50 years out and could be used to monitor fires in near-real-time.
Descartes Labs combines data from lots of different sources, including sensors and imagery, and refines that data so it can be used for modeling that can spot and monitor potential problems as well as actual fires.
The folks at the workshop are committed to pulling together these and other technologies to better understand the traits (weather, terrain, vegetation, humidity, wind, etc.) that might signal the right conditions for an uncontrollable wildfire. The group will also integrate technologies and other capabilities to provide firefighters with real-time data about a fire and its movement.
We have written many times in this commentary about how technology is moving insurance and risk management into a predictive and (we hope) preventive role, rather than simply helping us get better at responding after losses occur, and that change will need to extend beyond technological improvements. I learned last week, for instance, that we are probably a decade behind in controlled burns. We should be control-burning about 2 million acres per year to limit wildfires, but we have been control-burning only approximately 100,000 acres a year for a long while. No wonder there is so much fuel to feed those wildfires. (Some environmentalists may push back on the need for controlled burning, but the wildfires seem to have created a new level of realism all around. Unless we’re going to unleash tens of thousands of goats on our forests, controlled burns are the only alternative.)
As always, we in the insurance industry can help. While the data we use for making decisions about risk and pricing may not be the same as what authorities need to prevent or fight fires, there is significant overlap, and we must all work together in this time of acute need.
I’d love to hear any thoughts about what the Moore/Maryland group or we as an industry should be doing in the face of the wildfire threat.
Wayne Allen
CEO