Close to Home: Fighting wildfires with better science

The models currently used to forecast wildfires have not kept up with today’s conditions.|

The views and opinions expressed in this commentary are those of the author and don’t necessarily reflect The Press Democrat editorial board’s perspective. The opinion and news sections operate separately and independently of one another.

Nine of the 10 largest wildfires in California history have occurred over the past decade, and five of those took place in 2020 alone.

The threat of wildfire is rising because climate change has made California hotter and drier, with more frequent extreme weather events such as drought, heat waves and strong winds. Meanwhile, California’s wild lands are laden with fuel from millions of trees killed by drought and insects — perfect conditions for catastrophic wildfires.

To address this risk, Californians must come together to create better wildfire models and adopt a comprehensive, science-based approach to mitigating wildfire risk.

If we commit to using this technology, California can address the many critical issues surrounding wildfire, including evacuation decisions, firefighting tactics, power grid resilience, wildland management and insurance availability.

David Saah
David Saah

Success depends on effective modeling of wildfire behavior and risk.

Unfortunately, the models currently used to forecast wildfires have not kept up with today’s conditions. These models, developed decades ago, assume that wildfires spread as a thin line of fire along the forest floor, consuming a light layer of needles and twigs — the type of fire, in other words, that evolved as a natural part of California’s ecosystems.

But because of climate change, today’s fires burn hotter, spread more quickly and grow larger, behaving in ways the old models aren’t equipped to handle.

We ask a lot of wildfire models, because we need them to protect lives and property. If a fire is burning now, we want to know how quickly and in what directions it will spread. We want to know where new fires are most likely to ignite. And we want to know how wildfire risks will change in the years to come.

To gather this knowledge, we need foresters to quantify how much fuel is in forests; physicists to determine what causes a smoldering log to burst into flame and translate this information into mathematical equations; atmospheric scientists to assess what types of weather conditions are linked to fast-moving fires; and computer scientists to take all of this data and build models to more accurately forecast risk.

I’m the principal investigator of the Pyregence project, a consortium of scientists — from universities, government agencies and private industry — who are tackling all of these tasks, the research funded largely by a grant from the California Energy Commission.

A beta version of our near-term risk-forecasting tool is available now on a web-based platform, the first time such a powerful resource has been made broadly available to the public. It shows the locations of actively burning fires and forecasts where they are likely to spread over the next few days, much like a hurricane track forecast. It also identifies areas where conditions are right for possible future fires to ignite and spread.

We’re also creating new models to project wildfire risk into the future — from today to the end of this century — and help planners and policymakers make strategic plans for adapting to and mitigating wildfire risk under a changing climate. Relying on these models, electric utilities can design a more resilient grid, government officials can improve land use and development policies, fire management agencies can better target fuel reduction treatment, and insurance companies can more accurately determine a community’s risk profile.

If there’s one thing I’ve learned from working with wildfire modeling, it’s humility — at the threat that wildfires pose and the complexity of reducing the risk. Guided by better science, and working together, we can make California safer for all its residents.

David Saah is the principal investigator of the Pyregence Consortium, managing principal of Spatial Informatics Group and professor and director of the geospatial analysis lab at the University of San Francisco.

You can send letters to the editor to letters@pressdemocrat.com.

The views and opinions expressed in this commentary are those of the author and don’t necessarily reflect The Press Democrat editorial board’s perspective. The opinion and news sections operate separately and independently of one another.

UPDATED: Please read and follow our commenting policy:
  • This is a family newspaper, please use a kind and respectful tone.
  • No profanity, hate speech or personal attacks. No off-topic remarks.
  • No disinformation about current events.
  • We will remove any comments — or commenters — that do not follow this commenting policy.