Quantifying uncertainty in flood modelling

Extreme conditions
People, Places and Planet
Image of flooding across fields and vegetation

From Pakistan to Australia, Africa, Europe and the UK, flooding cuts a calamitous swathe through communities around the globe.

It is the most prevalent natural hazard, posing potentially fatal risks to people and threatening damage to property worldwide.

And as the climate emergency intensifies, assessing the change to flood hazards becomes more and more crucial, particularly as flat land and waterside locations become increasingly attractive to developers as desirable locations on which to build.

For Edinburgh University Professor Lindsay Beevers, Chair of Environmental Engineering and Head of Research Institute, it is a core of element her work as she focuses on developing models to understand and quantify hydrological extremes, their evolution and how they impact on society and the environment.

Prof Beevers, who joined the University in January 2022, had already looked at the resilience of cities to floods and droughts during an earlier fellowship from the Engineering and Physical Sciences Research Council. Her recent research project, Tailoring Novel Uncertainty Quantification Methods For The Flood Modelling Industry, followed on from that work.

And in her view, as climate change accelerates, alongside the race to net zero, governments across the globe must develop national adaptation strategies in the context of climate uncertainty. We are already baked into climate change : it’s already happening and we are going to see more frequent and bigger events, particularly around hydrological extremes (floods and droughts).

To understand what the footprint of a potential flood event may look like, she runs numerical models to understand how water might flow overland when it overtops a river, or when sub-surface drainage becomes overwhelmed. But these models are complex and take an inordinate amount of time and effort to run one simulation for one flood event, for example, a once in a decade or one-in-100-year event might take several hours to run.

A line in the sand

Traditionally, when developers or regional planners look at where is at risk from flooding, a flood model is run to give a flood footprint. The way that has always been done in the past, is by running a model once, particularly in light of how long a simulation takes. That gives a line in the sand – one side of the line is going to be flooded, the other is not. But to better understand what might happen in the future, in a more robust manner, requires a range of probabilities, as uncertainties in estimates cascade through the process.

Capturing uncertainty is routine in many domains which rely on numerical models. For example weather forecasting now uses a probabilistic approach, where the chance of rain in a given hour is communicated to the public through a standard mobile app. Such a change in the way the population now interacts with weather forecasts opens up other domains to the same scrutiny. Weather forecasting is geared toward this probabilistic approach – shouldn’t the same be true for flood modelling? Prof Beevers believes we need to move away from the line-in-the-sand approach and quantify the uncertainty in projections.

To that effect, she has become the UK’s first academic to research numerical methods to reduce the time cost of uncertainty quantification in this field, opening up routine probabilistic approaches to the industry. Her project is about recognising that deterministic modelling is not correct and that science needs to move to the probabilistic approach.

Artificial Intelligence

One way of addressing this was to take an Artificial Intelligence machine learning approach to reduce the computational burden of running thousands of models.

Since it takes a considerable time for some of these models to run – and thousands of runs are needed to help understand the potential future – she and her team looked at different machine learning approaches, fitting proxy models to reduce simulation times. The result speeded up the operation by 99.5% making the process up to 180 times quicker.

However, the real key was to work with industry so that any approach was tailored to their needs. Prof Beevers and her colleagues teamed up with Edinburgh-based Kaya Consulting, looking at potential flood risk assessments for developers, actively producing assessments and running uncertainty analysis to see the transferability of the approaches.

Having proved that these approaches show promise, the next step is to push policy-makers and regulators towards the probabilistic approach and to build in uncertainty into assessing flood risk.

Better decision-making

“Doing so would put us in a better position to make better decisions,” explains Prof Beevers. “At the moment we are making decisions that are potentially over or under-estimates. If we properly quantify uncertainty and cascade that through to adaptation decisions, we will make better decisions, building urban resilience to future hydrological extremes.”

"The challenge now lies in exploring the transferability and robustness of algorithms. To do this we need to work in partnership with Industry and the broader community to ensure that we deliver policy and behaviour change, alongside the enabling technology."

Related links

Kaya Consulting

Institute for Infrastructure and Environment