Alex Saunders, a Ph.D. candidate at the University of Arizona’s School of Geography, Development & Environment, led the research. This research illustrates just how preventive data selection can be for meaningful flood insurance effectiveness. Published in the journal Earth’s Future, this study tested a hypothetical flood insurance program in Bangladesh. Specifically, it assessed the impact of different data sources on insurance payouts for catastrophic flooding events.
The study scrutinized real-world flood datasets from five distinct sources, focusing on rainfall data, river levels, and flood maps sourced from the national flood agency. The study used two different satellites’ worth of data. This information aided the calculation of water coverage on monsoon seasons from 2004-2023. Their aim was to improve the consistency of flood insurance programs, so these researchers studied these important variables. Their aim was to provide improved monetary protection for local people dwelling in flood risk areas.
Methodology and Findings
The researchers calibrated five different model approaches to identify the timing of insurance payouts over a two-decade timeframe. They measured not only how quickly these payouts occurred but how reliable or predictable these payouts were. One satellite approach was dependent on the existing surface water observations on the ground. The second approach similarly used AI models but focused specifically on tracking patterns of monsoon flooding across Bangladesh.
This AI-powered satellite model was especially advantageous, as it could identify floods despite ongoing cloud cover. This novel implementation reduced the variability around expected payments by over 20%. In the end, it increased insurance affordability for consumers, reducing what they had to pay in expected payouts.
“Insurance providers may be guided in their decision making based on what data is most easily available, but there are a plethora of choices and more types of data through means like satellite sensors,” – Alex Saunders.
The study highlighted the need for multiple data sources. Finally, Saunders suggested that index-based flood insurance programs should conduct extensive data explorations prior to rolling out programs. Through the adoption of several indexes, insurance companies can lower the threat of avoided or improper payouts.
“But just because a specific data set is easily available or based on the newest technology, doesn’t mean it’s the right one for a given scenario. People should consider and compare the full range of available information to look for the best solutions,” – Alex Saunders.
Implications for Global Flood Insurance
Whether anticipating future flooding or recovering after flood events, the conclusions of this study could shape communications for residents living in susceptible areas all over the globe. Yet between 2000 and 2023, just 16% of the $1.77 trillion in global economic flood losses were insured. This statistic serves to highlight a glaring need for improved flood insurance policy. We should be equally determined to use the most relevant and effective data sources to tackle this issue.
Kevin Anchukaitis, Andrew Bennett, Beth Tellman, and Jonathan Giezendanner joined Saunders in supporting this important research. Their combined efforts shed light on the importance for insurers to adjust to changing flood trends. These changes, prompted primarily by climate change, are needed to help provide meaningful financial protection for impacted communities.
“More and more floods are happening every year, and that brings with it an increase in the total damage they cause,” – Alex Saunders.
This research points to the critical need for careful consideration of available data sources. It allows insurance companies to respond to disasters more effectively and assist those in most need of protection.