Skip to main content

Analysing footprints in the flood

By [email protected] - 21st August 2018 - 15:28

This article describes an assignment conducted by Ambiental as part of an ambitious initiative to develop an environmental decision support dashboard. The nine-month project, now underway and due for completion this November, is focused on developing a flood data delivery system for London. This article incorporates field reports from an actual visit to the metropolis to witness an imaginary flood, and to generate data from the scene in an effort to provide some baseline testing of a live data system.

The reasoning behind such an unusual field trip was based on the utility of using Big Data -more specifically social media - to assist flood incident management decisions. The purpose of the visit was to support an AI research component of the core project objective … one that focuses on deploying Ambiental’s FloodWatch® flood forecasting solution to cover Greater London. This software predicts flood events by continuously processing Met Office rainfall forecast data feeds and can model entire catchments by calculating hydrological flows and then hydraulically modelling flood evolution.

Smarter flood risk detection

Development of this new data delivery platform is funded through the UK Space Agency’s (UKSA) Space for Smarter Government Programme (SSGP). In line with the goals of the latter, the proposed solution is intended to deliver geospatial-driven insights that offer positive benefits to UK Government by encouraging the adoption of emerging technologies and showcasing the capabilities of the UK space sector. This overall aim of the programme is to increase public sector uptake of satellite and Earth Observation (EO) data through integrated digital services.

The EnviroTracker™ smart city dashboard being developed within this project is designed to help city authorities manage and visualise high resolution flood forecasting footprints. The system is being configured for users across multiple government agencies, with project stakeholders including Transport for London and the Greater London Authority.

Sophisticated river and coastal flooding early warning systems do already exist in the UK, but precise warnings for surface water have not been widely used until now. The reason for this is that accurately predicting pluvial (rainfall-generated) flooding is notoriously difficult, as it can occur anywhere. Through a UKSA International Partnership Programme to develop Earth and Sea Observation Systems (EASOS), Ambiental has created a new flash flood forecasting system that has been deployed in Malaysia, and now London, to prove the system in a different setting.

Risky business

According to the Environment Agency, an estimated 140,000 Londoners are at high risk and 230,000 at medium risk of flooding across the capital. Reports from London Underground state that 85 stations are at a significant or high risk of flooding, a figure that represents almost a third of all stations. London Bridge has the highest risk, followed by Kings Cross and Waterloo. The annual cost of flooding of just these three stations is estimated at £1.2 million.

Flood risk was probably far from the minds of everybody else on the author’s train as it rolled into an unseasonably hot central London in late June of this year. But it was certainly at the front of his mind. The ‘mission’ was to gather field intelligence and, specifically, to determine the extent to which capturing temporal and spatial records of flood-related information, might provide value in emergency flood situations.

From Victoria Station, the author sent several tweets relating to where he was located and what he was supposedly witnessing. In this imaginary scenario, the tweets described wading through flood water and reporting the prevailing chaos from various parts of the city.

The motivation behind this work was driven by a desire to tackle the serious problem of sudden flash flooding and which can severely impact cities such as London with little or no warning. Ambiental’s modelling approach involves the fusion of quality data inputs to drive better predictions and which, in turn, lead to improved warnings, enhanced situational awareness, and more effective response capabilities. The EnviroTracker™ system being developed is delivered through a front-end that employs Hexagon Geospatial SMART M.App technology.

Going with the flow

Back at ‘HQ’ in Brighton, Ambiental’s front-end operator, Elena Puch, was monitoring the ‘live’ data stream harvested from Twitter. The author’s tweets were quickly received by Ambiental’s systems and his location easily and accurately determined through shared GPS coordinates. His location was also proximally geocoded in Bing by applying natural language processing techniques. The latter extracted a location whenever he specified where he was within the text of the Twitter posts. From Victoria, he headed East on foot, generating a stream of flood reports.

The EnviroTracker™ AI component is developed with input from Ambiental’s research calibrators Hydas at Dundee University. The software module is used to capture and classify flood-related data using AI algorithms that interpret meaning and pinpoint location. The field visit was designed to test the system limits and assess its ability to separate useful information from background chatter. This included filtering-out potentially misleading posts from either those not directly impacted by the flood, or from those posting on completely unrelated matters but which happen to contain words that are most commonly associated with flooding.

Machine learning is a subset of AI that attempts to categorise unstructured data. The London exercise contributed towards an assessment of the such techniques for interpreting flood related data. This study, in turn, yields results that can improve the value of subsequent model iterations.

Decoding a torrent of tweets

As the author headed across the city, he tweeted flood details in a multitude of ways. To avoid any panic and confusion arising from this entirely fictional ‘live from the scene’ incident reporting, he used a Twitter handle with no followers and labelled as a test account. In parallel, the Tweets were replicated in Brighton where Elena operated a second test account and populated Twitter with even more potentially useful data, as well as generating background ‘noise’ for the processing systems to churn through.

As the author progressed further on his journey, he identified safe refuges from the floodwaters to confound the system, effectively registering them as false positives. GPS locational accuracy from his smartphone was generally calculated to be within several meters of an actual location, proving that an accurate geotagging of flood-affected locations is possible. However, it seems reasonable to suppose that, in reality, flood-affected Twitter users would generally tweet about experiences once safely away from any flood hazard, thereby adding further complexity to the inherent uncertainties associated with processing such data.

Despite limitations and many complex challenges, the field of flood science recognises that accurate model validation data can be hard to come by, and therefore tools such as this, which capture and interpret the digital signature of flood events, can assist with determining flood event footprints.

The study supports the view that live social media streams appear to hold some value when incorporated within flood modelling dashboard systems.

AI can potentially deliver improvements to actionable intelligence systems for governments, insurers and infrastructure managers, not just in London but, potentially, worldwide. Future studies are likely to explore the viability of using such approaches in more isolated settings and where limited digital connectivity may reduce system viability. Even so, the potential for safeguarding lives would be substantial.

Preparing for the worst

The author’s journey through London, as recorded by Twitter, described incidents of flooded roads, damaged property, and the closure of various stations. As he crossed the Thames at Vauxhall, he documented a torrential river flow. A network of defences already safeguards London from fluvial and tidal flooding, but after several hours walking and tweeting through the hot city streets, he was starting to convince himself that it really was flooding!

As he headed North along the South Bank and gazed across to the Houses of Parliament, he concluded his journey by tweeting ‘Is the Government prepared for flooding in London?’ After all, past thunderstorms have generated localised rapid-onset flooding in the city, and climate change predictions of more frequent, higher intensity storms are likely to push the city’s drainage infrastructure to the limits.

Applying AI to interpret Big Data holds huge potential in making sense of unstructured streams of data generated by a swarm of human sensors. Unlocking the full potential of this technology will likely require the proactive mobilisation of individuals to ensure that data is captured safely and with high rigour.

Dedicated hashtags and specialised phone apps are expected to provide further enhancements to data capture. But simple utilisation of common technology is already demonstrably saving lives around the world and will become increasingly important in dealing with mounting environmental risks in the digital age.

The Twitter account used for testing was @EnviroTracklerPD.

Paul Drury is Product Manager at Ambiental, headquartered in Brighton, Sussex (www.ambientalrisk.com) and who can be contacted at [email protected]

Download a PDF of this article

Download