With extreme weather events challenging the resilience of utility grids across the globe, Christian Wirth explores the crucial role of geospatial data in keeping network outages to a minimum
It is undeniable that the electrical utility industry has faced increasing environmental changes and operational challenges. This is clear from recent extreme weather events such as Storms Christoph and Aiden, which led to large waves causing 33 shipping containers to be lost overboard in northern Scotland.
The UK’s Met Office now estimates that heavy rainfall and floods are now seven times more likely due to human-induced climate change, while 2019 saw new national summer and winter maximum temperatures registered amid a 1.1 degree Celsius rise above the 1961-1990 long-term average. In fact, a significantly higher prevalence of extreme weather events has been recorded globally this century, posing a growing challenge to utility grid resilience and service reliability across the globe.
Gaining visibility and oversight
As the causes of utility grid hazards and their consequences have grown in number, visibility over them has simultaneously become more difficult. The decentralisation of energy generation and information taking place as a result of large-scale renewable energy investment by government and industry has given way to a struggle in energy firms’ ability to create a single, accurate geospatial view of their increasingly complex network assets.
The corresponding rapid growth of smart grids that use sensors and field worker mobile devices to generate crucial utility data at the ‘edge’ of the network has only added to this trend. It can easily translate to an ever-growing backlog of inaccurate information held in legacy, centralised Geospatial Information Systems (GIS), or even paper maps. Siloed applications often provide contradictory or inaccurate data for field technicians, reducing process efficiency and creating backlogs for as-built data collected in the field.
Even worse, applications and data are often not interoperable so it is difficult to get a real-time picture of network assets This makes it difficult for grid operators to track degradation or damage caused by natural disasters or man-made changes and proactively gauge the potential for further damage. For example, in the case of gas or water utilities, ground movement caused by temperature changes and soil moisture can cause pipes to burst.
End-to-end data for end-to-end efficiency
The kind of solutions described in our case study (see boxout), deliver an accurate bird’s-eye view of the entire network. They close the data gap between field and office and ensure designers and on-site engineers can communicate seamlessly and clearly. By increasing productivity and safety, such solutions give network operators the ability to be both proactive in planning resilience and more reactive when disaster strikes. They enable teams to manage, oversee, and crucially revisit projects when necessary, knowing that they have a full ‘audit trail’ of past activities.
Accessible geospatial data allows field crews to continuously correct or update network information for all stakeholders. While many utility companies face a growing backlog of field as-built information, this approach allows users to maximise disaster-resilience over time through incremental improvements to damage response. Whether in the face of local or national threats, they can now rise to this challenge by creating an accurate, comprehensive, and accessible view of the network.
Foundations for the future
Beyond the practical application of open and accessible networks to disaster response and resilience, there is another key driver behind the need for a centralised collaboration approach to utility grid management. Namely, new service expectations as a result of distributed energy generation have meant a fundamental shift in the role electrical network must play if future infrastructure demands are to be successfully met.
Advancements such as the deployment of smart metering and Internet of Things technology, along with the proliferation of electric vehicles and solar power generation are quickly making optimising network planning and capacity management a business imperative for managing future smart cities. For example, in line with the skyrocketing volume of data being produced, collected and analysed by electric vehicles, there is an urgent need to integrate meter data, network operations and population demographics in order to plan local charging stations without causing disruption or congestion.
As cities and communities undergo a transformation in line with these sophisticated technologies, increasingly complex network infrastructures demand optimised data quality for electrical utility companies. The first step in adapting to this change must be automating the all too prevalent manual processes of centralised legacy GIS systems. Once real-time data can be reliably captured and visualised in this way, it can then be harnessed to improve all manner of customer priorities such as safety and customer service, enabling utility companies to meet consumers’ rising demands for instant access and support.
It is estimated that the number of connected devices globally will reach 75 billion by the year 2025, and it is therefore crucial that utility providers respond quickly to this rapid move towards distributed energy networks. Whether it be to safeguard critical infrastructure against damage from environmental threats such as extreme weather events, streamline collaboration and communication between operational and field teams or respond to transformational changes in energy networks globally, the key to the effective running of future utility grids will be decentralising and democratising the process of capturing and managing network data.
Christian Wirth is General Manager, Europe, at Cambridge-based IQGeo (https://www.iqgeo.com), a supplier of geospatial software solutions to telecommunication and utility network operators worldwide
Case study - Harmonising and decentralising network data access
One need only look to other countries with decades of experience of extreme weather events to see how geospatial strategies have helped increase response efficiency and promote resilient and reliable grids.
For instance, Japanese power giant TEPCO has addressed frequent storm and typhoon damage by harmonising and decentralising network data access, providing end-to-end visibility to vital network data for everyone across both office and the field, allowing them to react quickly and efficiently.
TEPCO deployed a decentralised, mobile-friendly platform which could easily be accessed and updated by field workers, creating a comprehensive and accurate view of not only the grid damage, but potential future hazards. When Typhoon Faxai caused extensive damage to Tokyo’s utility network in September 2019, the system enabled central managers and field crews to rapidly view and act upon mission-critical network information such as blackout locations and affected areas.
This system, which functions similarly to well-known Google Maps technology, streamlined technicians and construction teams in their search to identify and reach unfamiliar locations for prompt power restoration. It allowed for online and offline work to be completed and enabled technicians to instantly locate comprehensive details of the condition and position of nearby assets, accelerating repair rates and facilitating a targeted approach that minimised downtime.
This case study stands as a timely and valuable example for utility companies in the UK on the value of leveraging open geospatial data to bring enhanced grid visibility, in doing so improving operational resilience and disaster recovery. As extreme weather events continue to increase in frequency, it is of growing importance that geospatial data be made accessible to those on the ground.