Skip to main content

Access to algorithms

By [email protected] - 20th December 2019 - 12:22

Access to satellite data is increasingly being democratised. What was once the property of wealthy corporations and governments with space agencies, is now being used by small teams and individuals to assist urban planning, maximise crop yields with precision agriculture, and even inform stock picks for hedge fund managers.

The non-profit sector has also benefited from wider access to aerial imagery and satellite data. Earlier this year, imagery from declassified Cold War satellites enabled scientists to measure the rate at which Himalayan glaciers are melting, while researchers from the University of California in the US combined GPS and satellite technology to track migratory elk populations responding to climate change.

Online tools such as Sentinel Hub EO Browser, which provides imagery in 12 different bands from the Sentinel-2 satellite pair, mean that researchers can now sign up and get started with satellite imagery almost instantaneously. However, in order to truly unlock the potential of geodata and glean insights that have clear benefits for environmental science, scientists need to analyse this data with the correct processing tools and algorithms.

Algorithms under wraps

With AI and machine learning, scientists are able to analyse and capture data in entirely new ways. It is this technology that enables investors to monitor the number of cars outside retail businesses in order to accurately predict how that company will perform. Meanwhile, others use it to track the movements of oil containers or growth in corn fields, helping forecast the commodity markets.

Unsurprisingly, these proprietary algorithms have traditionally been kept a closely guarded secret, available to the few organisations that developed them as well as a handful of wealthy clients who can afford to pay for them. In response, the environmental science community has had to develop its own methods of processing and analysing geodata – and it must be said to some degree of success.

Organisations such as Global Fishing Watch offer open access to fishing vessel identification and tracking data, enabling scientists to develop their own analysis tools to investigate which nations are most responsible for industrial fishing, as well as to identify where shark populations and fishing activities overlap at sea.

However, difficulty arises when you scale these experiments up and begin collecting huge quantities of data. For example, Queensland researchers this year designed a drone equipped with hyperspectral cameras, which they are now using to capture coral bleaching data from the Great Barrier Reef. Each hyperspectral image of the reef can provide more than 4,000 data points in 270 different bands of light.

With this data, AI can differentiate between coral, sand and algae, as well as identify different coral types and gauge levels of bleaching. But on a regular desktop computer, it would take thousands of hours to process the multiple gigabytes of hyperspectral imagery captured. Without access to online cloud services and open source tools, this would be an impossible task.

Luckily, such tools are becoming increasingly available, including an online repository from Microsoft AI for Earth, which enabled the researchers to cut processing time down significantly.

Despite this example, and while these research teams rightly deserve the plaudits they receive, the wider state of geodata in environmental science paints a very different picture.

The environmental data drought

In 2018, the United Nations identified 93 indicators to measure environmental-related sustainable development goals. Yet earlier this year, the UN Environment Programme released a report that concludes there is “too little data to formally assess” 68% of these indicators, with “major data gaps” in areas like urbanisation, forest plantations and soil degradation. The report concludes with a number of recommendations to secure this data, including making data management, software and algorithms more accessible.

We’ve already seen evidence of the impact that the democratisation of geodata analysis can have on a country’s progress towards sustainable development goals. The Namibian government has had great success using tools like the Water-Related Ecosystems platform – launched by UN Environment and Google – in order to better understand the distribution of the country’s water resources. In an arid country so reliant on surface water or shallow wells, this is crucial information for engineers or policymakers seeking to make use of water resources for new abstraction methods or investment in hydropower.

This platform has also proved popular in nations with very complex water systems, such as Canada, where ecosystems range from Arctic tundra to seasonal wetlands and rainforests. In the wake of global warming, many of these systems are expected to experience transformation – so these tools will also serve to measure the effects of climate change.

With similar access to tools designed to analyse geodata in other fields, it’s entirely possible to address many of the major data gaps identified by the UN Environment Programme. For example, algorithms designed to identify trees and canopy cover would be well placed for forestry management, while object detection applied to buildings or vehicles can help inform urbanisation trends.

Collaborating for accessibility

In recent years, we’ve seen an encouraging shift in the democratisation of satellite data, which is now readily accessed through SaaS platforms and online tools. Yet, for smaller teams and independent developers, accessing processing tools and algorithms that can analyse this data remains a challenge.

The barrier to entry must be lowered for research teams. Collaboration between satellite owners, geodata providers and developers will be key for the future measurement and management of environmental change. New business models, such as open platforms and marketplaces, can not only help software providers and developers to commercialise their processing tools, but also increase the number of people who can make use of them.

This is more than just a moral issue. Once we are able to put these tools into the hands of the people who understand environmental threats better than anyone else, we are equipping the research community with cutting-edge technologies and consequently maximising our chances of averting climate breakdown.

There remains work to do

In recent years, the scientific community has proven that tenacious teams will quickly learn to use new tools and data sources in order to advance environmental research. With the relatively limited resources at their disposal, researchers have already shown their willingness to embrace geospatial data and its analysis.

It’s essential that both non- and for-profit companies continue to support researchers globally, ensuring that they are equipped with the same data and technologies that are revolutionising private sectors, such as agriculture and hedge fund management.

Only with a data-rich understanding of the challenges that the planet is facing will it be possible to meet sustainable development goals and tackle the causes and effects of climate change..

Eli Tamanaha is CEO at UP42 (www.up42.com)

Download a PDF of this article

Download