Skip to main content

Schrödinger’s geography

By [email protected] - 8th May 2018 - 14:48

If anything has widened the use of huge datasets, such as satellite imagery or vast vector geographical databases, and helped to deliver the concomitant benefits that come with that use, it has been the availability of enormous storage and processing power via the cloud.

Instead of setting a task running on your own puny computer and having enough time to learn a new language before it finishes, you can upload the data to as many servers as there are grains of sand on a beach and have your answer sent back to you in the blink of a crab’s eye (I may have overdone that seashore metaphor).

But such is the restlessness of the human animal that even this computing power is just not enough and has already become yesterday’s news. Seriously, there are some who refer to the cloud as ‘classical computing’, including Dr Talia Gershon of IBM Research.

During the same presentation at last year’s ‘Maker Faire Bay Area’ showcase at which Dr Gershon used that description – a recording of which is called ‘A Beginner’s Guide to Quantum Computing’ and is on YouTube (https://www.youtube.com/watch?v=S52rxZG-zi0) – she went on to explain how it will be quantum computing that will be the next leap forward for all of us. Since she also used the terms, ‘quantum positions can exist in superpositions of zero and one’ and ‘entanglement of qubits’, I’m not 100% sure I can explain exactly how. But since that has never stopped me from trying before, here goes!

Quantum computing differs from its classic predecessor, which at its heart is constrained by only being able to describe one of two positions: on or off. That’s called a bit and so the only way of processing more information is to have more bits. Hence the names terra, peta, exa, zetta and yottabytes have entered the language (they would also make a decent line-up of a Norwegian boy band).

But whilst we have got used to this being the way of computing, it has been known for the past 100 years or so that all subatomic particles can exist in more than one state at time – not just on or off but both at the same time, neither or somewhere in between.

If we blindly accept this counter-intuitive phenomenon – often the best policy in certain branches of physics – then it isn’t too hard to see that this vast increase in the number of states at the heart of a computer will lead to a resulting exponential expansion in the processing power.

An often-used example of what will be possible using this power is the modelling of more complex molecules, leading to the development of new drugs and chemicals. But imagine what could be achieved in the field of geography.

It is an accepted rule of thumb that doubling the resolution of a climate model requires 10 times the computing power. So, moving from a 100km grid-based model to one based on single kilometre squares would likely require every server farm in existence to create (and that’s ignoring that these models are four-dimensional!).

Could climate scientists use quantum computing to produce that model? Well, given that IBM’s 20 qubit machine currently only remains ‘quantum-capable’ for 90 milliseconds they would have to work fast. But if the archaic Moore’s Law can be applied to quantum computing, it may not be long before we are seeing geographical data used in quantities like never before and producing results that today we can only dream about.

Alistair Maclenan is founder of the geospatial B2B marketing agency Quarry One Eleven (www.quarry-one-eleven.com)

Download a PDF of this article

Download