Skip to main content

The Metaverse is Geospatial

By GeoConnexion - 27th January 2022 - 14:27

The real world and the internet will merge thanks to the ‘metaverse’ – and geospatial information and technology will be key to that combination. Simon Chester reports on the latest work going into making the unreal real… and the real unreal

With momentum and interest once again building around the ‘metaverse’, OGC hosted a ‘Metaverse Ad-Hoc Session’ at its virtual 121st Member Meeting in December 2021. The session saw speakers from across industry – from photogrammetry and AI-enhanced semantic remote sensing companies to geospatial, BIM and gaming software companies – speak about how geospatial tech will inform the metaverse, how the metaverse will transform geospatial, and why open standards will be critical for the metaverse’s success.

But before we get too far, what even is the metaverse? I asked Patrick Cozzi, CEO of Cesium, co-host of the Building the Open Metaverse podcast, and panellist at the Metaverse ad-hoc.

“Ask 10 different people, you’ll get 10 different answers, but what most folks are agreeing on is that the metaverse is a progression of the internet to go from something that’s 2D to fully immersive 3D,” said Patrick Cozzi. “You’ll also hear definitions around it being a persistent, virtual world that allows collaboration of any sense, from gaming to enterprise to DoD [Department of Defense] cases. I think it’s a super exciting time to be in geospatial as this all comes into one place.”

This lines up with the definition put forward by venture capitalist Matthew Ball, who has written extensively about the metaverse in his Metaverse Primer:

“The Metaverse is a massively scaled and interoperable network of real-time rendered 3D virtual worlds which can be experienced synchronously and persistently by an effectively unlimited number of users with an individual sense of presence, and with continuity of data, such as identity, history, entitlements, objects, communications and payments.”

But what will the metaverse look like to the end-user? First of all, virtual/augmented reality hardware won’t be mandatory: just like the internet, it will adapt to the device accessing it, whether it be 2D, 3D, small screen, big screen, headset, etc. Also like the internet, the metaverse will comprise many different interconnected 3D ‘spaces’ (like 3D websites) operated by different entities that together form the much larger metaverse concept.

Metaverse spaces will include those forming completely fabricated virtual worlds as well as those that are modelled after, or augment, the real world. Metaverse spaces will be interconnected, with users being able to cross between them, whether it’s to visit a friend, play a game, go shopping, manage a construction project, train for a new job, model a new warehouse workflow or something else entirely.

Users may also be able to extend and affect the real world with actions and items being able to move between both. For example, items purchased or earned in a shop on a virtual High Street in the metaverse could be redeemable at its real-world counterpart, or buttons pressed in the metaverse could actuate machines or objects in the real world.

Those metaverse experiences representing the real world are the most obvious place where geospatial technologies, standards, knowledge and best practices will play a major role. However, every metaverse space will be a massive database of physical and semantic environments that needs to be designed for efficient streaming. A metaverse space, then, can be considered an iteration of the geospatial industry’s city- or state-wide ‘digital twin’ technologies in use today for modelling and simulation, citizen engagement and more. As such, just about any 3D Geospatial Standard will be useful in building the metaverse.

Also worth noting is that the laws of geography that underpin geospatial technologies will also apply to entirely virtual worlds: users will want maps to navigate and make sense of virtual spaces just as they do the real world. As an industry, geospatial clearly has much expertise to contribute to the creation of the metaverse.

Geospatial will be transformed by the metaverse

The metaverse is the internet transformed by real-time 3D technologies, but the impact of 3D real-time is also transforming geospatial. The blurring of the lines between ‘real world’ digital twins, and virtual metaverse spaces is exemplified by the integration of geospatial data into game engines, which enable the rendering of photo-realistic 3D scenes in real-time using consumer hardware.

“Game engines are really changing the game for GIS,” said Marc Petit, VP, general manager, Unreal Engine at Epic Games, and co-host of the Building the Open Metaverse podcast, during the OGC Metaverse Ad-Hoc Session. “I think these [real-time 3D] technologies are really enabling for GIS, and the science of ‘knowing where things are’ is going to be hugely important in the metaverse.”

Philip Mielke, 3D web experience product manager at Esri, shared a similar sentiment: “We have about four or five years until the practice of GIS is fundamentally transformed by this convergence of technologies, capabilities and expectations… We at Esri are investing a lot in game engines so that we can transmit services for consumption in [the gaming engines] Unreal and Unity.”

As an example of the benefits of this convergence, Patrick Cozzi discussed his experience when Cesium enabled a link between their 3D geospatial streaming platform and Epic Games’ immensely popular game engine, Unreal Engine.

“Something magical happened when we built this bridge to Unreal Engine, because I feel that we made 10 years’ progress overnight. I feel like suddenly the decades of investment in games technology was unlocked for geospatial, and then likewise, all of this 3D geospatial data became available to the game technology. And that’s just one example of how when we make these open and interoperable ecosystems, we can move the field forward as fast as possible.”

Indeed, if the metaverse is all about diverse 3D experiences interoperating to form a cohesive whole, open standards and knowledge will be absolutely fundamental to its creation – just as there would be no functioning Internet without open standards, there can be no functioning metaverse without them, either.

Open Standards will underpin the metaverse

Innovation surrounding the metaverse, just like in other information technologies, will move quickly. The standards that will gain traction when building the metaverse will be the ones that can move with its pace of innovation. OGC’s new standards development ethos, as seen in our OGC APIs, builds open standards that are modular, lightweight, and extensible – allowing them to evolve alongside technology without breaking, while providing a stable baseline upon which lasting innovations can be built.

However, being a novel technology, many of the standards that will solve problems in the metaverse won’t exist when the building starts. It is likely, then, that the open technologies and specifications that bubble up as best practice as the metaverse matures will be de facto standards. Recognising the importance of de facto standards, OGC years ago developed a nimble ‘Community Standard’ process that enables snapshots of de facto standards to be adopted by OGC so that they can benefit from the stability that official standardisation brings, or can be better harmonised with other OGC Standards.

Community Standards can also form useful bridges that support the convergence of previously siloed industries and domains. 3D Tiles, for example, uses technology and know-how from geospatial and 3D graphics to provide a standard for streaming massive heterogenous 3D datasets that developers from both industries can follow and build to. Other OGC Community Standards relevant to the metaverse include: Indexed 3D Scene Layer (I3S) for 3D streaming; Indoor Mapping Data Format (IMDF) for mapping and navigating indoor spaces; and in the process of endorsement is Zarr, for the storage of multi-dimensional arrays of data (also known as data cubes).

OGC Community Standards can leverage the expertise of ‘outside’ industries relevant to geospatial to build a bridge between geospatial technologies and those from their industry of origin. The Community Standards process will prove useful, then, in bringing to the geospatial community the knowledge, experiences, and technologies developed by the many non-geospatial 3D and internet organisations in the early days of the metaverse.

Similarly, the liaisons and partnerships that help bring outside de facto standards in to the OGC Community Standards process will additionally serve to bring OGC Standards out to the communities that can benefit from them, and even bring those communities – and their perspectives – in to help shape Standards development and evolution.

Building the metaverse together

It is now clear that the metaverse – the internet in real-time 3D – has never been closer. Like the internet, its creation will result in technological advancements and disruptions. Geospatial is already starting to feel this as it adopts, adapts, innovates, and integrates 3D real-time technologies such as game engines and digital twins. However, the metaverse is not assured: it will only reach its true potential if, like the internet, it is based upon open standards and technologies that are easily available to all.

“I really want to see how far we can take the metaverse,” said Patrick Cozzi, “and I believe that to take it far, fast, we need open interoperability.”

As an organisation and community that’s passionate about Findable, Accessible, Interoperable, and Reusable (FAIR) data standards, OGC will continue to: provide, design, adapt, and adopt a host of standards relevant to the metaverse; offer a neutral forum for experts from across industry to meet and share knowledge; and work as a liaison and bridge-builder between other industries involved in building the metaverse and their standards organisations.

Simon Chester is communication manager for the Open Geospatial Consortium (www.ogc.org)

Download a PDF of this article

Download