Skip to main content

Through the looking glass

By [email protected] - 21st August 2017 - 15:01

My first foray into virtual spaces goes back to May 2015, where Ordnance Survey (OS) was the platinum sponsor of Digital Shoreditch, a celebration of creative, technical and entrepreneurial talent. Previous visitors highlighted a problem with navigating the venue, a Victorian basement with multiple corridors and rooms, and finding exhibitions. From this, the idea formed to produce a visitor’s app that acts as a guide.

The first step was to use Blender to create a virtual 3D version of the building for users to take a virtual walk-through the building before the event. Then on the day of the event itself, the app can search for exhibitions and guide users with simple Augmented Reality (AR) arrows providing turn-by-turn navigation.

Out of this world

Twelve months later, planetary scientist Peter Grindrod from the UK Space Agency asked OS to create an OS-style paper map of a section of Mars. I thought it would be interesting to focus an AR experience on the dramatic landscape of Schiaparelli crater.

Using a set of height data for the planet captured by NASA, and with the advice of Peter, I produced a height map in Grey Scale. Then using Blender, I made a 3D terrain model of the crater and its surroundings.

To complete the Mars AR experience, I used Vuforia to create an image target in the cloud that allows for image recognition tools to pick up the feature points I’d programmed in. After a few tweaks, the image target score for the augmentable reached the 4-star stage, which is what’s required for reliable target for recognition.

Vuforia has published software development kits (SDKs), for both Android and iOS, to help the production of AR apps that are either stored locally or in the cloud. Since the intention was to possibly change the target, it was decided to experiment with the cloud version.

A proof-of-concept native iOS app was produced using the device’s live camera feed to constantly scan the image for the target. Computer vision techniques are used in real-time to trigger the augmentation as soon as the target is found. Once completed the app displays the augmentation on the same camera view and snaps it to the real-world target (in this case the area of Mars). The 3D model is attached and the user can move the device around and, for as long as the target is still being seen, the app tracks and augments in real-time, as if the digital content (augmentation) was truly a part of the real-world object.

Finding your way – indoors and out

Using the same technique, I created AR experiences for Great Britain’s three highest peaks. Simply point the device’s camera at the relevant OS paper map to pick up its markers and suddenly a 3D recreation of Ben Nevis, Snowdon or Scafell Pike appears out of it. As well as being interesting to look at, it’s also a visual reminder to be prepared and keep safe if you plan to go up one of them.

We then collaborated with our local hospital, the Southampton General, to see how AR could be of use. They told us about a problem with doctors and consultants visiting and not knowing their way around the hospital, which is vital if that doctor or consultant needs to treat someone urgently.

Expanding upon what we did for Digital Shoreditch, we built a simple turn-by-turn concept by incorporating smart signs. The app would, on generic signs housed within the hospital, present tailored navigational content relevant to the doctor or consultant using it. This expanded to include patients who with the app could receive content relevant to them, which would include directions to the appropriate department or clinic and other useful information, such as delay times.

Then came one of my favourite projects: Beer Maps. I replicated the Mars AR experience, but this time using beer mats, which triggers an augmentation of an OS map centred on the pub and its surrounding area, complete with walking routes.

Mixing it with Hololens

For CityVerve, the UK’s Internet of Things (IoT) demonstrator project set in Manchester, the interior and exterior of Manchester townhall was captured by OS surveyors using the latest Leica scanning equipment, and I used that data to create a model of it in Hololens. This was demoed at the World Institute of Ideas Forum. The Hololens app was made using Holotoolkit for Unity and Visual Studio. It loads the data and captures the surface mesh and pins the object to the surface mesh for a mixed reality experience.

For the OS and Geovation sponsored British Library event that celebrated the future of mapping, I created two AR experiences. Using OS single building heights and aerial imagery in QGIS, I generated a Hololens mixed reality 3D app that showed Canary Wharf and London. Seeing London in this unexpected way, by simply slipping on a pair of glasses, delighted those who tried it, but the uses of such models are genuinely exciting, especially in relation to construction and Building Information Modelling.

The possibility Hololens offers in being able to see a projected new build in situ and its relationship with the surrounding area is an exciting development. For architects and builders, sharing the final vision with clients and other stakeholders in this immersive way, before any actual work has taken place, is a powerful new tool that will aid planning and decision making. In my view, it has the potential to be revolutionary.

Building for the future

The second experience for the event was a tablet and marker-based AR app that used the Geovation logo to trigger a visualisation of the 3D data of London. This was again done using Unity and Vuforia, and, as a means of communicating construction projects and visualising their final outcomes, it is another powerful new tool for the construction industry.

None of these experiences have received a public release as yet, but later this year OS is releasing in OS Maps its first official AR experience. Users simply point the camera of their Android or iOS device at the landscape and, using GPS and the compass, accurate points of interest that sit in that view will be highlighted.

To create this I chose not to rely on external SDKs and, instead, used Apple iOS Core Location and Core Motion framework and access to the readings from Gyroscope and Accelorometer, which gives the accuracy. When accessed, it calls on the OS Placenames API to retrieve the OS populated places, which delivers points of interest within a set radius based on position and orientation.

AR has been around longer than most people would expect, but my feeling is that technology is catching up and we’re only scratching the surface of what it can achieve. As with Building information Modelling (BIM), I think we are on the verge of a revolution that will bend reality and allow us to simulate a lot more besides.

Download a PDF of this article

Download