Skip to main content

See where we’re up to


By [email protected] - 28th October 2015 - 10:34

According to a report from DigiCore published earlier this year, augmented reality (AR) and virtual reality (VR) technologies combined represent a US$4bn market that is expected to grow to US$150bn by 2020. â©

Today, the lion’s share of the market is focused on VR, but DigiCore anticipates that AR will soon overtake it as the experience of choice and will represent 80% of the entire ‘computer-mediated reality’ market by 2020. â©

There are several reasons for this. Although VR has advanced to a point where people can connect their smartphones to a VR headset, rather than be tethered to a super computer, it’s really only suited for controlled environments, not for field workers looking to visualise gas or water lines.â©

“Try walking on the street wearing a virtual reality headset and you’ll run into a car or walk into a lamppost,” says Marco Tillmann, augmented reality product manager at HERE. “Outside the isolated space of your couch, virtual reality simply doesn’t work for everyday situations. Augmented reality, which enhances real-world surroundings with digitally rendered elements, is much more suited for business use-cases.”â©

Smart hardware maker, Vuzix, was first to market with an enterprise-grade smart glasses solution offering AR applications tailored for a business setting. In combination with pre-packaged software, like that from Pristine EyeSight, or with a bespoke solution, the Vuzix M100 enables workers to collaborate and solve problems hands free.â©

“There are so many potential uses cases that the technology is really playing catch up to all the ideas,” says Lance Anderson, vice president of enterprise sales at Vuzix. “Which use cases make it to live production today are driven in part by the viability of the form factor itself, which right now is primarily monocular smart glasses. This type of hardware lends itself really well for ‘see what I see’ use cases where a worker in one location can show their colleague what they see and receive guidance on how to complete a particular task.”â©

Similarly, ORA-1 smart glasses by Optinvent are being built with specific enterprise use cases in mind. In France, electrical providers in the field would like to use AR glasses to identify where underground lines are located. In Germany, meanwhile, engineers are testing the glasses to help service power poles and wind generators. â©

Indeed, awareness, interest and use of AR among businesses is growing, but for it to really take off, sensor accuracy must be improved and developers need more tools to create more comfortable user interfaces for their AR applications. â©

Whereas VR applications rely heavily on processor capability, AR is very sensor-dependent. For example, when wearing smart glasses to identify the building or telephone pole that needs to be serviced, the real world and the virtual world need to align almost perfectly. This means that GPS sensor accuracy has to be within a few centimetres so that the underlying reference data or the virtual world can be called up and overlaid on the real-world location. â©

“You need to know down to the centimetre, down to a degree, the orientation and positioning of what you are looking at,” says Tillmann. “If the geometries are off and you point the smart glasses anywhere slightly off the centre of the object – say on the roof – those centimetres make a big difference and the smart glasses may not recognise the building at all.”â©

This contact analogy, where the virtual and real world align, is often not precise with today’s GPS sensor technology. Although the technology is improving rapidly, most of today’s smartphones can only offer between 2-10m accuracy. For enterprise applications that need centimetre accuracy, an additional GPS solution may be needed.â©

While the sensor technology catches up, the reference world, or the underlying location data, is already quite rich and accurate. Location experts such as HERE have mapped the outside world, including 2D and 3D geometry for many buildings and high definition maps of many of the world’s roads, down to centimetre-level accuracy. â©

“The LiveSight API from HERE is an off-the-shelf solution for companies that renders our highly precise map specifically for augmented reality applications,” explains Tillmann. “Businesses can use the data, but also layer their own, company-specific location data on top for one of the best referencing solutions available.”â©

Both Vuzix and Optinvent are testing how to use HERE to imbue location context into their enterprise grade AR experiences. Vuzix, for example, is testing the ability to overlay its own custom data onto a HERE base map and how to incorporate turn-by-turn drive and walk guidance from the HERE mobile SDK. Many of the features available to developers in the HERE mobile SDK, like navigation and map search, are also available offline, which is a key advantage for workers who may be in a remote area or have an unstable wifi or data connection.â©

For heads-up object selection and guidance, developers can use LiveSight from HERE to display 2D and 3D virtual objects directly into the camera input. This means the user can point to objects or areas with their smart glasses to get more information and, in some cases, like underground power lines, literally see what’s hidden from view.â©

“We didn’t find another AR location technology to rival LiveSight for our smart glasses. It’s true AR technology that provides just the right amount of information, at the right time, at the right size within your field of vision,” says David Chérel, director of innovative solutions and mobility at Eurogiciel, application maker for Optinvent’s ORA-1 smart glasses. “When your head is down, you see the street map or directions to your point of interest. When your head is up, looking towards the horizon, that is to say when you are walking or driving, you just see what you need to from your POIs [points of interest]. This natural motion of looking down for more information, such as you would when consulting a paper street map, and looking up for visual confirmation, has been translated into clever functionality. Thanks to the gyroscope in the ORA-1 glasses providing stability, the user experiences a smooth transition from one mode to the next.” â©

The flexibility of the HERE mobile SDK and LiveSight AR solutions can be used across many form factors, from smartphones and tablets, to smart glasses, goggles, helmets and even smart headphones that come with an optional, flip-down display. â©

Of course, continuing innovation in location services will also be needed to help power the next generation of AR applications. “For location companies like HERE, indoor mapping and positioning presents a huge opportunity,” says Vuzix’s Anderson. “The demand exists today for AR solutions that work inside warehouses to help, for example, direct workers to the next best, most efficient task based on where they are now in real-time. This requires a detailed indoor map in combination with GPS, Bluetooth beacons, wireless access points or other positioning capabilities to accurately pinpoint the location of workers, equipment and goods within the warehouse.”â©

Outside, location context like that from HERE will also be critical for AR views when navigating drones or other unmanned aerial vehicles. Drone operators would be able to see 3D representations of buildings to help architects or city planners visualise planned construction or service a field of wind generators using AR images shown on top of the drone’s video feed.â©

Location awareness and context will be a key feature of AR experiences. With augmented reality, developers and companies have the ability to reference a rich virtual world and apply that to the real world helping users find information about the locations around them and then react to that information.â©

Sarah Durante is communications manager at HERE (www.here.com)

Download a PDF of this article

Download

Read More: 3D / 4D Data Capture Terrestrial Surveying