Skip to main content

Establishing the right process for mobile LiDAR data

By [email protected] - 21st April 2017 - 14:30

Mobile LiDAR Systems (MLS) are quickly emerging as the dominant technology for surveying and mapping transport corridors. The reasons are clear. Productivity is unrivalled as data is acquired by a vehicle traveling at road speeds. The MLS data representing the 3D corridor can meet the highest survey standards. The need for survey or mapping personnel on the roadway is eliminated, as are lane closures and other traffic maintenance measures.

There are many survey/mapping companies operating MLSs today and the number is rapidly increasing. Thus, any government agency or company active in transport infrastructure will need an efficient process to exploit the value of MLS data across their operations.

W Edwards Deming, the father of quality revolution, said, “If you can’t describe what you do as a process, you don’t know what you’re doing.” This certainly applies to processing MLS data. A poorly defined and inefficient process between data delivery and the extraction of deliverables can consequently be very ineffective and unprofitable. This article discussed the requirements, challenges and solutions associated with implementing the processes necessary to efficiently extract value from MLS data.

The MLS data flow process can be broken down to three basic components: Manage, Assess and Extract. Manage refers to the administration and storage of the data. Assess addresses how the data will be evaluated to assure the quality is sufficient to meet project objectives. Extract refers to the operations required to develop topography surveys, asset maps, and general information from the data to feed downstream operations.


Even moderate MLS activity will produce terabytes of data very quickly. The first step in implementing a data management process is to decide where data will be stored. Scaling an existing internal network to accommodate terabytes or even petabytes can be very costly in hardware, installation, security and annual maintenance. Another option are the very large scale cloud-based storage services now available from providers such as Amazon Web Services. These are very secure and practically limitless, yet are inexpensive.

Having chosen the optimal storage medium, the next requirement is the user interface. Large amounts of data stored on a cloud server is of little use without the ability to efficiently locate and transfer the required data. Additional administration requirements would include determining how the stored data is organised to facilitate identification and location. Control protocols must be implemented to assign access to data efficiently among many remote users, assure all users are accessing the same version of data, and accommodate additional metadata relevant to an MLS project. Such metadata would include: survey control coordinates employed in the project, results of quality control and assurance analyses, extracted CAD files and other relevant material.

As an example of implementing such a solution, Certainty 3D’s TopoDOT software offers a built-in administrator called TopoCloud. As a first step, data is organised within TopoDOT to break it up to manageable size for efficient storage, identification, access and transfer. Geospatial GUI links are automatically developed and associated with each data tile.

Then a stand-alone TopoCloud administrator module allows a project manager to upload all MLS project data and metadata to the storage medium of choice. The administrator may select and grant access to users anywhere in the world. Those users granted access see the data GUI links over a map allowing them to select data where and when needed to download to their workstation hard drive to support extraction operations.

Users with project access download data where and when they need it. To maintain data integrity across the user base, TopoCloud will always check to see if requested data on a local hard drive has been changed on the cloud storage medium. If so, the updated data will be downloaded again from cloud storage. Thus, any changes to the data uploaded by the project administrator will migrate across the user base as the data is accessed.


Having provided for the management of the MLS data, the next process step is to assess the data quality. MLS data is employed in supporting various corridor design, engineering and construction operations. Workflows for assessing spatial accuracy and characteristics of MLS data are critical to meet requirements for payment, to identify and address anomalies and to determine if the extracted deliverables will meet intended project objectives.

As an example of just one step in the assessment process, you can establish relative data accuracy by looking for the elevation difference between MLS data taken during overlapping passes along the same corridor (see Figure 1). This provides a good indicator of overall system calibration and performance.

A subsequent assessment step then compares the aligned point clouds to geospatial coordinates. This process establishes a documented lineage from the point cloud to the validated survey control.

As the data might be used at different times for different purposes, the assess process becomes critical in determining the data’s suitability in meeting the respective objectives of each application. A robust and well-defined assessment protocol thus increases the value of the MLS data as it expands the scope of its use.


Having established a process for managing MLS data and assessing its quality, the next step is the extraction of a corridor topographies, asset maps, clearance measurements, 3D models or any other deliverables. An extraction team must be trained to properly extract the required deliverables to feed downstream operations. Of primary importance in organising such a team is to match their extraction capacity to the expected MLS data acquisition rate lest the team becomes a bottleneck to the overall workflow.

In traditional land survey operations, surveyors attach ‘intelligence’ to the coordinates as they are collected in the field. CAD software is then relatively quick to draw break lines, topographies, and models as the software is provided the identity of each coordinate and its’ relationship to neighbouring coordinates. Thus, typically a single CAD technician might support the work of three or more traditional survey field crews.

The nature of MLS data turns around and increases this ratio as now many more extraction technicians are required to support a single MLS. Now keep in mind that MLS acquisition is so productive, it covers in hours what a single field crew might take several days to weeks. So, the overall process is much more productive, safer and less costly. But the extraction CAD team must be increased and trained in the skills necessary to develop required deliverables. The size of the team can vary, but there can be 15-20 extraction technicians processing every day to support an MLS operating at close to maximum acquisition capacity.

To avoid the extraction process becoming a bottleneck, one should first estimate the number of annual kilometres to be acquired by the MLS. It is important to apportion the estimated distance by the type of project – a full corridor survey extracted to meet level one survey accuracy will require much more time than an asset mapping project. Similarly, simple bridge clearance measurements will require even less time than mapping. See ‘Extraction calculations’ for a simple example of this.

Distributing the process

While there are variations, Manage, Assess and Extract are the ‘fundamental’ process components of the MLS data flow. Understanding these processes, organising a well-trained team around them and assuring the production capacity will match the expected amount of project data being acquired are the keys to a successful MLS data processing operation.

A final point is that while such a process is a requirement for successful MLS data flow, it is often distributed over several entities. For example, MLS consultants may deliver data to a transport agency. That agency may assess its quality and pass the data back to the consultant for extraction. The agency might then assess the deliverable quality. Thus, regardless of how responsibility is distributed, understanding the process and assuring that it is successfully executed is the key to success. We’re sure Mr Demming would agree.

Ted Knaak is president of Certainty 3D (

Download a PDF of this article