Skip to main content

The proof is in the vegetation map

By [email protected] - 22nd June 2018 - 10:58

It isn’t often that ‘exhilarating’ is used to describe vegetation mapping surveys. But Dr Whitney Broussard, a senior scientist at JESCO, an environmental and geotechnical services company in Jennings, Louisiana, in the United States, chose that word – twice – to describe a UAV-based survey along the state’s coastal marsh.

“Our area of interest (AOI) was 4km south of the nearest terra firma,” he says. “We strapped the catapult down on our research boat and launched the drone off the bow over open water. It was like launching a mini airplane off a mini aircraft carrier. We chased it home doing 65km/h down the canal and then right as the drone was finishing, we sent the final command for it to land on open grass near a boat launch. It was exhilarating.”

Part of Broussard’s excitement might have stemmed from that being his first commercial UAV flight. But a larger contributor to that enthusiasm was the fact that the flight was the foundational survey for a successful pilot project testing the viability of using hyperspatial imagery with object-based image analysis (OBIA) technology to improve the accuracy and detail of vegetation mapping in coastal wetlands – environments that are notoriously difficult to survey.

“Coastal marshland is extremely difficult to map, both from the air and the ground,” explains Broussard. “Accessing it via airboat and foot can disrupt the vegetation you’re trying to protect. Accuracy is an issue because it can be difficult to establish control and often the GPS technology used isn’t survey-grade. And traditional aerial surveys and image processing techniques aren’t fine enough to precisely classify the land/water boundary and varied vegetation.”

However, Broussard’s UAV-OBIA application aims to both provide a way to help resolve these unique, wetlands-mapping problems and improve upon traditional survey methodologies. Pairing the UAV image spectral richness with the OBIA intelligent and rapid classifying abilities, Broussard’s integrated technological solution is not only beginning to yield new business revenue streams for JESCO, it may help redefine the business of vegetation mapping for state and local authorities.

Bucking tradition

Louisiana’s coast is home to 2.5 million residents – more than half its population – as well as 37% of all the coastal marshes and habitat in the continental US. Despite the importance of such vital, natural resource assets, Louisiana has lost nearly 4,900 square kilometres of land since the 1930s and without action, the coast could lose up to another 10,600 square kilometres over the next 50 years.

To help combat the natural challenges of the its coastline, the state’s Coastal Protection and Restoration Authority (CPRA), with funding and support from the US Geological Survey (USGS) and the Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA), established 390 coastal reference monitoring stations (CRMS) across the entire Louisiana coast to measure the health of the coastal marsh vegetation and the impact of erosion.

Although the CPRA and USGS have been routinely mapping and monitoring the CRMS sites with 1m aerial imagery, the coarser resolution only enables them to map the land/water interface, not vegetation, so ground surveys have been used to measure the vegetation. Teams of scientists walk the marsh along pre-determined transects, drop one square metre of PVC pipe in random locations, record the location with a handheld GPS and visually determine the types of vegetation and their coverage values.

That traditional approach is time consuming and the 1m imagery analysis tends to lead to an overestimation of the land coverages. Broussard has been using that CRMS data in his research to calibrate and validate a modelling approach that automatically classifies UAV imagery and creates vegetation models using Trimble’s eCognition OBIA technology. eCognition works by following user-defined processing workflows called rule sets to automatically detect and classify specified objects and map them.

“The traditional CRMS data detail and accuracy haven’t been exact enough to create precise vegetation models,” says Broussard. “When I began working with JESCO’s UAV data, I thought the hyperspatial, fine-scale imagery would be a natural fit for developing an OBIA-based technique. Unlike traditional image-processing methodologies, the OBIA software could handle the high spectral variance and subtleties of the hyperspatial data. I wanted to test the feasibility of combining the two technologies to produce meaningful coastal vegetation maps that could supplement the state’s traditional monitoring programs.”

In the spring of 2016, Broussard got his chance. JESCO had been contracted to fly its Trimble UX5 Multispectral UAV over a restoration site in Terrebonne Parish, a dense, marshland region near the Gulf of Mexico. With guidance and field support from his postdoctoral mentor Dr Jenneke Visser at the University of Louisiana at Lafayette, Broussard successfully pitched the idea of extending the field work to fly over a CRMS site in the same area to test his proof-of-concept.

All the colours of the marsh

Broussard chose the nearest one square kilometre CRMS site for his AOI. Its moderate-sized footprint and simple vegetation pattern offered a nice testbed for the pilot. The UX5 flights were scheduled for late August 2016, allowing them to capture data during the peak biomass season.

To ensure the reliability and accuracy of the UAV data, they set out five ground control points (GCPs) for each flight block. Carrying a handheld GPS, they navigated to each pre-defined location, laid down the elevated target – designed not to disturb the vegetation – and used a Trimble Geo 7X GNSS handheld unit to record the GCP’s position via RTK corrections from a VRS network. After a day’s work, the team had placed 11 GCPs throughout three overlapping flight blocks.

Based on his experience using UAV data for wetlands-mapping research, Broussard knew that collecting imagery over patches of open water would be a photogrammetry challenge. So, in addition to outfitting the UX5 with a Sony Alpha 5100 sensor for natural colour (RGB) imagery, he also added a Sony NEX5 with a modified, near infrared (NIR) sensor to the UAV’s payload. The NIR reflectance values would help them better delineate the land/water interface and differentiate between vegetation species.

Using Trimble’s Aerial Imaging Flight Planning software, Broussard established three flight blocks over the AOI. All three were flown with the RGB sensor. The middle block was also flown with the NIR sensor. Launching the UAV off their boat about 4km from the landing site, the UAV flew at an altitude of 75m and speeds of 80km/h. The team followed the UAV by boat, maintaining constant communication and line-of-sight for each flight, and then guided the aerial rover back to the ground. In three hours of total flight time, the UX5 had collected 4,106 images over the entire AOI at a ground sample distance (GSD) of 2.5cm – data detail that equates to 98.5 billion pixels over one square kilometre.

Of the four flights, Broussard chose to use the 1,984 overlapping RGB and NIR flight images as his test-case sources. Using Trimble’s Inpho UASMaster software, he first generated two digital surface models (DSM), one from the RGB data and one from the NIR data, and then used the DSMs to produce orthomosaics of each. The orthomosaics and DSMs had horizontal and vertical accuracies of 2.4cm. All of those products were used as source data for eCognition.

“A key advantage of UAV data over traditional aerial photography is the spatial resolution,” says Broussard. “You can see a shadow behind a leaf, the individual plant stems, and the different colour tones from one leaf to the next. Those intricate reflectance and elevation values enable you to build point clouds and elevation models that the OBIA technology can use to accurately delineate land from water and classify vegetation.”

A mapping success

For the eCognition process, Broussard needed to start by building a rule set that would instruct the software to methodically isolate and classify image objects according to his user-defined plan.

Integrating the DSM and colour infrared orthomosaic as inputs, Broussard developed a two-tiered rule set to first delineate the land and water, and then further delineate the land into three vegetation classes: Grass (Spartina patens), Reed (Phragmites australis), and Other.

Using a multispectral segmentation algorithm defined by NIR, red, green and DSM thresholds and object scale, shape and compactness specifications, the rule set first segmented the data stack into meaningful objects that gave more weight to NIR reflectance and compactness of the objects while also considering the height of the vegetation. Relying predominantly on the NIR information, Broussard manually defined a water-mask threshold for the software to use to classify the water/land objects. Objects with values below his user-defined threshold were classified as Water and objects above the threshold were classified as Land. To further refine the water/land classification, Broussard applied a minimum mapping unit (MMU) of 1.3 square metres to the dataset and instructed eCognition to reclassify all water features less than the MMU as land.

The Land class objects were then merged and re-segmented by first running a nearly identical multispectral segmentation algorithm. Then, following user-defined spectral rules, the software identified and combined new objects with similar spectral signatures. This created large objects that were compact, circular in shape, and accounted for the height of the vegetation. Reeds were first classified by analysing the average height of the objects (reeds are taller than the surrounding grasses), and then studying the average difference between the objects and the surrounding objects.

To classify the Other vegetation, Broussard integrated a Normalised Green Red Difference Index and Grey Level Co-occurrence Matrix (GLCM) dissimilarity and contrast indices to define the texture values. eCognition used that information to identify the Other vegetation objects based on a combination of their greenness values and their texture. The remaining land objects were then classified as Grass, which was the dominant vegetation type for this landscape.

“The magic of eCognition is in its segmentation,” says Broussard. “Once you have set your parameters within the segmentation process, eCognition uses those parameters to group pixels into units that share similar attributes and categorise them. It mimics the human brain’s process of identifying objects through pattern recognition. Using a rule set ensures that you are capturing the objects you want classified because the software won’t deviate from the rules. That’s why OBIA is able to methodically and repeatedly do something that humans can’t do.”

Broussard exported the classifications as shapefiles and used ArcGIS to finalise the cartography and perform a spatial analysis, calculating the percentage coverage for each of the vegetation types and for land versus water. Based on a comparative analysis with the CPRA’s 2012 data, he identified 100% of the vegetation types, calculated plant heights to within 88-94%, and produced a land-water interface map that was “strikingly more detailed”.

“With drones and OBIA technology, instead of producing point data every 100m or so, we produce models every few centimetres,” says Broussard. “That gives users an incredibly data-rich product to help them better assess vegetation health and to quantify the rate of wetland loss and changes in the coastal zone.”

Indeed, after presenting their maps to the CPRA in the spring of 2017, a wetland scientist expressed interest in developing a new method for marsh creation monitoring and incorporate it into their traditional monitoring campaigns this fall.

“That kind of response and validation says that the project was a success,” says Broussard. “And it was a significant test-case success for JESCO, too, which hadn’t been focused on vegetation mapping previously. It’s given us the opportunity to take on more of this work.”

New business takes flight

That new business, in fact, began only a few months after completing the Terrebonne Parish project, when survey company, CH Fenstermaker tasked JESCO to survey and classify a wetlands mitigation bank at the Rockefeller Wildlife Refuge (RWR), a 290 square kilometre wildlife and fisheries refuge in southwestern Louisiana.

In an effort to offset ecological losses from infrastructure improvements, the RWR developed a 0.4 square kilometre wetlands mitigation bank and in 2010 and 2012 planted a variety of wetland grasses across the site. The refuge first surveyed the area in 2016 using airboats and visual inspections to determine the vegetation species present and their coverages within a 2m by 2m piece of PVC on the ground.

Based on that initial survey, scientists were concerned that repeated measurements of the vegetation with this methodology could be harmful to the wetlands health – airboat trails can lead to permanent damage. The RWR wanted a more non-invasive and accurate approach. In November 2017, Broussard went to the RWR to fly and map the mitigation bank’s vegetation. Working in tandem with Fenstermaker surveyor Ricardo Johnson, they established eight GCPs, setting them at 300-600m intervals and using a Trimble R7 base station and R6-4 rover RTK GNSS receiver to record their positions.

They flew four UX5 flights – two RGB missions and two NIR – using the same Sony sensors, and given the size and location of the site, Broussard could control and monitor each 30-minute flight from one location. In total, the UX5 collected 4,899 images with a 2.5cm GSD. Each dataset was used to produce a DSM and an orthomosaic for input into eCognition.

Having created a ‘master’ rule set from the CRMS project, Broussard only needed to slightly modify the rule set to accommodate the different vegetation classes. In less than two hours, eCognition had delineated the land/water boundaries and then classified four objects: Grass (Spartina patens), Reed (Phragmites australis), Shrub/Scrub and Impervious.

“One of the key elements that made it possible to distinguish this particular vegetation was the DSM, which you only get with the drone data,” says Broussard. “Having the elevation values allowed me to differentiate the species, particularly the Phragmites australis and the shrub, which are really challenging because they have similar textures and NIR reflectance.”

Although Broussard is currently analysing RWR’s 2016 survey report to evaluate the accuracy of the classification, the response to the OBIA-based vegetation map from refuge managers has been incredibly positive.

“A detailed, OBIA-based vegetation map gives managers a meaningful measurement of their wetlands environment,” says Broussard. “It also provides a highly accurate map of their wetland acreage – rather than just an estimation – which they can use in their required reporting to authorities.”

Based on the early successes of this new operational application, Broussard sees a bright future for UAV and OBIA technology.

“Combining these technologies opens up tremendous possibilities for monitoring changes on the ground. Coastal environments will still be challenging and dynamic. But with this integrated approach, we can not only replicate and supplement traditional monitoring methodologies, we can produce precise vegetation and land cover maps at scales and speeds that we couldn’t ever imagine or do as a human.”

If Broussard’s vision proves correct, he may experience more exhilarating fieldwork.

Mary Jo Wagner is a writing and editing consultant and contractor ([email protected])

Download a PDF of this article

Download