Skip to main content
Dryad logo

Data from: Integrating a UAV-derived DEM in object-based image analysis increases habitat classification accuracy on coral reefs

Citation

Nieuwenhuis, Brian Owain et al. (2022), Data from: Integrating a UAV-derived DEM in object-based image analysis increases habitat classification accuracy on coral reefs, Dryad, Dataset, https://doi.org/10.5061/dryad.6m905qg2p

Abstract

Very shallow coral reefs (< 5 m deep) are naturally exposed to strong sea surface temperature variations, UV radiation and other stressors exacerbated by climate change, raising great concern over their future. As such, accurate and ecologically informative coral reef maps are fundamental for their management and conservation. Since traditional mapping and monitoring methods fall short in very shallow habitats, shallow reefs are increasingly mapped with Unmanned Aerial Vehicles (UAVs). UAV-imagery is commonly processed with Structure-from-Motion (SfM) to create orthomosaics and Digital Elevation Models (DEMs) spanning several hundred metres. Techniques to convert these SfM products to ecologically relevant habitat maps are still relatively underdeveloped. Here we demonstrate that incorporating geomorphometric variables (the DEM and its derivatives) in addition to spectral information (the orthomosaic) can greatly enhance the accuracy of automatic habitat classification. Therefore, we mapped three very shallow reef areas off KAUST on the Saudi Arabian Red Sea coast with an RTK-ready UAV. Imagery was processed with SfM, and classified through Object-Based Image Analysis (OBIA). Within our OBIA workflow, we observed overall accuracy increases of up to 11% when training a Random Forest classifier on both spectral and geomorphometric variables as opposed to traditional methods that only use spectral information. Our work highlights the potential of incorporating a UAV’s DEM in OBIA for benthic habitat mapping, a promising but still scarcely exploited asset.

Methods

Data was collected with a DJI Phantom 4 RTK, an Emlid Reach RS2 GNSS station, and a set of GoPro hero 9 cameras.

UAV imagery was geolocated through Post-Processing Kinematics.

Both UAV and GoPro imagery were processed with Structure-from-Motion in Pix4D and Agisoft Metashape, respectively.

Shapefiles of habitat classifications were created in Trimble eCognition and GIS software.

Usage Notes

GIS software is necessary to use and view the data. We advise the use of either ArcGis or QGIS (open-source).

Funding

King Abdullah University of Science and Technology, Award: BAS11090-01-01

Groninger Universiteitsfonds, Award: 2021AU050