Using deep convolutional neural networks to forecast spatial patterns of Amazonian deforestation: supporting data and outputs
Ball, James; Petrova, Katerina; Coomes, David; Flaxman, Seth (2022), Using deep convolutional neural networks to forecast spatial patterns of Amazonian deforestation: supporting data and outputs, Dryad, Dataset, https://doi.org/10.5061/dryad.hdr7sqvjz
1. Tropical forests are subject to diverse deforestation pressures while their conservation is essential to achieve global climate goals. Predicting the location of deforestation is challenging due to the complexity of the natural and human systems involved but accurate and timely forecasts could enable effective planning and on-the-ground enforcement practices to curb deforestation rates. New computer vision technologies based on deep learning can be applied to the increasing volume of Earth observation data to generate novel insights and make predictions with unprecedented accuracy.
2. Here, we demonstrate the ability of deep convolutional neural networks (CNNs) to learn spatiotemporal patterns of deforestation from a limited set of freely available global data layers, including multispectral satellite imagery, the Hansen maps of annual forest change (2001-2020) and the ALOS PALSAR digital surface model, to forecast deforestation (2021). We designed four model architectures, based on 2D CNNs, 3D CNNs, and Convolutional Long Short-Term Memory (ConvLSTM) Recurrent Neural Networks (RNNs), to produce spatial maps that indicate the risk to each forested pixel (~30 m) in the landscape of becoming deforested within the next year. They were trained and tested on data from two ~80,000 km2 tropical forest regions in the Southern Peruvian Amazon.
3. The networks could predict the location of future forest loss to a high degree of accuracy (F1 = 0.58-0.71). Our best performing model (3D CNN) had the highest pixel-wise accuracy (F1 = 0.71) when validated on 2020 forest loss (2014-2019 training). Visual interpretation of the mapped forecasts indicated that the network could automatically discern the drivers of forest loss from the input data. For example, pixels around new access routes (e.g. roads) were assigned high risk whereas this was not the case for recent, concentrated natural loss events (e.g. remote landslides).
4. CNNs can harness limited time-series data to predict near-future deforestation patterns, an important step in harnessing the growing volume of satellite remote sensing data to curb global deforestation. The modelling framework can be readily applied to any tropical forest location and used by governments and conservation organisations to prevent deforestation and plan protected areas.
Input raster data all freely available online
Original input raster data from:
- Global Forest Change - https://glad.earthengine.app/view/global-forest-change
- ALOS JAXA - https://www.eorc.jaxa.jp/ALOS/en/dataset/aw3d30/aw3d30_e.htm
Processed with code at https://github.com/PatBall1/DeepForestcast
- Input shapefiles for each study site.
- Input geotiff files (.tif) for each study site.
- Input PyTorch tensors (.pt) for each study site.
- Model weights (.pt) for trained networks (for testing and forecasting).
- Output deforestation forecasts for each study site as geotiffs (.tif).
Python (PyTorch) for processing tensors - https://pytorch.org/
QGIS to view raster and polygon data - https://www.qgis.org/
Natural Environment Research Council, Award: PDAG/501