Summary
Published in ISPRS Journal of Photogrammetry and Remote Sensing 156: 14-26. https://doi.org/10.1016/j.isprsjprs.2019.07.010
Tree mortality is an important forest ecosystem variable having uses in many applications such as forest health assessment, modelling stand dynamics and productivity, or planning wood harvesting operations. Because tree mortality is a spatially and temporally erratic process, rates and spatial patterns of tree mortality are difficult to estimate with traditional inventory methods. Remote sensing imagery has the potential to detect tree mortality at spatial scales required for accurately characterizing this process (e.g., landscape, region). Many efforts have been made in this sense, mostly using pixel- or object-based methods. In this study, we explored the potential of deep Convolutional Neural Networks (CNNs) to detect and map tree health status and functional type over entire regions. To do this, we built a database of around 290,000 photo-interpreted trees that served to extract and label image windows from 20 cm-resolution digital aerial images, for use in CNN training and evaluation. In this process, we also evaluated the effect of window size and spectral channel selection on classification accuracy, and we assessed if multiple realizations of a CNN, generated using different weight initializations, can be aggregated to provide more robust predictions. Finally, we extended our model with 5 additional classes to account for the diversity of landcovers found in our study area. When predicting tree health status only (live or dead), we obtained test accuracies of up to 94%, and up to 86% when predicting functional type only (broadleaf or needleleaf). Channel selection had a limited impact on overall classification accuracy, while window size increased the ability of the CNNs to predict plant functional type. The aggregation of multiple realizations of a CNN allowed us to avoid the selection of suboptimal models and help to remove much of the speckle effect when predicting on new aerial images. Test accuracies of plant functional type and health status were not affected in the extended model and were all above 95% for the 5 extra classes. Our results demonstrate the robustness of the CNN for between-scene variations in aerial photography and also suggest that this approach can be applied at operational level to map tree mortality across extensive territories.
File
Sector(s):
Forests
Categorie(s):
Scientific Article
Theme(s):
Ecosystems and Environment, Forest Ecology, Forest Growth and Yield Modelling, Forests
Departmental author(s):
Author(s)
SYLVAIN, Jean-Daniel, Guillaume DROLET and Nicolas BROWN
Year of publication :
2019
Format :
ISSN
0924-2716
Keywords :
télédétection, mortalité forestière, intelligence artificielle, apprentissage profond, réseau de neurones convolutionnels, simulation d’ensemble, écologie forestière, écosystèmes et environnement, modélisation de la croissance et du rendement des forêts, article scientifique de recherche forestière, forestry research scientific article, ecosystems and environment, forest ecology, forest growth and yield modelling, remote sensing, tree mortality, machine learning, deep learning, convolutional neural network, ensemble learning