Multimodal and Multitemporal Land Use/Land Cover Semantic Segmentation on Sentinel-1 and Sentinel-2 Imagery: An Application on a MultiSenGE Dataset

Fiche du document

Date

27 décembre 2022

Discipline
Type de document
Périmètre
Langue
Identifiants
Relations

Ce document est lié à :
info:eu-repo/semantics/altIdentifier/doi/10.3390/rs15010151

Collection

Archives ouvertes

Licences

http://creativecommons.org/licenses/by-nc/ , info:eu-repo/semantics/OpenAccess




Citer ce document

Romain Wenger et al., « Multimodal and Multitemporal Land Use/Land Cover Semantic Segmentation on Sentinel-1 and Sentinel-2 Imagery: An Application on a MultiSenGE Dataset », HAL-SHS : géographie, ID : 10.3390/rs15010151


Métriques


Partage / Export

Résumé En

In the context of global change, up-to-date land use land cover (LULC) maps is a major challenge to assess pressures on natural areas. These maps also allow us to assess the evolution of land cover and to quantify changes over time (such as urban sprawl), which is essential for having a precise understanding of a given territory. Few studies have combined information from Sentinel-1 and Sentinel-2 imagery, but merging radar and optical imagery has been shown to have several benefits for a range of study cases, such as semantic segmentation or classification. For this study, we used a newly produced dataset, MultiSenGE, which provides a set of multitemporal and multimodal patches over the GrandEst region in France. To merge these data, we propose a CNN approach based on spatio-temporal and spatio-spectral feature fusion, ConvLSTM+Inception-S1S2. We used a U-Net base model and ConvLSTM extractor for spatio-temporal features and an inception module for the spatio-spectral features extractor. The results show that describing an overrepresented class is preferable to map urban fabrics (UF). Furthermore, the addition of an Inception module on a date allowing the extraction of spatio-spectral features improves the classification results. Spatio-spectrotemporal method (ConvLSTM+Inception-S1S2) achieves higher global weighted F1 Score than all other methods tested.

document thumbnail

Par les mêmes auteurs

Sur les mêmes sujets

Sur les mêmes disciplines

Exporter en