Towards better classification of land cover and land use based on convolutional neural networks

authored by
C. Yang, F. Rottensteiner, C. Heipke
Abstract

Land use and land cover are two important variables in remote sensing. Commonly, the information of land use is stored in geospatial databases. In order to update such databases, we present a new approach to determine the land cover and to classify land use objects using convolutional neural networks (CNN). High-resolution aerial images and derived data such as digital surface models serve as input. An encoder-decoder based CNN is used for land cover classification. We found a composite including the infrared band and height data to outperform RGB images in land cover classification. We also propose a CNN-based methodology for the prediction of land use label from the geospatial databases, where we use masks representing object shape, the RGB images and the pixel-wise class scores of land cover as input. For this task, we developed a two-branch network where the first branch considers the whole area of an image, while the second branch focuses on a smaller relevant area. We evaluated our methods using two sites and achieved an overall accuracy of up to 89.6% and 81.7% for land cover and land use, respectively. We also tested our methods for land cover classification using the Vaihingen dataset of the ISPRS 2D semantic labelling challenge and achieved an overall accuracy of 90.7%.

Organisation(s)
Institute of Photogrammetry and GeoInformation (IPI)
Type
Conference article
Journal
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
Volume
42
Pages
139-146
No. of pages
8
ISSN
1682-1750
Publication date
04.06.2019
Publication status
E-pub ahead of print
Peer reviewed
Yes
ASJC Scopus subject areas
Information Systems, Geography, Planning and Development
Sustainable Development Goals
SDG 15 - Life on Land
Electronic version(s)
https://doi.org/10.5194/isprs-archives-XLII-2-W13-139-2019 (Access: Open)
https://doi.org/10.15488/10183 (Access: Open)