MACU-Net for Semantic Segmentation of Fine-Resolution Remotely Sensed Images

Semantic segmentation of remotely sensed images plays an important role in land resource management, yield estimation, and economic assessment. U-Net, a deep encoder-decoder architecture, has been used frequently for image segmentation with high accuracy. In this letter, we incorporate multiscale fe...

Full description

Saved in:
Bibliographic Details
Published in:IEEE geoscience and remote sensing letters Vol. 19; pp. 1 - 5
Main Authors: Li, Rui, Duan, Chenxi, Zheng, Shunyi, Zhang, Ce, Atkinson, Peter M.
Format: Journal Article
Language:English
Published: Piscataway IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1545-598X, 1558-0571
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Semantic segmentation of remotely sensed images plays an important role in land resource management, yield estimation, and economic assessment. U-Net, a deep encoder-decoder architecture, has been used frequently for image segmentation with high accuracy. In this letter, we incorporate multiscale features generated by different layers of U-Net and design a multiscale skip connected and asymmetric-convolution-based U-Net (MACU-Net), for segmentation using fine-resolution remotely sensed images. Our design has the following advantages: (1) the multiscale skip connections combine and realign semantic features contained in both low-level and high-level feature maps; (2) the asymmetric convolution block strengthens the feature representation and feature extraction capability of a standard convolution layer. Experiments conducted on two remotely sensed data sets captured by different satellite sensors demonstrate that the proposed MACU-Net transcends the U-Net, U-Netpyramid pooling layers (PPL), U-Net 3+, among other benchmark approaches. Code is available at https://github.com/lironui/MACU-Net .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1545-598X
1558-0571
DOI:10.1109/LGRS.2021.3052886