LAENet: Light-weight asymmetric encoder-decoder network for semantic segmentation

Encode-decoder structure is used in deep learning for real-time dense segmentation task. On account of the limitation of calculation burden on mobile devices, we present a light-weight asymmetric encoder-decoder network in this paper, namely LAENet, which quickly and efficiently accomplish the task...

Full description

Saved in:
Bibliographic Details
Published in:Journal of physics. Conference series Vol. 1966; no. 1; pp. 12047 - 12053
Main Authors: Hong, Liangyi, Duan, Shukai, Wang, Lidan, Pan, Yongbin
Format: Journal Article
Language:English
Published: Bristol IOP Publishing 01.07.2021
Subjects:
ISSN:1742-6588, 1742-6596
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Encode-decoder structure is used in deep learning for real-time dense segmentation task. On account of the limitation of calculation burden on mobile devices, we present a light-weight asymmetric encoder-decoder network in this paper, namely LAENet, which quickly and efficiently accomplish the task of real-time semantic segmentation. We employ an asymmetric convolution and group convolution structure combined with dilated convolution and dense connectivity to reduce computation cost and model size, which can guarantee adequate receptive field and enhance the model learning ability in encoder. On the other hand, feature pyramid networks (FPN) structure combine attention mechanism and ECRE block are utilized in the decoder to strike a balance between the network complexity and segmentation performance. Our approach achieves only have 0.84M parameters, and is able to reach 66 FPS in a single GTX 1080Ti GPU. Experiments on Cityscapes datasets demonstrate that superior performance of LAENet is better than the existing segmentation network, in terms of speed and accuracy trade-off without any post-processing.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1966/1/012047