LAENet: Light-weight asymmetric encoder-decoder network for semantic segmentation

Encode-decoder structure is used in deep learning for real-time dense segmentation task. On account of the limitation of calculation burden on mobile devices, we present a light-weight asymmetric encoder-decoder network in this paper, namely LAENet, which quickly and efficiently accomplish the task...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of physics. Conference series Ročník 1966; číslo 1; s. 12047 - 12053
Hlavní autoři: Hong, Liangyi, Duan, Shukai, Wang, Lidan, Pan, Yongbin
Médium: Journal Article
Jazyk:angličtina
Vydáno: Bristol IOP Publishing 01.07.2021
Témata:
ISSN:1742-6588, 1742-6596
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Encode-decoder structure is used in deep learning for real-time dense segmentation task. On account of the limitation of calculation burden on mobile devices, we present a light-weight asymmetric encoder-decoder network in this paper, namely LAENet, which quickly and efficiently accomplish the task of real-time semantic segmentation. We employ an asymmetric convolution and group convolution structure combined with dilated convolution and dense connectivity to reduce computation cost and model size, which can guarantee adequate receptive field and enhance the model learning ability in encoder. On the other hand, feature pyramid networks (FPN) structure combine attention mechanism and ECRE block are utilized in the decoder to strike a balance between the network complexity and segmentation performance. Our approach achieves only have 0.84M parameters, and is able to reach 66 FPS in a single GTX 1080Ti GPU. Experiments on Cityscapes datasets demonstrate that superior performance of LAENet is better than the existing segmentation network, in terms of speed and accuracy trade-off without any post-processing.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1966/1/012047