OverSegNet: A convolutional encoder–decoder network for image over-segmentation

Efficient and differentiable image over-segmentation is key to superpixel-based research and applications but remains a challenging problem. The paper proposes a fully convolutional deep network, named OverSegNet, for image over-segmentation. OverSegNet consists of an encoder and a decoder, which ar...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computers & electrical engineering Ročník 107; s. 108610
Hlavní autoři: Li, Peng, Ma, Wei
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Ltd 01.04.2023
Témata:
ISSN:0045-7906, 1879-0755
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Efficient and differentiable image over-segmentation is key to superpixel-based research and applications but remains a challenging problem. The paper proposes a fully convolutional deep network, named OverSegNet, for image over-segmentation. OverSegNet consists of an encoder and a decoder, which are designed for the two core parts of over-segmentation, i.e., feature representation and pixel–superpixel association, respectively. To obtain edge-sensitive and noise-insusceptible feature representation, the encoder is endowed with rich over-segmentation-specific convolutional kernels via over-parametrization followed by task-driven neural network search (NAS). The decoder adopts a multi-scale convolutional structure with cross-large-scale connections, to achieve pixel–superpixel association in a coarse-to-fine feed-forward manner while eliminating accumulation errors. We conduct rich ablation studies to verify the effectiveness of the specially designed encoder and decoder. Experiments on both the BSDS500 dataset and NYUv2 dataset show that the proposed OverSegNet is fast, obtains state-of-the-art accuracy and has good generalization ability. Using semantic segmentation and disparity estimation as examples, we also verify the proposed OverSegNet in downstream applications.
ISSN:0045-7906
1879-0755
DOI:10.1016/j.compeleceng.2023.108610