OverSegNet: A convolutional encoder–decoder network for image over-segmentation

Efficient and differentiable image over-segmentation is key to superpixel-based research and applications but remains a challenging problem. The paper proposes a fully convolutional deep network, named OverSegNet, for image over-segmentation. OverSegNet consists of an encoder and a decoder, which ar...

Full description

Saved in:
Bibliographic Details
Published in:Computers & electrical engineering Vol. 107; p. 108610
Main Authors: Li, Peng, Ma, Wei
Format: Journal Article
Language:English
Published: Elsevier Ltd 01.04.2023
Subjects:
ISSN:0045-7906, 1879-0755
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Efficient and differentiable image over-segmentation is key to superpixel-based research and applications but remains a challenging problem. The paper proposes a fully convolutional deep network, named OverSegNet, for image over-segmentation. OverSegNet consists of an encoder and a decoder, which are designed for the two core parts of over-segmentation, i.e., feature representation and pixel–superpixel association, respectively. To obtain edge-sensitive and noise-insusceptible feature representation, the encoder is endowed with rich over-segmentation-specific convolutional kernels via over-parametrization followed by task-driven neural network search (NAS). The decoder adopts a multi-scale convolutional structure with cross-large-scale connections, to achieve pixel–superpixel association in a coarse-to-fine feed-forward manner while eliminating accumulation errors. We conduct rich ablation studies to verify the effectiveness of the specially designed encoder and decoder. Experiments on both the BSDS500 dataset and NYUv2 dataset show that the proposed OverSegNet is fast, obtains state-of-the-art accuracy and has good generalization ability. Using semantic segmentation and disparity estimation as examples, we also verify the proposed OverSegNet in downstream applications.
ISSN:0045-7906
1879-0755
DOI:10.1016/j.compeleceng.2023.108610