Feature Adaptive Co-Segmentation by Complexity Awareness

In this paper, we propose a novel feature adaptive co-segmentation method that can learn adaptive features of different image groups for accurate common objects segmentation. We also propose image complexity awareness for adaptive feature learning. In the proposed method, the original images are fir...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing Jg. 22; H. 12; S. 4809 - 4824
Hauptverfasser: Meng, Fanman, Li, Hongliang, Ngan, King Ngi, Zeng, Liaoyuan, Wu, Qingbo
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York, NY IEEE 01.12.2013
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:1057-7149, 1941-0042, 1941-0042
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we propose a novel feature adaptive co-segmentation method that can learn adaptive features of different image groups for accurate common objects segmentation. We also propose image complexity awareness for adaptive feature learning. In the proposed method, the original images are first ranked according to the image complexities that are measured by superpixel changing cue and object detection cue. Then, the unsupervised segments of the simple images are used to learn the adaptive features, which are achieved using an expectation-minimization algorithm combining l 1-regularized least squares optimization with the consideration of the confidence of the simple image segmentation accuracies and the fitness of the learned model. The error rate of the final co-segmentation is tested by the experiments on different image groups and verified to be lower than the existing state-of-the-art co-segmentation methods.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2013.2278461