Feature Adaptive Co-Segmentation by Complexity Awareness

In this paper, we propose a novel feature adaptive co-segmentation method that can learn adaptive features of different image groups for accurate common objects segmentation. We also propose image complexity awareness for adaptive feature learning. In the proposed method, the original images are fir...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on image processing Ročník 22; číslo 12; s. 4809 - 4824
Hlavní autoři: Meng, Fanman, Li, Hongliang, Ngan, King Ngi, Zeng, Liaoyuan, Wu, Qingbo
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York, NY IEEE 01.12.2013
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1057-7149, 1941-0042, 1941-0042
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, we propose a novel feature adaptive co-segmentation method that can learn adaptive features of different image groups for accurate common objects segmentation. We also propose image complexity awareness for adaptive feature learning. In the proposed method, the original images are first ranked according to the image complexities that are measured by superpixel changing cue and object detection cue. Then, the unsupervised segments of the simple images are used to learn the adaptive features, which are achieved using an expectation-minimization algorithm combining l 1-regularized least squares optimization with the consideration of the confidence of the simple image segmentation accuracies and the fitness of the learned model. The error rate of the final co-segmentation is tested by the experiments on different image groups and verified to be lower than the existing state-of-the-art co-segmentation methods.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2013.2278461