Feature Adaptive Co-Segmentation by Complexity Awareness
In this paper, we propose a novel feature adaptive co-segmentation method that can learn adaptive features of different image groups for accurate common objects segmentation. We also propose image complexity awareness for adaptive feature learning. In the proposed method, the original images are fir...
Saved in:
| Published in: | IEEE transactions on image processing Vol. 22; no. 12; pp. 4809 - 4824 |
|---|---|
| Main Authors: | , , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
New York, NY
IEEE
01.12.2013
Institute of Electrical and Electronics Engineers The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects: | |
| ISSN: | 1057-7149, 1941-0042, 1941-0042 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | In this paper, we propose a novel feature adaptive co-segmentation method that can learn adaptive features of different image groups for accurate common objects segmentation. We also propose image complexity awareness for adaptive feature learning. In the proposed method, the original images are first ranked according to the image complexities that are measured by superpixel changing cue and object detection cue. Then, the unsupervised segments of the simple images are used to learn the adaptive features, which are achieved using an expectation-minimization algorithm combining l 1-regularized least squares optimization with the consideration of the confidence of the simple image segmentation accuracies and the fitness of the learned model. The error rate of the final co-segmentation is tested by the experiments on different image groups and verified to be lower than the existing state-of-the-art co-segmentation methods. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 ObjectType-Article-2 ObjectType-Feature-1 |
| ISSN: | 1057-7149 1941-0042 1941-0042 |
| DOI: | 10.1109/TIP.2013.2278461 |