Multi-Class Segmentation with Relative Location Prior

Multi-class image segmentation has made significant advances in recent years through the combination of local and global features. One important type of global feature is that of inter-class spatial relationships. For example, identifying “tree” pixels indicates that pixels above and to the sides ar...

Full description

Saved in:
Bibliographic Details
Published in:International journal of computer vision Vol. 80; no. 3; pp. 300 - 316
Main Authors: Gould, Stephen, Rodgers, Jim, Cohen, David, Elidan, Gal, Koller, Daphne
Format: Journal Article
Language:English
Published: Boston Springer US 01.12.2008
Springer
Springer Nature B.V
Subjects:
ISSN:0920-5691, 1573-1405
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multi-class image segmentation has made significant advances in recent years through the combination of local and global features. One important type of global feature is that of inter-class spatial relationships. For example, identifying “tree” pixels indicates that pixels above and to the sides are more likely to be “sky” whereas pixels below are more likely to be “grass.” Incorporating such global information across the entire image and between all classes is a computational challenge as it is image-dependent, and hence, cannot be precomputed. In this work we propose a method for capturing global information from inter-class spatial relationships and encoding it as a local feature. We employ a two-stage classification process to label all image pixels. First, we generate predictions which are used to compute a local relative location feature from learned relative location maps. In the second stage, we combine this with appearance-based features to provide a final segmentation. We compare our results to recent published results on several multi-class image segmentation databases and show that the incorporation of relative location information allows us to significantly outperform the current state-of-the-art.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0920-5691
1573-1405
DOI:10.1007/s11263-008-0140-x