Multi-Class Segmentation with Relative Location Prior

Multi-class image segmentation has made significant advances in recent years through the combination of local and global features. One important type of global feature is that of inter-class spatial relationships. For example, identifying “tree” pixels indicates that pixels above and to the sides ar...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:International journal of computer vision Ročník 80; číslo 3; s. 300 - 316
Hlavní autori: Gould, Stephen, Rodgers, Jim, Cohen, David, Elidan, Gal, Koller, Daphne
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Boston Springer US 01.12.2008
Springer
Springer Nature B.V
Predmet:
ISSN:0920-5691, 1573-1405
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Multi-class image segmentation has made significant advances in recent years through the combination of local and global features. One important type of global feature is that of inter-class spatial relationships. For example, identifying “tree” pixels indicates that pixels above and to the sides are more likely to be “sky” whereas pixels below are more likely to be “grass.” Incorporating such global information across the entire image and between all classes is a computational challenge as it is image-dependent, and hence, cannot be precomputed. In this work we propose a method for capturing global information from inter-class spatial relationships and encoding it as a local feature. We employ a two-stage classification process to label all image pixels. First, we generate predictions which are used to compute a local relative location feature from learned relative location maps. In the second stage, we combine this with appearance-based features to provide a final segmentation. We compare our results to recent published results on several multi-class image segmentation databases and show that the incorporation of relative location information allows us to significantly outperform the current state-of-the-art.
Bibliografia:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0920-5691
1573-1405
DOI:10.1007/s11263-008-0140-x