Creating the Bag-of-Words with Spatial Context Information for Image Retrieval

Recently bag-of-words (BoW) model as image feature has been widely used in content-based image retrieval. Most of existing approaches of creating BoW ignore the spatial context information. In order to better describe the image content, the BoW with spatial context information is created in this pap...

Full description

Saved in:
Bibliographic Details
Published in:Applied Mechanics and Materials Vol. 556-562; pp. 4788 - 4791
Main Authors: Li, Zhen Wei, Zhuo, Li, Liu, Xin, Zhang, Jing
Format: Journal Article
Language:English
Published: Zurich Trans Tech Publications Ltd 01.05.2014
Subjects:
ISBN:3038351156, 9783038351153
ISSN:1660-9336, 1662-7482, 1662-7482
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recently bag-of-words (BoW) model as image feature has been widely used in content-based image retrieval. Most of existing approaches of creating BoW ignore the spatial context information. In order to better describe the image content, the BoW with spatial context information is created in this paper. Firstly, image’s regions of interest are detected and the focus of attention shift is produced through visual attention model. The color and SIFT features are extracted from the region of interest and BoW is created through cluster analysis method. Secondly, the spatial context information among objects in an image is generated by using the spatial coding method based on the focus of attention shift. Then the image is represented as the model of BoW with spatial context. Finally, the model of spatial context BoW is applied into image retrieval to evaluate the performance of the proposed method. Experimental results show the proposed method can effectively improve the accuracy of the image retrieval.
Bibliography:Selected, peer reviewed papers from the 2014 International Conference on Mechatronics Engineering and Computing Technology (ICMECT 2014), April 9-10, 2014, Shanghai, China
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISBN:3038351156
9783038351153
ISSN:1660-9336
1662-7482
1662-7482
DOI:10.4028/www.scientific.net/AMM.556-562.4788