Relevance Feedback For Image Retrieval Using Transfer Learning and Improved MQHOA

Image retrieval is a challenging technology in multimedia applications where meeting the users’ subjective retrieval needs while achieving high retrieval performance is insufficient for existing methods. In this work, a related feedback image retrieval algorithm based on deep learning and optimizati...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Journal of physics. Conference series Ročník 1880; číslo 1; s. 12006
Hlavní autori: Wang, Huaqiu, Liu, Qian
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Bristol IOP Publishing 01.04.2021
Predmet:
ISSN:1742-6588, 1742-6596
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Image retrieval is a challenging technology in multimedia applications where meeting the users’ subjective retrieval needs while achieving high retrieval performance is insufficient for existing methods. In this work, a related feedback image retrieval algorithm based on deep learning and optimization algorithm (CAMQHOA-RF) is proposed. Transfer learning based on the deep convolutional neural network is applied to extract deeper image features to reduce the semantic gap. The multi-scale quantum harmonic oscillator algorithm improved by the idea of “aggregation” is introduced to search the feature space effectively. The covariance matrix is used to strengthen the relationship between feature points at different scales to guide feature points to approach ideal query points faster. Moreover, the query point is reselected based on the feedback information to explore more potential users’ interest areas. Experiments have shown that compared with other algorithms, the proposed algorithm has fewer parameters that need to be set, but higher retrieval accuracy, faster retrieval speed, and stronger robustness are obtained, which can meet users better.
Bibliografia:ObjectType-Conference Proceeding-1
SourceType-Scholarly Journals-1
content type line 14
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1880/1/012006