Pose-guided feature region-based fusion network for occluded person re-identification

Learning distinguishing features from training datasets while filtering features of occlusions is critical to person retrieval scenarios. Most of the current person re-identification (Re-ID) methods based on classification or deep metric representation learning tend to overlook occlusion issues on t...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Multimedia systems Ročník 29; číslo 3; s. 1771 - 1783
Hlavní autoři: Xie, Gengsheng, Wen, Xianbin, Yuan, Liming, Wang, Jianchen, Guo, Changlun, Jia, Yansong, Li, Minghao
Médium: Journal Article
Jazyk:angličtina
Vydáno: Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2023
Springer Nature B.V
Témata:
ISSN:0942-4962, 1432-1882
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Learning distinguishing features from training datasets while filtering features of occlusions is critical to person retrieval scenarios. Most of the current person re-identification (Re-ID) methods based on classification or deep metric representation learning tend to overlook occlusion issues on the training set. Such representations from obstacles are easily over-fitted and misleading due to being considered as a part of the human body. To alleviate the occlusion problem, we propose a pose-guided feature region-based fusion network (PFRFN), to utilize pose landmarks as guidance to guide local learning for a good property of local feature, and the representation learning risk is evaluated on each part loss separately. Compared with only using global classification loss, concurrently considering local loss and the results of robust pose estimation enable the deep network to learn the representations of the body parts that prominently displayed in the image and gain the discriminative faculties on occluded scenes. Experimental results on multiple datasets, i.e., Market-1501, DukeMTMC, CUHK03, demonstrate the effectiveness of our method in a variety of scenarios.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0942-4962
1432-1882
DOI:10.1007/s00530-021-00752-2