Counting crowd flow based on feature points

A counting approach for crowd flow based on feature points is proposed. The objective is to obtain the characteristics of the crowd flow in a scene, including the crowd orientation and numeric count. For the feature point detection, a three-frame difference algorithm is used to obtain a foreground c...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Neurocomputing (Amsterdam) Ročník 133; s. 377 - 384
Hlavní autoři: Liang, Ronghua, Zhu, Yuge, Wang, Haixia
Médium: Journal Article
Jazyk:angličtina
Vydáno: Amsterdam Elsevier B.V 10.06.2014
Elsevier
Témata:
ISSN:0925-2312, 1872-8286
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:A counting approach for crowd flow based on feature points is proposed. The objective is to obtain the characteristics of the crowd flow in a scene, including the crowd orientation and numeric count. For the feature point detection, a three-frame difference algorithm is used to obtain a foreground containing only the moving objects. Therefore, after the SURF feature point detection, only the feature points of the foreground are retained for further processing. This greatly reduces the time complexity of the SURF algorithm. For feature point clustering, we present an improved DBSCAN clustering algorithm in which the non-motion feature points are further eliminated and only the remaining feature points are clustered. For the calculation of the crowd flow orientation, the feature points are tracked based on a local Lucas–Kanade optical flow with Hessian matrix algorithm. In the crowd flow number counting, the crowd eigenvectors are constructed based on the SURF feature points and are trained using a support vector regression machine. The experimental results show that the proposed crowd orientation and counting method are more robust and provide crowd flow statistics with higher accuracy than previous approaches.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2013.12.040