Counting crowd flow based on feature points

A counting approach for crowd flow based on feature points is proposed. The objective is to obtain the characteristics of the crowd flow in a scene, including the crowd orientation and numeric count. For the feature point detection, a three-frame difference algorithm is used to obtain a foreground c...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) Vol. 133; pp. 377 - 384
Main Authors: Liang, Ronghua, Zhu, Yuge, Wang, Haixia
Format: Journal Article
Language:English
Published: Amsterdam Elsevier B.V 10.06.2014
Elsevier
Subjects:
ISSN:0925-2312, 1872-8286
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A counting approach for crowd flow based on feature points is proposed. The objective is to obtain the characteristics of the crowd flow in a scene, including the crowd orientation and numeric count. For the feature point detection, a three-frame difference algorithm is used to obtain a foreground containing only the moving objects. Therefore, after the SURF feature point detection, only the feature points of the foreground are retained for further processing. This greatly reduces the time complexity of the SURF algorithm. For feature point clustering, we present an improved DBSCAN clustering algorithm in which the non-motion feature points are further eliminated and only the remaining feature points are clustered. For the calculation of the crowd flow orientation, the feature points are tracked based on a local Lucas–Kanade optical flow with Hessian matrix algorithm. In the crowd flow number counting, the crowd eigenvectors are constructed based on the SURF feature points and are trained using a support vector regression machine. The experimental results show that the proposed crowd orientation and counting method are more robust and provide crowd flow statistics with higher accuracy than previous approaches.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2013.12.040