Compliant Navigation Mechanisms Utilizing Probabilistic Motion Patterns of Humans in a Camera Network

Motion trajectories provide rich spatio-temporal information about a person's activities. In this paper we first employ an algorithm for learning collections of these trajectories that characterize representative motion patterns of persons. Data recorded with a non-overlapping camera network is...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Advanced robotics Jg. 22; H. 9; S. 929 - 948
Hauptverfasser: Liang, Zhiwei, Ma, Xudong, Dai, Xianzhong
Format: Journal Article
Sprache:Englisch
Veröffentlicht: 01.09.2008
Schlagworte:
ISSN:0169-1864, 1568-5535
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Motion trajectories provide rich spatio-temporal information about a person's activities. In this paper we first employ an algorithm for learning collections of these trajectories that characterize representative motion patterns of persons. Data recorded with a non-overlapping camera network is clustered hierarchically using a fuzzy K-means algorithm based on spatial and temporal information, respectively, then each motion pattern is represented with a series of Gaussian distributions. Subsequently, a method is proposed to improve behaviors of a mobile robot according to moving intentions of people. In our approach, whenever the camera network detects a person it computes a probabilistic estimate about which motion pattern the person might be engaged in according to the learned models of people's motion behaviors. During path planning the robot then uses this prediction to adapt its navigation mechanisms. In practical experiments carried out on a real robot we demonstrate that our approach allows a robot to quickly adjust its navigation tactics according to the activities of people in an office environment.
Bibliographie:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0169-1864
1568-5535
DOI:10.1163/156855308X315109