Compliant Navigation Mechanisms Utilizing Probabilistic Motion Patterns of Humans in a Camera Network
Motion trajectories provide rich spatio-temporal information about a person's activities. In this paper we first employ an algorithm for learning collections of these trajectories that characterize representative motion patterns of persons. Data recorded with a non-overlapping camera network is...
Uloženo v:
| Vydáno v: | Advanced robotics Ročník 22; číslo 9; s. 929 - 948 |
|---|---|
| Hlavní autoři: | , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
01.09.2008
|
| Témata: | |
| ISSN: | 0169-1864, 1568-5535 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Motion trajectories provide rich spatio-temporal information about a person's activities. In this paper we first employ an algorithm for learning collections of these trajectories that characterize representative motion patterns of persons. Data recorded with a non-overlapping camera network is clustered hierarchically using a fuzzy K-means algorithm based on spatial and temporal information, respectively, then each motion pattern is represented with a series of Gaussian distributions. Subsequently, a method is proposed to improve behaviors of a mobile robot according to moving intentions of people. In our approach, whenever the camera network detects a person it computes a probabilistic estimate about which motion pattern the person might be engaged in according to the learned models of people's motion behaviors. During path planning the robot then uses this prediction to adapt its navigation mechanisms. In practical experiments carried out on a real robot we demonstrate that our approach allows a robot to quickly adjust its navigation tactics according to the activities of people in an office environment. |
|---|---|
| Bibliografie: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 |
| ISSN: | 0169-1864 1568-5535 |
| DOI: | 10.1163/156855308X315109 |