3D ActionSLAM: wearable person tracking in multi-floor environments

We present 3D ActionSLAM, a stand-alone wearable system that can track people in previously unknown multi-floor environments with sub-room accuracy. ActionSLAM stands for action-based simultaneous localization and mapping: It fuses dead reckoning data from a foot-mounted inertial measurement unit wi...

Full description

Saved in:
Bibliographic Details
Published in:Personal and ubiquitous computing Vol. 19; no. 1; pp. 123 - 141
Main Authors: Hardegger, Michael, Roggen, Daniel, Tröster, Gerhard
Format: Journal Article
Language:English
Published: London Springer London 01.01.2015
Springer Nature B.V
Subjects:
ISSN:1617-4909, 1617-4917
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present 3D ActionSLAM, a stand-alone wearable system that can track people in previously unknown multi-floor environments with sub-room accuracy. ActionSLAM stands for action-based simultaneous localization and mapping: It fuses dead reckoning data from a foot-mounted inertial measurement unit with the recognition of location-related actions to build and update a local landmark map. Simultaneously, this map compensates for position drift errors that accumulate in open-loop tracking by means of a particle filter. To evaluate the system performance, we analyzed 23 tracks with a total walked distance of 6,489 m in buildings with up to three floors. The algorithm robustly (93 % of runs converged) mapped the areas with a mean landmark positioning error of 0.59 m. As ActionSLAM is fully stand-alone and not dependent on external infrastructure, it is well suited for patient tracking in remote health care applications. The algorithm is computationally light-weight and runs in real-time on a Samsung Galaxy S4, enabling immediate location-aware feedback. Finally, we propose visualization techniques to facilitate the interpretation of tracking data acquired with 3D ActionSLAM.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:1617-4909
1617-4917
DOI:10.1007/s00779-014-0815-y