3D ActionSLAM: wearable person tracking in multi-floor environments
We present 3D ActionSLAM, a stand-alone wearable system that can track people in previously unknown multi-floor environments with sub-room accuracy. ActionSLAM stands for action-based simultaneous localization and mapping: It fuses dead reckoning data from a foot-mounted inertial measurement unit wi...
Uloženo v:
| Vydáno v: | Personal and ubiquitous computing Ročník 19; číslo 1; s. 123 - 141 |
|---|---|
| Hlavní autoři: | , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
London
Springer London
01.01.2015
Springer Nature B.V |
| Témata: | |
| ISSN: | 1617-4909, 1617-4917 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | We present 3D ActionSLAM, a stand-alone wearable system that can track people in previously unknown multi-floor environments with sub-room accuracy. ActionSLAM stands for action-based simultaneous localization and mapping: It fuses dead reckoning data from a foot-mounted inertial measurement unit with the recognition of location-related actions to build and update a local landmark map. Simultaneously, this map compensates for position drift errors that accumulate in open-loop tracking by means of a particle filter. To evaluate the system performance, we analyzed 23 tracks with a total walked distance of 6,489 m in buildings with up to three floors. The algorithm robustly (93 % of runs converged) mapped the areas with a mean landmark positioning error of 0.59 m. As ActionSLAM is fully stand-alone and not dependent on external infrastructure, it is well suited for patient tracking in remote health care applications. The algorithm is computationally light-weight and runs in real-time on a Samsung Galaxy S4, enabling immediate location-aware feedback. Finally, we propose visualization techniques to facilitate the interpretation of tracking data acquired with 3D ActionSLAM. |
|---|---|
| Bibliografie: | SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 14 ObjectType-Article-1 ObjectType-Feature-2 content type line 23 |
| ISSN: | 1617-4909 1617-4917 |
| DOI: | 10.1007/s00779-014-0815-y |