Opportunistic sensing for inferring in-the-wild human contexts based on activity pattern recognition using smart computing
In recent years, with the evolution of internet-of-things and smart sensing technologies, sensor-based physical activity recognition has gained substantial prominence, and numerous research works have been conducted in this regard. However, the accurate recognition of in-the-wild human activities an...
Gespeichert in:
| Veröffentlicht in: | Future generation computer systems Jg. 106; S. 374 - 392 |
|---|---|
| Hauptverfasser: | , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
Elsevier B.V
01.05.2020
|
| Schlagworte: | |
| ISSN: | 0167-739X, 1872-7115 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | In recent years, with the evolution of internet-of-things and smart sensing technologies, sensor-based physical activity recognition has gained substantial prominence, and numerous research works have been conducted in this regard. However, the accurate recognition of in-the-wild human activities and the associated contexts remains an open research challenge to be addressed. This research work presents a novel activity-aware human context recognition scheme that explicitly learns human activity patterns in diverse behavioral contexts and infers in-the-wild user contexts based on physical activity recognition. In this aspect, five daily living activities, e.g., lying, sitting, standing, walking, and running, are associated with overall fourteen different behavioral contexts, including phone positions. A public domain dataset, i.e., ExtraSensory, is used for evaluating the proposed scheme using a series of machine learning classifiers. Random Forest classifier achieves the best recognition rate of 88.4% and 89.8% in recognizing five physical activities and the associated behavioral contexts, respectively, which demonstrates the efficacy of the proposed method.
•Novel scheme for activity-aware human context recognition (AAHCR) in-the-wild.•Integration of 14 diverse behavioral contexts with 05 physical activities for AAHCR.•Fusion of smartphone and watch accelerometer for inferring user activity and context.•Detailed performance analysis of position-independent and position-dependent AAHCR.•Detailed comparative analysis of a series of machine learning classifiers for AAHCR. |
|---|---|
| ISSN: | 0167-739X 1872-7115 |
| DOI: | 10.1016/j.future.2020.01.003 |