Wireless Sensing-based Daily Activity Tracking System Deployment in Low-Income Senior Housing Environments
Maintaining independence in daily activities and mobility is critical for healthy aging. Older adults who are losing the ability to care for themselves or ambulate are at a high risk of adverse health outcomes and decreased quality of life. It is essential to monitor daily activities and mobility ro...
Uloženo v:
| Vydáno v: | Proceedings of the annual International Conference on Mobile Computing and Networking Ročník 2024; s. 2260 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
United States
04.12.2024
|
| Témata: | |
| ISSN: | 1543-5679 |
| On-line přístup: | Zjistit podrobnosti o přístupu |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Maintaining independence in daily activities and mobility is critical for healthy aging. Older adults who are losing the ability to care for themselves or ambulate are at a high risk of adverse health outcomes and decreased quality of life. It is essential to monitor daily activities and mobility routinely and capture early decline before a clinical symptom arises. Existing solutions use self-reports, or technology-based solutions that depend on cameras or wearables to track daily activities; however, these solutions have different issues (e.g., bias, privacy, burden to carry/recharge them) and do not fit well for seniors. In this study, we discuss a non-invasive, and low-cost wireless sensing-based solution to track the daily activities of low-income older adults. The proposed sensing solution relies on a deep learning-based fine-grained analysis of ambient WiFi signals and it is non-invasive compared to video or wearable-based existing solutions. We deployed this system in real senior housing settings for a week and evaluated its performance. Our initial results show that we can detect a variety of daily activities of the participants with this low-cost system with an accuracy of up to 76.90%. |
|---|---|
| ISSN: | 1543-5679 |
| DOI: | 10.1145/3636534.3698115 |