Fall detection algorithm based on pyramid network and feature fusion

Accidental falls are the second leading cause of accidental death of the elderly. Early intervention measures can reduce the problem. However, so far, there are few related studies using Transformer coding module for fall detection feature extraction, and the real-time performance of existing algori...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Evolving systems Jg. 15; H. 5; S. 1957 - 1970
Hauptverfasser: Li, Jiangjiao, Gao, Mengqi, Wang, Peng, Li, Bin
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Berlin/Heidelberg Springer Berlin Heidelberg 01.10.2024
Springer Nature B.V
Schlagworte:
ISSN:1868-6478, 1868-6486
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Accidental falls are the second leading cause of accidental death of the elderly. Early intervention measures can reduce the problem. However, so far, there are few related studies using Transformer coding module for fall detection feature extraction, and the real-time performance of existing algorithms is not so good. Therefore, we propose a fall detection method based on Transformer to extract spatiotemporal features. Specifically, we use an image reduction module based on a convolutional neural network to reduce the image size for computation. Then, we design a pyramid network based on an improved Transformer to extract spatial features. Finally, we design a feature fusion module that fuses spatial features of different scales. The fused features are input into the gate recurrent unit to extract time features and complete the recognition of falls and normal postures. Experimental results show that the proposed approach achieves an accuracy of 99.61% and 99.33% when tested with UR Fall Detection Dataset and Le2i Fall Detection Dataset. Compared with the state-of-the-art fall detection algorithms, our method has high accuracy while maintaining high detection speed.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1868-6478
1868-6486
DOI:10.1007/s12530-024-09601-9