Fall detection algorithm based on pyramid network and feature fusion

Accidental falls are the second leading cause of accidental death of the elderly. Early intervention measures can reduce the problem. However, so far, there are few related studies using Transformer coding module for fall detection feature extraction, and the real-time performance of existing algori...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Evolving systems Ročník 15; číslo 5; s. 1957 - 1970
Hlavní autoři: Li, Jiangjiao, Gao, Mengqi, Wang, Peng, Li, Bin
Médium: Journal Article
Jazyk:angličtina
Vydáno: Berlin/Heidelberg Springer Berlin Heidelberg 01.10.2024
Springer Nature B.V
Témata:
ISSN:1868-6478, 1868-6486
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Accidental falls are the second leading cause of accidental death of the elderly. Early intervention measures can reduce the problem. However, so far, there are few related studies using Transformer coding module for fall detection feature extraction, and the real-time performance of existing algorithms is not so good. Therefore, we propose a fall detection method based on Transformer to extract spatiotemporal features. Specifically, we use an image reduction module based on a convolutional neural network to reduce the image size for computation. Then, we design a pyramid network based on an improved Transformer to extract spatial features. Finally, we design a feature fusion module that fuses spatial features of different scales. The fused features are input into the gate recurrent unit to extract time features and complete the recognition of falls and normal postures. Experimental results show that the proposed approach achieves an accuracy of 99.61% and 99.33% when tested with UR Fall Detection Dataset and Le2i Fall Detection Dataset. Compared with the state-of-the-art fall detection algorithms, our method has high accuracy while maintaining high detection speed.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1868-6478
1868-6486
DOI:10.1007/s12530-024-09601-9