DBF-YOLO: a fall detection algorithm for complex scenes

In complex environments, factors such as varying light intensities, occluded body shapes, crowding, and significant changes in target scale result in low accuracy, reduced detection efficiency, and frequent missed detections in human fall detection. This paper proposes an optimized version of YOLOv8...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Signal, image and video processing Ročník 19; číslo 7; s. 532
Hlavní autoři: Wang, Xu, Xiang, Xiaodong
Médium: Journal Article
Jazyk:angličtina
Vydáno: London Springer London 01.07.2025
Springer Nature B.V
Témata:
ISSN:1863-1703, 1863-1711
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In complex environments, factors such as varying light intensities, occluded body shapes, crowding, and significant changes in target scale result in low accuracy, reduced detection efficiency, and frequent missed detections in human fall detection. This paper proposes an optimized version of YOLOv8, introducing an enhanced fall detection model: DBF- YOLO. First, the C2f module in the backbone network is combined with deformable convolution DCNv2 to enhance feature extraction for irregular fall targets. Next, in the neck module, BiFPN (Bidirectional Feature Pyramid Network) replaces PAN (Path Aggregation Network), strengthening feature fusion and improving model performance. Finally, Focal EIoU is incorporated into the enhanced network, replacing the original CIoU to mitigate the adverse effects of complex environments on detection performance. Compared to YOLOv8n, the DBF-YOLO fall detection algorithm improves precision, recall, mAP@0.5, and mAP@0.5:0.95 by 4%, 1.6%, 2.6%, and 0.9%, respectively, while achieving a frame rate of 243.9 FPS, meeting the requirements for real-time detection. Experimental results demonstrate that the DBF-YOLO model significantly enhances the detection accuracy of human falls in complex environments.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-025-04179-4