DBF-YOLO: a fall detection algorithm for complex scenes

In complex environments, factors such as varying light intensities, occluded body shapes, crowding, and significant changes in target scale result in low accuracy, reduced detection efficiency, and frequent missed detections in human fall detection. This paper proposes an optimized version of YOLOv8...

Full description

Saved in:
Bibliographic Details
Published in:Signal, image and video processing Vol. 19; no. 7; p. 532
Main Authors: Wang, Xu, Xiang, Xiaodong
Format: Journal Article
Language:English
Published: London Springer London 01.07.2025
Springer Nature B.V
Subjects:
ISSN:1863-1703, 1863-1711
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In complex environments, factors such as varying light intensities, occluded body shapes, crowding, and significant changes in target scale result in low accuracy, reduced detection efficiency, and frequent missed detections in human fall detection. This paper proposes an optimized version of YOLOv8, introducing an enhanced fall detection model: DBF- YOLO. First, the C2f module in the backbone network is combined with deformable convolution DCNv2 to enhance feature extraction for irregular fall targets. Next, in the neck module, BiFPN (Bidirectional Feature Pyramid Network) replaces PAN (Path Aggregation Network), strengthening feature fusion and improving model performance. Finally, Focal EIoU is incorporated into the enhanced network, replacing the original CIoU to mitigate the adverse effects of complex environments on detection performance. Compared to YOLOv8n, the DBF-YOLO fall detection algorithm improves precision, recall, mAP@0.5, and mAP@0.5:0.95 by 4%, 1.6%, 2.6%, and 0.9%, respectively, while achieving a frame rate of 243.9 FPS, meeting the requirements for real-time detection. Experimental results demonstrate that the DBF-YOLO model significantly enhances the detection accuracy of human falls in complex environments.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-025-04179-4