Bibliographische Detailangaben
| Titel: |
Utilizing the YOLOv8 model for accurate hand recognition with complex background. |
| Autoren: |
Kristianto, Budhi, Dewi, Christine, Purnomo, Hindriyanto Dwi, Hartomo, Kristoko Dwi, Mohd Hashim, Siti Zaiton |
| Quelle: |
PeerJ Computer Science; Oct2025, p1-25, 25p |
| Schlagwörter: |
COMPUTER vision, OBJECT recognition (Computer vision), HUMAN activity recognition, CONVOLUTIONAL neural networks |
| Abstract: |
Background: The recognition of human hands is essential in the pre-processing stage of several computer vision tasks, as they are actively involved in these actions. This task encompasses hand posture estimation, hand gesture recognition, human activity analysis, and related activities. The human hand shows a wide range of motion and experiences many morphological changes. The presence of numerous individuals in a limited area complicates the precise identification of distinct hand movements, while some hands display a wide variety of motion capabilities. This research's motivation is to open up new opportunities to solve the problems above. Methods: This article provides a concise analysis of convolutional neural network (CNN)-based object detection algorithms, notably emphasizing the YOLOv8n and YOLOv8s models trained for 50 and 100 epochs. This research examines various object detection algorithms, including ones specifically utilized for hand identification. Furthermore, our proposed method is trained and evaluated on the Oxford Hand Dataset and EgoHand Dataset using the YOLOv8 framework. Performance measures are employed to assess and quantify critical data, including the number of Giga Floating-Point Operations Per Second (GFLOPS), the mean average precision (mAP), and the detection duration. Results: The results of our experiments show that utilizing YOLOv8n with a training period of 100 epochs produces a more reliable conclusion than other previously published methods. In the training phase, the model exhibited a mean Average Precision (mAP) of 86.7% for the Oxford Hand Dataset and 98.9% for the EgoHand Dataset. Moreover, YOLOv8n with 100 epochs surpasses the maximum average score (mAP) relative to prior research for both datasets. [ABSTRACT FROM AUTHOR] |
|
Copyright of PeerJ Computer Science is the property of PeerJ Inc. and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.) |
| Datenbank: |
Complementary Index |