A modified YOLOv3 detection method for vision-based water surface garbage capture robot

To tackle the water surface pollution problem, a vision-based water surface garbage capture robot has been developed in our lab. In this article, we present a modified you only look once v3-based garbage detection method, allowing real-time and high-precision object detection in dynamic aquatic envi...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:International journal of advanced robotic systems Ročník 17; číslo 3
Hlavní autoři: Li, Xiali, Tian, Manjun, Kong, Shihan, Wu, Licheng, Yu, Junzhi
Médium: Journal Article
Jazyk:angličtina
Vydáno: London, England SAGE Publications 01.05.2020
Sage Publications Ltd
SAGE Publishing
Témata:
ISSN:1729-8806, 1729-8814
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:To tackle the water surface pollution problem, a vision-based water surface garbage capture robot has been developed in our lab. In this article, we present a modified you only look once v3-based garbage detection method, allowing real-time and high-precision object detection in dynamic aquatic environments. More specifically, to improve the real-time detection performance, the detection scales of you only look once v3 are simplified from 3 to 2. Besides, to guarantee the accuracy of detection, the anchor boxes of our training data set are reclustered for replacing some of the original you only look once v3 prior anchor boxes that are not appropriate to our data set. By virtue of the proposed detection method, the capture robot has the capability of cleaning floating garbage in the field. Experimental results demonstrate that both detection speed and accuracy of the modified you only look once v3 are better than those of other object detection algorithms. The obtained results provide valuable insight into the high-speed detection and grasping of dynamic objects in complex aquatic environments autonomously and intelligently.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1729-8806
1729-8814
DOI:10.1177/1729881420932715