Towards Autonomous Visual Navigation in Arable Fields

Autonomous navigation of a robot in agricultural fields is essential for every task from crop monitoring to weed management and fertilizer application. Many current approaches rely on accurate GPS, however, such technology is expensive and can be impacted by lack of coverage. As such, autonomous nav...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems S. 6585 - 6592
Hauptverfasser: Ahmadi, Alireza, Halstead, Michael, McCool, Chris
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 23.10.2022
Schlagworte:
ISSN:2153-0866
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Autonomous navigation of a robot in agricultural fields is essential for every task from crop monitoring to weed management and fertilizer application. Many current approaches rely on accurate GPS, however, such technology is expensive and can be impacted by lack of coverage. As such, autonomous navigation through sensors that can interpret their environment (such as cameras) is important to achieve the goal of autonomy in agriculture. In this paper, we introduce a purely vision-based navigation scheme that is able to reliably guide the robot through row-crop fields using computer vision and signal processing techniques without manual intervention. Independent of any global localization or mapping, this approach is able to accurately follow the crop-rows and switch between the rows, only using onboard cameras. The proposed navigation scheme can be deployed in a wide range of fields with different canopy shapes in various growth stages, creating a crop agnostic navigation approach. This was completed under various illumination conditions using simulated and real fields where we achieve an average navigation accuracy of 3.82cm with minimal human intervention (hyper-parameter tuning) on BonnBot-I.
ISSN:2153-0866
DOI:10.1109/IROS47612.2022.9981299