Visual Servoing Drone Utilizing Ground Robot Detection Model for UAV-UGV Alignment for Retrieval Operations

Saved in:
Bibliographic Details
Title: Visual Servoing Drone Utilizing Ground Robot Detection Model for UAV-UGV Alignment for Retrieval Operations
Authors: Jamaica Mae Pepito, John Mel Bolaybolay, Earl Ryan Aleluya, Francis Jann Alagon, Steve Clar, Immanuel Paradela, Sherwin Guirnaldo, Jeanette Pao, Carl John Salaan, Argel Bandala
Source: IEEE Access, Vol 13, Pp 103866-103879 (2025)
Publisher Information: Institute of Electrical and Electronics Engineers (IEEE), 2025.
Publication Year: 2025
Subject Terms: Visual servoing, multi-robot systems, object detection, Electrical engineering. Electronics. Nuclear engineering, unmanned aerial vehicles, computer vision, TK1-9971
Description: Hazardous and remote environments, such as volcanic regions and disaster zones, require regular monitoring due to the significant risks they pose. Innovative solutions like aerial and mobile robots offer safer monitoring options, though drones are limited by short battery life, and mobile robots face mobility challenges. Collaborative aerial-ground robots have been developed to address these issues, with drones transporting and recovering ground robots. Deploying the ground robot is easy, but its recovery is challenging due to drone Global Positioning System (GPS) inaccuracies. It highlights the need for a precise visual alignment technique. This study utilized YOLOv8n segmentation model to develop a ground robot detection model. The development involved gathering images of ground robots in various environmental settings such as rocks, seashores, sand, grasslands, limestone, and soil. Three models were tested in this study: plate detection, ground robot detection, and a combined plate and ground robot detection model. The ground robot detection model emerged as the most robust, achieving an F1 score of 99.7% and an mAP of 95.4% at 0.5 to 0.96 IoU thresholds. The model was integrated into a small computer on a visual servoing drone and tested outdoors. The alignment threshold was set at 65 cm, based on the size of the retrieval mechanism. The experiments indicate that the drone achieved an average alignment error of 41.985 cm within 7.82 seconds. The experimental results exhibit the effectiveness of the UAV-UGV alignment for the retrieval task in hazardous and remote areas.
Document Type: Article
ISSN: 2169-3536
DOI: 10.1109/access.2025.3578612
Access URL: https://doaj.org/article/08dec48cb2a44517b92565bafb5ea977
Rights: CC BY NC ND
Accession Number: edsair.doi.dedup.....539ca19c5505b2af2c545a83b6050f03
Database: OpenAIRE
Description
Abstract:Hazardous and remote environments, such as volcanic regions and disaster zones, require regular monitoring due to the significant risks they pose. Innovative solutions like aerial and mobile robots offer safer monitoring options, though drones are limited by short battery life, and mobile robots face mobility challenges. Collaborative aerial-ground robots have been developed to address these issues, with drones transporting and recovering ground robots. Deploying the ground robot is easy, but its recovery is challenging due to drone Global Positioning System (GPS) inaccuracies. It highlights the need for a precise visual alignment technique. This study utilized YOLOv8n segmentation model to develop a ground robot detection model. The development involved gathering images of ground robots in various environmental settings such as rocks, seashores, sand, grasslands, limestone, and soil. Three models were tested in this study: plate detection, ground robot detection, and a combined plate and ground robot detection model. The ground robot detection model emerged as the most robust, achieving an F1 score of 99.7% and an mAP of 95.4% at 0.5 to 0.96 IoU thresholds. The model was integrated into a small computer on a visual servoing drone and tested outdoors. The alignment threshold was set at 65 cm, based on the size of the retrieval mechanism. The experiments indicate that the drone achieved an average alignment error of 41.985 cm within 7.82 seconds. The experimental results exhibit the effectiveness of the UAV-UGV alignment for the retrieval task in hazardous and remote areas.
ISSN:21693536
DOI:10.1109/access.2025.3578612