Modeling the Bionic Compound Eye Vision System Based on Graph Neural Networks

A bionic compound eye (CE) vision system is inspired by examples from nature, such as the eyes of dragonflies, mollusks, and other beings. It is used for visual measurements and 3-D reconstruction at close range due to the large number of overlapping miniaturized subeyes, which allow such systems to...

Full description

Saved in:
Bibliographic Details
Published in:IEEE sensors journal Vol. 25; no. 14; pp. 26748 - 26755
Main Authors: Arngold, Artem, Li, Yuan, Ren, Xuemei
Format: Journal Article
Language:English
Published: New York IEEE 15.07.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1530-437X, 1558-1748
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A bionic compound eye (CE) vision system is inspired by examples from nature, such as the eyes of dragonflies, mollusks, and other beings. It is used for visual measurements and 3-D reconstruction at close range due to the large number of overlapping miniaturized subeyes, which allow such systems to be applied in robot navigation, autonomous vehicles, medical endoscopy, and others. The calibration of the CE is difficult due to distortions and the large number of optimized parameters. This work proposes a new method for CE modeling based on graph neural networks (GNNs). This model creates a 2-D to 3-D correspondence solving the problem of missing values that appears when an object is not captured in all subeyes. The obtained results verified better performance of the proposed model in the estimation of 3-D object coordinates and in visual measurement of Euclidean distance between objects, compared to a traditional calibration approach based on pinhole camera model as well as a method based on multilayer perceptron (MLP) model, where missing values are filled with zeros. Comparative analysis is done to validate a design of the proposed GNN-based model.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2025.3575172