Modeling the Bionic Compound Eye Vision System Based on Graph Neural Networks

A bionic compound eye (CE) vision system is inspired by examples from nature, such as the eyes of dragonflies, mollusks, and other beings. It is used for visual measurements and 3-D reconstruction at close range due to the large number of overlapping miniaturized subeyes, which allow such systems to...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE sensors journal Ročník 25; číslo 14; s. 26748 - 26755
Hlavní autori: Arngold, Artem, Li, Yuan, Ren, Xuemei
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 15.07.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1530-437X, 1558-1748
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:A bionic compound eye (CE) vision system is inspired by examples from nature, such as the eyes of dragonflies, mollusks, and other beings. It is used for visual measurements and 3-D reconstruction at close range due to the large number of overlapping miniaturized subeyes, which allow such systems to be applied in robot navigation, autonomous vehicles, medical endoscopy, and others. The calibration of the CE is difficult due to distortions and the large number of optimized parameters. This work proposes a new method for CE modeling based on graph neural networks (GNNs). This model creates a 2-D to 3-D correspondence solving the problem of missing values that appears when an object is not captured in all subeyes. The obtained results verified better performance of the proposed model in the estimation of 3-D object coordinates and in visual measurement of Euclidean distance between objects, compared to a traditional calibration approach based on pinhole camera model as well as a method based on multilayer perceptron (MLP) model, where missing values are filled with zeros. Comparative analysis is done to validate a design of the proposed GNN-based model.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2025.3575172