Kinship Verification in Childhood Images Using Vision Transformer
Facial Kinship Verification involves determining whether two face images belong to relatives, a task that is particularly challenging due to subtle differences in facial features and large intra-class variations. In recent years, deep learning models have shown great promise in addressing this probl...
Saved in:
| Published in: | Procedia computer science Vol. 258; pp. 3105 - 3114 |
|---|---|
| Main Authors: | , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
Elsevier B.V
2025
|
| Subjects: | |
| ISSN: | 1877-0509, 1877-0509 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Facial Kinship Verification involves determining whether two face images belong to relatives, a task that is particularly challenging due to subtle differences in facial features and large intra-class variations. In recent years, deep learning models have shown great promise in addressing this problem. In this work, we propose a Vision Transformer (ViT) model for facial Kinship Verification, leveraging the proven effectiveness of Transformer architectures in Natural Language Processing. The Vision Transformer is trained end-to-end on two benchmark datasets: the large-scale Families in the Wild (FIW) dataset, consisting of thousands of face images with corresponding kinship labels, and the smaller KinFaceW-II dataset. Our model employs multiple attention mechanisms to capture complex relationships between facial features and produce a final kinship prediction. Experimental results demonstrate that our approach outperforms state-of-the-art methods, achieving an average accuracy of 92% on the FIW dataset and an F1 score of 0.85. The Euclidean distance metric further enhances the classification of kin and non-kin pairs. These findings confirm the effectiveness of Vision Transformer models for facial Kinship Verification and underscore their potential for future research in this domain. |
|---|---|
| ISSN: | 1877-0509 1877-0509 |
| DOI: | 10.1016/j.procs.2025.04.568 |