Handwriting-based gender and handedness classification using convolutional neural networks

Demographical handwritings classification has many applications in various disciplines such as biometrics forensics, psychology, archeology, etc. Finding the best features for differentiating subclasses (e.g. men and women) is one of the major problems in handwriting based demographical classificati...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Multimedia tools and applications Ročník 80; číslo 28-29; s. 35341 - 35364
Hlavní autoři: Rahmanian, Mina, Shayegan, Mohammad Amin
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York Springer US 01.11.2021
Springer Nature B.V
Témata:
ISSN:1380-7501, 1573-7721
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Demographical handwritings classification has many applications in various disciplines such as biometrics forensics, psychology, archeology, etc. Finding the best features for differentiating subclasses (e.g. men and women) is one of the major problems in handwriting based demographical classification. Convolutional Neural Networks (CNNs) advanced models have a higher capacity in extracting appropriate features compared to traditional models. In this paper, the ability and capacity of deep CNNs in automatic classification of two handwriting based demographical problems, i.e. gender and handedness classification, have been examined by using advanced CNNs; DenseNet201, InceptionV3, and Xception. Two databases, IAM (English texts) and KHATT (Arabic texts) have been employed in this study. The achieved results showed that the proposed CNNs architectures performed well in improving classification results, with 84% accuracy (1.27% improvement) for gender classification using the IAM database, and 99.14% accuracy (28.23% improvement) for handedness classification using the KHATT database.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-020-10170-7