A Research on Character Feature Extraction for Computer Vision and Pattern Recognition

Character feature extraction is a key area in computer vision and pattern recognition. Traditional methods often rely on manually designed extractors, which struggle with capturing complex structures and abstract features in character images, limiting their performance. The training and tuning of th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of information technologies and systems approach Jg. 17; H. 1; S. 1 - 19
Hauptverfasser: Wang, Xiaoyuan, Wang, Hongfei, Wang, Jianping, Ge, Jingjing, Dong, Haiyan
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Hershey IGI Global 09.01.2025
Schlagworte:
ISSN:1935-570X, 1935-5718
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Character feature extraction is a key area in computer vision and pattern recognition. Traditional methods often rely on manually designed extractors, which struggle with capturing complex structures and abstract features in character images, limiting their performance. The training and tuning of these models require considerable computational resources and time, reducing efficiency. This paper explores and compares various character feature extraction methods. It integrates two-dimensional wavelet decomposition with grid-based statistical and structural features. A detailed design of wavelet coarse and fine grid feature vectors is presented, starting with the construction and extraction of wavelet coarse grid feature vectors, followed by the finer grid feature vectors. The wavelet fine grid features demonstrate stronger specificity and discrimination than the coarse grid features. Experimental validation on 108 character samples yielded a 97.4% success rate, confirming the practicality and effectiveness of the proposed feature extraction method.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1935-570X
1935-5718
DOI:10.4018/IJITSA.366037