Language-Independent Text-Line Extraction Algorithm for Handwritten Documents

Text-line extraction in handwritten documents is an important step for document image understanding, and a number of algorithms have been proposed to address this problem. However, most of them exploit features of specific languages and work only for a given language. In order to overcome this limit...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE signal processing letters Jg. 21; H. 9; S. 1115 - 1119
Hauptverfasser: Ryu, Jewoong, Koo, Hyung Il, Cho, Nam Ik
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York IEEE 01.09.2014
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:1070-9908, 1558-2361
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Text-line extraction in handwritten documents is an important step for document image understanding, and a number of algorithms have been proposed to address this problem. However, most of them exploit features of specific languages and work only for a given language. In order to overcome this limitation, we develop a language-independent text-line extraction algorithm. Our method is based on connected components (CCs), however, unlike conventional methods, we analyze strokes and partition under-segmented CCs into normalized ones. Due to this normalization, the proposed method is able to estimate the states of CCs for a range of different languages and writing styles. From the estimated states, we build a cost function whose minimization yields text-lines. Experimental results show that the proposed method yields the state-of-the-art performance on Latin-based and Chinese script databases. Further, we submitted the proposed algorithm to the ICDAR 2013 handwriting segmentation competition and our method showed the best text-line extraction performance among 10 participant methods.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2014.2325940