A measure of relative entropy between individual sequences with application to universal classification

A new notion of empirical informational divergence (relative entropy) between two individual sequences is introduced. If the two sequences are independent realizations of two finite-order, finite alphabet, stationary Markov processes, the empirical relative entropy converges to the relative entropy...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on information theory Ročník 39; číslo 4; s. 1270 - 1279
Hlavní autoři: Ziv, J., Merhav, N.
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York, NY IEEE 01.07.1993
Institute of Electrical and Electronics Engineers
Témata:
ISSN:0018-9448
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:A new notion of empirical informational divergence (relative entropy) between two individual sequences is introduced. If the two sequences are independent realizations of two finite-order, finite alphabet, stationary Markov processes, the empirical relative entropy converges to the relative entropy almost surely. This empirical divergence is based on a version of the Lempel-Ziv data compression algorithm. A simple universal algorithm for classifying individual sequences into a finite number of classes, which is based on the empirical divergence, is introduced. The algorithm discriminates between the classes whenever they are distinguishable by some finite-memory classifier for almost every given training set and almost any test sequence from these classes. It is universal in the sense that it is independent of the unknown sources.< >
Bibliografie:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0018-9448
DOI:10.1109/18.243444