Information-Theoretic Lower Bounds to Error Probability for the Models of Noisy Discrete Source Coding and Object Classification

Probabilistic models of noisy discrete source coding and object classification are studied. For these models, the appropriate minimal information amounts as the functions of a given admissible error probability are defined and the strictly decreasing lower bounds to these functions are constructed....

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Pattern recognition and image analysis Ročník 32; číslo 3; s. 570 - 574
Hlavní autoři: Lange, M. M., Lange, A. M.
Médium: Journal Article
Jazyk:angličtina
Vydáno: Moscow Pleiades Publishing 01.09.2022
Springer Nature B.V
Témata:
ISSN:1054-6618, 1555-6212
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Probabilistic models of noisy discrete source coding and object classification are studied. For these models, the appropriate minimal information amounts as the functions of a given admissible error probability are defined and the strictly decreasing lower bounds to these functions are constructed. The defined functions are similar to the rate-distortion function known in the information theory and the lower bounds to the these functions yield a minimal error probability subject to a given value of the processed information amount. So, the obtained bounds are the bifactor fidelity criterions in source coding and object classification tasks.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1054-6618
1555-6212
DOI:10.1134/S105466182203021X