Information-Theoretic Lower Bounds to Error Probability for the Models of Noisy Discrete Source Coding and Object Classification

Probabilistic models of noisy discrete source coding and object classification are studied. For these models, the appropriate minimal information amounts as the functions of a given admissible error probability are defined and the strictly decreasing lower bounds to these functions are constructed....

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition and image analysis Vol. 32; no. 3; pp. 570 - 574
Main Authors: Lange, M. M., Lange, A. M.
Format: Journal Article
Language:English
Published: Moscow Pleiades Publishing 01.09.2022
Springer Nature B.V
Subjects:
ISSN:1054-6618, 1555-6212
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Probabilistic models of noisy discrete source coding and object classification are studied. For these models, the appropriate minimal information amounts as the functions of a given admissible error probability are defined and the strictly decreasing lower bounds to these functions are constructed. The defined functions are similar to the rate-distortion function known in the information theory and the lower bounds to the these functions yield a minimal error probability subject to a given value of the processed information amount. So, the obtained bounds are the bifactor fidelity criterions in source coding and object classification tasks.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1054-6618
1555-6212
DOI:10.1134/S105466182203021X