On Smooth Rényi Entropies: A Novel Information Measure, One-Shot Coding Theorems, and Asymptotic Expansions
This study considers the unconditional smooth Rényi entropy proposed by Renner and Wolf [ASIACRYPT, 2005], the smooth conditional Rényi entropy proposed by Kuzuoka [IEEE Trans. Inf. Th., 66(3), 1674-1690, 2020], and a novel quantity which we term the conditional smooth -⋆ entropy. The latter two qua...
Uloženo v:
| Vydáno v: | IEEE transactions on information theory Ročník 68; číslo 3; s. 1496 - 1531 |
|---|---|
| Hlavní autoři: | , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
New York
IEEE
01.03.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 0018-9448, 1557-9654 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | This study considers the unconditional smooth Rényi entropy proposed by Renner and Wolf [ASIACRYPT, 2005], the smooth conditional Rényi entropy proposed by Kuzuoka [IEEE Trans. Inf. Th., 66(3), 1674-1690, 2020], and a novel quantity which we term the conditional smooth -⋆ entropy. The latter two quantities can be specialized to the first in the absence of side-information. We explore the operational roles of these smooth Rényi entropies by establishing one-shot coding theorems for several information-theoretic problems, including Campbell's source coding problem, the Arıkan-Massey guessing problem, and the Bunte-Lapidoth task encoding problem. We consider these problems in cases where the errors are non-vanishing and for each problem, we consider two error formalisms: the average and maximum error criteria, where the averaging and maximization are taken with respect to the side-information. Using the one-shot coding theorems, we conclude that Kuzuoka's smooth conditional Rényi entropy and the conditional smooth-⋆ entropy are the solutions to the problems involving the average and maximum error criteria, respectively. Furthermore, we examine asymptotic expansions of these entropies when the underlying source with its side-information is stationary and memoryless. Applying our asymptotic expansions to the one-shot coding theorems, we derive various fundamental limits for these problems. We show that, under non-degenerate settings, the first-order fundamental limits differ under the average and maximum error criteria. This is in contrast to a different but related setting considered by the present authors [IEEE Trans. Inf. Th., 66(12), 7565-7587, 2020], for variable-length conditional source coding allowing errors, in which the first-order terms are identical but the second-order terms are different under these error criteria. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0018-9448 1557-9654 |
| DOI: | 10.1109/TIT.2021.3132670 |