Stronger estimations of Csiszar f-divergences

Uložené v:
Podrobná bibliografia
Názov: Stronger estimations of Csiszar f-divergences
Autori: Ivelić Bradanović, Slavica
Informácie o vydavateľovi: 2023.
Rok vydania: 2023
Predmety: Csizar f-divergences, Kullback-Leibler divergence, Hellinger divergence, Strongly convex functions
Popis: In many problems in statistics, closeness/similarity between two probability distributions needs to be measured. To solve such problems, various statistical divergences are introduced as essential and general tool for comparison of two distributions. A statistical divergence D(p,q), as mapping of two probability distributions p and q to R, satisfies conditions D(p,q)≥0 and D(p,q)=iff p=q. Two distributions p and q are very similar if D(p,q) is very close to zero. One important class of statistical divergence is defined by means of convex functions and is known as Csiszár f-divergence. In our work, we establish stronger estimations of Csiszar f-divergences between two distributions by using the class of strongly convex functions, a subclass of convex functions with stronger versions of analogous properties. As outcome we derive stronger estimates for some well known divergences as the Kullback-Leibler divergence, χ-divergence, Hellinger divergence, Bhattacharya distance and Jeffreys distance.
Druh dokumentu: Conference object
Prístupové číslo: edsair.dris...01492..24f9a70cf5b80883202e22b6bc19431c
Databáza: OpenAIRE
Popis
Abstrakt:In many problems in statistics, closeness/similarity between two probability distributions needs to be measured. To solve such problems, various statistical divergences are introduced as essential and general tool for comparison of two distributions. A statistical divergence D(p,q), as mapping of two probability distributions p and q to R, satisfies conditions D(p,q)≥0 and D(p,q)=iff p=q. Two distributions p and q are very similar if D(p,q) is very close to zero. One important class of statistical divergence is defined by means of convex functions and is known as Csiszár f-divergence. In our work, we establish stronger estimations of Csiszar f-divergences between two distributions by using the class of strongly convex functions, a subclass of convex functions with stronger versions of analogous properties. As outcome we derive stronger estimates for some well known divergences as the Kullback-Leibler divergence, χ-divergence, Hellinger divergence, Bhattacharya distance and Jeffreys distance.