Stronger estimations of Csiszar f-divergences
Gespeichert in:
| Titel: | Stronger estimations of Csiszar f-divergences |
|---|---|
| Autoren: | Ivelić Bradanović, Slavica |
| Verlagsinformationen: | 2023. |
| Publikationsjahr: | 2023 |
| Schlagwörter: | Csizar f-divergences, Kullback-Leibler divergence, Hellinger divergence, Strongly convex functions |
| Beschreibung: | In many problems in statistics, closeness/similarity between two probability distributions needs to be measured. To solve such problems, various statistical divergences are introduced as essential and general tool for comparison of two distributions. A statistical divergence D(p,q), as mapping of two probability distributions p and q to R, satisfies conditions D(p,q)≥0 and D(p,q)=iff p=q. Two distributions p and q are very similar if D(p,q) is very close to zero. One important class of statistical divergence is defined by means of convex functions and is known as Csiszár f-divergence. In our work, we establish stronger estimations of Csiszar f-divergences between two distributions by using the class of strongly convex functions, a subclass of convex functions with stronger versions of analogous properties. As outcome we derive stronger estimates for some well known divergences as the Kullback-Leibler divergence, χ-divergence, Hellinger divergence, Bhattacharya distance and Jeffreys distance. |
| Publikationsart: | Conference object |
| Dokumentencode: | edsair.dris...01492..24f9a70cf5b80883202e22b6bc19431c |
| Datenbank: | OpenAIRE |
Schreiben Sie den ersten Kommentar!
Nájsť tento článok vo Web of Science