Stronger estimations of Csiszar f-divergences

Saved in:
Bibliographic Details
Title: Stronger estimations of Csiszar f-divergences
Authors: Ivelić Bradanović, Slavica
Publisher Information: 2023.
Publication Year: 2023
Subject Terms: Csizar f-divergences, Kullback-Leibler divergence, Hellinger divergence, Strongly convex functions
Description: In many problems in statistics, closeness/similarity between two probability distributions needs to be measured. To solve such problems, various statistical divergences are introduced as essential and general tool for comparison of two distributions. A statistical divergence D(p,q), as mapping of two probability distributions p and q to R, satisfies conditions D(p,q)≥0 and D(p,q)=iff p=q. Two distributions p and q are very similar if D(p,q) is very close to zero. One important class of statistical divergence is defined by means of convex functions and is known as Csiszár f-divergence. In our work, we establish stronger estimations of Csiszar f-divergences between two distributions by using the class of strongly convex functions, a subclass of convex functions with stronger versions of analogous properties. As outcome we derive stronger estimates for some well known divergences as the Kullback-Leibler divergence, χ-divergence, Hellinger divergence, Bhattacharya distance and Jeffreys distance.
Document Type: Conference object
Accession Number: edsair.dris...01492..24f9a70cf5b80883202e22b6bc19431c
Database: OpenAIRE
Be the first to leave a comment!
You must be logged in first