A Characterization of All Single-Integral, Non-Kernel Divergence Estimators
Divergence measures have been used for a long time for different purposes in information theory and statistics. In particular, density-based minimum divergence estimation is a popular tool in the statistical literature. Given the sampled data and a parametric model, we estimate the model parameter b...
Uloženo v:
| Vydáno v: | IEEE transactions on information theory Ročník 65; číslo 12; s. 7976 - 7984 |
|---|---|
| Hlavní autoři: | , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
New York
IEEE
01.12.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 0018-9448, 1557-9654 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Divergence measures have been used for a long time for different purposes in information theory and statistics. In particular, density-based minimum divergence estimation is a popular tool in the statistical literature. Given the sampled data and a parametric model, we estimate the model parameter by choosing the member of the model family that is closest to the data distribution in terms of the given divergence. In the absolutely continuous set up, when the distributions from the model family and the unknown data generating distribution are assumed to have densities, the application of kernel based non-parametric smoothing is sometimes unavoidable to get an estimate of the true data density. The use of kernels (or other non-parametric smoothing techniques) makes the estimation process considerably more complex, as now one has to impose necessary conditions not just on the model but also on the kernel and its bandwidth. In higher dimensions the efficiency of the kernel density estimator (KDE) often becomes too low for the minimum divergence procedure to be practically useful. It can, therefore, lead to a significant advantage to have a divergence which allows minimum divergence estimation bypassing the use of non-parametric smoothing. For the same reason, characterizing the class of such divergences would be a notable achievement. In this work, we provide a characterization of the class of divergences that bypasses the use of non-parametric smoothing in the construction of divergences, providing a solution to this very important problem. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0018-9448 1557-9654 |
| DOI: | 10.1109/TIT.2019.2937527 |