Online estimation of the asymptotic variance for averaged stochastic gradient algorithms
Stochastic gradient algorithms are more and more studied since they can deal efficiently and online with large samples in high dimensional spaces. In this paper, we first establish a Central Limit Theorem for these estimates as well as for their averaged version in general Hilbert spaces. Moreover,...
Uloženo v:
| Vydáno v: | Journal of statistical planning and inference Ročník 203; s. 1 - 19 |
|---|---|
| Hlavní autor: | |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Elsevier B.V
01.12.2019
Elsevier |
| Témata: | |
| ISSN: | 0378-3758, 1873-1171 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Stochastic gradient algorithms are more and more studied since they can deal efficiently and online with large samples in high dimensional spaces. In this paper, we first establish a Central Limit Theorem for these estimates as well as for their averaged version in general Hilbert spaces. Moreover, since having the asymptotic normality of estimates is often unusable without an estimation of the asymptotic variance, we introduce a new recursive algorithm for estimating this last one, and we establish its almost sure rate of convergence as well as its rate of convergence in quadratic mean. Finally, two examples consisting in estimating the parameters of the logistic regression and estimating geometric quantiles are given.
•The asymptotic normality of averaged stochastic gradient estimates is established.•A recursive estimate of the asymptotic variance is introduced.•Rates of convergence of the recursive estimate of the variance are established. |
|---|---|
| ISSN: | 0378-3758 1873-1171 |
| DOI: | 10.1016/j.jspi.2019.01.001 |