On the rates of convergence of parallelized averaged stochastic gradient algorithms

The growing interest for high-dimensional and functional data analysis led in the last decade to important research developing a consequent amount of techniques. Parallelized algorithms, which consist of distributing and treat the data into different machines, for example, are a good answer to deal...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Statistics (Berlin, DDR) Ročník 54; číslo 3; s. 618 - 635
Hlavní autori: Godichon-Baggioni, Antoine, Saadane, Sofiane
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Abingdon Taylor & Francis 03.05.2020
Taylor & Francis Ltd
Taylor & Francis: STM, Behavioural Science and Public Health Titles
Predmet:
ISSN:0233-1888, 1029-4910
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:The growing interest for high-dimensional and functional data analysis led in the last decade to important research developing a consequent amount of techniques. Parallelized algorithms, which consist of distributing and treat the data into different machines, for example, are a good answer to deal with large samples taking values in high-dimensional spaces. We introduce here a parallelized averaged stochastic gradient algorithm, which enables to treat efficiently and recursively the data, and so, without taking care if the distribution of the data into the machines is uniform. The rate of convergence in quadratic mean, as well as the asymptotic normality of the parallelized estimates are given, for strongly and locally strongly convex objectives.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0233-1888
1029-4910
DOI:10.1080/02331888.2020.1764557