Byzantine Resilient Non-Convex SCSG With Distributed Batch Gradient Computations

Distributed learning is an important paradigm in the current machine learning algorithms with large datasets. In this paper, distributed stochastic optimization problem of minimizing a nonconvex function in an adversarial setting is considered. A robust variant of the stochastic variance-reduced alg...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on signal and information processing over networks Ročník 7; s. 754 - 766
Hlavní autori: Bulusu, Saikiran, Khanduri, Prashant, Kafle, Swatantra, Sharma, Pranay, Varshney, Pramod K.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:2373-776X, 2373-7778
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Distributed learning is an important paradigm in the current machine learning algorithms with large datasets. In this paper, distributed stochastic optimization problem of minimizing a nonconvex function in an adversarial setting is considered. A robust variant of the stochastic variance-reduced algorithm is proposed. In the distributed setup, we assume that a fraction of worker nodes (WNs) can be Byzantines. We assume that the batch gradients are computed at the WNs and the stochastic gradients are computed at the central node (CN). We provide the convergence rate of the proposed algorithm which employs the design of a novel filtering rule that is independent of the problem dimension. Furthermore, we capture the effect of Byzantines present in the network on the convergence performance of the algorithm. We evaluate the performance of the proposed algorithm and present the simulation results using real world datasets, in addition to providing the theoretical guarantees.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2373-776X
2373-7778
DOI:10.1109/TSIPN.2021.3129352