Distributed optimization with faulty nodes: robust aggregation in hyperbolic space

The increasing deployment of distributed machine learning models necessitates robust optimization methods that can tolerate adversarial or faulty nodes. In this work, we propose a robust gradient aggregation method for distributed stochastic gradient descent that leverages hyperbolic geometry. Speci...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Neural computing & applications Ročník 37; číslo 26; s. 21563 - 21605
Hlavní autoři: Ghosh, Subhas Kumar, Vittamsetti, Vijay Monic
Médium: Journal Article
Jazyk:angličtina
Vydáno: London Springer London 01.09.2025
Springer Nature B.V
Témata:
ISSN:0941-0643, 1433-3058
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The increasing deployment of distributed machine learning models necessitates robust optimization methods that can tolerate adversarial or faulty nodes. In this work, we propose a robust gradient aggregation method for distributed stochastic gradient descent that leverages hyperbolic geometry. Specifically, local gradients computed at individual nodes are embedded into hyperbolic space using the Poincaré ball model, and their geometric median is computed as a robust aggregate. This aggregated gradient is then mapped back to Euclidean space for the gradient update. We also show that existing robust gradient aggregation methods like Krum can be improved using hyperbolic space. Compared to existing robust aggregation methods, our hyperbolic approach offers improved separation of outlier updates. We provide theoretical convergence guarantees and validate our method on benchmark datasets as well as on a traffic forecasting task, demonstrating its efficacy in mitigating Byzantine failures in distributed federated learning environments.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-025-11475-0