Distributed optimization with faulty nodes: robust aggregation in hyperbolic space

The increasing deployment of distributed machine learning models necessitates robust optimization methods that can tolerate adversarial or faulty nodes. In this work, we propose a robust gradient aggregation method for distributed stochastic gradient descent that leverages hyperbolic geometry. Speci...

Full description

Saved in:
Bibliographic Details
Published in:Neural computing & applications Vol. 37; no. 26; pp. 21563 - 21605
Main Authors: Ghosh, Subhas Kumar, Vittamsetti, Vijay Monic
Format: Journal Article
Language:English
Published: London Springer London 01.09.2025
Springer Nature B.V
Subjects:
ISSN:0941-0643, 1433-3058
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The increasing deployment of distributed machine learning models necessitates robust optimization methods that can tolerate adversarial or faulty nodes. In this work, we propose a robust gradient aggregation method for distributed stochastic gradient descent that leverages hyperbolic geometry. Specifically, local gradients computed at individual nodes are embedded into hyperbolic space using the Poincaré ball model, and their geometric median is computed as a robust aggregate. This aggregated gradient is then mapped back to Euclidean space for the gradient update. We also show that existing robust gradient aggregation methods like Krum can be improved using hyperbolic space. Compared to existing robust aggregation methods, our hyperbolic approach offers improved separation of outlier updates. We provide theoretical convergence guarantees and validate our method on benchmark datasets as well as on a traffic forecasting task, demonstrating its efficacy in mitigating Byzantine failures in distributed federated learning environments.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-025-11475-0