FAdagrad: Adaptive federated learning with differential privacy

Federated Learning (FL) represents a promising distributed learning paradigm that enables model training without centralizing users' sensitive data. However, FL faces several practical challenges, such as communication overhead, convergence rates, robustness, and overall performance efficacy. A...

Full description

Saved in:
Bibliographic Details
Published in:2024 IEEE International Conference on High Performance Computing and Communications (HPCC) pp. 508 - 515
Main Authors: Luo, Yuling, Pan, Ziyan, Fu, Qiang, Qin, Sheng
Format: Conference Proceeding
Language:English
Published: IEEE 13.12.2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Federated Learning (FL) represents a promising distributed learning paradigm that enables model training without centralizing users' sensitive data. However, FL faces several practical challenges, such as communication overhead, convergence rates, robustness, and overall performance efficacy. A paramount research objective within FL is to achieve rapid convergence without compromising privacy. Regrettably, the existing body of research on this subject is scant, with most studies concentrating solely on privacy safeguards or the optimization of adaptive learning algorithms. To address this gap, this work introduces an adaptive gradient descent algorithm underpinned by Differential Privacy (DP). The approach synergizes the TensorFlow federated learning framework with a suite of techniques, including relaxed DP, subsampling for privacy amplification, and the shuffle model, to expedite convergence while bolstering privacy protection. The empirical outcomes on the MNIST dataset show that the FAdagrad algorithm accomplishes a test accuracy of 82.29% within just 12 iterations, all the while maintaining a tighter privacy control (with a privacy budget of 0.5). In comparison to the DP-FL, Simple, and Topk algorithms that employ conventional differential privacy techniques and attain peak test accuracies of 52.17%, 71.09%, and 72.67% respectively under analogous conditions, the FAdagrad algorithm introduced in this paper markedly improves the key performance indicators of the model.
DOI:10.1109/HPCC64274.2024.00074