Federated learning meets Bayesian neural network: Robust and uncertainty-aware distributed variational inference
Federated Learning (FL) is a popular framework for data privacy protection in distributed machine learning. However, current FL faces some several problems and challenges, including the limited amount of client data and data heterogeneity. These lead to models trained on clients prone to drifting an...
Gespeichert in:
| Veröffentlicht in: | Neural networks Jg. 185; S. 107135 |
|---|---|
| Hauptverfasser: | , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
United States
Elsevier Ltd
01.05.2025
|
| Schlagworte: | |
| ISSN: | 0893-6080, 1879-2782, 1879-2782 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | Federated Learning (FL) is a popular framework for data privacy protection in distributed machine learning. However, current FL faces some several problems and challenges, including the limited amount of client data and data heterogeneity. These lead to models trained on clients prone to drifting and overfitting, such that we just obtain suboptimal performance of the aggregated model. To tackle the aforementioned problems, we introduce a novel approach explicitly integrating Bayesian neural networks (BNNs) into the FL framework. The proposed approach is able to enhance the robustness. We refer to this approach as FedUAB, standing for FL with uncertainty-aware BNNs. In the FedUAB algorithm, each FL client independently trains a BNN using the Bayes by backprop algorithm. The weights of approximating model are modeled as Gaussian distributions, which mitigates the overfitting issue and also ensures better data privacy. Besides, we apply novel methods to overcome other key challenges in the fusion of BNNs and FL, such as selecting an optimal prior distribution, aggregating weights characterized by Gaussian forms across multiple clients, and rigorously managing weights variances. In the simulation of a FL environment, FedUAB demonstrated superior performance with both its server-side global model and client-side personalized models, outperforming traditional FL and other Bayesian FL methods. Moreover, it possesses the capability to quantify and decompose uncertainties. We have open-sourced our project at https://github.com/lpf111222/FedUAB/.
•FedUAB integrates FL and BNNs, reduces overfitting and model drift. Effective for limited, varied data.•Innovative BNN-FL fusion: Prior selection, Gaussian weight aggregation, variance control.•FedUAB boosts server and clients models. Adds uncertainty quantification. |
|---|---|
| Bibliographie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| ISSN: | 0893-6080 1879-2782 1879-2782 |
| DOI: | 10.1016/j.neunet.2025.107135 |