APPFL: Open-Source Software Framework for Privacy-Preserving Federated Learning

Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in clas-sical machine learning. The FL capability is especially important to domains such as biomedicine and smart grid, wh...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2022 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW) s. 1074 - 1083
Hlavní autoři: Ryu, Minseok, Kim, Youngdae, Kim, Kibaek, Madduri, Ravi K.
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 01.05.2022
Témata:
ISBN:9781665497480
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in clas-sical machine learning. The FL capability is especially important to domains such as biomedicine and smart grid, where data may not be shared freely or stored at a central location because of policy regulations. Thanks to the capability of learning from decentralized datasets, FL is now a rapidly growing research field, and numerous FL frameworks have been developed. In this work we introduce APPFL, the Argonne Privacy-Preserving Federated Learning framework. APPFL allows users to leverage implemented privacy-preserving algorithms, implement new al-gorithms, and simulate and deploy various FL algorithms with privacy-preserving techniques. The modular framework enables users to customize the components for algorithms, privacy, communication protocols, neural network models, and user data. We also present a new communication-efficient algorithm based on an inexact alternating direction method of multipliers. The algorithm requires significantly less communication between the server and the clients than does the current state of the art. We demonstrate the computational capabilities of APPFL, including differentially private FL on various test datasets and its scalability, by using multiple algorithms and datasets on different computing environments.
ISBN:9781665497480
DOI:10.1109/IPDPSW55747.2022.00175