APPFL: Open-Source Software Framework for Privacy-Preserving Federated Learning

Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in clas-sical machine learning. The FL capability is especially important to domains such as biomedicine and smart grid, wh...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:2022 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW) S. 1074 - 1083
Hauptverfasser: Ryu, Minseok, Kim, Youngdae, Kim, Kibaek, Madduri, Ravi K.
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 01.05.2022
Schlagworte:
ISBN:9781665497480
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in clas-sical machine learning. The FL capability is especially important to domains such as biomedicine and smart grid, where data may not be shared freely or stored at a central location because of policy regulations. Thanks to the capability of learning from decentralized datasets, FL is now a rapidly growing research field, and numerous FL frameworks have been developed. In this work we introduce APPFL, the Argonne Privacy-Preserving Federated Learning framework. APPFL allows users to leverage implemented privacy-preserving algorithms, implement new al-gorithms, and simulate and deploy various FL algorithms with privacy-preserving techniques. The modular framework enables users to customize the components for algorithms, privacy, communication protocols, neural network models, and user data. We also present a new communication-efficient algorithm based on an inexact alternating direction method of multipliers. The algorithm requires significantly less communication between the server and the clients than does the current state of the art. We demonstrate the computational capabilities of APPFL, including differentially private FL on various test datasets and its scalability, by using multiple algorithms and datasets on different computing environments.
AbstractList Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in clas-sical machine learning. The FL capability is especially important to domains such as biomedicine and smart grid, where data may not be shared freely or stored at a central location because of policy regulations. Thanks to the capability of learning from decentralized datasets, FL is now a rapidly growing research field, and numerous FL frameworks have been developed. In this work we introduce APPFL, the Argonne Privacy-Preserving Federated Learning framework. APPFL allows users to leverage implemented privacy-preserving algorithms, implement new al-gorithms, and simulate and deploy various FL algorithms with privacy-preserving techniques. The modular framework enables users to customize the components for algorithms, privacy, communication protocols, neural network models, and user data. We also present a new communication-efficient algorithm based on an inexact alternating direction method of multipliers. The algorithm requires significantly less communication between the server and the clients than does the current state of the art. We demonstrate the computational capabilities of APPFL, including differentially private FL on various test datasets and its scalability, by using multiple algorithms and datasets on different computing environments.
Author Kim, Kibaek
Kim, Youngdae
Madduri, Ravi K.
Ryu, Minseok
Author_xml – sequence: 1
  givenname: Minseok
  surname: Ryu
  fullname: Ryu, Minseok
  email: mryu@anl.gov
  organization: Argonne National Laboratory,Mathematics and Computer Science Division,Lemont,IL,USA
– sequence: 2
  givenname: Youngdae
  surname: Kim
  fullname: Kim, Youngdae
  email: youngdae@anl.gov
  organization: Argonne National Laboratory,Mathematics and Computer Science Division,Lemont,IL,USA
– sequence: 3
  givenname: Kibaek
  surname: Kim
  fullname: Kim, Kibaek
  email: kimk@anl.gov
  organization: Argonne National Laboratory,Mathematics and Computer Science Division,Lemont,IL,USA
– sequence: 4
  givenname: Ravi K.
  surname: Madduri
  fullname: Madduri, Ravi K.
  email: madduri@anl.gov
  organization: Argonne National Laboratory,Data Science and Learning Division,Lemont,IL,USA
BookMark eNo1j81Kw0AYRUdU0NY-gSDzAqnf_M-4K9VoIdBAFZdl2txI1CZlUit9ewvq6nDP4sIZsLO2a8HYjaCxEBRuZ-V9uXg1xmk3liTlmEg4c8IGwlqjw1GbUzYKzv9vTxds1PfvRCSDEj7ISzaflGVe3PH5Fm226L7SGnzR1bvvmMDzFDf47tIHr7vEy9Ts4_qQlQk90r5p33iOCinuUPECMbVHdcXO6_jZY_THIXvJH56nT1kxf5xNJ0XWSFK7zGqiCj4YA6zseuXrCEgd7DEMtfXRQMqoqNKiUtFZkrq2wsARVvAU1JBd__42AJbb1GxiOiyDV0aTUz8Uy1Jj
CODEN IEEPAD
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/IPDPSW55747.2022.00175
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE/IET Electronic Library
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 1665497475
9781665497473
EndPage 1083
ExternalDocumentID 9835407
Genre orig-research
GroupedDBID 6IE
6IF
6IL
6IN
AAWTH
ABLEC
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
IEGSK
OCL
RIB
RIC
RIE
RIL
ID FETCH-LOGICAL-i203t-6400de8955eeb6cb8faee2496109ef68a5e22a30d41d3a76024f615e70ebe8093
IEDL.DBID RIE
ISBN 9781665497480
ISICitedReferencesCount 17
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000855041000127&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
IngestDate Wed Aug 27 02:24:24 EDT 2025
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i203t-6400de8955eeb6cb8faee2496109ef68a5e22a30d41d3a76024f615e70ebe8093
PageCount 10
ParticipantIDs ieee_primary_9835407
PublicationCentury 2000
PublicationDate 2022-May
PublicationDateYYYYMMDD 2022-05-01
PublicationDate_xml – month: 05
  year: 2022
  text: 2022-May
PublicationDecade 2020
PublicationTitle 2022 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)
PublicationTitleAbbrev IPDPSW
PublicationYear 2022
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssj0002931892
Score 2.0366902
Snippet Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central...
SourceID ieee
SourceType Publisher
StartPage 1074
SubjectTerms Collaborative work
communication-efficient algorithm
Data models
data privacy
federated learning
open-source software
Protocols
Regulation
Scalability
Software algorithms
Training
Title APPFL: Open-Source Software Framework for Privacy-Preserving Federated Learning
URI https://ieeexplore.ieee.org/document/9835407
WOSCitedRecordID wos000855041000127&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwFA6bePCksom_ycGjcWnTNIk3UYvCmIUp7jbS5HXs0sncJv73vnR1evDirRQK5YW87-Xl-75HyIWLLSJPwRlAbFgilWOF0gmLrAx-WcGupBYK99VgoEcjk7fI5UYLAwA1-QyuwmN9l-9nbhlaZT1TdylUm7SVStdarU0_BWEr0iZuXJzCTF2slDVvRMERN73H_C4fvkqJBTSeC-Ng1BkFeuGvqSo1qGS7__udPdL9UefRfIM7-6QFVYc83eR51r-mgSDChnVHng4xxX7YOdDsm4FFsUTFb6cr6z5ZYF-ETFFNaBYsJbDq9LTxW510yUt2_3z7wJphCWwac7FgKW5GD9pICVCkrtClxfAnJtipQ5lqKyGOreA-ibywKkVsLrGaAcVxGTU34oBsVbMKDgkFK33ptRCJc4nE85iJXIGZoPRCBHOdI9IJwRi_rf0wxk0cjv9-fUJ2QrTXJMFTsrWYL-GMbLvVYvo-P68X8QuJXJpI
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1NT8JAEN0gmuhJDRi_3YNHV9put931ZtQGYsUmYORGtrtTwgUMAsZ_706p6MGLt6ZJk2YmnTc7fe8NIZcm0A55co8BBIqFIjYsj2XIfC3QLwvtSkqhcBp3u3IwUFmNXK21MABQks_gGi_Lf_l2ahY4KmupckoRb5BN3JxVqbXWExUHXL5UQeXjhFt1Xa8svUoW7Huq1cnus96rEK6FdifDAK06fSQY_tqrUsJKsvu_F9ojzR99Hs3WyLNPajBpkOfbLEvSG4oUEdYrZ_K054rsh54BTb45WNQ1qe7Z8VKbT4b8C6wVkxFN0FTC9Z2WVo6royZ5SR76d21WrUtg48Djcxa5z9GCVEIA5JHJZaFdAkKFhupQRFILCALNPRv6lus4cuhcuH4GYs8lUnqKH5D6ZDqBQ0JBC1tYyXloTCjciUz5Jne1oLCco73OEWlgMIZvK0eMYRWH479vX5Dtdv8pHaad7uMJ2cHIryiDp6Q-ny3gjGyZ5Xz8PjsvE_oFxoSdkQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2022+IEEE+International+Parallel+and+Distributed+Processing+Symposium+Workshops+%28IPDPSW%29&rft.atitle=APPFL%3A+Open-Source+Software+Framework+for+Privacy-Preserving+Federated+Learning&rft.au=Ryu%2C+Minseok&rft.au=Kim%2C+Youngdae&rft.au=Kim%2C+Kibaek&rft.au=Madduri%2C+Ravi+K.&rft.date=2022-05-01&rft.pub=IEEE&rft.isbn=9781665497480&rft.spage=1074&rft.epage=1083&rft_id=info:doi/10.1109%2FIPDPSW55747.2022.00175&rft.externalDocID=9835407
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781665497480/lc.gif&client=summon&freeimage=true
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781665497480/mc.gif&client=summon&freeimage=true
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781665497480/sc.gif&client=summon&freeimage=true