Wireless Resource Efficient Distributed Learning with Deep Joint Source-Channel Coding
Distributed learning methods such as federated learning and decentralized federated learning are promising approaches to constructing high-accurate models through collaboration among client nodes. In common distributed learning, each node continuously executes model training and sharing processes to...
Gespeichert in:
| Veröffentlicht in: | IEEE Globecom Workshops S. 1 - 6 |
|---|---|
| Hauptverfasser: | , |
| Format: | Tagungsbericht |
| Sprache: | Englisch |
| Veröffentlicht: |
IEEE
08.12.2024
|
| Schlagworte: | |
| ISSN: | 2166-0077 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | Distributed learning methods such as federated learning and decentralized federated learning are promising approaches to constructing high-accurate models through collaboration among client nodes. In common distributed learning, each node continuously executes model training and sharing processes to sophisticate a trained model; however, it consumes numerous wireless resources because the data size of deep learning models has been increasing in recent years. For the remedy, this paper introduces a novel concept of distributed learning architecture that employs deep joint source-channel coding (DJSCC) for the sharing process. DJSCC can provide wireless resource efficient communications due to the data-driven encoder/decoder optimization and pseudo-analog modulation. The most significant contribution of this paper is that, even with noise included through pseudo-analog modulation, distributed learning smoothly trains models and achieves the desired accuracy with fewer transmission symbols. Through the computer simulation, we show that the proposal reduces the required symbols by approximately 90%, compared to the conventional distributed learning model. |
|---|---|
| AbstractList | Distributed learning methods such as federated learning and decentralized federated learning are promising approaches to constructing high-accurate models through collaboration among client nodes. In common distributed learning, each node continuously executes model training and sharing processes to sophisticate a trained model; however, it consumes numerous wireless resources because the data size of deep learning models has been increasing in recent years. For the remedy, this paper introduces a novel concept of distributed learning architecture that employs deep joint source-channel coding (DJSCC) for the sharing process. DJSCC can provide wireless resource efficient communications due to the data-driven encoder/decoder optimization and pseudo-analog modulation. The most significant contribution of this paper is that, even with noise included through pseudo-analog modulation, distributed learning smoothly trains models and achieves the desired accuracy with fewer transmission symbols. Through the computer simulation, we show that the proposal reduces the required symbols by approximately 90%, compared to the conventional distributed learning model. |
| Author | Suto, Katsuya Matsumura, Issa |
| Author_xml | – sequence: 1 givenname: Issa surname: Matsumura fullname: Matsumura, Issa email: i.matsumura@uec.ac.jp organization: The University of Electro-Communications,Graduate School of Information and Engineering – sequence: 2 givenname: Katsuya surname: Suto fullname: Suto, Katsuya email: k.suto@uec.ac.jp organization: The University of Electro-Communications,Graduate School of Information and Engineering |
| BookMark | eNo1kEtLAzEUhaMoWGv_gYvgfurNe7KUaa1KQfDVZclkbmy0ZobJFPHfW3ysDhy-7yzOKTlKbUJCLhhMGQN7uahW73nTaakEn3Lgcl8zAA38gEyssaUQTIHSxhySEWdaFwDGnJBJzm8AwHTJjdUj8rKKPW4xZ_qAud31Huk8hOgjpoHOYh76WO8GbOgSXZ9ieqWfcdjQGWJH79q4hx5_rKLauJRwS6u22VNn5Di4bcbJX47J8_X8qboplveL2-pqWUQGeih8jcopqUBah6VmQkivpA4BpA9OSInONY2tS2TOuFAGqLmQrFHKld7bRozJ-e9uRMR118cP13-t_68Q3_F2V2U |
| ContentType | Conference Proceeding |
| DBID | 6IE 6IL CBEJK RIE RIL |
| DOI | 10.1109/GCWkshp64532.2024.11100602 |
| DatabaseName | IEEE Electronic Library (IEL) Conference Proceedings IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume IEEE Xplore All Conference Proceedings IEEE Electronic Library (IEL) IEEE Proceedings Order Plans (POP All) 1998-Present |
| DatabaseTitleList | |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISBN | 9798331505677 |
| EISSN | 2166-0077 |
| EndPage | 6 |
| ExternalDocumentID | 11100602 |
| Genre | orig-research |
| GroupedDBID | 6IE 6IF 6IH 6IK 6IL 6IN AAJGR AAWTH ABLEC ACGFS ADZIZ ALMA_UNASSIGNED_HOLDINGS BEFXN BFFAM BGNUA BKEBE BPEOZ CBEJK CHZPO IEGSK IPLJI OCL RIE RIL |
| ID | FETCH-LOGICAL-i106t-cbe5a545049ae861334c546ff04cfa344eaadd9b8e1a7af8f0b2341d55a8cc9d3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 0 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001566406000095&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| IngestDate | Wed Aug 27 01:46:15 EDT 2025 |
| IsPeerReviewed | false |
| IsScholarly | false |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-i106t-cbe5a545049ae861334c546ff04cfa344eaadd9b8e1a7af8f0b2341d55a8cc9d3 |
| PageCount | 6 |
| ParticipantIDs | ieee_primary_11100602 |
| PublicationCentury | 2000 |
| PublicationDate | 2024-Dec.-8 |
| PublicationDateYYYYMMDD | 2024-12-08 |
| PublicationDate_xml | – month: 12 year: 2024 text: 2024-Dec.-8 day: 08 |
| PublicationDecade | 2020 |
| PublicationTitle | IEEE Globecom Workshops |
| PublicationTitleAbbrev | GC Wkshp |
| PublicationYear | 2024 |
| Publisher | IEEE |
| Publisher_xml | – name: IEEE |
| SSID | ssj0001682796 |
| Score | 1.8916752 |
| Snippet | Distributed learning methods such as federated learning and decentralized federated learning are promising approaches to constructing high-accurate models... |
| SourceID | ieee |
| SourceType | Publisher |
| StartPage | 1 |
| SubjectTerms | Accuracy Computational modeling Computer aided instruction Data models deep joint source-channel coding deep learning Distance learning Distributed learning Encoding Noise Quantization (signal) Symbols Wireless communication wireless resource efficiency |
| Title | Wireless Resource Efficient Distributed Learning with Deep Joint Source-Channel Coding |
| URI | https://ieeexplore.ieee.org/document/11100602 |
| WOSCitedRecordID | wos001566406000095&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3JTsMwELWg4gAXtiJ2-cDVbRIvsc9dQByqSmy9VY49hgqUVF34fmw3pXDgwM2KbCeakfzGk3lvELqhQnFIDSVZXhSEKeqI8jBJkoI64D5Cddmq2UQ-GMjRSA1rsnrkwgBALD6DVhjGf_m2MsuQKmunQd9MBOnI7TwXK7LWJqEiZJYrUQuLpolq33Ze3udvU8E4DZSrjLXWG_xqpRKRpL__z284QM0NJw8Pv9HmEG1BeYT2fsgJHqPnUMr64Y8uvE7K415UiPA74m5QyA3NrcDiWlT1FYcsLO4CTPF9NfGTHuIqEigHJXzgThXe1URP_d5j547UfRPIxF_wFsQUwLWPjHzwr0F6vKbMcCacS5hxmjIG2p9qqpCQ6lw76ZIi82BmOdfSGGXpCWqUVQmnCEvlfWWNNtrfBDOgklFpGZeQpLIAa85QM5hoPF1JY4zX1jn_4_kF2g2OiPUg8hI1FrMlXKEd87mYzGfX0aFft92iFw |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3JTsMwELVQQQIubEXs-MDVbRLbiX3uQoFSVaJAb5XjjKGiaqoufD-2m1I4cOBmRbYTzUh-48m8Nwjd0FhyCDUlUZKmhElqiLQwSYKUGuA2QjXRstlE0umIfl92C7K658IAgC8-g4ob-n_5Wa4XLlVWDZ2-WeykIzc5Y1GwpGutUyqxiBIZF9KiYSCrt7XXj9n7JGacOtJVxCqrLX41U_FY0tz751fso_KalYe733hzgDZgfIh2fwgKHqEXV8w6socXXqXlccNrRNgdcd1p5Lr2VpDhQlb1Dbs8LK4DTPB9PrSTnvwq4kgHYxjhWu7eVUbPzUav1iJF5wQytFe8OdEpcGVjIxv-KxAWsSnTnMXGBEwbRRkDZc81mQoIVaKMMEEaWTjLOFdCa5nRY1Qa52M4QVhI661MK63sXTACKhgVGeMCglCkkOlTVHYmGkyW4hiDlXXO_nh-jbZbvcf2oH3XeThHO84pvjpEXKDSfLqAS7SlP-fD2fTKO_cLU5GlXg |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=IEEE+Globecom+Workshops&rft.atitle=Wireless+Resource+Efficient+Distributed+Learning+with+Deep+Joint+Source-Channel+Coding&rft.au=Matsumura%2C+Issa&rft.au=Suto%2C+Katsuya&rft.date=2024-12-08&rft.pub=IEEE&rft.eissn=2166-0077&rft.spage=1&rft.epage=6&rft_id=info:doi/10.1109%2FGCWkshp64532.2024.11100602&rft.externalDocID=11100602 |