Wireless Resource Efficient Distributed Learning with Deep Joint Source-Channel Coding

Distributed learning methods such as federated learning and decentralized federated learning are promising approaches to constructing high-accurate models through collaboration among client nodes. In common distributed learning, each node continuously executes model training and sharing processes to...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE Globecom Workshops s. 1 - 6
Hlavní autori: Matsumura, Issa, Suto, Katsuya
Médium: Konferenčný príspevok..
Jazyk:English
Vydavateľské údaje: IEEE 08.12.2024
Predmet:
ISSN:2166-0077
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Distributed learning methods such as federated learning and decentralized federated learning are promising approaches to constructing high-accurate models through collaboration among client nodes. In common distributed learning, each node continuously executes model training and sharing processes to sophisticate a trained model; however, it consumes numerous wireless resources because the data size of deep learning models has been increasing in recent years. For the remedy, this paper introduces a novel concept of distributed learning architecture that employs deep joint source-channel coding (DJSCC) for the sharing process. DJSCC can provide wireless resource efficient communications due to the data-driven encoder/decoder optimization and pseudo-analog modulation. The most significant contribution of this paper is that, even with noise included through pseudo-analog modulation, distributed learning smoothly trains models and achieves the desired accuracy with fewer transmission symbols. Through the computer simulation, we show that the proposal reduces the required symbols by approximately 90%, compared to the conventional distributed learning model.
AbstractList Distributed learning methods such as federated learning and decentralized federated learning are promising approaches to constructing high-accurate models through collaboration among client nodes. In common distributed learning, each node continuously executes model training and sharing processes to sophisticate a trained model; however, it consumes numerous wireless resources because the data size of deep learning models has been increasing in recent years. For the remedy, this paper introduces a novel concept of distributed learning architecture that employs deep joint source-channel coding (DJSCC) for the sharing process. DJSCC can provide wireless resource efficient communications due to the data-driven encoder/decoder optimization and pseudo-analog modulation. The most significant contribution of this paper is that, even with noise included through pseudo-analog modulation, distributed learning smoothly trains models and achieves the desired accuracy with fewer transmission symbols. Through the computer simulation, we show that the proposal reduces the required symbols by approximately 90%, compared to the conventional distributed learning model.
Author Suto, Katsuya
Matsumura, Issa
Author_xml – sequence: 1
  givenname: Issa
  surname: Matsumura
  fullname: Matsumura, Issa
  email: i.matsumura@uec.ac.jp
  organization: The University of Electro-Communications,Graduate School of Information and Engineering
– sequence: 2
  givenname: Katsuya
  surname: Suto
  fullname: Suto, Katsuya
  email: k.suto@uec.ac.jp
  organization: The University of Electro-Communications,Graduate School of Information and Engineering
BookMark eNo1kEtLAzEUhaMoWGv_gYvgfurNe7KUaa1KQfDVZclkbmy0ZobJFPHfW3ysDhy-7yzOKTlKbUJCLhhMGQN7uahW73nTaakEn3Lgcl8zAA38gEyssaUQTIHSxhySEWdaFwDGnJBJzm8AwHTJjdUj8rKKPW4xZ_qAud31Huk8hOgjpoHOYh76WO8GbOgSXZ9ieqWfcdjQGWJH79q4hx5_rKLauJRwS6u22VNn5Di4bcbJX47J8_X8qboplveL2-pqWUQGeih8jcopqUBah6VmQkivpA4BpA9OSInONY2tS2TOuFAGqLmQrFHKld7bRozJ-e9uRMR118cP13-t_68Q3_F2V2U
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/GCWkshp64532.2024.11100602
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Xplore
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISBN 9798331505677
EISSN 2166-0077
EndPage 6
ExternalDocumentID 11100602
Genre orig-research
GroupedDBID 6IE
6IF
6IH
6IK
6IL
6IN
AAJGR
AAWTH
ABLEC
ACGFS
ADZIZ
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
CHZPO
IEGSK
IPLJI
OCL
RIE
RIL
ID FETCH-LOGICAL-i106t-cbe5a545049ae861334c546ff04cfa344eaadd9b8e1a7af8f0b2341d55a8cc9d3
IEDL.DBID RIE
ISICitedReferencesCount 0
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001566406000095&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
IngestDate Wed Aug 27 01:46:15 EDT 2025
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i106t-cbe5a545049ae861334c546ff04cfa344eaadd9b8e1a7af8f0b2341d55a8cc9d3
PageCount 6
ParticipantIDs ieee_primary_11100602
PublicationCentury 2000
PublicationDate 2024-Dec.-8
PublicationDateYYYYMMDD 2024-12-08
PublicationDate_xml – month: 12
  year: 2024
  text: 2024-Dec.-8
  day: 08
PublicationDecade 2020
PublicationTitle IEEE Globecom Workshops
PublicationTitleAbbrev GC Wkshp
PublicationYear 2024
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssj0001682796
Score 1.8915693
Snippet Distributed learning methods such as federated learning and decentralized federated learning are promising approaches to constructing high-accurate models...
SourceID ieee
SourceType Publisher
StartPage 1
SubjectTerms Accuracy
Computational modeling
Computer aided instruction
Data models
deep joint source-channel coding
deep learning
Distance learning
Distributed learning
Encoding
Noise
Quantization (signal)
Symbols
Wireless communication
wireless resource efficiency
Title Wireless Resource Efficient Distributed Learning with Deep Joint Source-Channel Coding
URI https://ieeexplore.ieee.org/document/11100602
WOSCitedRecordID wos001566406000095&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LTwMhECbaeNCLrxrrKxy8btsFlse5D42Hpomv3hpgB21sdps-_P0C3Vo9ePBGCAyESWZgmO8bhG6NFnlqqfKPnADJYUQnklibADjG8izLBDOx2IQYDORopIYVWD1iYQAgJp9BMzTjX35e2lUIlbXSwG_GA3XkrhB8DdbaBlS4JELxilg0bavWXef1Y_E-4yyjAXJFWHMj4FcplehJ-of_3MMRqm8xeXj47W2O0Q4UJ-jgB53gKXoJqaxTb7rwJiiPe5EhwkvE3cCQG4pbQY4rUtU3HKKwuAswww_lxA96jLOSADkoYIo7ZVirjp77vafOfVLVTUgm_oG3TKyBTPubkb_8a5DeX1NmM8adazPrNGUMtLdqykhItdBOurYh3pl5zWhprcrpGaoVZQHnCBvNgRqSUm4Ys0RJGTtSzZmzORUNVA9HNJ6tqTHGm9O5-KP_Eu0HRcR8EHmFasv5Cq7Rnv1cThbzm6jQL88Zoo4
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3JTsMwELVQQQIubEXs-MA1bRM7jnPuQoFSVaJAb5XtTEpFlVRd-H48JqVw4MDNsrzJI3ns8bz3CLnRKkp8w2L7yEFIDg-UJwNjPICU8yQMw4hrJzYRdbtyMIh7BVjdYWEAwCWfQQWL7i8_yc0SQ2VVH_nNBFJHbqJ0VgHXWodUhAyiWBTUon4trt7WX9_nb1PBQ4agq4BXVkP8ElNxvqS1989V7JPyGpVHe9_-5oBsQHZIdn8QCh6RF0xmndjDi67C8rTpOCLsiLSBHLkobwUJLWhVRxTjsLQBMKX3-dg2enK9PAQdZDCh9RznKpPnVrNfb3uFcoI3tk-8hWc0hMrejez1X4G0HptxE3KRpjVuUsU4B2XPtVhL8FWkUpnWdGDdmbWNksbECTsmpSzP4IRQrQQwHfhMaM5NEEvpKnwleGoSFp2SMm7RcPpFjjFc7c7ZH_XXZLvdf-wMO3fdh3Oyg0Zx2SHygpQWsyVcki3zsRjPZ1fOuJ91KqXX
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=IEEE+Globecom+Workshops&rft.atitle=Wireless+Resource+Efficient+Distributed+Learning+with+Deep+Joint+Source-Channel+Coding&rft.au=Matsumura%2C+Issa&rft.au=Suto%2C+Katsuya&rft.date=2024-12-08&rft.pub=IEEE&rft.eissn=2166-0077&rft.spage=1&rft.epage=6&rft_id=info:doi/10.1109%2FGCWkshp64532.2024.11100602&rft.externalDocID=11100602