Attention-Empowered Residual Autoencoder for End-to-End Communication Systems

Channel autoencoders adopt neural networks to represent and optimize previous block-driven communication systems from an end-to-end perspective. The existing deep fully connected autoencoder needs retraining when the input length of bit sequences changes since it can only handle fixed-length data. C...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE communications letters Ročník 27; číslo 4; s. 1
Hlavní autori: Lu, Min, Zhou, Bin, Bu, Zhiyong
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 01.04.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1089-7798, 1558-2558
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Channel autoencoders adopt neural networks to represent and optimize previous block-driven communication systems from an end-to-end perspective. The existing deep fully connected autoencoder needs retraining when the input length of bit sequences changes since it can only handle fixed-length data. Convolutional neural network (CNN)-based autoencoder can accept arbitrary lengths and is widely adopted. But it has limitations in 1) error floor in high signal-to-noise ratio (SNR) regions, and 2) poor performance against interference. Therefore, we first adopt a residual attention module to enhance the representation ability of the autoencoder, where channel and spatial attention focus on finding fine-grained clues between signals and noise. An adaptive data-flow merging scheme for demapper is further proposed to satisfy dynamic environments. Simulation results show that the proposed method can achieve great gains in coded and uncoded transmission scenarios. Our method is robust and generalizable against error floor and burst interference, as compared to a conventional system.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1089-7798
1558-2558
DOI:10.1109/LCOMM.2023.3242281