Data-Driven OTFS Channel Estimation Based on Gated Recurrent Convolutional Autoencoder

Considering the traffic environment with highmoving vehicles, orthogonal time frequency space (OTFS) has become an emerging technology to handle the rapid timevarying channels via vehicular communications. Due to sparse representation of the delay-Doppler (DD) domain, the related channel information...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International Symposium on Communications and Information Technologies (Online) S. 7 - 12
Hauptverfasser: Chen, Junshen, Yuan, Qihao, Zhang, Shiyao, Liu, Chang
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 16.10.2023
Schlagworte:
ISSN:2643-6175
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Considering the traffic environment with highmoving vehicles, orthogonal time frequency space (OTFS) has become an emerging technology to handle the rapid timevarying channels via vehicular communications. Due to sparse representation of the delay-Doppler (DD) domain, the related channel information can be estimated by means of the embedded pilot technique. However, the uncertainties of unknown and burst noise can incur system performance degradation issues. To tackle this problem, in this paper, we propose a novel gated recurrent convolutional autoencoder (GRCAE) model to denoise the complex noise for channel estimation in OTFS systems. Specifically, the proposed model can distinguish and retain the significant features of the signal during the denoising process through the gated recurrent unit (GRU) network. Meanwhile, the convolutional autoencoder can better capture the local spatial features of the signal and reconstruct them to obtain a denoised signal. The parallel procedure further improves the denoising accuracy and robustness. Our simulation results demonstrate that the proposed GRCAEbased approach present satisfactory performance in various noise scenarios.
ISSN:2643-6175
DOI:10.1109/ISCIT57293.2023.10376091