SAE-Net: A Deep Neural Network for SAR Autofocus

The sparsity-driven technique is a widely used tool to solve the synthetic aperture radar (SAR) imaging problem. However, it always encounters sensitivity to motion errors. To solve this problem, this article proposes a new deep neural network architecture, i.e., the sparse autoencoder network (SAE-...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing Jg. 60; S. 1 - 14
1. Verfasser: Pu, Wei
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:0196-2892, 1558-0644
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The sparsity-driven technique is a widely used tool to solve the synthetic aperture radar (SAR) imaging problem. However, it always encounters sensitivity to motion errors. To solve this problem, this article proposes a new deep neural network architecture, i.e., the sparse autoencoder network (SAE-Net). The proposed SAE-Net is designed to implement SAR imaging and autofocus simultaneously. In SAE-Net, the encoder transforms the SAR echo into an imaging result, and the decoder regenerates the SAR echo using the obtained imaging result. The encoder is designed by the unfolded alternating direction method of multipliers (ADMM), while the decoder is formulated into a linear mapping. The joint reconstruction loss and the entropy loss are utilized to guide the training of the SAE-Net. Notably, the algorithm operates in a totally self-supervised form and requires no other training dataset. The methodology was tested on both synthetic and real SAR data. These tests show that the proposed architecture outperforms other state-of-the-art autofocus methods in sparsity-driven SAR imaging applications.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3139914