SAE-Net: A Deep Neural Network for SAR Autofocus

The sparsity-driven technique is a widely used tool to solve the synthetic aperture radar (SAR) imaging problem. However, it always encounters sensitivity to motion errors. To solve this problem, this article proposes a new deep neural network architecture, i.e., the sparse autoencoder network (SAE-...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on geoscience and remote sensing Ročník 60; s. 1 - 14
Hlavný autor: Pu, Wei
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:0196-2892, 1558-0644
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:The sparsity-driven technique is a widely used tool to solve the synthetic aperture radar (SAR) imaging problem. However, it always encounters sensitivity to motion errors. To solve this problem, this article proposes a new deep neural network architecture, i.e., the sparse autoencoder network (SAE-Net). The proposed SAE-Net is designed to implement SAR imaging and autofocus simultaneously. In SAE-Net, the encoder transforms the SAR echo into an imaging result, and the decoder regenerates the SAR echo using the obtained imaging result. The encoder is designed by the unfolded alternating direction method of multipliers (ADMM), while the decoder is formulated into a linear mapping. The joint reconstruction loss and the entropy loss are utilized to guide the training of the SAE-Net. Notably, the algorithm operates in a totally self-supervised form and requires no other training dataset. The methodology was tested on both synthetic and real SAR data. These tests show that the proposed architecture outperforms other state-of-the-art autofocus methods in sparsity-driven SAR imaging applications.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3139914