SAE-Net: A Deep Neural Network for SAR Autofocus
The sparsity-driven technique is a widely used tool to solve the synthetic aperture radar (SAR) imaging problem. However, it always encounters sensitivity to motion errors. To solve this problem, this article proposes a new deep neural network architecture, i.e., the sparse autoencoder network (SAE-...
Saved in:
| Published in: | IEEE transactions on geoscience and remote sensing Vol. 60; pp. 1 - 14 |
|---|---|
| Main Author: | |
| Format: | Journal Article |
| Language: | English |
| Published: |
New York
IEEE
2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects: | |
| ISSN: | 0196-2892, 1558-0644 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | The sparsity-driven technique is a widely used tool to solve the synthetic aperture radar (SAR) imaging problem. However, it always encounters sensitivity to motion errors. To solve this problem, this article proposes a new deep neural network architecture, i.e., the sparse autoencoder network (SAE-Net). The proposed SAE-Net is designed to implement SAR imaging and autofocus simultaneously. In SAE-Net, the encoder transforms the SAR echo into an imaging result, and the decoder regenerates the SAR echo using the obtained imaging result. The encoder is designed by the unfolded alternating direction method of multipliers (ADMM), while the decoder is formulated into a linear mapping. The joint reconstruction loss and the entropy loss are utilized to guide the training of the SAE-Net. Notably, the algorithm operates in a totally self-supervised form and requires no other training dataset. The methodology was tested on both synthetic and real SAR data. These tests show that the proposed architecture outperforms other state-of-the-art autofocus methods in sparsity-driven SAR imaging applications. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0196-2892 1558-0644 |
| DOI: | 10.1109/TGRS.2021.3139914 |