A Generalized Chirp-Scaling Algorithm for Geosynchronous Orbit SAR Staring Observations

Geosynchronous Orbit Synthetic Aperture Radar (GEO SAR) has recently received increasing attention due to its ability of performing staring observations of ground targets. However, GEO SAR staring observation has an ultra-long integration time that conventional frequency domain algorithms cannot han...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Sensors (Basel, Switzerland) Ročník 17; číslo 5; s. 1058
Hlavní autoři: Li, Caipin, He, Mingyi
Médium: Journal Article
Jazyk:angličtina
Vydáno: Switzerland MDPI AG 06.05.2017
MDPI
Témata:
ISSN:1424-8220, 1424-8220
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Geosynchronous Orbit Synthetic Aperture Radar (GEO SAR) has recently received increasing attention due to its ability of performing staring observations of ground targets. However, GEO SAR staring observation has an ultra-long integration time that conventional frequency domain algorithms cannot handle because of the inaccurately assumed slant range model and existing azimuth aliasing. To overcome this problem, this paper proposes an improved chirp-scaling algorithm that uses a fifth-order slant range model where considering the impact of the “stop and go” assumption to overcome the inaccuracy of the conventional slant model and a two-step processing method to remove azimuth aliasing. Furthermore, the expression of two-dimensional spectrum is deduced based on a series of reversion methods, leading to an improved chirp-scaling algorithm including a high-order-phase coupling function compensation, range and azimuth compression. The important innovations of this algorithm are implementation of a fifth-order order slant range model and removal of azimuth aliasing for GEO SAR staring observations. A simulation of an L-band GEO SAR with 1800 s integration time is used to demonstrate the validity and accuracy of this algorithm.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s17051058