Geometric Positioning Accuracy Improvement of ZY-3 Satellite Imagery Based on Statistical Learning Theory

With the increasing demand for high-resolution remote sensing images for mapping and monitoring the Earth’s environment, geometric positioning accuracy improvement plays a significant role in the image preprocessing step. Based on the statistical learning theory, we propose a new method to improve t...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Sensors (Basel, Switzerland) Ročník 18; číslo 6; s. 1701
Hlavní autori: Jiao, Niangang, Wang, Feng, You, Hongjian, Yang, Mudan, Yao, Xinghui
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Switzerland MDPI AG 24.05.2018
MDPI
Predmet:
ISSN:1424-8220, 0746-9462, 1424-8220
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:With the increasing demand for high-resolution remote sensing images for mapping and monitoring the Earth’s environment, geometric positioning accuracy improvement plays a significant role in the image preprocessing step. Based on the statistical learning theory, we propose a new method to improve the geometric positioning accuracy without ground control points (GCPs). Multi-temporal images from the ZY-3 satellite are tested and the bias-compensated rational function model (RFM) is applied as the block adjustment model in our experiment. An easy and stable weight strategy and the fast iterative shrinkage-thresholding (FIST) algorithm which is widely used in the field of compressive sensing are improved and utilized to define the normal equation matrix and solve it. Then, the residual errors after traditional block adjustment are acquired and tested with the newly proposed inherent error compensation model based on statistical learning theory. The final results indicate that the geometric positioning accuracy of ZY-3 satellite imagery can be improved greatly with our proposed method.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
0746-9462
1424-8220
DOI:10.3390/s18061701