Representation learning with convolutional sparse autoencoders for remote sensing

The performance of object recognition and classification on remote sensing imagery is highly dependent on the quality of extracted features and the amount of labeled data in the dataset. In this study, we concentrated on representation learning using unlabeled remote sensing data and using these rep...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2013 21st Signal Processing and Communications Applications Conference (SIU) s. 1 - 4
Hlavní autoři: Firat, O., Vural, F. T. Y.
Médium: Konferenční příspěvek
Jazyk:angličtina
turečtina
Vydáno: IEEE 01.04.2013
Témata:
ISBN:9781467355629, 1467355623
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The performance of object recognition and classification on remote sensing imagery is highly dependent on the quality of extracted features and the amount of labeled data in the dataset. In this study, we concentrated on representation learning using unlabeled remote sensing data and using these representations to recognize different objects which vary in complexity, characteristics and ground resolution. In the proposed framework, randomly sampled patches from remote sensing images are first used to train a single layer sparse-auto encoder in order to learn the most efficient representation for the dataset. These representations are appeared to be as gabor filters in various orientations and parameters, color co-occurrence and color filters and edge-detection filters. Subsequently, representations are used to extract features from target object based on convolution and pooling. Finally, extracted features are used to train a machine learning algorithm and classification performances are evaluated. The proposed method is tested on recognition of dispersal areas, taxi-routes, parking areas and airplanes which are all subparts of an airfield. Performance of the proposed method is competitive with currently used rulebased and supervised methods.
ISBN:9781467355629
1467355623
DOI:10.1109/SIU.2013.6531525