Two-Stream Deep Architecture for Hyperspectral Image Classification
Most traditional approaches classify hyperspectral image (HSI) pixels relying only on the spectral values of the input channels. However, the spatial context around a pixel is also very important and can enhance the classification performance. In order to effectively exploit and fuse both the spatia...
Gespeichert in:
| Veröffentlicht in: | IEEE transactions on geoscience and remote sensing Jg. 56; H. 4; S. 2349 - 2361 |
|---|---|
| Hauptverfasser: | , , , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
New York
IEEE
01.04.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Schlagworte: | |
| ISSN: | 0196-2892, 1558-0644 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | Most traditional approaches classify hyperspectral image (HSI) pixels relying only on the spectral values of the input channels. However, the spatial context around a pixel is also very important and can enhance the classification performance. In order to effectively exploit and fuse both the spatial context and spectral structure, we propose a novel two-stream deep architecture for HSI classification. The proposed method consists of a two-stream architecture and a novel fusion scheme. In the two-stream architecture, one stream employs the stacked denoising autoencoder to encode the spectral values of each input pixel, and the other stream takes as input the corresponding image patch and deep convolutional neural networks are employed to process the image patch. In the fusion scheme, the prediction probabilities from two streams are fused by adaptive class-specific weights, which can be obtained by a fully connected layer. Finally, a weight regularizer is added to the loss function to alleviate the overfitting of the class-specific fusion weights. Experimental results on real HSIs demonstrate that the proposed two-stream deep architecture can achieve competitive performance compared with the state-of-the-art methods. |
|---|---|
| Bibliographie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0196-2892 1558-0644 |
| DOI: | 10.1109/TGRS.2017.2778343 |