Kernel Feature Extraction in Signal Processing
Kernel‐based feature extraction and dimensionality reduction are becoming increasingly important in advanced signal processing. This is particularly relevant in applications dealing with very high‐dimensional data. Besides changing the data representation space via kernel featu...
Uloženo v:
| Vydáno v: | Digital Signal Processing with Kernel Methods s. 543 - 588 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Kapitola |
| Jazyk: | angličtina |
| Vydáno: |
Chichester, UK
Wiley
2018
John Wiley & Sons, Ltd |
| Vydání: | 1 |
| Edice: | Wiley - IEEE |
| Témata: | |
| ISBN: | 9781118611791, 1118611799 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Kernel‐based feature extraction and dimensionality reduction are becoming increasingly important in advanced signal processing. This is particularly relevant in applications dealing with very high‐dimensional data. Besides changing the data representation space via kernel feature extraction, another possibility is to correct for biases in the data distributions operating on the samples. This chapter reviews the main kernel feature extraction and dimensionality reduction methods, dealing with supervised, unsupervised and semi‐supervised settings. It illustrates methods in toy examples, as well as real datasets. The chapter also analyzes the connections between Hilbert‐Schmidt independence criterion (HSIC) and classical feature extraction methods. The HSIC method measures cross‐covariance in an adequate reproducing kernel Hilbert space (RKHS) by using the entire spectrum of the cross‐covariance operator. Kernel dimensionality reduction (KDR) is a supervised feature extraction method that seeks a linear transformation of the data such that it maximizes the conditional HSIC on the labels. |
|---|---|
| ISBN: | 9781118611791 1118611799 |
| DOI: | 10.1002/9781118705810.ch12 |

