Connecting the Out-of-Sample and Pre-Image Problems in Kernel Methods

Kernel methods have been widely studied in the field of pattern recognition. These methods implicitly map, "the kernel trick," the data into a space which is more appropriate for analysis. Many manifold learning and dimensionality reduction techniques are simply kernel methods for which th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:2007 IEEE Conference on Computer Vision and Pattern Recognition S. 1 - 8
Hauptverfasser: Arias, P., Randall, G., Sapiro, G.
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 01.06.2007
Schlagworte:
ISBN:9781424411795, 1424411793
ISSN:1063-6919, 1063-6919
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Kernel methods have been widely studied in the field of pattern recognition. These methods implicitly map, "the kernel trick," the data into a space which is more appropriate for analysis. Many manifold learning and dimensionality reduction techniques are simply kernel methods for which the mapping is explicitly computed. In such cases, two problems related with the mapping arise: The out-of-sample extension and the pre-image computation. In this paper we propose a new pre-image method based on the Nystrom formulation for the out-of-sample extension, showing the connections between both problems. We also address the importance of normalization in the feature space, which has been ignored by standard pre-image algorithms. As an example, we apply these ideas to the Gaussian kernel, and relate our approach to other popular pre-image methods. Finally, we show the application of these techniques in the study of dynamic shapes.
ISBN:9781424411795
1424411793
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2007.383038