Connecting the Out-of-Sample and Pre-Image Problems in Kernel Methods

Kernel methods have been widely studied in the field of pattern recognition. These methods implicitly map, "the kernel trick," the data into a space which is more appropriate for analysis. Many manifold learning and dimensionality reduction techniques are simply kernel methods for which th...

Full description

Saved in:
Bibliographic Details
Published in:2007 IEEE Conference on Computer Vision and Pattern Recognition pp. 1 - 8
Main Authors: Arias, P., Randall, G., Sapiro, G.
Format: Conference Proceeding
Language:English
Published: IEEE 01.06.2007
Subjects:
ISBN:9781424411795, 1424411793
ISSN:1063-6919, 1063-6919
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Kernel methods have been widely studied in the field of pattern recognition. These methods implicitly map, "the kernel trick," the data into a space which is more appropriate for analysis. Many manifold learning and dimensionality reduction techniques are simply kernel methods for which the mapping is explicitly computed. In such cases, two problems related with the mapping arise: The out-of-sample extension and the pre-image computation. In this paper we propose a new pre-image method based on the Nystrom formulation for the out-of-sample extension, showing the connections between both problems. We also address the importance of normalization in the feature space, which has been ignored by standard pre-image algorithms. As an example, we apply these ideas to the Gaussian kernel, and relate our approach to other popular pre-image methods. Finally, we show the application of these techniques in the study of dynamic shapes.
ISBN:9781424411795
1424411793
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2007.383038