Constructive Approximation manuscript No. (will be inserted by the editor) Power Series Kernels

Uloženo v:
Podrobná bibliografie
Název: Constructive Approximation manuscript No. (will be inserted by the editor) Power Series Kernels
Autoři: Barbara Zwicknagl
Přispěvatelé: The Pennsylvania State University CiteSeerX Archives
Zdroj: http://www.mis.mpg.de/preprints/2006/preprint2006_155.pdf.
Rok vydání: 2006
Sbírka: CiteSeerX
Témata: multivariate polynomial approximation, Bernstein theorem, dot prod- uct kernels, reproducing kernel Hilbert spaces, error bounds, convergence orders
Popis: We introduce a class of analytic positive definite multivariate kernels which includes infinite dot product kernels as sometimes used in machine learning, certain new nonlinearly factorizable kernels and a kernel which is closely related to the Gaussian. Each such kernel reproduces in a certain ‘native ‘ Hilbert space of multivariate analytic functions. If functions from this space are interpolated in scattered locations by translates of the kernel, we prove spectral convergence rates of the interpolants and all derivatives. By truncation of the power series of the kernel-based interpolants, we constructively generalize the classical Bernstein theorem concerning polynomial approximation of analytic functions to the multi-variate case. An application to machine learning algorithms is presented.
Druh dokumentu: text
Popis souboru: application/pdf
Jazyk: English
Relation: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.86.5528; http://www.mis.mpg.de/preprints/2006/preprint2006_155.pdf
Dostupnost: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.86.5528
http://www.mis.mpg.de/preprints/2006/preprint2006_155.pdf
Rights: Metadata may be used without restrictions as long as the oai identifier remains attached to it.
Přístupové číslo: edsbas.C6C1A6E3
Databáze: BASE
Popis
Abstrakt:We introduce a class of analytic positive definite multivariate kernels which includes infinite dot product kernels as sometimes used in machine learning, certain new nonlinearly factorizable kernels and a kernel which is closely related to the Gaussian. Each such kernel reproduces in a certain ‘native ‘ Hilbert space of multivariate analytic functions. If functions from this space are interpolated in scattered locations by translates of the kernel, we prove spectral convergence rates of the interpolants and all derivatives. By truncation of the power series of the kernel-based interpolants, we constructively generalize the classical Bernstein theorem concerning polynomial approximation of analytic functions to the multi-variate case. An application to machine learning algorithms is presented.