The kernel recursive least-squares algorithm

We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a high-dimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum mean-squared-error solutions to nonlinear least-squares pr...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on signal processing Ročník 52; číslo 8; s. 2275 - 2285
Hlavní autoři: Engel, Y., Mannor, S., Meir, R.
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 01.08.2004
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1053-587X, 1941-0476
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:We present a nonlinear version of the recursive least squares (RLS) algorithm. Our algorithm performs linear regression in a high-dimensional feature space induced by a Mercer kernel and can therefore be used to recursively construct minimum mean-squared-error solutions to nonlinear least-squares problems that are frequently encountered in signal processing applications. In order to regularize solutions and keep the complexity of the algorithm bounded, we use a sequential sparsification process that admits into the kernel representation a new input sample only if its feature space image cannot be sufficiently well approximated by combining the images of previously admitted samples. This sparsification procedure allows the algorithm to operate online, often in real time. We analyze the behavior of the algorithm, compare its scaling properties to those of support vector machines, and demonstrate its utility in solving two signal processing problems-time-series prediction and channel equalization.
Bibliografie:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
content type line 23
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2004.830985