Comparison of \ell -Norm SVR and Sparse Coding Algorithms for Linear Regression

Support vector regression (SVR) is a popular function estimation technique based on Vapnik's concept of support vector machine. Among many variants, the l 1 -norm SVR is known to be good at selecting useful features when the features are redundant. Sparse coding (SC) is a technique widely used...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems Jg. 26; H. 8; S. 1828 - 1833
Hauptverfasser: Qingtian Zhang, Xiaolin Hu, Bo Zhang
Format: Journal Article
Sprache:Englisch
Veröffentlicht: IEEE 01.08.2015
Schlagworte:
ISSN:2162-237X, 2162-2388
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Support vector regression (SVR) is a popular function estimation technique based on Vapnik's concept of support vector machine. Among many variants, the l 1 -norm SVR is known to be good at selecting useful features when the features are redundant. Sparse coding (SC) is a technique widely used in many areas and a number of efficient algorithms are available. Both l 1 -norm SVR and SC can be used for linear regression. In this brief, the close connection between the l 1 -norm SVR and SC is revealed and some typical algorithms are compared for linear regression. The results show that the SC algorithms outperform the Newton linear programming algorithm, an efficient l 1 -norm SVR algorithm, in efficiency. The algorithms are then used to design the radial basis function (RBF) neural networks. Experiments on some benchmark data sets demonstrate the high efficiency of the SC algorithms. In particular, one of the SC algorithms, the orthogonal matching pursuit is two orders of magnitude faster than a well-known RBF network designing algorithm, the orthogonal least squares algorithm.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2014.2377245