A Robust Regression Framework with Laplace Kernel-Induced Loss

This work proposes a robust regression framework with nonconvex loss function. Two regression formulations are presented based on the Laplace kernel-induced loss (LK-loss). Moreover, we illustrate that the LK-loss function is a nice approximation for the zero-norm. However, nonconvexity of the LK-lo...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Neural computation Ročník 29; číslo 11; s. 3014
Hlavní autoři: Yang, Liming, Ren, Zhuo, Wang, Yidan, Dong, Hongwei
Médium: Journal Article
Jazyk:angličtina
Vydáno: United States 01.11.2017
ISSN:1530-888X, 1530-888X
On-line přístup:Zjistit podrobnosti o přístupu
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:This work proposes a robust regression framework with nonconvex loss function. Two regression formulations are presented based on the Laplace kernel-induced loss (LK-loss). Moreover, we illustrate that the LK-loss function is a nice approximation for the zero-norm. However, nonconvexity of the LK-loss makes it difficult to optimize. A continuous optimization method is developed to solve the proposed framework. The problems are formulated as DC (difference of convex functions) programming. The corresponding DC algorithms (DCAs) converge linearly. Furthermore, the proposed algorithms are applied directly to determine the hardness of licorice seeds using near-infrared spectral data with noisy input. Experiments in eight spectral regions show that the proposed methods improve generalization compared with the traditional support vector regressions (SVR), especially in high-frequency regions. Experiments on several benchmark data sets demonstrate that the proposed methods achieve better results than the traditional regression methods in most of data sets we have considered.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1530-888X
1530-888X
DOI:10.1162/neco_a_01002