Classification and Regression Using an Outer Approximation Projection-Gradient Method

This paper deals with sparse feature selection and grouping for classification and regression. The classification or regression problems under consideration consists of minimizing a convex empirical risk function subject to an ℓ 1 constraint, a pairwise ℓ ∞ constraint, or a pairwise ℓ 1 constraint....

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on signal processing Ročník 65; číslo 17; s. 4635 - 4644
Hlavní autoři: Barlaud, Michel, Belhajali, Wafa, Combettes, Patrick L., Fillatre, Lionel
Médium: Journal Article
Jazyk:angličtina
Vydáno: IEEE 01.09.2017
Témata:
ISSN:1053-587X, 1941-0476
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:This paper deals with sparse feature selection and grouping for classification and regression. The classification or regression problems under consideration consists of minimizing a convex empirical risk function subject to an ℓ 1 constraint, a pairwise ℓ ∞ constraint, or a pairwise ℓ 1 constraint. Existing work, such as the Lasso formulation, has focused mainly on Lagrangian penalty approximations, which often require ad hoc or computationally expensive procedures to determine the penalization parameter. We depart from this approach and address the constrained problem directly via a splitting method. The structure of the method is that of the classical gradient-projection algorithm, which alternates a gradient step on the objective and a projection step onto the lower level set modeling the constraint. The novelty of our approach is that the projection step is implemented via an outer approximation scheme in which the constraint set is approximated by a sequence of simple convex sets consisting of the intersection of two half-spaces. Convergence of the iterates generated by the algorithm is established for a general smooth convex minimization problem with inequality constraints. Experiments on both synthetic and biological data show that our method outperforms penalty methods.
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2017.2709262