A New DC Algorithm for Sparse Optimal Scoring Problem
Linear discriminant analysis (LDA) has attracted many attentions as a classical tool for both classification and dimensionality reduction. Classical LDA performs quite well in simple and low dimensional setting while it is not suitable for small sample size data (SSS). Feature selection is an effect...
Uložené v:
| Vydané v: | IEEE access Ročník 8; s. 53962 - 53971 |
|---|---|
| Hlavní autori: | , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Piscataway
IEEE
2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Predmet: | |
| ISSN: | 2169-3536, 2169-3536 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | Linear discriminant analysis (LDA) has attracted many attentions as a classical tool for both classification and dimensionality reduction. Classical LDA performs quite well in simple and low dimensional setting while it is not suitable for small sample size data (SSS). Feature selection is an effective way to solve this problem. As a variant of LDA, sparse optimal scoring (SOS) with 10-norm regularization is considered in this paper. By using a new continuous nonconvex nonsmooth function to approximate 10-norm, we propose a novel difference of convex functions algorithm (DCA) for sparse optimal scoring. The most favorable property of the proposed DCA is its subproblem admits an analytical solution. The effectiveness of the proposed method is validated via theoretical analysis as well as some illustrative numerical experiments. |
|---|---|
| Bibliografia: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 2169-3536 2169-3536 |
| DOI: | 10.1109/ACCESS.2020.2981429 |