A Proximal-Proximal Majorization-Minimization Algorithm for Nonconvex Rank Regression Problems

In this paper, we introduce a proximal-proximal majorization-minimization (PPMM) algorithm for nonconvex rank regression problems. The basic idea of the algorithm is to apply the proximal majorization-minimization algorithm to solve the nonconvex problem with the inner subproblems solved by a sparse...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on signal processing Ročník 71; s. 3502 - 3517
Hlavní autori: Tang, Peipei, Wang, Chengjing, Jiang, Bo
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:1053-587X, 1941-0476
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:In this paper, we introduce a proximal-proximal majorization-minimization (PPMM) algorithm for nonconvex rank regression problems. The basic idea of the algorithm is to apply the proximal majorization-minimization algorithm to solve the nonconvex problem with the inner subproblems solved by a sparse semismooth Newton (SSN) method based proximal point algorithm (PPA). It deserves mentioning that we adopt the sequential regularization technique and design an implementable stopping criterion to overcome the singular difficulty of the inner subproblem. Especially for the stopping criterion, it plays a very important role for the success of the algorithm. Furthermore, we also prove that the PPMM algorithm converges to a stationary point. Due to the Kurdyka-Łojasiewicz (KL) property of the problem, we present the convergence rate of the PPMM algorithm. Numerical experiments demonstrate that our proposed algorithm outperforms the existing state-of-the-art algorithms.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2023.3315454