Generalized Nonconvex Nonsmooth Low-Rank Matrix Recovery Framework With Feasible Algorithm Designs and Convergence Analysis
Decomposing data matrix into low-rank plus additive matrices is a commonly used strategy in pattern recognition and machine learning. This article mainly studies the alternating direction method of multiplier (ADMM) with two dual variables, which is used to optimize the generalized nonconvex nonsmoo...
Saved in:
| Published in: | IEEE transaction on neural networks and learning systems Vol. 34; no. 9; pp. 5342 - 5353 |
|---|---|
| Main Authors: | , , , , , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
Piscataway
IEEE
01.09.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects: | |
| ISSN: | 2162-237X, 2162-2388, 2162-2388 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Decomposing data matrix into low-rank plus additive matrices is a commonly used strategy in pattern recognition and machine learning. This article mainly studies the alternating direction method of multiplier (ADMM) with two dual variables, which is used to optimize the generalized nonconvex nonsmooth low-rank matrix recovery problems. Furthermore, the minimization framework with a feasible optimization procedure is designed along with the theoretical analysis, where the variable sequences generated by the proposed ADMM can be proved to be bounded. Most importantly, it can be concluded from the Bolzano-Weierstrass theorem that there must exist a subsequence converging to a critical point, which satisfies the Karush-Kuhn-Tucher (KKT) conditions. Meanwhile, we further ensure the local and global convergence properties of the generated sequence relying on constructing the potential objective function. Particularly, the detailed convergence analysis would be regarded as one of the core contributions besides the algorithm designs and the model generality. Finally, the numerical simulations and the real-world applications are both provided to verify the consistence of the theoretical results, and we also validate the superiority in performance over several mostly related solvers to the tasks of image inpainting and subspace clustering. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ISSN: | 2162-237X 2162-2388 2162-2388 |
| DOI: | 10.1109/TNNLS.2022.3183970 |