An algorithm for low-rank matrix factorization and its applications

This paper proposes a valid and fast algorithm for low-rank matrix factorization. There are multiple applications for low-rank matrix factorization, and numerous algorithms have been developed to solve this problem. However, many algorithms do not use rank directly; instead, they minimize a nuclear...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Neurocomputing (Amsterdam) Ročník 275; s. 1012 - 1020
Hlavní autori: Chen, Baiyu, Yang, Zi, Yang, Zhouwang
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Elsevier B.V 31.01.2018
Predmet:
ISSN:0925-2312, 1872-8286
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:This paper proposes a valid and fast algorithm for low-rank matrix factorization. There are multiple applications for low-rank matrix factorization, and numerous algorithms have been developed to solve this problem. However, many algorithms do not use rank directly; instead, they minimize a nuclear norm by using Singular Value Decomposition (SVD), which requires a huge time cost. In addition, these algorithms often fix the dimension of the factorized matrix, meaning that one must first find an optimum dimension for the factorized matrix in order to obtain a solution. Unfortunately, the optimum dimension is unknown in many practical problems, such as matrix completion and recommender systems. Therefore, it is necessary to develop a faster algorithm that can also estimate the optimum dimension. In this paper, we use the Hidden Matrix Factorized Augmented Lagrangian Method to solve low-rank matrix factorizations. We also add a tool to dynamically estimate the optimum dimension and adjust it while simultaneously running the algorithm. Additionally, in the era of Big Data, there will be more and more large, sparse data. In face of such highly sparse data, our algorithm has the potential to be more effective than other algorithms. We applied it to some practical problems, e.g. Low-Rank Representation(LRR), and matrix completion with constraint. In numerical experiments, it has performed well when applied to both synthetic data and real-world data.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2017.09.052