Symmetric tensor decomposition by alternating gradient descent

The symmetric tensor decomposition problem is a fundamental problem in many fields, which appealing for investigation. In general, greedy algorithm is used for tensor decomposition. That is, we first find the largest singular value and singular vector and subtract the corresponding component from te...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Numerical linear algebra with applications Ročník 29; číslo 1
Hlavní autor: Liu, Haixia
Médium: Journal Article
Jazyk:angličtina
Vydáno: Oxford Wiley Subscription Services, Inc 01.01.2022
Témata:
ISSN:1070-5325, 1099-1506
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The symmetric tensor decomposition problem is a fundamental problem in many fields, which appealing for investigation. In general, greedy algorithm is used for tensor decomposition. That is, we first find the largest singular value and singular vector and subtract the corresponding component from tensor, then repeat the process. In this article, we focus on designing one effective algorithm and giving its convergence analysis. We introduce an exceedingly simple and fast algorithm for rank‐one approximation of symmetric tensor decomposition. Throughout variable splitting, we solve symmetric tensor decomposition problem by minimizing a multiconvex optimization problem. We use alternating gradient descent algorithm to solve. Although we focus on symmetric tensors in this article, the method can be extended to nonsymmetric tensors in some cases. Additionally, we also give some theoretical analysis about our alternating gradient descent algorithm. We prove that alternating gradient descent algorithm converges linearly to global minimizer. We also provide numerical results to show the effectiveness of the algorithm.
Bibliografie:Funding information
Hubei Key Laboratory of Engineering Modeling and Scientific Computing, National Natural Science Foundation of China, 11901220
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-5325
1099-1506
DOI:10.1002/nla.2406