Symmetric tensor decomposition by alternating gradient descent

The symmetric tensor decomposition problem is a fundamental problem in many fields, which appealing for investigation. In general, greedy algorithm is used for tensor decomposition. That is, we first find the largest singular value and singular vector and subtract the corresponding component from te...

Full description

Saved in:
Bibliographic Details
Published in:Numerical linear algebra with applications Vol. 29; no. 1
Main Author: Liu, Haixia
Format: Journal Article
Language:English
Published: Oxford Wiley Subscription Services, Inc 01.01.2022
Subjects:
ISSN:1070-5325, 1099-1506
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The symmetric tensor decomposition problem is a fundamental problem in many fields, which appealing for investigation. In general, greedy algorithm is used for tensor decomposition. That is, we first find the largest singular value and singular vector and subtract the corresponding component from tensor, then repeat the process. In this article, we focus on designing one effective algorithm and giving its convergence analysis. We introduce an exceedingly simple and fast algorithm for rank‐one approximation of symmetric tensor decomposition. Throughout variable splitting, we solve symmetric tensor decomposition problem by minimizing a multiconvex optimization problem. We use alternating gradient descent algorithm to solve. Although we focus on symmetric tensors in this article, the method can be extended to nonsymmetric tensors in some cases. Additionally, we also give some theoretical analysis about our alternating gradient descent algorithm. We prove that alternating gradient descent algorithm converges linearly to global minimizer. We also provide numerical results to show the effectiveness of the algorithm.
Bibliography:Funding information
Hubei Key Laboratory of Engineering Modeling and Scientific Computing, National Natural Science Foundation of China, 11901220
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-5325
1099-1506
DOI:10.1002/nla.2406