Gradient-based Learning Methods Extended to Smooth Manifolds Applied to Automated Clustering

Grassmann manifold based sparse spectral clustering is a classification technique that  consists in learning a latent representation of data, formed by a subspace basis, which  is sparse. In order to learn a latent representation, spectral clustering is formulated in  terms of a loss minimization pr...

Full description

Saved in:
Bibliographic Details
Published in:The Journal of artificial intelligence research Vol. 68; pp. 777 - 816
Main Authors: Koudounas, Alkis, Fiori, Simone
Format: Journal Article
Language:English
Published: San Francisco AI Access Foundation 17.08.2020
Subjects:
ISSN:1076-9757, 1076-9757, 1943-5037
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Grassmann manifold based sparse spectral clustering is a classification technique that  consists in learning a latent representation of data, formed by a subspace basis, which  is sparse. In order to learn a latent representation, spectral clustering is formulated in  terms of a loss minimization problem over a smooth manifold known as Grassmannian.  Such minimization problem cannot be tackled by one of traditional gradient-based learning  algorithms, which are only suitable to perform optimization in absence of constraints among  parameters. It is, therefore, necessary to develop specific optimization/learning algorithms  that are able to look for a local minimum of a loss function under smooth constraints in  an efficient way. Such need calls for manifold optimization methods. In this paper, we  extend classical gradient-based learning algorithms on   at parameter spaces (from classical  gradient descent to adaptive momentum) to curved spaces (smooth manifolds) by means  of tools from manifold calculus. We compare clustering performances of these methods  and known methods from the scientific literature. The obtained results confirm that the  proposed learning algorithms prove lighter in computational complexity than existing ones  without detriment in clustering efficacy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1076-9757
1076-9757
1943-5037
DOI:10.1613/jair.1.12192