Gradient-based Learning Methods Extended to Smooth Manifolds Applied to Automated Clustering

Grassmann manifold based sparse spectral clustering is a classification technique that  consists in learning a latent representation of data, formed by a subspace basis, which  is sparse. In order to learn a latent representation, spectral clustering is formulated in  terms of a loss minimization pr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Journal of artificial intelligence research Jg. 68; S. 777 - 816
Hauptverfasser: Koudounas, Alkis, Fiori, Simone
Format: Journal Article
Sprache:Englisch
Veröffentlicht: San Francisco AI Access Foundation 17.08.2020
Schlagworte:
ISSN:1076-9757, 1076-9757, 1943-5037
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Grassmann manifold based sparse spectral clustering is a classification technique that  consists in learning a latent representation of data, formed by a subspace basis, which  is sparse. In order to learn a latent representation, spectral clustering is formulated in  terms of a loss minimization problem over a smooth manifold known as Grassmannian.  Such minimization problem cannot be tackled by one of traditional gradient-based learning  algorithms, which are only suitable to perform optimization in absence of constraints among  parameters. It is, therefore, necessary to develop specific optimization/learning algorithms  that are able to look for a local minimum of a loss function under smooth constraints in  an efficient way. Such need calls for manifold optimization methods. In this paper, we  extend classical gradient-based learning algorithms on   at parameter spaces (from classical  gradient descent to adaptive momentum) to curved spaces (smooth manifolds) by means  of tools from manifold calculus. We compare clustering performances of these methods  and known methods from the scientific literature. The obtained results confirm that the  proposed learning algorithms prove lighter in computational complexity than existing ones  without detriment in clustering efficacy.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1076-9757
1076-9757
1943-5037
DOI:10.1613/jair.1.12192