Limited Rank Matrix Learning, discriminative dimension reduction and visualization

We present an extension of the recently introduced Generalized Matrix Learning Vector Quantization algorithm. In the original scheme, adaptive square matrices of relevance factors parameterize a discriminative distance measure. We extend the scheme to matrices of limited rank corresponding to low-di...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Neural networks Ročník 26; s. 159 - 173
Hlavní autori: Bunte, Kerstin, Schneider, Petra, Hammer, Barbara, Schleif, Frank-Michael, Villmann, Thomas, Biehl, Michael
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Kidlington Elsevier Ltd 01.02.2012
Elsevier
Predmet:
ISSN:0893-6080, 1879-2782, 1879-2782
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:We present an extension of the recently introduced Generalized Matrix Learning Vector Quantization algorithm. In the original scheme, adaptive square matrices of relevance factors parameterize a discriminative distance measure. We extend the scheme to matrices of limited rank corresponding to low-dimensional representations of the data. This allows to incorporate prior knowledge of the intrinsic dimension and to reduce the number of adaptive parameters efficiently. In particular, for very large dimensional data, the limitation of the rank can reduce computation time and memory requirements significantly. Furthermore, two- or three-dimensional representations constitute an efficient visualization method for labeled data sets. The identification of a suitable projection is not treated as a pre-processing step but as an integral part of the supervised training. Several real world data sets serve as an illustration and demonstrate the usefulness of the suggested method.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0893-6080
1879-2782
1879-2782
DOI:10.1016/j.neunet.2011.10.001