HyperNP: Interactive Visual Exploration of Multidimensional Projection Hyperparameters

Projection algorithms such as t‐SNE or UMAP are useful for the visualization of high dimensional data, but depend on hyperparameters which must be tuned carefully. Unfortunately, iteratively recomputing projections to find the optimal hyperparameter values is computationally intensive and unintuitiv...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer graphics forum Jg. 41; H. 3; S. 169 - 181
Hauptverfasser: Appleby, G., Espadoto, M., Chen, R., Goree, S., Telea, A. C., Anderson, E. W., Chang, R.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Oxford Blackwell Publishing Ltd 01.06.2022
Schlagworte:
ISSN:0167-7055, 1467-8659
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Projection algorithms such as t‐SNE or UMAP are useful for the visualization of high dimensional data, but depend on hyperparameters which must be tuned carefully. Unfortunately, iteratively recomputing projections to find the optimal hyperparameter values is computationally intensive and unintuitive due to the stochastic nature of such methods. In this paper we propose HyperNP, a scalable method that allows for real‐time interactive hyperparameter exploration of projection methods by training neural network approximations. A HyperNP model can be trained on a fraction of the total data instances and hyperparameter configurations that one would like to investigate and can compute projections for new data and hyperparameters at interactive speeds. HyperNP models are compact in size and fast to compute, thus allowing them to be embedded in lightweight visualization systems. We evaluate the performance of HyperNP across three datasets in terms of performance and speed. The results suggest that HyperNP models are accurate, scalable, interactive, and appropriate for use in real‐world settings.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.14531