Generative autoencoder to prevent overregularization of variational autoencoder

In machine learning, data scarcity is a common problem, and generative models have the potential to solve it. The variational autoencoder is a generative model that performs variational inference to estimate a low‐dimensional posterior distribution given high‐dimensional data. Specifically, it optim...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ETRI journal Jg. 47; H. 1; S. 80 - 89
Hauptverfasser: Ko, YoungMin, Ko, SunWoo, Kim, YoungSoo
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Electronics and Telecommunications Research Institute (ETRI) 01.02.2025
한국전자통신연구원
Schlagworte:
ISSN:1225-6463, 2233-7326
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In machine learning, data scarcity is a common problem, and generative models have the potential to solve it. The variational autoencoder is a generative model that performs variational inference to estimate a low‐dimensional posterior distribution given high‐dimensional data. Specifically, it optimizes the evidence lower bound from regularization and reconstruction terms, but the two terms are imbalanced in general. If the reconstruction error is not sufficiently small to belong to the population, the generative model performance cannot be guaranteed. We propose a generative autoencoder (GAE) that uses an autoencoder to first minimize the reconstruction error and then estimate the distribution using latent vectors mapped onto a lower dimension through the encoder. We compare the Fréchet inception distances scores of the proposed GAE and nine other variational autoencoders on the MNIST, Fashion MNIST, CIFAR10, and SVHN datasets. The proposed GAE consistently outperforms the other methods on the MNIST (44.30), Fashion MNIST (196.34), and SVHN (77.53) datasets.
Bibliographie:Funding information
This study was supported by a research grant of Jeonju University in 2022.
https://doi.org/10.4218/etrij.2023-0375
ISSN:1225-6463
2233-7326
DOI:10.4218/etrij.2023-0375