Generative autoencoder to prevent overregularization of variational autoencoder

In machine learning, data scarcity is a common problem, and generative models have the potential to solve it. The variational autoencoder is a generative model that performs variational inference to estimate a low‐dimensional posterior distribution given high‐dimensional data. Specifically, it optim...

Full description

Saved in:
Bibliographic Details
Published in:ETRI journal Vol. 47; no. 1; pp. 80 - 89
Main Authors: Ko, YoungMin, Ko, SunWoo, Kim, YoungSoo
Format: Journal Article
Language:English
Published: Electronics and Telecommunications Research Institute (ETRI) 01.02.2025
한국전자통신연구원
Subjects:
ISSN:1225-6463, 2233-7326
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In machine learning, data scarcity is a common problem, and generative models have the potential to solve it. The variational autoencoder is a generative model that performs variational inference to estimate a low‐dimensional posterior distribution given high‐dimensional data. Specifically, it optimizes the evidence lower bound from regularization and reconstruction terms, but the two terms are imbalanced in general. If the reconstruction error is not sufficiently small to belong to the population, the generative model performance cannot be guaranteed. We propose a generative autoencoder (GAE) that uses an autoencoder to first minimize the reconstruction error and then estimate the distribution using latent vectors mapped onto a lower dimension through the encoder. We compare the Fréchet inception distances scores of the proposed GAE and nine other variational autoencoders on the MNIST, Fashion MNIST, CIFAR10, and SVHN datasets. The proposed GAE consistently outperforms the other methods on the MNIST (44.30), Fashion MNIST (196.34), and SVHN (77.53) datasets.
Bibliography:Funding information
This study was supported by a research grant of Jeonju University in 2022.
https://doi.org/10.4218/etrij.2023-0375
ISSN:1225-6463
2233-7326
DOI:10.4218/etrij.2023-0375