Generative autoencoder to prevent overregularization of variational autoencoder
In machine learning, data scarcity is a common problem, and generative models have the potential to solve it. The variational autoencoder is a generative model that performs variational inference to estimate a low‐dimensional posterior distribution given high‐dimensional data. Specifically, it optim...
Uloženo v:
| Vydáno v: | ETRI journal Ročník 47; číslo 1; s. 80 - 89 |
|---|---|
| Hlavní autoři: | , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Electronics and Telecommunications Research Institute (ETRI)
01.02.2025
한국전자통신연구원 |
| Témata: | |
| ISSN: | 1225-6463, 2233-7326 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | In machine learning, data scarcity is a common problem, and generative models have the potential to solve it. The variational autoencoder is a generative model that performs variational inference to estimate a low‐dimensional posterior distribution given high‐dimensional data. Specifically, it optimizes the evidence lower bound from regularization and reconstruction terms, but the two terms are imbalanced in general. If the reconstruction error is not sufficiently small to belong to the population, the generative model performance cannot be guaranteed. We propose a generative autoencoder (GAE) that uses an autoencoder to first minimize the reconstruction error and then estimate the distribution using latent vectors mapped onto a lower dimension through the encoder. We compare the Fréchet inception distances scores of the proposed GAE and nine other variational autoencoders on the MNIST, Fashion MNIST, CIFAR10, and SVHN datasets. The proposed GAE consistently outperforms the other methods on the MNIST (44.30), Fashion MNIST (196.34), and SVHN (77.53) datasets. |
|---|---|
| Bibliografie: | Funding information This study was supported by a research grant of Jeonju University in 2022. https://doi.org/10.4218/etrij.2023-0375 |
| ISSN: | 1225-6463 2233-7326 |
| DOI: | 10.4218/etrij.2023-0375 |