Posterior Collapse in Variational Gradient Origin Networks

Posterior collapse is a phenomenon that occurs when the posterior distribution degenerates to the prior, leading to a decline in the quality of latent encodings and generative models. While it is known to occur in Variational Autoencoders (VAEs), it is unknown whether it occurs in Variational Gradie...

Full description

Saved in:
Bibliographic Details
Published in:Proceedings (IEEE International Conference on Emerging Technologies and Factory Automation) pp. 980 - 987
Main Authors: Clapham, Peter, Grzes, Marek
Format: Conference Proceeding
Language:English
Published: IEEE 15.12.2023
Subjects:
ISSN:1946-0759
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Posterior collapse is a phenomenon that occurs when the posterior distribution degenerates to the prior, leading to a decline in the quality of latent encodings and generative models. While it is known to occur in Variational Autoencoders (VAEs), it is unknown whether it occurs in Variational Gradient Origin Networks (VGONs). The goal of this paper is to compare the posterior collapse of Variational Gradient Origin Networks and Variational Autoencoders. By checking the latent encodings of VGONs against the key posterior collapse metrics, our experiments reveal that VGONs do exhibit posterior collapse both in the decline of the Kullback-Leibler divergence (KLD) and the collapse of individual variables. Furthermore, the results show that VGONs and VAEs have a similar polarized regime, suggesting that the cause of posterior collapse is not specific to the architecture of the model used to find an encoding. These findings support the claim made in previous research that posterior collapse is a general issue that affects a wide range of latent variable models.
ISSN:1946-0759
DOI:10.1109/ICMLA58977.2023.00145