Hierarchical decompositional mixtures of variational autoencoders

Ping Liang Tan, Robert Peharz

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

3 Citations (Scopus)

Abstract

Variational autoencoders (VAEs) have received considerable attention, since they allow us to learn expressive neural density estimators effectively and efficiently. However, learning and inference in VAEs is still problematic due to the sensitive interplay between the generative model and the inference network. Since these problems become generally more severe in high dimensions, we propose a novel hierarchical mixture model over low-dimensional VAE experts. Our model decomposes the overall learning problem into many smaller problems, which are coordinated by the hierarchical mixture, represented by a sum-product network. In experiments we show that our models outperform classical VAEs on almost all of our experimental benchmarks. Moreover, we show that our model is highly data efficient and degrades very gracefully in extremely low data regimes.

Original languageEnglish
Title of host publication36th International Conference on Machine Learning, ICML 2019
Pages10701-10711
Number of pages11
ISBN (Electronic)9781510886988
Publication statusPublished - 1 Jan 2019
Event36th International Conference on Machine Learning (ICML 2019) - Long Beach, United States
Duration: 9 Jun 201915 Jun 2019
Conference number: 36

Publication series

NameProceedings of Machine Learning Research

Conference

Conference36th International Conference on Machine Learning (ICML 2019)
Abbreviated titleICML 2019
Country/TerritoryUnited States
CityLong Beach
Period9/06/1915/06/19

Fingerprint

Dive into the research topics of 'Hierarchical decompositional mixtures of variational autoencoders'. Together they form a unique fingerprint.

Cite this