Hierarchical decompositional mixtures of variational autoencoders

Ping Liang Tan, Robert Peharz

Onderzoeksoutput: Hoofdstuk in Boek/Rapport/CongresprocedureConferentiebijdrageAcademicpeer review

Samenvatting

Variational autoencoders (VAEs) have received considerable attention, since they allow us to learn expressive neural density estimators effectively and efficiently. However, learning and inference in VAEs is still problematic due to the sensitive interplay between the generative model and the inference network. Since these problems become generally more severe in high dimensions, we propose a novel hierarchical mixture model over low-dimensional VAE experts. Our model decomposes the overall learning problem into many smaller problems, which are coordinated by the hierarchical mixture, represented by a sum-product network. In experiments we show that our models outperform classical VAEs on almost all of our experimental benchmarks. Moreover, we show that our model is highly data efficient and degrades very gracefully in extremely low data regimes.

Originele taal-2Engels
Titel36th International Conference on Machine Learning, ICML 2019
UitgeverijProceedings of Machine Learning Research
Pagina's10701-10711
Aantal pagina's11
ISBN van elektronische versie9781510886988
StatusGepubliceerd - 1 jan 2019
Evenement36th International Conference on Machine Learning, ICML 2019 - Long Beach, Verenigde Staten van Amerika
Duur: 9 jun 201915 jun 2019

Congres

Congres36th International Conference on Machine Learning, ICML 2019
LandVerenigde Staten van Amerika
StadLong Beach
Periode9/06/1915/06/19

Vingerafdruk Duik in de onderzoeksthema's van 'Hierarchical decompositional mixtures of variational autoencoders'. Samen vormen ze een unieke vingerafdruk.

Citeer dit