Sum-product autoencoding: Encoding and decoding representations using sum-product networks

  • Antonio Vergari
  • , Alejandro Molina
  • , Robert Peharz
  • , Kristian Kersting
  • , Nicola Di Mauro
  • , Floriana Esposito

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

Sum-Product Networks (SPNs) are a deep probabilistic architecture that up to now has been successfully employed for tractable inference. Here, we extend their scope towards unsupervised representation learning: we encode samples into continuous and categorical embeddings and show that they can also be decoded back into the original input space by leveraging MPE inference. We characterize when this Sum-Product Autoencoding (SPAE) leads to equivalent reconstructions and extend it towards dealing with missing embedding information. Our experimental results on several multi-label classification problems demonstrate that SPAE is competitive with state-of-the-art autoencoder architectures, even if the SPNs were never trained to reconstruct their inputs.

Original languageEnglish
Title of host publication32nd AAAI Conference on Artificial Intelligence, AAAI 2018
PublisherAAAI Press
Pages4163-4170
Number of pages8
ISBN (Electronic)9781577358008
Publication statusPublished - 1 Jan 2018
Externally publishedYes
Event32nd AAAI Conference on Artificial Intelligence, AAAI 2018 - New Orleans, United States
Duration: 2 Feb 20187 Feb 2018

Conference

Conference32nd AAAI Conference on Artificial Intelligence, AAAI 2018
Country/TerritoryUnited States
CityNew Orleans
Period2/02/187/02/18

Funding

Acknowledgements The authors would like to thank the anonymous reviewers for their valuable feedback. RP acknowledges the support by Arm Ltd. AM and KK acknowl- edge the support by the DFG CRC 876 ”Providing Information by Resource-Constrained Analysis”, project B4. KK acknowledges the support by the Centre for Cognitive Science at the TU Darmstadt.

Fingerprint

Dive into the research topics of 'Sum-product autoencoding: Encoding and decoding representations using sum-product networks'. Together they form a unique fingerprint.

Cite this