Extended Variational Message Passing for Automated Approximate Bayesian Inference

Research output: Contribution to journalArticleAcademicpeer-review

158 Downloads (Pure)

Abstract

Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many data processing problems can be formulated as inference tasks on a generative probabilistic model. However, accurate generative models may also contain deterministic and possibly nonlinear variable mappings and non-conjugate factor pairs that complicate the automatic execution of the VMP algorithm. In this paper, we show that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables. We extend the applicability of VMP by approximating the required expectation quantities in appropriate cases by importance sampling and Laplace approximation. As a result, the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications. We implemented EVMP in the Julia language in the probabilistic programming package ForneyLab.jl and show by a number of examples that EVMP renders an almost universal inference engine for factorized probabilistic models.
Original languageEnglish
Article number815
Number of pages36
JournalEntropy
Volume23
Issue number7
DOIs
Publication statusPublished - 26 Jun 2021

Keywords

  • Bayesian inference
  • Variational infere
  • Factor graphs
  • Variational message passing
  • Probabilistic programming
  • Variational inference

Fingerprint

Dive into the research topics of 'Extended Variational Message Passing for Automated Approximate Bayesian Inference'. Together they form a unique fingerprint.

Cite this