Reparameterization gradient message passing

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review


In this paper we consider efficient message passing based inference in a factor graph representation of a probabilistic model. Current message passing methods, such as belief propagation, variational message passing or expectation propagation, rely on analytically pre-computed message update rules. In practical models, it is often not feasible to analytically derive all update rules for all factors in the graph and as a result, efficient message passing-based inference cannot proceed. In related research on (non-message passing-based) inference, a “reparameterization trick” has lead to a considerable extension of the class of models for which automated inference is possible. In this paper, we introduce Reparameterization Gradient Message Passing (RGMP), which is a new message passing method based on the reparameterization gradient. In most models, the large majority of messages can be analytically derived and we resort to RGMP only when necessary. We will argue that this kind of hybrid message passing leads naturally to low-variance gradients.
Original languageEnglish
Title of host publicationEUSIPCO 2019 - 27th European Signal Processing Conference
Place of PublicationPiscataway
PublisherInstitute of Electrical and Electronics Engineers
Number of pages5
ISBN (Electronic)9789082797039
Publication statusPublished - Sep 2019
Event27th European Signal Processing Conference, EUSIPCO 2019 - A Coruña, Spain
Duration: 2 Sep 20196 Sep 2019
Conference number: 27


Conference27th European Signal Processing Conference, EUSIPCO 2019
Abbreviated titleEUSIPCO 2019
CityA Coruña
Internet address


Dive into the research topics of 'Reparameterization gradient message passing'. Together they form a unique fingerprint.

Cite this