Abstract
The Gamma mixture model is a flexible probability distribution for representing beliefs about scale variables such as precisions. Inference in the Gamma mixture model for all latent variables is non-trivial as it leads to intractable equations. This paper presents two variants of variational message passing-based inference in a Gamma mixture model. We use moment matching and alternatively expectation-maximization to approximate the posterior distributions. The proposed method supports automated inference in factor graphs for large probabilistic models that contain multiple Gamma mixture models as plug-in factors. The Gamma mixture model has been implemented in a factor graph package and we present experimental results for both synthetic and real-world data sets.
Original language | English |
---|---|
Title of host publication | 2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP) |
Publisher | Institute of Electrical and Electronics Engineers |
Number of pages | 6 |
ISBN (Electronic) | 978-1-7281-6338-3 |
ISBN (Print) | 978-1-6654-1184-4 |
DOIs | |
Publication status | Published - 15 Nov 2021 |
Event | 31st IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2021 - Virtual, Gold Coast, Australia Duration: 25 Oct 2021 → 28 Oct 2021 Conference number: 31 https://2021.ieeemlsp.org/ |
Conference
Conference | 31st IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2021 |
---|---|
Abbreviated title | MLSP 2021 |
Country/Territory | Australia |
City | Gold Coast |
Period | 25/10/21 → 28/10/21 |
Internet address |
Keywords
- Mixture models
- Machine learning
- Signal processing
- Probabilistic logic
- Mathematical models
- Probability distribution
- Gamma Mixture Model
- Expectation-Maximization
- Factor Graphs
- Probabilistic Inference
- Message Passing
- Moment Matching