Samenvatting
Probabilistic programming provides a structured approach to signal processing algorithm design. The design task is formulated as a generative model, and the algorithm is derived through automatic inference. Efficient inference is a major challenge; e.g., the Shafer-Shenoy algorithm (SS) performs badly on models with large treewidth, which arise from various real-world problems. We focus on reducing the size of discrete models with large treewidth by storing intermediate factors in compressed form, thereby decoupling the variables through conditioning on introduced weights. This work proposes pruning of these weights using Kullback-Leibler divergence. We adapt a strategy from the Gaussian mixture reduction literature, leading to Kullback-Leibler Tensor Belief Propagation (KL-TBP), in which we use agglomerative hierarchical clustering to subsequently merge pairs of weights. Experiments using benchmark problems show KL-TBP consistently achieves lower approximation error than existing methods with competitive runtime.
Originele taal-2 | Engels |
---|---|
Titel | 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) |
Plaats van productie | Barcelona, Spain |
Uitgeverij | Institute of Electrical and Electronics Engineers |
Pagina's | 5850-5854 |
Aantal pagina's | 5 |
ISBN van elektronische versie | 978-1-5090-6631-5 |
ISBN van geprinte versie | 978-1-5090-6632-2 |
DOI's | |
Status | Gepubliceerd - 14 mei 2020 |
Evenement | 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2020) - Virtual, Barcelona, Spanje Duur: 4 mei 2020 → 8 mei 2020 https://2020.ieeeicassp.org/ |
Congres
Congres | 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2020) |
---|---|
Verkorte titel | ICASSP 2020 |
Land/Regio | Spanje |
Stad | Barcelona |
Periode | 4/05/20 → 8/05/20 |
Internet adres |