Approximate Inference by Kullback-Leibler Tensor Belief Propagation

Onderzoeksoutput: Hoofdstuk in Boek/Rapport/CongresprocedureConferentiebijdrageAcademicpeer review

1 Downloads (Pure)


Probabilistic programming provides a structured approach to signal processing algorithm design. The design task is formulated as a generative model, and the algorithm is derived through automatic inference. Efficient inference is a major challenge; e.g., the Shafer-Shenoy algorithm (SS) performs badly on models with large treewidth, which arise from various real-world problems. We focus on reducing the size of discrete models with large treewidth by storing intermediate factors in compressed form, thereby decoupling the variables through conditioning on introduced weights. This work proposes pruning of these weights using Kullback-Leibler divergence. We adapt a strategy from the Gaussian mixture reduction literature, leading to Kullback-Leibler Tensor Belief Propagation (KL-TBP), in which we use agglomerative hierarchical clustering to subsequently merge pairs of weights. Experiments using benchmark problems show KL-TBP consistently achieves lower approximation error than existing methods with competitive runtime.
Originele taal-2Engels
Titel2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Plaats van productieBarcelona, Spain
UitgeverijInstitute of Electrical and Electronics Engineers
Aantal pagina's5
ISBN van elektronische versie978-1-5090-6631-5
ISBN van geprinte versie978-1-5090-6632-2
StatusGepubliceerd - 14 mei 2020
Evenement2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) - Virtual, Barcelona, Spanje
Duur: 4 mei 20208 mei 2020
Congresnummer: 2020


Congres2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Verkorte titelICASSP
Internet adres

Vingerafdruk Duik in de onderzoeksthema's van 'Approximate Inference by Kullback-Leibler Tensor Belief Propagation'. Samen vormen ze een unieke vingerafdruk.

Citeer dit