TY - JOUR
T1 - Principled Pruning of Bayesian Neural Networks through Variational Free Energy Minimization
AU - Beckers, Jim
AU - van Erp, Bart
AU - Zhao, Ziyue
AU - Kondrashov, Kirill
AU - de Vries, A. (Bert)
PY - 2024
Y1 - 2024
N2 - Bayesian model reduction provides an efficient approach for comparing the performance of all nested sub-models of a model, without re-evaluating any of these sub-models. Until now, Bayesian model reduction has been applied mainly in the computational neuroscience community on simple models. In this paper, we formulate and apply Bayesian model reduction to perform principled pruning of Bayesian neural networks, based on variational free energy minimization. Direct application of Bayesian model reduction, however, gives rise to approximation errors. Therefore, a novel iterative pruning algorithm is presented to alleviate the problems arising with naive Bayesian model reduction, as supported experimentally on the publicly available UCI datasets for different inference algorithms. This novel parameter pruning scheme solves the shortcomings of current state-of-the-art pruning methods that are used by the signal processing community. The proposed approach has a clear stopping criterion and minimizes the same objective that is used during training. Next to these benefits, our experiments indicate better model performance in comparison to state-of-the-art pruning schemes.
AB - Bayesian model reduction provides an efficient approach for comparing the performance of all nested sub-models of a model, without re-evaluating any of these sub-models. Until now, Bayesian model reduction has been applied mainly in the computational neuroscience community on simple models. In this paper, we formulate and apply Bayesian model reduction to perform principled pruning of Bayesian neural networks, based on variational free energy minimization. Direct application of Bayesian model reduction, however, gives rise to approximation errors. Therefore, a novel iterative pruning algorithm is presented to alleviate the problems arising with naive Bayesian model reduction, as supported experimentally on the publicly available UCI datasets for different inference algorithms. This novel parameter pruning scheme solves the shortcomings of current state-of-the-art pruning methods that are used by the signal processing community. The proposed approach has a clear stopping criterion and minimizes the same objective that is used during training. Next to these benefits, our experiments indicate better model performance in comparison to state-of-the-art pruning schemes.
KW - Bayes methods
KW - Bayesian Model Reduction
KW - Bayesian Neural Networks
KW - Biological neural networks
KW - Computational modeling
KW - Parameter Pruning
KW - Probabilistic logic
KW - Reduced order systems
KW - Signal processing
KW - Training
KW - Variational Free Energy
UR - http://www.scopus.com/inward/record.url?scp=85179030776&partnerID=8YFLogxK
U2 - 10.1109/OJSP.2023.3337718
DO - 10.1109/OJSP.2023.3337718
M3 - Article
AN - SCOPUS:85179030776
SN - 2644-1322
VL - 5
SP - 195
EP - 203
JO - IEEE Open Journal of Signal Processing
JF - IEEE Open Journal of Signal Processing
M1 - 10334001
ER -