Abstract
Although discriminative learning in graphical models generally improves classification results, the generative semantics of the model are compromised. In this paper, we introduce a novel approach of hybrid generative-discriminative learning for Bayesian networks. We use an SVM-type large margin formulation for discriminative training, introducing a likelihood-weighted ℓ1- norm for the SVM-norm-penalization. This simultaneously optimizes the data likelihood and therefore partly maintains the generative character of the model. For many network structures, our method can be formulated as a convex problem, guaranteeing a globally optimal solution. In terms of classification, the resulting models outperform state-of-the art generative and discriminative learning methods for Bayesian networks, and are comparable with linear and kernelized SVMs. Furthermore, the models achieve likelihoods close to the maximum likelihood solution and show robust behavior in classification experiments with missing features.
Original language | English |
---|---|
Title of host publication | International Conference on Machine Learning (ICML) |
Pages | 1272-1280 |
Number of pages | 9 |
Publication status | Published - 1 Jan 2013 |
Externally published | Yes |
Event | 30th International Conference on Machine Learning (ICML 2013) - Atlanta, United States Duration: 16 Jun 2013 → 21 Jun 2013 Conference number: 30 |
Publication series
Name | Proceedings of Machine Learning Research |
---|---|
Volume | 28 |
Conference
Conference | 30th International Conference on Machine Learning (ICML 2013) |
---|---|
Abbreviated title | ICML 2013 |
Country/Territory | United States |
City | Atlanta |
Period | 16/06/13 → 21/06/13 |