The most generative maximum margin Bayesian networks

Robert Peharz, Sebastian Tschiatschek, Franz Pernkopf

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

4 Citations (Scopus)


Although discriminative learning in graphical models generally improves classification results, the generative semantics of the model are compromised. In this paper, we introduce a novel approach of hybrid generative-discriminative learning for Bayesian networks. We use an SVM-type large margin formulation for discriminative training, introducing a likelihood-weighted ℓ1- norm for the SVM-norm-penalization. This simultaneously optimizes the data likelihood and therefore partly maintains the generative character of the model. For many network structures, our method can be formulated as a convex problem, guaranteeing a globally optimal solution. In terms of classification, the resulting models outperform state-of-the art generative and discriminative learning methods for Bayesian networks, and are comparable with linear and kernelized SVMs. Furthermore, the models achieve likelihoods close to the maximum likelihood solution and show robust behavior in classification experiments with missing features.

Original languageEnglish
Title of host publicationInternational Conference on Machine Learning (ICML)
Number of pages9
Publication statusPublished - 1 Jan 2013
Externally publishedYes
Event30th International Conference on Machine Learning (ICML 2013) - Atlanta, United States
Duration: 16 Jun 201321 Jun 2013
Conference number: 30

Publication series

NameProceedings of Machine Learning Research


Conference30th International Conference on Machine Learning (ICML 2013)
Abbreviated titleICML 2013
Country/TerritoryUnited States


Dive into the research topics of 'The most generative maximum margin Bayesian networks'. Together they form a unique fingerprint.

Cite this