Joints in Random Forests

Research output: Contribution to journalConference articleAcademicpeer-review

23 Downloads (Pure)


Decision Trees (DTs) and Random Forests (RFs) are powerful discriminative learners and tools of central importance to the everyday machine learning practitioner and data scientist. Due to their discriminative nature, however, they lack principled methods to process inputs with missing features or to detect outliers, which requires pairing them with imputation techniques or a separate generative model. In this paper, we demonstrate that DTs and RFs can naturally be interpreted as generative models, by drawing a connection to Probabilistic Circuits, a prominent class of tractable probabilistic models. This reinterpretation equips them with a full joint distribution over the feature space and leads to Generative Decision Trees (GeDTs) and Generative Forests (GeFs), a family of novel hybrid generative-discriminative models. This family of models retains the overall characteristics of DTs and RFs while additionally being able to handle missing features by means of marginalisation. Under certain assumptions, frequently made for Bayes consistency results, we show that consistency in GeDTs and GeFs extend to any pattern of missing input features, if missing at random. Empirically, we show that our models often outperform common routines to treat missing data, such as K-nearest neighbour imputation, and moreover, that our models can naturally detect outliers by monitoring the marginal probability of input features.
Original languageEnglish
JournalProceedings of Machine Learning Research
Issue numberXX
Publication statusAccepted/In press - 2020
EventConference on Neural Information Processing Systems -
Duration: 6 Dec 202012 Dec 2020
Conference number: 34


  • cs.LG
  • cs.AI
  • stat.ML


Dive into the research topics of 'Joints in Random Forests'. Together they form a unique fingerprint.

Cite this