Towards Robust Classification with Deep Generative Forests

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

14 Downloads (Pure)


Decision Trees and Random Forests are among the most widely used machine learning models, and often achieve state-of-the-art performance in tabular, domain-agnostic datasets. Nonetheless, being primarily discriminative models they lack principled methods to manipulate the uncertainty of predictions. In this paper, we exploit Generative Forests (GeFs), a recent class of deep probabilistic models that addresses these issues by extending Random Forests to generative models representing the full joint distribution over the feature space. We demonstrate that GeFs are uncertainty-aware classifiers, capable of measuring the robustness of each prediction as well as detecting out-of-distribution samples.
Original languageEnglish
Title of host publicationICML 2020 Workshop on Uncertainty and Robustness in Deep Learning
Publication statusPublished - 11 Jul 2020
Event37th International Conference on Machine Learning (ICML 2020) -
Duration: 12 Jul 202018 Jul 2020
Conference number: 37


Conference37th International Conference on Machine Learning (ICML 2020)
Abbreviated titleICML 2020


Dive into the research topics of 'Towards Robust Classification with Deep Generative Forests'. Together they form a unique fingerprint.

Cite this