Meta learning for defaults: symbolic defaults

Jan N. van Rijn, Florian Pfisterer, Janek Thomas, Andreas Muller, Bernd Bischl, J. Vanschoren

Research output: Contribution to conferencePaperAcademic


In this work we propose to use meta-learning to learn sets of symbolic default hyperparameter configurations that work well across many data sets. A well known example for such a symbolic default is the logarithmic relation between the number of features of a dataset and the available features per split of a Random Forest, as observed by Breiman (2001). Symbolic functions allow for a more rich vocabulary to define defaults on. In the past, symbolic and static default values have been obtained either from hand-crafted heuristics or empirical evaluations of specific
algorithms. We propose to automatically learn such symbolic configurations, i.e., formulas containing meta-features, from a large set of prior evaluations of numeric hyperparameters on multiple data sets via symbolic regression and optimization.
Original languageEnglish
Number of pages7
Publication statusPublished - 8 Dec 2018
EventNeural Information Processing Workshop on Meta-Learning - Montreal, Canada
Duration: 8 Dec 20188 Dec 2018


WorkshopNeural Information Processing Workshop on Meta-Learning
Internet address


  • Meta-learning
  • Automatic Machine Learning


Dive into the research topics of 'Meta learning for defaults: symbolic defaults'. Together they form a unique fingerprint.

Cite this