Meta learning for defaults: symbolic defaults

Jan N. van Rijn, Florian Pfisterer, Janek Thomas, Andreas Muller, Bernd Bischl, J. Vanschoren

Research output: Contribution to conferencePaper

Abstract

In this work we propose to use meta-learning to learn sets of symbolic default hyperparameter configurations that work well across many data sets. A well known example for such a symbolic default is the logarithmic relation between the number of features of a dataset and the available features per split of a Random Forest, as observed by Breiman (2001). Symbolic functions allow for a more rich vocabulary to define defaults on. In the past, symbolic and static default values have been obtained either from hand-crafted heuristics or empirical evaluations of specific
algorithms. We propose to automatically learn such symbolic configurations, i.e., formulas containing meta-features, from a large set of prior evaluations of numeric hyperparameters on multiple data sets via symbolic regression and optimization.
Original languageEnglish
Number of pages7
Publication statusPublished - 8 Dec 2018
EventNeural Information Processing Workshop on Meta-Learning - Montreal, Canada
Duration: 8 Dec 20188 Dec 2018
http://metalearning.ml/2018/

Workshop

WorkshopNeural Information Processing Workshop on Meta-Learning
CountryCanada
CityMontreal
Period8/12/188/12/18
Internet address

Keywords

  • Meta-learning
  • Automatic Machine Learning

Fingerprint Dive into the research topics of 'Meta learning for defaults: symbolic defaults'. Together they form a unique fingerprint.

  • Cite this

    van Rijn, J. N., Pfisterer, F., Thomas, J., Muller, A., Bischl, B., & Vanschoren, J. (2018). Meta learning for defaults: symbolic defaults. Paper presented at Neural Information Processing Workshop on Meta-Learning, Montreal, Canada. http://metalearning.ml/2018/papers/metalearn2018_paper70.pdf