Meta-learning for symbolic hyperparameter defaults

Pieter Gijsbers, Joaquin Vanschoren, Florian Pfisterer, Jan N. van Rijn, Bernd Bischl

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)


Hyperparameter optimization in machine learning (ML) deals with the problem of empirically learning an optimal algorithm configuration from data, usually formulated as a black-box optimization problem. In this work, we propose a zero-shot method to meta-learn symbolic default hyperparameter configurations that are expressed in terms of the properties of the dataset. This enables a much faster, but still data-dependent, configuration of the ML algorithm, compared to standard hyperparameter optimization approaches. In the past, symbolic and static default values have usually been obtained as hand-crafted heuristics. We propose an approach of learning such symbolic configurations as formulas of dataset properties from a large set of prior evaluations on multiple datasets by optimizing over a grammar of expressions using an evolutionary algorithm. We evaluate our method on surrogate empirical performance models as well as on real data across 6 ML algorithms on more than 100 datasets and demonstrate that our method indeed finds viable symbolic defaults.
Original languageEnglish
Title of host publicationGECCO 2021 Companion - Proceedings of the 2021 Genetic and Evolutionary Computation Conference Companion
Subtitle of host publicationProceedings of the Genetic and Evolutionary Computation Conference Companion
PublisherAssociation for Computing Machinery, Inc
Number of pages2
ISBN (Electronic)978-1-4503-8351-6
Publication statusPublished - Jul 2021

Bibliographical note

2-page GECCO poster paper, full length original submission available as preprint on arXiv:


  • hyperparameter optimization
  • metalearning


Dive into the research topics of 'Meta-learning for symbolic hyperparameter defaults'. Together they form a unique fingerprint.

Cite this