Abstract
In this work we propose to use meta-learning to learn sets of symbolic default hyperparameter configurations that work well across many data sets. A well known example for such a symbolic default is the logarithmic relation between the number of features of a dataset and the available features per split of a Random Forest, as observed by Breiman (2001). Symbolic functions allow for a more rich vocabulary to define defaults on. In the past, symbolic and static default values have been obtained either from hand-crafted heuristics or empirical evaluations of specific
algorithms. We propose to automatically learn such symbolic configurations, i.e., formulas containing meta-features, from a large set of prior evaluations of numeric hyperparameters on multiple data sets via symbolic regression and optimization.
algorithms. We propose to automatically learn such symbolic configurations, i.e., formulas containing meta-features, from a large set of prior evaluations of numeric hyperparameters on multiple data sets via symbolic regression and optimization.
Original language | English |
---|---|
Number of pages | 7 |
Publication status | Published - 8 Dec 2018 |
Event | Neural Information Processing Workshop on Meta-Learning - Montreal, Canada Duration: 8 Dec 2018 → 8 Dec 2018 http://metalearning.ml/2018/ |
Workshop
Workshop | Neural Information Processing Workshop on Meta-Learning |
---|---|
Country/Territory | Canada |
City | Montreal |
Period | 8/12/18 → 8/12/18 |
Internet address |
Keywords
- Meta-learning
- Automatic Machine Learning