Authors
Pieter Gijsbers, Florian Pfisterer, Jan N van Rijn, Bernd Bischl, Joaquin Vanschoren
Publication date
2021/7/7
Book
Proceedings of the Genetic and Evolutionary Computation Conference Companion
Pages
151-152
Description
Hyperparameter optimization in machine learning (ML) deals with the problem of empirically learning an optimal algorithm configuration from data, usually formulated as a black-box optimization problem. In this work, we propose a zero-shot method to meta-learn symbolic default hyperparameter configurations that are expressed in terms of the properties of the dataset. This enables a much faster, but still data-dependent, configuration of the ML algorithm, compared to standard hyperparameter optimization approaches. In the past, symbolic and static default values have usually been obtained as hand-crafted heuristics. We propose an approach of learning such symbolic configurations as formulas of dataset properties from a large set of prior evaluations on multiple datasets by optimizing over a grammar of expressions using an evolutionary algorithm. We evaluate our method on surrogate empirical performance …
Total citations
202220232024631
Scholar articles
P Gijsbers, F Pfisterer, JN van Rijn, B Bischl… - Proceedings of the Genetic and Evolutionary …, 2021