Core Concepts
This work proposes an extension to the GramML model-free AutoML approach that incorporates hyperparameter search into the grammar-based pipeline configuration process, enabling a more comprehensive exploration of the solution space.
Abstract
The paper presents an extension to the GramML model-free AutoML approach that integrates hyperparameter search capabilities. The key aspects are:
Incorporating hyperparameter values into the grammar using grammar rules to expand the search space.
Modifying the Monte Carlo Tree Search (MCTS) algorithm to handle the increased complexity, including pruning strategies and non-parametric selection policies.
The authors conduct an ablation study to evaluate the efficiency of different selection policies (UCT, BTS, TPE) and compare the extended GramML approach (named GramML++) to other state-of-the-art techniques like AutoSklearn and MOSAIC on the OpenML-CC18 benchmark. The results show that the GramML++ variants, particularly the one using Bootstrap Thompson Sampling (GramML++BTS), significantly outperform the other methods in terms of average ranking and score.
The work demonstrates the effectiveness of integrating hyperparameter search into grammar-based AutoML and provides a promising approach for addressing the challenges of larger search spaces in this domain.
Stats
The paper does not contain any explicit numerical data or metrics to extract. The focus is on the algorithmic extensions and empirical evaluation.
Quotes
The paper does not contain any striking quotes that support the key logics.