Authors
Niki van Stein, Diederick Vermetten, Anna V Kononova, Thomas Bäck
Publication date
2024/1/31
Journal
arXiv preprint arXiv:2401.17842
Description
Benchmarking heuristic algorithms is vital to understand under which conditions and on what kind of problems certain algorithms perform well. In most current research into heuristic optimization algorithms, only a very limited number of scenarios, algorithm configurations and hyper-parameter settings are explored, leading to incomplete and often biased insights and results. This paper presents a novel approach we call explainable benchmarking. Introducing the IOH-Xplainer software framework, for analyzing and understanding the performance of various optimization algorithms and the impact of their different components and hyper-parameters. We showcase the framework in the context of two modular optimization frameworks. Through this framework, we examine the impact of different algorithmic components and configurations, offering insights into their performance across diverse scenarios. We provide a systematic method for evaluating and interpreting the behaviour and efficiency of iterative optimization heuristics in a more transparent and comprehensible manner, allowing for better benchmarking and algorithm design.
Total citations
Scholar articles
N van Stein, D Vermetten, AV Kononova, T Bäck - arXiv preprint arXiv:2401.17842, 2024