On the Automated Generation of Optimization Heuristics: A Dream or Reality?

Thomas Bäck
Leiden Institute of Advanced Computer Science (LIACS), Leiden University, NL November 2024

For decades, researchers have been looking at paradigms gleaned from nature as inspiration for problem solving approaches, for example in the domain of optimization. There are many classes of such algorithms, including evolutionary algorithms, particle swarms, differential evolution, ant colony optimization, and the number of proposed variants of them is quite large. This makes it hard to keep track of the variants and their respective strengths, and even more so it creates a difficult situation for non-experts who are interested in selecting the best algorithm for their real-world application problem.



In this presentation, I will introduce and discuss ideas to automatically optimize the optimization heuristic. This task can be approached as an algorithm configuration problem, for which I will present some examples illustrating that this task can be handled by direct global optimization algorithms – in other words, by “automatically optimizing the optimization algorithm”. I will give an example how a combinatorial design space of thousands of configuration variants of evolution strategies in a so-called modular Covariance Matrix Adaptation Evolution Strategy framework can be searched, and how the results can be analyzed using data mining. This approach provides an opportunity for discovering the unexplored areas of the optimization algorithm design space. Extensions towards other algorithm design spaces such as particle swarm optimization and differential evolution are then outlined, too.



In the second part of the presentation, I will discuss a range of real-world engineering design applications, for which such an approach could truly provide a competitive advantage. In such cases, optimizing the optimization algorithm requires a proper definition of the problem class, for which the optimization is executed. For the example of automotive crash optimization problems, I will present first results demonstrating that these problems differ a lot from the classical benchmark test function sets used by academic community, and present an automated approach to find test functions that properly represent the real-world problem. First results on the performance gain that can be achieved by optimizing the optimization algorithm on such real-world problems are also presented.



To conclude, I will briefly show a novel approach for using a large language model in an iterative loop to automatically generate metaheuristic algorithms for optimization, and how this approach which we call LLaMEA (Large Language Model Evolutionary Algorithm) can generate novel metaheuristics that perform very well on a standard set of benchmark functions.

Peerapon Vateekul
Brief Biography

Thomas Bäck (Fellow, IEEE) received the Diploma degree in Computer Science in 1990 and the Ph.D. degree in Computer Science in 1994 (under supervision of H.-P. Schwefel), both from the University of Dortmund, Germany. He is Professor of Computer Science with the Leiden Institute of Advanced Computer Science (LIACS), Leiden University, Netherlands. His research interests include evolutionary computation, machine learning, and their real-world applications, especially in sustainable smart industry and health.



Dr. Bäck has been elected as member of the Royal Netherlands Academy of Arts and Sciences (KNAW, 2021), as IEEE Fellow (class of 2022), and as a member of Academia Europaea (2022). He was a recipient of the IEEE Computational Intelligence Society (CIS) Evolutionary Computation Pioneer Award in 2015, was elected as Fellow of the International Society of Genetic and Evolutionary Computation in 2003, and received the best Ph.D. thesis award from the German society of Computer Science (GI) in 1995.



He currently serves as an Editor in Chief of the Evolutionary Computation Journal (MIT Press), Associate Editor of the IEEE Transactions on Evolutionary Computation and Artificial Intelligence Review journals and area editor of the ACM Transactions on Evolutionary Learning and Optimization. He was also co-editor-in-chief of the Handbook of Evolutionary Computation (CRC Press/Taylor & Francis 1997), co-editor of the Handbook of Natural Computing (Springer, 2013), author of Evolutionary Computation in Theory and Practice (OUP, New York, 1996) and co-author of Contemporary Evolution Strategies (Springer, 2013).