Author: Community
Guest Editorial Special Issue on Theoretical Foundations of Evolutionary Computation
It is our pleasure to introduce this special issue on the recent advances in the theoretical foundations of evolutionary computation (EC). While in the early days of this field, theoretical analyses inevitably focused on simplified models of evolutiona…
It is our pleasure to introduce this special issue on the recent advances in the theoretical foundations of evolutionary computation (EC). While in the early days of this field, theoretical analyses inevitably focused on simplified models of evolutionary algorithms (EAs), the continuous progress made in the development of suitable mathematical techniques for the analysis now allows to derive proven statements regarding the performance of off-the-shelf metaheuristics, such as standard generational and steady-state genetic algorithms with no algorithmic simplifications. Comparisons between the performance of a given EA with the best possible one can also be made nowadays, allowing to assess whether a given algorithm may be improved upon or whether its performance is optimal for a given class of problems. Such understanding often provides insights for the design of new EAs which provably have better performance for given problems. We are glad that examples of results of this kind are present within this special issue. A total of 27 papers were submitted which were the subject of at least three independent reviews, and six manuscripts of the highest quality were selected for publication in the special issue. In the following, we provide a brief summary of these manuscripts.
A General Dichotomy of Evolutionary Algorithms on Monotone Functions
It is known that the (1 + 1)-EA with mutation rate $c/n$ optimizes every monotone function efficiently if $c < 1$ , and needs exponential time on some monotone functions (HotTopic functions) if $cgeq 2.2$ . We study the same question for …
It is known that the (1 + 1)-EA with mutation rate
/inline-formula> nor by the offspring population size
Parallel Black-Box Complexity With Tail Bounds
We propose a new black-box complexity model for search algorithms evaluating $lambda $ search points in parallel. The parallel unary unbiased black-box complexity gives lower bounds on the number of function evaluations every parallel unary unbiased …
We propose a new black-box complexity model for search algorithms evaluating
This article significantly extends preliminary results which appeared in
Finite-Sample Analysis of Information Geometric Optimization With Isotropic Gaussian Distribution on Convex Quadratic Functions
We theoretically analyze the information geometric optimization (IGO), which is a unified framework of stochastic search algorithms for black-box optimization. The IGO framework has two parameters: 1) the learning rate and 2) the sample size, and they …
We theoretically analyze the information geometric optimization (IGO), which is a unified framework of stochastic search algorithms for black-box optimization. The IGO framework has two parameters: 1) the learning rate and 2) the sample size, and they influence the behavior of the algorithm. We investigate the strategy parameters of the IGO with the family of isotropic Gaussian distributions on a general convex quadratic function. Compared to the previous theoretical works, where an infinite sample size is assumed and the deterministic algorithm dynamics is studied, we investigate the expected improvement of the algorithm with a finite sample size. The analysis finds that the relative decrease rates of the distance from the distribution mean to the landscape optimum and the distribution standard deviation must be the same, which we observe in practice, while the analysis based on an infinite sample size failed to obtain. We derive these rates explicitly as a function of the eigenvalues of the Hessian of the objective function and the strategy parameters. We also derive the stable value of the ratio of the square distance to the optimum over the distribution variance, as well as the conditions that the stable value exists. These theoretical values coincide with our numerical simulations.
Significance-Based Estimation-of-Distribution Algorithms
Estimation-of-distribution algorithms (EDAs) are randomized search heuristics that create a probabilistic model of the solution space, which is updated iteratively, based on the quality of the solutions sampled according to the model. As previous works…
Estimation-of-distribution algorithms (EDAs) are randomized search heuristics that create a probabilistic model of the solution space, which is updated iteratively, based on the quality of the solutions sampled according to the model. As previous works show, this iteration-based perspective can lead to erratic updates of the model, in particular, to bit-frequencies approaching a random boundary value. In order to overcome this problem, we propose a new EDA based on the classic compact genetic algorithm (cGA) that takes into account a longer history of samples and updates its model only with respect to information which it classifies as statistically significant. We prove that this significance-based cGA (sig-cGA) optimizes the commonly regarded benchmark functions OneMax (OM), LeadingOnes, and BinVal all in quasilinear time, a result shown for no other EDA or evolutionary algorithm so far. For the recently proposed stable compact genetic algorithm—an EDA that tries to prevent erratic model updates by imposing a bias to the uniformly distributed model—we prove that it optimizes OM only in a time exponential in its hypothetical population size. Similarly, we show that the convex search algorithm cannot optimize OM in polynomial time.
Landscape-Aware Performance Prediction for Evolutionary Multiobjective Optimization
We expose and contrast the impact of landscape characteristics on the performance of search heuristics for black-box multiobjective combinatorial optimization problems. A sound and concise summary of features characterizing the structure of an arbitrar…
We expose and contrast the impact of landscape characteristics on the performance of search heuristics for black-box multiobjective combinatorial optimization problems. A sound and concise summary of features characterizing the structure of an arbitrary problem instance is identified and related to the expected performance of global and local dominance-based multiobjective optimization algorithms. We provide a critical review of existing features tailored to multiobjective combinatorial optimization problems, and we propose additional ones that do not require any global knowledge from the landscape, making them suitable for large-size problem instances. Their intercorrelation and their association with algorithm performance are also analyzed. This allows us to assess the individual and the joint effect of problem features on algorithm performance, and to highlight the main difficulties encountered by such search heuristics. By providing effective tools for multiobjective landscape analysis, we highlight that multiple features are required to capture problem difficulty, and we provide further insights into the importance of ruggedness and multimodality to characterize multiobjective combinatorial landscapes.
Population Diversity of Nonelitist Evolutionary Algorithms in the Exploration Phase
This paper discusses the genetic diversity of real-coded populations processed by an evolutionary algorithm (EA). Diversity is expressed as a variance or a covariance matrix of individuals contained in the population, in one- or multi-dimensional cases…
This paper discusses the genetic diversity of real-coded populations processed by an evolutionary algorithm (EA). Diversity is expressed as a variance or a covariance matrix of individuals contained in the population, in one- or multi-dimensional cases, respectively. We focus on the exploration stage of the optimization, therefore, the fitness function is modeled as noise. We prove that the expected value of genetic diversity achieves a level proportional to the mutation covariance matrix. The proportionality coefficient depends solely on the EA parameters. Formulas are derived to predict the diversity for fitness proportionate, tournament, and truncation selection, with and without arithmetic crossover and with Gaussian mutation. Experimental validation of the multidimensional case shows that prediction accuracy is satisfactory in a broad spectrum of settings of EA parameters.