Toward a Steady-State Analysis of an Evolution Strategy on a Robust Optimization Problem With Noise-Induced Multimodality

A steady state analysis of the optimization quality of a classical self-adaptive evolution strategy (ES) on a class of robust optimization problems is presented. A novel technique for calculating progress rates for nonquadratic noisy fitness landscapes…

A steady state analysis of the optimization quality of a classical self-adaptive evolution strategy (ES) on a class of robust optimization problems is presented. A novel technique for calculating progress rates for nonquadratic noisy fitness landscapes is presented. This technique yields asymptotically exact results in the infinite population size limit. This technique is applied to a class of functions with noise-induced multimodality. The resulting progress rate formulas are compared with high-precision experiments. The influence of fitness resampling is considered and the steady state behavior of the ES is derived and compared with simulations. The questions whether one should sample and average fitness values and how to choose the truncation ratio are discussed giving rise to further research perspectives.

CACM editor-in-chief steps down advice for editorial board

In this month’s Communications of the ACM Moshe Vardi, the current editor-in-chief, publishes a one page valedictory article doi:10.1145/3090801 in which he argues editorial boards must be proactive in soliciting interesting submissions to stop their j…

In this month’s Communications of the ACM Moshe Vardi, the current editor-in-chief, publishes a one page valedictory article

CACM editor-in-chief steps down advice for editorial board

In this month’s Communications of the ACM Moshe Vardi, the current editor-in-chief, publishes a one page valedictory article doi:10.1145/3090801 in which he argues editorial boards must be proactive in soliciting interesting submissions to stop their j…

In this month’s Communications of the ACM Moshe Vardi, the current editor-in-chief, publishes a one page valedictory article

Parameters, Parameters, Parameters

The practice of evolutionary algorithms involves a mundane yet inescapable phase, namely, finding parameters that work well. How big should the population be? How many generations should the algorithm run? What is the (tournament selection) tournament …

The practice of evolutionary algorithms involves a mundane yet inescapable phase, namely, finding parameters that work well. How big should the population be? How many generations should the algorithm run? What is the (tournament selection) tournament size? What probabilities should one assign to crossover and mutation? All these nagging questions need good answers if one is to embrace success. Through an extensive series of experiments over multiple evolutionary algorithm implementations and problems we show that parameter space tends to be rife with viable parameters. We aver that this renders the life of the practitioner that much easier, and cap off our study with an advisory digest for the weary.
Wanna learn more? The full paper is here.

Parameters, Parameters, Parameters

The practice of evolutionary algorithms involves a mundane yet inescapable phase, namely, finding parameters that work well. How big should the population be? How many generations should the algorithm run? What is the (tournament selection) tournament …

The practice of evolutionary algorithms involves a mundane yet inescapable phase, namely, finding parameters that work well. How big should the population be? How many generations should the algorithm run? What is the (tournament selection) tournament size? What probabilities should one assign to crossover and mutation? All these nagging questions need good answers if one is to embrace success. Through an extensive series of experiments over multiple evolutionary algorithm implementations and problems we show that parameter space tends to be rife with viable parameters. We aver that this renders the life of the practitioner that much easier, and cap off our study with an advisory digest for the weary.
Wanna learn more? The full paper is here.