Paradoxes in Numerical Comparison of Optimization Algorithms

Numerical comparison is often key to verifying the performance of optimization algorithms, especially, global optimization algorithms. However, studies have so far neglected issues concerning comparison strategies necessary to rank optimization algorit…

Numerical comparison is often key to verifying the performance of optimization algorithms, especially, global optimization algorithms. However, studies have so far neglected issues concerning comparison strategies necessary to rank optimization algorithms properly. To fill this gap for the first time, we combine voting theory and numerical comparison research areas, which have been disjoint so far, and thus extend the results of the former to the latter for optimization algorithms. In particular, we investigate compatibility issues arising from comparing two and more than two algorithms, termed “C2” and “C2+” in this article, respectively. Through defining and modeling “C2” and “C2+” mathematically, it is uncovered and illustrated that numerical comparison can be incompatible. Further, two possible paradoxes, namely, “cycle ranking” and “survival of the nonfittest,” are discovered and analyzed rigorously. The occurrence probabilities of these two paradoxes are also calculated under the no-free-lunch assumption, which shows the first justifiable use of the impartial culture assumption from voting theory, providing a point of reference to the frequency of the paradoxes occurring. It is also shown that significant influence on these probabilities comes from the number of algorithms and the number of optimization problems studied in the comparison. Further, various limiting probabilities when the number of optimization problems goes to infinity are also derived and characterized. The results would help guide benchmarking and developing optimization and machine learning algorithms.

Table of contents

Presents the table of contents for this issue of the publication.

Presents the table of contents for this issue of the publication.

Evolutionary Black-Box Topology Optimization: Challenges and Promises

Black-box topology optimization (BBTO) uses evolutionary algorithms and other soft computing techniques to generate near-optimal topologies of mechanical structures. Although evolutionary algorithms are widely used to compensate the limited applicabili…

Black-box topology optimization (BBTO) uses evolutionary algorithms and other soft computing techniques to generate near-optimal topologies of mechanical structures. Although evolutionary algorithms are widely used to compensate the limited applicability of conventional gradient optimization techniques, methods based on BBTO have been criticized due to numerous drawbacks. In this article, we discuss topology optimization as a black-box optimization problem. We review the main BBTO methods, discuss their challenges and present approaches to relax them. Dealing with those challenges effectively can lead to wider applicability of topology optimization, as well as the ability to tackle industrial, highly constrained, nonlinear, many-objective, and multimodal problems. Consequently, future research in this area may open the door for innovating new applications in science and engineering that may go beyond solving classical optimization problems of mechanical structures. Furthermore, algorithms designed for BBTO can be added to existing software toolboxes and packages of topology optimization.

Novel Interactive Preference-Based Multiobjective Evolutionary Optimization for Bolt Supporting Networks

Previous methods of designing a bolt supporting network, which depend on engineering experiences, seek optimal bolt supporting schemes in terms of supporting quality. The supporting cost and time, however, have not been considered, which restricts thei…

Previous methods of designing a bolt supporting network, which depend on engineering experiences, seek optimal bolt supporting schemes in terms of supporting quality. The supporting cost and time, however, have not been considered, which restricts their applications in real-world situations. We formulate the problem of designing a bolt supporting network as a three-objective optimization model by simultaneously considering such indicators as quality, economy, and efficiency. Especially, two surrogate models are constructed by support vector regression for roof-to-floor convergence and the two-sided displacement, respectively, so as to rapidly evaluate supporting quality during optimization. To solve the formulated model, a novel interactive preference-based multiobjective evolutionary algorithm is proposed. The highlight of generic methods which interactively articulate preferences is to systematically manage the regions of interest by three steps, that is, “partitioning-updating-tracking” in accordance with the cognition process of human. The preference regions of a decision-maker (DM) are first articulated and employed to narrow down the feasible objective space before the evolution in terms of nadir point, not the commonly used ideal point. Then, the DM’s preferences are tracked by dynamically updating these preference regions based on satisfactory candidates during the evolution. Finally, individuals in the population are evaluated based on the preference regions. We apply the proposed model and algorithm to design the bolt supporting network of a practical roadway. The experimental results show that the proposed method can generate an optimal bolt supporting scheme with a good balance between supporting quality and the other demands, besides speeding up its convergence.

Genetic Programming and Evolvable Machines 2020-07-16 00:13:00

The third issue of Volume 21 of Genetic Programming and Evolvable Machines is now available for download.This is a special issue on Highlights of Genetic Programming 2019 Events (guest edited by Ting Hu, Miguel Nicolau, and Lukas Sekanina, wi…

The third issue of Volume 21 of Genetic Programming and Evolvable Machines is now available for download.

This is a special issue on Highlights of Genetic Programming 2019 Events (guest edited by Ting Hu, Miguel Nicolau, and Lukas Sekanina, with associated articles indicated below with “[Highlights]”) which also includes a special section on Integrating Numerical Optimization Methods with Genetic Programming (guest edited by Anna I. Esparcia-Alcázar, Leonardo Trujillo, indicated with “[Optimization]”).

Contents:

[Highlights]
Guest Editorial
Special issue on highlights of genetic programming 2019 events
Ting Hu, Miguel Nicolau & Lukas Sekanina

[Highlights]
EA-based resynthesis: an efficient tool for optimization of digital circuits
Jitka Kocnova & Zdenek Vasicek

[Highlights]
Horizontal gene transfer for recombining graphs
Timothy Atkinson, Detlef Plump & Susan Stepney

[Highlights]
On the importance of specialists for lexicase selection
Thomas Helmuth, Edward Pantridge & Lee Spector

[Highlights]
A network perspective on genotype–phenotype mapping in genetic programming
Ting Hu, Marco Tomassini & Wolfgang Banzhaf

[Highlights]
Multi-objective genetic programming for manifold learning: balancing quality and dimensionality
Andrew Lensen, Mengjie Zhang & Bing Xue

[Highlights]
Learning feature spaces for regression with genetic programming
William La Cava & Jason H. Moore

[Optimization]
Guest Editorial
Special Issue on Integrating numerical optimization methods with genetic programming
Anna I. Esparcia-Alcázar & Leonardo Trujillo

[Optimization]
Parameter identification for symbolic regression using nonlinear least squares
Michael Kommenda, Bogdan Burlacu, Gabriel Kronberger & Michael Affenzeller

[Optimization]
Unimodal optimization using a genetic-programming-based method with periodic boundary conditions
Rogério C. B. L. Póvoa, Adriano S. Koshiyama, Douglas M. Dias, Patrícia L. Souza & Bruno A. C. Horta

BOOK REVIEW
Arthur I. Miller: The artist in the machine: the world of AI-powered creativity
Anna Olszewska