Expensive Multiobjective Evolutionary Optimization Assisted by Dominance Prediction

We propose a new surrogate-assisted evolutionary algorithm for expensive multiobjective optimization. Two classification-based surrogate models are used, which can predict the Pareto dominance relation and $theta $ -dominance relation between two solu…

We propose a new surrogate-assisted evolutionary algorithm for expensive multiobjective optimization. Two classification-based surrogate models are used, which can predict the Pareto dominance relation and $theta $ -dominance relation between two solutions, respectively. To make such surrogates as accurate as possible, we formulate dominance prediction as an imbalanced classification problem and address this problem using deep learning techniques. Furthermore, to integrate the surrogates based on dominance prediction with multiobjective evolutionary optimization, we develop a two-stage preselection strategy. This strategy aims to select a promising solution to be evaluated among those produced by genetic operations, taking proper account of the balance between convergence and diversity. We conduct an empirical study on a number of well-known multiobjective and many-objective benchmark problems, over a relatively small number of function evaluations. Our experimental results demonstrate the superiority of the proposed algorithm compared with several representative surrogate-assisted algorithms.

Call for Papers: Thirtieth Anniversary of Genetic Programming: On the Programming of Computers by Means of Natural Selection

In 1992, John R. Koza published his first book on Genetic Programming (GP): “Genetic Programming: On the Programming of Computers by Means of Natural Selection” [1]. This ground-breaking book paved the way for the establishment of a new field of study….

In 1992, John R. Koza published his first book on Genetic Programming (GP): “Genetic Programming: On the Programming of Computers by Means of Natural Selection” [1]. This ground-breaking book paved the way for the establishment of a new field of study. It influenced the work of thousands of researchers and practitioners worldwide, many of whom aimed to continue the exploration, formalization and improvement of the original formulation of GP and/or to apply GP to challenging problems.  

We aim to celebrate the 30th anniversary of [1] with a special issue that focuses on the multiple impacts that the book had, and is still having, on the GP field. We hope that the special issue will illustrate many of the ways in which the ideas proposed by Koza in [1] have influenced and are still influencing the GP community. 

[1] Koza, J. R. (1992). Genetic Programming: On the Programming of Computers by Means of Natural Selection. Cambridge, MA, USA: MIT Press. ISBN: 0-262-11170-5

TOPICS OF INTEREST

We are open to a broad range of submissions that, in various ways, help us better understand the impact of [1] on the GP community. Submissions that would be welcomed might for example present: 

  • High quality review articles of [1] and of subsequent books or other types of publications, that may include a deep discussion of the relationship, similarities and differences with [1].
  • Discussions of ways in which current techniques and practices are similar to or different from those recommended in [1], and the ways in which the methods presented in [1] have evolved towards more modern and effective methods.
  • Challenges that [1] regarded as open, and the extent to which they are now completely fulfilled or that still constitute open issues for the GP field. 
  • Ground-breaking ideas that were proposed in [1] and that have inspired research in the past, or may inspire new research in the future.
  • Ideas in [1] that have received little attention to date, which might be beneficial to revisit.
  • Works that deal with the impact that [1] has had in applied domains, exploring and/or surveying the achievements and limits of the methods and ideas proposed in [1] for solving real-world problems.
  • Applications that directly build on the techniques described in [1] and are able to achieve human-competitive results.

TENTATIVE TIMELINE

  • Submission deadline: 30 July 2022
  • Initial reviews: 15 October 2022
  • Resubmissions: 10 December 2023
  • Final notifications: 10 February 2023

GUEST EDITORS

Leonardo Vanneschi 
NOVA IMS, Universidade Nova de Lisboa, Portugal
lvanneschi@novaims.unl.pt
Leonardo Trujillo 
Instituto Tecnológico de Tijuana, Mexico
leonardo.trujillo@tectijuana.edu.mx

SUBMISSION GUIDELINES

Authors are encouraged to submit high-quality, original work that has neither appeared in, nor is under consideration by other journals. All papers will be reviewed following standard reviewing procedures for the Journal. Papers must be prepared in accordance with the Journal guidelines: www.springer.com/10710.

Submit manuscripts to: http://GENP.edmgr.com.  Select “S.I. Thirtieth Anniversary of Genetic Programming: On the Programming of Computers by Means of Natural Selection” as the article type or when asked if the article is for a special issue.

Springer provides a host of information about publishing in a Springer Journal on our Journal Author Resources page, including  FAQs,  Tutorials along with Help and Support.

Additional information can be found on the official Springer Call for Papers.

Call for Papers: Thirtieth Anniversary of Genetic Programming: On the Programming of Computers by Means of Natural Selection

In 1992, John R. Koza published his first book on Genetic Programming (GP): “Genetic Programming: On the Programming of Computers by Means of Natural Selection” [1]. This ground-breaking book paved the way for the establishment of a new field of study….

In 1992, John R. Koza published his first book on Genetic Programming (GP): “Genetic Programming: On the Programming of Computers by Means of Natural Selection” [1]. This ground-breaking book paved the way for the establishment of a new field of study. It influenced the work of thousands of researchers and practitioners worldwide, many of whom aimed to continue the exploration, formalization and improvement of the original formulation of GP and/or to apply GP to challenging problems.  

We aim to celebrate the 30th anniversary of [1] with a special issue that focuses on the multiple impacts that the book had, and is still having, on the GP field. We hope that the special issue will illustrate many of the ways in which the ideas proposed by Koza in [1] have influenced and are still influencing the GP community. 

[1] Koza, J. R. (1992). Genetic Programming: On the Programming of Computers by Means of Natural Selection. Cambridge, MA, USA: MIT Press. ISBN: 0-262-11170-5

TOPICS OF INTEREST

We are open to a broad range of submissions that, in various ways, help us better understand the impact of [1] on the GP community. Submissions that would be welcomed might for example present: 

  • High quality review articles of [1] and of subsequent books or other types of publications, that may include a deep discussion of the relationship, similarities and differences with [1].
  • Discussions of ways in which current techniques and practices are similar to or different from those recommended in [1], and the ways in which the methods presented in [1] have evolved towards more modern and effective methods.
  • Challenges that [1] regarded as open, and the extent to which they are now completely fulfilled or that still constitute open issues for the GP field. 
  • Ground-breaking ideas that were proposed in [1] and that have inspired research in the past, or may inspire new research in the future.
  • Ideas in [1] that have received little attention to date, which might be beneficial to revisit.
  • Works that deal with the impact that [1] has had in applied domains, exploring and/or surveying the achievements and limits of the methods and ideas proposed in [1] for solving real-world problems.
  • Applications that directly build on the techniques described in [1] and are able to achieve human-competitive results.

TENTATIVE TIMELINE

  • Submission deadline: 30 July 2022
  • Initial reviews: 15 October 2022
  • Resubmissions: 10 December 2023
  • Final notifications: 10 February 2023

GUEST EDITORS

Leonardo Vanneschi 
NOVA IMS, Universidade Nova de Lisboa, Portugal
lvanneschi@novaims.unl.pt
Leonardo Trujillo 
Instituto Tecnológico de Tijuana, Mexico
leonardo.trujillo@tectijuana.edu.mx

SUBMISSION GUIDELINES

Authors are encouraged to submit high-quality, original work that has neither appeared in, nor is under consideration by other journals. All papers will be reviewed following standard reviewing procedures for the Journal. Papers must be prepared in accordance with the Journal guidelines: www.springer.com/10710.

Submit manuscripts to: http://GENP.edmgr.com.  Select “S.I. Thirtieth Anniversary of Genetic Programming: On the Programming of Computers by Means of Natural Selection” as the article type or when asked if the article is for a special issue.

Springer provides a host of information about publishing in a Springer Journal on our Journal Author Resources page, including  FAQs,  Tutorials along with Help and Support.

Additional information can be found on the official Springer Call for Papers.

Evolutionary Neural Architecture Search for High-Dimensional Skip-Connection Structures on DenseNet Style Networks

Convolutional neural networks hold state-of-the-art results for image classification, and many neural architecture search algorithms have been proposed to discover high performance convolutional neural networks. However, the use of neural architecture …

Convolutional neural networks hold state-of-the-art results for image classification, and many neural architecture search algorithms have been proposed to discover high performance convolutional neural networks. However, the use of neural architecture search for the discovery of skip-connection structures, an important element in modern convolutional neural networks, is limited within the literature. Furthermore, while many neural architecture search algorithms utilize performance estimation techniques to reduce computation time, empirical evaluations of these performance estimation techniques remain limited. This work focuses on utilizing evolutionary neural architecture search to examine the search space of networks, which follow a fundamental DenseNet structure, but have no fixed skip connections. In particular, a genetic algorithm is designed, which searches the space consisting of all networks between a standard feedforward network and the corresponding DenseNet. To design the algorithm, lower fidelity performance estimation of this class of networks is examined and presented. The final algorithm finds networks that are more accurate than DenseNets on CIFAR10 and CIFAR100, and have fewer trainable parameters. The structures found by the algorithm are examined to shed light on the importance of different types of skip-connection structures in convolutional neural networks, including the discovery of a simple skip-connection removal, which improves DenseNet performance on CIFAR10.

Evolutionary Neural Architecture Search for High-Dimensional Skip-Connection Structures on DenseNet Style Networks

Convolutional neural networks hold state-of-the-art results for image classification, and many neural architecture search algorithms have been proposed to discover high performance convolutional neural networks. However, the use of neural architecture …

Convolutional neural networks hold state-of-the-art results for image classification, and many neural architecture search algorithms have been proposed to discover high performance convolutional neural networks. However, the use of neural architecture search for the discovery of skip-connection structures, an important element in modern convolutional neural networks, is limited within the literature. Furthermore, while many neural architecture search algorithms utilize performance estimation techniques to reduce computation time, empirical evaluations of these performance estimation techniques remain limited. This work focuses on utilizing evolutionary neural architecture search to examine the search space of networks, which follow a fundamental DenseNet structure, but have no fixed skip connections. In particular, a genetic algorithm is designed, which searches the space consisting of all networks between a standard feedforward network and the corresponding DenseNet. To design the algorithm, lower fidelity performance estimation of this class of networks is examined and presented. The final algorithm finds networks that are more accurate than DenseNets on CIFAR10 and CIFAR100, and have fewer trainable parameters. The structures found by the algorithm are examined to shed light on the importance of different types of skip-connection structures in convolutional neural networks, including the discovery of a simple skip-connection removal, which improves DenseNet performance on CIFAR10.