Special Issue on the ‘30th Anniversary of XCS’

Submission open until: October 31, 2024

Guest Editors
Anthony Stein, Tenure Track Professor of Artificial Intelligence in Agricultural Engineering, University of Hohenheim, Germany
Ryan Urbanowicz, Assistant Professor of Computational Biomedicine, Cedars-Sinai, Los Angeles, CA
Will Browne, Professor and Chair of​​ Manufacturing Robotics, Queensland University of Technology, Brisbane, Australia

Learning Classifier Systems (LCSs) are one of, if not, the first Evolutionary Computation algorithms to adopt machine learning methods. Thus, they belong to the class of evolutionary machine learning algorithms. With a rule-based model representation at their core, they possess unique and valuable properties, such as inherent interpretability of learned solutions and the ability to model extremely complex and heterogeneous relationships. LCSs were conceived in the mid 1970s by evolutionary computation pioneer John Holland. At that time, these systems were designed to model adaptive agents in his pursuit to understand complex adaptive systems.

Subsequently, LCSs have proven themselves to be a very effective, flexible, and broadly applicable approach to predictive modeling and sequential problem solving tasks. They have been successful not only in well-recognized benchmark tasks, e.g., exceeding previous limits in solving multiplexer problems, but equally important, these systems often excel at solving complex classification and regression problems in real-world domains such as biomedicine and intelligent system control.

What is XCS?
XCS is the archetypal LCS as it embodies many core principles, whilst acting as a framework to address bespoke problems. It belongs to the category of Michigan-style LCSs, one of the two major families of LCSs algorithms. This style is characterized by adopting an online-learning strategy, and employing steady-state niche genetic algorithms to optimize the coverage of the problem spaces at hand. XCS differs from earlier Michigan-style LCSs by its accuracy-based fitness, which has been shown to lead rule-discovery to explore a complete and maximally compact learned problem solution. XCS is the extension of the Zeroth-level Classifier System, both proposed and made popular by Stewart Wilson in the mid-1990’s [published in ECJ, hence making it an ideal home for this special issue].

Since the inception of XCS, interest in LCSs experienced a new impetus and over three decades of LCS research have been sparked, leading to outstanding advances of the system in terms of algorithmic innovations, formal theoretical understanding, and a wide-range of real-world applications. Even so, there remains enormous potential to expand and improve this class of evolutionary machine learning systems. For example, while the deep learning era has brought many innovations in the utilization of deep neural networks in almost all domains of artificial intelligence, the integration of deep learning with LCSs has, to date, been limited to a handful of promising works that has the potential to lead to a resurgence of interest. Currently, there is a growing interest in neurosymbolic systems where the flexible structure of LCSs provides a framework to integrate connectionist with symbolic learning.

Therefore, for this special issue we solicit papers that explore and contribute to the discussion on open questions such as:

  • How to fuse XCS-based systems with deep learning (or other cutting-edge algorithm) concepts maintaining the idiosyncratic advantages of both: e.g., conducting flexible online interpretable machine learning combined with the ability to efficiently and accurately model extremely complex problems through hierarchical feature learning.
  • What algorithmic and/or theoretical advances are still needed to overcome persisting limitations of XCS, e.g., the maintenance of long action chains in delayed reward settings within contemporary reinforcement learning tasks?
  • What are novel or potentially untapped application domains, where XCS has been found particularly advantageous over other machine learning techniques?
  • What are the latest deep insights into XCS resulting from mathematical analysis, ablation studies or rigorous method interaction analysis?

Article categories and submission instructions
We solicit manuscripts which belong to the following article types offered by ECJ:
Full-length original research articles (including surveys, typically approx. 25 pages)
Letters (short articles, typically approx. 6 pages)
ECJ accepts papers that broadly fall into the three categories: Applications, Experimental Results and Theory. Of course, many papers may fall into more than one category.

The focus of this special issue is not exclusively set on original research papers, but welcomes survey-type papers alike. In case of contributions concentrating on novel applications, it must be thoroughly explained why XCS is particularly suited, what algorithmic adaptations facilitate its adoption and how the presented XCS-based approach compares to alternative methods.
Please carefully follow the general submission guidelines of the Evolutionary Computation journal, which also apply to this special issue. Submissions are handled over the Evolutionary Computation Editorial Manager. Authors must select “Special Issue: 30th Anniversary XCS” as the article type when submitting.

Review and Process
All submissions will receive a minimum of two reviews, with at least one reviewer with a strong LCS background and another reviewer with a more broader perspective on the EC and EML field or, in case of manuscripts focussing on XCS’ application to new domains, one reviewer will be selected from the specific application domain.

Please submit your manuscripts until October 31, 2024.

Anticipated timeline:
Manuscript submission: October 31, 2024
Author notification: April 15, 2025
Revision phase until: September 2025
Finalization: October 2025

We invite prospective authors who plan to contribute a survey-type paper to inform the guest editor team upfront in order to prevent potential duplications of efforts. In case of any questions, don’t hesitate to write an email to: anthony.stein@uni-hohenheim.de

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.