Advances at the frontier of LCS: First step completed

The first step of the volume Advances at the frontier of LCS is almost done. Below there is a list of the camera readies collected so far. These book chapters cover the contributions to IWLCS on 2003 and 2004.

  • Data Mining in Learning Classifier Systems: Comparing XCS with GAssist.
    Bacardit, J. and Butz, M.
  • Bloat Control and Generalization Pressure using the Minimum Description Length Principle for a Pittsburgh approach Learning Classifier System.
    Bacardit, J. and Garrell, J.M.
  • Improving the Performance of a Pittsburgh Learning Classifier System Using a Default Rule.
    Bacardit, J., Goldberg, D.E., and Butz, M.
  • Effect of Pure Error-Based Fitness in XCS.
    Butz, M., Goldberg, D.E., and Lanzi, P.L.
  • A Formal Relationship Between Ant Colony Optimizers and Classifier Systems.
    Davis, D.
  • An Experimental Comparison between ATNoSFERES and ACS.
    Landau, S., Sigaud, O., Picault, S., and Gérard, P.
  • Where to Go Once You Have Evolved a Bunch of Promising Hypotheses?.
    Llorà, X., Bernadó, B., Bacardit, J., and Traus, I.
  • A Hyper-Heuristic Framework with XCS: Learning to Create Novel Problem-Solving Algorithms Constructed from Simpler Algorithmic Ingredients.
    Martín-Blazquez, J. and Schulenburg, S.
  • Backpropagation in Accuracy-based Neural Learning Classifier Systems .
    O’Hara, T. and Bull, L.
  • Use of Learning Classifier System for Inferring Natural Language Grammar .
    Unold, O. and Dabrowski, G.
  • Analyzing Parameter Sensitivity and Classifier Representations for Real-valued XCS .
    Wada, A., Takadama, K., Shimohara, K., and Katai, O.
  • Three Architectures for Continuos Action.
    Wilson, S.W.
  • Using XCS to Describe Continuous-Valued Problem Spaces.
    Wyatt, D., Bull, L., and Parmee, I.

Multi-Objective Machine Learning

The book Multi-objective Machine Learning edited by Yaochu Jin contains several chapters on the usage of LCS and GBML for multi-objective machine learning. Among other topics it includes the usage of multi-objective optimization to evolve accurate and compact rule sets using LCS and GBML, and the use of GA-based Pareto optimization for rule extraction from neural networks.