Using Coverage as a Model Building Constraint in Learning Classifier Systems

被引:18
|
作者
Greene, David Perry [1 ]
Smith, Stephen F. [1 ]
机构
[1] Carnegie Mellon Univ, Inst Robot, Sch Comp Sci, Pittsburgh, PA 15213 USA
关键词
D O I
10.1162/evco.1994.2.1.67
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Promoting and maintaining diversity is a critical requirement of search in learning classifier systems (LCSs). What is required of the genetic algorithm (GA) in an LCS context is not convergence to a single global maximum, as in the standard optimization framework, but instead the generation of individuals (i.e., rules) that collectively cover the overall problem space. COGIN (Coverage-based Genetic INduction) is a system designed to exploit genetic recombination for the purpose of constructing rule-based classification models from examples. The distinguishing characteristic of COGIN is its use of coverage of training set examples as an explicit constraint on the search, which acts to promote appropriate diversity in the population of rules over time. By treating training examples as limited resources, COGIN creates an ecological model that simultaneously accommodates a dynamic range of niches while encouraging superior individuals within a niche, leading to concise and accurate decision models. Previous experimental studies with COGTN have demonstrated its performance advantages over several well-known symbolic induction approaches. In this paper, we examine the effects of two modifications to the original system configuration, each designed to inject additional diversity into the search: increasing the carrying capacity of training set examples (i.e., increasing coverage redundancy) and increasing the level of disruption in the recombination operator used to generate new rules. Experimental results are given that show both types of modifications to yield substantial improvements to previously published results.
引用
收藏
页码:67 / 91
页数:25
相关论文
共 50 条
  • [31] Imitation guided learning in learning classifier systems
    Métivier M.
    Lattaud C.
    Natural Computing, 2009, 8 (1) : 29 - 56
  • [32] A secure model for building e-learning systems
    Masadeh, Shadi R
    Turab, Nedal
    Obisat, Farhan
    Network Security, 2012, 2012 (01) : 17 - 20
  • [33] Prediction using Pittsburgh Learning Classifier Systems: APCS use case
    Peroumalnaik, Mathias
    Enee, Gilles
    GECCO-2010 COMPANION PUBLICATION: PROCEEDINGS OF THE 12TH ANNUAL GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2010, : 1901 - 1907
  • [34] Using Accuracy-Based Learning Classifier Systems for Imbalance Datasets
    Udomthanapong, Sornchai
    Tamee, Kreangsak
    Pinngern, Ouen
    ECTI-CON 2008: PROCEEDINGS OF THE 2008 5TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING/ELECTRONICS, COMPUTER, TELECOMMUNICATIONS AND INFORMATION TECHNOLOGY, VOLS 1 AND 2, 2008, : 21 - 24
  • [35] Robust Bidding in Learning Classifier Systems Using Loan and Bid History
    Workineh, Abrham
    Homaifar, Abdollah
    COMPLEX SYSTEMS, 2011, 19 (03): : 287 - 303
  • [36] Incremental learning with multiple classifier systems using correction filters for classification
    del Campo-Avila, Jos
    Ramos-Jimenez, Gonzalo
    Morales-Bueno, Rafael
    ADVANCES IN INTELLIGENT DATA ANALYSIS VII, PROCEEDINGS, 2007, 4723 : 106 - +
  • [37] Supervised learning with a quantum classifier using multi-level systems
    Soumik Adhikary
    Siddharth Dangwal
    Debanjan Bhowmik
    Quantum Information Processing, 2020, 19
  • [38] A Comprehensive Strategy for Mammogram Image Classification using Learning Classifier Systems
    Siddique, Abubakar
    Iqbal, Muhammad
    Browne, Will N.
    2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2016, : 2201 - 2208
  • [39] Coordination number prediction using learning classifier systems: Performance and interpretability
    Bacardit, Jaurne
    Stout, Michael
    Krasnogor, Natalio
    Hirst, Jonathan D.
    Blazewicz, Jacek
    GECCO 2006: GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, VOL 1 AND 2, 2006, : 247 - +
  • [40] Supervised learning with a quantum classifier using multi-level systems
    Adhikary, Soumik
    Dangwal, Siddharth
    Bhowmik, Debanjan
    QUANTUM INFORMATION PROCESSING, 2020, 19 (03)