An Incremental Network with Local Experts Ensemble

被引:0
|
作者
Shen, Shaofeng [1 ]
Gan, Qiang [1 ]
Shen, Furao [1 ]
Luo, Chaomin [2 ]
Zhao, Jinxi [1 ]
机构
[1] Nanjing Univ, Dept Comp Sci & Technol, Natl Key Lab Novel Software Technol, Nanjing 210008, Jiangsu, Peoples R China
[2] Univ Detroit Mercy, Dept Elect & Comp Engn, Detroit, MI 48221 USA
来源
关键词
D O I
10.1007/978-3-319-26555-1_58
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble learning algorithms aim to train a group of classifiers to enhance the generalization ability. However, vast of those algorithms are learning in batches and the base classifiers (e.g. number, type) must be predetermined. In this paper, we propose an ensemble algorithm called INLEX (Incremental Network with Local EXperts ensemble) to learn suitable number of linear classifiers in an online incremental mode. Specifically, it incrementally learns the representational nodes of the input space. In the incremental process, INLEX finds nodes in the decision boundary area (boundary nodes) based on the theory of entropy: boundary nodes are considered to be disordered. In this paper, boundary nodes are activated as experts, each of which is a local linear classifier. Combination of these linear experts with dynamical weights will constitute a decision boundary to solve nonlinear classification tasks. Experimental results show that INLEX obtains promising performance on real-world classification benchmarks.
引用
收藏
页码:515 / 522
页数:8
相关论文
共 50 条
  • [21] Nonlinear Multi-model Ensemble Prediction Using Dynamic Neural Network with Incremental Learning
    Siek, Michael
    Solomatine, Dimitri
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 2873 - 2880
  • [22] A Genetic Neural Network Ensemble Forecast Model for Local Heavy Rain
    Shi, X. -M.
    Liu, S. -D.
    Jin, Long
    Zhao, H. -S.
    Zhao, J. -B.
    2010 8TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA), 2010, : 2798 - 2802
  • [23] Boosted mixture of experts: An ensemble learning scheme
    Avnimelech, R
    Intrator, N
    NEURAL COMPUTATION, 1999, 11 (02) : 483 - 497
  • [24] A Self-Organizing Incremental Neural Network based on local distribution learning
    Xing, Youlu
    Shi, Xiaofeng
    Shen, Furao
    Zhou, Ke
    Zhao, Jinxi
    NEURAL NETWORKS, 2016, 84 : 143 - 160
  • [25] Learning to Trade with Incremental Support Vector Regression Experts
    Montana, Giovanni
    Parrella, Francesco
    HYBRID ARTIFICIAL INTELLIGENCE SYSTEMS, 2008, 5271 : 591 - 598
  • [26] Noise Source Recognition Based on Two-level Architecture Neural Network Ensemble for Incremental Learning
    Gao Zhihua
    Ben Kerong
    Cui Lilin
    EIGHTH IEEE INTERNATIONAL CONFERENCE ON DEPENDABLE, AUTONOMIC AND SECURE COMPUTING, PROCEEDINGS, 2009, : 587 - +
  • [27] Adaptive Mixtures of Local Experts
    Jacobs, Robert A.
    Jordan, Michael I.
    Nowlan, Steven J.
    Hinton, Geoffrey E.
    NEURAL COMPUTATION, 1991, 3 (01) : 79 - 87
  • [28] Design of lightweight incremental ensemble learning algorithm
    Ding J.
    Tang J.
    Yu Z.
    Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2021, 43 (04): : 861 - 867
  • [29] An incremental learning algorithm of ensemble classifier systems
    Kidera, Takuya
    Ozawa, Seiichi
    Abe, Shigeo
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 3421 - +
  • [30] Infinite Lattice Learner: an ensemble for incremental learning
    Lovinger, Justin
    Valova, Iren
    SOFT COMPUTING, 2020, 24 (09) : 6957 - 6974