Ensembling neural networks: Many could be better than all

被引:1396
|
作者
Zhou, ZH [1 ]
Wu, JX [1 ]
Tang, W [1 ]
机构
[1] Nanjing Univ, Natl Lab Novel Software Technol, Nanjing 210093, Peoples R China
关键词
neural networks; neural network ensemble; machine learning; selective ensemble; boosting; bagging; genetic algorithm; bias-variance decomposition;
D O I
10.1016/S0004-3702(02)00190-X
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural network ensemble is a learning paradigm where many neural networks are jointly used to solve a problem. In this paper, the relationship between the ensemble and its component neural networks is analyzed from the context of both regression and classification, which reveals that it may be better to ensemble many instead of all of the neural networks at hand. This result is interesting because at present, most approaches ensemble all the available neural networks for prediction. Then, in order to show that the appropriate neural networks for composing an ensemble can be effectively selected from a set of available neural networks, an approach named GASEN is presented. GASEN trains a number of neural networks at first. Then it assigns random weights to those networks and employs genetic algorithm to evolve the weights so that they can characterize to some extent the fitness of the neural networks in constituting an ensemble. Finally it selects some neural networks based on the evolved weights to make up the ensemble. A large empirical study shows that, compared with some popular ensemble approaches such as Bagging and Boosting, GASEN can generate neural network ensembles with far smaller sizes but stronger generalization ability. Furthermore, in order to understand the working mechanism of GASEN, the bias-variance decomposition of the error is provided in this paper, which shows that the success of GASEN may lie in that it can significantly reduce the bias as well as the variance. (C) 2002 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:239 / 263
页数:25
相关论文
共 50 条
  • [1] Ensembling neural networks: Many could be better than all (vol 137, pg 239, 2002)
    Zhou, Zhi-Hua
    Wu, Jianxin
    Tang, Wei
    ARTIFICIAL INTELLIGENCE, 2010, 174 (18) : 1570 - 1570
  • [2] A mixture of shallow neural networks for virtual sensing: Could perform better than deep neural networks
    Shao, Weiming
    Li, Xu
    Xing, Yupeng
    Chen, Junghui
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 256
  • [3] Mutation Selection: Some Could be Better than All
    Zhang, Zhiyi
    You, Dongjiang
    Chen, Zhenyu
    Zhou, Yuming
    Xu, Baowen
    EAST 2011: EVIDENTIAL ASSESSMENT OF SOFTWARE TECHNOLOGIES, 2011, : 10 - 17
  • [4] Are analog neural networks better than binary neural networks?
    M. Vidyasagar
    Circuits, Systems and Signal Processing, 1998, 17 : 243 - 270
  • [5] Are analog neural networks better than binary neural networks?
    Vidyasagar, M
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 1998, 17 (02) : 243 - 270
  • [6] Ensemble modelling or selecting the best model: Many could be better than one
    Barai, SV
    Reich, Y
    AI EDAM-ARTIFICIAL INTELLIGENCE FOR ENGINEERING DESIGN ANALYSIS AND MANUFACTURING, 1999, 13 (05): : 377 - 386
  • [7] ARTIFICIAL NEURAL NETWORKS - BETTER THAN THE REAL THING
    ASHARE, AB
    CHAKRABORTY, DP
    JOURNAL OF NUCLEAR MEDICINE, 1994, 35 (12) : 2048 - 2049
  • [8] Uncertainty in Neural Networks: Approximately Bayesian Ensembling
    Pearce, Tim
    Leibfried, Felix
    Brintrup, Alexandra
    Zaki, Mohamed
    Neely, Andy
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 234 - 243
  • [9] MANY COULD BE BETTER THAN ALL: A NOVEL INSTANCE-ORIENTED ALGORITHM FOR MULTI-MODAL MULTI-LABEL PROBLEM
    Zhang, Yi
    Zeng, Cheng
    Cheng, Hao
    Wang, Chongjun
    Zhang, Lei
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2019, : 838 - 843
  • [10] Ensembling Graph Neural Networks for Node Classification
    Lin, Ke-Ao
    Xie, Xiao-Zhu
    Weng, Wei
    Chen, Yong
    Journal of Network Intelligence, 2024, 9 (02): : 804 - 818