Clustering ensembles of neural network models

被引:99
|
作者
Bakker, B [1 ]
Heskes, T [1 ]
机构
[1] Univ Nijmegen, SNN, NL-6525 EZ Nijmegen, Netherlands
关键词
clustering; bootstrapping; bias/variance analysis; multitask learning; deterministic annealing; expectation-maximization algorithm; multilayered perceptron; nonlinear model;
D O I
10.1016/S0893-6080(02)00187-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that large ensembles of (neural network) models, obtained e.g. in bootstrapping or sampling from (Bayesian) probability distributions, can be effectively summarized by a relatively small number of representative models. In some cases this summary may even yield better function estimates. We present a method to find representative models through clustering based on the models' outputs on a data set. We apply the method on an ensemble of neural network models obtained from bootstrapping on the Boston housing data, and use the results to discuss bootstrapping in terms of bias and variance. A parallel application is the prediction of newspaper sales, where we learn a series of parallel tasks. The msults indicate that it is not necessary to store all samples in the ensembles: a small number of representative models generally matches, or even surpasses, the performance of the full ensemble. The clustered representation of the ensemble obtained thus is much better suitable for qualitative analysis, and will be shown to yield new insights into the data. (C) 2003 Elsevier Science Ltd. All rights reserved.
引用
收藏
页码:261 / 269
页数:9
相关论文
共 50 条
  • [31] Network inference with ensembles of bi-clustering trees
    Konstantinos Pliakos
    Celine Vens
    BMC Bioinformatics, 20
  • [32] Symbolic Interpretation of Trained Neural Network Ensembles
    Chakraborty, Manomita
    INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2024, 32 (05) : 695 - 719
  • [33] Computational Awareness for Learning Neural Network Ensembles
    Liu, Yong
    2017 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION (IEEE ICIA 2017), 2017, : 376 - 380
  • [34] Evolving neural network ensembles by fitness sharing
    Liu, Yong
    Yao, Xin
    2006 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1-6, 2006, : 3274 - +
  • [35] Sharing training patterns in neural network ensembles
    Dara, RA
    Kamel, M
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 1157 - 1161
  • [36] A Genetic Algorithm for Designing Neural Network Ensembles
    Soares, Symone G.
    Antunes, Carlos H.
    Araujo, Rui
    PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2012, : 681 - 688
  • [37] Research and development of neural network ensembles: a survey
    Li, Hui
    Wang, Xuesong
    Ding, Shifei
    ARTIFICIAL INTELLIGENCE REVIEW, 2018, 49 (04) : 455 - 479
  • [38] Application of neural network ensembles to incident detection
    Chen, Shuyan
    Wang, Wei
    Qu, Gaofeng
    Lu, Jian
    2007 IEEE INTERNATIONAL CONFERENCE ON INTEGRATION TECHNOLOGY, PROCEEDINGS, 2007, : 388 - +
  • [39] Research and development of neural network ensembles: a survey
    Hui Li
    Xuesong Wang
    Shifei Ding
    Artificial Intelligence Review, 2018, 49 : 455 - 479
  • [40] Evolving neural network ensembles for control problems
    Pardoe, David
    Ryoo, Michael
    Miikkulainen, Risto
    GECCO 2005: GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, VOLS 1 AND 2, 2005, : 1379 - 1384