Clustering ensembles of neural network models

被引:99
|
作者
Bakker, B [1 ]
Heskes, T [1 ]
机构
[1] Univ Nijmegen, SNN, NL-6525 EZ Nijmegen, Netherlands
关键词
clustering; bootstrapping; bias/variance analysis; multitask learning; deterministic annealing; expectation-maximization algorithm; multilayered perceptron; nonlinear model;
D O I
10.1016/S0893-6080(02)00187-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that large ensembles of (neural network) models, obtained e.g. in bootstrapping or sampling from (Bayesian) probability distributions, can be effectively summarized by a relatively small number of representative models. In some cases this summary may even yield better function estimates. We present a method to find representative models through clustering based on the models' outputs on a data set. We apply the method on an ensemble of neural network models obtained from bootstrapping on the Boston housing data, and use the results to discuss bootstrapping in terms of bias and variance. A parallel application is the prediction of newspaper sales, where we learn a series of parallel tasks. The msults indicate that it is not necessary to store all samples in the ensembles: a small number of representative models generally matches, or even surpasses, the performance of the full ensemble. The clustered representation of the ensemble obtained thus is much better suitable for qualitative analysis, and will be shown to yield new insights into the data. (C) 2003 Elsevier Science Ltd. All rights reserved.
引用
收藏
页码:261 / 269
页数:9
相关论文
共 50 条
  • [41] Enhancing classifiers through neural network ensembles
    Onaci, Alexandru
    Vidrighin, Camelia
    Cuibus, Mihai
    Potolea, Rodica
    ICCP 2007: IEEE 3RD INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING, PROCEEDINGS, 2007, : 57 - +
  • [42] Effective pruning of neural network classifier ensembles
    Lazarevic, A
    Obradovic, Z
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 796 - 801
  • [43] Local discriminant basis neural network ensembles
    Asdornwised, W
    WAVELET APPLICATIONS VII, 2000, 4056 : 341 - 350
  • [44] Predicting software reliability with neural network ensembles
    Zheng, Jun
    EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (02) : 2116 - 2122
  • [45] Class-switching neural network ensembles
    Martinez-Munoz, Gonzalo
    Sanchez-Martinez, Aitor
    Hernandez-Lobato, Daniel
    Suarez, Alberto
    NEUROCOMPUTING, 2008, 71 (13-15) : 2521 - 2528
  • [46] A pruning algorithm for training neural network ensembles
    Shahjahan, M
    Akhand, MAH
    Murase, K
    SICE 2003 ANNUAL CONFERENCE, VOLS 1-3, 2003, : 628 - 633
  • [47] Neural network ensembles for band gap prediction
    Masuda, Taichi
    Tanabe, Katsuaki
    COMPUTATIONAL MATERIALS SCIENCE, 2025, 246
  • [48] Neural network ensembles: evaluation of aggregation algorithms
    Granitto, PM
    Verdes, PF
    Ceccatto, HA
    ARTIFICIAL INTELLIGENCE, 2005, 163 (02) : 139 - 162
  • [49] Neural network ensembles for posterior probability estimation
    Berardi, VL
    DECISION SCIENCES INSTITUTE 1998 PROCEEDINGS, VOLS 1-3, 1998, : 1081 - 1083
  • [50] Random Separation Learning for Neural Network Ensembles
    Liu, Yong
    2017 10TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING, BIOMEDICAL ENGINEERING AND INFORMATICS (CISP-BMEI), 2017,