Mixture of neural networks:: Some experiments with the multilayer feedforward architecture

被引:0
|
作者
Torres-Sospedra, Joaquin [1 ]
Hernandez-Espinosa, Carlos [1 ]
Fernandez-Redondo, Mercedes [1 ]
机构
[1] Univ Jaume 1, Dept Ingn & Ciencia Computadores, Castellon de La Plana 12071, Spain
来源
NEURAL INFORMATION PROCESSING, PT 1, PROCEEDINGS | 2006年 / 4232卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A Modular Multi-Net System consist on some networks which solve partially a problem. The original problem has been decomposed into subproblems and each network focuses on solving a subproblem. The Mixture of Neural Networks consist on some expert networks which solve the subproblems and a gating network which weights the outputs of the expert networks. The expert networks and the gating network are trained all together in order to reduce the correlation among the networks and minimize the error of the system. In this paper we present the Mixture of Multilayer Feedforward (MixMF) a method based on MixNN which uses Multilayer Feedfoward networks for the expert level. Finally, we have performed a comparison among Simple Ensemble, MixNN and MixMF and the results show that MixMF is the best performing method.
引用
收藏
页码:616 / 625
页数:10
相关论文
共 50 条