Mixture of experts: a literature survey

被引:237
|
作者
Masoudnia, Saeed [1 ]
Ebrahimpour, Reza [2 ]
机构
[1] Univ Tehran, Sch Math Stat & Comp Sci, Tehran, Iran
[2] Shahid Rajaee Teacher Training Univ, Dept Elect & Comp Engn, Brain & Intelligent Syst Res Lab, Tehran, Iran
关键词
Classifier combining; Mixture of experts; Mixture of implicitly localised experts; Mixture of explicitly localised expert; INDEPENDENT FACE RECOGNITION; NETWORK STRUCTURE; ENSEMBLE METHODS; MACHINE; CLASSIFICATION; CLASSIFIERS; ALGORITHM; MODEL;
D O I
10.1007/s10462-012-9338-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixture of experts (ME) is one of the most popular and interesting combining methods, which has great potential to improve performance in machine learning. ME is established based on the divide-and-conquer principle in which the problem space is divided between a few neural network experts, supervised by a gating network. In earlier works on ME, different strategies were developed to divide the problem space between the experts. To survey and analyse these methods more clearly, we present a categorisation of the ME literature based on this difference. Various ME implementations were classified into two groups, according to the partitioning strategies used and both how and when the gating network is involved in the partitioning and combining procedures. In the first group, The conventional ME and the extensions of this method stochastically partition the problem space into a number of subspaces using a special employed error function, and experts become specialised in each subspace. In the second group, the problem space is explicitly partitioned by the clustering method before the experts' training process starts, and each expert is then assigned to one of these sub-spaces. Based on the implicit problem space partitioning using a tacit competitive process between the experts, we call the first group the mixture of implicitly localised experts (MILE), and the second group is called mixture of explicitly localised experts (MELE), as it uses pre-specified clusters. The properties of both groups are investigated in comparison with each other. Investigation of MILE versus MELE, discussing the advantages and disadvantages of each group, showed that the two approaches have complementary features. Moreover, the features of the ME method are compared with other popular combining methods, including boosting and negative correlation learning methods. As the investigated methods have complementary strengths and limitations, previous researches that attempted to combine their features in integrated approaches are reviewed and, moreover, some suggestions are proposed for future research directions.
引用
收藏
页码:275 / 293
页数:19
相关论文
共 50 条
  • [41] Mixture of Experts Residual Learning for Hamming Hashing
    Jinyu Xu
    Qing Xie
    Jiachen Li
    Yanchun Ma
    Yuhan Liu
    Neural Processing Letters, 2023, 55 : 7077 - 7093
  • [42] Evaluation of cluster combination functions for mixture of experts
    Redhead, R
    Heywood, M
    Proceedings of the International Joint Conference on Neural Networks (IJCNN), Vols 1-5, 2005, : 1154 - 1159
  • [43] Supervised mixture of experts models for population health
    Shou, Xiao
    Mavroudeas, Georgios
    Magdon-Ismail, Malik
    Figueroa, Jose
    Kuruzovich, Jason N.
    Bennett, Kristin P.
    METHODS, 2020, 179 : 101 - 110
  • [44] Channel Gain Cartography via Mixture of Experts
    Lopez-Ramos, Luis M.
    Teganya, Yves
    Beferull-Lozano, Baltasar
    Kim, Seung-Jun
    2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [45] Model selection for the localized mixture of experts models
    Jiang, Yunlu
    Yu Conglian
    Ji Qinghua
    JOURNAL OF APPLIED STATISTICS, 2018, 45 (11) : 1994 - 2006
  • [46] MIXTURE OF HMM EXPERTS WITH APPLICATIONS TO LANDMINE DETECTION
    Yuksel, Seniha Esen
    Gader, Paul D.
    2012 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2012, : 6852 - 6855
  • [47] DEEP RECURRENT MIXTURE OF EXPERTS FOR SPEECH ENHANCEMENT
    Chazan, Shlomo E.
    Goldberger, Jacob
    Gannot, Sharon
    2017 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS (WASPAA), 2017, : 359 - 363
  • [48] Modified mixture of experts for analysis of EEG signals
    Uebeyli, Elif Derya
    2007 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, VOLS 1-16, 2007, : 1546 - 1549
  • [49] A Modified Mixture of FMLP Experts for face recognition
    Makhsoos, Nina Taheri
    Hajiany, Alireza
    Ebrahimpour, Reza
    Sepidnam, Ghodrat
    PROCEEDINGS OF THE 2008 7TH IEEE INTERNATIONAL CONFERENCE ON CYBERNETIC INTELLIGENT SYSTEMS, 2008, : 198 - +
  • [50] On the use of localized gating in mixture of experts networks
    Ramamurti, V
    Ghosh, J
    APPLICATIONS AND SCIENCE OF COMPUTATIONAL INTELLIGENCE, 1998, 3390 : 24 - 35