Mixture of experts: a literature survey

被引:237
|
作者
Masoudnia, Saeed [1 ]
Ebrahimpour, Reza [2 ]
机构
[1] Univ Tehran, Sch Math Stat & Comp Sci, Tehran, Iran
[2] Shahid Rajaee Teacher Training Univ, Dept Elect & Comp Engn, Brain & Intelligent Syst Res Lab, Tehran, Iran
关键词
Classifier combining; Mixture of experts; Mixture of implicitly localised experts; Mixture of explicitly localised expert; INDEPENDENT FACE RECOGNITION; NETWORK STRUCTURE; ENSEMBLE METHODS; MACHINE; CLASSIFICATION; CLASSIFIERS; ALGORITHM; MODEL;
D O I
10.1007/s10462-012-9338-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixture of experts (ME) is one of the most popular and interesting combining methods, which has great potential to improve performance in machine learning. ME is established based on the divide-and-conquer principle in which the problem space is divided between a few neural network experts, supervised by a gating network. In earlier works on ME, different strategies were developed to divide the problem space between the experts. To survey and analyse these methods more clearly, we present a categorisation of the ME literature based on this difference. Various ME implementations were classified into two groups, according to the partitioning strategies used and both how and when the gating network is involved in the partitioning and combining procedures. In the first group, The conventional ME and the extensions of this method stochastically partition the problem space into a number of subspaces using a special employed error function, and experts become specialised in each subspace. In the second group, the problem space is explicitly partitioned by the clustering method before the experts' training process starts, and each expert is then assigned to one of these sub-spaces. Based on the implicit problem space partitioning using a tacit competitive process between the experts, we call the first group the mixture of implicitly localised experts (MILE), and the second group is called mixture of explicitly localised experts (MELE), as it uses pre-specified clusters. The properties of both groups are investigated in comparison with each other. Investigation of MILE versus MELE, discussing the advantages and disadvantages of each group, showed that the two approaches have complementary features. Moreover, the features of the ME method are compared with other popular combining methods, including boosting and negative correlation learning methods. As the investigated methods have complementary strengths and limitations, previous researches that attempted to combine their features in integrated approaches are reviewed and, moreover, some suggestions are proposed for future research directions.
引用
收藏
页码:275 / 293
页数:19
相关论文
共 50 条
  • [21] Prior Distribution Selection for a Mixture of Experts
    A. V. Grabovoy
    V. V. Strijov
    Computational Mathematics and Mathematical Physics, 2021, 61 : 1140 - 1152
  • [22] Modeling time dependencies in the mixture of experts
    Fancourt, CL
    Principe, JC
    IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 2324 - 2327
  • [23] SPARSE BAYESIAN HIERARCHICAL MIXTURE OF EXPERTS
    Mossavat, Iman
    Amft, Oliver
    2011 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2011, : 653 - 656
  • [24] Mixture of Experts for Facial Forgery Detection
    Luo, Chaoyang
    Yang, Linxi
    Zhao, Guoqing
    Jiang, Ning
    Pi, Jiatian
    Wu, Zhiyou
    JOURNAL OF IMAGING SCIENCE AND TECHNOLOGY, 2022, 66 (06)
  • [25] Parameters Modeling for a Modified Mixture of Experts
    Jin, Jian
    Huang, Guo-Xing
    Ding, Jianguo
    Hu, Yong-Tao
    2008 7TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-23, 2008, : 4024 - +
  • [26] Scaling Vision with Sparse Mixture of Experts
    Riquelme, Carlos
    Puigcerver, Joan
    Mustafa, Basil
    Neumann, Maxim
    Jenatton, Rodolphe
    Pinto, Andre Susano
    Keysers, Daniel
    Houlsby, Neil
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [27] Localised mixtures of experts for mixture of regressions
    Bouchard, G
    BETWEEN DATA SCIENCE AND APPLIED DATA ANALYSIS, 2003, : 155 - 164
  • [28] A mixture of experts model exhibiting prosopagnosia
    Dailey, MN
    Cottrell, GW
    Padgett, C
    PROCEEDINGS OF THE NINETEENTH ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, 1997, : 155 - 160
  • [29] Modified Mixture of Experts for Diabetes Diagnosis
    Elif Derya Übeyli
    Journal of Medical Systems, 2009, 33 : 299 - 305
  • [30] Modified Mixture of Experts for Diabetes Diagnosis
    Ubeyli, Elif Derya
    JOURNAL OF MEDICAL SYSTEMS, 2009, 33 (04) : 299 - 305