Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts

被引:0
|
作者
Nguyen, Huy [1 ]
Nguyen, TrungTin [2 ,3 ]
Nguyen, Khai [1 ]
Ho, Nhat [1 ]
机构
[1] Univ Texas Austin, Dept Stat & Data Sci, Austin, TX 78712 USA
[2] Univ Queensland, Sch Math & Phys, Brisbane, Qld, Australia
[3] Univ Grenoble Alpes, Inria, LJK, Grenoble INP,CNRS, F-38000 Grenoble, France
关键词
MAXIMUM-LIKELIHOOD; OF-EXPERTS; HIERARCHICAL MIXTURES; FEATURE-SELECTION; IDENTIFIABILITY; REGRESSION; MODELS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Originally introduced as a neural network for ensemble learning, mixture of experts (MoE) has recently become a fundamental building block of highly successful modern deep neural networks for heterogeneous data analysis in several applications of machine learning and statistics. Despite its popularity in practice, a satisfactory level of theoretical understanding of the MoE model is far from complete. To shed new light on this problem, we provide a convergence analysis for maximum likelihood estimation (MLE) in the Gaussian-gated MoE model. The main challenge of that analysis comes from the inclusion of covariates in the Gaussian gating functions and expert networks, which leads to their intrinsic interaction via some partial differential equations with respect to their parameters. We tackle these issues by designing novel Voronoi loss functions among parameters to accurately capture the heterogeneity of parameter estimation rates. Our findings reveal that the MLE has distinct behaviors under two complement settings of location parameters of the Gaussian gating functions, namely when all these parameters are non-zero versus when at least one among them vanishes. Notably, these behaviors can be characterized by the solvability of two different systems of polynomial equations. Finally, we conduct a simulation study to empirically verify our theoretical results.
引用
收藏
页数:33
相关论文
共 50 条
  • [1] Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models
    Chamroukhi, Faicel
    Lecocq, Florian
    Nguyen, Hien D.
    STATISTICS AND DATA SCIENCE, RSSDS 2019, 2019, 1150 : 42 - 56
  • [2] On Parameter Estimation in Deviated Gaussian Mixture of Experts
    Nguyen, Huy
    Nguyen, Khai
    Ho, Nhat
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [3] Convergence Rates for Gaussian Mixtures of Experts
    Ho, Nhat
    Yang, Chiao-Yu
    Jordan, Michael I.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [4] Convergence Rates for Gaussian Mixtures of Experts
    Ho, Nhat
    Yang, Chiao-Yu
    Jordan, Michael I.
    Journal of Machine Learning Research, 2022, 23
  • [5] Rates of convergence for the Gaussian mixture sieve
    Genovese, CR
    Wasserman, L
    ANNALS OF STATISTICS, 2000, 28 (04): : 1105 - 1127
  • [6] Convergence of parameter estimation of a Gaussian mixture model minimizing the Gini index of dissimilarity
    Lobato, Adriana Laura Lopez
    Garrido, Martha Lorena Avendano
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2024, 53 (17) : 6030 - 6037
  • [9] Towards interpreting deep learning models for industry 4.0 with gated mixture of experts
    Chaoub, Alaaeddine
    Cerisara, Christophe
    Voisin, Alexandre
    Iung, Benoit
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 1412 - 1416
  • [10] Optimal rates for parameter estimation of stationary Gaussian processes
    Es-Sebaiy, Khalifa
    Viensb, Frederi G.
    STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2019, 129 (09) : 3018 - 3054