Mixture of experts models for multilevel data: Modeling framework and approximation theory

被引:0
|
作者
Fung, Tsz Chai [1 ]
Tseung, Spark C. [2 ]
机构
[1] Georgia State Univ, Maurice R Greenberg Sch Risk Sci, 35 Broad St NW, Atlanta, GA 30303 USA
[2] Univ Toronto, Dept Stat Sci, Ontario Power Bldg, 700 Univ Ave, 9th Floor, Toronto, ON M5G 1Z5, Canada
关键词
Artificial neural network; Crossed and nested random effects; Denseness; Mixed effects models; Universal approximation theorem; MIXED-EFFECTS MODEL; OF-EXPERTS; HIERARCHICAL MIXTURES; REGRESSION-MODELS;
D O I
10.1016/j.neucom.2025.129357
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multilevel data are prevalent in many real-world applications. However, it remains an open research problem to identify and justify a class of models that flexibly capture a wide range of multilevel data. Motivated by the versatility of the mixture of experts (MoE) models in fitting regression data, in this article we extend upon the MoE and study a class of mixed MoE (MMoE) models for multilevel data. Under some regularity conditions, we prove that the MMoE is dense in the space of any continuous mixed effects models in the sense of weak convergence. Asa result, the MMoE has a potential to accurately resemble almost all characteristics inherited in multilevel data, including the marginal distributions, dependence structures, regression links, random intercepts and random slopes. Ina particular case where the multilevel data is hierarchical, we further show that a nested version of the MMoE universally approximates a broad range of dependence structures of the random effects among different factor levels.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] MULTILEVEL THEORY AND THE UNDERSPECIFICATION OF MULTILEVEL MODELS
    VANDENEEDEN, P
    QUALITY & QUANTITY, 1992, 26 (03) : 307 - 322
  • [32] Saddlepoint approximation for mixture models
    Davison, A. C.
    Mastropietro, D.
    BIOMETRIKA, 2009, 96 (02) : 479 - 486
  • [33] MULTILEVEL STATISTICAL SHAPE MODELS: A NEW FRAMEWORK FOR MODELING HIERARCHICAL STRUCTURES
    Lecron, Fabian
    Boisvert, Jonathan
    Benjelloun, Mohammed
    Labelle, Hubert
    Mahmoudi, Said
    2012 9TH IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI), 2012, : 1284 - 1287
  • [34] Explainable data-driven modeling via mixture of experts: Towards effective blending of gray and black-box models
    Leoni, Jessica
    Breschi, Valentina
    Formentin, Simone
    Tanelli, Mara
    AUTOMATICA, 2025, 173
  • [35] Asymptotic properties of mixture-of-experts models
    Olteanu, M.
    Rynkiewicz, J.
    NEUROCOMPUTING, 2011, 74 (09) : 1444 - 1449
  • [36] Supervised mixture of experts models for population health
    Shou, Xiao
    Mavroudeas, Georgios
    Magdon-Ismail, Malik
    Figueroa, Jose
    Kuruzovich, Jason N.
    Bennett, Kristin P.
    METHODS, 2020, 179 : 101 - 110
  • [37] Model selection for the localized mixture of experts models
    Jiang, Yunlu
    Yu Conglian
    Ji Qinghua
    JOURNAL OF APPLIED STATISTICS, 2018, 45 (11) : 1994 - 2006
  • [38] Mixture of experts regression modeling by deterministic annealing
    Rao, AV
    Miller, D
    Rose, K
    Gersho, A
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1997, 45 (11) : 2811 - 2820
  • [39] Dynamic Mixture of Experts Models for Online Prediction
    Munezero, Parfait
    Villani, Mattias
    Kohn, Robert
    Kohn, Robert
    TECHNOMETRICS, 2022, : 257 - 268
  • [40] The relationship between multilevel models and non-parametric multilevel mixture models: Discrete approximation of intraclass correlation, random coefficient distributions, and residual heteroscedasticity
    Rights, Jason D.
    Sterba, Sonya K.
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2016, 69 (03): : 316 - 343