Mixture of experts models for multilevel data: Modeling framework and approximation theory

被引:0
|
作者
Fung, Tsz Chai [1 ]
Tseung, Spark C. [2 ]
机构
[1] Georgia State Univ, Maurice R Greenberg Sch Risk Sci, 35 Broad St NW, Atlanta, GA 30303 USA
[2] Univ Toronto, Dept Stat Sci, Ontario Power Bldg, 700 Univ Ave, 9th Floor, Toronto, ON M5G 1Z5, Canada
关键词
Artificial neural network; Crossed and nested random effects; Denseness; Mixed effects models; Universal approximation theorem; MIXED-EFFECTS MODEL; OF-EXPERTS; HIERARCHICAL MIXTURES; REGRESSION-MODELS;
D O I
10.1016/j.neucom.2025.129357
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multilevel data are prevalent in many real-world applications. However, it remains an open research problem to identify and justify a class of models that flexibly capture a wide range of multilevel data. Motivated by the versatility of the mixture of experts (MoE) models in fitting regression data, in this article we extend upon the MoE and study a class of mixed MoE (MMoE) models for multilevel data. Under some regularity conditions, we prove that the MMoE is dense in the space of any continuous mixed effects models in the sense of weak convergence. Asa result, the MMoE has a potential to accurately resemble almost all characteristics inherited in multilevel data, including the marginal distributions, dependence structures, regression links, random intercepts and random slopes. Ina particular case where the multilevel data is hierarchical, we further show that a nested version of the MMoE universally approximates a broad range of dependence structures of the random effects among different factor levels.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] A mixture-of-experts framework for adaptive Kalman filtering
    Chaer, WS
    Bishop, RH
    Ghosh, J
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1997, 27 (03): : 452 - 464
  • [42] A Framework of R-Squared Measures for Single-Level and Multilevel Regression Mixture Models
    Rights, Jason D.
    Sterba, Sonya K.
    PSYCHOLOGICAL METHODS, 2018, 23 (03) : 434 - 457
  • [43] Data Fit Comparison of Mixture Item Response Theory Models and Traditional Models
    Yalcin, Seher
    INTERNATIONAL JOURNAL OF ASSESSMENT TOOLS IN EDUCATION, 2018, 5 (02): : 301 - 313
  • [44] Classifying Incomplete Data with a Mixture of Subspace Experts
    Kizaric, Ben A.
    Pimentel-Alarcon, Daniel L.
    2022 58TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2022,
  • [45] Mixture of Experts with Entropic Regularization for Data Classification
    Peralta, Billy
    Saavedra, Ariel
    Caro, Luis
    Soto, Alvaro
    ENTROPY, 2019, 21 (02)
  • [46] APPROXIMATION THEORY ON DIELECTRIC PERMITTIVITY OF MIXTURE
    PALETTO, J
    GOUTTE, R
    EYRAUD, L
    JOURNAL OF SOLID STATE CHEMISTRY, 1973, 6 (01) : 58 - 66
  • [47] Mixture Multilevel Vector-Autoregressive Modeling
    Ernst, Anja F.
    Timmerman, Marieke E.
    Ji, Feng
    Jeronimus, Bertus F.
    Albers, Casper J.
    PSYCHOLOGICAL METHODS, 2024, 29 (01) : 137 - 154
  • [48] Mixture cure models with time-varying and multilevel frailties for recurrent event data
    Tawiah, Richard
    McLachlan, Geoffrey
    Ng, Shu Kay
    STATISTICAL METHODS IN MEDICAL RESEARCH, 2020, 29 (05) : 1368 - 1385
  • [49] Model Selection for Multilevel Mixture Rasch Models
    Sen, Sedat
    Cohen, Allan S.
    Kim, Seock-Ho
    APPLIED PSYCHOLOGICAL MEASUREMENT, 2019, 43 (04) : 272 - 289
  • [50] Mixture models for clustering multilevel growth trajectories
    Ng, S. K.
    McLachlan, G. J.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 71 : 43 - 51