Mixture of experts models for multilevel data: Modeling framework and approximation theory

被引:0
|
作者
Fung, Tsz Chai [1 ]
Tseung, Spark C. [2 ]
机构
[1] Georgia State Univ, Maurice R Greenberg Sch Risk Sci, 35 Broad St NW, Atlanta, GA 30303 USA
[2] Univ Toronto, Dept Stat Sci, Ontario Power Bldg, 700 Univ Ave, 9th Floor, Toronto, ON M5G 1Z5, Canada
关键词
Artificial neural network; Crossed and nested random effects; Denseness; Mixed effects models; Universal approximation theorem; MIXED-EFFECTS MODEL; OF-EXPERTS; HIERARCHICAL MIXTURES; REGRESSION-MODELS;
D O I
10.1016/j.neucom.2025.129357
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multilevel data are prevalent in many real-world applications. However, it remains an open research problem to identify and justify a class of models that flexibly capture a wide range of multilevel data. Motivated by the versatility of the mixture of experts (MoE) models in fitting regression data, in this article we extend upon the MoE and study a class of mixed MoE (MMoE) models for multilevel data. Under some regularity conditions, we prove that the MMoE is dense in the space of any continuous mixed effects models in the sense of weak convergence. Asa result, the MMoE has a potential to accurately resemble almost all characteristics inherited in multilevel data, including the marginal distributions, dependence structures, regression links, random intercepts and random slopes. Ina particular case where the multilevel data is hierarchical, we further show that a nested version of the MMoE universally approximates a broad range of dependence structures of the random effects among different factor levels.
引用
收藏
页数:14
相关论文
共 50 条
  • [11] Multilevel mixture models
    Asparouhov, Tihomir
    Muthen, Bengt
    ADVANCES IN LATENT VARIABLE MIXTURE MODELS, 2008, : 27 - +
  • [12] Distributed Mixture-of-Experts for Big Data using PETUUM framework
    Peralta, Billy
    Parra, Luis
    Herrera, Oriel
    Caro, Luis
    2017 36TH INTERNATIONAL CONFERENCE OF THE CHILEAN COMPUTER SCIENCE SOCIETY (SCCC), 2017,
  • [13] Latent class and finite mixture models for multilevel data sets
    Vermunt, Jeroen K.
    STATISTICAL METHODS IN MEDICAL RESEARCH, 2008, 17 (01) : 33 - 51
  • [14] Janus: A Unified Distributed Training Framework for Sparse Mixture-of-Experts Models
    Liu, Juncai
    Wang, Jessie Hui
    Jiang, Yimin
    PROCEEDINGS OF THE 2023 ACM SIGCOMM 2023 CONFERENCE, SIGCOMM 2023, 2023, : 486 - 498
  • [15] Fitting Censored and Truncated Regression Data Using the Mixture of Experts Models
    Fung, Tsz Chai
    Badescu, Andrei L.
    Lin, X. Sheldon
    NORTH AMERICAN ACTUARIAL JOURNAL, 2021,
  • [16] Fitting Censored and Truncated Regression Data Using the Mixture of Experts Models
    Fung, Tsz Chai
    Badescu, Andrei L.
    Lin, X. Sheldon
    NORTH AMERICAN ACTUARIAL JOURNAL, 2022, 26 (04) : 496 - 520
  • [17] Modeling of Active Transmembrane Transport in a Mixture Theory Framework
    Ateshian, Gerard A.
    Morrison, Barclay, III
    Hung, Clark T.
    ANNALS OF BIOMEDICAL ENGINEERING, 2010, 38 (05) : 1801 - 1814
  • [18] A mixture fraction framework for the theory and modeling of droplets and sprays
    Bilger, Robert W.
    COMBUSTION AND FLAME, 2011, 158 (02) : 191 - 202
  • [19] Modeling of Active Transmembrane Transport in a Mixture Theory Framework
    Gerard A. Ateshian
    Barclay Morrison
    Clark T. Hung
    Annals of Biomedical Engineering, 2010, 38 : 1801 - 1814
  • [20] Multilevel Mixture Factor Models
    Varriale, Roberta
    Vermunt, Jeroen K.
    MULTIVARIATE BEHAVIORAL RESEARCH, 2012, 47 (02) : 247 - 275