Mixture of experts models for multilevel data: Modeling framework and approximation theory

被引:0
|
作者
Fung, Tsz Chai [1 ]
Tseung, Spark C. [2 ]
机构
[1] Georgia State Univ, Maurice R Greenberg Sch Risk Sci, 35 Broad St NW, Atlanta, GA 30303 USA
[2] Univ Toronto, Dept Stat Sci, Ontario Power Bldg, 700 Univ Ave, 9th Floor, Toronto, ON M5G 1Z5, Canada
关键词
Artificial neural network; Crossed and nested random effects; Denseness; Mixed effects models; Universal approximation theorem; MIXED-EFFECTS MODEL; OF-EXPERTS; HIERARCHICAL MIXTURES; REGRESSION-MODELS;
D O I
10.1016/j.neucom.2025.129357
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multilevel data are prevalent in many real-world applications. However, it remains an open research problem to identify and justify a class of models that flexibly capture a wide range of multilevel data. Motivated by the versatility of the mixture of experts (MoE) models in fitting regression data, in this article we extend upon the MoE and study a class of mixed MoE (MMoE) models for multilevel data. Under some regularity conditions, we prove that the MMoE is dense in the space of any continuous mixed effects models in the sense of weak convergence. Asa result, the MMoE has a potential to accurately resemble almost all characteristics inherited in multilevel data, including the marginal distributions, dependence structures, regression links, random intercepts and random slopes. Ina particular case where the multilevel data is hierarchical, we further show that a nested version of the MMoE universally approximates a broad range of dependence structures of the random effects among different factor levels.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] MODELING EXPERTS' OPINIONS BY USING BAYESIAN MIXTURE MODELS FOR A SUBCLASS OF THE EXPONENTIAL FAMILY
    Rufo, M. J.
    Martin, J.
    Perez, C. J.
    PAKISTAN JOURNAL OF STATISTICS, 2009, 25 (04): : 595 - 613
  • [22] Modeling time dependencies in the mixture of experts
    Fancourt, CL
    Principe, JC
    IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 2324 - 2327
  • [23] Parameters Modeling for a Modified Mixture of Experts
    Jin, Jian
    Huang, Guo-Xing
    Ding, Jianguo
    Hu, Yong-Tao
    2008 7TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-23, 2008, : 4024 - +
  • [24] Modeling multilevel survival data using frailty models
    Kim, Sungduk
    Dey, Dipak K.
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2008, 37 (11) : 1734 - 1741
  • [25] Within-Level Group Factorial Invariance With Multilevel Data: Multilevel Factor Mixture and Multilevel MIMIC Models
    Kim, Eun Sook
    Yoon, Myeongsun
    Wen, Yao
    Luo, Wen
    Kwok, Oi-man
    STRUCTURAL EQUATION MODELING-A MULTIDISCIPLINARY JOURNAL, 2015, 22 (04) : 603 - 616
  • [26] Finite mixture models for clustering multilevel data with multiple cluster structures
    Galimberti, Giuliano
    Soffritti, Gabriele
    STATISTICAL MODELLING, 2010, 10 (03) : 265 - 290
  • [27] Adaptive mixture-of-experts models for data glove interface with multiple users
    Yoon, Jong-Won
    Yang, Sung-Ihk
    Cho, Sung-Bae
    EXPERT SYSTEMS WITH APPLICATIONS, 2012, 39 (05) : 4898 - 4907
  • [28] Multilevel Factor Mixture Modeling: A Tutorial for Multilevel Constructs
    Cao, Chunhua
    Wang, Yan
    Kim, Eunsook
    STRUCTURAL EQUATION MODELING-A MULTIDISCIPLINARY JOURNAL, 2025, 32 (01) : 155 - 171
  • [29] Deep Modeling: Circuit Characterization Using Theory Based Models in a Data Driven Framework
    Bolme, David
    Mikkilineni, Aravind
    Rose, Derek
    Yoginath, Srikanth
    Judy, Mohsen
    Holleman, Jeremy
    2017 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2017,
  • [30] Mixture of experts for stellar data classification
    Jiang, YG
    Guo, P
    ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 2, PROCEEDINGS, 2005, 3497 : 310 - 315