Mixtures-of-experts of autoregressive time series: Asymptotic normality and model specification

被引:17
|
作者
Carvalho, AX [1 ]
Tanner, MA
机构
[1] Univ British Columbia, Vancouver, BC V6T 1Z4, Canada
[2] Inst Appl Econ Res, BR-70076900 Brasilia, DF, Brazil
[3] Northwestern Univ, Evanston, IL 60201 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2005年 / 16卷 / 01期
基金
加拿大自然科学与工程研究理事会;
关键词
asymptotic properties; maximum likelihood estimation; mixture-of-experts (ME); nonlinear time series;
D O I
10.1109/TNN.2004.839356
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider a class of nonlinear models based on mixtures of local autoregressive time series. At any given time point, we have a certain number of linear models, denoted as experts, where the vector of covariates may include lags of the dependent variable. Additionally, we assume the existence of a latent multinomial variable, whose distribution depends on the same covariates; as the experts, that determines which linear Process is observed. This structure, denoted as mixture-of-experts (ME), is considerably flexible in modeling the conditional mean function, as shown by Jiang and Tanner. In this paper, we present a formal treatment of conditions to guarantee the asymptotic normality of the maximum likelihood estimator (MLE), under stationarity and nonstationarity, and under correct model specification and model misspecification. The performance of common model selection criteria in selecting the number of experts is explored via Monte Carlo simulations. Finally, we present applications to simulated and real data sets, to illustrate the ability of the proposed structure to model not only the conditional mean, but also the whole conditional density.
引用
收藏
页码:39 / 56
页数:18
相关论文
共 50 条