On plug-in estimation of long memory models

被引:4
|
作者
Lieberman, O [1 ]
机构
[1] Technion Israel Inst Technol, Fac Ind Engn & Management, IL-32000 Haifa, Israel
关键词
D O I
10.1017/S0266466605050231
中图分类号
F [经济];
学科分类号
02 ;
摘要
We consider the Gaussian ARFIMA(j, d, l) model, with spectral density f(theta)(lambda), theta is an element of Theta subset of R-p, lambda is an element of (-pi,pi), d is an element of (0,(1)/(2)) and an unknown mean mu is an element of R. For this class of models, the n(-1)-normalized information matrix of the full parameter vector, (mu,theta), is asymptotically degenerate. To estimate theta, Dahlhaus (1989, Annals of Statistics 17, 1749-1766) suggested using the maximizer of the plug-in log-likelihood, L-n(theta,(mu) over tilde (n)), where (mu) over tilde (n), is any n((1-2d)/2)-consistent estimator of mu. The resulting estimator is a plug-in maximum likelihood estimator (PMLE). This estimator is asymptotically normal, efficient, and consistent, but in finite samples it has some serious drawbacks. Primarily, none of the Bartlett identities associated with L-n(theta,(mu) over tilde (n)) are satisfied for fixed n. Cheung and Diebold (1994, Journal of Econometrics 62, 301-316) conducted a Monte Carlo simulation study and reported that the bias of the PMLE is about 3-4 times the bias of the regular maximum likelihood estimator (MLE). In this paper, we derive asymptotic expansions for the PMLE and show that its second-order bias is contaminated by an additional term, which does not exist in regular cases. This term arises because of the failure of the first Bartlett identity to hold and seems to explain Cheung and Diebold's simulated results. We derive similar expansions for the Whittle MLE, which is another estimator tacitly using the plug-in principle. An application to the ARFIMA(0,d,0) shows that the additional bias terms are considerable.
引用
收藏
页码:431 / 454
页数:24
相关论文
共 50 条
  • [1] A plug-in approach to support estimation
    Cuevas, A
    Fraiman, R
    ANNALS OF STATISTICS, 1997, 25 (06): : 2300 - 2312
  • [2] CHARACTERIZING SIGNAL CONDITIONERS WITH PLUG-IN MEMORY
    GARELICK, EL
    CONTROL ENGINEERING, 1977, 24 (09) : 62 - 64
  • [3] Plug-in estimation of general level sets
    Cuevas, A
    González-Manteiga, W
    Rodríguez-Casal, A
    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, 2006, 48 (01) : 7 - 19
  • [4] A Plug-in Memory Network for Trip Purpose Classification
    Lyu, Suxing
    Han, Tianyang
    Nishiyama, Yuuki
    Sezaki, Kaoru
    Kusakabe, Takahiko
    30TH ACM SIGSPATIAL INTERNATIONAL CONFERENCE ON ADVANCES IN GEOGRAPHIC INFORMATION SYSTEMS, ACM SIGSPATIAL GIS 2022, 2022, : 239 - 250
  • [5] OPTIMAL PLUG-IN ESTIMATORS FOR NONPARAMETRIC FUNCTIONAL ESTIMATION
    GOLDSTEIN, L
    MESSER, K
    ANNALS OF STATISTICS, 1992, 20 (03): : 1306 - 1328
  • [6] Estimation of Plug-in Electric Vehicles Schedulable Capacity
    Lv, Li
    Gao, Ciwei
    Li, HuiXing
    Shen, Tugang
    Wei, XueHao
    Liu, Ling
    2016 3RD INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2016, : 223 - 229
  • [7] A plug for plug-in cars
    Cocconi, A
    Gage, T
    IEEE SPECTRUM, 2002, 39 (04) : 14 - 15
  • [8] The estimation of misspecified long memory models
    Robinson, Peter M.
    JOURNAL OF ECONOMETRICS, 2014, 178 : 225 - 230
  • [9] Robust plug-in estimators in proportional scatter models
    Boente, G
    Orellana, L
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2004, 122 (1-2) : 95 - 110
  • [10] A Plug-in Method for Representation Factorization in Connectionist Models
    Yoon, Jee Seok
    Roh, Myung-Cheol
    Suk, Heung-Il
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (08) : 3792 - 3803