On consistency and sparsity for high-dimensional functional time series with to

被引:6
|
作者
Guo, Shaojun [1 ]
Qiao, Xinghao [2 ]
机构
[1] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
[2] London Sch Econ, Dept Stat, London WC2A 2AE, England
基金
中国国家自然科学基金;
关键词
Functional principal component analysis; functional stability measure; high-dimensional functional time series; non-asymptotics; sparsity; vector functional autoregression; REGRESSION; INEQUALITIES; GUARANTEES; LASSO; NOISY; MODEL;
D O I
10.3150/22-BEJ1464
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Modelling a large collection of functional time series arises in a broad spectral of real applications. Under such a scenario, not only the number of functional variables can be diverging with, or even larger than the number of temporally dependent functional observations, but each function itself is an infinite-dimensional object, posing a challenging task. In this paper, we propose a three-step procedure to estimate high-dimensional functional time series models. To provide theoretical guarantees for the three-step procedure, we focus on multivariate stationary processes and propose a novel functional stability measure based on their spectral properties. Such stability measure facilitates the development of some useful concentration bounds on sample (auto)covariance functions, which serve as a fundamental tool for further convergence analysis in high-dimensional settings. As functional principal component analysis (FPCA) is one of the key dimension reduction techniques in the first step, we also investigate the non-asymptotic properties of the relevant estimated terms under a FPCA framework. To illustrate with an important application, we consider vector functional autoregressive models and develop a regularization approach to estimate autoregressive coefficient functions under the sparsity constraint. Using our derived non-asymptotic results, we investigate convergence properties of the regularized estimate under high-dimensional scaling. Finally, the finite-sample performance of the proposed method is examined through both simulations and a public financial dataset.
引用
收藏
页码:451 / 472
页数:22
相关论文
共 50 条
  • [31] Factor Models for High-Dimensional Tensor Time Series
    Chen, Rong
    Yang, Dan
    Zhang, Cun-Hui
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2022, 117 (537) : 94 - 116
  • [32] High-dimensional functional time series forecasting: An application to age-specific mortality rates
    Gao, Yuan
    Shang, Han Lin
    Yang, Yanrong
    JOURNAL OF MULTIVARIATE ANALYSIS, 2019, 170 : 232 - 243
  • [33] High-dimensional Contextual Bandit Problem without Sparsity
    Komiyama, Junpei
    Imaizumi, Masaaki
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [34] Asymptotic Confidence Regions for High-Dimensional Structured Sparsity
    Stucky, Benjamin
    van de Geer, Sara
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2018, 66 (08) : 2178 - 2190
  • [35] High-dimensional integrative analysis with homogeneity and sparsity recovery
    Yang, Xinfeng
    Yan, Xiaodong
    Huang, Jian
    JOURNAL OF MULTIVARIATE ANALYSIS, 2019, 174
  • [36] COVARIANCE AND PRECISION MATRIX ESTIMATION FOR HIGH-DIMENSIONAL TIME SERIES
    Chen, Xiaohui
    Xu, Mengyu
    Wu, Wei Biao
    ANNALS OF STATISTICS, 2013, 41 (06): : 2994 - 3021
  • [37] Estimation of Constrained Factor Models for High-Dimensional Time Series
    Liu, Yitian
    Pan, Jiazhu
    Xia, Qiang
    JOURNAL OF FORECASTING, 2025,
  • [38] High-dimensional lag structure optimization of fuzzy time series
    Gao, Ruobin
    Duru, Okan
    Yuen, Kum Fai
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 173
  • [39] Empirical Dynamic Quantiles for Visualization of High-Dimensional Time Series
    Peña D.
    Tsay R.S.
    Zamar R.
    Technometrics, 2019, 61 (04) : 429 - 444
  • [40] The Art of Sparsity: Mastering High-Dimensional Tensor Storage
    Dong, Bin
    Wu, Kesheng
    Byna, Suren
    2024 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS, IPDPSW 2024, 2024, : 439 - 446