Limitations on low rank approximations for covariance matrices of spatial data

被引:122
|
作者
Stein, Michael L. [1 ]
机构
[1] Univ Chicago, Dept Stat, Chicago, IL 60637 USA
关键词
Fixed-domain asymptotics; Gaussian processes; Kullback-Leibler divergence; Random effects; Subset of regressors; Total column ozone; PARAMETER-ESTIMATION; STATIONARY PROCESS; LIKELIHOOD;
D O I
10.1016/j.spasta.2013.06.003
中图分类号
P [天文学、地球科学];
学科分类号
07 ;
摘要
Evaluating the likelihood function for Gaussian models when a spatial process is observed irregularly is problematic for larger datasets due to constraints of memory and calculation. If the covariance structure can be approximated by a diagonal matrix plus a low rank matrix, then both the memory and calculations needed to evaluate the likelihood function are greatly reduced. When neighboring observations are strongly correlated, much of the variation in the observations can be captured by low frequency components, so the low rank approach might be thought to work well in this setting. Through both theory and numerical results, where the diagonal matrix is assumed to be a multiple of the identity, this paper shows that the low rank approximation sometimes performs poorly in this setting. In particular, an approximation in which observations are split into contiguous blocks and independence across blocks is assumed often provides a much better approximation to the likelihood than a low rank approximation requiring similar memory and calculations. An example with satellite-based measurements of total column ozone shows that these results are relevant to real data and that the low rank models also can be highly statistically inefficient for spatial interpolation. (C) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:1 / 19
页数:19
相关论文
共 50 条
  • [1] Generalized Low Rank Approximations of Matrices
    Jieping Ye
    Machine Learning, 2005, 61 : 167 - 191
  • [2] Generalized low rank approximations of matrices
    Ye, JP
    MACHINE LEARNING, 2005, 61 (1-3) : 167 - 191
  • [3] FAST LOW RANK APPROXIMATIONS OF MATRICES AND TENSORS
    Friedland, S.
    Mehrmann, V.
    Miedlar, A.
    Nkengla, M.
    ELECTRONIC JOURNAL OF LINEAR ALGEBRA, 2011, 22 : 1031 - 1048
  • [4] Robust Generalized Low Rank Approximations of Matrices
    Shi, Jiarong
    Yang, Wei
    Zheng, Xiuyun
    PLOS ONE, 2015, 10 (09):
  • [5] SYMMETRIC GENERALIZED LOW RANK APPROXIMATIONS OF MATRICES
    Inoue, Kohei
    Hara, Kenji
    Urahama, Kiichi
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 949 - 952
  • [6] Fast low-rank approximation for covariance matrices
    Belabbas, Mohamed-Ali
    Wolfe, Patrick J.
    2007 2ND IEEE INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING, 2007, : 181 - 184
  • [7] Multichannel signal processing using spatial rank covariance matrices
    Visuri, S
    Oja, H
    Koivunen, V
    PROCEEDINGS OF THE IEEE-EURASIP WORKSHOP ON NONLINEAR SIGNAL AND IMAGE PROCESSING (NSIP'99), 1999, : 75 - 79
  • [8] Generalized Low-Rank Approximations of Matrices Revisited
    Liu, Jun
    Chen, Songcan
    Zhou, Zhi-Hua
    Tan, Xiaoyang
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (04): : 621 - 632
  • [9] Sign and rank covariance matrices
    Visuri, S
    Koivunen, V
    Oja, H
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2000, 91 (02) : 557 - 575
  • [10] Sketching for Simultaneously Sparse and Low-Rank Covariance Matrices
    Bahmani, Sohail
    Romberg, Justin
    2015 IEEE 6TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP), 2015, : 357 - 360