Quantifying the information lost in optimal covariance matrix cleaning

被引:0
|
作者
Bongiorno, Christian [1 ]
Lamrani, Lamia [1 ]
机构
[1] Univ Paris Saclay, Lab Math & Informat Complex & Syst, 9 Rue Joliot Curie, F-91192 Gif Sur Yvette, France
关键词
Random matrix theory; Covariance matrix estimation; Genetic regressor programming; High-dimension statistics; Information theory; DIVERGENCE;
D O I
10.1016/j.physa.2024.130225
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Obtaining an accurate estimate of the underlying covariance matrix from finite sample data is challenging due to sample size noise. In recent years, sophisticated covariance-cleaning techniques based on random matrix theory have been proposed to address this issue. Most of these methods aim to achieve an optimal covariance matrix estimator by minimizing the Frobenius norm distance as a measure of the discrepancy between the true covariance matrix and the estimator. However, this practice offers limited interpretability in terms information theory. To better understand this relationship, we focus on the Kullback-Leibler divergence to quantify the information lost by the estimator. Our analysis centers on rotationally invariant estimators, which are state-of-art in random matrix theory, and we derive an analytical expression for their Kullback-Leibler divergence. Due to the intricate nature of the calculations, we use genetic programming regressors paired with human intuition. Ultimately, using approach, we formulate a conjecture validated through extensive simulations, showing that the Frobenius distance corresponds to a first-order expansion term of the Kullback-Leibler divergence, thus establishing a more defined link between the two measures.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] SHIP DETECTION WITH THE NONLOCAL INFORMATION-BASED POLARIMETRIC COVARIANCE MATRIX
    Zhang, Tao
    Yu, Wenxian
    Zhang, Yonghu
    Guo, Weiwei
    Li, Jing
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 6390 - 6393
  • [32] Covariance recovery from a square root information matrix for data association
    Kaess, Michael
    Dellaert, Frank
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2009, 57 (12) : 1198 - 1210
  • [33] SURE Information Criteria for Large Covariance Matrix Estimation and Their Asymptotic Properties
    Li, Danning
    Zou, Hui
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (04) : 2153 - 2169
  • [34] PRIOR INFORMATION ON COEFFICIENTS WHEN DISTURBANCE COVARIANCE-MATRIX IS UNKNOWN
    TAYLOR, WE
    ECONOMETRICA, 1976, 44 (04) : 725 - 739
  • [35] Optimal transmit covariance for MIMO channels with statistical transmitter side information
    Hanlen, LW
    Grant, AJ
    2005 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), VOLS 1 AND 2, 2005, : 1818 - 1822
  • [36] Locally optimal covariance matrix estimation techniques for array signal processing applications
    Abramovich, YI
    Spencer, NK
    Gorokhov, AY
    CONFERENCE RECORD OF THE THIRTY-FIFTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS, VOLS 1 AND 2, 2001, : 1127 - 1133
  • [37] A-optimal chemical balance weighing designs with diagonal covariance matrix of errors
    Ceranka, B
    Katulska, K
    MODA6 ADVANCES IN MODEL-ORIENTED DESIGN AND ANALYSIS, 2001, : 29 - 36
  • [38] EFFECTS OF ESTIMATED NOISE COVARIANCE-MATRIX IN OPTIMAL SIGNAL-DETECTION
    KHATRI, CG
    RAO, CR
    IEEE TRANSACTIONS ON ACOUSTICS SPEECH AND SIGNAL PROCESSING, 1987, 35 (05): : 671 - 679
  • [39] Estimation and optimal structure selection of high-dimensional Toeplitz covariance matrix
    Yang, Yihe
    Zhou, Jie
    Pan, Jianxin
    JOURNAL OF MULTIVARIATE ANALYSIS, 2021, 184
  • [40] On the strong convergence of the optimal linear shrinkage estimator for large dimensional covariance matrix
    Bodnar, Taras
    Gupta, Arjun K.
    Parolya, Nestor
    JOURNAL OF MULTIVARIATE ANALYSIS, 2014, 132 : 215 - 228