A Constrained l1 Minimization Approach to Sparse Precision Matrix Estimation

被引:630
|
作者
Cai, Tony [1 ]
Liu, Weidong [1 ]
Luo, Xi [1 ]
机构
[1] Univ Penn, Wharton Sch, Dept Stat, Philadelphia, PA 19104 USA
基金
美国国家科学基金会;
关键词
Covariance matrix; Frobenius norm; Gaussian graphical model; Precision matrix; Rate of convergence; Spectral norm; VARIABLE SELECTION; COVARIANCE; CONVERGENCE; LIKELIHOOD; RECOVERY; RATES; MODEL;
D O I
10.1198/jasa.2011.tm10155
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This article proposes a constrained l(1) minimization method for estimating a sparse inverse covariance matrix based on a sample of n iid p-variate random variables. The resulting estimator is shown to have a number of desirable properties. In particular, the rate of convergence between the estimator and the true s-sparse precision matrix under the spectral norm is s root logp/n when the population distribution has either exponential-type tails or polynomial-type tails. We present convergence rates under the elementwise l(infinity) norm and Frobenius norm. In addition, we consider graphical model selection. The procedure is easily implemented by linear programming. Numerical performance of the estimator is investigated using both simulated and real data. In particular, the procedure is applied to analyze a breast cancer dataset and is found to perform favorably compared with existing methods.
引用
收藏
页码:594 / 607
页数:14
相关论文
共 50 条
  • [41] Parameter selection for nonnegative l1 matrix/tensor sparse decomposition
    Wang, Yiju
    Liu, Wanquan
    Caccetta, Louis
    Zhou, Guanglu
    OPERATIONS RESEARCH LETTERS, 2015, 43 (04) : 423 - 426
  • [42] Use of PALM for l1 sparse matrix factorization: Difficulty and rationalization of a two-step approach
    Kervazo, Christophe
    Bobin, Jerome
    Chenot, Cecile
    Sureau, Florent
    DIGITAL SIGNAL PROCESSING, 2020, 97
  • [43] A Completely Tuning-Free and Robust Approach to Sparse Precision Matrix Estimation
    Chau Tran
    Yu, Guo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [44] Sparse signal recovery via Kalman-filter-based l1 minimization
    Hage, Dunja Alexandra
    Conde, Miguel Heredia
    Loffeld, Otmar
    SIGNAL PROCESSING, 2020, 171
  • [45] A Dimension Reduction Model for Sparse Hyperspectral Target Detection with Weighted l1 Minimization
    Huang, Zhongwei
    Shi, Zhenwei
    Qin, Zhen
    2012 5TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING (CISP), 2012, : 972 - 976
  • [46] Improved Sparse Recovery Thresholds with Two-Step Reweighted l1 Minimization
    Khajehnejad, M. Amin
    Xu, Weiyu
    Avestimehr, A. Salman
    Hassibi, Babak
    2010 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2010, : 1603 - 1607
  • [47] Dynamic Filtering of Time-Varying Sparse Signals via l1 Minimization
    Charles, Adam S.
    Balavoine, Aurele
    Rozell, Christopher J.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (21) : 5644 - 5656
  • [48] DOA estimation using weighted L1 norm sparse model
    College of Mechanical and Electrical Engineering, Northeast Forestry University, Harbin
    150040, China
    Harbin Gongcheng Daxue Xuebao, 1600, 4 (603-607):
  • [49] Sparse recovery by the iteratively reweighted l1 algorithm for elastic l2 - lq minimization
    Zhang, Yong
    Ye, WanZhou
    OPTIMIZATION, 2017, 66 (10) : 1677 - 1687
  • [50] A necessary and sufficient condition for sparse vector recovery via l1 - l2 minimization
    Bi, Ning
    Tang, Wai-Shing
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2022, 56 : 337 - 350