Convergence rate of Krasulina estimator

被引:0
|
作者
Chen, Jiangning [1 ]
机构
[1] Georgia Inst Technol, Sch Math, Atlanta, GA 30313 USA
关键词
PCA; Incremental; Online updating; Covariance matrix; Rate of convergence; Adaptive estimation; SPECTRAL PROJECTORS; PCA; APPROXIMATION;
D O I
10.1016/j.spl.2019.108562
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Principal component analysis (PCA) is one of the most commonly used statistical procedures with a wide range of applications. Consider the points X-1, X-2 , ..., X-n are vectors drawn i.i.d. from a distribution with mean zero and covariance Sigma, where Sigma is unknown. Let A(n) = XnXnT, then E[A(n)] = Sigma. This paper considers the problem of finding the smallest eigenvalue and eigenvector of matrix Sigma. A classical estimator of this type is due to (Krasulina, 1969). We are going to state the convergence proof of Krasulina for the smallest eigenvalue and corresponding eigenvector, and then find their convergence rate. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条