Principal component analysis (PCA) is one of the most commonly used statistical procedures with a wide range of applications. Consider the points X-1, X-2 , ..., X-n are vectors drawn i.i.d. from a distribution with mean zero and covariance Sigma, where Sigma is unknown. Let A(n) = XnXnT, then E[A(n)] = Sigma. This paper considers the problem of finding the smallest eigenvalue and eigenvector of matrix Sigma. A classical estimator of this type is due to (Krasulina, 1969). We are going to state the convergence proof of Krasulina for the smallest eigenvalue and corresponding eigenvector, and then find their convergence rate. (C) 2019 Elsevier B.V. All rights reserved.
机构:
Hokkaido Univ, Grad Sch Econ & Business Adm, Kita Ku, Sapporo, Hokkaido 0600809, JapanHokkaido Univ, Grad Sch Econ & Business Adm, Kita Ku, Sapporo, Hokkaido 0600809, Japan
Igarashi, Gaku
Kakizawa, Yoshihide
论文数: 0引用数: 0
h-index: 0
机构:
Hokkaido Univ, Grad Sch Econ & Business Adm, Kita Ku, Sapporo, Hokkaido 0600809, JapanHokkaido Univ, Grad Sch Econ & Business Adm, Kita Ku, Sapporo, Hokkaido 0600809, Japan