Among various dimension reduction techniques, Principal Component Analysis (PCA) is specialized in treating vector data, whereas Laplacian embedding is often employed for embedding graph data. Moreover, graph regularized PCA, a combination of both techniques, has also been developed to assist the learning of a low dimensional representation of vector data by incorporating graph data. However, these approaches are confronted by the out-of-sample problem: each time when new data is added, it has to be combined with the old data before being fed into the algorithm to re-compute the eigenvectors, leading to enormous computational cost. In order to address this problem, we extend the graph regularized PCA to the graph regularized linear regression PCA (grlrPCA). grlrPCA eliminates the redundant calculation on the old data by first learning a linear function and then directly applying it to the new data for its dimension reduction. Furthermore, we derive an efficient iterative algorithm to solve grlrPCA optimization problem and show the close relatedness of grlrPCA and unsupervised Linear Discriminant Analysis at infinite regularization parameter limit. The evaluations of multiple metrics on seven realistic datasets demonstrate that grlrPCA outperforms established unsupervised dimension reduction algorithms.
机构:
Univ Shanghai Sci & Technol, Business Sch, 334 Jungong Rd, Shanghai 200093, Peoples R ChinaUniv Shanghai Sci & Technol, Business Sch, 334 Jungong Rd, Shanghai 200093, Peoples R China
Xu, Jiawen
Perron, Pierre
论文数: 0引用数: 0
h-index: 0
机构:
Boston Univ, Dept Econ, 270 Bay State Rd, Boston, MA 02215 USAUniv Shanghai Sci & Technol, Business Sch, 334 Jungong Rd, Shanghai 200093, Peoples R China