Graph-dual Laplacian principal component analysis

被引:5
|
作者
He, Jinrong [1 ,2 ]
Bi, Yingzhou [3 ]
Liu, Bin [1 ,2 ]
Zeng, Zhigao [4 ]
机构
[1] Northwest A& F Univ, Coll Informat Engn, Yangling 712100, Shaanxi, Peoples R China
[2] Minist Agr Peoples Republ China, Key Lab Agr Internet Things, Yangling 712100, Shaanxi, Peoples R China
[3] Guangxi Teachers Educ Univ, Sci Comp & Intelligent Informat Proc Guangxi High, Nanning 530001, Guangxi, Peoples R China
[4] Hunan Univ Technol, Coll Comp & Commun, Xiangtan 412000, Hunan, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Principal component analysis; Graph-Laplacian PCA; Dual graph; Feature manifold; Graph-Dual Laplacian PCA; MATRIX FACTORIZATION; LP-NORM; OPTIMIZATION; ALGORITHM; NETWORK;
D O I
10.1007/s12652-018-1096-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal component analysis is the most widely used method for linear dimensionality reduction, due to its effectiveness in exploring low-dimensional global geometric structures embedded in data. To preserve the intrinsic local geometrical structures of data, graph-Laplacian PCA (gLPCA) incorporates Laplacian embedding into PCA framework for learning local similarities between data points, which leads to significant performance improvement in clustering and classification. Some recent works showed that not only the high dimensional data reside on a low-dimensional manifold in the data space, but also the features lie on a manifold in feature space. However, both PCA and gLPCA overlook the local geometric information contained in the feature space. By considering the duality between data manifold and feature manifold, graph-dual Laplacian PCA (gDLPCA) is proposed, which incorporates data graph regularization and feature graph regularization into PCA framework to exploit local geometric structures of data manifold and feature manifold simultaneously. The experimental results on four benchmark data sets have confirmed its effectiveness and suggested that gDLPCA outperformed gLPCA on classification and clustering tasks.
引用
收藏
页码:3249 / 3262
页数:14
相关论文
共 50 条
  • [31] Principal component analysis
    Hess, Aaron S.
    Hess, John R.
    TRANSFUSION, 2018, 58 (07) : 1580 - 1582
  • [32] PRINCIPAL COMPONENT ANALYSIS
    ARIES, RE
    LIDIARD, DP
    SPRAGG, RA
    CHEMISTRY IN BRITAIN, 1991, 27 (09) : 821 - 824
  • [33] Segmented principal component transform-principal component analysis
    Barros, AS
    Rutledge, DN
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2005, 78 (1-2) : 125 - 137
  • [34] Degrees of freedom estimation in Principal Component Analysis and Consensus Principal Component Analysis
    Hassani, Sahar
    Martens, Harald
    Qannari, El Mostafa
    Kohler, Achim
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2012, 118 : 246 - 259
  • [35] Dual systems for minor and principal component computation
    Hasan, Mohammed A.
    2008 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-12, 2008, : 1901 - 1904
  • [36] A dual purpose principal and minor component flow
    Manton, JH
    Helmke, U
    Mareels, IMY
    SYSTEMS & CONTROL LETTERS, 2005, 54 (08) : 759 - 769
  • [37] Graph-regularized tensor robust principal component analysis for hyperspectral image denoising
    Nie, Yongming
    Chen, Linsen
    Zhu, Hao
    Du, Sidan
    Yue, Tao
    Cao, Xun
    APPLIED OPTICS, 2017, 56 (22) : 6094 - 6102
  • [38] Graph-Regularized Fast and Robust Principal Component Analysis for Hyperspectral Band Selection
    Sun, Weiwei
    Du, Qian
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2018, 56 (06): : 3185 - 3195
  • [39] Smooth principal component analysis network image recognition algorithm with fusion graph embedding
    Chen F.
    Zhu Y.
    Tian J.
    Jiang K.
    Guofang Keji Daxue Xuebao/Journal of National University of Defense Technology, 2022, 44 (03): : 16 - 22
  • [40] On connections between Renyi entropy Principal Component Analysis, kernel learning and graph embedding
    Ran, Zhi-Yong
    Wang, Wei
    Hu, Bao-Gang
    PATTERN RECOGNITION LETTERS, 2018, 112 : 125 - 130