Graph-dual Laplacian principal component analysis

被引:5
|
作者
He, Jinrong [1 ,2 ]
Bi, Yingzhou [3 ]
Liu, Bin [1 ,2 ]
Zeng, Zhigao [4 ]
机构
[1] Northwest A& F Univ, Coll Informat Engn, Yangling 712100, Shaanxi, Peoples R China
[2] Minist Agr Peoples Republ China, Key Lab Agr Internet Things, Yangling 712100, Shaanxi, Peoples R China
[3] Guangxi Teachers Educ Univ, Sci Comp & Intelligent Informat Proc Guangxi High, Nanning 530001, Guangxi, Peoples R China
[4] Hunan Univ Technol, Coll Comp & Commun, Xiangtan 412000, Hunan, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Principal component analysis; Graph-Laplacian PCA; Dual graph; Feature manifold; Graph-Dual Laplacian PCA; MATRIX FACTORIZATION; LP-NORM; OPTIMIZATION; ALGORITHM; NETWORK;
D O I
10.1007/s12652-018-1096-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal component analysis is the most widely used method for linear dimensionality reduction, due to its effectiveness in exploring low-dimensional global geometric structures embedded in data. To preserve the intrinsic local geometrical structures of data, graph-Laplacian PCA (gLPCA) incorporates Laplacian embedding into PCA framework for learning local similarities between data points, which leads to significant performance improvement in clustering and classification. Some recent works showed that not only the high dimensional data reside on a low-dimensional manifold in the data space, but also the features lie on a manifold in feature space. However, both PCA and gLPCA overlook the local geometric information contained in the feature space. By considering the duality between data manifold and feature manifold, graph-dual Laplacian PCA (gDLPCA) is proposed, which incorporates data graph regularization and feature graph regularization into PCA framework to exploit local geometric structures of data manifold and feature manifold simultaneously. The experimental results on four benchmark data sets have confirmed its effectiveness and suggested that gDLPCA outperformed gLPCA on classification and clustering tasks.
引用
收藏
页码:3249 / 3262
页数:14
相关论文
共 50 条
  • [21] Approximation of Graph Kernel Similarities for Chemical Graphs by Kernel Principal Component Analysis
    Hinselmann, Georg
    Jahn, Andreas
    Fechner, Nikolas
    Rosenbaum, Lars
    Zell, Andreas
    EVOLUTIONARY COMPUTATION, MACHINE LEARNING AND DATA MINING IN BIOINFORMATICS, 2011, 6623 : 123 - +
  • [22] Dual-constrained and primal-constrained principal component analysis
    Andries, Erik
    Nikzad-Langerodi, Ramin
    JOURNAL OF CHEMOMETRICS, 2022, 36 (05)
  • [23] Novel Dual-Purpose Algorithm for Principal and Minor Component Analysis
    Zhongying, Xu
    Yingbin, Gao
    Xiangyu, Kong
    IEEE ACCESS, 2020, 8 : 31530 - 31538
  • [24] Principal component analysis
    Michael Greenacre
    Patrick J. F. Groenen
    Trevor Hastie
    Alfonso Iodice D’Enza
    Angelos Markos
    Elena Tuzhilina
    Nature Reviews Methods Primers, 2
  • [25] Principal component analysis
    Greenacre, Michael
    Groenen, Patrick J. F.
    Hastie, Trevor
    D'Enza, Alfonso Lodice
    Markos, Angelos
    Tuzhilina, Elena
    NATURE REVIEWS METHODS PRIMERS, 2022, 2 (01):
  • [26] Principal component analysis
    Bro, Rasmus
    Smilde, Age K.
    ANALYTICAL METHODS, 2014, 6 (09) : 2812 - 2831
  • [27] Principal component analysis
    Jake Lever
    Martin Krzywinski
    Naomi Altman
    Nature Methods, 2017, 14 : 641 - 642
  • [28] Principal component analysis
    School of Behavioral and Brain Sciences, University of Texas at Dallas, MS: GR4.1, Richardson, TX 75080-3021, United States
    不详
    Wiley Interdiscip. Rev. Comput. Stat., 4 (433-459):
  • [29] Principal component analysis
    Abdi, Herve
    Williams, Lynne J.
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2010, 2 (04): : 433 - 459
  • [30] PRINCIPAL COMPONENT ANALYSIS
    WOLD, S
    ESBENSEN, K
    GELADI, P
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 1987, 2 (1-3) : 37 - 52