An Evolutionary Orthogonal Component Analysis Method for Incremental Dimensionality Reduction

被引:2
|
作者
Zhang, Tianyue [1 ]
Shen, Furao [1 ]
Zhu, Tao [1 ]
Zhao, Jian [2 ]
机构
[1] Nanjing Univ, Natl Key Lab Novel Software Technol, Dept Comp Sci & Technol, Nanjing 210023, Peoples R China
[2] Nanjing Univ, Sch Elect Sci & Engn, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
Dimensionality reduction; Matrix decomposition; Learning systems; Feature extraction; Principal component analysis; Data mining; Estimation; incremental learning; orthogonal component (OC); subspace learning; LINEAR DISCRIMINANT-ANALYSIS; PRINCIPAL COMPONENTS; CLASSIFICATION; REPRESENTATION; ALGORITHMS; PCA;
D O I
10.1109/TNNLS.2020.3027852
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In order to quickly discover the low-dimensional representation of high-dimensional noisy data in online environments, we transform the linear dimensionality reduction problem into the problem of learning the bases of linear feature subspaces. Based on that, we propose a fast and robust dimensionality reduction framework for incremental subspace learning named evolutionary orthogonal component analysis (EOCA). By setting adaptive thresholds to automatically determine the target dimensionality, the proposed method extracts the orthogonal subspace bases of data incrementally to realize dimensionality reduction and avoids complex computations. Besides, EOCA can merge two learned subspaces that are represented by their orthonormal bases to a new one to eliminate the outlier effects, and the new subspace is proved to be unique. Extensive experiments and analysis demonstrate that EOCA is fast and achieves competitive results, especially for noisy data.
引用
收藏
页码:392 / 405
页数:14
相关论文
共 50 条
  • [21] RANDOMIZED NONLINEAR COMPONENT ANALYSIS FOR DIMENSIONALITY REDUCTION OF HYPERSPECTRAL IMAGES
    Damodaran, Bharath Bhushan
    Courty, Nicolas
    Tavenard, Romain
    2017 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2017, : 5 - 8
  • [22] Dimensionality Reduction Using Principal Component Analysis Applied to the Gradient
    Berguin, Steven H.
    Mavris, Dimitri N.
    AIAA JOURNAL, 2015, 53 (04) : 1078 - 1090
  • [23] Nonlinear Dimensionality Reduction Via Polynomial Principal Component Analysis
    Kazemipour, Abbas
    Druckmann, Shaul
    2018 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2018), 2018, : 1336 - 1340
  • [24] Curvilinear Component Analysis for nonlinear dimensionality reduction of hyperspectral images
    Lennon, M
    Mercier, G
    Mouchot, MC
    Hubert-Moy, L
    IMAGE AND SIGNAL PROCESSING FOR REMOTE SENSING VII, 2002, 4541 : 157 - 168
  • [25] An Orthogonal Locality and Globality Dimensionality Reduction Method Based on Twin Eigen Decomposition
    Su, Shuzhi
    Zhu, Gang
    Zhu, Yanmin
    IEEE ACCESS, 2021, 9 : 55714 - 55725
  • [26] Dimensionality Reduction in Continuous Evolutionary Optimization
    Kramer, Oliver
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [27] Reducing Dimensionality in Principal Component Analysis – A Method Comparison
    Z. Kánya
    E. Forgács
    T. Cserháti
    Z. Illés
    Chromatographia, 2006, 63 : 129 - 134
  • [28] Reducing dimensionality in principal component analysis -: A method comparison
    Kánya, Z
    Forgács, E
    Cserháti, T
    Illés, Z
    CHROMATOGRAPHIA, 2006, 63 (3-4) : 129 - 134
  • [29] Approximate Orthogonal Sparse Embedding for Dimensionality Reduction
    Lai, Zhihui
    Wong, Wai Keung
    Xu, Yong
    Yang, Jian
    Zhang, David
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (04) : 723 - 735
  • [30] Orthogonal margin discriminant projection for dimensionality reduction
    Jinrong He
    Di Wu
    Naixue Xiong
    Chuansheng Wu
    The Journal of Supercomputing, 2016, 72 : 2095 - 2110