A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine

被引:432
|
作者
Cao, LJ
Chua, KS
Chong, WK
Lee, HP
Gu, QM
机构
[1] Inst High Performance Comp, Singapore 117528, Singapore
[2] Natl Univ Singapore, Dept Math, Singapore 119260, Singapore
[3] Natl Univ Singapore, Singapore MIT Alliance, Singapore 119260, Singapore
[4] Off Nanjing Comm, Nanjing 210008, Peoples R China
关键词
support vector machines; principal component analysis; kernel principal component analysis; independent component analysis;
D O I
10.1016/S0925-2312(03)00433-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, support vector machine (SVM) has become a popular tool in time series forecasting. In developing a successful SVM forecastor, the first step is feature extraction. This paper proposes the applications of principal component analysis (PICA), kernel principal component analysis (KPCA) and independent component analysis (ICA) to SVM for feature extraction. PCA linearly transforms the original inputs into new uncorrelated features. KPCA is a nonlinear PCA developed by using the kernel method. In ICA, the original inputs are linearly transformed into features which are mutually statistically independent. By examining the sunspot data, Santa Fe data set A and five real futures contracts, the experiment shows that SVM by feature extraction using PCA, KPCA or ICA can perform better than that without feature extraction. Furthermore, among the three methods, there is the best performance in KPCA feature extraction, followed by ICA feature extraction. (C) 2003 Elsevier B.V. All rights reserved.
引用
收藏
页码:321 / 336
页数:16
相关论文
共 50 条
  • [1] Feature extraction in support vector machine: A comparison of PCA, KPCA and ICA
    Cao, LJ
    Chong, WK
    ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE, 2002, : 1001 - 1005
  • [2] A comparison of ℓ1-regularizion, PCA, KPCA and ICA for dimensionality reduction in logistic regression
    Abdallah Bashir Musa
    International Journal of Machine Learning and Cybernetics, 2014, 5 : 861 - 873
  • [3] A comparison of l1-regularizion, PCA, KPCA and ICA for dimensionality reduction in logistic regression
    Musa, Abdallah Bashir
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2014, 5 (06) : 861 - 873
  • [4] PCA and KPCA Integrated Support Vector Machine for Multi-Fault Classification
    Yin, Shen
    Jing, Chen
    Hou, Jian
    Kaynak, Okyay
    Gao, Huijun
    PROCEEDINGS OF THE IECON 2016 - 42ND ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2016, : 7215 - 7220
  • [5] Conceptual and empirical comparison of dimensionality reduction algorithms (PCA, KPCA, LDA, MDS, SVD, LLE, ISOMAP, LE, ICA, t-SNE)
    Anowar, Farzana
    Sadaoui, Samira
    Selim, Bassant
    COMPUTER SCIENCE REVIEW, 2021, 40
  • [6] An empirical study of dimensionality reduction in support vector machine
    Cao, L. J.
    Zhang, JingQing
    Cai, Zongwu
    Lim, Kian Guan
    NEURAL NETWORK WORLD, 2006, 16 (03) : 177 - 192
  • [7] An empirical study of dimensionality reduction in support vector machine
    Financial Studies of Fudan University, HanDan Road, ShangHai 200433, China
    不详
    不详
    Neural Network World, 2006, 3 (177-192)
  • [8] Dimensionality Reduction Methods: Comparative Analysis of methods PCA, PPCA and KPCA
    Arroyo-Hernandez, Jorge
    UNICIENCIA, 2016, 30 (01) : 115 - 122
  • [9] Dimensionality Reduction by Soft-Margin Support Vector Machine
    Dong, Ruipeng
    Meng, Hua
    Long, Zhiguo
    Zhao, Hailiang
    2017 IEEE INTERNATIONAL CONFERENCE ON AGENTS (ICA), 2017, : 154 - 156
  • [10] Parameter Optimization in KPCA for Rotating Machinery Feature Vector Dimensionality Reduction
    Jiang, Lingli
    Li, Ping
    Tang, Siwen
    MECHATRONICS AND INFORMATION TECHNOLOGY, PTS 1 AND 2, 2012, 2-3 : 755 - 760