A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine

被引:432
|
作者
Cao, LJ
Chua, KS
Chong, WK
Lee, HP
Gu, QM
机构
[1] Inst High Performance Comp, Singapore 117528, Singapore
[2] Natl Univ Singapore, Dept Math, Singapore 119260, Singapore
[3] Natl Univ Singapore, Singapore MIT Alliance, Singapore 119260, Singapore
[4] Off Nanjing Comm, Nanjing 210008, Peoples R China
关键词
support vector machines; principal component analysis; kernel principal component analysis; independent component analysis;
D O I
10.1016/S0925-2312(03)00433-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, support vector machine (SVM) has become a popular tool in time series forecasting. In developing a successful SVM forecastor, the first step is feature extraction. This paper proposes the applications of principal component analysis (PICA), kernel principal component analysis (KPCA) and independent component analysis (ICA) to SVM for feature extraction. PCA linearly transforms the original inputs into new uncorrelated features. KPCA is a nonlinear PCA developed by using the kernel method. In ICA, the original inputs are linearly transformed into features which are mutually statistically independent. By examining the sunspot data, Santa Fe data set A and five real futures contracts, the experiment shows that SVM by feature extraction using PCA, KPCA or ICA can perform better than that without feature extraction. Furthermore, among the three methods, there is the best performance in KPCA feature extraction, followed by ICA feature extraction. (C) 2003 Elsevier B.V. All rights reserved.
引用
收藏
页码:321 / 336
页数:16
相关论文
共 50 条
  • [41] LDA–GA–SVM: improved hepatocellular carcinoma prediction through dimensionality reduction and genetically optimized support vector machine
    Liaqat Ali
    Iram Wajahat
    Noorbakhsh Amiri Golilarz
    Fazel Keshtkar
    Syed Ahmad Chan Bukhari
    Neural Computing and Applications, 2021, 33 : 2783 - 2792
  • [42] Dynamic strain measurement in Brillouin optical correlation-domain sensing facilitated by dimensionality reduction and support vector machine
    Yao, Yuguo
    Mizuno, Yosuke
    OPTICS EXPRESS, 2022, 30 (09) : 15616 - 15633
  • [43] PCA Dimensionality Reduction Method for Image Classification
    Baiting Zhao
    Xiao Dong
    Yongcun Guo
    Xiaofen Jia
    Yourui Huang
    Neural Processing Letters, 2022, 54 : 347 - 368
  • [44] Combining KPCA with Support Vector Regression Machine for Short-term Electricity load Forecasting
    Zhang, Caiqing
    Lu, Pan
    Liu, Zejian
    2008 INTERNATIONAL CONFERENCE ON RISK MANAGEMENT AND ENGINEERING MANAGEMENT, ICRMEM 2008, PROCEEDINGS, 2008, : 305 - 310
  • [45] Predicting Corporate Financial Distress using KPCA and GA-based Support Vector Machine
    Zhou, Jianguo
    Bai, Tao
    2008 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1-8, 2008, : 909 - 913
  • [46] Dimensionality reduction in higher-order-only ICA
    DeLathauwer, L
    DeMoor, B
    Vandewalle, J
    PROCEEDINGS OF THE IEEE SIGNAL PROCESSING WORKSHOP ON HIGHER-ORDER STATISTICS, 1997, : 316 - 320
  • [47] A Comparison of Extreme Learning Machine and Support Vector Machine Classifiers
    Bucurica, Mihai
    Dogaru, Radu
    Dogaru, Ioana
    2015 IEEE 11TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING (ICCP), 2015, : 471 - 474
  • [48] Support Vector Machine data reduction for Direct Filter
    Ishiyama, Hiroaki
    Yamakita, Masaki
    2014 IEEE CONFERENCE ON CONTROL APPLICATIONS (CCA), 2014, : 1765 - 1770
  • [49] Local support vector machine based dimension reduction
    Li, Linxi
    Wang, Qin
    Ke, Chenlu
    STATISTICAL ANALYSIS AND DATA MINING, 2022, 15 (06) : 722 - 735
  • [50] Reduction of training data for support vector machine: a survey
    Birzhandi, Pardis
    Kim, Kyung Tae
    Youn, Hee Yong
    SOFT COMPUTING, 2022, 26 (08) : 3729 - 3742