Improvement of the kernel minimum squared error model for fast feature extraction

被引:3
|
作者
Wang, Jinghua [1 ]
Wang, Peng [2 ]
Li, Qin [3 ]
You, Jane [1 ]
机构
[1] Hong Kong Polytech Univ, Biometr Res Ctr, Dept Comp, Kowloon, Hong Kong, Peoples R China
[2] Harbin Inst Technol, Biocomp Res Ctr, Sch Comp Sci & Technol, Harbin 150001, Peoples R China
[3] Shenzhen Univ, Coll Optoelect Engn, Shenzhen, Guangdong, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2013年 / 23卷 / 01期
关键词
Machine learning; Kernel minimum squared error; Efficient kernel minimum squared error; Feature extraction; DISCRIMINANT-ANALYSIS; FRAMEWORK; ALGORITHM;
D O I
10.1007/s00521-012-0813-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The kernel minimum squared error (KMSE) expresses the feature extractor as a linear combination of all the training samples in the high-dimensional kernel space. To extract a feature from a sample, KMSE should calculate as many kernel functions as the training samples. Thus, the computational efficiency of the KMSE-based feature extraction procedure is inversely proportional to the size of the training sample set. In this paper, we propose an efficient kernel minimum squared error (EKMSE) model for two-class classification. The proposed EKMSE expresses each feature extractor as a linear combination of nodes, which are a small portion of the training samples. To extract a feature from a sample, EKMSE only needs to calculate as many kernel functions as the nodes. As the nodes are commonly much fewer than the training samples, EKMSE is much faster than KMSE in feature extraction. The EKMSE can achieve the same training accuracy as the standard KMSE. Also, EKMSE avoids the overfitting problem. We implement the EKMSE model using two algorithms. Experimental results show the feasibility of the EKMSE model.
引用
收藏
页码:53 / 59
页数:7
相关论文
共 50 条
  • [1] Improvement of the kernel minimum squared error model for fast feature extraction
    Jinghua Wang
    Peng Wang
    Qin Li
    Jane You
    Neural Computing and Applications, 2013, 23 : 53 - 59
  • [2] Sparsity Based Feature Extraction for Kernel Minimum Squared Error
    Jiang, Jiang
    Chen, Xi
    Gan, Haitao
    Sang, Nong
    PATTERN RECOGNITION (CCPR 2014), PT I, 2014, 483 : 273 - 282
  • [3] Using Feature Correlation Measurement to Improve the Kernel Minimum Squared Error Algorithm
    Fan, Zizhu
    Li, Zuoyong
    PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 : 563 - 573
  • [4] Incremental kernel minimum squared error (KMSE)
    Zhao, Yong-Ping
    Wang, Kang-Kang
    Liu, Jie
    Huerta, Ramon
    INFORMATION SCIENCES, 2014, 270 : 92 - 111
  • [5] Semiparametric spatial effects kernel minimum squared error model for predicting housing sales prices
    Shim, Jooyong
    Bin, Okmyung
    Hwang, Changha
    NEUROCOMPUTING, 2014, 124 : 81 - 88
  • [6] A safe semi-supervised kernel minimum squared error algorithm
    Gan Haitao
    Meng Ming
    Ma Yuliang
    Gao Yunyuan
    2015 34TH CHINESE CONTROL CONFERENCE (CCC), 2015, : 3723 - 3726
  • [7] Sparse kernel minimum squared error using Householder transformation and givens rotation
    Zhao, Yong-Ping
    Xi, Peng-Peng
    Li, Bing
    Li, Zhi-Qiang
    APPLIED INTELLIGENCE, 2018, 48 (02) : 390 - 415
  • [8] Laplacian regularized kernel minimum squared error and its application to face recognition
    Gan, Haitao
    OPTIK, 2014, 125 (14): : 3524 - 3529
  • [9] Towards a probabilistic semi-supervised Kernel Minimum Squared Error algorithm
    Gan, Haitao
    Huang, Rui
    Luo, Zhizeng
    Fan, Yingle
    Gao, Farong
    NEUROCOMPUTING, 2016, 171 : 149 - 155
  • [10] Enhanced kernel minimum squared error algorithm and its application in face recognition
    Zhao Y.
    He X.
    Chen B.
    Zhao X.
    Journal of Southeast University (English Edition), 2016, 32 (01) : 35 - 38