Improvement of the kernel minimum squared error model for fast feature extraction

被引:3
|
作者
Wang, Jinghua [1 ]
Wang, Peng [2 ]
Li, Qin [3 ]
You, Jane [1 ]
机构
[1] Hong Kong Polytech Univ, Biometr Res Ctr, Dept Comp, Kowloon, Hong Kong, Peoples R China
[2] Harbin Inst Technol, Biocomp Res Ctr, Sch Comp Sci & Technol, Harbin 150001, Peoples R China
[3] Shenzhen Univ, Coll Optoelect Engn, Shenzhen, Guangdong, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2013年 / 23卷 / 01期
关键词
Machine learning; Kernel minimum squared error; Efficient kernel minimum squared error; Feature extraction; DISCRIMINANT-ANALYSIS; FRAMEWORK; ALGORITHM;
D O I
10.1007/s00521-012-0813-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The kernel minimum squared error (KMSE) expresses the feature extractor as a linear combination of all the training samples in the high-dimensional kernel space. To extract a feature from a sample, KMSE should calculate as many kernel functions as the training samples. Thus, the computational efficiency of the KMSE-based feature extraction procedure is inversely proportional to the size of the training sample set. In this paper, we propose an efficient kernel minimum squared error (EKMSE) model for two-class classification. The proposed EKMSE expresses each feature extractor as a linear combination of nodes, which are a small portion of the training samples. To extract a feature from a sample, EKMSE only needs to calculate as many kernel functions as the nodes. As the nodes are commonly much fewer than the training samples, EKMSE is much faster than KMSE in feature extraction. The EKMSE can achieve the same training accuracy as the standard KMSE. Also, EKMSE avoids the overfitting problem. We implement the EKMSE model using two algorithms. Experimental results show the feasibility of the EKMSE model.
引用
收藏
页码:53 / 59
页数:7
相关论文
共 50 条
  • [31] A minimum squared-error framework for generalized sampling
    Eldar, Yonina C.
    Dvorkind, Tsvi G.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2006, 54 (06) : 2155 - 2167
  • [32] SOURCE IMAGING WITH MINIMUM MEAN-SQUARED ERROR
    STOUGHTON, R
    STRAIT, S
    JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 1993, 94 (02): : 827 - 834
  • [33] Minimum mean-squared error covariance shaping
    Eldar, YC
    2003 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL VI, PROCEEDINGS: SIGNAL PROCESSING THEORY AND METHODS, 2003, : 713 - 716
  • [34] BOUNDS ON MINIMUM MEAN SQUARED ERROR IN RIDGE REGRESSION
    BALDWIN, KF
    HOERL, AE
    COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1978, 7 (13): : 1209 - 1218
  • [35] Minimum mean squared error estimation of each individual coefficient in a linear regression model
    Ohtani, K
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 1997, 62 (02) : 301 - 316
  • [36] Kernel minimum error entropy algorithm
    Chen, Badong
    Yuan, Zejian
    Zheng, Nanning
    Principe, Jose C.
    NEUROCOMPUTING, 2013, 121 : 160 - 169
  • [37] Some relationships between minimum Bayes error and information theoretical feature extraction
    Vasconcelos, M
    Vasconcelos, N
    AUTOMATIC TARGET RECOGNITON XV, 2005, 5807 : 284 - 295
  • [38] Pattern recognition based on the minimum norm minimum squared-error classifier
    Song, FX
    Yang, JY
    Liu, SH
    2004 8TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION, VOLS 1-3, 2004, : 1114 - 1117
  • [39] Minimum Bayes error features for visual recognition by sequential feature selection and extraction
    Carneiro, G
    Vasconcelos, N
    2ND CANADIAN CONFERENCE ON COMPUTER AND ROBOT VISION, PROCEEDINGS, 2005, : 253 - 260
  • [40] A critical feature extraction by kernel PCA in stock trading model
    Pei-Chann Chang
    Jheng-Long Wu
    Soft Computing, 2015, 19 : 1393 - 1408