Improvement of the kernel minimum squared error model for fast feature extraction

被引:3
|
作者
Wang, Jinghua [1 ]
Wang, Peng [2 ]
Li, Qin [3 ]
You, Jane [1 ]
机构
[1] Hong Kong Polytech Univ, Biometr Res Ctr, Dept Comp, Kowloon, Hong Kong, Peoples R China
[2] Harbin Inst Technol, Biocomp Res Ctr, Sch Comp Sci & Technol, Harbin 150001, Peoples R China
[3] Shenzhen Univ, Coll Optoelect Engn, Shenzhen, Guangdong, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2013年 / 23卷 / 01期
关键词
Machine learning; Kernel minimum squared error; Efficient kernel minimum squared error; Feature extraction; DISCRIMINANT-ANALYSIS; FRAMEWORK; ALGORITHM;
D O I
10.1007/s00521-012-0813-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The kernel minimum squared error (KMSE) expresses the feature extractor as a linear combination of all the training samples in the high-dimensional kernel space. To extract a feature from a sample, KMSE should calculate as many kernel functions as the training samples. Thus, the computational efficiency of the KMSE-based feature extraction procedure is inversely proportional to the size of the training sample set. In this paper, we propose an efficient kernel minimum squared error (EKMSE) model for two-class classification. The proposed EKMSE expresses each feature extractor as a linear combination of nodes, which are a small portion of the training samples. To extract a feature from a sample, EKMSE only needs to calculate as many kernel functions as the nodes. As the nodes are commonly much fewer than the training samples, EKMSE is much faster than KMSE in feature extraction. The EKMSE can achieve the same training accuracy as the standard KMSE. Also, EKMSE avoids the overfitting problem. We implement the EKMSE model using two algorithms. Experimental results show the feasibility of the EKMSE model.
引用
收藏
页码:53 / 59
页数:7
相关论文
共 50 条
  • [41] A critical feature extraction by kernel PCA in stock trading model
    Chang, Pei-Chann
    Wu, Jheng-Long
    SOFT COMPUTING, 2015, 19 (05) : 1393 - 1408
  • [42] MEAN SQUARED ERROR PROPERTIES OF KERNEL ESTIMATES OF REGRESSION QUANTILES
    JONES, MC
    HALL, P
    STATISTICS & PROBABILITY LETTERS, 1990, 10 (04) : 283 - 289
  • [43] Inequalities for mean squared error of multidimensional kernel density estimations
    Ushakov V.G.
    Ushakov N.G.
    Moscow University Computational Mathematics and Cybernetics, 2010, 34 (1) : 16 - 21
  • [44] On the expansion of the mean integrated squared error of a kernel density estimator
    van Es, B
    STATISTICS & PROBABILITY LETTERS, 2001, 52 (04) : 441 - 450
  • [45] Robust tensor decomposition with kernel rescaled error loss for feature extraction and dimensionality reduction
    Zhang, Shuaishuai
    Wang, Xiaofeng
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 270
  • [46] Minimum Mean Squared Error Estimation and Mutual Information Gain
    Gibson, Jerry
    INFORMATION, 2024, 15 (08)
  • [47] MINIMUM MEAN-SQUARED-ERROR ESTIMATORS FOR SIMULATION EXPERIMENTS
    DONNELLY, JH
    SHANNON, RE
    COMMUNICATIONS OF THE ACM, 1981, 24 (04) : 253 - 259
  • [48] Local minimum squared error for face and handwritten character recognition
    Fan, Zizhu
    Wang, Jinghua
    Zhu, Qi
    Fang, Xiaozhao
    Cui, Jinrong
    Li, Chunhua
    JOURNAL OF ELECTRONIC IMAGING, 2013, 22 (03)
  • [49] A Cutting Algorithm for the Minimum Sum-of-Squared Error Clustering
    Peng, Jiming
    Xia, Yu
    PROCEEDINGS OF THE FIFTH SIAM INTERNATIONAL CONFERENCE ON DATA MINING, 2005, : 150 - 160
  • [50] MINIMUM MEAN SQUARED ERROR IMPULSE NOISE ESTIMATION AND CANCELLATION
    KERPEZ, KJ
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1995, 43 (07) : 1651 - 1662