Improvement of the kernel minimum squared error model for fast feature extraction

被引:3
|
作者
Wang, Jinghua [1 ]
Wang, Peng [2 ]
Li, Qin [3 ]
You, Jane [1 ]
机构
[1] Hong Kong Polytech Univ, Biometr Res Ctr, Dept Comp, Kowloon, Hong Kong, Peoples R China
[2] Harbin Inst Technol, Biocomp Res Ctr, Sch Comp Sci & Technol, Harbin 150001, Peoples R China
[3] Shenzhen Univ, Coll Optoelect Engn, Shenzhen, Guangdong, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2013年 / 23卷 / 01期
关键词
Machine learning; Kernel minimum squared error; Efficient kernel minimum squared error; Feature extraction; DISCRIMINANT-ANALYSIS; FRAMEWORK; ALGORITHM;
D O I
10.1007/s00521-012-0813-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The kernel minimum squared error (KMSE) expresses the feature extractor as a linear combination of all the training samples in the high-dimensional kernel space. To extract a feature from a sample, KMSE should calculate as many kernel functions as the training samples. Thus, the computational efficiency of the KMSE-based feature extraction procedure is inversely proportional to the size of the training sample set. In this paper, we propose an efficient kernel minimum squared error (EKMSE) model for two-class classification. The proposed EKMSE expresses each feature extractor as a linear combination of nodes, which are a small portion of the training samples. To extract a feature from a sample, EKMSE only needs to calculate as many kernel functions as the nodes. As the nodes are commonly much fewer than the training samples, EKMSE is much faster than KMSE in feature extraction. The EKMSE can achieve the same training accuracy as the standard KMSE. Also, EKMSE avoids the overfitting problem. We implement the EKMSE model using two algorithms. Experimental results show the feasibility of the EKMSE model.
引用
收藏
页码:53 / 59
页数:7
相关论文
共 50 条
  • [21] Fast Kernel Generalized Discriminative Common Vectors for Feature Extraction
    Katerine Diaz-Chito
    Jesús Martínez del Rincón
    Aura Hernández-Sabaté
    Marçal Rusiñol
    Francesc J. Ferri
    Journal of Mathematical Imaging and Vision, 2018, 60 : 512 - 524
  • [22] A FAST ITERATIVE KERNEL PCA FEATURE EXTRACTION FOR HYPERSPECTRAL IMAGES
    Liao, Wenzhi
    Pizurica, Aleksandra
    Philips, Wilfried
    Pi, Youguo
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 1317 - 1320
  • [23] A Fast Feature Extraction Method for Kernel 2DPCA
    Sun, Ning
    Wang, Hai-xian
    Ji, Zhen-hai
    Zou, Cai-rong
    Zhao, Li
    INTELLIGENT COMPUTING, PART I: INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING, ICIC 2006, PART I, 2006, 4113 : 767 - 774
  • [24] MINIMUM SQUARED ERROR SYNTHETIC DISCRIMINANT FUNCTIONS
    KUMAR, BVKV
    MAHALANOBIS, A
    SONG, SW
    SIMS, SRF
    EPPERSON, JF
    OPTICAL ENGINEERING, 1992, 31 (05) : 915 - 922
  • [25] Hyperspectral image feature extraction via kernel minimum noise fraction transform
    Lin, N. (linnawb@126.com), 1600, Editorial Board of Medical Journal of Wuhan University (38):
  • [26] Reducing the mean squared error in kernel density estimation
    Kim, Jinmi
    Kim, Choongrak
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2013, 42 (03) : 387 - 397
  • [27] Reducing the mean squared error in kernel density estimation
    Jinmi Kim
    Choongrak Kim
    Journal of the Korean Statistical Society, 2013, 42 : 387 - 397
  • [28] A Fast Incremental Kernel Principal Component Analysis for Online Feature Extraction
    Ozawa, Seiichi
    Takeuchi, Yohei
    Abe, Shigeo
    PRICAI 2010: TRENDS IN ARTIFICIAL INTELLIGENCE, 2010, 6230 : 487 - 497
  • [29] IMPROVED MINIMUM SQUARED ERROR METHOD FOR ROBUST CLASSIFICATION
    Zhu, Fangzhi
    Yan, Rui
    Sun, Yong
    2014 IEEE 3RD INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND INTELLIGENCE SYSTEMS (CCIS), 2014, : 71 - 75
  • [30] PROPERTIES OF MINIMUM MEAN SQUARED ERROR BLOCK QUANTIZERS
    GALLAGHER, NC
    BUCKLEW, JA
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1982, 28 (01) : 105 - 107