Feature extraction in kernel space using Bhattacharyya distance as criterion function

被引:0
|
作者
Xia, Jian-Tao [1 ]
He, Ming-Yi [1 ]
机构
[1] Dept. of Electron. Eng., Northwestern Polytech. Univ., Xi'an 710072, China
来源
关键词
Algorithms - Classification (of information) - Computer simulation - Convergence of numerical methods - Quadratic programming;
D O I
暂无
中图分类号
学科分类号
摘要
This paper proposes a novel approach to feature extraction for classification in kernel space using Bhattacharyya distance, determining the upper-bond of Bayes error, as criterion function, which is called BKFE. The key idea of BKFE is that the data are nonlinearly mapped into high dimensional kernel space at first. Then we can find a set of discriminantly informative features in kernel space to linearly map the data into low dimensional feature space, where the Bhattacharyya distances between classes are maximized. First authors draw on kernel theory and nonlinear optimization technique to develop BKFE for binary classification problem, and solve the feature extraction problem by quadratic programming method, which endows BKFE with fast and global convergence. Then authors extend BKFE to multi-class classification problem. Compared with KPCA (Kernel Principal Components Analysis), KFD (Kernel Fisher Discriminate), and FD (Fisher Discriminate), BKFE has two desirable advantages: (1) features extracted by BKFE are more effective for classification; (2) it predicts the upper-bound of the number of necessary features to achieve the same classification accuracy as in the original space for a given pattern recognition problem. Experimental results show that BKFE can provide more informative features for pattern classification than others.
引用
收藏
页码:683 / 689
相关论文
共 50 条
  • [41] The nonlinear feature extraction with parsimonious components based on multiple Kernel function
    Liang, S. (lightsun@cqu.edu.cn), 1600, ICIC Express Letters Office, Tokai University, Kumamoto Campus, 9-1-1, Toroku, Kumamoto, 862-8652, Japan (07):
  • [43] A MINIMUM DISTANCE FEATURE EFFECTIVENESS CRITERION
    FUKUNAGA, K
    KRILE, TF
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1968, 14 (05) : 780 - +
  • [44] An SVM Kernel With GMM-Supervector Based on the Bhattacharyya Distance for Speaker Recognition
    You, Chang Huai
    Lee, Kong Aik
    Li, Haizhou
    IEEE SIGNAL PROCESSING LETTERS, 2009, 16 (1-3) : 49 - 52
  • [45] NONLINEAR FEATURE EXTRACTION ALGORITHM USING DISTANCE TRANSFORMATION
    KOONTZ, WLG
    FUKUNAGA, K
    IEEE TRANSACTIONS ON COMPUTERS, 1972, C 21 (01) : 56 - &
  • [46] An optimal symmetrical null space criterion of Fisher discriminant for feature extraction and recognition
    Xiaoning Song
    Jingyu Yang
    Xiaojun Wu
    Xibei Yang
    Soft Computing, 2011, 15 : 281 - 293
  • [47] An optimal symmetrical null space criterion of Fisher discriminant for feature extraction and recognition
    Song, Xiaoning
    Yang, Jingyu
    Wu, Xiaojun
    Yang, Xibei
    SOFT COMPUTING, 2011, 15 (02) : 281 - 293
  • [48] Novel Feature Selection Method Using Bhattacharyya Distance for Neural Networks Based Automatic Modulation Classification
    Shah, Maqsood Hussain
    Dang, Xiaoyu
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 106 - 110
  • [49] Robust speech feature extraction by growth transformation in Reproducing Kernel Hilbert Space
    Chakrabartty, S
    Deng, YB
    Cauwenberghs, G
    2004 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL I, PROCEEDINGS: SPEECH PROCESSING, 2004, : 133 - 136
  • [50] Robust speech feature extraction by growth transformation in reproducing kernel Hilbert space
    Chakrabartty, Shantanu
    Deng, Yunbin
    Cauwenberghs, Gert
    IEEE TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2007, 15 (06): : 1842 - 1849