Feature extraction in kernel space using Bhattacharyya distance as criterion function

被引:0
|
作者
Xia, Jian-Tao [1 ]
He, Ming-Yi [1 ]
机构
[1] Dept. of Electron. Eng., Northwestern Polytech. Univ., Xi'an 710072, China
来源
关键词
Algorithms - Classification (of information) - Computer simulation - Convergence of numerical methods - Quadratic programming;
D O I
暂无
中图分类号
学科分类号
摘要
This paper proposes a novel approach to feature extraction for classification in kernel space using Bhattacharyya distance, determining the upper-bond of Bayes error, as criterion function, which is called BKFE. The key idea of BKFE is that the data are nonlinearly mapped into high dimensional kernel space at first. Then we can find a set of discriminantly informative features in kernel space to linearly map the data into low dimensional feature space, where the Bhattacharyya distances between classes are maximized. First authors draw on kernel theory and nonlinear optimization technique to develop BKFE for binary classification problem, and solve the feature extraction problem by quadratic programming method, which endows BKFE with fast and global convergence. Then authors extend BKFE to multi-class classification problem. Compared with KPCA (Kernel Principal Components Analysis), KFD (Kernel Fisher Discriminate), and FD (Fisher Discriminate), BKFE has two desirable advantages: (1) features extracted by BKFE are more effective for classification; (2) it predicts the upper-bound of the number of necessary features to achieve the same classification accuracy as in the original space for a given pattern recognition problem. Experimental results show that BKFE can provide more informative features for pattern classification than others.
引用
收藏
页码:683 / 689
相关论文
共 50 条
  • [21] Feature Extraction Using Kernel Inverse FDA
    Sun, Zhongxi
    Sun, Changyin
    Wang, Zhenyu
    Yang, Wankou
    PROCEEDINGS OF THE 31ST CHINESE CONTROL CONFERENCE, 2012, : 3672 - 3675
  • [22] A GMM SUPERVECTOR KERNEL WITH THE BHATTACHARYYA DISTANCE FOR SVM BASED SPEAKER RECOGNITION
    You, Chang Huai
    Lee, Kong Aik
    Li, Haizhou
    2009 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS 1- 8, PROCEEDINGS, 2009, : 4221 - 4224
  • [23] Microscene validation using the Bhattacharyya distance
    Basener, William F.
    Flynn, Marty
    MULTISPECTRAL, HYPERSPECTRAL, AND ULTRASPECTRAL REMOTE SENSING TECHNOLOGY, TECHNIQUES AND APPLICATIONS VII, 2018, 10780
  • [24] NON-LINEAR FEATURE EXTRACTION WITH A GENERAL CRITERION FUNCTION
    FUKUNAGA, K
    SHORT, RD
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1978, 24 (05) : 600 - 607
  • [25] Method on Feature Selection of Hyperspectral Images Based on Bhattacharyya Distance
    Cai Simin
    Zhang Rongqun
    Yuan Hui
    PROCEEDINGS OF THE 8TH WSEAS INTERNATIONAL CONFERENCE ON APPLIED COMPUTER AND APPLIED COMPUTATIONAL SCIENCE: APPLIED COMPUTER AND APPLIED COMPUTATIONAL SCIENCE, 2009, : 324 - +
  • [26] Phone clustering using the Bhattacharyya distance
    Mak, B
    Barnard, E
    ICSLP 96 - FOURTH INTERNATIONAL CONFERENCE ON SPOKEN LANGUAGE PROCESSING, PROCEEDINGS, VOLS 1-4, 1996, : 2005 - 2008
  • [27] Optimal feature representation for kernel machines using kernel-target alignment criterion
    Pothin, Jean-Baptiste
    Richard, Cedric
    2007 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL III, PTS 1-3, PROCEEDINGS, 2007, : 1065 - +
  • [28] Feature extraction via prototype margin distance maximizing criterion for subspace learning
    Li, Zikang
    Xu, Jie
    OPTIK, 2021, 231
  • [29] Feature Extraction Using Laplacian Maximum Margin Criterion
    Wankou Yang
    Changyin Sun
    Helen S. Du
    Jingyu Yang
    Neural Processing Letters, 2011, 33 : 99 - 110
  • [30] Feature extraction using fuzzy maximum margin criterion
    Cui, Yan
    Fan, Liya
    NEUROCOMPUTING, 2012, 86 : 52 - 58