Feature extraction in kernel space using Bhattacharyya distance as criterion function

被引:0
|
作者
Xia, Jian-Tao [1 ]
He, Ming-Yi [1 ]
机构
[1] Dept. of Electron. Eng., Northwestern Polytech. Univ., Xi'an 710072, China
来源
关键词
Algorithms - Classification (of information) - Computer simulation - Convergence of numerical methods - Quadratic programming;
D O I
暂无
中图分类号
学科分类号
摘要
This paper proposes a novel approach to feature extraction for classification in kernel space using Bhattacharyya distance, determining the upper-bond of Bayes error, as criterion function, which is called BKFE. The key idea of BKFE is that the data are nonlinearly mapped into high dimensional kernel space at first. Then we can find a set of discriminantly informative features in kernel space to linearly map the data into low dimensional feature space, where the Bhattacharyya distances between classes are maximized. First authors draw on kernel theory and nonlinear optimization technique to develop BKFE for binary classification problem, and solve the feature extraction problem by quadratic programming method, which endows BKFE with fast and global convergence. Then authors extend BKFE to multi-class classification problem. Compared with KPCA (Kernel Principal Components Analysis), KFD (Kernel Fisher Discriminate), and FD (Fisher Discriminate), BKFE has two desirable advantages: (1) features extracted by BKFE are more effective for classification; (2) it predicts the upper-bound of the number of necessary features to achieve the same classification accuracy as in the original space for a given pattern recognition problem. Experimental results show that BKFE can provide more informative features for pattern classification than others.
引用
收藏
页码:683 / 689
相关论文
共 50 条
  • [1] Feature extraction using the Bhattacharyya distance
    Lee, C
    Hong, D
    SMC '97 CONFERENCE PROCEEDINGS - 1997 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-5: CONFERENCE THEME: COMPUTATIONAL CYBERNETICS AND SIMULATION, 1997, : 2147 - 2150
  • [2] Feature extraction based on the Bhattacharyya distance
    Choi, E
    Lee, CH
    IGARSS 2000: IEEE 2000 INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, VOL I - VI, PROCEEDINGS, 2000, : 2146 - 2148
  • [3] Feature extraction based on the Bhattacharyya distance
    Choi, E
    Lee, C
    PATTERN RECOGNITION, 2003, 36 (08) : 1703 - 1709
  • [4] Representation of a Fisher Criterion Function in a Kernel Feature Space
    Lee, Sang Wan
    Bien, Zeungnam
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (02): : 333 - 339
  • [5] Feature extraction based on the Bhattacharyya distance for multimodal data
    Choi, E
    Lee, C
    IGARSS 2001: SCANNING THE PRESENT AND RESOLVING THE FUTURE, VOLS 1-7, PROCEEDINGS, 2001, : 524 - 526
  • [6] Feature extraction using kernel Laplacian maximum margin criterion
    Sun, Zhongxi
    Sun, Changyin
    Yang, Wankou
    Wang, Zhenyu
    OPTICAL ENGINEERING, 2012, 51 (06)
  • [7] FEATURE COMBINATIONS AND BHATTACHARYYA CRITERION
    DECELL, HP
    MARANI, SK
    COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1976, 5 (12): : 1143 - 1152
  • [8] Feature selection based on the Bhattacharyya distance
    Xuan, Guorong
    Zhu, Xiuming
    Chai, Peiqi
    Zhang, Zhenping
    Shi, Yun Q.
    Fu, Dongdong
    18TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 3, PROCEEDINGS, 2006, : 1232 - +
  • [9] Feature selection based on Bhattacharyya distance
    Xuan, Guorong
    Chai, Peiqi
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 1996, 9 (04): : 324 - 329
  • [10] A Fuzzy Kernel Maximum Margin Criterion for Image Feature Extraction
    Xuan, Shibin
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2015, 2015