Bayesian approach to feature selection and parameter tuning for support vector machine classifiers

被引:50
|
作者
Gold, C [1 ]
Holub, A
Sollich, P
机构
[1] CALTECH, Pasadena, CA 91125 USA
[2] Kings Coll London, Dept Math, London WC2R 2LS, England
关键词
D O I
10.1016/j.neunet.2005.06.044
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A Bayesian point of view of SVM classifiers allows the definition of a quantity analogous to the evidence in probabilistic models. By maximizing this one can systematically tune hyperparameters and, via automatic relevance determination (ARD), select relevant input features. Evidence gradients are expressed as averages over the associated posterior and can be approximated using Hybrid Monte Carlo (HMC) sampling. We describe how a Nystrom approximation of the Gram matrix can be used to speed up sampling times significantly while maintaining almost unchanged classification accuracy. In experiments on classification problems with a significant number of irrelevant features this approach to ARD can give a significant improvement in classification performance over more traditional, non-ARD, SVM systems. The final tuned hyperparameter values provide a useful criterion for pruning irrelevant features, and we define a measure of relevance with which to determine systematically how many features should be removed. This use of ARD for hard feature selection can improve classification accuracy in non-ARD SVMs. In the majority of cases, however, we find that in data sets constructed by human domain experts the performance of non-ARD SVMs is largely insensitive to the presence of some less relevant features. Eliminating such features via ARD then does not improve classification accuracy, but leads to impressive reductions in the number of features required, by up to 75%.(1) (c) 2005 Elsevier Ltd. All rights reserved.
引用
收藏
页码:693 / 701
页数:9
相关论文
共 50 条
  • [21] Evolutionary feature and parameter selection in support vector regression
    Mejia-Guevara, Ivan
    Kuri-Morales, Angel
    MICAI 2007: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2007, 4827 : 399 - +
  • [22] An efficient method for tuning kernel parameter of the support vector machine
    Debnath, R
    Takahashi, H
    IEEE INTERNATIONAL SYMPOSIUM ON COMMUNICATIONS AND INFORMATION TECHNOLOGIES 2004 (ISCIT 2004), PROCEEDINGS, VOLS 1 AND 2: SMART INFO-MEDIA SYSTEMS, 2004, : 1023 - 1028
  • [23] Kernel Parameter Selection for Support Vector Machine Classification
    Liu, Zhiliang
    Xu, Hongbing
    JOURNAL OF ALGORITHMS & COMPUTATIONAL TECHNOLOGY, 2014, 8 (02) : 163 - 177
  • [24] Selection of the Parameter in Gaussian Kernels in Support Vector Machine
    Zhang, Yanyi
    Li, Rui
    2017 2ND IEEE INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA ANALYSIS (ICCCBDA 2017), 2017, : 430 - 433
  • [25] Kernel parameter selection for support vector machine classification
    Liu, Zhiliang
    Xu, Hongbing
    Journal of Algorithms and Computational Technology, 2014, 8 (02): : 163 - 177
  • [26] Support Vector Machine Parameter Tuning using Firefly Algorithm
    Tuba, Eva
    Mrkela, Lazar
    Tuba, Milan
    PROCEEDINGS OF THE 26TH INTERNATIONAL CONFERENCE RADIOELEKTRONIKA (RADIOELEKTRONIKA 2016), 2016, : 413 - 418
  • [27] Group feature selection with multiclass support vector machine
    Tang, Fengzhen
    Adam, Lukas
    Si, Bailu
    NEUROCOMPUTING, 2018, 317 : 42 - 49
  • [28] Large Margin Feature Selection for Support Vector Machine
    Pan, Wei
    Ma, Peijun
    Su, Xiaohong
    MECHANICAL ENGINEERING, MATERIALS SCIENCE AND CIVIL ENGINEERING, 2013, 274 : 161 - 164
  • [29] Support vector machine tree based on feature selection
    Xu, Qinzhen
    Pei, Wenjiang
    Yang, Luxi
    He, Zhenya
    NEURAL INFORMATION PROCESSING, PT 1, PROCEEDINGS, 2006, 4232 : 856 - 863
  • [30] Hyper-parameter Tuning for Quantum Support Vector Machine
    Demirtas, Fadime
    Tanyildizi, Erkan
    ADVANCES IN ELECTRICAL AND COMPUTER ENGINEERING, 2022, 22 (04) : 47 - 54