A generalized quadratic loss for support vector machines

被引:0
|
作者
Portera, F [1 ]
Sperduti, A [1 ]
机构
[1] Univ Padua, Dipartimento Matemat Pura & Applicata, Padua, Italy
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The standard SVM formulation for binary classification is based on the Hinge loss function, where errors are considered not correlated. Due to this, local information in the feature space which can be useful to improve the prediction model is disregarded. In this paper we address this problem by defining a generalized quadratic loss where the co-occurrence of errors is weighted according to a kernel similarity measure in the feature space. In particular the proposed approach weights pairs of errors according to the distribution of the related patterns in the feature space. The generalized quadratic loss includes also target information in order to penalize errors on pairs of patterns that are similar and of the same class. We show that the resulting dual problem can be expressed as a hard margin SVM in a different feature space when the co-occurrence error matrix is invertible. We compare our approach against a standard SVM on some binary classification tasks. Experimental results obtained for different instances of the co-occurrence error matrix on these problems, seems to show an improvement in the performance.
引用
收藏
页码:628 / 632
页数:5
相关论文
共 50 条
  • [1] Support vector regression with a generalized quadratic loss
    Portera, Filippo
    Sperduti, Alessandro
    BIOLOGICAL AND ARTIFICIAL INTELLIGENCE ENVIRONMENTS, 2005, : 209 - 216
  • [2] Nonlinear Regularization Path for Quadratic Loss Support Vector Machines
    Karasuyama, Masayuki
    Takeuchi, Ichiro
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (10): : 1613 - 1625
  • [3] Generalized Twin Support Vector Machines
    H. Moosaei
    S. Ketabchi
    M. Razzaghi
    M. Tanveer
    Neural Processing Letters, 2021, 53 : 1545 - 1564
  • [4] Generalized Twin Support Vector Machines
    Moosaei, H.
    Ketabchi, S.
    Razzaghi, M.
    Tanveer, M.
    NEURAL PROCESSING LETTERS, 2021, 53 (02) : 1545 - 1564
  • [5] Support Vector Machines with the Ramp Loss and the Hard Margin Loss
    Brooks, J. Paul
    OPERATIONS RESEARCH, 2011, 59 (02) : 467 - 479
  • [6] Efficient Feature Scaling for Support Vector Machines with a Quadratic Kernel
    Zhizheng Liang
    Ning Liu
    Neural Processing Letters, 2014, 39 : 235 - 246
  • [7] Efficient Feature Scaling for Support Vector Machines with a Quadratic Kernel
    Liang, Zhizheng
    Liu, Ning
    NEURAL PROCESSING LETTERS, 2014, 39 (03) : 235 - 246
  • [8] Design and simulation of support vector machines generalized observer
    School of Chemical Engineering and Environment, Beijing Institute of Technology, Beijing 100081, China
    不详
    Shiyou Hiagong Gaodeng Xuexiao Xuebao, 2008, 4 (95-98):
  • [9] Multiclass Generalized Eigenvalue Proximal Support Vector Machines
    Guarracino, Mario Rosario
    Irpino, Antonio
    Verde, Rosanna
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPLEX, INTELLIGENT AND SOFTWARE INTENSIVE SYSTEMS (CISIS 2010), 2010, : 25 - 32
  • [10] Multiple instance learning with generalized support vector machines
    Andrews, S
    Hofmann, T
    Tsochantaridis, I
    EIGHTEENTH NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-02)/FOURTEENTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE (IAAI-02), PROCEEDINGS, 2002, : 943 - 944