Adjusted support vector machines based on a new loss function

被引:0
|
作者
Shuchun Wang
Wei Jiang
Kwok-Leung Tsui
机构
[1] Golden Arc Capital,Department of Systems Engineering & Engineering Management
[2] Inc.,School of Industrial & Systems Engineering
[3] Stevens Institute of Technology,undefined
[4] Georgia Institute of Technology,undefined
来源
关键词
Classification error; Cross validation; Dispersion; Sampling bias;
D O I
暂无
中图分类号
学科分类号
摘要
Support vector machine (SVM) has attracted considerable attentions recently due to its successful applications in various domains. However, by maximizing the margin of separation between the two classes in a binary classification problem, the SVM solutions often suffer two serious drawbacks. First, SVM separating hyperplane is usually very sensitive to training samples since it strongly depends on support vectors which are only a few points located on the wrong side of the corresponding margin boundaries. Second, the separating hyperplane is equidistant to the two classes which are considered equally important when optimizing the separating hyperplane location regardless the number of training data and their dispersions in each class. In this paper, we propose a new SVM solution, adjusted support vector machine (ASVM), based on a new loss function to adjust the SVM solution taking into account the sample sizes and dispersions of the two classes. Numerical experiments show that the ASVM outperforms conventional SVM, especially when the two classes have large differences in sample size and dispersion.
引用
收藏
页码:83 / 101
页数:18
相关论文
共 50 条
  • [1] Adjusted support vector machines based on a new loss function
    Wang, Shuchun
    Jiang, Wei
    Tsui, Kwok-Leung
    ANNALS OF OPERATIONS RESEARCH, 2010, 174 (01) : 83 - 101
  • [2] A New Convex Loss Function For Multiple Instance Support Vector Machines
    Kim, Sang-Baeg
    Bae, Jung-Man
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9023 - 9029
  • [3] Robust support vector machines based on the rescaled hinge loss function
    Xu, Guibiao
    Cao, Zheng
    Hu, Bao-Gang
    Principe, Jose C.
    PATTERN RECOGNITION, 2017, 63 : 139 - 148
  • [4] Function Approximation Based on Twin Support Vector Machines
    Yang, Chengfu
    Yi, Zhang
    Zuo, Lin
    2008 IEEE CONFERENCE ON CYBERNETICS AND INTELLIGENT SYSTEMS, VOLS 1 AND 2, 2008, : 752 - 757
  • [5] Support vector machines based on hybrid kernel function
    Dept. of Control Science and Engineering, Harbin Institute of Technology, Harbin 150001, China
    不详
    Harbin Gongye Daxue Xuebao, 2007, 11 (1704-1706):
  • [6] Training Robust Support Vector Machine Based on a New Loss Function
    刘叶青
    Journal of Donghua University(English Edition), 2015, 32 (02) : 261 - 263
  • [7] Training robust support vector machine based on a new loss function
    Liu, Ye-Qing
    Journal of Donghua University (English Edition), 2015, 32 (02) : 261 - 263
  • [8] A Set of new kernel function for Support vector machines: An approach based on Chebyshev polynomials
    Jafarzadeh, Sara Zafar
    Aminian, Mohammad
    Efati, Sohrab
    PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON COMPUTER AND KNOWLEDGE ENGINEERING (ICCKE 2013), 2013, : 412 - 416
  • [9] A new smooth support vector regression based on ε-insensitive logistic loss function
    Yang, HZ
    Shao, XG
    Ding, F
    ADVANCES IN NATURAL COMPUTATION, PT 1, PROCEEDINGS, 2005, 3610 : 25 - 32
  • [10] Twin Support Vector Machines Based on the Mixed Kernel Function
    Wu, Fulin
    Ding, Shifei
    JOURNAL OF COMPUTERS, 2014, 9 (07) : 1690 - 1696