Adjusted support vector machines based on a new loss function

被引:0
|
作者
Shuchun Wang
Wei Jiang
Kwok-Leung Tsui
机构
[1] Golden Arc Capital,Department of Systems Engineering & Engineering Management
[2] Inc.,School of Industrial & Systems Engineering
[3] Stevens Institute of Technology,undefined
[4] Georgia Institute of Technology,undefined
来源
关键词
Classification error; Cross validation; Dispersion; Sampling bias;
D O I
暂无
中图分类号
学科分类号
摘要
Support vector machine (SVM) has attracted considerable attentions recently due to its successful applications in various domains. However, by maximizing the margin of separation between the two classes in a binary classification problem, the SVM solutions often suffer two serious drawbacks. First, SVM separating hyperplane is usually very sensitive to training samples since it strongly depends on support vectors which are only a few points located on the wrong side of the corresponding margin boundaries. Second, the separating hyperplane is equidistant to the two classes which are considered equally important when optimizing the separating hyperplane location regardless the number of training data and their dispersions in each class. In this paper, we propose a new SVM solution, adjusted support vector machine (ASVM), based on a new loss function to adjust the SVM solution taking into account the sample sizes and dispersions of the two classes. Numerical experiments show that the ASVM outperforms conventional SVM, especially when the two classes have large differences in sample size and dispersion.
引用
收藏
页码:83 / 101
页数:18
相关论文
共 50 条
  • [41] NEW ROBUST UNSUPERVISED SUPPORT VECTOR MACHINES
    Kun ZHAO Logistics School
    JournalofSystemsScience&Complexity, 2011, 24 (03) : 466 - 476
  • [42] New robust unsupervised support vector machines
    Zhao, Kun
    Zhang, Mingyu
    Deng, Naiyang
    JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY, 2011, 24 (03) : 466 - 476
  • [43] A new SMO algorithm for support vector machines
    Zhang, HR
    Wang, XD
    Wu, JB
    Zhang, CJ
    Xu, XL
    Wang, J
    PROGRESS IN INTELLIGENCE COMPUTATION & APPLICATIONS, 2005, : 305 - 311
  • [44] New robust unsupervised support vector machines
    Kun Zhao
    Mingyu Zhang
    Naiyang Deng
    Journal of Systems Science and Complexity, 2011, 24 : 466 - 476
  • [45] A NEW METHOD FOR LEARNING THE SUPPORT VECTOR MACHINES
    Cocianu, Catalina-Lucia
    State, Luminita
    Vlamos, Panayiotis
    ICSOFT 2011: PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON SOFTWARE AND DATABASE TECHNOLOGIES, VOL 2, 2011, : 365 - 370
  • [46] Bounded quantile loss for robust support vector machines-based classification and regression
    Zhang, Jiaqi
    Yang, Hu
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 242
  • [47] Simplify decision function of Reduced Support Vector Machines
    Li, YG
    Zhang, WD
    Wang, GL
    Cai, YZ
    MICAI 2005: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2005, 3789 : 435 - 442
  • [48] Nonlinear Regularization Path for Quadratic Loss Support Vector Machines
    Karasuyama, Masayuki
    Takeuchi, Ichiro
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (10): : 1613 - 1625
  • [49] Fast Online Training of Ramp Loss Support Vector Machines
    Wang, Zhuang
    Vucetic, Slobodan
    2009 9TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, 2009, : 569 - 577
  • [50] On multicategory truncated-hinge-loss support vector machines
    Wu, Yichao
    Liu, Yufeng
    PREDICTION AND DISCOVERY, 2007, 443 : 49 - +