Smooth Twin Support Vector Machines via Unconstrained Convex Minimization

被引:13
|
作者
Tanveer, M. [1 ]
Shubham, K. [2 ]
机构
[1] Indian Inst Technol Indore, Discipline Math, Indore 453552, Madhya Pradesh, India
[2] LNM Inst Informat Technol, Dept Elect & Commun Engn, Jaipur 302031, Rajasthan, India
关键词
Machine learning; Lagrangian support vector machines; Twin support vector machine; Smoothing techniques; Convex minimization; FINITE NEWTON METHOD;
D O I
10.2298/FIL1708195T
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Twin support vector machine (TWSVM) exhibits fast training speed with better classification abilities compared with standard SVM. However, it suffers the following drawbacks: (i) the objective functions of TWSVM are comprised of empirical risk and thus may suffer from overfitting and suboptimal solution in some cases. (ii) a convex quadratic programming problems (QPPs) need to be solve, which is relatively complex to implement. To address these problems, we proposed two smoothing approaches for an implicit Lagrangian TWSVM classifiers by formulating a pair of unconstrained minimization problems in dual variables whose solutions will be obtained by solving two systems of linear equations rather than solving two QPPs in TWSVM. Our proposed formulation introduces regularization terms to each objective function with the idea of maximizing the margin. In addition, our proposed formulation becomes well-posed model due to this term, which introduces invertibility in the dual formulation. Moreover, the structural risk minimization principle is implemented in our formulation which embodies the essence of statistical learning theory. The experimental results on several benchmark datasets show better performance of the proposed approach over existing approaches in terms of estimation accuracy with less training time.
引用
收藏
页码:2195 / 2210
页数:16
相关论文
共 50 条
  • [41] Domain Adaptation with Twin Support Vector Machines
    Xijiong Xie
    Shiliang Sun
    Huahui Chen
    Jiangbo Qian
    Neural Processing Letters, 2018, 48 : 1213 - 1226
  • [42] Probabilistic outputs for twin support vector machines
    Shao, Yuan-Hai
    Deng, Nai-Yang
    Yang, Zhi-Min
    Chen, Wei-Jie
    Wang, Zhen
    KNOWLEDGE-BASED SYSTEMS, 2012, 33 : 145 - 151
  • [43] Universal consistency of twin support vector machines
    Weixia Xu
    Dingjiang Huang
    Shuigeng Zhou
    International Journal of Machine Learning and Cybernetics, 2021, 12 : 1867 - 1877
  • [44] Quantum speedup of twin support vector machines
    Zekun Ye
    Lvzhou Li
    Haozhen Situ
    Yuyi Wang
    Science China Information Sciences, 2020, 63
  • [45] Domain Adaptation with Twin Support Vector Machines
    Xie, Xijiong
    Sun, Shiliang
    Chen, Huahui
    Qian, Jiangbo
    NEURAL PROCESSING LETTERS, 2018, 48 (02) : 1213 - 1226
  • [46] Intuitionistic Fuzzy Twin Support Vector Machines
    Rezvani, Salim
    Wang, Xizhao
    Pourpanah, Farhad
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2019, 27 (11) : 2140 - 2151
  • [47] Multiple Instance Twin Support Vector Machines
    Shao, Yuan-Hai
    Yang, Zhi-Xia
    Wang, Xiao-Bo
    Deng, Nai-Yang
    OPERATIONS RESEARCH AND ITS APPLICATIONS, 2010, 12 : 433 - +
  • [48] Universal consistency of twin support vector machines
    Xu, Weixia
    Huang, Dingjiang
    Zhou, Shuigeng
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (07) : 1867 - 1877
  • [49] Twin support vector machines for pattern classification
    Jayadeva
    Khemchandani, R.
    Chandra, Suresh
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (05) : 905 - 910
  • [50] Sparse pinball twin support vector machines
    Tanveer, M.
    Tiwari, Aruna
    Choudhary, Rahul
    Jalan, Sanchit
    APPLIED SOFT COMPUTING, 2019, 78 : 164 - 175