Smooth Twin Support Vector Machines via Unconstrained Convex Minimization

被引:13
|
作者
Tanveer, M. [1 ]
Shubham, K. [2 ]
机构
[1] Indian Inst Technol Indore, Discipline Math, Indore 453552, Madhya Pradesh, India
[2] LNM Inst Informat Technol, Dept Elect & Commun Engn, Jaipur 302031, Rajasthan, India
关键词
Machine learning; Lagrangian support vector machines; Twin support vector machine; Smoothing techniques; Convex minimization; FINITE NEWTON METHOD;
D O I
10.2298/FIL1708195T
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Twin support vector machine (TWSVM) exhibits fast training speed with better classification abilities compared with standard SVM. However, it suffers the following drawbacks: (i) the objective functions of TWSVM are comprised of empirical risk and thus may suffer from overfitting and suboptimal solution in some cases. (ii) a convex quadratic programming problems (QPPs) need to be solve, which is relatively complex to implement. To address these problems, we proposed two smoothing approaches for an implicit Lagrangian TWSVM classifiers by formulating a pair of unconstrained minimization problems in dual variables whose solutions will be obtained by solving two systems of linear equations rather than solving two QPPs in TWSVM. Our proposed formulation introduces regularization terms to each objective function with the idea of maximizing the margin. In addition, our proposed formulation becomes well-posed model due to this term, which introduces invertibility in the dual formulation. Moreover, the structural risk minimization principle is implemented in our formulation which embodies the essence of statistical learning theory. The experimental results on several benchmark datasets show better performance of the proposed approach over existing approaches in terms of estimation accuracy with less training time.
引用
收藏
页码:2195 / 2210
页数:16
相关论文
共 50 条
  • [21] Review on: Twin Support Vector Machines
    Tian Y.
    Qi Z.
    Tian, Yingjie (tyj@ucas.ac.cn), 1600, Springer Science and Business Media Deutschland GmbH (01): : 253 - 277
  • [22] Twin support vector machines: A survey
    Huang, Huajuan
    Wei, Xiuxi
    Zhou, Yongquan
    NEUROCOMPUTING, 2018, 300 : 34 - 43
  • [23] Generalized Twin Support Vector Machines
    H. Moosaei
    S. Ketabchi
    M. Razzaghi
    M. Tanveer
    Neural Processing Letters, 2021, 53 : 1545 - 1564
  • [24] Improvements on Twin Support Vector Machines
    Shao, Yuan-Hai
    Zhang, Chun-Hua
    Wang, Xiao-Bo
    Deng, Nai-Yang
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (06): : 962 - 968
  • [25] Multitask Twin Support Vector Machines
    Xie, Xijiong
    Sun, Shiliang
    NEURAL INFORMATION PROCESSING, ICONIP 2012, PT II, 2012, 7664 : 341 - 348
  • [26] Generalized Twin Support Vector Machines
    Moosaei, H.
    Ketabchi, S.
    Razzaghi, M.
    Tanveer, M.
    NEURAL PROCESSING LETTERS, 2021, 53 (02) : 1545 - 1564
  • [27] An overview on twin support vector machines
    Ding, Shifei
    Yu, Junzhao
    Qi, Bingjuan
    Huang, Huajuan
    ARTIFICIAL INTELLIGENCE REVIEW, 2014, 42 (02) : 245 - 252
  • [28] Localized twin SVM via convex minimization
    Ye, Qiaolin
    Zhao, Chunxia
    Ye, Ning
    Chen, Xiaobo
    NEUROCOMPUTING, 2011, 74 (04) : 580 - 587
  • [29] Unconstrained Transductive Support Vector Machines and Its Application
    Tian, Yingjie
    Sun, Yunchuan
    Chen, Chuan-Liang
    Zhang, Zhan
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 137 - 141
  • [30] Smooth twin support vector regression
    Chen, Xiaobo
    Yang, Jian
    Liang, Jun
    Ye, Qiaolin
    NEURAL COMPUTING & APPLICATIONS, 2012, 21 (03): : 505 - 513