Sparse kernel logistic regression based on L1/2 regularization

被引:0
|
作者
XU Chen [1 ]
PENG ZhiMing [2 ]
JING WenFeng [2 ]
机构
[1] Department of Statistics,University of British Columbia,Vancouver,BC V6T1Z2,Canada
[2] Institute for Information and System Science,School of Mathematics and Statistics,Xi'an Jiaotong University
基金
中国国家自然科学基金;
关键词
classification; L1/2; regularization; thresholding algorithm; kernel logistic regression; support vectors;
D O I
暂无
中图分类号
O212.1 [一般数理统计];
学科分类号
摘要
The sparsity driven classification technologies have attracted much attention in recent years,due to their capability of providing more compressive representations and clear interpretation.Two most popular classification approaches are support vector machines(SVMs) and kernel logistic regression(KLR),each having its own advantages.The sparsification of SVM has been well studied,and many sparse versions of 2-norm SVM,such as 1-norm SVM(1-SVM),have been developed.But,the sparsifiation of KLR has been less studied.The existing sparsification of KLR is mainly based on L1 norm and L2 norm penalties,which leads to the sparse versions that yield solutions not so sparse as it should be.A very recent study on L1/2 regularization theory in compressive sensing shows that L1/2 sparse modeling can yield solutions more sparse than those of 1 norm and 2 norm,and,furthermore,the model can be efficiently solved by a simple iterative thresholding procedure.The objective function dealt with in L1/2 regularization theory is,however,of square form,the gradient of which is linear in its variables(such an objective function is the so-called linear gradient function).In this paper,through extending the linear gradient function of L1/2 regularization framework to the logistic function,we propose a novel sparse version of KLR,the 1/2 quasi-norm kernel logistic regression(1/2-KLR).The version integrates advantages of KLR and L1/2 regularization,and defines an efficient implementation scheme of sparse KLR.We suggest a fast iterative thresholding algorithm for 1/2-KLR and prove its convergence.We provide a series of simulations to demonstrate that 1/2-KLR can often obtain more sparse solutions than the existing sparsity driven versions of KLR,at the same or better accuracy level.The conclusion is also true even in comparison with sparse SVMs(1-SVM and 2-SVM).We show an exclusive advantage of 1/2-KLR that the regularization parameter in the algorithm can be adaptively set whenever the sparsity(correspondingly,the number of support vectors) is given,which suggests a methodology of comparing sparsity promotion capability of different sparsity driven classifiers.As an illustration of benefits of 1/2-KLR,we give two applications of 1/2-KLR in semi-supervised learning,showing that 1/2-KLR can be successfully applied to the classification tasks in which only a few data are labeled.
引用
收藏
页码:75 / 90
页数:16
相关论文
共 50 条
  • [21] Sparse Hopfield network reconstruction with l1 regularization
    Huang, Haiping
    EUROPEAN PHYSICAL JOURNAL B, 2013, 86 (11):
  • [22] L1/2 regularization
    ZongBen Xu
    Hai Zhang
    Yao Wang
    XiangYu Chang
    Yong Liang
    Science China Information Sciences, 2010, 53 : 1159 - 1169
  • [23] L1/2 regularization
    XU ZongBen 1
    2 Department of Mathematics
    3 University of Science and Technology
    Science China(Information Sciences), 2010, 53 (06) : 1159 - 1169
  • [24] Application of L1/2 regularization logistic method in heart disease diagnosis
    Zhang, Bowen
    Chai, Hua
    Yang, Ziyi
    Liang, Yong
    Chu, Gejin
    Liu, Xiaoying
    BIO-MEDICAL MATERIALS AND ENGINEERING, 2014, 24 (06) : 3447 - 3454
  • [25] Sparse portfolio optimization via l1 over l2 regularization
    Wu, Zhongming
    Sun, Kexin
    Ge, Zhili
    Allen-Zhao, Zhihua
    Zeng, Tieyong
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2024, 319 (03) : 820 - 833
  • [26] Doubly Sparse Bayesian Kernel Logistic Regression
    Kojima, Atsushi
    Tanaka, Toshihisa
    2018 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2018, : 977 - 982
  • [27] Structural damage identification based on substructure sensitivity and l1 sparse regularization
    Zhou, Shumei
    Bao, Yuequan
    Li, Hui
    SENSORS AND SMART STRUCTURES TECHNOLOGIES FOR CIVIL, MECHANICAL, AND AEROSPACE SYSTEMS 2013, 2013, 8692
  • [28] Robust censored regression with l1 -norm regularization
    Beyhum, Jad
    Van Keilegom, Ingrid
    TEST, 2023, 32 (01) : 146 - 162
  • [29] Sparse minimal learning machines via l1/2 norm regularization
    Dias, Madson L. D.
    Freire, Ananda L.
    Souza Junior, Amauri H.
    da Rocha Neto, Ajalmar R.
    Gomes, Joao P. P.
    2018 7TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2018, : 206 - 211
  • [30] Improved sparse reconstruction for fluorescence molecular tomography with L1/2 regularization
    Guo, Hongbo
    Yu, Jingjing
    He, Xiaowei
    Hou, Yuqing
    Dong, Fang
    Zhang, Shuling
    BIOMEDICAL OPTICS EXPRESS, 2015, 6 (05): : 1648 - 1664