Efficient and Accelerated Online Learning for Sparse Group Lasso

被引:3
|
作者
Li Zhi-Jie [1 ]
Li Yuan-Xiang [1 ]
Wang Feng [1 ]
Yu Fei [1 ]
Xiang Zheng-Long [1 ]
机构
[1] Wuhan Univ, State Key Lab Software Engn, Wuhan 430072, Peoples R China
关键词
group lasso; sparsity; online learning; dual averaging method; accelerated convergence; SELECTION; REGRESSION;
D O I
10.1109/ICDMW.2014.94
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The batch-mode group lasso algorithms suffer from the inefficiency and poor scalability, and online learning algorithms for group lasso, is a promising tool for attacking the large-scale problem. However, the low time complexity of current online algorithm often be accompanied by low convergence rate, and the faster convergence rate is a key problem to guarantee the online learning algorithms. We develop a novel accelerated online learning algorithm to solve sparse group lasso model. The sparse group lasso model can achieve more sparsity in both the group level and the individual feature level. By adopting dual averaging method, its worst-case time complexity and memory cost at each iteration are both in the order of O(d), where d is the number of dimensions. Moreover, our online algorithm has a accelerated capability, and its theoretical convergence rate is O(1/T-2) up to T-th step. The experimental results on synthetic and real-world datasets demonstrate the merits of the proposed online algorithm for sparse group lasso.
引用
收藏
页码:1171 / 1177
页数:7
相关论文
共 50 条
  • [1] Accelerated Block Coordinate Descent for Sparse Group Lasso
    Catalina, Alejandro
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [2] A Sparse-Group Lasso
    Simon, Noah
    Friedman, Jerome
    Hastie, Trevor
    Tibshirani, Robert
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2013, 22 (02) : 231 - 245
  • [3] Classification With the Sparse Group Lasso
    Rao, Nikhil
    Nowak, Robert
    Cox, Christopher
    Rogers, Timothy
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (02) : 448 - 463
  • [4] Fast Sparse Group Lasso
    Ida, Yasutoshi
    Fujiwara, Yasuhiro
    Kashima, Hisashi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [5] An Iterative Sparse-Group Lasso
    Laria, Juan C.
    Carmen Aguilera-Morillo, M.
    Lillo, Rosa E.
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2019, 28 (03) : 722 - 731
  • [6] Sparse Damage Detection with Complex Group Lasso and Adaptive Complex Group Lasso
    Dimopoulos, Vasileios
    Desmet, Wim
    Deckers, Elke
    SENSORS, 2022, 22 (08)
  • [7] HYPERSPECTRAL UNMIXING WITH SPARSE GROUP LASSO
    Iordache, Marian-Daniel
    Bioucas-Dias, Jose M.
    Plaza, Antonio
    2011 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2011, : 3586 - 3589
  • [8] Adaptive Group Sparse Multi-task Learning via Trace Lasso
    Liu, Sulin
    Pan, Sinno Jialin
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2358 - 2364
  • [9] A group lasso based sparse KNN classifier
    Zheng, Shuai
    Ding, Chris
    PATTERN RECOGNITION LETTERS, 2020, 131 (131) : 227 - 233
  • [10] Adaptive sparse group LASSO in quantile regression
    Mendez-Civieta, Alvaro
    Aguilera-Morillo, M. Carmen
    Lillo, Rosa E.
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2021, 15 (03) : 547 - 573