Accelerating Stochastic Variance Reduced Gradient Using Mini-Batch Samples on Estimation of Average Gradient

被引:1
|
作者
Huang, Junchu [1 ]
Zhou, Zhiheng [1 ]
Xu, Bingyuan [1 ]
Huang, Yu [1 ]
机构
[1] South China Univ Technol, Sch Elect & Informat Engn, Guangzhou, Guangdong, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Optimization algorithms; Stochastic gradient descent; Machine learning;
D O I
10.1007/978-3-319-59072-1_41
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Stochastic gradient descent (SGD) is popular for large scale optimization but has slow convergence. To remedy this problem, stochastic variance reduced gradient (SVRG) is proposed, which adopts average gradient to reduce the effect of variance. Since its expensive computational cost, average gradient is maintained between m iterations, where m is set to the same order of data size. For large scale problems, the efficiency will be decreased due to the prediction on average gradient maybe not accurate enough. We propose a method of using a mini-batch of samples to estimate average gradient, called stochastic mini-batch variance reduced gradient (SMVRG). SMVRG greatly reduces the computational cost of prediction on average gradient, therefore it is possible to estimate average gradient frequently thus more accurate. Numerical experiments show the effectiveness of our method in terms of convergence rate and computation cost.
引用
收藏
页码:346 / 353
页数:8
相关论文
共 50 条
  • [1] Accelerating Stochastic Gradient Descent using Adaptive Mini-Batch Size
    Alsadi, Muayyad Saleh
    Ghnemat, Rawan
    Awajan, Arafat
    2019 2ND INTERNATIONAL CONFERENCE ON NEW TRENDS IN COMPUTING SCIENCES (ICTCS), 2019, : 393 - 399
  • [2] A mini-batch stochastic conjugate gradient algorithm with variance reduction
    Caixia Kou
    Han Yang
    Journal of Global Optimization, 2023, 87 : 1009 - 1025
  • [3] A mini-batch stochastic conjugate gradient algorithm with variance reduction
    Kou, Caixia
    Yang, Han
    JOURNAL OF GLOBAL OPTIMIZATION, 2023, 87 (2-4) : 1009 - 1025
  • [4] A MINI-BATCH STOCHASTIC GRADIENT METHOD FOR SPARSE LEARNING TO RANK
    Cheng, Fan
    Wang, Dongliang
    Zhang, Lei
    Su, Yansen
    Qiu, Jianfeng
    Suo, Yi
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2018, 14 (04): : 1207 - 1221
  • [5] An adaptive mini-batch stochastic gradient method for AUC maximization
    Cheng, Fan
    Zhang, Xia
    Zhang, Chuang
    Qiu, Jianfeng
    Zhang, Lei
    NEUROCOMPUTING, 2018, 318 : 137 - 150
  • [6] Stronger Adversarial Attack: Using Mini-batch Gradient
    Yu, Lin
    Deng, Ting
    Zhang, Wenxiang
    Zeng, Zhigang
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 364 - 370
  • [7] A Framework of Convergence Analysis of Mini-batch Stochastic Projected Gradient Methods
    Jian Gu
    Xian-Tao Xiao
    Journal of the Operations Research Society of China, 2023, 11 : 347 - 369
  • [8] A Framework of Convergence Analysis of Mini-batch Stochastic Projected Gradient Methods
    Gu, Jian
    Xiao, Xian-Tao
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2023, 11 (02) : 347 - 369
  • [9] Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting
    Konecny, Jakub
    Liu, Jie
    Richtarik, Peter
    Takac, Martin
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2016, 10 (02) : 242 - 255
  • [10] Accelerating variance-reduced stochastic gradient methods
    Driggs, Derek
    Ehrhardt, Matthias J.
    Schonlieb, Carola-Bibiane
    MATHEMATICAL PROGRAMMING, 2022, 191 (02) : 671 - 715