The Improved Stochastic Fractional Order Gradient Descent Algorithm

被引:2
|
作者
Yang, Yang [1 ]
Mo, Lipo [1 ,2 ]
Hu, Yusen [1 ]
Long, Fei [3 ]
机构
[1] Beijing Technol & Business Univ, Sch Math & Stat, Beijing 100048, Peoples R China
[2] Beijing Technol & Business Univ, Sch Future Technol, Beijing 100048, Peoples R China
[3] Guizhou Inst Technol, Sch Artificial Intelligence & Elect Engn, Special Key Lab Artificial Intelligence & Intellig, Guiyang 550003, Peoples R China
关键词
machine learning; fractional calculus; stochastic gradient descent; convex optimization; online optimization; NEURAL-NETWORKS;
D O I
10.3390/fractalfract7080631
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are designed combining a fractional order gradient and it is shown that the corresponding regret functions are convergent at a sub-linear rate. Then we discuss the impact of the fractional order on the convergence and monotonicity and prove that the better performance can be obtained by adjusting the order of the fractional gradient. Finally, several practical examples are given to verify the superiority and validity of the proposed algorithm.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
    Needell, Deanna
    Srebro, Nathan
    Ward, Rachel
    MATHEMATICAL PROGRAMMING, 2016, 155 (1-2) : 549 - 573
  • [32] FTSGD: An Adaptive Stochastic Gradient Descent Algorithm for Spark MLlib
    Zhang, Hong
    Liu, Zixia
    Huang, Hai
    Wang, Liqiang
    2018 16TH IEEE INT CONF ON DEPENDABLE, AUTONOM AND SECURE COMP, 16TH IEEE INT CONF ON PERVAS INTELLIGENCE AND COMP, 4TH IEEE INT CONF ON BIG DATA INTELLIGENCE AND COMP, 3RD IEEE CYBER SCI AND TECHNOL CONGRESS (DASC/PICOM/DATACOM/CYBERSCITECH), 2018, : 828 - 835
  • [33] Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
    Deanna Needell
    Nathan Srebro
    Rachel Ward
    Mathematical Programming, 2016, 155 : 549 - 573
  • [34] A Novel Stochastic Gradient Descent Algorithm for Learning Principal Subspaces
    Le Lan, Charline
    Greaves, Joshua
    Farebrother, Jesse
    Rowland, Mark
    Pedregosa, Fabian
    Agarwal, Rishabh
    Bellemare, Marc
    arXiv, 2022,
  • [35] Stochastic Gradient Descent, Weighted Sampling, and the Randomized Kaczmarz algorithm
    Needell, Deanna
    Srebro, Nathan
    Ward, Rachel
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [36] Performance of stochastic parallel gradient descent algorithm in coherent combination
    Li X.
    He Y.
    1600, Chinese Optical Society (36):
  • [37] A Stochastic Gradient Descent Algorithm Based on Adaptive Differential Privacy
    Deng, Yupeng
    Li, Xiong
    He, Jiabei
    Liu, Yuzhen
    Liang, Wei
    COLLABORATIVE COMPUTING: NETWORKING, APPLICATIONS AND WORKSHARING, COLLABORATECOM 2022, PT II, 2022, 461 : 133 - 152
  • [38] Numerical and experimental study on coherent beam combining using an improved stochastic parallel gradient descent algorithm
    Song, Jikun
    Li, Yuanyang
    Che, Dongbo
    Wang, Tingfeng
    LASER PHYSICS, 2020, 30 (08)
  • [39] Stochastic parallel gradient descent algorithm for adaptive optics system
    Ma H.
    Zhang P.
    Zhang J.
    Fan C.
    Wang Y.
    Qiangjiguang Yu Lizishu/High Power Laser and Particle Beams, 2010, 22 (06): : 1206 - 1210
  • [40] A Novel Stochastic Gradient Descent Algorithm for Learning Principal Subspaces
    Le Lan, Charline
    Greaves, Joshua
    Farebrother, Jesse
    Rowland, Mark
    Pedregosa, Fabian
    Agarwal, Rishabh
    Bellemare, Marc
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206