Online gradient descent learning algorithms

被引:93
|
作者
Ying, Yiming [1 ]
Pontil, Massimiliano [1 ]
机构
[1] UCL, Dept Comp Sci, London WC1E 6BT, England
基金
英国工程与自然科学研究理事会;
关键词
learning theory; online learning; reproducing kernel Hilbert space; gradient descent; error analysis;
D O I
10.1007/s10208-006-0237-y
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This paper considers the least-square online gradient descent algorithm in a reproducing kernel Hilbert space (RKHS) without an explicit regularization term. We present a novel capacity independent approach to derive error bounds and convergence results for this algorithm. The essential element in our analysis is the interplay between the generalization error and a weighted cumulative error which we define in the paper. We show that, although the algorithm does not involve an explicit RKHS regularization term, choosing the step sizes appropriately can yield competitive error rates with those in the literature.
引用
收藏
页码:561 / 596
页数:36
相关论文
共 50 条
  • [31] Learning Fractals by Gradient Descent
    Tu, Cheng-Hao
    Chen, Hong-You
    Carlyn, David
    Chao, Wei-Lun
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 2, 2023, : 2456 - 2464
  • [32] Gradient Descent Learning With Floats
    Sun, Tao
    Tang, Ke
    Li, Dongsheng
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (03) : 1763 - 1771
  • [33] A Limitation of Gradient Descent Learning
    Sum, John
    Leung, Chi-Sing
    Ho, Kevin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (06) : 2227 - 2232
  • [34] Gradient learning in a classification setting by gradient descent
    Cai, Jia
    Wang, Hongyan
    Zhou, Ding-Xuan
    JOURNAL OF APPROXIMATION THEORY, 2009, 161 (02) : 674 - 692
  • [35] Learning to Learn Gradient Aggregation by Gradient Descent
    Ji, Jinlong
    Chen, Xuhui
    Wang, Qianlong
    Yu, Lixing
    Li, Pan
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 2614 - 2620
  • [36] Distributed pairwise algorithms with gradient descent methods
    Wang, Baobin
    Hu, Ting
    NEUROCOMPUTING, 2019, 333 : 364 - 373
  • [37] Convergence analysis of gradient descent stochastic algorithms
    Shapiro, A
    Wardi, Y
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1996, 91 (02) : 439 - 454
  • [38] Normalized Gradient Descent for Variational Quantum Algorithms
    Suzuki, Yudai
    Yano, Hiroshi
    Raymond, Rudy
    Yamamoto, Naoki
    2021 IEEE INTERNATIONAL CONFERENCE ON QUANTUM COMPUTING AND ENGINEERING (QCE 2021) / QUANTUM WEEK 2021, 2021, : 1 - 9
  • [39] Sensitivity-Free Gradient Descent Algorithms
    Matei, Ion
    Zhenirovskyy, Maksym
    de Kleer, Johan
    Maxwell, John
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [40] Impact of Mathematical Norms on Convergence of Gradient Descent Algorithms for Deep Neural Networks Learning
    Cai, Linzhe
    Yu, Xinghuo
    Li, Chaojie
    Eberhard, Andrew
    Lien Thuy Nguyen
    Chuong Thai Doan
    AI 2022: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, 13728 : 131 - 144