Almost Sure Convergence and Non-Asymptotic Concentration Bounds for Stochastic Mirror Descent Algorithm

被引:0
|
作者
Paul, Anik Kumar [1 ]
Mahindrakar, Arun D. [1 ]
Kalaimani, Rachel K. [1 ]
机构
[1] Indian Inst Technol Madras, Dept Elect Engn, Chennai 600036, India
来源
关键词
Mirror descent algorithm; almost sure convergence; concentration inequality; sub-gaussian random vectors; OPTIMIZATION;
D O I
10.1109/LCSYS.2024.3482148
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This letter investigates the convergence and concentration properties of the Stochastic Mirror Descent (SMD) algorithm utilizing biased stochastic subgradients. We establish the almost sure convergence of the algorithm's iterates under the assumption of diminishing bias. Furthermore, we derive concentration bounds for the discrepancy between the iterates' function values and the optimal value, based on standard assumptions. Subsequently, leveraging the assumption of Sub-Gaussian noise in stochastic subgradients, we present refined concentration bounds for this discrepancy.
引用
收藏
页码:2397 / 2402
页数:6
相关论文
共 50 条