OPTIMAL STOCHASTIC APPROXIMATION ALGORITHMS FOR STRONGLY CONVEX STOCHASTIC COMPOSITE OPTIMIZATION, II: SHRINKING PROCEDURES AND OPTIMAL ALGORITHMS

被引:112
|
作者
Ghadimi, Saeed [1 ]
Lan, Guanghui [1 ]
机构
[1] Univ Florida, Dept Ind & Syst Engn, Gainesville, FL 32611 USA
基金
美国国家科学基金会;
关键词
stochastic approximation; convex optimization; strong convexity; complexity; optimal method; large deviation;
D O I
10.1137/110848876
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper we study new stochastic approximation (SA) type algorithms, namely, the accelerated SA (AC-SA), for solving strongly convex stochastic composite optimization (SCO) problems. Specifically, by introducing a domain shrinking procedure, we significantly improve the large-deviation results associated with the convergence rate of a nearly optimal AC-SA algorithm presented by Ghadimi and Lan in [SIAM J. Optim., 22 (2012), pp 1469-1492]. Moreover, we introduce a multistage AC-SA algorithm, which possesses an optimal rate of convergence for solving strongly convex SCO problems in terms of the dependence on not only the target accuracy, but also a number of problem parameters and the selection of initial points. To the best of our knowledge, this is the first time that such an optimal method has been presented in the literature. From our computational results, these AC-SA algorithms can substantially outperform the classical SA and some other SA type algorithms for solving certain classes of strongly convex SCO problems.
引用
收藏
页码:2061 / 2089
页数:29
相关论文
共 50 条