Constrained Stochastic Gradient Descent for Large-scale Least Squares Problem

被引:0
|
作者
Mu, Yang [1 ]
Ding, Wei [1 ]
Zhou, Tianyi [2 ]
Tao, Dacheng [2 ]
机构
[1] Univ Massachusetts, 100 Morrissey Blvd, Boston, MA 02125 USA
[2] Univ Technol Sydney, Ultimo, NSW 2007, Australia
关键词
Stochastic optimization; Large-scale least squares; online learning; APPROXIMATION; ALGORITHMS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The least squares problem is one of the most important regression problems in statistics, machine learning and data mining. In this paper, we present the Constrained Stochastic Gradient Descent (CSGD) algorithm to solve the large-scale least squares problem. CSGD improves the Stochastic Gradient Descent (SGD) by imposing a provable constraint that the linear regression line passes through the mean point of all the data points. It results in the best regret bound o(logT), and fastest convergence speed among all first order approaches. Empirical studies justify the effectiveness of CSGD by comparing it with SGD and other state-of-the-art approaches. An example is also given to show how to use CSGD to optimize SGD based least squares problems to achieve a better performance.
引用
收藏
页码:883 / 891
页数:9
相关论文
共 50 条