Stochastic Variance Reduction for Variational Inequality Methods

被引:0
|
作者
Alacaoglu, Ahmet [1 ]
Malitsky, Yura [2 ]
机构
[1] Univ Wisconsin, Madison, WI 53706 USA
[2] Linkoping Univ, Linkoping, Sweden
来源
基金
欧洲研究理事会;
关键词
Variational inequality; extragradient; stochastic methods; variance reduction; oracle complexity; BACKWARD SPLITTING METHOD; EXTRAGRADIENT METHOD; CONVERGENCE; ALGORITHMS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose stochastic variance reduced algorithms for solving convex-concave saddle point problems, monotone variational inequalities, and monotone inclusions. Our framework applies to extra-gradient, forward-backward-forward, and forward-reflected-backward methods both in Euclidean and Bregman setups. All proposed methods converge in the same setting as their deterministic counterparts and they either match or improve the best-known complexities for solving structured min-max problems. Our results reinforce the correspondence between variance reduction in variational inequalities and minimization. We also illustrate the improvements of our approach with numerical evaluations on matrix games.
引用
收藏
页码:778 / 816
页数:39
相关论文
共 50 条