MGPROX: A NONSMOOTH MULTIGRID PROXIMAL GRADIENT METHOD WITH ADAPTIVE RESTRICTION FOR STRONGLY CONVEX OPTIMIZATION

被引:0
|
作者
Ang, Andersen [1 ]
de Sterck, Hans [2 ]
Vavasis, Stephen [3 ]
机构
[1] Univ Southampton, Elect & Comp Sci, Southampton, England
[2] Univ Waterloo, Dept Appl Math, Waterloo, ON, Canada
[3] Univ Waterloo, Dept Combinator & Optimizat, Waterloo, ON, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
multigrid; restriction; proximal gradient; subdifferential; convex optimization; obstacle problem; MONOTONE-OPERATORS; CONVERGENCE; ALGORITHM;
D O I
10.1137/23M1552140
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We study the combination of proximal gradient descent with multigrid for solving a class of possibly nonsmooth strongly convex optimization problems. We propose a multigrid proximal gradient method called MGProx, which accelerates the proximal gradient method by multigrid, based on using hierarchical information of the optimization problem. MGProx applies a newly introduced adaptive restriction operator to simplify the Minkowski sum of subdifferentials of the nondifferentiable objective function across different levels. We provide a theoretical characterization of MGProx. First we show that the MGProx update operator exhibits a fixed-point property. Next, we show that the coarse correction is a descent direction for the fine variable of the original fine level problem in the general nonsmooth case. Last, under some assumptions we provide the convergence rate for the algorithm. In the numerical tests on the elastic obstacle problem, which is an example of a nonsmooth convex optimization problem where the multigrid method can be applied, we show that MGProx has a faster convergence speed than competing methods.
引用
收藏
页码:2788 / 2820
页数:33
相关论文
共 50 条