Local Linear Convergence of the ADMM/Douglas-Rachford Algorithms without Strong Convexity and Application to Statistical Imaging

被引:32
|
作者
Aspelmeier, Timo [1 ,2 ]
Charitha, C. [3 ]
Luke, D. Russell [3 ]
机构
[1] Georg August Univ Gottingen, Inst Math Stochast, D-37077 Gottingen, Germany
[2] Georg August Univ Gottingen, Felix Bernstein Inst Math Stat Biosci, D-37077 Gottingen, Germany
[3] Georg August Univ Gottingen, Inst Numer & Angew Math, D-37083 Gottingen, Germany
来源
SIAM JOURNAL ON IMAGING SCIENCES | 2016年 / 9卷 / 02期
关键词
augmented Lagrangian; ADMM; Douglas-Rachford; exact penalization; fixed point theory; image processing; inverse problems; metric regularity; statistical multiscale analysis; piecewise linear-quadratic; linear convergence; ALTERNATING DIRECTION METHOD; DOUGLAS-RACHFORD; PENALTY; APPROXIMATION; MULTIPLIERS; PROJECTIONS;
D O I
10.1137/15M103580X
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the problem of minimizing the sum of a convex function and a convex function composed with an injective linear mapping. For such problems, subject to a coercivity condition at fixed points of the corresponding Picard iteration, iterates of the alternating directions method of multipliers converge locally linearly to points from which the solution to the original problem can be computed. Our proof strategy uses duality and strong metric subregularity of the Douglas-Rachford fixed point mapping. Our analysis does not require strong convexity and yields error bounds to the set of model solutions. We show in particular that convex piecewise linear-quadratic functions naturally satisfy the requirements of the theory, guaranteeing eventual linear convergence of both the Douglas-Rachford algorithm and the alternating directions method of multipliers for this class of objectives under mild assumptions on the set of fixed points. We demonstrate this result on quantitative image deconvolution and denoising with multiresolution statistical constraints.
引用
收藏
页码:842 / 868
页数:27
相关论文
共 32 条
  • [21] Optimal Rates of Linear Convergence of Relaxed Alternating Projections and Generalized Douglas-Rachford Methods for Two Subspaces
    Bauschke, Heinz H.
    Bello Cruz, J. Y.
    Nghia, Tran T. A.
    Pha, Hung M.
    Wang, Xianfu
    NUMERICAL ALGORITHMS, 2016, 73 (01) : 33 - 76
  • [22] Optimal Rates of Linear Convergence of Relaxed Alternating Projections and Generalized Douglas-Rachford Methods for Two Subspaces
    Heinz H. Bauschke
    J. Y. Bello Cruz
    Tran T. A. Nghia
    Hung M. Pha
    Xianfu Wang
    Numerical Algorithms, 2016, 73 : 33 - 76
  • [23] Preconditioned Douglas-Rachford Algorithms for TV- and TGV-Regularized Variational Imaging Problems
    Bredies, Kristian
    Sun, Hong Peng
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2015, 52 (03) : 317 - 344
  • [24] Linear Convergence for Distributed Optimization Without Strong Convexity
    Yi, Xinlei
    Zhang, Shengjun
    Yang, Tao
    Chai, Tianyou
    Johansson, Karl H.
    2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2020, : 3643 - 3648
  • [25] LINEAR CONVERGENCE OF FORWARD-BACKWARD ACCELERATED ALGORITHMS WITHOUT KNOWLEDGE OF THE MODULUS OF STRONG CONVEXITY\ast
    Li, Bowen
    Shi, Bin
    Yuan, Ya-Xiang
    SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (02) : 2150 - 2168
  • [26] Linear convergence of Frank–Wolfe for rank-one matrix recovery without strong convexity
    Dan Garber
    Mathematical Programming, 2023, 199 : 87 - 121
  • [27] Convergence in quadratic mean of averaged stochastic gradient algorithms without strong convexity nor bounded gradient
    Godichon-Baggioni, Antoine
    STATISTICS, 2023, 57 (03) : 637 - 668
  • [28] On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity
    Heinz H. Bauschke
    Jérôme Bolte
    Jiawei Chen
    Marc Teboulle
    Xianfu Wang
    Journal of Optimization Theory and Applications, 2019, 182 : 1068 - 1087
  • [29] Linear convergence of Frank-Wolfe for rank-one matrix recovery without strong convexity
    Garber, Dan
    MATHEMATICAL PROGRAMMING, 2023, 199 (1-2) : 87 - 121
  • [30] On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity
    Bauschke, Heinz H.
    Bolte, Jerome
    Chen, Jiawei
    Teboulle, Marc
    Wang, Xianfu
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2019, 182 (03) : 1068 - 1087