Tight global linear convergence rate bounds for Douglas-Rachford splitting

被引:46
|
作者
Giselsson, Pontus [1 ]
机构
[1] Lund Univ, Dept Automat Control, Box 118, SE-22100 Lund, Sweden
关键词
Douglas-Rachford splitting; Linear convergence; Monotone operators; Fixed-point iterations; ALTERNATING DIRECTION METHOD; PROJECTIONS; MULTIPLIERS; ALGORITHMS; ADMM;
D O I
10.1007/s11784-017-0417-1
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Recently, several authors have shown local and global convergence rate results for Douglas-Rachford splitting under strong monotonicity, Lipschitz continuity, and cocoercivity assumptions. Most of these focus on the convex optimization setting. In the more general monotone inclusion setting, Lions and Mercier showed a linear convergence rate bound under the assumption that one of the two operators is strongly monotone and Lipschitz continuous. We show that this bound is not tight, meaning that no problem from the considered class converges exactly with that rate. In this paper, we present tight global linear convergence rate bounds for that class of problems. We also provide tight linear convergence rate bounds under the assumptions that one of the operators is strongly monotone and cocoercive, and that one of the operators is strongly monotone and the other is cocoercive. All our linear convergence results are obtained by proving the stronger property that the Douglas-Rachford operator is contractive.
引用
收藏
页码:2241 / 2270
页数:30
相关论文
共 50 条
  • [21] The error structure of the Douglas-Rachford splitting method for stiff linear problems
    Hansen, Eskil
    Ostermann, Alexander
    Schratz, Katharina
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2016, 303 : 140 - 145
  • [22] Convergence of the preconditioned proximal point method and Douglas-Rachford splitting in the absence of monotonicity
    Evens, Brecht
    Pas, Pieter
    Latafat, Puya
    Patrinos, Panagiotis
    MATHEMATICAL PROGRAMMING, 2025,
  • [23] On the convergence rate of Douglas–Rachford operator splitting method
    Bingsheng He
    Xiaoming Yuan
    Mathematical Programming, 2015, 153 : 715 - 722
  • [24] Diagonal Scaling in Douglas-Rachford Splitting and ADMM
    Giselsson, Pontus
    Boyd, Stephen
    2014 IEEE 53RD ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2014, : 5033 - 5039
  • [25] Removing Multiplicative Noise by Douglas-Rachford Splitting Methods
    G. Steidl
    T. Teuber
    Journal of Mathematical Imaging and Vision, 2010, 36 : 168 - 184
  • [26] Removing Multiplicative Noise by Douglas-Rachford Splitting Methods
    Steidl, G.
    Teuber, T.
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2010, 36 (02) : 168 - 184
  • [27] ON THE O(1/n) CONVERGENCE RATE OF THE DOUGLAS-RACHFORD ALTERNATING DIRECTION METHOD
    He, Bingsheng
    Yuan, Xiaoming
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2012, 50 (02) : 700 - 709
  • [28] A customized Douglas-Rachford splitting algorithm for separable convex minimization with linear constraints
    Han, Deren
    He, Hongjin
    Yang, Hai
    Yuan, Xiaoming
    NUMERISCHE MATHEMATIK, 2014, 127 (01) : 167 - 200
  • [29] Douglas-Rachford splitting and ADMM for pathological convex optimization
    Ryu, Ernest K.
    Liu, Yanli
    Yin, Wotao
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2019, 74 (03) : 747 - 778
  • [30] CONVERGENCE ANALYSIS OF DOUGLAS-RACHFORD SPLITTING METHOD FOR "STRONGLY plus WEAKLY" CONVEX PROGRAMMING
    Guo, Ke
    Han, Deren
    Yuan, Xiaoming
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2017, 55 (04) : 1549 - 1577