A partial PPA block-wise ADMM for multi-block linearly constrained separable convex optimization

被引:8
|
作者
Shen, Yuan [1 ]
Zhang, Xingying [1 ]
Zhang, Xiayang [2 ]
机构
[1] Nanjing Univ Finance & Econ, Sch Appl Math, Nanjing, Peoples R China
[2] Nanjing Inst Technol, Dept Math & Phys, Nanjing, Peoples R China
基金
中国国家自然科学基金;
关键词
Convex optimization; augmented Lagrangian; alternating direction method of multipliers; multi-block; proximal point algorithm; AUGMENTED LAGRANGIAN METHOD; PARALLEL SPLITTING METHOD; LOW-RANK; MINIMIZATION; SPARSE; CONVERGENCE; ALGORITHM;
D O I
10.1080/02331934.2020.1728756
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The alternating direction method of multipliers (ADMM) is a classical effective method for solving two-block convex optimization subject to linear constraints. However, its convergence may not be guaranteed for multiple-block case without additional assumptions. One remedy might be the block-wise ADMM (BADMM), in which the variables are regrouped into two groups firstly and then the augmented Lagrangian function is minimized w.r.t. each block variable by the following scheme: using a Gauss-Seidel fashion to update the variables between each group, while using a Jacobi fashion to update the variables within each group. In order to derive its convergence property, a special proximal term is added to each subproblem. In this paper, we propose a new partial PPA block-wise ADMM where we only need to add proximal terms to the subproblems in the first group. At the end of each iteration, an extension step on all variables is performed with a fixed step size. As the subproblems in the second group are unmodified, the resulting sequence might yield better quality as well as potentially faster convergence speed. Preliminary experimental results show that the new algorithm is empirically effective on solving both synthetic and real problems when it is compared with several very efficient ADMM-based algorithms.
引用
收藏
页码:631 / 657
页数:27
相关论文
共 50 条
  • [21] The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
    Caihua Chen
    Bingsheng He
    Yinyu Ye
    Xiaoming Yuan
    Mathematical Programming, 2016, 155 : 57 - 79
  • [22] On the Sublinear Convergence Rate of Multi-block ADMM
    Lin T.-Y.
    Ma S.-Q.
    Zhang S.-Z.
    Journal of the Operations Research Society of China, 2015, 3 (03) : 251 - 274
  • [23] Multi-block ADMM for big data optimization in modern communication networks
    Liu, Lanchao
    Han, Zhu
    Journal of Communications, 2015, 10 (09): : 666 - 676
  • [24] A relaxed proximal ADMM method for block separable convex programming
    Min Sun
    Yiju Wang
    Numerical Algorithms, 2024, 95 : 575 - 603
  • [25] A relaxed proximal ADMM method for block separable convex programming
    Sun, Min
    Wang, Yiju
    NUMERICAL ALGORITHMS, 2024, 95 (02) : 575 - 603
  • [26] A note on the convergence of ADMM for linearly constrained convex optimization problems
    Chen, Liang
    Sun, Defeng
    Toh, Kim-Chuan
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2017, 66 (02) : 327 - 343
  • [27] A note on the convergence of ADMM for linearly constrained convex optimization problems
    Liang Chen
    Defeng Sun
    Kim-Chuan Toh
    Computational Optimization and Applications, 2017, 66 : 327 - 343
  • [28] Block-Wise Separable Convolutions: An Alternative Way to Factorize Standard Convolutions
    Huang, Yan-Jen
    Wu, Hsin-Lung
    Ching-Chen
    IEEE ACCESS, 2024, 12 : 21559 - 21568
  • [29] Iteration Complexity Analysis of Multi-block ADMM for a Family of Convex Minimization Without Strong Convexity
    Lin, Tianyi
    Ma, Shiqian
    Zhang, Shuzhong
    JOURNAL OF SCIENTIFIC COMPUTING, 2016, 69 (01) : 52 - 81
  • [30] Iteration Complexity Analysis of Multi-block ADMM for a Family of Convex Minimization Without Strong Convexity
    Tianyi Lin
    Shiqian Ma
    Shuzhong Zhang
    Journal of Scientific Computing, 2016, 69 : 52 - 81