Acceleration of Primal–Dual Methods by Preconditioning and Simple Subproblem Procedures

被引:0
|
作者
Yanli Liu
Yunbei Xu
Wotao Yin
机构
[1] University of California,Department of Mathematics
[2] Columbia University,Graduate School of Business
来源
Journal of Scientific Computing | 2021年 / 86卷
关键词
Primal–dual hybrid gradient; Alternating direction method of multipliers; Preconditioning; Acceleration; 49M29; 65K10; 65Y20; 90C25;
D O I
暂无
中图分类号
学科分类号
摘要
Primal–dual hybrid gradient (PDHG) and alternating direction method of multipliers (ADMM) are popular first-order optimization methods. They are easy to implement and have diverse applications. As first-order methods, however, they are sensitive to problem conditions and can struggle to reach the desired accuracy. To improve their performance, researchers have proposed techniques such as diagonal preconditioning and inexact subproblems. This paper realizes additional speedup about one order of magnitude. Specifically, we choose general (non-diagonal) preconditioners that are much more effective at reducing the total numbers of PDHG/ADMM iterations than diagonal ones. Although the subproblems may lose their closed-form solutions, we show that it suffices to solve each subproblem approximately with a few proximal-gradient iterations or a few epochs of proximal block-coordinate descent, which are simple and have closed-form steps. Global convergence of this approach is proved when the inner iterations are fixed. Our method opens the choices of preconditioners and maintains both low per-iteration cost and global convergence. Consequently, on several typical applications of primal–dual first-order methods, we obtain 4–95×\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\times $$\end{document} speedup over the existing state-of-the-art.
引用
收藏
相关论文
共 50 条
  • [21] Acceleration procedures for matrix iterative methods
    C. Brezinski
    Numerical Algorithms, 2000, 25 : 63 - 73
  • [22] On the Comparison between Primal and Primal-dual Methods in Decentralized Dynamic Optimization
    Xu, Wei
    Yuan, Kun
    Yin, Wotao
    Ling, Qing
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 1501 - 1505
  • [23] Acceleration procedures for matrix iterative methods
    Brezinski, C
    NUMERICAL ALGORITHMS, 2000, 25 (1-4) : 63 - 73
  • [24] Preconditioning in Fast Dual Gradient Methods
    Giselsson, Pontus
    Boyd, Stephen
    2014 IEEE 53RD ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2014, : 5040 - 5045
  • [25] Diagonal preconditioning for first order primal-dual algorithms in convex optimization
    Pock, Thomas
    Chambolle, Antonin
    2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2011, : 1762 - 1769
  • [26] Acceleration and Preconditioning Strategies for Higher Order Moment Methods
    Dault, D.
    Shanker, B.
    2014 IEEE ANTENNAS AND PROPAGATION SOCIETY INTERNATIONAL SYMPOSIUM (APSURSI), 2014, : 2142 - 2143
  • [27] Machine criticality measures and subproblem solution procedures in shifting bottleneck methods: A computational study
    Holtsclaw, HH
    Uzsoy, R
    JOURNAL OF THE OPERATIONAL RESEARCH SOCIETY, 1996, 47 (05) : 666 - 677
  • [28] Dual-primal FETI methods for linear elasticity
    Klawonn, Axel
    Widlund, Olof B.
    COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2006, 59 (11) : 1523 - 1572
  • [29] A primal-dual approach to inexact subgradient methods
    Au, KT
    MATHEMATICAL PROGRAMMING, 1996, 72 (03) : 259 - 272
  • [30] Primal and Dual Bregman Methods with Application to Optical Nanoscopy
    Brune, Christoph
    Sawatzky, Alex
    Burger, Martin
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2011, 92 (02) : 211 - 229