EFFICIENT REGULARIZED PROXIMAL QUASI/NEWTON METHODS FOR LARGE-SCALE NONCONVEX COMPOSITE OPTIMIZATION PROBLEMS

被引:0
|
作者
Kanzow, Christian [1 ]
Lechner, Theresa [1 ]
机构
[1] Univ Wurzburg, Inst Math, Emil Fischer Str 30, D-97074 Wurzburg, Germany
来源
PACIFIC JOURNAL OF OPTIMIZATION | 2024年 / 20卷 / 03期
关键词
composite minimization; regularization; quadratic approximation; proximal quasi-Newton method; global convergence; limited memory methods; proximity operator; local error bound; QUASI-NEWTON MATRICES; CONVEX-OPTIMIZATION; GRADIENT METHODS; LINE-SEARCH; ALGORITHM; MINIMIZATION; SHRINKAGE; LASSO; SUM;
D O I
10.61208/pjo-2023-036
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Optimization problems with composite functions consist of an objective function which is the sum of a smooth and a (convex) nonsmooth term. This particular structure is exploited by the class of proximal gradient methods and some of their generalizations like proximal Newton and quasi-Newton methods. In this paper, we propose a regularized proximal quasi-Newton method whose main features are: (a) the method is globally convergent to stationary points, (b) the globalization is controlled by a regularization parameter, no line search is required, (c) the method can be implemented very efficiently based on a simple observation which combines recent ideas for the computation of quasi-Newton proximity operators and compact representations of limited-memory quasi-Newton updates. Numerical examples for the solution of convex and nonconvex composite optimization problems indicate that the method outperforms several existing methods.
引用
收藏
页数:32
相关论文
共 50 条
  • [21] A COMMUNICATION EFFICIENT QUASI-NEWTON METHOD FOR LARGE-SCALE DISTRIBUTED MULTI-AGENT OPTIMIZATION
    Li, Yichuan
    Voulgaris, Petros G.
    Freris, Nikolaos M.
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4268 - 4272
  • [22] Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
    Hiva Ghanbari
    Katya Scheinberg
    Computational Optimization and Applications, 2018, 69 : 597 - 627
  • [23] Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
    Ghanbari, Hiva
    Scheinberg, Katya
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2018, 69 (03) : 597 - 627
  • [24] Accelerated Proximal Gradient Method with Line Search for Large-Scale Nonconvex Penalty Problems
    Wu, Zhongming
    Wang, Kai
    Zhou, Zhangjin
    ICBDC 2019: PROCEEDINGS OF 2019 4TH INTERNATIONAL CONFERENCE ON BIG DATA AND COMPUTING, 2019, : 281 - 286
  • [25] Stochastic proximal quasi-Newton methods for non-convex composite optimization
    Wang, Xiaoyu
    Wang, Xiao
    Yuan, Ya-xiang
    OPTIMIZATION METHODS & SOFTWARE, 2019, 34 (05): : 922 - 948
  • [26] ASYNCHRONOUS PARALLEL NONCONVEX LARGE-SCALE OPTIMIZATION
    Cannelli, L.
    Facchinei, F.
    Kungurtsev, V.
    Scutari, G.
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4706 - 4710
  • [27] Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
    Andrei Patrascu
    Ion Necoara
    Journal of Global Optimization, 2015, 61 : 19 - 46
  • [28] Scaling on Diagonal Quasi-Newton Update for Large-Scale Unconstrained Optimization
    Leong, Wah June
    Farid, Mahboubeh
    Abu Hassan, Malik
    BULLETIN OF THE MALAYSIAN MATHEMATICAL SCIENCES SOCIETY, 2012, 35 (02) : 247 - 256
  • [29] Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
    Patrascu, Andrei
    Necoara, Ion
    JOURNAL OF GLOBAL OPTIMIZATION, 2015, 61 (01) : 19 - 46
  • [30] A Sequential Subspace Quasi-Newton Method for Large-Scale Convex Optimization
    Senov, Aleksandr
    Granichin, Oleg
    Granichina, Olga
    2020 AMERICAN CONTROL CONFERENCE (ACC), 2020, : 3627 - 3632