EFFICIENT REGULARIZED PROXIMAL QUASI/NEWTON METHODS FOR LARGE-SCALE NONCONVEX COMPOSITE OPTIMIZATION PROBLEMS

被引:0
|
作者
Kanzow, Christian [1 ]
Lechner, Theresa [1 ]
机构
[1] Univ Wurzburg, Inst Math, Emil Fischer Str 30, D-97074 Wurzburg, Germany
来源
PACIFIC JOURNAL OF OPTIMIZATION | 2024年 / 20卷 / 03期
关键词
composite minimization; regularization; quadratic approximation; proximal quasi-Newton method; global convergence; limited memory methods; proximity operator; local error bound; QUASI-NEWTON MATRICES; CONVEX-OPTIMIZATION; GRADIENT METHODS; LINE-SEARCH; ALGORITHM; MINIMIZATION; SHRINKAGE; LASSO; SUM;
D O I
10.61208/pjo-2023-036
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Optimization problems with composite functions consist of an objective function which is the sum of a smooth and a (convex) nonsmooth term. This particular structure is exploited by the class of proximal gradient methods and some of their generalizations like proximal Newton and quasi-Newton methods. In this paper, we propose a regularized proximal quasi-Newton method whose main features are: (a) the method is globally convergent to stationary points, (b) the globalization is controlled by a regularization parameter, no line search is required, (c) the method can be implemented very efficiently based on a simple observation which combines recent ideas for the computation of quasi-Newton proximity operators and compact representations of limited-memory quasi-Newton updates. Numerical examples for the solution of convex and nonconvex composite optimization problems indicate that the method outperforms several existing methods.
引用
收藏
页数:32
相关论文
共 50 条
  • [11] An efficient augmented memoryless quasi-Newton method for solving large-scale unconstrained optimization problems
    Cheng, Yulin
    Gao, Jing
    AIMS MATHEMATICS, 2024, 9 (09): : 25232 - 25252
  • [12] ON THE COMPLEXITY OF STEEPEST DESCENT, NEWTON'S AND REGULARIZED NEWTON'S METHODS FOR NONCONVEX UNCONSTRAINED OPTIMIZATION PROBLEMS
    Cartis, C.
    Gould, N. I. M.
    Toint, Ph. L.
    SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (06) : 2833 - 2852
  • [13] Preconditioning Newton-Krylov methods in nonconvex large scale optimization
    Fasano, Giovanni
    Roma, Massimo
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2013, 56 (02) : 253 - 290
  • [14] Fast large-scale optimization by unifying stochastic gradient and quasi-Newton methods
    Sohl-Dickstein, Jascha
    Poole, Ben
    Ganguli, Surya
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 604 - 612
  • [15] A STOCHASTIC QUASI-NEWTON METHOD FOR LARGE-SCALE OPTIMIZATION
    Byrd, R. H.
    Hansen, S. L.
    Nocedal, Jorge
    Singer, Y.
    SIAM JOURNAL ON OPTIMIZATION, 2016, 26 (02) : 1008 - 1031
  • [16] Inexact semismooth, newton methods for large-scale complementarity problems
    Kanzow, C
    OPTIMIZATION METHODS & SOFTWARE, 2004, 19 (3-4): : 309 - 325
  • [17] Efficient methods for large-scale unconstrained optimization
    Luksan, L
    Vlcek, J
    LARGE-SCALE NONLINEAR OPTIMIZATION, 2006, 83 : 185 - +
  • [18] Optimization of Inf-Convolution Regularized Nonconvex Composite Problems
    Laude, Emanuel
    Wu, Tao
    Cremers, Daniel
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 547 - 556
  • [19] Proximal Newton Methods for Convex Composite Optimization
    Patrinos, Panagiotis
    Bemporad, Alberto
    2013 IEEE 52ND ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2013, : 2358 - 2363
  • [20] An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold
    Wang, Qinsi
    Yang, Wei Hong
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 89 (02) : 419 - 457