A feasible smoothing accelerated projected gradient method for nonsmooth convex optimization

被引:0
|
作者
Nishioka, Akatsuki [1 ]
Kanno, Yoshihiro [1 ,2 ]
机构
[1] Univ Tokyo, Dept Math Informat, Bunkyo Ku, Hongo 7-3-1,Bunkyo Ku, Tokyo 1138656, Japan
[2] Univ Tokyo, Math & Informat Ctr, Hongo 7-3-1,Bunkyo Ku, Tokyo 1138656, Japan
关键词
Smoothing method; Accelerated gradient method; Convergence rate; Structural optimization; Eigenvalue optimization;
D O I
10.1016/j.orl.2024.107181
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Smoothing accelerated gradient methods achieve faster convergence rates than that of the subgradient method for some nonsmooth convex optimization problems. However, Nesterov's extrapolation may require gradients at infeasible points, and thus they cannot be applied to some structural optimization problems. We introduce a variant of smoothing accelerated projected gradient methods where every variable is feasible. The O ( k - 1 log k ) convergence rate is obtained using the Lyapunov function. We conduct a numerical experiment on the robust compliance optimization of a truss structure. (c) 2024 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons .org /licenses /by /4 .0/).
引用
收藏
页数:5
相关论文
共 50 条
  • [21] Nesterov Accelerated Shuffling Gradient Method for Convex Optimization
    Tran, Trang H.
    Scheinberg, Katya
    Nguyen, Lam M.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [22] A new inexact gradient descent method with applications to nonsmooth convex optimization
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Tran, Dat Ba
    OPTIMIZATION METHODS & SOFTWARE, 2024,
  • [23] A Modified Proximal Gradient Method for a Family of Nonsmooth Convex Optimization Problems
    Li Y.-Y.
    Zhang H.-B.
    Li F.
    Journal of the Operations Research Society of China, 2017, 5 (3) : 391 - 403
  • [24] A New Inexact Gradient Descent Method with Applications to Nonsmooth Convex Optimization
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Tran, Dat Ba
    arXiv, 2023,
  • [25] Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization
    Yuncheng Liu
    Fuquan Xia
    Optimization Letters, 2021, 15 : 2147 - 2164
  • [26] Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization
    Liu, Yuncheng
    Xia, Fuquan
    OPTIMIZATION LETTERS, 2021, 15 (06) : 2147 - 2164
  • [27] Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems
    Guminov, S. V.
    Nesterov, Yu. E.
    Dvurechensky, P. E.
    Gasnikov, A. V.
    DOKLADY MATHEMATICS, 2019, 99 (02) : 125 - 128
  • [28] A NOTE ON THE (ACCELERATED) PROXIMAL GRADIENT METHOD FOR COMPOSITE CONVEX OPTIMIZATION
    Li, Qingjing
    Tan, Li
    Guo, Ke
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2022, 23 (12) : 2847 - 2857
  • [29] Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems
    S. V. Guminov
    Yu. E. Nesterov
    P. E. Dvurechensky
    A. V. Gasnikov
    Doklady Mathematics, 2019, 99 : 125 - 128
  • [30] A FAST DUAL GRADIENT METHOD FOR SEPARABLE CONVEX OPTIMIZATION VIA SMOOTHING
    Li, Jueyou
    Wu, Zhiyou
    Wu, Changzhi
    Long, Qiang
    Wang, Xiangyu
    Lee, Jae-Myung
    Jung, Kwang-Hyo
    PACIFIC JOURNAL OF OPTIMIZATION, 2016, 12 (02): : 289 - +