Properties and practicability of convergence-guaranteed optimization methods derived from weak discrete gradients

被引:0
|
作者
Ushiyama, Kansei [1 ]
Sato, Shun [1 ]
Matsuo, Takayasu [1 ]
机构
[1] Univ Tokyo, Grad Sch Informat Sci & Technol, 7-3-1 Hongo,Bunkyo Ku, Tokyo 1138656, Japan
关键词
Convex optimization; Proximal gradient method; Numerical analysis; Discrete gradient;
D O I
10.1007/s11075-024-01790-3
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The ordinary differential equation (ODE) models of optimization methods allow for concise proofs of convergence rates through discussions based on Lyapunov functions. The weak discrete gradient (wDG) framework discretizes ODEs while preserving the properties of convergence, serving as a foundation for deriving optimization methods. Although various optimization methods have been derived through wDG, their properties and practical applicability remain underexplored. Hence, this study elucidates these aspects through numerical experiments. Particularly, although wDG yields several implicit methods, we highlight the potential utility of these methods in scenarios where the objective function incorporates a regularization term.
引用
收藏
页码:1331 / 1362
页数:32
相关论文
共 50 条