CONVERGENCE PROPERTIES OF STOCHASTIC PROXIMAL SUBGRADIENT METHOD IN SOLVING A CLASS OF COMPOSITE OPTIMIZATION PROBLEMS WITH CARDINALITY REGULARIZER

被引:0
|
作者
Hu, Xiaoyin [1 ]
Liu, Xin [2 ,3 ]
Xiao, Nachuan [4 ]
机构
[1] Hangzhou City Univ, Sch Comp & Comp Sci, Hangzhou 310015, Peoples R China
[2] Chinese Acad Sci, Acad Math & Syst Sci, State Key Lab Sci & Engn Comp, Beijing, Peoples R China
[3] Univ Chinese Acad Sci, Sch Math Sci, Beijing, Peoples R China
[4] Natl Univ Singapore, Inst Operat Res & Analyt, Singapore, Singapore
基金
中国国家自然科学基金;
关键词
Nonsmooth optimization; cardinality regularizer; proximal subgradient method; global convergence; conservative field; NONSMOOTH; GRADIENT; ALGORITHMS; PENALTY;
D O I
10.3934/jimo.2023149
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
. In this paper, we study a class of composite optimization problems, whose objective function is the summation of a bunch of nonsmooth nonconvex loss functions and a cardinality regularizer. Firstly we investigate the optimality condition of these problems and then suggest a stochastic proximal subgradient method (SPSG) to solve them. Then we establish the almost surely subsequence convergence of SPSG under mild assumptions. We emphasize that these assumptions are satisfied by a wide range of problems arising from training neural networks. Furthermore, we conduct preliminary numerical experiments to demonstrate the effectiveness and efficiency of SPSG in solving this class of problems.
引用
收藏
页码:1934 / 1950
页数:17
相关论文
共 50 条