Stochastic first-order methods for convex and nonconvex functional constrained optimization

被引:27
|
作者
Boob, Digvijay [1 ]
Deng, Qi [2 ]
Lan, Guanghui [1 ]
机构
[1] Georgia Inst Technol, Ind & Syst Engn, Atlanta, GA 30332 USA
[2] Shanghai Univ Finance & Econ, Sch Informat Management & Engn, Shanghai, Peoples R China
关键词
Functional constrained optimization; Stochastic algorithms; Convex and nonconvex optimization; Acceleration; ALGORITHMS;
D O I
10.1007/s10107-021-01742-y
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Functional constrained optimization is becoming more and more important in machine learning and operations research. Such problems have potential applications in risk-averse machine learning, semisupervised learning and robust optimization among others. In this paper, we first present a novel Constraint Extrapolation (ConEx) method for solving convex functional constrained problems, which utilizes linear approximations of the constraint functions to define the extrapolation (or acceleration) step. We show that this method is a unified algorithm that achieves the best-known rate of convergence for solving different functional constrained convex composite problems, including convex or strongly convex, and smooth or nonsmooth problems with stochastic objective and/or stochastic constraints. Many of these rates of convergence were in fact obtained for the first time in the literature. In addition, ConEx is a single-loop algorithm that does not involve any penalty subproblems. Contrary to existing primal-dual methods, it does not require the projection of Lagrangian multipliers into a (possibly unknown) bounded set. Second, for nonconvex functional constrained problems, we introduce a new proximal point method which transforms the initial nonconvex problem into a sequence of convex problems by adding quadratic terms to both the objective and constraints. Under certain MFCQ-type assumption, we establish the convergence and rate of convergence of this method to KKT points when the convex subproblems are solved exactly or inexactly. For large-scale and stochastic problems, we present a more practical proximal point method in which the approximate solutions of the subproblems are computed by the aforementioned ConEx method. Under a strong feasibility assumption, we establish the total iteration complexity of ConEx required by this inexact proximal point method for a variety of problem settings, including nonconvex smooth or nonsmooth problems with stochastic objective and/or stochastic constraints. To the best of our knowledge, most of these convergence and complexity results of the proximal point method for nonconvex problems also seem to be new in the literature.
引用
收藏
页码:215 / 279
页数:65
相关论文
共 50 条
  • [31] Safe Online Convex Optimization with First-order Feedback
    Hutchinson, Spencer
    Alizadeh, Mahnoosh
    2024 AMERICAN CONTROL CONFERENCE, ACC 2024, 2024, : 1404 - 1410
  • [32] Optimized first-order methods for smooth convex minimization
    Kim, Donghwan
    Fessler, Jeffrey A.
    MATHEMATICAL PROGRAMMING, 2016, 159 (1-2) : 81 - 107
  • [33] Optimized first-order methods for smooth convex minimization
    Donghwan Kim
    Jeffrey A. Fessler
    Mathematical Programming, 2016, 159 : 81 - 107
  • [35] First-order methods for the convex hull membership problem
    Filippozzi, Rafaela
    Goncalves, Douglas S.
    Santos, Luiz-Rafael
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2023, 306 (01) : 17 - 33
  • [36] EXACT WORST-CASE PERFORMANCE OF FIRST-ORDER METHODS FOR COMPOSITE CONVEX OPTIMIZATION
    Taylor, Adrien B.
    Hendrickx, Julien M.
    Glineur, Francois
    SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (03) : 1283 - 1313
  • [37] Runge-Kutta-like scaling techniques for first-order methods in convex optimization
    Porta, Federica
    Cornelio, Anastasia
    Ruggiero, Valeria
    APPLIED NUMERICAL MATHEMATICS, 2017, 116 : 256 - 272
  • [38] An inexact first-order method for constrained nonlinear optimization
    Wang, Hao
    Zhang, Fan
    Wang, Jiashan
    Rong, Yuyang
    OPTIMIZATION METHODS & SOFTWARE, 2022, 37 (01): : 79 - 112
  • [39] First-Order Conditions for Set-Constrained Optimization
    Rovnyak, Steven M.
    Chong, Edwin K. P.
    Rovnyak, James
    MATHEMATICS, 2023, 11 (20)
  • [40] The First-Order Necessary Conditions for Sparsity Constrained Optimization
    Li X.
    Song W.
    Journal of the Operations Research Society of China, 2015, 3 (4) : 521 - 535