Large-Scale Non-convex Stochastic Constrained Distributionally Robust Optimization

被引:0
|
作者
Zhang, Qi [1 ]
Zhou, Yi [2 ]
Prater-Bennette, Ashley [3 ]
Shen, Lixin [4 ]
Zou, Shaofeng [1 ]
机构
[1] Univ Buffalo, Buffalo, NY 14260 USA
[2] Univ Utah, Salt Lake City, UT USA
[3] Air Force Res Lab, Wright Patterson AFB, OH USA
[4] Syracuse Univ, Syracuse, NY USA
基金
美国国家科学基金会;
关键词
DIVERGENCE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distributionally robust optimization (DRO) is a powerful framework for training robust models against data distribution shifts. This paper focuses on constrained DRO, which has an explicit characterization of the robustness level. Existing studies on constrained DRO mostly focus on convex loss function, and exclude the practical and challenging case with non-convex loss function, e.g., neural network. This paper develops a stochastic algorithm and its performance analysis for non-convex constrained DRO. The computational complexity of our stochastic algorithm at each iteration is independent of the overall dataset size, and thus is suitable for large-scale applications. We focus on the general Cressie-Read family divergence defined uncertainty set which includes chi(2)-divergences as a special case. We prove that our algorithm finds an epsilon-stationary point with an improved computational complexity than existing methods. Our method also applies to the smoothed conditional value at risk (CVaR) DRO.
引用
收藏
页码:8217 / 8225
页数:9
相关论文
共 50 条
  • [1] Distributed Distributionally Robust Optimization with Non-Convex Objectives
    Jiao, Yang
    Yang, Kai
    Song, Dongjin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [2] Stochastic Successive Convex Approximation for Non-Convex Constrained Stochastic Optimization
    Liu, An
    Lau, Vincent K. N.
    Kananian, Borna
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (16) : 4189 - 4203
  • [3] Non-convex Distributionally Robust Optimization: Non-asymptotic Analysis
    Jin, Jikai
    Zhang, Bohang
    Wang, Haiyang
    Wang, Liwei
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Large-Scale Methods for Distributionally Robust Optimization
    Levy, Daniel
    Carmon, Yair
    Duchi, John C.
    Sidford, Aaron
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [5] An Online Method for A Class of Distributionally Robust Optimization with Non-Convex Objectives
    Qi, Qi
    Guo, Zhishuai
    Xu, Yi
    Jin, Rong
    Yang, Tianbao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] A Stochastic Subgradient Method for Distributionally Robust Non-convex and Non-smooth Learning
    Mert Gürbüzbalaban
    Andrzej Ruszczyński
    Landi Zhu
    Journal of Optimization Theory and Applications, 2022, 194 : 1014 - 1041
  • [7] A Stochastic Subgradient Method for Distributionally Robust Non-convex and Non-smooth Learning
    Gurbuzbalaban, Mert
    Ruszczynski, Andrzej
    Zhu, Landi
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2022, 194 (03) : 1014 - 1041
  • [8] Large-scale regression with non-convex loss and penalty
    Buccini, Alessandro
    Cabrera, Omar De la Cruz
    Donatelli, Marco
    Martinelli, Andrea
    Reichel, Lothar
    APPLIED NUMERICAL MATHEMATICS, 2020, 157 : 590 - 601
  • [9] Constrained Non-convex Optimization via Stochastic Variance Reduced Approximations
    Nutalapati, Mohan Krishna
    Krishna, Muppavaram Sai
    Samanta, Atanu
    Rajawat, Ketan
    2019 SIXTH INDIAN CONTROL CONFERENCE (ICC), 2019, : 293 - 298
  • [10] Interplay of non-convex quadratically constrained problems with adjustable robust optimization
    Immanuel Bomze
    Markus Gabl
    Mathematical Methods of Operations Research, 2021, 93 : 115 - 151