Large-Scale Non-convex Stochastic Constrained Distributionally Robust Optimization

被引:0
|
作者
Zhang, Qi [1 ]
Zhou, Yi [2 ]
Prater-Bennette, Ashley [3 ]
Shen, Lixin [4 ]
Zou, Shaofeng [1 ]
机构
[1] Univ Buffalo, Buffalo, NY 14260 USA
[2] Univ Utah, Salt Lake City, UT USA
[3] Air Force Res Lab, Wright Patterson AFB, OH USA
[4] Syracuse Univ, Syracuse, NY USA
基金
美国国家科学基金会;
关键词
DIVERGENCE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distributionally robust optimization (DRO) is a powerful framework for training robust models against data distribution shifts. This paper focuses on constrained DRO, which has an explicit characterization of the robustness level. Existing studies on constrained DRO mostly focus on convex loss function, and exclude the practical and challenging case with non-convex loss function, e.g., neural network. This paper develops a stochastic algorithm and its performance analysis for non-convex constrained DRO. The computational complexity of our stochastic algorithm at each iteration is independent of the overall dataset size, and thus is suitable for large-scale applications. We focus on the general Cressie-Read family divergence defined uncertainty set which includes chi(2)-divergences as a special case. We prove that our algorithm finds an epsilon-stationary point with an improved computational complexity than existing methods. Our method also applies to the smoothed conditional value at risk (CVaR) DRO.
引用
收藏
页码:8217 / 8225
页数:9
相关论文
共 50 条
  • [31] A STOCHASTIC APPROACH TO THE CONVEX OPTIMIZATION OF NON-CONVEX DISCRETE ENERGY SYSTEMS
    Burger, Eric M.
    Moura, Scott J.
    PROCEEDINGS OF THE ASME 10TH ANNUAL DYNAMIC SYSTEMS AND CONTROL CONFERENCE, 2017, VOL 3, 2017,
  • [32] Adaptive Stochastic Gradient Descent Method for Convex and Non-Convex Optimization
    Chen, Ruijuan
    Tang, Xiaoquan
    Li, Xiuting
    FRACTAL AND FRACTIONAL, 2022, 6 (12)
  • [33] Robust Sparse Recovery via Non-Convex Optimization
    Chen, Laming
    Gu, Yuantao
    2014 19TH INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2014, : 742 - 747
  • [34] A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function
    Zahra Khoshgam
    Ali Ashrafi
    Computational and Applied Mathematics, 2019, 38
  • [35] A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function
    Khoshgam, Zahra
    Ashrafi, Ali
    OPTIMIZATION METHODS & SOFTWARE, 2019, 34 (04): : 783 - 796
  • [36] Compressed sensing of large-scale local field potentials using adaptive sparsity analysis and non-convex optimization
    Sun, Biao
    Zhang, Han
    Zhang, Yunyan
    Wu, Zexu
    Bao, Botao
    Hu, Yong
    Li, Ting
    JOURNAL OF NEURAL ENGINEERING, 2021, 18 (02)
  • [37] A new hybrid conjugate gradient method for large-scale unconstrained optimization problem with non-convex objective function
    Khoshgam, Zahra
    Ashrafi, Ali
    COMPUTATIONAL & APPLIED MATHEMATICS, 2019, 38 (04):
  • [38] A parallelizable augmented Lagrangian method applied to large-scale non-convex-constrained optimization problems
    Natashia Boland
    Jeffrey Christiansen
    Brian Dandurand
    Andrew Eberhard
    Fabricio Oliveira
    Mathematical Programming, 2019, 175 : 503 - 536
  • [39] A parallelizable augmented Lagrangian method applied to large-scale non-convex-constrained optimization problems
    Boland, Natashia
    Christiansen, Jeffrey
    Dandurand, Brian
    Eberhard, Andrew
    Oliveira, Fabricio
    MATHEMATICAL PROGRAMMING, 2019, 175 (1-2) : 503 - 536
  • [40] An optimal subgradient algorithm for large-scale bound-constrained convex optimization
    Ahookhosh, Masoud
    Neumaier, Arnold
    MATHEMATICAL METHODS OF OPERATIONS RESEARCH, 2017, 86 (01) : 123 - 147