A FUNCTIONAL MODEL METHOD FOR NONCONVEX NONSMOOTH CONDITIONAL STOCHASTIC OPTIMIZATION\ast

被引:0
|
作者
Ruszczynski, Andrzej [1 ]
Yang, Shangzhe [1 ]
机构
[1] Rutgers State Univ, Dept Management Sci & Informat Syst, Piscataway, NJ 08854 USA
关键词
conditional stochastic optimization; nonsmooth optimization; stochastic subgradi- ent methods; reparametrization; ALGORITHMS; APPROXIMATIONS;
D O I
10.1137/23M1617965
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider stochastic optimization problems involving an expected value of a nonlinear function of a base random vector and a conditional expectation of another function depending on the base random vector, a dependent random vector, and the decision variables. We call such problems conditional stochastic optimization problems. They arise in many applications, such as uplift modeling, reinforcement learning, and contextual optimization. We propose a specialized single lems with a Lipschitz smooth outer function and a generalized differentiable inner function. In the method, we approximate the inner conditional expectation with a rich parametric model whose mean squared error satisfies a stochastic version of a \Lojasiewicz condition. The model is used by an inner learning algorithm. The main feature of our approach is that unbiased stochastic estimates of the directions used by the method can be generated with one observation from the joint distribution per iteration, which makes it applicable to real-time learning. The directions, however, are not gradients or subgradients of any overall objective function. We prove the convergence of the method with probability one, using the method of differential inclusions and a specially designed Lyapunov function, involving a stochastic generalization of the Bregman distance. Finally, a numerical illustration demonstrates the viability of our approach.
引用
收藏
页码:3064 / 3087
页数:24
相关论文
共 50 条
  • [41] Deterministic Nonsmooth Nonconvex Optimization
    Jordan, Michael I.
    Kornowski, Guy
    Lin, Tianyi
    Shamir, Ohad
    Zampetakis, Manolis
    THIRTY SIXTH ANNUAL CONFERENCE ON LEARNING THEORY, VOL 195, 2023, 195
  • [42] Nonsmooth Nonconvex Stochastic Heavy Ball
    Le, Tam
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 201 (02) : 699 - 719
  • [43] Nonsmooth Nonconvex Stochastic Heavy Ball
    Tam Le
    Journal of Optimization Theory and Applications, 2024, 201 : 699 - 719
  • [44] A Stochastic Successive Minimization Method for Nonsmooth Nonconvex Optimization with Applications to Transceiver Design in Wireless Communication Networks
    Meisam Razaviyayn
    Maziar Sanjabi
    Zhi-Quan Luo
    Mathematical Programming, 2016, 157 : 515 - 545
  • [45] A Stochastic Successive Minimization Method for Nonsmooth Nonconvex Optimization with Applications to Transceiver Design in Wireless Communication Networks
    Razaviyayn, Meisam
    Sanjabi, Maziar
    Luo, Zhi-Quan
    MATHEMATICAL PROGRAMMING, 2016, 157 (02) : 515 - 545
  • [46] Nonconvex Stochastic Optimization for Model Reduction
    Han-Fu Chen
    Hai-Tao Fang
    Journal of Global Optimization, 2002, 23 : 359 - 372
  • [47] Nonconvex stochastic optimization for model reduction
    Chen, HF
    Fang, HT
    JOURNAL OF GLOBAL OPTIMIZATION, 2002, 23 (3-4) : 359 - 372
  • [48] A filter proximal bundle method for nonsmooth nonconvex constrained optimization
    Najmeh Hoseini Monjezi
    S. Nobakhtian
    Journal of Global Optimization, 2021, 79 : 1 - 37
  • [49] An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization
    Liu, Ruyu
    Pan, Shaohua
    Wu, Yuqia
    Yang, Xiaoqi
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 88 (02) : 603 - 641
  • [50] On a globally convergent semismooth* Newton method in nonsmooth nonconvex optimization
    Gfrerer, Helmut
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2025, 91 (01) : 67 - 124