A stochastic subspace approach to gradient-free optimization in high dimensions

被引:15
|
作者
Kozak, David [1 ,2 ]
Becker, Stephen [3 ]
Doostan, Alireza [4 ]
Tenorio, Luis [1 ]
机构
[1] Solea Energy, Overland Pk, KS 66210 USA
[2] Colorado Sch Mines, Dept Appl Math & Stat, Golden, CO 80401 USA
[3] Univ Colorado, Dept Appl Math, Boulder, CO 80309 USA
[4] Univ Colorado, Dept Aerosp Engn Sci, Boulder, CO 80309 USA
基金
美国国家科学基金会;
关键词
Randomized methods; Gradient-free; Gaussian processes; Stochastic gradients; INVERSE PROBLEMS; CONVEX-PROGRAMS; DESCENT; ALGORITHM; MINIMIZATION; UNCERTAINTY; PERFORMANCE; COMPLEXITY; FLOW;
D O I
10.1007/s10589-021-00271-w
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We present a stochastic descent algorithm for unconstrained optimization that is particularly efficient when the objective function is slow to evaluate and gradients are not easily obtained, as in some PDE-constrained optimization and machine learning problems. The algorithm maps the gradient onto a low-dimensional random subspace of dimension l at each iteration, similar to coordinate descent but without restricting directional derivatives to be along the axes. Without requiring a full gradient, this mapping can be performed by computing l directional derivatives (e.g., via forward-mode automatic differentiation). We give proofs for convergence in expectation under various convexity assumptions as well as probabilistic convergence results under strong-convexity. Our method provides a novel extension to the well-known Gaussian smoothing technique to descent in subspaces of dimension greater than one, opening the doors to new analysis of Gaussian smoothing when more than one directional derivative is used at each iteration. We also provide a finite-dimensional variant of a special case of the Johnson-Lindenstrauss lemma. Experimentally, we show that our method compares favorably to coordinate descent, Gaussian smoothing, gradient descent and BFGS (when gradients are calculated via forward-mode automatic differentiation) on problems from the machine learning and shape optimization literature.
引用
收藏
页码:339 / 368
页数:30
相关论文
共 50 条
  • [21] Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
    Gasnikov, A. V.
    Lagunovskaya, A. A.
    Usmanova, I. N.
    Fedorenko, F. A.
    AUTOMATION AND REMOTE CONTROL, 2016, 77 (11) : 2018 - 2034
  • [22] Gradient-Free Algorithms for Solving Stochastic Saddle Optimization Problems with the Polyak-Lojasiewicz Condition
    Sadykov, S. I.
    Lobanov, A. V.
    Raigorodskii, A. M.
    PROGRAMMING AND COMPUTER SOFTWARE, 2023, 49 (06) : 535 - 547
  • [23] Topology optimization methods with gradient-free perimeter approximation
    Amstutz, Samuel
    Van Goethem, Nicolas
    INTERFACES AND FREE BOUNDARIES, 2012, 14 (03) : 401 - 430
  • [24] Gradient-free strategies to robust well control optimization
    Pinto, Jefferson Wellano Oliveira
    Tueros, Juan Alberto Rojas
    Horowitz, Bernardo
    da Silva, Silvana Maria Bastos Afonso
    Willmersdorf, Ramiro Brito
    de Oliveira, Diego Felipe Barbosa
    COMPUTATIONAL GEOSCIENCES, 2020, 24 (06) : 1959 - 1978
  • [25] Decentralized Gradient-Free Methods for Stochastic Non-smooth Non-convex Optimization
    Lin, Zhenwei
    Xia, Jingfan
    Deng, Qi
    Luo, Luo
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 16, 2024, : 17477 - 17486
  • [26] Metamaterials design using gradient-free numerical optimization
    Diest, Kenneth
    Sweatlock, Luke A.
    Marthaler, Daniel E.
    JOURNAL OF APPLIED PHYSICS, 2010, 108 (08)
  • [27] Approaching Quartic Convergence Rates for Quasi-Stochastic Approximation with Application to Gradient-Free Optimization
    Lauand, Caio Kalil
    Meyn, Sean
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [28] Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
    A. V. Gasnikov
    A. A. Lagunovskaya
    I. N. Usmanova
    F. A. Fedorenko
    Automation and Remote Control, 2016, 77 : 2018 - 2034
  • [29] High-Dimensional Nonlinear Multi-Fidelity Model with Gradient-Free Active Subspace Method
    Liu, Bangde
    Lin, Guang
    COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2020, 28 (05) : 1937 - 1969
  • [30] Gradient-free optimization of chaotic acoustics with reservoir computing
    Huhn, Francisco
    Magri, Luca
    PHYSICAL REVIEW FLUIDS, 2022, 7 (01)