Derivative-Free Method For Composite Optimization With Applications To Decentralized Distributed Optimization

被引:11
|
作者
Beznosikov, Aleksandr [1 ,2 ]
Gorbunov, Eduard [2 ,3 ,4 ]
Gasnikov, Alexander [1 ,2 ,4 ,5 ]
机构
[1] Moscow Inst Phys & Technol, Moscow, Russia
[2] Sirius Univ Sci & Technol, Soci, Russia
[3] Moscow Inst Phys Technol, Moscow, Russia
[4] Inst Informat Transmiss Problems RAS, Moscow, Russia
[5] Adyghe State Univ, Caucasus Math Ctr, Maykop, Adygea Republic, Russia
来源
IFAC PAPERSONLINE | 2020年 / 53卷 / 02期
关键词
gradient sliding; zeroth-order optimization; decentralized distributed optimization; composite optimization;
D O I
10.1016/j.ifacol.2020.12.2272
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a new method based on the Sliding Algorithm from Lan (2016, 2019) for the convex composite optimization problem that includes two terms: smooth one and non-smooth one. Our method uses the stochastic noised zeroth-order oracle for the non-smooth part and the first-order oracle for the smooth part and it is the first method in the literature that uses such a mixed oracle for the composite optimization. We prove the convergence rate for the new method that matches the corresponding rate for the first-order method up to a factor proportional to the dimension of the space or, in some cases, its squared logarithm. We apply this method for the decentralized distributed optimization and derive upper bounds for the number of communication rounds for this method that matches known lower bounds. Moreover, our bound for the number of zeroth-order oracle calls per node matches the similar state-of-the-art bound for the first-order decentralized distributed optimization up to to the factor proportional to the dimension of the space or, in some cases, its squared logarithm. Copyright (C) 2020 The Authors.
引用
收藏
页码:4038 / 4043
页数:6
相关论文
共 50 条
  • [21] New Subspace Method for Unconstrained Derivative-Free Optimization
    Kimiaei, Morteza
    Neumaier, Arnold
    Faramarzi, Parvaneh
    ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2023, 49 (04):
  • [22] Discrete Gradient Method: Derivative-Free Method for Nonsmooth Optimization
    A. M. Bagirov
    B. Karasözen
    M. Sezer
    Journal of Optimization Theory and Applications, 2008, 137 : 317 - 334
  • [23] Distributed Derivative-Free Learning Method for Stochastic Optimization Over a Network With Sparse Activity
    Li, Wenjie
    Assaad, Mohamad
    Zheng, Shiqi
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (05) : 2221 - 2236
  • [24] Sobolev seminorm of quadratic functions with applications to derivative-free optimization
    Zhang, Zaikun
    MATHEMATICAL PROGRAMMING, 2014, 146 (1-2) : 77 - 96
  • [25] Sobolev seminorm of quadratic functions with applications to derivative-free optimization
    Zaikun Zhang
    Mathematical Programming, 2014, 146 : 77 - 96
  • [26] Distributed Reinforcement Learning for Decentralized Linear Quadratic Control: A Derivative-Free Policy Optimization Approach (Extended Abstract)
    Li, Yingying
    Tang, Yujie
    Zhang, Runyu
    Li, Na
    LEARNING FOR DYNAMICS AND CONTROL, VOL 120, 2020, 120 : 814 - 814
  • [27] ZOOpt: a toolbox for derivative-free optimization
    Liu, Yu-Ren
    Hu, Yi-Qi
    Qian, Hong
    Qian, Chao
    Yu, Yang
    SCIENCE CHINA-INFORMATION SCIENCES, 2022, 65 (10)
  • [28] Distributed Derivative-free Optimization in Large Communication Networks with Sparse Activity
    Li, Wenjie
    Assaad, Mohamad
    2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 817 - 822
  • [29] ZOOpt: a toolbox for derivative-free optimization
    Yu-Ren Liu
    Yi-Qi Hu
    Hong Qian
    Chao Qian
    Yang Yu
    Science China Information Sciences, 2022, 65
  • [30] ZOOpt: a toolbox for derivative-free optimization
    Yu-Ren LIU
    Yi-Qi HU
    Hong QIAN
    Chao QIAN
    Yang YU
    ScienceChina(InformationSciences), 2022, 65 (10) : 293 - 294