MANIFOLD SAMPLING FOR l1 NONCONVEX OPTIMIZATION

被引:13
|
作者
Larson, Jeffrey [1 ]
Menickelly, Matt [1 ,2 ]
Wild, Stefan M. [1 ]
机构
[1] Argonne Natl Lab, Math & Comp Sci Div, Lemont, IL 60439 USA
[2] Lehigh Univ, Dept Ind & Syst Engn, HS Mohler Lab, Bethlehem, PA 18015 USA
关键词
composite nonsmooth optimization; gradient sampling; derivative-free optimization; DERIVATIVE-FREE OPTIMIZATION; TRUST-REGION ALGORITHMS; NONSMOOTH OPTIMIZATION; CONVERGENCE;
D O I
10.1137/15M1042097
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We present a new algorithm, called manifold sampling, for the unconstrained minimization of a nonsmooth composite function h o F when h has known structure. In particular, by classifying points in the domain of the nonsmooth function h into manifolds, we adapt search directions within a trust-region framework based on knowledge of manifolds intersecting the current trust region. We motivate this idea through a study of l(1) functions, where it is trivial to classify objective function manifolds using zeroth-order information from the constituent functions F-i, and give an explicit statement of a manifold sampling algorithm in this case. We prove that all cluster points of iterates generated by this algorithm are stationary in the Clarke sense. We prove a similar result for a stochastic variant of the algorithm. Additionally, our algorithm can accept iterates that are points where h is nondifferentiable and requires only an approximation of gradients of F at the trust-region center. Numerical results for several variants of the algorithm show that using manifold information from additional points near the current iterate can improve practical performance. The best variants are also shown to be competitive, particularly in terms of robustness, with other nonsmooth, derivative-free solvers.
引用
收藏
页码:2540 / 2563
页数:24
相关论文
共 50 条
  • [31] L1 optimization under linear inequality constraints
    Smith, TJ
    JOURNAL OF CLASSIFICATION, 2000, 17 (02) : 225 - 242
  • [32] L1 Optimization under Linear Inequality Constraints
    T.J. Smith
    Journal of Classification, 2000, 17 : 225 - 242
  • [33] Replicating Portfolios: L1 Versus L2 Optimization
    Natolski, Jan
    Werner, Ralf
    OPERATIONS RESEARCH PROCEEDINGS 2015, 2017, : 533 - 538
  • [34] EMBEDDING L1 IN L1/H1
    BOURGAIN, J
    TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY, 1983, 278 (02) : 689 - 702
  • [35] IMBEDDING OF L1 IN L1/H1
    BOURGAIN, J
    COMPTES RENDUS DE L ACADEMIE DES SCIENCES SERIE I-MATHEMATIQUE, 1982, 294 (18): : 633 - 636
  • [36] SPECTRUM OF (L1 AND L1)(R+)
    PAPACOSTAS, GC
    BULLETIN DE LA CLASSE DES SCIENCES ACADEMIE ROYALE DE BELGIQUE, 1979, 65 (10): : 545 - 554
  • [37] A fast and accurate algorithm for l1 minimization problems in compressive sampling
    Chen, Feishe
    Shen, Lixin
    Suter, Bruce W.
    Xu, Yuesheng
    EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2015,
  • [38] Embedded stacked group sparse autoencoder ensemble with L1 regularization and manifold reduction
    Li, Yongming
    Lei, Yan
    Wang, Pin
    Jiang, Mingfeng
    Liu, Yuchuan
    APPLIED SOFT COMPUTING, 2021, 101
  • [39] Convex Hull Collaborative Representation Learning on Grassmann Manifold with L1 Norm Regularization
    Guan, Yao
    Yan, Wenzhu
    Li, Yanmeng
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT II, 2024, 14426 : 453 - 465
  • [40] ADMM-based l1 - l1 optimization algorithm for robust sparse channel estimation in OFDM systems
    Vargas, Hector
    Ramirez, Juan
    Arguello, Henry
    SIGNAL PROCESSING, 2020, 167