Exploring high-dimensional optimization by sparse and low-rank evolution strategy

被引:0
|
作者
Li, Zhenhua [1 ,2 ]
Wu, Wei [1 ]
Zhang, Qingfu [3 ]
Cai, Xinye [4 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 211106, Peoples R China
[2] MIIT Key Lab Pattern Anal & Machine Intelligence, Nanjing, Peoples R China
[3] City Univ Hong Kong, Dept Comp Sci, Hong Kong 999077, Peoples R China
[4] Dalian Univ Technol, Sch Control Sci & Engn, Dalian 116024, Peoples R China
基金
中国国家自然科学基金;
关键词
Black-box optimization; Large-scale optimization; Evolution strategies; Sparse plus low-rank model; LOCAL SEARCH; SCALE; ADAPTATION; CMA;
D O I
10.1016/j.swevo.2024.101828
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Evolution strategies (ESs) area robust family of algorithms for black-box optimization, yet their applicability to high-dimensional problems remains constrained by computational challenges. To address this, we propose a novel evolution strategy, SLR-ES, leveraging a sparse plus low-rank covariance matrix model. The sparse component utilizes a diagonal matrix to exploit separability along coordinate axes, while the low-rank component identifies promising subspaces and parameter dependencies. To maintain distribution fidelity, we introduce a decoupled update mechanism for the model parameters. Comprehensive experiments demonstrate that SLR-ES achieves state-of-the-art performance on both separable and non-separable functions. Furthermore, evaluations on the CEC'2010 and CEC'2013 large-scale global optimization benchmarks reveal consistent superiority in average ranking, highlighting the algorithm's robustness across diverse problem conditions. These results establish SLR-ES as a scalable and versatile solution for high-dimensional optimization.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] High-dimensional covariance matrix estimation using a low-rank and diagonal decomposition
    Wu, Yilei
    Qin, Yingli
    Zhu, Mu
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2020, 48 (02): : 308 - 337
  • [22] CHANGE DETECTION IN THE COVARIANCE STRUCTURE OF HIGH-DIMENSIONAL GAUSSIAN LOW-RANK MODELS
    Beisson, R.
    Vallet, P.
    Giremus, A.
    Ginolhac, G.
    2021 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2021, : 421 - 425
  • [23] Preconditioned low-rank methods for high-dimensional elliptic PDE eigenvalue problems
    Kressner D.
    Tobler C.
    Computational Methods in Applied Mathematics, 2011, 11 (03) : 363 - 381
  • [24] Reliability analysis of high-dimensional models using low-rank tensor approximations
    Konakli, Katerina
    Sudret, Bruno
    PROBABILISTIC ENGINEERING MECHANICS, 2016, 46 : 18 - 36
  • [25] On low-rank approximability of solutions to high-dimensional operator equations and eigenvalue problems
    Kressner, Daniel
    Uschmajew, Andre
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2016, 493 : 556 - 572
  • [26] Dimension-wise sparse low-rank approximation of a matrix with application to variable selection in high-dimensional integrative analyzes of association
    Poythress, J. C.
    Park, Cheolwoo
    Ahn, Jeongyoun
    JOURNAL OF APPLIED STATISTICS, 2022, 49 (15) : 3889 - 3907
  • [27] Low-rank and sparse matrices fitting algorithm for low-rank representation
    Zhao, Jianxi
    Zhao, Lina
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2020, 79 (02) : 407 - 425
  • [28] A New Statistic for Testing Covariance Equality in High-Dimensional Gaussian Low-Rank Models
    Beisson, Remi
    Vallet, Pascal
    Giremus, Audrey
    Ginolhac, Guillaume
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 1797 - 1812
  • [29] A Compact High-Dimensional Yield Analysis Method using Low-Rank Tensor Approximation
    Shi, Xiao
    Yan, Hao
    Huang, Qiancun
    Xuan, Chengzhen
    He, Lei
    Shi, Longxing
    ACM TRANSACTIONS ON DESIGN AUTOMATION OF ELECTRONIC SYSTEMS, 2022, 27 (02)
  • [30] Fast Low-rank Metric Learning for Large-scale and High-dimensional Data
    Liu, Han
    Han, Zhizhong
    Liu, Yu-Shen
    Gu, Ming
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32