Multifactorial Evolutionary Algorithm Based on Diffusion Gradient Descent

被引:8
|
作者
Liu, Zhaobo [1 ]
Li, Guo [2 ]
Zhang, Haili [3 ]
Liang, Zhengping [2 ]
Zhu, Zexuan [4 ,5 ,6 ]
机构
[1] Shenzhen Univ, Inst Adv Study, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[2] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[3] Shenzhen Polytech, Inst Appl Math, Shenzhen 518055, Peoples R China
[4] Shenzhen Univ, Natl Engn Lab Big Data Syst Comp Technol, Shenzhen 518060, Peoples R China
[5] Shenzhen Pengcheng Lab, Shenzhen 518055, Peoples R China
[6] BGI Shenzhen, Shenzhen 518083, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Optimization; Convergence; Statistics; Sociology; Knowledge transfer; Costs; Convergence analysis; diffusion gradient descent (DGD); evolutionary multitasking (EMT); multifactorial evolutionary algorithm (MFEA); MULTITASKING; OPTIMIZATION; LMS;
D O I
10.1109/TCYB.2023.3270904
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The multifactorial evolutionary algorithm (MFEA) is one of the most widely used evolutionary multitasking (EMT) algorithms. The MFEA implements knowledge transfer among optimization tasks via crossover and mutation operators and it obtains high-quality solutions more efficiently than single-task evolutionary algorithms. Despite the effectiveness of MFEA in solving difficult optimization problems, there is no evidence of population convergence or theoretical explanations of how knowledge transfer increases algorithm performance. To fill this gap, we propose a new MFEA based on diffusion gradient descent (DGD), namely, MFEA-DGD in this article. We prove the convergence of DGD for multiple similar tasks and demonstrate that the local convexity of some tasks can help other tasks escape from local optima via knowledge transfer. Based on this theoretical foundation, we design complementary crossover and mutation operators for the proposed MFEA-DGD. As a result, the evolution population is endowed with a dynamic equation that is similar to DGD, that is, convergence is guaranteed, and the benefit from knowledge transfer is explainable. In addition, a hyper-rectangular search strategy is introduced to allow MFEA-DGD to explore more underdeveloped areas in the unified express space of all tasks and the subspace of each task. The proposed MFEA-DGD is verified experimentally on various multitask optimization problems, and the results demonstrate that MFEA-DGD can converge faster to competitive results compared to state-of-the-art EMT algorithms. We also show the possibility of interpreting the experimental results based on the convexity of different tasks.
引用
收藏
页码:4267 / 4279
页数:13
相关论文
共 50 条
  • [11] A Group-based Approach to Improve Multifactorial Evolutionary Algorithm
    Tang, Jing
    Chen, Yingke
    Deng, Zixuan
    Xiang, Yanping
    Joy, Colin Paul
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 3870 - 3876
  • [12] Research on the Quadrotor of AHRS based on Gradient Descent Algorithm
    Lin Feng
    He Liuzeng
    2018 EIGHTH INTERNATIONAL CONFERENCE ON INSTRUMENTATION AND MEASUREMENT, COMPUTER, COMMUNICATION AND CONTROL (IMCCC 2018), 2018, : 1831 - 1834
  • [13] Hinge Classification Algorithm Based on Asynchronous Gradient Descent
    Yan, Xiaodan
    Zhang, Tianxin
    Cui, Baojiang
    Deng, Jiangdong
    ADVANCES ON BROAD-BAND WIRELESS COMPUTING, COMMUNICATION AND APPLICATIONS, BWCCA-2017, 2018, 12 : 459 - 468
  • [14] A gradient descent algorithm for LASSO
    Kim, Yongdai
    Kim, Yuwon
    Kim, Jinseog
    PREDICTION AND DISCOVERY, 2007, 443 : 73 - 82
  • [15] A Mirror Descent-Based Algorithm for Corruption-Tolerant Distributed Gradient Descent
    Wang, Shuche
    Tan, Vincent Y. F.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2025, 73 : 827 - 842
  • [16] Multifactorial evolutionary algorithm with adaptive transfer strategy based on decision tree
    Wei Li
    Xinyu Gao
    Lei Wang
    Complex & Intelligent Systems, 2023, 9 : 6697 - 6728
  • [17] Multifactorial evolutionary algorithm with adaptive transfer strategy based on decision tree
    Li, Wei
    Gao, Xinyu
    Wang, Lei
    COMPLEX & INTELLIGENT SYSTEMS, 2023, 9 (06) : 6697 - 6728
  • [18] Helper objective-based multifactorial evolutionary algorithm for continuous optimization
    Yang, Xu
    Li, Wenhua
    Wang, Rui
    Yang, Kang
    SWARM AND EVOLUTIONARY COMPUTATION, 2023, 78
  • [19] IdiffGrad: A Gradient Descent Algorithm for Intrusion Detection Based on diffGrad
    Sun, Weifeng
    Wang, Yiming
    Chang, Kangkang
    Meng, Kelong
    2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1583 - 1590
  • [20] Frequency memory based gradient descent bit flipping algorithm
    Asatani, Jun
    Kawanishi, Hiroaki
    Tokushige, Hitoshi
    Katayama, Kengo
    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2015, 10 (05) : 585 - 591