A Fast and efficient stochastic opposition-based learning for differential evolution in numerical optimization

被引:28
|
作者
Choi, Tae Jong [1 ]
Togelius, Julian [2 ]
Cheong, Yun-Gyung [3 ]
机构
[1] Kyungil Univ, Dept AI Software, Gyongsan 38428, Gyeongsangbuk D, South Korea
[2] NYU, Tandon Sch Engn, Brooklyn, NY 11201 USA
[3] Sungkyunkwan Univ, Coll Software, Suwon 16419, Gyeonggi Do, South Korea
基金
新加坡国家研究基金会;
关键词
Artificial intelligence; Evolutionary algorithms; Differential evolution; Opposition-Based learning; Numerical optimization; POPULATION DIVERSITY; ALGORITHM; PARAMETERS; ENSEMBLE; MUTATION; SEARCH; STRATEGIES; CROSSOVER;
D O I
10.1016/j.swevo.2020.100768
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A fast and efficient stochastic opposition-based learning (OBL) variant is proposed in this paper. OBL is a machine learning concept to accelerate the convergence of soft computing algorithms, which consists of simultaneously calculating an original solution and its opposite. Recently, a stochastic OBL variant called BetaCOBL was proposed, which is capable of controlling the degree of opposite solutions, preserving useful information held by original solutions, and preventing the waste of fitness evaluations. While it has shown outstanding performance compared to several state-of-the-art OBL variants, the high computational cost of BetaCOBL may hinder it from cost-sensitive optimization problems. Also, as it assumes that the decision variables of a given problem are independent, BetaCOBL may be ineffective for optimizing inseparable problems. In this paper, we propose an improved BetaCOBL that mitigates all the limitations. The proposed algorithm called iBetaCOBL reduces the computational cost from O(NP2 . D) to O (NP . D) (NP and D stand for population size and a dimension, respectively) using a linear time diversity measure. Also, the proposed algorithm preserves strongly dependent variables that are adjacent to each other using multiple exponential crossover. We used differential evolution (DE) variants to evaluate the performance of the proposed algorithm. The results of the performance evaluations on a set of 58 test functions show the excellent performance of iBetaCOBL compared to ten state-of-the-art OBL variants, including BetaCOBL.
引用
收藏
页数:37
相关论文
共 50 条
  • [21] Opposition-based differential evolution using the current optimum for function optimization
    Xu Q.-Z.
    Wang L.
    He B.-M.
    Wang N.
    Yingyong Kexue Xuebao/Journal of Applied Sciences, 2011, 29 (03): : 308 - 315
  • [22] Hybrid Differential Evolution Algorithm with Chaos and Generalized Opposition-Based Learning
    Wang, Jing
    Wu, Zhijian
    Wang, Hui
    ADVANCES IN COMPUTATION AND INTELLIGENCE, 2010, 6382 : 103 - 111
  • [23] Fast random opposition-based learning Golden Jackal Optimization algorithm
    Mohapatra, Sarada
    Mohapatra, Prabhujit
    KNOWLEDGE-BASED SYSTEMS, 2023, 275
  • [24] Using Opposition-based Learning to Enhance Differential Evolution: A Comparative Study
    Wang, Wenjun
    Wang, Hui
    Sun, Hui
    Rahnamayan, Shahryar
    2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2016, : 71 - 77
  • [25] Lens imaging opposition-based learning for differential evolution with cauchy perturbation
    Yu, Fei
    Guan, Jian
    Wu, Hongrun
    Chen, Yingpin
    Xia, Xuewen
    APPLIED SOFT COMPUTING, 2024, 152
  • [26] A Hybrid Group Search Optimizer with Opposition-Based Learning and Differential Evolution
    Xie, Chengwang
    Chen, Wenjing
    Yu, Weiwei
    COMPUTATIONAL INTELLIGENCE AND INTELLIGENT SYSTEMS, (ISICA 2015), 2016, 575 : 3 - 12
  • [27] Self-adaptive opposition-based differential evolution with subpopulation strategy for numerical and engineering optimization problems
    Li, Jiahang
    Gao, Yuelin
    Zhang, Hang
    Yang, Qinwen
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (03) : 2051 - 2089
  • [28] Self-adaptive opposition-based differential evolution with subpopulation strategy for numerical and engineering optimization problems
    Jiahang Li
    Yuelin Gao
    Hang Zhang
    Qinwen Yang
    Complex & Intelligent Systems, 2022, 8 : 2051 - 2089
  • [29] An efficient DBSCAN optimized by arithmetic optimization algorithm with opposition-based learning
    Yang Yang
    Chen Qian
    Haomiao Li
    Yuchao Gao
    Jinran Wu
    Chan-Juan Liu
    Shangrui Zhao
    The Journal of Supercomputing, 2022, 78 : 19566 - 19604
  • [30] Investigating in Scalability of Opposition-Based Differential Evolution
    Rahnamayan, Shahryar
    Wang, G. Gary
    SMO 08: PROCEEDINGS OF THE 8TH WSEAS INTERNATIONAL CONFERENCE ON SIMULATION, MODELLING AND OPTIMIZATION, 2008, : 105 - +