Natural Computing Method Based on Nonlinear Dimension Reduction

被引:2
|
作者
Ji Weidong [1 ]
Sun Xiaoqing [1 ]
Lin Ping [2 ]
Luo Qiang [1 ]
Xu Haotian [1 ]
机构
[1] Harbin Normal Univ, Coll Comp Sci & Informat Engn, Harbin 150025, Peoples R China
[2] Harbin Med Sci Univ, Harbin 150086, Peoples R China
基金
中国国家自然科学基金;
关键词
Natural computing method; Optimization; Dimension reduction; Nonlinearity; PARTICLE SWARM OPTIMIZATION; ALGORITHM;
D O I
10.11999/JEIT190623
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Many optimization problems develop into high-dimensional large-scale optimization problems in the process of the development of artificial intelligence. Although the high-dimensional problem can avoid the algorithm falling into local optimum, it has no advantage in convergence speed and time feasibility. Therefore, the natural computing method for Nonlinear Dimension Reduction (NDR) is proposed. This strategy does not depend on specific algorithm and has universality. In this method, the initialized N individuals are regarded as a matrix of N rows and D columns, and then the maximum linear independent group is calculated for the column vector of the matrix, so as to reduce the redundancy of the matrix and reduce the dimension. In this process, since any remaining column vector group can be represented by the maximum linearly independent group, a random coefficient is applied to the maximum linearly independent group to maintain the diversity and integrity of the population. The standard genetic algorithm and particle swarm optimization using NDR strategy compare with Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and the four mainstream algorithms for dimension optimization. Experiments show that the improved algorithm has strong global convergence ability and better time complexity for most standard test functions.
引用
收藏
页码:1982 / 1989
页数:8
相关论文
共 27 条
  • [1] Solving systems of nonlinear equations using a modified firefly algorithm (MODFA)
    Ariyaratne, M. K. A.
    Fernando, T. G. I.
    Weerakoon, S.
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2019, 48 : 72 - 92
  • [2] CERVANTE L, 2012, 25 AUSTR JOINT C ART, P313
  • [3] Linear regression based projections for dimensionality reduction
    Chen, Si-Bao
    Ding, Chris H. Q.
    Luo, Bin
    [J]. INFORMATION SCIENCES, 2018, 467 : 74 - 86
  • [4] A Competitive Swarm Optimizer for Large Scale Optimization
    Cheng, Ran
    Jin, Yaochu
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (02) : 191 - 204
  • [5] A social learning particle swarm optimization algorithm for scalable optimization
    Cheng, Ran
    Jin, Yaochu
    [J]. INFORMATION SCIENCES, 2015, 291 : 43 - 60
  • [6] [邓昌明 Deng Changming], 2019, [四川大学学报. 自然科学版, Journal of Sichuan University. Natural Science Edition], V56, P45
  • [7] HE Yichao, 2019, CHINESE J COMPUTERS, V42, P267
  • [8] JI Zhen, 2010, CHINESE J COMPUTERS, V33, P556, DOI [10.3724/SP.J.1016.2010.00556, DOI 10.3724/SP.J.1016.2010.00556]
  • [9] Particle swarm optimization using dimension selection methods
    Jin, Xin
    Liang, Yongquan
    Tian, Dongping
    Zhuang, Fuzhen
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2013, 219 (10) : 5185 - 5197
  • [10] Kennedy J, 1995, 1995 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS PROCEEDINGS, VOLS 1-6, P1942, DOI 10.1109/icnn.1995.488968