A double parameter self-scaling memoryless BFGS method for unconstrained optimization

被引:4
|
作者
Andrei, Neculai [1 ]
机构
[1] Acad Romanian Scientists, Ctr Adv Modeling & Optimizat, Splaiul Independentei 54,Sect 5, Bucharest, Romania
来源
COMPUTATIONAL & APPLIED MATHEMATICS | 2020年 / 39卷 / 03期
关键词
Unconstrained optimization; Self-scaling memoryless BFGS method; Global convergence; Numerical comparisons; CONVERGENCE CONDITIONS; ALGORITHMS;
D O I
10.1007/s40314-020-01157-z
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A double parameter self-scaling memoryless BFGS method for unconstrained optimization is presented. In this method, the first two terms of the self-scaling memoryless BFGS matrix are scaled with a positive parameter, while the third one is scaled with another positive parameter. The first parameter scaling the first two terms is determined to cluster the eigenvalues of the memoryless BFGS matrix. The second parameter scaling the third term is computed as a preconditioner to the Hessian of the minimizing function combined with the minimization of the conjugacy condition to shift the large eigenvalues of the self-scaling memoryless BFGS matrix to the left. The stepsize is determined by the Wolfe line search conditions. The global convergence of this method is proved, assuming that the minimizing function is uniformly convex. The preliminary computational experiments on a set of 80 unconstrained optimization test functions show that this algorithm is more efficient and more robust than the self-scaling BFGS updates by Oren and Luenberger and by Oren and Spedicato. Subject to the CPU time metric, CG-DESCENT is top performer. Comparisons with L-BFGS show that our algorithm is more efficient.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] New Self-scaling Quasi-Newton methods for unconstrained optimization
    Moghrabi, Issam A. R.
    Hassan, Basim A.
    Askar, Aadil
    INTERNATIONAL JOURNAL OF MATHEMATICS AND COMPUTER SCIENCE, 2022, 17 (03): : 1061 - 1077
  • [22] A hybrid scaling parameter for the scaled memoryless BFGS method based on the matrix norm
    Babaie-Kafaki, Saman
    INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2019, 96 (08) : 1595 - 1602
  • [23] Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
    Andrei, Neculai
    OPTIMIZATION METHODS & SOFTWARE, 2007, 22 (04): : 561 - 571
  • [24] Scaled memoryless BFGS preconditioned steepest descent method for very large-scale unconstrained optimization
    Leong, Wah June
    Abu Hassan, Malik
    JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES, 2009, 30 (02): : 387 - 396
  • [25] Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
    Andrei, Neculai
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2010, 204 (03) : 410 - 420
  • [26] An adaptive scaled BFGS method for unconstrained optimization
    Andrei, Neculai
    NUMERICAL ALGORITHMS, 2018, 77 (02) : 413 - 432
  • [27] An adaptive sizing BFGS method for unconstrained optimization
    Hao Liu
    Jianfeng Shao
    Haijun Wang
    Baoxian Chang
    Calcolo, 2015, 52 : 233 - 244
  • [28] An adaptive scaled BFGS method for unconstrained optimization
    Neculai Andrei
    Numerical Algorithms, 2018, 77 : 413 - 432
  • [29] An adaptive sizing BFGS method for unconstrained optimization
    Liu, Hao
    Shao, Jianfeng
    Wang, Haijun
    Chang, Baoxian
    CALCOLO, 2015, 52 (02) : 233 - 244
  • [30] On Optimality of the Parameters of Self-Scaling Memoryless Quasi-Newton Updating Formulae
    Saman Babaie-Kafaki
    Journal of Optimization Theory and Applications, 2015, 167 : 91 - 101