A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization

被引:0
|
作者
Kim, Yongjin [1 ]
Jong, Yunchol [1 ]
Kim, Yong [1 ]
机构
[1] Univ Sci, Dept Math, Unjong Dist 355, Pyongyang 950003, North Korea
关键词
unconstrained optimization; conjugate gradient method; multi-step secant condition; self-scaling; improved Wolfe line search; QUASI-NEWTON METHODS; GLOBAL CONVERGENCE; DESCENT; ALGORITHM; PROPERTY;
D O I
10.21136/AM.2024.0204-23
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.
引用
收藏
页码:847 / 866
页数:20
相关论文
共 50 条
  • [31] New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization
    Zhang, Li
    COMPUTATIONAL & APPLIED MATHEMATICS, 2009, 28 (01): : 111 - 133
  • [32] An AP-DE Algorithm Based on Multi-step Gradient Method
    Dai Dameng
    Mu Dejun
    CHINESE JOURNAL OF ELECTRONICS, 2016, 25 (01) : 146 - 151
  • [33] An AP-DE Algorithm Based on Multi-step Gradient Method
    DAI Dameng
    MU Dejun
    Chinese Journal of Electronics, 2016, 25 (01) : 146 - 151
  • [34] A CLASS OF DESCENT FOUR-TERM EXTENSION OF THE DAI-LIAO CONJUGATE GRADIENT METHOD BASED ON THE SCALED MEMORYLESS BFGS UPDATE
    Babaie-Kafaki, Saman
    Ghanbari, Reza
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2017, 13 (02) : 649 - 658
  • [35] A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
    Wumei Sun
    Yufei Li
    Ting Wang
    Hongwei Liu
    Computational and Applied Mathematics, 2022, 41
  • [36] A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization
    Sun, Wumei
    Li, Yufei
    Wang, Ting
    Liu, Hongwei
    COMPUTATIONAL & APPLIED MATHEMATICS, 2022, 41 (04):
  • [37] A Gradient Pursuit Algorithm Based on Multi-Step Quasi-Newton Method
    Hu, Yanjun
    Cheng, Lu
    Jiang, Fang
    Wang, Ren
    2019 IEEE 4TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA ANALYSIS (ICCCBDA), 2019, : 559 - 565
  • [38] A Velocity Prediction Method based on Self-Learning Multi-Step Markov Chain
    Zhou, Yang
    Ravey, Alexandre
    Pera, Marie-Cecile
    45TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY (IECON 2019), 2019, : 2598 - 2603
  • [39] Gradient Cleaning Method of Potato Based on Multi-Step Operation of Dry-Cleaning and Wet Cleaning
    Yang, Hongguang
    Yan, Jianchun
    Wei, Hai
    Wu, Huichang
    Wang, Shenying
    Ji, Longlong
    Xu, Xiaowei
    Xie, Huanxiong
    AGRICULTURE-BASEL, 2021, 11 (11):
  • [40] Color-based skin segmentation in videos using a multi-step spatial method
    Mohammad Reza Mahmoodi
    Sayed Masoud Sayedi
    Fariba Karimi
    Multimedia Tools and Applications, 2017, 76 : 9785 - 9801