A family of spectral gradient methods for optimization

被引:3
|
作者
Yu-Hong Dai
Yakui Huang
Xin-Wei Liu
机构
[1] Chinese Academy of Sciences,LSEC, ICMSEC, Academy of Mathematics and Systems Science
[2] University of Chinese Academy of Sciences,Mathematical Sciences
[3] Hebei University of Technology,Institute of Mathematics
来源
Computational Optimization and Applications | 2019年 / 74卷
关键词
Unconstrained optimization; Steepest descent method; Spectral gradient method; -linear convergence; -superlinear convergence;
D O I
暂无
中图分类号
学科分类号
摘要
We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the long Barzilai–Borwein (BB) stepsize and the short BB stepsize. Each member of the family is shown to share certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is R-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is R-linearly convergent in the any-dimensional case. Numerical results of the family with different settings are presented, which demonstrate that the proposed family is promising.
引用
收藏
页码:43 / 65
页数:22
相关论文
共 50 条
  • [21] A new family of conjugate gradient methods to solve unconstrained optimization problems
    Hassan, Basim A.
    Abdullah, Zeyad M.
    Hussein, Saif A.
    JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES, 2022, 43 (04): : 811 - 820
  • [22] A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization
    He, Qing-Rui
    Li, Sheng-Jie
    Zhang, Bo-Ya
    Chen, Chun-Rong
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 89 (03) : 805 - 842
  • [23] Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization
    Sun, Zhongbo
    Li, Hongyang
    Wang, Jing
    Tian, Yantao
    INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2018, 95 (10) : 2082 - 2099
  • [24] A New Family of Conjugate Gradient Methods for Small-Scale Unconstrained Optimization
    Jusoh, Ibrahim
    Mamat, Mustafa
    Rivaie, Mohd
    PROCEEDINGS OF THE 20TH NATIONAL SYMPOSIUM ON MATHEMATICAL SCIENCES (SKSM20): RESEARCH IN MATHEMATICAL SCIENCES: A CATALYST FOR CREATIVITY AND INNOVATION, PTS A AND B, 2013, 1522 : 1360 - 1365
  • [25] Spectral-like conjugate gradient methods with sufficient descent property for vector optimization
    Yahaya, Jamilu
    Kumam, Poom
    Salisu, Sani
    Sitthithakerngkiet, Kanokwan
    PLOS ONE, 2024, 19 (05):
  • [26] Subsampled Nonmonotone Spectral Gradient Methods
    Bellavia, Stefania
    Jerinkis, Natasa Krklec
    Malaspina, Greta
    COMMUNICATIONS IN APPLIED AND INDUSTRIAL MATHEMATICS, 2020, 11 (01) : 19 - 34
  • [27] Gradient methods exploiting spectral properties
    Huang, Yakui
    Dai, Yu-Hong
    Liu, Xin-Wei
    Zhang, Hongchao
    OPTIMIZATION METHODS & SOFTWARE, 2020, 35 (04): : 681 - 705
  • [28] GRADIENT OPTIMIZATION METHODS WITH MEMORY
    MELESHKO, VI
    ENGINEERING CYBERNETICS, 1973, 11 (01): : 33 - 45
  • [29] A new family of conjugate gradient methods
    Shi, Zhen-Jun
    Guo, Jinhua
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2009, 224 (01) : 444 - 457
  • [30] Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
    Yu, Gaohang
    Guan, Lutai
    Chen, Wufan
    OPTIMIZATION METHODS & SOFTWARE, 2008, 23 (02): : 275 - 293