The adjoint Newton algorithm for large-scale unconstrained optimization in meteorology applications

被引:21
|
作者
Wang, Z [1 ]
Droegemeier, K
White, L
机构
[1] Univ Oklahoma, Ctr Anal & Predict Storms, Norman, OK 73019 USA
[2] Univ Oklahoma, Sch Meterol, Norman, OK 73019 USA
[3] Univ Oklahoma, Dept Math, Norman, OK 73019 USA
关键词
adjoint Newton algorithm; variational data assimilation; LBFGS; truncated Newton algorithm; large-scale unconstrained minimization;
D O I
10.1023/A:1018321307393
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
A new algorithm is presented for carrying out large-scale unconstrained optimization required in variational data assimilation using the Newton method. The algorithm is referred to as the adjoint Newton algorithm. The adjoint Newton algorithm is based on the first- and second-order adjoint techniques allowing us to obtain the Newton line search direction by integrating a tangent linear equations model backwards in time (starting from a final condition with negative time steps). The error present in approximating the Hessian (the matrix of second-order derivatives) of the cost function with respect to the control variables in the quasi-Newton type algorithm is thus completely eliminated, while the storage problem related to the Hessian no longer exists since the explicit Hessian is not required in this algorithm. The adjoint Newton algorithm is applied to three one-dimensional models and to a two-dimensional limited-area shallow water equations model with both model generated and First Global Geophysical Experiment data. We compare the performance of the adjoint Newton algorithm with that of truncated Newton, adjoint truncated Newton, and LBFGS methods. Our numerical tests indicate that the adjoint Newton algorithm is very efficient and could find the minima within three or four iterations for problems tested here. In the case of the two-dimensional shallow water equations model, the adjoint Newton algorithm improves upon the efficiencies of the truncated Newton and LBFGS methods by a factor of at least 14 in terms of the CPU time required to satisfy the same convergence criterion. The Newton, truncated Newton and LBFGS methods are general purpose unconstrained minimization methods. The adjoint Newton algorithm is only useful for optimal control problems where the model equations serve as strong constraints and their corresponding tangent linear model may be integrated backwards in time. When the backwards integration of the tangent linear model is ill-posed in the sense of Hadamard, the adjoint Newton algorithm may not work. Thus, the adjoint Newton algorithm must be used with some caution. A possible solution to avoid the current weakness of the adjoint Newton algorithm is proposed.
引用
收藏
页码:283 / 320
页数:38
相关论文
共 50 条
  • [31] Planar Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization, Part 1: Theory
    G. Fasano
    Journal of Optimization Theory and Applications, 2005, 125 : 523 - 541
  • [32] Designing Large-Scale Metasurfaces with Parameterized Adjoint Optimization
    Mansouree, Mandad
    McClung, Andrew
    Samudrala, Sarath
    Arbabi, Amir
    2020 INTERNATIONAL APPLIED COMPUTATIONAL ELECTROMAGNETICS SOCIETY SYMPOSIUM (2020 ACES-MONTEREY), 2020,
  • [33] Curvilinear stabilization techniques for truncated Newton methods in large scale unconstrained optimization
    Lucidi, S
    Rochetich, F
    Roma, M
    SIAM JOURNAL ON OPTIMIZATION, 1998, 8 (04) : 916 - 939
  • [34] Numerical experiences with new truncated Newton methods in large scale unconstrained optimization
    Lucidi, S
    Roma, M
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 1997, 7 (01) : 71 - 87
  • [35] Preconditioning On Subspace Quasi-Newton Method For Large Scale Unconstrained Optimization
    Sim, Hong Seng
    Leong, Wah June
    Ismail, Fudziah
    STATISTICS AND OPERATIONAL RESEARCH INTERNATIONAL CONFERENCE (SORIC 2013), 2014, 1613 : 297 - 305
  • [36] Numerical Experiences with New Truncated Newton Methods in Large Scale Unconstrained Optimization
    Stefano Lucidi
    Massimo Roma
    Computational Optimization and Applications, 1997, 7 : 71 - 87
  • [37] A PARALLEL ASYNCHRONOUS NEWTON ALGORITHM FOR UNCONSTRAINED OPTIMIZATION
    CONFORTI, D
    MUSMANNO, R
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1993, 77 (02) : 305 - 322
  • [38] Another Conjugate Gradient Algorithm with Guaranteed Descent and Conjugacy Conditions for Large-scale Unconstrained Optimization
    Neculai Andrei
    Journal of Optimization Theory and Applications, 2013, 159 : 159 - 182
  • [39] A THREE-TERM CONJUGATE GRADIENT ALGORITHM USING SUBSPACE FOR LARGE-SCALE UNCONSTRAINED OPTIMIZATION
    Chen, Yuting
    Yang, Yueting
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2020, 18 (05) : 1179 - 1190
  • [40] A Competitive Divide-and-Conquer Algorithm for Unconstrained Large-Scale Black-Box Optimization
    Mei, Yi
    Omidvar, Mohammad Nabi
    Li, Xiaodong
    Yao, Xin
    ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2016, 42 (02): : 1 - 24