Quasi-static ensemble variational data assimilation: a theoretical and numerical study with the iterative ensemble Kalman smoother
被引:6
|
作者:
Fillion, Anthony
论文数: 0引用数: 0
h-index: 0
机构:
Univ Paris Est, Joint Lab Ecole Ponts ParisTech, CEREA, Champs Sur Marne, France
Univ Paris Est, EDF R&D, Champs Sur Marne, France
CERFACS, Toulouse, FranceUniv Paris Est, Joint Lab Ecole Ponts ParisTech, CEREA, Champs Sur Marne, France
Fillion, Anthony
[1
,2
,3
]
Bocquet, Marc
论文数: 0引用数: 0
h-index: 0
机构:
Univ Paris Est, Joint Lab Ecole Ponts ParisTech, CEREA, Champs Sur Marne, France
Univ Paris Est, EDF R&D, Champs Sur Marne, FranceUniv Paris Est, Joint Lab Ecole Ponts ParisTech, CEREA, Champs Sur Marne, France
Bocquet, Marc
[1
,2
]
Gratton, Serge
论文数: 0引用数: 0
h-index: 0
机构:
INPT IRIT, Toulouse, FranceUniv Paris Est, Joint Lab Ecole Ponts ParisTech, CEREA, Champs Sur Marne, France
Gratton, Serge
[4
]
机构:
[1] Univ Paris Est, Joint Lab Ecole Ponts ParisTech, CEREA, Champs Sur Marne, France
[2] Univ Paris Est, EDF R&D, Champs Sur Marne, France
The analysis in nonlinear variational data assimilation is the solution of a non-quadratic minimization. Thus, the analysis efficiency relies on its ability to locate a global minimum of the cost function. If this minimization uses a Gauss-Newton (GN) method, it is critical for the starting point to be in the attraction basin of a global minimum. Otherwise the method may converge to a local extremum, which degrades the analysis. With chaotic models, the number of local extrema often increases with the temporal extent of the data assimilation window, making the former condition harder to satisfy. This is unfortunate because the assimilation performance also increases with this temporal extent. However, a quasi-static (QS) minimization may overcome these local extrema. It accomplishes this by gradually injecting the observations in the cost function. This method was introduced by Pires et al. (1996) in a 4D-Var context. We generalize this approach to four-dimensional strong-constraint nonlinear ensemble variational (EnVar) methods, which are based on both a nonlinear variational analysis and the propagation of dynamical error statistics via an ensemble. This forces one to consider the cost function minimizations in the broader context of cycled data assimilation algorithms. We adapt this QS approach to the iterative ensemble Kalman smoother (IEnKS), an exemplar of nonlinear deterministic four-dimensional EnVar methods. Using low-order models, we quantify the positive impact of the QS approach on the IEnKS, especially for long data assimilation windows. We also examine the computational cost of QS implementations and suggest cheaper algorithms.