Data assimilation applications lead to large-scale nonlinear least squares problems. These are commonly solved using Incremental 4D-Var, a truncated variant of the Gauss-Newton algorithm. Due to the very large problem size, at each Gauss-Newton iteration the resulting weighted linear least squares problem is typically "solved" very inexactly using very few iterations of the conjugate gradient (CG) method. An additional challenge is that the problem data are typically subject to significant measurement uncertainties.
In this talk, we discuss some stochastic error analysis tools which are relevant in this context. First, we consider the first-order propagation of covariance matrices in CG. In a second part, we generalize this approach by investigating the sensitivity of some matrix functions to random noise.
This talk is based on joint work with Serge Gratton (ENSEEIHT and CERFACS, Toulouse), Philippe Toint (FUNDP, Namur), and Jean Tshimanga (ENSEEIHT, Toulouse).