Symmetric quasi-definite systems may be interpreted as regularized linear least-squares problem in appropriate metrics and arise from applications such as regularized interior-point methods for convex optimization and stabilized control problems. We propose two families of Krylov methods well suited to the solution of such systems based on a preconditioned variant of the Golub-Kahan bidiagonalization process. The first family contains methods operating of the normal and Schur-complement equations, including generalizations of well-known methods such as LSQR and LSMR but also a new method named CRAIG-MR aiming to minimize the residual of the Schur-complement equations. The second family follows from a related Lanczos process and contains methods operating directly on the augmented system, which generalize the conjugate-gradient and minimum-residual methods. We establish connections between augmented-system and reduced-system methods. In particular, the conjugate-gradient method is well defined despite the indefiniteness of the operator. We provide an explanation for the often-observed staircase behavior of the residual in the minimum-residual method. An additional contribution is to provide explicit stopping criteria for all methods based on estimates of the relative direct error in appropriate norms, as opposed to criteria based on the residual. A lower bound estimate is available at no additional computational cost while an upper bound estimate comes at the cost of a few additional scalar operations per iteration.
Published May 2013 , 55 pages