Group for Research in Decision Analysis

On strong Consistency of the regularized Least-Squares Estimates of infinite autoregressive models

Yulia R. Gel

The talk addresses on-line parameter estimation of infinite autoregressive (AR) models with exponentially decaying coefficients. The practical importance of the problem follows from the fact that the class of such models includes (but not limited to) all causal invertible ARMA(p,q) models. On-line parameter estimation means that the length of the observed data sample is not known a-priori and may indefinitely increase. Hence the parameter estimates should be refined upon arrival of every new observation. So use of the maximum likelihood method (ML) is not feasible due to the high computational burden, and recursive estimation procedures are preferable.

We usually assume that the true underlying model of the observed process is a finite AR, MA or a mixed ARMA equation. However, this assumption can be rarely justified in practice. A common approach is to approximate the true model by finite AR models whose orders are chosen by information criterions such as AIC, BIC, PLS and whose parameters may be estimated by the ordinary Least Squares method, the Yule-Walker method and others. In this paper we investigate a limiting case for approximating by finite AR models, i.e. an AR model of infinite order. We discuss the strong consistency of the exponentially and polynomially regularized Least Squares estimates of infinite order. The regularizer may be interpreted as the smoothing operator applied to the number of the AR coefficients being estimated, and constitutes a link to the model selection criterions. The proposed identification procedure is illustrated by simulations and application to modeling fMRI data.