Estimation of the mean of a multivariate normal distribution is considered. The components of the mean vector are assumed to be intra-block exchangeable; this is modeled in a hierarchical Bayesian fashion with a multivariate normal distribution as the first stage prior. The resulting Bayes estimator is calculated and shown to be robust with respect to the presence of outlying coordinates in the vector of observations. More specifically, the hierarchical Bayesian approach is used to express the problem as a general linear model for which the class of components of the mean vector can be subdivided into smaller subclasses, each subclass consisting of components that can be assumed exchangeable. Under a quadratic loss function, the Bayes estimator corresponds to the posterior mean which is obtained by standard techniques. For a large class of design matrices, this estimator is seen to be "partially" insensitive to the presence of outlying components of the observation vector; that is, most of the components of the estimator will shrink towards 0, or a suitably chosen prior mean, even though some will collapse to the corresponding components of the least-squares estimator. Furthermore, the proposed estimators possess good frequentist properties such as admissibility. Finally, the use of the proposed estimator is illustrated on data sets from an animal breeding experiment and from a clinical trial. In the first example, the breeding values of sires are computed using the proposed estimator and a comparison is made with the more familiar BLUP solution. In the second example, the hierarchical Bayes method leads to smaller confidence intervals and the corresponding estimator is shown to be less sensitive to outlying values than the least-squares estimator.
Published July 1990 , 23 pages
This cahier was revised in December 1995