There has been much recent statistical research in the area of inference under constraints. Here we consider the problem of bounded parameter estimation, in particular that of normal and Poisson means, using minimaxity as our criterion of evaluation. Because of the ease of calculation of linear minimax rules, we also study the ratio of their risks to the nonlinear minimax risks for these problems. In order to find the minimax solution, the dual problem of finding the least favorable prior distribution is often considered. On bounded parameter spaces the least favorable prior is often discrete, so the finding of the minimax estimator and its risk is equivalent to a global optimization problem with constraints. Previously published numerical specifications of the priors have used iterative (often heuristic) procedures. We propose two global optimization procedures. The first is a multivariate Lipschitz optimization one and makes use of bounds on the first order derivatives. The second one is a decomposition procedure which utilizes the partial concavity of the Bayes risk function. Both procedures are compared and the decomposition method appears to be much more efficient than the Lipschitz optimization one. It is shown that the Ibragimov-Hasminskii constant, equal to the maximum of the ratio of linear to nonlinear minimax risks is different for the Poisson and normal problems.
Published October 1990 , 44 pages
This cahier was revised in February 1992