Back

G-2002-56

Convergence Results for Pattern Search Algorithms are Tight

BibTeX reference

The convergence theory of generalized pattern search algorithms for unconstrained optimization guarantees under mild conditions that the method produces a limit point satisfying first order optimality conditions related to the local differentiability of the objective function. By exploiting the flexibility allowed by the algorithm, we derive six small dimensioned examples showing that the convergence results are tight in the sense that they cannot be strengthened without additional assumptions, i.e., that certain requirement imposed on pattern search algorithms are not merely artifacts of the proofs.

In particular, we show the necessity of the requirement that some algorithmic parameters are rational. We show that the method may generate infinitely many limit points, some of which may have non-zero gradients, even for C1 functions. We also show that, even when a single limit point is generated, the gradient may be non-zero, and zero may be excluded from the generalized gradient. Therefore, the method does not necessarily produce a stationary point in the Clarke sense.

, 21 pages

Research Axis

Publication

Convergence results for pattern search algorithms are tight
Optimization and Engineering, 52(2), 101–122, 2004 BibTeX reference