Group for Research in Decision Analysis


A MADS Algorithm with a Progressive Barrier for Derivative-Free Nonlinear Programming


We propose a new algorithm for general constrained derivative-free optimization. As in most methods, constraint violations are aggregated into a single constraint violation function. As in filter methods, a threshold, or barrier, is imposed on the constraint violation function, and any trial point whose constraint violation function value exceeds this threshold is discarded from consideration. In the new algorithm, unlike the filter method, the amount of constraint violation subject to the barrier is progressively decreased as the algorithm evolves.

Using the Clarke nonsmooth calculus, we prove Clarke stationarity of the sequences of feasible and infeasible trial points. The new method is effective on two academic test problems with up to 50 variables, which were problematic for our GPS filter method. We also test on a chemical engineering problem. The proposed method generally outperforms our LTMADS in the case where no feasible initial points are known, and it does as well when feasible points are known.

, 36 pages