In derivative-free and blackbox optimization, the objective function is often evaluated through the execution of a computer program seen as a blackbox. It can be noisy, in the sense that its outputs are contaminated by random errors. Sometimes, the source of these errors is identified and controllable, in the sense that it is possible to reduce the standard deviation of the stochastic noise it generates. A common strategy to deal with such a situation is to monotonically diminish this standard deviation, to asymptotically make it converge to zero and ensure convergence of algorithms because the noise is dismantled. This work presents MPMADS, an algorithm which follows this approach. However, in practice a reduction of the standard deviation increases the computation time, and makes the optimization process long. Therefore, a second algorithm called DPMADS is introduced to explore another strategy, which does not force the standard deviation to monotonically diminish. Although these strategies are proved to be theoretically equivalents, tests on analytical problems and an industrial blackbox are presented to illustrate practical differences.
Published November 2019 , 29 pages