The advent of computational science has unveiled large classes of nonlinear optimization problems where derivatives of the objective and/or constraints are unavailable. Often, these problems are posed as black-box optimization problems, but rarely is this by necessity. In this talk, we report on our experience extracting additional structure on problems consisting of both black-box and algebraic or otherwise known components. We provide examples on nonlinear least squares/calibration problems and knowing derivatives of some nonlinear constraints or with respect to a subset of the decision variables. In each case, we use quadratic surrogates to model both the black-box and algebraic components to obtain new grey-box optimization methods.
Group for Research in Decision Analysis