Applied to strongly monotone variational inequality problems, Newton's algorithm achieves local quadratic convergence. However, global convergence cannot be guaranteed by monitoring the decrease of some convex objective function. In this paper we show how the basic Newton method can be modified to yield an algorithm that monotonically decreases the so-called "gap function" associated with the variational inequality by solving a sequence of linear programs. Convergence of the algorithm does not depend on strict or strong monotonicity assumptions; however, under strong monotonicity and strict complementarity assumptions, the set of active constraints at the solution is implicitly identified, and quadratic convergence is achieved.
Published February 1985 , 19 pages