Back

G-2012-51

Subgradient Optimization in Nonsmooth Optimization (including the Soviet Revolution)

BibTeX reference

Convex nondifferentiable, also known as convex nonsmooth, optimization (NDO) looks at problems where the functions involved are not continuously differentiable. The gradient does not exist, implying that the function may have kinks or corner points, and thus cannot be approximated locally by a tangent hyperplane, or by a quadratic approximation. Directional derivatives still exist because of the convexity property.

NDO problems are widespread, often resulting from reformulations of smooth, or linear problems, that are formulated in a space with much smaller number of variables than in the original problem. Examples of this are the reformulation implicit in Dantzig-Wolfe decomposition or column generation (Dantzig and Wolfe, 1960 and 1961), which are equivalent by duality to Cheney's cutting plane method (Kelley, 1960). These methods do not work well if an aggregated formulation is used. Shor's subgradient method (Shor, 1962, 1964) provided a superior alternative, leading to a true Soviet revolution. His work was expanded both in theory and in practice by numerous authors. Held and Karp (1971), unaware of the work of Shor, developed a method for the traveling salesman problem that uses subgradient optimization to compute a bound in a Lagrangean relaxation scheme. This seminal contribution also led to a huge following; see for instance Fisher (1981).

, 14 pages

Publication

Subgradient optimization in nonsmooth optimization (including the Soviet Revolution)
Documenta Math., Extra Volume ISMP , 277–290, 2012 BibTeX reference