In this note we suggest a simple but efficient modification of the well-known Nelder-Mead (NM) simplex search method for unconstrained optimization. Instead of moving all n simplex vertices at once in the direction to the best vertex, our "shrink" step moves them in the same direction but one by one until an improvement is obtained. In addition, for solving non-convex problems, we simply restart such modified NM (MNM) method by constructing an initial simplex around the solution obtained in the previous phase. We repeat restarts until there is improvement in the objective function value. Thus, our restarted modified NM (RMNM) is descent and deterministic method and may be seen as an extended local search for continuous optimization. In order to improve computational complexity and efficiency, we use the heap data structure for storing and updating simplex vertices. Extensive empirical analysis shows that: our modified method outperforms in average the original version as well as some other recent successful modifications; in solving global optimization problems, it is comparable with the state-of-the-art heuristics, but easier to implement and more user-friendly.
Published March 2008 , 24 pages