Power system operators continuously monitor their networks by solving a nonlinear least-squares problem known as state estimation. The problem is nonconvex, so even in the absence of measurement errors, local search algorithms can become stuck at a spurious estimate. Recently, the low-rank matrix recovery problem in machine learning was shown to contain no spurious local minima under a restricted isometry property (RIP). In fact, state estimation is just a special case of low-rank matrix recovery, so it too should admit no spurious local minima under a similar assumption.
In this talk, we show that "no spurious local minima" requires a very strong RIP. By contrast, real-world instances--like power system state estimation--are only capable of much weaker RIPs. In fact, counterexamples are ubiquitous. We give a specific counterexample with a strong RIP that nevertheless causes stochastic gradient descent to fail 12% of the time. We prove that the counterexample is sharp, meaning that further strengthening the RIP would eliminate all spurious local minima.
At the same time, even mild RIP will eliminate spurious local minima within a neighborhood of the ground truth. Using this insight, we show that spurious local minima must move further away from the ground truth with the addition of redundant measurements. By adding redundant measurements, local search algorithms become more likely to converge to a global minimizer. Accordingly, power system state estimation becomes more likely to succeed with redundant information.
Bienvenue à tous!