Groupe d’études et de recherche en analyse des décisions


Robust Regression and Lasso

, et

Lasso, or regularized least squares has been explored extensively for its remarkable sparsity properties. The first result of this paper, is that the solution to Lasso, in addition to its sparsity, has robustness properties: it is the solution to a robust optimization problem. This has two important consequences. First, robustness provides a connection of the regularizer to a physical property, namely, protection from noise. This allows principled selection of the regularizer, and in particular, by considering different uncertainty sets, we construct generalizations of Lasso that also yield convex optimization problems.

Secondly, robustness can itself be used as an avenue to exploring different properties of the solution. In particular, we show that robustness of the solution itself explains why the solution is sparse. The analysis as well as the specific results we obtain differ from standard sparsity results, providing different geometric intuition. We next show that the robust optimization formulation is related to kernel density estimation, and following this approach, we use robustness directly to reprove that Lasso is consistent.

, 36 pages