We discuss the tremendous potential of integer optimization methods for learning predictive models from high-dimensional data via exact sparse regression. We show that novel integer formulations can solve exact sparse regression problems of sizes counting p=100,000s covariates for n=10,000s of samples. That is, two orders of magnitude more than current state of the art methods. We also indicate that robust optimization methods can help practitioners make data-driven decisions which are safeguarded against over-calibration to one particular data set. We claim that robust optimization methods have an enormous untapped potential when making subsequent decisions based on data.
Bio: Bart Van Parys is currently a postdoctoral researcher working with Prof. Dimitris Bertsimas at the MIT Sloan School of Management. His research interests are situated on the interface between optimization and machine learning. In 2015 he obtained his Ph.D. in control theory at the Swiss Federal Institute of Technology (ETH) in Zurich under the supervision of Prof. Manfred Morari. He received his M.E. from the University of Leuven in 2011.
Welcome to everyone!