Groupe d’études et de recherche en analyse des décisions

Scaling up exact neural network compression by ReLU stability

Thiago Serra Bucknell University, États-Unis

Thiago Serra

Présentation sur YouTube

We can compress a neural network while exactly preserving its underlying functionality with respect to a given input domain if some of its neurons are stable. However, current approaches to determine the stability of neurons require solving or finding a good approximation to multiple discrete optimization problems. In this talk, we present an algorithm based on solving a single optimization problem to identify all stable neurons. Our approach is on median 21 times faster than the state-of-art method, which allows us to explore exact compression on deeper (5 x 100) and wider (2 x 800) networks within minutes. For classifiers trained under an amount of L1 regularization that does not worsen accuracy, we can remove up to 40% of the connections.

This talk is based on joint work with Srikumar Ramalingam (Google Research) and Abhinav Kumar (Michigan State University).

Bio: Thiago Serra is an assistant professor of analytics and operations management at Bucknell University's Freeman College of Management. Previously, he was a visiting research scientist at Mitsubishi Electric Research Labs from 2018 to 2019, and an operations research analyst at Petrobras from 2009 to 2013. His current work focuses on theory and applications of machine learning and mathematical optimization. He has a Ph.D. in operations research from Carnegie Mellon University's Tepper School of Business, for which he received the Gerald L. Thompson Doctoral Dissertation Award in Management Science. He was also awarded the INFORMS Judith Liebman Award in 2016, the best poster award at the Princeton Day of Optimization in 2018, and the Outstanding Reviewer Award at AAAI 2021.

Inscrivez-vous à la notification par e-mail des séminaires d'apprentissage automatique efficace du GERAD.