Back to activities
DS4DM Coffee Talk

A Stochastic Proximal Method for Non-smooth Regularized Finite Sum Optimization

iCalendar

May 20, 2022   11:00 AM — 01:00 PM

Dounia Lakhmiri Polytechnique Montréal, Canada

Dounia Lakhmiri

Hybrid seminar on Zoom and in the GERAD seminar room.

We consider the problem of training a deep neural network with non-smooth regularization to retrieve a sparse and efficient sub-structure. Our regularizer is only assumed to be lower semi-continuous and prox-bounded. We combine an adaptive quadratic regularization approach with proximal stochastic gradient principles to derive a new solver, called SR2. Our experiments on network instances trained on CIFAR-10 and CIFAR-100 with L1 and L0 regularization show that SR2 achieves higher sparsity than other proximal methods such as ProxGEN and ProxSGD with satisfactory accuracy.

Joint seminar with Alexandre Forel.

Federico Bobbio organizer
Gabriele Dragotto organizer

Location

Hybrid activity at GERAD
Zoom et salle 4488
Pavillon André-Aisenstadt
Campus de l'Université de Montréal
2920, chemin de la Tour

Montréal Québec H3T 1J4
Canada

Associated organizations

Research Axis

Research application