Back to activities
“Meet a GERAD researcher!” seminar

Multi-Precision Optimization Algorithm: Decreasing Computational Cost and Controlling Computational Error

iCalendar

Feb 22, 2023   11:00 AM — 12:00 PM

Dominique Monnet Polytechnique Montréal, Canada

Dominique Monnet

Presentation on YouTube.

In this presentation, we focus on the minimization of a smooth, non-convexe function. This framework covers many applications, including deep neural network training, which high computational cost has motivated the developpement of low precision hardware. Performing computation with low precisions, i.e. , with numbers represented with a small number of bits, enables to save computational effort at the expense of more computational errors. We introduce, in the determinist case, the Quadratic Regularization (R2) algorithm, a gradient descent algorithm with adaptive step size, and how it can be extended into a Multi-Precision (MPR2) version of itself. MPR2 adapts dynamically the precision level for the computations to spare as much computational effort as possible while controlling computational error to ensure convergence to a minimum. We highlight some of the pitfalls related to the use of variable precision and how MPR2 avoids them. The performance of MPR2 is illustrated over a bank of problems.

Charles Audet organizer
Dominique Orban organizer
Olivier Bahn organizer

Location

Hybrid activity at GERAD
Zoom et salle 4488
Pavillon André-Aisenstadt
Campus de l'Université de Montréal
2920, chemin de la Tour

Montréal Québec H3T 1J4
Canada