Dominique Orban
BackCahiers du GERAD
79 results — page 1 of 4
Augmented Lagrangian (AL) methods are a well known class of algorithms for solving constrained optimization problems. They have been extended to the solution...
BibTeX reference
We develop a worst-case evaluation complexity bound for trust-region methods in the presence of unbounded Hessian approximations. We use the algorithm of Ar...
BibTeX referenceCorrigendum: A proximal quasi-Newton trust-region method for nonsmooth regularized optimization
The purpose of the present note is to bring clarifications to certain concepts and surrounding notation of Aravkin et al. (2022). All results therein contin...
BibTeX reference
We develop a trust-region method for minimizing the sum of a smooth term \(f\)
and a nonsmooth term \(h\)
, both of which can be nonconvex.
Each iteratio...
The Harwell Subroutine Library (HSL) is a renowned suite of efficient and robust numerical algorithms designed to tackle complex mathematical problems such a...
BibTeX referencePLSR1: A limited-memory partitioned quasi-Newton optimizer for partially-separable loss functions
Improving neural network optimizer convergence speed is a long-standing priority. Recently, there has been a focus on quasi-Newton optimization methods, whi...
BibTeX reference
Historically, the training of deep artificial neural networks has relied on parallel computing to achieve practical effectiveness. However, with the increas...
BibTeX reference
We introduce an iterative solver named MINARES for symmetric linear systems \(Ax \approx b\)
, where \(A\)
is possibly singular.
MINARES is based on t...
The indefinite proximal gradient method
We introduce a variant of the proximal gradient method in which the quadratic term is diagonal but may be indefinite, and is safeguarded by a trust region. ...
BibTeX reference
We present a Julia framework dedicated to partially-separable problems whose element function are detected automatically. This framework takes advantage of ...
BibTeX reference
This paper presents \(\texttt{Krylov.jl}\)
, a Julia package that implements a collection of Krylov processes and methods for solving a variety of linear pr...
We develop a Levenberg-Marquardt method for minimizing the sum of a smooth nonlinear least-squares term \(f(x) = \tfrac{1}{2} \|F(x)\|_2^2\)
and a nonsmoo...
This paper presents PDENLPModels.jl a new Julia package for modeling and discretizing optimization problems with mixed algebraic and partial differential equ...
BibTeX referenceOn GSOR, the generalized successive overrelaxation method for double saddle-point problems
We consider the generalized successive overrelaxation (GSOR) method for solving a class of block three-by-three saddle-point problems. Based on the necessary...
BibTeX reference
We consider the problem of training a deep neural network with nonsmooth regularization to retrieve a sparse and efficient sub-structure. Our regularizer is ...
BibTeX reference
The conjugate gradient (CG) method is a classic Krylov subspace method for solving symmetric positive definite linear systems. We introduce an analogous sem...
BibTeX referenceComputing a sparse projection into a box
We describe a procedure to compute a projection of \(w \in ℝ^n\)
into the intersection of the so-called zero-norm ball \(k B_0\)
of radius \(k\)
, i....
DCISolver.jl: A Julia solver for nonlinear optimization using dynamic control of infeasibility
This paper presents DCISolver.jl a new Julia package implementating the Dynamic Control of Infeasibility method (DCI), introduced by Bielschowsky & Gomes (20...
BibTeX reference
We introduce an iterative method named GPMR for solving 2X2 block unsymmetric linear systems. GPMR is based on a new process that reduces simultaneously...
BibTeX reference
In this paper, we consider both first- and second-order techniques to address continuous optimization problems arising in machine learning. In the first-orde...
BibTeX reference