Publications from GERAD

The fruit of our labor

Cahiers du GERAD

Collection of technical reports and working papers, testimony to the strength and productivity of our group.

Scientific Publications

Repository of all the categories of scientific publications produced by our members over the years.

Newsletter

Semi-annual magazine popularizing scientific research carried out by our members and summary of our recent activities.

Recent Cahiers

Browse the collection
G-2024-36 Branch-and-Price
, , , and

Integer (linear) programs are a standard way of formalizing a vast array of optimization problems in industry, services, management, science, and technology....

G-2026-16 Surrogate-based categorical neighborhoods for mixed-variable blackbox optimization
, , , , and

In simulation-based engineering, design choices are often obtained following the optimization of complex blackbox models. These models frequently involve mi...

G-2026-15 Transfer learning in Bayesian optimization for aircraft design
, , , and

The use of transfer learning within Bayesian optimization addresses the disadvantages of the so-called cold start problem by using source data to aid in th...

G-2026-14 Hierarchical constraint reduction for the penalized security-constrained optimal power flow
and

We consider the security-constrained optimal power flow (SCOPF) problem in a linearized form where thermal line limits are enforced as soft constraints to re...

G-2026-13 On the completion of AI-based weather models
and

Predicting the state of the weather, from tracking hurricanes and thunderstorms to assessing daily temperatures, is of crucial importance. Over the past deca...

Recent Scientific Publications

See all publications

book

Artificial Intelligence Based Primary Care Artificial Intelligence and Human Cognition in General Practice and Family Medicine
, , , , and

article

Advances in Power Consumption Model for Data Centers: Analytical Formulas vs. Machine Learning Models
, , , , and

article

Zeroth-order Kronecker optimization for pretraining language models
, , , and

Training language models (LMs) under tight GPU memory budgets rules out standard back-propagation and motivates zeroth-order (ZO) optimization. While ZO m...