Publications from GERAD

The fruit of our labor

Cahiers du GERAD

Collection of technical reports and working papers, testimony to the strength and productivity of our group.

Scientific Publications

Repository of all the categories of scientific publications produced by our members over the years.

Newsletter

Semi-annual magazine popularizing scientific research carried out by our members and summary of our recent activities.

Recent Cahiers

Browse the collection
G-2024-36 Branch-and-Price
, , , and

Integer (linear) programs are a standard way of formalizing a vast array of optimization problems in industry, services, management, science, and technology....

G-2026-14 Hierarchical constraint reduction for the penalized security-constrained optimal power flow
and

We consider the security-constrained optimal power flow (SCOPF) problem in a linearized form where thermal line limits are enforced as soft constraints to re...

G-2026-13 On the completion of AI-based weather models
and

Predicting the state of the weather, from tracking hurricanes and thunderstorms to assessing daily temperatures, is of crucial importance. Over the past deca...

G-2026-12 Mean field games: Large sparse network limits and Laplexion dynamics
and

Dynamic games are considered with large subpopulations distributed over large sparse graphs, where each agent has mean field coupling with all agents withi...

G-2026-11 Electric vehicle fast-charging facility location with endogenous queuing in path selection
, , , , and

As fast-charging demand grows, ensuring the operational resilience of electric vehicle (EV) infrastructure under congestion becomes a key challenge in large-...

Recent Scientific Publications

See all publications

book

Artificial Intelligence Based Primary Care Artificial Intelligence and Human Cognition in General Practice and Family Medicine
, , , , and

article

Advances in Power Consumption Model for Data Centers: Analytical Formulas vs. Machine Learning Models
, , , , and

article

Zeroth-order Kronecker optimization for pretraining language models
, , , and

Training language models (LMs) under tight GPU memory budgets rules out standard back-propagation and motivates zeroth-order (ZO) optimization. While ZO m...