Retour

G-2020-23-EIW14

Distributed stochastic gradient descent with quantized compressive sensing

et

référence BibTeX

One of the major challenges in large-scale distributed machine learning involving stochastic gradient methods is the high cost of gradient communication over multiple nodes. Gradient quantization and sparsification have been studied to reduce the communication cost. In this work we bridge the gap between gradient sparsity and quantization. We propose a quantizatized compressive sensing-based approach to address the issue of gradient communication. Our approach compresses the gradients by a random matrix and apply 1-bit quantization to reduce the communication cost. We also provide a theoretical analysis on the convergence of our approach, under the gradient bound assumption.

, 10 pages

Document

G2023-EIW14.pdf (370 Ko)