Retour

G-2020-23-EIW11

State of compact architecture search for deep neural networks

, , , et

référence BibTeX

The design of compact deep neural networks is a crucial task to enable widespread adoption of deep neural networks in the real-world, particularly for edge and mobile scenarios. Due to the time-consuming and challenging nature of manually designing compact deep neural networks, there has been significant recent interest into algorithms that automatically search for compact network architectures. A particularly interesting class of compact architecture search algorithms are those guided by baseline network architectures. In this study, we explore the current state of compact architecture search for deep neural networks through both theoretical and empirical analysis of four different state-of-the-art compact architecture search algorithms: i) group lasso regularization, ii) variational dropout, iii) MorphNet, and iv) Generative Synthesis. We examine these methods in detail based on a number of different factors such as efficiency, effectiveness, and scalability across three well-known benchmark datasets, as well as explore practical considerations.

, 8 pages

Document

G2023-EIW11.pdf (250 Ko)