Paper ID | SS-NNC.3 | ||
Paper Title | ONLINE WEIGHT PRUNING VIA ADAPTIVE SPARSITY LOSS | ||
Authors | George Retsinas, Athena Elafrou, Georgios Goumas, Petros Maragos, National Technical University of Athens, Greece | ||
Session | SS-NNC: Special Session: Neural Network Compression and Compact Deep Features | ||
Location | Area B | ||
Session Time: | Tuesday, 21 September, 08:00 - 09:30 | ||
Presentation Time: | Tuesday, 21 September, 08:00 - 09:30 | ||
Presentation | Poster | ||
Topic | Special Sessions: Neural Network Compression and Compact Deep Features: From Methods to Standards | ||
IEEE Xplore Open Preview | Click here to view in IEEE Xplore | ||
Abstract | Pruning neural networks has regained interest in recent years as a means to compress state-of-the-art deep neural networks and enable their deployment on resource-constrained devices. In this paper, we propose a robust sparsity controlling framework that efficiently prunes network parameters during training with minimal computational overhead. We incorporate fast mechanisms to prune individual layers and build upon these to automatically prune the entire network under a user-defined budget constraint. Key to our end-to-end network pruning approach is the formulation of an intuitive and easy-to-implement adaptive sparsity loss used to explicitly control sparsity during training, enabling efficient budget-aware optimization. |