Video: Use a Sparse Training Algorithm AC/DC for SOTA Neural Network Performance and Accuracy
The consistently-rising computational costs and requirements of deep neural networks (DNNs) have led Neural Magic’s research team on a quest to explore model compression techniques that result in sparse models that do not suffer from accuracy degradation. We came up with a proven approach called Alternating Compressed/DeCompressed (AC/DC) training of DNNs. And we’d love to tell you all about it!
AC/DC, a novel sparse training algorithm, allows you to prune your DNNs while still being trained, leading to simpler and quicker training workflows. AC/DC outperforms existing sparse training methods in accuracy, even at high sparsity levels. And in some cases, it creates even more accurate sparse networks than their dense counterparts.
We held a virtual session on February 23, 2023, where we summarized this research. Our ML Engineer, Damian Bogunowicz, walked us through AC/DC, including the:
- The background on training-aware sparsification and what benefits it offers;
- Deep dive into the AC/DC algorithm, how it works, and the intuition behind it;
- A walk-through of how to use AC/DC on your own models and deploy it for better performance and accuracy.