Sparse Training of Neural Networks Using AC/DC

Presenter: Damian Bogunowicz

This video summarizes deep learning research on the "Alternating Compressed/DeCompressed (AC/DC) training of DNNs."

AC/DC, a novel sparse training algorithm, allows you to prune your DNNs while still being trained, leading to simpler and quicker training workflows. AC/DC outperforms existing sparse training methods in accuracy, even at high sparsity levels. And in some cases, it creates even more accurate sparse networks than their dense counterparts.

This video covers the background on training-aware sparsification and what benefits it offers; a deep dive into the AC/DC algorithm, how it works, and the intuition behind it; and a walk-through of how to use AC/DC on your own models and deploy it for better performance and accuracy.

More ML Research in Action Videos

Apply Second-Order Pruning Algorithms for SOTA Model Compression
Sparse Training of Neural Networks Using AC/DC
How Well Do Sparse Models Transfer?
How to Achieve the Fastest CPU Inference Performance for Object Detection YOLO Models
Workshop: How to Optimize Deep Learning Models for Production
How to Compress Your BERT NLP Models For Very Efficient Inference
Sparsifying YOLOv5 for 10x Better Performance, 12x Smaller File Size, and Cheaper Deployment
Tissue vs. Silicon: The Future of Deep Learning Hardware
Pruning Deep Learning Models for Success in Production

Get more info about

Sparse Training of Neural Networks Using AC/DC