Technical Papers, eBooks, and Data Sheets
Neural Magic NeurIPS 2020 Paper: WoodFisher: Efficient Second-Order Approximation for Neural Network Compression
Learn about the WoodFisher optimization method for efficient second-order approximation for neural network compression.
Neural Magic NeurIPS 2020 Paper: Relaxed Scheduling for Scalable Belief Propagation
Learn about efficient parallel algorithms for the key machine learning task of inference on graphical models, in particular on the fundamental belief propagation algorithm.
Neural Magic NeurIPS 2020 Paper: Adaptive Gradient Quantization for Data-Parallel SGD
In this paper, we introduce two adaptive quantization schemes, ALQ and AMQ. In both schemes, processors update their compression schemes in parallel by efficiently computing sufficient statistics of a parametric distribution. We improve the validation accuracy by almost 2% on CIFAR-10 and 1% on ImageNet in challenging low-cost communication setups. Download the paper to learn more.
eBook: Pruning for Success
Get an overview of the best practices for pruning a model, and an in-depth walkthrough of the gradual magnitude pruning algorithm.
Neural Magic ICML 2020 Paper
Learn how to leverage sparsity for significant performance gains.
Neural Magic Company Fact Sheet
Learn more about Neural Magic.