Subscribe to Our Community
Connect With Us
Our team extends beyond our employee base. Whether it's interacting around feature feedback on forums, incorporating pull requests into our product releases, or swapping sparsification stories at events, we wouldn't be where we are without you. Be it by code or usage, as a newbie or advanced practitioner, we strive to provide a place for people to chat about related topics, such as model deployments across CPUs, the role of sparsity in deep learning models, novel machine learning research, and more.
Get the latest ML performance tidbits.
You can get the latest news, webinar and event invites, research papers, and other ML performance tidbits by subscribing to the Neural Magic email communications.
Explore Neural Magic Community Slack
Meet like-minded people. Share your expertise. Learn from others.
Join our online communities to interact with our product and engineering teams along with other Neural Magic users and developers interested in model sparsification and accelerating deep learning inference performance.
Install Our ML Tools
Prune and quantize your models for inference performance.
Install and use our model sparsification tools
Our model sparsification tools, SparseML and Sparsify, can be found on PyPI. More information can be found on our Docs website. To see the code, visit our GitHub repo.
Run on CPUs?
If you are interested in running your model inference on CPUs, you can install our DeepSparse inference runtime from PyPI. More information can be found on our Docs website.
Sparse Model Zoo
Want to get started quickly using one of our optimized models? Our model repo, the SparseZoo, is publicly available.
See how Neural Magic contributes to the research community.
Our employees are passionate about contributing research back to the ML community at large. View our conference-approved research papers below.
On the Predictability of Pruning Across Scales
WoodFisher: Efficient Second-Order Approximation for Neural Network Compression
Relaxed Scheduling for Scalable Belief Propagation
Learn from Neural Magic team and the community.
Introducing the Neural Magic Platform
Go in-depth on the Neural Magic Platform, our open source deep learning sparsification tools and a free CPU inference engine.
Big Brain Burnout: What’s Wrong with AI Computing?
If our brains processed information like today’s ML products consume computing power, you could fry an egg on your head. Learn why we need to rethink how we’re building products that rely on ML and AI.
Pruning Deep Learning Models for Success
Contrary to popular belief, pruning deep learning models is not that hard. Get an overview or pruning, as well as easy ways to prune your deep learning models.
Join the community to learn and have fun, virtually and in person.
Catch up with your favorite community folks. Ask your toughest questions, meet local users, and walk away with some brand new swag.
Do you have a passion for machine learning performance? Want to join us as an event speaker, blog writer, or event panelist? Have other ideas? We’d love to hear from you. Contact us here.