Lunch & Learn with Neural Magic: How Well Do Sparse Models Transfer?

The machine learning (ML) research community is moving near the speed of light. New research papers are released daily. The amount of new models, algorithms, and optimization techniques is growing at a rapid rate. Keeping up with research is no small task. At Neural Magic, it’s a full-time job!

That’s why we implemented an internal weekly Lunch & Learn, a dedicated time where we discuss new state-of-the-art research that’s shaping ML. Each week, a member of our team presents a paper they find interesting and we discuss it as a group. We learn a lot from each other and we think you would too!

We are excited to open our Lunch & Learn to the wider community of researchers and practitioners interested in simpler and more efficient ML performance. 

We invite you to join us on Wednesday, December 7, 2022, at 12:00 PM Eastern Time, for our inaugural open Lunch & Learn where we’ll be summarizing our CVPR-accepted How Well Do Sparse Imagenet Models Transfer paper and ENLP-accepted The Optimal BERT Surgeon. The papers show that sparse models can match or even outperform the transfer performance of dense models, even at high sparsities, and, while doing so, can lead to significant inference and even training speedups. 

No prep is required. Just bring your curiosity. 

RSVP by filling out the form on the right (below if on mobile).

Moderator:

  • Mark Kurtz, Director of Machine Learning, Neural Magic

Save Your Spot:

Date: December 7, 2022
Time: 12:00 PM ET; 09:00 AM PT

Was this article helpful?
YesNo