Products

Neural Magic 1.3 Product Release

Dec 23, 2022

Icon

Author(s)

As the year comes to a close and we look forward to celebrating the holidays together with our friends and families, all of us at Neural Magic would like to thank you for your continued community support.

Here are highlights of the 1.3 product release of our DeepSparse, SparseML, and SparseZoo libraries. The full technical release notes are always available within our GitHub release indexes linked from the specific Neural Magic repository. If you have any questions, need assistance, or simply want to say hello to our vibrant ML performance community, join us in the Neural Magic Community Slack

DeepSparse 1.3  Highlights: Bfloat16 support, MLOps logging capabilities with Prometheus, and new use case support with SQuAD 2.0 and NLP multi-label tasks 

Bfloat16 is now supported on CPUs with the AVX512_BF16 extension. For sparse and dense FP32 networks, you can expect up to 30% and 75% performance improvements respectively. This feature is opt-in and is specified with the default_precision parameter in the configuration file. On the performance side, we have also added max and min operators in the runtime to support better performance across models. 

New logging capabilities including metrics logging, custom functions, and Prometheus support have also been added to DeepSparse to support production MLOps pipelines. You can now stream your logs directly to Prometheus to understand your models performance and key metrics. More documentation is available.

Key Changes: 

  • PyTorch 1.12 and Python 3.10 support
  • YOLOv5 pipelines upgrade from Ultralytics
  • Transformers pipelines update from Hugging Face
  • DeepSparse performance improvements

View full DeepSparse release notes.

SparseML 1.3 Highlights: Recipe template APIs and new model support

SparseML recipe template APIs have been implemented which enable you to easily create recipes for custom models with standard sparsification pathways. To get started with the new recipe template APIs, check out the documentation here.  

Additionally, as part of our effort to support a broader range of use cases and models we have introduced new pathway support for many of our new sparse models (EfficientNet2.0 and oBERTa) and datasets (SQuAD 2.0, GoEmotions, and GLUE) added to the SparseZoo. 

View full SparseML release notes.

SparseZoo 1.3 Highlights: New YOLO models and NLP multi-label dataset additions for BERT

Continuing our efforts to enable new use cases based on user feedback, new models have been added for various datasets:

  • BERT (GoEmotions, SQuAD 2.0)
  • oBERTa base (GLUE)
  • YOLOv5 and YOLOv5p6 (transfer learning) 

View full SparseZoo release notes.


If you have any questions, need assistance, or simply want to say hello to our vibrant ML performance community, join us in the Neural Magic Community Slack.

- Neural Magic Product Team

Was this article helpful?
YesNo
Icon

Author(s)

Icon

Join the Conversation