New Features:
* More performant YOLOv5s and YOLOv5l sparse quantize models
* YOLOv5 sparse quantized models for [m](https://sparsezoo.neuralmagic.com/models/cv%2Fdetection%2Fyolov5-s%2Fpytorch%2Fultralytics%2Fcoco%2Fpruned85-none), [l](https://sparsezoo.neuralmagic.com/models/cv%2Fdetection%2Fyolov5-l%2Fpytorch%2Fultralytics%2Fcoco%2Fpruned94-none), [x](https://sparsezoo.neuralmagic.com/models/cv%2Fdetection%2Fyolov5-x%2Fpytorch%2Fultralytics%2Fcoco%2Fpruned70_quant-none-vnni) versions
* YOLOv5p6 sparse quantized models for n, s, m, l, x versions
* NLP multi-label use case models for [BERT-base](https://sparsezoo.neuralmagic.com/models/nlp%2Fmultilabel_text_classification%2Fbert-large%2Fpytorch%2Fhuggingface%2Fgoemotions%2Fpruned90_quant-none), DistilBERT, and [BERT-Large](https://sparsezoo.neuralmagic.com/models/nlp%2Fmultilabel_text_classification%2Fbert-large%2Fpytorch%2Fhuggingface%2Fgoemotions%2Fpruned90_quant-none) on the [GoEmotions dataset](https://sparsezoo.neuralmagic.com/?order_by=modified&descending=true&page=1&keywords=&dataset=goemotions)
* Initial oBERTa models (RoBERTa style models) for SQuAD and GLUE tasks
Changes:
* None
Resolved Issues:
* Due to a breaking change in NumPy, its version was pinned to <=1.21.6 to prevent crashes from happening across SparseZoo, SparseML, and DeepSparse.
Known Issues:
* None