Added
- Added `convert_to_onnx` function to the following models:
- ClassificationModel
- NERModel
- Converted ONNX models can be loaded (requires specifying `onnx: True` in model_args) and used for prediction.
- Added `fp16` support for evaluation and prediction (requires Pytorch >= 1.6) for the following models:
- ClassificationModel
- ConvAI
- MultiModalClassificationModel
- NERModel
- QuestionAnsweringModel
- Seq2Seq
- T5Model
- Added multigpu prediction/eval in
- ClassificationModel
- ConvAI
- MultiModalClassificationModel
- NERModel
- QuestionAnsweringModel
- Seq2Seq
- T5Model
Fixed
- Thread count can now be specified for MultiLabelClassificationModel.